var/home/core/zuul-output/0000755000175000017500000000000015157416104014531 5ustar corecorevar/home/core/zuul-output/logs/0000755000175000017500000000000015157427762015511 5ustar corecorevar/home/core/zuul-output/logs/kubelet.log.gz0000644000175000017500000341741015157427666020303 0ustar corecore/ikubelet.log_o[;r)Br'o -n(!9t%Cs7}g/غIs,r.k9Gf i>Eڤ펯_ˎ6_o#oVݏKf핷ox[o8W5!Kޒ/h3_.gSeq5v(×_~^ǿq]n>߮}+ԏbś E^"Y^-Vۋz7wH׋0g"ŒGǯguz|ny;#)a "b BLc?^^4[ftlR%KF^j 8DΆgS^Kz۞_W#|`zIlp_@oEy5 fs&2x*g+W4m ɭiE߳Kf^?·0* TQ0Z%bb oHIl.f/M1FJdl!و4Gf#C2lIw]BPIjfkAubTI *JB4?PxQs# `LK3@g(C U {oLtiGgz֝$,z'vǛVB} eRB0R딏]dP>Li.`|!>ڌj+ACl21E^#QDuxGvZ4c$)9ӋrYWoxCNQWs]8M%3KpNGIrND}2SRCK.(^$0^@hH9%!40Jm>*Kdg?y7|&#)3+o,2s%R>!%*XC7Ln* wCƕH#FLzsѹ Xߛk׹1{,wŻ4v+(n^RϚOGO;5p Cj·1z_j( ,"z-Ee}t(QCuˠMkmi+2z5iݸ6C~z+_Ex$\}*9h>t m2m`QɢJ[a|$ᑨj:D+ʎ; 9Gacm_jY-y`)͐o΁GWo(C U ?}aK+d&?>Y;ufʕ"uZ0EyT0: =XVy#iEW&q]#v0nFNV-9JrdK\D2s&[#bE(mV9ىN囋{W5e1߯F1>9r;:J_T{*T\hVQxi0LZD T{ /WHc&)_`i=į`PÝr JovJw`纪}PSSii4wT (Dnm_`c46A>hPr0ιӦ q:Np8>R'8::8g'h"M{qd 㦿GGk\(Rh07uB^WrN_Ŏ6W>Bߔ)bQ) <4G0 C.iTEZ{(¥:-³xlՐ0A_Fݗw)(c>bugbǎ\J;tf*H7(?PЃkLM)}?=XkLd. yK>"dgӦ{ qke5@eTR BgT9(TڢKBEV*DDQ$3gFfThmIjh}iL;R:7A}Ss8ҧ ΁eor(Ё^g׬JyU{v3Fxlţ@U5$&~ay\CJ68?%tS KK3,87'T`ɻaNhIcn#T[2XDRcm0TJ#r)٧4!)'qϷכrTMiHe1[7c(+!C[KԹҤ 0q;;xG'ʐƭ5J; 6M^ CL3EQXy0Hy[``Xm635o,j&X}6$=}0vJ{*.Jw *nacԇ&~hb[nӉ>'݌6od NN&DǭZrb5Iffe6Rh&C4F;D3T\[ bk5̕@UFB1/ z/}KXg%q3Ifq CXReQP2$TbgK ء#AZ9 K>UHkZ;oﴍ8MEDa3[p1>m`XYB[9% E*:`cBCIqC(1&b f]fNhdQvݸCVA/e.# Okx܍>М>ӗom$rۇnu~Y݇̇TIwӜ'}׃nxuoỴRZ&Yzbm ]) %1(Y^9{q"4e?x+ [Vz;E|d1&ږ/0-Vb=SSO|k1A[|gbͧɇد;:X:@;afU=Sru CK >Y%LwM*t{zƝ$;ȾjHim @tBODɆj>0st\t@HTu( v e`H*1aK`3CmF1K>*Mk{_'֜dN${OT-n,'}6ȴ .#Sqη9]5zoX#ZVOy4%-Lq6dACYm*H@:FUф(vcD%F"i ' VVdmcOTKpwq.M?m12N[=tuw}opYG]2u<ΰ+a1tHayɒ aY(P*aaʨ@ΰ<pX /m GƂ%k1羨(zv:U!2`cV, lNdV5m$/KFS#0gLwNO6¨h}'XvوPkWn}/7d*1q* c0.$\+XND]P*84[߷Q뽃J޸8iD WPC49 *#LC ءzCwS%'m'3ܚ|otoʉ!9:PZ"ρ5M^kVځIX%G^{;+Fi7Z(ZN~;MM/u2}ݼPݫedKAd#[ BeMP6" YǨ 0vyv?7R F"}8&q]ows!Z!C4g*8n]rMQ ;N>Sr??Ӽ]\+hSQזL +3[n )ܗKj/jUsȕD $([LH%xa1yrO('h=TԫeVިO? )-1 8/%\hC(:=4< ,RmDRWfRoUJy ŗ-ܲ(4k%הrgU3U;,ד1t7lJ#wՆ;I|p"+I4ˬZcն a.1wXhxDI:;.^m9W_c.4z+ϟMn?!ԫ5H&=JkܓhkB\LQ"<LxeLo4l_m24^3.{oɼʪ~75/nQ?s d|pxu\uw?=QR -Mݞίk@Pc n1æ*m$=4Dbs+J \EƄզ}@۶(ߐ/ۼ𹫘qݎt7Ym݃|_M$ 6.x5 TMXbXj-P\jА޴y$j`ROA"EkuS#q * CƂ lu" yo6"3껝I~flQ~NCBX`]ڦÞhkXO _-Qy2$?T3ͤEZ긊mۘ$XD.bͮW`AީClСw5/lbl[N*t*@5_6."D/< {Dۥ sLxZn$N(lYiV =?_e^0)?]{ @| 6+#gPX>Bk2_@L `CZ?z3~ }[ tŪ)۲-9ֆP}b&x Uhm._O 4m6^^osVЦ+*@5Fˢg'!>$]0 5_glg}릅h:@61Xv` 5DFnx ˭jCtu,R|ۯG8`&ו:ݓ3<:~iXN9`2ŦzhѤ^ MW`c?&d.'?[\]}7A[?~R6*.9t,綨 3 6DFe^u; +֡X< paan}7ftJ^%0\?mg5k][ip4@]p6Uu|܀|Kx6خQU2KTǺ.ȕPQVzWuk{n#NWj8+\[ ?yiI~fs[:.۽ '5nWppH? 8>X+m7_Z`V j[ s3nϏT=1:T <= pDCm3-b _F(/f<8sl, 0۬Z"X.~b٦G3TE.֣eմi<~ik[m9뀥!cNIl8y$~\T B "2j*ҕ;ێIs ɛqQQKY`\ +\0(FęRQ hN œ@n|Vo|6 8~J[,o%l%!%tyNO žeVVޖ~;BLv[n|viPjbMeO?!hEfޮ])4 ?KN1o<]0Bg9lldXuT ʑ!Iu2ʌnB5*<^I^~G;Ja߄bHȌsK+D"̽E/"Icƀ "NqK$2$ Ri ?2,ᙌEK@-V3ʱd:/4Kwm2$'dW<qIE2Ľ)5kJҼMЌ DR3csf6rRSr[I߽ogCc;S5ׂdKZ=M3դ#F;SYƘK`K<<ƛ G׌MU.APf\M*t*vw]xo{:l[n=`smFQµtxx7/W%g!&^=SzDNew(æ*m3D Bo.hI"!A6:uQզ}@j=Mo<}nYUw1Xw:]e/sm lˣaVۤkĨdԖ)RtS2 "E I"{;ōCb{yex&Td >@).p$`XKxnX~E膂Og\IGֻq@wY)aL5ׄӫ A^%f+[`sb˟(]m`F3 W((!5F-9]dDqL&RΖd}hd xе"Q4SUwy x<'oӿ?GϧmI C6HJ{jc kkA ~u?u7<?gd iAe1YB siҷ,vm}S|z(N%Wг5=08`S*՟݃*־%NǸ*kb05 V8[l?W]^@G:{N-i bɵFWǙ*+Ss*iނL8G9ٻҦ62L0ډ"ܺ_z9JNȯ=@oUI y4ωz%lOONRѦmDVmxюݏX}K6"Qi32\-V_kR(I-wtSJR^m{d a|y,F9$^@mdH֙toN1 < ҷBq/ ۓ,j|z6OSu;BKŨʐPqO K\{jDiy@}b|Z79ߜih(+PKO;!o\戔-QB EM;oH$$]?4~YrXY%Ο@oHwlXiW\ΡbN}l4VX|"0]! YcVi)@kF;'ta%*xU㔸,A|@WJfVP6`ڼ3qY.[U BTR0u$$hG$0NpF]\ݗe$?# #:001w<{{B\rhGg JGIެE.:zYrY{*2lVǻXEB6;5NE#eb3aīNLd&@yz\?))H;h\ߍ5S&(w9Z,K44|<#EkqTkOtW]﮶f=.*LD6%#-tңx%>MZ'0-bB$ !)6@I<#`L8턻r\Kuz*]}%b<$$^LJ<\HGbIqܢcZW {jfѐ6 QڣPt[:GfCN ILhbB.*IH7xʹǙMVA*J'W)@9 Ѷ6jىY* 85{pMX+]o$h{KrҎl 5sÁbNW\: "HK<bdYL_Dd)VpA@A i"j<鮗 qwc&dXV0e[g#B4x╙✑3'-i{SEȢbK6}{Ⱥi!ma0o xI0&" 9cT)0ߢ5ڦ==!LgdJΆmΉO]T"DĊKٙ@qP,i Nl:6'5R.j,&tK*iOFsk6[E__0pw=͠qj@o5iX0v\fk= ;H J/,t%Rwó^;n1z"8 P޿[V!ye]VZRԾ|“qNpѓVZD2"VN-m2do9 'H*IM}J ZaG%qn*WE^k1v3ڣjm7>ƽl' ,Τ9)%@ wl42iG.y3bBA{pR A ?IEY ?|-nz#}~f ‰dŷ=ɀ,m7VyIwGHέ 2tޞߛM{FL\#a s.3\}*=#uL#]  GE|FKi3&,ۓxmF͉lG$mN$!;ߑl5O$}D~5| 01 S?tq6cl]M[I5'ոfiҞ:Z YՑ"jyKWk^dd@U_a4/vvV qHMI{+']1m]<$*YP7g# s!8!ߐ>'4k7/KwΦθW'?~>x0_>9Hhs%y{#iUI[Gzďx7OnuKRv'm;/~n-KI`5-'YݦD-!+Y򼤙&m^YAKC˴vҢ]+X`iDf?U7_nMBLϸY&0Ro6Qžl+nݷ" 㬙g|ӱFB@qNx^eCSW3\ZSA !c/!b"'9k I S2=bgj쯏W?=`}H0--VV#YmKW^[?R$+ +cU )?wW@!j-gw2ŝl1!iaI%~`{Tռl>~,?5D K\gd(ZH8@x~5w.4\h(`dc)}1Kqi4~'p!;_V>&M!s}FDͳ֧0O*Vr/tdQu!4YhdqT nXeb|Ivż7>! &ĊL:}3*8&6f5 %>~R݄}WgѨ@OĹCtWai4AY!XH _pw騋[b[%/d>. !Df~;)(Oy )r#.<]]i-*ػ-f24qlT1  jL>1qY|\䛧\|r>Ch}Ϊ=jnk?p ^C8"M#Eޑ-5@f,|Ά(Շ*(XCK*"pXR[كrq IH!6=Ocnи%G"|ڔ^kПy׏<:n:!d#[7>^.hd/}ӾP'k2MؤYy/{!ca /^wT j˚ب|MLE7Ee/I lu//j8MoGqdDt^_Y\-8!ד|$@D.ݮl`p48io^.š{_f>O)J=iwwӑ؇n-i3,1׿5'odۆ3(h>1UW蚍R$W>uwml"Ms>\΋"?|NKfֱn !`[nUgu$ B6 [^7 |Xpn1]nr CC5`F`J `rKJ;?28¢E WiBhFa[|ݩSRO3]J-҅31,jl3Y QuH vΎ]n_2a62;VI/ɮ|Lu>'$0&*m.)HzzBvU0h} -_.7^nya+Cs 6K!x^' ^7 l 2Jj.S֔(*CjaS:vp/N6I*x8"EȿQa[qVM/)fpOj4r!:V_IG^nILVG#A7jF};qPU嗈M9VS;a+Ӧ8E8zmMs*7NM~@6 ' 8jp*:'SOANa0rӍ?DT%l)gvN}JT(Ȋqm|dc+lQai,|Dߟ|, d#EjZܴv]pEO7}&gbXԈedKX :+Z|p8"81,w:$TiVD7ֶ]cga@>\X=4OZSܿ* %xccDa.E h :R.qɱMu$ơI8>^V Y. ,BLq~z&0o- ,BLqfx9y:9244ANb n\"X>Y`bb*h%)(*_Gra^ sh6"BzƾH( ."e)B QlKlXt҈t9՚$ضz]'.!-r"1MCĦʸ"66pE{ =CNc\ESD[T4azry !5yY~ :3;Y[Iȧ q:i Ǟ/"8Wxç,vܰtX-LE7 |-D`JLw9|fb>4Nu ߏ3ap5k_JA+A.A~ C~`[KaQ-Ģn9ѧf q:cT >to^ X]j?-ȇlCf0hM`~ ó}0W@o  K[{d+`ze"l |d;L2k%x90ݙ^Oe ]nHfS+.4<#/5߁ݛǪ0q,7FeV/!; 瓠 Li% z}ɯww"O-]J`sdN$@"J`Y13K/9`VTElsX|D^c%֯T][$m;ԝ!,Z5f`XFzȁ=nrSA8; P=uY}r/27OUa%~0;үM3Tu ȩ*'3IC~LG,?.?C3tBYpm_g.~>3ʄ55[c&-Wgy_jVo,?s w*n\7[cpMY<~/"˘oV܉T6nn \_ߋV_}Z=k-nn sn.*upw pX\_ U-C_wS!|q?E-S_w$-#9?mh{R 4ѭm_9p -h2 dֲ 1"j {]]Nk"䁖%5'32hDz O\!f3KX0kIKq"H~%.b@:Oec6^:V8FDza5H`:&Q5 ^hI8nʁu EA~V O8Z-mYO!tO֠υ9G`6qmJc,Qh: ݢKNw2taC0Z' O > f-`:F_Ѫ2)sCj1THɩhS-^p b~?.>, `0!E%ҏ:H =VՑӄ| Ć.lL t1]}r^nʂI-|i*'yW='W6M$oeB,޳X$I6c>EK# 15ۑO2Jh)8Vgl0v/eNEU"Ik dRu˜6Uǖ xs%P ع omWl҈sApX!^ Ɩgv{Xn|$̇d`>1Ljn떚F+B9l"UP۾u2Ja>0c0Vvގj$]p^M+f~@9{bOe@7ȱ^%u~-B竟} |23 Z.`oqD>t@N _7c$h3`lg\)[h+pHBr^J |r\8czEnv@qZbRT1e8V Scc6:$[|a.fpU`ZR֩bKgTlѩynۢ, "1LӰW&jDkM~# (C>ϭQ3{ߤ%EN;?P%ٱm -{2k 8Vbv"wŏݙmn&O1^'}plM)0\n ή ?Cֲa9H] lX9^vCο -vd+OUgRy2Я\ B0!% #>bJPUck\Ul'F瘏Y4Ew`[x٘p,>9V"R1I>bJ` UL'5m1Ԥ:t6I >jz(:W֪Ƹ)!fꠗe[XLE4atGS1px#S]MF˦NJPYDX%ܠꡗhl}i9f?q>b-E'V"mNf""ŦK9kǍ-vU #`uVi<s)/=r=nlӗЩsdLyVIUI':4^6& t,O669Ȁ,EʿkڍfC58$5?DX 4q]ll9W@/zNaZf% >Ę_"+BLu>'Ɩ=xɮ[⠋X((6I#z)2S zp&m?e8 "(O+:Y EaSD]<^(]В|Ǚ8"oRs?]\McZ0ϕ!1hKS`h0O{!L-w]ln2&0Ǚ'0=.T4G7! H/ͺ|@lX)+{{^s1V63 ۗI"*al NJ`Q8B\pup6_3XqCXznL9:{o qcuו8`n{ave=}OR9~yL Z1=W8>É R$|L ]OfJl˪VVg:lDԒ͢Zu[kWۗw{{7st08`J0ꨴU1|z:9dX)z2!S:'q9| 76"Q;D*04Zٚ ?V¼r8/G:T6Fw/ɚ~h?lUc3MکEen壹n\殸,˛_uu.Jssu/*47U0)l?R_^Uon̝f-nnZTeuu nn/*0׷տ·sHH?Et _I`[>>0ւcz/Adh.$@bѨLtT=cKGX nݔ͆!`c|Lu_~ǴT?crO e9d ljB?K_z>p%'3JQK-͗R>KkΤOq,*I|0]Sj%|-Ԟ = Ʃ%>H&t;9`>$& nIdE Ͻq*nŘʰNҁV_?m\+*/DTjgfʼnw&SSS.%yl_ %_LXɫ7h uGIXUgOȡ*L&Hp6E+CU|yAPi8T(+ޑeQƕܕvhʱ4UQ*,`3R8XŨy4eU?7SY"T'Smȡ*L텇[yA"y}IR,& ^_>EB:Z{JuqLI\ݟp.lH'*4#ўME(,J|憶j.MY$+MZ@Zٰׅ݅"w*A5_V'ySXjGFVQEUQ+c…W+48P@&az]zԋqWo ),%=־^Kme]*DcGٮtHHI~$cZTxG\:",|+ dOҼ(.U8d(.[-9eak2CL8kAfuɋGT#-ǁ|^>6@~Mld+֌7N!3Q1(DH,)^Kfy\ ^UI@|rdQ_\PU Jw꨷_,TѼ<&5IJjKz]rѼ:Ey9Ja-FqZꦖ񋥣hM&N є{\BT~v9/UQ=RK.ƣ?$Q,e/@L|5u%KNՙiZ1=kz|&3 [F^/wv+û+ uw3' !i}oLlGݑ,k9%hL9NQ^N ѳ4U^'UNjs=Gs=۲wJ/gG65̳7FoG*d(E:wqvOyŠ\q6뤃㑝QJ.䴆7+ |k 椮o'FyC -Tpu([oeQ؉t%8<ɐq|ן>[k8N;̕ljöwSqŃD4qZJ!.BeBo_yw_gg%>)Rr͛j P\'ggj};o;:"VΓF))/:#WR*R8k0H縬0=\&-OpW(Hw:=B)$R.)C n3) ̓;zQlOIqZH"x`p@vפRO(SX6",~xwJYA<4'ɹ8kC7\ɧqkjC&Ϫ<m%c*æ lL?D,L R&2F%ok.%ORu .@-[xYWu:b kx`f<ÍU'=It]Ii":}sׯƀ;5tz}Sj#gC(~lj7ml7mLKjnڲze]@hrYz\"n0EFtrr"q=\ LOb{w}u&T(P ay)%h퇜TގmI[Wճo5>.wXSH2;G[74E5QJ5.!JH5Eif[L_]ݾ͏䚃ua4i;x'r^sWRq CY( z\Y!R[ybQ$}H;di\'xR!EhQԬpH-eH $5^B|gHpѢp4}<6<&x#^IQJmN&%DNԷ.cL=_cH6O>CG*+<8muh{ g*0D_u/#&VDKX#$)RWfj@` }GG cc8Z:7*shjǢ -jv5WhXxs1ŴVIRϾ鰞鐋E\ )W ƶȥ}N뽘nWE8#|g[O.ޞ: iY@LkK@Wos *02ʪ!ȕN("f̊B͆ս}.#**z=|A\9Q 3<N25G"uKy׳BZZϳ )n_LU7֮7x]CaB|o+Bl1r7ݭ ~FHUz!$`42ȭ*H/ue.Pm˅ (YGK,)oRj<_=2MR#W"Ť< ȖT NjJ+kdyR{8r[-T%)Q<wte]iuog] `ؽePvMSɛ60ȫoDg'ovdJ,K `,`;TpUyIXEJ %i?8U Gt V]o(GXWzpmkDS-탞GdϘW^}|D%.͢n+qj[cB1<c\mo]/<{wwlP.$pZ|t5yuRue mA,V7U,ݸI)\pE,'WtGWv%y'vev*>Azcek> _r*K"k|Ly`VZwҨ"U!f`Xg͍lG&j-0 ŗ5Qq@A}Fe={ˤoՔ=fw< =ݜh~ oaǞ,$f".jj#☉0*}39|Y)`f3İJ>z@e4F)^ns{Ik&Շe`u;\ @+ 4|I"2G#äiL2z}%㺎c9=ll\ }#$mp)RkUk/OH6OчW{PEFfE6ʓQTV~c]5)1& ξ#Ӊn宭|BW%q]>`/v^#ɕɛO~ 1s.ޥdh_( D9]Aa5 HaǏͼ 9~W`wػ׍3&ǯt a-qLލ+yw;xݸ+[ =fuz);J4P/n#itf˛?<27?(X,Tc]Ar658%q.p̯НW$PУB !!@~M.ǮsA 8bBG`8:U+cGv9xPS"!sHߑ`TEmw=0<|@0,8 4bа" yKTiʷ@ x]HpthxHځ# ჉1-F|w+V9)Nm$ta5 ; ht@N u*@D: IAW1q^-ao֕k@L &Pnu!)J9 :0B$k;]么Q_QS$tBo5tD M[̡G"eѰ*W1GǢht8ۃ3GFhN9B{6XP9R؝1cZ1 YEƇ bNf 2WB0W 2s5@/ bN rU~ U:1 7vc) R ׁ1+Xۃn"X0 dLE61 WKh =ŒkhA~ kng.Jwk_"q KEQ!k[D1t%y30ŸCߘN@LMq:or@ c՜Bnp!4‡yaA'z`2NN%T`W^"8A`׈ll|֕$2οR+ hԴIqsu(ۗCfq2`HK$JO^UK>%S\.0u'C߬15.m>cs9( iQӨ|Hfw)IdVKbCHp>*lЯ򱊟4۲i٦-ӂhXP& <|Rm2&)$8U\,ÒrХbnrE_ A(53 mI2=r{#ւ ԃ4a#FYAE>&L?gMW?{tl4厒L[ɴa KBGVU| 3-`\1Ia\Œ$H% 6P)iCzj.T-{SiSG)&r'#f.Ž:l쫓c~|AE#2{Բ{H/< 8biҾ>aSSuLGNga"J})4s%ofhvsu<Ê̔$AZOZ/utUR0Jҩzd` JiU'(x9 'קwk`y<"3yd<́qӳ#?<DnuqLkRߤiY@YX̀YֱIZn ,3t`>ڞ$80{tQúFJT '|qu'{q P-2&ӶQa3aI`BhA 4[3WՌ!n'2s776Z+nʗ5Z`v5ÃUf瘁}PoE¹u嫕Fg%ϚMShdM>749O71\[ͽl?՗a`#U^"`}vfQӀ--2> PrrqK:Iux<#>>{M's ~qX6+ay'[M:}KvA1X!Ks8Asy.ã#O/ga~~Kp<s(>IFmzWuo_ja^. oW y@Mg) X:"/Ncr%Ў0Jӱ7oGi@ Hp`BcC71`ОgC;) =T r:;)US AQs lO]&2Lu3Nz%OS?cy)eB8<@jWHQ>eNHE`>X`i8k}»gT<¯xkdɜ"r㷛|.Mpx 'DsLj?N!t$L@o ʚC,TuUP&P1 mnmuua;pդkBUocUf=XJm,sCؒAh+݀7kӍ)?jk;q"3gwAp0#')`i :I$XcߗfuSS!aj|ȧ90NŮ[٭]u lc76t} 1ާBˠKKuSQf_mERI mԦٵkVt6ߣͨq稈.fߌt8z0`Lge^ktzķ9:NV!j0yH{ ˴(IT]_ JOv5"RK*/$_BϨ۳Lzٽtܸ/>Q6I. c4#oi)9̿S.6lj̴kłI$%~DgT9KC4ϲ Pħ?-򸏅JR#Зq&?q2ME1tuO[ nkAb!e=S:RNKxt'-=[_ԓ|{ xPRE諁o7B4^ DTVW80=%ڣi6s^Q+rI$3WU c(;RJ}" vmWuH+W穟T>{8tu4DUN=u`#(yHMk.5qϔ;[pN[NϓSomR٬Uه z^KR6.}j`nBr#=,QC?)-$ТDKJVhjDx#&툉Hl"DM0`֭,hp4²ypn#g_˒fuDΖD.ڠ>,OȻBtZ;O~]Sn۾W[|+K_`ݧJ/x~˴ #b72\xZJZ P.ŸǢsӦ\*_,+V5$sqG*RK,PyQ#)<Ƚ>W2D88)l+ c%65{RlxƗRK ᚖvO]DGأmUw>J p>iQ߳UTpryj!Oi/6D8F.qI 'pPU,EGhro]sX%Gy(2P52?ѣ$=J)!ՊO$/:䢍#;w빀g>ïH4?򮵷`>~A@0-JIZ(|wāSӽ%y\s,BF1_hׄ??DO~ǧ5~o?ZwOx[wJqG|F~[*_ͯaA9|֫vqƱ_Ǵȷ`\0*zcxn; |7E/,o$\G4DiTeG?;M3prBwٴ8Zk.x6owsK$F![m'|dD5emt[O9hR 'QJ僀ύJV 3" N{?Y'Zm?q'y82<@qłBA*%DS>| L$ht;\ An"(tTO5+yRPԠl{Yg51i"Ӯ Ps^.2wΩ<ɨMW_ҷplƸ zWhCoW/Vd3H)_`vύ5D/ `ЏJw&v C4(.Tyg2e}e %r5'- GKSyQK?J#`]{{i8 gR#4).p|Mꂣ3+1QA9!*q_,|(M.WxF 8.e!C ~Rʉ?0P1#H8}(7mp,}. "88X q5e ELwF{#%/vH].luĨfΣO% wN5%iR$s䜄]8ou qC`o(CA8>.1t\a%H/ݗ-PVG|(P ƼJB/YůsH4mÁ VG^!B)[Пf9KkA'ERjd4^ Ҡ1ӨETEìX ٪G1^>5=}H*]Ybk1L1<8UH,gJU t1koA5kU =ٟlC*)ܾEtVc4+b7E҅VoK]p*wLU94Lܔ}OЏ[svݩy=g-'=o{-Y-OpLmߔJ_[- Dz!fb-)bĻW !=7!̌1*Q(:f ,tɰagzpq^jP@GЛ }gmR+#H!5ELe鏡Q?XHZ.k='Xn=zjJw9pԵh뫃B˲֌Y48x #EB\vm{ FC>xs-T) k R bR Z"M$x-x @Fg^YIUbsk/]p=$'hݛELPjLLb@jRDr6qV4*E Gi 5=6饦 u>U${Ji}GY/mwHw=џl)&$J " փS(b W{2&A_I cݱ<?\N浑q͔ '?5rPF{a9bG-Ҝr; _K8.=׃sFHiF0=YYs_~u?ج9;qB녁eh *D5$RHAS$z2mX16>I֣.{+@#R3A4Iĭ]4rvZ1:jӺ{M::W-FBx񕵓 %̔$x("m(5%!n'}f$-{kwΥ.8lԙ"&M7 0.\E82-P,vF!TnIn3ם&ͤO J#AD 5lT>P ~xl~b4_[%Up]pcK8ZxV_ǨC#i*IYY58r66r [+zwNYzkS4Hn8^ &O䌣HqZKʒz;JVI(b`2r e,\;uqis /l>-tmvtFSV:/@*$j̊ۿ pkAϾsK9eT8Mt>uqp' gI$oqtRJr!㡭h\hGJBLEC0K/BaD .=uunAE(:qYb7%$`wZuiz󃻁&sZPC\,UW (P"b<-qd"r'hJuQL✟fS{/x#-Waka*Q WhCGu1&%-@" {H7xspB7%n!{_`+&2 90|]2zA5kUf)G A(&ymsKŇf_j[Ci]F[s#+b++=@si)uBWǟ><͇" (xU֚P1(P8ɣJXJM;FscyQN~]bX>_Mk{kYfDs6l^"m q> }oHM2DY(Y =Fm'UjPë,\@8'r;$ ӣwj|}-k[93:".Ee.Ҿuq+2&^<)<>l6\XRMZ%(j/QеI=!~E3}]e8ޣ?ޞ'eS킣TiGp̸vp$D)LXxMV~gppɵi6rsvFIweTZ3+ZiKe$+K"9ێdk3 8U(51**hܡ}mN)Ab}8-fUA=HӲ{^m x?43)6m, ʯ>Hr'fv94Sߺkd8GJ'.8yqxMsf݉7zVz#<2D k /rOViI6eE.YWŏy=+mƶ `Als_ AiLHO2ƁbZ|++E΃~phdՑP1eA 3F5h'S}ҁ;#'𛿙q?m$&jgxK=pDZ-{3H!<*kQwgMHޞtz^#L4fODLH u8%\\bzŨ0O0= uճw0ŒtZUFp@fib ! Ǡ?~ k njK$I[͡)-2;ּ~8Z20qit(jQ N)SS 8~f,3ױM|mjfc08+l\c ߌ+OavGgg?)WtGW

!Hz\c "}G56?tbtlH<7?AD)sU̘<%)5ڇA3"# ].zx0Y鮇ATg7e_G/S2dkGc,cuZm?++]&aZ/|x2gFB7"Ɯ#DǦ{>XB7ŝuiahǭScZExKA5uԔSQ'w@בoVuHTKޛ YqUgANgd\3W'.צaMO\]l˱L2ީHgُr2@['o"Ru~G~c~ =vW5wwUWrsw%;xQukfKYU (m| Bʞ$nխ%ۣok- Ffe1=0T56(`]'d?a$w{6?+gQsv]5PDYIJ|yXDd+;K7h,ZON܎wwo{'ݓrXwDuD`-N28u`wzyjiqQ˱,UQt{xWt'J`2V S^Tp&F.V#cݎp,'RaeWQE;Rfﰞo&swH,#(-|1xAm s,P)Q9V9e3-O͖9 ,j|loIQA^JݚÆx{2xUQ|8" d D/3${/< bCp\NʺKVnK^}6%=0K߫&JWxt'?:ػ@qTPګ8xuf]=s$3o< 4vk7DqW@Tߢeh<3~`u׭ʯ>.zq+B̑znç`gz?&+߼u h?ൖKOiy)jzQ-7[(-gDmCLu͍Зp`Ћ9 ^s$`%Eg~> Z .bEiy0&>}7eV ;v <ǃay:ETR<^99[˥4IdP$*(hkK0> J9 EM_ٴx 8C,-8/ESBϨ쳺tq 4C]kƢ)ȁs3 k'U)׀"sJbH]^3vL3v|͐I7(̝ r"dߧ\\ ȆF@2:{cC W\tQ2%R0,(B917E s2 x#{L39!O |MU{b 5s1& @)ve;̵R ·Z&.|øد|i:uzU0AaW շ:omwg!߷ 0eDm]^4!DnŇb|Q #L7!{upe%vlli1b['ipB|o*Ԓ>:Ddx)+ L(*j2UդhZJQ4z @5T p3I02;hgT'-=f#RrA7K_Wq5|Ɇe H }_tM<[[-O߈ꖏ4\4K~]_7?/E5Ii]8X2&ť--EȍszsRA{M%W`zDHȗ( #HN1&8ϐҹXrLĹHyL9TZij/ Pb R0GqIH`P` N% s!m#@ J}-M1O/Ԥ4+L?,qO\>.@qF1#\Q4CH(:$9=IP5%7Z0݉ѻ虠Fڵ.pZFCLJLRqCXԚ0U[C&9d-|a`'4aB^?vsL\kJr0S9&sJ4d#ք۸&4ӭ61dM"m;(TB 륦)"֞h:Ƙa^4'ě`' yjM n()I[E-Ji"jQMԢ4mv-3ABAGƺ=U vi6I=FcA[F+6W9Q!d΄ M]@9x"0c ^x6d Ejg>vP/)s ˷ {? `F[^]GǔNO.i FїkiĜ9D3qp;0Ң &`B9n1+|)@2ocN:ɘ9Ҕ ENeeaVXL Ɓˢ`A8z+hz1H <*k -bY=,1Eި%5,%R")Ѡ 2aC= +(X`,sj45-1w‚;87vcI3 )!H D3YD g3#lbdaSftrEɄzŽϑC!|Iμ S8g@lI%@+!qJnG'x[͡!&Hì0'\9b1Ns!|\JLKjs#HM{qD$-LƯr#I8yQ q tZQLAkm\Y9hC89{Ylb) Yul'l~bKǖɏ3Ùᐜvk g5E"\jZA MGX9v•9B$}򻟂hfߧ. uj3P'3G=ir%WVu54`~y>#V2!ǏRdJVҠ$R^y(U<=D0ⴗȞkdj2I.ptX4f3)F둩`$_„4Ҋ*([e*plfYҠc 8m4(,i6I^Zm4((inAËrhq LzGk0Yؘrئ+C1Y5m+l%ӫhLF1]x[,E)z)#IF8J8ִhqaѣ 3- Hp]2?tl(&2k"d4Hš%"IYMccbLDz;QB7A5ᨈbRBfzD')KeLĔLlK`ejXbQt;>kIT"n.QLSZ#rM‘Hu/qE"<2m7@#v 0u/c JZ)& %}EQsA=z7H)Ҡ c!%BlYld t;[ 8W1G1Z#D"ӴoExc\)mF C^D4z-?R`$S7T{`kZsLO )(iPdpI+pk$i`5@1MQB0o%mZJ6Fh)%nm| x?suyJA3iQW."@RZX e5GʣՄb85&?M8A n͵xC˺ 1"hm[`2Q>?uhK1jhj40(i` /wGн9\N$ ׿R,5Pa ԽnmObnf̮hzf,Oeu ^^a]!=֫Q> YML '6‡u6Y, OyKʆ~:#H T^Qa7Auz;X)Dׁ9E(ߎ߭o9s&n`V(x51p# QX~t:_{g(?V 9xU꣭F]Ԝ1~|g$Tl&>9E/^sd&-m'Ļv篿m9ĵ\i9T!CZ -`UGې!7\VK`ŨP%ꥌbf>7k/X5N(%5g{{>X ӛͩ0̑ofɱ6龙n#Rtޚt!_ce׻s ȶ7W|GIt< 旋-`2F@m} ۈm.o9 W΋kYy󳿓h nr3Z`f7Y\ sr-k .<;;c; c 4v]6<Y9W8`<ûwo'rfQtT'Un[l 0%_{NO愹~p7ZLx(X;t.Z:kc DE[5{ա&oX;C &@%R"̠NBF9 H&E4NgTbGD=j˟**uHgT(e Jr#5emɋ5'/xS J )L ƴAV1 f .^%MS ff$(Y~*PEJe/:hiM^LVUkh 5LB5R-.%  iHJ4/ŻY+jVHyß'pv4أeUv0ww"N-Fzrμz[9hƢ(. X CeKg*y{xj@ zR_6||v;;qOZ6+9YE3I7?Y0=ۏ v?a__YC?X"3ht]>0b֬5\j^P5|y{Y4a k(v{qgEI~p)_;/􌈚LNQj=KZ5EָdOZ!V8wk=DQ1MHk ۪_/!*%TZ e=Jسz} @۳QDc| +Yc| 8.l6ħS4ݧ}~]:T@WƖ>+Y.X ?|.d<+9{͒i}WkBu1]C(ٹV{Iem,G42[ԉw]^|Iߊ]% IGiᑪCU1iYG6T⁎Cl)U>hL>0Aa%) ZGxm!MURtTh+?CKCwʏR:o&wGyRS6H(H[^VB4TͨU~..wNq*Xrm)GRu/a`R|] dƺD3$k U{fR?Ԝn%<>_mdBƅ\rStmsΏOdJN{NUf%0H谳;];Ó2~c~&/`%J2w1}xTNc8IqBb  EzՈjz?.}T x|]`QK? ]QQ>|Մe.se.7kT%Uqꐕw).j_[vuJq-Qtxpq8> \7k5)F, \U,bH4 K%V\)j"dH 9>>A)L*ea%/pq uYɋLqT.y^-4#ٿ4}ڍ^61s>뱼o"dK8LX(dDžDr]H΀1gn[93%q'.=D(x?=W0y87 D2p)$F\}1ji@}|9(%v؝£ҝ"Kwҝ)(3b0 { p#+=ڊ*ބo7FzWnBs08=o39{.0q0 Ng8[?R /jcjo2pـ9T‘ $6$~8^描 =lI_衑 yƯCU0֗ލmՙ"):O巃kҿ,eC0)ҜWL0ݖ=ȓGY&gr?g`Eޘ$ ըa77 4g"Ӽӹ+N+g8mBsWCQXNhU'2d zO'щ8*EM?o2n}}0c+&GǑ :Uu%ܾ}Ԙ1auTNBJ.Cd#Nq޸e;StL;3ҦwrQxѮMݕV$D{=x3 ]?N1z<:K\&cuBliӝ'%aL_|S>Mgn^%ʹ.yVaŧ)^j=18[abo? Zy'#398:UyOc&?=O姍˧`es XfH;vtߺ9+wjK*LapgxYpEΚ2UDmlbݧujE}_{5/w],, X rχbUq`{*ܲꅟgeO+*RY1OW2e4lZvgЭ2<,af$Ƴg7,0 FI?HvKMƧf#ێ%qb;xwp{B8.y hm8r!D0NY  ~uf1`ϟo[̾})km|V0Çƾ~Ֆ~42 &E4&Y JQ\JPYʽP% 0Zɠ|TsZ.KbgRj.bLE^TVZBAx陵ױ[d e8bA0T$ @*E{wm0eE' Ԣz@뼳n+HE/ޢz/Vmy_FUfdE{*F=6r`^rˁuAn p&tm;@iW0uEҐ/~}qkS5za?xig?#v|~ƋG^[rzv?t|*^\~TWuDlu9BXjÛJ+³#/q݊}MϫK$>| /iB09()h#mC@< K0{PQ#kߝ%/U!um׍c=灨=܊mZ 5r+ZžjZd/7~{axUxн/l!1^7+52K!i#!*kdDZ*vꪹzwϟ7lM퐭v|ikBkˣeSV"mXV;SZtՁy9D\ :A Q붫z踷γuBg5\U+-LU#it5ed9t"^]uߚv?7ſXxg֕SO< z*΂osoXԾlM\_ۋ`#4"kZ|iN_B|ڡVj+Tg%cceD։R:q'Y `7/(Z7F}vibQo$F{@=C k'jN;ִԚx...A (ę0*cS FCVP_X-`څqL'FC>e%a;aآ*r%a)V.cpQ֤1Ĕ0Ƚ/ $K])|f5v=<<;(@%ȔAܖ7(n=Xab{,)#" U[u7dIXg 00_ȴHZט#H{x~i~0kczVhÓUժh:UC |PVZgOSmly?╝w)$6md9$++Q֨fZՄ'Uؖ #&19md1@n1mxktNcB342@RVRUuiO[P}$OY%v.% aW%8 ch&紑0ZV)9}9Mo:4iNr|.kk*l@$XXWd?VvmaydZI*1 Px;RdrDB .q}K^qtw߮:h5 7eQmR#9J#r^m_j ꢢN//N('>Li#%/*)3]܆w3ZIp QCvFv6tfmkz*e4)SڈGPBL617L%Q!{aa{RſW|~\K[-Y߼9341JLYɬ[*Ճ!vZ_#6@9:&Ձ%Y sKֳQC4%~-ğ<9VثG Mbv1P<" 4e%1HODSFCS.M8)쎬r Zؼ*@J9`d9huhj}66e' $+ [@bbCWn+DQ"w wv kiT &I RO`d9(c պa< ߻/1u>;{#K<9CLGX.I0oJt Mԑ%%)H4;DFCbB;H iJGjkOcpH!%$Қ4 n *ѮDhDh%}w=>kkݨ,Beٮ{*1Xt<'{$McWÈcq 7$Y 1ey:):dr)uHTƧ<2!(} M%@}SAݥIhCW!j}SMYUp]0ծzߖi){Tw$t N[UR} WןF= 0*jS@&T%YǨ5pS.rXr(n,DAx'F ;II06Ĉ(DFQBG#j:B3¸ HQLYQ{uuZ;??l3DD,@`d9 oj14NybftnR$Y ѻoMnWfͷ%߸N\%fInЁ"6Hlfd9"唪ob)#NIg08EuFw?H_1Sr7~4$ w q[ݓNlˏnK~ڶ6DbGXqMMBƑ)\Giu1c9bBbBj8BǑ%+9MG7` ۭ-p&h-@Z F\,Gژn` Pû0 ,`,ze̾S[r6.0fp ΅30E)ٚq5'q템 30Ith߀ 0 Ą(v+bHGA=piwwV Nq]åkr6.1`jôΥ7tc4)^[?6>Èq템KsJ4 T@QzkfGO%67X1!D)5" IC*VH%D`(5}qOTCXZF!+ |ؗm?+#|)Dlm^!)%RBơ`H?ėɿաkG͡5GښJrXWǡr>ZB!tC?tDt"Z!QVr @epe!Dak1ZkF`pW5\F!gʒ(~y0@Dʵq\YZ|b0BIABǕ%8Y$c0JrJ^M"bbQJ VEYQaL17Kʼnႚݱe>%Sfi`gY- ˽}e 3ͮzߛIϻLk2B 00\o?]92OzL&F JvX>N'_o_fʊ67 vx$vGKWfp1aBo͵Z%,A-i0BpEyDhGF_=>(~:( 9XJ\HCm2pۃ)*ٙߣs@1#_DaQq z_m|a242W=pZO л~vyoA6䰛qrcOe:4Kd~gq`t0O3ݲ* h.w?YfT[pzfo$?):NTM 8,F7~Xm!!?|O/f\HGjm^~rz W*M M|чM>wK;@uyxkTV9$#?*F#2MC䢙&cGR?ngrЂ}nQAfk$3sNT jL]8x>b=6 ="'nd;xQ3%ǃ7daTK۵E9vݏ>=ˑ[V@m*=%PP6ᜈ15KR+u!r(j];# 'Mh@JÔmM| EL bGܰXC]'_; *I/|Me:QfgB:" Hi2dg0Nh%1`@A97"L}R$zV`ԣI\ހi Hc=1HQ ;cc3ڟU:}f-541)0r{ll_!\)עH;a&8x{IR.rB y.yΎub;=C/G?MM` 0KIVr>Sg,Bǘ:dbmϡ;^Ҏ/eE) r> Iȓۜz;؋XjN%֛R20? PBt73z]Juv\DؕuM'1U{s8*nP0IaׁBN(~UyR% *%f*#Iΐo֡ӑǹ!D|%JPhл(Q qH1Io"_xOZ`& &W6dR+A4z1`|>J@4ٓTp򨧑a>}q7o[zEFߏ7BRwE8$-‰CH[3^ҷ?1L~p͕-t AmE$y F _sf෷\'K7! ҊHM.bǽVő|:'N3eϵ4c)tzgz J|᭷JbJ=d40]&kwj?ܟE~4Т?]='p1}_>63^V2_?KZMVi_p{g`Akr(qķ-U1 *Ic\1)&9JZl)-j3X0sa0Z2:qRu$C~6'K`]Oj̦ݳ+Z 1{GמwuC2AݐFkjXq*0/K(խF3BZ&e]DZnWg:p# NcŞw1Q4"2PR-R!8!FEea415[t7Vfd t4kSǀT5>(>z O3$vF[¶.]Tǿ۫Ҩ1A[=I ܟXؕΒs)C ;wgB?ψ$1J4R<Dc+#0k"gIH <[qnYMiـj ձ0^ScjVNc[5&19nL5ZVvNA4h4./qtŃ9XDƥNo+X{vUc֢-|/tR]rTLwLBX_1Iuk$0ǢeL\e}MxIQ_+Rs*y*?M/&s-7ozfVRszsyK?y֓/36[ _'OP>;^Q\WƁ S#7ۀ[|p/~5ti5vtZt10m.CWPg/7Wvӻ;?{t95簿p~lj"Nh~L|Y. =ta]$OmQ%z PZ ݯֈrpQ1n]}t:P:xE޾#ާ(_?VQe^(:[*c g!dhR%i& 1y4kJ@bMĒ֍퐦fjy:FVG5&)4 D6aC iV\-Z'-A1 yR3XͼP"5l~EPt@\E[շ?NU&!5g9j~MNpjC(sǝfb2M]~ LWܨmO?/JwR+T9xdB\'P°o]l{Oez8Kze Ko$MGMC3Lk+bFIJꐚAD@ !MygcfHEVcw7oњ!EUUjkL ̙h`f᱗"T'" Do@ `JfESH & DS#x%iq>ɦa{,zg镱F6M7`gr@ޛGz~Z3 CUN'?흀o*t&"DV7vZ]Dq`d_3қ3scN'?w\*GE5'Ybt#')*w:*gF1Cob[5(S%*_聾 %?Yz`Zܪ;4D1Sya/T7JKjRj'56I^HLtemo7{ਙ㭗5!1c#_}=f0u=%3ݍ0%1PuY^҅ +salX=s7/[mPQs٢Bb-y[tPQ N3 LPm]$sUŁ9fx89{w!2|iA"z!迆Zd y:G~n'k0dk? |q؛F $zU߆wN;9ƴxд%X6-yHǛOWdGʓsn`g4o WSzߚzz?u x#̔>q/Y/^77 7GHE=;22f ![`6:PE6J%'90:DvL֖JA*'\tҗ1Js`x\B3UR3<QQZ7|S9NO/@bVSZH),gUud)Ib"9!-B\KAFc2cRF"1Nƣo(&Uk`t]N8_S Ai-Y=׷k%4Yf&2fiv%"Dʈ1V 2L`s1 4fku&_m`kf+Jmᜐz(D"/@C g&plOB*npօ$U҄dEQ%} ڸ'S?S'&7G!6FB((%)Ȍ$e J~zs_+ݔ0y+z7xS C!nf] (( E?vx-f> HM"ZY+LLj() !>q`t1lAGb>I#iҺ(%r, N&!+Q)YC%}37Bƌ6Z(&Qԥ(=i*(HHjsAuQVm SX*,eij`46Zy4ȔSeQ[]Np k!ɠ-.$Seژ[U7)D(٠1.t-c6$}5lYՌ9&e*.0T%'B[ic֠\cNCm8UB$C`Px׌VQ5}S'-`u:UIPA*DddPwqv ldBKF+Y)"Y!9'8fR_ R,e2p2(_uX,*@ M &IKeffP57.WR̨ 1d a!B)B "TY'%Υ"tk"၂ŜE1N0tp\"10'[&qaoJ vq66@% qoe `|D8&VJ!(ޜ]LEwF-%J92j(XI{ERH6Y'o A!N9BZEzPW&[+hDj{RR1lԈ FyET B9r0$JLQ#p HM 6ʒ21A7nʰkQMj>R3&"h8sa.nvߋq)mŬGq"pRTb_Ha:I !1Ⱦ vlߙx`y}F^Zp *01󭦚݅LP6yd-$ >:b›=X۪dp9ZPL.ZMUsSc2$3(v5' 0A;]TE"&LԴ̨*aA>XbiΏe162A N"vKu< oŭ T2IBQ$%bٶ\# Vo;h .`vNIc6!Wr "&b٥PH6UL AD . 踜f J#9KSk]!Dgg,2B! nWnj 8 L^ThɈ%f]2"zF՛-g]n^AאD K-wj z+A{by <_{\>m)vb8NNڎxZ}ޝ~n{`\ߏp#C\?V h9_He ۶N7|k} uyQ9uLQ5*QhuM7:(R7tN7tN7tN7tN7tN7tN7tN7tN7tN7tN7tN7tN7tN7tN7tN7tN7tN7tN7tN7tN7tN7tN7tΫu -Q )|:+U^@KuַuoȨuQuQuQuQuQuQuQuQuQuQuQuQuQuQuQuQuQuQuQuQuQuQuQuՒdm-i>F1@K:RA7:,t Vû ;:z0j|`/ijCB}pGv._-K9M뫳:|X>>?__Qm8yYle<-룶n>C2v/c8JytQa}%;+'1PY3 S9 pt^ӻo/Z^C%mf5 ؋ROJڬ.DȮ#I6zmR_ܞ IX@?Wg|PC{ӫ!xv*L3H< ֓t<y/Hl;Ꙁ%"~eqxLLgV+%(/vw& w,v޼inVzj[Akq K :vce:BH\¸g#"Dr:vqLU?)9+@]Et1;q}KerG*8⊪UDXW.)F\El赫er*8⊑'IBQ] c:2eӛ),nwW]tYrg!z`dͭ2dş U_d&Ss5_S~3]5kG䳿</?fVޮc2lί?ĕ 8j8Zzm L-q;quv$ ,C|s5ktV1cQI̽]^!,ę9:fDچd1{^ o nw e2w$/H^j2mlc^$'A?g$96)T3Pe(1ƗZ;>e/TBSĒWN,*^[M7Q{ɱ$hᑖ!1B47J\VâbXTD)G%"mm!-=%\&~;s^IS32MX/alՏ)hœ| ̫,b lD;? +or8.Lê =w*TK O+&ݿfXnq7eedpӵrx_L&c!(w9/O.}6?G3)_N3Z:X o+[pmUoV"x:#.ߛz3Cl æ x?|gM' f2;S`}gg˻r5ߧ?ǿ™ܛ%ܗJ'\]V'tGA;]btaqnTvfGM<O0}>S cZ:rtUMw~F(a&9H-)L|(HM)կejYZejYZ-K͵z||XM"~Y1]L6QF79;Xލ {}t{[$h\F!mC;SUXiԓjZYmeB.az 6|+*,7Sʚ&%Gi CS3S7!sWTX. Uq Jo|Q1;I4Os|cw9Ihl(Ag%`A)6Ң~iP7uq1 Co^V1L཮LAͨ4Tn=Cf6[eP*+2ZD `AuFN!~!z?!P+,J}⇯'u턷އ$8MCIao᪪cNW=D"aRN$v ϛY,6BoJkjKDj,ap&R"d>v$jjO82[Տt{аD 5к`;@Y, wlG[!֬apw8 Nr`=.c|#L|#n][7Jxnඅ<ƋE5Ex8v:788!wz9,߅̀\h 7v+3XXg!KEQ1q*~xOc}Jo.DJ#5رZ @RĄR:2Ɩ9}$$IrBѝV>1^?F,0D1Y៥t2%P.eEdGp^b'8Xj1e RD BFmf0h&YH +> kG5+8u FːPW22.wܶP7i& /upJc?ȼNʟfĢh/Nvt~- ]/c2Ls7TGb&6y72]. w&ESOS#oGN=$Hm7_cAPCAS )=bXKoQ,|!Tt6Q# /Qq12AWvhr }rqp43N+狤XcCZ`pB)12F!4CxMH~.?ER(5Lae) @-c4E2H3d$9%|;nw?^G#Q1Epstyi1愅h b[]<=WzdLmԗvޓrb4'p+Q10Ƙ'C1'}O1cBNcFoe'*'9|Řc1NR[c`te;4M}Y=' p`3u 62p*J[{# /u<-?V//:5Lxb*8(PIxĵ9M˄,Tx<)X*,acPha=rZóy04M-|EPUjqkzXїnwHh;$ rӻE:݅ʊku S`iS2mD4iK]uħ#g=νӯx;if:Pn̺]id( ';n˥ەC1Mr =GV%ZT@*<Ғ3T - 0_˵{ky)b:_ηZ3LXk'>nZj>gV]>zir:QZUVPuᚰ>{cbDf3ޖ1Œ>,J&2 [-+(&ze8:(IQ"=;՗ &A+NT% \xEe‒`(nuFFB% /Q KV!fyYSh$yD2;`nؒɳ*1 |GjdUk9|1Šٜb9 ܢ UQ Mʰ`4Iޗt8.tK]Y/ 0gY);*"rQ[HGLRUah&0δ8/g(l,#u<6R }wGmhHGJ,$t/nG!E +H C`h,\h4@0U=F( Qx_)*̳\N lq,%"mx8aZFusό>Y}I>`#3f%j`;?n:?չjJ3wm=H4}v3;0i,) ]pJ*RQc%:e1J4 6嵢ڇfPCUk{A̽Ox EZ׵exb[snW&]3y~9YaƮmKBpQ xs8(*&\^mO됑p5iY=1=kΐvHHD21*8R ?,sq HƂf!(I󒈲W f=th.!x/ȇ|ĨW|0f\F?sNt*y _!fp lZUVqKr¶OCPfi_CEO2^X=8v0CFo.n{I*Ff\N@_Ihn"xo y]/d5w@\0o[3 femYp8mGE>"x[?-h >WϨU-xUCt)%]B_grrC>T*K\z}Ub"@l),gۻ(j6zC\<{A8іX2R\K>ћ:dsFgvn^ H<6ެ@̍)5t0KWBXd SМg{ͬўUPCKp[ z!%/4)< V}7S3 p؈Mmq؈Hf),_ȢWI??Iyt&tц<YfzvgL>+95B[pW4U$z%{>o*qo_D}5>G㹚zoEƊ(bll 70Ŭnf$ڷMo%㑦GԆ cӣxfxxXSGB^o$ӣfwPGSϕ-)EkE11EV;ڢ5a9D;.bEfLcQх:=zxC>wEU2a;Ul}sE}7`j)5RvMs8b}ťy/eB)ǡ{тjfz^lEJۑ|tdۛ ŲFr@fbbDn39%FȠ Tz8.Y,|GX݂dž#:)8тk,j萑 xAGS<Q5p:B`m ]M%h/ ߵ"萑."-WTѵ+ v"jk-!mrBl[FLY,8  iMr[R UE-bkG>Lŭ4X  t@#BNr=o1L%Ə?agGcRwH8晈KXsCF xPF=DPwLEP{DUn*7x\" :2/=6C+ e&զѰZA`. eth6C<ڛ!(*W#N7LFeDTA1"VdTPtL̘GO1C+RO()B/x/J vyi/r qCUK7+?.7]TPU<,hRy%L |a/Yϥ gTŒzrVT =^Ne$|:k-1ݵ :Ag]uk#oC $̃ DklsX|3~6^0U~E▨ ͈]ۇ}iZG^fmFi+$^yQUERȈYX$/8)DToS5ĈWQCU= Z "6O*u%rU+0ϝT1CU+G(/nqSMПXuIXz .ʅQ͸ٶE<޶EUQT}+.%|Mg϶ږV\34 p8FSotKC'|RGwg z>=GV9]fl`!$ݜīvݫzV:3qLLkn{*+TUlT 炇|sPSdp2bl*VR~ '78MOv#)E@,x]?wpy#< k$C؈w(slBrEkכ1EKs.SV:C`T2o1hԇ*y*bEfLcQ;E7Ac8ycӄ4gT$ݽQvg!`ѨyfL0 J+mњ::4fLaQFu1ZxM(=D\x^IU/Ÿ [#(kѺX4>20 !f5 czcr+JƸ%Uv-Z-CK%+keB KYxI5(PBj(E%I^RhB|e:"c]5>$Nؽ6L0i-Lٛӭe'9Ub3lk{i[d C"6Xk5ORkn ?o- mow`y6 w6m}W:#<-&.׏xwo/ڽ&125eX#.?|W6Wn[<fߟw-yS֯̓R}vA - E& VP3/,deɤ(3eJf-}>/Ms*Ual_w Xwϖl߷UMnnjׇ]͝ݝ> J}OϋmߋmWZmg=-\vԺK|.nScXw?emgěS:E[_t֟OvOy?CӋٻ*wx [gWb^v|xy?W2۸|~2_i: K*wr&i}iu_;v:oi|[y؃eM_nn>Vw!X.vm wc<гvS>w?.7[`ݮ6?}b\O?j}I}M}-VwŶwPVv'׏:nw:c738MY3f?P@IUݖͤb`oc܏a,8ѱHu, )am"UWt\^mQtcCx+)fvc-vļ:Sڸd"3D, uAk/v6zڒ,(9a!鸃!~y8`U)eroܾ|c+âM1WipS+͈oKnU]WQ:~fLqd+na`[da_o$ssP0On+xަMQĝ1#ƿq3aMMK$?ף3 ʉռe˒yH`>OͻaPᶤnrѢ_!~]8Y+(\~)e>  1vY1ɲTDu\9o[Ekg7cE,+ED[>T}@;V}c-Z7cX<"X {2'ZJY<J/w[0zyт)|݌<16*,o33^Q{ZA:qQ'i%)ʢE_<.|v3fLr!7&k~oz@lu8ãr@+>!howX+*p{5a:5,T}v/Ծ‘. ,IE>TBGNn3t6e,ْ<..  GQE>TZ%=Ijǜ9 - 6 |^xv|prrEDeN$2(CKqr5O} P5X#ݬՀ<+w"~F&^ kU:0:f mVq%!mGI(|&d)Ғ*w>&^/`mo ""E݃#ߞ H{?է Nz3 'uX R"KePNEо$ t"Pig'uŤFoD_JfP˃r:S{&0@&~)|~,fFQC QITFIF2%wؐ2F0KI.p(3_K,,b1_fց톚9p\Cב=ukoׯG0Mh2Q}`8h~)i뱭jD &щ$1+o>n1e #50b[/EoOO>:;@wx4^J[]#U2z)\!q1`^/,XF(qZ&Do5FVh*7hB d ق -:2RB$6y!O I"zsj)2AFp> ϰa6džEk*v$bYz"1IE è.bMz*l􏿢Hi)}H'SejB%v)RiX& 2`$8CHM??=:p| c2 $+qmQ$[MD;gUw:zZ|4rOc8ʚWAzgTUHWx's x""t!Rj1|y^ E;#7ɱ-ѧ\(/Rz"쇰 nu)7e}gcsVS;ŭ r;fPbcg HK%V%K<Ё#D4|E)S $3Ɲ FqHʤ+0ϦqY9>rnO<@~ o}aJ LECMlT#9Ly*vl.iOu8@*b'/C_fO;I :"'.)kw'Y%usP0FcCW\v7Nw26 %럊o'˼|lq5eF<'uLS ^Z}uRƓFZ)|)}/w*ށ#0qN?0KB^L,CDJuL*i✡b0vtH0LvIW{CF22#!+NşΟU{FeB1}*R 9t"aIC!YT٩ fg3˧ӡc~vokT=p`2j5ب=h =0AbB{IqN$E=:htLyzi7kAfҽ={ L' 痨ErZ8Ә+8H3PI6;iV{;<09qT'8ٜ1]waܾ kKDѩSVd0߼s#0qę<,;ĩg(TbSZ0yf0ҧ_%UW5Y}mg*%tDt& @ A~8)Zq'TڭG`$,^x,گ *}ArG1X[!HʝӁ#0qƷϧ#nF Lz9tRNNM¬E4b&1He^HU&s25n`"᭘q~C~$F-lGJN@`lgnٻ8܄Sxy^)cI5[ӏIu.2].ߊkrMzU_Pth#{@}~ᏓS048\]CZkDnѦHɈ%78I0Z+`s& [4qF{ nLb #9y2x8>V9}6'覎&}SuC^\C*"dweՁ#0q95Y0:2k˜L ÔqO\*.Oz2E|\ڱ.{= DB< fe BaBG2Jφ ,+e#;k@I $5I imᓮ>U;p& V+)MlE 8k(f͘Yw )QuK>@s݊Rٳd 7M)倴 -tg-W*- a"cO=\@:p&N"ܣ:(G`ᵶ2]f=BfŎ<ȭ۬x_[h$Wka#"URcbA/?c>)b>d> wɑL,L ʋRg,K2&f8G Oqң9gf(/!ìrRrTL5sz\S(FW娭Fi|~8G`A6-zfу|/:PaiG𰕐D J >e-:@j_axyw'/TaV|S{:M'H(5NKq&E+6^n(8Ak@CzV˼G%(oG`09D;!ä֭,narÒb:U<2?߉+KLxrùVʜImԐݚV=eöջV/?FGw5.S@_s}YM-n o:]nl_}ooը'A^j%HY!nnG;#]fCw~bps/E2wp!e} DfQ9 ~sZ/_w-W؏^+bb|X¦u%x @]\cOs(IX[ !jwɌI.r*͕[ʉgo) @osK%Ԋ0@TSG[[Um Ū-+I׷Z-HߎapkAp4lwmwǘ`h3H0vAߨ`*Iyv *ͧS@_3a"<4dzWA%kN^s 2b1pxa&$P]9ڇ<[ź)\ sC>-{[po:~$%C}[2cỳ$Xҹ?GUkXX(wgMv\[':ظXlp;g]#1,8-+uwRL2]N~]a%&Ix3Uґg/H}ٱ^3a&XO#Ome#=|;/Gc[A/cnb ~?βWAzѩgn";6o 0|5ԋxta J #R8I`_55`gz I&FAzpIHsPVR֭=s-ag7`.[Cܩ&gr[ >7Ro+!}\O*0'D)8%vELj]$N//|4.8s8yN04Qq$ӬEξӈ y]%D"7879%|ۜ04Q!dp5H1ړ"ہ@Fl41ړI^I*KP]1s9zcm0T6ӸlT%7Ma0pA _@>wиܙ,J崫]>֤jm P.R2gGYm…4ܭ|P' t޵AsAۤè.bL#AgծD)b~ ŌD CZ`7Q-qlo됺"uua9=.r]5`8GObh,tmr_~h]yt:9}?둷#ʖ7]zz*OGU~^_߬?Te|VƲ37N^lR Q07|ϯ|^z~0ybY.%lco*=8A\L'qC?G_p5 _eCN\\M8QS5m/T`3G n_کE}g ax՝&)VP9y:R K)&ʦ$^f ,iާ5i~,M]*~g?M"kb2ܖ=NΎJ/{WƑ />0Xޝa)"$YhfUEGc <"2/oP=vq0gĶ0xzGN!?/"W~(>7}\gFXٵm`;6 3lZRPWr}Mb?ٻ|V}Lk_22.貓q\6Aadn"a0zՕ*n0o[zwmSoJ?|hXp6yXp?LwM+Th=!N6=o+p(֛|Dů|XБ!#8#;kcSp@3v8̸7 N,}ˢO :z?Oo߾%T5_f`f<03?lzO<_2t$|q(}@"\Hs-ZպwӸDt+~G1aV_do]6ˀ6 PpN"NA.gHuV@u]t r"m`UHI5&pR Y+~'-0$6Rg@`q HIoJbƶm,yK?ܴOԅ-vx ݽk^HLv`*! OP2wpZ}[/ [!4.у+f+BbL%Ċ4<Ɋ&Tslr?L֠X.  9h3Q,MCh?EI tAq3(%uAq`ӯ?U\tWọiH(6w#-i]_B!HLkB<.hMĒ\T߼3&H e0"N άV‹_.ZCÆU}4]5El0>t\KQcp2K`rW|?Teȥcpe 9 #Ʌ6pJY:EG(0oo4VʘB;j-{g.E"zNi)rLxHKE"$.Sx{I|YwLjX}q\8XM Ϋp.m%V)dgc"sk U27їlc%_B A(czfl 8 ge'2$9bO1Zks OTU Vno/<{@D3޷챌jꭌBW!CԽvdX]3i]+?(oyV+?E9?G?䬚A|[M770M䛛ᓿ?QGŏɧ&|?A| 4aϚYK?9*VĢuSa𗫝ӋtN`gkˇ٨8iPds dXtI0ߴǎ]g-u,L5̢ Mݨf2XHCc̳ m'`H##V"e4NrԮ];[X"hbduCH$ԏ=gi F,"c"աj`Zsf=˖M;&mr!J, 6G&̄ ɄK qI}Qp` E$%*8 xu^d-q(9_,X =ͮ;&E ESn>0"1M Wr%C"UB+WctLNrUaQ1ΔBxRpFsekvL~[CN/g9qn>+%Ѽώ߂q:w vF1:&i@ٔ~[0r֩5pg )KP U5#BtZtg>M˯).hI #κŎ(}vxJgjIi>թ\:fSh5uIm@M&!✰ΈHW8,'8\«-JtT߁K|XLN |Y JM(-A}vLGHmû9'f6 1o!6={qNEQmOs'Pc5Q$6B*{W-֗S%B aK|i3jK:0{z;O !NO, cy\TQcMs/A RR%AH%PBA`6YW"j!npXb͑ eMQA:|vZΉH }8P 6 ,](ĈYR&Ac%JQT'Bʭ7*|b[-tO`lNn!w7Ff|';wLs rJ21qZ 1هlp;y4u\!0"n#9/ͳUӪ[*|}"+'C>ׄz~r7u)̍F$dcq`XSԞ¨ɣ;[OW3YѝgϮX=no7#e*,*%ыHH24c815dle6 c"d# !{^tpz}0mSb8f8EE"Yl/r,GfXHC۝wL.?nB]. ғfގr4~7@u5 >`UBP5nWQOzZRljٻqee{qlCy{SJ8Ih=m_,אdㄖ𿿑8!$VH#i4hf)y N8I`.U۷xhk7Q2P:d0'2awdygpT>~@/ƃ OKyz EC饓AocNupTvnڽJ{:bգ~~_m4kunQPF׃EwRߞOWnM5{s=X0"MpG!CcS Gh PA#Fr )m?6C{vm霊ABA^d)Ŷ#P-Y2l"K}wz:mnJ_ Fvs,!FN C0Ȏ,Ή!'F!%BZS2N~jԏ-Db'tC ۑFƥ 1uabqxy܉Be9_J:`6z Hr)'L``YȠڜ D+{mƖ~|؞л KG m C۱eg=}p?V٘Lb~WWVSQ9$@IhG$|3 r,ea".;.̑g*lh#W&O*Po="М(I|'Ж'BĆ?@G".,, RǑ1α.!8; bر= a ɭ@ň2N_XZRj揄fd~sUUm?ʫwP@Z_ir ofVcKSJ`'0^/Q KD2{GYJqc 8,ffBq\BX kkW*X>|sС/kRtREG _՘L/H[۹ #S};c"iBu={d3eO;4[TH6ѺyiQRСv&\`ݢ΃5<_/;jY E7A\ez(UH<}(b[K .dM}x){fwZOFΉW3_l,".5tBʾzGUlҭ ߡ4I?Esj"+ahXڕs}U~A"4a68]gС!(M eCCt/e{=a[d#X''ALJ=+('y|]ZW}Q 5eդI ȿ!L!u 2J-2J] `g"1O#>gf뿇P/0x<ٚ+M%2X9Gvx6b4>-B[9TumB_+`[p9E=T90.]ɗߗ[jkLB>>|.P_6Mg6eýb7 `blun߸ghrcaq9K܅rr~%{7RۯV8=֦v^;neyԍUvwԐBH*'=e)EQo{$\0Êps`$0<7$O܎3K*Mnjz\gF/3h1 A| T5Gqw"g~r }5v6za<k(+uW;׼jhgҟ7ިQcշTz_"ֻH_:k7?kkVJJJsE`rߟŝ/G:_z<OD.v ۠m,=d>|E " $ Edc<1%0w\a6,쭬mZF:ߦo0M r'JHx{r&6},؅~r7̝%ZmD:?[B1\I JU8I%s uU䷂mWz5gXaa? f(,X]vL-x^Kft؋J͸+=멠̎3Q##yk+fLd H^k?xo/.^p*/ǥ_ XStɂ:#ha+:c'JFdqךS+LDA":uy޻+8yd@c'LS8/L޵i~ 3;s /@h1>#d=D2nK.U=XW1ڼ<)~ΧdFD Aڅ8QӮm@ܬ+eJ_ӓlm4޳-+YT&!?MF%cs˂w)OOƖOjX:{oG7u_bC8;=V>V8+˥X)n, %YY N@ |~v_1qLx򧫭Xa)~\ ChughKFz`7o};eMjK9NkO 2}GT˱˔F,yHcJ#Y4b}$#<{2ƣ1#m=띿Ģ1fsVd/%ZCebmJfE9>< }`6gD>luy}T2); a#y )rUfFγj(MVCXg5TKY qI5 2و+ YsWZyJR`-~@qEsJƮ&@٣^o<9$e,)_`vUVsj)Rf3m?9C w6ܨ#] tPE>d86,}ZSR8F3xFϨuOAS)y <}>OAS)y <}>OAS)y <<T{1)ė1 Fg9 tFKRg-uFKRg-uFKRgltFgl\˙-Hg-uFKRg>F@^!]ND.Jv]L$MUTN:+Н>xgl2YX%ykٹ`_L314qb9o:5SӧO\QWF1W_/'O"t'41%I(L2҈3c_>w#%OJ4jG>d2“WZiKZ_u@yo{DxI K|ZA)b%6gOR,eYNnUHI;\h;T )sNI(mA-y{'Юjw圓Ca@EEL0''!P/&~U9^1 ( L8pp[S#Hpv&+xH&/J »(9؜vt{_ᗩ ߤOk,gb)_d`30y8O5-p[Ԣ=YPT x1֦BUMY5LTJ}H\yi&L󗺔-4w>-_hշ ZlD&c@T1#!anX•rN t,{RET [rO؅)սpY,1oL?Z127L^96!BmsPnVI^ЖÃY>yf-y@f@9 d62s v, ֭f0ҭwhLN10*76p{e7immoe#VX Ol5A<2*XL]!3//gwB C5bC>P063l6 b?4#[&v,n& 1u5b>[Kl[hk݉fL29gxR-'@m?9%y.sJj=T>9^qe!ϲLA{_+gʶ^2^278b%mACs#eyS Y ݐ|v@khߕM}q};MzZۦrֶ+<}פw..-]2}sSn++?x$lcsl& 6㘼狻ђ6Ϳֆp+_3Cr1b!-  Gc"c;zj/S{4ԓ'E;-*mi O-Z[̑ A:hP A45'#KC;Kg:#Έ3b@!AP<%6]\*K `*~aY>*KOp*[8`3j }0<Q2O4Dz$V<|dhTޯ,*1+5c )]`b;>*jոpTȇ Fp4vp٨/^/]'&X?`R6IƱ<:t8A%i(G`?{ɭ@g)xr?Al?;^ڛ =Şz55C~WXU$bv[Gw1S{OfS@@{/q/ (wտ_yO:n{^[|9]vU׺=P|3ZX GϷ=CZ}f@ҚNe һKfpq^<2o祯 _ ]m}+10[oCۈ7]"̥9يN*~2]j;p8w ^ u4nMQ5@{A >Ku)^A֟_qdgߧE9n\#qf8ัP\6qx?ebD/:Z\hkKsF&BjD4 l[(S牐om?#KrWaQWjVFtD 9Т.cY->wBL op!Sݦ(lh芗Lf,tUehDWoJ=]̎4{=hiMҕT? e-߹.* 7ϥ^=8ߟPp(,9 NBLr/Ź?[lZH8[o]I&i"$KቂʖqWHhns(ԜDnLX⣴zw&FfKh([R m5R9~"@}2*EWHh2|ӓg"r4#L|&jr ŕcHW4[8ri#ØY>FBF}sMt$Z -nfو%b-#l9w@i,1_# x-uczD\!'$Ci(8 -3F0Hh󼰁 83Qn"acf.CA!=1,%]#X&8qa -3&Dk8Q(ޡS0WI*bBB LkX,E0/ 3x,fCr -KKZ)0^iY3khAXC4yFB BkڡCAR]6Zk$1`|R:RXveSi8J2zFB 50AɆ`RN|H8[D} *4 1#IaR)Kl5Zek'4݄@Ճ8d2ѳ_!xf )iƚfLQ|YHi`<d;e/>cחƕ7:pz4Z3% THX Y(wŗm{nEFK5MG WQy1%HM, p_Ζrֺk$mp8IElbLJp5~XBV.Mm1O6Nl ż |U)iPĉVLX<(њLH0/2cqkx%dC+$pm 2|s1 y0g<7BUޢ(Ջp UO8oX2!] -K0F!% u`LTNcZa_!xLFy*qG"2n_#xCmksYkFũ%Tpv|iz 'gspDx&1Uz p8xYnj=&dX͕s8F31:f!lHhAyޫa)y`# J0YD>Hh6J㘜B1I>W -Մ#@4A.!mN&iuk$/>mE5b~_1RIQBz|o^SR49tcF& -kj[xbZFLDT,*'!autR`'転lb,F61_#x,[)H1w$GRa.5j$4oXkhaS v)JģR [ Ph蔥%BYE<8H E>m֭+ oEWE]BWE[-FLtjWR|14K]=M]kWOּ5O]6jNs Ns?_]Z=Q<b$h2kgυy7<_mJXVi/[3Y)]" ǧ$Vز&uP^ǿb ՞ r*w礗F: D&"PKd&y "e\#(;{%z\G^uVl%V;4߲u Cn6kӔ6~j?r憩{ ~)WV a|5S_votj| JH4#f%=}?Ϙ >!]Qz,Y48_ήsP'|)tqH,̈́lǟзpkZzl7} t,|dA{Yt>lG"d4[mƚ ejDFr+TRj,`9)ڐ#v|GZ$ɚwV5߳UダwC#zGo} དྷ+`1jt%uytM0[i{(~ZU'@U-fM\#E\A,4pe[>^+{e@"kU̯5B};-ٹKb~j;@'Jц9e[(P ?%wxk)yN_ ky+{i -hB\e!hg_-ZJ5-P,+TW(/2ytUz7HWlDtU,8t4Uіc%]E`znAvf-UlVBGXn=ŁZn9TUv_{^0E ^J(./Wp#>n tR|W]E,ˠwd,[o<ʬwdgߝﺫT|W{\p#hPaAa4MQ`,MVSE+ŵc+ 웞8P[.~rJs[~]5 ]u  ]mj)zOt^ůɊ4rS;ziyf|nڗ76|\=8嗕S?~'.%:ơ{xVcϒքR=|O:z3t+s:@O_0^?uϧn.zβyP+nFӗ#JR]ƅ/;*lcR%o3̃ވ= [>b,6=[kKuͯcˬtT:<1B*eF{a8Hy c^xh^L9`̵{ q~@Y.4Z\ߟܗUnxE"1}>78.񯋏I09&'PG'!2[8 4!0sYLYSt)._Y=㥅'ZDeBArLZX@Dic]*|iH$313s,86s A;GZ;@tfa{y>Oqd[(A#v!|\?{Wq]^b`JJd=v*69XXJTsXH "˙3{nOM":JJ1U`]4L4T{}2᚝-ӯDh#!Q"OpϨ6hIoŏm}~J%C*Rq8A1#3X2d]}P9wQH/ZƩJj[+s$+f[tMu$ףI YOpߨ#1Lɇ%J1)D$6)hK 9¯PZ)dN^DCZZ壝pPR<@xK< `ZjS6zzx5jF"wyM:>Y0F.\>E=uk,R sS`w"R+ ]0+(I׆R ӌT"yˌBi`L9%X3jVb{UPQAm> h-iЮmtW*A5V-Tbպ* %N@2mֲtXBՅ1HAN0,ms]Qbflq%NX)j^ ?hV #liVԀά$޲[6RSҭ*x/Gt^QѠM6H*tM>gAKCjM.+ ɲ 7l3ILmU ԭwH7ۭIFOB=V`&]˿c'QEŬmð^k Q^$QzhЛAI; # ArR{R@k8ݐh!/Q[tCۚb tyBܢzҤD+Y.TP=`eˌ`*1J[Ƞ rw]ߴUaKl+TOW XQW6b]MA57w(._* 2ݡPڢ#6>XLaǫw:H)-0'GJFdUK:=?zqquI7eayRdf0!t07xGz^_|B*,G/J Wo_KqIG'.~wp^',䜌:V{+gc\=_ssը4XB6Q:lauبF6Q:lauبF6Q:lauبF6Q:lauبF6Q:lauبF6Q:lau֨ ӚШ.ǨƨqWߨsF/7Q:lauبF6Q:lauبF6Q:lauبF6Q:lauبF6Q:lauبF6Q:lauר㭓~NFf>Fuj6Fn:@بsF`0lauبF6Q:lauبF6Q:lauبF6Q:lauبF6Q:lauبF6Q:lauبF6Q$Q'xu7uG%{Nn %uȨ:lauبF6Q:lauبF6Q:lauبF6Q:lauبF6Q:lauبF6Q:lauبF1|ޝ^}qv5bޕնx`{~G @Wf>/"콯(]`_!>(RsAhgCW?k9x]ZtE(cb8c|~u g΅mvtu8tӾ+||֮M3Hhޫ+Bi"=aN  re*"q6w6<׮ꋡ+cazhWgϤٮvLWzg{[]  @]J: 6"߾kzuq kuM3OW}tSY:E45/rI)_Ƥ̲ՒD "?Z?߶?#^.^t53.O<ǯW~d_-W' ݹS[:22(QFO.pIw|׿Z=e:V&:J[OM=|;CI/lo1l)".q=h¨[XnfhtF\U^,bl\Pt?l .gNGS<6T/v?Sje+`'3}G*q YfzC[_n4/W|8.Y|vS.{Y*`D8!`*d8TAH1`c dGJ9^y5<].ɚ⽜8 ^ū}x>֑0^oGWNjo V_-w{}!ɀ]F|;}b#jSبL|W4brQ/N~BNO[wCtBu["m@z~E($g fehT2I(c\T'Z^y21Z^ܷ#Q zD],(%):ԨڗdH"XeyʏKDYQ]NtI=I?꜔gɿ> 8K1~^~r3Z9?o`.+SMfV<*4e8niXS#az97]RpӗOx2j8\Yir7]$7_.f[>72T6p&m^b}~ixFQ~YeϯҖ;SNʏG7kNjS+}/qJ.ylܐzPq@ÿJ.)hM?oO_GQq|hs?fW}.!qtޭa0->x RO?Twʅy2QtߧMn cGg#)Y,YZ+ߍ.3,%Lկ8iEU~4fPYG7Iȝ2GT^^#zW/_RjJ T(~ǻ,oq|F {[%f)t̲QV#5)mՂ9B$fz0tIO]DnO1V/db K%Bĕ;/=p2w3nY9ȣO-8DY1labb dLPEU#|{b9ul$Ln{qSE*{})ҞLX:N;R^P7dyl1`9qrs[+JKat*s3)iBQ- e:/zXoGek5_2Ӌa*~C˷{Zբ W)ZdұX=<鯙`r|?]_e6i>-HBAϧ&^pioVNVr'd&M]JW&DF^x<.G^^&7q2̦AK_t+۷sJpF#i2+Q.'  {T)m\ae)ejs2BG8ehOB2LͨR\]ĕuplO3P#K35J.yfHPTTpj5Ws0+TP0 W12Z!C$뎸+[)Z0l&شX\=ԺhқpeF\m/=!\`{8+T+TByWWG^1J>ony* #C6⪃RL]q  :EWV2:P%WW]ĕofKlwfO(fW7*j0ԑ .]ۥ0 )ikr}xZ;~-(@˵Qk$_Ϋ Uk]bs?}mvXeݑ,v{"aMvczS#J+6MFe+Il ėm^oiܷ*qT cV 2wgr/4+<5ieZ>]򄺅h"6_'߼{ۓzXOFZnS?{}'2R;6'.v[oZ~d$^`5~n08ï`(6M Qr{QoDiO+M;@+Ysԙ'O:m7/~{ EPd WZBJ]pD+aTErXUwpe9wG=lW\`FQU U긂r\Z ;a uANB:w* ;(E)v= tbWVUJqE\+ C}=jal|C୏iqYLk;.HaݑZFf*e`]ősۦy kCW(apjWˈJaHyW(0Z& նJ":+)Pdpr+PxBBG\uWxZ+,% Ղ P W24 WG^IW X1KW(;*BR+Tb2{o H +ɼD{WUqeZ>= *EWq* 9BBG W(בYjs UP{'qR+8 d&V]J]\MO!$i$XqLmyj#T: p#nzo/&` DM~WWQJ{01ܤKƓO1/o"bJs6mWygT?u= +,ec}'WFn/6b+t8>˦òznb<]^YSPULþ.~>qc\?-z>\mu+zJTKi߾?}.Q=$}ms6_jX̓9xK#J$Uo9+\ie.i)l+Sy:*PjRA}l͛:VT:[ϓžc$dXL(pvLImK ]L$~|\ gT -F$_ (~קɇ9,G-ޏF˽MGH,@-\,m_'*&fovg1K}$ oapRq̠H.Y2Q~+& ]ze>Ȋ@ y7G>+3uΓqllf,|ewPH^E&_xRF[oDv8^ѿ`ZHq~,fׁ/}[bIkGvuxmCݒ;^]9{voulz+TkZp|GrpU )O&:rУ2lU1:y$^CÖYziA./'/Mt"?sUo3MUyгSkJJS]o:G:6t\J|Uq%GM+$WdprބLmKD.⪃RLO W JAW(WOQW]ĕbҬ`-\6Xe!W}(U PWD\uWJ2!\Gh&W*B+TiyUq<>@m=iS3SU*t\J":+# ӔB +:ʵdbWV'RF\uWVz!(yW( \O&vj%7 U qA\9%+"++T+Tѻ"SZ1;#+k$\ZBt6Jlzyʔ-΍;v\fj:۳J;,O6ڶ*k }K3ʆ+TZ񈫣Jh +j,\ m ]]5RYMyq\Ipdpr+TkeBNE\uWnG5u|S72_>-ɚM6MTJNkibR3- pe4 ւN*23TU3Pinh|xlzGa$x||I2ۓzXu 9$Q(@7S`O sIV*u uKvN*2KP ~ -OgQpe-c;7lh*:*B.wnjqwcʥ]C'W2⪛Vusθ(W2* -]ap蝴ҾA;z5}svgzH_<./A">AڄeQ];$ZsWU%RCtwթS3}㘓c~8u}z:F &JvՁ'o^ߛ7obzs;mq:A?jD e<_utAu~G:=ݔK}D9xxAwNWr8;αſC;B[tzy=>9B|oppZRJ}{UړwȔ큸O.Gs~A(^|xa|^q#^{*;~ayDs* O#e"3p@smykV;#bɑgPOW1}~޼rҶjզ~jپKƺ9d8P7du5YّMgr8]HK|d7&ć?ho[K+  Xjt 3=T-쁛1Y2IYbmQ:scfMNuhj^c u*D*UENkO9WS*N8j^Uc?5Nid?.4NʾuU*bMԱz`v6U>k˭91&L2ݜh}k5D *ckD,FM F5S)R6;D $T,nmِ*jm-e()bV'G@SF)mɏTD2;  !FVYJ{'cv(,Q!(}tBw:Z9eH.޷i*CVT)0^ ,s.̱Fx_ا\?oNB#TRZ S $]ڪS ;Qd.p׬#(ɇ%j1)F$q -ڒBB6O ҋJSh#ZKk|sU.JHQC_2"$2m6c!jT'"4k],T Vէ 損5S4hMgEʥ: VA`fYt7(!Q!Sh եfQ#6*yˌBii l")>(s%ps L-FW e&*QYjXkY rBE(kGU((NU@pɔ2 R c[l鶒p5T`ԬffC6I6p.HqN +QPIwP TT!z@ef$XM2мZPB]F@+6CA adܠѦ!P C='X0,Lh=4v֮sw̶UtFtѷl<t,r38oaP.=+ŸKUUg:G'cEͫ2Ik4B6}ofCa2Ī~j\]׽Atr uew ڦB&a1-A7 /Ml#TYY(rt%C"Ze1-Op(v5O -tF\8hZ R|" LԴʨ >x .N7;%$0AcIA'uKeL< o 7 Dd6~9=,FT>COe$X67NB&Ȉ`PǠA]jrc!/fs7DRq//sXN=S} $$LES{ X1z*a3m>%t $v ԁ>赈 )-f)@M›ւ ֫LUY[tڠ@4XA Vr&-3U65EOhɌBj4CSQ3AO8Q{UQ!,6\b%ـn Pq7tl)Wӥ!H0r(:fA59YWp: 4M]hh%![ U\ yFB#SnI0H5㯶z?͗y]n \]hň2ޯb3˖hN?=ո" YAjȞc ~͋_؛8RR\n;ٔ&NOǎ_[~ټ?jumg`Aƛ==^f;η_ƿT_ǫ<n>~j;ީqWO~fgܶxCQђn5| jbhu'1<#Ũ#F1QG:buĨ#F1QG:buĨ#F1QG:buĨ#F1QG:buĨ#F1QG:buĨ#F1QG:by2d%uB,Ǩ6׫Ĩ:h |uĨ#F1QG:buĨ#F1QG:buĨ#F1QG:buĨ#F1QG:buĨ#F1QG:buĨ#F1Qx:/ɨc-+ZQpYQh Qs48(F1QG:buĨ#F1QG:buĨ#F1QG:buĨ#F1QG:buĨ#F1QG:buĨ#F1QG:bu y-ɨcܠcu<_$Fgd cuĨ#F1QG:buĨ#F1QG:buĨ#F1QG:buĨ#F1QG:buĨ#F1QG:buĨ#F1QG:Gczcuwo~糱tq~?Ov*|ELf9";W(__W\zcz+{`O>oՃjR+~]c^G1,tbj v)tZc3+RK8jK@$t 銕?!+v-\Kֹc+7P ]=s +;.pY ] 䏝Jzt N/HՀˋ9w5pt5P(t ʫ"-X娫Z>zu5PZ/t *hE:>rj9w5/~?] ^N?G4-r7,F] P%:teا&e72< y"zJ:2++tإǠDWk(>%wt5Pzte"DW̴u5Ũ(# ]=C"cZh/ܧ&0J'9aI Y] BWUNW{A9ҕ5NXwRe?٦71x:C dzuy=l8_nt} }9m#ԯZ]* ۸z[߻/k`7:=lw}uVޯonFSk07JT" K[qe]o_/: {|hRgep#d~}~GWwYNfEs}oO iY5}EU97kc} ߳ʤ)*ϱdʎ[ՊStj_L޵xŗΙAr{.)NrX ކ8' ~ڒm:g-O޲)&4B~6eoRc"K2lyuIL LbNg!2paHan@$!'#/"ѿ 6:<>L%X[ֿϮnWtYѪv-ԃpEG{muZ%hTP,A_E`_=S.%_($\Q-mkHglgKEqlP^9C]4 ~O'gA@-vr`xeB^蟗i1ZpTs Gd60*ǃv?co#a<]mA_G 5oS ggA :6<mEyBIwev4 ?gHs `Z'folq@;mS?|z8C)}ayZ*=/3&UχCVb_m27 ?C<6rVAceoJ[G隈ŕ[lUopɬq6ذ66oSjﰬ߯W ±k k [; NBj\=Nޮ;Ǖ_yhm7Vcx}Teڶ/M/p ՝Op-\Qzufٛ\@r3n̓[ RtT e}trs˳8_\K?h +;GggE19yqL~ڻ0뱄&B  mыzymcOѓBhݖL_alx êf}ZW-T c{Mv:\]uL`5.pX˙čۯFL+oC9*Ͽ.ǷC'|cмe/v|bXt}uY?&vܮ](UEy̩iWg{RxVW}׿`( 흢F](;Q&UJٚ*AST2yvjW}N/?jyg F۠ZUEU&qE/͡mz:~T4]Dž^A7TQ4@]ޱŧrt".#hɲn2dp7o}p}R_](qXqSHIbQ‘Vz14Yё:^BٯH#?я B:~|Gs}^wG쇥GKF^7ƶKr _,:CO%G]깬 ޸؏֔9rUZwAad?k !@Dz;e~-o|Xy )i`uuQEQ] |"H]y`RWa4/ʣSWW%[Q]I^]Օ,ǻ M.(L·YT)jeߗK^/LѮ;-Ս Fkw=npe|gvV9òD YM1(%$C4$XeZI@#T'>(W׷6v=5ZFHJ4Hℤ$5*(T18uB0Q+xjI%![Spc2ifm5s8!1RIX2Rg9 cFUcn~p1FV23ڥLqdŹފ1wlwA~qZ}wqmkFk;߽~~󛭽?ֽsh]? |W *x{@jpԎWC074_=z$r5`2ic,8Q a JYHܮctkm6V h[P '`0G%S:EutVK?kvoZkY#c[mj Nl=uk7nnx@Td&a øہ9q2)a-O9R$vDK > CD$)E ƭw_1+jHyl:>yr&$ͽ9G)M=錒Մ ~'[$O"Η?dƘb̄ƛ$3vϾ7fTWR SXP1 R2?Bʘ8xGn~ w>U]6C~\ZryH㼓 >bͶ];MDx?WV2=L~|՞Cx~/HƟ RPƟc\\ỷ&M_lf;+b,"e_^i2Uiƹwf䍺y fuwi |2E14$ X+(_ T~g+8.L̹O~r7Yر^,4oAhea }Dsp#)ޤ!ܬp ՛-O}G73:A Hp˱]t&Uה(Vԭ\>+looF~X:(P2m8?xJ2Pܔ4D̑,a!|Zh"~({ ='ҳZ(@P]̊c WԱ?_~g5/)+; W\6eՇo-6+ tAHTƠt+U y_x$A ˣ2o1W#څaykFr74&U򸨯.&*-ǚ&eY""DqGC+ `ڸNGfo4{;契]N܃_%h#ۛӀp\jQ 6zq+)% Vc y 'HʤF P . 18WwȜ'x :+ uitf[%iHf@4LL̈́H8p q¼'wJPyv'Cq;SV>h! d~t"#TIae~n4KN5I)4(]wa?׶0M4K_{h¹NQ͈PDp׀D&"1 Jx9R)g Az\A eА߿ 'zk$,mbDad9EBVi̱*GoQ)DWJ2?8yγxٸڬ _4;jGFOj VNSx OEc4LLV]"˫v!73w b $B ]A\;`r? $amT-JK-A޶V&A~St9ѯ ύkI_+A E/= "I$ +!ddSN&fY;6xٵe<6+8853RcoXjP* p9cc!PN?PCoiPxVf {{骠C0hGgQ2v]{ykSb3 .Ok a%@~(]ms7+S#EU]{Z;! `l^$!)9NgHjHҐIqReEFhi9jn1FaO ܀&bl SgPIjouMn{RN߯O OUDJy2If_* /l+PDoƘ?lG悁2NF$xbah ^[lcl&G j5+3 !.xd:`&EȌzP25B?a,6x\S)e򩪾G3+厱! & t`soR`,˜[,QJtfb8)ֶdbiП?fg+Lg7)p4IFrF2wWMꦮ//6XU|A6_hjh% ֻMK+g|۟` +=izt}ۖZAgtwsGBp+:og`P|@ D.<2r8 IBxNǣ&?9,j)Ro 2d5F5l`aaxBs 1|yM)ٓ;Ea&,|>ȯY - +;WbF[Mix,gxp2Ӱ39I.xo`t9PCi0>a.5(;hN=Xꇣ/IȮa{A V{Ye7RAHZ<y Gױ$tV֯azr%܀.o&L msrγT8RBIτݎ'tYo?[srpt˜k'g[/v]JqRrTmwx n:|o!Ԛ=r^RRƹt[ɥ0#&zWȕ3iF1,4cZin־̪uHM'~ n?,-ik' џ8gmeyHe׻OާS\D(i&gqW LN/NʽMȋpn2k//(+C}|~h+]q%dzR~,* Kaۖ,Yw`k#vhx80[Еj׮+P~xq1)!(g_ߢt?Y)` q-qfʧJjbERs̙:&#ѐL;$A,)皯g_4US8i>rOi@[cfs&Wd. 2’R5^9оϒ}CVݵY?;jBH/ܨWZi9cn}F_Jm')|;r=SN(h{ulud`'!2ٸ [q x%SW}Cm.>I #o0>p%mK|>v(GwɣO2M!wvH1H7|ceLv{b T+mcG%F|^~/٠%YV9TpRH1hp24f$1‹t:;lwBQr/mq*A4W?hƜ}(#MWlTBGiԂV z1zp=k9tηkb[殦,WSfyv^ڽ@; tI| 7oSDl h!nFoMiMW5k3u>Rgbh}gP:E4J1)+dVRuc6] s'#V#hS0[ E5PD+͡C]0HYT.m+DXCixGWGHWB2QYv y1 E'7Eû&+˵p%b^$Un^ݨ8=~c *X/vj+>Pۡ}<ۡlAW]mi]Qbl],ց Q ("e-th9tBvTGWBWr{'lnXպ-th>tTyWHWJm [ȶe?D\ttut%9eK凣XTɋU|9o" TYΕE&y0pg[M-3KZ+Jj -"Y\\w$pk "Zq>!T##$Y(3EtmM j*% DO[=D1B_ JM;:B2 :0'Y~pyk"ng_+Di c+ˉ0mR5Y-hO Jn::Fku:~,O1$#7XIG8]q.FP=bs%?S@!ӆ'<2cڲ;EZ8a0)|vyTZYT\H_p|Wld72-ݬ\_829; J%Zͼ-|ʄf$jj, m3YZ1YI% EFnf'.FQⳓ Q 8P';4 p[=%t‡ 7#~ _W}9 ngn~݀ f]]E?@HSpgŕ~(FH's(yꔌixh gYuI *.D9>̱ +LϨ ~_=4;owB"=)$>FvJvchg; *@\w=j7z F!\:3_%*P^tx)s40BBTL{K\`2+Fg\} G __,}Cb L&7J/cRiB 0Ug)96ˀ#&әpJs*w`*}d8(/eQUd9tx.ngS٥I w\j6.?;R`>K&̯%P܍X]7-Ys@>YTmS)i+uwɕWiĮ>2' 㭻.~V(. yVmIPEeDe|p9!ѺL Bi4,x+@4gʆinc^18~4ӳq0\ -#r4<.PWt\}Xxތa>5u༿ 2M eJic1Ҟǁ8qRD8X) 㝴5ڦ?nsT19TT§WC<1@!=#v`!e*1\cCrx4'K0T19T:tҡ :-Ajϛ,Da;&avh_]faY%ˣ`\AGُAaXlt<rQ4 E{=~*{\)+PLQ_REO`g@GדɨDǗZ2B3_ O#'n7Mq`#bAU5 oqETE,o`in`n2a|1zdp;jF S/>]25>h<@'xǣwiźmlƢA(Ihʾ ƹEe`s`VR t@e6uvC4v8kflXh+س}f Dŀ\p|M ~( dsk,QV8dPCAc>i/ib$m2)G EέRȰ, Ƹ#aREb/"D4 d3A/q+Ί݇ CQ$K4Ɇ a,RT d}(_{[ =$,\ʭo7/v?]ݥ;y } zpG|-C6Çmt#懐#uFo\l?jߏJub)^uwghNS&hclO 6GPU.g\z˱RY{ѳ'6vXYNf}řUI2rLM$aˈ OP2$H#Uym8i䛰rE؄ A/Qc(,<a8z @y>eO(*{u~g߆W[eG3wJu (<&r11NcL[G#Σ4 a9$^ Ld>R x:R")9Hd 3âҌ; K`@x:4)V3z5g ݄_OXyͪG,4?3WFB]I6!q!w1qRʑq/ 3\sӱ`d3e V`[qD@9QDz#ԻRL]ϔe^95:[T@1AN[l9krTa %w$ٗٿADȝn`mjI).;:g^T EpI I\@+#+XڈBQaad<†6(AڱHgҪ' %Y'~J=fFsF1#\Dj+_t ˧W.4ac9I,2ZΟڌ?N7!^֟{4t+ Dڲ4a/\BΕ]rί&vpkT9 [8S<ȝi832?>wHs]ysܽlv~qFphXW _uhmn7 W:w7Ο Ar51x4hi/9/6$Qs-9˝Rp!\;>,4xͣ.;`x𨽜3HP7hY-Kix6#A_$e8 S٥|v'W2:U78Ŝ9s˭ĉuXTyUjzX3Cj 2A0ɵ rMØ8P=d:FY^1H,w0S{hohqe%ZmkO_*wHl3#bHl]lߎj`!rFACsKFDoAs8.wgE+f=V]%u{KnJq*QIeP]1FjJ3u-*Q+驫Db:Gu [Y`婚4ߖ.Dg䄏i{f޺yaI=j`jn=f0\7̶R'a_ӔnQ*K))n6[43 藫bNeɀoT+nYVw,pq1u#5ɚn_vK/fNE΄h8Yk, W֔Jb}Mtshӊ]"Xո- UtQ]INF-RW`tȕٍOԾ4t=^]|TF])αjS#LV5`SWWJ\O_]%[r5"mQWZO]]%* PW:S@ UW{,4qor[:K틟{٨\퉪jѶSWL=c"u5*+q[UVSWWJ-:uu.n"B`[E]%j%:uuTSW種OfTIt/ܟQ~>}0~V~Jh/ʋ*B*:H%Ȯ#x:ymdNŒ?! g}6:@h:Ԥ-J&O~NoaU.QDDRyʒܙT(W 4Or7e<N'U뵳 lU86b(I9 F{ br[4ڰLT?1 jkR꿏Gn>osʆ(6n|+Zgla<%A}qbP!ԖHW<(Mj Se|tDE+tBI *P @ B$ ,-̖3ˌ3rM|}ߤCO:5z-VʴQL6jܒ9.-{UDQQA(MD(hc9JpE i#ڇ$Nr|^Ip'ghDkd#;M4@J%,U8"¤c5*gEC% 6DHDᚨ FT8G"W B[@a a9krVw f;n,s qRaę2F "ÕҠL`3$hw"Фܾ+z;XAIEUZ -6vQttؚ~SQWትhS݂^GnFUq-vw~Y)ALIR;m,F8'.Zj'+%^s!Bt<"@tTqXPS _ `pP"TmXF` aaJ`pK5uќx,XS8P K>G'L 3 Rcu#hWMyצ\َ}T衪R>ʮfvr/H] SeXJMutH`]wU}j}5Ն[cR;ĬW` ^ wGn1Xe͔dv.{ܯ꣯,kE.X)b4+c3/Ib27aA@@^rln%9 ,jM%|^XHdRFB)7j<[֑aYqGkZ _Dtb|,yh]9zi{vҍnSe]`WX&RsƕVejed: ᙠU-_%3P 8@0q9mPbwzCp|U/ _iKR'l 4I+ mU$&5Rh*y~VnyZ+Q`т!D 0X>iѹcPqU|< X_=顉G"cL@l 6qk\"Nq4TtJ1p5+[Ӿfw(#h=uNFT,%$hI-Fu1 %ywTJd$ >b N%<@@jQVaRϻB4̼goZ*L؉hE5 $GIz8Hh?X(v?s89V_k w^NGu 0_M^(SYAX*,X=; t5#;4!LÐ;?<Cbh#WQ#y߫$yUYcZy#u{s0G7 Vwithri}^O?Ƴñbȫ-3߯J?!|+;i?+aN& nO&SS.NwPP7X@وtByt]Y뫺na}a&(㰪7F|#@g]YY}.m.WOTQ&Y<5Ӧq?4%ǫkG3 䪄[tl ,+eEayk׺~&:>YaUxdаQ&`L,4i\&6}}x_L2cgSAΗ|qg>gcv+T<ߥFOm'zP>[Ӊ#j篨 :?xG, n9r9wZh:=3\W|Vط {ʳ0ȕ 8瞣r3A _J4R(ך.4[Qq\\n!Y0g]`&Vg9o.򄆛A"oJ Yxdwג29کzÐ0<|-@" O{lyN@P:kgH]A6J3ͤK<[C"s&S"'47\^+BAjͱ{ҽM(+ֆIN=ٴ-;|AbwL:㡓)R2[L9r)2e(Mѝ9ӵ/oW1&Y$ԈGːc"I0UhD(bg$˕I@>b\Ukj J1ZQ), ;1xFRj[p`M~Ye !>ay#d!6JG I&M2(:PZIJ^)zU<߫{gzUV(ӰZ>|ۆzJc>g:> oӪ*xn: : G (C/|أɏa: ѭ|,lC+R޿t%CbD a2 >99>^^ܛ/i>7rJ\hN{Ɲi/gfU\|ah9U(8_m5 M[}= wvo& /܏&UY*[ff;_a%O1F YJ*w߮ JU[XSjv*VˋLsLq`ǰF*ex18QvAn{W ?aتJ7dJ7`t3Oin{&;֋EHRocB(2W^ 'Z>-ۅwn»]x v!7-ۅnwn»]x v.ۅwn v.ۅwn탪!@yVxKY»]x b.ۅwn=323swUgro>^ -xyKƧ[pVMPCF8%8ZSpc Vg88s[ȭ> GՓċ&حmp388vڥmsVZGaXgό19p GȵE3uS.8 F|!u(FGτV()5y8DL!MDO,hn05VV, ?Ը#DCٛ{{p ϓ=ٴ-;|AbSZ»]X.ۅ3nq[x v.ۅwn?? )ۅwn»]x v.G4%Du(>wXΤCwiwHro^.^$d3Nc -r\XEx Y)A,VB?z-rJ3X&JK뵵$a Q*7$ha<Жk!yGJnfQNyD͙IE ƆRE#7Е[mi;O{y =QR} ~}ǥi[]LI #yS|V8'Zss0:U:Xo87$*M+06r[\4|\[.q,W QoXa)y4'ceP)9@jBk)%RC2ύ 4/Ǣ:N؊(b ic"#V'M ` ɸ|b+6n j j?ua40"h?RI\IF e*DlKehe8ƭđ4 pmQJC@C<3:ZӐZi:[ ^O-"xU#@P6OH(ksARb AqCHaNreF15ʷl r-*6t_}ހU/ Y֫])Ƹr_ c\l Z 4&j)$8z]\/,X&F b'q? i:yRyn%Zsa.fOt9}]\D~a: TWkߝ.hEOy24P".Eځ2" 3-N>GF'_#{Фp nGN|JO|tnk ķc٣a?{v|[\mJ<1qYɐFxB4o_z4vot.v+hJ*pxb+?FNւ%Y,T* hҕ~N}[c&2o A4i km;p?&s޲_|r|rfp.]5ztͅhMJg6Z2ܥ@MtEQ^]+o (۟ ƣ4=cQj/=RHl8{mYe{:v7K5t I&|bPbjǹFQTb2NZ:7rKV_~ l^gG*1IkaB,P!9gJ`]٧6? ;nɴyirMǹ~o/f|=DuO(Lɬm3O-c#^|3f##Fg,Ȑ>i$GcvwGq k;3 XԎpdM3 節`]a;#ansƞc+MtEÏG6Õү"cKՅ-L<837|~C#oCG)ng-Ӝ1OPA.`;0GNI7ZVg(HWwIaA)]hGŭ†;f4cG8e\4[K~wҔ +G@֒*Gcg}`Z^^SN Rr=cbDmbAƩLI-j_mmZ2>`rz95pwi*{[Qz{pvhXb#VG#$KS%JQ*d ylPkL}vwe}wU7NWң7ϴgLRpxRV*eJYRVE u<.<.<.<.<.<.3-LòqN ~ֆ #]HZ6P OՑQ6>o6?^|عi|}צ!t AHRVa S띱Vc&yeĀȱhj4BZ"Zs-a]Jd9'Of웮 -,Q;7 qf%*Ű[)fcq;i6r71f½gPaegGJJ%*V弽P2zE+brK%bdF/cL)lEE9P3JV@f}ÖՈ*WE E#&h^xҎe# '>~V"~_u*HTԏwnM=zYX\zKZ9N;;N~R Ej`X}ު|{/tyn?Ԭlx]bf F^s7?j[\ݝ%'_O)~/H*6Kr6@ Daf)aU)ttݿ Ąo|P+ !?P4M†pߺ#-\SI%Y.T(\{S}"oFu/?*PY9E|xB-f3av?M˽ }tܗ}Q{\nEܘ}OS`ۜ,ڥ{kE7Mv};оhoڷہ e1ko2v(:~b_BXR턈~4>޹Jydޚ&Un?F0MshTQ>A#QGSܳH`P/&y36j\IkRD|;"JZj(:PaGd^{KaOĬ]6]?d@|%S}^_T-Sp#N0Ky1fi,kqR(Rm$)m~*G#ّ*Xct}{W]7ޤ[^%wXE.skxBtv٫nV58׵"@ę )DtMʽ)mPRJb3^B':#NEtgnxG?SR;mUwS1lLF0u1R-hv)1;C+flJeH3f_+i~*y9QMi[W!ӆgHua'K,PȤQmcQ˦uW!ݤ'vC$ioi[LׂvVcO`VOB`iߗoKt-PT̩EGI@X47 \]7^z3ݾX3^zE/373R_?-㩚rժVD _]^ 180 D:AiʼWVse(㣳$g&*^HZDjpp^.V@i6L҄H@ci9I1Ԗ~o Њ+6^޻Q}zXJ`:vnnk k\:YЕW O\]CgEwB_VrvI M B9n X00\4vHi$^r[|$Kd4"RȈ5L2‘dI1Mct+ղ5e )riHn2S oHzJ>cCrx'K0VPt!ϕݲt z?Mՙe:a[9ܒ[Hc7ڰ⦧b5Kk`@"5\$$m.sAH48EtPV Qo=qX[R`4&m)q%1rU*ȃ>ӬZ_iVID}ŗXo}>f`y*M'x 29(HESN_E$Z9&F\^2BS%L*RĀ<JH"i9iEXG9hZ) <<ǨU1kQRk zqʶmж]# ,__fYF4|@,`aWk'\RCR|veRx(F#lh QqiWJ4_;ױ4m<4ΌbFJK/:1رـ6O=xLLU=dNi1ۡ>A:lG4nzĕse oAU=@) 7a< h#->OqP:9%5ݟ@aTZffOi}{ȹ!serЉo!3w9'YҬ]z˞i'dn#̐5K@jSdM/ zTa Z06D阰"g*\rSb!Vz+3~_rw?W9>' E๾"ߔsL(QmGs+)/"1rbrX+/B"+"ѱ9K=5yjť"E🿑v]9brE~4"lÍUc>aYb`X&g{?Φf%BKk}(߸Xyx^Oo }?H[ !{wq@ ٿK ʊb M/Cs "9KW{!%.`\2ש u*qZc> y4jbTH&p븧qm# 06 !S"W ݷgH_h+LUuUc6HTfc>8?;E(}C~C pr;Jig$3E?{ӻFu_e $J#(ZE5`̐~,8y&3>|g?u)^~3Mrdfp%=}jjvN^h.Pc*׉f9He*2@[#cܳN^xd$ 4.@:wqto[:xjgbj{A+ŢVɏo_dTYTw7os5}[T՝j5$V>P|ur[%lWU) l)ܿq教DөB2D*uLŹ",/Jz!HAq𜲰PD (w Y8^K-5ZAIvGk1 vqQwˢ ZT<ז׆Ӻڻrwrzb94Gf)휠q,.cE~5\c%xͫWgEyqz/;?'s xxcJd~cD.B+5}Èq4)6R˳}2\F.^Ï}fAEE~DWu1t1/7dJ3ɦ)Iu 5;9,g QU^fTgjUgR%ǠdT3nN\\TOIe}4<0Km*CL3Ń>'L2L=mDeBEgnkfwԹy`'_t\Os&\GM<^ꌶSŠYQ`//n,㺼c9\ 9K˭.#:F GUǵu )Z4A0ɵ r B($2f4j|sWL#,8ZI}A)3)I s>|U]3\x\jM6v-[Ę8zc.JCv=E^m6PN{\F b K=J/jx7{YzZO\&_#4'Y6|?YD:sw=dt}:{j\!I ר]QQ7_> IKB9UaaR"嚠jɷt >=1gJF(bR>RLSNq飶ěU`̙Zڑ&slrTjjP^oOZPS42aD%$@tY$h) Ӂi]IՄ X2[9jψ"88fSAbP |@N@kC?# Ҙ&8쁋屖># #$$: H[XA`Ir,aKMei-vHN;i;S4X$\'逷=x_}.͇@1 {71cb;rn;'Lo}4 4)sمPuWuT EfǥkE2Ha"Ӏ6du,7_;ǥ݆Qz}(/aN(EKf-D~Zep2OtZ51SHTJqTQܦ2:YqGkZ HbV{Z&հ:ҧkmֵǷo:~r!#{OTt5|h1jL$:`e5{iaoZ3-dRbJy[_tݔ zG)t]39t%T8 y Қ)RR%4]ELDK9 Q Ƶn_t7%OCu-hvvNffjmow[2VVaǿ^_|2dIDz̐2V(~TJ|'eeY) # F&֚_8?/Ϊ <Όp35`fx6U3WÎvj7Jz !yd|Ep \̔#_gʺ<{Qj h hF 'Ya.;nho[5xjgbQ=ADQ yu7գYoUۜ'?toi-|𸒍@EqnG&2˧س / np; 54 6GE/r[kYs(ߚ8 oӱd*RJeH^NѪ?m#8٦(R9p!$HcG*-1ˬ)l%UZ8xNYh4x`O2}.ywϜz|Ş$:H@HR mS%L*#Uym)4\2FXG9@d) <|Q4i+p-AZ/5^l:xόΟQ:,+3zf[oA~RR:cw`pXLicRJRb0[x ||Az1$PuJ DJN'2(03,* %pD0A nF>}*?f½ P;yd4>yr.X myNm#V :yL# VrBZ,H$?V5K5V. kVR[!IRzH$h?qVFjaFzL~mp&Ɉ +xDa;%wYjXdX_K81Öu0[o<@;z.8 \*?'_xQC/U?ܭw;Nա-~WV}?|,&{j,SR2$$4Eawhrr3lSSX̞W Silj[psiwE)siK=bho=_߁< u&c>!1HXqשQ@qNSNF[\DB2E0qs㈀pbxnl|<rszMbrքt%I Wp>veo)Cwwů՟͚Ll9+d[E<}t:If'ڡUMMut]N;ݹrU@ieۙL>ҿ1C8 ݂N/ߧ+RIL>zXA01 ݲPW"}Zp#; >a.!:HeRx(F#lh Q=p;RZ@_Uՙi˻Et#\Fj/:KgXl@yK5S6O}VϤc'_fihQr{d_ v= й apRѿQ7};s~3t~LӼe%uP*lW?mvj#hrh!b ]%`T=ۯQgf7aׯҷv3u bz-ÏRo'R]I{1I>"_p+|۱)`rtd G`~Y#rD !޵q,Bcb ؇N س%"諤&,?3C8$%8d_WWWe֟Χ O섺\{SاC,ZEˎcѮ ;\XI=(N;!щ^uBdb~r[4O>O,Yޝ<̿.\n C揢bKm e~vcS*ۦM1lk.Ev=` \a2mN>`(+/Z>|_y͘ ~^.Ncacc&UpQAsg¶**GT KIwP!;m}6.bw|䒮p=eϴ{0C Qb밨{Mr~V0#(b`R+m|R{,#ɲ]zᝂp#߫K`xؕ8ս?ζ@#*TB TPB T +Sr^N)z9E/SrޱgQ[xY͵8^dEၱXiT TV0 )&e˔)|Rd~˛8p^99 M%G7!{ !,Gf1JM'.|9IcpsDn1 aNj%|bWmfre7oWi`,w4]Աj Sܹ3kn7AXQ`nا(7ؗx:KWnQu7T[@jq]]/:13 0 sHW[]E‰uXT*` $y!E\(wp)B!cF@1gz4‚S->w0Ѝ}ov hNYߛ?@U2Вm ۍ4C41|}2f\ׇj84.Kf@ 17QV Rb:!Urۉk=p=wa*Y:G{d{T_L_$/ e zϏC[[B_n&Fk4~4s|]`B<UX-zg՘IDk1M}5ͭl9{j0?5ھ}YfY%~XF¿9N]`DZʩR ˽)C;ؠ0.3qڽ9J]\6t{3tݓ4 |IMB}XPG Vp).}Ԗx930PHZ(>Eήv/KwN;@m7ƷE F+LxdN0+D{-"#a:0- 9r؀ a˾# \HLg@YGe9AQN%{i? +:tL+ jbKFbR$S, A9p0I̥:,V4hZ6h A)L z:DrqΩzJl{p&Ko|"Px*39 /1޺[M f2kVۖmV*\u2(,=edcY'kc9T>e9s=H 9ͿV6bwmy=qx5D^l{JWĨ5FOyKQj;j&~kmX2jh+ɼQs6`F0v_4i=Ngͦ`6gVp៯f=Aqs_o,CB`mYC!TZ9 QI* AxZS`N3"$ك?/ٜv[ϙ=0u뉏:>i{6/aVWݙY8L~QO b5v_LatitՌ?=s2?0jf=!`2${l'o7,G̼6rh&(7s8^gӴ&;+0i|2~}AoBxiiق#7G]"ĖD:"ҹipK|Q_δD6R_BkHhV}=W=)=$͋roJ,Ă^Dc˸zaL՝`ӥq_5ZI1nxN+#x<, NU=zEQ=Ʌ=)yŊzAʵt%Su2QJ껵@ի+a2A!%iJS潲+ÔF%A<3QCo,N()ya`"ZA+AH"I`ci9I1?sD&;*/8Xxi5vofK FXyG%[WO9*k?Lg\)U JBAAmXTP¦MTR#*#+TT!HK ^fֳl9RjY-}'ХcTa(f9NF /cPybRya?2ŭ/*Sދ;}fVEdxARDDgցb":xЅm.as[A-_O:/~;}C1 LyE{h֔x0R -HGK`pd1󰸿{kw:ߓ`:o׻qh6M0>MSLIR;m,F8'.Zj'+%^s!Bth[~s`R/VTfS61@!=#q`xC0% ت9k "+O=V`hy&!Y;'̸ !7m8iOtrr5H̗ \"Pr5rb6ήn2c0 aVdH洨RbK*o+L6l;aE/n-'u05~7B09&r6ȹudXDcZ)"V"D4BS r7aԏMǾi(tKM0.of>FF_'טvt?&_&ox`~[^={6f:эyh>D0bi1N\epԷ7JMS:-?z^[Ki\GSG3WuBmPݘQVM=;B3c rAs4ŗTpvhf#W(88Ҧ߆oIӠt u׸q`{e3k\?0M:_?[bގ鯵CC8"6K#BXYL^jʈhA #(ʘh[:[b0+>KI*w~~`@b<ڸ5mCCM0*f͖)zsz%T8 yJhpP%J7bi1g;Ja8Co/ nuMtg[Um;ؒ{EZd_j>%^^^h,Y|.ڡWWgV'VPщg:DX$WzhQ/ra=]MYOmsظW] KKobGHJ&-'Fs~tX?_߳)޴L?t)wۇVve~vEXfwh;xDOdu裔ڈS$86{tLcC5gZRn~-ġ?8ދMȖq/^xCr1י޽]Rjؽ+S֭%WbR$faFR!FQŜ QUyƥ+Ujg[_UkF8)v[nRނ!`CTD #HR#lq!Q&)bD3/`W2$k (W$6!`BJ;f2Rʃp-[/5^]r$rL][Î-3{p/%NZ1;OqLEQHRQs@d>~nn5¥kXyGDJN'9̰4NE4"P (O 5"R} -}ٔtb`F80M9U:PfXQhK"L+w!hIy$EW\ө\qE  +VQ j%)wDV7NIPE2+k gdž!, +;%wYjXl/m#ۿBBgދ -1OG"9- I=%ʖlmyy9!t#oYҍu0[m29aq\"qyp>t"6uE: o"b^t, NHB_՚ѠpZ5ʓKz 8-&3}]{,{;|SB*1U$z5ϹK}6 B˛Ay<$i a匿,̈Fb2|u$,tN(|MW!a-ǂָ4(1ƸNs܎> õ`Zd7E:cXmxD9AWRP;7E>y1RsN΂#"ᯠo5iA0&BaJ8]m::pQ#j[45Ξi4OR?x8L0u'e.E)U!O~ܝNmA!u >T>Z pX/LkDR%a;Ч w Ymo"}yqV;HC.ÕLm]>Pq; w-FЎuQeP,=%O.O J6DMTx:u֍ByH0!k -+RV6?Ml˛ ܛ#DJ|"t]}uEyrUw;zW5ݔ;axc}][Vi')Gf!2|RyؘnPө6aVR*[|n,)_{V'~:y 9lTŠK8ԍA?S/ kFKc?fSKv-)FHүvPлS8/zL>))2u LB/SN(y(\<]OTF$FG@Q2#E/i5?>GwVL}u.`y5föVbF3C~Z3!OH':aLS!>a؅q5 cƱƓA̹0B*S"Kyg▱Wg,LaBS]iʣ0T!&+77>RJ cw^3tny$XLjA%jroyK&ucE0΢`00csj &uD9hkc"wL0&wGTAh`X4 \r2Mb֢y@! $ɰ+#~**Aر区߲î$R bW `~:UWSaW C-zJa8!vkz:UvUGJPjٲWȮ4bW0çîdYo{\;"PZEMb6qlrem)p*,M{w5#wDtXu@FH^`w +Ɛtȱ /Ғ7~T i4fEJa"-SZD˂K%vKxMw4Բ]JוKՅɖ΄&81Oy `Sc>x$$] GH%,cDi}F`P >KGUNHǮ/>[ j6cV΢; vz 5k gAvm+6 ,YJ튫13%`luuo(@4*k:E@YjhϝQFGR*:dLK@`W;Yj r)ȹ¦(9rÏ pc|{Wpo{O^}O?B]ݕzE W2an}3L:}6ʙ->nΑn$w H ׋rMYo[G˲kaݖOP(\2(:gZ3IDk1J55[!-v宲<9&0 %?Nקw7I`HjXF|0aF2LJ1,,`0#bt}٪y9rւvv„@fGrކР#Ȋ[E1oNΛk(GkQS\-Ы ,3 "Zڒ&nYʊ|"i7^o-F-)@G J_B:`^em) Ӂi]IX0[>JDa iU3PSAbA1 ('@c}9l4 {pXKc/R$$$5pFBb%#1)RFA9,aK%-R!))D1 ]>BdQ8FOja;ņI'ck]023]f?)mۨ>k?-Zy,9`&ȋr,s2JrίFrRX̵Q*g er+gI1iI=]Ũ1ʜ~}VVjBm]j(a.nb'sW1i C,Qi .ImnӘԈ7BԬbn%k:LRG-IC,ϱNr7SD(+t0v]9yu~)fͥ"nT+ZoڧP Kտmߋ&N^IlfI- ZGK!2f4j|sWL#,8ZI}-6->ĸP>tZ&}p}xtt?,eKmU&TØ0ݼ&M_}mz ׁUǫ*B$tu*,9z hӠ-!}4WWY.G+iԓh)۬H?aӌi7(rܽbN:{>]S(ܰ{3[\wJZsBRyX:TR#D/5 ftQ ,Rg2DO"k/%%Hybi1g;Ja8$"wޮJo=}"Nop[qJH Flv\ Q &fQ袥Rn)s eeR z̄`frG7y`m7P= ɣU)˓`ߎxSJQIco:ѸӬ4#Mh'UXEpIg._Fc sr]8YA|Z g ~Iv/&R!: -˴*/EwK>fgn0i¸}Nʟgj{)~p_`xTQ0.cO}\q~P) MKYzJxdT 9p&1g*M ʠU dTpoYz|걕P6y-oQpot"fFuuեT̪;e5P7v7emv;4%2>S<l 7aVK+r|Z+l7D1nx$Ns#8(hA5^A;XP2hYQڡd;#,Ӄ)NFZik2/S" D:Ai|>ϕaJzΒ c`hM摴 %%c88/ LD+T4h4!1ZYft٢ У)dǞ K7e{ȑW0͢aw;` (:Ȳ#$WYv˖t$68;@/cC2e K=/=JKCR3nJ2WlIXXn/[y"͍87hbE@L>3Hi`4".,KĉWT}n9Jq\- kBV"xe,2"ɜO@GM$he+yB3D$ԶL˝1ibYb>PDd|p䳖6{jqz^ڇ , -L+l΁.* JHfTLg:}Bz}d5 mYf8{&B1{7Ik$-RZA8',~{y'aD~eѨ BH<AI8r3؄ aue7ȶm.ܳĝvl]ZVG͙I<=iCm= Fi)X$@n]t>pQyis!Y˒Ar(Ɠ`r[ʹ%7y,A3ۃ= )R)B,LD9P*.EHTҘkD!becƚ _;zr;hps.jҴM(kg:wJ g[K7@jXef/ ty3-nqܨH#3^Eg Yha~YP53TZMLN!ԤJ}cbYPr*t,,B#Lw*=_JӮqݶџ\>@5*>TL0MbD+p'ZHGWZgIީoUVijX5?⹳fo^m>~f$m 0B/ o KBSE:'cϚ3!Qdr8]4LN<{ g9= ǘ﷢uc)b[Z͹{5znD|kLXUS1%kC6.7yVN~G IQPAtָ4m׫ e?FJd Q(gVwUK<@=Vv?9ÖDeǒcy6-RDDb`Dcg.c” " ЙcRa x쎿Cy> ̕0-Y'MIBhWtmilD>*\Y ᬻ)OZ^Y$kTt)hP6I[` *u=g3dE;.YZB,ʙU!,1I0`9aX JH*jT S&D "[="%z?-iJ4 v$oHm}"k8{^eDXHN]_YnI>\ICA$'84&gK<ĬIo J.8A.1~$/:]hvRnW6y]&EPXeªHpU.+:%@O=n5(P~l Ge^ŘGC2*o"c>1i .dul\`u9Xb(V\i4nHrE 9uXKn ýKCPgFz}>'0&зeL@$}N#ABN[e/I{јG{W\mP;yf8īT* 6)֦Ѥ\^-2_pB~Zֹ.vfzݕ19?ofE^5Wnb2-۷04_dӫ94%Д$7]?g<F:sbjס#n[\$ݠ)#7Er:;ҠX.p:Lc{Oyf >~(4|𷑿"-MY2n hbSgH"L$8agHe< ^ᗳɔd]37 bE r9nz1z} Sh~n ~kf}Klm E!mv~o>Ϸt8)G5…~ݩ%!9rSs1P[{D-Cŕ!(>cxK2yOa)Dߺ&s"eDeSX[SNuw|$]E5 H `80q/Qd%%vnJ><7ПK̼PmpNzf&T%wFU{j?ԣM |6I}Wr;gٯٯٯ>jٯ_=ճ_=ճ_=ճ_=ճ_=ճ_=W~W~W~g_p:5SsP:AsPj#,; ujZ99kZ9{HsP:k%cJ.mca֢DZH q* h3oQyM7Q7?ӕ?_os%*zi{|M)'c|p9cϸO\5z(f)48V"p*_o;3E.*](xH1)Z&ݻ~[cFm'uX}CQ}b0ی|sw}Vngw~^\/M{ގGno< OH0CtX"$W۳Qx1Ԣж9ſ'|z+eeYjYZeeYjYZe9^/Eɉ>2`u5R(L|MPV`:u?U];]ɮq8Xز@h=NH;=մ8rvzxu=j~C}=!ȘCtL7%GZ'h@f2Qdϰb209x ِ2FF FGE\EԴeYÜՂЍ~3 %L( Q tV%gJ̅3+$rHDgz66=鎃/d$7M7OAܟv;نw} PIYеY xN+co6\c U0`-|$L8unϏd\qIWΠ8/ˁ Q@FϭBG#=1U g7YӼ)Wroro)Zo^QYFChsh}1t^Rq!#j̙U9}SfR՘1{!du7 9mN~t9k%8٭}U~@;o'o=p`稯%R7!k.s y ' "sM5Yh"T Ƀ~t hѣQV;(uL^[cBJyQTB霹VY&Pqs`eN-S+z،6ˤ;9w>+ ,;%bll _z$f#!~I~?lr&$~ hI>F#/XnJQa,kp],3Ro:K<|QD-풁Oچtye~jhlnN㉯pQS\}+\dR6&w:y0 iuIYʟZI'38wSrk"4hM9.>`Y@@Yi93 &L;b։R{YvuIּ+GT %&e{%m8Y/ƭq~\bn[Q,=Bh:ҦPPE˟oZ؝hr&eK.gvTɜ7y,G+F̀)X1%+`ڦH䬼ds|#!_(ܵ'w-FT6- кkw ɗI8/~נdZ.Ȭh$ AHQil4*pf \ʡr&G,ZcXQ>P+Ⱜ16 SG)!FXjLcl|Hz$L>8AXWc8c*X l,* #"(4h1yU,1 }sii'a`E}"4ūvکUO%΋n{X !Y)VzXgC7Lpp*F1&x#B:rΌK=ԠeHLV@ȲxxrAjԂcT'T^$8APK{_x8 kR;}8;I4֡Ml77~.0'6 n_Z7]r&;:]َajRˆZwmyg#yvz("67=ۜ<\wsgyV .}i%f~iΧ?o;ζs#<^{S|FjnUu^: Z&lpej;OUAiM5 lфEixY [StNYTJy.F|X67)8`Ny\.E{A0X'NTRi lSAͨdZ.F*1&K] H# Ts +w&ކG,GnKH\fXt 26$SK=/=Z*Y܎&<8WlA~r}GE(.:'bĊ'<|f `4".,Kĉ6*+e1aMJ%B<:Y#:s mIȧ] @pw`p2AЯE҉bUS"˒-KTD `;ZdYOUu)љqMaX`q 8m?c1EMBW]ouEjcq?.ELwh6E8ǻA !Z_F+?vhů&Cpm,>%ܕD_HSlI"i__|պ~ֹCNyt</ukk=z3ɎE[4WTNrIĶx1q]6 &eF2 ,Z#ՁHOi=#Vk묙3-̀"y1'σٰ\%ގ+v7=u=.fŖqIPKHc@3CMP3PL %*G9K dd .UPp[/fhYK/7jfnC`[· WEUぷF.MR@c VK(D*Mռ4Dn%LV1V[}4sbqvKL W]"iM7~vx}T :oNKO.^[)ǩ}Lx0qF;Y`Ϧp'ٹӮd)7X#\٘,::,]7W(%ȽzJpNe0$gcjq͹,+>zJrNAXaq@qVbPZcT͕A 4W4 ;sŕgjus\Y+#(0rF* ,٘,lt>R2ڛWc_F㽫||~??ǓQCȱj;NDWx1שd|ɟ+8ZJAqB3<8GeEjn7h(db*PB PhNSbht0paYr'6YQe J4hR)W0QNeqQ!A`<]Dʔ#sJlDxRzSHZO4X@@+Qx0HƌK X01(fPgBFISvra!΀;8 %K$+-2S en6"wy'CPx(cבGsDž+MGm8d.1s/-ZY;$)! ^Z)b&,QBsʶ,a.v_}xBַ,)1)Nma+XZ1E Q*IpA VېcQ `}CF}#wh4ЊCVc,xnoYiA[>j-aԟoݮր1"j?H)QBT;hB20@#Bi t7NUsV-VGiΈ9ToƼыO KZK 4!eL^hkoGe2Ӈ<%ZS}[n<$.88FoqU RB3$.U`|_(A E h`}Wњ2:[cvxِ>>)!K%ȍVǮ>!#qVF:<$7>\/y /7rn%Ӛ`+u$ ;鉃< V^</n't*|6dK?ɕCE.pʹIB8I@e$gSG'"GG9cp*plY\u++:oOnC yIZ#u9|c-1N^29O^&ӖLZCNS")ew̍̍en{z 'G[K]d^|PX/>QQWvcOXه}⊵^Ŧy"71.r_q㮘ROrW%~N3k0Eز7=nLZU8r\w ]WV=B&=(#(߫qߌxfmy6l\t3hsJ5㬷r]rk[ཥۄ&Q4H&fKhn))oqF#ԁ}>~BƝ6|8lT=ṡcw Lw 6fE$n0Z_lfLx`8&9WCd8Oh᎛ևx*8,DG諙M˧|P򪢨pS{R\ "Q>!s[#Omg?S+iSϮS]t~Qrd}GYNj |Hn&x(g0%wL,8 QJ5"Vw-(-<)H\kJ=!&:&ΘxF.EͬӮd?E=o/SYzN7WȊceE.j%Rߨg(WhJgcC #+mIE.P,y(`/4ݥ\*`FR,ɢNm>&`2/q ,ڣ&ƄBS@pBRiM"r K{LO8 oY˨l?#mk&fsO[4K'Kt TƪN7jO5mC^.]ܐ%OrGʌa0,+GkP_hD%rYd:BC`.u^z3Cή!e!n|(1JH&C1<8pAJ** Q3h Cl4d]U_% }61:l 1 k4wI!y¦XH(=í{ٮX5,bv@h=i}ڻp\6~߿WdXAP٘AеL~-o6;m2<Ֆ B% R2:(^S @'",%Q `_a?P_ɺC]vPhyuO("RĀ)pHL09uTsj=HlڀmsJJM$"Ckhe:%* *Qsx Zl->7Pm54={}7j狥mQ-"wH)BU Tg8 kQ pz.*_Vbo+t >[KO2=GmxJOQP$QTR򴦈Qb)wA#Q4#Ḵpj .>Td+šM =j NM"ϗ'ی>sJ==UOM'Zj f/jv7Ux~4Ouޠ_~MO;L+2+ʼn/?{ќ\XW^"6}O_N+l2-3.hH4Jk`*IVXFRk2tp+Ňmun P_{q{#qOd5rS=\a=K=F–-_5P1HL459(w[ %Q6&9k5I{נ.0nAoӓ巇]Əc__o hFh) pgY p'!(fJmc"^& V;-xK8 KB*Ѕ@I ϣBa x">wюw#.vmBžmu21)!Rј[$AKY\p^CQRţ+C* >5!c5蝤uI)%v_?:BTQ[Vlwhyf= N[Q?o@XQm\ HHHхuteδuN0B1h0(Bgѡp 6 δ釳3xernx %^;sfX-SWVxODn$miɫf^rfg7'<=.<%&2qUxfeR:'1mcφn/r)1~4%dodWxULGuzmBW`[2ڢZJɹL%zJE ft-*S~ t+KVͩ+$jur 箮2u|; v|pJ3g0z"=_û??*Kg>  .2I kX_(WWAr ]M4cRLM%oSZ,Dd',>jEK.xNԋ(j2*R+qKH iq\K1B*%>ǕbI \JO J͕>j &O:"}Y5iz`Ob<+ZbKCBcq~"G-q^]$i|N?.Kz/IcuPAC1Vh HS@R" y(76>]~E9/W+~6 FS<%㩚TgSԌ8A 3B[  ;6ihIR#}6&'Q#h\DJ1)PV*&TV Thѣ=Z M Ѿ&&/8z )X/̼p).M]scn!e:ڭZWΉ/QOKt'PYL:A(2B p{ib҄Inc[TߞwrvI!LI9 )g O۠Qp]ACrfI\1ŒRbr,0tT.gIh :F" Yc(g\xbV2z闐]D >&dP)9< 5:'|"8 M 4)o+E~eGN'bEQq„"D1S&d\R>YiAZj5NO:?uyӀ47tv. B20@#Bi tS(qߝK_x⠾~&; oGiΈh9ToƼӋO KZK 4!eL^t܎v?C.8>kE>ݰJv@tJ<&+?Q),'@9sAq59Dz񘃱j9*a0q~ 8(kˏ?֖-D.>: HxWGA&I 5##;.o&t}8*?7 D%,x# 5/W8XAt&f{/Dzn'7Ai7~U>g 42=G'Y )ホHkZ~*znl $ #IψO9n5VfpjҘ款οąI?2^x޷8Mp\5b B>zWg^j3^cy06A5_5~DE*{7aA8޳^Io8C^pZ$exYm `WnW#@ۡ?cֵ:5V;Bŭ[Tz~׮μꕫ3GҜ[>(`]hM$ |X9ﴉ$;&ǸYѰ_^5r?b`E]4ܠ󏙀>cv| AJS ĩv!ă2L,h5T"=֦Ո:{_n>cdY :ɳq|}ǗӖyerv L s*ov||2Mg^1z%դx 9yyu6- ;Y2Қ]=Z<~|m"2>& N}+m[R&U)^h>m%؀ $x|tp=Z˹E ᬻ7*^@ l+(v~[~>8B]1~pF}2'Xh{lshJ(WxBYbV[`a(mdMh'ZnC 1#r!Xā @uB!q<'G^mgQcE_}݆euMAytrszoUL{Hcq>C̊&^JXMtɲe9$%^8ڱ+ u^hv.K"ya(Q2G~a f9x֑H#pC5y{$n7({< E{Cl*Gn?Abm>audӲ9֡BTi RDt 3On3MeF, Fʸ@cL[ͽ//}NZ1c9qB@k)L>FE,"*n-'1F1 ds-",t<3:^W+x/^V!|I6÷ r5_ K,%ݠuNǎo5ѥ{ORGjVoswzOef1QhXըϕAr>8Ib]z~[Ŀ?:ϣ,,r,0m׽Y2"=Rzp-Vw RN+vva\3?eܜMdS.pٵO  j.pOGtP&2cqZڏf% pkߛDs} ۭzM=Rt,dF:9$ Ng]č gmHpt\e8yKE@>XQ،#ZzoY[#gY@ǣ4oڕ\\;Ym lo)jp_9~rRX@ɉ6Nɠ*)etk|Us{醻W;ٮ :O,ll; iaw͏ꏯ;vf[ YV൬&s]b"]H#>2†,T'B%6) ЍJ^i#!i5>f=-YXfԙ.CDEf)9,LJrNzBa[FΞF7p:S6$7=nvk=L櫶|{2HJQr^I\ :ep!oy-JT$Hy(RQ[S!MMYrAV]c974sgRRmk٭хqculYNT92.c֓\3m~r3P`}ПL_nCJ+35I0h4M4$#\2<1f]3ƶᓭ[NkDmHXV@2E)!ZtJHې{N:{E2,gL hp \)-F͍D#qA7bI*lC6ltIZ'iEOb;Ďnˊf'1'\Gg'\WT 0nOH];uE(.NE]jA*T.^UWRiS- &pigOFLhr7W(cuj\I~Zҫ[[.o78>'x747QVXv[EmFsW>vǛNZko~JxSb3x{]>L׌+Wj^LaȰȖG4G)[*O]i 8541^^xj^ܱd q.lplR(k+WQz6q[?L\Zd#cg`Xp^I% (@@T0I9XZGߦ`>K% )>J|bR32>$btr!}rnds%$ Rq~@:3Kњ ]2ͷV^7<L&Dl\]htԶq`-4d2r]^^ Rm⥜)MVsf HJBR 8<2#$֒> 3-cBzՄ1E)ԩ!2.}\mt $h8uOiÐy2&6{$I$q^2-iFϐw`!"-4I>lM.FӘ_0;4q!)bv:rL9Jҏj#2Ɩ.QIiR.Ғ.xmFI?O`0F3Z$0ԙ>ʄ͈*I*XB!`RW:YR24IS$8&OMt:MĒ ֐UmFZb2d6@#KJY&jU*I&Ur Qq`2Βf#֑,5N7j9.MbۨźX.-gj{۸_y1p?I{q[4I j IW$E{YRe%,1p3Ϝ=l2zbJ˲fP57]\?70c)(|k X "Y0(3m[(2#χPʨ[sV X,pM`&Ď\)bDJp`gSO@nc-r`f̤cD8*VZ!hM e*3[!(s$e丝Ƣ+x3N ljj\ qe+Ձ32Fr^5 5$Dje'EpPZMil-@E@I v/UTy V"Кe8MMdPgm 9_֭ UDZ1&"9ip> ìr Lهɝ/:d]Z|ou/c5 ܦB$c[oNox.ፍ`ӦLdUt4+JrhRUA 0񘊒'8$;wk=/3(39՜@HDNWDMʴLwk<Ѵ8@= KH@e@YLZ[]hx (nGfp6 >cQI,TGW>/A_EYьmC5YK1Z1pߩƪb|=u#di"YrBUF+xm i,{إ)G9(P"Q _T Ԟ7$`YwQ׀R"Te`{{(%'6ܖSBEk@.I⎕5L >^AB XC-D;)g ޲N+}Ƣp;JZkXdי$@fj2RA ٕ@P~ЃRA*8vGyPgUE ת{E¤4,,Hc;&gو.BPDUb|؀mVH'YLg#NGhTf%ݢZMUр{942IJX6lӨ;ubAK0AlmRj6X;xnjth6c|.1Qe& [)n[@3 .-zLah Nd4 Ӛg)DzI(ΓDɃakBm2&>Ka#[NJ3bO*Vh <%2`rh[SP/O(7""Uϩ1ד%\I"uA:(Xf7Ш3) ʀ)w!kݰf=*-+PA>Ǯ;lE^QH"NdviP'7 B Eʟt7/dp0) ,*1"&7cQGQ1!0G&]Z`rO+QU1tFΩm X{uAfav35ziQcM b &_T4/Q ,X-l]Ρ~\뜼=NKiנB]Qc,djoZ 05^%kJmqU8P(-l9EO(Is!1>?n D;k|Tp('=fMW:PX6\NU \ 6&TS.XKC$`PrAvf5$$T$nBN Z&!Kk|T4\z!(L rt ^o>ڋ冼z`KӃ x?EzqW[FO?m|_A\Ur j=9VB %75SUHTY!B Ji? F|{ܴXa uXBP:,a uXBP:,a uXBP:,a uXBP:,a uXBP:,a uXBP u:_5 u` u`,:`I:,a uXBP:,a uXBP:,a uXBP:,a uXBP:,a uXBP:,a uXB'8& x:b,սoB]-iB^krʿ׿c=tvxꋓ`BOidDQ9jA(QB'bVwQQ$e t]R$ASo,mV>U|R\ˋP>_R]F:pSo,Ey.^c^6߼q_^Lw <%G+:q]nQZYo5.@>?0ؐG],o2>٢i10#[HCZH:D 4M~@Q}yw`m'sˇW4CA8[Oμ6?w;`"v[|:]=0|}_.sW?wީ?ɕGo\:5]tZml߼֛ tCEY~EQzlmHeᬎL#e-6;Y<'J՗''׿M, |@o{lmذ>A>w[y-V V(ng'/Ot\_M'oNiy2II3:X(˓61clv9gߎm쮼;~ût}|o/G]赹};_,8^MVs /|8_~lxŕ6n{p@v_-Z#]gsf.gۭCf[ ˳݆lmK\x͟8 w}8i&g؀M{(ߠۼ~Խ1n߁}4Gq~w.^3;]~Ǐ7"_0)pΜmFPZ8UM`R8BBG-]χ>Ř`ybsn|\Х{Ny56Pn.xV?Vms#CKQ+fOwAɚm޾5);|r9n`C?e$*q >ؖϿ̃gY.~:'%ڛ1db%4lUG2FvcNS`Bd̜ۑq?Ky,3zb,Tѵ-?&>zzo~}W' Ogb~fI[ T9+6^iWrOؕDשubabĞ̜;gu|.=wĨmNU:JV mPr<`6Qj*Nk2Oi5G$116[c;pqq=fu=BP\"=Bԭ5Q<^^rЃ۝{V-EYDIyDod&4$wk塛#Zrrz÷EJbi7 lnƹό=&r'RAXB<*!l09$;j TI:ms}5\<3~w~}uਞl߽q:2: >tȽ-嗶$nyC/a+(n&pxt:{A/NęμX8|}{c䫓o|!】7.>ݯC|s=-gêPl\5iNns͋˿8u/$W|ɺ7olj|vV|3Q֑UcƥYh#?pdn0,nFVr*]SV5`lݫӬ(a^ކoL9ꙭ٬Ӗ˱SOXעK%$[oA Q1+_5hC2R%{3դrg_\Ju:zvibnDWD)}&伖Ӷ&3% 0٬Mw=!vX$COˀrX1W15ȚeWzC~bOMVU|} Jo8[bL. .D)[2Qg%wEvbmxY%xkI"wXwSE0+P^r2ZQLecj@@m e|1LgIfDynMCPB> JQ#Bƅl2s1~7X:$ P?-a.(SU!B8SNT+!k>-.my-VriB-X1w_zrI$WM ٢2՚jsruݧ$pDxqIjFaz!*T1VU6ٻƍ$UC"67p觬X4䉳~$%˲deIc{D_i!}@:Z mlG#PNt6:IϰlDy[[,Ӕ `Q$`6# q%-^JZy a!:J[sm:6_ߧ5">հZ62)riH݆e0zHzJj ׹XcSՊ617 .1yt膥C H 3I$/̳M/=m]64$R,!M][ ™ d%zIJ,r ,/aN(Esfͭ$G~[e= v4uۖxKr4ѓcTJqTQܦJ:2,"1rbrX+B"I[cJˌv8nq{rt '\8QG_XJ6ZnM' fqnx;X8s K)mš1js:Oۧ9S(bkͽsoc&qJ>,}4xU)[ck\>+: mV_3qVh |ˆ^FcD2Y+냉KM-a$EV)[YFOg׀ *sPw:%.M0u~tEfArA ,Rg2D'HkKI QDpC@J,s$#MMZ[Yz\4˷ajĮa7=nM$Zyp%Otjp?|\ry8rY߭dV0[ܟ`.y֭= V+o,V&Åi Koz5줥9^6m66'ն dn: Bl!ix^bw Ƭ>C)h0Q/[@k>l0{:cѸw[5ncjY>wàLx'bG8-}s!/fM:фuZ_ꑛFm囲vawPँ;>τ"eIL8RQ5i >KGUNH=3ųgr6KLpC:o2no M ŏ>ɸ)lǩm&Lx"yC~P㕎aλL~i4cm:ș ,W>ii2D A;jڽEJGQ׎T ~ 0 |JKu2@ l)I>(#S5/wv4stp&zl\R#z(LL,5Qq,JxnsBއ&o0[6@;ڑHi\V*`_dm  "c^^KyqZs|h2zх쇿%OzXJ9-!3]fT=JaLtb1^=w_x㮁OBr޵Xr4,s달.g'D,4'ix)šP0Ɇ_Z!q}밠5:5J ?> tr5a} ɀ"X+8"("tS1EnlSL`M鄾{JQ@P5iatH\\}z.P,.CgMliOL堝>-ٓ,36hUsSWy]i&!V0U8WbWv+J4bݮWߘG/SX+XOsgʭ .3n,%)ťI!׽,>ag'\RCpRsN΂#"AsZ:GSk&tx&zr ̛CR,f1Vj8av~v#o|l r'oaSZ'b|rKs bsy:jܐ WXr1] ILrmΌ6>i1;˜lo#L:^?~Ab|b|ژ2ԗ~|d1Ŷ5VB*ּ!tc0qm3N{n?ـ5Ģ&1D{|86`8d{-|xQ A@F2)$mLB+SX1C\࠲@ZrwF١;+Zgr^\*LLBϻE( nU޼ݦb60J &{WVQJ+6eJcϽThх<&$i#JF1ZlAj4wF-aQ9b0XJBԀPEL9KJ;p8|oOd)J uV;~Tdeee&O74@F!@,/O*OS{cϥsS2ۊ`0wN/C_Aq؎\p|vr[jga>X& Eue- u,׹=YQ|PBZe@m`J+-uӇU|EW}TV'Z,T~ 8f;`Df6|g2J7 ~XuVp>ձt2v^]괓7fwxNj9j/'ꩧ &vD fx;uUV]CuJ9ؔrǢJTֺzpf:+ܢrڳ 1ʝ]8|rK:j` l]g;l6QRbf}*Kyع▱y_ :GcT;]wӄrVa$e d1Fh!+{|wg+m3>xQv3+d;ٵ( ILaI昷$Bg78Ŝ9s˭I:F g\*tD ʝG%Xf{nK+f|{vv^F^F:0AzW =c9}\ 43mBTKo90zݡ[k(+!ż .b[H lxȼD ܼ x!#H52BS%?FI,"F<x%#Se)KY[K}J? P!Y/{3;O&ic"(bBB(i˟uӒkơ[WJ?,aM?! Ӧ*`XQhK"̙ VrB*mOB#ٲ^?K!f5]@AbE#*AB .!H$h1ۄ RÂaD sIr\0;0DDZLYZMpN% fg~GXH4.#3w˨:٧"Efe?dEǼu d{ 9jcwɰ zхǧ{?,yիR2$$| |vfT!]%I0 D_q/q@ZUe׆bѰ]* q}q26kV>'b?N{}_@/NZ) lw]qשQR@ȸYi0:,/X"V"v<#_ `GE3Bb:(u )7y}oW 0jtbC& -,I֐+8=wYe0(0AѲBHb)r٢٧E8;{e&"1jnQ?k:#9 \ix4 ˕Fu@ƌ?&8|š_?{F /[`Y ;b_p05DIv.[l],j-Qd!ZTUb>\3 M&kJǬZg&0 |o-K%ƥi4E&6jydI 6"BQNk+>yn\|~ch "ہԄ S`B~/yM3hRcq(m{5j>L>ug7Go'S1~1xIOٽQ%%:w7ǡY3olйw7ղ ;=iYgwty{e({ eGol·Qu,lG<& <;Xt 0Ah u73HJߢ"BH`fiLo}hē\ >P1@CnqAL(IaMQb% zV̑pf`O19y 1\ٻ1O]v6z4G>;1 ?i,exRyn; M .R=gV=?YOQWCBBfb-ۭo}Ȟ {aȃHrKRǭ#% cF|Ra޴U(Q ·%bl˔pF'b&$:59 kcDQ*DZ5)W9 RPQqmLDcb.e 빱1$SI.$ <#:j>| 1%.|(&}Nܷ\Q^jK *3PZϑcQvDCa _GVEZrw?W-9ý9jW*6М:N.b}@h; :`B_B%^nWg8ol25}dX5AЀ՘A`*:Q43ةd,d2y-Kd tP>:)(O tYJǗ=Ā9=0"ALjM<(ÁDVHu mm"p8=ϽMd9%Ipfr}c~CЃ}E(Y|,Vdb[ oHɤSJ'GL@DjddK+aP0VFFC4R[hښ0K3J1&*yb [p1q[V x2Rյޑ)ڸoG\QHC20:߫ʆ?VOyKȁ(A +54rg6ACx 5^&X^A{Yn(,3zg}5#1sI/m=cHl;W)]ڃ̧"y߇G~sg$UD><`~ iC l0*~>'/U6F)D;MM>]9C"e`Ts4HTS\BdH c)cבGaU;RBr䬺:$NBd 6X[J؄Ng,&~ϸ/,B^ O WgdrA6ո;/jdO|@ct=6!a2=׎Hh"$!>A2z2q=e$%N4Fs6T^3A% 6`d1t飝xւsŴP 6T].\9h10T($  X.5D%U_N~ûGչz+w|v߷)]Ex8%= >"o'8q|㷿%-F4vYrtORYO%˟2pѣlk8ʷVǼ0zˢ_hkhh#rѶ =U򟋡<I^M/˚"v_KE3/خ?,,M{ft? o ~I>;:μlud%jns2X {zXO$j7@blO)&ZZOk3`C.v&" !BN `XІad6orٗO8O*| ]'~yPgu[/V. S-msc'q b?r n?nws4s՘I= qd#5|?xJ;ܭyCWw>:߫y.nƋ;L>\>r>7c曇c3 ̧+Y{oHjo{1<?媛 |!>JPqseEHZ7D4rqhFCP26'kލW S+4>%|v6s{g\'~%dyh; ߾ǃfj{aE}a!LþYvIÑ(A3lּK\c IIOٗ6r^'w2faME^f'8E؉ bXyL7WNoJ9Ɂw774I4DXhİ9fsʃ&!Aa3[Ckf^oZNfWZ\pi4vju矌ZpC(K^œŀ9\۽~%YY{K>\;u˪T9K >P1@C& 5A)gRXVrT@ dZJM^ુl}wMU麗U|.ٴpnrva4g7潘n\:u[YteXm,6"\ VoghǴx%#vf˗Y^nQ^W7C0>`(BNDaLKYи41iB^QQmVUW=rwT_W;{O!LI9 0 L1yamhx8D %:Aij$YGcF+w1R K" ?02KYN:FQAxa;+&NO;k+0eug>&T{F@A/1D *%/H 0+D@$g p@I{-p= ?wt:+&I$zZD41-$JK B؈EP9?qcӀT(!i 4Hh/,HPFB$&9]8b:q@Uɽpe3<.ʜE7Qbc^z .(E` FHeDPIlr1f'4b*rƧ<$)ěn*RXOȁrP%ׅP}5rcMAXG1 9:La0o1 z; nj"ҴCVU:wvig}H $rUȲ ւlssATY`.8ɥPgi ?w*cToʘK%+XqWY\,FUwI轂ή{S; n}` a|'j'[96b'O#D~Wn O0taP?g,trB鏑Ϧc~L?__<_O׵╪ΑU~︓L" 5`l GQl =L;qA* )U5襬YZ.}UR[\U5ӑp֡IWf'ek}Hw]>wkLݫ7v{>gkIq_ɩNkmgZ[h>^lv d0p#ŁPuJ7^Qw !H\ jP9  ꭒ\#:rxY0<Ֆ B% R2:(^S @'",%QP hf^X'ۜMqW2ήs&˜Ŷz]-(t+__ݷvY Bce4P|TQ/IMǀ%qSǯ&xqچ0jZ iz/螓黶_2ޙ;vf/EavN,!}恟 =MWKRiC \nWx=X{'yﻈ '3/9KWEڭbKyԝfk};N_jw +zطqå=+||59Hwz4VBjk.g}Q a3Rڬ^|#zߦ??GՆa̓m1[ez$&3ؗ!>j|j~?dMP۴J)2KR&lZ4^H266z\D8pHy ğ>]m1XHZ!'BÆ`$ G95y;`m=VYXf12Cp,%g2' }IT%hrkWQ"d AYE@(',O.L@B,E,i8 L:r 0s:pz..VߊzyD7FH%J!ՆGƨq%WFUS/6ﮯ.7,Z~}+AMNCv39͝Rf휴7||uML*Z92o+A2%gѢDMZ[S!U֦, уFL.1ٜLk9W3 ڒpvK(Mda5x,eeAebDenLސzzs5޵qǼ~ ?^}^\bR2Ax@w2Kӂ޸RY켨*h`2!ER!hHFd2{̺,9bfpvKp5JKd>EϕPYjC/{4>d;иA&)r66y.fGuZe*d!2@Vte"!Z Kp$E|HY%ɨFk63`<X>+UezKYΘ6@f+*e} \D@H!Z@6F2r Xjx7l紞ӎN pJݵ҉Vqq,wq^@X.*tq.4l!6kِm"L>Ϡst|QFؐeP B;dw[>XE6W &d 3Mx8OTn.v,qfwt=w]4rګb_͵h\m?0m>KjͿnwO9jwth: 7v>gyyvz(Ip67vrg:霥Yo) M-ͷ6ifwI9Ģ,t^o:+U2j殨i61PkEXQvkÙI)ő\I̱loԢ) ReojK_P ?͔N_(EG Ev[6&CbʊsGy%hx[h,u=10:dX"dʣE19G,15%Kg&Du<^!"P zqLEVXJgՆ'u|;`P.*~ ^ڇ 8(K @'s H+kK90- xxBY& ~/?{7"o 63PL(2s(YD]&f=:)呴EV5'c"Ioߎ;MˆD,>dm'MR8sR$\*&!s9jr/8'7r|M<jZ ")e)$DT9ȐaIDlrZ 'Um<"41үf5 [2_5y.gg̷9k28$SYt DM&f(K&S4(8Q^MӳKcƎ*[P5 y.w~D'M[kQ\Ԥ,·>{DbtcDH$J<< Y ԯPzRnqp>UIFyN7Zـg >etdħSjNN@BvTNR(IJRqΐ5E,B0_m̢/;²zNHuۺMKz--Q;}>rxTCc:} !\jvj-`ݛ)_x44/tϾ/%]r5{ـj\~d㋆+k?F+ణR{;'44$TX:`8 v{?QtPt ?#@`@ЁYӇxqs۩6]ŲJwH<91 y~^Sgw~R0QifEo ulb,:dxl -? zӬ/Q% ma@e, }8 Bg.z+td-+3Sgx6$5vw>?_MټR;1]ǗfI|Q 7)O9V81R(b%)$pLQ(jMQ=2}ls^o[aqW e kQY6侘=`ˤF3m٩۞//;sg̷c˺ |uƌ3+3ŽcGP1m }W STA=Iֆ͗ad%m[C5Zfe!r$V\ཤ{^K"91fM h 8k>w|غr&tNי,ņ+#Qz,Lfyz=λx2ԅJKʺ :sl:4_nQ@PT8y+9N*FY4Yr*'0R'U{!'?D#C>gQ"b$kfVx&H$@`\(%iG$p=d.BԂ uB5q<'Gv&Z3twAGsK~NGCItr%zoULqG12'KR#e Yk×h.4qxӖ+u~L|QL(Q2G~a l,:Pd qzbR0wۨG1cGPD8oƬokG^G86'bR:]ܒTY 23s1LOoB(T~V(B[hI1v Yw~RuC2#= ї .v%IӨHPVDōE<ƨz/PjO?$sluavaq7% ㏃t9j3Z榽_e,y 6y[v|w=&:SOf>j(\%Q|ƻ.E 94Υ)Iק|O=Ȧ`2UW{L+qI|f]]kps9LSGdzx.p2ec$=ߟ27OW[\aڮ{dq hmnɶ"SLpL|f.-~ ûtʗ8t5G'#s:(aZp2>{oѤ{\6߯|"~Vwrk-OOq~d&öf4Um/ʛxć/wDkS*/8}cbϗeHӞ6⯉>J1qC:DaID*]C01,ceڷBTMu8 >se%1Ie]F'KMX-'f>1%l h]"n$6П0m< df,)"rI}&KjONvb$; ??" wR9^ՓO2鲳x+_O\`ii2#ZþgrT)az+d!4Yh@8 }W){^a8$?CrdsHv 2A\&"lIIi ls0'zY/R>YzɝXQH_\R)ǗhoR=Yx,J =܌VQS"pu6mL BKBԧƴP ocۘ+2g Ξ"r *JvP WWP]Y fu>O_? >f? mR^z6~l7g|ʞ;vz-kf~{fd2j 7#N_@u;"0<$\< 4qFLnMYEwp?~2`M_ld՗-@YK ^@3| 1:wM+˩Lrz&y4–@u%#ķek *xf;4l!6kDTiX]OS/NS/NSC P+#lȒ.ɓ RYƔU `1mH LUOAHda S2Ht:\d3R*#w*h쩯F΁n>%shol$w<|AX bij[?iw͡o, mvԋ]Ǔy;G?;Izmꚉh; 2o]ͲJt+9֧{g:hkZμ2r7?}=w|ϓQpXM{='7]-5ZY:\r8Wb?P>:ڡvZn1fPt֪7&Fe Dȣ\}NԱlh}VFKJ/d kן.e/\F yH\EE3B]HAvJ0LZ}ZIqe-kV.ƑWzb6ޜ|SCx=2n Qz,QhY:~4ʆ|RW r㼵 SMIhV\lY|kԧM]g@sc9[xd:\-u5A鷛zVo3Bs{4Wx<%kYB!5W鋾TFBy7 Bd *& %B #2ك:K:QQb*&֖iBq%HH23WeJ:g̩GucKnGJ9ޑ{E9Bf<~*U-mHl33R=٧n6ڦ{Cf'FndM4h`)e66ޟ{1(./lV%kP<'NY?ϳs~&o@P-ONIm܉09 ~IC(EPNY25 \YY0pZt3WͶ@`Ɖ8}YvUx_B/"ƔkR0X9ۢ9EPg"Bj|&QDR"IDɉi!V NTŠSFAgѢDM6UՌښR6e%ͤHs>_3F)o qsu E o3yK)I8cǿ u56!,$zdJq ,SIH0X_bq yQƧ1'dB66QCАpd2uYCp:G,[g-rk0Tv5xֆZz#]ϥ ?H aDCȹm wK2TՇYicYȄ 9eB$dQ $4GQ)"h_FCQs5U{X%yrƴ2[Ewx_&"RPF\Fz!^-mg%#EŢDgqTwf7vq^,XΨw!T&rpQŒos!Ò丵ONi9^yD)ҏf[ _5ygwHn ),=n&SxN+IR(8q^O%qN*CfFyr@fџ.gF6ÇmtZn|7Z{I>d# Mtr|ڔaJE{6+NQKa2h:N_}%~uUU+[c3ڊ٣˗fN R@< $pd %!2'A̅ƒ@o%rYbʳkk3[2:Թە8r}9.}K( 0iY^*) ̊.妗n KS%sp%cP$JRhII&pQԐ. -Xwbyw3a]2ݖ8Ehi0l6XnzFwͲ֬ӑA2{sͲ%t\]`BB1#^~1maCwNJ-6l ?Jm Wdz1hslr\ས$B`;GOmWd1-~*Ę[zWhc{Edeihg·c%Lw ۼg.Ee<ƤÄw1c ulb2e{;n܎7lx2ԅJ+u!:s-|ABQx[qR7AkL\ҘYou0\( CfKe,O(LhMBhMuxޘb&kƹ7 N^Wm~;qR(CH͑(e7>xf?m~YD˴mlќ CCR6yK!$wy5Vrϯn$yf%wX\"mj}kęM^f6>EpCp-sk?Қ; ~l?J-ixm1f*%_kTA :R݀EhlBl xiv٩tz&-ϳFwED$F8ӥ8w,M!DT4: 9&%Ox4t@CL񕖮smGҬ) _ MܝVEغCG7#@fSn-".ZfT d D6%P&ŭgV3WC>C]1~ip&k4ΰ(?wGr>gQzl$ X QIӘdMi%*w#!JE, @uB7$xNTe9=PBrkyxuK]J)vުd }vM#T%YkGWx>hvV1)e%˻,ud0FIH![G# Izdȓ 21 ;̈́DLJZ [R*kDwf0"sN Mu鵶{ B#9"CíQ`%'2.h#b"0Bs,K#\ʂ8f>'Ah]J2%$N"AB%>2*n-'1F71 dg8yau ?|oRq>4xG?f8jT|QyiSӯ̬4ŦKӏ_zj,V?qc+?J/(\o&stթ4Υ%I?湈OȦ`2sqXɸuϕEr> Z?IYbqkgڌ\1^S"Dt/J3N?-|R4qϑOnkiΒ72 gůܒmE| $+2^ olN{]i5b A 9`ȜDf,n!W+uOnJk,/E(q4oן?w>}o (3^nGoNgEϣz4W#9^ky:*/8}c b?pː=2nL[ӛϴ0/7ѷGf;>lV!=-ZOٻ6$W?]V`1i؞``)M2IV7odU$bR$z,YYYq|q]o-ˍP*oT2hTmL)0P#KZ۩}ڴ6'/ gS4mOaYJ;ijH!8/x>YMo`Ѱ+|V$RL ‡ vHќh…S)2] etSm@\B.N  EWBC:rvvJy.{Tm֖mڡ|v|P eUgm!R:Aސ)I)N.XƁ62T{DPZ2UB_8>+PNFFC4Rt4[mMrh%NR^rL<9r6sR x<M*W`{MQk[aSsIt4$~ 7l57+1q?y_ө?j>_Ć{O' >.a W|.1( NƼlS&P[RߣD?y'Vnrv2]<7#˽†QGv/ ȗVxiRm¼h~s`X,[$(H9w ޫ ʕl uܖ[+\Gmȭ'A`|fEegˁV9j{ )q8 V/(M7M<7/7)}_w-/I铛qo _pV`d3m*=xSdI]qe.E5sB B)H=22 d9ѠD#KhrKD`,&x_"i  !b\pJ'Sh~-ŖEUƨB8U;UwR<dd; 9:4.De8K+5r&IHeӂZȈ )y^ x"+7/d>..^lŖYF6OE"JDQX"N"vq-3D4"ms}sxPR3.8!(Q{4XEr6K~}ENt w0f5YO#+9;e~ud'o+l1Dovu}3oC['<`v E8;z4gCm|ZS= \}]hԖ[wrt6`m\'nGj:NViՉS;ԉS]z -~ҷX@6ǥݑa ~::kxHVto9(?<_O۝Tv CּK\cIIO {L'J<9`=.e 빱1$r$2A3⩣FksN?"<5-8orn5cls8Apv9lc[rGwCc:zZn 1agM!NJ9Ɂw7WTIU~h",uh[`'fvj0歱lGu;~@M<&gR &N &dHfEy:bբNϠV$9}1L.fG 3i{s>\`W\fAu|I U!1438JIarQ1xΒj=+]J<]ЁtFmbuf;Kjk ؠ*%mQyk> 1o;c4d֗q`ﶬ4bg) u^D tҡ碯vf:4W5z8;@yU(jyǼN ~|yMA(cVK(D*ν4Dn%L6>@V4VfuH4 i)4 2w.4!F($Q=8w]0%],(@> 2E۝g O۠pJJuN.A-nI\1̌\brE02KYN:FqI|Vl9[YW6! #P n  JR3 M Zб@I}- -;:QF $DNN"qId%V!lDm:U jȟtvqi5 FD)%1JHj !$3 c8 4R.FI@:QDp[M<-/*ϷތQӜQ)s2XG y%O KZK 4!eL^tVDv3UCy\WD|0`@tJ<&mT%$ K sVDS%υP}c-O;Qs-Lbvc0!zѹtehK $r@, cibA8D1s_{!RiݻrTK%Cjv28r7I]hsL}sCܡbvS*1I(|$IDQ!s) ր 2X _cRTᓫ?R%eCTwԼ/.{mGֆ#IB_nx~~`l C _eIs(UϐE"E%53꧟˴fU~8G dB'PW +tL$2@6E5&Q!WC Qlq(*\PMTIPnSbٓC0kҊJEެ hKdX vpgՎseLl ^CS0MVMc~…>L} r'yd0j6 3OPdNh $I!&$2琒Ǘt/~pS|+:O.W)%x8 i 6 AD gD8"rAR.jx/5E#VzB(a>M`KT#t$c LED@bt$թy;\KnM HB#;*9)N+eT,*%]bIj ZȔ0@u7H8)P LJcɥI/qN%2Akf4K?7x0񯝴$Uyz["K2f*vkM\Ӌg0/3YK(&w̢6n_G}{Z8"?KPܲ튨LRɠqic$Mq c9Vr~g_EWs5rr+g|J:PL'Aj`%wH,8 Q 5"Vt>*׾Qj7k}_|y*[—R-j)P2ZћҎg<&xګɟ(e>FShy8)):9>OBI^6DK@g=wGgV|3{Λ~Uu1/U6촗h }|c3a% ?ճz)ӣxǖmŦ7@ >OXl~sg-?EWÐ=-Vg BbeՖ憔ˏ<_WIڑ̐{ۏ,\u^j0țW>}hJ ˧_1`r`eJC:+3~/))QN)~cLj8F3Qb!qs %HƄH4!6sa=76p*)e$gSG'OLIExjLcl|@.!,4u/2#yYV!M>\FO,nj]3c 0 pVR}nz4P1@_я&U:SF"Ygu۳_l>gMnXe㯨C9ajO?&Ǖv?HJN8i]'AxU(Ğ1@TK=q4?+svYӮ73n*'=^ȋ"B'6?[;6N^t˛OpE8ћ].{ԗ~[ynz/gw~\j~7n56^z,>Z;gdV"/{d!jj#lNz_og+!_ Ek0L.IP&s FƆTdM3ٱF\ZEZV{7FoWmqlx\ I˂n[O(4NJ -C54F+D. ɫ:3ر$2y-HKd tP>:)(O ,%QqF %pH$SBe80(3Y`jDzMu7&Ύt'm?ZtP^'3!6jlv}]{5.B9eARsn?>/Fx=6-e7$W)I)N.XƁ62TAiɌT | 52Xjs/V[aI{ST)ƄJ4;1q<Ǔh6Xk6ٔ`STv#vEg7Ҕh. 5vY\!۲aU7첔ZwvpZW(0UUV||AR^#\ pVuj{7WB#cG|cr4 &9]B?Si Te/.Z8A*|WOh_4LRhJ>s6_)ϩaĒ: >W~\ӣG~Ny% \HR gpsSQQpNebX'5ZC:UD)*B( WYZCOPJEhWA>.ZWY`#[W($,-էWYJk+M"UWDi !W6k+C%i165perҲRr7Wzǡ繤z' QO뗅IRQORO+C/`UDϣѸ_]Δs)aԎy*^Ug8.ς@› i Ǥ*$;KGkZ>[w~Sw4ɤ1F: R S-AGHAH!PBF.u8KW ֔L۳=:9 h+JݩU.UK\yJ0-M'YaIEmHnҹg87Op+R!* I px <'ȳo!]fxQUP/CvP .ʢcWg|‰}˩#yc"{kĬ'$f&\"yn8e0XEF(!NS$lă3].#BJu|_*<$T=w*XN/#$LZ{-v4RSL1.] "z+`$#k]Z>x~y-+5n? Ǽ&ɴ<$?Yϟubi-ӓlsU?,jo:"Bh>gyW&FseEHZD8F$@D!(dkJ)L3Ғ.Gh#t\#ObUCs bVfHT|bmVmM8}=X,v\v/VZѾ Զlwj Wzʉ:{?Tfn% PRj"XC#w&@8SshSz3L}7kҽ9$Z<{m3/?n[&; ^(a`gbOz9x|Hyu5qЏymEx"9e.?4\ߍ$ҳ+~2wGz;'/GU6F)D;MM. sD DHS5rL6_,:o1OVg9|l>R^ ӌ}7£bBܶMv&_YП<_?P`exEl!a4a8uA DAI ąDJƲFk!60ceRy NBP#$&D #vclGl7%TP1aԆ;Sh10T($

ݡ{-7x4H9B;?`Y[:NFfͺF4ӟ7 wmӮRP˙N⹥[˭3W΂sO&D0蓩 Q̩Ԅ(=tkBQڮ&[ %tRSjԺς*\d"әQvc.y9<`\cYESi4$z1%a*NBJ68e=X) d>[Y)L0Q[c8ƤҠsDF˜G7(൑L9f_)Wm8;Dv =oIz7/L#_cYcC:e7+^!יb]n[*\QQv9 oQۘ|f (VV耘B Osqn3ˣ83F "x0 KLyt(Ft2%ƣ&$ Җ2g`"غ=^!"P r',1,0-4$4YϪ gG>kWf-.6*~^څ 8(K67@'s H+kK(0- xxBX& |}-?;"o 63PL(2s(YD]&R뤔GZՠVNzYvys,F$5/Q [nCH ŏwrr- eS"(FX+,4ѰL?,"Nb2nA39tR*Q:eB2p!kTX2i FT̒Xg tiAM g}^>opvm0>6ZEuLIB}~X`` T6{?ny  hUʦecϗD2*)՚hH}p2 Pc{+?Ań0N"F"Z>ڝ`xKְrHd}c3 a*y YF%cH$dԞ$YfU7)O9V8R(bN).pLQ(* j. H_'F5%SQx2{I k3Ϻ>y咇Ad`X( |r'\kɹ$#AR䵕KH.uo.1#=ymNJ ֙D߃G$&C"pO(G#5iAu)xA-Np8}G#'dU""MJ9Gp! a1*Wz\Y~ >kxvWQ80mPE-~>yY /⅁gVGLS׾r1U.RDsI{⬲9)f-'Q=w'MMңr_cp;%0^nqi{S 7g4gf -Y3ujQ၇4K3>%Y@;%#!3jvcYdJ& ϻx4HD ˧Ui+Qۊ]`C-yô*;Y&lGΒWYh,C+s)-8G%uɱml׷YɻWa HNZcBPYouB1m2[*vF2Jo<MBhMzU^/JR6Z:W.GRъe1Wᛃ,}O:0֧]]b@*F =I$KpF7`p&¨i.W+ #:T;":CLDeL4A!@HZ3KHH\B+T昔j|nHcd#Ei6'MkxmkW*YpjhpQY ᬇTz+*^@ l(vztX2u]b0E''~&JB,Jdm$k8aX5J ij)%DBF:D"="%z?-`io}9ƮP'Ҽ!sr$*k%5#et~DI9n^)ٚGKQur%+zoUL{->:gxYę-ݲ,p[x Ӆ&n'%Ӓ@'}RHV c̑_dYud9G'F@Mq݌<=["½!NcHQ^G8}6'bR:]ܒTY 2}3s1LG/BoةRMUFrD[iJNle\F1&D-daYK =cʌ]j>'Ah[J2%K>Q!K'o,IyQջBz  ūT*?>)ަpRCoL"~ZH:]o7ҳfcgJoKY37[a}omsx1iKKS\/Oy˳yȦ`2(LIuϕIr>Z?Ib[~ެK=r^n4uo$ xXozJ~1f#I?[04*hr#gn v՛%_Gl+; 'k”*2 ^)Wqٽo 1j.pOGtP&2cqpRo4~4n S\2?G>G?+s[ܶh|# &vh5YV_pj|[Swp*ߺM2P[{~Z4핿&(9tzsIAnoV:w"gng]P7BTp@|ef Ie]F'KMX-'f>1%lU9, Jks>Z Of̌%P%|9Z5dcd*LB(|SS$j5x%EU=IK.[o%w;w2hMbH+Џy W* Bs3 ;.cM(,5(h+tNV*Aj,Uص>K/<掠G!Y³ h@пbpULbUऴ6tQOui+!Z> =9A娙vZ=4P[ SӓDʶB!,2OS1ZА@h0Ʃ,~ǗAڟMcK~+͹ؿhm8~8}r;k3?~u氣kF)!L"$H$q*V?~;Ÿw?̰]hQ](jH=<.0}ϬCȴT.bY~+YwI9ћ y~^j޴ٝ5ۻno37d*4s}QBEH/5nNj.qoZZYnyhX8>[< aT0 ;1&P76֕+TH3NtaNE;#^*L&d[f""*YF3g;tajuf64džTߎ)9K%Y6lXU++RB%`mfL3R%Agc16#`Y6zH%cp1q,{kppt\`w9}`ѩw9 Fa3zn*j鉩rd2Wv^^G.qqdKP!eMѺ6j'5`v oBlh#K]l"Wm#d4oˆ 䢳N2ɜN>3޺JQmMzX*rMR4]}bYq:d"_tgE1%V?`rzZաUSI0Z{9dundKg$h{IWت?#О92;[5)JD0RD]U)W0wM1lr>iwo-FdӿTȤo#>xOr?&LɒS $cKHT08]u"V) 6*8v aT>VsU m%#W8 1xm!Bpɖ佢9r(͗nZBy>$~- ]v)BTie-TofSwe yaߞr}mLp=_lD{KןxPyx\qV{6ـkCNTF )D *ֈtEFGRSR{J5z8S&K Iȅj8p&XU@72v[~dR^ ICtB=bbڽ]^5#KN ϋ>K q:=}:Y\~-8d)- $e6衈F{W؆LEy E+6q>;Lס*枢P5j檲Έm9; XPt<3j#%`|!/Vq4P#bt˞C`S2V슇:s" (0C@iЅUve]'R*Һf1 ݖF XcAnPDю8"N 2CV:K JK_- \͐TƚRT-Y@(1?Q׆3LȎxÔXZ=`nrr#¦\&%E݈#.5d=<^(u9Z|^\%/΃]^GywwG߹`?˿c0;pJ4Tf@< ʅ01nktubJX)UӢ:1`ja;JYX&AQх}M'{+f{iݒq9g#mv]/ܹyˮElaj-2omOYу^=F^lg1Ytv?2rzeEZJmosuwЗ{>D:J=דx]YV˙q-Kkhnn3t]/cӮ+KEFnً9꺛K6)l"tr,lx_ha,6t9 ! jOeCӑlh:"†F AVMccŚK9Vz*J4TVtN̕,=fQ#%`Hl/1m]Zqq>f6~/?^lxٍ8yy7,͔[ʡzӑ>ǑE`n7$$1#NdiYUGBmzٚ!:+,ˋ[Ycؕy Gͮ[}d'9e;%Sa2ŲxX3)Fʤ*|HE՘|KIBs*O#sq:t59bc8$(1d1 bZՙu[YĂGMF%_&UV:Ƀ)yQo|Wќ_6c>bS]:,yMzC"bU b_H(XUB49K䀢Fڅœ=Van;~<٩AiRE"zVai#SkDjK*>#ʼnFe-O@A{q9j Ub$Kb#7b:YgK;PTt P΢VV!TYtך4-QK;UsA4T$ʨ=U"k>_XE9QUѲco^U9Ql R.Ԑϵǣ+FݢTʗc5I/ߎFpD~tA+ąR*PPH(.`.8_X Lu.˫cQ&Q4ok|!۴*eٔf!@u~5TWEg rlGm]Y|nwzI&"U>|VCR0NUV96U"ܡA0:'J,?BwFQk2vTBIC#t֎#:Amcq=)@ !=4tkhȒHM@ %@oJ;laM!NEny>tfw)F URLzPUM.a2~5hom2N^CkcN5lB䢪u Uf2$DN_W걷#{k);n}n;|q@⎼HWWh酚MFNw'3O;4y*.%dSL=؜ Mr< 0ӡ4QԑuW=n OO*W5=9r,&[IU15Q*ɲX5xkO)V+5dsh(h+r5 kᖭA*tD͑ %F2XoG>ڏm9֙1%=5F?6zn4^|^N8a{޶/;ޚ{|ʼn5: FALĎ#/]U_"ViU ,[d ?0e*"ƌ2T![v^I1F-Xo ]H,xRsUV! A>h+iQ% ,rF׈ ݖs@:0cBsD s0S{"tw϶:SqSO[gi4X|0+*N"8gFR%M@ƕ c K#2BV{C`EMɘJ#[kp{!9Zi5"O'it=i[ p0r~a1f,blXoGf[ /dmb됪F]֕sú[8Gߢ_S-:{(Aڷ{ޖ$_`ėzxSQVUĿs6C|v T(. t{WEXGECގo>W1|wP-#BYՍ WؿpXzHף;Sx;|ijw'^Һ&kd:/}F'10]B+1Ywh`?{b3Že/AG"z_>_f+ϗo'Od٠S+gA'VUT=)&SI `kők\s<]ZkJQ$P P[bEwr+>n|ֆ2jz]@I`gQu&m<?|hG^xXu̝`w-C\}3[ݝn(=j#My~Z,dћF C+ N%Y [,t[ߓX@މ8=ʲ&&C:ՂVp1 zPdLdu.ecI,M|yph) KXv \*>@-Aw^90KZΏu-dՌJ6{Dt%j]#\4jmsρ9XC-rArF3չG焒ȇߐuW P7_W,گ=f_=FЊhGQv6 [GRZ3MΏ2J]V 5`  tBUޙm!͜TQ~Yښ̊=S"V>$5{K.R^;_hR4ӨH!>JαK. 5GF^Y0ˠIg >RXl䜝G=Kog8+ NV$,ͭ"?4-riYE`OUj﷧ —i/.NXo2?-=5,G csZԿY9ee'_V`Hnɀj7}~A>[BRJ$8_M N~vu2aykFpAdZV-QwbA7hBڜ+Y_-b5Z\xu|v|{/>(|wͯoĶJ 8\7gӚ&.1M?Y\ʞ OðJNH1Da"Yy-u^/|{_צ<\pN~YcKak/E\r|}Sd.&XZKӺ/B{!;xqpw㝭KLOVG7߮dO^ht"ܺ6)C`ckv@־|-u ''zuM-rλBiR@TXŢ ֕OLJys}|=ݮM#GJPۖPv PnYAX+/b{ GjQ!t魯Cn'%'[Nd8k_Wgz-VO|^ON&_i?ې]URox>cIO@&KU_ g=8/X"ev!)Qo")CbTUu]&G_O>s~ä5.II-̫!I+>BwG6G>̎0?-k|r=PѵGCP;-&vZ~X]u"v'.`t\.1Oz1߯\XՋ2}ge<̰mn~\(ʹMyiw2nEl/߿^=/h<{/؃0Y$O㛫\!@FbژSD u+u =NЧT9[7MK|=P\ƅm"11 D| zٗi}VԨ"m3뤔gUg]]q@Ѝ@t=i\- 7]xX8̢*7F!˅N]HJ<ަytz4u6ي"rOj,yJ1#Ds=ÌM&NMr&%}V`+  8YZ1QP7Z:nj옦;.t7ѷ$;&qq#K -} ݻ5mBdlN%6g3vb E Z{M4Us?Q%l\KVrV%˹qW*d18ldeqEsWp5q<h4Q^;ռkl`W4og\>$"ӆ%9SOr.R ޒ`3Pf~1qGnŏi?Dc^km4L0>| s]!U1(ePR1?\L> nn_sWbexsQ+}姝NLG(8]lv"!iSU\] @P3"6gWE\cHZ`'o)LWoTR3"gWE\}6pU'ϮHePmpvH9+X̊\q9 \i%?u*RBWd"YUs"@DVB!LtДoJ\6+tA)F JULƙJc\p8mL;E߁xH Yon'n/2Y-0ٹ.Ft`+h&z],J !r z )RRt)gl$w|>Ɍ?zZu`|!ؤ.X~,Sz.Y%%k9-Ǿطcr[}˱5!Δ=Bcr[}˱o9-Ǿطcr[}˱o9'c `՞rppN!m.0oWrD`i}N~~Ov5}\734`89y KIao-#Kr \dՙyB=bqg<r+t_t|x`8 I9GDxrT* 7Yx3ŒqIM i+UlCl$y[LNPuDi p1LlVgM%γƝ ݸ8An']N7EOU;Č5uc{߰}5P}[eNNE[ίmSq& ^h3E޹P$S;6xsQd &ɄNGi []ct %=)`Pl K>Wq2Tmd&vdR^ iƾX(+chXxT,H*)e2!%-H9YuJjZ/.ʸ.6\M/FÐ~ ?,chi*]m~Sl{vz,P9X͒J4d{&8y[K߫ӷV4+u5\ѾGWiy4ggYv񓺬@0'(ĉB`mju25IbG:֤ 1DZ)1CgyFt!h2PZḵ:lƽf5=w46FrwŢm7&G_K^=mprsJ`S.tWߥ癉<qr1()ǣc`Bx3]Lo> FC z^[v~{_9\C(I@)!3;oe''O*Ee4iSetB&\7D8"]q / dلd4>H 2&Ii &kZ$D̪א 1NEK;v0rK,B<מ# 20Tg\bf~}Wy_'Gvkh`pX5oZ{RˏCvMp²fMbe8N!ֆ2~< r9hK+g;.f<:_jY|EW[=[3A;ՙۡnhLmXB&35W+}߱O֝dp!j/[lyAɷ'dmpLa92q#t.qf6,KG8yBO''v\sWR ӟT\Ҹg@Li)XY1I  4,C*W40n_nNU5طjuyc)Wf+=0Jn2J0ЂWOݬ:סU]v@̻s I<4x$?.~+?k #?ȓi ,jO$ %.dc1)mh0+!F-FGg\s.Y#2uPFkBTwdL2YHƬZ$H,y89bГ1 M`SK^s_$_.BG{nM.UїqXWhqг\ڭ]bqMF)5R%H(e )Y ILZ9Bf;hc( JYzΞzo@V![y]GI ,BDL+l΁.* CbFY@*g/,$ 3ѠLh29C0.dЗxHEP7CVoE_cF4*O psR$\j36!s9jr5ʼ&ʼn̼(|7ޫx`!Ң,r3K ZD4F16sUP.fؔ_Rz*|)aJ:eEhmf iRq)B2䪔t][' 1zk0&u?MBgpcb^G9۩I6Yo?!t!-(BV^{AxQNeѼ#[5j4֟(kWљFFyhXDT-z6r1@a 5RjX:7YeEE k,YT GRl 1n«ַ8sOq8Çc:!Oh0gcz8_S_V:f15ux E$KZ5@xN(+{VKYqNd_/a ;~\c^u)g;ڲC<֫͗A(Y+I&fQeLX:*yxֽ\Og'f[g-3JZggB% K`Ku6iV3T ɆRX3:&'8+<0cIqVgwyoϟ;2XF>&I+DJ.2Zla` ]> ?MH>L2TNK^ /}-3K+R}`yGgZ}:w|iO~q9\|kn9鏏$l%9tkGLV첺[xמF%0%Y `ưMAsZ=iU5fSs3|U㻨hshі>gQNhJ* -A I%CmRF\c̈́(5yrȅ`O/Nj4 v992};U&Tw[emrvcZ/h)~\9`ރIp.\8C]!fX JCA \fc:G~EBG|àDeiR֠VIUf \y)YnI#Г"hUAt  qvNϨq6'bJ]tl\b:EDgr\K~E  !@rR+낱)=! + '/=BU!)0W˜@s)*h.IN#ABN[=hB=KRvg#,NYT*OKo8\ql\B$<, 6qib ߻En?ƯFj60μ_Yb.BROOԉҗl ̢ƚ `80q/QdeKܔS*>ynZ ڵ=z,3cI+ɕ@|97lj٘lBUygTO=zꫯݶ1f*nDڞ/ (V*_~ۯrUnW5rU=~ۯrUnW*_~ۯrUnWUnW*_~M~z~ۯrUnW^3[?|7WUΧ#M")06Jהi# 2}*:-8ӏ+V#{e" $yY9")hgcvVKg}ɂ3Lrr5*2$¦,9}:\8#A"I.GQ{Ξ/gf:]ζxXqad JWn4N?˩njeݗ֦4$9)j)<C2o d CCwu^on 5, /aċwDOg\q3u)VLsx2VbK PMV1N>$=ww 69mzb[n eo}i/ _von ϲQ ͹.JeCaa2KSe Y:-04'o/Spw`x2B(x0A %!GJo=醢mBBeFW$0栊&:9}KOί=ƒU{\8t(#W7Vۏ[qiu:=++hQх0Cckb=)U{ĹLz㻠9YssJDJ* 7(!dM"@JF%OHGͷJ.tLu q }vubԈ>d6WaY<9 =PZm8ʨq6'bJ]tl\b:E,1*ˈrtHDĊ B#9"CAB!VcZkSLBVO^RBU!1J>'0&\J Z&KHP C>i1=F)3JYwg^%hG7 ~hӊh`̇g"j}sr;O]}mf\>jn*]-ۧqxp+=>Q_MR$?N'mxtI Y{_7ȍcf}p1exT<-Q3\ &_Kf!Yr >~.O}?o 'KN7gd@:@ɝfdhʎl9N+㑟<. ZSΏpտ`+R]̙mdpGwK0~cQJ/tϻN|"I?.3]n[/귫qrr,F"X{ꤩoׇLC4pp*s]-Lp Ğ\ {jq$v 5Dk}!})RE`vcֵtpui\mTOOԉҋR2.k&e'ƽDAۀ,J)o<7_߇vm|?B&XJr%t9|97dj٘lBU*{gTO=}!yHTtL%>6L=Yq|٘0W;\)]GRQ5t{d5!4YhdqV& u-SsORKaTGGB³ h\&hEׁӢ-|P+?5}A=sz3NkuG+wmmHiN>Lv;hbक़H$;q߷xt%<ıϡ#WUeLX Aa޵bV'o[O]V})|iyb̓ɓ7l> lHeshURcI=:ҪW`Ď/NbUȝ (cP.وSȢB"j,kX-ݨ{>~JkHi[Jo-1+8O; n=MB> 6cŧP)E`U0`!H&(&?:# &nSre鿆G: pn<%qC;9]. *G+[p5qv[Zx<Mt{gk_ܖM >Em+zo4wh+KVХjvZp^R\KQ% FGh%\GF^^/m?'bvbrRk뇃8P板mbBv̍6Eh3 ܄R68!68ZeUɻ~/쑹),UOJ#Wz|OWmp2q:nk};N> ԿGvҊ}gG}ŎG\NugYxHQihߦm3ENPEm,t9})Eӿb=Iӥs`Fc!f2@R ɦsZeƍʜzX-~q/ a3S<'>~. U,{O%b>llgbkb#D^\O74i'J)2G(}| F^ܖXs|p(,ժΰޙQX1rUqR!W ,%R#҃an:FX=:G0JI a1̹& A;M#'\-'.ӔmD!&A{}0 T̂(78}/Qg'wo < d7VH-Samv-q8džTUK5^MOνp:neAmNNlaū~OʱO!mfPywZ2ZXQrZC &eT TЖ,:MRZq}bɰGV1~Ra;mVY!ْ߯#@b}Xbuy|Ve\7ZOcL;| `K"9DT]h~,7EPoV;v,q>x|H[E{+?ᎭnĠ1 7nuҺxdCme>eK- l?v~ƻ>ǓWZnׇiw>:Wd=?|ǃ癈;^SmTsEYeJX$Ik~mLY;qڜNȇv'C+@e>"S"9ʒ2X[7%/y`iŹ1;]B"d/ qWȽ yH\G!CH+HRg`̵7jۇ)0yjVs9]K5lЅX=C"kHE'0w$Z\O/DJUk&~rģ2, n~Yf(d6]#ZeRQ ?_'y؟PˎDqa°cQgjC3yyuE dh WO˓O?d5!YhdU^#^20׿/$O.`㎝Q[Xpu6UnžK"R^HRrTFX#H`gU\iN/=•1ZqFpEpUgî:\uRBW!\YUXUײs"sWEJp=\}p,H`v6p`Z \ؓH jU=*k~6pU5\HkO])] lWkNfs7 ~goVe,e!8"Xέ.*1 +jYct8V9&hjzM$gVEҏwA^> /ĉ dR(7tSt eR lF% W&|1UR[uo1l4Z8d| 8! IaA+[2W>QM= BCܧ 4\f|Q"eliNXʓW>JCӅn|ŖS~ 1n\2).O⼠ޡq c>hI)N@B'N̠j ܿ< 5!+JO'f#21"ȜO R):H3F2,{iݾ~@N9 r&M"k: %uDVjig]}yf)6EOaK.F249$s Ε̨3B/gtH 4sYI1̼FV2- cp(wY#`"[X&ifSjT+Nv[#+c /uLUv܅x2D!H AGTRWӉR ^y>Ӵ,D@n!A"XfdHֲ$;g 1ފv=FdVǗ)o2_FߋfpzKMC=gcisîiL*Ƨ-sPq{!{xRD/t4~sy5-&иU~W! %bCL+,ѲL?,( /A:Hc)-pSM*Ԡ3%CclZ %$t{-&N. c~Jh086=Vsi*8΀c)5β@ )hgc%Xlf9  %L(V0KNwm$;V?_Ip^$wHpq`EwW5E$%6OaIE=e?93SuTu=j4([ VEP&Qc,7X n6{~Q:qӵ~OҖvsKֻw3Um%z~B3}Pׅ=ei!c SLd w, jCG#آJ7 ~Ŗ=[6JB6fؖEfC +6t!li)h A6#"P!WAVH].IYAƬCܣC3CB}@9뱾ҋ93zcpuj?Ŗ\i6;o]# p"Th@ZԌ/ &1Z])0, xg+ !WRKOZ2ߏM(rBLr93lj HU =b@FSz*7;+8zax|SbRfd8-dzcB`K_T( NQ`fZű>(eN-}G PЮI=N 0F|W^ =f^.q&Kd|D@@Mف^2cݤvd0~cad]z?iU#㮮;/WoH^''tqoj 3Z-WF>;wZf'+UyJ:g:^<ը)_d<識3ؿRL?E%xyDžքcud h~X~zr르ݽ, N<(d NRKoEtš EF&F)*gy]`*!%m؄'l_|{{ 6(9r4`-TJA[MNl#".+}<^ЫqwPeiƣrd?=sG7Una(՜{|Fnϵֈ=4;g.DpN:Ow0Ѕڅjgܔ?v4 #0?S!+*V T 48^EK#1 Qk۔|D N>dHybڐER&0] ]Ce]s4PMﯪhkvx΋FZ MF%c19HA)H6<C@];t) _]kŁ.O)f,)._9Ȯ8/!B [eNJA6U&AL  1Mh٫#ܽPBH~mAe4C*3tbCRߙ f"JE~9\Kd(f#% 'Ydr!Y92?em'X RP ?:ɲ6AytW,h"lNPe=ݠNv oAݑS{~yugLH?4.xR]_GUiCX_vvy._-| }_zO Kpy?K(Np.ӳtr4)[n>_m}_"x66MKlMJޏz?IžzXƿ8A-VlAWkBb4wB3bCYS @ߍ6-H'Zo絫/mI L=v5_Wq>4|;_rơ[[-b`E-9$9P$Me ?G}nFPQK2{-z!JfoK9bǧƘy6:Uo T꥚3= {xD :S.y!7FXI%Epл P hD-, ХZu(Eڤ0&ΦUH ip7=w{gL/b @ Y:E#խ!);6䋏Λm+-"ے]vu~`g5{ҹ 7^z{تZүaq /7@u+OJ/ o?Y&!??syGQ.J&ar~>MWg|?_6[]:6Ks&|iyf*D׳~6̛wK+{?]A]nL%MCWCT.]h|?yX~X{("=6}o<Ց'/j}tt,g|F/Xe2 ވԆ:z;bi]NgVCqy6ZzOxJ@X̏.1O-:bUv :ABQY[˞Zoq(šhXzGL LdHa/FBHXK* Z+}qbSuYԁ{nHz_-Ŋ$w ?K(wҊWTևņTiM+{X~-;gf0;Ə4FFّd Y.LMUɑ7'`z^d)k1g'@ t HӥHE 5:t_K7m_}g?̱2 _ϷնexwUw~z/DjS>8|̇u"<9Y&!$|I)^'BaVhɡ/RXbr(**R:} MwQ$(/|28:a) @RO))I%eh7=cRCs qv1zR/߰$daQƙ%1$_x IN;) DaŦJ \P25nd ),3Ab%qJ=H &ύtv+*z-Ru-Y}:{jYQ57/j?!&n]Uk4iah[UY&}tyձgt-՜><{&ʻHg-yYFrX @2L^|"{dC,2&Yw$zEK+{㰄0R :B+RA526ndlUaa3x,X,+nfTmq^z˃pw<6o]'7،C&xXL$048z(jI[Y*Q-!O.h5"*SR Fe :EA d*";ߺIF4PPt<1jÀ"v>dJT;řP'ȅAVά!J02Ci+ƺ7Bff(,Vϫ6TĄŠlTJMg7FBoCAfXD4 8 e<9]бvÔ6174Ś\23-(ш`$\SD ^ amL7e(`Rq&aȒNϙ-iNBntQNn\l&%E퀋.%@ LlX࣐l cE']sDCfX<==a㷟k~9_iFmלI݇Q,w я/Dͨ=G?ٻ6r$WcY$03]`p_#dWlID/i8bW>]/6s#$ ]:9wEt%19[-C O[I=Gd(& >0OuI&Ih0aC٭n;OK]9+w]n;^Oݎ U4WUW"}:{b1_,mNzOy11͘NYؾ<㑫{OmyvɗMymv}ճ6Ϸ:==Cr/?kxIqkW]]݅ՎXnuqoߙٳ6[Ef=]e!^nQ;;&tA0et`r\riNNriN:s,[Q KQ2Bx  cȹn/]ad tt(;S Hw!GXR2qZ*_ng!90]]wv(`Wo>4M5i]sVoEbFL|2#j1_hZWqKybJ)bWd#+ \UqQ \Ui:\U)W/6뫳zߕod/R~Cu ;0/ԂbrL< i2m^|bz.WĞ?_jj<%|NZ% ^jw:m4_۟~zW i<]vdzVqhv*ZrU_ப8EGWU`WUZWI!hWơpO^WUZmWUJcFzpe8"b8vUŅaWUZIC*#\@">&b-Zs,pҒ0C*ZWW筇WSgos|=/\}$)ۤ7,o+[_O뭗ro?rjj{Jy†DY3b'B&Ig@-B;ԥ#?+_ﴊY;]" + Pd QRKuH*b{A( 5uޮm}ZвR`'5ݛemLO?^e}Lo(B]puAJ*@̦yv*^Lמ&|3b*aZjRC>Wp2Y˒xsuTg8W;X"FJk'EA|!:8+YChgCwn_Og<^@:G^>l-~~mZ 1c!zU\c!zUZeNZD=)uf3ld3;}BLCN)@v%bQ`H˂PQ![[ RX$P,}к+a+qHٽHq`q=UgV5btt[;@mgN[v~zʔY]{}onq̒ݖ% ߘ% HuGcGM̧yEG˂&v޼mtX;v*a :H]Kz`3,KG^zp t4n)͠ɧw~ZnT\t|5ʽiM.^0؂ljG9*TP[cH?AV%LmXT"tVΑU%mhҦ2w*܃_qSK}뻼ԣ|ȓiӻ|1qu{B}Wo"߽_ cӭ_ܛ]~POLK eKgT71يIF[~3ۼ'UQs>4 f=9ަMQ^+Pol o~zDP G.Br~6//`d(b/YK)XwC+b8A鳔9 9 /T@%-R2pVEhpߺR??ӿTz̪z\xj4i%G윙,}r F( WˬYFdA(}mE>Ga5Ё+˷5R2BHlmA`NZACV6v?]gG \3=94n/cͱ6OFKUT۲JkI%#: 9/"eI9 SSd4dKJ%QxTHXM|NT z:W`\NcE(IqkW7Uߡ_ >cҭ.;3lF|sC=]e!ioϵmXNu>CT%sI;fpN: IA8αlE%P,! sJGsmoH^AQh1\H/.0yO2:)ɻУN,)8-ENF/ZG.A9W.Wozz뇷T}E$2'a-hNSEu3gd9_eOS(PJ.Vv%&a֪F3x^XM[>}SaM{Ǣ^c9P;D+2 Fi|$!LR/.H$j #ɧ̶ddUmFC"BxmД`(+],NJ+3 Ac` gw51l&I󌡃&Т)/g% z;S{6Wyi׳CNn=_@S}n_n74cFzʆbE|Ђ$I萍:AEr 7Zn35? Tx'5 -J-3SNՍ1;UJxZAQm6"M=xebcf+9a0Bɝl(&T(gK,RV|!5gO;߁ZmQT> P$jk[$[J ILke- ƄH ^:(H:ZE@N?z@ExMJHmE9r.+Ttv9->&i3iZͨV:Oz^6쒄Y !A2L!S2A3ؔ+APFh|u}9f:qP z MSro{̀٦!dKRFkX Zd4"hm히l׳}wLE5>jl|}*Fá(K(m` +BdH]:?Vs$h126ʸ4PpýćF9t|g5߼.{q HM)؄/>YΟY]0TCDg:I:"tى?"㤟i7'hŕC=VJICm )X RD(+Tc׻hOV".}}7[\ȼnUT}L/bw:6tc &w@V֗oI?Wpsܸn[W3#IsKj5LNF'_/ڗ[`uVG'c|dW4) {{9mWF#ld&qo=Cǝ~XK??RZxF@rjJWaPݾR]rpYWSΦ1?V1AgMpk%kQй,K|:gr',9A|ɮm32d *(-ꂹ(`YrͿN43ң̞TJDAޗa~qL=K͂o7!@kegaC9倅SvNK/=?>w|G'cXS)^B&'C5ݛD-G6k!GSM`GDxD,+U\8C]!fXdYjD,2kwQ Y8lK;ʦr]U(Q2G~a l,:Pd izRR 0wȃܷc'P$&Nq`CCH#2ב1N͉%x2*kDwf0"sN EĿqizi>Hp+m!XI 4Ƥ,;KRJ"= =K]diT$*@ +ۂ>)1jT(sl}av0x҉x[񇣿ŝu6m/8jqtZ]"Lr[v~u-S0G/_:OKpy?scÑOx%ZWJ.!siiJ~c|qjdS0gƺR$u2I܇OReEZ5zvL{ gto 8\G]'8GBȽ14*^oS8nsiН%wHq'%sKU1A$8`ǧוط-q5'鯭~ZMHQst{b<22Z7]\{(%N/?sɬ">E~\|Udziӝ2SsMCn/g+Wy%R .^ؑ7eBt t/iMo>Z_h$+> 01dmPo&b>ޫ(#(YtQ,-m²l90q/QdL@,Jqks=*kv@rw@+_kMgdd*LB(580SSPM&HxU_}i;.Ё Nw;XtObkOTUb+d!4Yh@8 jL[Ra[|Q^|yj Kk9 B9$CgЀbp8nLbDT 6QXI^F4IQng+I59[ *|Q;JXF;R:RTJj%Fn?m\#>f2KZe&h7t*Tj}Svdz7rdcIIXRH&CqL"V a|eXQ o>xϽe2 @HEȨd:6a,!C>[o aT*۳xϦpf;J]X-K~;7/dhnߤi{ҽ?)`>רdPʪ; '؀+[^Ọsal5jJK'"")066+2&LY uBҚY"DFZ4 9&%w/<CbllIvAͼ&!T)CNm'X/}1RF/@tD2h0.FJN7Χ;ExPz 5* ;p7[dn.#z!URLǏ%{{}l;|6d'2$SRIFu G}pR2!kZD ZE-3E@A% ]{h,ݗ FKuu~GHzB;E />1t}_&Rt,dF:9$_NgC*eAdyqܗt;zzFa3zn*jIrdWn ~g=fګկOO -vhYeMʻ4ʆ.cRG r㼵 PՔȑ&h H.F5sq?[|d9#D2Y@ܫdDRc<{.j$Rdl( \.Qj-Ylw[w<}=C{\Gv2H2g5;;eMpE:X~\L.OϦGEP9Miܷ4ȉ*Ad *& UN^`Ў,ME;WPgJ!(QAeL .AB霹,ǸV9kdNcC6Skҭ00ru}(acXGBT3mK-2Q@|0橼Rۃ]' |,U'ϾL+WOYn>~|ଽӆ3>ߟOd튍ڝr;{cŰ%鯚-*u4;nj/bwhmRsS6*3ܫe VHJqcmT@$#݋/Ì{@=8V[-4ٗ/@~)_qOXuXBS۷Mq!A*:r`4όA3BxS ¸-ONsin*4ӳ> kBV"xe,2ɲ19G@)]$̄QAݫge/̤* r'uLE֠d@@CBIpԳV~j dp/K#)WRsie9 Jkr˚bFYS*p^eӯ,$ 3ѠLh29C0.dIZ{"uPԊ)?o]aD~e!ē!)D#:M\*c\A&G]Ax+FlRiSuKBDK ZD4F16sԝՋ.gH?J*Ի&"*TVC*F!"t".)qI(<JťɐRj8I-I۩ScN2n5 9V;\ep1z9/kI6g8[͛ҽ'-('넄.䕒MT`n}8]ĬyG\p֨XWHCI8DtQ& `" uyүűh6]9y5f@&GXJtZ&lQw_ae2tr,q; 7[\@h=yZ&?N<.m $AzZ\oFK?pr}͜K\-D`{DH7|@FE_7DMZlP{+ԤeNFyGMIsM[kc[|:RwkbJ&:R5%_)Űe(kn?j.qr/4MF[zo=k*gcdͲrERШ^쬖dG/,'%JQV0KNcr:\8#A"Ld}MUXl|lh#vnJtH꯵~aO9Unw˰OZu1=r{[c+!qL!p4 Xtoc`XLHL6qYEO*Lg>a &"XBe X LjnuLBF'eV0p41%e1ē`);f)4)F{Az \+LQQT6 Նs@Bfd YL-WsBTɠtwwo?RpLj|g>fM6熇Ȕe\'xҊ#9d_!g@iD1<Zg#V2u"GE v<}BJ$U&ZX~ކ{tPQ80Рj{v5d%TÈ^5ߒx❛q#OdMdKƓJ^bUЄhdS9j?9#}\3ɪ&!Sfls!}e|+3I`ִ|h;4Z{diROW͏Z&mľ.t9<¸J?~ZLӒyAIp\c?lr ,N.nd{3lCeЏǶ˼d<8Of82w/.c흞;-g<5׭cvnwtNAcvz4zEa5.,Dds$#6m/a{GNOceVSOu25)C~ԤhzЦ6"KDF~8!ɼޚ|8s4>m$X+d_[x>r%[Z.5bc:[ +k[3ݺ23t-jv:W yH+-= 7Qtz,Fm_/G@6u-o {߅/ _SA;CDtQ*  ]hTf\o=r"F Af`%8G-MXڌ~i3Ύ*s1Bi LX [Owf;W yМTjqT%7~^ uq۪S^qJ﯏v ^TڹׅQgۿVd++hQх0Cc96:Nj{VNmƱG_$ݕs_\7<>7Ρ͚K(yP"b$ jr@xJ$B` peBZ DON. ')5B;&$xNLe킵s-¦㸚 {V6GP7ɕ=YG12'Gl@Зo:1b]7Ҥ(Aa9 *S2˹%@OFV]c¸c\9ńS|P \eTD8-1i .dd:u6.Lkf1X"sN U~=R[aqHrE 9uXa{//SʊtLω{% Ҹ %Iiu$(@ QsB;*>0 >q?&Bexa?/R%?8|8`Mh9o]Oe?߫MoK_5wn*c-oǩ:v2:,MˌAn6/9峏}S#9`v3IXIks+ odۢ{fm̦p9U֤Ƀ.7-'g8.yw >~)O}?~?ۖOh^V+iZUeSse%Cxp*k]] TDc b?mIˈx$ ^Ҟޜ^_`27HS)]^hr[w@=#>QG'J/ Ye]4$MIO{"%YY>yi޿OzڞxC&XJr%t)B|97,pc5 lL6*OSSO*ѡL6co9kg%u.uRHٴ}6641=ao.#- BHY̦\!%y E$KZ5@xN( ?T~ /C@訷HSMZ< Wۨrmj -fmOmz%Hz.2Tr~f$dˇ**;cEyysj˗%O:(Ү/f8Wш~-߹"[Ⲝ,#r\ЛwWuMm;6_>?iI|EG(fuFʍ5AGEJuw\~ZH Vr@&HÙ?t:unkD?`|'z =<3(ɉZD %IRH6\#j{\8loK9/,̠y$ "%wJF-C6 C’vRR8FK`,ÇNy(s~Kps56b&/WCo[Q;m[zؒьM^$R*@ٯ5SΚFl+iO^N̝LtJk'"")03!k2&LY uAd DVs'BJtZ+ID&lQw3_ae2ƈՆ0F$ ܲЧh)Ss&\4wqvTrYvnj~>:J[F4E-9Q FYY# 4Χ?RC# NA|f՟W$%=$vu9cF73vH땐(etuu9i*cy=y}S_mshQ^Crz@}/Ğ+'N"gr&eC6!$1b6L5Y-zMD%!f!$Y&&sJ3ʁj9^,Lgoק{Bgڹi'][oG+v]} $&]9O }u, I)qS=ŋɦDJ`洚5=U_Sm ݛ6Ra2f^C_'p1X"'*k6Gȿh2 2j 4(NkXp 31gUZ:3&p9A GFs-8-Q?p0e8׎־sK8EK,WY>`Psf9mh,應YH) z1-z.6ϮM\gW&~owſxW>z`=_ >N8i`{ ۞Ft_A?ݕ1ɲ !* oMAT\+&x w O-}]UͲ]oo;6B(pbͺYFCI\jmrB(Y`3G1!JHQH"$) rJBu &[.K +m9l?  0Aezi8lgiwcG2~N#RLDvܦ*ʃf6*5bh(1 Ʋ(` hBo'΋?|%Pzzw1+ߏFܕYhڟ}7tOۅFh QoֶgF̦?z|eMocr',iq~=.1jJxq%?=-nh1xy2*0oyB-C%$džV $*j0sg6̸l Fb:g +.K:.K:_qh8o`].K:.K:.K:W,9qDPE`G U=H"GB) tP1J `jo'`?hJHq\N&XgLCY6FgI 91 x^O+0FF2tyāhN(B:s U=8C@: ;tChh`9Tm:[YN=xxFa2?]{4nؑg"s yGtKڤt%ଯ1dPKkR\A2L,$mĘȦsYD,,MZ؅d Z^@4rWltw9OϷdzsDMm`\grawysAO&5.N\ wф Qwm&7b\i=ε{`u)2 , d1;ejY- ٠Nɥdo༷g$\wV8+Ҟg0l%dRL ƿU !2nKCK#ɇQYCL%&x$f j)LX阴=G-g%ᘾeFп%xГh.0AȳڇYo=%4 OuXw\O'7{z 8{_^z z5~/rvX̝trܾ% y\ӎ?и Gl!i dz|bRňjEvHQ⽁Beٺ)ˆ șP:( 6L܎9ڄV2 TFjGl? PPvlXC΃] 'Lg:'ϸ!m6;Qu,e4XR!\J$L! <&O$"Ǒ0>p>fY RUMxؒmc*x(XM?EDYe"HǃȒY2 ' LDr@nAF-Xb*"ZRM%l&Q#g|`RIɄ@LpPvl2p6O||ї8]{x ~ 0IAp]3E?+!'_¯X!$o2f KN&(+S6.̱wջ;zw{E HQ&MdLΓIђ'y4 sjUNum,t]3t|z>+E 4.5׮8B.~6$ x˓itC/r^L:d\իNH#9;|UwO>Y #WC)Y|awor"i?*ԝlx!*r!5AZ+xBj11=>wQ8Fi|s}()htb4q USr7٦g>Ц,xmuJQkyPD (dT* ;kw"B u[BQ]5LenR1@$lD޾nI) }TDU`\j HCÒybsg*VӐmVAoFec byZ i!|j.]{:%dэ~ɴ?`e7~7 -Թ O&J;vr! 4.Z$Yڪ!`kgyO'}G{O'eOZ>(.ˏTO[Vq2c:z!]>v!Oy,of{1]sD[l}M>j{OW޷ef_0{PI1j4Sw~Moz`Gcܕ x#m} ^3?ziwp{BmwvK]_n@7nJJSa _rn_k.~pYž󾲏i g=K#`Rd2ZxG/ j)vQGm, Y.'[GET 5ʌ) YRɿV WU!OΈ&2q.? U5q6,M6.4ݔ)z_38KʉMLHlW;ןOLڗ:Kgl='tEH`{HO*Ee4iSR$U:a?PtRa:`xM@&$YG"3eL LFgXB +eԚ !' `UИ 1NEY[͂K2Nt΃ 1 xv.PM gvH wN+vD|AcZ j.+~]r?WͬYP3_%hD/ YX#TٻFr)l[|h`\."bs}GYJy)vd![i[)6٬ǯa@Z/QGbiQ^GHM^%IZ .>EO҄hऒS 6_8\(޹o$gK]OaG5ħ蓟i"|O6J ꇍĶ)YOo'|>?;+qXKvR8YlQ^OG'O(O(c|#d_gOqN)_Xx9|qEpL[Hqe=e\s;=nG?ti{Ĵu}rer2vy16{,}~8 8?NKH`s28#1lb`m3:/ ӉףY5u{s=:_ӳO )I'9Q^QOvPZw'Bi);wVFN}qO:je_(E?MY m6e w gm(}b\N<D7r'{ ,XN ζmv{wMkXnQG,twD?a&<)m~Ҽy cb_a`p7{Jm 6k_2W+2vdto\-=igrc)Qi[|S*:[UYlL>noǍVӠ{.7G<_®iij.f3{?&}uѕ;>NWp|!~3/梁fquhLڍy Yvd-~ҩܱjGN]׶,zpʦ˓窕 q9T;8-3ărI{%CA QK3ӫ]X~៕b *q}簋 +c0q!Ҍ@ 4<: xC6 !!2B+IsPņ^g>z*:Vą/( CT]aJUz @VPŊ#:ozF핕Y4LI֨B1dhCoj{ޢ3]&u}>~a DMX9YssJDJ* T24&SEN&D!KtzpJpQO81H$,/?Iɇdɑ]xrvV_uZz5O[ڌzJztLuq }vub!4.29$t Z8X:7Y) kPXeªh| Q2˹%@OFVMXc%("aW+NpQytlNĤ%8ʻ9ٸ 2`u9Xbv W!,TnO !@rb+낱)&n! + '/MHi'(FJ,qY-GHP C>i1sThTs>BexJ(6Lۏ,;k2[_ ~7Z47ɬ8Gq]i~5|UG+]~_G}멹a vZ3V]Y}˶Yt".枛ϊdAkBh-8+;?AU/0K)TzT ,Bak%g'XD[EiFRE5 H `80q/Qd6 $KO^wӲohxC&XJr%t)iC3A,8ٱd6&PV5SS*СMxU|._%:`ٺB?#:=`ӥR*|q6B \,P4J8UzQO/dCL I{ yd*$p!B<ˀV _ m) PB-$5xJmi[=#Uu>"}V\5Am@萸F[* +A*$O/~є 3n$ cQeLX:(Z-լq?~gDE@g-3JRZU![R9x.u,UҬ_Q)$Jq8ZjC2q1rXV}Q1{n&WystR|M`r}L.W\d) $ ˫1Z23x :y0oqޖ!/JM ;_C,;`߼}JM-큔cs쯟jX1dS5T l+޽ +Z+o2dZx""#83.c” "YTH0 @d%8GQ1ixRk.>J1uٺ"dR: ybY+Lrƈ$k"-X{ -۠F^M|y;R?>Ⱥ1_bJnS!&1{**/ mu1F75-3Ar1/k}Z)vo"iXzb&xX"@Dh'UJ&>ڌ*)ngfUcLƗ9@)ёicN+Pڭ zc%W7A0,2Bo@eq+xB)0 >-JQ)A+:o%g;FH)b" ĉ86}wl^Y%J3j,9x45&H e-H=^Z!"e0 41A1Ȁh6"+YgK/!`@\itie9JkG”bF㥲SH d/{A~e'ydB[Ρ}Bv!̛ۘGiZՠV<Nz]v)#+c ׈FeB)HR89)G.u yUAVw»l"]Դ,DRR .:8sI9ȐeID 9`ic0G=p[=(}tCaj$/Ae[tR+ SYtIusڡ,VJ=Q`\[' 1k0pG5 ٗ;\epߢbYkqSD4i-(q'!䕂M$#lWyZ6~JtdWљFFyhXDJ:C|girY7| ^VZ&ꤦ.m\ g% .B!m#4V )(wV[t~h"%u` vVy%Szv }K B3"/e}Q>?Ѕ֖!>_|Qlx8'*ɿzrx xծV=m]u<'XvE%e0lkזƃo z>ww=YAQBR]}kѵ8zwm۹ tyX[kWEHvW:}~ֵ2&?48V7X謽L9X̖=}Y싮KW ICsG#b^j Ary1?}}_vv_6)066ҔFe(Wyў+Ux8ԧ2\U7ؽ=p899@HA>{Z:M_YN>UuD%!M:+̒ZgJ̅3+$rHDO6׫-g2cCR:[ɾ$_3WAqyRzui;gCߊ_C/dƒ1DV2)^KMtA^,lv# -ZB %@MD,3d@QciG<0T^`Rs$2J+P­J !C$}gK.]-ٗ%!i4ݳF?M_ nˊjeҽYTdʲPRµ1H%%tY}&yz*8K#xOi=[E yޛQyQzvep6ʣ!'#0φA )pi3xxCkG"kO`C Q'TQy@k.dDrp,qc}`-zZl]G PCz(mM ?bR !^twr?;㥍03| E42Ef"Bo&r2Y2Q7rxy"=j6{1'k**UNڄhऒS %pXܝwǜml۶Sؑf,)ٻ6U v~? ,f/d яjIg9em!ERCtS1`K4kUS% Laļp?~gMR: M$ZBAMw_g<r 9HAN痳&W1\9,0ChHqL`Zs:Α2?wW*ٽX;~Y{jfYK@vV ۋ/FQmܵuwi?FQ 'Kk$ld8=0x9齼.&)7p_h&cD%?=d-͵<:n_72h'H֙l 1/q}{W&=^k1r6nPatjHS%(W0nB&zs[X[lR{Goto)v}f/W}&j?-Y-||E!{|':va}&F~]8]E;TnSD $$&K cxTR-uMwwy!owpdBڎPi?DAݳŞ4$_b066N.,i`BMtb'9{s_a%io}qzogpcZ;<@6cG\ƶ[8KwSaz1Ǜ\:xd2WИQ{mގb u+&nHd?m/ݰ!d)3_f&J{7nmZpsߙ^lBS Ljq(AE")(J Ó!%(UN/V>O( (.+HcBbQ^ d(Mkhý!:0M0:+"J/K`HI-&%\\ z%, O^WrLᚡ+<,Wy*=(ʋ֛8\oA:ybT"X_HRhH2w\a]. |__aWG䫩@lH" C9"䗟x-MѢ,]>;{f/z,J{' ˲B*\l32DF]2(sR.H}H2D28/4yRsIKTatS:k u^tr.-V<L6TR#$ZeR;nP\m"sOݏ@"oXq r©@e,Dɍ.T2)Y5ൌZP9qήٹex8su W~twVL:kF!(QS:DԁhUR.؍w8[ySx֬U>qe5–}5z<[olTgcT'M$6Sy%Cjp[:{n٫vIZF8F!L 2UYJBY)lL8zZ8">o巎ObIAi m̐d(na"KFjFNlXjhڞxCBLHSd.e|)фܭ+CDn??Wc6:Ur:BPˏ0^1>P/y[6 !fQ*R`4x_%j)` s,N4 u;>Oo /*qD .PXb !-qy "$Az9oF\Tjf;׏AEhj {/(6 w! n/`<|ޢϾp6~n~wTߴ60܎7cN/⭻X:&^<WY\@-AvvtH\:FQCThAE>iiM :2hUVC_D̞I,}WϝRgL8M( d"F}R3A!h4qH[dvkػ0fӼې]naZhiʍequν}\M?ɲWQX I]e<NQh{u$G}Ďc'S)PD}RB)Ƅ:W2j : UVj ^(C_8L,[@xZ 0V%MQ̆H4/yK`Lh9Rur; 8<zf85c2(pS@\\;Qd2K^[lưbW Y3u<¤\7lu0n[팹ۺ+*=_ .y8Jڣ#<&Gr\&9*iA@I}MV^z[%&sLhL588PBKk pPPj Qp6SR$*ɭ58qx(q_Axf(LZn^ϲhzrT*Ƕ8{BPOhA׃arn4nEp]$c*+#&zV#+EVJ'B1{DRt@r+(Vڊ$S.ς R̓ BA L 9KPcp[*PHV#sK@rgW3U7X=#+SzIkD[?÷7Z$z/D 3EM3Xļb\S iu -(ޗ$@&wIo 4I4#2z ٽQ|n´hNsC+\LN^{lz3PUmvG [Cd碽Ĵв$x,;ɥ@+Db#t&sXT3*S{=ʘ8ezc#uLֳ UQ*KJkb֌J1]XL2Յ.d.<.\p6xCb\Ch4]ֳ\c[SDJjEȸ !>dFU2HB4YW6 rule38FtP+5Q'툱 ª M٭dLSŤP- kmkn-T˵G[ 2F 0[69#g)5mpQ4T .x#@ !hE`HP ܏r<|̍ՠ]bևQ?/ܘhbqF55bb`=Zdy$* Ϙhȥ@‚$4cnBSjtCvsQ3h( 5:h0~gv3׭}7Ob r]8?gf`5B<tPK/K3uYAO>/Σ9Y$A:cd _Zx`2ܗ`@.{tL98I#v#jMr@C |u#[_%ٲzdږ=`öbwU)vbA 7y╚C{Fӿ:"F.#)IU8< wYe)R睑]0`d@k2i[F=0|GR'-iPy1)UȌ" @%c/߂~ ^fZ/{DW(S<'Cu Wv.R A_!V24n5zx=B޺M ܼ5/vsFYqX݇oQY}f&:2m1݌yꌧg5|>sk;ϴm/{JtJ!*5nRNHΊw*PD%$W#Ñ't8$"IW t2r]V\ E<3gNrӊs|Jt]TN`P9_|ߜ,I@>^_[kل~AYYTlF/QJ%MsVw\7zw/'s_4|sw—!vKCO4s4/ Zni魝x#Oo<7FvN§eyz{تt[n#Oo<7Fyz#OodBb#Oo<7FyzCGkt7yz#Oo<7W Ma~~>X}BT)g?V%/^]7轱E ?.~H?]IlR⤄|"uAz蔖Q.lVmғƟ7&;^2c.Z?wxy2]|64/}p}_óS<ݳ]a=}W*'}WuLӛEtzK9Y5p7Oo5WܸxDVI T[˶Sވ ;i" c!zZ%}xlҜOA/\">Ey]q2d.f1;oEu!' (S3Tts$1&ӫtTMM"G6I2Fd!1e ga g+乮nQbC9]e \Ϛbr"''I:ejRg˾&_m} xPB<^7Fcɡ Ӳ71'PyMбI&lH@(3蹋U#c޳Rbb\P!@ۘыR)n\+)HՎaje eX(Z,|XxCAƓUZ}8;2 FfgT r=}bZsTQQHe !PM:.X4[`.AfE${. &% QFE+0LBaRhrH #v5uGl?[)PײCU x.'AO `%#sN 1YN(E6x1RyP&\ƺAsE2!C(h!bbDzǬ <@aj쏇}Rd9&jqhDԕ#nEĝl<,Y5xrhD.de1K."gXEqJr`2Z%r3>*T`@ }*+j?"~:K=:YɡqVwYVGKሒt8$])D-ȥvYes)F˃,$̜q,:= L+E)eЩ˭FLށu! CS:`r`)kܪΈë|L =,A:_l|ng+Z =۟_0솩̪m6ix*J)"cح<Ng+"v*i9)zY ;ɷ|+|+ˁ lfSa<.s</ #ER586:p1/IP\Yok)DQ&{Iʄ>&V$U:lx~wLPη99c%R)B,QuqT\  *e0][͑-i{<=Quƭf!zV7A?,lX !TړHMAB+6yb[<a}2]\¬eft_efBJ,!NN9;X:JǓUr89+ BJEXh&Z^Џ}7kmٷOJJM.E&lA=GYxoKXC"=bkݍBGTܝΉwKň;-~W%,=عl癠GЄh)_Prǔ]: v5fWzJ_r=_kqtö ]߉OӫR'CK} ЍDPKVSgdǚKB_d܁0M@݌۽x?"ZRq3t_1dV2<=ONpL,Y3Y6i&S& mq"Bcnk لE}dr~㕬jAY#a\bc3 E_B$(MLЛ 7a~~c;0ux: qzy-rOȅE_$*,>\;я?߹k0$QpJF yH4C%?!_Ğ݌"3.a6KwgH TP'^zjvp>WqNΗ&ťI7eFHAUs[s]NOo湩\ XMڝ3*b~ӵBٰ?dU/bx]t?ZxÄ^7H5vLȮE,f+՛% _\b(䏸{/tX_I\}(a·uԐl3N{%(:w),.}|H.5C 78 (iiDnh}k} GׂJ)C̍*ړ^[?8)96M]'m4eLZ`fLl0'fB2cIʲf)6ɦDJ4`b7^ջޑ54%<6D'̤im`sݨy]Uχ U爳횎^on+`T|O Rbob5kر ͘1 L霴k4l7bci|61~cj&>]\_3B)Y% (Lβ:oC )^0 c]4s|=!͆0}&y[vwAbhOkVJ3h_\R6Cۺ^q֊jbOYu蕵(L*Y399-hO2qҔ"AhN`b󿦮/-;nQ3mBT@q-J>|+%|{<p(LR5ʶ{NËtEu$@܈Fmcשa0Fq 6q`غ=KH!i¬90I`Z:2!yHFO$s2JIi; K`ނN߃Of@Y<˙H&h1L2 [:°Ҝ]nr-LYme`Vu]fUۯ ++T`^wz &<cv&^}uq[ -.u 1t `zu8?w@=0nYݜR_%rwuL0u `:\0Xs,B!l <@rFAh)E 0O%U6b S]Z[)Ͷ`k U cJ#68R$#?gNz\b' :՘O#:qTm ߆Oއ^6*yvT9׿ %n>w e ;*Cˌn?yD]JQMMP3G5%V,QJ#Uz:ry#d㽙TsHXiiGۂ)"gdv+6[JL.0"4'S%SjD{""{kh~n!:((rΉ3Q2-cWR#OcTyM ]ݩ?s;9A&G북 -GxpVgY]ec`G,(2ya!u(|PFBRx'N#[{bVޞEhEWRҋ9sV윧ľ=7O(4kݿE?L~|߆~NK3X$L-T[0aXY MR5fy@je) :eV:CCvBq1gJ2!'ߞBNQ \EEA|Xzҟ QOϑ 'W,ifj4jC(a+'q-1jQUBaY{jlUW^PA͝.gL!s"GD^t;axa)X,|X_z~}ˊtYg X.e R2l`vB=\p464u<]%{…5l^ʚ L,-yF:HP;\]!SP"ycS<`s'<AO"֢]7Ă/rU;u~? n2:1jFq ؠG3䭙#{3-M, _tt.`ϲm-:f}+>XCgʺ pltٽy+"5e3 س?J-p_DXYRYu5Ox lRljН_bKPjT/WLs)ƈ+_i 0evB*0k<{7 7qh4k[SjobwD@/z4Vϙ翀[FM7a&hu'|dkhr{׻N};(J6r/3?|?e]0Kq۝\tqAG;qZXNMtgfQ)@Ȍޞ~ II ֺ5Y/s(v^mO^Lӕ9@r> |2AOWr/|%[~KE-շoޜeeIWgf)F5#oonvǤ7 z9cjٞ,akVT7Wi0MS YݔDM6 *gpBU`{EϔAP)Syj˰T`g4"ʶQ1mx5ao?{MeL`RGSJ2ZcM0&OyϣRBuu_שڲݩ B ᬲ=M>7I7̛J5W/ gu\߷IJEe+s71;s,\/5pg.L]G>L5F#/XnA' Gj&cKGjIuHdH4f>C5C)=!:+[!*ʰ)ழW\>ǕJ\d,)`lj L ǔSHf\BU%AhPpfdZ"@b3Ӟi|wRNeŤ3H<&IZ(n`-Xc-u;j:[E(b]ënFf/b'woP+⪍{ẉǫJ {|aqwU"H*|1퇅=˼=2Tޟ F de.9)BγI&-JG Ǫhf+;䂠8Vm1 ' 7sR:W,P,PXXxQ}aƧlp-vq|oz@g+ Gl! rCbZsP(L_( {&K#,S ?'AfE{!#JcQ(¡1pY(39̢ʈ]M旸b jWCQ[UFm5`wY<6 קL9.Y(E˝4jQ庛pYcE (:ʢĢ2D >`"g۠~~)ǂPDԕQ8"N@6],Y5XNc6Q #D RKD1lUDt3VQler0V\p&`IK|)\ŒV?'zԁpqtuL{լP\qь8.w&,V{"W/,gBAp I '4 rcjq(Be< òtѮ{w?s6b]`ܘ:ُ8g~AUHs .4Ye(RM4NGΙq4DZXHV@ȲqQ'-&2A'%uDqDe4i_Ba嗥]evn.gUrjy|VE"[t'p4f=zx=Bo2fӰnidC#7{H_xw*䭑뗷t9A!wܳGpWq3ۚnL3tu]n6UnnǼdXٹ+JFnclJ9/ic6bY5'縵Ί7*REtO,,7$:M(>3`4Ns ( 0h9rr`FϭMWNwLX+fH̔G'˜9zr<h3FuyB38a`;ZǤQd A,JIDde?@?k/ZC2T%|i (1pY,ElVm΁~`@(^S̨g:5]K+?w $+ЖebN.!Bv!ۘ7OZHQ1Nz]v% AG$W^Ω B@(psR O\j3s9UcV﯎Z58|˫ɽ$ Y`!Ң,EEg::g1Z"8QƤmvDHD_K*Իf":Tv!#(J;eEJ۸d R4sm5w,/I;SㆌUejrwM˲F??@Cvy?|7%%d0&RfLo>ZĬegt_ҢQK}^! KqL#d< MA4,ӿD?{뱞VF:wD]f`kY3ܖ}J}'dYPz;*t,,BWXc̢vZsQz;;odW8 }ٕZsx/#y-y ᑬI=YK nncf&SÓ_&=a2L5[?ib^hb8gdIӸO u\{;Zߕ.Z6v]NX䓉9}oK4%5KH}cN-KEsGߵRTm I{QaLi"YeU,/->{{])&`g|78t LnSS Z|h.ߞL_([vSZ|![w[-omm\m4gr)}."m𶼚Pc7HŔ Mt)ҍ7D̈́Mi CItnZQ]ՈD܉ɇe7|FYw#~(leU|NƸpx*XV% :i1S!hт3Lr2LQHՍqADCk3Pk*eITuX'0 gW"y2xzx5u\~.JT2VMCOnz;|3nw*ջjvˬ ꕾ^G:/"j)<C6o%>8!$4`ҽ`fBzeσŖ _0PdX!`bͦ "SR|E+qԊ]". _^鐳t 7 IG4O/V8BO/!.E^Lb@Dr`10BbFy*!O>Ίݟm؏Cq u>rɬ= m%*%ȣAk(.dV8V8]jCG P>EjGvӰ_MOJVK5@8p7FE_8+blb 4AbdȓI)% )y!F0x?$ב#t-xZbRe .{JaU Ӓ%#:ᓴ|paV}N$0nhbsub`;Nԙ_ >܈\߰巓_LΦ$/n-^|k}=zwvxZyzOɅ<_\4>+A,}>yqN c{w :77p#¿A6{MDxȗiFK88ҥ3~$*NOItP'^zWSO|YkR\OϮ=?--3BEM>,\ӳIgīvP뉵Vs&}4m:) h$Wa9t]뢟}jY ̃6ۘ#Zg6{sQN<}?B?x-}?D(awuTl3L{-쓨#;,a˜問" Irr cP 0Æ=EڃE>[<mֺmwq .nݷ[8 \']OU=i%hJ4-xb=+LaVCO?>G<_Sukꋍj:A݀nqi{wt<Ӎ\-{x*u7կznР:18y;,f5/3q`GN=]ۢDmqmmb<=;ݽܶ>s!vPJ.JeKüeF|PYFzEX*Wq$Bm_ػl{^` _e)e%~3$ERHQCq(2g8S3]UTuSU]n9M] 2'}‡w ><#|8İZrwrzd| CbjI1wt9Ӥ⇬}yP$G?)&/=]4$I;INcՎqP?ԣ ,}㧟rVz'zyj_igϓҨ֪d' 9(C!~oF$Μk~)Y~PT5䢭'<ۜ˱i^|;[cNzEUbq2B@!Fiމ;]a8s uTB:GayA>%9>*f6r >Fֿ|jԈd_nK@ٝi%rOHXT )a4IC@_vW:|چˑkZmU0qT!e; %}x@o%Oǻzæ[{)a V3wcűr*[`4 -U)Hä!;EXp\ ouhC$GנK5u0t a g03oS[oլ>7‚ʱ 4R71JB'{~Fub(y3V+ #DG>{x3O5!N%ʊUTbI0C \K VTͬ\ Z%ЂVKt ̈́RH#/ƃf6Mey7NAWrrkl;3;4ui#Kr0oW fg߳-CC/=ۀfXhq_rP2$m$DEM'HOC8ʓv!$/:~0_Zq37)%xhLWIi 6=UhѢ!DEr&NT{ wZJԒhƺV@HP6|\F)4%3ha1:*RSV!I>34hV8XTJ?@l¢.Բ! ,zt8\N#éG(0ͼ m6z/[n3 3,3,hӴV?/CM#/κs/~~烜KWm5Fлˎn&æe:֙HOT);mBS˪'WMsp޸4JA*79OaLA&jhCר䇫aٔD.?b]>;?WH!r~Ew}S#cӪ9 Jo(2cI#7g_zq8E? UZ3'~uDzS&RUΉvzENUnߞ'8\ƾfjP7ZDHtByt\#b.ng!e4|?'Gqh6([+[wY}l3qvLX[5զXG3y$!8Mbc0=o (p*Ğ-hK{^~%kӫ8gPz[e-}@aZ"LfhuO[=8o-K%Ʃi4E&6jydI 6)8:.䥁ljm)zfB4&r;18%8}Qq . @(aOm')tb'&z; U8y|1gmC"Kɥt@ThP%\ W )`BdtO-UEV'2J@fVDR@vnOO^%oaY|c%.r պ:k֔z.hCL@&ygLN<2G0{r:ԲCb^ڰ:?; ?ڡqЛyy6Asx[sY}7c`x3X_%5௿sMeޫ98+kj tz`]*|t*-@SC Ҡ{Ϯ(n+Zqf+I S[4"(کtFPq;Za ,OD.6)pI)!9AM tnl}h[gw]hql9;G٧ ң&`*B`<%:ļ7h ¹b۝wӧְv,ro,إH|[7ԣm7a0~WH^,sP?`\7JV#T8 L^z qb*Q~H2.+GN퉗b'YXfvC*DZ5)v's9pAJ)AD1x\8scc I2y![&H@xFպGJ꦳tȅ]&o{ˌ<("C_G= v15;sr4* XUV,]Aw45R,]wWY\u4*KkLJI) +#rWY`u<*kXJˉR EweTP}&,`ީ_> Ig,%]^ЎIFo׋tك{#ԯ~ _s?&5]DswOzC=ɻq25 :g<" "^/թ3oQE'7zTkKK6_[}O߆ٮG7jLgW_H9ԃj8QGz}?RjN!CxsqO ֶll+֖M->!mefxUQpDXGղL Vc,%CDՓСĥiryw4)t]wSf:pxO?oΦv{Z?A[;۪8ύ-aoN,1x紕o<'\zD(t 2XZF@4+J4+J4;0*e 9 hgei:[Pr0 VP8)˝e8kdelHUƄ6Tżei]jʢWYڸڕ5GG8ٖ0! ]Zd* [|WZ[ K1**Vڡp"L[{bcؾXX'XdT[.k)$H|tRxM5Qx|0L4J0<ڭGgR׾Mڅhp`2VHu mm"充rY_aPK t].LVxvۑzCK׻I(9kYX6xgwrK23^-J-"-f HhuklhW~p1Uq5WlD4`mlR8*H2)BY-˸otmv6H?l^s3x$NABNIe [0PWAQ4NS{u.x뜡DB0 R[#Tɖse%.K+_ٍENFg5959bl0N'%KH%DmFX"L&(r%(-Gph$q*N)}Il|TnQ0bF+ߚD+?8nV;mM亦o ֕[fsG n!_؆+`s8_fͤc^yV S{NqZ a!EN x#ª :@uoTh3JR93R8ezcy-THƱ`c)U1H*TM3UZiƮ Y{sM32^CӀko~oNl?_*e=6!a4ahzHh"$!>A2z2q=e$a m k2fl`;rKpb&&10L>&nc&f톃bvkzmhkC>J͕C]moG+?2VUp{b-|ܷ ~Ӣ8W=R!GTӢI`[5=UOUuWWQ@Ơa$3ĸMf'x,hr@b&\JDA2y((pbX"J/T:1]˼ُmP.Z[T|2"GD)"Kg ܕ<*YIK\-Bsc$uoPZPlJFIxԂ3>`DRIɄ@4#~FJ[eP"ѫ˦SS&%⢮jwI 0qiDؒI BLJ'^*<$cTx \<<&⡩8mJrLN[w7E5mŪ\$1K Qުs2 ־N-C}2 ̎2N N᳨ ,2oxx0fe'ZyobU񙗌Es$@ᝧ. 'ȢBZǬYٯ6s]}HYB!Ym12ejv]k\v.T?[b\B4f5z7yZ1DfS{5[4dw-(O VP[J~zOwfvG=;{wr=ޭYz*urs;kf{~҂ϲFnVl%r$gͺRThBϟ+!F4F*h \qbuǪs| 5*M5y9nYc]~|冎j'uv_U/%Fx.Ԑh k\&J c3Y(XnE[Ap}R𠢷XCF.'.A,rhB.C2Xجß!̚59Dۈ$y$X$hiv;?+I؉wd?%S0 cyr2 Mؒi 3.Y"DDᲩ 렌B` Q9^y>'1͒(5^89 ]W> ΀/'sHh5굟}f'J(N)"aG[~f~}{돟;a)(*FqChH)˜IʄH*$rQҰ .2S64F$;jn5mïFG=;gi` -H)ˋaL\4x4Ĕj֞ LqRfrQN40ze&imXF&%mi"Ucgz+PVP>4DH PH ;Ha"2')uB@(&c0Qj"PgM'J23Y&"B>KWR.U ׳SN# ʩ"4G\HfMr1!*tL`g482]\%j2q%j{ b9(sP`G C>j eMhJRrԶz;""їO%;d*VCTVu!5(Bw m3 C!șB2~.D$!~QQ=gjWCV-n,a nӅ4˱9.MAiNHM)D]>^]\ ȠrSkF/Ql=`Ǥ.?S|"K)\Q T#r& D'ϙ[UNO[x|q}G酞y^csՑ}'\Ga B e<.̊ܟogҌCq )ev. kGΆHIFD FJk)gg&Xʝ~mXGY=:;8ghSrjGvSu5LVjxQnBV%_9 @y 4Ĥ{Sw2 }/)>ay'FXSzبX-D1eOѤ(SdcIN 1OK\O܀Vm8$\8[%00L>ɿ}·mUn؄jCr,yK#2z~uU6Ȑ#F\BNf6 ȕ@Lq;ʩ?KQ!Ӯ=)JcBgLҐ0X1q醠u``ta*W$0jPV }ݽB_z2 6;m7V!ﺸ~V/'ע#}vCƊcW_WY9AAlCytIonOV߭Eg7/j~fq\;z?deoh &4~M IfaOˋI; ]io6>YmLSp$e|K 얏#?O早~KlE "pߢerɰ.ñ qkyF'iVmd2U1Mma׷^ΪtRJEoWB 7YXsS" #ZBP ZX®/Ū'oU }KFX2_RP^]xpE*bէWĥ7KB}*ܸ{2; ny: 95pҽ½5;oNO?r8Jӻ_V k0&pfy=_)8EӏW!KJuz*ܕ /=Ogl8" {;u=vkpj6DqJz@w^<5"($\^eä8k0<ҼNKeV6FSOkM Mp3irBext#O dVơ 踠)tTӟt71LpI=n|$v+Agu_6ĚeLN0X*ZL.Emx%:\ r\m|;/$z"pL2YI_IE >;4}qDxx An5U[K$@mJ1˰(DBFӈVg9YJ{z%GrXA/}!zƕAȲL2JɓI + $Zds( QIRfz32(Kk}`9GO@Ddl9P:jU~H40 +W}ubϔ1jhn=@Q̚S$p8I$JUi)aA+c|HI$w1gsٻH+DÀg0d=ЃGHB-Z$E=Y bs+UWԭ̓7=72OO;ܳESx d(g@H6hHvsyJ{E!ո^ 5m!'B&4AO/Ƨ\|8o@U"hj*uo-l)RII-w/s eNJ}:ƾU2)Diiñ]R"vLMZ%1įZ)|! mDiivJVCJ Vj/ "$ FHr5jvxNIbѥ`D)Ȃ64Jt)*k_c#7% >{(.՝` `ҖBv=5ڐ]hGiF*hx L}h$kIu 5<o 7|tXTu~R% 9ՠhKP-"V1hFy¶e@TDn*BLv|ݭy#7di"YrӦ8e1h6A@F4= R}R{ȋI - _C݅Zc`8Bk ) f*2{"z^5_o/ׯ^ wi^/߻?*ۋ@Jӳ1.\b\"o\J-NדPUs r+\ڰtE(]d:@BRcFtWqq*"+Bk;t*J7E|%Q΅tB}+B)c]1HOJ@ o/K[FG >Z\)=ޓ4w޻^`?>8?7/߻1>0}O\L`pKedсȣ߼O?smmI|0в~C8dʾgv5i٨7&8\]K,-EK?'z0w}|m7Voe@QZ!zkJEZ?>vjN3읛MA.V}]\`^8P0"Z̅9+s+lܝ rPFtut4m1#`dr.tEhw"[&OáhszN| p|ABo5Jǯ v=ݱ08]=p|G_zZBt8v#1]=%ogDWxFtS<]Omy>]ʭjf^dx`gCW7Z!4LWHWZcT `gCW7{_ Xxwq7fFU_{3o-/6̴] Z<S1^2?g &5[QcP[ݶMel˧qmk]Z Mhccj(^z3q'kWy e)?W>&BnӺ٬(aֆJREKSc \3LY;G atvHQ6fQPn-Qt?'LgC \":ݒ>xCZFhPw"D>Jw-Km{tM=߇%[nI, M-vekJWm(V#no6Ȩ Rc)7Zfuwvozu8{Pl\zAgnRdsեw_\)%-Bf&jB),MH@ѹycNk1WצRyb6Hwi4OloOVYEv'7LL\6fg/ H $vGD(E(P*eU/DfSa>ZJ"n&צKQNڵL87bGζ]mvtJ>QFT%=!/fj/@)]Mn/k~NuVٞgݍM?RsA|(gS2Z[%JLd93%@KŴY#2vR.ԎTؙ-i?KMH-q݊ m&{>s:RSis1Y/ڕ9,9㿦b5W~734v_+}n~(;OVK3]ض|6],uI<̱0ezzeŴ{Y:[yl ?Rz 9 ]mõkjMM6*jVqQ6_ 42hx%sm,6nk| mw1mܓRmyVG,0L>m* cFDUMJ>%2`e.% ч P22w; \W^֝kAߓ_T}~vSsDCU!(CdΒs4ky FSK"88Ie+gO<,&H JX+ 1`cEZ9njٮ-M&ǺJ{_EO9߰;K-8b>邒O^_wX̝t j ;~SkG$gsڪ~a{ΞP?jYzP3-s(]M;NEmY#=L@L2#uNd\R9$ s A(sE1ز91C&ɋcy~i"QB$:+a֩ߦrRg b58UeDT#"xG#KFd[ Ӂriɧʑ##D 4B D!{2m$II\&h$gheI3#6Sq].y6Jn2elT'$יq{v.OQ?/q -qE]:F1E~/2p}]w4_K'$に-'ׂ)| Qcp[䝫5Re3qYхy%I%d]cQiѺOﯨ4Fߝ`6Q/T^eÛ52)xhVy0̿|>OH|d!>Zb0'-S4X8i#UHNK9Ym, DЪd] *)5O$JmD0Dm+-u&! z87<ɒ&(F0iʙպ# a`vP MXї4L2*C9~;+;w Mka{%8QR. ,;a|_P LyIۈCr>( ̭VctT\;L) h/Zwc]wY॑\3eщ21&'r-5%neDi0\1%f;n1ge%lS2 DTjig]YCJ_Öd6F2&Ēhr\"ڒ(As9XR;ѡv33L\Ҙ>WF1HC$ϣy<ؠrMWӐSU#.~Y5iA!jҴMO >OvigOK$J6 ayr>Yܔ0U\Ҳә%Jt֊FzM7kt㑃wQvpje7ˆT'}qK$ŝFC5p1[>6?<8LYOV]\W! ˒,RE欵1 Fκ\;SRW;tp͊4s#+z.~*GF럴vM_ү hPsR&)k6fSPHz~ԜH|p9^R|k?r>%|G*3U2X2t.lb>{3J8ѠN` 2\ݪ#g;yvRR2k,mc: y6 ֬wN}oB#.eSJpo&7N`,yB 5^G%35I^ 23"y"G`< 9 93̢JڀBo!I +@(fC 32 I!9Sp<' d(xX{Zwz'f6$Biޮt"n;̅=*6780mxfͶ*;PX*C%ւ1A¥RB#zm"F<5 =#|oc9L{n:I)b@Dr8A@sn$7F7RX$D6`dJ E Z#ORA +V>}>6ۗ  ԳHQmnwpL'+ ^ xhu%16\ C&@47YXv'Kh)2Z0|?%`ޫȐQ]fS8ԞX#e˚MpHє%Hw>;?fS`()Z+~qg8]p^wLesnKͦieoPدht+V,] !7XE^?_\MnqA3.D\wŧ˸ԊLxu9sZ.o(´U̾k+9<뉭G׿=yjKgP9>$ vKa[P)!;.EWtBn;hh=qxK0z;YT[gO8Wtqه?2%$#oS|I3u5Z}ب6r6*|`=Vk+%Y%SK:$_Nyj\ [w19 J3,jue[8[tA~9FZu}mae{u>쒭?u3]zFG30ajӷȺZ"j\ OU8s\4h^4x-Ҝ,tQ+3ݵޛ0Q)6J(. +ؓV;]\(>6 \xĢXHyl 7M͠9:wrxO[ܺX$n*(͗0pG4,II<5U|h%nJdݠ5ࣇ1Kd3C9˿8ͨ<5j.5KL [l-R;<( vJ]g?8K/}lf[̱PΡ!h,}΢tG:gIZmAH^ 1ɚt^n" n \@{K-:a #q<'Gv]Mg3PE<:sö,_}P]\i[̯ͅ3%bVdqFLHeˬo:}8WX&7swYCa9 Cc In##&s8N=="=a C;u8udӶ9֡BDi RDt 3=? O cJ5mU B#9"CíQ`%'2.h#b"0Bs,agiq>'Ahh/vGӨHPVDōE<ƨqԖQLjmaq0xS*^&^ջIqg'Ż?Wzr~:,ήd%}v[t$3ҥzOfl5Gn z%\a>YuV,_ݑjV#nɶ*>&H8I&.ѩ)]d< ^ɔt\=6j@ r9X0]b <~ۇ”86;E儿|" M?.s܌h^ڛoN&55%!|*{=mlp*Ğܢ-j#o.%DŽ^?]NH9XC0UX;2=62 Dư6:YRm²l90q/QdJTKS'[ mOwݰC4m<4df,)*FrI}&!;';Vd(SThg4S(2tq+xU<``q1``e%vۣ>tBVh˸v:0ߔtOdQ3 r;)p(C@QdvFA^U&hb[(á6o kyWv y^Zydu!B<ˀCeLbT 6leL.fX6_ϰZ)jO)+EЅZդsUi۽y78]72uNGy HڟO,MyK#wUƑNDt\cu :Qm6TqJ R-mO㯟B@L7ٵ%cRZs& NJuv $:5u&B@e%3JRJ賁Q oK8D+U dB]V$rL,e׃ٹ |9fSgLǯ ke2@HEȨd:6a9,!C>[o gNw޾ޝj3 5ny ?0xb,5oɾ=˘d)N?]ɠj)e;];r]) 7߇x-!-rݯ"^{w]T^i00me#gBZ4Wf;GZ>lEhX|qv"cuR1ՠm >+TH[{*+%ROUVJFR2!kZD ZE-3E@,3Jmg_?KF)x|gf5 .Z)ю٪j)\?s)Eǒ Nf1CUtVWX-\L( V(iW14:εs mѩw9m6*G;Φs?YGiޔb wG1,+zy,Kg57 uFH ڝ&[2.۝$(G(F*T3iҐD+(\4!Wnx Kel]8{.# SR UBBIWnzFsR_n)Ww=;Nho^GJm n4תo&2vI5 cV,xPOK,˛rC^hxANMsw Pѫm騡HlhsmZҌW3?~ы;"ݫ vk'՟muO\#ɥu:Wu3IL_*>wstۖm׺bhs>ۊ膝ӧR#PI+6mSPs_=nJV4o,?bBa+P^L:|)M~Q}9:=@է>}S-Iۯu `7R&<Ǎ䲔hTP;4j~_g#T8u 4."|ܭA5Π7 K՞̮gjvZ5Z[]x8i =zgbT(eGVgQ[^L!?tIV$޼RdZyr :,T }.KrC|I|&[&(vn(v8~޴g)_\(&e4~,ׯ7;jBʷ}١/*7Fƨ}cԾ1jF7FQƨ}cԾ1jo3H3mWˤam| T:L-Xr_yEs$eVVآ;x/ϗ7S;#vhGƝ҆եfB[+ :NΒ HNh-Sv"9Yx0>,mNEoSt^pr>p ڿ1Q# lRz(ބy>AA* udׂ*Q\ d"H(+쮛`dWLbڎs!j rhc qAx -ATRp`M֧@ g+m`1&KLڢLsBJ 6]gt6qUCN~94 V­6hb՞FUn8A(JA{"Vt^;-d2t>(F@OXH1wtg]B'N \Ϲqnj 71 C<7'*,9dhUF -d:3ϲFHbf-26 p',!xYY`6[DciJ(i!c>l:[ڕ}v T|8]lmJ:drZG´Q%`Z@ VԳ@,{BN?3ycydB`Cɽ &B61{I)-R:Zq:8$g[MZ HZk5dm'MR8sR$\*&!s9r/8:j3NZMn6jZ ")e)$DTD^JdHư$v69nƓSZc<"4ҤFO:#]v|[&C:eE@4md" itIR(8Qgc8r(cp?jRkp>\#R7Cِ}s$J< 6y`0\U IMj?CuAJ2Ct< uA,ӿAG" ro?p)42 E(IE Q)Nkc좍dѪnR*Q:e2p!k8B1ikҽ15t&qKHBˇ! I0eBelQ6L%o$+Tܨd ړM$9ˬ/I!!C$}gC]-. !xEnA#ͬنzVPr,O ֒sIEeɃ 9KgNk+XO7g4^l=$\r RfL"R!'#0ϚAȴ z]mT"Aӕf8~fR*{9D֞2@T<7*Y…,@$2\8{bn? 7-pi`ydGx !^x6h}Bd?;@\ >",2 3579ʈ:KƳ-A7Zbb|y"OGuѦ,b @QŧhrR:DZN*9` ^"ye:l gŽemäՙA"}_^?_i:~.a4atKc`,wK;DWMGM?K9%71^wmH6o,0Y> YrrO?,)nIV(tĎQ,X$-ý'θYlwo2\dždH^׹zt۸% eZ=9yl ;øwzWo =Y>-R|Pvvҫ2CIB~]p5zzS_uke<ڃ#si-„}'Hq /\ Y1rE;a:1RmjU[o7Y°O"C}y⿥؉c]6a 'uxwGV>njy>B#ni32~i]DW>OWe8}Qn[l:T[gC=!DLHʮ}SW:ߓ[]Lڒ2N󇧐_ܼ?^Z=72v*oqlU ϞCm~>x1uvml{G]qQl^r#]3r|OOsH:ȽgV{S1lNۼŌAMnHd;>x.۽!b)k3߲N0Sfs?QW6h<PPjd@6RL{ $O@JP%Ė)~3ܓB(cLHBrE Di 7@CV[=c<(́>TAY7J퉾6JU^V jf _^5) ˜^"vO?H U³絑rntrؘlBͯ1b!z׻ER M/(W+"87R$l^;b՗~ԚxG 9 jfL Mʑ ABUCAa:fl)7<=6F ~8lNC$ 164"t)R91F.1sgYvonyQuQL.|Վ:Jh $Q!7!6G@I){dQ/u:;@,oWz֒'iQ)*lDȠDwDB5 w~6νt?6 {G(؇gU OJQX l 9ȅ5.Q@)2%\3ha1a=8ZGI}(C4|$QSs)n+m,j\b)j TF*ަ@`J9\KLN#@di0/e FzIyxGEn, ؎_i6 D%K~K8yj:RUOod6B;HDvxZ4Ʉgw"y|ύU ]R}OTk]%olj=\nZkiZ|ҹ/롡NA µ,ݢ{5a Fqf QEk~r?Ϧl}3e99?Fu(2egz,T69Wdҥ0+ [ea(8aWwF_ٺnxUL3Su?A!(Q7ZYl$F9={t7YkN^뀪Mgko@-n'?ŭCq=<3 hs,yaBY(ه&\"x".O;e; eWZI@q<2;siZZhN-cNT\18+D\~|{܆ދo1|s{O7|e xcF7Qa : S{3)zrIBxQwT&-wJ,Jp*<2&6 ^Yzq=껅"v-r2%pYZ#]=.!&"DwQ ygL 2HG0ⲗ&wsrx6~Lot'Ea!'+lO&o׭}K߮/ü?gC8n48얟t `,\U:p%< ؀ CmAvķl;-R✵ibB[?"6σyIS"brU>TBR[-iV ]0#P c`^a.`D]B1& h(A'^D*|("/1|ʚr]9!E|Cڧ\m`۶v=6;VڡwPykۜ5nIZ\linnF9d \͙n(r۔VZpPT2 Z JeT6Rϩ&XRJ: \IWSa@J: BI[riHkc{ŤXeI:2bL &62(iH]O!c\k\VȀ=eUF2⩰^;ǕN.IA?3(t ǭn=> VC- >TaOp;0H.yJM1#^}# .浳 WRެJrme"UTBYR(SB <4lDCaGJ_#Vs Ԫ(hF𠤥 gj ܧZ+Pf.3oܝڞ@fDƵm6}YPW˩dp^(܎7qDG% 5s9f<&,Q7ڏt^AS iQ@I2ECV"TIgd.%̜0ʿk] {jQ - D%BAa ܙ B="$74Oaj.5PLn/Wʾy|(ݴm#Nzp9GN6ϞwEB"<MN2ڒ) g6+D7t 5md,z.")g:Hr 8HIBN+mD)`^CPb.k#tko ljM|99 1D *g*2gb9Ā\SH7i-kDnP|/IGBSMs&KKuF7v4.cc|Q(=;|߹\9n0٪2v'5 `mF{63ry%r |d9[|ͫ^tzM͆4 3uaN{y~89#e'AtXޕ(u͍njNyhӊ2tS/ a =T 0Inx }ucPҲys^)9gdN ZQZyp{M d%v QJ@1]yys0Ji{ L3ױ;zׯ]ˋٰae3W/G?9=:CZ^K7~u]4n~]nt}&^~ R[YcB֟>},E/f 蛣[qv .kcHAiQi!0ܼ5X?Vv4XD:._ŚS=,+~^tadkEd3aog^QjLEP^P֡O^Ԗ=kxsY򶏝톯MjkO;Xn%W>o?5-'ȁg|AJẽBTkNpUZ6.\iy}Pio歕Oo7Ojؤz8ޡw4Sz&IMvX&{IAxzC%7֧ bZ8u(K(2S)I!.@ЎK:OҎ;  fLFďvp]N|eV{5sCotI>5Jqji:YZ*i=l-\s?ƆGW^@CKlYpT#}AYח~wqU߯&jZtgG'"w@7WhhVۭ_)Ng'h{2־u;茶x}}x[ kDw}\=[ƓbrVO2ɾ0*E?YFv*h77k|K[soVZu}inĥhU2 d%P-9MNVy{_%#0 J,&xu@#KrAHE ILYkV(Tk.TT}V4.,?li OLC:jR& ,yd(Xa*6KP-c QR_觔vܷJ j} ) ՘-fL.IiR GE5S!4:I(*)V!ZIRwD( ++{_d07:eZICdʔE8C{,Z˜)z8 Y!)!ac9b/_4IXPQɔT# > Eë5ZErc& _ _[7 .XX!C{f0Azd~oI! |Re1Җt!dHXl+2X Qd,U%X3F"ΉT,жh:xѬEkAޙ=DoFৌ9H01Dk09GZGWHs<_G(eǤ (-rL))#!8@D/ 2%֦R N*(A$ԓŬs֞Dqˌ9$%UVش+!KύQ#b& |0J!B QȎD ޛGI6SI*e§@4.Z=&7d!RN),"#ڴB9+Cۆuqx(&5$JR-(jlO%nd0)OZF%´[M`s8@IonV4Nъ6a)m'-Mnh+UR%!bj$ ʐTl0`Q&~6h4NRDv3oVj^P6<`, ל ! d2aͷ#0S $B Lh!ɡ%2Y,5%AG"I >TK*TzC%:3XQ% qa ~ڛ65d&/bڛ(S1Dp7GRFU LѤ3;2 5udEh_\l1ZBT} թJ d(QNKG!bߥJ)ȲZHzҰ WmFՊXku@u΄{ / ZګYLKQD+fcDֵ;IB1!w f&dy>WoZ 4mK-up6D-x `Z:pĥ8 >m:D* ]45Z. p1%Hv9 k/TX va-<%E@i$Qd"BU^iIQ0¸䠣i@_(^|D$H.:L_R˨չO<om UuܤHp#+QQgj̶,9겚 F"`;>. ݭ\O#O*q>e[J \{#B|t)]LAjx1i"!p9X]%\?% Z45}0E<=?^֫k{eM߫Y4"qfQ:Zhѣй1c0ߎ"'4 cԦbZcVYkA#(ZthQI3(O"fb\& |IaFhQ"}_ADx XmڔRA6G=#ʍe|6'uJdLԃ%L(H $CB6z DzE@z;5z6=l '+ IJ*UJ:*@jG 9{?"wV^V0ZSL R,ʨ1Ԕ*Q p 3ETRXD L ȣtF$ AATj `NLUY3<7 Gj A5i*U|R̤:IL~$jd-'k3kDv ʷҔ/C>7SAj+r'Yu0kJ!3<.paR+EXQ#,.(-L0E(+υ4S"p#&D9ZJ9i0k1 8e߆Jh­_M=`4Dfc)kĪhX.-2caVMveEHb %:)`J,X2yS{a AyG[V}ދիo~u{qbqƿ8(5279x7'Enr6tf 5~%̵HqQy ^o%`OJUMaov??m?kO5BB {-{8L @b&3 L f1@b&3 L f1@b&3 L f1@b&3 L f1@b&3 L f1@b&3 38$&…aA` U3 @e^3@b&3 L f1@b&3 L f1@b&3 L f1@b&3 L f1@b&3 L f1@b&3 0>ZÙ5Ʃ9[Mu{szC`ܹrw?ˋӋI T ']>-$[`nr_ȒN"fM{VHVU?O3&"(E!cm}BĵFI_$Lu?n fj8U/׼>Va7e{@:2[p7; X6JGA¦ZiNE~SzOs_Ư_nnV &G&ܞs ,dnZswb'}K9V*+sz_w'ˉw“w|s7G~+\1murŔ90eq)E+IB Xf"q1aí.Foc!lw |r.fj_YsK"$n9\k:Lr{w/C֡j6O:ݦ>fusA{}x՛klݲ܇wץ|:#ܛ5\u@:T+,Wǥkմ|NJW\I*`bncoJ]ֱ0µfDIP*{ܥւaUWU>Uuo;ڿA{Iy80D3A0ɵ *FVu3՞6#,ҔTk%i ,UiMY}j'Iy3s&$`]pUopfW+X| jp+y :l0};X,vX@*r3Gq;4>j4]pd0T;k_s䘐 )"f:^.vi_U9۶z#{%iB 50& Eev)X)`V ,~vGݧ܆ hXS"H7!>C+DuAג%l@HpSF&GcEӒ8Ru1s&`tֆ}B׼0,+kV5i&ƛSGEwvc3 &Z  șĤC`A"* #2Q :A %99m˲, c]~LlefxFǨqjyD"on]=lO\Ir&s\UvNY;1S"Q92LtRDR5r> ˦: r*T ር1G18=c>fӌs}! / 7k[%d|}\* Xɲ{[oz@?L'wEB!g f)=L CP}8J Hs4j7!*܆D'ZK R[%^FvHHLJT=v6qN{l;9M;,flc*-0} d$;P`Q RGh|L2&$S"R["88fb;$9A%8JiLfM"\N|Hq(1~yfG8zN :F$QcaS]pZ5XSW,=ޡdVkFM&ˢ6!:iɹ~Qfb_x $0PpâS7T $z{(6^X:Kšlq?T:^mFϵR'o5mb;+8W+dC($Ipcd?rtp,C#(5iEd PX#hc$Gcw戆xPYOs4*Kl)(mFy`"B\1k, WFdrLs̝8|=,,tLص^fOL]·ٷBvVװJ Pi֯bk/z?.֣9+OLM2}7 ad#^inOO^71n}X5GöG{%g=FC^G/PsՉXRݓѴ؎yIZve_@8;7j4د%9jRzUptC.LbWc-H8I#j@$] 35ǔȢuȘ=rGSu.4 o(!pW/2쬑V.qz)[!tyA_.s=wp7X+X^?nSsT S̺.+UX9$ˎW}|6$PGI.X0 \uv|H~yr,\"Nt(qbPU/4N)0j Se|tUt[CgDQ4>t0p ֗I[bIЉ+`Mޙypk!ZN~):ºPEq3QDG5bTy X%53iQ#ڇ$ xhY,YhgiW\"Nq&iuia jT0<v*a1̠3``6X05QAJpDRQ (\lgig]̬R.E/aK}L;0ȥ&3eL#pR5ya#Sr{Qӯ "HiSwgցb":xm.s`#j:$G#Oz]6얄FgBαc/Bך\jh qtYGVweӉJG$w3߳ igY2%KiO.9sRD8X) 㣵勈lw?S*`eSsOe6O 1@!=#6,#(paJ@U`K5 -iGxylؠ:fӐsCg=Aʴ:3m/k!E5dS0ܓ H&!aL6ydΥb߿xXĬ0S/aYdHC9Ӣ Jтd k,)Pp2=IMKoZL^l`$(USiSsN%Ϩ2QWݬVX}PR0sl~TzFbl1j +H1#ڔGvp!+{~ōe7*TQ[/WY!58EiHz&jiRi((zλGaR/+Ljd<)YB*˓2""dN #(2&fc>fgǚ5X,!UƝ#LcӉm2SXf\wour -ڷzN+~Vr_(!@%EXM8m %AӍp>NJ5LD"=GGH< 1|O J"o@f˨B"BQ%RKAD:,YydcOL5IY,-,xGI4 ;TMŭ<|纽Sz% yw#n%Mw|w}%Jmt\zfELb1lx A1_SmT+*7BQih`V[O3_nzZ]yޛQS, QҨ 3sCT ShS8YF"D=AW'4mOxZS>sK࿬Ӱ>ͧ)oe+I9i[͟SSO= ;)ɶ'KdGJ%kbFU}tGfBWYsޖ'}R5iiʛ5Kzs8rGϮ=|q7 4&ft].|,g TTJܠ^9l31r&XlOͷ;F5S],Y U*p &-#{ ν {L$>V^pHui՛3p/Ӱ[D&p6'ZEUJKl2P#~Ql k5+SսcYMFyj7o+qW-n wśߛp; L5L] 4a L T[e#SS;KRU:8J)]7ߗ(uVۑ(ȳ8aϟ%cqcL,c05%a`x 7V"FMsFV;|=>I޸:\.OMj`"3]v0Ca\1)LJ_.]9m *K]o9WL|ɢ=,,p@Qt$~9(Ѯ\$^_~>?3Z,'\iլ_z 1~Eތ>N;z_i6cPb=KYjhqI8{wup<~<1;koHuM)D"sZ]0e>xM<ז2(VԒ|@h\ip1bVɬi߬}r2l腭NiE^/|mxZkV iz"Hh#=?^,;u3S ԃس;2#.5ǔ_ }kz .J( <MKW8 <3g'jq6"d+BQP TA"Ϻ:U=qrzrȏ6BԭX :x+ zfLrdkD$Rv3[ܩmܩGuLuV㵫zשz{?Av_hǜQj|'/m.1,$f7`'ggd+TZ;]BTW DnP@_۹u~=!hD) OLdё(+f L T:ojZ *l&>so#Wѧw4͸]] ?p2w~>7NO>M:|۝B8UӸq8[ (65e^ƥtT/@T&Oo| nl;dEBgP"oA'w QH>7UJN ExRV~#?0*l T,vC),nb; BZc`ːޱ,۫s>_Ф\IP]6)w`d0hI&f?9{7v(G:r%%I 0rXV)EkJ.$!a(ڞ!ʕ_ooWKl?lv;BzLxf 큡C|A*9y1z]-Em1?iC|}B0 ޒzdvI/^P)eՁ$kpl'q7WWw_~Sݧ9EO,7YdvGfPjbyƜ&+x7}اyUI߻׷f5MOՔyl5бKR`}u(P?9f'Ov(M<\2>Ê$ruu=lDYʂj{_uj~2|wtR3+IG1Eĥ]LD9f)i \PaT Pi|ԽWx] O׽姙N_-_B?ڕا3|v@|@d9!R]lːR |Cg;4/( bL:M9ј$d|Xw^x8WX!#(ZdZq~=y/w3mIՍў O@6{0M8:ebz5%8ucQw=x~1?< P)`(Ȏ$C%HѶ+#*:%Pa=뻏iHC0O x CAEa/IG!t)@ i.yiX~8Pl U+{R=kYwKEx;{!!. Hu!ϡEwiri7MM"I /!t9A JTarxg !GWh #9+H)J#eaVhPL%Tt:a M}(YS (}=RǜB^TEV(1xEe g&Ivv&|%1F1D#3K*dlUQIxV(jgak"lZLzvX՝erv:wx.׶Tt)k<<{&GfPfk#e9mOBa>gdM-EWr.1񶈹h0Jh\.>(c)e+RTkl8-c;6}mnl ` j W#6X|hp{iZl0? 0_}_Lg?f;d< Ūe$aJ, MڢMJR6 E.5b%{!+cT笭KϮ0OmITDr:AMljPcͤc_ 6 V{@32k~3!S0gB!ܓ%Z{N uV J['\1օH`P(T&"WMڣ1I5Pl=_f(X,b3" q1x9]СV626s.x&r%İF,JNkj=J!Tk U9&dӄ)1V<$"6iOud8uLfR]t`7IgH:: P23_/ | z:Vth9g6F=Ccͤc_{5 `jOl(N}M?Ogtٯ[6|s7D?Pv‰G?~>A0GicWQ]$)Y aC!SڑfJ 1`gXt !& ab<#ɬ2z:GȒ֜v=Y~^Ybb޽vv6JiVG] 6w]{}R1CܺݼuN[:]L2q;\vv}m/r󝖛&;[aϼrO=oxA!ԕXl/n )w2mmşC=s ql#s2x9`y^0nԴHgNK2v 1I*.LZB"x$I#"x${[Yj D3kndQ$a@J"QAӶn/A,d$$EFM>f`2m`QեV쐧lnz <}{f+W'X +r{ݟx90nm% UbsJ"`S]Iw@Fw< bDS5F49殮`l( gClCN5-ݞrQhm-:Ћ񚲛NwZ ,Ir nzi˧;⪣]aUL`T@r(BFx AVJ>%'d~f-r$x<٩^(gc8PFdF2y]F2q@Z!\l{,S<Ia{ڦAX!4?ng͆zPe9 ʗCvQ!%GHp %,4ļOdYllKjM6*փ 8 %-CPQUƔ(k D`Jw }X& ܾ_+?w&3I%-@2KkdL6El}!S%-bA0NOz]v) F$/y n<*šJyUV阘^PC;QEpLf|.!hh4W ],8>d@k(DP쥏Z(&ZnƣERq[=fmqE=|wHX3oyHR8$SXD{u&pf(K"U4Q#(Dm9cg2wz3-kpa Ԥi>yu\d96`dz5m܀K40u7XCal1xE2*0~EK-/V89-  ZdžX*j^lqٶ٬5\}2X*)TW)i],#R,f*CV sk YxWh̢vZymJs*Z|u~Ne5 7}R\M·_NM>},JXnzL]v(vQGmb#dH J4̐4T T̅H4Mogj90cĻЌ'?9EYfߑef`z[Ɵx.ReoM 0q|-^.k@)Qh^KƝ2zϓ'2(ݩ`YdtB&\w4yHwG لd4>H 2&Ii L"&rI2*,Ut,(`,sJ ;ϵ))BmSY7O$<# @߹ܝ,픐|>ntU[+;yvZ`Ϥ;3m5d\0 4%ExDJI%ɗG(:@F2R;[K#xFuDNI&y+'2#}.!Y=!Xq\V OL =<$O. _=m% "kG`dRHh%M_XhS*Iް1f];xPQ80ؠjkv}d%TC^42G KHo * >uk\)q.1B|O8!k<˽Mb@|RUIRm6KƳ.z@Ԏrbrb 7O] LѶ8(wq0L}O+ڜk/_EӞiT&v-x-ڟF~#{,z#7Slo(nv(ACVkKqȀ4_nXu>ouX߻vW|rWO.aOt*]|V_Ř8b { =Rݥ]&_K =hj,cwCIhK2ta:x @B>i\l%"NwNkit՛SC3/g{o AI:"_ H1Vs'IF&'v՜Mkp3- 0dpʺ?tZ.S-J+O'f@gū?y4l! ݠ̢ISE侗I4iHT<;UU>#vVm(o2֛.ǩg]Wsx?>-=ǻ/ta(w3t*7ł꺮[5_r\PJl8ix :H}2mp&֠䍓S镩C_Vh;T!<̣ yRЁ. 4@1)FGnnt~D9Ӣ{}FQAT5S1-M:#' E) =@9Ei hN%i"DEmѥڲut)$ZsE?=8Wr`R1`A"c: {3jWv;A.zoLrZ64 w2m6Ǎ{\wix7kdVjtڲJ;HJ 4Cq2H([?mn'߾ P*eRHf753U;(U J?yp {[%8(s#F3F-Jt D/5BWN|<0{Q1v}~鰠 _0%-\64m`΂ (}V1gb*'{Iiw[y=,\.fӞHe%G5l~J R8^ RX Z чU&ACIjw''N9Mn;Q;iQ#:QQc1pJJgʙSR{IR XKQ`) DBJtAZ"ˈ^gmAʐcBiZi&smXm8cD͚>'ȎGWfOik7q%*PO CG1_ b*x4kV^5RJ߀.. t<߇+VBZ}z&ԟW9*<ꨰ:)_vH땐QZ % *rEά?/Uʕd+rʾb]c<)9Ĉ>n':C Npߴ.]]]m[PRz}i}ں;:};LZ:"ux.>ՃJw彩tF##=)b]ХUYRo4{׾>OxGI!vi׻s4-sZ?)umx#9 T]y w!u߽֏NͦG\os']feB]#O)NMOvr3jcޮ 2yEQd's7`œiQZ]&QÎ0IOq8}x'Fs8*iE,w( p0[ftqQeY:-04z3s*RU8hOK*c0q!Ҍ@ 4<: xC6 !!2B+yӫRhRE/uLqaeUR:Nuyg/bclc*qF ,JEgCqbw2]AoH1٘ls% (Qa;Q"'hNN*, (C=Dl\LY:3rV;GzMZbMلbZ 5dRp{E Da韌zUs6]͜cӡwߐ*~Ӊ߸8XO}Xot]5$*'Uΰ%SJRAS͞ˈ6Zf(nISՆ5BsAf&ՠIS*r *+vYgц&Smzo]Ws~d}Rt|:)7Ū E]-'_+ 0FD:r$eJ0i%IEo$o{NAWQj#٬[0ViTWRaԪ CA*Ajh"PlEcrB-o4EgVbr<(/8Ig7nMhj{9%5fXBc]6:@#4N&>QI `k(]@ cN+P9m*(o䊾c'զ8*27r&A8jec҈.2*E]n-%ofwQ.?gu5fr!~):+hbE@L2t>hF@O Rt;ȳ0.!F'fnjۻ8xSքD*Y"dʣecD's>15m J e<^Z!"eK &hp'u{u\e08&@2ul*l3t:n>NwWnѢBwo^:*irv:.nfOp֨T_(mWљFFyhXDP{~#pʾ[J+cI NjbtrL?30E@"-QWj>J1u \VdR: yb!XX&sjyoJ>Ex|.\?r-nn-Vܾ ܔ6L ?+ǻKڟǦStQi]Mr0 BFIgj"P^`d8-4]]i8$Z- o!2y^p sȸYJ_1E U`ӁۄOe[#6|s5v՛n&GC\ttMʞ֘v2M/stg/\t82"}TN2nP{Z׃[^ RYRGTkrC(Y+I2fQeLX Nz v%^خt#ەeFI"lUTK`K"D4+) T ɆR(7u$%#8+ɴTN<0lij%2jvKۆ(`2!xEHE杒QːMB1 ,ђ"[! [y;x:vOӞj~qC]tAy%q\u&M \k#)gMUC 6@{ -8 wx)AsH h̔/2&LY uAd DVs ǁ*$5 'r@yS^yMC̚Π$Kt X29ڊi{NllK%ɻ,MޠVU"ʫ 4"G[hH"R%/bOŪq|ZZ@y6w٬=rYnƓ0LX_ztZƌ3$=ۂK"W>FL3|,2FL}\bFdutbbd䊝[#%)LnO״|:kk`+b]>o̙mdpt͢w;tK:u[aJa޼Dy.}"-.||+K~}8;{d'BX{ꨩuGse%C ;xi)k] TXSc bQ˄ ,䯈>g =ɘnnom[feRgP0&'Doh+( XH(袱&lXN|bKtq&pO^/%.^i=ϠI03VXq%4B:E_3A,g˜X2MJZ"̩u̩ʠX&ICo>C#OPW:1\:x8~+RӚ&'j`u5R()7gJW}o%]o7s Ch`Lfo,f; WBZRIUi}񂸤 yWT pKe'2$S4 I4$% ][q c )1I I{ٔHMȢEm@.ݾ]]u뜪@\BJƄG'k /">,%Q ^([sŷʆʏբJì"h8p-ce$UȐI}ƸP ;\ Lr-XPT:(@BNIe"˼ @Ȳd*F4NS@]9CZHFr 9!Vր$ta8ZLr>'ljL6o|`cr!S@bўt5XĜhD#܀%`-:Q ʰL$!i??h-N8p]YZL/uE qY,6Nݺ#|ϧus0֍׷* y]^\V-)oMfˡݒ RnU޿ʪ}.,p ) a/Xu ܠ`BD]([gJjBH  MO \cȾNIX[JA؄Fe,~XFơ'wgȸ%b?zev;ͩ_4h42/b é程&J<ġިL$XV6)dFs,T^ 'EPm& #xK'+~&B1ϦHvqv`wV!p8GGFMT&Yg)q6RQc)g"݄K NȐR, x"eK 9GR-}&\L& 0E,6>P"Bo{!#' (L*b=B`%O=‚$I7Lє+W˘c"M ƃ,e;ZD%U^Uqlw7?@ʅ<Ȉ,D!Re֕* pSʈ}BxPsP*# ҄p3Ԃ ܧZ+'ɲ<'KEtj0%%Im\ ~c"h-q><4|A٫2]{tD.ddਉ5V 0e(/ $HQ*RJS9pJ +'s*%!ֳ/R摞CP`t RGkh"(ăP x }2BP(Iks1ȾE|Z~.Jn[ExVꕄz#S;7\.ʝgx8[Ӻ9^kw_du!vlżOn+>5_z$/ArAw7WBHcL*sGaɼY]hFȇE/sKsOەf]tyϻ wD].N9o/GhDGeX>Yf1DͺuAUscMmf]y/.vW [-o? _]\?>'uL yǮXWy&7NK8(1H Q/Rp5V^yGwvwojMɊm‘K^y׺c/u Gm6Q`qx}=?_3f%K5ή|?5$%\wf/wgAlP2_~!;QL.v.|K|KyhNr*ApCcA0&\ (8@mQA$Ih(͚]tN`&@]ΏеU5 f8vv?im֦%ՉWۥ:RW3<8GeYp[ aZE [ t"" 0 2"% щzRV m@$(-r"NI"E U"qC*&h`7J1 sT9 "YZ bt=BdvNyј_ ;ZÊ5Q vYסE˜?J )(PT ~+Y/0y7 M}Jooɞ}ٴslOyBk@AEj+;g 5qsϿ뺵?5MRf/[[q8_SLFWo$lHm9_mmd]I,v>M&ŶUp`6hhaޭϮ|MvvS$tS{xnDd|2"%Hvp|}ښ-( &͂46 ^挭Ti. 2Dz !ϋ2WntS{z>]KI$;BxOC:L P܃D!Peѝyg^'9!*veW_a~#6gK΋{IсOhDZRb+s5A7^D7iF|M>O} /zX yfۯÃkƄauEL?~Az4EYmU󂓃+[uD' ǃ<<~lxYoWg ~|qw mkE‡z/B'UE]|>OMƋQv"q~`U JշF][ 1YO[~\q5o^/_^oWLqQlzh=_^R_ݭ$_)d`w?I:Qy+cbk szKYC}WfP}=wt ՉUh WW=ɪ[~']m7J'VLyArwYp' ۄnMl1LyOtdo;_x)?C(%ow }a'x.5IM$V S$ `m$XH{IqX84 1PVK'2 RHt.`덦!r, +^}o<}ꝛs[llZKZ:tw I=Ly'ڕ0J,a}% 9ax'tK ɥI{Ayg;Qt)BI7EN@Q3I%@pQA&HxNH>AVX=m9\םVUé ,I^rZE-\EL/)BkW,naFgsAu/`2GEw]܋yns'B0NsqjT;+WeYlqZHKU6&aR s VTN8̺n4:Fu Gk2*eQְ)&3&J iDJrsn~n%xh!|ḢWw{x18,SUf{}\5e-6 a 6y`TAA`KvC)Eh* )&3*?*M1-bIs x 8c:AҐ/Ak2kERƌ"CPgt}։jR䢪sI`Q+TQY'MI)_|P~출xnS S#[sjfG_Un[ΦJp>/}[ |u5` P *CF41nCP,N[řhduBRKl{HmcbZg )b6q#4bK\rI&&L_ͫt |1қoICF'|ltbkPx:9myȣn:/e~~iŽP :D('B-L4\pZEPw轌tT/6AtR\X*pC&^+7&W&(==_ د$Ug+[QΙkTC RŤ ũS%ZSU.UD&^jPTyֱmp\rk{Yσϟ70rPŘ U!B (6;/ +I81+ƀ.y0 {_v3Mo ˼;`g&B2oyT,U.V`Ҁ\q %98U붾 M*=! ZMJ 2VŢG0h&zXxS0bs]!u=CБY_؄$@5b$}a(JK%DDN굜+ki+ ]Z|UPKbt5&fҊcSN U2r UҶzbZZ6mp-Sd1o؄l&@5DHbH#8 #3J8 o⾖G=y[bπ;Ъ2D,[FzJ; + 3,&_󤚞K5VI  |0B(Ńj:OJƉ:3c9Zt}wi=UH@AddC& cqs$l܁ipEfמg_ǺG ߞhnyo' aqv:oNnFgsZ"?O{usy}fJ=ҕ?}Pl.ouo:hM˝8/'FVvrM.EeKVy^ݯ˭MUXo"-Iˋ6=m[6'3gt=8o}9듋O-J+_H+[}2f7'wW+w{Ay|V7h AJ'JVǥ'̫K4xkͯbivjq-:hX]bFEDV Y-έ[ndg{1%ϯ7ۓ_/b#!|.(Cݥ7ovlQ+jZ:[&C~ֳ[N{-xt@7ojA;2m,)GED70;/9>f[E^6:8&{CxBxoO\Ѵ ""U/(͠U,JG˦kTٵoW 'x8V}_ݵe{%fZ#& ơHQ|5^gΖN%V!PZ]H&wjwQF?H 8UrEnG_N'/tbľ3mq|n@U 4HbC|U0@b5 i7)ojn27,;l'BH+( ;:p*!sȍWi>p *5 2q$|b5stT‘OcԘgxbOc"5׬1Z'X+{cKȜ˨@2NDV3rE.0bnE\Vc*-,_!o"hYcQ>+nvxfo]腾J1WSb7U"AvX7߹(v s'993o@@)T[l+C5Tb|WWWzC`Τ]c,gE[\21RMbU+G1.{jVt[~~g||uqq]}5eo)Fr]1Y/dYbUoY~XM y~6\>"_v߳BT 23DЎ X*%]ZGRk!1&.ikBZnVmS| DzM:L,V=<0GfA 5:m*9DĬ5UBAע`E%zwݖsASd6S}:q f?9K:rN@励𡐾zӾZ6 Z)V{eS2 .=$%*mb _3W.k9%v;&tyEYj$' IdF5;Lș2( 'T#bMc+3ԾN|9  C bEgVa ˺ZnY* d19gݖ_Fm&E"vJDY"I"Nq'ϔ"`S6,L`"JĐYguUKV׉ BoU=H`0əTRF`^˘XFzuݖs@w,:\\ +:&_g7*y\ 墟$wQ<6ePX>/|TZLA$Uat-ؕⓝScݨ;GyfA<O!Q{e4R*T;bRvȞ\Zy*j s榼#͛)Tm E,GLT!'1D b a s#u7X~^:ٰJzfȝK[\<.cBn+Y͵a=qm{zyŞIc8K8diw=/fS{=%]oĀW_Z@'.θ0].(Ν^Ct PrCbC@\L(SEGу+[KLr ؖlVUU7>a!u@5)H$(Z)gyFx:ge4`W(5ZW S->B6C{wwz>{}=+GY7kO@"=v=C-Ի^gծ :nrvI@fy9(P$n^Qtr5̏G׭oO77w/_5!|Ĥ]ѻrO]o8Ͱ+=hoFp3uvIzrMv+~}A7v4m5+y]}qu0{$d9` _OZn=[t!Xj2Ll2oRJOo^٨IJbc5BQ E~"#^˙! ݠ̴\bY<3ef³?*=s-U/3WxW$-/]ԫby1 ]ƟG+mgZ{5.* oMZќMIF"=Q{V&0M^3c&K6sG\|xsZj+go^gK@:.i (DCVJج1aQިzy `߾Z7=!{O.uƉ'}]?9SkO/qO\ ,7_@Eowt{ }nq8GPh>h7-Q"ڗkT0roJiJCg};Э_/򆭴cncc~=Z[~||kDdI21L53UGY=ƟoG#teN6rD\֝;m4ڲXY Gd: T!F)#iX4@ (B'\fl|-J˥ u ˲~rsMβ˽JEKAZg0Dy`9K.}( 3gĄԈ.JsCs)itcHZ+R9Y'qC﷦ښa$H%͍4Axo$'+-:Bqg XN2#R-WjPZJe3xGSc (@ #rL.K!(p6g68Kg;<1ߺ`Q[jAi//Fsk6)vnE~j:.=_M/ѯ?[g;_f(4p8n0eRysihJ~B|qjS0k^Nmg$jٽX)1kf]#Wergfbףb,֗4tx)I"=SJOiTP{BෑOs)Н%H1% θ!ݪظ $8bgZIe8sm>Y;=Yf~m(@܅ʣLa q9xCs 0eMgbϫ]$Zw.|\'C{xJ[If:l ajNMs\-+qY/yb@e55t |Z4yGwEQ<'5R^`\r*}`69wql,Bauk Mvo'pDxOPK'j͂ JhhgeÉ;D6X l$fP7ƚ<5_lM=NQ}Ӹ"<>МI03Kbhh!-ƗJM(g˜Xd>1 5:71iz}u*Ak3/;(Pꉟ/r6ߙt92ڟ?ӉBg'"5q|.m3ku17e0gj0Q i >,Ȯ`> 鿺H뙐BG@Z]xO\2-syƽ?§e'![SM@O$O!FpH1y'vq1*nDZgJ &ꐼ``:CHH"K+f5 BZ/R M(I0CʒS)2V "9#7sֺj =.4WכS ᇟjry>&V'^}6r 7k7O%@p\@h9WYk>@ʃ*cG0j k4 W588X1gU+kwů$g'E/ LNk-%1UUj'4o:n]^{Gmo):r^kϤY~yٿřBM]|M3`sDPY7 rd(6>c?BiO.r-ɵ)SdȥY&VbL2"yBŭW`ʹ e) GuJ.%Yf.} {ˁZYVr&(E26=ʭ"lj?BD a5AJEM=X,IRsX23i1UeLRKs/6$FkULNH%&B4Ie&Ԯb]k85]LKӤ#7)SݪF'n-J[0hzs0;˷Y+.N49 Bw_S1iox ?1ۢ debN<x1ֱ`LtJFet1{J$eL̒ =AhrST\I9$jKj-aje* eeY(zYxTYQ dg;s SAe;%6! b=}bB0 Z @VWYb\L:hNMgd.i`{QAWQg혱YM`U)M[b("QǡR*Kmv`V 'L9'τۤmv 9wq6Js,%y\QiA!2@Zte"&H4$([|,)IgՆ[vJ"{"ǡ+KD%b/b`=idIˌLɕ'l[ -)r9@ (cd9vf\>d gLi 8Hd r3>` Ʉ@8!D%bQ&}$f\dF%E]Y.^.rq r$ M'UXIGCS+0Fe/!OEVC塩,LHsVz"AqicG_dF1z{?>5ۧ*9r77lxd9S6ͱN4nyo?, 9bBA014'` IFAnj+O ]> vo;w]rq]͕+@h{ץ=aom,$PZw/Y;:]Qo]Բ n{zm"%HWܣ畖顄moqu{HCs[w \vg[R.Is1E}Sn(< zTxgQ;ͭ͟G7\yWKQK75k$Iri0o DK0RYr9Xt{"umvFn3'#`7ɪ̽`)d}Uk8\>%ˢ!Nj,d7ܞG󱇩5`ꎔ涊xJe^&gAwU.lh/FFl0:cO5^mryb+e>WBxP-Ȑr*n`ɨLp,6ܡ2䬃 910U~l(&ڸ >j 4\vwPp|ՠkvn=:z&tZJ<:mͣbGsGƠ'@ms? ( s$NP Go,r QU7>a!u@5)H$(Z)gyFx:ge4`W(5ZW S->B6C{wwz>{}=+GY7kO@"=v=+:aѭQ{5_E%M)4޵$B귁ఓ0e;.lO|&)YEXN -_U5!Ѣ995b vo&?yd|~&qzk֠#Pw4_s2Iܘ}Ap" j>6A$ Mt{ڷbJ^]oYm&C^"?^\ƭF/>&/f9kRT}8ѳ\1B3,8GꢌAjn7$'ZhfLU="2/lC  . 2:Q/"k"XM"XAJR}"FI"U<1C*&hȽC >@d ;G$˩10tFNhBIsF)ZE C]QWp"T3E9Ϥ~I3̊Mq!CK:ChJ(e/5 3LY3K=t*)phAH->a|sߦggt7*uOo7 .n1)d 󲶮_q?Z) xz9y?uӇx9 bMj,+ӁE͙[8+-Ў@qd)9sv.<;F˺p.~mMWBH.'i=Ttp=ONh<x-Rb=&Gq]AD9(+A5=~;E9˫2h.[ G2<{f3m"0-_V:j4")!(VWR¾\2wūj2UHE޵qJ+)ETTw v#gWO~^EMAqk.FUcBLiثQri,l)^χfr BT xu~#*]P-7.H$jg~')B~b~$5!l9ΞWMa3r&ՖNWZbM06XETJܩJg}˸ы6A?6-MOHR:VE֖G/J*Aו) +Whfah홄ڝ\45vwtDž<\L5{&.8װ@ &(9 턷c7ӊM2]>7-2n1t{tg"8Oc=Wn9ӰA+g\QAj%HÂx") +x4(TEo/ Wagbۚ~R@r^# RHt䍦!2,ߕ`kAӇAy,v(:mUюUKthCX1@Mh$ (.* 2E|Cȶ77Kٻt ;.^wN[Y@D 8Qu/9(S<(%EџuZY]WS[u߀0Gw]~z]w5m}C^m؁,e>`BOUH2AKSݓUu>v/ݺ$JŒ$ z=uy{;sDhC+}=+JTX@uvh\ZRd!p H':2H !@ARwJ(ǿ mD9Bs^hBhve/!:Dhf#`Bbꪈ.ܹ}g)\}7j Ec>JN4@K!xQs+҄/ W¶3 XC@I88AZc4,%oF!:OrLS] _ p0:%x8ۨJ Y~IDW%υP}/݉͡HGgt|WӬX=?qÆ8dP:Ґ(>䑂MU}Iz]Vrofn8`@'Wڴ|vk.pmhS̃@+0G MWcDd1DSݞJ2+czfl $ l0&q*OsxZ) J*Rُ=B|Nfv[/uh,+Y{LTɹsu9yVL?/._Z|Oq Ty_xZ_y[6%ufjQ-&i%śE}5)+ Y0W_On'?_Lm׽_GIL"ƘCQcC,~Κlj)1`nwiޏ?}m 78g0RWKf.R/rQï?DӢE |&Kutk\o2?U",t<;No}r9JҸ/rÃѿ\!}WT ;5۸780qa)I`)]7TB8ǿ>*JnUkղ}F9Z1!26)Ègj3l/hz(7X'_E(7+%u!SĉWMV *?*Z_3E nwmIغZԎ֧WJ_AdѕtCwWW6jNRۃ$NL3"5Dqn DlF(^k"V8Us(MwaVEQf J총Zl-,RZX: pQHzii;s)CMսSmǦԎR?nj ^ݟ''6eC66#9؈Ms ܞM& f$&@C{C%n7hQQGOoۖZ>`mCK^R ^g-ٔv%SQQ΋?WrbZ^dsoON%6SN%|z*oɐ ,Xn2Myجȕ)~u<(QQGbXR8})-IӅ軭I6q*I~ÀW)|XG?8"^Қ-J_8*}~TUCCбOpJi zLnv"Xfr8v߸ĝD~}n2_jgbfSVjpsK4d9ʞ|2Ub!63W|`8M`z9,G)Ɍh^ Б#Om>'ӿ o~xythd0uuR7%$"аsL-w,L΁fSkdNJcVݢ1I#[! [*/]Ų4a2v\4ܴCJy65IwS߷ODu&Z:ioSR` tKF\IHR[*=%U~r]!a%tsYx3QsUxÙi{'O$pJHzB M2ur 8c4*'iF ۵upʀS )KHg@$ @(\FN1P m{GubS ')(Y؏סu%hJ@s%*F#r(h&ƀ\r)4QMfp2Cm YSծFhR5RJBI( $SZR"*nDFB1(Z ݆؂"!H`(nژIxipmRphq=$jH̟@MWy14L ͌B T%†H0 r Թ[CPLR% )iIr%qRxDǔCkBMgځV]9" 1|Kd'ŹVz ^0i*kqVH7^F+ 'JylҴFh*O.q z"Α[pIbF$%\2Y k%v UG@"Z#yR:᭶(Q*}k ]nǖ!ϧ_ ` IAEQ֥dIFf7eZU|!4gmv催a$U|i! C6WKr5jfE `-iЮltG*C dCb )ؼ(NR!'D_ aV9 5&>A1]mFW\p v ^6 mzZ$t>%‘F YiT2Mt%CHh|pL Ρ&yc2C A9J#HV **2MʴJ5}46xOy |Y%@ ֡6(28`G!cP$ 8ՀQh%᫈;+Qyn6 ^֊N"`ÍNhvKO.N&!z*`YeD &PG]Jr s@ Q/сwK }CGzv@ @z R{XRrhDκ$!;V@r0]Zx _3@[v(f roNy:  DhP8f4Ʈ6 34Id(J+12~R*qwq`D$p*yX0°" wMϲC|Љ6k MJ!ڲhF& ht%݂ ZMުh{9Exgo(mU(am WZR0 UK Fnse;Nibl눙$KKFPMlJ''40z63 )xv%[Y:64k*ZSRmzAk7v)oOZ|7lDCaFI*ျ!(A-ZtCۚbCX2'ʡSt9 L=iҭlx  "i8o!X7gSMhf4D2_%T,<Ҭ$\M`$9DG4\- Pk|X~:C+zgjp!G@(]E _ZܼZ:P=:5:do"O` =w[ r{q" oo%`Oα]۽7klZCҽ/avmy}TEƼ&%P6G %+`׬j@J F5+X J V@b%+X J V@b%+X J V@b%+X J V@b%+X J V@b%+X J V@/] dwTU7LgW`wV(gϫ^>~VF@J J+0$+X Ŕ@ )^9ΫS=AӰ@b%+X J V@b%+X J V@b%+X J V@b%+X J V@b%+X J V}J |_{z@07W&@2ZV}J 9@b%+X J V@b%+X J V@b%+X J V@b%+X J V@b%+X J V@b%+X JoW dЯJ 'Q/p(%+` @ܰ@b%+X J V@b%+X J V@b%+X J V@b%+X J V@b%+X J V@b%+X (>]Zz~9%-5,o ?.X?\bt@Oh5x|fyIro]fQ:Zߏ\[}qÏ<= _TS{ZNopezY-׫v*US٘^P4_oׇ{^L`ѿ7ܩ3. 2)UٮJ%1<.W$8-0N|Z.үKb˻V~h&KH4hӓ(7u^>Mm;獀!=gSO[+^SYn5,O>irq^|hyQTr1x zҶ¡tpᴌ݇8,V !C7rˤMאGo ΁A<̜umWkmQW{x6n:vm-)Uem~[DFF*FO]޵>]ZakMݍ fxv,gAgpaD'_=kڪN܏fw gDLUMf U/;lt>}qbApx{C#2-=dwnSͮX]8ʿ՝[_\c&+n,5k6PF圮=l`xŦѩj%:o$ܝ_== чI%ݐC)P(rdɯ/gX#Zթ.>3h~{_ElMst&yțR93@EqL~ePEBsTpUbV~$ԏ& Ȯ\/DLk9rd`c+O2*FYb5uߨx$. ;/&q6bq6̜;& l㨫_9*.XK"XS\-5 TlɔQU#29( 6W!kmC~f{HPӎ'%)tbs'DrvE Ar~_0Oz^z+#sN֜ARPlZU٤سs8icv꼝_,}_'䘗Ul p9K'ߟ9pGA#໔iD.(Z5c)KR½*=\UC2g:zmFD`YhI+:zRo<4 CxF{-ц颈Q6;"I6R\ ݌ϭ2Jx c\xTV!88:8zZq^,on6L[vHŮM.A Zr |&fowvJDE~9YWwѥקZ߃\՞Njͩ)dN݀Uᙧ燱 a82^zWVKÏ +^9>2`P_jMOOLzy*'Vӡ 3y"P/{G vrHRR`<muTkԓ-=zO)?NhikmcInpFK,6Y${pd /Ւ)R%AV )RҐ5F%s=5U_Uׅ3:1V@:FzSf0 y^1AqOTooRLg)E%ZɸN8zjLښ@Gǘ TLH_Z;ͨ /-|T ?b &5,3ḍBmJ2/HW0QR 'HrXg:k3BfCr@0HWgeAH>8?-=>[;hiںqժ[ŠAd`X() |r'\kɹ$"A\Z\IV/ Gӏ_]QtQ|o .K{s֙Dx߃G$&Ci\G G#a5iAҲHxʚ Tk|$6Jd@IV).d"9LF8aޤα> }nB Eԓ@QߖyM , DPEA,^ld6mb &Dʖu"'rpDʖ!F3@)&&}7?oWڳk3;fMT|CGz՗oC9l=H!N׏u3U~JPN ?L\r 3*Q5J,3P~8hohxM'6Nl( wODD`Dm3]5q1aȬ  2:GQ1) x?nOՆZ|F r%iQR/~_ܐl3غ1_ b*fb.e#LfT ՊdD*V%P&ŭǃbMkPlkCh Z3}Ou!C#CCPEdڂc 'p(Fi$Mc5蹃Q*žpȅ`O/dTP mi!sr"n9{ֻ[iu>?}ױ,'77'Ahh-vGӨHPVDōE<ƨB=#h=< 4NfwċT7SEqū?M8tVo ;1**PW6HJ*|x90b@90b90:GnTLȴCh!&1h-D9DTfd/ Buˍ]}ۮCʀyF٧twkۦ1 T71mB퍼O1sr<Rt,dF:9$ _Ngs7ʂ\s!qN\;/lzyFa3zn*jIrdg (j>]窫koV-1eJ#&㤷-Z]ׅ݂0h* UUҺ8&D7:kC$)Iώ3ŜYd4GƎp|\ .yi0i/mMleW I*iHI M(AgQN ڈU@TGg<0,ҝ'IŔɲsI˘fγTަdQE= Ho(}>HoHk kwng1~_]lGJHK(7fJ5?oWNqD9a2dYCײک9.Nc@g}je Y $O.HJdmS' & _B~ G% lܕeHt,%g2RIO0O uFΞF7:ù<;ڙy}]Mr:m\cB"8N-nxQbw9 ~lSS/9uDn%m1>.}B"\0^gۀ4U3z:gzzBYW-?`'/ZL=f^Jk|˫y~gV+j'1qymس}z9ڊ5y]w1/×_"|u6d3Z"5\-%I6ޢ-T7&JeD2ҽȣ\ccvFndAmY&.ϐCdcJdGY9yYE ٪~Kh#"tAH xmAbwUz{;l6 GtJhmਖrzsz> h"(WO2BNP "sUPUY("`Ў(黊o~Xo>Gz Q5,F *ewep Ld,8ƵY#sj~qA@s7ޓ6sߓrdAf^m\J u;^ Y8rPO :aE! x_%MA sFhcS1kvpqߠ(F+&&Kd\l3I}08*"sm cu߱*{H'E__}z;ϩ'?ػ6,W`B^b8Aa'YkmdQCJv=|HIl%v}{nwUݠ,(/I;Lg kQD9KF4&L@q/+.fUϊןR") HL]a-qT 0~ߗ85ϸ}v Zǫ ¶"^ۊ,gCZzW|w& o&(9ҲWR)N%9 :Ϣ2+otA wmjkJJ%VxkkS\zU&D9Rț4sT؛8ۑ?b   o,ûz;| ʟ0ggg`VElr ^!3D$$8 R7T1r j|RZ$ A28%M4%#\2r.1{g;byZ*1 j6tD#ss_iB`;.W<$E)mvm\:6R!\Q ̐("eȐK8"l|߽zg;E}w|Z牟>D?ED3"#"N."FŤs RV\<2RɔXIe0rƴ!Y d2$JřBulcD%- 7b겾{ SdCzٛ싋g\G\<]Pf Fy* / ~:6,kK+s*% r\5g?]h:~4'Rv1yfjLy ξ.tҼS%FSܠyfWDh 7z;i -/D!,^W7/\.cfdCl(/,Yi9s #d! 0di茢aVx ")HX"W"Qg}\辶bf}}s̟ 7{0ݴK+U˺nꦜU=<κ119nn*壡3;)>F2I;q4c;`\lwjaAZG?F/^ O>Vɳ6n"=~utMщu[%2ݾk0/o 8F#;)oYGBVwʈFru:vGcv221F~5n>몃k3}GHwr-v\Ϸ[j0hjۊD5IӃ/Hnh(.n*Y>B֦? kˋ@4K 20B u D8/ g#bDɧ%:͋TApt=ݙf3~ޝ<vc?yS7|oߕg86ifӚv?ng8M.(v \jEҒ%F6úEQܗ n<}krU#ʎ'[TUe#H6a!ϜԐrIutSut ]D(`:ނGy>xu9 i]S3ve6uki{Aʯ\e+u1˓.hr/g+8\yQ}po1ZKթ:P^6ٱ75o fWqN iysԏ9AHXS !0^4cG*ՠ|d{?<2>͸V][wYڗV)7Qٟk~yٟ'43?4RL5,! wHa 6/`TxѫEcV+` 70Dpy51%'?e?~1zInUsu:-ㅦ|x?:s4>}W6ݢW+ͬc}ͨ,AZ{E'g-u1_-~/ty^0\ ԋdXy=V\ݍq1nrؠtmEJ?=W!:eP_cr/ \{j.FՎ%Ww"ܒU hxz;=^?_\*8}[1uozSD0i`Ai/C ߃|+LKZᖜ_Dgjty'1pէwYbU]y':.liY.lv]7c_,ˍ4]X/Q߅ZJwQeI[ՍGa'L=7{˟u=룳v+:'u7:͂<_q}X@N_vOZ4[m6QbG2Z.胵UeH# LHyom&k}(u˖i3iCehm8ϑH.O1lLL>"%3ϧ%zJ?Ę[|5F{K>퀻&U>ܜ48&,}>gBEPHroN# hvGKwy؋vrSO0qzNFhGe2 pu8_OFP45ƕ=7]O}u)πs)Cq?^rD1y8}m~5·+G߼8LZ+<\ri$fͭ)#:my_kq|n \~ȃL_ ^N)}۵n5 _SI{ϝ";UB-_ fı&Rl~e"7]]$.4sjޟ[/e|3;eǼgAͿ^pړC-h1Ž9,uOq2l:=zfGj/k .;W׽Jeuo7؟ 'r%%b,~mqөx6]ts}N*ߞnJxanF,vxw,n׶K& ni;R cMV];4W eP Bʻs3{ q0|?n'V)s.F\m{ef߮.bR([2V&xHK2RUsCֶ+g=N/r&1V>&8eO?NK^yd񢓣lK038Hڟ>(=UZ,k6wp(%IP)CH"^'&ԅ% qI\ A*+{?1%2`6d ԧLZy (1{BFiHtҚ}Pa $C0D #9S,ީMzP-#dWT|b BjSuMEe̕ `R*r*sb%>MU9t#CFG"E+1xϴIFaGs839k}w. C 17lb$!M";P`* Cw6 C#kJtx2c*2_0Kukwix! c1G{[?-_^oYc}"*X/@I6rQgc"vqsFUPH(Łs,#D͊tru۴W;'FUvqgIf1/=t-"_zD7I[U2"P )@H`I$xu='RePpC2 ɱDEm@j3^R@vdv9JbR/ eQ0A$%H٧dr c% (xԂsH]Q2ofV rL^˃*t,@cPy<qL*L:&a-e JVX R> S*!V1ED2 {'yF%= _q.@YӧznAVbQI2 (B2:LHpi#hmq&qק+d34~,S3d+5rfL6 ˥_F/K`B]N*Ё'BnU{Ces C8LAA6[e( yJ Ȃ@ B\*HE""n1 ѽԄaeKQ4Ddbt5c)@fa. `I5 pԵX d#T3((s$e不¢+U KP/u2 ~JRՕH0jWR>1i_1 &y!X@!_ 䵑ǍQqPBIѧYV $R CNl rZQ3[lj4~3AiQnq}~J7vdĬCr|bH.+C8ՙ!@߻SBw P&>Ƴ:q1 NTįZp )"Fۺ HP6:" Ƈ³P\ڌ̐r`(2էJGES^bU Շ+I(y{b '*z%t_hM(r Tx$"LDȼ""kmȲȧEڪC@`,vy`A=-nhR#REj(Z&W˦`6]}TS`rQR0*g4nh=dhEYR:[&@ڢcQ8ڂ VgGESU_~(Ӭ(al2JRp `|;vv;d*,hb⮆,kS0 w@܆{^s@ QMހwK }Y]P9B(5f2HcBQhv)0ְc[8 9@g@!t93V53b- Fyx%6)6.~cQm\g$0,6QM ;#RF WnT\[?0wϢbt",U@k6֜ß?H倱͐:0M`QIRhW jTrV-=Xu^Jw afs؀$>WkV%TaҹdJZ<i̽g7vNiW# l{ͤBc&A7aI`b+Z`8FL-FTah XeOw 5$޵繨)UFhn0F~ihG3wm A x XRcF5Jt#BK6\D[9jPʥP4]&T*d@y!UxJ$Q5@>N( j=c5X߸UaݲIY!|?uEr\O.DN!Fu/x `\ة !-JFJQ5 Yc4~ aQ=+NFp `Niu)"727kԪT gF (I<(0SLGjJҸh ҵ#k$Gw\m^5ʾawU@`H2 G tiY jCP/jC bLčrzs/Kx>*5A=Clm(1 qƓTV̯^pH`7jgC1IXU?]|3_v)&T~;"&" 0"%. &J˶+.Q튄ީEF8Ø<`j?^y]bn{òC &w:)[XDBՈzr{_Cd1t+EDx{btzv([}%]^,Wi> ..Keh*yHÏ^ڣXk.ؾq ~~u[ooV$|?~Fn>"ҀzlߕxXPV0{&}NP1B[1 WnO?U'Wŋ\mb^pa`-?~WAn;HpE+\W$" HpE+\W$" HpE+\W$" HpE+\W$" HpE+\W$" HpE+\W$" "B0Z U{"_W\bvj:v삫f$-   HpE+\W$" HpE+\W$" HpE+\W$" HpE+\W$" HpE+\W$" HpE+\W$zVU{_ krW0׾J ~+ip$" HpE+\W$" HpE+\W$" HpE+\W$" HpE+\W$" HpE+\W$" HpE+\W*R\kW^ zbW V\&W@EW$" HpE+\W$" HpE+\W$" HpE+\W$" HpE+\W$" HpE+\W$" HpE+\*T%m7v׽lk§_S)y7LRs{jm+nA'Y,qE$6 V>.V;s1#rl0N%;L 'eI+- |5 L)]Χܝq~=}(=z.iM4*LvUi| K[@cEQx ̈A<9wu Ts-L ~7:ve~$Cw]>FD+n+.+7Z ~3,,ʍO]/v/i{9 GsHۧqy:?jv=*o߅#pQ_\(;npMb/Ū(ٸ,v ?ޡ^+M)TQ$2%CmD%ѧ~;pw>rz~2uR#5!@K QSܪA֋H>8Vt5eߩeFZyYz%d<_:S S)#m+)8Cc,$\^jq\Qyv8@]D\5EQ@TV:9G3n\r̗#ǿj@}VX%4C`Dygp)|0Xu0gؗ=oZQܜ>/F(`2*+>DJMUhRī0>,|R>!rd`&5UB7Vk!pH#(6)F1A1F3q/_AxsQ爥j!W%-gSX'TA;L[UV*c3=w)V2mӒ/Wu%#զL`E&5J1?#eOOӛή8WK1:b<@H]KyR%~ͅ(cYVOL{Q<\|o^|0Ygs?|V;9p8ߓ >eZ [|,Dp/-R2fkYx=w(ƳƔ4Ex3"tzy,7<6GxVhKѲĒJ۸pqRq)b6*`|CKuiqXc4ylt'A/0;ᐮ? ߼.ۂրrݐ@&_D3\/y2=7L\vaq)m(ꠒ7pNv*E*?Ģ3~:fU1|f s +4Pj[yں\ otP}w6o_Gsc,?"͜,tLG・7->][b^\E}; A/}DRs-s?L1@S2vɗ)e|^cv0-zʢy& ǽ ;%mEj_*:ڊ VEMsEPN[- ٶ \C-^>ϱO_38 |i<^wxa E|%-Epd[;p v^2g90,6V!{ƳVɹD~ʱ~!M8i,*z %qbu)M=ymS T(Z˒YmO!XJ8b,FgQo;iJ'_8s|m(`C[ɬ*EgxRf`*Ⱦd+<~XC=1{_1ph`.fvx8 /[,'ew)*elI )EkS^ԆDOPxLh飆y~φy}*7zeiԈVYSuNޚݛy3ٳժt3Sn:5ypS[>MnlM1pAխ.gu9>j:iwu?c{vk0P^]Ֆs7Y@u)^OgXfDԚVgeoWaѵko;N<^w"};9N=GJ6Mi7~ kC> pz}XR6SxO*+ޢ։`m}Jw5~ڽӾ/iJ׽}Kay_?zDƢoD0=G%>mq^nrqwqZ ) xKa^|HXꖻ$`EmDR*uls=t\Ov9) Kd4q ھgUtFmn;鱜7x`vQp)=ܰk[y/B[kc|A5ǞY BkUq Q9E2 K(22iI\`7ȼԑ)sw 2u91t-+ K u,BLG!o8dMMJE+r+K#.:‡OE?Ky/;+;NYI429Q{]qC%b )aS,C`DI0J಑>wC2h,+_2h| g7Q2jqO(!_̳yZϖ/i5Qs!LAhsV:&118[.q|lV*h|)ı=Dz)reIY>KfZCVZ۔AC2.QӘ[(ŗSXn SNX3V-,Ed NO(Q1^l8]h Q/% L e6 Q$A D-6z.bkxqvt]KUVd&l%.RԻMU7t _r(Tu^ܾ)*_zP6տ}4Uk7^M?몹W0o~zUVEJWEB/~׋o&(#yzzYB.n9ZGe]*'^+u]S;wI/=LJt+ҫ߿+AAhq+"dm$OoR5V*o*FlonK4vNkaU6ݞLI@! %.nx;V?^inR\ `8Fjerx.曛z~ ڗ$d1-e lRE4x 2 K1x J ~ 2:, ih֤h4N%ɅXѲ(.pvL1v82pl$B%w1hU$P(>b"8.C"#46;A/PntHӌ(鐘S}%ɛI{`O!G˖貊`j(vaޣ/4RpMh$]2y:"HeuмH`)Mʪ#ewǘ𳗵/( :U u5@;GfOc1?;2L V`})c(sxlSCN5/}W5"~_Ž]_^-S~w+cшS0-{{#h:^~m8ruJh=^i΂׫3rwI0zRV{&צ?Q($9X.}^ 5J=wЦ%3wjˏfL|$4c=,;{4CTl,@e65 v1=v5sѤR ֡{A8O&S6n2dy='Q_:l%|9 hMgPAhb>1IJeZnԩG5WJPdx|J]Tl}}f -wIPT{hxo/.c Œ&'u6G_yVg[;5!Q(@i,LDY,ց+l ͞H^1'p*N#pK83pxUg>YsNnPͯ-+zfy f90|fBnF.tITgOrmm@ ㄴV>StU:X@74 Xp1̼ S,n9R]VP>\rBScb]l7 ( 9e4H <)*i ^_ΒN" nO-XNDyD80R@w$5L:+1=f,1"L$;ΆPCoSX"dέـQX fR$"Hd]Owk:n^v튍7}b~m>_\wVv@3^,,uYԒT{EUiYzyHgdpYpy*)aLTL)oeDNsid!K4̑GcqZ2,Fk2Z){g#];Ύ98Gwm[efuQ&:a3{]C4l85.ݪco,qrnzr W: 7W]^bsBàB.omZԺ}Һ=AY-_vH-kjYOVnz6rt_ݗ}y; ̭ Vsޝs\|hTZ0&bZ"[A6bIt0R˃(@fN;SR^nafmĀʡ> 6&3x.TtG養4RN:mg)j;& g;ID6KQd[Gc0Kaj0u2ALWW kgeZosP+uvE;aJR:ZJj^K%3xb; \ :BxZ6 UΐLp,pL`QM!:W*0ɧelHGـu)A8 >t-2<_#fy_ӶLTqŷ}S ?"{*0kgJ2Kۀmٴh`-=g66><^c]r֟Ǹ;RlPxS\ %2.}ˤX>mdK_X9΍6!ֺv^מ]駗4uz"41eqi>`3Zd,2gHÙfh}`)ϔúq,*[I/zyDd$$pX \1ڇyPdڝ-݄>';ϨTVO9LzFKZ 2JN۲WR)NE:ϢEʛ$ܻnMmMN)ʉ6eAqhI<&3xIs-= |&yqjXXM3vBY ŀ[ix/-3f_gqɟhh4<:{ħEl!,dzdJq ,SIH0T_bq  HVfi(ΞODqSD AC2%ɷce1hleĮ&È&biǮ Qd V0Rg LXܡqكLR5j׆;ϥL!\V@21CEGY YI.Mđ0!eX$k8aԗzӳ'>> "Vӏ]QUFD5  9xt<242+&}u nJ'Ŕr#т F*)2duK87FL& ΄Ԩhc$OZ4o~43?NazLF6?|:mi|3db!+LZP?wH 4@+Oc~hfEBʥ1u{Ja{gMq+~fO/>[eHiX~̗r:꘾?5*sS_ j馂O_iE=Nɦbs,c,20Y%-լgL^)8s %x i%K3 h ZdZak,.ɟU&*-wVhƔGKX͹'JKv+[T"4;TWK~ܦ4J,{KWNzkgL%qF__Uȥխ@YmDӬx[GlgL7^E`%iϓ$U!Pf7iu3FE측 mWx-u%]j LLa**$,YP^e#klV&3Ɗ=Kٱcѳ XuŽ-Sz7ZʶSz1;}tAx +ǐ68e=X) H೭{dL0Q[c8Ƥ}9hR H0FXhBr xm$'HvpC5qH\ Qob&,kb+W ђG],_l9 oQۘ|f qRJ): .M f O rYY`gG;&hO,\(dɜO R%H3# ,{i$c&+BIf5LЗ;aǤQdr&X %݈lg:TdOaKۘp#b)$dr4D GV8iWE@N?zcydB`Cɽ &B61^'5t~-mvFy~pކNT3f, dcJ`0ē`΁;f)ZyAv \(E}}TVM-z3!͚ҝ,Ռ|<"nrR.Af2KD/pS53ju|gxDc'xRK%*,:%4^"G4OL/N=kg@׆k! ӂ S y ]qTNc89^>PJe#PF2*G&Y#O_ta{U|Vs>cM-q(d!Ej{vgai`GY/bA'&^ld`7PyL -y>E4N݉-1>1a|jt+`o]]OSIYSҭXٯyLRڛNͣzca-ݡckAkغӑ2ͣj6Js[!UG^MPx5HB]Jùm&?M"C_s uq?\LRӍl<.hkMQʙ]'ТYIoW t(c{:YX#VNEWߺ蠢uk‡}I+b6c%\te%*_ȬJT,d.mG{oޟW_!z1 '{wrnak|Yw2:_WV^RSjwFiT4i>\9[F"u|hN 5fJWgKA7}KF.l{V{F&Bl4y#7oX'ԛ6k"t_V.ŏw%~mNK!-w_ }wae %-D%dL'Q wrjlh^~4l<{N p>=Z>bEZD[:Ũ@1i;]\7y. _t_]ѺSdy/B8Ncv=Yrg4<{8a]`C-yô*aAYN<)U1\Z QI=T9, H(J*<%@kL\1@(ӆ!ft'x MBhMq*W40擶WȪƮIy厤{ {U}n7y( oϓg~&P7? Q;A$mi+T ҪzȈ>LF4~gDeD4cLIš)\ʕ wݩ}=5L2M0%fPQe! CdTé]9]JX΂սcܫϽzro|g]ZU]6feDaZ%hTfPp?$ ,נ=7=2 eN"'OAQ>I:M5~$ys :3S#7;5s293< R݉_yUpiJz([kn-܊vɊzpv=RԎ>s;EyCY20CԚ$FC2ihBeR+QiG]YdFt=xRނDBYx  |^M Iǘ Ϝ>gJy L ʒ6R[mwll#mJgM޵Ƒ#x-*ҁ"b >Ҍ<3=O{fI[XnF͞&}d=>4ppLuRhԼ:}˳C󷦝4響ϛ/0M3?͛2>5{i3?$:n~Ы.2h.\7\շW{U%yy:F)5 >b0[.ܧlnQ8換.څ휅#~a4;]ϯ{R8$zs'ho䕈X[JаveJP%5?eSjk)݅+05yzڣ%ʹpLӻfNwy5(tz:b҆I w33~rrt߄7dyOk^_+^{_UV|E-AOsw{T޼ʫwqSTB9;f'tv 3G;te4^DB4ҳ) +6w <0;HvkksFMC ]w茱dk~oV-׭H~}/-Y_;/®$RkwPi-{~tVZYI.U[Ywɢ:usTosi3\>f>/ K_m$gvI9 Z6c[C5$`"ڴJH.&pf8D2:>$"eBr(&͝ ]R>Xn}QJZF{NXڥ)B @Aou5A,QQBF3،2CѳmLu+:&Uwnٮ7ijq6֌ym%4VgChuuT\kрsϴ=|{m;nQs7CH*"3d QDIrif2 :[>'3H?.h #{wg r'Ombf3)K`aj(!y"CLL.DJ'qT6|6m; _I%hTIu‚tymg+Yer}>3ڧI 4zi6 m^ltciQ3hr" 7EYm$HS.V+Ѣ$D)3vE`Hr" HrJdZZ/EGCMb(K' MJovֺA:`2]3slrs@qbZl5^NKG?{BWII> g̨u>ս`./S#)uwWGcN!Y:?_"X/:(.ߧE3=eٚQX,jz1> ;+ȍAZYm2(֬%@h0 tA\kS{6d:몏^~4_{Yץ :)c7k>s=ޟ޾Md Bˣ~T_*H Z]gݚwōqjn;}-l扯$]b4ֈHSLZ4[=.Upʇ.m~|~>܇}΢C?qvѰ|Uth IJ֣mrNꫬs+p߈O s\@Ir&C^s<ȗU HBDZ.t*!@'g؄Z.ׇUxX{6\#~S+ƾ&%ߨ9}K`h kú ;Ms'w꛺&&Re1E[rd.l=.@6@KBnJGHLYfv]v(E1Yk5F%CkAHOM@PAu~ ć3:0ŨB'rcd u,]QЋ,cوN1hЕ75Ve݈0*K8gޜ䣧7~7OXCR;Bʾ tA9$*Tw>PB&t&QyQ%z{ H˧3GPŘ}ȶEIR$vҥ!hQ24FL1"[&(ϯ{juXYh v=Լ>F<Slj/lwS@lb DJ* J3OcDѯ[>.Y~#,+X Ń"kQ15(&~ҕ,,R]P_[1lr&)3f:O=ڐWfo#S]O)$&T'b*A#E;ʬ.+'230 28 cv* e(Rz YDF{(*t[bQ&lR0Zi0*c(˖ c@dOE+Rq dVRxnzx8.N#z&QHoYK~,W/OQ.^*&V' 1GHxr{7Xxr+!dzdzm򿃇/|̽{<;cw.[ ]OB5_΅t󿽏Og@꜓.-ѕCb|eS3>e@㴂aǓSoXۑ7)KdW\PD)M*R$ȇn g=Mij@{N[x,Yj1ךR^=T>/0Һ>825m j B 9/!A'xIJ3T`bVL6$jm]@fq^W .C majn}@tow1.FAG|zT..sr3##ͅua_!f&9N?;:Ix+kc*KkOm]U% \]3wp|-RU7uHVˮV]8P76~O&fQjmfd6oENd}Z䊧cԗww;֢֔E#fi}S?}ob9Ô14ov>=Ճv&?=Һ.VƷ̛Bu&HS3wɈF0r/gG\q3QRAk 5[; }֙XKN{2J]qqŹ>^fưyN!8[L^;lڷg.{##y&FB܆:9Wr-axy9<oM4BCVG+ EikEic"olQ .ؒ\T, (E+.Ԫ2iP%Ճ.a#Q H'\tkz0 *b5,"g,\-2[b*x-F9t0`VM ]}9kx$ha Qə%leёNBV Q(o0Z'-ZD8ş OLHGV(4hrD. 48w+*Ϻ-4k#3.dl[ba'];  +!M\tkzeWds:_ I?ޖY= 5I%<-Üʧhs!A$GL<5١ÒsQo\. $XGeWHقUPH mn8V62j PQmZl7eSiqOh`xNHV1uN,FzC_j#*z[I (AM)JNR^Re9+huF3v"'mITDB7tĹb$_bz`G="[3) ?XIQygB:JҕQ>VH WŐ3!32,:*$rYl@,j0t `m;R*S XC-""V c<ȌP#T)FF(bkmHe`H~ya &яjIk9lgohz(DqGUU]HˑGT2%SdȺMh,gL H21LD.r&-F͍D#i҂C,nK!RM=ecuvKc{m;ޑ2&i>_$1FXBmQp+'*%d/!Ev;cygy ^gs[o(&Q;ޏOh~M| 9rD4SgȲXeSs]bc8~B˒'a[Wy {GNeo6dYx"XS a$uP0ޑkGΏ 9ZXY-ƒkWP~b8~mn&@gt}I'y nf|ڌ/X}|Wi9?No>$rmK5j5`R"6!:CnOb86BaY5Ij`Mr*@WG 3BwX²' 78YRz&&. N8o<#rFEK_Qt)[sVnU\5N.tY4케_Ƿ7+_{3 v6tjx*WxC֑ر "֌K/K-%hPZ/It:n\vH-kj9QؽmAϋw^ovyP#not{@b̻\VsOs-8 wi[*ͷlVJ[,Ңc;}n'1D@5ت^/I?m&yר.sܘdkE"#Crχ2 #7 +`=g父,g!21`U%˿2<3=K^")E *:={%0Ut[t: JޅT"JSa2w}9;$c q6)7W34Uwv8IoUSO]0> ǫ]U#^u'B;mxu0:/ Uuj'@[M]c+&zl㴭*m;99ly4`@k*㑚܂EY(1-ʲKpkdFqҸY&uN!*sw]$+rvLjm09HAZ RU&Ȇ[eDPZ&mJ/PB}57}5l%0梪-XAVN=/G׳7jGeřTf_>VJ ^W﫯^_ϿPylXmS8^ҁ7-SO^^/=S2RnDZԣi^kziӶ'MX{ MҋN/[ n}ė{)ڙZY%ɃIՠd(Llk ~Orn0`q+)&{Jԃ t\TWƨK9?iA Mh2툻:r'\\ 9ph4 iUcGccR z^^_/6rQ度{ ;Ɍ10vstώ`d5j.5ʤOt=gz];sDL=BJٽ,O3)ej [5QISD 7htgИK=?:?ݰ,+=[:t>Jm#,$R7*&I =").|t!)XRgΈ =,p-zR(Fa9 dc9xց$G#p-:ljc8,]NR枚Ywes"&%C ޅ-NKMLkF:Q)":G 7Lolwl?Hp+m!Xɉ $=0D-daYagi/K?ZZK]d4*R;aETx[dOcZwvݮ~^Z'Yp1Z'||Vz\_W r]ga6YPuNE7ҳfvzN B8LmJfUpciÔlקwx;E͹4$I?O+cuhS0g$oȵczg9TD[5ƌc,K_䩽-4;f}Xaڮf$$wRrzM~[MQs,1 9`X.+\c{m2 Z\Y[ZrfifH=0PYbM-}[4jGI[:֡xK&SF7du}<ʨ?:4mbT}ܡ/=:_Ao̟Ɯ[Rm;1-JC&mߛ)`SJztMFB$K3 hW5?(,}t{:>CgZb .CrdsH2AbpmLbT 6lek\-!:H1l?NjLf^R+VQz{öA9 ߒ/zpRUm&r>{Tߴiz8z;_ kCm4l"wGӒKφ|Cf-U;iDz;踨7,(!A*QJ R,wooB)2n$kKZ eLzNR{OBu䒸8/( WJD l 9x]ƱT '@ F2 dcJ`0ē`%B8")%V^J W)J=p9[H&GGmߋ݆B%ûqLv:%a 2^IբO4zoxzy˴ ""%w !ۄxoplKzy7t:I~W0HzPYA+|7OHg>LU3M:   x $gcn9[AwSRLn8!ȲY$bޅ KmdpB6{1.j.M{#iG$ 7. T+,ܐA7QN :uUGg"}sĩr2y'-cZ9ϒ4l6hRȶZ 'p YR6&"R@X$,Ldy8n>_ջwwnvf0fޞˇmiiL| .:̣cO⦅<^?܂DbAq_@|"9W*\TXuaUXE]b#^*Z&d Z 6Ah`&!"eRЃn# m,*fmHLI:s [&#g:6uFΎq޽?&W:Ngzؑ_qg;x`w. lp@HGd-$Gveʖs_Ūv%-&V{B̓PMl qo_◗ RQBA`mq.:K_Eh)} C[3l8PX@QJd9 }FX";z'K9L:Q$BJoglۣfk(+c~O;;|- 1P\׋ݤ.dFd%1L6v Y&-pNT.?sҞ ȅgB lImB9LIzJ" Q$a@Jbeɺ1>,A!I7$;&!{IJ  bI 8duiL@p(|16.vi>yBǻi/ƶ{BP&+U_v-t>/(ΘEaJ@.fhOrNK5طBv!eH!{U;%dMXR>Ke0t*%(ј$d|\>^Y@:#&E8,}gs0_Le /f4߿ў r'vJ鲫41N\:ߧ@J`OK2%3 ˒$]4Lͯ~`I?Tts -l9iT8kʬ#Pf.E(z9S##J,UDXZ Ā:E:+F3q Ë?Xv1vvS ð&haQȢgf/<[UTYVdh0ۦ ZST}EM6E9$SF6{6 Z/gYv˴4Ϻk͒b4ӓbE7JW׾dɖT8 ^vﰚ;₿u7]:Z왜*? A(uҒ`1a1&8dHn&zWJx[hzc%oB&_ JBajُJ3,lbnj“bDulʌ$_^j٠ d Q/،CPlzQ,^Re;dW`DeomRJA )RkHTg/:FU`"m]-SHo'<mI|cn&~Ď+p1iDZ QԴEl)Q0V,8f%&J]['1H02AiWu!Eo$P{IW)e,eHŐHTgZm&~=7IqǢk$M6C l`dO<*ţ3X=L%<SxꙏWb>K g2G0?)SLnfپaQ?(bV|7nyv$^=JW Ρ^un@*x̆YϙƕrySŽ.Ֆsp?m~_j7m Z~,O \ўVܒ*캝Mȝkүaw['v5TǶ=u?ں[=?ΫN^2w-;n^?]5yEI^FP3oo}uaoc>D gtJ-enzG;\rwNK.h{ׯ +mҦ'[i[u"/_ uuF7Vt"bEhVN&ʜ^:aToJ5PNuTsyl<'+ԴPю44TҥISQُmCRZ1lr!*k5TQs СLt1O-q$y|tzWe5" J. I ـ:SJ" kY ;ӁK6{}6UCe_Jq:ō=iHre.+ч6bT.$o#; >PIQe& @o+\tzdNd`K>k1fDTʩ 5R4!X%zD~3qȊz6 Mdt< KK#a[=o|f/?;V"VVN?_>Q::$=6E UD쫊!DOyk] 'erNNd8Xn3:?vv! l,bbf*:1fԥ@ 9j/ .o{8S<,=2'Ρ6%Uy)knlg9z0^Q/-b }JlH5K0`ɕ FCXH*A,MZ폢+>_YYsEmvZ(D9E  RE;%cBقՌj_ ~99"k?XkeM, K#IL!d"6-ZGY`2G1nG1eӜ w=gLZxR*K CIdRMdJԑEO( $6dkk"²2s[&i39|c[8$%;٩H"!ۤ+A5H"YvUjc8#Q؇9xyܑJ}LCll{\'$k.ܤq9wۥ[CPz"QP C+IulrOjFt4W޽j]šLF?CbFWhzOuWZqR C\W^w4@lAz:v sb2Wu"5 S[Ҁ͒O>p60po^ *G&J0_re4 E}YΖTYKWZEѷs(% S%`l~=,>G{5 w@vSoftaRP(oFgw;[>Cx3C-Ƭ=0sHŠD2uw_1dP1nr%gS6/}jy \g+I²uKOH?XZocAPYuZڭY/]5>^HY)BKS~?]([ݱR+q7eU2}i>Z%^P[gs7mrpSk6tp$Օ8 hz+tmj[֛r&ud.k6];yNDD݊n7fݘz䚕3N/"AhOV`)P=%DˠQuhܺFlWY[V:D\EBLTE00h; Ppm3g"$E(.pmJC7_^t tvukp!_hF$\8d +누/1n.Ku]-Fu@ }۩ll.[:5j+G}1, Ϟ&{Mry?2]4K{o&uLzRcF6>llkR/%z/Fl]eeoPJ:erv7Lv+F)7n:_`55fZw5kj4Dc3gj57o3GV\I4i5'+B2N_ +)*SIR :E1CtEKDk)?/2]{ !Z hΗFL)+W%Lv7aգmRVbj?TFJ VLL ?Vӻ:K,ᙉfbBJ'0ɴV@i6%iB$ L}nbŔǽmqYmE   $BkJP<q? 4/yG$OݧﺉʮEq*(j#41*弉m,pDrE i#cYNrU& F0 pbiZ1:4Q\ aJ k JBAAmX1(0D)uCv$rŐ* Vn 10fl(g\}yb,aԔT di`X9BGSg§z7ÕҠLSgHf 82ŭD ܼ z;XAIEUZtahs$R@sЅ,/ Yl͐5M0@&+d%7i#J3" ,]96y>^y4^'Kd [ f.v-[ȐxN\\~ŦӮEDIJ ϒkX#O.oW\%DNCUˢhtpFa*a.M U~pWgNb4Z?.}*qENISr3e2 fgK*Ҭ+9iSvF6B7i촧޸рCx_ =.#J8 _Q7)B>5p<gT9WW|QF]ԪM[ ĬKg[4 \aܦw Zx`צ2t.[hlX>6{ӆ..7ͷO;& )ꯉ#\0@u{Π_Mx./lʪd%jLUkNm)8c't+*;!hG:H$xraEҹ1#ox$IQ!Njy6˖ya)QWxڝދ <&`osQ2P#!Q@JVj`]:BrJF:E%tת4_xc\*ϥ2o\*qauUkLMC)E?4")#ISE/?Mm5&$jUSKJ Z`IhDchcܺhFlGQђ?V P!uJg.@` (86H3Ҏ"xkX6%/V/:.N Յ{]; 뺵MCkRZbC0ze%es2ԨWUtwb;vKg>*:R-Uw,U R =J[g-Bbxɲ^lf·ںj.~\؉z -/ʿ&ֺF6lF{;mĖ/ΊeӼVl>Zz̈́2a^!X @JĨ6xΨjg5 gsUܨʘYh;L%ʟwƩ9Xp?de0iR24=tR".& 0; i ghjCZ[9g<ŌL!){ipXri4$GڬE !Ie*hpԖ:ED)ıH1 +n'8F{6r2/qĈ2qg@CjJ=+1(U ࣐Q[8G Wb3ک"["Uw:<-zkX!- EOau(>a?)ˢݕ.?)ʋUqT`DOYNð_E~M1b~ hšpcEhIpH :b(%VXXd+#Ȉb f k{U?1 A؏heya[`fb`-!Y %Yv҆hce7Cm:}&Uwl׌vGڰ Z6V៍粉j-:!uwq%܀*ܸ=B::2RG"Ud2HLʴ{'-hQr Qa h h^픩XyymcV ӍEIDIs\2 ARr#EL<(%+gQ9g4e~ ;r^BLȆVjj^`pϵEc8#b>PmOLB6 !>IϮӐ I?3ΔnL7NL17Hh2 tkUJ^K6+u s%nSV4Mѐ^iW9R ]B8-~Q?IO}OM8NN>\>HԞoIKz}OYoXfe<5"x)[oz66[ǩOjr6! kҿ2ޖb~_uM{NX I gz°knA1RIFF"`qY8kymS.hW9>B}_١q֕4Tta@hjA '>sgg|5{'uL.Ǯå#х{<ʘ39C+d 5*%zr'UwŏquqvR2XQZ^= dkDBrq31Xém©m6'6?]wԶo ~{w7::gw>AEkqF:R:Тt*\efY+ԊOyF#M}ckPAbKDHN@o9!g`dTbjPóW[)V f[}S.L&?H o&゙//ͯӋ }mΠ4H|>o|8 &=(xPӚeX֥tTGTҸEA ՓOm<0.i;+rSF(J?4@$Vgt`hB 1*Grȹ Ż JUJ*.9-\ƦWК}Ӡž ŦoUWNz"]/7W7/x$xc.xFy$=?ϗhvB~vŎ'K҇I]T>څϏ5#)d*DA s"A)OQ[n,rSPUX69I@qDBf54lbJUѢFQȄJBl]S1{#92+·IJbKɮ$򶱁4s d10_T4P(j~16eDd. Lh,0kQ+xH HMN>I2.2$UJpl|ݽF,^W?Nfp ?/lz$N䲿\QM C?`_^.t̪ڣK/M9y3w7׳PkY=#IP[^5'b~Ǐt :y߾G׀EH+&OεDJYE՘(ءE E-e6L \" *ɚ̞ @,2]"8 _hA@%]s-94HZT0..hP VEPF鋰YIfy,fHir'OA|}vWmĬ*3hsT%Po+RA)9RO_'dɹ䜧TbfKz?S*"e=c& RZL2Vvc2FR&[٨(؂ gY߁..hVګ1{N?1rJniE)zِ;cNZHi 3o;Ssj$AGEQtT'U˘!JN /1(IĠ2Tt&rcdL2=>ϑ{=vH]RHo\jq>ۣ z2 .W<#Y^N.ڹ8Hyex_-8mcR |/ PayP2T,BL )STrG]%|=^}>N[z 8-8{ ?B#A n//,bY@xr ̻ylo2cW"U]Tz"Η$G!5" -BHKD+RNXZ9(IbDuP=UeAJ`il!FDkHyᓗʃԩR(RCf)Šc0ɺZ63+!m҆$ =Rkx&da A̢%$O[E*yN +$t0Qm V+sߍ*ˌHF{&SsAVmx:c Kq%f}Pja͓jaWja!kz5PϮs==Uy1D2{&ʻSˊ5ʡZ6&d !@dEDWt`$TneR(%Q#BZ*v=7PHD,bQJI͆3*|a/ԍ}}A}jDޮ+2pb-2/ѠW߹f?dM/Ʉ( βW`DEoҒ69K%9(6!E{MtJbQB* .E[-$ ũP1>,e$oj8=v{s(^kCc VE:6}&dJTPVD1VVXsF3dFEg]G l,GQBLLI5غ8pvÞSP=7b}l'`0.vZ:-;Q̕}􂝽q={@VEūgA<=ҞwV!ݢv$aݎğ_GKr*Ewiݣ5 #F>`9gLZqC{V[nawŀA%эSrTe=v n[6`5.c5ۻln32ܺݢwtj=dp~}ܲ㖷wƻ;^CUtŞw94|GKA; ޻-6})d뻽kA2 (w\#mGIHy' ^Nޜ&nenZhcҹ;Ìl*.̚)y(~,>T x+6j j6MRo]`TbD $$Ms1)H#U$ǰKptb8&t- ͉Jyv`?Ff2qe"U(11/|Z.(E` FHEۺAN#;ˡ[S)鎕 _ :)ěyFUIa)!ʙ ] %8CE=6Q9۪:ם͐utKpk<My>(?.]{q$` 6{6$fMj;~5 Ⰳ0Kx#k5W;XMuI? rv(ȣU\qT1>RMBWeDdh}Ԣ]I.u\Xύ$!\>6 L$xZĜS˔T'uĹ]BDA[ $M]cWr:~"XA99=2Q5F0&jI2N;V zLH͂OڼWlorڰW߂V [i*TC' ~-vc/*`38mF/erm3؛5@W"jZ ?srzF(ךY +61Qn.6÷ md<@ڛqŒ8Tu?.7gd[)NWIrX8ӆ^՟x4@/`CZY\~3St&)14fȒ.8 иslPAi# `N7)( be9G_L&ʌdXGhkԌ7:gEcg}͢Mik=1%%f [`m &{{w8^}_0>dGgVKs^!Qe R+VzCrV [@Uwc\޶tb$ÀZ ̠/"JEDD m*BqJ0PKHc@3C7U TNΤ Q娀7ˣTZI[Q. G1MsHӑpAojs{`"Y)8- 8Q AO`bP̐)пl?X/JDY/B%:*8d0tPE]{v}w-Bv0Yg|f*RڈYH9۔gOoU-Z]Z&0צ7~ խ3|caf-ޡZN)|mBUԬ:8L W6 N:]y ǵ~J[;i ]3@&r~sSqW0+(`x)&pCɧV]7 DuC?i:;΢j,/QK?8͂FaAZ@[w,ҌͷV]Cz~1lVS3NSrޅ+ vG~aO8 &x_ws^ 9r(K4&2*"E6DRAX) V$DZJyQpY6;y:ٲRvaA`x5He D[- dHE"2Jꍦ!r,{k/FgOo*wXżkίVMq۪m&(*[Y(–"*q} A632i~wƥ@Z]IBPHx)q76(lw u6;*x,D$uL&zI@Q3IE%x"!9![҉f>K3ߦ"|}xLգN C=kgp]wN[/$a U}6Dz<6ZFe6;Uݻb?>eJ7TWS\<;TQe>]Uu*ߠժT]r2eG/[]w^k]E'vXpLvv1Ѹ^CNF &%0DŽq(:Eb xCjKȧϠwVqjN6u0)ra.峵;?{Kt\\Dkcu4f5|]PԜsk>8ߖGSkʲ|1[)fBfNK@t^JS)M)Fc@|+4$&\ i;h$`OW -xe\5Bi"tD 10F{gtP"E%0q (# Zho,,xɵF(U2d O:bŸ-QO-?2񞳓êNqc^Wo3̫VOvɤ?Ѹ=W$U%\n?V j2q\zw$ok~=kW͒V.}1V_Xfy*{ӶGv/]} 1=kuEj{/[2❌7>gRy`k?B ޵q$;U{{k ղbItHɎgHJI=DDK gU]nBhx=-/Fg#U8ii2&{tbl4"n4{,ycYl" p!F?OONex(_ϏzEn',oj?i;/azz7M G 4yXܶVZ>1@TyެkD5ة@W f|?Ğ990_[ԭqRf19DFjNq 'K'aN4J1w3R$z}_O%^~k#孤da^j;?ʤؗ*V]fڛOnSDH.k"d.fo:iiCVA;[ x@ M(M>EǒS>)e "'Hcx*E&1֠ `L(@Er1Ƹ%-wjH e9A@XK wd8%TevrW/kpMnj'v΁0--t<]؂ 5jlsHcJc/Ò=\8PCH!0V&֪ U2CXDφ-^),9Km}n,SaYVeӹ&VY*bnH. bFV' tqQ e1.T%B(7%R L>lضA_FљE5ڌ lu9F󈁣{zk<)iHiᩅVle}V dق ll tՇJfA nN(ܳk? ZHͬ e2@AA0RӅ9{?@5 ~|nPQp1?Z|zS )?}s\MdžO&^џ<y(;|ZODWMm3L]O 8:8;yN10$C2iqyTĦP+Z+O_+|6|FS<ouoK1k7-}Mձ*Ol}}uEFiQNF.:1`Oj!A) >9 7{X"覻B[Hֿ.;tA3bySsb/9a}3.F]@ݸ8n\pa2Jхk* q:|ꮾih_js`^_j .ngKl+BJ q*<uW1DgIh84o\CV Tam^ULe#s:NJ6UD) ֧8ْ &4 3yv ZNN>TkAĀŦU7Ru*.`I̺dS^"4Ʈbqk@b6!rQչC;VoZeH(WCcȹ#h]Bj{be6D~uF^ǿ?; W] lCb>t@?#M˨=k,B'0Fb}Fos?-DFS^,@A*ĵd4ږoRa4=&j#U.2t6Ci] Z1dͲ:nPT0֗T-=6:8bPiو=* .A#*˛sI  )} PR2T, sW聆 UJEpY` XDD֘URQoQ5(.!ek}@%IA EAU]ViH|oB?b[wIGAv-+I׸g'ryC)҅l[ zߜIbjlIcXZG_GZFZ~M~\ }эD׳>9nx%]lH뉐H$"%eW>w7+l rMr[qLIstTn=PG=Œ=Tމ<_3P4NZ[*9y ZYe`3;QYU@\:5[.Jc5Wh5G2 Tϊb* Z[§)O;e~~zaN;>;ýnVw"Ps)1/gARrPl j*1 Jv\A:9*X !BleˑbI JYSZMYšEg}rbݳW_UӽF NDΧ|?r?ޏZ.7D.f}t#g:96r&w쌥g/{]%tomaыc: ߺjmX \v醹ѫu.Q+?.^X||z:Ky>﻽U[ +##ѧtIj+!zy:#x/V>#:jްv 7y>x$X4mZ8zP(\BF閫ly K/R'! x<2GoXNtXՆvqۿ]xr|sK^v}7Q/&̕orf#ºo몹=sPlGhFbƚ]ulCs$\"Yݽ r# a;7z.V<|; y3;D;#(RR-UDE')XoTٻ)Mxu3L~t6'I5k?ܜP;ãlbp|64KG[k5BV*@məjBۢJ!: l0'Q9c,]8$ :dبL!lLka~aL\ne.`+c/?4&62Hjd?]9@4Y?$C5)J `Rɲe2ijǤFjejcyrrr~lZ%i:] L$mp 6R5Ia-# ߂n"x',X[YqfednLʺ֒YfjN1*}$ lwީeÐ0 It*lƕ7"ܡtԐR Do3H@O2V M:Z kE/c.+Y~kc".E YSNRi;>{3YR;`kEY$\DKlȞl}dD`p{hRj.1GS,[IPdU5T^\ V =HiE@C!IQĴt!+̠b69`$ܺ<+APc5XB%0.R>*.2,(Z-n bW,,KIPea؂VeLrU fVl"XDbעED,BWDN&ӾGU7q6( &.NPɗ62%#c12@Vc"[UsDbY qZn}BE3ɲgˆ) bHr(Ұ8bF`xL;qT$UMS?PNecDXS(q3XlWn_ bd[ePqF%R9;;E*f=%d~EBYU.PSQZ!P]Tl-s;C^UuA$E>/)@HQ#2DϋEV.*+9RO;%㠄bLcd"DST[4LkMBՊJΘ Lgg2aSK<棊1 D%e(<qMia;w3~ZәUC 3J'. -e*(yA3 vڢR:/Q(JRI"&FG 僕EJPUm4nDh"@:tP2уGM@@8ڂn2UUBN~d} DU;͊MTA5SRJRpF!6}{wntѯ+!O‚.&jH}4 ]{ #@ |um^.-ŨuԆHT0ܠ舮T@5ʪ+H H jcBQ\g"U3bhdr[Ҩy~kAe#s4pqFc6FuF 39(h `?AjءvGycP)EF W7CQesWwϢb!ʉTICXsB L4B:,iY $y(mJUΪrMSoĽQEXYw iߨn$Lztkh*0FC:)hLmE;Hc<[`tOryj?L'i\CIд@`WH7]pD3 L[ΤTa7 ;5,Zw kHZ\PUF#rp[C{o>51n(!/7C|樇ty@CK67Fm娝Y.eBAt %0(HP#.y9n \ XoܬGŰhlFTȞWWz9_a+@1\eN" Y)oGPب&xyF2aNH3BPYPcx)U,Jb'UAл5uch~ Iڰ(RVhGϊ54hci =72[ F5jփ*\#|r(A I *t$Ơ~B1Xɑr5Wἠ!%ghMg Tf#m6B ncv0²f@be4/ĆMtU2Qk1덵9mFِCL*V5 :O&] 120+&) T$1Dݡ %;)[`4bM(u*?u Ay3!z c#j-~Zg۵q6[E{GXD@Ո]r˗/Pא}/Y =;CrȞ}6VѰ60A ># !`.BBF5@j$P# HF5@j$P# HF5@j$P# HF5@j$P# HF5@j$P# HF5@j$P#\H! ωN|>$H P+2|H |H Xē'`I τj$P# HF5@j$P# HF5@j$P# HF5@j$P# HF5@j$P# HF5@j$P# HF5 ".Guig/ԔzX^_/_-;u]M5w@R[sfo?^:{?k5ØP# U+.'_R= QU$vf_rf[.g`3.v qgw޲|xFt"VAh_=l)լ${)U1B@V=h+VK74QKo wxGZUxG&0?lv!{8:e?SqXlK__\=-C_a~pgN7^ReW~{vtvsFZuUW%3v;s\HMeg]x=ޕzàm$G~8ŲY>=;:JvyZٮu`d\Z4Nil_g/w\ꭡ`-Ԇ[ťK9 è7#ZD3rٜ1MrXG@fmYt__j9-Wu5Ojt0 rR{9 gj8pˌʒח[FimAGj+fI2wҼ(i/8}ܛ/SʑWuXI}ʡPhUYWvkޘ?^'ͪP,rLּ^-<,dl}$ qy8bxh/;7hNuVDnoSK\ܡ0uFvفi% "W"z"ד 8-(zkfÈS;Fir%O''@D[\kYXM;VG xETm,gtvn,pλ&OlN{~jp6_|v;*/q{\=7v/:&'7|gtpjE`f,Rv$Aqc=KDql6EK)7 3:/?nC}%h>qA; |yaRbnv|rTVw]}Xb#˞yr抶qd,r\fw̞i\U2]JK}ҕ48XHc3U}|(yN'i~o[WMhIȭ'Gut)ƾ"z%YժOΉmvϤ?`̻ua;NqwUpىOiFyVprz;گOte['XFsc1v]Vw`^')6+.Oe! ["r}8,n(~@|u}6_9|rV˶y9}7}/ducl(?Bf#PSv;-ʞ\sfJ%e`\GߧGW4|W;ҨR(6lPزvHk蠟Jz$I5l[$ՅFٿT({bW%>JzÙxUMJ$~iIC.8i@W-kHZUZR52mߕ5#HmTFycGF3'SZ'|Drtґ6YTzݮ2zKeIc!+#rd2V .$B`uQ:J#E_0۟?) F*ns. 57Thi@sG4-V_ǨbPr5rE'WރN8{؃,D؃lbGeW5䒫QDuQl sDV:3mDSyZq 1Q#fZdWVfʯaߟw~܁OrɮO8;'z)5&1Ib4MIwmV:.P}Oڋhw 'R D'N%ԁOE=Ж/_¨Rbd ϙ6R\VXfzqbGԋ3-Q( jE/eVr%"r29#'JkG 5;t$n?N1v409)>u?KO7,?}5w.tv~Wdv^tIm7y]uRN:N|MwϿz~\Ͽ^=9Yy[Izh>(hPQ]|qcCN:vc8Ct5Io϶}l B\r{@nAB'KHÂG-TSa"~RH6f-2Kao8uh˟]`m ˺^q:oOq~үgeű޵BX} 0H`w]g!c%)D4v[}H΋Dl9r5|);w? >)ތx0/kVCuO3ĆJSזZ2[b2*)6d 9\qf'u=[&TOcnf>gǧ_G15 $K%<%U,ꫬtުQ:AK6V2HCLb7l!XcL>JLgDR&[Y(:V85Ax2hګկ!f^10˗a v7I7(&)W?Isj5yhhBU/%7lceˑ2/D"i;Y`(ވk &y/sp9 9$fv!텰6yѣdLtlci&*^"THN@-rUf]bDV\ID|\b@]eʲqa|?%6,$L"ARZV"ceL1 jfj@ /=38^1އײN'eX'՝\u/?~:C+,BL )St"?Tclkł33 bE <6!i;M8׶ @dWuw3O:]K>g+Ĭ׻[%wnFnP#=At _>ECx2?oCUm Ǝ YE%_u}Aa߂zۖ{Rn;J#]ԑyj)p:a) P" G(#t &YTk|3q5'Uv6z{7S20 39EE VJi'$t0QP.(lB.Y%,sJ6{6L +j3q6+j9ԅ7aXZ&]bڟ|u(Up# f/}olZ^찪;V]9چ*OK{ECw|*oQ3Vr.:kr4ژE*aC]Lt Idlf#DJITIKŦ1Va-h'J"jm2*laX[B5+.*2^Rp~wn ~E>FOgob2 Ūd* JQћMR bJ!=ډJTںl!Y@:bn'| @ |Ol oP}Vc640X .1`gBDaKʳࡸ@4jEu2Dm;:XsF3dFEg]PXD\cDLHEfR]\lL'&0El>kMch8XĵxYqбFJ`:dM`"GYfN((mMR뀹q1e V0U;0e+.>gfҊNBn-b3q4>*iP}u6[%]]PD djkTA`Zk9h.f7̐܁ 3x52~ |௹MF1<9n~|GG?S 9J+T-p@ҥhum0YJa!q]a'^98Z-f]N"Y V0١pYurs0 XB/|"]C$27(B)&ƳI#~Tb,+_(e7!XG̡dF?_X[x{S{NGg`: Q9 }9燓WTQSIySB)ww}[r>Icl`&hX%G$i552Ê}jg!Jee|1_qQ_LFX(Xޫyg+Ndx)E/B"Rt˳ QF}ӂr~4r[_;ge߭ Z*,w ~_h_z^"rI|erh1n_N;֬ѳL=8}_gGw<:^?=?γIƬwOGv w 130)Fk$Y"$ԙe1ɘ,dm]Ũ2 HEHEpEShG2y$鰨|j]8[@;i-q/]_#˚&'qnnNv5<7q8Bu; |Vϴ'\ʤ3sa v{AK1NL6aDD QN;"!cM1*Xt䅏ZE"x|Ӷf6F*M:rʘTMq ,JddgR)9uBj]8!]+OuV寮2Ϸ=Y^]$ 7 (܎m=KGH-@ @.;j $ EVsޝ>R3 ͮN_3h7AZBP%0bdh2gtp - TB[4Z gP A!-YRϿ]Р,IF**"= >F1m&Ζeo-2>qrVd,Qrx>C_ 0x~G6ZEo-y)hLId*Wz9]1BCPba0 /`D 0$) ыQHg96 TldN 0HmFWH ^2ZY].IVAUL-N{d3S~+2G[dt+ZU#6ݭ5-`M]nY!t ,{hAH4 Rj/ &Eku, OfG8gYwN lePLUr` u]C5iNsYә"/fOdl-D>l;o&C?@WTR/^5ُK9|o%J B 0pNVU!pt6?t9wZφ?g,xw+~/}sYl~$9_væ]DOs\O:57gz?圑:=4M\`)E)biZ8|9zdv}k!ڑOd W$ }8&kFYϫn!Az6|Ƃ ޸T~k=TG]~`/o?jכ*3m"Xguong y$V y' G#}5Z&|0;p^u4OJ&h+=-i//&vλb_daf%vz$ zdWѮ٣]B^ݽ|jݽctdQW)(ZTUIK؀h) %YՒ{yUKu(? GOۑTK{&mLM0 !8b`m I|PX0T[%5 ,Ͱfffԝ{3%k:Ǫ1+껕oc|Ҵ[ivk;ݶ>nuZq89LG<=@.:F>fd)=JE80zt;w=i,w4{z~3[A h9AI*nڿc/bx}z{ EYs䗒ƈk g3akPFA'm`K Ke7eP"Q p(uj|mvw󖉻>w5]?s\ޛc߁֤mag@aΫ8-q?۪,LLuOLmǤvtT>@hvW:A2ٳ(uW{Gz@[8s'{SeVM{ l۾,SDYWN P4[dgjfr`KBaǷGgn#mHݺ]v0Dan!vWv y^¿xkKwϯ @6m}%ZH{8Cp`Z(;YUdxIw3=޷vfԸY3gv;e3-eS~7ڐ1b*Mo+i|ل5cfA9:L&%B Js!_GcfRј`!;Ȑ|ߕ}@t;YȮ;{G狶tj- ayʥ_>.X~ ay."&kճWt*E&p0Yx5!hznAPRnc) TK)ʎc*JQ`%ը΄\PJ-.؜*4)hUeXƅ6F ']F 9aj{A4QW7>5-+/鈒\%qr;qaӎ7~Ѓrp]U6b:4YT00B* /ŝ_w/yeF𽞃~ 뜒EԺ(ȄQEV:j309&e_ \Cچp%Ays˟/˫CZQ>[>:8z^V%_$/uӭ_ݛ~::~ӴCtv{nD||xxڹ+%+mu6`[k*q/z"-zhU+ܥASrB8'V3'eڏ2nФTO&eѠ:Z!WHpF* ; Y#r Nޠ!=[s/A|W9mf ,BERi61augJQI ʐ} !5S*2`Un&!Xݷ"D=G}I'+ SY^br!㊷I{dǧguSag__u %vO+&f4+T-I)o18PB ν9=!9CLP7d4dB;5<ewb[ѝH.gTpSH•d2Rk(PJ sEMW쵷gv) ԈZb-k߯=b%p;zsG݋$޿9kqBGOʻ7t$/7Vo ,,ɿ递}f`-P Cv98 '>7g_+=:MVf#tAM48q^@yf(yM7A g00v|v|,eiX'DPbԸNCHC9M`1,IBVV(@J ɨ:Xڑiv_ieE]?q5ʊ?7hmfRN[emX%|P]%S@5V[ۑ'0'fAud f*)XbUҩm.'snp3d!mH[g?=ڲ(C'(̳I3VY&#l+ƔU\L *DZ>WXU8 C' :gG9uSI^Bq2w+OoZ 0Gw촍񲂣6Tv~89o %-޸}aswɳeJrB+5y0/vyf2EG)L`[v|N:KT Bv^0&U-F @Wc֩PjΪe="UX]ĄB L^{vޑpJX2vs,4S,k,Qng oh:L/8(̷O!?/^xĖ8"*#uf圎@ʑzӾZW6 1-Hh+44(i4P蘴5^9]MUC*aȂ 5. ˸GQFT"C9pKR$O-0%"v]#x'/1 #`S65f`j[?zբ :K*Ūo{Z)@1;:Vq0KLmq&S.`K&md<)h(sD6-߲,qq1kT=zYɮq1t~S\⣐@9(\J|bb`qtm-Z9#Nqqqq,us,r7,?2'~tqՏe+&@19y;]ZHj*sQq.: |5a,LGyq^fӉ $0Stڄw!%𞡜 7|tqCGIqu-7qy$=l(pNU|5ER@a5;Ftl>3;_LEˀy3NՐd҈opdؿ@(KBqDI9h@*8Dd<~=!Mov:VŶn6ksZ,L zW6>ihlI4YkvrhUYm5νӨn>Z),uϭyp:]yѬ;lt,fݮ;=_5~d=_/,r>n5GW{^]:s~-7 ɮR@n9Uu7M׭U$KL=\i{6)tVO/}4G>TTХ@FQ̛F9Rr>x[[nzǷl;]Ė1([ra$牵195P4+- SG<@).ƤEQusЋՎFx:E J+`$S׏gsK&wznvx5X +ѴܞY|S~ `Is=^}&B^T5?T ŠX|0%DK+}%ۃo2{lǞߙ;%Sr9B]6d5#qLyCEmCvFh ZmHޱRQIwHS^jhAZ!A&'u<QB cErN!wU9cDٙ8k@3p "w/]fA5e}k=_)h6mǜbŶn>{b޷;ʋⲣuܖ@U4H`Qjl"$AǴo;Ĥ 1zEE&finxi^ώ5)0gA$ 20FS6hT x+6jP3姃]dϓvQ q,4$BHfph\(#T!Æ. G'xWJ;4~ߣ xe3<.ʜE7Qbc^']PP%2&/ݎ6D|Ȯ;)鎕q_' xMV?n*ᤰT%7P!vzTMBWmDdh}Ԣ]I.r.@.>6 L$xZĜS)O]0mv󭗴 gxPrًKX7L߿@Ejf.#XOU.@;ܕD_HSǘex]gH{cb;䆓 vˎo+(%AWhNSbhIc"KND&+|2JplPAi# `S4VU0DeJĔN8S&&rKT$95=V';l'U6"竩zo`R=<7sw|?h﹦fm]ZulūWGor*KPʗZ{()r"p&GOh[AT|/׽=nq|i`%Ӕge:a2_GcLPO99%w.H D"P=5٠uYnscv||uӏLVqr9 #7w@s\^Pf}h#0J=YȎ}bbUFa'BC[U .)80Xʄ!G̠)I';i\S<\4u OOxi~P %먰ޑҒ^i~ HvkPG̡n柝LF<:he\[Qͦs?\ VjVf8"q :˸<Է- ?毿7E?EwG +=;R K<]Ji߇v6=MaGyֶcg= g[|WlR븍.q5 IIII$Y)Vٻ6$0~TwU ] tt%v5&%IC!Uwu%'y56FK M2%+KK^,t+A;X䬹&%vؙ53\bb).W0'Dfɧj{)G^"^]MI ŅEѪ5l+N\Ut,)ki"#dOpIZc \Spw*nQ^czwtYDtY@X4K] 79inF5ױұˀw$Wkû[B[}m90a6=]$ɆRi*#8;ys;O4ӿV(f{zT>@W枮[_r}zz)Z[<}`q}h)To>(4[ϓW_ֳ'69ԞOm֫Y$X4+/l|#K UfĽkK.~(&(j!PS*8+\cg&2 @w=u3?-tN|Zabag4P@}55@+raO}SL¼KYBt8Rpi2t­mOAse8ѐ2g_͈j*T%ۻ+'/t=bduq R&z@p|-޳HaS*΅K%rb_Ng9u&[õ6?_wԶ,~{藃^l_A?ю7Ac~raeɞs<e|8lct-tNiL5SL;f{2Eݳ$W`@5Zn@cs~iy<2UYQ*Tt2Ӭ,[2& .iqB-Z^) 7-f AOurx ~|M\`RnLtsMo~:zwth|/=js5=|&5HNMt8rs8[(60jtZF Օ!A*&M 5^zuTm``mJ5i&NV[ HPjuU)%q>pUzלm‘;grlL1DRpP7A/NrvTRVqwG!Ib%4Db M͗*QJ;={CD^yGQ9zt,f\>x;k(Z'o 7!}3o %ǯcD1Ke)I)IR['}0ܕc[ǶmEɵBkLZ(jYN&!KJJZY7icنXj%nFosLkBL~Ee!i<*7sҸQ!bLۃH30}[ /Ɨ Ew i!zSu}3/'>|_s?'Bgn͕Rؙ,^njq{}kX#n<.+>ݫ3t19xg÷(ѻ^Zz#v:'9gC3^&2g5K^\\X15@BB~,e@~YjF#{.Af-vMc2eqFd 1݌ɼ2{1(rhX'3;YiM탱N̳nM1u3sA|Kd[Sf#paFLzvoduܣM?HNaVWKPrs~j6ZI錮A|do^ -<#@JܫTv ۃw?'~zzoGB]G{ZV+;3ʕKrgio3/^f¬'.PTl$8ש~vda@&}fgw7X/K@o˘R1#֐"\C̽ 5,w2|wƋWU㸦ҫ;ϔڙj=\͂s>]0IQԲʵsīnQxX %OuŴǡ^ƦǗڟ'S{-!fSYC˩P!iIڋ-b)*Uٻ%ju[24W89IB>\nP{R¹:C~XJiQgnJtr]|r:*gI`₸+ |HKSc^!sS*&kŃq!co>z&[;c1H~jK Z!?&\kjDIz %6 fMT0JmoR 6<'WR 6|Q/߷BZs.d֊*H& )WbUhY{,u Y \scS6DG2eU%bT ~$!L>|eŐua (Sь{ r_7V4P&R`a4!A{]XZn94c.+(ҁ=Hc)T-x}8^r5X)O!E%% W L gX͕$UY7%X;aT+Qp[5D5aZ) w:tdL{v AkNԱw^\ppN?b*jBKgI }^Qb9[!U8ΘC2)_`xb\B LCe% 0j 1 F޸Txt0Q/p vX;:)V PT@t[0{=`.5`;oj!PPGHYu>ҧ/bʓչcmVS6V EAQAG\Z6ޝ04^& 3˶J B (=*pJʬ7, tE Ȓqi%Ekl[49" $a`@\VjŊiRA l`&i5^$0&cZJ)v\Dok(@B]: IoTzC%5Ƹ#LA@XV['0 ĎQt)Y?a;JBa \ALJswo ̑P% :|׃5x&/`؛d)[`(0:;@J7/uV* R6͢+4]ZH5=*(0ྮmZмT%ʦR譺%ky%DtP֌Il,@Hq;+aTfx">VҤV&23ų wEoP7v1 r }ŬQs|=b0QBe;:i@6!@指GS_Jbr(}\w,}U2 .G_D K*``2qLA#'jb#QE ƒZR=HA]yM!|`K .&<E",AȘd" h'76xln#Qd*P?QySdzT%ðۊhLj&HjbMmy/N'sUѱ,K0B X;K1z 2ŨqĆpTs¼XP0-@ @> hb!QK9ph @;ر@,qA d$URmܨE8]n7K"o/ 3;$`&Fk[nV%^.Eu vo\CBAyНLAYͱ X6\g 0,( ::bPKFISǯ0̣0k/3wςda DtIȨL()*?!eќb XGh3o^ BLI[bq/8#^c-7(̀ a $YBY]R)'U653kF٨͟[׷2l4uՕxUI +&.nNx0g"fg6*U &= *{KSlbPfxkѬ>vu[ǻ=ȹ瘲7hhƸ-?!|Aa+sۚwà .B%Bl>=I_ $ڈA%fp3PC'Ay!@E0K\^N) 0V ;U2j4Q!*$Wj/JIDs2Υ%xrK$ƷUMZ Pt]EH!C <e1CFTXD 4.1ӉT ]LX"cs rl,"b&# \,$-r]3Ɇ{L@S6h p-7|j1%κaLuM% dr$@>8FD 7P7K΂#,K@-1A+}WׅX~)1D%q{vh5@=_`YPBh℞mskD`nFY>Q V"tuD 0!bv ɪ $ ?tb'4\&aF1t^=\W˺"!w2YmV3&`j?YzBe |dЩ^V,Z k1M+C]ٔ}9=駭w' |mpko9`O.BCzBYAzHP}ßԗ@┠۩**VOI |:J kœQZgVLJru @"%)H DJ R@"%)H DJ R@"%)H DJ R@"%)H DJ R@"%)H DJ R@"%)H DJg >3[=v%L=%PǮV*:9*jw"%)H DJ R@"%)H DJ R@"%)H DJ R@"%)H DJ R@"%)H DJ R@"%)H DJ R=c%OI z:J>9d@o/ /^{ǜlQwx"TG>zH3&z:T7SK^Y_gf?6/o^lʥȿAFqv,y)!|Aۃ8~XK zqΏ{^5S߲WuKZV~"/u7/*g3 j-"khwD\Z1 wR/pӚb$LεjU#q2K{rr B<ǜ9lցaRsƵ@ht7ҋ󢙽RN'!KgǷ, Ǿ"g 'w'?Tn5NfڇOx2fr"Z;Xx[6u/VݱS? ڤe}#ONQ; (E`!j Wס.gL PTt :7ɜKnz6:E-mhNO9^8@$=3$R#P}i@(~)XF N1~q%tmIsFgGM?a:)4ml^~7[jCki2 FYq0GyyY]kލ>Y#{>'@'͚IAL{nmzz'֏c Ak<xrvc׳"Pכ)ַ_}vz7^7$,])eʽmz@PavVW]Dq~b8}{y5[mJX?cw Puf$>́OkVv5+>!zWkcQw6pn67ŋR-[p_GZrlzW}z7%lW?Ʈ,;(5sFH6Yt2Q0HZ`KtNvt6];Osniep>OToziˇ_RKgR1ZKυ۪@"ȁG,LPJ8(B2 ̜sh;j{)bfHQƺHat >%3:\%BaB-:Xɸ\Ghntḫ@;1ښ}\cFENYk5́G9tKtrz]|Nh4*#cSe@;#?k_09Σl_7:k{r7ɶ]=:;gVzt7wQx]W~MŌqׅp_V5c><'fy.*/-2\lAIAqkhy>9N:ah˳X3sū"K<_CLXm%5\xM2iܓUp u4%྽েǓw{e έhXYH[SA##❏U7rx%Rշ7RG1m_ێEcq^=Rz,YzZʬuu% h>7*Ŝ1C OǖSbfp+lB; tbݭ&t{Ww ؤ˖NeW׽6eu@_ J$Ө(d:EMݟ9;9ʮo/ì*X.m~+FYg٤M.)uNcOamIwYh Ţ:dqcd"MW_>VkONw)Z^ ן/S4^ ˪W`2E:?DӗVTaZ.ϛǾh><͵b雲̷כօY^Rp]a^˂ɷ{I:@.Ak?(n@9\wR&2csRwuU.)5>}5W{8ȳVhTb؃[rn}a=Üu^ze25(-9MvVDkzD;ư6:*X\7^XeB.as]wҲY'7_Mm?z~{wX;S 3cn!*;eE݂_k3Xw<䠱LL&eYiIaҩ Y871799=7yuϥm_T:7/^:YAZ1w*7z%`SucѺŬl25cmrJR댒mVYz%dWpy륗 T S7hC͚H[?-ushzЅc]K<ċܼ+.҇eC2~).mbrPpa.ʆY5r8=?Q7sWgÏ5 M"g}]蛟Ǔ=~ڤ]@HwξVw WScf_]O3c\~.r;Vz6}yy`*]gkڂb.F{t @;?H)~~kJꆘZVf/ :j:eAԝ'oZ@%wUYU(Y`"W1*HY_nJR=&)Vw)$`&A3OC8`i0s|^#d^ z'@h2vkN5QV [LiyH9RtђO!y+H}ͨE S %s&xIZۅUV*}=n`P2prsDsC:z.~uY=Cl : KlppSڭ=TA3w=$ݪ%_"_dK/>=͍V @30+wGV 2Ml}#|"+ ɐ% 62yVn+BYKj%+W{%p'޾ŷNWW0/)g}b/[?9ڦT}Z1l^ihVWj4n[otB5۝lE.V/'e9 =^qr#x6v$bvZo˗eGcw`IN(PP%( 2l\598kz}.-8!8X'TJ"j3PfRT;ej8|qAyj*zNaɞgC[0Zaf;>;/e׷.b+Q= _;:ÇDKjNr~raQ.VeHia͐⡵4>Dz*G}{ǫD7Ucj6WRN[VmڒS)q.LZ[b]CVDsشAL%jÊǤ6>d[9[VV}zHۓOry-9C0yja,s7lr #ƔEWª hscߝnQMR!Kcώr*ދx> -!K:W]|ޛmdm1>ry쌹BK׷:;l.>y9;y!m%ōkyfg3bLYdO\1.(`4Yj >&U[1[mzc1fCRd*ymAb͜؏tn+Xh&,|V,]Uc+z-ˀ7yNOg,.sr:WF ʑġ{Ӟ+EŒ2)6Zn [(ii0x(۩l+RY{7efγsb jw]Q:6L=1ص6d0q皕b8ր5́1ikr]cC*9: ! 3 Yt CUo8 F );RERٌcR b7]gDt"N̒`)޵Yޡ$r\tZDv$7FpÙLDzlH&mĞ4V*_u\\ OkݼdW\ qO8:GIA0I惭"çѤFTT}@V'aNl'\|\<a쌇;*>?>Ⲟs7-(7MՏThՏ},@2Gim'ZGWVIMe~9O#^9"%S&o'14**<СcB^waQ Iw̶ji Ug+"䔅VH.@ad(.+2z5z#OJI[Rɖ[?j$e,"D&֬h1v+Zr3A5Z9PO69}E^.Odyi]e w&v|Cf$+U<[׷er%IS9jG"j}|2H QT['|HQT9YMy?}'-Nd~cɍr};f(_Ti5>wҀVU0aggNxP{{ܭNj][fUEߔgN/h߾J\!p Y[/bZ3 ݱpZZ~6QDN%,A=,^?ņAo܈giq~5򗓟;WUk\ͩ۶$^t9/HxO|7L+mD}mL>~>ů C"(N2J`6Tu#igs-=_r{H=ŒƖusQ)U]M*(YUQ6%վ9$ۦ:v=l #4ɣ3Qu/sP\>mޱ1 s=J9Rr4+Izcn/,Yc[oS{dΎ%zͧV.uJ%+hXdžل.Flm1 bqɦ k!jm\И`*4 3f{C'MtC_NX,_v =))ۭ[̮e|ĚevhĶn?Η&ہ)[NB%1)V %VbN 3ѻh Hp$GMZX"wV<"xCP3 H.hX2DQONM7ۨYbߍIqX f\ꅂ2hb vTXkbRr!su3g8 FGsT9MHhp4IjS`$cLP r24@5Poӯ5GVU_UŪ)XIb1s('Rق&Սj}r5I[`4pD~" u̙4ycq֐ں}3O8b_6v6#uxO6Ӵ5*:`B2T g)Ex:FJ:+7E[j^Ouo+Ȯ;Ma6/pXaf|ܦS$-WhA[KN]px(0=S㖌u75:stqmkݸNWokS {QpteH$ZMATlJ&Ae+Z TKpz>t(C&R&F;@Ry)A̾ 8I[kA^Sㄥu 5&gSꚍQ8j+:u]UI$u֥b1UR\nA{Utc,lre[9Cڧ;ί>,3/jfB7'X7/q~Oy%`N}kʟ r#ɝ sh6zHU氧tv|v*mǟkS[r umrŊ1Sv,DBDv)qžO l{ *[-Զ-/Y 37 LMVII.S՘Oٲ<2\g~'}c8=+TfOXޛ֯]|r(^tXݸo`^Nɹ8kR^|>XSFK9뚌W1'_P5US&l0! B} ̙ٔ$\&.*$#XQ \}jŪ+0l}!PY&AF8 NkKF.됵JNST͜- 3'{O+&_>lQA7;n=>ڈrf5XsQV\#;EAPt&{|1%pb(hC2 ,Mxf4AJ虐{Ch>刁'H9[ILJ+ȹ*ozԄ`O-VqhdhQDzgv'y7ى/'=!dGxk4F W4P#!$j.1psVAD`a|^ӎU~r5+jޏ 4˴`˲p{rd?Qjj?yt\]̎ӟ~Q4gwH&E];\#Q&@!ջt`4}f_x&Ug3 PQ5-z[=1EOJ5n%z75kT4sg!5Օnգj3'Pϲ0W_Ў2u Ba%H"{nLJQ!iNE beQK]_iw4&D8B(a@‘( %N+5Ҁm0[:BrJF:׈KeުiAiV\=>?z4 [\ NJF҉Ii9-}ղb>A%-c.WF{ H ZTr%Dˠ;}ܘ7n FH'jK'* bAN)=s&@BQ/b~G W'>k(j>ߪyZe"!`CkKV)-!@H#ޭ5[l"U`_Ђ{E`t+VT5dzևMrML5L] 24a L [eS=S{@IUi*}DͦL@vro5nD&Wpa~Aw=JwɱXfj6^+Y ]DD&pIZc+&9NxQ =IN\[A'SuH MjJ0[]0MRY3 FjAaf^?f3vB 2eoC9d&l8G#`t3Ph;Ǽ*:S"̜/>SjM0X|ԟ] -c1KQ\ѱ:ӢcТS0EzX&,yM>F#UST0˓$8šݹmQ7_sLVՕ}ڬY9$l<^z&sՑ@VQdpzExYW>lUsDyۗIq^B^r# d|%gS4a?JȌNM_rRV5Uxfğ|*\i+Gg^ T~iEV 5V0#)6gF\hw1.Y]SMuS۬5uH!O6WKQ\^ԸvE}2{l#ь=Z??[ ]6 6&yVnZgdژN!WB5UɩL:^$IN{jq$W{)/nj/5NMۼ89X/Iq..h,@ɳGHֆԴgN6IM] Efar f556b%;=A̴Ru0^j*TڍQH:} ,NڍYYM-:#TT0+iᴿW'3OF8ϴgi ;GM)ppf?).Fjfy՘UXQNS{Zfo<6&#V&PsU0Tѥ#E\4/] ͉=}]RmIy`[yJmOtX}#CȐ 48,AYIp8.M[=@0="lQa䐤XQ #52Q ibP/xNSXs(1#9 &4 pRϢĊ+EAx'Rz!qY۳|U) 2zܚU_g0 F!M Y7WirW$:5h_eT>2{aD\|e#!2~E}oI7֙7%:h #Hd> 7gSiMR xޔs  J59~'S@r }~V~f/^o76eefblwJR ,⅓=oDFvTx$a-%@P>KQ~f&pmg) yump " D\TtS- WX/oAzCfp!4 expA=|řRdQS/U+9]\Բy2_MIlmPX:$)ch)aY:7 #e~-pٴP ^9(i䝳Iq(*,@e}I A~.ooMzӿנޫ$qu{) lz5WACln-eJ%LJWR]H KȳlcͩQ>vhMm:_ǜD{~+"zI*^dЏyڿOt;[~L`UP@3yɰ;ɪOQceO0 +%r )( F#lh Qqi@j;/62ϰSS#8fRIW H`Na!`h,G6 1.тu˩,Nt6zn˹/oL'V > ǽף~zȡmYTc/;:_Av̟fǜ#j|'е;A?߸^w睻5gkn{BuFi kRn2,Oёؤ?7P0vY4iS P1VٽkoC-+S R,"֥-3 nC&ﳛ0Ȧp_ opeߞMJ?]շͻfoPC7"6>NKd^l˭Ͽu4>ƛB@uB2yA aCF38Q!| "[o#[?Ny՛2sf{[3h=sVP ـo;~Dpq&@f~?m \2U,xNK3XIșN[N91͇lQԢ#;-:D2JtR! $!abs&c; 'YsOՖ/D?G ̙1֫#H[jD=Ԛ4QP7#N/iwlE&y%cFjG5x)^ *EhDsP7X6̐CR@Ep549翟kJDْ0`vù<9Nu"*|sQx*X]4y1&,#Jk&;4q>Ag/\JGT)͌S|SpmFAR0ybU;=8S83? V6NK +v@yW3"\fZЉMs_,vgsi_%^ޕ"?CW?CFC:b]/]/zq`\=.?yyaM0.'PL!~݁kWmo-~zy\Blu ?5ek:':5>ƞԘ{B/V3V=PΑXZ\ jJcLa)ә1doo~ aUeݤ+Ih?[;)zuTkIJN7ʡǘcLB5JpTY۹]%Y cBKֆ긺IZ⢢g.IojctK d&|#xs||*ϟX:|-,?ӭS/,,y{D* 3S܈Tde!Yr Pn[w /bDjT3bXmw%T4*KﱄRU2s16_'.fq^3]<]r{ø˷pv9*'wqSlWpѝ>:b^ ϡҀH rc/C. /]x˷W'C"o~{'0G\L9suz7J,[ԣAC5ʚ-;A57+Eu'fJ[`Ndi\ AZ050bV{5q1*-T <}!o?R̥fCl5 eEJ,i3iyv[KΌi k\Bn|.Rb^ 0^{wX1S"Z߱VWlkAm(xr*8u6玩B1 bɥwbþ7ևyb B:d=0hO-ꐍD廅=FK:eXyFLY9HΔJh?ѓ8BH"UFrm=2[9ӐJڊ6'1=lVglȏ.$3jᛤMbUy'/ UlənB3QlcMxG1+2A몰 YklwH!ڵYPkCvv:2)M,Vɯ +L!"La)fgb{u@Qlxc0ZSl|[GŽq()V(!:T/C *LՍO5a[:5 1yz\HW3\Zh˾I` Ua G$6 k9 @ѮX;wVHcWAՕ&,>A1I)l p\ `%CK4hݝqI3Qx ޷TGuţ8@ȶr3x+q2M!!vƥl7 E S jI a 5`B!*XYQrnnt60J ܛT31@Hqps$eUaj` jLDNRP6ZgǿjCR!]Tê1 (bm+5SrBBICU0P(f ȲޅBG4H gv][kPkwX5ұƕjjZȝ\x ua ELSV#,[̀|DR2 Lt*K$;{uE8LYGʝ˥p0ز\hr*xIF xZkbGI` 4vW<,􎪄S`Հ\ܗzBZ|||zIT uBXD@=:@N~߁wi&Qɒj3/f^l:~I+g~ᇾaDt[/K9=]Z %#^>a[r/ugw|`'g}4{|fQ5͗Qr=s*9ވ%IMk>!8"]oGWr cA -dy0i, ߯zHJėMYei5kz*So=uuhͮ3֋Ujq/}!C_xT_x-QDgE_8lІ3şh <35ySI" Cdd\hvτ$%z*se$egPpY1rC)Q3ulGL$uqЦǮ&nGø1OkWӎCzm`ihf|%$s$J~6;odLXtZva6(L_{5/*/,{Ϻ hb4Sv۶ng䬝Uy`lNfa[dsc76(0sfYtG rtuKGI>6s`/?jbYl1%(@g,P .Ɓ*[?0C$\wJ3_s7G?1eb]GAߺx?AR Lb RtBb9 "B (iFWfj/!|y=`eI ikcZu4DшYK4p61x#7sGnԎZ#$K#7!2KE8bF^r .iʑ8$)h.ie첚MZ˧HPЧer?j,Xq27%r謎^K Skj{z;[A ƫ#3K rD"G$rD"G$rD"G$rD"G$rD"G$rD"G$rD"G$rD"G$rD"G$rD"G$rD"G$rD"G$rD"G$rD"G$rD"G$rD"G$rD"G$rD"G$rD"G$rD"G$rD"G$rD"G$rD"G8"K: O#U9"ByHX+++++++++++++++++++++++o%l(Vo !$ A-fMHS, ##]#IH}FNGTz(}tJSԶM[7 ȃĤLK 6To:#ZltPmjUض8۶z2=·eA _啙W~f4M][z'K^6Mz׋Y`u&)d&$Y%Fu1p2̔)M#;M,Zn5m BY0Y9PgLЂV`y_\`H9PT :#YTvVM=?3uvaK(~`TH$ȒG G 9igJQńw,M \}.p{OɛLL*iN}pĩ3Y&"B>Kkbə"bU- ӋSR8ooy=nNxbĈB)EeJJdCF!$&Hk94Q.K1=>846:Fu~c&[x:~+9/]AY>71|3_ŝeOQ!xm3lʜ%mSb#cAzI>jM" ʚh1JT* ꝈlI/W HiC>p?.,9]P]茵STԀ=fC˶k!P]^3C5Ys"&ny P_&n7޴v,U^\iZX[+"B ҿW//xDq;Xͳ6Nە-YgBX>.t3 g_Gj f7ADj e܇? >2#>2#>2#>2#>2#>2#>2#>2#>2#>2#>2#>2#>2#>2#>2#>2#>2#>2#>2#>2#>2#>2#>2?ZٝqXRMWJ W /}b;ϻu5LYrt5lV5إөv|!|G;oMB٦57LZ%XxO)E8P#EYm`zޛ L5{sslz3ar*iw WLPR6onJW2I겓![ EYD]NUV&JN&ޅrۇnS:j؅zp5>.~}9ru9n;U }GRqvS) q|jf 3 t^sB\=M%9@`Iɗ7:I¸+:R8v@ &g1&2:Cp |jr8%Tu 5j*:jv CN;jɦ^ɇ=Gotg|>Mf[njx~Wi2~掇@Jfi'(US0aHtB\^pSkǓ֐M]F׃s>3MQ3K@f7hvhf?j%%ԃnw ?@JR '^ x1h}'vޥgK2ֈ(C_L\hNfQpV3!E{06?GtݳӇK K؞T-2I)mB嵷 Ӓ 겒TO{n_$P [{//Gߧiq}io}B2(yhIѸK($/s:ITFb/m2+9zyLo~b`QAA7mq}Ez]*[?gS7aU/3Bx.O=O|nޛl{_#XA YC5 :4:yIN ٤[lp.~C,!_?]kh O{?Y~ c-;h/ >4NK0娎1C ,{17f㧿ɛ u&6g۸R= "foo^V.QxuC ͭ M?7t~s- nnCA|\`z  Q=AJg7iSyBwy-lb$/7YM\{hWOxp"v y] <;.nh;"2 ohWF ss;]*x2Zx1"/-M6,bn1bRĴ_[>hynNݣÆ"D<=:hr˝ g>_nlBS*]idIcHt>:!զTffC N4ɕm<za~{bYKZB3p<󪋲nB;.UB)cLHB3 pDL,1q؆u``t^W40M4֮5־]pXb\xdi< һymDmNJ Sl=Bwy,O4ƨa)I:Bǩ+UpbG>H͊HI|a_% IJšYHaOZF\f B“$/~ZƇެtqxvuS]S41A$K`& 1VS(׉GMNOΣO" W5{0N` v4UEV^.]] !jvtS06j@RI ~L@z$>G*zSOɯauU%C<"Csӣ.,Gq ݺU>9+è3o&~&/;j;X﮷YJ/zsX5bdg_.nǗ=Lz0IޓPe{e1gv1_@a_ ޽yqbos%> aWBDkuù{nN^1ףN[*Jy|jP>/zoGI \%sfZ~VjV̶[b (}EO]~qRnE8_8BN/E^Wﭘmscmmak/N-m/C ?qB2Jog;u$Qn-a'v6,{@z}o4LGvk#R8poc׷Sx״$FVhE>}q zS>`b9pueA\ck km#GJ837>w\fwݙ<0?&HvgqdKrۖXdW}byFO=7+vxf X_XiDe{}鍩o1beu;׃ް5'?~z4ޡ-X"P3a+M(GC[4:V+`FsmHg'RVV8 ZR\XD!sJ(=2&|X$){_ߍ7:2T֢BuB$/){u(̴ۄn!aQʰB5*S)V  t͝pѪsi>SDJι+5462GZ<B/J֬%K⻇Y͚i*C3n6H!7Sht?)8D{9-CzHE$ez_g j`V<}u-d_oǨ+)P{-l@=9 ٘`qJIīHC"9_ErZx}lAUD4gkH*o],{7QStfzlp%SQ?1N yz؜GnN= `~(Lrizރ:L1mY۽ܘ>rz."*ʊ:D]!$TSY)6ݛ0::!L~ hcwl$NE[q#H-fS̺fb! c;Y՚*;#̮VewnZ̮6jWˏ\ٰbGJ J-/gT0͆$zyDRV&D"<SdB9ZJm%yY&6$9(Hvc״8&r>]:J`dp&=]uZ9 @$TB.BDG`4z!D]k(ϗ`1GsHA<[ՃtXKQ\v%MuOr,arMB7oB[3.\`1J|(ѪEI,81,irۧlվ'MU(Vmi1#pJ J$I1Đv!sdO0c42h f 2ΥH餍I$BzFL4q"?+u 5b ! nb.O9^q6?Q l䥱 hRɒ|Ӂ3~B.338' 8۲-Kl51nczyhGO21" zmĢ!U`L5{Z'y)olPV@IB: @$:XlHxцbcYۜ$Rq8l<)H-,xJ'㘐f3)<7[P$[uzet@2I(}qx.oQ|3߯l97?*N Jԓ~;ZqEn͟{0}]1S7 u~;*Wq[rxXO.ݎ UE#o2} Pq]^UO |a4Nڎ)]/xʪ75!DW[-кBF8Xi0zV.n{9a!&im/75:dXJ横eL2aB޸ckm 4 ~<`ulusav!E.坡o"9+~0.F(;P#;fMnv俄ztIe#; ׽=^!w8]S2iQE$38Z O;άh8諆[T[1RDNu6GjU}E,L,?Q}ZV2_ǪWl᥈b4Lvпc+1h$c|ūbR(N6{= [\zH\2V<ҰUB9_~ۯo~g?-s${w4@(#s픋xar*R}K0XDH)p9J4Zb7rAj T歛ZCjؐ 6jSʡ,2I8%#@di0/B+@ؕXv:?LhiY[ s||U c7y>;yn%Zc$V;UNѤ; 4x?N){A5o Jɫb}rxGNxruq9NQRĩ\q6(*ck)BTFhvV\^}9WE;SMr\h풨žPM,{2'lQ"KN*:H+)KF?A!(hGm$F9=W4WIA4R1>wdIܱC (;vego@Ɍt$c/a?\^:ڄta2yPX&Mn#x"4VF~DjnZo*l HLl ȼ4vE;-"}BR[A`Z,D9XSc;O%Or0UbL@>:%=PhOT(qҊ dxo4d9|/Sp!ă6\0(3Y)<k1 {5kΚę.eۯu!9t>qUg˫pEeoh e]f&s4}p7ϗR&B'ETi :F%e$4(f|Z6PZTcI9r EKfLz-U6[2;c Br(M\Z^$:ai9Hsi71cBT$4GkH iO8E .lk,Oa,;ȴ^.9zH;c\zMџOw`bΈM4 BTJJeր2ߙ dׂ)g~3csFg XX*CwMe@3"_"LԒ1 jcȻurf!B?B}xxAdžڡbuz԰KQ&ΑM6(ӸL#ϏSK][sF+({+~Qj7u6T%Q*Ѧd;33I*@o Pɑ1YNlƤRxN4qV?ˇ>(s TÉIa'E Uݶ+B@W9)eBւ縆])1jӮeBEbCFD3Bqcb)*| Qg{YKx{ k\eQm'i;%ysQF]M̟|9B?r1I]wxBe/.mmsyR\vRe=jo>zTҒ[T"GŊ2/L@N/Jߐ^Gb-@]0$vN {TݫuG?vͅw+u/z~׿z dD#>ze'//G^ t1 'fY\^Vο% F0L#?<;K45v-Yf(it4{c|cfdo>|1@7c,f*najPf4xfγ;5ѾW2:Iɏ3DyZc@!?,xUBͧ\俹WlO$q iVU!'a8 B1whV 9?\j:l^IʼnRQsJTN0N{oNr9PEv ~I{֏F3pU)Rq!ik)r}xØ,^wQ)3N)`K61oI_Ctٖ$ldPn>oÁ5>RVI5ͽxPă<$mN[C| ͍Ask57Mj1H[h4"h:Vt,w vҞ/ZS).?Wzy^MB:s+}75o.#)XNݹ S!$(CD0CD'`B  I C~DKQ$ݎV|-V+OFu8dlȦɊ3䗊M!!iDoc1y b@7_VoP|->Lտ ) BY)>0aSXNuQgsWJ9޺1PsZ4MHRD@! R:EaDPqCEXRF]Ĕ""[H/[~x f[6=-T$*X[4>Ri!#-TTN!̺W~pTK.-IgLjIA˝Ÿs݌)S)f!C171+f Z ĬIԓύ/Ab8 Τ1b^9nJ;8R RU2Z]U2Ꟃvć,fG!?Ìx} bcڛ[, Q`b[e;'K\Bd :1o"RBn dҹq#Y$$q`&cTHiEiL0L#(%G4 1MLPey`oGgGb>Mޫy՝` MS %T4Xxř}㫓VeĴy&/bЏd),ՠ8Tۑ(OnLJ$Gi-8nLi.z`T7co/9}l}^O-!") H? #&B`xc:|1x=fg$21 ֽz/#7.~u꣱,mX680HP8"0(di!+, 90!S !_GykC+~Y[vՑ%CH(G"LX&@ P8Qs`ȉT.F:OGumaS}G5'K^j:L!v 1Q5:K:ޠUp9ʊ1ao=Z2 kcpjޒ;Ř3 EEҫ0n5qpbCWugjitRWa_}(" 1(HRK_b8`dcM%A]:;zћI';rh{)5h);3KedқX^b%>K\1LRNw&w!Зlۥ 2|p2FÌdIcu:D'əl(&䷄f9G{}j\(vDv|wi/)k[(q2.C% ZMJ,I0ej阕XjeȲK /G+G:r@OX_16zt,NTٟM˘=@}YL)Ed@'%: A&'!%:a>$ -7$[<"w@G,7]|㑱JЛ';E`e0d>le^g̀1F d(S&fu1SLJX_DX!tǴ Qx^_w]j:"/sO M:{?0Pi?qS O^hcXee"-1gupQ P*F;#<NR&XsPi[iV؅0M0* &R̢I H4@[ 2D]d;IV^)BJЖ#bǴf"cp &B mrժ aLJ|VkRLehJc9Cj%Ɋw {INzb6nZֆ,^ٛڏ2I _+toVAI (IJsTH1 yjO7csJiyТbw仟y@p?E!˝9ctHP;^tSAR-s2SF>Q\!|Q/K Dޢf 3Ccf牒E0;5Rԛ#+]}Ne6)s]M ƍE?ގ#Q$C$ b?$<oGދ7LJ.9/0s'YUfjFYe&&/^*lfV{9hKW9-H+K=oXW{ Er3)$~Ɨ #>}-,|T+Se\<}Y6kdV`DʹZ@ )K#z:Y7|>3+դ_SUzX.ؔrgF;#5w05.&MMJ_cJ%#0ln?Fus20i;.^J,:/݃uS&=^t2Hfˑˀ严 '~EJ5~q:?WmQBաR٬}SgXc)Ul>d]8-A@񧩽l?j-%|H5FDJ^vTL<˖~Tƒih 9fe zCj[CI广s2գ+S1ұ)q#wye&uM0pDycJK5!ԝTeHQjoud LXt&6;]ۣ镏^ ye"+#>Zʆe[|PpV{4 nZÚXĨa6 X[!|>OҩJ 2^s*4Mi_%<+r  L`nSN 3D^C/tMaG$.\:贍bә\~ZҀBW4{qἢƏ Q i7A/@soXZW4s8 GĠF\Wg "42<̟#qb$rJPհ Fs8C.)g|e"!ULߧ]jT $!dV l7a(xH%)WY@gZm{{Rlcjk `E2D)x'K֠V"\:}ЫǥYMU`{AMk*H%OoZgJ$[kpV1"J4[Q2_l=bɮOIT+P+|* )΍66IVN[ܶā<|pv"sfcE*שJY ~6enS7)*akP~̳eRU>tCPTF>gZ҆n~yǙ׍R]l_YV@\mcOJe2 44>LS]9Hӫ,p9-&_bJ3ew-)R;A B >yDr#"tA(cDeC0Q*U\i-hk 5a4BC-mO3x%K0}']A2x ofɅ90bYd dy[@t#X? |LL sxoN7)h/a`Fo{Gׄqo()fS߷opdQoU>AJ15]F FiLz 8yyy; *1GƧ6Ur;g3Ͽ~ń !:,?l:ϊTJTJ(>icR&@S#c ܸ`c%|N7dPJ+)];充GIdUul_xz%8$xlo8\EC5yIY?kGbԌefAazEDd@F0؇Qn^Xd 5^jSiL&sL*^uS5R 9=?P75j͸UUHB H,,Rk0Bvٿav`SWCP̰B!NXUQQaG@ʉ `%dG m7񲠫GD17Ƅ$TQ빃Njtx1 &UVTҨJxgHc .hY eB-] JT Ν_`>(Nh v3C cv&<~l9tm?wGQ|$5ݔt|c }m1O>vsYAv$potO8NH>|X01z|ncW;*j+~j9A1ư@jD1(l=68ˍ;~@-=EXNF ٰ1|EVC;,1lZ`ygJN#$*TҬ~ ] })OӶMD Md#O(.eD0kSw=4<矤ITT1lG`׷J\"K2ƤHބ[b%#Qw}Yb*m?;Be+..ʞͲ|<5eEYKU\b{Bq} 5wq*ݛO BҸrlI%h$ɇ܊BPuoGr4Y?;3lpra,XUWeffS:NDՖ_Q%‡L3Nh@M䜜zd/].JҨ*V+0kz9Wa0W]L7{XyvKfgy` Ի@'Y놟sk6zO^$FI9F8n eQףW?NTG=ܮIKx=X|@B&~Qz:6σE' dq )NWa"IPw)ązǐPjZNla$|€%#S.c%HSvqY3J=Nb>Li7(FGq03Sae|XEB-#%e!sɭEݲX&4W!X e3)dn6=s#r4_M_vwЍӴ%Ia"TR47JHh!\_-q"]N!.kg#4GTVSǹhPAy #;8A+y6@* c1 .v2|ι1\^dJYC N:= d)ձ@_{4 p$a7 g&[pj۠JZ*1LY>c 7E)9&ò/׋i??7Xc` Fc]!!z!B+ӛN2O=ڔ}?vҦBB k<@BvC6f[boT8+jdtZ~Gn eV+xV NLɂ3N*ڟ hkWe.81!F9{V#^V+ ]Vh8{_>hVaVΕZ,U*c*5d Q&3.Y`yO.uZڢ]jVҊ Y+Y} A+3Ec|{Щlg9Պf05n`/rkUz-3%T$h4&x1Z̒h.ނ|G2oDH +{,6m.IqCG'Rv90u}k ʐŠHTw:6q$gk\ M3hwLᶹ?=^s#^ZCrY^-A-7"jjò>2Uc}/hjzY~^z8K)=B-VL2k"+Krgc|z ]:/lK^0wIIpppdYK Tr"sFUk30P+:.|}}i|*(&<DY1*K _ti?|s<:|ky]#%vqvR*Ҿ[2P9f RxU KtVJn =p'˨|$54ᕉ3Y*C}9]yyurJy&)e~w)uϫVBV"K<xSj.,91H=Xހe.zjḒ݉54?aQ9{[5F)SzR40~ї`Nگݗ́= 2 Sj5E m(,[ق09&fKD%:xaHQ-/93=P~5L4gEz?e wjW0=)xpk 1B^Tŕ~IN"uIvl5^Q2ʊ> l=v;'9>OGzjsXpJ|G^q~Z1u4ܷv:>T՜ůǗ՜.JZԅ8e 9\o`,R6rw*^15` #(":cnjj2f^^JsDH*(XÂ#)Q +E1T0_Z|h4@ب~( <6pA󯬘@aŮ̧T10VMqz^4[#s2Vl+*GSd6XHІ4zi>h:MN5ykGIԖM]Sc[dx~޵%`ZR]NAn&1)ɋ r/$C &!H{띯##u݉Da&R4M4'1pE`όųc|BRC A"p8s/0C3:pȳtVscvBjWJB=?=ɫn؄ 5MѣN$c\a. X_6۵1x4OCJNZ]D12L9`d`X!0t?0&fH UB@#~>M a"VBщwG= *Xw/>cZ'yQtfse0@@b.7 |*Q#]}Z"= t l\ Ki>$_isj 8t7EʳUӨwqTGemTrVfMPrUY2K#h#<")W̞E%ʒD@9~M+#)WckH"!nî[%RD=@^tU0]Uc =z%p]Wܿ,8,ɚhR ;:}G\фnӛ2l PXN-`1DlTq|aƛ$gzR, DE΢7@M9*׏=Z1@3gL{~YDI&c\e7#][co:}wq~='wí7<(n(R?Ut슏:6B_<ţW^; Wi=֓gU6+f`-W,ao GcV}S$l:oHKQ3?hМRE! 8Z^56ô#f0}ݥq,pؽ݊U4n[3&4UѺ"Ehn'!nt٧!7~F}1[mEn|{YbٶmϜS-YX;+WIH GQQ|:2\pHr{1<~g>}h ,༲H_KslC"`#bڽ);ٻ͔3];C7w?c+mAYm[>)S(j8<ę[ ءXoVujb_ &x?Xg~Q-?涾EܲK<&l2NWQts}1DZ/l ,z%,s~I?u@PK0!t5lV~eU ,Je<]I/.KK^ /aS<*hI)) Kn-*K|aDYթ%Oo QB P+Vl&*υ\.+ɂKfC `U7  9ύ)gbF%1bߗ˼!._ x\x2N!9J9A$ 2svb9Wy-дz C9cN+m.XT.=B̪eVQZ~e-Q_li*Y?Rx95!-}!/6~nf[c@ Fc/JnTCNp% 4}|w^(i㉴s/Bp|C)`xadl*kFEVSSSToQ]N b4@f8~=GTVS/9)mB.~PyH d$,hS+MayWӔ4xfp"auH3m qSG8wo ܓ(T$k=Vbă1EY9&`7z| #Hv4=A|ߐ+cL,5AsCd$bGO_!YȒbcC_]xY /vhs:'quT%1nDGgbVT 7PFa1(<_4 F{cPrJިQE$X׍&REt\ҫjw?a,3GӜGNujrrF;8R\PA'~#lu'o6m[DM*e\fP j(5t,3{` SFL6<|FHխE^OQR(*%K{WJrw`>Cvd93j\1(zA[Wq 'iD wl<MuYє 0rPM=?P 艽nV&6Tgc\nzVV<צSL \6i(yCq ' =~DՌHc I)qcAD%~oTcJU*UJ{6y.W~qXl8)'ڔNZX  (NS5@Hp0 xSnlz0hHDǨ>4~U&(&#D:?<>*Ø( 5mPɬ EPŠH%nz4`F'buCpp0/>A=*hoײ։50쌂N:~bJAS@1(kP")摯*12@@ja@0 ݤKѯ* [/PuV{ :}Xw7K,B_mCP]uӇ׍eXOQU1kх*V9^p;y wHH]Gh'PGE"T68y(fBCkJ]Xk4CfV*u\RthX+)kv@] </dKDUD-L3pJuϸT5 S"XOgP#r&VII" f 12A;9x+*@*r- ؽ=Hn!w>v4v4v"m= =&3dULӎ3l/^!v #:"!P(.VȤJ1pF?feaC%=[XCrʸ#N;O+ߩgؙ$|#MeF$d*E 8ոOI22 aA2Ox+P`ܕ St ]050- 0= nūzj5V>GKqdOCgA;{p~51 dGсk\R501WpHd2;y)L$mq?ŒrYhhM~~~2 nٞ,4q+(EAG dzZ7-ӧidm3Rn(Bd0HAi.BDɻF&w~&Li4T=2 8]WI{MQ _%$4X7D{CJvJ(x3FЪEN\gtSQ [~}qmv B;)xf?IX:c25D8'[H0qT[o|TBxxFUb~vo7ض VED0߁@B)H5lguV|.$P*:yѹE,Px[?f;/i qo)GxKY}@IgO?ʑj}gA 9<} ͵疘3t>B*{A%)Γ([yN3{,tmT(. OǃiT-ngq*_*F"dp?KƕK4!["(@:{ǩ!aeaIN:!Nd7 RcS \3JdBLiߔQ@fOnC٪"^bvTЌ^4眄Y?}M!~X+4Sw fąz{eRQ*م˲P.Ryﴝޥ avژ q~zv'; DߵPc(Cq 4P\М !;/8@( k,dri2OPtJ]'+tQ? a:\COEG3WT $asJz]P/?mWDp=헜ܬ'dn[0ljQ?°p-6U-k2ao$w4gV]qD|VY#}() aX`Z^H1r a9 (=ti2j ib0zrՆ JqTJ[a\q'9t 0aY|.q}TġE7ovzvN{_Zo:F\hI T[f BVbh;7ht!& RV)Hn!il'wYWEmD`D)8&")d=7 &P)=2%xڐ+3]Թ={ҲFg }8h Rin&+_|[7&]2`^* փ(@$ږcG%&F!RPtrRFKS_c 욷`%6,3Y쁺$C^7@IhYZ,%cB gbdY)ytprP, d9AK^Ra2YáKr'@;nShzv +cX itv^J`γ&s#qi9>~e0>x95uBFoښIq"ಙe|Udm Dk0JF[;/I'q":1u$4PZ)?iH9&쿎G\N)&JNI oDz|߭)?# "c2ȧlͧy>4Ll>·57O|})firC>YI/nL$y%+;t/}'.~,?_V-i4 ?'M&ߥf =(Fvp6q$! u[+ ,vy <0Y-7Pl~䳑 ZCr$(]ʮbCaKG'9ÊO]!H$o#W,.oM;^kw q"FnA]9A!쌽7u:oy&צ$գHIQs*Ǔ(kS"h'9$ڄ3ʼ{Y "D""!EJIj`pE\`2U .0 8Fw;$^Ǫ.QJ/:NS!A& P 50 MX@bEgk$̔VFʁ[<(#1غõ x]^Z #oh@ޔģ8 8?]h_LOh@cFOQb_-aE'@&`:INrs(NSOaG@YLj-,VзCb(x,uW*&:[# AYRF\JU+! RXXPvNʍG{A, ʏ'-s:v<bIHPTZ~ۃnXR,BT7tX( 0RxAֳbxW|ZܪG!3ӣx˥oN B1]^KTopIh.5IqecJ!20[v=< r)T'dܷ(e: oo~, ڳe)ၻRt7 |H4Tq?p. ͷAL-Y f8ld/ z[.FIZfEhk94z@FИF&B֑ԩH#bd$J2jIdA¤ߋ|YǧNA:\wL흙|}|u (!GăBs9ȿ%%`{͞v!Ek"ZoA5 a_(c Q0Ug"d)e"AZT,"{C9sn/^'-C+ulgɍD=r$ IetuP%#h+~fTǁ0B%kwFalJ"~'߇>:orp92?<^)&^ W|.i-wkmxK7 %f 'cpp$ke9Rx+vN){{gǹ{LbXbtƝ؀3XEAX[^/FxՅM(~[^ԗ9sn˿s-qɑ5U4gz9iX,ew6U_my8 śm@0smș= L s LUof=VPm`'JXVA^F1RT[E<ALIHEՅd Q Qa{$B\}ɿ1wBlUh!<3Pcwm"x&)-pyb[OFSga4e`<丬S]UgQrSxbCvkTƒ_k,^~Zj@4Sաu B;9(N%{*1`Z]S{)P3XPDw aeC x..Kт-C+ṁuL"cY_ h q[{.z]xTIn(6 ߉jm ΃: cC.tvHc ˌiā!Iox`q weî]8R{Sgr0C0kM2W6'>ABñJSYoiR  ɛ!BSŅQ{c7w+Oyd%ٳa%Z Uju/5MDҌ-1bybD;V'ES%&gKë&vs8 mehxq6|3r LB:$ wemyF1%@?$E;އA-,[T4ϡ$ۥ,k\[)+gT*8X#p6a(m1OkRc୿:7z#HpXSyPj426k먈2n#xUȀX$,K\2E4v`*2M WeOμۨ"5ݲ 7IٌeTi#G.bpQ. Le >HVJ4:׋>b|$6KoqJUoH1PXXxQA&F!sKK1(jj"5v9EX\^\|ƭJ> N-ubd1VĈG.Va,=o 97`ņGg$Fp'^_((SCh"-HFZ#42k,ȞtÕ'}[a4`\C7VEݳJ1!E%r,FVG8ڷi|/)e*yj DZ}'{ލ[U'qV!jU=ܷ$G–,dcL8:EH4A bbi _[y\CH=TGOܥsq HSx -(U*#FOyt2q)19#9g>2I0?AFl&/lDL idP"#ڤۢl-"6S iJ!brE7=7<exLFHZ%"C qF((3)Ujݤյps`mUsSeC~xmg.gR~}-$z0_&ΐcYOhCS$QM1qdc>A}R(&l(=,nAcMWL^Q:]I,ER e0js9z̭L{a!,͹Nn$]e7CIx2Ի4FT[_Weqx^]'^^ᅍa_VpArFOyM(Þ2rY0E3X8iDROtŒAI 2SYLʕR:5ReGfr:4mf)X8)!HkN1(cA26t*)GX=׷5?DÖ"_6ͼp4 po8;ޖW46*d#ƆLSׅ#Ik6bMRML hkNIhqbC^@9LM{ܚ;1Ǚm{m8ݴ^/vZec\VaaBưm}N8ހ̫' &G QJX,ܩp/fF32%0e~YxȼLK} F9\JYu'e7AQa,O+U-+K%q5:SQ~DOuܶ\zV!c/v+Nnfu4<8zi$ᠸOK,BcjnVJڿ߸ν8w5)6 25( Tן;~3h/+OrE{Z6XߊR@trQnPʎYX](DWyŁa)0B*s2HJȀ&:B&0x^ŵg|>sW57Fe#I0Qn4PI"ddegugՂ$<:LK qd"A';ii/.E9-g&نe#M֪ݑDNiO7` k3;{6t (Ul>!g /]kt }\Ԝ f-piDpѣMb(ϋEW\?@-nI_VY`l!AF8I3AsXgO`1Rfs$]9Lm X۰LʽO] jMK0H yZz3TQC|"xf`Y$%ՉBg2(_+'3b񵜭3z }igUb!c霊0s?HJx3#@7[d @\xpqNlpɸjܳ}bA Df]%*v­0];磢 ԣ/@tucmc -z94!y:06duHR.$_IgcQav2WFO*o$*,zY,e7FWVԋ@ډC:.! LǼyr; LP3E`mm8bv]d+>U!Q xX[R.wżϞIH+ik!)J.M.///+d$Y_r nKEWӁMʞvU:ٙ1Mwr§X >}&J dW'F7$'RU{UWJ]k+wR0kk95/;2gmP6t /`#6ؽJkI pn(ceQ.c2Q{"xx6r[$ZVDWeC\!I`b8&5B'~,FuFC^2HK)ֈ'!FYcMeʝ.FtqႮ'#aagaƠ 2鯿P*8zpr~[o&j.~~;ݨ.`;~zi%>Rݿ^ nF*c!߷LU 7{^Fh1:4ux9 Xot4 ;FlA?~_hp^ʍ:+ګ/FN {k-!# ՓC.**{x޾_Ző1 ,Z513 ,!#0,wKHR%lە V]),Qh'Zχ-/u?W,[kIL00:pI,*&y ۙZ+ n{,FϷzBaJ:0ӢPnT6{CBELphzNqG2Ū ?ϒףۧoF<_i{'2$ҋ~O.O?53ٱӛ8=.`Yhu/L_t]o&yWdBFTW Kއ/T멲/.Y*(gOWr:FF?/v:9/ަÈRa|M6t̮;m3ԙ:,(f4o+ 1V1Ȉwg8(]Cga}]uȝ١`"Ĭ׍\b ޵K^}ɷx?r鬽bRЎ;`[œ ;sWJ83 qC<S_'&Z$"˖tzՂJQfyi}F|2w5>1[w˼N /BP,b\EGC8뫉 w5\^zV*ZBLAuTxW:y)QU^Omrwઐ `c 8To(4}6FJwDkBqWrYp&uVѦ,Cdž`? 9"`/h8;O:[ Vò]An"bH8|Fח+X׫e(b2u l([' 7TD!'+MUH PyGCwY8\B.}4ACHnñ(l ΒwSw R)ߐ$=!GTpi4"CʅZ#wƇ BY1VFw}n?-d]ӫ;ыտBKbTo\nK!gu+d _0!X`󻑽0yd%%U`{e]+f5Ûwo)p#E3RXVi޵#b`,yd ,0=ؗs ;qV,oQeɑܒ=$lUX=e?N8uAyҌ\y)hDFNMU쐠17QPruĤQzm B?Жl=={̽=)~ S42dș).'֟w^l =I|*7A .D]*u+&e" |k 蠔( 5h4'[͇U{,*jRgP G[bxτlp}Y0fQ; BHOQuhhJ ،ZD݉k--b'+$eh(lK&ޤpAlcJ"5 fZB@PX+գ+.W%*"ǔuRU푔KJUKH,ޗBa= o'@PEbP($Vb\Pp|")i-ơ2r‚9JI\HP5ADcpdžO2d?`hKʼO.40JcJE]. yC2A[GQ:bPTSV d΂x+*H\R1 Eh΃\a'@c(J6+DcQ X CLlHٚa4vv9[4]OB8̊OZA.(>`f m6-b;-SM=^`jǐmCF0ⵐT,^V8c7x 6'WЊ'JdH؍A Ed#B6)6N5+I{xJ} *R(pYŜp #ɀvWh{}l]ݗ"Xc>w1D/X: t*ĵ`H!B@*Fe45k'.Ya hIiT!A UH4r+jIórNҿ|1!w:yèBZjCYW)Eֈ61VWu0"ƊlS1 md=~?>j @my*Y(s8wΞ$N6;<`%r&ZVT?YqK> t0SC9nʿ%7ϣg']ޯD]xZ|5f'gzĞnJ8||fdoឹ;miXv} B _X&#O<~a쿘zmy)Zhz%e-gd5.VgPsp zaR)XuPYf j_@n܂4l+EX5=d{k6`Ung_D->Ӵ|cKZ@Z٪Z+rA,V̦B1&Ԃ+c͕1!@r @GY(O] ̐]Y2J9@ A_k 4vjzNMdRQvl=Y 4͊`Y)fx췐DHT | F1mcѲvc,z1(X0լcKskHMJ !uYJ]4=-`v1.5W ,BUveٽ[c賞4uS\piZ}^uE1w qeZ z3}yS6:ޜnrBy+>8x ֞-Mzfhm{[yiO9f1ճ#y׫_TY<*+/tyjZ#)Uޔ2 k=b\|@:=|߉ R AB&h+^jYRE! ەAem6%nU1:W䗆9> 72Mc-g{y8*;:EjIOu>w0cNj x#a,JIbnJ7Zy"Sİ6w hw  wz,~Oֻ4w0h$iK$iKZ˷̹j'+#ZW4Bi%L‰K7KAÌz @gJ2N107xW Hf\,yV{Ў1p s-dz$9NLg!lw]26M-~/}I_zzyya_?:|' ;n %(xSDyU)|p6Eu~<> z%ⶡԇCz[eg^Pr3I+u OXmax|[h-ܛ[ G?>it}nJ3{VGkn, ˶@'y2& !)X)L=`aN7²%X Z*rꪰL@kԨ^&Pqe4,,;p{[?Ȋg,iP u8Zs~+oͳv5%k\;;pPLA410 b]8Sc caV}ھGAc۬&zQ7:tyEGt]:M+ug'G@ުmC$FLjH3S py;Ic_оǫ/Z2xٻgzKvi: /)@T )ޞܕuf}/pIN3uOǃ@KR/9UP8k1p9.7K:ϛ,C.Ŧ$H';ɔw c'f'j(dѡfNm )yεLpj`=~Kd9'KQ $7ͲS1pzWSgK̶T]JcENU,m8/Az,xF[!4EMmW?7tAFCCw`g!jήKX.7jLhc@cMh57mW_\aGغ4VHp+#V}/''kÔj"/] fr:ˤ%]/`2̂\"[E\MjS@|^u8@y|Z\, ?Oa= oM ]R;,;-tl1ɈO' ,JAR-5(] G(ٻ8rWJDy9X`yCv|_Tm┻>N\3N'HQdHs6 _/ɧsvBKI(:N|M Dűw &@N Q_^ TE[ &\ФfWF_vⅦxgf`vyHgHs_j YaXgaM3sB)Lv5 u u4^k.S\뛼EeNPg2'2ܮ]Gb:ͼ&VIɃG%Vsעbcl.58$F)b6=@!Rg&ejL/=צ b%T)zA7{_i fJEa) ApRѱ IȻh\LMlA1ƭTS)د]/'DQ>y +bqZ'_,)FgUp:)+o 1kaCϠkIk-F&{N}n2ٲFW560a_< :[_֣Z:m)*ft*ZV(EU/ EGet'c ^iۏ$/ Q}~ pI4B*WRȧ4O}z=P4*ԂLq%)IX6y2$0(pHOR=ڔz ;?:?x-i&wR@ )dDU i&+:/"~ok{LޑBUGշ=1tfFЙ5ch,vWHւa^iJ=k  ]}C0ׂGD+PJӲcԬ/34]%MmZ\/>@&$>Vt#6_s zh:x\z2qu(N#_l&-+N[V 9(gȘ*A˿8 Q2|MFT492^$J꽞oK^cu&ԇ#Wc[R4)RdxÝwNT=RJ(0rԜr8 T2? ] UP̩aWQN7@k^w]V[VڲՖUՖGۉ'a:ջoWŪrO=쾮޾*+vk1V\Д.i[Dݫ߮WaP`gCg{s}8=-7Co?xtt^]W-q7t~Vy V4N?A[hA.-#]\޿6:1,_x .ߎ^rٻg_oHFa9*nX&=6>hi6R w@w!-]}TB } kA*+4^<;cF?KdIN"xd@6,I %yIb~sĴGRF/T"*Y}0ggWSqc5MȼJ*} y[#1ExNBɀ^8FW(^VE.P:w\5`ۡ}?&ܼkmZWr2 Toz%.aå10Eq`{L[Y޺Ϯo-BI}խkK`V-VXuUEdֈ(| \NrXj\ CΜZ\,: _-lŝ0O[b:Ռtq DbA݈+%thRk\duv;*[0|^IfUӓjz GnwSmR*"2MZ"iUU2iAg]$GeT")pMheDZV) VBL-bh$Eј' $~'eq#,в/LF0W5D:R]~8<_7'߰8/ÇpLqN1$pryP!!bAlI<͛`h-<7lOxKzln^}^3p΅WX" *з{Dv1ͫ/h׌,d# S5} 6*hp+pB\WxXf}%مеľ!I)X56ڳ.c LФB6S/>A:gվ؊tTp(XEt&t\!.6h 36eGz|Vo'Hg~>e S˞v Z`˶`;B2ٜ0c4=bNh堚,9h9: Mߝ8;ܗv{Yn]1}vi4%S9>_;p<~d"5TlgmaːL)wvvv>~m@mEg6?T(ه'r~Z$lgԕf=+LhBCKznm5܅7_9i*f`=,F^S/4nNBlnm9&NkiTqZ/@jKto5<j@Fw0:zQLWޢ b\jOQ}5sv2+jͰƯ9hggj7S;o|N={Ԏ=i0tLfj]!M3{&]5a]϶w3FHKk'`.ODQ\žHvK;ѯkR;3b/{wiڛ#fgsMnh=,J"~>NL|gmӿ^JyZ>y;G?~d or|{[a 0ֱ׮wfWk?5i4`x4v; {A1`F9k୘QT_a~C: fzuup̫~ձFq) ^;\7` `^8SFV"E؍p-q(h6*qRts!09+B &&c޽PXGnadY/$&ypD[$7ӱ'Ys p8b~Tz j sl~4OgCG5h1ﱘ78{A[7t ] ybhһ[^Vi}34~I3#O,\qT@]slmU}켰Yx&;0; +jN(ag f3M-}}љ)!jɕcʽmPSmSnRtmz@u<0 b'hx ky;xkf}2k}y9 PXbPVp(l⨶UKz6fcF'Hw^`unz/cbMRihO6&Aq s iVGqe F\P֪kレ/Ɂz_ȣ|zo&jrYa݉K4JfmbU7Żm"! F\+h9:lEcWJM {[Kl x3 XOG6ifq_snX #q9I։ ruMkڑ7Z[t󤒢BjFS EjX,xhGoM$D BsE7[vxNxָ`˜"wھVddY,=eC :*u0$vĨMtc4 s, mm^^d`Z Tl#OM^n@9GIr@.*s4л1zmjv AƪcFk6jiQLw--! zR,4/EbgЈMBV+$ݓozmMRI(j!) B{e [vlm?,\?9f&vLccАi[2 ܡӰ׏X ~Mi0Ќi4ޭatgpFV z7V5 Atͩd=m,.{pk6Igq7*f2x|B_yEL)서$@t4`{kLo5Qc~\Dǿ6|&  S<"=|UGTUN|sX~O;h;@>3J1]# [9+CPܵxh eK8|00:){Vɇ3T95= Bf8#yVբ9"t,M%#Ѽ6UUQ*O }H :Dzݑl0732F7<:Yx˝oϮ5Q\޿/| d-ŖHks+J\șV/v,jV,itWgc+M =[˺ #X}4m0!hn;eͣT47hIȑz A֤b/hٷ|?OH >|3Dt 5M̵(}1%:ǨqO{r $F|LݾC+(+4Z4Gg()Ĵ!nF]RVMv}v#j Mz[1F0i\$dyHZ$43j?9+V_y6:0")C83pX}ccT|oأOoJ 鎄͘gpq(1#)i"2Gp%1 <%V1! Cmo?-J%_kJ 0B ;`lPjlں4o~T{q {̯ 7c H@x; IjZ3]<{Sz֧/x9?^iPNlVх4md8HԳ`pRw-YZ#L򕔨jUoYMc:g_総2{sP?}z9V;q*vrw싇XgRRcP9 \rm ;uDR}~_5T)nucU/]%7 UU2^ZQ9,μb_5ZEg=:AO C%TRrseHH8"}n`}xӽCDoë  <{zic鳣I lfb. }=i3*`^S{Cy&Wth?KMly&o~mZ ͻӑӂaahu8Y鬼PrrI9Ȕ*ǩuvH:RBab^R,nuXHTcE2RnTL8%u]#S).-}|s#Y!!]^$5VE ދsCh3I`yukrC@FVU/Wq#)dqRr mm(JLl +󘬻bz0yJ|VZތ?Pl_nKőuμwCYm՜BٍaCKjs[-A`wV|5OAr+k40AiiK{;/ëOtZ iudKR%lSp#rt[ںȅ2ȣ`p>U`Xcr Rk9)t59Hw1]!ΫME8tZO_rw_;yYK]׫A~_nVLK2JW@N=~Ż4n<Ly-c6g%˷<|fnVh$PRu:X~1q\TE~bؑWR<+_Ɗ%$¶np<%]_U6!]q3HWI"GdtZ{ Z3- Iֶs='ﶃ6ӞpX2vg;s'Yhh;]f F5Д ֐RA0h՟nۏ=&%u3pEBp jD</<|Ftժ3_ak2F|:Ķ&=n.@>n]F[6ܶoJ`?_!ށ./aR!nt7E|!/D65>Xb[X-}į]!*,UӡTa{|ۏxH"7-Uq̿̆G.) \"&>t%返e~ ) AaWdL|y/<ހ?E]Kyf7c ͸ |TΉgFP8pp"`[yGoV6}Py RLz籣 N>7]B2 rs;WUڢ)D4/!moet(PUحDlGpmC>|0ˆs#Cl7@}(iЈ ev}) tnv>=7nbWCv$Kk+)UfX|hkv:284˼\`0"q(QyaSQ#JZxF6fP+ "ffKYqiQBw.j ;R[X=R(R/05-H}M9< rE Z=@M BR3r;:1pMc I mT+FKh!Ga-AqFh5vWE)3X[%#aצʚA됺%gj{q_ -*-JH5 vf,zS;8G9oe];~AI)Ll[צ'[}mu=46gQM Dl] EVl-jDU-bZd,;+*}6H:vz"5Wրê }U#`U\ +d=NFu-;f̐_` @V`1t`L)R֫8U=ٞ9XVygvln?7_֋a(ٗhzu5lR;q!k"`}\l#V!zѫ&pDq14wOol(BLDeZè{H>ڦ-ȱ2mF~"nXT7-|TN1Vbk;n(&(E~AȩԠv` * jӈ@uS߲xV12:vUsߘ^#Blj!| *`U6DhXQJQm* " #sw E+Q\m+>w01cPSU%HS r-uL1|eMJn+Je*hj`&头[˫ʡG[kp_Hhߑ5܉TB/vűD2$(* :EYB P>ݘEX8hg\r >f9!k(P!s$ = []GQi{%?.CAkXT/j>9J[mzgYaiՖ,I|Glԡf 2٭ދu㻺al/A:*Vv/[ }d u|Tk¢*҅baM .T;cpXN~%i9 yϧQryN&q̆G`Çp7)7\Z{kz=y7RC*Z=M `czr]W*0R%NF~%adnF3Ts|ꌈEQ>쾒%Xw+43Ҟt.)aV/vyȬ5Ubv\Wvöz3~;2o 5)®Oc A`RO=o|kLɞ&%#]d)]*vA(o?y7!`= "{$z;ARS6cӣ% WOq5/P9+.ШXg?(>b*pN}SGd@iѩAKi9v) 4lȱU[!z.[sQt~lލwcG9?fU7Mf9M7n6zϯ/?ܯe,]SnGʤ!zq Mqrqv~WċݦX&})CNxuE1n'ISZd%=hsz5ח 7))7%,._o#vz Zdz꼥䲉hbKe6 W[OtDl͡h ShǣBUk'хv6P}R0_{7֜~ӘwQNp=)w o% mg7!+ gsgQۙSNO7_?uAW6[Cԧ0jr݆6Q}k{ ViqƼY}}dl&iju(Av`Dz~Cˆ˃ n U]k &{n0Zdlj-)7KRgxv? ̬R似jnM$N?@T]Sc rNvzw/K+I;SGd<< j ~o>lc tg\r(q9~kT_ `w[tZ9%ck<1s?J_, xvc,Dɞ9mHgy{Jexʎ9M`SCddـTq rM87pI}et+>'߸|>*ۻ߫qۖH%0ԕ WLǚ j퐪YǸUqn|,BSA)R#*> (5ArWMl"hmUθCwV!=xtLxj/ !T(D_LU N\T˸w3l4l̎Z5f3͌jIef`VF6xY ȫgYdQw4r0k.t׶).CioܥuQ]:6 KVKkim(*:7+/O/l:Ye)*m dnRVM` E.:1RV+FN!D}{=T%B,H,]J97ht.ܧj(}jܴUe.vDG&IGz;N#l|>4J5%X(:ѦC8ߎs+Lߪ2oJU|ċV'J§ Ւ@9J$U;N\?yO'o$&OފvH+4*;K`:u'{9Ҙ#}MRmH٨1GNw>[y ʗ=Z͔o?[ $l3 }%4wd@:QB]V荗 H@ƧF! Hy^QQR嚾pRe1S'5ׂBn]ys1;׊m.Xo/<:b;i6N=9믏]I&2j94S'ebщzm՛ !?/n3@{;}R^]p`/2ow㤭JG[2*7'?V}MW}O17e;Y[:^E[Ny(]?kh/Z^?"뫻D{xӬ1uyWNϊYRKtsg\L?x*E$CMB:P,h;Vc0|Bg,r>wys2^CyеO߳#K8L]:=kyOk}nG_voyxى}{;۝4 [v$9i~(٦,"Ň6[Kyp\S1 c行BՎ1 NWm9B1T1}H(>B_(d`ɐI z1ڽy8;'_]x'⓷S䯷y㛴_4=YX*#h%L(KDZ+@Lx#~T;G7g/^N4SHqTax AZP+g2BhhS-h~luK],5^Ѳ BUfloԥBSNz >{c\  &J擽TOFW#6&$&@(G$c$ $Y*Vh0x+*RhoIb|caSJW0S1 nB*T|B2:"DZJR20i=<7 me6P\նx^"k+nԑ&B1~ZbBTkɀ-#3ߒ٩[k^)Z- NpMG5c¶lE&!+%K8C a$8M1@C="/iN1AH0fZsew#5Fd:5Ӿ1_y~E (B;Xվ+w)O9q>p][*yǧ'BYL4+aYBɤTJ jC8 h!a5/B% Q+o@ yϻRs{{ŽuC3/T{|X< p#RвQ3mpфOlLPʆ RB-.ru2 FUznViӨx&oom84 ?#QC:yS'oZ`#Jl!\ (l}VzxpRFc94!Շ(>akʄjB;;y>L3uI&2B2֋ǽn:+"sGhb!( 2-ϧ[!ݧQfBIl Ok!9,؉2~Dɹ+o\￟W,Qfؿ䳿Iuy g$\<+Y)Zm"?`G0#o7#t|R)p`VilN$͌sYt ,C`xL5{,9u.3b',>M3גO Qj*ϐ>{ٗOn}֦`&oqY nOwz :{~M n~>ߟ~|b<x:: =Z&E& hO|z?;wV.E/Ee")M&7~?Ami<5Uŷ>u>Zo/MY<1mDIAEIDZ'Yn[ި/'+GK՞d"s|!<]\ QW2Gq?m6ńrBx+%lɋ%쿼oΖecnk] z臋YފߎO/?D90DqMM$VJ,&>L4 ͤ-F6|<ʋ[}񓻾f]p}go4ipN"M&/ΊبɬӇso4)j`s:Qg5q2Re94gL!B~(ZH<PD%qAeB$ɠ.Œ*'T5$MhwD2uX[ZeU*f7;t{Xs.b18i4,4"K Ҡ9$4M*TFzĵ"-ǍmsVyr+dё )\X:vFƎByB&V-ӎgXb[nQRyVmq)chEIs܄1HOg~R#4iQr*8{ d LFj بua\U6VE\McM؈oޤy6^\x ;T]iM('t2k! Vb(c=ath0sT!yHD ջV_ʍo5d(߫]WN+bQ{E\S0*h^&HZ;Pc 6OeG|&pq {zs/;yY+B.~~o?6 |dq[$0E KB.m9({BY4T GPHm5P{ސn>E%y-&߫"*̸!94R*Ӏ_*.UNk7%@ >{ur`AJg+&-[ư`FηC9)-a'8(LCvDV `Fi,I QtՍybXuy1O'jJݍdB8F*t@Bf Q: W ֓r3%yPjt_h;!}0kiЎU}u ITAu>B('Qzw& < u1ʂ26JrM\A`Kbf|G`w:@E(-uh(e#5O6uE2K& ?mPk%:TL" >J cHm#`kpʫA~:?. @h*Nє8X8s^E]> ۈ*-d*|ёLy/6}AXTIM޽6E6 TR4Zg{8W!f~?fȝػ )1)AU Krd D29ӏ_Wq[Ê!2 A^'_i>tٔ_ khqƘm lf> uz|2tLW /YMQ94347Ѓof}||.˧leҞ_}v3{応_Pc]f);5qF>K3ͦߤ7w0\q_3]~"v4 0VԱ+/Cr "f^p54\>aMJ \Pm g+ul͹M'~S_ OWٟv|\ʚQ3ϋblM;ۚZBG;~oᢰٱGݢNToa>G][%:!M4#%m:b"ʨ/ .Ōw@$Ԁ{Edv紮;^?ƸCm9V5r^1+hOALAH?k'\RpX`nւKGO2嗟Wgh-}|L#Y)R{]1i2q$QcXFl┸nHh#4˸Q?># 0-է8KksKϒGMtv[Ѯ߬%’ ̂6ʎ\[oG3;7̟ʤXtIx,2?K)K0w;]OaqʚDmhQQξ4u`&B .d mN?/g |Nw69"8 >+d}|FY? <\ڌ92E ͔bƤ]?!Zs2|{$1I(yFbHCt)VUs Fh4p[ߎ\hc L/e(Ș`6Hq@z{رsԀRipJp 1n3(\dV8Kx"97R#xZ) HD02xi\b3(Jh}3ξT&N,Zo7f%12\b)xH\Y@s L Y5 G&p2ΈJ!e$ȉHJg?ƟDQ 'F@HJkҊyud ͌zD*j? pӤ<`YDX#a`(p0 Bs; ]KH 'fg+bTvQcj6#[g F7 x(f'¦iͅg:E %,0cl#fzB`MF x$#!YAs au *4ZLqH.ݷƁF$Mh;R]*->'Q^< 7M5EfI\`[l ,u:a:LGTi^Wt4eho:X@a;K'|'i ,@dw\cvu(YwgqwnP*]n}(Xw8+ٞ9 o!M0]Bm*63;s!WgPP^1,ԁO;' zcAB8o+ܻLsQ{Z/իUYѣ{t_Qif,DrJ"!wR}ZUJչkY&R*<⊅~u@ZPteg r.@LG?3_ǖK1`Q "e%G|M\@I)*9d3FK 8x(#H@8jkMP qV攨7ߦmQ3L{ n`Q^{,3Ncۡo޲TJo09EUj͞\9ܤsoװ`g/|;?ԀoۥP/<Lg~Nxmm7(fSFؠN1WOf4LR$TBƹW&uK?rLO`7d0H3/if(UL>z"L A菿^|ljuǀ͎wb- m oxLb o+G4>s<^#;&xmzE?)ĊJpNZ7&H`@Z=9D[~0\)=  QZZ'O3Ơ"Uf[ 1kM;j76|o@ډ^.}}kբWY R9BoƗf_Y^2aq_5&3-k>c[cvB@nCX>q7͏[״ #L1FO .o4?FFR9km ^>Lf,&He85^j}a?kZ˫vlc͹Zk]ЪtȽu|U6h*{mM;S91"~|9Kp9M_DQy$u=ۢ<7I9Wekt9`hR2c J3pEQi֊^` %i"m%}hRǏ9ΠZTIXTyQ{-OM~'WzcFSyEfjr`31kOZ` 3RK Ѫm w{GRpMhb[ RzAJo1FR-ȋAܤ9jJ$ a"!§X`$#RDM;j%j-9f(;GLɴu v6UͶ[Tl60sb}pѣ.'R-QO!Ġ @NĂ/F$5}sݻ\_yrA6ʴL3&,!YR76iY p|Z"mO>(U lQDB][Vs4%wON -L^!At{j 7N% r4\Dh0\) k].~WZs}y\ z~THh}MTGN-HS|[5!$`PbO$!,1",.$K)fbVKœK%g6{gSӱ^Nлh ^i6L qs}WdxS:ɻ^/0Cw%I(=Uݕf*t'tf8c*Z?H]h'>j cp_(,eNϹ~Ol2)1>}5to<.f4 $)F>Y1YY̘de`PBƘX0e[=ϪN'{V |)SgCT {RW$TY2ry۟.L/Pr$;D`|(W,es5Lz0]~8 )AJM`Ԍ"Κzr/nQ)ƀgVcI Tr0V {OSD"$fhF{N("Jg86 T|[$ NG,03\Z£\ SWarZ(lt&8js"9[8H[q3uh ̄(GJm")(7I Ec IjFcXπƂbmNH%#S6 ZpeR ݈Ƃ2}Q3L2ef{IϞeS`Wށ$9i$xN J"HĈ~Bh 6H$kSe3jx1 F3(o҅2(K4 =̦|7snռsV֋_?̿W(y/R+j漿 4e^NR^ߌF9[?=;#3}8N/sU"ښal'|-?¤~s2nhQ-yY3W9c/ q0⃅EOǒ9R$NvJ$6"9b0Q i%"<ɵB2ڐ E4j-(ŕ (ZPe[t*~mJ?PH5+Í+hZ!- qlwm#Kc.r߇Y =B[b+3]dփeBبȆŜg;/|t!N6 ]o /Y4N6i :]OS\fiIpst=%~'k4|Eꗊy{s`~Wpf͔SjKIOxi`-#h{Gl6Y£}_ ߆&کϾ#G7 ;nK Z~e0^cΜ+0k^/zM=Z^jg¹e3JN{A9p.drDɝ7 dYD?E#y1k`a^ Ũ΋;,kLHU߬:;`X7 k>bJhh.9ZڜQb=.ȹ|V;S$E.e"ʣ VDR6fРI ƌx3c|o\:?NAx! >DxA`\:`蠹w}z c2^]MXGq?eZE_&ޅҟ~sQ6*w#Al9X=,ک/xv!H?$uFK9μk8a38]}SmS/MO ?E$p޶7{'mi;9QKB9 gaDV0Y%dU->s;IO ˬîakH|@uvk+OnîRakHZ|@h3;iO4`GNktAkpr9Uk pi$5g9m.YuG0Yn}NR褺אNu$5HZLdW<ȌMDp&@IfN`k'?8)DO8GT]MOnzL%(MXχ (*l (w+^bGS W'62%6裓xQh˭;A8%82c4aDc-MpD߾vpwi }@z7>gQLRD]5VGӭ!Y"܍v~5YCaUHPճz| BR*r~5YCa9$(`}3taׄѝfH0t RJD_(kVgΘx[ &  fl~SX}dm3Hkj8@fi5#Ft+H8IE+߷l/Le+klYPST@\,&==Gif_9lt|fe:*n { !:]XCigg6Qynꗝlx`Ʊ^/K#}>XA)t'7bi ,-/,0s0,ҴZ_=<)/o~{wx|09=~=`<_}oX(ŧ&5rp1~{ ulva<]/ߎ"՞|~.HD*F/y/biOG/j2y=emp.g8b'(fl&WM1j4lFbnehcߎ`KG6KNtX~%\ Fœ ,o'oŴXVӧO ,g`;nge;ק0?|fz2<3_>㘖= m=oieGgfغr)m;IiLJiɅOa>Ńbv8fφX-}۶>ϳ~ʫe(ǛMzaM"}o4mND1R5ĄΙT +S#%8Wwd-5MU[E~{es"ƾ {(U2@ 6>`Wzmx4+qk76i^^1mc02 Z䝳zu_XlB 'X+ֳb%؞[k?{Xt:8YӾG^A Z ->ӇjBPxť嬂+sIHI"Q%wo%OG}tr4D eg-_N(dFV8@_zϠ80Ad0j鶠\k__?z{|[p\,[6߹TuV9;1gڜHְehL<3,c6^+uv+!N\ޔ֚RE@ɎJ u .)!)A2mc3>BDrA̙x8fg)H(zVAZc}c&T(heL1"Όg2hGd$&iLmr@9>5dO!-auI=0WD{(B(Z(.wffup%!U5>8cu;+@s):o5lWŋHiZԱ8 [CD1]Yt@/ՠa7&YFa2/Y@Lg!ct{~;9`t(90r:q|a.Ou@~ Ƴ^{H"&}+I푣#o62_s<)Gc[8f![e?L~0y \ xg :5#X|iL"RȨOÙ[49?A8b_ ՄJ剖YJv5e $iL~-6r5<ة9=#AE:}7δU;}K3!U^@F޵6r#"圜M˼_  ݗ ^Jl#I-$eK6[eıZMbWdiWr_)xsZ(ŌI# W k$*m*2MyVA@N8EqɟǗ53(q1?,'+6yG%-zDhMsM|J.u/=j*ܚP5ً4_ {j5k2%XtvtQw '2 |h)?,kQyK,r1is{h((G=p*YkBwPޤXjg>|1 3,@ye8 DQY惋M"c"V6r9NI۩>bϕ,ϕW8`M$os +гU,Z>|ݖ6H!zBosjA 57)۫3%w 707ups8Z $sjv<߅ =8?"׽xJǷp-}VҒoCuV|he=|h)Z}$h1  Hi=8=tpcǽo:`jv̀JGV6?}p7w18爐gDRp/xDM|N. w|u W-ORloy~Oe%Aߚ_ Q;"!>bG@S;hZ؀ FLhkP"j"`EVIFS*w;fu.z!̄(GJ\v{eGhvBkM =;ywGߵӚgٕ4^*g#.NZjKKX꠽_GTTn;u&)rm<\ 6I*yܭ!EUKuG nxYMb:>7(ո7N'u`tW:+4SiEu3BoJe?yW6#꟬jq 7ޡ﫣M:)9xgEZlh 'uF3X=Dl:!VpW<sDŽ4ǔ2Bm CO{q)FFET& ͵(H"f{ls䴴q]"C }Mſ 숋Ϳ{8WWB7gOgsJ󀏃:C{/R?yNg_G;GD=]$΁= ?stgùss;^NBZUT&xbJTh{W*sWŴJ|߻71&5I?W1/sc=,*5.^2&`ξBdW9OȔp`r8ipQ؍/]pv9[s87t׏p"+2UX4҅0_Ή%*R8`*3t)+*MwVL2G]vFhAdbd$$cd&)/c t)b`z Qp9z]LRhd,ug$sewhӚ5#|\LvLcG$޲IvLcϕo.ʉB%7\b).W-|:0dJ*:*xA<+./kبSrZsLDfG81'fvUsDs!R/h[UI8})дw:\I'M:(;ܾhx) 9axq,r#Ļ"maNFd|X}2)iVrۘ!?}hm_F;a6Lar.kN0&/Kj$;^'_jV7#Zb xr @McQ}OƠ[DɪDm%C;-^ٟ$,ӛI]̮ofgoU?ͮOON0S%:9bvKG7s׽'J&9&W 7DKtX+Y![$ċQetlX~6RCiN `@q9xw Wgx",HcIgBZ,Q=G${6ml=Yfx Vw|u}׿m4w~R>xn8?5345pHXa'ᜥu!.ssssQe9Bl1 QJUB|be]~KU$U$U$UTMVNfӕ \\cY96M n[E;.\u^R"iȵ C#Rtk{źkEZ,Қd$d="ϔ" ؾP+1i58ekMrTԔc,bw28 Hl$?Z={^22o濸ފM\CsZ YTX]9@*ֳCq"51" )bR(\|ahu(< 5Crl{.LordPގ_Si6Q#p50>S0p eP! QH`>! "1-98"pXSO1P6I#9*~0ΔHc6 /E x R0@\0Spm ;& frxgdMYgS"S<ߗJ&!Ӈk&(h8#0uQw<w"d>(Zd-uXMT9Lނ'ӢU h!\AG88W:JF0H `іz5ǥF{WoR5@?4Sղ}u~[If-/zOͼtP/u/qe14yvy|geD>"L2{?.a2-f}~/?=N?\ W~}[@ &ݻ<Qzը؟o B_I(!XDwN`mN\iNɑsuRN,5-+Yw+R]7Xs˜i~_Gn+) T|rQ_ }ۢAܤ E*d_ͣEjfg5]J4ㅛa[+M^6?gw}g_]pq'{~{|2Gf3sEL_jˑم ~п M罏|.+hnwǺN (LשtBZ`oPRR4so\8OsjE,z8& N0̊/:āv_gh<@x&о8% )ňu+%5x~' +"c*0@#&2nntK9dFI)?y%13Jú`W#Hp9~Ir\SbLLE^+m/kN_e :HECxz5u{84(JG;o=`CiUGH$/av](alr4Pq )x)Umfl*Y۶_L4>C@} VpJ3oΛ.Vn빷Y7eaj܆>uH sU# M'!J6DR2K5fcF1ĮNZr<⊲HG5R,y{.#pJIF>) \=x-,бZz$1X9sfU;d\iG}DJՒ% 0f> llHs0vb'"D܎CN,T\bzQ% (}Dnq1PLC %* BRJB=IQvi9Yl[bv1%rN$Rm;:Hc #F~eeAA8 LUe2DE aN @I@Jm5dKxA@b&$+ 9΢0dן$Hm"ApUi]#֞Wt`Bv紫Te_6J$.$Esp\=rb@Er u`CDSëSZ""*\\!hG$cR*cfg?26H?mw7M* F޾0)+ rnӴ@'ct3GB4%UcƵ 1}B#t'y7VڽiP k =Yyphݯzh 0̱'jFICzI>1Ad=Zc:GFm(JQ.0@Wv;922GqL[OR3sI ~oUTD)% gںG{'㤔P9Mw[va⪉ ݓTP8BpPV'w;z?s 2z'ȜJcdc}¸W* ZRj8A^i(.Pg8Hqnм aS\oM[9K\9CG1xs GtkHO1fkHO~f}k^:ZG` XJPeDH]@ sMY*UƤK{-PĶ;GcT\V2>dMƞa%KڰY\2! wkXXC\$<5}f^H6LGpbvlW &S{L+U2P踅H5H=fufb 7X5-h 96W84ۿ[n%^Բ{Duuv{XQ+Ccs#:3;+35\LOgվ(nβ5'E8>.e#>ոZ5ȴZ&׶ǁв쩶OkY]"s 4Aߨ䁆C~*h:Z*}Jf *8 94NB = eR@su94hAս.s?L x4 @}';<ӉO˂)l}d.^*yc"['S,%ChNm wڸ$0>o+ :U9\+Q)Li*}͡w>=iա;6Eś7ÔԕD Zybo[7v}<:u9[e8GylUm *'ˮ *7e0xhUzܲ~%W߼ ;wq)WicvHX4c~c4+#0T:Kd cz*QaD0+㒀g bUc(۬{m{Gt!qc3a8N+F.$U՜)Vן(!t3m7_ y9~LH]M$9]`wR|,~޵؛wlNy՛[E, WBq–b#s`pagv}yOwpG a$69QLd:9Z-!Jcj[ǣjc;@Q﯌̙.Ĩ5 fU7iDv?fPgB~D 2@>lܙ}RCHFf6d>WW=8#2/G" |KqqvK˃䥴`*u(uܖq6qιT S C ˉjv*S* 6_>63jƪUrКXLDej-JUiwP5ҎYgD*Ej# \B *ā|ooO[qJ$/WP$G)"djMÙ交0|$Hf9ځa'?<iu(%:_!81hG2w3Qj<:[Y)G#Y d/ w@p  |QG/(7Z^r"9Pi0^JVYz% 쟑wL糷M_oOL]X[=n? 0BaYsv4]ok'iɼ\̏[V;֮q]{. a=f熏;,cҘ Mԅҁ+yJT~d h*)EI@UFU׉Jfd,p=5NoMB}/e j/܅]j8gf$f+TcO"[_UEE2_og7G,~p/:_lgl鳘$r8[;x ib`N?E|wh;8C'L[+zmu0zn7ʖp-pus]sSkw&y[O>v>ƘJέp ЕDy;1iF)= i-.П\^rs5GfD^@[J9}&ʂΙF# ;8dbG$؍I=(8ΰzxt67!x"H ^!+^^=e0n}a!U)}C,MgP\W}iʔB*G3X$ %Uk+u49YGsԕb6Y]1Qu+ܲ_gEg]7 /CIr.Z2))ofw8/@<}4)\^r~2{s^1޺J|Nyo[IfM;#sbrGMW;d,z2.,2yƀI1i>(pZK>E%pZM>Oqel&%o|K~ź)y^Ժ)y?wnJ^[2q^O!6Iz~xOc Jk4kdr!&o:58bV{Ocgq=0qc;tǍz7BuzݦkGv}wIgDK-0?*0$!*,GK_)/0[T6 =ɢ䡲2& ~.A$Ten\(;D.\$45X1k 0TFCpaRSbq]s%R7: 7q=@Ic§F0L{6gTZjaL ԊJzRH܇IګBdU#r[+!,+Z$xv~g+O4Lݗ@jD}O=1 ?|lXWY} %!t~߮L4 Om yp|_Q,$&\q~z5wRI\4(@ [C E5WEAD&H BscDz'K֪y&&ť5r^F[? )z!#/)gm6m?fBψd݌э1IRYNIdUv.|CZdf,2yƐI3T7Cudfv뙄]쬋K-ByjòdݮhGMa_R\Dms~ sʚӫ( fhxU(ͼ #u;3G1 菍}RXA:O70_Q|1]c0;Eez?ɇ8Q羱񻰨rWՒ|"Z$S~޲n º GtJQǺ\Qћu+F4׺!!_TP=-&h1[! DTףu+E4ۺ !_6)!cY0G<6} Υ{Rs^'`Ptf,cf[#ocyfÜSHTX!IV,1\:) pBL 2PbδG$YJ94۬ۜ{Tn.κGr1,bMn"ӳ~+""bM[Gjuqy5-Py͍vv޹kp3Y]ŮGG1TnitJ*[O8 Di+i5soq-`BO LT2@[ݿ;pw J-O%$${1V$`1V`G+d%!FYE呉!eb Ib7(!6$ј0J47:SJHP ¢Z`g 20$x)JG_[5e MX'ꠡZ2!9bARkI6^R `F-<2!0&*ykcMTrJ-U%C> _9Tjj+oG</Ac3q56$ "jݣmCn4gԱnGm⮭[1֭ EH|mt[) rDu1GЎ͵nmH.2%2eJ!ɲ]SW[>Jr)7&1 1 Sb-& x0'0c@$(1pF4goA3HALTi;"jRkD^LPH p7[!pI1|p1&m8fS!ym]=*iwyo"9jxYޛ8:D2 $8E$-܅Vpy:Y-GpKKmnGWW`8QMH(tH-%6aǂoūXZ=x5N+-hИ6PN946 +*=:jD1ct9G(wes9BDУnb;MkXo)Su =7TJ`1,02pKXMtSt&J.)|ÙͱQ@2iQRX6{X'n;polq!eg(M#G/>Gs=y5\q@׍pSUoE.5BtGǙ4TFF|{_-=,%0Z?+< }2D10uM8Q0Q{l]("F`"H#ingϊ9,{)X `QTq+B[%2Xɽ AkMBȣ6M#hOkt77ڂ{uu7or\_p>wKwM X32\Ȼ@?z7a12r`T%iQo*NuxQ3TMTƉT#OQ`TXF._۵;+⎢0rQJq ,iUR3 ,hX(U2^HvUi6c1!ظA.1pWKgvҏRQĒbgvDg8]Uˌ3JÌ3݌hu 3"Zc+ SK"0;f\l!z@K?葷:`CnI?yZ}죁l3J"v-60Y04Y|;;!Z]#0) mm *c\C3gV'2DwZ|(<|BOQ}WtOɧЄ~2Ud1-.y#{۷U߂v:G9> s3|hR1njk=|4ur}}+u7͵ :[do[bkaa79f $PT0- spqft9I}r"S'ebgqsM _/R/B'bp#0"ga[ nqXof1E+dTfŪc-̍,R;C}gƩ!Wyo[+pƔ@p3p|TTr3 ӿT=s½_U'M:G,Ming w/fۋbn65 Ga 'BKBG#supBlV" 7K7}:Z&ɕA7KxY<O)As?<_2^mH²RzdRE`~J,KrlkyK!a802qX/BHI]NɅYYme)`xœqX:8t AF(.˟W&џ&^4ϧ]-~ 1• [gya'|H5w 97uvo'§Y~$}?=6s >Ͽfnhr A|H U9w_s.u;"E RrHd 3âҌ;dVA9K&dsw agKٻ7n,W d#yt66 j 5:d }*IUնۭ.}{{dep? 'lrG nbz6AD4m.k&R 3 bӻbH `>8mHmQtчwK?qDTn8}~(< m&COuD/7ܥMA3AJjoo5Q25kldIB,T )QcSFbd58uW*p>[2̮W3TXCۡ _FBOCm߮xow݁}-muPR^ƸE5 V[F4ܷ6`n>vhwݮ;T|,o7u 8[<|zYOjlI6S QPd-O˯XQa\u,i*%P@𺤩'MJV#[Ʊd@ӚexL+k =!r\v[oϞ$[qN>zX ]kXV_i @&kE*/AU`В,-=a{Y$ `U7SY;%*\5HWKm*2YMhͅΡ6+QjY53!ĂRe:*V9&6ӟ Ə\o|If:_xmN&KG%9*-[Y5 gS!gUk>yXQ쮯]^o䵚Տ+jǕ/i`^Y)VR|ƥ8YK'W|oew]ag,7CgNB&/=߱>넏^@#^GׇDQQaؕ*<Aqm |L>zY/;Q|,Fq-is Qg&Сta\nFfD8ބ+Ba4 ߥғ?NSi'͇vVvncO4ǘqmrL0?ˑP[g֔?}z`O}z.e G9Ḭ2숈D#""9+D Gd`c5 Ix,wĽMְM0(:G_qz.j'fC(g=L(n\YXuՃ! SE+zpz&h&~W+|Ϛ~}nYsE ]fu/?k%ͺvNaYkۗH$rQW$!`FB9eu0X9=IzY P&/>lmJ$%%/bh4O:,ؾ+ mV(9磻4q~lg;v]G REY42uXDkr7#{g<Tn lDӬDgw.Z,`G঑j1dYX2w#!֐)x_cͺ \tFV2Sw*|cb'n -k!!֑)$֭֭.eDULi)[mBZCHg.dJ\~bͺIVgnu1(#:uǨbΗ <-ۺ&u;|"ZK 8׭[] ʈN1XO mjZֺBB>s%SIfAŠu;"̺&u;|""SLkyß9IUrT-Z:%LL|fge`:B]W^Z,E%\jcܦ_ǡ_*=N0 )Q8=tEcHQƮHBJL\V3 }rVd#R4i"G _0t k&58.q2pZU^/D"8{wCGSHE {;l$VMUM zE3G*d^[0`H~`+ׄ׎jZ!̾L[-OK,~Jeefo>la(C\0ܓ] tzfX¼yWb5 zg'~ ΩӼݡd`d[_qeZ(ؽzɊU2% T_pL+׏OάwDj z fau~g.,KIηv}l[gMOsFd|Pܬ sK76Kq*_ϬWDꫳ?sip99oR9&nRhImIQZrִJý&1V<Ƽ_m^ scfYs2vXfGE`CBcǷǻ]ŷ;nwTw4L5)q-ڸI&qy2f$Vf=Vq$ 7x5 &Mk6MN81*Ȭr|`I!|B߰J% Gey &0sZu,=%2Sv:4sI&M!է#tj7z![䘴 (6&BW!~zug+H/oμ.f;'~Nߵ3>Mb#,"11,P~/%&hH(m-j}Xc#85 ĸX:99N`\'THv)ۤ+tkT;|; BˇW)|;Ipʼ+->>&ǵ/'A"~3 QF8.ݍ6g)$Apଵ$`Pg7̀$t&=2j۴^Owȗ㜣mC*WZY/3:̺,E|HPD͙p=q3IM2g=͓iem}l)FXi,Ak4*"qY>D2/} L%1,:FfO>SВ|lCf ԫn)= 6x$qPW{xxG'ɲbǔwY\eneD9(fQ^EoAN/?/˯d +tݹ>zTޤKם맹s]A_i%ԑTG"|zq[%6e „_oe<[%k:5NZo~JQv1ca50,N@5oKPtWĉ68ajO8V'r@h, DJX"b >P$M}hhFYCɻft f1 dj3^n<6 ,)PA,,RY۱^s[0_S@`ߢzI("a5bK`3 #,E[X2΃\"l0ۏ \z+e';SuqcNjO{eֲe+(*gdI7Q΁k|ΛI ?{1SC`JL1CZ2Q^11 ŽdM+Ty5r N%+4GkC βρV8 柚U~Mw:5z8C,,!YhsI Jl6'0Jf!'fuI@u %ƷH އpNf)Km'{ϟT_-2q3y1Ҟӱ^`WoKz~kAKkaE'1 S}I Svv漢.Rլ$R҂B fޮGf:Qgut7X@UBl8"7peą-t/yRׯofiǥ?/eFyɳճ~IK5ǁ{Ӿg>t;n# r1£^i7MJFoBk^͎_h3^eŏ4wvP|jgҎ~Y9*~6Ӗ_|MO곝 j_kJ7`} dfN(-AVL*7;^+Bzjv|Q{bwJv/ȧЛοm#"}fqu`P$5C.=^$%6%jī(y"ðg(sFP{aӤa@ Exqoat*_ )]#µD C(FnIvXSH_[Hy))^,kTǬ#wn^ofH ;4zlaJM )./. y}h=\n,q`Ox 6j\W`}&Ts!'klg%y./'eMC!]^+h\Veuuv,Z]O'l`$!"f/nnڞ*\ s|-z^Vu]ĦxS`ê_INILu -*<6nc꧒9N( j۝0`odԛehy"BŹ+Zm(:H DUIU{q-/lgI~3r%x\l9">.YڒtIXff9WP!󾮤te~1b -f:HMtY ~YE ydc*}9T>F@UO.XI¯WA\`?EBOqh,Sx+; q^W30ۗvX}(D8kS8p-;^?k"kLd?B_סos&-!OY;(+サg:k8] m$7N]g3,pn ^rҘ{9s^ H[S7\IJy\xz_;994'ű]hd,ֵJfc6Mz¤Ud7&[Li.kI:BS9N eP?9;zW [ WYG;:s mv" ݾcBKhAE+чTBԚHLA/.B_~w\.)vn9G9W/WcNa]X9VY[!gU2چ{gcjo\oXK g$K}2t{ R߈LG2~d:wN Yu/׻ulQQH\g jkQ)N ZZs¬SI칲FL*#xjYPњ8=׵CҎ\ JE kEAQyNHbQ`S^ vYUJOB tHZE$(FHHi tdAp*xfB <B#퇣h3F-bjgOZof0PJZ,AdͶ_8˹`10oᥜYxe-ow!T|̂4݅n1]1m 3VޗPeJ`|L]k5 W#1e5tn_p1?N3\u!fHKUݹRCbAc 'Td=6?fJ)ӳOS/X- 6j <9[pȘۙH+>fH <˨E `&$K# 6 `e;h5^!v_ =r f ǣ(x [w_RA^]7Ne㠍Uw)rƓ& 0Q~}hB93[8#O#aX\#)5bբ~05+"u?WxvWͬ;3\$Z-`~41nVQ^-)٫©7BY15cl P":UwZ y :V- ~P)2AѨZ'J_,Z6j 4BIHv81_̾<F8ڞ'~]S9QkKN(@U}$.d )w!%Bs1`?ב#o3ddqn:BnbW4}{e-.C^y90MI*Gի8Cq}XPzG\;6a-zk\(R¬Gs09Nmxdq _~MfF`^|Ú.YQ_H3Oy* >Ϻujf8!©1`p"gDy!:@nFN۳-Y na(X6PzƠ{ A ɑV N^Jϵ@G!&{@gws#_A ~ѹ (BS6@Rm$g57iNvw@jW ~ !m~1?ݟ;4 `sT;}'㥣26a*= DB[xlȆ( G&J*vEj5F?/.)D!ӸrNyKhFʽA{Ki~mAZ j+&i0Fv \ˠa[?y̭7@15yjsÕ{2yE|[< \ADxzy82'2j0tEYf2b/;@E_O}2NT/&_+P&UpKGnɸLD HR4%(f%좺K5P#(jv |%xh@S]Jww/fTaG;*w jʻK5s %Hc%)04$:d=kK+i%%מcz[}"d8WO5^rh Զqw6Z>/rQY!yǬ )c%!˘fD]b͙™L.a0j! /&G`8Ox3nVBLOۇKEra `'?[.qƵǾʢ۫Of *Oy1vk[sݫl?D'_[ $}Y|HDB23æuҟQa1)C=$ mZM wDZ6(YAi#ihٳ ࠋQQ0WX hzyZ+-vxg%nj7^d(k@m67!5pT"!骓`J @ݖ]E YnqJeśB4gP4րKBP)Ζ5STc`6hc%,hsbx=ƫҐW8ލ[`R' Or "Y=@LLhdZ̚ hPjO{!aNv0hqj!'N6g)z_Fv;=;N Jc=`'ϜA ?9X٠p8^d4F?7#M)eO&bJ.D_in=| p}r;2R ڨ`(@"?ցڬm%$PEk@nĚ`;BX!`tkAvOgNയ N F(׀eBGᝈrPRiԅ um,Ir/} %(zp]^|#ͩ(J`L/~9I' qx1+B~ iMivmAd"VO?|i2F*s0R~4ܱ]FYn%pEv94@fѴ9 D3~WQ1K19&s`r,qG"USs:kɤڌ4 m5Dhܓ[:hs9R^JL1NOBegka %H=rYUEHh qKe6|u$L^ӝ`!á*dC5ՄCّ(]9Lod ݅T\#P([=GQ#G~`NflmOj`{<@:v4Iz8W}+ Pf*pQ!1/7h6 n\CՏ7۝!4qP|r͑QPDU1!KgG>!\@͞M'/ _bʠĪjGZlz|1(D!֑,?9V3JX9\pIux ̠NIC!}F 6/n6sN%M e ߽X!@J&?~HnM`**ˆ}$ j yw%^4wK8oE'^9um Om{OO^T#W=#s 9_ņ_Wܚ!&G&tfNh$RFT jJp`T0@GH0 (#pF?{WƍʔP}JZVj+_\# _`0PFVhFi%Oj0O4E;4dlMx`7`Zw.=h&E1dPQt{|4Py_F {2|? 8pX:{ n~qgJ`/Lu&l3CnƶUq qC7n#e3`kf>ջ|L%,<4;q;o\k+ !Y6,m3AfsÂd#CBaZuBƿvv/^1PP-+-/ه>40G1k.^o@/'m՛,T༏Skf87*bT{tdo>M?< Zs_f7@;ŒRȫ^X`QH/5tC'K|q;{dA?Bjž@ڝpL YTp&bF߆m@^u]_646ڃͲcwS}W)ڜ͗EB' ww>\P:`K= ¶4xH2_b4Ph8(FŰ]{s1lWι :vOW?zVT3ɀqUyڈ QZ e*V2 abAs (IV j}WY"E*49YRr2 r 8Gu,Ij*sUgKtg~0s~^f֬[[b Ap: Ё xE·,lg2$ Bh(M<\ IR8"QqHiJ%˩ʴ{A1YҞPb8Ogu9 ߇] r& A֭7}Ƿ 0}D.McuI秐Al]].#C51^fE_CiIgrF?geR`k9>GI=zppy`0OzdǺ6}陸℃Jhi:޺%Kc'ctg%g+T~qxd23+9xĥ)OhW?: ~[޳mx23嵠Ҽ..cU+OT鯙]N?ORO,=SS>KJXdӷ\~u8lm}-o|o0FKikw*muEA?*aG2qS%AH:TRWt;rXX#V)աR{vP2S:+n[a vƱK#u7V 6T;WtӮԹJe4zxYӝz2z('N'f23*{'~Bܲ>L]#/( z= )->ESBu +pËt2T_=! a/, ±/%,?Mc5\v'"*h; @_ذQ"`hץC TKRRu>(Uo#Tsjʱ)J \T`-gL(I19*Pb3'T9-{n56sB~:xO8jope2o{Kg {?OBT14̸L$yC-ݘPIQBO%*q)_ ^wƿ.o|ڄB>ǺN\mRMua &nXKA.F l:6O?3"yLTG; Xa:Ǔ;?^Fk 1J CTJV-([!:5N^4 SORǴi*O}pŵ\$ ?뿗߄QڱoDlüp` D:7hv, p:zڢn+;FUG_!,C؆|&aS:&nj:ӄn7-^jIbq!=NKy̻U<żZ`^P*w@J0e;I'Nc " 2mA<"~y8?`SЋo|S|T[.hFi u$0Sf)j ޲XLt243GB@1z |Ho߿.CÛ,} qhmK/Viaz鋗%YwGZPR[^jaS1p!{r쫭LInUS}sG: =saoCbP֜tY8'GӉ&@H DkQ)q\Yx;I#Z $$A&$#sy4?`L h_fM-N$G?ĝr nwhA7 )pם][HD:oIۦeI:zFy'HnS𝳷ߢ?fȇ/8MvvA@vJ!Z\)b ܠ!!WF_Ӑr opbp(W@0z-6SxabZTaF4`۽#Z2Bsxm4%aw:~@rĀ$pKsv H;Ks.j-V;5nDܡ[yE4vܧ ]:*n?sG>sUǻ7qZ6&B'%sK 2w Bء3=+ɧ>oyU ?ꭋk޺|33sngzQ~qAC)%rZľg%j* ,%_n1v< Kr dBCБl^TE[^r aIh9܎>7@Ts.[qS\~"II*zOg'`D* b~LuMgW䲑<zlDB聐^XiHA ($)*H┤Xd 8[A<0p+1Cf8M-u`\~Qe) 㩄z21I&ԟ2~(<2XN@k̉pX0^9$RzUEJ'b3I-|I3{X;XjRjNh^>!mSA9UapXE_m͞{|^̔A(qM\GKQ,sm--Ŕ}ӛW_`C.W'͏~ _'z:/Vr4һS8(Da;ӕYN&yC`mFCH:~h|w9һ &Otw> wqMqDso$0\F:Zl0< -u{?+vh;c g:XD`oX}yLL EyV_{B`gr׺&w*LOTdwkfϓ3@ɬwBrwYRj}$P*Wя戏FoG˛v'Ec~P~O73t-A 4| Wor[cK#t&:(Q{q+6nt}gvv8d/{IV4+a1:^"hb~<ќA.3sVHSBz h$y=A(v *Z`V@`Ǵrt[v泷Fu (=6i(Ž,IDG&$[)M~`7&08^tW^3AmVSmԷpl4e^!͸5Fk|OF{4'<%Q*W7ATSySm^}yKgf4b+iگRfԴۡwPH0̑͡6'QskǔHNEN:JLԁ%ce6) UfPsaKj:c0",69EV 3.LzĵDkФX{ Q"ةj+QJ kIR{Z'쩍: K(w٘ijeb5uG 1hnbk9֘[&ML@2MY\܃]{˫:Ԉ~NhuO1[j9ήכ29e6 ߼[=H3l!sw\ݽ&`z~v}mkB{O;OmImƌ4Y{ s*wn]oiEO[wT?E:\§lMؤh[7cZ۶_NKKfc;;n2M~;cme+Mdde2MmQyshx)10QSjRzr/AE*iccW{9Xx:=\wKkv6C #7BB}v)ɯlӭC/!J`DsCu@kQxu6Z A Z=(em"Lg굵1jL'DD'uوFۂhJ CYdvcxlNZ1hp/-d9ryoBGGb`v%eEc!,3>Cᾡ$"N],H mǥ fցA`K1IU{u21. 6ۧ[Gq7S]r""<W QUb1[UNh3ACQ07~?tMS 6qL:euNőCאS<"fV|LgÔk ҩfO[18=MH: ^Fi (_u) ¤mGX P>r|'5m:͇:NT]o  ׳1oks>ϐ0b@H?LtgLU 8{ѳqjݿ,EX+$ ђ%"`Bq lYqZ`Z^9$ Co ltӠA<Z$Y`'ԨP28UIMO:jC{Qm=` ̯[p*ۍJ:L5vj> uߍDzAl"VċdG D~Zݑ#kuGUz<2tBrDrEJDL1E4f< S(Vw%>_7|`6zO#lk)VoSCsƏbV0aIA;l;O{8t-#4(B Ra.1Tf92ϰ֜<k,J@QKviU^sa99* c*#FY[gJ F"5s2(5tW!OI ЫǙf8(Zq#|>\1]#MN٬W仌AM-%VZɜZTq3 #sƍF@&(ekjnw1G0֬җe$+o9xU b׺MfA7X]UOէbڒ%-%z0֊8p \5}Ûw_ⲿ#!֟ߠχ+G?g5jhgXq/S.V3IW~~şS྾Yl=Kr J}w>q%#* `8qz0H 7$(Bt=)i+)" ?<. /Dmϒɸꓚ芵X { V'zeN>n27@Tox@YbJ3ם>00 AB>%yzfsA^ !4h 846rjix"ò ҠI:N헨vֲ v L'Dtk$vd1mpI4ӅƵ2I2#JEIMO睇Y=thnagFnF[pp!`d~q <#9"ė%d|s_; aw'[?~dmQW93!l T0)2fPyZIƴ€"4P9߷!~=F1:| c񴶌2kȮ-R4:C!B2H$%9)YnZZ߆Ƈo[:;xBQhr a_-nd 7գdWr 6_<UXJ{\ ~0B2ͤ֠ +UYPf@RI!EbQ,*~quKA %sx9)twx]7D|U58؝y:;_ @`&bxiW4CtjR7~z[b C>*GSyA>WBMĤ{b +ef5!0ڳыu:8 <7 =[A%@4Q-7No71x;y0.?~-翤GrRJ-lOf/|20f蘅Xukt0KpQ18S'Zrjfg2-]k6]ܚ_ .5U_:єd'N 2+`0Ѽy)&f+HcZfƚ B F*bcqouDL=(T -^OQn%ob(K?ϼ,K80u^@ ƭw!sQ<0A[Ed jE Jtޠ,^mPa ʲ6("JTo\%Y7(!Z 핱[n@(ۄ[T[H C˧[p7֛!*Sq3^/$UjC5 IP<,]af*A0P2@r_;;z?=QT%0(1Z.X+cwrB! 899z7;BRM=I-҉€28Bz>iXtγU&zt8ük |Wo߿ӋA:?LUW9pipC@v;< EJ!'N=ڜgnSmd:$Y(ut®QXv'<ɼm]"qQ@n8{A4g8Xx$ξzX g_BܟMH$098GoW9;h@(nre9* c*#@XS@γ 8#D 3RI`Qm'!B[gXWϧak0}z*Y|T#(DRF漧d8u&z@@MJH7%p@\ _qL Py^Zgf׻NmVՃ|8?!]^_^M٫Uӯ,NiVm{/$ogKbNS1 e8krq΁Vf>!HHIh)Ji 2SHmtrL4U`I2E,R@\jZypVJІZ;oˡ4Liz_QHh aJs0KV,QV=>&BOէbWw<z&{Rzz?.P~xKTvQts|ŕQY2݃y TXWA1 dOaR{ 1I:_|qJ?=Ny}}D8ۃo 93[7|Rw=PđX %9z8 aRqDdz;۫m#a72T^֊$f-Y" &8Ԙr l'ݳ(AM!Z59$iHٓc!)FD猧D EHT 22Cq\~a܀jZCTCVCCT30¹-!C+oUu!B3@Nji>íq9{|e%5m<Ň\Q=CaL{Izݘ; OT}KJ'B#7P  )Ν͑Z<8=Q3f}C N8eiQdžs̆|uw#N3D-4^p6!c'm+Ⱋ6ch['huKs0*!hZJGSgR]И@YV$tD/$@*B2_?? I±,G,xdxXKhRl #$/_Xp EB1O2Hh/"hYqKC{=3+D?g }'0kGRɹ0ל7N?oj`BpwH <ɜ0c4v*ZlHL 쾘u( V龢5KED?=/\DKq)Lg@f.Rh8)QnU,P< S(ꝔF W2)k+1/+Q !z]D᪳Fw]nI/&~j YP5RB2mIɔ}nZLf˄Q.JfDsƉo ws7 nx82l4ᘪƝO  %U@wi:GYxce`*E?>Ȑ$C. H2)C_0E #=1HEƀ*DD&**7!hP&Mu9v>y7ϖAY~=zɯh C FnBXe} ٛ7Cg>ChyTi+GĥłClBc4bNJaL+Eo02Y5_ luÜ"ve`ڟ]%8 m%hLSM*mw)Ҩg)[Ly ?-4M?OP0{~a/F=6ZguWL0"#rbU 9m{FʕG19 c :`籍؅t5AR "?楴7d5Iqwo4؋{xDXE0 1FH1WYI`wsD(j,,ٌh'QYjex=aIJ -Z@A>;{S=qwa7#Q#7C.iBVQwj[2mqDk/rzu&m*` ]Ynu '1AU?H]xח2'f1KayIBe,:uu?ˆ)m/qy_"sNyϔ#JUItNc4UkNmx"9݃~kĺ)pUІ CyL$se cNLAkJ#. ưiGW{a+@ l֚zOWWwXڊ-*m|D۱>x_F>VPPW_2aӵ}-[_=@0_Y^rbJK*1W<^~۫ x5dTm?0ǻjUbeg7% 70]]Nyq$WlYȾGS^m5Y<_o&e!=^o%,{f^ A8 |n0и,f1ԷWɦUddQ|s~8m}\w7:\h@2+N,l]1V30Tt V}t3 2Jy rrimQXet(zjП|C Lfw7_NDdofcתbȢ y cIT^ǰ ƂB. E *<θ $9 3iJjAk^eqgC՜'! qBeVuXI/ c.ba[F6]OcjAL D!/(G88r; ̃o>JF)pV 2IC 3W:K4LYt3&:"fE`D Fh`R F*S;#[䨣-:'mhhSZ\WF\Fw]=TV{f)#HM0î`HFKiARXHz͢B?Q*CR9`VXvO)Iev5`W'z]l!>juRN9a*뢘k) վB6s0ml甹s|8?CckC=Lnx9QE|o{x$rP;1 cu4Y8;s}G:$zJ?u~I779|0% 7f a c͙dYX+hlfd(ybЁޞ8!]=|wٻ ?Z|N=7m_'C''q{J gR3-/pS'{c,;i=}>nK) !"{dBo5$7ٗR#?a*V Z}ůczOSsuOE%e}Յgqw5#*2lqa/ f#B &>C ѕsSÐf>Nj,s[]Y`]Ysl?9nx)̊Ifm;F ? =o6=mD;ӉK!ײ7;شȂ=ob FǸ|d#ڜ9R̯Ԭm2Sƨ!brukkzDb9e ޹6>{ B͵w!Y{Ocr\Y0ABC:Օ}}hެǸdUvK0lńݳ8٢luRh#IUQff:'KX삜>Fիgqw5ߨF91z4Ӄ 8 J=|3:Ӄ8hW)YgKGU@[+&X!GeWOjRԼm翆J s]Yz\~~~2Lt"rM|˛CA.r9L)IP7\̯ MȦwqDVA锖ƻq?YĜ ݊ݺDlГǶo\}DVA锖ƻq 6:݊ݺDlJ83-:֚F>1!b2JℵBqa(C Q KTF.r4ᵦHj/zm R4h|{`qD q2̑E(w tD NR}yUk.A)9޶6xn8EAyh B~p-F߱[M#1) OKns46 ?)lhtB~p-)!G8ѣnNim1g[qCs[hM g;8LFVց֎ Eґb]dpp 5„(A@0GA\4S^2 R驥`?{Oֱ_!H+;U{F}+1Ćr=E"\.Z=ۜmΜ"Ƙ"s< .25vQ@-%d? ミYS3ymM7s[W.%2UpdEUVu@VJ)}Fv@@M#ڐW.E2%\n~1ڭ9S&Qw+F4W!!\Ddm-=ԚZc Gr9_&:(iYy;L܆r-)efGP3$a-볈*!Q$JM% L*(mLY$b2K".P5 mkI"CBD-(Zc44+VzFRP@ qK(.F+W{Z~λA0I;7t2%L.ϻ_tзGY`N.A} ZA^I6Y E>cO=+P+\K +AM-}FN*w㭼O׆r-)ށZnET) ѨpN16kf !\DdJݤG+AV>Rv a-F4;eׂrZ8B D8rvqb!9i\9iB]產Ȯ]^}6ԎީX1XpsMu:7]sRz !&jAO  !㥏DĠK ȚdVF+]iڈ>/0m=!cL3ӭr8dw g,2 K9#(:0 .W0s;y-o\rӆDyxjok7{:S vGy*z鎿ulAa~biT*9%i['@^ 3 pG)w]pF|| ]9»Al;3mb=;7sr{} O "ܹ: y_cv4w1{;* D._S2&BnP¶r폯+&e VEj o{W\$4]@ "1HLǺ(wqppn8r W-"9 IU[yλO_>CaմPX.dI%|FˆjFe,lŒ=%'m\> G_XÜp8|3 ,CJNS JXÖdVؒSm[r5l-zQWw۴)֩znq6E+VYRo?*K49ղ=+ڣծ=aH [ڰ'W`Jt4W+,!~W+,@xaӠpy0=QK6W|9XiMū`wOk7pŋq[@pd& ~mB¨d;J<;}|Q,4ϓf Sc ˪ Ϙ AI~[>w+$Ǭdi؃t3hYy03Qkz`)y?M3qMym Nwč7MkbZ\88/(KyND˔@M-sBpD0i-'r>YaSˉ& FU6r2xzr9bQ0bQCB~/HpQeoie G# jee L:"iמhsL Շ1wȷ 4UO͂W2*+dAĠf?sk?[(Q $PT̘#Ocum I5FWHu}lש}4VtOoT>W$Sǒ{l9 ńr<~EZp.G!V1HgyWJ5Pg*L^Bd<ȋ_+{;AF.Ľ%,q/e{,|7TG)I GH{E(u<E=Q *`h*Ju S:Ri5ul}˼2,wo-̄iY~6Ո}7ݐ4 }]{TS3K]ԄC;#5BQHࠈ.'X}E@X(Ugn'b#e; /ϟqVBaN)+ ZVp'1El#k)<()e&<\ZfZ={5rI@hXi™G&׶Eڶ<`vh,Fmx a&soY,Z?c`DH׫?Z9 kIQWfF,JiO>+]x MxBW >sY{qF$X>IȤ5<2&UF5ً,d/e!{) KY^= Y>Z`&R as$R=NE/i1`R5٥*.},Մ:2U'Di苨]5+wk1#`Ԟ;>dDŜD֯|E5biXD0v,)u6 BJɱEx@ Cq&ih @g__JҸsDr%Ra+N`壵^(aW nyG)u?u^ ۑ!M0ee)1*_E4TYpEYH gcA%Z٠4c2NqR=2Jt6 X `cƜsvG΀qF (S̗!#g![ڍ~ٮ$mtJR nJY$FKω+ԧ. #j`(ivK1{/ }C?霜Y;vo/M珋J$J7 //pExĻQk*`W.ȥPtG?/NoΐODZ뾓sĈ۝!?+pd'G'Rr =Ts3/װTBkr*QWTpgt.^sA5k2U5ҙb }#*@-I +#?#'Fnn68|^g4h)t$+ ex$`)HL2+[z{8x{DI sOiZ@XBFj7uWs$':Oq`IcE8X/KيP.QWNZlExIP./GIXr5%"%EQajI&op3(^@tڶkDC09TXe #s]pR}ʂV ^GIH@& )Ķ$FVSsYp& "@IG76LNC4T`KWڿ2Z<}k%)*-W7AԦ({z"?x Z&Ы?d8^>9h8%RBB[&#|Tđ豷4cHdpDbW6,p0N2%K+$+D!6#odTz^;qZ7=9 .eT`4~{40Lλ Qkq;N-0 T \7G]t8Bd d Rs,GQ͍T8rn֓5" JI4xbaK OSL)ښLqVzXrXܐN+dtX܌XsTeSb519e6bjvveZyې[$xEOr\m1&w]jg&O7=ٮ+3<j3s֨c\KcH͓7CG,!ӯ1<={UL3o+A#5@*T󩐕F9Ⱦj`?dIg3 ڝr_˃{ޕƑ#"e2`a`{06me#`UY*UIʬC^ R">|ne-T00%'}ww˿ճiy{&7~vbu: ɛkRݙ\Zb~Qu=7]8?0umgOY?zy}y^/_bѤn_/߽"nhʾ$HWƿ﮻x{{97"֒n޾_VY8߯STsW_?RcvS?Y*wJ<2_޿|` 1:=[Sz l6g{ w/1b5cf?lG<''[QJ+)ÚgJWĂ/Mo?]&5pi1r_'"o?p @@xj+)[:c֛l춼sSh+Z^%־UbGZ/م7x~#bIr{ƫCAvV/W{;[{nΠS`+~Bπ?] nzd XyGw~ 7hJ:u1?y41DQD%,r2M"./gԲe5:U.HYk5/j(/d#/d-g@cGPU5ܱrԄSIDQ* }P'9y*CTuP`X[SES* Vr?RdoCҍW9}Q R$ K`+!GGXL1&c2:TT{nZW#MU}M/)B)R$cuI +U4`|}ՈcDk9FO7f`i #6u -P&!"(0},#zzF( Fa<N ɀ=X]#lmW r0ԻAl}xzئYM޾[1I28 3:=.!eAd| gredV-Yk{nap./Kr枌ݘɑ'lNۧBLt(>C&tbdFqN' 3f`A vc9Ǣ}\K2kac4n2A;y$3n*1lA^Ld60Z)^ЍL/;ն jtb& G$4Jʆ;؁ZW*n)pѨR~~PH6iv^;lCaH蜱NmyǓ)1Kn*H! h79msJ ov *=s?؅YH}Og&3-5Fd7gSb_jn"-˟)ɔ3mc]GWe !Yqmc#| {0)Zot0;mQ|]Q #l90"fa ;{E`Q""Z?Ml );$xڍꔱ(D2֣$RJ%`R`=9īDLTR3^ݸ7oW*| ;`{=/G8j6/7 @RE&ԢJ|QqQziS, 0s* 0gFϥF cc 9sloHl(ʯ4"j4WY P$6(/BVM h1&&Kj$.vQFSs]4̻ElV'j@!Ǡ K aZ#,#&ڹӸ2p+wMXkԇKA}p>ox5b-dd:@L+D2d'3ڗh\Z^KFh=Ѝ#⾋ f7'{~ӆH W31nLĸN"Y %I&tNH.j T Ek*'>\s3wnTkQr*"PR$e d(:9L*8@UdH+pm7j(yZmʱp43IdmхSn&hw$z; ڋ mqpkIqԀVM}2$AS@s3a I5X=a $l ڣg7br($9mv<}m[{IJJ065U.s s@bT[Z<2mlLuyJXz$L:&Ԑm),Tt%"SYar(YgDQ TS-kQt1sy7] olw+1:~UdY(~bi-'F@>T^uGXm;q؛>^(&݆3FCi-ploU84~]~SN#>? G(%pDm8׉7{\_ v`+{yʵgt_'O#>-&K(z8:`!3_Oލ S{V1sFj2:bfo~juK>ZuV(%. Y,[ԏt˻z;arǭK}(&]ǁ.ug\U/Y-ȭm ϵw6qQ )u<ݦ˿XkN'cg*S2@Fn;6ro_[ `8PBӋj@yуc`e=^h%}g(a,QԞYrƯ̦[4֝ VNkxVy5`hy,<RpU[#DFg&D?XXUŵ(XLqtFjg5xH'@c BX$ SAy_JR^=1̓W%}MXHE'-dT Ә1 gX0LUG`ڢMuzJ]9pAwztGrk?ާQm)ާ)G)cKr6YQF69^%RlDl}#*ZUQ[ )lG'B wEF-[Npht B#/7,%g:EL3m =;![p7lJ栉6 ?8fɆCv9qD'҈|ɤ&ĝ]Kz}}s6m${%y/ @ Hd(8DC0' wX-r;ؼMr/صH:МWWAG<lu?NTSGKφ`/F8Bc l[Y;AЬ R!:йPGB"HhDm%J4j8I|h&"TEjK8B־z)D֊#Zwq0J4L* 8B[',Ҫ2 rRi bh}Uru!溪[F9F<^"K`eZ,pK%D‚j[di+ mL&+[D/(vޯFv$:@(#_RZ}=@_zmɟ a"w5dݮ綫ϟ~j7j`)h6 [ӹL"*S2z|ZY`j#LNsobN,Pjbޓ- `v+.xgRce.5FsvYˆ$-l'qv_z:l՝-hjؽlNjd<ފ\pjwÖbzf&W,?@e3Z Mtao4{A{h/)@*M\aT $eT貰>隗s8 bQڔ}5ؓݛ;-\ջod:c߰{uV^fySw\Iz9gĸs/4Tڙ k; ,S] )l!LJ )<ފG""emid["M/}ؑ;T';D<=֎T<ߎdnvxOlpG'ZnVPפ:Ik7rKlC t{k.E& ;)ϟj\.ߎpPZ a;miHiU}!攩!%`]\眢ؐUZc흆<+#tEWj/>FcՐHCGo/H҅#ƚv{$=kh~M, x-S{.yjݝ6ZkF?)&Ǟm\TM qGŅ0~T\YLzlwD0_t}lfcY>6B5tsHKUs-c^ܞ4&%˧ Q0GJ` %VIqW<\P,'5nFc c%( Q6/$zI!2W2KJf/Q~[黁}}!MS$$7S8/fzI\n\z?yTg÷qsb۞F ۟k#}$E5JnDΝIB$91%]x;*' I^RC6Hb2r<_Gp;z&G[C+uz]B s sRk]+}-vp-9[&y)[O%Un!Sr^REFerR>5ZK kPh*W*DzIuμh\mn|W泑_Gt`4u? ᣟv `0 T/:m^ک jG Hk2jpNio`YTtM@ ⟇!vk@8"LZPGQ:]ߡ h tJH$F2-7ad HJ>`A oί8 #*)wB"n ,Bm(ǒ[)2' E1P4tk "&Qa!(VJ`<24ԱD@ 5RNQBII4<8P(jDRRqD^ bIA[+.iP Yp1hBfZr%hB+bzdNǃf CTBAii3 3s,Jk')|%C$g !kZp3cRS0T9nAOH8l D'ajg%5AOoXOffvξ ~Wo!Wv~=2qG~v/? `&'_Mo=xLt棅3hif6(5Q}3`oh澎3WڠU`w-q>'~a~Z(س /g띳sm=&ޱLo7*g\޻Ɓ-WÅ2pOsGt<կX8bYVg":163iYFqb%eYY&Oae lY}̚nk`Jv۔v$>_{.kx\_kxUysLPcBOFypZa 6>hf]MTɌkg}K,7R)7ƴs//{~bb//__,(X8_b=.u~bbŚ\_Ų=A~$bbIŒ%GKAzg;JD(m-qi*;J`~maSF?ADV-!Pˤ(Vd4߇Ñǯڬwg-0bu_jP>j VXiu Q&V,kmpS}Es*8G7??׭VDF&נ^UJ;ڿ~@u*lU"u:(DzLK*A s?H@y N0Cf AenP%5]/89I uzH/rMF_ƋPkR5H}?5IFdzz!H!ډv? $&ϻzAkϦb 1 ma5Quy<#~@r4,D<{K sU_fq_qmNhՎϥsI@ drnѮ?PBqa v U9*lHU>35u[a oka;Ϧ6`=&!bEQ{ #p02eov3 DZ37('S?Jg.Pi%uoċs=;?D BRH d"> Q)*(ID05 1c||-Խ]R0WY:fU‚W:kgY5E9VyHc:ͪkQ~<5tp%0dZr-_F}]M}b-|p7]bjquwOw-Ⱥe [;W>͕ 9ǾNW7|$|յclfֿru\u)+PLu]1GnL dB`*GpȅhO aMA vkꛧ)Ӎib[ p )A;n`rM[na1hb=T6n18Y48|LP]na1h⬞9-IAC.ECxW]`PI":w ĠPP#pRq1[pˢqr,S u>w4]na1h򶞦ixS-\8TUY A vk|iƣ[x-\85OꙡC&v Š넎QFw h*qsqƢ[ p )J:Piщi(cw^hhsqEn8Y4FUCZna1hϕKr%n jr,SLtE'?spxJu~V tTITn Ze$bMvf s8[9* L.̚v7: `Ο棥d򋯹(qތ> 732:\U`Rp0$+ԻZ!0`r8svg@I9E|RTILZ19eβD|́RX&5VU>HZqo(*f4g\8b{CE)U6S Snb0>-9OMϴ#!%@ٻ6ndWXz }Nm& WE*$%?!i o&De\ 9@FhX=TuTb`Sy,ㆺT\a!SBXH EC8E5R\8ЂDM҂:`lՔPk]HDd^ `sI5WJNBJ%,,}"v3؆zg F3-"\G\ua~(#wC3 #b61zE)M/܁ŷ{nbf1 o-y='M-yfY Ӳoy"3(-1"{÷g8LgIp8]>OqQ![}ghe/am~ݼ+8~}H=ֿos6QP2GUg3m!\#Op* fyc֧5?]8F%,k!ixP4ľޏa2 MpҫWa+X w0JQUQdaIzj85Jp6/ 9U^ 娡,S-(kDvig5ȣ>T!r$J7o3s;(F\иp  )a؈Rr7LhB#+L`HIAB"Fv`.~]NjoY9ݧ~Lp q)}XòfryX裃~zOUózd*\I` ?Zu ~oN?'-㡨*>CuQ洙"h fTPe7)|ޓ ׭GDP .+fV烯$rǿR~ց-Q&T#kZF Hw2 HzD$+(Jh|RLX*'v5!rkZf/ӛyDkՒD0 RJF\z>J% @]d;a`v)S}AԪ͖IWfV̗dZOɆ:R%G\CvTgKCܖjŎbt0bݝowS 2R[UE!/&1/ |]9ﵻ$l RiB=^o|& }6P spn|d4i*(͉|T(A-ҭ!+xsaN+}"pCl'ۭhhAQLsqut+])|b%բR)Nh5G֫I*gt$Up H1^-Pql)SjZ=ہj/q"YG-dF TUC:l"NNռAH[CbP֚` 8!úR[25LSK$Lye&`+bIu+ eHKQnW15sv>gSᤏFqgΞvuxVbSAsd95&*+֎pBC07!&f>ỻY;|wY4s4_{;or0?1г_߁ 5@B6HMm!NƗ̕fZyLvX~xt^?@;IXM|',zC8M]k  V02gNFkCBI"MkHl jF: Tb sId H4mĘ?Ot9]G0^R0f6|%Ko0J/}5}xMףu|`sf!e 駨NBmJ)Wdjo%wʝ38]Vӓ 4v hN7zB D%E$zsR.6aETuˬY"ESG2F7W8_,^ \ad) _"R]TC03cP;MJuDMOmb0htp9to0Jb~=YӪ{DN*CO-hz')_n\*)g[r@9i@`ZPcLj߂ +5-p[v"E;qpOsgy{ȦTLvdz7يN=a V^Qk8hx27E a"bGM5 )>ґG'D|I~5P dFy_j^݊ͧy58#9p%%5kS#FrjPHQR=&ZX<}t;yV2–[!"; xrNb'Fc"1yQx밒(^Btn \y,+i()l,U~TҤ~V YyD>R0ʎ 0c c R)Pכ#~XT>v[(Ć:”cٞtg!$KAxK\"R1,ӫ?v)эE}rŏ uXܚd_fZs\cadEpRj끙]%+.+HP< oX0luV+aF->ց۱WieiAxO a˚f;xaM+Znt< my^(jY+.,Z#ͣXi.(.7hGEk+w55NexLm]Q9}d)w#Oi{k]"]BKSch JwcnUFk *)X-QB nGPa. emSA',tZ[.bmݸEy^cY˝wڀuQ.E[u'BH*ۤƒЖvpnH7}\7qx^Ṫ^אiZsv&l y۱tݏ}ki`HM^84esuR )L:uTk3E-8 Ʊ]dL=X% EJVl`8쐅ۗvu5hm^@B1ek99gF2z|`͋:R#4WAy1h +jç~RcI6.exy8}p𨧯ZI-}~?By*8.0,!\#U|z@N8!Ͽ2"]|OwQͧ[4&qAQ !fP FZ0H!y9p)] +ǫyn<ܯ\ѫѫ޻P!`^({GYgov*~̮z_3ӊDٵC GgGk;u?5GʉlLx[yj-z:[Ed{Dr{@^:#Q8ňf >|fGܟW#`9r\j*tC81|M';J4Rj{*ˇQjtS(j [B$|⚌ǠWL߭]Qt0Z4EђOM墢釡gz8_ie1;C@?YZgWI2ɶ-od*^,fU--j:+Ȉd(?Yf3* $J3?cSJ/ i*` ň8Zx2r Z%u{,] *=V^+NbX>^yso!24]Vjyw7wFp<>ܮ\.a0000TwGY4`3TJ%-1`EEVqs9ɎWܗAsSzS3HpUhYx:F*E⎵(NN02f!\+"$\13Jȳeˀ!"#t(@ ts#9rZ/ p*8{i4;l .*j9|~W1BY}{~P{`Z $T_s|{/>4xt{ 6*?WLgߏaO|$p/Ql'! M\¸;BUZ!^_=QuU-輌gav1θ ›aW6a?uu1`ulɥW1j=*zi1ifOa5&rFs1ǟo^o"Vy.P>MAQq]WVk[D&/x;USQC,=3|Yc}6!Xp"3;'CEˆgiC`2Bx%G =~L>yRs9K"i(40FZsOc`7V$x- 8>md<]$}RB?͵ r QBaCD vÏJ8ɱVt;PAK/pX H I K MuXZcoL@(I7/`~ȑŪ[sJgdK聺MA_xѮ1RuZ[;9ȟn"&^2P?k- (l0vrd;X-"LH+yb4q 1ju Ö5!.ČC#D)$AaqY+CqٕөѼF#0:AI"/P2QӼ.`8uN(!-7>Y0/hiFPaۊ)ۙ)ǧQbhZ@>ٸΟy?]BT {Gwn-1o#~5f^ǙUSf `Sj%YS&-DbvŊ,Z)Ȕӧ%U4LfqArTZ}Ua=lky^$hlUcj3?iȾ& >Z~ Ivd;nJ#r,͆!)F /:KĞUy! b, ˔L8k)[2$-VˬROY~hّxe&Lʎj~95?.cZkwglɫ홃2z'iL(ޫ~mE)~WL%ʗ_Z 쏳 bJ/aa90 Cd#p8VS2pƽA N2aPre}k3Uj5ES@e(!bJ28!euXmfFo'7 %wN̺lav}+ Sc٫`Y7Q.HǚFLwۯX4XLa- WؘL&3pIvHhQ\"Lz!)wVY-1 ;e3Yh&Vi cQz̷NNckE٨գL#%5;;Jsɕk47 rX?7)'}<>Vy …;xb ^a5/K5^1ddb$hzЮNùAHtl4 g0oCPڝtE[Ƹ>%KqNxj~1!uuS?g$!״3'Ufj,JˤL2I*-S{sdV>^O5L s0j.;A>>;wI88ZepqM8܏gs0OՃ1/ח>ʿ *;L.7wn'y8Z]h&Y9{>h&[:]H7õu9g }dه:3JlL)ƹcjqȁC/=)wMc:irrnMep76ޫr(@K\fS6?YZL-/|D4]J b)aLfͪ7ąKz)8"3eu BR߹AYk%l8Βo)(dSfL{0lBRKעm<}biSS9f4ղHx:}R8\ܗ?k7WӠHXbC%%;!t8805q,c+kWĆ+'7Zc> ʂffbOy,οٔMnZ!rSPsM b4<ѡ  ] ٘o{P9#s-6neyC|(6" QQD2C0ޕ|f5S,l̷[b`[|3;} 8v@Z% ݥR`A\wwj6EAÇVMH:h$Yg+4.s*dZH_~Qr~U~2/KFD yaj<* _[WZ ^K?Ut;50uEA IFP>^iB/?m8_ilnnN%uJPY@bi: w>Ţ>Y!Džy}Nxn'@8iIݼmL{sߐ]_4_ߜ=,hUVy,ԊƂ.kY@-&RH\kBuRM ##c^!Vͩ ;Yɝ_"~k=XN#DĂ:V3wYtJ1tUe<!}0.ky4W,V.Wʭܳ5>_B?RylA͌)eAGRoKZx$k')$ꮟucՋrhqLF#Az{g d ik˔jE!IƖ+blTƜ4%3M"iչy]WYjȴhP1Bwmu(ZA.p ")8f5L"Pj [GwZ* c|-ѷ S;0$f gFX UF#%f15*0=r+n4̭Wk ;nuB/;}^gz8_!;;CNKvcg:HB]#%"5ϩ&)5ɦԭDIlwN=Oe.5!q U|Zb!/JS&<\sy8d-9: sD|t`}j Ipj8{A K[!}7ע̃6hi~|{U6ࣝ剾??2(4}wOSK(ObK< y,+YY!]Ŵ4 'SIS >&v2[dĜ;tcd.7-_-E'eKkNF㬮l;?z'˟,ߒKCU۳]:U?.R!]9,EbYɽd>-4 Aȶw6V)m1ӱq`;k[y݄Uz 6"˯Vp$q )@J;B0q"@"`C%H)\pPy:̧q6pUtKh2|뫳jv`!&Jy-* M8I2 JHP:9mb+9QIZm Kg">.|"2j"{Wۃpȶ08J{98`Lz{z>|ВΊ!A&ً!%TJÔ|)rN&U&ǖHѱ4DHv6_]B{8Aǹ+"FOé1(b4\-Ӫ.% pym6"}_%cR>|$*݌{Zvl42JvIc͡iaږF(&҈T4bRń@`$ P4a[5$T1hN[51ol0Su&0wr97ǜH!s\X~s}l W9O]ήswrϒodjNV zVd3v~};?=?<ϯO޼~st~L7{(:^F?ۼv[6J#w>Yû8*}1uHWTy!usxv#1Z!pluyIqv;/nA V(iL︱ ln/ ZL g˻Т"3 UZDׯyS0)g<&QKлFʣ( (sRp *ə6yB_:;rs?-56Sko4d5\xft%]HL YR_1R|ͤ# k A{Z8$dP0VqI埴ǡYyC$e3'Z] ǚ7@e U`ނ Nc:TĈ7PzFuA&?'y 8o *O(3Xs9;O%+=XUDz"@0?R@ULKI@oz]'Sn=T fD>fJ.75FRRJ]nN77z6q^3f~uu[ tmPnȴ;~~s0Tގ_ԿmFD>EZmg!Qzi6ZljC,~VC5|glL#1-Py㚑Zr,/xƆzZST21y.58AiJF6d8%#}% JvcK,1jh5s+MVXp?}_Y {%}泎~mC^n=@MO5GsYg1~]_|W>.,Dk߱߶EQ!'|\wuV_C^後q6/u.W@ q# hX3^^>&W BzXAhUn}m`}qtQupu; q|6AGySgW~5*`B vwT<*ƫS ,e0Uy3[HdCHa:h1C( כ(6VH"5_>?h.f/ e3+땜 }骎Tu}YK-m"zDSqlhM*t?wjSQJQBOӚj(Km6u䑚Jy_<]ߖ;7z2.{ftu&95=?V^6@׌״1g[:7zo~\^G+Jqce/H^ޟ!:^bl؞LqG&p}TCr5qt8񣮫8crNvQ@n~90%J笂6pUYU!xkEOڬ6ט5{wq:"^y@v7u~ڑ& UzISoCқ͕}MEI۸tW.df>Gr4T եܳ;s7fަ#y 1ށ| o0jxMŦ&̘Mu̦WUouԣk_G Z~n\!ގ4$+8imaT|qњ[y$4!̰~">!?oDฆܩĘ*%^@yy,o2RLRM>֔Hb&kOAj"W(d<_m#da_FKLg>JRʸ]TA5l|3Hۅ@,orUpq??m'dPٛϺEץռsA≠qdc0@="ڢ>x~ZjONf I/EBD@2.zA3'İ!.%ټ!A w4@DѴy)sUD%zW0p@A4Twt7{ۤ L7花Z1lJD&lB[B&b=G} zL`CTֆהyn1{Kz%mL-J=e?>؜,2(w:poVU'#Aɏ߽_rJ 8wa boP&ۋk֖&!ӘV,t¨o@I_7S;;jL`m\QZ(Fen cX1Q ?O_Q}E4~]ڍe5Qk񠆍#cQHLzl3ɴv@e)2) TJ-3F(LjCŀ* éI֢9HkTrvf/Y*RK9D9HK-C\af#Ik`Nv1 q]Yz򓬠 #O ?G#N5i}i'cŘ\~cS`TMQc֮2-6_JA="pփVA1S ZSNm(\Є(| (ESi2*S^JbS5s/-SR)2x4F^zD0-TTkXb(cDѷS3oI{.LA1)+[=9RaJT{#gtꍪ_"18 .pTB5e4A> p?9O m=cJL(JЦ<Ǖ%۳5s4 jp^x\<9%\4>JCa45j;?[ܓ")[.LBM$N_Ԇ&UF$RTLPI94i ݆5%N rk;%w xHhpS@+ؠ ).FM@جlM8[-:$؀IsDI ew`-11#tː.9ʚb&5*,s8meAuC fMKT fHb&Z/RbL&(:(s6) TSƷ !b%D"c\%R 0 :J913;ĭ=f8w!uL9چ撶*2*m\4"*Ͻh9>z&1 fU;Pi&{dJX1bʺ| V25NҊ:L< f6X +P*2V0Afא9N1hHeKI(P%0Nԓ3+$X ؽ>^G;~b'X1JvljJr?ȭ|Y[>wXYk0ܟQ Q኉1JfDhHh ؃<'XMr٘&B|z5elظ{u3{hr⧑N.777?bRn}|fPב#ыV0%`S&cLhtsu5y;[8 .>=ya'>&\0ŀ=?^\nUz3Bň#'""rbwv?zmu^VfO5h[h{ jV"Z|_miɚ+kn#GEᗍXR@:Oݞ} N]C}M_ :HC4]e"Ʉ\ɏ=m5Y4oJXJB< ͵>Ťy>0)܉X$NI$Fo׻O$4ODF.Ē0LtE^*`w6IݵΛaB$8R @Lxv_]`٬XxپjGM2G:d 0ғl 8~Ne(֊1l8Iu{5R6eF$$x,VJ? e%TC3e6Υmlx^]yqW͊&e7gp Xdn@E1Xųav{7mYKX/yn[<3 /WZ XyJK搈wA,#\t胔)16i'oOJ2q‚u/>$Kt'4VQP2jYC0ړD+r [g # D $=MF_Vk^|kEϼ@>T0dW=lp{n4Hd1"W[#kS/,TX?gG{9sGg|d2Wk%ĥբs{i=Ē@7T卯7]Z-!JvŒVy\ֻOMi;U>l!\=}ۆV=+wTp +^kjxV7ޔooqyy{w*.>d|ҟa_3?j=ox.t}du]|2x44bmqta)??(Z34c{6Ծ۠ֈ|~hU~;E._vG?]7b;e\ս g_߲Wu<,W2ߚR(HA( 5Дd&Imz7 JwaEcŏ_.t#\A~8.\CLZRn"}1 &I)/Z\QLP3Db|Ku(''B!-~+Ch¬}|g;?իѦ*qzkzA0aWr;kd]/WS2sF8`:Di=^*Q@,Fd%I%dzM*q֤3+DXk,ĄL<9#ucerLYu rkA*Jk ¹GD9Q ԳR͐r3GAcȋKW!uqR%}(CF],-Cځcfh^ŎɥBWD4kj?gHr1#w]v41>.UFNnjQO` tu!* |s-zBHDXo'2>qlj狲Ҁkv\뙧NSg"҃U d י ^lʴR#ǻ [&oLLM /5=[z!ko\EAu3s || Go8Dԟ׵պNÛEY}/o2rbʇ &~q@&Dx/ Np'~q섫yjd]D'~&e5g+ֳp6ZDDpwMߤJP;UfJ8YZ(d)^ ƈ~cNkd9@#9*4x]j% U1/n]ҳ  .8X5Q>RLĈeV$CLP+@[HYd2zxJ H^X+Lڃ,E-J|19 .ʠ3e-#r)/N4dP@NjA_֤hPLi=0HVn\fydJQ"5 DM~@ Cdٕ|J̰SsJ狻Fpہhr8'6x35"֟$џ`H&iu/o ѯϴ>ˢE֟$' 7^>$֦X0+x.“]Oqѷ;$1ƚ_qD&eĀAӎ9 9kd108nxّ`{NH?ܼT?We SXs~nuI33WHgE͙P0RD#Eई$OA+swF_c5x<@^߫8ulWwl[~{tn'mm$*_\tÿQo="-~c_\:/wKLx{lX3Q$+UB3/lmSA39cڍ|STzu尲$iKu A# IbGW_2$7z6֒+t&MWm@6&rz@NI=ΣJVG kL!\u~޹*cw >܀S`κ+ɱB=] 6EZ-J=hRH-]Z$Zhyzs&K=PcY6:A$J_JRmB/ϷuNEWy}={Y/Q0M>}s_k%*EJr_3׋}]`V Ph21F$)DJd6arkR8=f)_uSߗR$Cq/1.7[업<2 [D"EJItIY"G#_}'47J?L4GywN?Ҭ9c ;<ԫa5zK;أwI1XxC M,r-MK͔;Y=1_;eDJDl(1*;hP%_ М䱏<@SNLk٠piKb}Ot9٩rH5<2v.z; db4(vj5L2ա/lv4'޵aXB1M1ed$5Ap!qт&AʐJ3SA45ڧנ'$q mH!$"=E sAؘB`FD]Y/SKuEs]* 1iR#f+Qdb^<pNywi\A*-AAI. > 5ja Ԩ3jTY]*dư_Q+8m*1)yIEWE V( #q?>G3ji',HO{Y~)1GPie7&d#@,A&Plf@0j#n}(}`=OTɰ")>GM2}8DUs#w? ̮|/]bkF/Z[A$.ÄaC7)[Deė~w|Bqyͳu0U h E %_^@\J4;/W@)r(ɞL~px+K~$?*ŬAXK$@Q;5L,C!`/5TfAa.Ok9F4dO6o Xe[zV+_WlڋbI}g&/z^o~>9c^6ɂ^^`  rr dG>l$dk<Fc,~b݋TLUuP++QqmݕJ&2RkN:ߎ_V|||~J S|˪(xguUc55ry-%/vV2NH3N\:EK jT'*VٺNyܭRT 2;/CAG+|O9Bd{<7QTN|}e=za^TTTVxkޚEZmRh`2*h6b0|oF耮\8芞6Y٧RzsAPBVȄ^Jr5 ^M@8*%0b)j/|o(!F/WۧL Wb j۔( (Bc⛠:xP TZii@}%(.9 ?ގr4K /Ī<c7tDf6O Mą7]Wyoy'lM7u@ aD䬮m"NkLfaNP7>AiCXB|Cʑ+ =h%>ӊr oR=vNbci.Åvp.y ]OBgżG55<u/m<M8C,8T%nvK):TqЏP9vYj .>]`A;}nJXT_E@igu?Xի֯2 _ bwotk}J=i=(bk#ݐ&){RnЫ;PAI]#FM}).{rK&&m_Ѥ;%koQb[\5bۄV@1+PqC~+:Nap)OamSBaÔ',P{)^";0jL%!15lhf#I}|)%X3k.X&oeh)`vXPob,ЭqIMؚ~J2޳KB/{8!ZkYsijKSYROyC;JV,y{xF0Y.-瀆=TygA?\.4-P2t}(r@5:f/1r@#(9C A"1^ y0V"lQ%pd`fNՂ#ܔur?W=#0vS#dl`3 ߵLK}kMR. FL 2U>͝BqC~+t` [\5bۄKKLo(Ƹ!?MSD}8_K7gTFTtVTr;ɭCKwF!?|@Q|\KL :oK 20 !$H9W}j^jnDzxB𫋫oo|"(;2lٛ XF)//q )֘fhz\=tOeJX̮V؉Hx^!yNH7`P,t*X^&ҋe8 @GQRr 1|Rw J W/ ( /23/z72 Ћ_"$3(+5^kdj&ĦW5Q P~ҥ~7_AStˎi3s]wP=>SIarw."nYomn#8i{c(v64y/\=o) FXMI*3k#c84&e32PMǔr!Oe+he i "o&@ (Ɇ5%}+ jIZ7!mV(H_R$VmQٰAN=@%d;gfQҠSPiBlMIU]s"J]Ka5~n&v\ZaAjAT&@ѳy_\." X|;5IxQf&ХfhfG^豙}ܝֺ J4V2Q¬fd]ZVDZ<|xۃ.dЕ4:̄9 $5uVWzS+ZAjUMpwlXc pv \ZM)ao IJAyX^gOAFkKc#DIiM 3V k<o .|K:=B!++cVV,qHʞ\絜^NzJ@$FqǓ%xlm6kFŽ*CymxKr%l{=<+x~Vz>ֶ^Nx7p),MY0Wo<ۡ\id"7mm쵹-.Q@ $3n|rQ&[^IVW7 'HB\FәI!H^؏+pzODݝan8x(SJ̴gN]%d3aPbea`M`N\Jg*AwX;:YÝxkF?w3ov[4ؔ=2t캴7ijʭğ$PQFT@x Ɇ5[؁pY{Xt߷yD~;gוWt9HZq\ xZ-E^MX"4 Q|>Cjٞg/]f߹;<=??Z]O{+&':W_n?=N=>u7Sr~y:ٲcnQ"TGgJe{?]*8VJBp__#I~~68aǵz5IUL;Zzu}Qjcp]Vg;KTuW$*1{%*p>dKWc3I3bp]/swy9A3Ay=8 #ݙRa?/?#))!7Rbhe٣eOcJyeylJc+|YJcSx Vy*@KnX& wA 5Ov83hf#LV#f+Mg8^ B5AiKO% .dRҔJ=bTeC }&5SVE8ì3҈V=RЖ)R7ҽa'4!zĘ`2[8瘻!p0Idi9yrsYYOl{ԝ1kٺ$:+< ;I.v&<dPR u.Z+*x^\욛=FFc=Ҭ{sK. O!1F!24.ĕǔz qOl]*U! -r]_K/2 [ܝۅUg@~bbb:x=G*&6ooW1˝ Vڋ<:R85;+ަ><} aq Vu@ANkhH͔#*tR|h-_r *K{`.I)i/ EiecBiUnMZGˋ$4Q.pu q2g'RdI*$T JǔQUӃSHr}h[OT֬t ?\kV"k%;!a ~?"}Y|Jk)9SXMteCɰҮj[rFr9{.2jFUV)hm-B#t~KF I{5ssf 9;UKϖ'U2VHNO^$H-iѳK ߳r@QW 5RRV};} ^K`2xyVU49IR>6CN'CC*ߛڄCDS a|b8Bcmp7xa؃8d ; . !w˦%uUI⸒ eTME6 q7d$3LЌ}a)ۖ߫YS##Y& yUg˫/^~],ӫ:KIr^.h0yrhNM?,@iF[掝-MIRd&i >f~ (S*sn6n~Fɼ3k {x?Z e3~ 9oAΨi{$4Zl@AOTUXĆ5% 1/|ΐŃ+h%Gׯ1}?6 W'v1H /[{+X]{SDz**97Pܺ( l@:T4JdD$M(۳`˫]pXjwg3ݫŕY%^+C|vO^"ЛW.%7ԥ#3 *4TPt=֞iҞ!Tk(yJ WtCycBd}UPd PۉA" E ~Z@Je:7ʦrh{@=JQwUKz_vojF>ǔTzM|bu;޼P>S-s]%@2.]fХΖ: ]I D4A9UՕ[Z7\'b8gA4t|N ^4iޝFY~iy':SJˬ)#h+Nz^ݠp?v<5aifw ̥E_@a!vIq燿kDR a:|hosŕk%lV"E&蟘C&WBK[DX57`<89"ZdGst> 뜖d~`,k_Ɏ HSh|c^ 9r ]Ǚb%N.ѝ1wXH=.fdpz$$n ^0ph73-V U:J")#CZxLudZX;fP:oRo/Չ.$ڛ!N݆zcs|20ĒxwBmvBOn:3h#;P9JSmH_ANy<<.Q:0.C\Qz-"҄bp57RQ6چ\[ֵiiz``,(E./uuX.+CV@ 'á]^T `Pfj S)/ogW$Pcx뫼 IN&9Z(0#&֫'Wzs& 2h}h1Ѩ$w$ZmyeAٍ8dgHG\ ]\mz/rRNA gd_Y:#EӘTFaK9E<Ҳ/3(LVvXli)*jF{*-A4q(h^2\`D#T0Df1i0FSFP˴VI$b BX ' 5E ? &0$i:8lUo/`Chf3F l !.DeyD)J#ъႩ (%3CtXH7h5 y)ŴGAa5|H#|R%2,d)K hɊ \Iݔ'sA#@y"3/ gXF"FKE%hO"LI@ ʀT fbֆsqtK*dz'lQ0l RA Ê.iA702=,Ü(kdQFqU4Tdw KdմtW{96< zɈHx/I[| =vB|aHlܵ]h׶qfckQv`[ 5=ƵўQ+-I/M^AwǭpH$J̻.Ev93v9jmf~̭w w9vDR|ssW('rsQǹi.=2j8oy[m`Vۉ^'F_\qwY!7O_Y@[븧I~f=.JY)mE,7,dp[?ϊ/ #+7֏ew̸X;V|[ hJXGd%;V<`i^}`K#8|K-Jm'TWO#{Ф ]wkwU?0Ls끕F97υ[5{!Oڧqe׮IxH %7fcݽG}sONZV1z;hjͅK ?vG-ͅ(Tq/xyR֟QWE/]?YɋW->\|۴(r\|Jcr{ӓg㲙kzc˃s' *q]WǑ=k\.R,;yEX7'2/!RlEj%., c`dZkdAXO/wOI.۳W,pg; ܻwGWc \RLʮC]']wtB*T(h_NW 4>}ZFĘy ɲ/xlr8d>h[C'h(VvNX)0obY)I[Dfi;KYvw(m7I gNKE΢JL]p>.@>ŊQ׎Z{xvۿ!@G{oqb4:C6 @leSٲ(,1PR]9(D͆S6S>~Zg>Ϭ}PlsYm.W(W wųb21무 ڂy+]O@W3ؔ؅6wк~OY<լWj[VGt"\qaI+oёLEeL@:qvA栧 ӉPM2Mt ]Lz {\KOdq?%d1,\ҿgR+>q[EzU VBv (ǝaBkǪH_턁- EA)[%=w`G,Q7R_yހiw%,NњM>?Ro/ymРSncpauk}-nJf?}R")Y_pT/?ޖghEWd nq!|gZ/C|]}^ޅ݅݅݅r.Hœ&UJ:,A&y MU@W&Z m4Lٺ.pTOW] VvgG/@A)^K^ị嫲uw '~'+> Wg7?⿄!B|׷]w$w'^_1*S%}uڑрP3yYy凰 SUACv(2 VKnm_Y>$',1@%,c&:0ߘwUϏ]FYeuQ?#*/̋._?6a}k-[yӷm3_H}?#W~G?Sz!u W [ 9^NN/^\2 %->HF1$/!E$nCE[:Ҥwe}Uo&+$(mNdLj"gT')tha Y&j彔q.e  P"elp)4(qh;W)ƞ<(0*͠`\1`$Bױ$Ȓ",PɁ>t6„߽!P@er&1M  V6q #B %Oc2;Y-'t3.+IC3&i> 4ײ60C~5}`5&aҒB~k/G)?eX ̺:"G/{p hw ^>KyPͽ0"+4NtZh>trCY2[7%$mK% @%v[8̦21I|6+T|hf;f=}T诫02|8EG2{-΁5;;w\uꜪLq9%k5TV[dhFGDYk*m9*|rqԈ %{8GE[ 5B7uY/}bΎ{).#MCI|h:H%uAY*VrLqBBV^%zB6(٠"E5Q& f$m]<7~1>_[g\.au% %F$X5`.f圡"@Ng4Ҽ}J`&]\,o!.jQ^mMx'$٘;"] u2)]IC4د*t(ӞU UQVi,3EM3u $b'Iڽ/bxOSZQ19̮3o~}i[\5XOsR1H83&yo򷶹RZ-V8]UQ,jۭHuj ìD$>4s0,'J-BRA^.mTy.ʾGD~ ڃT1a=Gttijw@J^^Q邘 Nr$M=}Hp/xy:ՙOmy1# :L@ht;;: HDj[!⫵kk42q!:P\:j4isD{O9!@N.'goDTLJi;u* ӃUʴPlv %*&v nEe6IXú1 ՙ158 j"8}"N[a[S{I_ZԕvT0]PΊ J3ICLʔeP3)0dmEnPlle~Vo< X >hjs60;K:wEC*2w[ ) (U";IIF XG %P8=SN}HAQz2]vb^$L[ mmdFo(Ew ֦SUcKk3UۃH!Ւ AHB"SKnV8In'0>'SlVﯷ9eڂs(YBঘЍ(zB[(e>^pKq6uyƏ-}1oC7v%XNFt"k,KdBKa`rV6+̓%3jY̌ ܿERWlepbgN3aںsK&BY\Lh%%i4^,t5ji$"Ⱥ;-MFj鄊J4z<*QӉ%%쏎_PvjeۼF'I MV)l 1Ԧr=``P$Jq,¾ڱF1vA):PދP4D?tH9a6x %#+u#ŸfU-(nTvvb NqDټ5w (_ڨÀsjިb@ؐ%V7gɖH_vluÄ$O8bǞYZ(Fl@ ~$Xȫ,.mD~#<:Zf K= %e_oTFPhN$MH!eZӍ\o@~*O &1-4N|Ad_ ǯZvt].^_WѨ ^"V8Rq>cn$3wxRit9ziP_tT )Z;oiJ3KEce|j"⼂ф6H$sq?AaMY8HOx.+sB286nųƵT(aSe}V S5`9t_ؼ/97ܐ*./70Z퇷orb8a `-4Jw6p}>yAt0'.~} x2qrↃ{*8?BMb eȶ`ksLm&5;n)u9>> Mo3V 8rSL٧h~X55z?ŷO`Dt0- LnYqiՒ|?ݳ_HnRU:UuNonu _tOȈ_lf~X,%Ư-w}\i j5}qw|ׇfp<BFq-aD1 (1x7p*YTUNjCF 1 QSʶ@."b5ӣ#bNłS2hBҘHX.q&qǴ"̇@o 5atG-ʖu!W_U=B@SU];q\ca2 Adgrs*(\.vo[sf_x^hvQf5ebw >M;n8.; sY W_:^:4 @4qZL̶ԼdypޅY9yY]pT(ST8"~x<ɹ0a9\(~}[|W^A|Sfɉ=߄`خxr؇nZwe>@o[YWQ[2[ȋRWn*qy6NarN'ZDT;7eK}nLՅ'Ko; k?P^|| +FJ*V\ӷ]i'mU{A(Sy 2/~ǯ5V,xvD J*l)Ԫ ~z&q_o%~\EK̘^b3Ōt6F჆1D`%6 Z9X\—ޏWk;a ۟x\9s\:o}T_۫神q1%1q`"ZpTG:e *˫x6'WQ w_!DJڎ,zg!C\,=@R3$r˄I j!vC=5z!5 ԣw[cd+nDm#WLa456KF) 3ރxQލh+ .~ 5C!l(v 3P. ißL'3bwߋ+PX@トgւ8S(i~'m¾wT~>`X3澫[`#]xPqEoB42<%.]$ʦȢoM#}N~l*/7wę) sA8쵝 3g:_H];h$jS~l"Ϝ">w~{; =Pni|AQ}NoJޖ&8IƋ/Q꯯iV"S":l U'*邮jźX:|;7}PwiY^DK r#p%ف6O:]&kJT?1L~޼ܰ,t"&vW3oz;5S?[D-& M ӧ$73Hy7୐8@&Hd4Qb2M3$2ڞUWrpWԺ><az1ѧ K6]WX~ ]>Wx4čG'*+|,wZ"q՞|閯OFg#k=s#rYMK9TbzT`yTW?QXѣu=MhDD72f+FW=B:eSe`2 Η$? jC(/ߜ(tIF1R i !Z ){Y׫Ed:O:PRR!ȲOSAX5$(EJEsûZ{Aw5xȶT+*tk5¹aj?N ]_sչ\tMsu:s'WʞAxA;sΕ⃔gT/L?2W^x_n "L-ox3 voa!6[Tڙ)O-k!zsBކZ2`38JL%8$9~J9]Al-\PeJUTזSyݭ0#Mp(&J?{۶#݆(a?m[޵h[H${uvfAd;Jb%R$9r, kɣ!ϋdq;'9x\<CiUShv/:SbINĒoX=,ĈEt'P^`Nۿ~37BP˷(6ݗc6-7#ewJ5[Is=1Ä-f[6 d˗ݓr ʄ:Sm۷gkL9lj'-TB-]aR Hm(xgbM,J޺e9Ula1Yn~anˡ.F?Hc)d%UXB\w*Ն,liYV숏5mX3#) ޚAO!TSيOك$,?b<@1E O[HҞ%U0Cev1'Aԍ&5g\RQL̽{(2Sc4-H/=*^۫(A/[;ASa$Pp9H4+l]XE W[bum|EUid>B'Eշ?I"KL'5h8v60aCH h}qG3 <]$87a2 ֶ9QE_&6㹠KYA+n׽U|="/g":+nGNɕPr)QjS׫*h_TEGX]wn6&e2]g5o+$~SipSic)V1E)Iܴ4վ=T!K퀿 x-Ϡ<1n*sAhC6:_,)9^k֛9/ %3':N& W{oL)$|:>򚱺Lu8s:>Bwx I {.E >K5o.вy$o4x~&d5^!Ag-D(Ą;jÃa49by똠ܸDj dŕ^TWm b)R7/;Qmlo4|vg=Bic8wvr6 sUVBrY1řYu/u BHn(؍;m&9q0BsRJD1KQ5UN3P\HcI~%HHă/6gjZw\7qQ"קKrZ5w s?Afm.Xxq]dҲ) cfvQޮuku+Ÿœp(j=0漳ΰY4e,d4ܙY5; @,gۑ? ㆳ9<KFpږ.;8ـn7}q2 8MmW ޘ&JsFriIcǴvCSy3q^&uNkE3I; ^D@Bӻ tv}\ e-aYB@jɲO,@S mPMћȥKȇMvsig&^c@Ϻ{39z0suaH?r^$b #@ 0nY4?Ѭg`zv<;g_zA0qh~hgRx%;&Q3xٹ,N펦+C?l]tz@jd48z߿{ud}SGJ'p~<IlNI"+k2: NNgtxt bui3tWi=4|Z?z|OOµ~χy?SGv@)\sa2jxeBh`@ 1%TV(ʁJbԗ`H~Bqߥ>&K.w}!.[7 ^ 'ރA ?tԪ2Ojۉt'̃o˫pꎡK.wo\Cgco'UЉ2Tx%%tr%C BZx̭JaEϭ>='g Sag7' U_H@p #m&'P.2^TyJt/NN^L'O?΁h K"g:v,z|zL/}`d{lϲ>ODbI-%R7'Vb,7,2QlDÕŜ2IJe "! ag3kD{^F06_Q"3Bi4` Aћ]'f''p0n ^¢_7.`.9.V&yWTde;c:e=Θ1X1E ~K}4"D1Cs٤78yeaMtIX?9r|{^XdEM;ݚ' wzmoy8Iت|W'-ic9UW?*A~o:e˞U'ATulBSpjr>^?c[jûSr?8d Df<݃p>н^6qyz{Ri&!>ryY8BV >}IY֥[&&$ :M-Q# NgZ U>YK+6䟿oXLԛ= yRP)#/{|VF^bB a5™-C*7 h8?WM"i6pFAqB$^uld JZy $흇s337._?II O OzplbjC$ a~ ܦN=O.h\m;(R#tw).9Շ!t"#b$0JZY$ ]2eQF81 cCNkh" UUtSZO~YTՠ'$DN$~/"xs;3w/޼Qd qG6`@sZ-CJkS޼ !ё #$)H0혋4F< -\[GhiTJa0nP>;2j[䄿XAc݉_ٻW,adIfR kqaFD zs B,LK)n[u8tp% bbX0wAĬ PEF/2rqDLPU/ cd5snLND+oKRcHG 7g?*P"q:VR4e9RYR,%1>= yՄ,OPA"8ƑR&BXf b1ZK'bT+;vRG*,ՠjyNuC={Pv%r [Ӂ7oNޜtȪ '{?9j:/# "41CA!c%64;!2j[GMun6w1=;c9dHapDtՇe ]ssi)c+Q&"Z x+ J*C#BVhAZYvY*(iN07Ku15ŝ.h[cgdFȟᴓb΍xb;We[ϞoPhkxUnƝ쮙[fwWWl{YCqcpO"\4#o'GvEM:t  Ut'?۝wuǣ'ՌlcmUZ4!tjw(7}uVZGel ׍$Kcm›o\K&AnJ0/'>|ŚLxݛ W7΁iy;B(obebyRڽ,ev/*mXvrlUޫQ[K$UZe22U`SP;`@ҺH*URL+ks!1'# 8TRxwAlJ 4!ZyFFSVD3XXjF?xo~F9~Z'i[~6%7sR@V(}6o^q^4~FB-.>GrQ U<.~ :(۴'Is[!6ejO6Ȉ@J({>AlOOp'KޗOqI' 4^=.67^}Ul?"OF!J )Vv #)~ܗS))BZhc@$p[Iσ๘n+wDq2F04~l_4YΧ9%^,ʃ$2'S2zdfI f(cLY2C0{n>s2M'+[.]Ą3`M^HzsȂ(:I'O3nL$`+{q5Kp8坋pV)%P3R㺫h2j-meyڲXke Tmǚ腱^26aZh#XT[;;\Vr, n79I* +y6#7qM7xў,L;4]ڂtgRÅ>h,eOxk ɫ6 U0i>|3YMYm)Fc 6%jzAԀuULP* ),A&)*a"2qE2ux y(N4^_}8A 9)fў]g4WMM^Q0D(;% -`^3(΄LxSA"KCդrh󦤗B;bv>˧c>ıa,.C6,:.@,2p ISq5Nq NF=H%]AQyzx(*' %%] X*NU001[$L"` ];Z<#, t#ڹ-tv^cjxC^-B'ÕO$trQXpN1xʱÓ$]>NZ/Mυd0rp 2Qm͂Љ` <*r)X4xP> џHY.i eWI-'ËWUk X15ǒ.!#IS;7iM/c*b >,ݨ/@oQ,7NR&ed68?cg><sx鋫ū)h1(O?[wOŭh46/omW@<_3vn @{]kKzM碓{5Qˮ\$l_\ז+Uu'qhxo{v>Su {5ETWz JOI|*օ{W_w#goNLjEԭCs!v*-/Ç!o0H]H j_VַsNOm aq~U ~uˋ׽~wrB/N1C\Ok?p'<)Xy2D7~ËNܡ6SR@ldB uCq]yb*Qq>-ɾTi?w;\'.\0XBA'Pz # /޼<v>|)rz鍿$:N^~a'F P{bHOY'h#'?v_f/{5r[FxPD#/*Um׿}Z,`\_wcfߤ[4C%^ /svI[-ۓo O0 9}* +X]1!3Ѧe\WŌH+xF-TdWtd hC ͦL)͈ʌIGG05& o"=vk PNIi4s0:&h (ZGL`џ&Q i&>H"wi&Y%"p Y vӕ"3V6đ1X8%?~lfŧv^qGXsAԣ" O5[(`NM)Q[ሡZ%ء v|Z B}é50eQ9(bv+dAQIb,(XL:FEmA ךfmqz% hMq"\bkKc8͐CL2AʠXldXA8%ոٯ0/J-J36H)6#9G VݣpƘ]:3dJ N\4d34X`; ffE0.ƨ HvCݒS <yZ^=^ 1 *Yٍ(EjFeksg0.?Mʪ-?.{ ͅ~4sP&f7"~p@KA%+\.~ F{-"C4uֿR뇗 Q>NpЊg9&H$?ݺ8L!$Lf4Q'$RqtD#vqmh~f<r{Bq6$eڍN]ըH)ew)v *)ꈧYI~NJkHIxHҚRkq6Eؗd 8'b[=K];)43-uKVw뒑m$sѐůŪ" 5(tH0tZ1AVƈRo迣)L3!9F$nĭ(Ɩ" ϗ6N3hHah6BT#0[p'FDjd 8lz_F4AQ`HIӭަA@u-儉5A4bǎAWK;.8Qds|PurRΰtL 0On2 siTRgfͣ-FYQ%-yi Ga #Eom/68DD$g;<:}Oa'zCpdSxYR9BhF *6 )fL BHr* .`AnނapJ<1GtTd '>؁^MA cAjG-htTA& N1`5J(U `UvKc@t;6BmsJ#|hTǩXIn VrB j:ʵy?dY   &c6Bhj8k+6 ` HM*#Êmd`*F//:.`ASTtJ4hПOtAm$ ] 5 ;)@xE lI >JĞ)PS 1{!pʔ$xpl{xyjٺLS_y3O%|l]'] | @+#Ap1)vAI0C^r\ߍ\#_=d`ؗJ/2~q!=Khku EBb ׅ#(L!BGʈhhęZp$z{sI/l{V>` TpZmUD&5KnIdN"kD$l5sFU%py=QL"%Ns*iBik͵@D<2זS˃u$ @7U`/"!J)H-vɶ ކA}$$-%.| k{t^L[<_x' A:|r-|uz*Ȉjv˴92l"*_ro_ytwx}wџM:1}sszT^a??Ge9FÇkZ_OOˏoej#0 _ɏTb?l)9}r3OBQ$cV!KWv&eKDd~scna,P3RZUJkJi Wմ~v};FuDlo;-_?_18:30ZQO5ʉF]&e}kt"'YnFNNT"'ǟ>"3-s0RVj b ɻZ!Yud79F\|CS S+p\:oBn[̴Dpf[5!ϖ^ |A&0.2$'zxqy^\^WUe5#4VpMɵ*cB*`R/HEUě:QWc%AUM'@z(>#v[F8SNGoslL?-3l<Ӗ 5,5s>9uHp bA|PCU!_ɶ>+6Z-@+ vgᡫ7x =.+\:vY~N2^߭2[BS- hS+d F9EqX=]@BIЎh XQsJ9G{ʿ{t9>R@u7NƘ@8UNY鉥B"J ujO[, j$Sbr8H%$a,͜Jan.QG:JVׁj՛vfjA m9 k(/xd Y) ~z4ҐW6V~( W(aFG-Q{^w~HFgF])Qs0ޮbO?;cWo6 ~T[).SKӂޤm?Bv8yzgGʮB<ې|̍@pҕYkx•dwcv)=cC\TX $!/\D;Tũu&*On1snG3O+*n]z.]dJVʔn tЂ~0DU^Զ^p뭗!&t N 7^ Ğ()`u0rF4:.ZT v ]Y=ҵ Y8%H"FSp]Xs3.슲s"[t(o8b A]yAsܖ.QDM`uՀtClibCw-hƆ,E[R@IusYRplU -ySWM^uԤjbQѪĊK-/F @AUte%5O~?m !0rWO[2eP3s="345}]Ƣ1EaK',gb l42Tq9O'mh6 *%VuhVʼn*PDm-T9aU߀DrSkʉB7DN 2 h _x8MVn5qM]-KI}}mzL>_'A_}pgm+ǯ^lgoԯSjNךD] L=.Ӌ4$4`NɣJ&"O^ζƎb{]8iCǫOP0ヘϯ?L?P p t';)Zo r'rJ8Jb:kDR03XU6d29'{56  oŇN.d+|}-kӁP\ ~v}Bfw_|,D ~½{VC$f̵Wc-@, Bx@Fj󅌱/jA![Y>RӾ?ܼy74zVcV㨧 b)`ޚp0r@_A BJb<.P_L?_vj_/vIl=/ RL{oz1T*7/[F7'l})8.! h?@_\-hki$;$&qneYI!ȝTŨK $P@ق"pG- Ҕ]K h.ޒzuus=ev%9| \Ǒ0+RwCz}XA!ʼnƅl]F+1G/S׵9Kޭ~GO2g{?z3J at{}}e{b+\y,2~*Mv&J]U'[2~Ϲw5xܗOq|/$=_4ħ5g0t}w߽,fr]ݮ\"C!x?V`oHb*on.3 祷2RTQHW.=NBA U,{BSWԢK/iMoBK*}VhcY8ϣs(h^(驊$x 1 fXmNhiBpmǷS4r&Hu\qSj#hsgk9RKh?X43Ky$A5@-ْXOM-d' Tn$dgF&R5S’v A`ia@U-7Si8f8Q)3I ;q3Hؙ3¶Ex=mUp*HumECjU}'ogT*s] Xvo4׭+2) zj\ 1MF'Ǒe%jHt$gJJ qqn҃pAy\PIBY܎F# 0cǯ,:) ڋtCрgZ6B̗ߏط8bLnw$0HvfF3É7~դHIM!*R$DU]U]]]Z !V؄`jlH#cvd(EJDGj!u#)K"ҒʄD(bVII;2C) I'9^p&6 M3Zũ0)'F#0vyf+`"5&TItK#_L1qu>?|g<]atv<;%`B}@hATbd#iUZ/&naGyQITPaG !,W_}h_J:W4ɦ dSe~k/jɉv o'i)&qۉ׾Ӽdv+M;#+$.uzDlc(U};܋cZjMn^vOfOU9."aHpaUJ$5(L"HT[41VK6&;5碽kU@b,P L;$kr ;޺Ӣzw@fvY:$zNjo݂aO GpakX칮3)аHB4li6 K3(S#O*SaMdxjXK{S 2(Ih&q6([b FXe(=ANQ&.yD7RS[+SO݅KFHE)K`"q'g*P6ױ{'pTpR(dGOS{yw>ħ\[O{ٴw.-=Ib`L6.{XLhv˥`l~ϥWe@)s>h{:n*K_M't"9^9#o{Ԇ@ `ni=zY9՛IXG6IF4oQNtT#&)FM9 & B c YdǬkf501(.- 7Q`([R LVcDFt[ 9L+ARby\ĢH""maUQ.f8LLMQb) s,-&TS'6=8j}J5b=MO9vM(]O ]O ID@hw™! AoSGJw'Ј Z(4gZj+A.,:]OWҐ[R!TZ<[6 D3%Xe#Y &5-M7_Sf`AHq. VçDMt`D2u})̪9U{.OB*pOU =U`k!y鮩#ܛSu{`!WIJ ު*{3ȞK0yft![D-IV!r$SZSp9B$vnv!=(IFou$N$0M*%N~dĭЭxpB&l ql:Gd*%[%?Tr/ ə98z& ?N?Rw X-k Ňh7( }@ 6GgG'fɳB~yb(y#B yИn7LU࿁?0VXEKYJ]t:ʹƆɯ_K*ad:~*VP0o`2b[.//S@ș<2rjg f?__owN-ܚu`(/׈x{v/B<ȦxdfRW?},&Ѳflx Žo#h$؍XSG4oAq:dšbjFi2 >]zt?_ vE?tawZ>6r`܊oO6zsӀ` ꄯ]J5ksَ!3ɰ,IębQ!)AmԆHkHhNaoeS81Lx6o'Kŏ[/ |qGS' wU'4kMv^zќ_@/hwi9Zj6E7Ҙ!8e[: 3E3Ib<`ϰx9]g`VfeTxfŋ+feP %zzƒ1R>lE*DIrZ1u3vV9ywge`+QMH8q} 8!MKq8r;b؉p;ڇ:G!H9,X£&`awt\ "P\k91r+<Fpv[X.pBlMpnY0 H_)S5j?m:7@fڷ\۾,@"50~W˜דͰUɸv$Vrsfڱvdk&$ӂXl5i[:9p\]Jie&._5.Ar-ْ ꣌e"܎:8,+[g20c׵Ns<,SPJ<s-n֣,1?ҶbaٟG9mebȱx]\-Uhqw>ow/s;InɷplG0$R+'5gQ P{zT!<܍:dJ$[:[:;Nn"u|Fuk~|Nb4I4ƆFF0( tL]7߹%qp F'D$,IZ60N-5LĜԤ1) 0&4>v֞H| -3!]xq}"6Wo7b|&\3bl4s(\La#.3y`;9B#fltg6]`@b$&j|2;'_;⍰˫58'`~\w`<ߠ&pHWdq1*ONST$LLa[TV"VP5\b-(@̜҅[2/h)Dtpq>p{`TaR(DLY<F35R6\+iJ):qR0Kj$mu2ՔbY^jSBY"cHH cM!J$Jr҈ cƆ7Q/3ߵGݗS>?.Ĭ/%S5r1%Q@/1h;RYr=\K:a`kK ") ?̾$-~aLe}VβΟ^FfpþY9-eY(xLwiBa}\[ ҵz| O6v/vk;㟁y%ȼ&gf^).,F#۷8?ۯ  bۭ0q~ oOdi#<]F(tI^kF@_^k({8 Àd6zy4r;h$~]|B`/䯁=/( an1([g )?7kLcP1̝|T0Np8%.1zh'WvF1+ɒP=~'kW1PiLKNVE4Kf(I!3d~_/oҽ= MQ{k=俇EݓӇu߮5*Pr]ZE]'Ut6q׺ٻ9*oToe BpZ)PZO=ezJbBPr<;V6FPU+RņP Pd6yRֽǠR>eDNfо۫ . ܈Mzo/)oF]v·sbX ճF}Wײym@o@lł鰭\jc(/* \rX XK*厜kch2^I-2/.I'% 4G{p |\_N Y}-9T>RK8'TP6ę/=4j]y/yCKNmݐ+DiIV b&*ʭjfe^@eeԸimqUv&qh9o0TjZyx̡E(;ԀU+Z@T`fa~gLPdhg 6`#j8} ƹV>t贒z@#hx4^e}ø:z(t07d4[ eOߢv7 A3)h2)h Mk"k Em#4΅`t-F+:Hřw`# 6GH8 |]o6+)]dO~"a6٘.|{N}$~.vevJ=n$#'GTHPo'p\qrM Nh},1@6*Ѥ>6FP0hvRI-wʠt"eP:2I_eX*;L`z, (?PKL Ta"8aK =V R }3nX`u3M$h ֜*&8ul屮lQі\$tEV$k4V 57W$B*kepR{5~P]9GgԍX K׈7g̢v QJ(B%E g݊ک׽4h@3&" ђ5J )UI#̀I Z'q/,O|\RwI϶v@8 1:Nq S0t 63״6N҈Ϋu75O|THi$N7 2IdGshK9,:&okM<nE[+4 NF%Y"t4^rnMD;h^[wrgHKvABlܞ&o џbN)`j-v6xP:Do%҈# GEa2a#(#\-xum!eJ[%}5#Z JOxR޵{عb,5Bjs"q&r7\ ! j$ZsřGNP)4z54b*ux^p)9OD6<wEN † gxa.)N*չ6 &Aac.UaCa f\&)Dga{dﺗ`ů Ek%]9u:{OZ02|c]{cfmK_h\q1T ˥i N|+m3V^s.+~lI@Zq\\OK>qWD0RzKǶНX꽅Nna$9w@N̚>]Jvc,|3Fs[w,e6ho̫ ؽ` d;tPY.X!y@)b|Pw0/-saÇ 0fwnqt0/yfa ! 7iㄳwz2zvBThCt{0чwI'cjH -lD#J%gj :t6XY<}XN\C2Gn nsбpԐQŴuhHh.ȒC]kp%Z/Q~: ~ܸ4nS{3E&' ܟtYP}{ve'gU۩u.̧Wu-Rq_dTໟ#{ww:U}*ɆQKZ{ZT3b;:U{uj㷃kZK^xAl8hh<ʟh8MлY3Hz2v0 6,#;v .bk2ʭ>_uZRq*oup͵ \:k.3RN_̗Nm.-c8u~}eN#$oM5i(͍(ztrⱶ~5!PG7h-_1ҜBy6Ml^%u3R4Q~snG 2OӍ#wc[! ;tw2^DY@i3I_DK_I ̾["tc#&/yA(郃3N9tu\ftv%SKîq:@_KѾevddKxvӄ,wnl ]1LϮdrL# 'z-(HahкM13z,SspS`lWSt +J{zZE} bG}N#4G0k =+sz]]yoh-۳D񾹟Ok>/j8N j9,&&?(|M]:o[Ch@7 Y\-6t!||wDWg|,cAk/tnE5NV8qӊZh4qƧ0?O#:h 0k ʃ%6(/хItKguV@UU?2* o1oP^k&o{۞87ߦTǦb]rW;&כ\7w_xtyX]UAJ+k. n挰=R +upX:z`YXSAm]_j 2lyءy^-cȀP]v7&x8-<7ͯ"4}۝Vh>>'Ufǥ8NX rr831D08y08ep{8؝1d N;u ai{]6~V='J ]yqtR:ߕL97AuJYZgTYYPpLj1Kʕn|k"<.1"wmL,ZH52(cMV(>rO50.@ jYj)QJ1jZYTH.oqJcp M`Ҳ6>6cogHȧ߿g|Yぇw]\#LVW?3oFbr7Cg[Z;F}5?si9>__uጸLӧGWoLQU`(L=$'>{ 6N!Nl69Bĵ*G9BJX)ru/E;u#$Kyh~,~vNP{H˧"}UL[eYVD'RBjQ>z> .n@㈔/Vx ӤR:+#"r)CL55sBzY\ N'I.ⴚ\\&8)ׯe9ðJYX";Z~tNVxdH?:Sң}1ubXysho99K2LhR;I9]Ѩ?H=5\!2DzUBI>Tc3!kЀjMr[qΕ.gG $Ce"$Mh(yCpND1!JR zHk.Tw<'bO\ZNtZn/FF35Z#VHxeؘ?Ϟ y:_xw;p_ͷZmLY#-~}9anӗo_-_g7|(oޛ'CؒljZ֎읽cc,rLb Oi1cȌ pjFoiky)O0<%&@3&:A\@{9XE_ ٗҐ-DgR ;B\j}?13 om_5rHY2~׹$ߍg(ĜGa)КBK; 5X0Uw{jtd#UF9?{װQ\̷+>.8lodNzx?mI--ַu~c6eqw፤?t{ho.n"7}:?NJ?շrakK6Z\0%-c ڕ8U0<Y'!2&9-m;SF+QBy %Iu, TTE;%Ɗ ԕ9"PԲ]uGug\Щ.`re Srτm |1{52Kt 4bGF)KuYd^F݆3~eݍ+#{[wxxxvtV-og$fxqs/-U- b  [E w\+=q* Q/կ}![o=m"q핿)˪]"#LLf? )&1,:J9BQgx`YK."g9BX3Ld6v]P55;rOex%@'qdz/N3wlS%x ͅ3ju9yW 0pI!j}UPqHReD!dT%3a4gpr$ԑI;eӧ_'I5m|AI#e!9z.%UQ/1I3YCABYsW鑀*F\t59`ɰgdE1#8M&&Зd|2}>!tR[g 8Zes]ӖOR#Qie޴ hiEaEiÛy.SO-wx<0q^Hav|l;{pV.Z3fY/vM!^9^4fn߷ŞCt2mvr>kuKV!6SyruP׬Ы]qs'1qs'1w$vo0k}ƨ.)@bi2UH0]iJ檨0.sJK2PtP&Գln0 x9{\T-'g ݄M f~[oZ3SZ;~\g2×"((ENg>\)s"sܩDQQ82_PʰS5dx _v %xn๻@I. 0*`4J򒚧q]`xB c_׻7Bm lo,buJ`S<ø:nF*(*W@ Ifqf"ʚXhH.9gORB 1+7O \~Hs6ъ: K1 Ƭ?wUHw/3!(I&Og/%UӸOo~x0]mi~ήN}nRxG8M1S(aMXJ1 (=^ kYTxP=_!vԖqr[=w&+qkӡ.b%؊HȈlY? cg]$*zۧ{3 !B_(."}f`PSmr&/3٫()8D_k'6PA3?~ ?YOR,wAs$^`p ~dPY!~7TV r_u"P}PڪQ~syC]g% AƟ^lS榅q!(Q2eH(MQqJ޽y[Mgeʯt\OF2c}o`v(S̈&"2/] VzG2#O%s؏ubOigGJLd'%V\Wo#'NuZ6Tj9' :3݀ЉO gDg)Eީ}M_E$^;2CLs *@K,mrl~+(I䅙Fyų2^FɎ_7Y0tlN08\۵/+i )N{v(DؙT\GN0"L\tQh4y &Ւ !vjqWvw2}B,I(nFB# )v㣊w2AN ;dq p @[+[~pc7z&簧5QhƉ.3BKnN/Z> UY* M%,fJ PJ)󬦍r4˖GϱJjEa,$/4AG;s;pGӣT&N3 @@!eqfqs.M^qOd+1KuO&9" UIaeD69Ώ=}u*˧|ܳ[^ԯG Og̦}<33H!WgH-1@,vf Q̶@Yd[5Z}u4xѽS$HqLȰ{fssnUL<#WɱTO@&htn$BIc_GTnಙQb&ijIu/kr7@1H[!uq>τ Fsr*azRf9C(C,%8hzW˧X YkzhǑ᝖_u2NBM8gwIK&9/Qec XdK/Nu&%h;O!B${vWIA.Ytg_,/%AϪ\Bg&h#!2\!DPTgC)w!?+d3x~}1_]l!`Q+ҵҏf^ueǺ/188vo0 }1P#)TTH1g̵f"%yαqY" ef //goK3+3W6rvSrF1 ֛̜jisN٢ nPFBesn!36 ¾ox yGZh lDZao̭i0ܚs4p$XC%%P"!DXupAYJZb)ǠKoPVm& qU>rmW'#w_{n+cme칭=w+c?YϘRJ\ScP<' XRS`9k( 5a浚g gwdۈ9YF3\IFpL{s{opt'K TA)B9f*^(Y)D%D߼yG$ m(h6}|pNE]mF]m܍r1+4+E~  3mTNDQh0,f2jVϒ2{ІGh5bW|/w#c] l_x}t ;ӯ:'m:vPJFQ<(-퍙35f2kd1XTxխy}:ex쪟s.sI+*3-3 %\TBSJ+LJ-0*V!!VTƂp3 )>7}UeT"ȀXi3u(1 2,Ti, ԕ*H. 1Ι@08a R1j&#+buDUY/Ѧ)`4f WHOAҌ,,YTHUBCFQya_3cj{ܸ_̗ݞt|+-"F&0${HH{`yѴFl)6ӢS,VU%j+Z$>d!I182SRʤ$`fJl^l+ m̡0V$!b C7&H8\n%UR. +5y<79[GBۨ$&č^[ ޔR π U* ix Y,Ѕz Qپ ׏e!?2,>clod÷f~Wj-0Dчeor=844ӛW09o ~~}Qb5F.t}Aw\'Sgwt;Q6 ` e ƄGV v9g1}{kLc+eaڢrumg1]՗ՇV0i=F;AQ)OMBGNq,)nA0kΕ0KO(Ħշ'رD4JĘog ԭ@ec͠]S씮V@bҰTrW ky9.W5 j1h(m)xШkp< $c]& (FޚAѹT)ɶֶUmc΅d;?OETby7TC&E,3r N7,.o~l?~=gS iLt>~qAe 8fps`>b򻋟+ngd}_͹.a:_|Eܧ[~xJZvt:q$GM_7?'wka<x*\q\P 5Y@ЪCěӓLD6Q ˜Z%)1`,yk3BAIG5[ǂ¹̡YDZN]=-*xμpZGΩEϳ% B8ڲ<`ۥI}7)f ^) {p:ƯVB&C=񧊓_H"Y;t7gǶ8hEz35|EQm쑣HIWO e)yAѲD )mH2:ԫ2`-dˆ) f&eI2FnN 1z"=jsrͬN% i6!ʨ4Y7,d2蔉~*> b_= Ňٴ IݰwPHkkݼ vr_G OIBio+wG)qhw<@0F3!GI /4,~d6h޲Ӛ`DK?x32/Ү6 ?ޮbo(3ӚcʍΤ=fmj$_6?_lbo C&SISh*Zc/1Jj!{e`72ӽ/z1gTH:i֣%'F@Kzbuh # rs!R.DڑJz]P+9BĴZK'@Dt~EVF;deՒIB n™gOQ%=tNx3E#ҍH;!gm LJzLLV2|6<<]~3n7ąwv Bz #+&~?S[Sg5J4Rh[yg#mY3v~gw;v[x4 Ȟ]xW0+* 6\- Tϳg59 F2D聖͊db6B t4g19-h5 6׹˚Ё5Ԍr ,'Eo8je[U4 gtrX.XKQMVKfA1BFRp;[(N*H wX>O4Ts !jkn.$z`ɱd RQb1 :=sM #ZgXS{`!Apl<cW'F:Vyd %ųQ(@(޺AU lK76T3ˤ\P*ȶCzGOH̎ Oy[q $-%N0‹ _V94.›@s蓢wJB)8lR 0a*y- l͓h}@P/)%VK7Q()z0(z%T^ @>Ppn8sGc+< ܒd{nI tXq ="d>;i@I想ap+K V0J:>\eI$/5c,mCI憎d0ܮ0R+ˍ.Yֈ!b:T1Q%]SfC7"xԨYΘB@07`p3~e-w $ 'HAϿ.0k/"[R-* Oi"]y?B\_:$ԵyS_ڈK]Me*U1glvrYhD$ڥ(|0f/pH{ْDhjFS3p$P0&}-`|й$Xk@X[T.@Ʌ)>h,\AQ:BT nrg59C J.}4/V4*(E$4xP`# kE֠nKǸ#2 X, H ʂ{]2WZNxp0`fL8'Ru º5cn$/ܵ)] +4Fk u5FpAI^R1F4krLDl\^%'!m nt1ZS5K^3 *# Z+YtU.HZK tuʵaDi*)=*\o+ 7L:*9mWR6MDs!EV\Rk:GP ]mҪ+U*FLW+p*KD K$'WJdWRwsZ7-d -d|rQuȞޢ!<1\T칝V(X5'kv.^<<\tvtdû7m16n?l|ZGXsSUn4`܇mfv5upwo-w:Lb>1c(<4 9 #㕕+F7.?OOگ&|y3KʑNt ilZNxr)i{$0[[@m/a1){?j$'wWA IG@h h1t F;>ŜU,~6a$JꏨMNç/?xud()#cy7ѲhO68̺f6xj*F0k{ګI筣w`#5tyShƒ5w)${@ m-h+ZF:RKuĮB,HI,Ծ:<&铉GJ$dكEcrxMB ;\&a> {y=y0[Mu]ZvǕr :P;KWi`u>F?MW@Waa>>(8x_!烎Wm l:쪾[ʆZJz~P>]f>frb Y2.z^܇'Q&K`ND;1u :2;H2"tMϟ^: ·)3)>hgCr{:4G|)W \ \2nZnn5!Sѳ'/X C 5:eώs%;*ɭ}8zooXpI˽RK]-j*8莏ϡ>8FN5U ˑ!FlaA9S )֢ yDrHx fnX#r)><}|/WByDFc)#} Ah˧b|6eQ—/89~`\t]g܇ X' ]2=KdiCDw{Q! 9P5U\ )о[I]%#pDvUf'E^ IAuQO&[?pyM ZLF|OaĜ/oz,܁9j F[l]h4Je:"}n%siwm~9glߪ/Sf6E$rٗ3<8SeE)"i) .}U]]]׮⸋7q+ǐ K6Wօb&ᓐ{?5D"ՂЮZnh/@ul77h߅,׬#OQd&0I_ D5&Y!!F1rC[]sz62D#;]^ƅ:/k56Ym:䁳6nkKfd-\ޛJٔ =( gS"$hVjD h:F{J}lD6P^2~%m )bND5! ړZj4;S N\JZ v_:|mO =o Ix !#ORSzd3 Uq_m.sp+Db9%T!>QLI1(i|F¬}g%YsC!7)K2]7PFf*G-ȎjCjl\46$P60vL?'˴˓e N#}Izb6MμtuV9oy<;EM쬤5} 䪗aGqLUxt>7 BgZdSJo]TZ+Lv=cӏ l0UJv)Lu>XYpQ:q٧0-HBa6C.g~v6 ٻud'<(m4&ni.mEj!juONyc^P}-ïxϻ$w5.Tғ@2< \f'mKu\Qoҗ}"}a^cg4I[Ҍj 7v)ȓw㈠6)Eв*hYsPPLp.s ;{ Lh½f@*҆14IqPlҷz=PwȈQRm}:Xݕ^I+]h oEPk[&{m= EÈJڑ.qsA7aZAI\G4.nYFTZJ|kոՓgv.eSTjJiNI0󴤡ҩh*Bgq܈Uu}N[dZT.51ənˢ->'i Xf 8CoY/;uXkAĐ H^1z;zċ,2sUGQHOQnn ݊UZ .$xaȀn^oQZ]}z !I;x#%mv c @ *;4 iz9wJ"=vr4;|!o+BePPm1O;8rSAe ![4~Dfa[^_z|U" \[tE ISt7FJXc?q(5Кeb0ȿ c]'3 mМS%Ok ([uQ8x6dv]1]]:U °i:(VW#(+;t;eomig+d¿^lϻh_7n6oB,g 8M&_ s҅Dv2t!Ձ@'*O4׃o+Xޛ蚝y=2yl˹-]sj3I<τ2K!Ӡ.!{<ݕGR:LM(l$kfkst.blupK'3ꌛA5?osx?lD/H:GӏK:-+/#)RJ~̘?GH)|IT8I<rdۃ5<wk&y5* pD-#OnyŹi!v2AȞo5T(`Cj:{H~":܆3G@Co;{6 /kM#0J02~9W!u@_[AH4^P"()nD ]ouPCw9>xTa.Xos(U39&rPhdz5YĥֹtmJ[p, 55s8=@wOל4x֠zV8.Q+]/(0d )FV:D}G'@5TjUee(b\W YSwPob Px]#T%Νi<*$L?W-^۩ڑRi~"2  L]La:C񿟿|odߊE {s{u_f(gr/'(s{ƤBo'?vR@Ob5*3_9_Ңɥo{X~wțH)8&};779l&~mp jinoGk3E7Nj@rcFbF1͹ZHCNA#\uF!B)ɃRIc yo6jcxP0NȓO 5:.&4Q>B$-ř9; ?\ 8;e΃ɕ_\:0xfXu$EN`eБ . j& OoWJXEIe?|᛿]Jh>5sBw9۳G{9+嬰^Wg_?bÝ ?y_ }{^Nⷞ}5y7_l'j|lڳo^x/dslnLR1iXdcg0O(9`MӘ7yӞg讱-;ʽzn_\f}(&`OAѹS@N10_;; lwgKnL&Uy?7~BգZ#HWmgRKކVZR[̶+14 m@Ӣ6e*+@rty1]{^=/-P"`rRYoOq7i*laN@7;-9Br.뢐>21絮 co(#`*YoJNGgI)**O\}jB] Y[{o%6L{2f g͔'b(/w7b=A.XT5~)(/jƁ lut*)u:I8H xXdFr 0+!"ghBGdfvergvM#v ,dxO34LCn&69a-lS޺f=U$AˁK#a" xW#[ތ dL翧;6XRx$f%&k8سEp^r㛇x/x7E., _Z*Mr rh9f'8!QHfU`BjC5ZV\οiWQ˕2 6A^eZyEB x'\E]T"ϵ&t4r)rbt od$g\u#h~UR:f,g/d䑅r 9QAD84 .LC3x̣ T̅Dx}@a("4pyңtFŚgd/zzCF+)rRIMrb't +ʤwP;q"5\-\NI0枠!d7W!:Ω"#1-]6e"ʳ2PdE$+JRpkZha8ul(,5,V(Ÿ7D$3L&|xDЊ* or\3Np/`ȅ%14Є4>g"Nw&޵ r 4DWGcqs x-Zlr>Z~ͭt(ݏD\Rqד Fqr}9pe,)bH?J^>=w[xA4<}}xY+G?_^\3Fϖ?3+&IΗ..ߝ4[v+{;2B%N?M|A ڞ@K˓wL7 `2y};`Rfe/7BRe: f)QFQF*yzȎ( (gn ZӦjRr Ѯ9@*9l?E%缔3<\)Aꗎq}t_ZȃuLJ"BVƣ!Q8 7Z/  +'i5lu8u٪c^qcV˽0Iaۏæ0Ee lJbRF۞FopPzǛa:G÷1EwdۨyE#CyIŔIJJ uJa͖Q"0uh12˝cF$5"99իȥrE ,W*}$L㊄qXU^L˅ B,B9%#l}ZVsCķ|hV' ר;`ov)W4)M*[ݍ kA`pu϶^YqdR?{ܶ6&vOd<:%:wABKDZ.?$&)\.]`,v5 lO#I5caʡ(ti.`04Z<`Ub!h )DÀ69l4ΫT!#;*_ospeddNo9 H\{-\nq(ÓR A06sH`&<5!/ccB0@ms`!`V"m$\Z6̼^2B2+Ekrϵfe!D5kwQv&RayD`xHT'('lדn581WY۷ /)ݨ&rlx(!A=?q?}l\Keox4+5>O /5^l:gSp\>"b Ag*)}1mt{ 8hT&rJ5{."&~ԟp{pWXe&Q"X=$jM_D & xx<<}t ~ ~/P8N:DԶ8&䎕tm hdͩ|De@|beC1(yX ²iD\Xxw,V`@HI=˵-_\վrYW!}x ] "rc>lɮ u~㴹zҬUX6y틅d8\59QKqY dRCl0T5{/G eQZ7QpvX%gmpx_v%fWw #!Ր1&x[V%kDj-e)RUyskGu Ɣm rG~)tZ0G5FGLc)hIZ ZLp]QSGַ/ R1*RF޿ݤIPk+}mCІv| v1fArXqkLSz#Y0>(]tX BVoCRD#<#PA-\dB]$-;Szd4>b Q `AV%̆  @j g`.-\S8'*GjYNƷhuaj'ZbLSN.'#fdB˥C );kF5` "X#ySNu1`M bhsav,*lS=!-&KIK. D"NHMj{\ ۳H< )r@mW>L. NX t߶xM\ZWft9Ə\SAN}}S*3̅~5 -rn^x ʃlGLB,K3E(w<\7F,SGsYM>\k\=w|9q"ϭ~y-IzʅڜBq @e1Qp㠗j* -=$ :͹'v+G1ٳ$dp*铫nq"g"TJ'2oT`ƅf]L0Y.'EQ#\)O `D\8E`^HXB2z,8‘V]G 4%+5 {+Yf$|S=xP/q9m)l5!U9Εs"n1!Y^c6ЩhcIWR Y^F)2CA[ kJ4gPɨm"Nf|jQj {7Y\׺KO(C9 *(43^"[@iek2nO_:Ga^+-*R~@eEt;dxiS8 =3$?Q05߫QJRvzwEejõ,}Xbo"VɱZx-X~M唇:BOB[NS%3q+[Wv$ۺCM= 2:ADs  B $2K ̼ BJ(ÂMB!u̓PJB/R>p7?_{|Hyu7]~;~x;ϮOK^=[ u?\zsH@6. 8-etJMXwۋOljklL;͡2UdN|܏D恉|i2\kA>$ c/)ˣ{(Ԗ2sW$6㡗I.l)`kcĤ7yQ)FO`M7`;z7t+z%_[cswmז|mZ \ _\g,yD<9XT(j1ˁɢqHoG,y%$-䧕obѽ nJ!f{EsQp1w3݃WB4$0?O cǚډq̆oĭyy 1tę g۶bݻ(fpcðEQV,]WJ#zX7X%~y|seĕw/Mݒ+Ĺ鹕x . 1b}%CZre6L=po|8>z|njQHX pEŁ#"*dIQ F0t$Fah Ꞥ7K1՞wt 3D܆&&7㉳S,n ŏoA*w&qa 2( @@Ze0a6IBG^c ј nM!tň1![61Pb-ƒX)0;Pޑΐǫ.3gnk5o<ї :=q2^A9=y~ u3aF3VyZ,6:sC[\?XP~ @fs +` :}'/п]6ՠ\J$.(ߘ;xɫw;}}u`)%\s=dj97}O~O<=&uܜF5֤W 7c =Ofӵl 﨣א7J =i#3|V6e[XׄUZZI֌ ծu, JxbRUԯx]L[2V^.R6 (T ln|םV=^kPaQsf{;qYc^]XX{*01ɇxMܭ'%![4 ^|ҧRx2ڶaGerI^g{jN>fqvY0Vsﳺ-Zk[;/mG}60CRw^VGs։YņlӦ80i;h A_K,o}e ^b}4 W,p@UU롻̪ w{ɧ SbngE ]%~?4׾Zh>OvtP|4O Nz's).,0{}I^oUxn*>00 t!½FR3ʹ0Ns)Tw|wb Aul Ba{Yw<#qVcn+Sǒq^w2nW4he%5q.b' .TdW]N]\,GBvg:zoN22X$t9|#vRs ]6~N[;7Rq &b}3+xv2ZsU+|}֞z;.68ĩqG:J;#JXxG,lùjE6t$>ŭw0mxAC,d($v7y>\+ZO2DQe3i?WGVvϬc0I훝`yG(#,DHm۱wry캩|{)`墬ۼc;W'&v^"&7>p}impZ*t1Z@6?<:.868Lj*&Jv?bX&!b]p`lV[-FkubPuAx:hCtfhebY-k5>^:.8AVK jDjY߾&U(HU¡Ɣj$@*!mttU80n_tQ]Lq!Z} PkDi @wRk}YM,DD D fUsl00Q, *,6Z$FBT|[C<<uJHR`!25JC(u YHE"(g( 5\kxMKdtUE˔f \v^6iv4U"p|[i5esܙkT}Fקm Řjθ I%H+*N"$$f`P(ӧc-3 BדvIZg\SPbARbED$L,tI#Bݵ#SsoQVҨ !imNq^^{R *SP#mMQ 8f:5L(N;mibijTI+PY!۰yno]eq.-ny g VJP<Y P)tdV;#Fo5cd!,*ϟ^d~TʤDc6I"QˤA;`_܇JI%>gq;}$Rxm 񲘡r;1t0MNK.fƙ~gEz~輬ܵkF>_hȦw|;G컫>_hRMs]XkࢲY}4yL0b4Z6OИ3;=:M1 ͥ?{f|Wftz{s?Sf쎢/yk\`ACm:~>}g {sNs(w^)j"TO-^=']~crWnV;²?lo:nn$eȟ% e"#𦨐3Y||2e*$+/&%}-5J~Hϡ(<砶f:߽5Qv3l['=X\/:>h|=}7@+z&|6굝]㼦Ƃ~<>s9 Qj>J~6tIad [剞(4,HƋgx󩫯(r1~A/K;msw7~pS[ǣ[08,%_Cw)R ;rnߟNϮ'꽯0ѣU^~{ m6s~-(v+cSV_\`2,Ug?7_ ΍\ސr6ey].msPcXIu@)I tb4$$:mً&LX`> pST"Q?fgS~y#ʜt W9Z7 ޞY${(ίzۀdc`3+/qr;Y|릝k5gu(X+Si>=`m13vAK fzO4D5p# tws0TE%t^ dQ9"/u>S}PśW؟e1{ Qr٭걢0VtߊMe3TAa@al(MZE"+  6a K*F%*kBA1UC+nTs-ܸeFྴboɑGB7ך(aT},ɦrf5Xf}g@t>f?óW_LO"w,ܛ uhdA$;kQ'i.Iu&H6E[(HV#֠JwK.yA#I4䲵~4>LSR5MW{!J8o )fsG:wg}juzvmGPl+ҿY^y@w`qj;̋hg*GZDo ҡt:T1 ;7pw|]r󷑼oAu9p 9bVbN|u`;MQ&PלMr¾n 2JwGv@@*1*{PjԮ^p-s ʭ}[lQ]h]K0/h 9mpK,w. mM?̦2D/.$n?p9^K>mb(hak] P uLmx+̒a~P` >`{["!T"a>aܡM+·ɴ= $╝M{ĒłqƑHXʈ`).죘#i0Nc)%T+099o.vB̸ AL GL/ $e%E:%`ré2$a#Fv>EV.O$q &I2}Wc#۷Műq١Y(渧'' 0Es[o!JjQ%2=HUt ۏi\BMf6ϝip[s~jl͟ЋY 2%Ư(!FA`DHd>dN`ܴX!W?,)t9g >鹹* !6eyRcAd9(l7&3 /IxL˪IX~I`IMERRaQ (ёL `YcEXa@W7מ".} 󟃺 "PD0/f.\n:s)kW~p7`%U̴ 1=oYƊ˟d f ,hyi3Q,܈Af:: C6d = \88({67HN:L"h5I!HW qo!X̺ [|Ze$`*z@+b8X: nJbz3071h 0F:FVI\BK0 JFu 4=RZA,!Ea"sCtXp ^4GwQ(t5kJ"E,Gua^悆1$ Cڰr\Ng1( 9 tLX  nr@CG1GVqNyhB.4) hcD% 1.q.ҳ(Žt뉛8@ oܢ(@ bD %4\sB3h TXpfxVb* aL2 Fi 4Qp!QME"J9oB |tQc,LGDz&6]41aBy@Ű `| `.VʥXZ{ƑA_ٽU ;8-B?m%Ji'9&%jDr 1 !awͯ,AԤ(~e ب@]2ƀI.C!óUV~rtI'1&TLgi': UCٝBNAh)?yhډLؐ@.etY/?څdTYL2\=dtQ7&9:\bEE}`ϘhRL߱?~Ŝb.P[ ^3dEx[0W%ꗟ]_tt}G|KHTLAJq)խc<O?n/dH 'xiӣ|,z(e͔9☆X=OV?{X3KVKn-+㬒鐫dgX% խ>߹9әSۯU:p:9WH+GkN4z79O`DE (' BFP \#:bBk~ bLQ $7Uϡs[mX;xwàj7]NC3[[IHsi!8sGhsf9הxVm "m]ż=_l-Rn;_PKz l+u'/?/.B9W܌.@ r'Gp Zr_߄oPx\G3

ۼ jsy:_Ò/Gz\{]GYB)tDqk<1DE ?tZj)3Rr3+AiUx>ķ \sQvgȷ2Hߔ)4?zs&sʼn(лQSZvSd߹/ƣf-ÌZ\ݣ]BM !c/ye(,D1Zexx R2^KSZ U7ygur&jޫ콺M^ԽWudPU ,C]UG1TQ'U\JH9ji+6N:|@A}]1cƓ (c>_4<7*p#b 7Fpi/+GڷmDC&+ fy7:ER5HhdXNINN?H/;)wZU]|?/{N "uQkG' }m ;E<ɉ,pz(?_E5f s4r=yߘ)&FwwFMh6|E߇0|ń?=DϓiWshߌs2Yx9Ɠ?6_-qO$I%#ĘldcgG`^ˇf^B#7?}pVn&cGQ)9igMDcuJ)?8<É''|NʎɑQ@ DJ6:IuqQ>ayռm'v =CAwwj|±n8M;a Ly :2$$2AhcL\Y :€eGŬ'8Jˠ9 ٯnf= prE!|<9TBH|BKh3nc.P<>XH]k|\*_'W䢈kBNju d,IqZ d̑?(8%N&` a rPeQ*26L ZPk*lW cg;Ie朠[[{]xGdg wAFiB85lU'=->"Bz , R$ rʀQ{oVɚM=hqw?ƻ/qIua]6S *\ ǰsx9l(h\Hf E:ԗs*mxyy!S ͍rM5 t&ZF<ˀ.8˹fWR#eƔ #PKP,z' MpDU΀^OĿN/n ha^"<ȭލF$Ԛ%H!$q7I0*H&2 %NsNm$2oU[K3F9n1J1?j*w"Mn||侻6GJ|8{^O^_ʳ7_۱{4E.7Os/J(4d}iiNk WwuY|BCEZVd^t3ߢ~*-c_̎׸1:`/j:t&.-^gQOtajn/ )?ozWBnYT&tSphļW(SsV1)BG?l];<ZB\Pz᳻y(ER%jpRnn ?Ͼ~BaV~>関HJ'Mʂ_QpZq[U‰-o9t+BgJ:`߶dW6~\A!|ʟE,YL!'5n}א77<3 s#{TCNWf$ghOfAZJwE:_ðࢨh>CM3Jo&:|5z5D?p_N"̜JӝKd9\g/ng$z}E%1=Jn`tAj L V#*խeʻP$d%%f@I12Wrxb!?h &U:KBMJG9\$ 5^HJ \Mpk}YSn[J4}ZR?:Yu)KXpJ7AbMYLT%* !'TQpd$e%)$W1n}4iNLqqQnACxkM7647GԸwfy-@e uՇGGvHʏAwP bGG,7^Hy,WR% SZ;C^L5A${V'myDHAJqGFSc:Ϧ3M)ktI6 WU0K: WU'$WUZre 5]#cڮN@;rV(vP7dQY!lD@jMl|oJMA֔En*u@z)%. &g!yR@t bP&O. :q$IϠQbޝx~4w,h.WW+zBA5i gf~݇7bh6D`2 cQ\ZË4C8r^g HfKW3g? IZFs CE4^Z^Chx1,t 0@T쥏]Z+ ZCSrX˧_Xz1uOjF:Z+wes1!ݼomwmLJAWp A{ƚlӛ~OU7Vy! 'kΫ&8yu @ h oaLާ @P,{4nat}ڒEM u'X Mfw`0]z{BCێag(zάdn<8sq۾aPdt~F v-~{="[E&9J iҧ o=m2%f (do߳+Gh^!w1O, EƠN`|:;A *I2$ɮIϽ$Vϰl>ei CIlC5N|hT`VE?p8a0}d %S+̨>i5S~y S-l/Fɐ7 ;7_MH:P4ܜeQ#jdRPEdpW/-TyIe~=e~3(F\F{%s1_R*X| _u7| Ee溓v͟+CPP0X؂ .AoE-AуFʘ{38ltv+׬\ . 5/E}a4PFbhreYQv|iG:zTȹD~CS>q6RʧgŽǯ4ʐ|[2&KlgVg- @!*“\NF[p1AIXo[+`Q Ĝo2G:Fa @9wogѣ#$i&ń0PH))&D5{3)8 2WXmrX%k/ -YopD9hW9_"̚Ӗ,GbOlLzZyz,{hNk(9!R쿚CFȿݍ}:E2>N*)$#N_{?JF^-"_u $d߯do'2K ih-CüP)d0)\ G cR fC7In<i.=jvXHn&xB05@(ֈ fFZ?,uVs蘵Ctxi_C|w ,oIݜ|QqQG'MUpUEz!!YF,Ĵ k 㵑jA,FFUi,FR" =!xm4&p-77wcCw kUl*XweϾC><!:;T~E-վ8v=\c ّ)Y!^Ya,fۗQitFK5T@J,1H:E;zUFdbL/.n4oٮqxC5t\SG {m#Q 8ӥƀ TV> ڪ?xuJւRi5Ҁ uZ-0aW+?551Uׂ$$a11JkZY) ;}p-_(᠔cCiA*@TZ 3/X(!0kSP5)C&$1F,&ܼ{>zv[qKv#FUm݅?i!VQD1V7j΍#!޲2U;P7qwv)n0)E(Eh\@D Y/׽p1tăX>%` 46 1#A ՜Iitz hI9E1ÎJz7 07alT vLplCFYJtT[m/) 6R!CՒj<@ι~0FO-^izDJRV;߀Ɩ )~:\50x@%'aDL9Dn>wؔ<IaKN% 7Dh*8TJ8 `q5YyOR8/NOcL&p,S?&DcI{>/mo' ي) *5xr2ƺ؇ 4}C.?Vfw&M!(EaT+2IFX*U22ZyN##{`2(>&UJDOr(-0&P&2KSq7Q8)Q\)X s:]gRHTRJl h7]f-*R($_`JpJbe ^r<0ViVa"pcӈCjLF#esL v,E'${_bۘ|8g^/^7?.\7npի߮] !P&p!b DlmjK/N/._ Sa%NT=SE*$^3d<-Ԃ)INIB`\3ùf;AZc5Ԫk-+ |"ڌixe*IJH ;+lt%(F4JURyV-\O q9SslL>6 o%z٢g1;|Dfs&d\&@4& b{Ğ3'G:FUTIZ7 QCS/Q(yD.S Gk3(ǣTQGQi2F$nj0 RYjfu+IZ8dL-PaE BЪJa(?yS:HDC?8Ww ^`uqۣNY}9<2tu ?vӟ.JU_r(Z7Oˋz>L߶jC?h>b(?^ϝ-.ӵztt׎Dpw@DC"z$ @d!ÌFzE 6@VsYi .ҒR1ʿ֧Yhiz* $)u` i m |(Xjx˹=VOx TqTtS;!gDoA/9!0+,Z 'oV,^DswqΉ!3;'c.K{(R!9PYJ;T[GTq&LBL 9@jIy('8[@-#?r S"K ]#yAfrz%(/sRˆBHP(cZWO)tH)6U3niD. K3\g7{Ac$7F_,{P&d%܄-.S# օCKo~Or,Rw}3C.|S$#jLK|"O hv1C.|n)87PaVqvZq$bpN+5ѯ;~>rrRw8*Co*k_;'k4$4i~@I^BGN$ֽܺ LBUT>Z}Vī/`(F)N;wŏ3mcHīPYC(`\ݪChzJ4-pRXb}9bk^t牒s-acY1X0=v\UN֎.!`\niQÔTe=@Vc]r|CZq8>-h!LRo%rtp4bw s ڝ:7(Ρ_>`rkghH[qձ[%0b$p$uAYl; "~'d~2HFbWAO]@ h$xA+)=&Th:9Didt%o4wZ^:O͞ 9I֦k-]e9z/Y) ٤DI݁!2"HxRbD)hED≙`"r)r=l#E%T#d*t $E*\c~ҽTy1FCTGDeXhأdAT 4S#1 娅H.Pyq-r=/I("D)]CS=#oNA  u27kG*gq`Xu|cEС I!KR(S9`kϗ K=48 Ώ Ir3qSS.zodls<3;헟?5>6HWtX8;=%rr櫲/㭐_8eyfu b C<.w"H};qvLb46?K6/5dw8~\I$bWPŵ{s@7eil{-fNObgmFCI<U1B DW<ƈf=o۷|u4Nƒ^kE,iW?]M7~&<;o,Bugk腐(/:/6\oQqԪ kʅ񱾡Ejѳ&kgh.#|[HC8bR՟d23?&K^@*%4 幏=_QOH:jsy's *M8>CM0F2Pk8 $^K/$GB bҚ6p<-oNEw9BYôcce!9* Lc%,=A hQ1h'}G(a wjƲ֊ߥ&9X;~F1Jp Wy(VK*6Xo+ FZ.nYlj6ROR uE\%q}؈[A'x.݇]u.M2r+z:olV< A239b{ȰϹ/+1ljf i|H/fخl d+/~vwvBKCMQ kDu% c_P՞+Nn[=mVjEk]Q † ~WJoCy M}pb8H'zi)5l.7^u X88k3]V*c_c$carK,q=Y)X(Ҽ %M! 4ٕTµWg3J[%̶tݷX 7 moo '4v$沪Q8rх W"luebK ; si$[- 6e_S=o c3B-u߼D b[8RilnwsȚ ݷ%j̶bة[ʢsi]656:8w%ImC !gK穵{<0w-DH"kPۄBTXi}:-,6C?QF(%J+f<.c<ѯU'~mﭶbЊg|kvԺRxnq.P˲<ؚ7 /՞`0.$l[ZmKcA!ŃuF7Ϳ6[|w/bnR.|_'pYAF "e YExcȑih;!ږobppV~RYۂ5F:כ=jm?OEhȡss>a({1')cYBI BGA"MOaF#+rv.ލlL=Q~dT;C2عQiΞ9慇r6=R6j2 4x`G0MКV ߾櫣 yn3O'{Qm$JdB BijDF /*|OKo^|@!4hƬ(gɛT^K"uSp},2(^ZfTKDѶu&pݓu&J5D-uR  VԠ<,B9+,JOajJ.Vܖ@Sg.m8:%xVymeI}W^?XEF^u)ZNoL6S;Nj3fjd2-,V*曬K5པT.(bƺdSkPA@Eׇw04lNN/ͼ*R=B iPL])gۈW@bE\bU#+g&$-2+RK.ȷ$Zۆr!F]FF, RLN8KL9໺PΊRK̬1&噢˜IVYmL0&OJY@20ǒmQ.ĔH&X,ZlkQ8fKTZ22Llz5[] Rhxd1` :I+-^w5fjx/ט5IekSKD|})$RC5cOjHd1$rfR2;`ȠdhAiظD!+', ,u}P'Z4sWͶiBF at N}(cW 6i؂v^Δ0@ͮC>#Q6.3lbM4 Ao@/#u %tpɘ4/I*i\nޡ+f?WٳcM?+VXt||ypY` o&"sr%c@ZSj#ƔP) 0 =zQeqJUgqʅ hcN] Íp q5pp#-qɠ!u;v\/xmlQJCàCvExX l&WsX bRRB'CZ(%ΜYT)nە:ѰG#!4dj$x'MuO%!-҈"i'~A2ܺ`MR2̋`3?tfg=B'7xtjE ;Dz]"SE& m*,"?呭i,QSF\,Ǻ-H,~d%_ԱV!  9x%m; I07dT +"1kqw"S9'/DqCi|pԽpnv"mB$%{Ȳx'(XQ$y\ˊtlA7 1Q5DBjcT,[0()D~B"{ՔBd|̣s-1P?sP~_?,7wwlCt棎O?Tc׆aGU77.l$ 7X|'̛qUf}=&OGNC)X3v*wgwNwҙ?M 3$vv>FW\OFf(G6-Bj&q5<'(vY%c'k!cmQIdƷT|@Vj_3Kg'_3S+ek)1 =/XKfCߟ{[ҋWMYMO0\^@M)~?jBFчPkϳc6/lIl/Y8WD5R뵆IhF6xQ7D"+rQkVk.@:5ǫ)z[?o%70gt BJK.d=pM+6<>TPU2hMJy'4C>#&f&!Zr%I\Hƅ@bsW gmoäIhœd@vb1hXd5 ;HoET RR6 L,  tBD Y%ZH!Yr_N I+}`Ku B)ۃI 5D}`OJcFD6vˆxzF +XĚc'u10_E+{%)#F42޿mn'F,}hFRE$׬H# ,0#Ur Ja!!=\kיre֔m! ܦPxOˌHuYXP36g-dAx~:#)el++ŢgGkQ9d!0G* X1zNZNRKGřQ1S`IY`#L#qFbqu_h3F`%0D␴R!aѴ.AW-#YfC"c%8R!EqfQsF)w0ϱO1ؾf?&,!8$" $12RLiI(rZj#]LÐX;`(4k:2:bS_LdbeV*$!uqlyL5DkWLO)ߥ7!QLFH*ubSD$` ";ň($GIs1 2ɤ I3rJjCBuDS)QSqR7z[JP]s`o/ AW-B=MF舱kfE1ct̊&$;w=T7yPM:@4|Qu!E qHLlUܗ>a?IkTh6G<)9E)qU:+.9vceE>lơb M808RNbZi\fV̰%Ym)6̿I 7L0䖈E \lpৠ(ՄЩ:MEA48%2iU I oCA{1^?QIA檬XF~?l۹EB l٢BH|OH Ǯ"'AZ@- . v $SM,[+R[(^G_XH^ u#(LV>_ޓ(-t` s- U'XYe[P&qC3ZTO+-j O$B*VNM8:+e޽H' G\,O5-ml R%YEeD 4ɊλjEҏn<~v2Epֵ~N^o˺$pڷp6X],QβӟI/}GFP?]^r_A>&SX >[/Y,Ar=ߗc5vxF>טI!XXf^Lz2Q>K7ɱ2;ߐhOͯvAѩ*L,`ڭ>CIքh-bLZ|͑ 1bǶHӵ\tsBlcscBxqzwKtGWg﫝5l{6Iw3{uJvf^<-yzRB}`ol2*)lqnԒY@x8,24(@nxV@b;3;ȧ!t#01#@5gF[ E'ZB)%pw4k|d4c<m5 G`?0Fj 1 0J s<X;YGP}Ay-MY»'V= G3͔*fXbL­d#WK%T:Djzt oC%b3k\b 1 X!Z42.T:xzFA>.!gCR +)<JdKoFq)kok.㷧3}kw:Js[mv78VkxRö_Y=}^lCcKX/uUT!xѡJ ĎͿzSU FdJ[OU W #jz5$z2H6~n\+Xv)v=:6i*TS[X :*ER2B% ܈\B&a/ƅ8G,|@9Hd B[<q&##Sv8~M)ŕRcTǻ려s @eMǰFN')\ɬ 0AUG੒S4%_BE]ogS*@r?75j2ݙR. WCJY k%Nyɪ|\K'˪wKحl[h$E.ZgCo8h]HK5Q'=LSb4&B:2f6$q)R. g6`XX%ENPָ&$SJT+^T\L%Af%([ցhx~v sՋ'AN~8E7Q3ta@cJͅT:W]4{rV4%a8R L`)5WV|5MrŘXz"E$56L?*bΥ:x Ԣ?(oV%gX# W+u:Ee)?k'kt*=H*ԧ _"RGSIU|"*e/IdCYQEZJ^ءc6S5R3Z';ʢ0^XnW77޿6O)ofEQ{ *LZnvu2βi'CXx`bw Y-iǠ|M)ǶJbD|RI894Tnfͨמ9GEVjAQĶA6q iG;+ 4)0'F#|m}4$ Ѱ/v1yq.ihk6?/#qI (\7p˚Oj @\gsy,k36,5oSURN#bv`Ʊ`<dX]bObMb &P! j3>0 zO[f$/)aTqPi/ݫ &6( `!%%pAprGU"ܟ@ SOuPYZf]x΁ʐSڈeOFNXP + ,; 0RJE]*-Z{ۙi,ۈȐaq6V8Ђܡ'50#] w`MKbMO`7.X3h8Ij9ÔbҖSX3G*}QDGjh1[# Džv̛ йPf4^\Y }?d0Wr҈TaibmE-BU4{,Vjw Pm+U Bp?]8 Kx}@"& iI&nx2it El :+'ٽ>7cg{ZaOwju}C~P &U/k*?> rf,G ߦ^")n_ [g;IdM>/o٫^~/gyw|Z/ܮ~ a2tQ{elz2iӟλoxՌoaIws;wܾ]&MN_Mwd~=p_9RozrҞ};t'kq(7gO4L.e#s^-^蛬}oNm'I-.4H l,ZƦPm24I(;)HS/HE4l S?zh ?%40Sx;~|)\YK=Xw5󿭽 gltַ"҉ho.uuHs1նn˭ߟLS_Iهo6F5X[ Qߐ#^,*4݅pozŌܟI\\Tx%9MR;[,.\LSjط]~h_)kc2"2|`;~Ĝ!`ݗd_g7·^t}$ /νI6üڅfd^o-Lg0\=y 7i{_t6@/ӯoG+旅o  ^-KzZe~6TY(m:Xd J@A9γp}1a[0f߿XR "A wpΜWKKk ~_JP{2Y#Q ۣr8@B0Nw4,, e,6fB3r!_E%Kh`lqlif&m CaZ)sdg5C(Gww:} n%F#63[[pIx<rx R_*+,\*AXΤOKm<,ɉVFZOpDw}v}SxƑǍI+ s E8`l#jE+֛~e+0bϨ.~Y"Uz,gZ鄐}Z'wB!DȶƣcmIQ< V)#`ťMúNE?w&Z_!@ !--"XpF'fnHcj `"b1g=XAF B^JQ(5kl !!1qkHO>nQgiæf=8 ~n~ Z??%n8[MxӄުEϱ]cM8?Amx on@Z' xܦ[kDapWTGGiii! ur;sQڢ\m5 Go{_KkL _k= ủ-^v킟B-4@3n7_|ǧW{x_ŗ jQ"II\T y=:5` OfSv>.yߨ߹FlΝQFk \oR8l$pYCl鉛XRLqJ" iS[-ifk0Rƙe.5xƪM)r|r JHqDg?fEYu<p<XcRTy uQ!hxr32ɮW%cQݾj^rД<^KjոXOqkܡ|ӣe2ew$IbBv8;Oж3=۾TN(rx~)=T$+"BYBLN*¯JkC&\[1RoɔQXg430]P T]#L KōVe}}yuЄB|8 =|ϗt[-oZrt1y%*/,.jACj3KO$(HYNˮv19b%FaU=EEƺ% lzz*xfF9f< Fu#slr8j-ׅ`D_pon/:@ 0W/xVv\0j_WeAQѴn+._Kt9\s7#XZCj(ix\=8%a:d`/%pS y7psXRr 0#mT#WS͏%z2IPI`(gF%&5Ôbl~5]cZ(MݸFְl8ֆQ>7kLj MMy{gȥBIGUQ(QJ7&#c`%Cw އI>OBΓ87/ zeZ xl AmP3@iƴU1GRsV/8J LwqwO7g /b4U`"OsnFގg;r]K R}b8<B(/ O* %i O8E +j>R1ʡZmv{m9_XL]V19/FSZE(t GAqPKg_C[Iѐ/ 4d̆a>O*S鬝k8 ܣZS^"c)jg 4R yYv 2M J6BfZ`'a^+G33ZKs)rZ(U2\6q/9v߹I'x2_/(ߪ,Pw:'HoN+?BOhMů$LTk2"hBO"6kPa^#g rFf) HC%ϧ qOkhuXK%|~EjA/ubѱRErMwyM*U)X_+._%p'Oy. iB@ڻr@ڲ;N/&) gl_VQ(28Bp,v( u8/R& y[ jf6խ'ƯWEC V,`˼x`7]헟<L sVP$%jVe ْQFuAn[bn@udgĒ5̝Q [<83N]"[bJ &jH,iFcaTԑ끵O?Y0^(@ӧG m5f`-Eƛg)u$d(jO~f KS1%MKPu\?RilF^0;KC:b4gzlwő1S&$\kC2 3DjKWme)-;`jӵIОw\-3Am̙7\8IB)ʹJ[AFK(bʭJxcDs) {/bhiyhi8CxLzW_2jV1m~Qs@r?_e y.u(9.De}hiaKOxˆ_4^W⃈Γe E sxx)]jTdBI,Q ԑlhY)ˀQ=1˅5Dec|DiN4~[kDk09 b%V 8t uzd/֭%]O"PӼj,@8 kR@+QŸw bFA|[kLq_J>}  ][(U/p7;~?t)qٝ>3eA^D$iTbFd\%&7 / "Z:j0 )Gг)!Ku!Uge_oS `mbڏ:,} L#[ʶ-5,aM}XƯ}\1ԗUJW_Y&%1 &e9י>SO DWWb6Tn4tM\R:5c0JRn [0$u%)LЭzDaD.c˹:uV:3eG/ά>:|Q5dR3KiA %}fF(̎⟧c'(Sjv[Eda[aIB$aIB\elo r4UXΘSVS 2$2q87&<*yia$KX&XjXf|дf;RV^;L;!G\jV˩ӻt` 9XS͠&ԃMQ)dyƢJ]8g&RX wXYOܩ=q8AD$&:_Pc_:Ap>IgЫO;&m~g釧 bX+˷18fQyF_p( -Ej _P@\b)^?z ]̦ſøY>8pR%3<ӉF$ t_R&6@Rc,fzy~Zt9)5 @ʹ ưQG׋OW{x_b?w̝ig0n'\UXuC\-,vXMLgzkrc5(m8iIJcQ9ElUL$mVKmb5sC6꩕՘1VjƇ7k:b"4ЯoeSm}}L4QEL~#&˧;gn2R߱ݚO&YJ*4ű[2 D,e*7gp%Hhv Gl }"eLd1Ⱦz'C pJvxx FA8+! G!R/@] L2ۏo?Us/ 10)i̽ h~3/̯Beưq}G-? 9%@ͻSz킓D]$5$`aҡf W@0WaN>)&LX{E׌b7LjAIOfdg'AvI6eA'sn+`BekU&8T2L,EWiIzzZWrۑcYJ L M fV01 g)PjOa}p9ax3fnch)#a0`9Q,xdE3 +}9H; E;,Ft6ǃ{?.f?2!y0Q6}&'[uߑ !Hi, :hZ0l@* LX*׭h 8Ga_d+t)cp3nT3*zs durS2ư.)͓L6CJ9lxZx@zs8"4S?O'ѵ_NG'9Qr!X:ql\4" H;딂 5Vf09)x2.W, h>_ZSRް>_SXЈ,h=8F:Ƃ xWX[[H:U0cv|`&DJ[kMJ(x0@5.9+Z Q9DTGjV}24s2ڼY ]OK03}@ǷAxPDQu` .׫ ^MU6eG'ʎԲpD 3 D{+ at`^X{d5?\.vZ$R1 ͊Z(ד0߱'At0 _GCGMp;؉'y48X( ,xLI*W z;m,\w塅^2]a?ʖ1c\jR3; `A$6oW^u=Rt<\g6-gp=fir+MY@OA#DgE]. 8x.gHd1w>"VzL3MD}[˖UJ*A{_Ug+뙅㮗R= sM@X֖nGB0E-K4*2*]IFPy4wJ$TxP0e7#i`a,9% Z1kK=;_*tl6$ 4Ih`qTy4fM0FXjqe}awzR~Z7;=՘Yh+5R.OC7.z07שSLoOkOBw`/1=_2DI,CGl1aD!%ҚvvGS3F!'TuW!z+EiՕ])zBP5tϸ`=UTz$9/wЭ:+gٯ@^n1/zT.hEjp d3Z3|wyh_.x|qi7)ʇlnZ=ݟ)T`P$.i]Q7QA='_G#t4ItI/UOGG7#mqA^BE s7#`pj2 47_gkݾHkz,9Xx>tW>OWh40" DX:9yJT,TwrdשZaRS6?SOwdt0'=ucXvchOjYw|kYR5 $萅$`0UXknqL$'<<>Nݚi& fDSqZV)ҶwOmN*#(&4]nOejԃ@Uu A~є0Y 0=녎.S*j`M&6TM|\ŐA@]Uhh$3${>^B@cKfb>JZ;C*6k#L+7X}K׳T 5g Vbb1,.,Y+D[],F Y\0Y߽~lѤ+~aVu\ifJ_^⢫{ؕ!r;?A\ju.oͲVבJq6cB;Ui 2k$fbɛΨbDO(+f0C"rbfԜ0$ClL"LeKw0C*[7y|dmv'} *UFk"[Ifw֛:w_YS*Wc@+P?Q l_Rκ/$gO_ [;J-[)Sդf775Eox3E(}0~kWKV %e}Y|PL'7UOA,{sy*^y; 4+>?&g[Y.g)J1 LGV.{?4$!_T5s-"}nNu Ͼƕԛ[~&DvkBBr͒)"["P]ϿMZ\!]i'-Np "_E+ؑۗJ4(qWcիVGS*wV%LiM/4ܳ2Kvh3CH1T!ʳ NTUj)+ks$#BtӓɩI?U箿߼Hꎤ_Jž1zTwwi4 2E.>rn[};Uu&j 户lWb9JVǹ!`# HƐəZYIbyM_Bzκw݇P8[h*~[ 4r d#ʊy(m|"nan8(Tki-,[sI+vQ[~ⱨ.J68v]Őx9V-Շ>#kWt:L9Ey.XvW|Q#p5WTbait8 sEsRn5Åk՛ݜ#ڸqKV/=hk9'9WcALAݐZL0:4CP!)C8U{wHxCWlcų&~wՙ _[;=F%FiFwo?XANZQ{'L|G/ݪLž1z:j/ā}=- @9%q"xT /[A]a)[B'p&Sr>_pLuGVjy_NB^?fR VZ "me|ѐzE XƏ2(o>;L 2<'sCJ4l_̳MQY_kcǤ| 3TL*q_-9f[qF:06| tĿ_lU*Hcܶ++Uͻ/HN*BVZt!/ce5s&CGBy2k# 9A17jir9S:Oq>K诋Xᯬ߷-f;`Uʙ!Q)1J7YoϐiUW+j%`v)i;z޾_ޙ,j{КVd{(]ny =fe4#.q͈˪Q=:C f 5 )х/!bMUfh|"Y{r?Zl%AFm9~jsC%DW߬\ѳRETP_LwW忣zpǵZpfz~=竡ϣ|5vE'zr oȟfYN0=!?a_8R$j\]ΟwsB& 1rw%JFK6?"|tx2CϻGo7Nߌ'?m;6( D􈂝)(OowCdwy>*9G)*x`'*VOWVNE_o& ?:T Ɋ"?(yXEW"(尬;3M8GaRr- s;!xfS/9bƞLheziT4# 7Pc H4rT]F %/8$B"#i}"Smh"J$%]ir1L9*YP.o TdE/{y^GMHd03bF1 ¥G9gpL ~gEǼ!aXݣ8P[c8Y<|“-m3%=Ϲ%oPږY+!kοΎ!_ٵx#%R URb\[W rZ% !kO.'nALPC^ĺ7u.A / 'EiXjKR'/z)+*H15_f8I6|"o:d>Jls؛vG4br`7>jLqVyF aaA9IjFwkl3b;P D6Ճ}To%Ҵ;|XkUc){Z!bFT!jODnyὥVCimgdUrEWI$wpSY3ៃK(ߴ{gL壱D!wU[ L㏭x\xg1D`Rz6{U8`k ;Br{'"06)Mg~?7xO>&G4& İ($q0_%L(s)X.Ma[>b f7/0~Q{Wsu>;dJtqĎ2ԑ"i4>2Nl9ȥH,aiʁ4XЕLS-^$1rRS`~?k`^$,DR)H"A\R6q]+Urb #Q҆-3ZB,V"1ܪ)42I'ƭmVeX3N0FZ\`PcRaAa“8MSAԂiI͎@V@6 Kt1 0cNG4A l:Q) T !1fx?a[:.T+,IK/I׏.ExA! p'XףhnUߙJ'!;Seβ;}3XkZ*Jc-V#cʈ Ĥ81I,)-qV%$"HpeaP`Y2h)SujeUDD[L*کW'R_L>@7Vbݤ8- _k( `u +|'0)Db8J @kC:p"uy}9lsye9}J{BIs _,f^Q!XnHKCwm鸤i$(FG򿙛;c Ӭ}BB*y_-4,9k<ϒ#F,ڻj^H+sG׃p jkOBT$hQjlTa&63iv8D~_ QZvжcZ?V/k?+l1 *?k*<,Ӱt)Sε& $dT*bqb5wenC/ܴtLFEW qTKҪI9S[88 7=E.s`,@MZ`!'zg`f'jMI巇IVrwTk>\a"bΞ"j>ܴH&56fǺJfL^2i fc"fm`0ԹBWagW Ty,`~J_؟j-MFu/DfTUX)qa ]ߥL"[RQ,A^\ ?&^oVkԔ 86nݓХ$Ňx2^܇&ŔilREiE _U^zޤ,g~to?7+?YzPYr\$v(zOXbJG:+8f8) N!EYe. >}srO:|pK)݇rCWȍJ(}XW}ϑ*LMGڷ32R~\]1poL\gyt4 cZ==]ui;vza+U]ƀe2$ p.MU7Fߔ ʛtKђ>QEޑ+p6Ai,XX?:~9ﳐ2#S1x65GjL~(x߲bqA5(|skWu H>f^`jm5qZZڠ~5w4wնي WCf1! WeE晓M>ovn|#[VZ~Uq.pAUJgT ̠XpRX~]8_WC,wvp9*ò=6USN:agһsPv愊u—ъ|)AC:?'<_% \bz^^%1LWeמ\!ʤ.@-'zT#5vQ,srYs[(JIzHSNbO G6b\HS#3#-m얧!+O# b~[;ѱle~hL1ŋ2j ˼cC%S>LJar=׹(xuF>$ōFNgtoÑOyFMi^[e.j|kÂRB4VX4N4,iGiJP"bkTʼn15֟tL8 '"VxvѓC0;uI4+1u;ygKsTEwɛ>Ghr`'&,rY燋gݲt( g|0zNPIqD({,"kŎRGbi61nr7/P/͆9/o޽ÛKWXx\wWo^n2\}|0M%h>{}>Ylh=S>h SuhYПN7Cäe8T dE͇g ;q.'Q)3̒SS7z7gA683*+ h$EA #ATg=z8P[ށg2k>W8}cC{|orK )cm/Dӗr_G|GS8 #TPwemdzξ̓\$6<%pDǞNskf/lj8ƶ<]Uv4 _8nYM\B=kLsŷԓbk mwm^%5^wnuݭuVw<_*~KP]X[\Nfb(zRHmq*L˛Fk:[XPb+ Go{Ԅsp|9Zᝎs&Y!_If/1{|Ņ=?(4;ܯŭoc@!ŽOxN9 w?œ^FT %\ںв LBg nݢ.Xh| HV1.v AYH a7==EU`] e(sZ n2#ː%&^[)rq#&Ndhϣ l[?jŜrIhHNXJur`Ae)^ b X/5EL:VXL>7kW,C*SPaV;E*&u{ [ Y\k tS+Aycol}wP0ʕĂ5Z1=dF}2eeUo6׀eޟ_BTe}tVBͣ%e2J;*oKx cY,K5IanH bd8ϗޑo. G\O_l m(̐95y7~!}ps< +bɞ|?;OKukM<=oc }]3`PޛElI=OH*.BKhnEvKbr.cjψ aY=V/Np Ұ24z`C>&>.>w}c{WOzrױcZ-(umo}m hK-lu5-o\GGrwhxt0_hJ@܅9P8p4lTSV *>ƌQaw3d 8(F&``RS!Bs ]%$,RnX1v$\%$30W3 7~)e,a:mѩ:xԊK*Z*g,.Fsu@ O`?MǕsON Rr*eb h2sh1Vx¢;vQVcYT|Ebcxx Or,`>d2Wv.U, \̮E,J(iOkM άJG]ZVVR =b]˵;BZZ vc$wF;ϓԠNrd\y&6Nr?V˲liSO,_߿FV|*R}fw]NuVR,5J$GO#/H*p$fH :!Ɂ{,3xezoCzV.s ePwE[$)nQ"t "͍Z=/\RlY9\O]4l!%vp b QE\m{ jYՐ]ɡuטfި`SoשXRa {kLW-wX_1ERXƕ ћ={1Y \Ly Fs]L?rR U -0jm\W| )Ƽ`Qv237'Á &6L!ML-!ZCE-FEw("HX^)lye9A);C\ ڣԧ ;]x<ʚTM75P1`^_잽cf7߲VlY1E֒"u}U)V+NJ*ue"lE Ak FhjoA^23 Qh4tUO["{B0ĹŁ :Mb$ s0rVK @*eYɩ63)Zm`BSF$`}FI%Mvkm"bnL\g/1{^-{g;f9Ly*ߓ_NBk,u[Jy-d9XPH):$1qӉE%LXy$H JѝR{i|"p$4W=~IC?KW*mKD ձQKEYS"/3#5-XVˉ,(1Y",?+WQ( ~+WsƗVç8~ze?-;A1J1;8aZp3(x> qʶmGя/N4(7j n|qqɽ9%-!jĎ#(EҀbKkbW-ApW\dUU PƖ=wM_Z5[c`RdTfMe~i<<]Dc05=1 bz ADQN72\0m42x 9Jp{A-$'hGq᐀(߭f1lq:IG} #;w1zюߚv83M=䈟,;΂sKj8,) җ>/2SYVAM2uƜ.pZ⏐4%xQ$xoN$c P#pTD-DTGQӥzsWX ! U z!=ct:0"8_ILj%r;^*pX5n~/R/#(r:IcYFPd\yAwȀAl ,(k+XSq;DX5Hj]7(#!:Nl2Nƅk A3t1&w. J{q+GNZRk),g E bg}nܽR c8O?)漞_ݺQu '~+\<0?RO mJ2*ua[ȃ۾um ж愖DڌsU@ݮ(Z庂$& 9[5 N mEÁbL;R\Yp9{}sұE=HAt%m 7_+m#Iˢwr^? l`eB^%[J$u/1YU,nl1YEdWS22y@7x:W1)Ig!of))1r9fzWd0S|\#e'!IG^/t<L8z[tqmLzǶ1qRc,% EH ~mI1J19h&J֛v+O(nmHw.[TI (D83]K!ͤST)qefkA)٣d@v[ϔ4 3ylޮZExo)L;:Hȅ^BqOA"CVCvljR=Hk}_ZǹCPqc?w- [ z[Glӈ09k秩;w~|㌣Fܒ[o-Fέw-δm}gFdׄ~mN뽖β[Ii!0 Ԡ8轮DBL2]!QF H~M9])̀NAe)"<+O讻hoolxj1cXcF)aĖ~n19sgL˞>([pSş5s5gs갃ܘ@Ƃ9믡>N~^mt$KՎ9 q.qN KPO ˕K;~US)I4ƴ,eҔ J:804P\J!҃$Fu c=siEMy9Һ S{ -ܹS{%4 uVflA1 N7ܨew~Ik<˙y],Ԝ.`"MzraPG ɂ|&e5s2\se|2o(`/}R` em OAʋkyA9QPcU ބo{GQTJt z  n~+Y:DHy N¡.SB31W\k|^(D' ]zR(Ǫ ]Xg kͻ?E!j FiblqQemtnҜue1馍$y|tk@pp6A1:r*k,P͒ӄd)?,gK3w8ԄSԮ+ڨ( ]LN}eI:w ]>ӀN 6sR";C@zB:I +Ęiō5 tv@$5Li;CC0wa, fqp4z&JͤwH`(TS-TTYAuQ)@Pqte#D^1aI%e c( \ [&9#"-b0'qVS|U;LPR q_@\K%TUq58I1"Eb?LF#yTLHmw4}8e$i`@OXKp ]NRUI2uLԻL(kJ r)Y)NHE5L`#b"0Py\:QQr[ L1H@Ԅ+Z`s(.p4|uۣD*h/K~:PJo>UU5V\@7hnø1Vf:vORyRx$eԳ /TD S!V+:RJKm0Wh+HN 0f+tEyu-V0gqM F:ýyδWrκvZ1vA v\PN[nUYIw1Έ@t߹x~~(ͶD8{+iPI߲o>? >uz&~Ft&DŽn!# xnn'LWG)߈9.n?./q|"pmI N%ū}!4]Cq\~$Rm-8ZxyUD\u8ۑ$P#d:XIcIzqCIR!`^ι/G}3#4 Bc 4hk P^6^M4:С# MWcq\ѲwqRn*L0t*E_QKUqM x,UkЕ[4QEߌ^#9_]Djˆiq\Pe'2EGskIt'g~z$Ó}EZ8޴=g+`xd?M^_@ ܗvo85s-鼙ѐI1Iz@r%hBd1jVL.PcGTNG ZPn -ڶ3iP6mR¢A%8OK(-xU̎S5Z9b$!4QƊ-4:\pB1Q%]%%[PJ@ړW7g_.Jo4TzŇx8MS #\n76) X"*k-oDŽi4TZSP4LN3㊇r#Flou !OTZ.JckE7\%A>Osrmηj@y*%`=:ͯt! Ї5+dfaCFZK7] ,1hy=9EG& B#ٮ=|8%;ùAWoec«-K3!^_?<cԆ5ۚ`\:/޼t~C{6gZm&7?./XGӳ˧1+P9h,|s)G 2pa{_e@@#k9bh|[As\U M@u Zη1[晡8yzx'|2I=.sϬ;XfK4 L=F ȝٷ'zfȾMtPuc8*L#s|N>U\@bSL$-duDk߾ZhYHo;`6 ߫01=pGxS13Rjݫ`wB GShĻT2+ ς䊲"ՆZRP BT`YJwaS{,A M#SUm^ZX>B=WZHloX v"K &4qBrZ h/1p7I" |o Qrj^n#8xk Z=08l o#3Fx@[-קoTKZ.E\F!}f8B{f5;ie6!!kpnʌVpTLdz[\;(h6h~ 筟jX5GtI* [c&ɅšۓO_ 'UjVNg%͙ 9yKtbv|%kl?{W۸J/ %|)0^`mѹs?u1H*I\ۙ۹=D%)ZΤ@ġ%yIyu=[eY; < 6j  /\Lxǔ |kwK)mH U /Mx'X7RN %7~Ivz56 #F;gWC{Y ۧgQgֈZ2] WVKnuIC7UnQMugfoFWAH(4FtTsh[RIjG% >~73$Ry^pbVͳg[Kxq[l.lp{o`kyy-xIѫ"Qt;-˓N9)!b[%-;ƴsLd),EQE2WR%zy~8*9<]F%IgG@|^f׏gj`gUjO dK>W_VA=^Awo^z^^ XgBV OU.' 1E=}.{_ozx5u4霙_e32dr@O<_@ȥd@+"&$;yE?]c4NчB$KG%ЛB k OB\CMC['_݈ue݋QQhuލ#0*!U /|-My ߏ'i4prlXa,+c:,XfqnI^W`nS)d=ܔ&ajْV҂#aUTŁF9֔ voT0 s1-IəMIǸ ]8,M.Uu}+~ǬիRNaܺ&V Tnʲ{a]͡og𚛖w#R:&2cU1ZLiB8FQ f}˃gWY7wR!  CT4pő--%ʫP8!~)䑮f+yA2)Xq9lrs_Uh])̍%AsAS-_jr?r$j0Pj\%Y^+i(Fkm98nj a htn#'tt[SST #Vwe֕l?ZS-ݧR2 gY-AubԚLqJ2lXWh# rdQA(,L6 \:arS6zI@娜 e\V<LsDsM-cUDN` VWe.)5r:-1˥v`88DMRΖB"ua9BNrGBH>Di"z=2@ļWCǤ λqR # G= $T|W$/xsD uձq.PcoSߍ_U6aM2Jޕ*j:xN Q\&Xg$3 ;| 1@ ֱ>OOx/4usI8o Xe ""C4{#9K̳O{S~XkzAJt}v/ v_hШc EUƁcxwϪ< 8%`q~|i~mUqVbTrDASׂ*XPw!V! A4ŠCkǙO:jk{}'U~PŬ86TJ)ZO]p2bF΍ ڡ ;.0lk})Jjq @%3"XLY4C\X<)7#Uea˕-SUPS!N}6;mX*`%p^І xo&%&Wrќrk\ba͹.ӾOn8I6ޞ\ɁBq;|$MC.тٍɖEfк%,}S$h~w"x(VN㙅N[>/a57s?\W4R% fwW^wcʐ3 eA;rvhYqWw%jWYu_QGP&F/;@xtҳ!x@#3Bk+ޡ˫ lFRSZs;J'-M҇ċ(TxIP|K#l XxrH枣Wӎ F(&)XY0 #ڛF:s$ m 1mp޹4L frJZ:ki>$ NU\N}5<< ߭aS ^c=w9@t$M AQqQ+)˫<\*8rZA1,- V\HA%.s1tU o:pAZM(;F>B,: &tAZ{-|^F\5h뽙4|æ] V AQwSp?q&ޝu~73Brcw۞a%*'Εf;^|W=޻Eh t$p={ q^=my)lun!fj*UΕ[6\EG! S56SMf~#JP |N;H#dɤ[xԷt1C):xNiI"WX[# # HJ)}8}O_]>*Mjo/e@xǔ!O M)JZQ(SDP^GV_ZWɬ6O?&&5A[x&A'mCLlf>/"}y&fQ{j$ibc>#9(CwɪurJۻ+B_r7r+eQ#Hdnq>x+ΧRsF;pM1L$ wOr5:j&5ԏ >t[ m8`&D[.pÇC~pBT}4UX`{ȸO/~m~32UY꓍u1k~IP]Ϸv\>۳?gbqsuw[?Fw`$#B 6X(IBG#ڑF0 ~e!QЂCs3y V@y.PP7 cA@uMBAbh$HOCjxJD :.o-tݴUo6вLi= seLʽ&K\GP ])&fiƨl\凧09n=0 e"Ajg,5qZf̕&˙_UQ-%阄"pFёBJ UF4וHGFQ30 Ή4HCiHD5ĔL@D2)3r8d)옓*$d7F_k1Y c/L2z12tF9h]]@'aQ]d[}__ -+g̘P|۽)T% K;T%:rU% =Ř! SO^ܑn5Qb3uBGbNWŘ!<y/SfH'OwfG4*υ4th-4˻mO_OΥ@;ݢ/+/;4wf FxQkߖFA] fV8v5:N>߽[^0_H#_YCi1O=ѦyRA8ՍcE콖V%k3(2WfwA#VyC>ExVR]JM kAH8˸T2c(2ϴ9/H-WmX*&N6\WX .V03SrK,^Po`v(5?㜡xyM}qez ªJ˥6B4ใ"jT"7 )YJ&s˩ P`%%:V?a&ƥx°ݭ%59f/%h::Ųث<5>i)GCyBNtӧ4?tt? ;Ibшrw~P?b"Ѱ`n?Ih ~y/T7MGVey]-1^wcʐ̫J M#H+4v1³KZ(^ƄZ(>jz_%{CF/U N?&~16=1$z'_ލ}>QDZQ=NC>=9eO/RC:wұR !w#"и-hdS1ҒZ.d^[!֮5ɹkCr rZ ntg4=Cb 1 @mTNM:w+hڨL f\@}859JѰo)C];w!ĔݙzJ;I"x2k.;k0Q:(IC+FW,T}q_y:r ώnV}^{5cL^:E⇮h=*&Y`BeGuS1Qoٻ6dW,v7Cb@ [ q}I JԒ"T)qxf8 aOWU]Uɪ;JF`0ZvSZ=M킠ѩLQJ}ھ':}W>dk=;n2ϦbTWUSfStDz#eS`'šͳz{m5eORujdL_5Of XuxY00m*tiwk'K*6T ϵ:{M?4&8 5v-hNiGo$=5)Ii.vDJG-p 1@i06(#R[@0 +, U`]]1P Qߤn5{'}<~APN0k4(6:\,Aw~6}N/=\v0>XLZv.$uȓoةT,Ry'qjz~# vɑȧŊaP笨Jfytt; ^Lao'F~39Ny7oxBL_؊a9';7Z8+=#8dԘi)EGV$By <uԞJr[mTH !T1-2P^ +f8u/N'׽MYUيbfCZ7N}>TlAcg: ) L8HSM `r6Pw.g!J #+V׌cׂrII?gyKq$)h諸ݎVo*Wk[/+[R5!!\DkȔąM9G>C2ΚvYhD=lR*!k gT -ۓgQuؒeYei؁ii| LBג)Pԗu ( ^ԇXI<=^l {T3 b*yμ)|wx}38p_1F BH/fYsF3h8i%pg裟=CHgH 1#<_X I{1;+Aaxˇ/^Yi~`ċ G6_H ;bR 61;iӂA-u !TK!6H! b6` `I\;fx36N0]ˣyw{j߸/W)hiQjiΦz=P$Us<>~ 08(C$o#F~}{' l5i{O8KG7/){-~glfo ~gǻx>@e z$~7s]A0m*!0Bӎ|>B?7TJZ`,ooq6Hեe_ZP/Ć饉l5*\ԀpUL X$e3GBcjEȅ&kX4A!P6#aν|ȱ~|ջLYAyZ^-bUi2h)l|,AmLDI:ŀ Vߕ4Y8;hSsR{~Tb: ?p<嫇i2# ט<6N>4D ؆B" RsoY4R0i3$ ;Ǵ 9P4ڶVRhYk0J yBHZgF=)? 2ɰ`D롥9?+[hɒh؛T^D;bv9\E-2 |H%(p-TݶM@wm,X*k5VZkE p15YqǠabD I 0T>b85Cs7dn6g3(- #[Z ?2'Ѧo+S"3IZ.~f=ER{QߚۧhϏڧybWY^kh%h fAی)g|Gs8 }T3"ɢL`*A,!ʼn3щ !A;J|a.'y?pc),4 SrH|YB$%`\-nOzS @xp)WE?Ia9i B`o#`S6".~:VSk5Mxtx҆Z<++5?b3XfzK{ii>-ɿ~l.\&$ md]%*hWEȶ>NhXdw10l)LOI k=8,֪P:]Eu_~@K9@ vHHD#8X!,i#K3D!ÈYn"ZJ7&sa8e` Z‚4)= SX-[,}\- f$NG.wh5|9l L^P%y!<* S$ Nq@>}/Ԇ j007|("]%<0Eʷd02G9lP}vGBSZ{oOVsThނcvdi@>O Q)i8mtM9R#릜 NJ{t@$Mpr Ҕ墴#~;S UV#@WTsv)/)j|z7"o"-/G͞(?"$BOt\L eTO9rE~B߯!<)7ZPywn-KKZ =AM:1}ݬفJbF3'(>N'mV|X䈞bIMEErr'uҏbJ;G֯!acC!*="$YqȔ1*qȴ o۷2OB('֭| c-'msC! jkrb}"XP$=XSG5 -M#>9&Ĥŋ!DrL촍\>sNU'5ʌ[=ޭѰc"0N=vIVpΡ38Exg}{){ՃabтdJ;{Sbu~./'+AQDbgL{3fBjhuVՊ]5 9{<~.阛/KrF-cF!z*g } =puIbSìʊY-,^#K+KAu轾CY_mvrہh-o˅<)DRj0bWa"Y⬿[o+rI6]#G ϸ"nyi[  O ՃU ]{_됸 [ohNO_h?I9%qacμIqu$1ǎtfNn):ȭy>Q_Az(5PC-}@XaD 19?zruM,Y(ogUy6).1s0ȒZ li{(ΈBjz7.InRuNMN[e  \ۓsRtEm5([8 >F5*7,v˙!IM! ^x=R2Ek_ZܕkMnBL#;oؙ'ңTT0-y*+Kod]kcdC´#8 JVkbxXT4D$д%N/O?K'+nkWC"e׉@;RyX@݅_!Cւ`!0CɼQcN.T:#=vw&˹e+p1^y|Tb k2óq+S+R4kz"A sVqXە&qf0rǛmNhis(P`Lu6t 3JorCO4nZPE3DK1'00;izۡ_PPhXaNd@5/>}>]Y%aa`dA0sQ$;hBX;9(X!w{jAr) =KMRɞR܀A;Hh-$cَViZv#m::d^{u"zʢ`49P_M_>7oO$]t> tq/\6m4ć\eQ ŐpyC!UtǼtp ԓ?Nm 1I3- +9A 4'7+~"odFRExkY(If͟dYLYLU<#7jK+BNZ_,sQhrZZ(C *מ= LƬ1O~~ɻʓ,NIẗ́qvw;LNn6H? He6s29T&t>yr;OВZwt҄6 p b-ՅBadnCmz/ /FyN._P5,jݿDm PJohix,>^Hr@q̵+F熄WZ`8-G'#`)Zy D7no-+!c>^f""E͍e[Zrՠ*rlx/?lzt#c s(]vn_6W0wvݨę+ȍ]eZm;)9RUBP*F8ZSJTJuUs|p&!(yߧ6D ء/h:ڳ[p&Ipaz 6CIY,)9%%gZRR J>(&W* :v8e/yIlJ B  9@n׉:rN[Ot*JѥL )#c]ZXH9ٙR5UVᑎUTL\U%+ XJ4bV %fM3c<0 AbEtQؔ9kt:CƍVh,ҒvKF1^0kU\#IsU.g>qdҏ—4Xo(Zղ.LA( gr,r//dɅ+MaA$o0jEe;QxI\f 1Rsoe ד@-!3훷I'OįÄeuI Ȑ*ǒ3^XF$Dܓb9糯}8e ?dpyUx:cIx?rk+<3O8ۃwbUlvCF,Gtܦzw[OOxwCœx<~N{I_݄G?_W.?#TFxz[]\I')&~~pRs}?ﮮH`mQJ=<+}oJ\E .u'k0Ri-7߅RKQUf49D첺H դ M) aƔN,EЎ` 7^!/WQ0ƾ hVtR@:h" \"#sdW-0\ޯ<Yk]0іAr(}Թ99+L!p4'nuO Jgyd2vOosK< uzN ITJb7}VDڌ֌}@}Vx䵬m0%cc͠8/ 弃4CF%`QdGh=.vqa; jo۩RRÅ 8w^l8E; u wN[[5i^ۚ`ɈZ\X̀,!| gSNdL2)#{$>bS.dF[9ѭr+ic?Zڈ^rpi[2ČOU4Y$Uh+r&h*3C5rdVaڴ3ɌA^/g"X9Dm%#Sy KA-pt42/ՕͣF1!%$gvp9#7q3e1vrFd!Z̕fN"PZ[[ i\zMfȈдX㍂FJ\899 TLsfiiQ5H 28k#i;vMxf8Y l/i(19/n;l<ٰz-ɏ6{,+yn2dIֻV tf_^wޠ{Q8mF,-X ?iW7%=C4LҮUK<=՟5ĩåm./oLJR'kPDi19-g,,!ܡKdY{X׳7>)85B42^@V ?;Uu)V5K9!ı9H}|M%0 -F5RϿ q+,췉+8aH$G 2\Q8~%qZ歲13y - ]ÚU;ק#/w!ŗtq]5wD`f- FJf>!6W gG" >7ET/E9c`q\t̑25O0Ca%c;W]nK$LnaL&=o$dbK)չZyeS -PĚX$L8C!c \i-P#~xjϗ.,aT<\~<ͦ_aHSW5ݺh'IYw-&vނȪVlԸ)KݭέBճNvD(NX|$#5)-q#PN*2apd'r&jZ1 aBnB, A#( ,V|:YRCyU eڦ:iQBk!~2S.^+c 3Pov.X'BքIR, 03# >$)jZ}$x/S6In9W]u{;.1ﺉNTp6hfAU)ZrbcL;J vo-eX @vJX[RBD e0-15l█0P%ld/a2ւ%&* #<,xdInWN־F9>sspin1rӳk_5Nd &l̀%y$\k- !vͧQ-E#> 9c9rޭϏ_,|R\H?qe2S:T@"D2NťRHe}-j*Tn*$a Jt?}xA;*f/J FSf+cF K@ qi_M'(,cxJ$V*ǕSj !3#aBHsq|I rRٶ0c}Sջk"޿ώ-Dh]_d]?O>w]<ĈA~ M K& BȷN0qnR+p߿^ߨxYfÚ!cHL Dza9_E==)e1jh%&ٺZA j:敃Ɣw*5q)DJ!c 3!nζR`;c7}Mԗ&Kzc|&ѐǴx̴zw֋Qi1],n0]V.k]52-Ts~Dwg.m}kqqj^W-MہCY68ݮ:=ٽ'Z)(Ww V8Qh9|v`w@}:hO)zejzK>)˕Sv K ȏ]8pB)歃6bIeLzrCC\aaC/$w~hݺ@oapK߾(7VOi%qéR+1ߐ„ PNceCv~Ccc+._Kv/$Q`dJmTc+m ( =JW{2rO 2^9ufI GA)Lj=p,%, QE$ !us6EG=-t0َlgTdwb{S਼nEᜳS8dad0Zzv8 [Դp3s1H06Wg^n4rsWgN$^YB^fTjCF+SnNnmA ݲ;JX+7mD`96J9BS!F@%RTK6р1JŔA1סXrж bzm.-Xn\ަ0"i<0$JD@, M1`a|!R6brIy{;SHPaT_M&&oqib>xcNY;6?3b"҅ƯO;k瀀8*`v% iso-BqzIsg , %W|az;LIw4B![r U0]|, { `ã_If,3>% 6 bs}c~M=Lz×gκʚ5䌆nQŠ %NH) 2.%96L DP EbtLz!CzInE-ch+ݻS9ֱ͑/gKI! w#l1:~ ?C4r\ kG즵Bb,֌-J,'ki΀%dءpq ?}oRSZ[77o$|xp%QcOF#t ~AoYTO;o"4* yS@T+|!Ыmb a4~rʆW8{6 2 )gwC "8)/GwiR^Pn?`oDp@L׬m=n6!eI@71drb T%!|-68ܳme<ܭ?/eYmf.s:2!0!1xLG*BL_sNXe Ws٢q&M#c9F&SQ)BZ-4ģ(e9ZrW6/zY.۰qAfAuRB;i=B Nh6 j!3r/iO@ a]0 9Q6"Q= Aq, 9D@Uq)`jdEC tAo畣cb҆PlF8L9q#!UJ*R;6co 2ky5v#`h2!\{0VbEDQ!rb "7q8Q e/TrS^K1u/`3U]`88Hy-7%Mɼl_kuTva-~3!;B gvZ# &i~Rc4M0$h|JD2h>,R+uBdFhɑEoϙڔ@v!诤͌ {Bľ]daX"NOJ9(.gd/AKزF辟ӾJ19D(#,`𣥐~PvBYJ.2t3L Y#~"fUl1yő?ˉ$=-^.P@a.#UPpS3h[4 i"KeM!q9Oh$5rEe9 @>u,p/쐉0kETιJhWh˼lsUJKptۚ 3h;^+VLJxZI5+%V\05Zz`#Y0Aڨb[)FhPcC[ 1'˻SM9:I1R3y͌ǟh-|~̵Ȇ)zcۂJ vFtjˮVSݜpk6&/Om˧'nb7>6]D=Ha gۇX4&(WǷ|A76y=g_C4nϻ_?}HQ 3Wz:ni~MERxaA~뙽8nx)WRHDTȼwZvr"O?(2F<\*rNFWZZpI9zlj%cD-%cɘأ-y*˘?fLt!!\Dw)8&CU7jyiUjZT-%PE5 EUBNl;8ZfCj"ςD<~$5.DD{" Ʉ1qauPܹ&HGh.Kwl.6ݵ]4ǯ*4NLEdL SCDpEڂ\&DYO4;؂Z0}c7lnn-== )FnWQh*aHʚ-cee%׉d'(qXEx5w|`T9BG7l肍&j`yӾr;.@h,qA" zN93 Y"Kb QgJZ%s%\ UAZOӜ` e"`nȪ,.VycEq2\#'b)9ާYl-\Nj:,cbk e¯7^.?a3)0:QKڙ /U(r[MPnĉg%Z@<OHQ:B{ J.⪢Z->% 'Jʅ TBe4,fos HtBfݏu .t.5w^_.~/+ԎP-4Nys9OpL?&o6O~| e*rX~~1~zw5[$f7w.fӏ!X#yC::(Qxo.v{\|//'^_γn"uwic_3>G5MH ւ(uB5$sXqbs- R()BRH!9ֲd6"^X6kg:95[)1ejB(EJ,"fndk{͖1nKޑ,z|HYHÉ`4JQdS) l=1>ZMQS#kmL2(}S, ҷ!մIaZ:^ĘMo?T(<ޕEp@/Bv+p +^T "! Z&ၗɣvԊ@EFm+b!)Z"Rf+y[3;_)NӒ8ꨝ_^6^U[]ˁOB":1"`OѺ2Ɓ, )aBʹ8^nUR{;?A]vwtjqQ+)@?Jv%zAI1"$GkzE%z ϶x\BPyGuќAY;gmu >`yZD ˞KM_$Dȩ5yt\ot㓘ej%mcE=y'9ǛhV'?6T .2\ :?ZGPXX!hD,d pLQ|h16a wF`pИh#<&wmmK!5/Jme'9j03C /q"A\*"A0ISqhK0OMx$`삒N!AJANqzGcVS1]Zp♊Ul~"oBzI:Ͻɭ[疹e(?Y ?dI^M_/\0φx5Wn=<.D1@B~UfŊohAC2)36h=L \G[D0D6wnnj!cqozA(ɏWDJ]jJnSHr#/⡇*URY\0 .xqZpJO&JR)_S2wd3'1`m;6a!ૌn`#qbp0%XKk*%JdԇvV9O&&Аdjπ>ƨmcZ ]czTPg܎B NROR: |4p3S@;Ɗ9FGe8j:HS{}x (B!PXmֵiT(eav}K{>o5)M@f0CgbZi`Ӵ0|Qo)ŅՂ0V{a-7h c[cHXrx^m^WL{;wYm9O,ր"b!h"IÑ7ǎ >A>PJ]Fvq`3Մ)X#ib < '"j`ݘ/Cb &IXw3"Z:QeZLtd3Qݠ^ lI8K=B^:M-v` NU>N՚"؅u`693-i|9^A>/z9 r#qI:ͽmI43tm'|0$_A4Řh[\HJ`-k|: S>e&M [7E*|ƚan6H; d.pսѤwoI^ t%sBNTzFs48| 1O]H>yRSKec%5DfT sH;_J]0:oSX€ R>ܷ8QDAm5JT  [,1L|)virp uAjx&)P' ~ͮ$ &$1 W~`M*D2*Y4~A,= $ \Z:DN٠ mr$(Nk3`n(ڭT1Щ)nnK<&XHX>s7mLeY,ww(lS4"(._q d1hA6ma.C ^LcЮܲ)#/(nqTq!6VS \7lBkykOKq (;&knh]d7>*TT qxsnxkTj(R6#ȃ u?1&/Z+$TtCWyse\!m]yYֹ)9R4_ke1 @XGZl#ΦiLE9J SYܓt(O&Wiq{]=vfd@ {3H M@"c#9^C.3z5 (jyvi)<#]5W n('Mn?ޓ`Yoe\ߓvY9L؏CJgmQG介m"~dAIvK鼊IBNwl},7woyכ>ep.읊;wf.ƣbO.V>'Lc-EC`_VRPuS_M{WM|H+n_үܜUYٔ>5eXGf_PkS˲ԥlK{kpW4, kǺ۽?gW꧓*x_ֈ//:2GcV}HۂX ]8ćЖƺj~4Rʢ~8>L?!!nп~z 0yǟUؿ?!2pv/i7AnI)JZN1aylyGD opFpΜP8zX!F1Šy\.{׳4K tct{ `!kro|4XM #JjyY(vi2+ .31*FC܅xIFy)j/U^%A<kOb2|HZf7\ȱ=\΋(?/Y&7xyq|$n5@RR]lw[NpBH [\n?o3 yba2ʌ*_ *WbWx/]8u'JU8mjTkuuW--9!V %]#%0.B<Eҝ:U:34j>AcFF,bÄb->UiZQW@s) _1@,q=P),ORqR/%`1DcX׹ )/twS?>ha[}ƅo$d::/zswڭVus?KOVQ)5I3" PfT-{ѷhNocRFcQEߞTT!i}_CBF tE+n:UЏҫmҿˌəS 3X)K$*}HW^%r3YulZ1*on暺?0TD)!5K2l, R_v-F\ ɏ/P#*85`iba<@FԐDsCq{Vf3u`!-֮]IsՔRm۬|q) ~$VkV(v|'!ڜ7Xm9rΒ:eOBRjAZ*d+|Y⮕S\yElV?Zh)GȳHBr,J2C -P=M(U{+ray&!ThcN~Gܬ2 Vr"Ts3r/5)3f>|y"a"R"B)e0R 2{I;")9mRK #(bNNJx@[i,ZT nr6U @c &% wT#"H&]N>y3^|@&8Q5j%?j5e jZjBjx;yiEMد7/P`ךw`A,ee=z~_gfkz>⯷'al`e+:*%B0BFLN;?_yrSf\-S!rƗJJ~q9}k#p/A/|zebp~6;:SWt?`NnF*, 8A$ˁàjQG9Q'>'HbAlӐy钠t?]R9oUANIDrHR&1LH,^P^lF%u~''7øK\xJU8iN}ǥN#@7ߚw`%x5f%A&ۉ[N7BVbxf#_+SZr+랤iuOҤS[hpFY޵FE>]ۻ6 qJ<%rw*TekpC(-`F4-N>0?x=*adzZ]~szZZ?w%a50bO8O=|OHl)vgK\"S8l!)-l;vrkSHt&8,vMmʃ݃ŔDpDh@\+DBe'(օJiXopb8ejrR- G"3c8cyď e)ȘU:9gt^FFY#VCy%h.5eLfR NsBD=,Rc$ x f!*4̒ UwSDWl.CFB7$clk~ԢzD,M%ygXaRҥ^CUHCGǿ~VR *&1k/m_*P",ס(ʣIK }vovXWLECۡv 4w$^Lb^QAM$iE:ЪE)PlyJWDyrS8BH!5ю&Æ36%@^u JP"O& m2T(J+_e9%SQB;FLy }VɎ[%CŽi?Is܃IiuAUI/>mz4HTy>k 09uuu <.FKu4!8׮1wNi\ڻ"$Q6.nXOح@iۇ-?Ǒ,Z]d jBSPzBJkNaæagoOH\A$oZ8ݚN6Kjj2_bOvY!%1tt]tKNt(iEV)r zJbȺex(kOvO r/Ҏد-`9d"Q%/9=nt<)yZi:3ssLmЎaYpKB ZXn4D8TGANۂ n!Erўp Fȋ5$&9Ʀo U@E/ݸFuBQ9dsc,+-{R?aE ]o.RRywҏ[al;f C<$JS'YJ``D(1R0ZA 40 )g:{1Bmvg uuXB:חTy"Thh rH~hH,4=sbEbX &j%v"BEt >6H> s|6r|by /ohv4,^Pݤʂzq)l*f9P!B;˹bYd ń:lF/0CP :`c$(ҰD6[9R /sд;=f@zՎ4>HM}Sah߇wf|oV34W|Xk>c{.wt)P-yDֆ|&ܦz.z|b_L X0*ף¨ jLQy_Gƈ-3lyjS3{JuuJ!::*GiYe)J%mn]tqU JS'WL?Y]Ju97n~[zG1}}7ɱi H>}pcb:/n>RWW>" چ]"Ut-DƥFE}*"ڒ06&mTv A IyS\ڻз )AW~, {]u mTX@u {S<&Mv;CABL8PcۣAoG(JhvÇn$//jV͉[@L^}/1<"ϰC.QM}8-_-LOv9QnΗ Yl /"S}D(Yʈ- zQŃ;|SBУw)Y3D͞qP<HAŏfx~)92K **%X)6W2d~T1`bOxbemX/.C?ןʕ*AV"n6 B;,I9B |n]o2rb-(?|&k,׀ "ϝqz4oi~OWY:ckΆfJS(jŘ Z{b[Tb^# nNqb1u}$Oeό{&I7dNB:zm&'5ws 2+1j%tӼbGj>t!ߟ(Z_MaV|v# -m@+U .s_sl>+s4"(?ϏSXLZk7M mҷ,E݃-Jxǧ6FK~A1ȑdij䆧N+y="m]~XHZ r-P>X[ޝ9nuI8W>b̘KLǯhQ(r$-I.F(ܒYsK~/^[jG BeÍdށս+mШUIph;П"HW!2& yjhB6 [CPF$c2HĪܪ8gO&L,_YIP0@t}NQ`(aw$:EHבw&>uk,4`>TbjA@+ҏuV6n%(A(b5vpg4fhT4Zq(9Ӡ(8pZR]EÞ=ن!52adP//)vg.ٝ3w"g#:Lȑv+N6" h{N~3'tnR,0,hXak1Øhl i [0˃ti"Ry|zֻۙImhJ)6ѵVōfЈmzӜ n1`d\jO }|ʠ_2eysFۡlG2}#h7Xł]^[1 4<8tjy"O7 or Z75kmWY*"Dy>fe 1!FRG3ۓM;HwBQ8aUS}'wf¹Ϩz@ Q[S~mXWnY6%.o[+J7ei-afO9ng{i;r.$cr?i>μ6hR7ehֆ|&aSD<%Dft0!l|&I*@e%uWF:?Kcer1H'rQ'΁{<]2YLLhsPb1֮ @) B0% N$BYv5kܬ+u.N'p7#3d`n tϻp4Iϸut-#4IgFCp#}D2Mm47,>1¸w`R"y6K>|;vfY-xomwq"Z d!Q ˠp RK0But Ղ|*z߁?&se1}лzCًݏf&ǫb9_BPJ(ZOX#1=m:c@9G. o!y΁ϐvTxuS of|t yZxD XZ[8DJM[9b rn~-.nXB@Z.`kI4mV r!0gjYZV:Ek{% 4Ek[sSRHH-t:?EE4δ/Ý,<}rKF3.E!p1Q5y4-Jk $ghl[Ҋ\!ќfJ?XL9"c?xEH  { !DvAPG/cQ%^`ovEUQ-?U=7aTp~v9i^>m< K1%N+/:St iI>)wy*i *h3~O0mK%XK ڂpsPUK-,o㍦P5산Z j'(.֪`8d%0laBp h ÊArAXfe`Tm>.<NtסjWO*9|OQ)Y?^>.Xɖ4gVŽ8#oO3$\pRZ~~AV0T5f>L: u߆6!c8G:v04tPcnI!&^iN V u܇77]ǽuԐ.po^RS&,P ~Ń%N J#,kuH _0/ 0oZ?bJE2/0@ DcTń baZHG>TYDѤ5~2-(tƻ9v9~UJY``_$ !r~ք:TjF)?Z򫮢BU`c1$41!; 9* aZ{{A0aU/Ԉco=^009z7p]O%k,lH Lf] uVL/v"$e\=~{~岢6=H9A$j>nR5zb.9t9[ pazl}m=L[Xu1 _ NH~$1|Td)u8hA*)JmPF0YNJaS͸tyىoq4YIaZG/SXbSNQ;0$Ni abT1-. `unӶD s.t^R@EY\T#Yz>u. PE%x5 }pJhaPY]1 ]Z ,HcVEaq¼] ޕ~VQg82 AVMh'O tct&J̴&x7KXHPu,3B`4˩)E׏N;ý]v.jja/QSRu;/"*QiJ=~pΩwHw9C:jjH0a2g(EF`erA4R"{_XU6>/+H{,!􅁭o;vUVvLۯ"/(K0_^ueW%NIqkF_TaJ 1Jͨc]0k'/3;l%b.ц_P\)Q%&0a+]NEЄAp2qM@K/.}:#xqQ~e|&.i!.sn|eKkIRH0s#/呺 Pƻ<} >N}Hr}?RW-HPZKi Q(t47TQ*KZHWkC*o'Vc$ב,˵ZHQWw>SrW'W+.GRaW3IlUkЙ͵eai>u̐ADs]O|cȐcnb\ 慽7^ (0ʇ/4,f̿NN2脦rfG yᢳM(ܒ鯽yR.`og=OI>/?dgF!ռ!-Lav2Ocmz߹6=S{_?欌 @=x/2~͉]Ey8zqd1C1*{ r3_};.Aht o,.2JajwME g%/˒Be(N U4[ɫGLW3KlJOYTE7>5J[]xQ:|9hJ ;/kK\U_BPAؚv/@@pH G @! ɋyWâeY8Q HaGJ  bCq͐k}E?k lbNղ}#% 5ڣy:tKF F~t%3@S1]Y$aS/]V~nn\嵄ݻgIL(Q=yqkGmq5s,G:;~mzOakzE/~y S GXcn{k $g/ƔxܮN*Ҝ4WjxkE DbOYȹqbQqo~`!J&gK1mf3v7g PX#tRrICEV0=r=~,2!t*ma`8"ʁ˄A `"o1 h`^֟ `]3ʼno r (? Ѧbam K AK.0O؃QayGĖqS ‚G}V<^GCh2}ߎ^:f%S ٧:f3:fU7fiZ;c3X c$Tx?~ (<FQ)7#VIF1DCcz!qDv-#-^\g@z"4{kKEOC4 s#, H?Nisu.g?fZɿp۷O._>U_/Ћ<#,BܔD$RjS%DNZ')Ie5ˆ RNZq ;w C'O6w'@X=َ[LkZ=adT>2}oZtxKb^rz O)gL.^w_)楂YrbMW~@>}ڡ!RR)uѥy}01fp.1RAcJJRX\rp. "Rih Ǝn"VvMAK(0S-Xp`(r99[(PLvCA$K 3|`[DH2,X^k$"K;vSRp,WhI1Uӆzp<ʪq٭+>]7Bf  I)+Ywg b+SXU@J MPLF Be $^}JM /82!t`L-Eңha,[_ed^N|JJ7^FK}i PR8/pTP.E+ၒ˫q^:`]`N[0N ^ )G+Qu/.h6& AJ bPAuir}0ۇQq2_<~Ykᴽ=lm?}n2Sx*4ePg92#Lم ˩qY<2D?n:v( NSqF6Tjz|iĺ۴{wۡ 죽Ap2)q?|5-2dĉl}vAs{@s1j) $f}$!Ju,89Fѩ T;qC\@b} %g,U=z)ƞG2$1xKdTBL^-R!u?߿uHgɂ%rCʁHT63S,,}fM7qzSņcK'5 (Ͻx іhZ섃[r*\m 뾨VC/?YY@ϓdCaʹ\N :f"puJ!-*w9"in9DJ,(j[O'qX_n3;FFj=bU+s}ݯbȓnY2><| bw(G޽}BDi͗x3˯݂ in!=_@߿k?/psFCS+_0lH$/Z{O< %I)?OĞX] ֜~2gŁ{IԐ9zտM~{=j4G;ibyzjGL,b4Of`>/044gr=}Τd>+p͒OM<&3fsIUx1E>hp92?H̵!˼PNH9~GdޠmkRj?)7kC$] 1y&}g2ӭ1z86}d+nٞMÔ dR|fhO'W.rHGcHꪇ%C^xMG[)Dlqw7wM;-(ʷ ?[\UӚY+~]ICd 7VOhpe DERhIq}v>"Mc0۞AKu`gO*Y793x3ljxs=Huyc1%GS`uc#+ڑםi-))SƝ{ά(:R1cPKHǨ#,8MfP$(mXUXK{ɨ@9wzbqZdS tw3QjΗ 4z3&fO˟*X_8pY8S0r|^;\@$R!׈# R5K&{&WB$ʂG'FJg(?N> ;OV7/ZrhbO<í8 Iޒ>U%ܔ\ޖN3Sm餋mKo^㈏oLWBKjƺ~<]䓳WWw~2~h,g2TU3Ȝԩsi *v,:İf  ̣[`8m F;UnL5JDxʼn &m}#+,g1k `{t`F߳Mwn\4$!/\D;5C4fڭ.RD;h3]iMnMH :2E0ǹ$}nuy":uQE3:nݚ.%JLI<šlGZS8RpM@%nkr,cNK}VrVB v*!WO2ޮ?|=v"΀%X#VRV+fWc+Rz[̵}k,Q1g[h15~DEH,[T;ߙAl|%BhgqF'gqF$\7rz2 cXP,vF#JS)pg|r ǎ;Ֆ?pg9agFƼtRkorQ0㙳6HHN˹ $;XmOZNLBGQe'D ˧p\.GD9c#ȅFh5ֈ*Bv"O: ![*H cv#_Gvh}ׄj{-_3J{ޕ&XPRj );*J9eҶ\R.udXAΎv#>hy0qk,~u5pP>*so7V ޼(3>Nn5@.y.(FѢ1HFR1}{l*(TED؅X -PtNPa$QHZ5Cta~jA4e)sM :j*_-0P558f})B9ۃJTPGrv\k_q9XѝV %>$+% *σ7qiDNcnӟ?og wy]6pfac33Ot6 ~QjѮk}vwݟ ٤HW7mo4r=N0E'kK6&"7t9bX5ׇLhqEa|lunbU)9NY堼~j*_<8g5d$DI|%('.C3"Bn=1*'K9SdBHGy0bC$`c|Xbr|ru_t5X3JQky#W'7#Rb9{qs*ٜ!b&*b7.$1)=F)oXƤVΕ˸22ʥcVA !Jbҡj3K9GxfF)9\z $hq:"ے$>=v^seCBj@/3(#9dk`֘Yx{ɭ9h`KpSWK>ZIFgFRMbO_+kXU Y_ Fւ.,9G0xR{`μ2Xg' R*n+(TڎyƐ.KkGZv<H#n: 5J:acFBXF±=]|R %j-l l?ܞҐy)D:cPد@'aFHF,OH.Ȧ?,>89~oYwBrT0g fOs/ߞ%_LJ}(N|{7˿\ſ-`wn,ж?#ozɗOQV(?]??{j23X38 =6NrܳoA2#AA|9%*t%z)tT 7q=,@&CoBJҤaISNWgkRG] 5ʸ7 )!ʸ)LMBx?:M8%q/k}?@]\j^&@W h+c(\Dݍѣu?J&):(#._]gE_hhPhfz/ܢ۟&\e?mK=lyج\wݱMg(׭n5CZAvVkMiזv`0{mɖhf׊ndRr'o8M:J`-td-[fU# VSb.8\ 0Fcz4݊f,SO b.li#УNN쎏ܘ gBʁ0Z`@aamyLJY/^^-kg;qs*3U@'!6 =K.fKϒJFҮdG̜3!pOdヌ6K[V*b*,z+ P _u,ݴU> HBB8c(PO%;ՎP|L~,M۶xfD \ Z~DnTȍ=HTha 2:rFР6%Z3%MrBZI 9ZrM`Q!lغBT*=%_yp؉ pγ29 %=A~A﫢:.>{[&]g@{ʪ k#@m;Q5,3\7WX3!P;Aꉠi-DKmބup9nQoS <<-΢B3V!L\/eM#x do5٨Hw@ds4F'zǴC5:W7z:OY#=Ǭ_L̜J,P2`cRZMa9_Wv A}Sq=C>bPh~vBD Hi;#[Evr2+ضn.٠~͵5>/j|^M& yb2"qxV'\hZ'hL5v:a7hF0[%=vB,ݩqB>BaIGYb̀M0C\Ph M4F&-.LUEC!b~@$hk))(@U=rP]4sRw2 s\5y \˥GCP% ;E sn`!JXp*坪㦏;UTmd.>TNv ovL/'i<"x^^aJokpGi$fgba 裥s~+O]'pR}n.t`SrV&Yh&,bIRHP)er4. OQzLՆ oJ~W*"3#^=.9ٿ].vN~!k.7iY='ͶPy2L7uВkK~tá~X`z!~'_Bt7@8c;I0rO`@Dfb&{@ !^j)YJ̖lG6.ۼĆy*cIBф L7HiH8ɐ-AAgA61a2lTAժnEu4V(d` ӽfD".[ F`P s. );י#5%;+%g60nST֒$͹\@ vf@giE*im8DI>ӁeHf0 X%Ztf,(e,GT+ o; gi).;sY{nJ3,a.oyc%6̢$Utn}u]Eэp}C.6h_,fݢ#ՠ>RvT.CN"Ǡz9"gA3ەz+L?| Rx`h)PT?biіLZU2G{8G.혨pfTgir&Nb@x̫19\ (حC+ sN)w 8WF~bUJT>ܞ^,.zx/Ͳ87Irfoi!̂>H2RPvե?[ gwoT&LJ{:[WOnNrEzi2Zq܄.f\Z7tA6\VP.;o/պE/xmwR뱠ZthYn=GgI'A`\dͳ?eXHkްwx`"㉨eY8v'p[&V;ރE Cݨdec5X-ec`CNU^ݏc J \'ib.* YIz=vAԨv➄AN"(}8ZQfv>3! mX O,J6Ox]]xș*ySND, eoEB]1h$#ZBZ|( gYdJGu,ЂؙjTY=rfh:& |IA\22湍>kbm Fg;@Ԅ cᆜw# *Zkws##\(@PbDuG%ɄZ{x yV%{>GY!"IY  KqV k>n5 1s(%$&^J{cCg`G=Y43 h~9*ы$k4lqE}_?_/~_wM*:&b땎Zʹ)Pix;X }as񞬬\jθR5x٬ ,Vu{<`,xe}JkdlJQ$A5Z΅"UE_BiƯ#DIˈ7bW!A0P>﹪Oѣ~2z>ɉ-~PF6%AFfit<"4#F c 3v<ȞnTkQۊSjz:0T Ѽ]Wˮu[TWB! ~uKPȚ'Bʽ{s@kՑ 9jO,Ē+A ͽ ƒ15KC A90A>j5ɲa0d99 ِq/a$6&Y`FY配·<$V+',ׄY#1^ !dz.HsK^wZs{ .*\t`-- 3+$W S10M6TJ ޥx6>vf%Z?oӪb}"|:J(<h{y {D[' PĮX_,|/]P=z66#җr74^oRWdvos-\ˢBR6=!% _!fgILFt?6!-O2~߆r߫_J)5Sx+\PVh $.b8[y%>_)yZ{ۊp֥I q(I{ut$bw:P?kJ^/Lۺ72V]cF0 *.^; Fh"H|&I_)mK3fHT{PQ8ٌ!Ma0H>`]|jf>DY|ǎtp=*a>MұQwŏGm*fbH # ON5EcNQw,v@*f!9U/{}*2\- Tr0,"lgi ժp0U2Ԗ+=䓋x\褈F%Ewn/w8-RNs)O-~oSh0*uU;+8edž߽b |mv_<{oA yv}FB\oHc# ;'>Rm!..5 )C\٨ leyKMiAXŨM?VGxШԡLg1^?ݿ};Mo.)ռ[y xhg:>OüFCT\yU{.$fjEVW%!Nvذ)pCX4T;Bf7gNo{7[.)6턱^kҷwO>Kn]X37,jta;׈s˻qB9x\ RL'm)wѷwOsOn]X37 JL0)vSLSC,m)OF_CnƦ.|5 XeyyMGEqKNr27Ǒ DG&QD](0ƙ_s8z<*r.*m194Р)w[JI<۠h:Ҏm۟k6,mܱ7-< Cڸ\T!8ƥ+FF[F.zqǛ|2mӃkqƉĺcx@ڱF;" @c%#}# PEb#y>o!(ʾ뎯\[bȏVR`G*e_L/,Nn8ZjR6+BnJXBNH'AJv; nG6H_gnY69וdۻY [.)6tTp-??+ѻua!DlJ8&Sz\ RL'm)wh|0 ݺgn6)b+S@zfWVVJ+ %Oᨛ0c f!pqqFKn&^FN˘FVl~I%Ǝc7[d -:Q& Cp3H~I]rYJ^Ѯf0;Rc 9Xzzr9+ 0z/u\ ,xV=;uy ƉKv킀)"R|q:6!zqQ#'iͥ1(~Y ׅT2O0\y}+ ơ1Ƶ6ĪџQAOp2̯PXhwAG܆X',++,oo. wׯ^Q2Ւ ^X\pףOԷgW8 47x]S1r꿼Ar]={V;rwBvH5 2ʰ* DiV/Jo;0\,[Q3CfC4Q_tJ8CSp MeZIJv5y頓 21$D +NK;98톢Ztj̾}H<?DW:zpsV˭t 0"˗8R`(p0Zw0BAlbE}0҅ )Oؓ0A7唌(nX4B(IaJ G_J1ݓQ`".P:-+P%=@0Zt7\?*|,\p}9on,2+mcmIB%ۍ&.P:r4JR60ifr`:X6/>_Ix9yo_ʕlϕ' 9@zWs8%NХu~9/Hd>X| _v.?3^+";,9}$,?:,I V%IN=,YaIJ[]Z={PC:^ԭ5協3P׷W3z 3v#k9i>fޫˇO W'y utv :?t?+8ds.;^qR;ӌYƝ/cLK)wW@ SQ&q e`Z+u(Q}ܒd.gpQ¥ʯg%+]NmttO/jؼ[9$-0]meXYL#񸪮:)az[vrtD LBj&Ywv=(o4 H:?A(Ga;ŶShӢ%7UЄI4@|`J0fJ{F{(uXFa,UE"L harTKJ5T(wUT&UuP^R.YEtx*fԾ}<r_GOwL/۲fk\Vϭ>=7ŤCA5T?ۻB0[w` .gO|ı h?ǿs1&bn./'Wv}} |?Rܯ5%bti狿<h(v|sh]+q`'"4<< d`0mAn&QLniK]PA[ Aa57XYr_:QPΐ65u(aU٥j_iݭw9)/'j˹ @TU2˙N a/Ab z=p|s{'/joiԪUe܄Go t}{qYq:lfG0#|dg3kk]m!c9걿l肦dSύ +t[%'1ml7sXZŸ.6wVnv?F󯸋yV7TӪd~gwQ}ېEPGS.N`|뺾TNVZΒ|͵ 3$$+_prmҭ\z]R %+XX|`A8f1\Pq]啭b󜩀>Nlv(54^tTY)cZCRݔB_I[)K n\}y-5a}xgr Cqs}T]W;%c>!B _)*—w'rDTՃZ0֫[q/6ZDVJk+ỹL1BvT(h_p㩠[Aü)Okr8v9 @%r]p*Ygzʍ_*5@&$գ,ۭ;[lVOӭG3AKmW*a᝿k9NPPSʟhjL4zTt 36;ʘ:tgj}ē0B'F'2^M8JJ*FX5<>D=#ls"@ sg; V{ƺI- |*uJ|*) ^XRi@0xxXsOFmpfe'3 :x !\ >;‡as1Z8N&L/'d~ty)/xy 7|ȭTӛ>;5?`y|OMph$Y31=_}ZAB@;bW_n_G`vM #O.n=nlAJv35{Ș2I'HFX&v,nnIpᵒ]| 0ZDJA=/(%fD䳇?`KQʇaw|=9ΑH`z8l95wYjoxI{ϭіdx<K<>M7(%N4.>H;pX<:ߗ8ɨi4O2 G̻s9s>z%)l.|S{jKՙ 8:r`:bȨ,@)S,2ؔs;j:ky=+j% bV<$CL9x&.9(!TM+GIfSI=mԲ<2)t^JvDji!0ۺ4b)# RqC@f띟_弎.' ԭͯʲ3Ke9,V˷y bc7gOF2ʮhC@oQ+\,?)wpf؇#xgNȚڃ詠=u>6fo[ڢ/dhoM~e0*1=Sv@[>wQ=#la'ùɇ:F%ms&# =#!<M~ŁRi4E3zPl0 Sۨ懅ak5`yҠ 3sxXz`ఴ_vC1GȜaṹd9 Yk[BuKPUqycYg3 _DN ,c).KfEx<"Ӫw<䒑,xjJ"1ܘAgW+4eۍJq.CR.²dsN xHc z_98]1P}7&KZFJ 1*3j ѹV^ &IyjvlkG]% $ ،bg1i+*ǔE?)i.K5pG^FUɐ1x_5dR0N<E E4r)@Ґ:Ufmb;$^SP6Q$6bgG9E&hu˝IFskqj["TZԇR )XQBv>T*{uQԭ.:&Ov/[*A4IcMd2&َSb0P$%th}[hjjI;zHY9%J Te&gE I$4E)\r Ԏ7GƐQ["3.h 23&N[1 "0 $2( gg%2dt('¹(2]|j wd3gXZUUQ-J 9l"? L@nvbd6d(J֛b:s*$֓ /Ӳj XlT2 %;+‡PK'}EHfz? ;m2+OYg@'҇t8fMdv@n_\ ިirz^.|Rbr%k>߿vT#ђ>1RLiųV^~W7@_R2L.ie7SXj X^^i:Me:hB3KLHP=Z?ǚdۻyJuHe)N 677lo𞯻 !hS|n62z~j+^lmo8Gx5|\BlsCLJT|4JV }x!aBG>'!DY3cpRYV} "};B{x;SضSJ8\;ϋ.Q 35Fy)~+ި`:J8j,J9X1¶b`:Bf3Ձ'g-SgI0vr· ƠEY}1(" h֬W%TU(|ju;7;)[ g M% =56윑${dojIsv-ew,Dʄ7 -Vn'de`m59(d'fwյG/3ޙ\($%R'% \ ȅjvmU^= 5]µ8DB KNmcg-Kkd2$LX"] E1[5deK: szO]Yܝ-BfmvG- ޕ6u?e)Bt|'ReYF ++u6ΰOvA,zY^uqIFo}7N;b`b^jnԬ[q~}5U⺄Xx̰s5%ɠ1cjke| w{lQ儼7_L//?}b/}z1ӗ0fz1=Ƣd7LA~Q;7C`2Uy c(Ֆן8宿v^ui\q8 X8DtZrvn_rДqoʱZ gu5;F>z&[owi:|z̀Z};Hy$~-OnlW48id8_h߼+.п6 8}{TzHwow y?p)N1axKfSD,o† _x' d 8ܞ(OD6_Yn,N`Y8/-](O6Z/NYYo>,O?7jۦ$31/P| VC9;^Z7ڃ& %GhZ9[M$%,Pd.aw|=]=@nS=Y#{҇3.;w]c'l{ϽϒI9/i^Yr_CXc/vjMK~3+e>IMXLjNz:~6(zɛn'q"{KȑCV+k5zb8PЖsrٓ4E4&8BUvnޥS' {mLY5T#RUPhl*(Twynu>q>iެ_嵛bksUzraU2ZY?O_ ,eI3Jo `-~\oqU~ Q`eOJ9?~|uRO++.3Ƿ}/ҬBIb2W'gbc-S ;H`3ޥuQf%({cGygˎ ?uz{d 'o[XkƙʺyRvLRY7Vv[n>ʚ=Qbiа-c:=ey]ߓ6W\xU# |~ط}ۭ$Jk'O23vR'EՓTG|$ySM 'AV( RO=xpUn&M=];9,_]-NMҭϮ dgzӏԒZprE^YjLjp2:)|堝ք |+É- }j rx V*Ye.T(ο٬}~nD=~$>FZefwԔS`X;juz b9E,0 Istj$gKOUƮŖ'6ܵ}e`_m%pD&Tһ S:&S)6 c}减驕$x_ XAAJBY% c Y +PVL.pZu͆"Y{nNow]e&0ܡ =K`mlbJR.ZsߦX]ܞM 6O_ĢGMjRx=19Q]C˝d')S.hӺ tԠ7mE!kLwpH*VM]MBu(Ѡs$:BE/^BuwoZ丹bU؅9Kq 6xljK0T6o\zE f[ocz4Ja.mYL#}OK|҃=:33vV~d&)gb6l 1Kly(?7]p +-LyMKF-Nc\C\Ove}99A)FWo̓ۓ_4Kv͓7x.i 8r8j3>{#;UlS1\h@zӢx(M=PI&%R(QkձJ0=rtA B0Kp%Eh#^șd^-O)Z(W(OCy]Kdff^:c.f$ ͲH#ɔ1B=輵EkMcC>ԭ ] .Ĉ [55m<čp ި]cǖ u3c-zukq]BEsxڛ icœS-}i7ވcѩBNDND'un} 'NslUgJ!_Yt'?aeq:wegc/Ec+6?.Sao`!6U;߾8 IEDD45C]/lOA)2{O%[-"sСkfP@OKM8LЩŚh.Q_gg))|lDStIULOKtʧ9k`@h[ 6)_ 13HmF*އ_I/;!ѷi"Gc ?M"d]՝CpH]pԲ!h,R6j $PD^h2J[wDƪfOMB-1tjh%\9!ҘUk:4("ʋHI ׈%Atݳh+J1]ZPByw.Dmsm: v}[_Y!Sg:_V> (OazX5e+XsGq(2Fi`SD25%@-׾ 㴵궩Z>#_K^։sK'su[[5:ӫ:fqu#ж,PGmi!vbQ}(Rۃ]Z K-G\Ǿդ3}oiRQ[,#DI״ljbTAZ& hȞhk|jVJtMki\lv aa"3gPb&${8QjixF(F8шH7=1ȱY?k0AjrOCFlTΫ RK߰5$d5z4Pk9ڤCޙm:${Tfxncрlp[w~NbK?P+OwҺ!K?Ii@ , Sg8 ZoڻIBGE O$~bzD&)1~Vko/2>L"LH|_1 5j*{!+1][5@k\=R^Jo,4 aWNxbs6葈/<a;bc;"7rk:0&Sn W׭6(Ěu[{XXxǵ0щ*AC@V<)D;f(0D0f5ut|q @#qqs~wߍ~Gt쟫iQJOL68D#zAQJpLSX3,iki{SXFLF{Yk|,|pj5Ef!`)Yբ%*M!;xeD4k^qTw8dNV\Zae^JTo"C` 2jԵSe?K4w7JSOSnctp:BL*A-zK^}ﭤ)oeZ<_e8n 3nߟ#Yf& 3Wl:KRؗ(ixu~rFWCI%M m c:rl+/ 8?OlďN1&q3nLa>c3?w҈OweY=$N*0˜5g,x܊[+V!ZdveX,~2%39z˫nǼzZ8WOz85P^=QoaĞT'?r']ᔷwY]~pXVDw̕i0j1OA m2x BoqO\VQǹ9'k7E9r|Atx?px{p(oϹ$a%' 4:b4 qZMCPy y!tm; &a"LKU, Af.њ'$L8clٮ Tmeh-keͲ&`8m]f$6uIHV*qTi6vuà} w 62ubXW.o#Nj@ba Ƽ 3>tº#{.6R\DFM&дL3ZQabFmCȈ{h3{-`>ؚKuuټ@?L*a/Ia_jh~p( K4q~Uշâj 7E4[ێrM NAh}ҕr5WS! +|7<÷D:rTWRiN42#W Kmxy͆""Nٿeh"HƭMW.LLllRf+GugȂ䤻+՝ndFGNڂHIz8Q.dz˹0ꖃvnF]`G;ZZ^37x 6 xu%&K^VmʓN'6] e(G`GsH |>S{TA+l^@Kg<~{3׺|f^[uM\& ӤÖ,m%qju%8NDyδi8U׸8zVǴT|WW8uQ`ʳ(tc 6?~|"¢V¢̊O{NĮXF\釋 Gw/ Oy3sggї;pKL!q5FN[cny5{5dX<mDbJ6k(Vmgd :myT؂YTGdZu]7nEq]6>tkك3,kpPA<58LlX 1~Zo`*kP=x_`!5Պ{}pSEBkql{쥈;Sw!q Ͷ:X/I@j&/5H] |aۻg>0wRB_wl<_8~yc߿ aesk(=JOn%?쉎 iцwټ+ (q֬Z*[v&eL  =1W1˒y`L̑(|y}G yBꔍ(V,1Tqh^l[M. gusA$dx|nL_%ݞw3L@Z ٻm$W|Y͗"Y 3l~ŀ'=n;/GvvP-QbBdFbq< t];c U\eW7@dsq P% ;-QM-^QM1o+O;}*܂6([R^ H}HZ@!T\?^qI#?7i*X8Ui6U++TKB}#9Mjbk*)150ShMxbJ8u -nnIM)V2ͭ跁pߦ\A$Jsa,HѻEU0.5k 1Im*]XjE`, $Z&%զEYzAD$,,\=$ն~Qcw$g M_89vRmykhOYdSֽSVZ42|h\T,o9"djmt/(fSyj9GJ G*P/_ő+18r2+XWfE=[F2*)8_l]G6 Ifܦ8?,G6CWsl72h3{x;R )\=7"=wěx[]|w`xCħNT;VM@sjyT9Y-R;kG>6OX?GB};im˹13a"ǀg` 19dҳ2h98-54)L%yÏeLcDPRQQQa >tǜ-H0&g}2Wl̖JJkאRZcͅh<f=0ڬ1ii88y:8ǻlvx?v͎~!)a0W d\vIR;fEގied $bfT6~TCoƏ@j#8c1 -&m "܁SDQ Ua+MUb) SjZP*3BKE8`JPR!#(VpVJU>Sr.' -O^V6O~Z}2'S2K,StǜN<>y-PؔHX'YwK1to:u+hfK}BjFsRFB@VzHm4 )p?<=9Mzuܟ^)lF/bR57SP97i~~XN yAqu^!|3gnzv~y*(D4?c`Rb&~QއjRR(%fo.rv]\?}~G/cAKd\zgnx'Q-;o{L+9*ou}NU_)ẉ5 3; oq+,0y> e9jh3?D,$`99d{Ύ+}_YX_ݎk=G۝JQCgޛ#D%]Q#ܼa'%,ui1xeOW$sHR\$y#*^lhx?z;(Wn;gWxF+eV6m1Orz{T\>G3{O2]\vq7]?Ql;Єv؏x]is8Aa= j{1)=}zO^Y`Ϧ4ו` u|3~ }rj>t-փ?; z^w8Q-00BNS`H|/wxe\z-Y8`b3* &kwJyVS^4OӁ$ʽޱ<$LJ"Q Iׅ[=YmDb|p}ooKJh|:jJ6`FRݠlaS%jG4!r!<붒jZVIu op&PIUۯhyT-%R*̲("esN.5;I4e[]T$ӓF&]wlU] p|,B(7Ueۙ??=jzt:tDv> 5!mtc`F̏/l:$նК"sc>Z%O}~ ,r5 fR_{5lֿon5È>~q_VSC7od7O7A$Eϻud>TRcLkٜl1FTfލZ-\}p6twG:a5\T/Ep&sӽ+oC`[}ڈ Iqss]d{7~Yp7,-dDSdlѺqɑu7ѩ7a"og]k#*jƓEn]+PZ'зfT;+ tOy:$+UNX=I8Lbb8Sj+4-ʣ4[+=N&$c1/S/.̝y` %}9 ΨXSX6 G!kK`xm_*//Q(|Cɩz,Mq'ung/*8J2n9!lA6 R"u"@Rr<ɾ)8R"ƙ#6m3>۠hJcPb'fdڕYjk:i|rS5.ɭܛm1CeIa5EG"ʂCrQ?ϩSpN1*ԅVXTKHmS]rDNg~\*TU ](QW(ΌUj[IZLurԊdn|LyM$ԁ! گ<٠l?!YwkKʀDzA)IphY)ApdB?>Em<ԍؠFR*[ $72!^H]j9'ϑsRD2 1?w[EIS~Qs ȃQ˲[JNбoܐߑ1%ڧ/ѹ<,n'~i?i"/MWү4U.ub%FgJSNTe(qLd2jBC V Xo\oI(Ӕ6[t;.!.Rsc 5ĆD%"(4S K[J-*s֎+)%5B"V^\kuRvŶ`(/7<8v%آt%Fh_=n KF=ۢ}%F'M!w.yz,MACr`Jkp>:wb)Qhbztm'jaT9-cOWOlӧ=e|FWTHXZEtu(ê~̍7ݬ(k=VIZ#B;9ђ+YQpeo.rv]\?}tp "@ mβQ[+Dۼgf*>넗]lЩPD=8f)Yχ/%˲lF钦#)9K }dU4I \_N0ja'rrǹ@Týsr7S"=:zwf8i:[oveP7‘~uR t@D6p(zt=]DvRj I½`)Gd_Q"ƺio[8tI :"x5ߘ,/aj{ Sב+BH Dy4nLBYk 㴭Q-R#;*\gRݍr4 (-x": 4"׿~Tx&Y˩ʂh63MeODӔ里ZA9sɋ+U? 4#*2C렇?jUXR,vZ*B'}j]UH`\X-vlM>As"B?*M?仢Fs61P;$XqN<T(o6#*O D}{FO<TF;sR>v+h9ijzt3:3='M%!*βV :4UӸ6Knk< Ң:%rVZFsy}]u?MH? (#kNH-4~6/6$VLp8B8K&!*NCkxb5W:JB eq<)αQkAz@3뀕\I.P%b;p'— B[-oF{ ɼ%ךIUh1@Ff FQE*1ړ!8Է2sQdPTF0&W楝b3 ,%\`-SVb0:ߗ!s!JnxXX%J[emH: PN?P \14%ڟ'e:fY aW߮arRt Q~u{e)}^yuY2^`}pm!繞\ݘ˰n0KasYM*Z(9l'Je;G%O(~Q@lOAx#U]z2)` P"V_  rlԛwbXc#=]c+9.Uռ1zl8V#5{ 5~e嶀izY9=5&#J@?jAC*廃څf~Z2e K>3>v7'FaLz{p{vsL )gJ\$^a9nmxNjGժA3Nj(clR[jIJ{.qm*5Wnc:MVB"Gj /s7sp,2EV'qTɞ|%qm<ː33smLug_拕LǛ?΁DY[Tuumq4SȈZf.=;hw~Y }оV@88:M%>k[Q_zcGGkubߡ('1F4%$3PR% \i^y˹}ځ0N F pvR&-J1ƅN;P4Mp:uLܾJ{*=F\zdE h[]rEJWKӗ8I#K6sTI꒺ FA()@qpyL9R#r.9G?b5.X":e6l3*%Ao:4?JRrktGwߣj[s/Dƭp^M!??ؿl TqmM\dzҥ'ym3k3k3k7o$"r$S]! K\`NB1!3RˢB70|VBS=xiQ t($6pE1hOp2'oQN~lO|oTy n]@5mS@eqj)eV, iFrQ\Ѵ c:M3FB s);BpT N7]MP<%x8%Dn|C*^ 7թ ^c!sKl}r1o*SmE%5i,ٞ ( ÝX0n"{zp)SG>l3|ۙELUW?Qf,hUix4Ǭ5`κ^,nOy}jW ft;+B/ss|_԰ lvN.X]:clMnˋG~e7hr%L8jk,cٲ}?Lui* _ ֳ^TYFDl'{э#cn2p11|V nQ[ M4Ħ:m~tt[ \L't v%i 1ŰD )̘yZ|/vl=-̋zanZIӐu2_.C7D-o&?G?ll7X|2a+3}5,Eq /%hDm{&6uYMƢ?Ԇnm3b ٧tuw{u%]=fk#f !$ssd8}?:}^=&rqEB7EPR,~Z;P'K-cvdֿ K@wC}@U) QJ҅jQݚV)XJw&B*ݐ~u].MM"dPxC|b7iV1!BnQe˴ '2y_sW7~[86Ûa4+ƶKdi}bMr"ٞ=*]o\J欗@Q$%XDC:ۺYJClZ?8>]kR/Lb$*UI*xL`HvҿuFep8f4Tϓ/oW!R$ 1c73H*Q92Zd$"92ZlpUwS>B(In 8y΋߽amO?mN<2;פEfC ^OmIm$y%#q2s ~7 5w sC$$6u\xI⩋37%S3Ə`'ؑ1IgR}a;8N|$ƋMP M9+FK]G"fZH9+aqZ`fmPS+8 *({.A|۶$YQbDZeyA"% Q(isI afFSre"-?K *v$L#àۑqgOa$iC#o;W܇?q2lO!; T 4N5T_E*d"v8 B yȸ / .;rt֎ÿ6'#  {pP?=u(gl!zAg.M n{ۍ~m]C?$QekcW̗~,wACŴ] Ym5BB.g}]nKT(y-`#+d@, s0J%\cP^ID0ʣP#kp m&%m;q>(r'O,Jä¥DIԥVLimDbQ9DZޒ%mK3Č8F[g/?1"O>rxfT^(UTc$Hj@@ .S#E-t5&Xڊ9Nʄ8*8STSz$!el}S)q{`\YJ_$^<0 |n^ץxn(JS}B9!(9 a00ň;ГP߱S4 JO3t!MhwG3)2nQ&J3_٧ϪYP.0x!G5Qwjjb3z`~&b EvhR cEv‘^11]x6g) Ɵ[&(f}:^Wxfngb<\_O5+>ow7cHQUݝj{`α괅 r ?abOCO&f`W y6S57>vp3k^8>}K[B |o%[8e3eM0.zh1Yڙ#<7w >,1/#aps"p8IEI贜\Ȑ߮ c{PnݺF\]ܐНjŠEu!ag~ A/PI7 ߝ:neo/<2-BiVwKea_Fd>큸U#&/-;`.Pe#wP%MשUMpF'UJNQKTjOS(.H#Խ)po7//#E\u  _HU30؛PBb fQhVb6NAX(v^JzF iJNԈL-Ն3\ ΝS1NMo"=+IK'xR0@zL#ڧFdӳekb4x/2Z 5(^[)k6쭲A7 rQDtvE_ #!LMF$`O2ASw(/,KBӰq{e(erMiS̀u卣" WˠYn$t\!0YJ筕L0nP@у!zb =N>,Tié4š* @cɱ4z,cdTȔ`lj%^[ZUCwr]N F_31RL{n㱗 1ebcbR/售-\(ȧ4]{YXqg Ⱥ&p||W q8q׈f!=f*|@+/v}R!ji' lȇvD򡤇O5?GUb?qۨFOY&8PN6_0(&1g?( ({LD" "8=U9Ѕ_H.z-Q-Ȯҹf+a7/Y S}9D9OMGmWP.)MSiV[%| ='l;e*Mnq$D,[Rt$TU ʨ}trFPG_XfOS ?n>ř;Άti·.s%Ѣ8c( {gHuAga@x68:̓WjMBҶQ&[zAHxEQOde!seZZ&P%²*zZJLsATTg`CTJT[$AʕUAq4%#N?UCtWU`}1.hw5|Z \EŃCEnZ)$RAl 2sVdy{ӭ &D Ĵ}3@:tB3 Uw7c(n {X;կ}scž课c̋|U83sU'am}]t)uoW+J$'hs{80\'pX <3 ~x Sz]p`FjEŹaeDl$a4f)#ݾQ;ap^tJ7aKƟ>Dbņ& LgbN|DwLq'EI+O=3DSU{b-4)e '(FBvk_ImABuNb"8a^iu W%w2Dh,x 1 CW%G@XkCD%HtUT*4Ug8%bx٨=davs!O 8"ƻa0/!PZMN\ԸɉKz85#,R'ȍ:Xr@N>K4sXjTT$vE1_RQWwk!pXXU $\N9X  .4a1+ΗiHy2 &z4 4=5 T.|CϟEOֲr&Ia>W}3w]l EC^t@Tu8TBL}}qyp7xu S_9.6$LI[xEUR{ܿSh5\WlC.rGzWDcx_X"&򁣩/kuN\kt܁glI#ڤ n~AQuaMx_1ܬeN]6ca?-BW?"QoFUmzTkǥ4C/21S@28JYmBx7y;&a.T׆^Ta7ZdW1կ[ պ͛կ~xut`qFfDР])dFGX{*.Q1j_yԝޗdAsؗ=AblCN\B|Kb><|z$&)`D|׷ƣ&sfRiغit}IabX4ܛ*xdSAr*uh#얃ᴗIЄ󉛷[ vZ|Ysvg?pw4v5L)`:56%G\(Ղ312 <Ĥ@"e.K׈#∋L.KXf .7caS]*R8o{Of]7GROu.]5"AG*41@{u}4}= :\'wJdJ\(1D9Ӗs)o8Ɍ%3ŵD:ڱ|QOF4@h=H<-@6#ZS 2qg#N<)w>?wx3442jl=a'/~eu}]`VL]SMkV 5Zp W y?L[61Gh-x&?])~xluŁ39at*/:0E@54(({a MYmGSiayhQ"{{ם!#t<[ݚwF>5n[=)Ԩ|篽"N RJT sܓ.o*<tԺv4a7\KbfLmo_-]M7-a?_+Hд_^? ziM ~_^rQ-W׺O;ֺX6yMkO>C{z7UC!ZOh'R:Vٍ1JavAtm#gvQ?h[wj٭ MM z;M2٭+)ӵmDZl=u'KD[91-|.1 '2yba<ݸVgAherѨ_+ VNڅ qF% R:1T'>7C6BI030 IQUs%-e[`0L7^XHV6,'43:زBプԻ*WƽbUCQ7C^8 Nj"'(Ok ^0W8#@Rn?F/\:u\zVoRh!/*{rqeL 냅԰Ox}(sJG9̆,)lw~r+rk96 f{-hi[zH'ڸbU8ͮYdݙl2̻Oeޝ\^Х <"h#9'~zt>8(ʣ|L\WA(GEw7c=qocʨ`D̢IkQW3"8U@TT ,GK8:UBjA> OIՠԥ{ץsMjgr$$, vd-8$ kŠM_>/R]l'{wm|-?VaZOvh8:dDZHǎ?l\޾a}s.QIBU*ϤJ8먒pD%N^GSQ|sF'>G YKlդM'ӃZ2selź)Q®{GT)gi)r%o/M̈́lDԹ7W=[HcOg3Aȁ[vq3Q6c#Vy zP.ؖAO[lٻ涍dWP|9' rUaWVR]v}S0P$gHq!er,3=|EԇY`hhfG>GvɎT5վ.Dgd|o^#)uhA{&4wIùGJ7+Pz5?|(j91zhz_^7n6/6MGC7ID'h#0Ӗ"W{bO rW[P#vn0"t_u2@u23͏H]{RF|xbcACI_("\B!pKw\HjG{A eawxֿ+ӕoUpwVo1$;֫e ;5UGx@1{^ y՟ є;4%O{Np>1ƋSP_ T@Y/,bvO(v > akjtnf!UZ{ t},] d>p,ta%kv>$"x5XKNw"@"Z}ԫE ahWGI Xo7dH d} un$hBW*6t1xaYki>\$tɦw6,`ԴG[Dˍ,^8u^5u$ĸک ,vN}Lk]O#cTa(i2+bA(g1D1H TGS+sO(eZٞ: 덭G-:Ȫ$h 9l1Oy Vl3]z|"{UO~_kz"-yL(`Bww^ mON"oEƽ@αx:9#FcoVr:5: (DF[}H(|o_3w;)~c'1 37P\vj5Rt懋X`׉$e\ 4ݨ+ոUe5&8Y= 6^R I.AӅ`6hT0 $=͒2ܶۉ "{=Fad뭋B=6aÌ)bR:o6RQ[3 y;],U:Vbl4nTb"vgWgr3NT=q4EhV뺾4%&|  J~s?-#lHfd7]d ZjU{d~MJMz 5}A #?"1Q$c hUb^G#7 ;:8|0GG%&CRGӑꨞ( a6"4oUߠuӭM4 (`T#P HJ쁼XqTS-t/SeZɰ|/fie,1/6llr/&iې=QH`9gjL !9ĸ;]_ ̯2tm}܁qC=,EL #P ?3~Hʺ+` aŠ2"@2/c c/Knn5Eo}O&֒0 #d]g7¶*cbtgp?FKluVB>-dsdla]:i0Nņyal``(bc<>tr ;JXIهA>4C*Q[y?^DZ ,=zrg ([qr [Aax#ol*_/VB{#CaN-U2c *',8:4FR0RcΘ t`2 BPT>ކP-X)յ}FRB@z6nR Ш$Y:8B&g-Iq] EzjLJt{)}RjsR,-ޘ}T_oSMf>k)%MJ 2*hNY[! OhYN]ݺh~f^o2S $KNjYbWӅ=GyBfCv p0^,Yp5WToEİyHaeouhmK˩ &ڪ,6v":L["E ##; uATq9ƭ] <nF#s+ѝ#W=GK*@%oYG3==Q?Jg{Z}WeΒz{ x}ճ=C'}uϑwvZr[tv<l5HfkZNF#jeo 4OC!@,ܖ;?DwiP!ѕL;B|຋AECi"b+~4%?P/~Jy%P订COb~sCw *]𬜖='GLX ivn"B0 c:9DDm.Y`\bK-j*Jyͯ.Z-r˓V$RgVmrxyNOZeijs?mVCE39VA39]1ÜFq=#H ߁8 lv|W8̂'{̒,ZKOfljQ7ݚ5kD6A }Mڈ{Z%:rְDګlnS4$17N I/d8! v6@rL"{U~{w;`xx8Y>oQQ %^׻Q ͖7Qܞ~PB>.n0"oV0r:5>*a@RmhH86VCg_]/LֶD{JYLyt=3 d}Oa]avRYW\BY?5x'j1Ae|Z$)F_HEVHEj%-Ufq$QXl56LB]'C85VA{%Yj&i-b1M9-i:>B7 1(?]OI$P"_c}%F A1C7K# ̌pq j9cs6x%y!rz6H0"l~fK!qI '!A,* EP1FspA%@1dތ G>W6T|f\A&tDc 9R aedܥP1ΰc(@uȰHS% ._R $'bwp)yBJR_*r](0P kat(;-1DŽ=~B32R$ƿQar_ZG]/Z h3KŗS}jY*Ǘe| ;J)h*^߰lqK_vn@Fi.;Ʀ>! :kzor`ݵD=-Q¼n(kKԜFZVBTPƨ^R*4ud9ƎӸ\Sm*l6 a|M] 럮k%Ga SPIƑ_̑`6$ 3MDsYr5jj$~9f6]Ń^4.E  uF=ؗK_,1h6r0Wh2F#˃F[7#SB/]B߯lqi/ Wl~ʛm-v:F4,IwNeaTH׍13T⷇xh{ xDlzo4j{ənx<+`~i;؍vSlra ʲlu9bW[⨐aa晖7r)wi28m1.胃18"a.gRJQ}M5Cʽ>G)%[%{ٻq%W~sN, ȞI A;nۑ\&ߢ$۲vSKyH2%*Ū1tշT@RipBZ%)%@Ѕ T!M Dbdy`e($51BhjNa$2)gucKWk礔p|Ru"g-I)R*syL)%OJ3Bk2?)RIH)e~RQM-<}Lvj.=o)OJvZ Х}@)=g)Uy&*OjDH(Xu?&ӿV7wS{Z=bc/.J%~;TTO=BzФՊ?m2њFF}ESxՈ ۆ r?rvD(0RXh2nyǣɵk\[[QDJNh?,d&)ktBcK6&I01"s#^8rx"e.&`A 0QEQZ[ba]+ߧ<\ZaIBc+¬8ve#ikhx<$6( &RJPӀ8Ƅh-a*RpT~W4U% aKOCp 0.Tc07a0'AuÌ:X$ VI8f-j tFYv{Ws℧S( K$]!ݶW-$ n#Xxa1N YE$]fD 4舘sxA'lLNIqƾJkъU%k:nǷ/ ' Q!>HZݳ;^/X+^-CwKb[<= Bl~sNrIk桩'Ņ#w\ G:$ i1zkeۦyS[\<;qV`0(o5yf2W(SSYb[9gz2?FMe699f3#ωh_fc6:<'w8y=g.br!!iZ;TV*K8&JƆ@()xA]UmpN&^Ҷ:;Uh;[Ë;3P""Q Q̕L+tbcġQ.œFpSbUԂ!ŃJEMG]s -!8+ك)<T3/λ8 *J S}RWQ|)-xRʈFtwS}xsR9sㄲmU vo1sbψc& Ue T2갂1)j :n`g;+f+oWTG<:ղd0.up\N[ pI{(vh4K)XŤW7Wń&ʢPw >:v-*${];h:t+d`G66TK|EuHD˃B>BJ)ҌmNWk k5zz6q 7Cٍd}޲^^uVcm1d?'C;CJ b? M74mRI-yRDt`='fud#?K7$~x9L:^6)& Ax:p&#{ubK,3XÑ8d8A;1,<>g'c['l ADŽ4]_>#.85]^F،[$%3@rɐ:y3s4-bSf+X[.tjqKP_M}`\p޼)-ϲ=Q9Of4]򷳂phzD|Ж!J͘_EiDӶlFA hAlZ/f5/||հA`Vqy=Jܖ^x?}7z&VKȓIP#u,H2QHJ*8 UƱkU)(\ZB&2AFF!|5ˀäʆ<0L:RD06$SCS ץJTch#F Ҳ(eGغClb) \1j€hfQ+BVhh a 5B<"FL% A~$$Fx ZLKFЃH*L@Q #A$#!D30bQN%pxjXtiLEs2<1ǵ~ HnoשG]?)%YVSJ Ҍj&EJ\J'i"lME/g`s&BydΓ33 cJd)㡧9n-ҲvW\}/` ⁍nJfzSȥ4a?%qWX%Q-grc"]*oƕ2:٘(XRBfɠ7x 7=ɻ40Aҿ~9NNwG $ץi0$%_o4lܻίgo׹3$_˟3,KY* c3Luix𽛏׳ h']'+ xv|d9_w9o~eăRاA~r@ ݀ bBi+Ǡ-&,J{_򵣐_/4.KRS"JjZηv3ZOh&&zOt.]ʂ⇿4HwATaD ;FY"shB sboެd<ф v.2=Udz p{c({A3s#]\>;x89Rmj]Ui2agfbp0,CקQ6v]LA3Q#%;6=Mg)_xj< wgi /_O7ݦ=d_l2`f(}0Ɵ2QeWS9APyr[짼s~N7|:<J$ZZM65Bɬ-_Ywg.#*yY-4\}+'Hꁕ?9ĺ% qmշfuRIzϹe3۷-ݳ*\xTHF-Gzҝm INH`ȩRV96U=%Һ~Mע )Є&{t.0NW$QJ0&" uPGJ8^ϜiIeHuR -pԅ1wcR KcZ*2Cɚ{`:{Ϟ9R̼ fq!A Ш An< *`&z}遪\)IdhXx u-\ƀ'7(BLG@X Q3xea8$LPEۈU\>攇4XMGCB3OPP}KK;W_ܮErW/ .;,#_?1 ) +[]nOfWgtgQfp\'WEW ͳN236!| L2( ƉnXe*\z밿|>qn5h!sb}IZՠ ,4еo ߽~:RݼMy$>V/(f>w:MWc#TfZO@5Jk>2x09 Qi 2 Hwi%.6$e{Iه YEaNt/VWo|nFz.|y4CNT<3"6RJqD-PskS5 Jh3J2/osʴ`J0d]T]Cx|{\К! #t{:?)q ӞАxmTkhdX5V|>xV?]ExNUmJ5I .Ek^7$E3QL٘bT/>eHB.jAㅡh^]Xne̘CW2$NKbw}s@ŋ#ӎ Y:Q!$$T\̝KRʵ&5{L!tgąB!(5y=jBn G&*M`8, Frðv& [ $ڛV9\6^߲*K # CF͝55:u\ =R4 HKE{^OܚSgraQt{a/ju w?ń Ө7$amDZ6 c!B&8`&jxA1QQ^mDy%N扝FO;8 ƬfB w+MnCX 7=D%lRYMw1 _8FHfbz~j{{B yr$^ty)Q^H>$M,KQ(?gIwAsDHsoaiK֒AYz=D> ,%dxGT"bTr`.[)Rgj fWC`JL͡!:3lרA=k^zºobdy%^Ęǿϛ͆:ʬEvSzs}u\mJoޅd1Iwf}ӵV7]6H (dSD=c=y1jcY|P5ejڕ\bZľ7>fT6sG(vAFi1ԫ(dzlW )R/2A@#(0p$ [/bI.FIhIYxP3=Pcw/Z- `";1 9C?Pm<0Z!p節 Xђgm5x ֩L 4Mi};D"ESaM[KG 1Yk%=zoh# 4*zM:Yr E"8bCaLIj:$:4.Kl铤#DOW)$\NejiHfW/uLyo!=b5b&J/CF *r8SZ¼ 9O`CTRbS{GYZ.jPX1BFX-4LX މE'Vj6iEsR!3R=5JSb0( {T i8Wc']{ 8*nv5SƛڐTk`:3=@1\Ctb(R-Z9ވId x͑IjhoZےZZZ-oPLa〉=ZO{H=VXƱ}s182 >+ҕ,81o룳:1 5[_5"ɀlݶpI%(*iL4ǺPEF`,h;ŏ=YLmv3lP0n8b QeZ7P&D4(za@Z7t}֐A86y,F_O≹'fdى1|F N 'u*q$ě@&Nr&8sSqǾhp.sNN' PC u:1z&dgDo CgBhz6(6R }[q :6(yK5G\K6ej gR#݃οRk\DˬR R8Kva>u:aۅ'MĦ ,>nRwb[))S6_tj:tc{ڳD6p-)I8BKnN7rۜYB.&nwMM (m!7`0ё~Bԟzyb]|f ۟ڋiuoǟ:dHwdɯrEq.}gW_n?}?ŒoW ¹VFtFl.٭|h?yw{?Ak~mvU_5|W2{x7E߭uYitF|QRͽiRkxz׬b{kB ш1zG߼{vfL]]P! &(M9qoo=$}aarv*|]RҎOJ:wRÝl2UBz(YrURϓApwsCRLxz{ 3'T Y4q-wZ FfY+TNXI:)~}b3 jSwYSrNa?ui=d_?%)7B^u)DTW^T6??%˜ Rڥ9`i^,h/]PJD"4j$=BF sU#Ŭ)[W׍EtQ( IS$húE)hbSW5 LĂtZeI=V_% RO*5N5'Ӿ'ZЦvAX+WV] <1Q5fcPH[DuJ7"R6֖x25w FV8^ca1#l,FsߌiHVx]of0md ї!K5BuP^>[j4%/5ăRԍ21 E({RDuE2l`&!&7estI=>/NCW&D&Lw^eDV\2Ww|zxN) *RHcEs`|]F~YM.uv9ZyP!!Pa! "3VJ"n=*`V -jBh1߉ 츛RU]8?2&nJPJt_3BiBrb)wu2fL_)xC6r0-$$ے)w@%vڶH q052]Eϵ@+[q)j- 5ftOW B zBSj:zl_QevbL>lew;g_zB:ܔlw{b}"%)Bt8–XD hd*"xmIZW1FD' I79X,wsm]Ʃ~|J&IV`(rK^^,CGm$+gnsv_##- /lj%ԙ)(` )ufd'c[#!lTnBf=6^'[~y}ƺ1³'??MRs9]-}r)?Wk8o쇗`oӻ$v⧃0._3o>h׍TQg|ŕ'nX됎C*LfEe JhØ՚w<&VDVReV6D$:R' ɷIě~ 8 e2o-W>gyޜEoVq;""oY#M<=1Nӏ !&=J??e,Y>/9_O)G&…rZ։X&;\:j!ُ.aNWЌmXt}ћd4 # "ޅeU=K O|u+9 3ұ5k&FsŭQ&|1 -yt%dD`vIVu! L $Csw0?5OVR&@dBe @K9%n<[ONV%Ie i+S] th$ uW.Fphz'Lq:pňm&ו<vbXHDh l1sUOrD+xD,.==/*;WEG"˯{ N7K4&( ~ 7G'"vf_]/Jr nrti9*Li!֫g3_n?<|y@HKez, IXzt`I bAYE/D*%N׮N: ΫJOJVXYzBF3a Cs!jU`E*KAҍ ҞWdk A|69${j*,Rrêo:Ts7>CD.qWS?^:*,6jGGq4p0AH>t|>0bD-EC­$G'r,#4K11~I7,fa:F(6^k7@PSg?dqҥJ2yǬrA*D.}RT-4=b3Z XB_~}y"k9ɉ펓iZgoiW6Aetm# lm5Tش0yo' 9pFk%o'4.!%2plNȹdr: g>Je,ٗY/ے5t:9̲k"@pA )X kCHřC_4(3w8Z ⋡(t eT$r#G+b ꄣӹ}gL 8IFGxaD:Wu%|=ⵘ1˲% +&Bs}7 IǻyC@ZO/Sk+=Hb =3Hr\o;u <֟=v~7>aRj7΁һKF/ȳw`܍zzyy-9_m'G%@?cWSYhӏ8S=J =`JֈaT2q_;avk c79td;g޿OAga~>sw@"m3 |iFS`n\Zy++ V%nS(Yae)PSSMj҆72HCT]"獳A'r6(9m4.@- ]r:g*+u_0YX }Y=K_ACoڳ}U(ժ\otd7aMw$DO+(#fWI:bvV3}8%cs_OV6 +dUNSːT٫5x9ޜ-Gw4|%exԋGȌm7<oTrzQ08Q:F5VqǨ܄u@8ګ:T"hF׵jу, S8ΰ"j,:%k Y戤DS$ZS B"*ٽ^$˿J8Ei11rDĉ9@k.*D&L0=ecw]?5kzjK"̞F܈KS[f`҉\)Ri;,]ᝈCh$glͥMk66b߃\K?T"i!ֻ'X?3ӢIL[7('~mC39VU yfBo_2~- '/pj{rj::j窡NrPVW jWK>\ zsI[K[)tQ-cY1A#FUSV6-ܦ݈,$qڦڦyaV{T aAoɷOܘK (qԀxLNIqAU~վ̀)oyL!Pӫ.!Ⴧ"Oq2;VSxvpui8sj=;i?WX{{4,j8%FBy0X8fV2 ^𤥷Re};}#X6.f0tvÇy0;fOM(I~:oO>m޿'[_oݰG_Dٷ9Kv_(,Q"|·o({0If/25Jl.pC~ul+ Xы n"J~ fOw©F2Y`IWJX)1AXgg28I3az-]3uK;x`s,R$P >tP :IK)Pe)խ }J&/ Y&#:)2ԨIJ[JQI)|*!XEۙ~ңR]ӳڧtyn]u߿p.vC+U:Bkҕ:\Z=WμKȣ@0 ,ۮ@#Hm'dʢQ pc _uxau k+Q4'[yNj5xsjO #Hh0jW5'2bNʊbȠfGt~_ۼUS򏳿woO9C뤀_맯!ڏn.%0y!|22ЅsÜ\G2(RB1Y_E)TG%A:Jv]+3ӂ+J4- nׅš\8)t^X#"L3!2͍"X<\tlbrb]GWe(-m %Q!iYC(YXt@Es}Eiw=vɥ|wX\;{z^[v4c.1J}l'&O^ҥ>q2Jp㌮t"lb7VsûNe!*r•dF>p<;Ifi YuBTjk#cKVXYzU35HFB6BJ7A} 5qORzR9aXZI)5JAH)a6P/丝(P)'9IȞ7\+R1υj9Tq9N&#G`uFH*hBS4%!_x3P` c[ 2)TIuR LJ= ,R1[}R*LJ0$(HeR4Ay+vXōM:7:Oݢs3ːZc6]ݫbPup@rcG@(%`I3ʧzRI~·C(/E͇ߪjMT P "I"idI,EmEcDnWnSs*ѧ[*h;Q4%WW~Z"5VqNNgX7NpvTs:ʹ|[EYe]nu:q@I_bԶ Ow@)7H88C0rHbv98 ) 9XrjY;jeA<-L-Bt>7V7.tYJl/k1p/eQ&$;B> :(QzΝRuj4IKUE!lO4<.! =X$=G)RH;N #R iEI?[ i&M>p[t"9b0)N:*L 6 $+-X9nk^KôtPFƤf3cxPPr~ ,lη.s3[`ޛ߹[j^183uzQ5}!:o$i@p\sW,ooW7Ȋm>1),6W9;LX4׸|PyFy&4MxafO.8Z3.9.y)|[ʆ uEPN>Ggia[ծI4? 'I~sUϧ:7;;O{]5ū3~5zXYI%elG0u\t6z͜Y趟>Nu l|M%;lz>_#lHP`/O qmj4TT%]#C]yKM[GSGP$(|Q?[|9oP*#5 {lx!|b@]yx+S"E}xF4:r@0&HKP;A`Zך5SN;2 ^L(#$c!h5ɴ9Z4Fk/HԞjACZ[%]ռD1"ŭv%[h(-zHsYD!zsXZ@ )ttG,RԺFfjl[2؟JB?nXܭq[c>XpictkQRȌޭqCn1$9.DDjb7DB0|R!*S!ukMpۘYuBe=93jn:*5XCٻ޸$W,̌|S,0XbM$ִmn,oU%Q*j=MȌGƗ@U9CH"|yZ# 51oJye3 z3GePj;m z, -@uXיJG:u"Koƣy<(W+J n.}>'| *#O5gS J9Em0LbM*5 Jrsmڏq.^utn>lu73=O]OHM޿{7u&z-`2~G5+e2Ԧ2DꚖ0tvD`;]5[HB^G/@ fsן9v&1K6/g#I&+8w=fђ 5 E1tF15J}]Bg9} wۗpYwp?Re8mDԾeY\y\y6FZ9Tu#wnӳi2FF(9"amt2sgAy Џď g#9caq~`4hKñNZ->Ja{].] #i"X`ɀb K0s9/5Lp0LMk[ՖLʃ%ҵIyb.Q,Xǭ<+m&sdt+eIJeTVCXjoY83%m/u&ezMFyԤ Y!{Bd(9K[[yhOzזp75͂h<脹ZE[)~. Z*ͳ҆pZe[) 8y. ]!lu?Zs <W]xߪ ?VHLḧ́J…2N{wؑ|`3f]j;##7X1ē2uopySp@a y ֆДL8;wV݅q෇xH)<<9A/'mSJ5)I;phH?{*p}F^>h(J7?gbz#{1HTSǹbN 3\C: oK=╘171AM(>5hĚ&X-R!ըf`@"77jXkb6FĹ+Nu!<<8mKkZPR= {xٵxQ7EiԀZ/1,~si+#fF'#B/U~~L*?ĉ@J g$d✚۷2T lxv \OӨ{Q܄qD56dkܢYQ[wC l=W&)ySm>lH?䣦h6?6ًrß׏nNV1lCe盇.7//mWa/h@xUm()O{tMN2xIߣK*w9,䍛 $ /݆ʀc:Cf_q>6ư7nClʴz|$f-݆ʀc:C"N݆3MMjJd5p|IFHJH @4F/ݨħWMiۖjMy+7bCz0Qv@y zDR}Yw &{}KvnoH*KˑCyڿS m1Fi|yeړ~M*jRjyi\R5mFgJVRN bVI;2:: % 0Oy`n(lXgM?4[߲/]^2%Ǥ7tYpRdGځC߄q'fIqWuՇdB !(Jr8wa2` B/?ldYbUJ$ TH:qh>'VC7s'L킔E+pWQXXH4͔nz&vN2!Je⛍d,SaqJ&O*96WM.rˠDƉ–U>LZ2/MU^*4R9sisҶqT,KY:5yT%d76/M^Im%QAWysnלt4k8BĔp7S6|VQwsmoEЌ }ms"/anҨ1[b׬%&p(Pa%D}ȊvcgmSd{t<ַYEQ輴է3e)Nof, ؄UKf$X3 ֩KbQ=jO<".(+laY9VI KEEpbʈ`]Ϝ"_XQc7(\#̓%g*Hƀ?xɆǼ 1H= 'ݔٓJ~Gsl yцV-;By42>)nIv(T[;" q}gmgc#rz֧Oŧ@{W 噠! G{m k A[Yg΢?o[AY"pXj'u "j8GjNL\Xت. "\ٛ\Zxߣ4ZޮB<Lʠt]=RF + șcE)(R LT$ic\3DVU6ˡBLJUyXeUyh`^"a3WtJÒҎ3FU#QušBd>' g5ɪr%vDT}2`a[eЦ:b-VRd ј\P%K4)S%D &G*3 PgS.*ZxZQV̦X&H)34u~hQ:σ\ڔ)uT:2b-VebnB_"TDLC=^jκKSp(D$R M.RVSSd-ziFc6`t*p.z6zF{tf(u(Acq,X#vD9tf`2Fl]3`UXA4@n CǭK a>78v ޒq~._:+BY9JasRV% HSt`?sƴq|Bn|Fj&pE^?\0DVl7ʬQj +3@+48X-)5*f= ٵxOhTY_8x>-N28Y P 'ݎ⮼F hI0Yυ P}-zW#Щs|rpBifzsE"Gz7y[K79oH|3T՗{ޥ6٪n*2i/û΍!T@ԅ4ᝢ_#WO%a[Oj ܇[᥯9]?ȹf*w`$3>EK˼+k| ٓd|؍ZZygnح$7w1T+(g9aor-X :U#JJB0'֋*=U6ơ[ XBA|(Tf Ӣb"-^WAkjtmb;}3u"O=oh201IN=bd]`8Ժ|XR KRWŐ!n :v[?O7SVrs ;z}mW1c;"S2ɔ@)`CO뉜%ǂfXd$U!r,K4J]`[$C^ BQR8Y6J{Ӓb1JJb p+ra,zȂH(l PQl%H[,ijUH^Q:W* p\r>V (;Jk |"w.Ћh9 C 4X:瞽Ϧ 1OVN*πr .>l΃4=RJH.L7" :FEëս@mY<6qz'[&윈 QR̓!rZo~(ezɰC UU*;R R#w[jwN.ΧiY*:pjq3;zQf)sTČ$$LYWՍ6n"ݼip"\U}ȩZزx=hm'.\8#Z0.+ z$wf'xazװBݜVjόȢ*^,VH݀Ȉ˨mfS:`RAa޹|b,.i56l(, Jk˔9e}s)R)ء]I$Y՛b$*i(&2f9 j'3]B\څ蒥`YK$DDFAjQVQN, ϶{#~x7v!}qp1ҕ4+KX؍^YEr&F1k4*--&3!Cқ~Id}ȨUD.+ ) ĵÔvS6G( z;`0f;F)gJm'h)TTXj #O؄Y(坢E}()qUiߪƧvY|c#ӿj(ujww]}eVJ} r3> xe$(;Ta"cZÿ,Lۿ\'P5Sȃ@`j?{+9By 뜶?.?Zt,$$,Vh.k1z)I1IKVr)$'cu 8%bHf#W;%w4X})5Qab<&A?:Wqh ^/{HA S`Dyy!3^o@d"r1@l/23deuO4ErOF/N =Kőm%x#˺O7:GjʩOո+q؋7/g3D0䋑 -k~n$%ȡ:~)C taW2 cU1M͗og-JHV-VIo\D{TROKn}iP#:}E:(VZĩ[J6ELj]ŹLk[_ԈNwtn"m08viR !!qݒ)S)SfoB-Jח޹ 2*$['@ZTw oNQ*N]y`oȘ2K>T5y-ʍV mٓ״myn Pgwya5'a  RxVjɗ_?ުkoC6pzT"F@w>בQÒV7sdLiU0) ٻ]":SDA{>76rVph~ /Ra\تJCRfXѓ,?ßOL)L)L)Li3q68pQxh$~uQ?RM!2IrH! >mkJJ٩3$[y?E,~LZ/ߒ^)ߥf$Jobɣ1SHSʭ TdFڬ( &FnNgNLj$ GёȆflMJ yݢa2wISPA1`SjxYh- ;tTtTdrKzrPcX_rw&S6#]IRH  qw# ӫȻ_Or#vE:]TƧ~:5WwEj+R'Z*]u,] Ҧ8fB5ӯՎJYQuInD<0ZveEͅhǢQ@Io<yZYPY lJWuEk.Z=YV+աd4ݷ(gJJJakʔA'H22r_`i4VJfwhl#++wD? >ǟ|v}T_S./W -'騵db"3'HP]zK)6i\+?|]KX3`&/TU9P95Hv'{KygBBrIeH dys2^q\+M@P#:-7}0#J5\q6gmW|3͛?W!>ՄT0RN ݰŐ3%vRa<`gʹib8yrT]M\>F:p uVcxCߊ|Eh@W( ΐv;~;sBU% uI%7Eh~0&vꉓԊQ3}fiq7*P.|ށBLFیP[w)~;涨1izH{sGBi$ .3փ6|xFfn3>Ԗm]΁8&U-;~Z ? ZG/7rkj2Uƽ,8DdP (9Rv-gfΥH?¥cWP=k Bgƪ@} /rC󓵮ymZVsT~4FCWi`ԯ׆9 kP7sjxZUERiҎIgp1'ղjc;i HCV!,e$Ns 58:8nmڇr4rQ-DlɧDݗ|JЌ@GqxsmiCOSFW^77􅿩I"}WOCD/$m,ү,^qR3;R$.d2OzZ~ښ5WNOFx}|3=mp=+'p{z,ҌJL1dS)lc*㹻݇RA\ ɕf%HUiUTݹP( R]أP#gtR%:p5^ۻ:[g費u9*U 3?K-5gطujV8S<=uSpܔJ9/1MeHTVÀD:/P]V[RE ?s.Ί !x_G/mwswS;1M'<@xH2;Mc7 Ю1լ9 ԟAK͹ǡk LX3[b|7274g͹%^v\I \V˅p~;xO=[.,e)Bg,$r"KfVĜ%)qSBGH7VA[~v/c9ȝ7N0 %fB`вG8eMͫޘJ*Gu"n1ArxmV(-Ïa>bqvۗB৭X#N},Kdj6yHdpfșa 2m0nsJK*9o5l ]"Hٷ'kp],ե93aNO)cwC+KL/n`-ȐdT&8 I+[BxIV#?_mE \6y42)!3 Rg&JV6jW}tDGy{Rpfi:a.:R)$v#q>ԶF2Z%G~OkL<8X&){Qw~$NDT,4K:<>lO# q)#)ӀE_VM[IØsLҺ+-f>x҉dzɨymbZB-ֱ/DڨYINAb9;v3Zpݷ{Gd.[Ϟ{ \JJw' g ok?o7w0$nInze#!_-/m/^L*ŨVnӆb3b7߭2HkΒw)ɪK12Y yz: >J<7ۋoSr~5Zh<7%6\%EuUg鮮׫6Z*|-/UCf/A\إ>,+?/`g ZB,z7ꮽ iC!Ţk,Ӝo d,qyPf;Բq&Kah!C+< (.b0N26b%Ƀ|6_ OM24z+Ce}]1AwvӆƗ=Eh%AΕm1V*3H[y bTxЃv_?MO_)+O I$*iE)cȩ~=RCxNWOwU7qRS@SExbEJPhs0,+ͭF*V SdHE X2Ime9֞g:Bv_˲~VK–ݤ.Ah%nœlz?g!Y``YڿPI.w A%z<԰WNX3I!>~Ћ24ALT:3`^E@L.l-fSx g5Xs hSWtk1{s5 Q4B? J`w~l-ۊ\6|;V:ǠY焗 (/"AWv֜@MHM^5;RfȢVGCԎ6#$Z7NB5rj .Aʁ)@GG# 7Wa#Ok^qa;"^"pKNR}.w^;gqgTW\*zgcRU XSһwa(ګ9 Kr2*A#"θ<0i-4>: 6>,eP̑lHfo0ϋ{_H/SV݁?_/hr}_ cժ( F>zgFV+L$C֗0"Y4/ejr ȹfhLL& wN'K<# 9 zqWh+z2lrfWe2z$`ZA! RjoJKZoɳ)[Iu+q Cd:3%`z-Hqǽ+mݤ2Һ|)$uTH5Bj0`]iℐ}Aϯd0YF.a.̑ET$E8_84PWG߲voM i*]MКJNNI~Q;e'|Eie0*g!rEF){(OiO$(q =Jޒ҃}T29_Lm2ҎIN \>X]V'd~֝}JK gŮٸ$ \~q=ɢ5x?iDI^<+pȣ#v7K~2E4j}-3jfZ{n1a:#`=dcPY#|P=4esI$:qj&*+}$ȷ<_EOC#:#Kf,5G+M 0;tSs7tK;V緹K_猞ؚ>z$U\DtGu"/(]`ϛ_S- ykn-UQ.h.XȱXK( ^#ΔAb=fL]K%S0, -}TG׼/TK JU&m]V7 d$#e0edFD]UlE[r]K0u'\Ĉg0s9*iwN@(`C|I٭葳&Ji-~dr8 4xe&47.: YIBG2)𜬗Vy!LlZ|[S] p|ƒ;V|NIi-,kqY3gkʐ?47V߿-nB Yqkhov@j O(HvkqV]Tm)~jR5y̟6-f>m>n|=xb@>?Mخ62qX ~SS4?{q2E~vQk|oM=\26N[A#&K -fCgior?2qC~IꎏbNmM;ߵRٓ?Eل%PJm[nFOǴQ*oI[4W3`,Bl HUIQ%jnuSnun p&p/yloVwK?}=r#oLf;le)JL͏,"ZScۏ pX.a]~ClFJZPT/sIO$q51pmș-:qCD'jJes\[VM|h.jdd_,^RBgѮ$;Thڥ]6Lk|˕9cKexD(>fB ƈ\FEt O6xWvt2brYs5'$Qj|T'zdݍV+ Zk_4@K2p\od| .w ? âqљZɞ/GmQb9i?!G?xřC$) O&/nlJ)NMÒ,zcZ<2o-2dCyt/Z'S붂ϡzYf& sPŠ'ndZD,*enx_I:}YWFŨt-imSE> |mC Kjؠ#8=ڂ?jQ|^EJsIP>yJ}UK۾UTk6٧jt(Hdx91LkUqyzGK1q&g`AZ%tu[w: ;8$$'Kep:p\~ءf߼>LٜD[fr\1l?[o?ZѻGV;CӸy|w5<=*oX_G7?'_Q)ŋH?7dI_vݗ[|M՞ќOYjV6`b^xFq0SU]3+L7h#Y?05~B~C//}F8"yWM!$|=ugñε`rJRD Am˿J=b\&WHz'c6Pγ5 ~]|ݐBJ3dF8Fƀ9-i$浧~DRRGsm >;[N\}댮dc :kf1.. d\QQ*|loIMMhofHhLPs:xi.~F=уmNpCev2!X2ٍ0`,L婅PbAL~KmMtJq>? :{cfBͫktWjL؜-howk1 Q_$ei2Abn5_i$tCIPw;ǟ@?G8so9*KQ!t]⟟~ǿ`ZK/wok,Zc1ZvfDiw)A)aFj|O⳥}I#%DO﷬wNcjTJV1d&Yz<,'|F4Fε\?ŠqAgсVTнFcU-;Kyr%cValvmT%٭5wIR5, @ԌŃ*2H^7`Ֆ఻yOht_9)8wJn1.k^`]0?sshm{꼹uOBT۱w$Y/ 2d^gxn2 t;36xq'+L,LQ'?sCq"ku[^C'&}i[f崣KZE5%Īb&K"}Jh ;&f#"hor+p586 wDFL g,, k~T|roAx< e:Jb8/Ibޙeߌʂ?87\`o+qI6sZ4 áĴ)D\"CWvyq4:F r̪-Jb n҈6ȵU _UC@,i ~*<,a^g4T#i>PaYS ~Vglw 8YRWjJX]Ko;N@WY!- Ham ZaI|~SiΌHn3tl俞N_u@GPuǫ5f9)}{d ~y10?+ikс1zvI=]*TUZUeǫoqŭEa um*tAW9X2qh?(J壏̾ԩw|f,P,L 8~*'[Who ^=;͛ V^N٩qeEWDd.'X1?|̶gn!9k]5+0{r&(a%)HHt)+7D;2kS3V+*'QT"XzBZҨ;(}YnZJ'&;/= 墳P<!%c_;c !:09A CdXLC1 F@fm4r@ak)v1:}VbZX''.sL*NXD5`)j\!.sD+p>kXzSY~@uQi7Y6Ce ?*k|;qz7$\{,^єvv;Y^Ͽ/l;Ͳ[D-/~jTlZ נ4f_Bg%u؆,5vt~3z)A4`|ab27eMgӽmkd*?^m+w ʣh4sZw.# 2]%ATR+Rj&FA%?о[c:$:fodNG Jv],4%˛kŋ3LD4!0La@rImV\ChHxfZKXQEѥm'x^ƓӜ19> cr&Q9r~)`ebvǀȸqZfjUiNHZs P»n4Lƻo^AXM+㍅6m^7aw_DICAȬrNEpNuRW!Z8;: 7wIQY2# !kx|k,B4@i7\[NZ&#COmj9#lGz-6h. hPSTt@[%]w{IxP!$8ue s^eRS7#q J!V?^ɽKr5Zkςid7[TnGWs ޟʎ՛\Gj,F'C(kKc[>`C^{dKFUG m^mghWHAVՆ;<O&1g^y45ZIۢa0,4,uHGky^1bE*D=}e[UDp~? X_/(h3p@nV]/6k@rvwnn y'\l|)+0JL탕δ{'`[2c*ޚqJqiIXI-)XstN'ݾC}„^,VxPkr ټ}G4a^xǏ[ul`nNrp93;>.}ǡG˻!2!#k!-n?m.`ƴEN;#i-o^aDԜ/^iX!'8+)ͷwP ޤqsdF8U//ǵ8!R/S><>\(Oe jκ CvO~ >cp͠ZbKxar }eVp,B°LW5t{WrпxXH{_YOKO?;;;ш/ѮEՈ:gO~Bb^* <5M#Ҫw*:vLD5>٪D)Ac$+<?誇i%T@.đ:ݳFHǼk/s5q+~sJXuT&pAY[HE DFeQE} >Q6<2Xy; 4؊ ޿M_8(i5rD $<@t0qR(̥S\@}vۈ-m{UL۷84"1+cu ُP$ݤ/FfEX9Iؘ$zIt38#n$aBFNI_tlBpBÏqZ\6%6ݏ !"4Ȕ<~SZl0`OGb_#~N]TDɀ"]V*N j$Za78,{|[iTmfgK5x }2}/A$F|?r]E+EdU\q =vQdf(Q{V-"&vŹ`j,2 l!ES$}˳# Qc I IbqR9 `$'j%Xl"EG C`yTb*Xk&F@(9yNNaFy3JKlqB[끾;̾^W2Yuw4ob|kYJPVv6%~ۦ"XNsv|ܛ3[Υo6Qgew4:oufwÉi˽ڗ?~Z|,Nޛ8^aS#omӫ6jz8 &6xTdv`L] 'C{j7аѫsɛ7k ٷTk!˰jS=ovj{?=wZm:=ЉyRr}rK. *O^P uχZ7އNȎ@ꍸ#V,.nTK==ǭ[V\ip`O24bWmj-k=;mlB*pZD` ƉLNt?>g_$Q=PdXC8hX`ՁH@bqPX(?Z` s(ޮDvȝ W(/ܕXݼ8x`}BIмbʗӛ2P?qX$q@kסB8K`ny ӸvJMEQElնyƩmSZ~F豬O1^&`|am?XY(Lba_/F sԓ@::'KIJ{g4{V.Tg礡7uY׳1BknNh {qqfh a_8o|,w'߮>k#ShvLńfier?NؘHco0-@!u 0 lR:6\Ǭ(n2c ]xL9p |Y(BjP R&< *] r"Y0䟁k\?/'V֭Or x ȕeQ9Y5_6qdUH! & Yb@3|. (rr%ʰJ*Xq x&7)#.D*[%^lS6I.;U7!%Lj\uK S^{o$\TdN%7JREΤF*`&E44Xg8ޅ=,1uC?7EAZY==qܷWWyx%BO^<~jq1B0ʀdoL~W)ZT} >Y _h}&/~n%oo@&ڈnhӷ'M24i.Q藮dᕊ`t;F%Mf GZ?*7#pu$z|:0WA" )-?!jS aCD'y(J֚k<`LѺVqSZ9h%\U!ܿUsV4Ң4'ߪFWԶ|_YiTZeWK(BA qj_f, P^B,+{ߧ$XK5\C PhUۻl&u/ ϼUInP[ Sgg* 5eL! a qב߁Dh ;"fz!F˧RCpl$?^ nxܨc./T(v&zo"DM2w)GNu$+z{/GM/|QRk } NR3б4N}L:Yr,%zS*TvǼaqg9Q"SQekI[Zi̔@YZZncްya&-{J~O7y̼?-%dN?[@)20c[ľ.&d*VTo|Nw{[_yAbV('&;tvƟnG߃Y@d!)cB  :(\ Y&WQ ` `^*22Ce 5A6cԪzD7wÖӂu%9Q*zWr1rS^\yji @8Y/J_4GA2؜8nh@ERG7xhA "\:ZAڑH\MTAc ƜO缨v`5 Q\Lu`!e!A+#5/Xf&Nʰ`VK˰2Y.9pE5W*_:+Yk Vgnֺg2ܣzʀda(iݦÈDquɘvD LD RzSr]rJHo0P&ٖJ.r_Z{&6 2$- 8뻨jS h;]?ţFZ2->yW4"@*&3=bpʌa־haJ@RP-d+JsW\@`TQEZ-X@ː3pPdnB`Vp}1tM< Q8;IFA\H)hFnAhz>;ՂR'Ѕ!LγT 8SvBoZF-ލ/u3("W" Dkv1~{:#i"I Ez}cZ(_͖'o?fs Dw"+ȨIUz"=˔*ahs}/eݧoˬ~i/>/{`,hз/'O$pduRфLvֳ^:G<|~WmA5j|yt0PދK\m;K@hLEuknw/XCd l: lLG(uFь{JK37 rv]y/fh<&&\#/ӥmMOJi@`twlHÁ,}8EZR6~Qp{iB9B1+ؗ.aq'VGzێ? X#/K-<߳cOd4AY5Jp(QQpAPJ<آ}h􋍨&6qM򰃛TJ,=uh:05f`4kZk_\R7<7`sm52wFiD4EпDF6;LY(&6/*#h䫰;'>݇rYY|/grGJxB0k}bylۍI۵" Jz߬P2E ;[KX2|B F>)}.Tn,2ZˌV>8eSeG0٨ > DGjS8h/IqVhmn &ti!V6<`>e 6H׋16St8# V8RUn-i*(=DH^)߼+B޺ꨫx˙?E3C~(̛cn211xízN%dGz>, 7ѽmJX[ŕwo[S.Wĉ=-n$TZR"L0XU,V~\nuғ0;^#}TlוzJ^@_[A$.c1DE٨0ϝ$A'Sұ>rtL5$h}G7Px9F݉ 08H55ML6 U7QCo:4x2fV=j EpvOMIyD'+AU荫"Wit|k$ms#搖( B~pXpw:KB^pM:HMTa`$"P5/Pح.2(TK?L}͋eʐ0BHyaBo3agFrAKP R8# 9x+k05䮖6- }PG(kδuL Yo@@$V!D DjgWQ;3(4!{`"CS+T bQNFȤI@[);<k0QsA{Susg*ԐBн3UTaՙ*}ҟ+޵fc&JL4A)=ddXaj57>ne"PH56 MdT>է/ܭՀ."4qWOG&,&w] bgV&rYLhkOc9Sᗜ89e:c_I}=C~<%-8OA ءN).aם8gt- 91QՑ5ͥӚHS/̱ vH ĂEJ?J ͅ3YYӪf!M s5Q(Ei򉉞Y aCX($=Zc棩4- M5h HfFfM7 *YǦ[j<7㌰*mE M㱢YP6hI8bbZE&bd`*;.?P!a|G $;G*Ϸn~*=e=^j=MqWB$7 ;))W6uE\ G G.\w71J"&MF^8rR3~0u7Id-88q3dr 4h̜΀'GZN֮96 ֹI3Bc_+Ɔp]0n}8.#E:dĎY.HOȇ$GU!EWiI[^-g:C`009ӗs%-8O ÌO/) NyݛkQ= X]R@JJAry>ksFXhMBˬsB=XŌރtUӢM7hψbbRr=\BB)5-ETH.f MkD 6RZÃ>i!kz=B$q4銩 %ab7CH׳_r!-gNBPLړ3J~0'/iPĴ};&35HΌH SaP@TPGv\eOHIHy7S0ofl>U.5YmWƆflUgҪgɤ_$8Fԑ["CXn94> 2:fd, /YO(OЌ".UL]plNRnĵȮzzhmUJhVlq{ӯ S߿ u#F]!Fp*J!rz6D-SAGY܄!-S[u A_߱?Sރ7_j]5Q qޘо-O;;/m/[TQR9˷ǣLJZk!q5Iڤg[# ^Iy@19M{_+=Q](r|{QaVc&*Vʤfi㌍uZ?:mM?/>x_}ZIQ{R~%'Ձw{\iǜ B2`sSm+GP)Lx~?$%9U9T-cU&:/:~ ޶q 9;%;_p.GYCX;;N(,<ۻMMF?0t %C:bi]< zeKpF::sSn``dD˗W0T+1#$O)VVӬ8yǢ YOqH<J:}%eLHWR $M'_U9]8)$F3?s@0X7IEj'v8Z! Ԙp%*ui v* @iC5.f|fHN匉"31jJpo *ԨXZ2`ub: DCCFA"_עB#;7@Q5:;h).vInE/ jzAHf Ra4OK§u6c_7F"P,ś0bK>GPPva\74C: gi&drh\O̒`o_g Yc*xc#H5gf\"Ln\)GM*_Y5Ai\@U"w6ۛoVJ>.~~co>1JVd 3&&M`xĴb Rr?SOzd#1=#c ASGxdHT0#"_xb̍4.uZY}!'*f]Q,1f0j`߁u۷<ϼxD4G[{({D$ŧ4ߒ^)Ǻ a> 's$k5GB+ i{FJg=-O3ޝh/V gHLhgC'OO.d(Kzk,M.>ӯH%k>1N_(ce0a= U~TФ.CFR9aZ! msB:.bbEiB5(8c`,T |Z@.1+\Ұ .ڟjLiL<ÒjӁ&߰Ax3 w_VGWͩvV؂a~_I^Ws׶ ~ww~iA>{5̃y\= G»fpGj.K뚘{wCX[zrN鋴}p:\FҐ\E<}zͺ2PwX#^DLNg݆#$3 ":ysͺ5uGO A }Gu;fAJɬm USn j0K{yί]a˥=̬Jmuq`f۵jξ\38kea4f yz7_~[?>Un1{I'XujJ1+R}:+J=6oX" 78:ABB2 b1d "2H(K{ e)N܏m\3.sC0 MC n󏫿4E"/D0Kew}plydV =2$Kt K=@,fRXj+\sC+Qszsb㕓wvy*d8UV"3&0U[ Ͻ,w6V< Άc-h/[CWf?N4釤=RKy2 1rĒ@DdN}(LLv6R )=1 q(!M + *&L>6 ?CDȤZ@1s>B2TQt)HV.iu@)xeA1X`M jPh,)ZГZ+nJo i`Vu#D$ɹ J1ͼ}ҊzP !K~rvsМE?ܢVSXcp(gJR^яvW C S38K*R\`2k陔dTxҏU"mM6b9|Y' ]w[W@YnJKa;L3iQϻ͟j.A}ysͬ\EsƙL 6}Xfn[ޤэ'%dmրNV>:]=TF9 1Cmwl1`ĒIBż<\r%2L̾|zu7f={{I($\d^G9b)Y$H}6j0Ў#Pf9xH_u{xhDm=$hW 0hUPi[Y G+jto8tnwi+d*[QYaFAϓs r@O (;s 뢰W7-sg7dr\v"5S0sٻ6r]EpqIkjqY%y|mQJt><1m$A WـV6`r DњXsHV6 {#8& 1ueAiZ8l[(~uW?7~}!ő 8 A'DrL<[2u,֙}N?_9 V/כ}/ = yϒz$Ws"Ea?Q\ve. VζޡM \KݼOjtڛlG Qr/xRMsYaoė,L.z|>JNgWfHY(JDII͔9%bh鹀Z /j?n*0X<^訐T[%QQ_W8`$Y;EvUch-$8Oj6;;\q k6n@@cpʌgrBP8I' RXjšqy7^:%2.Ud׫X IVaCҠx`?|"(pvߛa.@h>,.WqhPu_ jWV0b%3G$MI><h?W_r׿!F\NjS!WA!;n1Eu(]9؁Q" ).kp'>d> Sќ~0`φQnsn YgImގ qEUH][ ,Il-b-c^?W+uF[n$0!z#,q is D X濶,i?bx)O/f RA`')7q頪NOq3ϕh멏Мgtt?h[.MUo۪p8"/`mMd ZaG3|:lڥ7^[!MHM/s](A?^ "]zbC6wӿhOjnLB_[k$>˵\Tçch>['C+m7R\8l6PZ71A)-_m3~ ؈k:s&e&'@:%_Imm (XjBԑC?)0Ls<J@D*ƖrhP%Fk"FFM20MlӶJ40;IVOz(qU7UŘʒ2TXtvXRI_ۇ|A Љ])oJ5,nO!E):)Wx('JQA J+\\+s*\N 4c$vkPѩ7*cK{ݹ@cLv'jPb'SV^] I܂B^" V5Zɹ(Asqτ Wˆ_o{c%\X5*u|"+$\X=?mv_%x_9{F[VPu?VON2RVŪA~+˳G@CS(Dx>_U#kUѹ>)☬N^zM~U3Yqu&6i~Kp-#)[K̡WAoUv9b6?lp>tZ TNZ*燇1T=WXV}Z@XJl:asaj\\9#:uCRR$ި %):ќP&(eY`e\\ ,pqQ6; ;&+J%T u/f")U޽z}5P~[Wܬ\~jҀ,$/ *#5SPs+zᅵ/@ )҅#&t}x_|y65H&Kj 8-6F ۺ{ܹ߆obsPysF~ yw2}@J5Jω 86*3Zko+ GYȖ2C#qfM˕SO ~[f>jт}[۹"ɽ!f:da25\ɠQSr1R8MBRJX }PF :W&ero:\a9$Ă5HF]<+q3.-[:\aoo5\m?9Z~- 5C&4&^mK(L5YAĹ}5K*!A`șj_WOaseb Kqm o\M߳^Fk%HpZlȕ{o?`8 *E # Oo#C@Y|f(%KMࡋ"V?)*sqzM)pnwwxYA.B*)^%#UXg\/߇scUK%iկY Vg-Ϗ(bAeeO>ڞRh)#Y![Lpqb 匳ᬀ=ޗ[ɼgnv,ҋóU_7e,.0B\Be2";pkENa8^<̫ptu]"?e2*`RObe>qzvW}Nӻ_UFl) Tlk6R;5lX,-=~f)*AlT?j{V  =-OquQljl\J_!P'i%lR9hgO\]w=z]9\M?Ko |Hlo>O.[)&~טi=כu>l- 2\*Za˷Rj*\EO_ =;\xv}pȓ86F%H(9ަ aCx]i)Vl|aG<̄._Ucr4D$*ltECnwP07\]1ĐZ)Ď_B %XKj~6:sl/Ym{78MCE{e`ǽԴv1st2qGauf|ocPAHaq6-uZ^D0 Kj6K~ɴzA/(Xt|QܻI ቮz Rdt.o'n k{]-F!$ooT hE~t'.뒸 ' |?_WO_؁.1#9Z_qk%a$%~=)ŏ( cjR%Q,!CsOu>hB7˟(ļ5T+E8TӞ*,DwNgG:*RнhΦGDVh_ҩDŽ(uFTcn׺TwVn7fr5KŗYޞZU:4Ŋ]=Θ;pHN"7n4J]WLZp}%RGHug_תWc!$`H <<)~wUO[_8pH'TZ=O\ ػ6dWf~ 'q}8Z)E俟!%(JD,3UUWW> ~,fcԶ+aݙ;xy-WGNNtY<( Ⓡ`G-PD7W: Tu!0-1"Y{fO5Ꞷ6Ҋ!tx/IiAs!22X\Lg"8)Efh3k}C,fgF_kjXp#+SOzsBKFmrI1LBO#ו2H(?LIL$*^еDֽ~% u2% )& Jm%Cn6s]"_nfFnIt.S@@,. U>'&rJP4%kPX1"_{l n-w .bL;k &X[zYrƃ[)t'(Qۡp2'BFmdbd>dhQ (d #X9Xe.iRǺӊ}s2gpгFz,vm6^Xff'5^ gm9ǶXs%?HW 1w`0$%rYuvMQr-ʋ.XpV%Yݒ"%d? k :g<53zZYN2"M 9얐YaIpĞ)n܃9X~jȬ!d֡c/=dSXgVp֪(W ~$ɮ%BI VNGe$#>7-7/=e`}H4aJVdF 4Z?[lQrr+M$N)MF?~}Zejkh][bezc[cW=grX`|& 0:H\fJ!2RS@?\Z:47|Z%rcًWӖd|VkZ[\ǟJ+ )CpX5-ak#b~lḢr~ >%:C15opv,tK Q+$z &ЬѬцϿXEExxZ[nz=2R8nX]3]4 .bF]-/]ѯsCoBē48drO|kD4D Qs<4 /?8Clq|zq0VV8V@:nIOcLHgwoڭ>}e찇pgNoZP09xc2{bAZ> 8C-o_gUk}h:|hc >^>`Uj;cv%6jFtWcݥ"o``paG ӳnvpҒg>*=j`||/WF_sgh 8忸/=߿˫ߖ6#k^uO/ϠYr_;?ۇNoF\oz=wc?se%N#͌)b)LAjjٗLveC9"Gpj֢r{X3iʏ R3 '(1Qi$`yEy4EDd<"`VK!V>>/)эDP;LWŹlGUժ ]8kiz}.9iuq$?+O!rk4!_䎲Y[0K7^pPlBg6_r+lZonvwֿv֌NUXsu o͟c/3UP+8<ޣ2058+?=KonG7Pi_Èe=!UcYw/änB۸pR`9Vo_*,EZ+$^՗~g@ Fj^ou__?ߘΝ W4z`Qk8TcXJj*$D+liK*Sm Wϗ<`%b g z?n; Ձ~ Oi>ֶ4d&:K>u F NڨI-&{nPnav޹i-+j0񘼷<.k9/cJl9Eo+&Rpl8>m.FGnذ*GTi{PsSdjkS~<)ţ`_ZNC{#lXjrYu`IW,k2Vs\;؞Bג=Aʖ7a94!t/ag4=nCL !Ftѐ=wKujOª.yⰴ1Auԅ~֦ЍPۃ~V6YUkc4.Yªf(Wȉ+B Eq5xC,fwMFٿG- ZAՄus&L1B\34}.IW9?l﷦qԺ~Gu}rl'٫`dXHOG2U.*(Wy_dM,4~l .81κ[\\ hr ~0u \N:CW0l BP']wkEoRrt kf|Fp[p;{S6xR5D IH-/6TkuZ2-+B)9T)N'JRQm7WE{&p{҆)P 9ڷjkWq=\r;VKhCQ=~:ƺP=^Q@2#%}`w٫#@f49.6,ȿ^VeU.љ!HBLsMo E}v>u\f+OӔOPn,e; (; }`elE3#q=BX,Z?4ȑQ?7ؘn~k\p5)7ܼ[[KSֹ g}X~zPsAZBTzp78M98y fI`p5X7l@n.\e &,Ows5yJn8p4ױBZ^dTHv@h<]&f2ېCD'!2 A؝S%TcN/A,v@NVllr ٜLXyp*$J&D :^rh% #HRO{u'C'^trSNtzcՎa3 S]NĖCUbp+A5n2hx 6Rnowc#/MW^ QyJ3h2֚QrX -6&^s>7V=XS7 7ԍԤh૱]nj\YBeT\"dey"Rڲ"Q0ؘ2X1ZM>p*19o{*0m(L+ ̋TR'</jId^zho7B+5xСbQڻ#]&EqH[z#ړ`6Mt9EŮ[xnfʌU|'{;Fǃ`gb2`mwKvTvC@T><{ ݅9JcHݳm^]1ttEASZA0fkD(*9LuUE 6 ^{D/u*xN(,&>ZG+;[T'op9'+F|ޅj!Hٱbj.zkU-!eWte5uYM|뻿 Y>,]_&O+Px4oykofEc\ :,,gL[1f{~_̡[/x>e'F{ &?(ri9:%j`4Pi%rX|#d pl1Y+ƸKTGNDu$ އx|@`.I[dZ3zt)&c Z*φbºq1A7 #sㄜEЩJ}Hn,Rx(KHL:%뉟}\0ynL(c~ Z=@Y6 q~EGAJ4]"ٯ(SMGщ!4 WLNe4m'nlہcm$߮Z]}o@H}LmBџ.i.ůˈf,6I?{dya^8;wwYDD;[4Qx [.몟/WA]Do,#m{h.>zQw_(K(Zp(iwY20kלm\c9nR:R^{ˡM3 IFd@8tۣlzFŒ~#]7)mcDz)D/m :z; >?{y-e?T'ݣfp1,nbXyMb*>Ɩoì)r{~7<9{{ww+ԜVFjXW;w3[Qq?B| Zx_D(DNuU&#{zz+Em Ο'eOaU n% E͖Kb &ꏒ9C4sAWՠOG ]OH1Iyd N0_U1 JJΩ=.ĒL,`h<'bB0ɒk?G>R[Z[z4V>9P@kk= RzGxrvW#:UQ=b+H,tԘnV0ot$OߠF%:cs#aǩ9Ե* -%ɍJR3^A=z&?fDH3~NrdZܨzv'Gvf< zfVL$2QRµIFɑwO!f20xi.`w?)|@\D'%:1ipjY!2DIDƚ}2TEQKb7AKq?:1d&GHC-$͔LN>4}*%8iz/>ݺ@0u IN$XϹepԃ*1SOx!LNG "ixnvsqRxjṧNLt9H\#m =T d%n3^Sm5a?V %Q;Rj:'qxNp")ieڑɔdOteYFP?_=H# wwLImY+o:z3j4Gg%u^ j֐2o1tUT+!ԅ1' w9۱L-d Пg\^]œ|M"Gfkb]';W9dKB6dASqܜ*Az_oߡY ѹ ͳ> l3Y5yв7gr{?m-ͯnm<7U/PJ+?L'E/{nE>ElK 9ݱ,r^8=v^{eo89г3cᩲұʗ2LJcA9dIa@ 'A,Y /ٽ 6$\ 6Ti)m(w`-2MKk:9j)$RVǡhKk-}ZYrm=>E՚>k-MjPuҧؤZNZ%^&QVaֿ[GKjc%! tD6w.+Tk0} (aԴ`+ϘGRw7;]RwWb53w#Z 6Ɂ=L4(=]&vh:=)qV(*fq6 <P9J2<{L%7۸ɤ6AdLɈk&$~=sMsғo6۪K.wBm#ĝr~i~ԇ_}]wdD2ٜhn@_utR^D)b+U@Z Q.s sZdĴ8nNGdpcƀs"Pth/+Bd㖃VzBpmc4tWʂvgJlvnHd&Hc;LEMЙҶ',Ǎ^3{319u":O9T_lRmOg>p6p~HYYA-dE5VJ)Ox̏?Zvar/home/core/zuul-output/logs/kubelet.log0000644000000000000000005364151415157427747017731 0ustar rootrootMar 21 04:23:14 crc systemd[1]: Starting Kubernetes Kubelet... Mar 21 04:23:14 crc restorecon[4683]: Relabeled /var/lib/kubelet/config.json from system_u:object_r:unlabeled_t:s0 to system_u:object_r:container_var_lib_t:s0 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/device-plugins not reset as customized by admin to system_u:object_r:container_file_t:s0 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/device-plugins/kubelet.sock not reset as customized by admin to system_u:object_r:container_file_t:s0 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/volumes/kubernetes.io~configmap/nginx-conf/..2025_02_23_05_40_35.4114275528/nginx.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/containers/networking-console-plugin/22e96971 not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/containers/networking-console-plugin/21c98286 not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/containers/networking-console-plugin/0f1869e1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c215,c682 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/setup/46889d52 not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/setup/5b6a5969 not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c963 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/setup/6c7921f5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c215,c682 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/4804f443 not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/2a46b283 not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/a6b5573e not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/4f88ee5b not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/5a4eee4b not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c963 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/cd87c521 not reset as customized by admin to system_u:object_r:container_file_t:s0:c215,c682 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_33_42.2574241751 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_33_42.2574241751/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/38602af4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/1483b002 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/0346718b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/d3ed4ada not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/3bb473a5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/8cd075a9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/00ab4760 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/54a21c09 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c589,c726 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/70478888 not reset as customized by admin to system_u:object_r:container_file_t:s0:c176,c499 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/43802770 not reset as customized by admin to system_u:object_r:container_file_t:s0:c176,c499 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/955a0edc not reset as customized by admin to system_u:object_r:container_file_t:s0:c176,c499 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/bca2d009 not reset as customized by admin to system_u:object_r:container_file_t:s0:c140,c1009 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/b295f9bd not reset as customized by admin to system_u:object_r:container_file_t:s0:c589,c726 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/..2025_02_23_05_21_22.3617465230 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/..2025_02_23_05_21_22.3617465230/cnibincopy.sh not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/cnibincopy.sh not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/..2025_02_23_05_21_22.2050650026 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/..2025_02_23_05_21_22.2050650026/allowlist.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/allowlist.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/egress-router-binary-copy/bc46ea27 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/egress-router-binary-copy/5731fc1b not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/egress-router-binary-copy/5e1b2a3c not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/cni-plugins/943f0936 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/cni-plugins/3f764ee4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/cni-plugins/8695e3f9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/bond-cni-plugin/aed7aa86 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/bond-cni-plugin/c64d7448 not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/bond-cni-plugin/0ba16bd2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/routeoverride-cni/207a939f not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/routeoverride-cni/54aa8cdb not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/routeoverride-cni/1f5fa595 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni-bincopy/bf9c8153 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni-bincopy/47fba4ea not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni-bincopy/7ae55ce9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni/7906a268 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni/ce43fa69 not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni/7fc7ea3a not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/kube-multus-additional-cni-plugins/d8c38b7d not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/kube-multus-additional-cni-plugins/9ef015fb not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/kube-multus-additional-cni-plugins/b9db6a41 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c432,c991 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/network-metrics-daemon/b1733d79 not reset as customized by admin to system_u:object_r:container_file_t:s0:c476,c820 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/network-metrics-daemon/afccd338 not reset as customized by admin to system_u:object_r:container_file_t:s0:c272,c818 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/network-metrics-daemon/9df0a185 not reset as customized by admin to system_u:object_r:container_file_t:s0:c432,c991 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/kube-rbac-proxy/18938cf8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c476,c820 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/kube-rbac-proxy/7ab4eb23 not reset as customized by admin to system_u:object_r:container_file_t:s0:c272,c818 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/kube-rbac-proxy/56930be6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c432,c991 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/env-overrides not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/env-overrides/..2025_02_23_05_21_35.630010865 not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/env-overrides/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/..2025_02_23_05_21_35.1088506337 not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/..2025_02_23_05_21_35.1088506337/ovnkube.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/ovnkube.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/kube-rbac-proxy/0d8e3722 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/kube-rbac-proxy/d22b2e76 not reset as customized by admin to system_u:object_r:container_file_t:s0:c382,c850 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/kube-rbac-proxy/e036759f not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/2734c483 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/57878fe7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/3f3c2e58 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/375bec3e not reset as customized by admin to system_u:object_r:container_file_t:s0:c382,c850 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/7bc41e08 not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/containers/download-server/48c7a72d not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/containers/download-server/4b66701f not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/containers/download-server/a5a1c202 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..2025_02_23_05_21_40.3350632666 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..2025_02_23_05_21_40.3350632666/additional-cert-acceptance-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..2025_02_23_05_21_40.3350632666/additional-pod-admission-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/additional-cert-acceptance-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/additional-pod-admission-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/env-overrides not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/env-overrides/..2025_02_23_05_21_40.1388695756 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/env-overrides/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/webhook/26f3df5b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/webhook/6d8fb21d not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/webhook/50e94777 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/208473b3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/ec9e08ba not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/3b787c39 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/208eaed5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/93aa3a2b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/3c697968 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/containers/network-check-target-container/ba950ec9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/containers/network-check-target-container/cb5cdb37 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/containers/network-check-target-container/f2df9827 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/..2025_02_23_05_22_30.473230615 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/..2025_02_23_05_22_30.473230615/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_24_06_22_02.1904938450 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_24_06_22_02.1904938450/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/machine-config-operator/fedaa673 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/machine-config-operator/9ca2df95 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/machine-config-operator/b2d7460e not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/kube-rbac-proxy/2207853c not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/kube-rbac-proxy/241c1c29 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/kube-rbac-proxy/2d910eaf not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/..2025_02_23_05_23_49.3726007728 not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/..2025_02_23_05_23_49.3726007728/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/..2025_02_23_05_23_49.841175008 not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/..2025_02_23_05_23_49.841175008/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.843437178 not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.843437178/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/c6c0f2e7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/399edc97 not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/8049f7cc not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/0cec5484 not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/312446d0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c406,c828 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/8e56a35d not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.133159589 not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.133159589/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/2d30ddb9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c380,c909 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/eca8053d not reset as customized by admin to system_u:object_r:container_file_t:s0:c380,c909 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/c3a25c9a not reset as customized by admin to system_u:object_r:container_file_t:s0:c168,c522 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/b9609c22 not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c968,c969 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/dns-operator/e8b0eca9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c106,c418 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/dns-operator/b36a9c3f not reset as customized by admin to system_u:object_r:container_file_t:s0:c529,c711 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/dns-operator/38af7b07 not reset as customized by admin to system_u:object_r:container_file_t:s0:c968,c969 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/kube-rbac-proxy/ae821620 not reset as customized by admin to system_u:object_r:container_file_t:s0:c106,c418 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/kube-rbac-proxy/baa23338 not reset as customized by admin to system_u:object_r:container_file_t:s0:c529,c711 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/kube-rbac-proxy/2c534809 not reset as customized by admin to system_u:object_r:container_file_t:s0:c968,c969 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3532625537 not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3532625537/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/59b29eae not reset as customized by admin to system_u:object_r:container_file_t:s0:c338,c381 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/c91a8e4f not reset as customized by admin to system_u:object_r:container_file_t:s0:c338,c381 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/4d87494a not reset as customized by admin to system_u:object_r:container_file_t:s0:c442,c857 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/1e33ca63 not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/kube-rbac-proxy/8dea7be2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/kube-rbac-proxy/d0b04a99 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/kube-rbac-proxy/d84f01e7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/package-server-manager/4109059b not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/package-server-manager/a7258a3e not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/package-server-manager/05bdf2b6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/f3261b51 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/315d045e not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/5fdcf278 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/d053f757 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/c2850dc7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/..2025_02_23_05_22_30.2390596521 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/..2025_02_23_05_22_30.2390596521/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/fcfb0b2b not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/c7ac9b7d not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/fa0c0d52 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/c609b6ba not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/2be6c296 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/89a32653 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/4eb9afeb not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/13af6efa not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/containers/olm-operator/b03f9724 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/containers/olm-operator/e3d105cc not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/containers/olm-operator/3aed4d83 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1906041176 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1906041176/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/0765fa6e not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/2cefc627 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/3dcc6345 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/365af391 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-SelfManagedHA-Default.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-SelfManagedHA-TechPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-SelfManagedHA-DevPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-Hypershift-TechPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-Hypershift-DevPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-Hypershift-Default.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-api/b1130c0f not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-api/236a5913 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-api/b9432e26 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/5ddb0e3f not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/986dc4fd not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/8a23ff9a not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/9728ae68 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/665f31d0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1255385357 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1255385357/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_23_57.573792656 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_23_57.573792656/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_23_05_22_30.3254245399 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_23_05_22_30.3254245399/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/136c9b42 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/98a1575b not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/cac69136 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/5deb77a7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/2ae53400 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3608339744 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3608339744/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/e46f2326 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/dc688d3c not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/3497c3cd not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/177eb008 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3819292994 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3819292994/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/af5a2afa not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/d780cb1f not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/49b0f374 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/26fbb125 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.3244779536 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.3244779536/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/cf14125a not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/b7f86972 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/e51d739c not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/88ba6a69 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/669a9acf not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/5cd51231 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/75349ec7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/15c26839 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/45023dcd not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/2bb66a50 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/kube-rbac-proxy/64d03bdd not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/kube-rbac-proxy/ab8e7ca0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/kube-rbac-proxy/bb9be25f not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.2034221258 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.2034221258/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/containers/cluster-image-registry-operator/9a0b61d3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/containers/cluster-image-registry-operator/d471b9d2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/containers/cluster-image-registry-operator/8cb76b8e not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/containers/catalog-operator/11a00840 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/containers/catalog-operator/ec355a92 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/containers/catalog-operator/992f735e not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1782968797 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1782968797/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/d59cdbbc not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/72133ff0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/c56c834c not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/d13724c7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/0a498258 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/containers/machine-config-server/fa471982 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/containers/machine-config-server/fc900d92 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/containers/machine-config-server/fa7d68da not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/migrator/4bacf9b4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/migrator/424021b1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/migrator/fc2e31a3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/graceful-termination/f51eefac not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/graceful-termination/c8997f2f not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/graceful-termination/7481f599 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/..2025_02_23_05_22_49.2255460704 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/..2025_02_23_05_22_49.2255460704/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/fdafea19 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/d0e1c571 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/ee398915 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/682bb6b8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/setup/a3e67855 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/setup/a989f289 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/setup/915431bd not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-ensure-env-vars/7796fdab not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-ensure-env-vars/dcdb5f19 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-ensure-env-vars/a3aaa88c not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-resources-copy/5508e3e6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-resources-copy/160585de not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-resources-copy/e99f8da3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcdctl/8bc85570 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcdctl/a5861c91 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcdctl/84db1135 not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd/9e1a6043 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd/c1aba1c2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd/d55ccd6d not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-metrics/971cc9f6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-metrics/8f2e3dcf not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-metrics/ceb35e9c not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-readyz/1c192745 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-readyz/5209e501 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-readyz/f83de4df not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-rev/e7b978ac not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-rev/c64304a1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-rev/5384386b not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c268,c620 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/multus-admission-controller/cce3e3ff not reset as customized by admin to system_u:object_r:container_file_t:s0:c435,c756 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/multus-admission-controller/8fb75465 not reset as customized by admin to system_u:object_r:container_file_t:s0:c268,c620 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/kube-rbac-proxy/740f573e not reset as customized by admin to system_u:object_r:container_file_t:s0:c435,c756 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/kube-rbac-proxy/32fd1134 not reset as customized by admin to system_u:object_r:container_file_t:s0:c268,c620 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/containers/serve-healthcheck-canary/0a861bd3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/containers/serve-healthcheck-canary/80363026 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/containers/serve-healthcheck-canary/bfa952a8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_23_05_33_31.2122464563 not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_23_05_33_31.2122464563/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/config/..2025_02_23_05_33_31.333075221 not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/kube-rbac-proxy/793bf43d not reset as customized by admin to system_u:object_r:container_file_t:s0:c381,c387 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/kube-rbac-proxy/7db1bb6e not reset as customized by admin to system_u:object_r:container_file_t:s0:c142,c438 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/kube-rbac-proxy/4f6a0368 not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/c12c7d86 not reset as customized by admin to system_u:object_r:container_file_t:s0:c381,c387 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/36c4a773 not reset as customized by admin to system_u:object_r:container_file_t:s0:c142,c438 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/4c1e98ae not reset as customized by admin to system_u:object_r:container_file_t:s0:c142,c438 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/a4c8115c not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/setup/7db1802e not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver/a008a7ab not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-cert-syncer/2c836bac not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-cert-regeneration-controller/0ce62299 not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-insecure-readyz/945d2457 not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-check-endpoints/7d5c1dd8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/advanced-cluster-management not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/advanced-cluster-management/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-broker-rhel8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-broker-rhel8/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-online not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-online/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams-console not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams-console/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq7-interconnect-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq7-interconnect-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-automation-platform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-automation-platform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-cloud-addons-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-cloud-addons-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry-3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry-3/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-load-balancer-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-load-balancer-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-businessautomation-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-businessautomation-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-kogito-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-kogito-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator/index.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/businessautomation-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/businessautomation-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cephcsi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cephcsi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cincinnati-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cincinnati-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-kube-descheduler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-kube-descheduler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-logging not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-logging/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-observability-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-observability-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/compliance-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/compliance-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/container-security-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/container-security-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/costmanagement-metrics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/costmanagement-metrics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cryostat-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cryostat-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datagrid not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datagrid/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devspaces not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devspaces/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devworkspace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devworkspace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dpu-network-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dpu-network-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eap not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eap/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-dns-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-dns-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/file-integrity-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/file-integrity-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-apicurito not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-apicurito/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-console not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-console/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-online not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-online/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gatekeeper-operator-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gatekeeper-operator-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jws-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jws-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management-hub not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management-hub/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kiali-ossm not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kiali-ossm/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubevirt-hyperconverged not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubevirt-hyperconverged/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logic-operator-rhel8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logic-operator-rhel8/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lvms-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lvms-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mcg-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mcg-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mta-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mta-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtr-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtr-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-engine not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-engine/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-observability-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-observability-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-client-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-client-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-csi-addons-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-csi-addons-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-multicluster-orchestrator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-multicluster-orchestrator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-prometheus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-prometheus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-cluster-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-cluster-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-hub-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-hub-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator/bundle-v1.15.0.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator/channel.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator/package.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-custom-metrics-autoscaler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-custom-metrics-autoscaler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-gitops-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-gitops-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-pipelines-operator-rh not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-pipelines-operator-rh/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-secondary-scheduler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-secondary-scheduler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-bridge-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-bridge-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/recipe not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/recipe/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-camel-k not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-camel-k/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-hawtio-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-hawtio-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redhat-oadp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redhat-oadp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rh-service-binding-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rh-service-binding-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhacs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhacs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhbk-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhbk-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhdh not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhdh/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-prometheus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-prometheus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhpam-kogito-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhpam-kogito-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhsso-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhsso-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rook-ceph-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rook-ceph-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/run-once-duration-override-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/run-once-duration-override-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sandboxed-containers-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sandboxed-containers-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/security-profiles-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/security-profiles-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/serverless-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/serverless-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-registry-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-registry-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator3/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/submariner not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/submariner/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tang-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tang-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustee-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustee-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volsync-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volsync-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/web-terminal not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/web-terminal/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-utilities/bc8d0691 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-utilities/6b76097a not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-utilities/34d1af30 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-content/312ba61c not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-content/645d5dd1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-content/16e825f0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/registry-server/4cf51fc9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/registry-server/2a23d348 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/registry-server/075dbd49 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/containers/node-ca/dd585ddd not reset as customized by admin to system_u:object_r:container_file_t:s0:c377,c642 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/containers/node-ca/17ebd0ab not reset as customized by admin to system_u:object_r:container_file_t:s0:c338,c343 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/containers/node-ca/005579f4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_23_05_23_11.449897510 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_23_05_23_11.449897510/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_23_05_23_11.1287037894 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/..2025_02_23_05_23_11.1301053334 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/..2025_02_23_05_23_11.1301053334/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/fix-audit-permissions/bf5f3b9c not reset as customized by admin to system_u:object_r:container_file_t:s0:c49,c263 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/fix-audit-permissions/af276eb7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c701 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/fix-audit-permissions/ea28e322 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/oauth-apiserver/692e6683 not reset as customized by admin to system_u:object_r:container_file_t:s0:c49,c263 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/oauth-apiserver/871746a7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c701 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/oauth-apiserver/4eb2e958 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/..2025_02_24_06_09_06.2875086261 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/..2025_02_24_06_09_06.2875086261/console-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/console-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_09_06.286118152 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_09_06.286118152/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/..2025_02_24_06_09_06.3865795478 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/..2025_02_24_06_09_06.3865795478/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/..2025_02_24_06_09_06.584414814 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/..2025_02_24_06_09_06.584414814/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/containers/console/ca9b62da not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/containers/console/0edd6fce not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837 not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/openshift-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/openshift-controller-manager.openshift-global-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/openshift-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/openshift-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/openshift-controller-manager.openshift-global-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/openshift-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.1071801880 not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.1071801880/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/..2025_02_24_06_20_07.2494444877 not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/..2025_02_24_06_20_07.2494444877/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/containers/controller-manager/89b4555f not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/..2025_02_23_05_23_22.4071100442 not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/..2025_02_23_05_23_22.4071100442/Corefile not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/Corefile not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/dns/655fcd71 not reset as customized by admin to system_u:object_r:container_file_t:s0:c457,c841 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/dns/0d43c002 not reset as customized by admin to system_u:object_r:container_file_t:s0:c55,c1022 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/dns/e68efd17 not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/kube-rbac-proxy/9acf9b65 not reset as customized by admin to system_u:object_r:container_file_t:s0:c457,c841 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/kube-rbac-proxy/5ae3ff11 not reset as customized by admin to system_u:object_r:container_file_t:s0:c55,c1022 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/kube-rbac-proxy/1e59206a not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/containers/dns-node-resolver/27af16d1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c304,c1017 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/containers/dns-node-resolver/7918e729 not reset as customized by admin to system_u:object_r:container_file_t:s0:c853,c893 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/containers/dns-node-resolver/5d976d0e not reset as customized by admin to system_u:object_r:container_file_t:s0:c585,c981 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/..2025_02_23_05_38_56.1112187283 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/..2025_02_23_05_38_56.1112187283/controller-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/controller-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_38_56.2839772658 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_38_56.2839772658/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/d7f55cbb not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/f0812073 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/1a56cbeb not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/7fdd437e not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/cdfb5652 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_24_06_17_29.3844392896 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_24_06_17_29.3844392896/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/..2025_02_24_06_17_29.848549803 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/..2025_02_24_06_17_29.848549803/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/..2025_02_24_06_17_29.780046231 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/..2025_02_24_06_17_29.780046231/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_17_29.2729721485 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_17_29.2729721485/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/containers/fix-audit-permissions/fb93119e not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/containers/openshift-apiserver/f1e8fc0e not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/containers/openshift-apiserver-check-endpoints/218511f3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes/kubernetes.io~empty-dir/tmpfs not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes/kubernetes.io~empty-dir/tmpfs/k8s-webhook-server not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes/kubernetes.io~empty-dir/tmpfs/k8s-webhook-server/serving-certs not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/containers/packageserver/ca8af7b3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/containers/packageserver/72cc8a75 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/containers/packageserver/6e8a3760 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/..2025_02_23_05_27_30.557428972 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/..2025_02_23_05_27_30.557428972/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/4c3455c0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/2278acb0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/4b453e4f not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/3ec09bda not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_24_06_25_03.422633132 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_24_06_25_03.422633132/anchors not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_24_06_25_03.422633132/anchors/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/anchors not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/edk2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/edk2/cacerts.bin not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/java not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/java/cacerts not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/openssl not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/openssl/ca-bundle.trust.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/email-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/objsign-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2ae6433e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fde84897.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/75680d2e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/openshift-service-serving-signer_1740288168.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/facfc4fa.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8f5a969c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CFCA_EV_ROOT.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9ef4a08a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ingress-operator_1740288202.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2f332aed.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/248c8271.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d10a21f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ACCVRAIZ1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a94d09e5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3c9a4d3b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/40193066.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AC_RAIZ_FNMT-RCM.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cd8c0d63.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b936d1c6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CA_Disig_Root_R2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4fd49c6c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AC_RAIZ_FNMT-RCM_SERVIDORES_SEGUROS.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b81b93f0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f9a69fa.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certigna.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b30d5fda.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ANF_Secure_Server_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b433981b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/93851c9e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9282e51c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e7dd1bc4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Actalis_Authentication_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/930ac5d2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f47b495.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e113c810.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5931b5bc.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Commercial.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2b349938.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e48193cf.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/302904dd.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a716d4ed.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Networking.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/93bc0acc.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/86212b19.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certigna_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Premium.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b727005e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dbc54cab.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f51bb24c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c28a8a30.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Premium_ECC.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9c8dfbd4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ccc52f49.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cb1c3204.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ce5e74ef.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fd08c599.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_Trusted_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6d41d539.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fb5fa911.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e35234b1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8cb5ee0f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a7c655d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f8fc53da.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/de6d66f3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d41b5e2a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/41a3f684.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1df5a75f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Atos_TrustedRoot_2011.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e36a6752.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b872f2b4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9576d26b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/228f89db.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Atos_TrustedRoot_Root_CA_ECC_TLS_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fb717492.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2d21b73c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0b1b94ef.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/595e996b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Atos_TrustedRoot_Root_CA_RSA_TLS_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9b46e03d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/128f4b91.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Buypass_Class_3_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/81f2d2b1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Autoridad_de_Certificacion_Firmaprofesional_CIF_A62634068.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3bde41ac.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d16a5865.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_EC-384_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/BJCA_Global_Root_CA1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0179095f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ffa7f1eb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9482e63a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d4dae3dd.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/BJCA_Global_Root_CA2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3e359ba6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7e067d03.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/95aff9e3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d7746a63.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Baltimore_CyberTrust_Root.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/653b494a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3ad48a91.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_Trusted_Network_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Buypass_Class_2_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/54657681.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/82223c44.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e8de2f56.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2d9dafe4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d96b65e2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ee64a828.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/COMODO_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/40547a79.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5a3f0ff8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a780d93.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/34d996fb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/COMODO_ECC_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/eed8c118.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/89c02a45.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certainly_Root_R1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b1159c4c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/COMODO_RSA_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d6325660.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d4c339cb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8312c4c1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certainly_Root_E1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8508e720.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5fdd185d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/48bec511.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/69105f4f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0b9bc432.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_Trusted_Network_CA_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/32888f65.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_ECC_Root-01.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6b03dec0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/219d9499.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_ECC_Root-02.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5acf816d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cbf06781.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_RSA_Root-01.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dc99f41e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_RSA_Root-02.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AAA_Certificate_Services.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/985c1f52.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8794b4e3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_BR_Root_CA_1_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e7c037b4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ef954a4e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_EV_Root_CA_1_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2add47b6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/90c5a3c8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_Root_Class_3_CA_2_2009.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b0f3e76e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/53a1b57a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_Root_Class_3_CA_2_EV_2009.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Assured_ID_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5ad8a5d6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/68dd7389.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Assured_ID_Root_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9d04f354.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d6437c3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/062cdee6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bd43e1dd.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Assured_ID_Root_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7f3d5d1d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c491639e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign_Root_E46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Global_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3513523f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/399e7759.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/feffd413.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d18e9066.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Global_Root_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/607986c7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c90bc37d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1b0f7e5c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e08bfd1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Global_Root_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dd8e9d41.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ed39abd0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a3418fda.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bc3f2570.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_High_Assurance_EV_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/244b5494.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/81b9768f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4be590e0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_TLS_ECC_P384_Root_G5.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9846683b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/252252d2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e8e7201.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ISRG_Root_X1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_TLS_RSA4096_Root_G5.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d52c538d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c44cc0c0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign_Root_R46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Trusted_Root_G4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/75d1b2ed.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a2c66da8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ecccd8db.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust.net_Certification_Authority__2048_.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/aee5f10d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3e7271e8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b0e59380.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4c3982f2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6b99d060.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bf64f35b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0a775a30.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/002c0b4f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cc450945.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority_-_EC1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/106f3e4d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b3fb433b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4042bcee.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/02265526.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/455f1b52.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0d69c7e1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9f727ac7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority_-_G4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5e98733a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f0cd152c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dc4d6a89.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6187b673.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/FIRMAPROFESIONAL_CA_ROOT-A_WEB.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ba8887ce.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/068570d1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f081611a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/48a195d8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GDCA_TrustAUTH_R5_ROOT.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0f6fa695.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ab59055e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b92fd57f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GLOBALTRUST_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fa5da96b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1ec40989.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7719f463.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1001acf7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f013ecaf.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/626dceaf.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c559d742.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1d3472b9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9479c8c3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a81e292b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4bfab552.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Go_Daddy_Class_2_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Sectigo_Public_Server_Authentication_Root_E46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Go_Daddy_Root_Certificate_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e071171e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/57bcb2da.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/HARICA_TLS_ECC_Root_CA_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ab5346f4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5046c355.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/HARICA_TLS_RSA_Root_CA_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/865fbdf9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/da0cfd1d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/85cde254.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Hellenic_Academic_and_Research_Institutions_ECC_RootCA_2015.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cbb3f32b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SecureSign_RootCA11.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Hellenic_Academic_and_Research_Institutions_RootCA_2015.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5860aaa6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/31188b5e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/HiPKI_Root_CA_-_G1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c7f1359b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f15c80c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Hongkong_Post_Root_CA_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/09789157.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ISRG_Root_X2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/18856ac4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e09d511.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/IdenTrust_Commercial_Root_CA_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cf701eeb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d06393bb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/IdenTrust_Public_Sector_Root_CA_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/10531352.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Izenpe.com.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SecureTrust_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b0ed035a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Microsec_e-Szigno_Root_CA_2009.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8160b96c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e8651083.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2c63f966.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Security_Communication_RootCA2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Microsoft_ECC_Root_Certificate_Authority_2017.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d89cda1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/01419da9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_TLS_RSA_Root_CA_2022.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b7a5b843.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Microsoft_RSA_Root_Certificate_Authority_2017.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bf53fb88.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9591a472.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3afde786.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SwissSign_Gold_CA_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/NAVER_Global_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3fb36b73.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d39b0a2c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a89d74c2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cd58d51e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b7db1890.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/NetLock_Arany__Class_Gold__F__tan__s__tv__ny.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/988a38cb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/60afe812.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f39fc864.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5443e9e3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/OISTE_WISeKey_Global_Root_GB_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e73d606e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dfc0fe80.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b66938e9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e1eab7c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/OISTE_WISeKey_Global_Root_GC_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/773e07ad.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3c899c73.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d59297b8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ddcda989.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_1_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/749e9e03.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/52b525c7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Security_Communication_RootCA3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d7e8dc79.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a819ef2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/08063a00.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6b483515.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_2_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/064e0aa9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1f58a078.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6f7454b3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7fa05551.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/76faf6c0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9339512a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f387163d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ee37c333.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_3_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e18bfb83.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e442e424.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fe8a2cd8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/23f4c490.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5cd81ad7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_EV_Root_Certification_Authority_ECC.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f0c70a8d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7892ad52.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SZAFIR_ROOT_CA2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4f316efb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_EV_Root_Certification_Authority_RSA_R2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/06dc52d5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/583d0756.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Sectigo_Public_Server_Authentication_Root_R46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_Root_Certification_Authority_ECC.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0bf05006.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/88950faa.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9046744a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3c860d51.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_Root_Certification_Authority_RSA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6fa5da56.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/33ee480d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Secure_Global_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/63a2c897.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_TLS_ECC_Root_CA_2022.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bdacca6f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ff34af3f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dbff3a01.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Security_Communication_ECC_RootCA1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_Root_CA_-_C1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Starfield_Class_2_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/406c9bb1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Starfield_Root_Certificate_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_ECC_Root_CA_-_C3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Starfield_Services_Root_Certificate_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SwissSign_Silver_CA_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/99e1b953.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/T-TeleSec_GlobalRoot_Class_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/vTrus_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/T-TeleSec_GlobalRoot_Class_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/14bc7599.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TUBITAK_Kamu_SM_SSL_Kok_Sertifikasi_-_Surum_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TWCA_Global_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a3adc42.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TWCA_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f459871d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Telekom_Security_TLS_ECC_Root_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_Root_CA_-_G1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Telekom_Security_TLS_RSA_Root_2023.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TeliaSonera_Root_CA_v1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Telia_Root_CA_v2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8f103249.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f058632f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ca-certificates.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TrustAsia_Global_Root_CA_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9bf03295.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/98aaf404.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TrustAsia_Global_Root_CA_G4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1cef98f5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/073bfcc5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2923b3f9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Trustwave_Global_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f249de83.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/edcbddb5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_ECC_Root_CA_-_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Trustwave_Global_ECC_P256_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9b5697b0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1ae85e5e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b74d2bd5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Trustwave_Global_ECC_P384_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d887a5bb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9aef356c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TunTrust_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fd64f3fc.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e13665f9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/UCA_Extended_Validation_Root.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0f5dc4f3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/da7377f6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/UCA_Global_G2_Root.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c01eb047.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/304d27c3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ed858448.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/USERTrust_ECC_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f30dd6ad.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/04f60c28.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/vTrus_ECC_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/USERTrust_RSA_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fc5a8f99.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/35105088.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ee532fd5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/XRamp_Global_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/706f604c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/76579174.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/certSIGN_ROOT_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d86cdd1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/882de061.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/certSIGN_ROOT_CA_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f618aec.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a9d40e02.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e-Szigno_Root_CA_2017.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e868b802.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/83e9984f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ePKI_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ca6e4ad9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9d6523ce.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4b718d9b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/869fbf79.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/containers/registry/f8d22bdb not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator/6e8bbfac not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator/54dd7996 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator/a4f1bb05 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator-watch/207129da not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator-watch/c1df39e1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator-watch/15b8f1cd not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3523263858 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3523263858/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/..2025_02_23_05_27_49.3256605594 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/..2025_02_23_05_27_49.3256605594/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/kube-rbac-proxy/77bd6913 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/kube-rbac-proxy/2382c1b1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/kube-rbac-proxy/704ce128 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/machine-api-operator/70d16fe0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/machine-api-operator/bfb95535 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/machine-api-operator/57a8e8e2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3413793711 not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3413793711/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/containers/kube-apiserver-operator/1b9d3e5e not reset as customized by admin to system_u:object_r:container_file_t:s0:c107,c917 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/containers/kube-apiserver-operator/fddb173c not reset as customized by admin to system_u:object_r:container_file_t:s0:c202,c983 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/containers/kube-apiserver-operator/95d3c6c4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/containers/check-endpoints/bfb5fff5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/containers/check-endpoints/2aef40aa not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/containers/check-endpoints/c0391cad not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager/1119e69d not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager/660608b4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager/8220bd53 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/cluster-policy-controller/85f99d5c not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/cluster-policy-controller/4b0225f6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-cert-syncer/9c2a3394 not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-cert-syncer/e820b243 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-recovery-controller/1ca52ea0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-recovery-controller/e6988e45 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/..2025_02_24_06_09_21.2517297950 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/..2025_02_24_06_09_21.2517297950/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/machine-config-controller/6655f00b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/machine-config-controller/98bc3986 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/machine-config-controller/08e3458a not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/kube-rbac-proxy/2a191cb0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/kube-rbac-proxy/6c4eeefb not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/kube-rbac-proxy/f61a549c not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/hostpath-provisioner/24891863 not reset as customized by admin to system_u:object_r:container_file_t:s0:c37,c572 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/hostpath-provisioner/fbdfd89c not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/liveness-probe/9b63b3bc not reset as customized by admin to system_u:object_r:container_file_t:s0:c37,c572 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/liveness-probe/8acde6d6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/node-driver-registrar/59ecbba3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/csi-provisioner/685d4be3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300/openshift-route-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300/openshift-route-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/openshift-route-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/openshift-route-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.2950937851 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.2950937851/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/containers/route-controller-manager/feaea55e not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abinitio-runtime-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abinitio-runtime-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/accuknox-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/accuknox-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aci-containers-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aci-containers-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airlock-microgateway not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airlock-microgateway/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ako-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ako-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloy not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloy/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anchore-engine not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anchore-engine/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-cloud-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-cloud-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-dcap-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-dcap-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cfm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cfm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium-enterprise not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium-enterprise/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloud-native-postgresql not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloud-native-postgresql/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudera-streams-messaging-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudera-streams-messaging-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudnative-pg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudnative-pg/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cnfv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cnfv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/conjur-follower-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/conjur-follower-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/coroot-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/coroot-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cte-k8s-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cte-k8s-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-deploy-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-deploy-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-release-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-release-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edb-hcp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edb-hcp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-eck-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-eck-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/federatorai-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/federatorai-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fujitsu-enterprise-postgres-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fujitsu-enterprise-postgres-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/function-mesh not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/function-mesh/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/harness-gitops-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/harness-gitops-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hcp-terraform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hcp-terraform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hpe-ezmeral-csi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hpe-ezmeral-csi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-application-gateway-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-application-gateway-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-directory-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-directory-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-dr-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-dr-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-licensing-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-licensing-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-sds-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-sds-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infrastructure-asset-orchestrator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infrastructure-asset-orchestrator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-device-plugins-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-device-plugins-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-kubernetes-power-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-kubernetes-power-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-openshift-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-openshift-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8s-triliovault not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8s-triliovault/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-ati-updates not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-ati-updates/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-framework not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-framework/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-ingress not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-ingress/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-licensing not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-licensing/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-sso not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-sso/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-keycloak-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-keycloak-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-load-core not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-load-core/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-loadcore-agents not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-loadcore-agents/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nats-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nats-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nimbusmosaic-dusim not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nimbusmosaic-dusim/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-rest-api-browser-v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-rest-api-browser-v1/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-appsec not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-appsec/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-core not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-core/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-db/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-diagnostics not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-diagnostics/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-logging not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-logging/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-migration not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-migration/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-msg-broker not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-msg-broker/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-notifications not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-notifications/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-stats-dashboards not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-stats-dashboards/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-storage not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-storage/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-test-core not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-test-core/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-ui not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-ui/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-websocket-service not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-websocket-service/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kong-gateway-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kong-gateway-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubearmor-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubearmor-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lenovo-locd-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lenovo-locd-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memcached-operator-ogaye not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memcached-operator-ogaye/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memory-machine-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memory-machine-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-enterprise not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-enterprise/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netapp-spark-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netapp-spark-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-adm-agent-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-adm-agent-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-repository-ha-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-repository-ha-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nginx-ingress-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nginx-ingress-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nim-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nim-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxiq-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxiq-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxrm-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxrm-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odigos-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odigos-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/open-liberty-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/open-liberty-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftartifactoryha-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftartifactoryha-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftxray-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftxray-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/operator-certification-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/operator-certification-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pmem-csi-operator-os not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pmem-csi-operator-os/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-component-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-component-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-fabric-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-fabric-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sanstoragecsi-operator-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sanstoragecsi-operator-bundle/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/smilecdr-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/smilecdr-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sriov-fec not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sriov-fec/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-commons-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-commons-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-zookeeper-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-zookeeper-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-tsc-client-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-tsc-client-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tawon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tawon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tigera-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tigera-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-secrets-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-secrets-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vcp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vcp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/webotx-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/webotx-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-utilities/63709497 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-utilities/d966b7fd not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-utilities/f5773757 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-content/81c9edb9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-content/57bf57ee not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-content/86f5e6aa not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/registry-server/0aabe31d not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/registry-server/d2af85c2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/registry-server/09d157d9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acm-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acm-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acmpca-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acmpca-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigateway-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigateway-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigatewayv2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigatewayv2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-applicationautoscaling-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-applicationautoscaling-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-athena-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-athena-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudfront-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudfront-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudtrail-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudtrail-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatch-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatch-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatchlogs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatchlogs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-documentdb-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-documentdb-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-dynamodb-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-dynamodb-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ec2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ec2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecr-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecr-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-efs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-efs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eks-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eks-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elasticache-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elasticache-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elbv2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elbv2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-emrcontainers-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-emrcontainers-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eventbridge-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eventbridge-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-iam-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-iam-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kafka-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kafka-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-keyspaces-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-keyspaces-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kinesis-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kinesis-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kms-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kms-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-lambda-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-lambda-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-memorydb-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-memorydb-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-mq-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-mq-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-networkfirewall-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-networkfirewall-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-opensearchservice-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-opensearchservice-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-organizations-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-organizations-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-pipes-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-pipes-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-prometheusservice-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-prometheusservice-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-rds-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-rds-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-recyclebin-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-recyclebin-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53resolver-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53resolver-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-s3-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-s3-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sagemaker-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sagemaker-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-secretsmanager-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-secretsmanager-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ses-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ses-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sfn-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sfn-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sns-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sns-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sqs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sqs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ssm-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ssm-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-wafv2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-wafv2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airflow-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airflow-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloydb-omni-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloydb-omni-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alvearie-imaging-ingestion not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alvearie-imaging-ingestion/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amd-gpu-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amd-gpu-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/analytics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/analytics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/annotationlab not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/annotationlab/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-api-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-api-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurito not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurito/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apimatic-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apimatic-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/application-services-metering-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/application-services-metering-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/argocd-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/argocd-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/assisted-service-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/assisted-service-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/automotive-infra not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/automotive-infra/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-efs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-efs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/awss3-operator-registry not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/awss3-operator-registry/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/azure-service-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/azure-service-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/beegfs-csi-driver-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/beegfs-csi-driver-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-k not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-k/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-karavan-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-karavan-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator-community not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator-community/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-utils-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-utils-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-aas-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-aas-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-impairment-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-impairment-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/codeflare-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/codeflare-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-kubevirt-hyperconverged not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-kubevirt-hyperconverged/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-trivy-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-trivy-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-windows-machine-config-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-windows-machine-config-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/customized-user-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/customized-user-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cxl-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cxl-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dapr-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dapr-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datatrucker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datatrucker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dbaas-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dbaas-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/debezium-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/debezium-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/deployment-validation-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/deployment-validation-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devopsinabox not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devopsinabox/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dns-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dns-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-amlen-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-amlen-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-che not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-che/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ecr-secret-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ecr-secret-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edp-keycloak-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edp-keycloak-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/egressip-ipam-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/egressip-ipam-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ember-csi-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ember-csi-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/etcd not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/etcd/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eventing-kogito not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eventing-kogito/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-secrets-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-secrets-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flink-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flink-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8gb not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8gb/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fossul-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fossul-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/github-arc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/github-arc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitops-primer not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitops-primer/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitwebhook-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitwebhook-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/global-load-balancer-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/global-load-balancer-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/grafana-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/grafana-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/group-sync-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/group-sync-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hawtio-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hawtio-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hedvig-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hedvig-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hive-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hive-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/horreum-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/horreum-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hyperfoil-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hyperfoil-bundle/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator-community not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator-community/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-spectrum-scale-csi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-spectrum-scale-csi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibmcloud-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibmcloud-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infinispan not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infinispan/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/integrity-shield-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/integrity-shield-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ipfs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ipfs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/istio-workspace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/istio-workspace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kaoto-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kaoto-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keda not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keda/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keepalived-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keepalived-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-permissions-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-permissions-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/klusterlet not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/klusterlet/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kogito-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kogito-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/koku-metrics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/koku-metrics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/konveyor-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/konveyor-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/korrel8r not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/korrel8r/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kuadrant-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kuadrant-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kube-green not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kube-green/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubernetes-imagepuller-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubernetes-imagepuller-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/l5-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/l5-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/layer7-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/layer7-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lbconfig-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lbconfig-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lib-bucket-provisioner not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lib-bucket-provisioner/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/limitador-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/limitador-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logging-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logging-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mariadb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mariadb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marin3r not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marin3r/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mercury-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mercury-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/microcks not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/microcks/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/move2kube-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/move2kube-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multi-nic-cni-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multi-nic-cni-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-global-hub-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-global-hub-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-operators-subscription not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-operators-subscription/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/must-gather-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/must-gather-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/namespace-configuration-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/namespace-configuration-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ncn-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ncn-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ndmspc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ndmspc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator-m88i not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator-m88i/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nfs-provisioner-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nfs-provisioner-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nlp-server not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nlp-server/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-discovery-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-discovery-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nsm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nsm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oadp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oadp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/observability-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/observability-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oci-ccm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oci-ccm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odoo-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odoo-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opendatahub-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opendatahub-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openebs not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openebs/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-nfd-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-nfd-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-node-upgrade-mutex-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-node-upgrade-mutex-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-qiskit-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-qiskit-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patch-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patch-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patterns-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patterns-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pelorus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pelorus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/percona-xtradb-cluster-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/percona-xtradb-cluster-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-essentials not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-essentials/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/postgresql not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/postgresql/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/proactive-node-scaling-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/proactive-node-scaling-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/project-quay not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/project-quay/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus-exporter-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus-exporter-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pulp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pulp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-cluster-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-cluster-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-messaging-topology-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-messaging-topology-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/reportportal-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/reportportal-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/resource-locker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/resource-locker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhoas-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhoas-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ripsaw not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ripsaw/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sailoperator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sailoperator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-commerce-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-commerce-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-data-intelligence-observer-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-data-intelligence-observer-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-hana-express-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-hana-express-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-binding-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-binding-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/shipwright-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/shipwright-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sigstore-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sigstore-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snapscheduler not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snapscheduler/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snyk-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snyk-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/socmmd not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/socmmd/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonar-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonar-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosivio not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosivio/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonataflow-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonataflow-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosreport-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosreport-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/spark-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/spark-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/special-resource-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/special-resource-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron-engine not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron-engine/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/strimzi-kafka-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/strimzi-kafka-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/syndesis not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/syndesis/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tagger not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tagger/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tf-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tf-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tidb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tidb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trident-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trident-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustify-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustify-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ucs-ci-solutions-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ucs-ci-solutions-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/universal-crossplane not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/universal-crossplane/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/varnish-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/varnish-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-config-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-config-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/verticadb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/verticadb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volume-expander-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volume-expander-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/wandb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/wandb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/windup-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/windup-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yaks not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yaks/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-utilities/c0fe7256 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-utilities/c30319e4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-utilities/e6b1dd45 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-content/2bb643f0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-content/920de426 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-content/70fa1e87 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/registry-server/a1c12a2f not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/registry-server/9442e6c7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/registry-server/5b45ec72 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abot-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abot-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/entando-k8s-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/entando-k8s-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-paygo-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-paygo-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-term-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-term-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/linstor-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/linstor-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-deploy-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-deploy-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-paygo-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-paygo-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vfunction-server-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vfunction-server-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yugabyte-platform-operator-bundle-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yugabyte-platform-operator-bundle-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-utilities/3c9f3a59 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-utilities/1091c11b not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-utilities/9a6821c6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-content/ec0c35e2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-content/517f37e7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-content/6214fe78 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/registry-server/ba189c8b not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/registry-server/351e4f31 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/registry-server/c0f219ff not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/wait-for-host-port/8069f607 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/wait-for-host-port/559c3d82 not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/wait-for-host-port/605ad488 not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler/148df488 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler/3bf6dcb4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler/022a2feb not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-cert-syncer/938c3924 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-cert-syncer/729fe23e not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-cert-syncer/1fd5cbd4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-recovery-controller/a96697e1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-recovery-controller/e155ddca not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-recovery-controller/10dd0e0f not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/..2025_02_24_06_09_35.3018472960 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/..2025_02_24_06_09_35.3018472960/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/..2025_02_24_06_09_35.4262376737 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/..2025_02_24_06_09_35.4262376737/audit.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/audit.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/..2025_02_24_06_09_35.2630275752 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/..2025_02_24_06_09_35.2630275752/v4-0-config-system-cliconfig not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/v4-0-config-system-cliconfig not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/..2025_02_24_06_09_35.2376963788 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/..2025_02_24_06_09_35.2376963788/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/containers/oauth-openshift/6f2c8392 not reset as customized by admin to system_u:object_r:container_file_t:s0:c267,c588 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/containers/oauth-openshift/bd241ad9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/plugins not reset as customized by admin to system_u:object_r:container_file_t:s0 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/plugins/csi-hostpath not reset as customized by admin to system_u:object_r:container_file_t:s0 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/plugins/csi-hostpath/csi.sock not reset as customized by admin to system_u:object_r:container_file_t:s0 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/plugins/kubernetes.io not reset as customized by admin to system_u:object_r:container_file_t:s0 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/plugins/kubernetes.io/csi not reset as customized by admin to system_u:object_r:container_file_t:s0 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner not reset as customized by admin to system_u:object_r:container_file_t:s0 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983 not reset as customized by admin to system_u:object_r:container_file_t:s0 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/globalmount not reset as customized by admin to system_u:object_r:container_file_t:s0 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/vol_data.json not reset as customized by admin to system_u:object_r:container_file_t:s0 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/plugins_registry not reset as customized by admin to system_u:object_r:container_file_t:s0 Mar 21 04:23:15 crc restorecon[4683]: Relabeled /var/usrlocal/bin/kubenswrapper from system_u:object_r:bin_t:s0 to system_u:object_r:kubelet_exec_t:s0 Mar 21 04:23:16 crc kubenswrapper[4839]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Mar 21 04:23:16 crc kubenswrapper[4839]: Flag --minimum-container-ttl-duration has been deprecated, Use --eviction-hard or --eviction-soft instead. Will be removed in a future version. Mar 21 04:23:16 crc kubenswrapper[4839]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Mar 21 04:23:16 crc kubenswrapper[4839]: Flag --register-with-taints has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Mar 21 04:23:16 crc kubenswrapper[4839]: Flag --pod-infra-container-image has been deprecated, will be removed in a future release. Image garbage collector will get sandbox image information from CRI. Mar 21 04:23:16 crc kubenswrapper[4839]: Flag --system-reserved has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Mar 21 04:23:16 crc kubenswrapper[4839]: I0321 04:23:16.139700 4839 server.go:211] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Mar 21 04:23:16 crc kubenswrapper[4839]: W0321 04:23:16.143235 4839 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Mar 21 04:23:16 crc kubenswrapper[4839]: W0321 04:23:16.143258 4839 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Mar 21 04:23:16 crc kubenswrapper[4839]: W0321 04:23:16.143266 4839 feature_gate.go:330] unrecognized feature gate: OVNObservability Mar 21 04:23:16 crc kubenswrapper[4839]: W0321 04:23:16.143274 4839 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Mar 21 04:23:16 crc kubenswrapper[4839]: W0321 04:23:16.143281 4839 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Mar 21 04:23:16 crc kubenswrapper[4839]: W0321 04:23:16.143301 4839 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Mar 21 04:23:16 crc kubenswrapper[4839]: W0321 04:23:16.143310 4839 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Mar 21 04:23:16 crc kubenswrapper[4839]: W0321 04:23:16.143317 4839 feature_gate.go:330] unrecognized feature gate: NewOLM Mar 21 04:23:16 crc kubenswrapper[4839]: W0321 04:23:16.143324 4839 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Mar 21 04:23:16 crc kubenswrapper[4839]: W0321 04:23:16.143333 4839 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Mar 21 04:23:16 crc kubenswrapper[4839]: W0321 04:23:16.143341 4839 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Mar 21 04:23:16 crc kubenswrapper[4839]: W0321 04:23:16.143349 4839 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Mar 21 04:23:16 crc kubenswrapper[4839]: W0321 04:23:16.143356 4839 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Mar 21 04:23:16 crc kubenswrapper[4839]: W0321 04:23:16.143363 4839 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Mar 21 04:23:16 crc kubenswrapper[4839]: W0321 04:23:16.143369 4839 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Mar 21 04:23:16 crc kubenswrapper[4839]: W0321 04:23:16.143375 4839 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Mar 21 04:23:16 crc kubenswrapper[4839]: W0321 04:23:16.143381 4839 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Mar 21 04:23:16 crc kubenswrapper[4839]: W0321 04:23:16.143387 4839 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Mar 21 04:23:16 crc kubenswrapper[4839]: W0321 04:23:16.143396 4839 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Mar 21 04:23:16 crc kubenswrapper[4839]: W0321 04:23:16.143404 4839 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Mar 21 04:23:16 crc kubenswrapper[4839]: W0321 04:23:16.143410 4839 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Mar 21 04:23:16 crc kubenswrapper[4839]: W0321 04:23:16.143417 4839 feature_gate.go:330] unrecognized feature gate: PlatformOperators Mar 21 04:23:16 crc kubenswrapper[4839]: W0321 04:23:16.143423 4839 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Mar 21 04:23:16 crc kubenswrapper[4839]: W0321 04:23:16.143429 4839 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Mar 21 04:23:16 crc kubenswrapper[4839]: W0321 04:23:16.143435 4839 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Mar 21 04:23:16 crc kubenswrapper[4839]: W0321 04:23:16.143441 4839 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Mar 21 04:23:16 crc kubenswrapper[4839]: W0321 04:23:16.143447 4839 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Mar 21 04:23:16 crc kubenswrapper[4839]: W0321 04:23:16.143454 4839 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Mar 21 04:23:16 crc kubenswrapper[4839]: W0321 04:23:16.143460 4839 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Mar 21 04:23:16 crc kubenswrapper[4839]: W0321 04:23:16.143466 4839 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Mar 21 04:23:16 crc kubenswrapper[4839]: W0321 04:23:16.143474 4839 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Mar 21 04:23:16 crc kubenswrapper[4839]: W0321 04:23:16.143480 4839 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Mar 21 04:23:16 crc kubenswrapper[4839]: W0321 04:23:16.143486 4839 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Mar 21 04:23:16 crc kubenswrapper[4839]: W0321 04:23:16.143492 4839 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Mar 21 04:23:16 crc kubenswrapper[4839]: W0321 04:23:16.143499 4839 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Mar 21 04:23:16 crc kubenswrapper[4839]: W0321 04:23:16.143505 4839 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Mar 21 04:23:16 crc kubenswrapper[4839]: W0321 04:23:16.143511 4839 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Mar 21 04:23:16 crc kubenswrapper[4839]: W0321 04:23:16.143517 4839 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Mar 21 04:23:16 crc kubenswrapper[4839]: W0321 04:23:16.143525 4839 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Mar 21 04:23:16 crc kubenswrapper[4839]: W0321 04:23:16.143534 4839 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Mar 21 04:23:16 crc kubenswrapper[4839]: W0321 04:23:16.143541 4839 feature_gate.go:330] unrecognized feature gate: PinnedImages Mar 21 04:23:16 crc kubenswrapper[4839]: W0321 04:23:16.143548 4839 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Mar 21 04:23:16 crc kubenswrapper[4839]: W0321 04:23:16.143554 4839 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Mar 21 04:23:16 crc kubenswrapper[4839]: W0321 04:23:16.143560 4839 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Mar 21 04:23:16 crc kubenswrapper[4839]: W0321 04:23:16.143589 4839 feature_gate.go:330] unrecognized feature gate: Example Mar 21 04:23:16 crc kubenswrapper[4839]: W0321 04:23:16.143597 4839 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Mar 21 04:23:16 crc kubenswrapper[4839]: W0321 04:23:16.143603 4839 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Mar 21 04:23:16 crc kubenswrapper[4839]: W0321 04:23:16.143609 4839 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Mar 21 04:23:16 crc kubenswrapper[4839]: W0321 04:23:16.143615 4839 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Mar 21 04:23:16 crc kubenswrapper[4839]: W0321 04:23:16.143621 4839 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Mar 21 04:23:16 crc kubenswrapper[4839]: W0321 04:23:16.143629 4839 feature_gate.go:330] unrecognized feature gate: InsightsConfig Mar 21 04:23:16 crc kubenswrapper[4839]: W0321 04:23:16.143635 4839 feature_gate.go:330] unrecognized feature gate: SignatureStores Mar 21 04:23:16 crc kubenswrapper[4839]: W0321 04:23:16.143641 4839 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Mar 21 04:23:16 crc kubenswrapper[4839]: W0321 04:23:16.143647 4839 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Mar 21 04:23:16 crc kubenswrapper[4839]: W0321 04:23:16.143653 4839 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Mar 21 04:23:16 crc kubenswrapper[4839]: W0321 04:23:16.143660 4839 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Mar 21 04:23:16 crc kubenswrapper[4839]: W0321 04:23:16.143666 4839 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Mar 21 04:23:16 crc kubenswrapper[4839]: W0321 04:23:16.143673 4839 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Mar 21 04:23:16 crc kubenswrapper[4839]: W0321 04:23:16.143683 4839 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Mar 21 04:23:16 crc kubenswrapper[4839]: W0321 04:23:16.143691 4839 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Mar 21 04:23:16 crc kubenswrapper[4839]: W0321 04:23:16.143699 4839 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Mar 21 04:23:16 crc kubenswrapper[4839]: W0321 04:23:16.143706 4839 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Mar 21 04:23:16 crc kubenswrapper[4839]: W0321 04:23:16.143712 4839 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Mar 21 04:23:16 crc kubenswrapper[4839]: W0321 04:23:16.143719 4839 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Mar 21 04:23:16 crc kubenswrapper[4839]: W0321 04:23:16.143725 4839 feature_gate.go:330] unrecognized feature gate: GatewayAPI Mar 21 04:23:16 crc kubenswrapper[4839]: W0321 04:23:16.143732 4839 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Mar 21 04:23:16 crc kubenswrapper[4839]: W0321 04:23:16.143750 4839 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Mar 21 04:23:16 crc kubenswrapper[4839]: W0321 04:23:16.143759 4839 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Mar 21 04:23:16 crc kubenswrapper[4839]: W0321 04:23:16.143765 4839 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Mar 21 04:23:16 crc kubenswrapper[4839]: W0321 04:23:16.143771 4839 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Mar 21 04:23:16 crc kubenswrapper[4839]: W0321 04:23:16.143780 4839 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Mar 21 04:23:16 crc kubenswrapper[4839]: I0321 04:23:16.144541 4839 flags.go:64] FLAG: --address="0.0.0.0" Mar 21 04:23:16 crc kubenswrapper[4839]: I0321 04:23:16.144567 4839 flags.go:64] FLAG: --allowed-unsafe-sysctls="[]" Mar 21 04:23:16 crc kubenswrapper[4839]: I0321 04:23:16.144601 4839 flags.go:64] FLAG: --anonymous-auth="true" Mar 21 04:23:16 crc kubenswrapper[4839]: I0321 04:23:16.144612 4839 flags.go:64] FLAG: --application-metrics-count-limit="100" Mar 21 04:23:16 crc kubenswrapper[4839]: I0321 04:23:16.144621 4839 flags.go:64] FLAG: --authentication-token-webhook="false" Mar 21 04:23:16 crc kubenswrapper[4839]: I0321 04:23:16.144629 4839 flags.go:64] FLAG: --authentication-token-webhook-cache-ttl="2m0s" Mar 21 04:23:16 crc kubenswrapper[4839]: I0321 04:23:16.144639 4839 flags.go:64] FLAG: --authorization-mode="AlwaysAllow" Mar 21 04:23:16 crc kubenswrapper[4839]: I0321 04:23:16.144647 4839 flags.go:64] FLAG: --authorization-webhook-cache-authorized-ttl="5m0s" Mar 21 04:23:16 crc kubenswrapper[4839]: I0321 04:23:16.144655 4839 flags.go:64] FLAG: --authorization-webhook-cache-unauthorized-ttl="30s" Mar 21 04:23:16 crc kubenswrapper[4839]: I0321 04:23:16.144662 4839 flags.go:64] FLAG: --boot-id-file="/proc/sys/kernel/random/boot_id" Mar 21 04:23:16 crc kubenswrapper[4839]: I0321 04:23:16.144670 4839 flags.go:64] FLAG: --bootstrap-kubeconfig="/etc/kubernetes/kubeconfig" Mar 21 04:23:16 crc kubenswrapper[4839]: I0321 04:23:16.144678 4839 flags.go:64] FLAG: --cert-dir="/var/lib/kubelet/pki" Mar 21 04:23:16 crc kubenswrapper[4839]: I0321 04:23:16.144685 4839 flags.go:64] FLAG: --cgroup-driver="cgroupfs" Mar 21 04:23:16 crc kubenswrapper[4839]: I0321 04:23:16.144692 4839 flags.go:64] FLAG: --cgroup-root="" Mar 21 04:23:16 crc kubenswrapper[4839]: I0321 04:23:16.144699 4839 flags.go:64] FLAG: --cgroups-per-qos="true" Mar 21 04:23:16 crc kubenswrapper[4839]: I0321 04:23:16.144707 4839 flags.go:64] FLAG: --client-ca-file="" Mar 21 04:23:16 crc kubenswrapper[4839]: I0321 04:23:16.144714 4839 flags.go:64] FLAG: --cloud-config="" Mar 21 04:23:16 crc kubenswrapper[4839]: I0321 04:23:16.144721 4839 flags.go:64] FLAG: --cloud-provider="" Mar 21 04:23:16 crc kubenswrapper[4839]: I0321 04:23:16.144728 4839 flags.go:64] FLAG: --cluster-dns="[]" Mar 21 04:23:16 crc kubenswrapper[4839]: I0321 04:23:16.144747 4839 flags.go:64] FLAG: --cluster-domain="" Mar 21 04:23:16 crc kubenswrapper[4839]: I0321 04:23:16.144754 4839 flags.go:64] FLAG: --config="/etc/kubernetes/kubelet.conf" Mar 21 04:23:16 crc kubenswrapper[4839]: I0321 04:23:16.144762 4839 flags.go:64] FLAG: --config-dir="" Mar 21 04:23:16 crc kubenswrapper[4839]: I0321 04:23:16.144769 4839 flags.go:64] FLAG: --container-hints="/etc/cadvisor/container_hints.json" Mar 21 04:23:16 crc kubenswrapper[4839]: I0321 04:23:16.144777 4839 flags.go:64] FLAG: --container-log-max-files="5" Mar 21 04:23:16 crc kubenswrapper[4839]: I0321 04:23:16.144787 4839 flags.go:64] FLAG: --container-log-max-size="10Mi" Mar 21 04:23:16 crc kubenswrapper[4839]: I0321 04:23:16.144794 4839 flags.go:64] FLAG: --container-runtime-endpoint="/var/run/crio/crio.sock" Mar 21 04:23:16 crc kubenswrapper[4839]: I0321 04:23:16.144802 4839 flags.go:64] FLAG: --containerd="/run/containerd/containerd.sock" Mar 21 04:23:16 crc kubenswrapper[4839]: I0321 04:23:16.144809 4839 flags.go:64] FLAG: --containerd-namespace="k8s.io" Mar 21 04:23:16 crc kubenswrapper[4839]: I0321 04:23:16.144816 4839 flags.go:64] FLAG: --contention-profiling="false" Mar 21 04:23:16 crc kubenswrapper[4839]: I0321 04:23:16.144823 4839 flags.go:64] FLAG: --cpu-cfs-quota="true" Mar 21 04:23:16 crc kubenswrapper[4839]: I0321 04:23:16.144830 4839 flags.go:64] FLAG: --cpu-cfs-quota-period="100ms" Mar 21 04:23:16 crc kubenswrapper[4839]: I0321 04:23:16.144849 4839 flags.go:64] FLAG: --cpu-manager-policy="none" Mar 21 04:23:16 crc kubenswrapper[4839]: I0321 04:23:16.144857 4839 flags.go:64] FLAG: --cpu-manager-policy-options="" Mar 21 04:23:16 crc kubenswrapper[4839]: I0321 04:23:16.144883 4839 flags.go:64] FLAG: --cpu-manager-reconcile-period="10s" Mar 21 04:23:16 crc kubenswrapper[4839]: I0321 04:23:16.144891 4839 flags.go:64] FLAG: --enable-controller-attach-detach="true" Mar 21 04:23:16 crc kubenswrapper[4839]: I0321 04:23:16.144898 4839 flags.go:64] FLAG: --enable-debugging-handlers="true" Mar 21 04:23:16 crc kubenswrapper[4839]: I0321 04:23:16.144905 4839 flags.go:64] FLAG: --enable-load-reader="false" Mar 21 04:23:16 crc kubenswrapper[4839]: I0321 04:23:16.144912 4839 flags.go:64] FLAG: --enable-server="true" Mar 21 04:23:16 crc kubenswrapper[4839]: I0321 04:23:16.144919 4839 flags.go:64] FLAG: --enforce-node-allocatable="[pods]" Mar 21 04:23:16 crc kubenswrapper[4839]: I0321 04:23:16.144933 4839 flags.go:64] FLAG: --event-burst="100" Mar 21 04:23:16 crc kubenswrapper[4839]: I0321 04:23:16.144941 4839 flags.go:64] FLAG: --event-qps="50" Mar 21 04:23:16 crc kubenswrapper[4839]: I0321 04:23:16.144948 4839 flags.go:64] FLAG: --event-storage-age-limit="default=0" Mar 21 04:23:16 crc kubenswrapper[4839]: I0321 04:23:16.144955 4839 flags.go:64] FLAG: --event-storage-event-limit="default=0" Mar 21 04:23:16 crc kubenswrapper[4839]: I0321 04:23:16.144962 4839 flags.go:64] FLAG: --eviction-hard="" Mar 21 04:23:16 crc kubenswrapper[4839]: I0321 04:23:16.144971 4839 flags.go:64] FLAG: --eviction-max-pod-grace-period="0" Mar 21 04:23:16 crc kubenswrapper[4839]: I0321 04:23:16.144978 4839 flags.go:64] FLAG: --eviction-minimum-reclaim="" Mar 21 04:23:16 crc kubenswrapper[4839]: I0321 04:23:16.144985 4839 flags.go:64] FLAG: --eviction-pressure-transition-period="5m0s" Mar 21 04:23:16 crc kubenswrapper[4839]: I0321 04:23:16.144992 4839 flags.go:64] FLAG: --eviction-soft="" Mar 21 04:23:16 crc kubenswrapper[4839]: I0321 04:23:16.144999 4839 flags.go:64] FLAG: --eviction-soft-grace-period="" Mar 21 04:23:16 crc kubenswrapper[4839]: I0321 04:23:16.145006 4839 flags.go:64] FLAG: --exit-on-lock-contention="false" Mar 21 04:23:16 crc kubenswrapper[4839]: I0321 04:23:16.145013 4839 flags.go:64] FLAG: --experimental-allocatable-ignore-eviction="false" Mar 21 04:23:16 crc kubenswrapper[4839]: I0321 04:23:16.145020 4839 flags.go:64] FLAG: --experimental-mounter-path="" Mar 21 04:23:16 crc kubenswrapper[4839]: I0321 04:23:16.145028 4839 flags.go:64] FLAG: --fail-cgroupv1="false" Mar 21 04:23:16 crc kubenswrapper[4839]: I0321 04:23:16.145034 4839 flags.go:64] FLAG: --fail-swap-on="true" Mar 21 04:23:16 crc kubenswrapper[4839]: I0321 04:23:16.145041 4839 flags.go:64] FLAG: --feature-gates="" Mar 21 04:23:16 crc kubenswrapper[4839]: I0321 04:23:16.145050 4839 flags.go:64] FLAG: --file-check-frequency="20s" Mar 21 04:23:16 crc kubenswrapper[4839]: I0321 04:23:16.145057 4839 flags.go:64] FLAG: --global-housekeeping-interval="1m0s" Mar 21 04:23:16 crc kubenswrapper[4839]: I0321 04:23:16.145065 4839 flags.go:64] FLAG: --hairpin-mode="promiscuous-bridge" Mar 21 04:23:16 crc kubenswrapper[4839]: I0321 04:23:16.145072 4839 flags.go:64] FLAG: --healthz-bind-address="127.0.0.1" Mar 21 04:23:16 crc kubenswrapper[4839]: I0321 04:23:16.145079 4839 flags.go:64] FLAG: --healthz-port="10248" Mar 21 04:23:16 crc kubenswrapper[4839]: I0321 04:23:16.145087 4839 flags.go:64] FLAG: --help="false" Mar 21 04:23:16 crc kubenswrapper[4839]: I0321 04:23:16.145094 4839 flags.go:64] FLAG: --hostname-override="" Mar 21 04:23:16 crc kubenswrapper[4839]: I0321 04:23:16.145101 4839 flags.go:64] FLAG: --housekeeping-interval="10s" Mar 21 04:23:16 crc kubenswrapper[4839]: I0321 04:23:16.145109 4839 flags.go:64] FLAG: --http-check-frequency="20s" Mar 21 04:23:16 crc kubenswrapper[4839]: I0321 04:23:16.145116 4839 flags.go:64] FLAG: --image-credential-provider-bin-dir="" Mar 21 04:23:16 crc kubenswrapper[4839]: I0321 04:23:16.145123 4839 flags.go:64] FLAG: --image-credential-provider-config="" Mar 21 04:23:16 crc kubenswrapper[4839]: I0321 04:23:16.145130 4839 flags.go:64] FLAG: --image-gc-high-threshold="85" Mar 21 04:23:16 crc kubenswrapper[4839]: I0321 04:23:16.145148 4839 flags.go:64] FLAG: --image-gc-low-threshold="80" Mar 21 04:23:16 crc kubenswrapper[4839]: I0321 04:23:16.145157 4839 flags.go:64] FLAG: --image-service-endpoint="" Mar 21 04:23:16 crc kubenswrapper[4839]: I0321 04:23:16.145164 4839 flags.go:64] FLAG: --kernel-memcg-notification="false" Mar 21 04:23:16 crc kubenswrapper[4839]: I0321 04:23:16.145171 4839 flags.go:64] FLAG: --kube-api-burst="100" Mar 21 04:23:16 crc kubenswrapper[4839]: I0321 04:23:16.145178 4839 flags.go:64] FLAG: --kube-api-content-type="application/vnd.kubernetes.protobuf" Mar 21 04:23:16 crc kubenswrapper[4839]: I0321 04:23:16.145186 4839 flags.go:64] FLAG: --kube-api-qps="50" Mar 21 04:23:16 crc kubenswrapper[4839]: I0321 04:23:16.145193 4839 flags.go:64] FLAG: --kube-reserved="" Mar 21 04:23:16 crc kubenswrapper[4839]: I0321 04:23:16.145200 4839 flags.go:64] FLAG: --kube-reserved-cgroup="" Mar 21 04:23:16 crc kubenswrapper[4839]: I0321 04:23:16.145207 4839 flags.go:64] FLAG: --kubeconfig="/var/lib/kubelet/kubeconfig" Mar 21 04:23:16 crc kubenswrapper[4839]: I0321 04:23:16.145214 4839 flags.go:64] FLAG: --kubelet-cgroups="" Mar 21 04:23:16 crc kubenswrapper[4839]: I0321 04:23:16.145221 4839 flags.go:64] FLAG: --local-storage-capacity-isolation="true" Mar 21 04:23:16 crc kubenswrapper[4839]: I0321 04:23:16.145228 4839 flags.go:64] FLAG: --lock-file="" Mar 21 04:23:16 crc kubenswrapper[4839]: I0321 04:23:16.145235 4839 flags.go:64] FLAG: --log-cadvisor-usage="false" Mar 21 04:23:16 crc kubenswrapper[4839]: I0321 04:23:16.145243 4839 flags.go:64] FLAG: --log-flush-frequency="5s" Mar 21 04:23:16 crc kubenswrapper[4839]: I0321 04:23:16.145250 4839 flags.go:64] FLAG: --log-json-info-buffer-size="0" Mar 21 04:23:16 crc kubenswrapper[4839]: I0321 04:23:16.145261 4839 flags.go:64] FLAG: --log-json-split-stream="false" Mar 21 04:23:16 crc kubenswrapper[4839]: I0321 04:23:16.146045 4839 flags.go:64] FLAG: --log-text-info-buffer-size="0" Mar 21 04:23:16 crc kubenswrapper[4839]: I0321 04:23:16.146061 4839 flags.go:64] FLAG: --log-text-split-stream="false" Mar 21 04:23:16 crc kubenswrapper[4839]: I0321 04:23:16.146069 4839 flags.go:64] FLAG: --logging-format="text" Mar 21 04:23:16 crc kubenswrapper[4839]: I0321 04:23:16.146076 4839 flags.go:64] FLAG: --machine-id-file="/etc/machine-id,/var/lib/dbus/machine-id" Mar 21 04:23:16 crc kubenswrapper[4839]: I0321 04:23:16.146084 4839 flags.go:64] FLAG: --make-iptables-util-chains="true" Mar 21 04:23:16 crc kubenswrapper[4839]: I0321 04:23:16.146092 4839 flags.go:64] FLAG: --manifest-url="" Mar 21 04:23:16 crc kubenswrapper[4839]: I0321 04:23:16.146099 4839 flags.go:64] FLAG: --manifest-url-header="" Mar 21 04:23:16 crc kubenswrapper[4839]: I0321 04:23:16.146109 4839 flags.go:64] FLAG: --max-housekeeping-interval="15s" Mar 21 04:23:16 crc kubenswrapper[4839]: I0321 04:23:16.146117 4839 flags.go:64] FLAG: --max-open-files="1000000" Mar 21 04:23:16 crc kubenswrapper[4839]: I0321 04:23:16.146126 4839 flags.go:64] FLAG: --max-pods="110" Mar 21 04:23:16 crc kubenswrapper[4839]: I0321 04:23:16.146133 4839 flags.go:64] FLAG: --maximum-dead-containers="-1" Mar 21 04:23:16 crc kubenswrapper[4839]: I0321 04:23:16.146141 4839 flags.go:64] FLAG: --maximum-dead-containers-per-container="1" Mar 21 04:23:16 crc kubenswrapper[4839]: I0321 04:23:16.146149 4839 flags.go:64] FLAG: --memory-manager-policy="None" Mar 21 04:23:16 crc kubenswrapper[4839]: I0321 04:23:16.146156 4839 flags.go:64] FLAG: --minimum-container-ttl-duration="6m0s" Mar 21 04:23:16 crc kubenswrapper[4839]: I0321 04:23:16.146164 4839 flags.go:64] FLAG: --minimum-image-ttl-duration="2m0s" Mar 21 04:23:16 crc kubenswrapper[4839]: I0321 04:23:16.146172 4839 flags.go:64] FLAG: --node-ip="192.168.126.11" Mar 21 04:23:16 crc kubenswrapper[4839]: I0321 04:23:16.146182 4839 flags.go:64] FLAG: --node-labels="node-role.kubernetes.io/control-plane=,node-role.kubernetes.io/master=,node.openshift.io/os_id=rhcos" Mar 21 04:23:16 crc kubenswrapper[4839]: I0321 04:23:16.146201 4839 flags.go:64] FLAG: --node-status-max-images="50" Mar 21 04:23:16 crc kubenswrapper[4839]: I0321 04:23:16.146209 4839 flags.go:64] FLAG: --node-status-update-frequency="10s" Mar 21 04:23:16 crc kubenswrapper[4839]: I0321 04:23:16.146216 4839 flags.go:64] FLAG: --oom-score-adj="-999" Mar 21 04:23:16 crc kubenswrapper[4839]: I0321 04:23:16.146307 4839 flags.go:64] FLAG: --pod-cidr="" Mar 21 04:23:16 crc kubenswrapper[4839]: I0321 04:23:16.146319 4839 flags.go:64] FLAG: --pod-infra-container-image="quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:33549946e22a9ffa738fd94b1345f90921bc8f92fa6137784cb33c77ad806f9d" Mar 21 04:23:16 crc kubenswrapper[4839]: I0321 04:23:16.146337 4839 flags.go:64] FLAG: --pod-manifest-path="" Mar 21 04:23:16 crc kubenswrapper[4839]: I0321 04:23:16.146344 4839 flags.go:64] FLAG: --pod-max-pids="-1" Mar 21 04:23:16 crc kubenswrapper[4839]: I0321 04:23:16.146352 4839 flags.go:64] FLAG: --pods-per-core="0" Mar 21 04:23:16 crc kubenswrapper[4839]: I0321 04:23:16.146359 4839 flags.go:64] FLAG: --port="10250" Mar 21 04:23:16 crc kubenswrapper[4839]: I0321 04:23:16.146367 4839 flags.go:64] FLAG: --protect-kernel-defaults="false" Mar 21 04:23:16 crc kubenswrapper[4839]: I0321 04:23:16.146374 4839 flags.go:64] FLAG: --provider-id="" Mar 21 04:23:16 crc kubenswrapper[4839]: I0321 04:23:16.146381 4839 flags.go:64] FLAG: --qos-reserved="" Mar 21 04:23:16 crc kubenswrapper[4839]: I0321 04:23:16.146388 4839 flags.go:64] FLAG: --read-only-port="10255" Mar 21 04:23:16 crc kubenswrapper[4839]: I0321 04:23:16.146396 4839 flags.go:64] FLAG: --register-node="true" Mar 21 04:23:16 crc kubenswrapper[4839]: I0321 04:23:16.146403 4839 flags.go:64] FLAG: --register-schedulable="true" Mar 21 04:23:16 crc kubenswrapper[4839]: I0321 04:23:16.146410 4839 flags.go:64] FLAG: --register-with-taints="node-role.kubernetes.io/master=:NoSchedule" Mar 21 04:23:16 crc kubenswrapper[4839]: I0321 04:23:16.146423 4839 flags.go:64] FLAG: --registry-burst="10" Mar 21 04:23:16 crc kubenswrapper[4839]: I0321 04:23:16.146431 4839 flags.go:64] FLAG: --registry-qps="5" Mar 21 04:23:16 crc kubenswrapper[4839]: I0321 04:23:16.146437 4839 flags.go:64] FLAG: --reserved-cpus="" Mar 21 04:23:16 crc kubenswrapper[4839]: I0321 04:23:16.146444 4839 flags.go:64] FLAG: --reserved-memory="" Mar 21 04:23:16 crc kubenswrapper[4839]: I0321 04:23:16.146453 4839 flags.go:64] FLAG: --resolv-conf="/etc/resolv.conf" Mar 21 04:23:16 crc kubenswrapper[4839]: I0321 04:23:16.146461 4839 flags.go:64] FLAG: --root-dir="/var/lib/kubelet" Mar 21 04:23:16 crc kubenswrapper[4839]: I0321 04:23:16.146468 4839 flags.go:64] FLAG: --rotate-certificates="false" Mar 21 04:23:16 crc kubenswrapper[4839]: I0321 04:23:16.146475 4839 flags.go:64] FLAG: --rotate-server-certificates="false" Mar 21 04:23:16 crc kubenswrapper[4839]: I0321 04:23:16.146482 4839 flags.go:64] FLAG: --runonce="false" Mar 21 04:23:16 crc kubenswrapper[4839]: I0321 04:23:16.146489 4839 flags.go:64] FLAG: --runtime-cgroups="/system.slice/crio.service" Mar 21 04:23:16 crc kubenswrapper[4839]: I0321 04:23:16.146497 4839 flags.go:64] FLAG: --runtime-request-timeout="2m0s" Mar 21 04:23:16 crc kubenswrapper[4839]: I0321 04:23:16.146505 4839 flags.go:64] FLAG: --seccomp-default="false" Mar 21 04:23:16 crc kubenswrapper[4839]: I0321 04:23:16.146517 4839 flags.go:64] FLAG: --serialize-image-pulls="true" Mar 21 04:23:16 crc kubenswrapper[4839]: I0321 04:23:16.146524 4839 flags.go:64] FLAG: --storage-driver-buffer-duration="1m0s" Mar 21 04:23:16 crc kubenswrapper[4839]: I0321 04:23:16.146532 4839 flags.go:64] FLAG: --storage-driver-db="cadvisor" Mar 21 04:23:16 crc kubenswrapper[4839]: I0321 04:23:16.146539 4839 flags.go:64] FLAG: --storage-driver-host="localhost:8086" Mar 21 04:23:16 crc kubenswrapper[4839]: I0321 04:23:16.146547 4839 flags.go:64] FLAG: --storage-driver-password="root" Mar 21 04:23:16 crc kubenswrapper[4839]: I0321 04:23:16.146554 4839 flags.go:64] FLAG: --storage-driver-secure="false" Mar 21 04:23:16 crc kubenswrapper[4839]: I0321 04:23:16.146561 4839 flags.go:64] FLAG: --storage-driver-table="stats" Mar 21 04:23:16 crc kubenswrapper[4839]: I0321 04:23:16.146589 4839 flags.go:64] FLAG: --storage-driver-user="root" Mar 21 04:23:16 crc kubenswrapper[4839]: I0321 04:23:16.146597 4839 flags.go:64] FLAG: --streaming-connection-idle-timeout="4h0m0s" Mar 21 04:23:16 crc kubenswrapper[4839]: I0321 04:23:16.146604 4839 flags.go:64] FLAG: --sync-frequency="1m0s" Mar 21 04:23:16 crc kubenswrapper[4839]: I0321 04:23:16.146612 4839 flags.go:64] FLAG: --system-cgroups="" Mar 21 04:23:16 crc kubenswrapper[4839]: I0321 04:23:16.146631 4839 flags.go:64] FLAG: --system-reserved="cpu=200m,ephemeral-storage=350Mi,memory=350Mi" Mar 21 04:23:16 crc kubenswrapper[4839]: I0321 04:23:16.146644 4839 flags.go:64] FLAG: --system-reserved-cgroup="" Mar 21 04:23:16 crc kubenswrapper[4839]: I0321 04:23:16.146651 4839 flags.go:64] FLAG: --tls-cert-file="" Mar 21 04:23:16 crc kubenswrapper[4839]: I0321 04:23:16.146658 4839 flags.go:64] FLAG: --tls-cipher-suites="[]" Mar 21 04:23:16 crc kubenswrapper[4839]: I0321 04:23:16.146673 4839 flags.go:64] FLAG: --tls-min-version="" Mar 21 04:23:16 crc kubenswrapper[4839]: I0321 04:23:16.146680 4839 flags.go:64] FLAG: --tls-private-key-file="" Mar 21 04:23:16 crc kubenswrapper[4839]: I0321 04:23:16.146687 4839 flags.go:64] FLAG: --topology-manager-policy="none" Mar 21 04:23:16 crc kubenswrapper[4839]: I0321 04:23:16.146694 4839 flags.go:64] FLAG: --topology-manager-policy-options="" Mar 21 04:23:16 crc kubenswrapper[4839]: I0321 04:23:16.146702 4839 flags.go:64] FLAG: --topology-manager-scope="container" Mar 21 04:23:16 crc kubenswrapper[4839]: I0321 04:23:16.146709 4839 flags.go:64] FLAG: --v="2" Mar 21 04:23:16 crc kubenswrapper[4839]: I0321 04:23:16.146718 4839 flags.go:64] FLAG: --version="false" Mar 21 04:23:16 crc kubenswrapper[4839]: I0321 04:23:16.146727 4839 flags.go:64] FLAG: --vmodule="" Mar 21 04:23:16 crc kubenswrapper[4839]: I0321 04:23:16.146736 4839 flags.go:64] FLAG: --volume-plugin-dir="/etc/kubernetes/kubelet-plugins/volume/exec" Mar 21 04:23:16 crc kubenswrapper[4839]: I0321 04:23:16.146744 4839 flags.go:64] FLAG: --volume-stats-agg-period="1m0s" Mar 21 04:23:16 crc kubenswrapper[4839]: W0321 04:23:16.146965 4839 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Mar 21 04:23:16 crc kubenswrapper[4839]: W0321 04:23:16.146976 4839 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Mar 21 04:23:16 crc kubenswrapper[4839]: W0321 04:23:16.146983 4839 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Mar 21 04:23:16 crc kubenswrapper[4839]: W0321 04:23:16.146990 4839 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Mar 21 04:23:16 crc kubenswrapper[4839]: W0321 04:23:16.146996 4839 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Mar 21 04:23:16 crc kubenswrapper[4839]: W0321 04:23:16.147002 4839 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Mar 21 04:23:16 crc kubenswrapper[4839]: W0321 04:23:16.147009 4839 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Mar 21 04:23:16 crc kubenswrapper[4839]: W0321 04:23:16.147016 4839 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Mar 21 04:23:16 crc kubenswrapper[4839]: W0321 04:23:16.147022 4839 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Mar 21 04:23:16 crc kubenswrapper[4839]: W0321 04:23:16.147028 4839 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Mar 21 04:23:16 crc kubenswrapper[4839]: W0321 04:23:16.147035 4839 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Mar 21 04:23:16 crc kubenswrapper[4839]: W0321 04:23:16.147041 4839 feature_gate.go:330] unrecognized feature gate: PlatformOperators Mar 21 04:23:16 crc kubenswrapper[4839]: W0321 04:23:16.147047 4839 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Mar 21 04:23:16 crc kubenswrapper[4839]: W0321 04:23:16.147053 4839 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Mar 21 04:23:16 crc kubenswrapper[4839]: W0321 04:23:16.147059 4839 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Mar 21 04:23:16 crc kubenswrapper[4839]: W0321 04:23:16.147065 4839 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Mar 21 04:23:16 crc kubenswrapper[4839]: W0321 04:23:16.147071 4839 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Mar 21 04:23:16 crc kubenswrapper[4839]: W0321 04:23:16.147077 4839 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Mar 21 04:23:16 crc kubenswrapper[4839]: W0321 04:23:16.147083 4839 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Mar 21 04:23:16 crc kubenswrapper[4839]: W0321 04:23:16.147090 4839 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Mar 21 04:23:16 crc kubenswrapper[4839]: W0321 04:23:16.147096 4839 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Mar 21 04:23:16 crc kubenswrapper[4839]: W0321 04:23:16.147102 4839 feature_gate.go:330] unrecognized feature gate: InsightsConfig Mar 21 04:23:16 crc kubenswrapper[4839]: W0321 04:23:16.147110 4839 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Mar 21 04:23:16 crc kubenswrapper[4839]: W0321 04:23:16.147116 4839 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Mar 21 04:23:16 crc kubenswrapper[4839]: W0321 04:23:16.147122 4839 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Mar 21 04:23:16 crc kubenswrapper[4839]: W0321 04:23:16.147129 4839 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Mar 21 04:23:16 crc kubenswrapper[4839]: W0321 04:23:16.147135 4839 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Mar 21 04:23:16 crc kubenswrapper[4839]: W0321 04:23:16.147142 4839 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Mar 21 04:23:16 crc kubenswrapper[4839]: W0321 04:23:16.147148 4839 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Mar 21 04:23:16 crc kubenswrapper[4839]: W0321 04:23:16.147154 4839 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Mar 21 04:23:16 crc kubenswrapper[4839]: W0321 04:23:16.147161 4839 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Mar 21 04:23:16 crc kubenswrapper[4839]: W0321 04:23:16.147167 4839 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Mar 21 04:23:16 crc kubenswrapper[4839]: W0321 04:23:16.147174 4839 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Mar 21 04:23:16 crc kubenswrapper[4839]: W0321 04:23:16.147180 4839 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Mar 21 04:23:16 crc kubenswrapper[4839]: W0321 04:23:16.147186 4839 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Mar 21 04:23:16 crc kubenswrapper[4839]: W0321 04:23:16.147193 4839 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Mar 21 04:23:16 crc kubenswrapper[4839]: W0321 04:23:16.147202 4839 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Mar 21 04:23:16 crc kubenswrapper[4839]: W0321 04:23:16.147210 4839 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Mar 21 04:23:16 crc kubenswrapper[4839]: W0321 04:23:16.147218 4839 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Mar 21 04:23:16 crc kubenswrapper[4839]: W0321 04:23:16.147228 4839 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Mar 21 04:23:16 crc kubenswrapper[4839]: W0321 04:23:16.147236 4839 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Mar 21 04:23:16 crc kubenswrapper[4839]: W0321 04:23:16.147243 4839 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Mar 21 04:23:16 crc kubenswrapper[4839]: W0321 04:23:16.147250 4839 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Mar 21 04:23:16 crc kubenswrapper[4839]: W0321 04:23:16.147258 4839 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Mar 21 04:23:16 crc kubenswrapper[4839]: W0321 04:23:16.147267 4839 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Mar 21 04:23:16 crc kubenswrapper[4839]: W0321 04:23:16.147274 4839 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Mar 21 04:23:16 crc kubenswrapper[4839]: W0321 04:23:16.147281 4839 feature_gate.go:330] unrecognized feature gate: PinnedImages Mar 21 04:23:16 crc kubenswrapper[4839]: W0321 04:23:16.147287 4839 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Mar 21 04:23:16 crc kubenswrapper[4839]: W0321 04:23:16.147293 4839 feature_gate.go:330] unrecognized feature gate: NewOLM Mar 21 04:23:16 crc kubenswrapper[4839]: W0321 04:23:16.147299 4839 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Mar 21 04:23:16 crc kubenswrapper[4839]: W0321 04:23:16.147306 4839 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Mar 21 04:23:16 crc kubenswrapper[4839]: W0321 04:23:16.147313 4839 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Mar 21 04:23:16 crc kubenswrapper[4839]: W0321 04:23:16.147320 4839 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Mar 21 04:23:16 crc kubenswrapper[4839]: W0321 04:23:16.147326 4839 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Mar 21 04:23:16 crc kubenswrapper[4839]: W0321 04:23:16.147332 4839 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Mar 21 04:23:16 crc kubenswrapper[4839]: W0321 04:23:16.147339 4839 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Mar 21 04:23:16 crc kubenswrapper[4839]: W0321 04:23:16.147345 4839 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Mar 21 04:23:16 crc kubenswrapper[4839]: W0321 04:23:16.147352 4839 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Mar 21 04:23:16 crc kubenswrapper[4839]: W0321 04:23:16.147369 4839 feature_gate.go:330] unrecognized feature gate: SignatureStores Mar 21 04:23:16 crc kubenswrapper[4839]: W0321 04:23:16.147376 4839 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Mar 21 04:23:16 crc kubenswrapper[4839]: W0321 04:23:16.147383 4839 feature_gate.go:330] unrecognized feature gate: Example Mar 21 04:23:16 crc kubenswrapper[4839]: W0321 04:23:16.147390 4839 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Mar 21 04:23:16 crc kubenswrapper[4839]: W0321 04:23:16.147398 4839 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Mar 21 04:23:16 crc kubenswrapper[4839]: W0321 04:23:16.147406 4839 feature_gate.go:330] unrecognized feature gate: OVNObservability Mar 21 04:23:16 crc kubenswrapper[4839]: W0321 04:23:16.147413 4839 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Mar 21 04:23:16 crc kubenswrapper[4839]: W0321 04:23:16.147421 4839 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Mar 21 04:23:16 crc kubenswrapper[4839]: W0321 04:23:16.147428 4839 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Mar 21 04:23:16 crc kubenswrapper[4839]: W0321 04:23:16.147434 4839 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Mar 21 04:23:16 crc kubenswrapper[4839]: W0321 04:23:16.147441 4839 feature_gate.go:330] unrecognized feature gate: GatewayAPI Mar 21 04:23:16 crc kubenswrapper[4839]: W0321 04:23:16.147447 4839 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Mar 21 04:23:16 crc kubenswrapper[4839]: W0321 04:23:16.147453 4839 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Mar 21 04:23:16 crc kubenswrapper[4839]: I0321 04:23:16.148855 4839 feature_gate.go:386] feature gates: {map[CloudDualStackNodeIPs:true DisableKubeletCloudCredentialProviders:true DynamicResourceAllocation:false EventedPLEG:false KMSv1:true MaxUnavailableStatefulSet:false NodeSwap:false ProcMountType:false RouteExternalCertificate:false ServiceAccountTokenNodeBinding:false TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:false UserNamespacesSupport:false ValidatingAdmissionPolicy:true VolumeAttributesClass:false]} Mar 21 04:23:16 crc kubenswrapper[4839]: I0321 04:23:16.164172 4839 server.go:491] "Kubelet version" kubeletVersion="v1.31.5" Mar 21 04:23:16 crc kubenswrapper[4839]: I0321 04:23:16.164225 4839 server.go:493] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Mar 21 04:23:16 crc kubenswrapper[4839]: W0321 04:23:16.164391 4839 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Mar 21 04:23:16 crc kubenswrapper[4839]: W0321 04:23:16.164418 4839 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Mar 21 04:23:16 crc kubenswrapper[4839]: W0321 04:23:16.164429 4839 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Mar 21 04:23:16 crc kubenswrapper[4839]: W0321 04:23:16.164438 4839 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Mar 21 04:23:16 crc kubenswrapper[4839]: W0321 04:23:16.164446 4839 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Mar 21 04:23:16 crc kubenswrapper[4839]: W0321 04:23:16.164455 4839 feature_gate.go:330] unrecognized feature gate: SignatureStores Mar 21 04:23:16 crc kubenswrapper[4839]: W0321 04:23:16.164464 4839 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Mar 21 04:23:16 crc kubenswrapper[4839]: W0321 04:23:16.164472 4839 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Mar 21 04:23:16 crc kubenswrapper[4839]: W0321 04:23:16.164480 4839 feature_gate.go:330] unrecognized feature gate: NewOLM Mar 21 04:23:16 crc kubenswrapper[4839]: W0321 04:23:16.164487 4839 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Mar 21 04:23:16 crc kubenswrapper[4839]: W0321 04:23:16.164495 4839 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Mar 21 04:23:16 crc kubenswrapper[4839]: W0321 04:23:16.164503 4839 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Mar 21 04:23:16 crc kubenswrapper[4839]: W0321 04:23:16.164511 4839 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Mar 21 04:23:16 crc kubenswrapper[4839]: W0321 04:23:16.164520 4839 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Mar 21 04:23:16 crc kubenswrapper[4839]: W0321 04:23:16.164527 4839 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Mar 21 04:23:16 crc kubenswrapper[4839]: W0321 04:23:16.164535 4839 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Mar 21 04:23:16 crc kubenswrapper[4839]: W0321 04:23:16.164543 4839 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Mar 21 04:23:16 crc kubenswrapper[4839]: W0321 04:23:16.164550 4839 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Mar 21 04:23:16 crc kubenswrapper[4839]: W0321 04:23:16.164558 4839 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Mar 21 04:23:16 crc kubenswrapper[4839]: W0321 04:23:16.164596 4839 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Mar 21 04:23:16 crc kubenswrapper[4839]: W0321 04:23:16.164604 4839 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Mar 21 04:23:16 crc kubenswrapper[4839]: W0321 04:23:16.164613 4839 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Mar 21 04:23:16 crc kubenswrapper[4839]: W0321 04:23:16.164621 4839 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Mar 21 04:23:16 crc kubenswrapper[4839]: W0321 04:23:16.164629 4839 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Mar 21 04:23:16 crc kubenswrapper[4839]: W0321 04:23:16.164637 4839 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Mar 21 04:23:16 crc kubenswrapper[4839]: W0321 04:23:16.164645 4839 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Mar 21 04:23:16 crc kubenswrapper[4839]: W0321 04:23:16.164652 4839 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Mar 21 04:23:16 crc kubenswrapper[4839]: W0321 04:23:16.164660 4839 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Mar 21 04:23:16 crc kubenswrapper[4839]: W0321 04:23:16.164668 4839 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Mar 21 04:23:16 crc kubenswrapper[4839]: W0321 04:23:16.164677 4839 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Mar 21 04:23:16 crc kubenswrapper[4839]: W0321 04:23:16.164685 4839 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Mar 21 04:23:16 crc kubenswrapper[4839]: W0321 04:23:16.164693 4839 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Mar 21 04:23:16 crc kubenswrapper[4839]: W0321 04:23:16.164701 4839 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Mar 21 04:23:16 crc kubenswrapper[4839]: W0321 04:23:16.164711 4839 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Mar 21 04:23:16 crc kubenswrapper[4839]: W0321 04:23:16.164720 4839 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Mar 21 04:23:16 crc kubenswrapper[4839]: W0321 04:23:16.164729 4839 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Mar 21 04:23:16 crc kubenswrapper[4839]: W0321 04:23:16.164737 4839 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Mar 21 04:23:16 crc kubenswrapper[4839]: W0321 04:23:16.164745 4839 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Mar 21 04:23:16 crc kubenswrapper[4839]: W0321 04:23:16.164755 4839 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Mar 21 04:23:16 crc kubenswrapper[4839]: W0321 04:23:16.164763 4839 feature_gate.go:330] unrecognized feature gate: InsightsConfig Mar 21 04:23:16 crc kubenswrapper[4839]: W0321 04:23:16.164771 4839 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Mar 21 04:23:16 crc kubenswrapper[4839]: W0321 04:23:16.164778 4839 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Mar 21 04:23:16 crc kubenswrapper[4839]: W0321 04:23:16.164786 4839 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Mar 21 04:23:16 crc kubenswrapper[4839]: W0321 04:23:16.164794 4839 feature_gate.go:330] unrecognized feature gate: PinnedImages Mar 21 04:23:16 crc kubenswrapper[4839]: W0321 04:23:16.164805 4839 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Mar 21 04:23:16 crc kubenswrapper[4839]: W0321 04:23:16.164815 4839 feature_gate.go:330] unrecognized feature gate: OVNObservability Mar 21 04:23:16 crc kubenswrapper[4839]: W0321 04:23:16.164824 4839 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Mar 21 04:23:16 crc kubenswrapper[4839]: W0321 04:23:16.164833 4839 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Mar 21 04:23:16 crc kubenswrapper[4839]: W0321 04:23:16.164841 4839 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Mar 21 04:23:16 crc kubenswrapper[4839]: W0321 04:23:16.164849 4839 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Mar 21 04:23:16 crc kubenswrapper[4839]: W0321 04:23:16.164857 4839 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Mar 21 04:23:16 crc kubenswrapper[4839]: W0321 04:23:16.164865 4839 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Mar 21 04:23:16 crc kubenswrapper[4839]: W0321 04:23:16.164873 4839 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Mar 21 04:23:16 crc kubenswrapper[4839]: W0321 04:23:16.164880 4839 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Mar 21 04:23:16 crc kubenswrapper[4839]: W0321 04:23:16.164888 4839 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Mar 21 04:23:16 crc kubenswrapper[4839]: W0321 04:23:16.164896 4839 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Mar 21 04:23:16 crc kubenswrapper[4839]: W0321 04:23:16.164904 4839 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Mar 21 04:23:16 crc kubenswrapper[4839]: W0321 04:23:16.164911 4839 feature_gate.go:330] unrecognized feature gate: Example Mar 21 04:23:16 crc kubenswrapper[4839]: W0321 04:23:16.164920 4839 feature_gate.go:330] unrecognized feature gate: PlatformOperators Mar 21 04:23:16 crc kubenswrapper[4839]: W0321 04:23:16.164928 4839 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Mar 21 04:23:16 crc kubenswrapper[4839]: W0321 04:23:16.164936 4839 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Mar 21 04:23:16 crc kubenswrapper[4839]: W0321 04:23:16.164944 4839 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Mar 21 04:23:16 crc kubenswrapper[4839]: W0321 04:23:16.164952 4839 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Mar 21 04:23:16 crc kubenswrapper[4839]: W0321 04:23:16.164959 4839 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Mar 21 04:23:16 crc kubenswrapper[4839]: W0321 04:23:16.164967 4839 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Mar 21 04:23:16 crc kubenswrapper[4839]: W0321 04:23:16.164975 4839 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Mar 21 04:23:16 crc kubenswrapper[4839]: W0321 04:23:16.164983 4839 feature_gate.go:330] unrecognized feature gate: GatewayAPI Mar 21 04:23:16 crc kubenswrapper[4839]: W0321 04:23:16.164991 4839 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Mar 21 04:23:16 crc kubenswrapper[4839]: W0321 04:23:16.164998 4839 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Mar 21 04:23:16 crc kubenswrapper[4839]: W0321 04:23:16.165008 4839 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Mar 21 04:23:16 crc kubenswrapper[4839]: W0321 04:23:16.165019 4839 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Mar 21 04:23:16 crc kubenswrapper[4839]: I0321 04:23:16.165032 4839 feature_gate.go:386] feature gates: {map[CloudDualStackNodeIPs:true DisableKubeletCloudCredentialProviders:true DynamicResourceAllocation:false EventedPLEG:false KMSv1:true MaxUnavailableStatefulSet:false NodeSwap:false ProcMountType:false RouteExternalCertificate:false ServiceAccountTokenNodeBinding:false TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:false UserNamespacesSupport:false ValidatingAdmissionPolicy:true VolumeAttributesClass:false]} Mar 21 04:23:16 crc kubenswrapper[4839]: W0321 04:23:16.165268 4839 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Mar 21 04:23:16 crc kubenswrapper[4839]: W0321 04:23:16.165281 4839 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Mar 21 04:23:16 crc kubenswrapper[4839]: W0321 04:23:16.165291 4839 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Mar 21 04:23:16 crc kubenswrapper[4839]: W0321 04:23:16.165301 4839 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Mar 21 04:23:16 crc kubenswrapper[4839]: W0321 04:23:16.165310 4839 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Mar 21 04:23:16 crc kubenswrapper[4839]: W0321 04:23:16.165319 4839 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Mar 21 04:23:16 crc kubenswrapper[4839]: W0321 04:23:16.165327 4839 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Mar 21 04:23:16 crc kubenswrapper[4839]: W0321 04:23:16.165336 4839 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Mar 21 04:23:16 crc kubenswrapper[4839]: W0321 04:23:16.165344 4839 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Mar 21 04:23:16 crc kubenswrapper[4839]: W0321 04:23:16.165352 4839 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Mar 21 04:23:16 crc kubenswrapper[4839]: W0321 04:23:16.165359 4839 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Mar 21 04:23:16 crc kubenswrapper[4839]: W0321 04:23:16.165367 4839 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Mar 21 04:23:16 crc kubenswrapper[4839]: W0321 04:23:16.165375 4839 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Mar 21 04:23:16 crc kubenswrapper[4839]: W0321 04:23:16.165384 4839 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Mar 21 04:23:16 crc kubenswrapper[4839]: W0321 04:23:16.165392 4839 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Mar 21 04:23:16 crc kubenswrapper[4839]: W0321 04:23:16.165399 4839 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Mar 21 04:23:16 crc kubenswrapper[4839]: W0321 04:23:16.165408 4839 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Mar 21 04:23:16 crc kubenswrapper[4839]: W0321 04:23:16.165415 4839 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Mar 21 04:23:16 crc kubenswrapper[4839]: W0321 04:23:16.165423 4839 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Mar 21 04:23:16 crc kubenswrapper[4839]: W0321 04:23:16.165431 4839 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Mar 21 04:23:16 crc kubenswrapper[4839]: W0321 04:23:16.165439 4839 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Mar 21 04:23:16 crc kubenswrapper[4839]: W0321 04:23:16.165447 4839 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Mar 21 04:23:16 crc kubenswrapper[4839]: W0321 04:23:16.165456 4839 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Mar 21 04:23:16 crc kubenswrapper[4839]: W0321 04:23:16.165465 4839 feature_gate.go:330] unrecognized feature gate: PlatformOperators Mar 21 04:23:16 crc kubenswrapper[4839]: W0321 04:23:16.165473 4839 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Mar 21 04:23:16 crc kubenswrapper[4839]: W0321 04:23:16.165481 4839 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Mar 21 04:23:16 crc kubenswrapper[4839]: W0321 04:23:16.165488 4839 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Mar 21 04:23:16 crc kubenswrapper[4839]: W0321 04:23:16.165496 4839 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Mar 21 04:23:16 crc kubenswrapper[4839]: W0321 04:23:16.165504 4839 feature_gate.go:330] unrecognized feature gate: NewOLM Mar 21 04:23:16 crc kubenswrapper[4839]: W0321 04:23:16.165512 4839 feature_gate.go:330] unrecognized feature gate: InsightsConfig Mar 21 04:23:16 crc kubenswrapper[4839]: W0321 04:23:16.165520 4839 feature_gate.go:330] unrecognized feature gate: OVNObservability Mar 21 04:23:16 crc kubenswrapper[4839]: W0321 04:23:16.165528 4839 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Mar 21 04:23:16 crc kubenswrapper[4839]: W0321 04:23:16.165536 4839 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Mar 21 04:23:16 crc kubenswrapper[4839]: W0321 04:23:16.165544 4839 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Mar 21 04:23:16 crc kubenswrapper[4839]: W0321 04:23:16.165552 4839 feature_gate.go:330] unrecognized feature gate: SignatureStores Mar 21 04:23:16 crc kubenswrapper[4839]: W0321 04:23:16.165560 4839 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Mar 21 04:23:16 crc kubenswrapper[4839]: W0321 04:23:16.165594 4839 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Mar 21 04:23:16 crc kubenswrapper[4839]: W0321 04:23:16.165603 4839 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Mar 21 04:23:16 crc kubenswrapper[4839]: W0321 04:23:16.165611 4839 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Mar 21 04:23:16 crc kubenswrapper[4839]: W0321 04:23:16.165620 4839 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Mar 21 04:23:16 crc kubenswrapper[4839]: W0321 04:23:16.165628 4839 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Mar 21 04:23:16 crc kubenswrapper[4839]: W0321 04:23:16.165635 4839 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Mar 21 04:23:16 crc kubenswrapper[4839]: W0321 04:23:16.165643 4839 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Mar 21 04:23:16 crc kubenswrapper[4839]: W0321 04:23:16.165653 4839 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Mar 21 04:23:16 crc kubenswrapper[4839]: W0321 04:23:16.165663 4839 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Mar 21 04:23:16 crc kubenswrapper[4839]: W0321 04:23:16.165671 4839 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Mar 21 04:23:16 crc kubenswrapper[4839]: W0321 04:23:16.165679 4839 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Mar 21 04:23:16 crc kubenswrapper[4839]: W0321 04:23:16.165688 4839 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Mar 21 04:23:16 crc kubenswrapper[4839]: W0321 04:23:16.165697 4839 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Mar 21 04:23:16 crc kubenswrapper[4839]: W0321 04:23:16.165706 4839 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Mar 21 04:23:16 crc kubenswrapper[4839]: W0321 04:23:16.165714 4839 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Mar 21 04:23:16 crc kubenswrapper[4839]: W0321 04:23:16.165721 4839 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Mar 21 04:23:16 crc kubenswrapper[4839]: W0321 04:23:16.165729 4839 feature_gate.go:330] unrecognized feature gate: Example Mar 21 04:23:16 crc kubenswrapper[4839]: W0321 04:23:16.165737 4839 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Mar 21 04:23:16 crc kubenswrapper[4839]: W0321 04:23:16.165744 4839 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Mar 21 04:23:16 crc kubenswrapper[4839]: W0321 04:23:16.165755 4839 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Mar 21 04:23:16 crc kubenswrapper[4839]: W0321 04:23:16.165764 4839 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Mar 21 04:23:16 crc kubenswrapper[4839]: W0321 04:23:16.165772 4839 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Mar 21 04:23:16 crc kubenswrapper[4839]: W0321 04:23:16.165781 4839 feature_gate.go:330] unrecognized feature gate: GatewayAPI Mar 21 04:23:16 crc kubenswrapper[4839]: W0321 04:23:16.165790 4839 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Mar 21 04:23:16 crc kubenswrapper[4839]: W0321 04:23:16.165799 4839 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Mar 21 04:23:16 crc kubenswrapper[4839]: W0321 04:23:16.165806 4839 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Mar 21 04:23:16 crc kubenswrapper[4839]: W0321 04:23:16.165814 4839 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Mar 21 04:23:16 crc kubenswrapper[4839]: W0321 04:23:16.165822 4839 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Mar 21 04:23:16 crc kubenswrapper[4839]: W0321 04:23:16.165830 4839 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Mar 21 04:23:16 crc kubenswrapper[4839]: W0321 04:23:16.165838 4839 feature_gate.go:330] unrecognized feature gate: PinnedImages Mar 21 04:23:16 crc kubenswrapper[4839]: W0321 04:23:16.165845 4839 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Mar 21 04:23:16 crc kubenswrapper[4839]: W0321 04:23:16.165853 4839 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Mar 21 04:23:16 crc kubenswrapper[4839]: W0321 04:23:16.165863 4839 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Mar 21 04:23:16 crc kubenswrapper[4839]: W0321 04:23:16.165873 4839 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Mar 21 04:23:16 crc kubenswrapper[4839]: W0321 04:23:16.165881 4839 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Mar 21 04:23:16 crc kubenswrapper[4839]: I0321 04:23:16.165893 4839 feature_gate.go:386] feature gates: {map[CloudDualStackNodeIPs:true DisableKubeletCloudCredentialProviders:true DynamicResourceAllocation:false EventedPLEG:false KMSv1:true MaxUnavailableStatefulSet:false NodeSwap:false ProcMountType:false RouteExternalCertificate:false ServiceAccountTokenNodeBinding:false TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:false UserNamespacesSupport:false ValidatingAdmissionPolicy:true VolumeAttributesClass:false]} Mar 21 04:23:16 crc kubenswrapper[4839]: I0321 04:23:16.167231 4839 server.go:940] "Client rotation is on, will bootstrap in background" Mar 21 04:23:16 crc kubenswrapper[4839]: E0321 04:23:16.175880 4839 bootstrap.go:266] "Unhandled Error" err="part of the existing bootstrap client certificate in /var/lib/kubelet/kubeconfig is expired: 2026-02-24 05:52:08 +0000 UTC" logger="UnhandledError" Mar 21 04:23:16 crc kubenswrapper[4839]: I0321 04:23:16.184024 4839 bootstrap.go:101] "Use the bootstrap credentials to request a cert, and set kubeconfig to point to the certificate dir" Mar 21 04:23:16 crc kubenswrapper[4839]: I0321 04:23:16.184179 4839 certificate_store.go:130] Loading cert/key pair from "/var/lib/kubelet/pki/kubelet-client-current.pem". Mar 21 04:23:16 crc kubenswrapper[4839]: I0321 04:23:16.186358 4839 server.go:997] "Starting client certificate rotation" Mar 21 04:23:16 crc kubenswrapper[4839]: I0321 04:23:16.186411 4839 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Certificate rotation is enabled Mar 21 04:23:16 crc kubenswrapper[4839]: I0321 04:23:16.186697 4839 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Rotating certificates Mar 21 04:23:16 crc kubenswrapper[4839]: I0321 04:23:16.213367 4839 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Mar 21 04:23:16 crc kubenswrapper[4839]: E0321 04:23:16.216943 4839 certificate_manager.go:562] "Unhandled Error" err="kubernetes.io/kube-apiserver-client-kubelet: Failed while requesting a signed certificate from the control plane: cannot create certificate signing request: Post \"https://api-int.crc.testing:6443/apis/certificates.k8s.io/v1/certificatesigningrequests\": dial tcp 38.102.83.188:6443: connect: connection refused" logger="UnhandledError" Mar 21 04:23:16 crc kubenswrapper[4839]: I0321 04:23:16.217523 4839 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Mar 21 04:23:16 crc kubenswrapper[4839]: I0321 04:23:16.234369 4839 log.go:25] "Validated CRI v1 runtime API" Mar 21 04:23:16 crc kubenswrapper[4839]: I0321 04:23:16.306657 4839 log.go:25] "Validated CRI v1 image API" Mar 21 04:23:16 crc kubenswrapper[4839]: I0321 04:23:16.308627 4839 server.go:1437] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Mar 21 04:23:16 crc kubenswrapper[4839]: I0321 04:23:16.319681 4839 fs.go:133] Filesystem UUIDs: map[0b076daa-c26a-46d2-b3a6-72a8dbc6e257:/dev/vda4 2026-03-21-04-18-13-00:/dev/sr0 7B77-95E7:/dev/vda2 de0497b0-db1b-465a-b278-03db02455c71:/dev/vda3] Mar 21 04:23:16 crc kubenswrapper[4839]: I0321 04:23:16.319727 4839 fs.go:134] Filesystem partitions: map[/dev/shm:{mountpoint:/dev/shm major:0 minor:22 fsType:tmpfs blockSize:0} /dev/vda3:{mountpoint:/boot major:252 minor:3 fsType:ext4 blockSize:0} /dev/vda4:{mountpoint:/var major:252 minor:4 fsType:xfs blockSize:0} /run:{mountpoint:/run major:0 minor:24 fsType:tmpfs blockSize:0} /run/user/1000:{mountpoint:/run/user/1000 major:0 minor:42 fsType:tmpfs blockSize:0} /tmp:{mountpoint:/tmp major:0 minor:30 fsType:tmpfs blockSize:0} /var/lib/etcd:{mountpoint:/var/lib/etcd major:0 minor:43 fsType:tmpfs blockSize:0}] Mar 21 04:23:16 crc kubenswrapper[4839]: I0321 04:23:16.342904 4839 manager.go:217] Machine: {Timestamp:2026-03-21 04:23:16.332186195 +0000 UTC m=+0.659972891 CPUVendorID:AuthenticAMD NumCores:12 NumPhysicalCores:1 NumSockets:12 CpuFrequency:2800000 MemoryCapacity:33654132736 SwapCapacity:0 MemoryByType:map[] NVMInfo:{MemoryModeCapacity:0 AppDirectModeCapacity:0 AvgPowerBudget:0} HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] MachineID:21801e6708c44f15b81395eb736a7cec SystemUUID:2a7bfad9-30ba-42d8-b982-971191ebb9d6 BootID:d75f4a6d-5ce3-49bd-a7c7-4f5c3e6bf62f Filesystems:[{Device:/var/lib/etcd DeviceMajor:0 DeviceMinor:43 Capacity:1073741824 Type:vfs Inodes:4108170 HasInodes:true} {Device:/dev/shm DeviceMajor:0 DeviceMinor:22 Capacity:16827064320 Type:vfs Inodes:4108170 HasInodes:true} {Device:/run DeviceMajor:0 DeviceMinor:24 Capacity:6730829824 Type:vfs Inodes:819200 HasInodes:true} {Device:/dev/vda4 DeviceMajor:252 DeviceMinor:4 Capacity:85292941312 Type:vfs Inodes:41679680 HasInodes:true} {Device:/tmp DeviceMajor:0 DeviceMinor:30 Capacity:16827068416 Type:vfs Inodes:1048576 HasInodes:true} {Device:/dev/vda3 DeviceMajor:252 DeviceMinor:3 Capacity:366869504 Type:vfs Inodes:98304 HasInodes:true} {Device:/run/user/1000 DeviceMajor:0 DeviceMinor:42 Capacity:3365412864 Type:vfs Inodes:821634 HasInodes:true}] DiskMap:map[252:0:{Name:vda Major:252 Minor:0 Size:214748364800 Scheduler:none}] NetworkDevices:[{Name:br-ex MacAddress:fa:16:3e:2d:de:1b Speed:0 Mtu:1500} {Name:br-int MacAddress:d6:39:55:2e:22:71 Speed:0 Mtu:1400} {Name:ens3 MacAddress:fa:16:3e:2d:de:1b Speed:-1 Mtu:1500} {Name:ens7 MacAddress:fa:16:3e:50:03:86 Speed:-1 Mtu:1500} {Name:ens7.20 MacAddress:52:54:00:94:21:78 Speed:-1 Mtu:1496} {Name:ens7.21 MacAddress:52:54:00:14:0c:d2 Speed:-1 Mtu:1496} {Name:ens7.22 MacAddress:52:54:00:3c:5a:55 Speed:-1 Mtu:1496} {Name:eth10 MacAddress:f6:02:ef:bb:a6:9f Speed:0 Mtu:1500} {Name:ovn-k8s-mp0 MacAddress:0a:58:0a:d9:00:02 Speed:0 Mtu:1400} {Name:ovs-system MacAddress:1e:2f:50:22:59:e2 Speed:0 Mtu:1500}] Topology:[{Id:0 Memory:33654132736 HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] Cores:[{Id:0 Threads:[0] Caches:[{Id:0 Size:32768 Type:Data Level:1} {Id:0 Size:32768 Type:Instruction Level:1} {Id:0 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:0 Size:16777216 Type:Unified Level:3}] SocketID:0 BookID: DrawerID:} {Id:0 Threads:[1] Caches:[{Id:1 Size:32768 Type:Data Level:1} {Id:1 Size:32768 Type:Instruction Level:1} {Id:1 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:1 Size:16777216 Type:Unified Level:3}] SocketID:1 BookID: DrawerID:} {Id:0 Threads:[10] Caches:[{Id:10 Size:32768 Type:Data Level:1} {Id:10 Size:32768 Type:Instruction Level:1} {Id:10 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:10 Size:16777216 Type:Unified Level:3}] SocketID:10 BookID: DrawerID:} {Id:0 Threads:[11] Caches:[{Id:11 Size:32768 Type:Data Level:1} {Id:11 Size:32768 Type:Instruction Level:1} {Id:11 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:11 Size:16777216 Type:Unified Level:3}] SocketID:11 BookID: DrawerID:} {Id:0 Threads:[2] Caches:[{Id:2 Size:32768 Type:Data Level:1} {Id:2 Size:32768 Type:Instruction Level:1} {Id:2 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:2 Size:16777216 Type:Unified Level:3}] SocketID:2 BookID: DrawerID:} {Id:0 Threads:[3] Caches:[{Id:3 Size:32768 Type:Data Level:1} {Id:3 Size:32768 Type:Instruction Level:1} {Id:3 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:3 Size:16777216 Type:Unified Level:3}] SocketID:3 BookID: DrawerID:} {Id:0 Threads:[4] Caches:[{Id:4 Size:32768 Type:Data Level:1} {Id:4 Size:32768 Type:Instruction Level:1} {Id:4 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:4 Size:16777216 Type:Unified Level:3}] SocketID:4 BookID: DrawerID:} {Id:0 Threads:[5] Caches:[{Id:5 Size:32768 Type:Data Level:1} {Id:5 Size:32768 Type:Instruction Level:1} {Id:5 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:5 Size:16777216 Type:Unified Level:3}] SocketID:5 BookID: DrawerID:} {Id:0 Threads:[6] Caches:[{Id:6 Size:32768 Type:Data Level:1} {Id:6 Size:32768 Type:Instruction Level:1} {Id:6 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:6 Size:16777216 Type:Unified Level:3}] SocketID:6 BookID: DrawerID:} {Id:0 Threads:[7] Caches:[{Id:7 Size:32768 Type:Data Level:1} {Id:7 Size:32768 Type:Instruction Level:1} {Id:7 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:7 Size:16777216 Type:Unified Level:3}] SocketID:7 BookID: DrawerID:} {Id:0 Threads:[8] Caches:[{Id:8 Size:32768 Type:Data Level:1} {Id:8 Size:32768 Type:Instruction Level:1} {Id:8 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:8 Size:16777216 Type:Unified Level:3}] SocketID:8 BookID: DrawerID:} {Id:0 Threads:[9] Caches:[{Id:9 Size:32768 Type:Data Level:1} {Id:9 Size:32768 Type:Instruction Level:1} {Id:9 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:9 Size:16777216 Type:Unified Level:3}] SocketID:9 BookID: DrawerID:}] Caches:[] Distances:[10]}] CloudProvider:Unknown InstanceType:Unknown InstanceID:None} Mar 21 04:23:16 crc kubenswrapper[4839]: I0321 04:23:16.343231 4839 manager_no_libpfm.go:29] cAdvisor is build without cgo and/or libpfm support. Perf event counters are not available. Mar 21 04:23:16 crc kubenswrapper[4839]: I0321 04:23:16.343544 4839 manager.go:233] Version: {KernelVersion:5.14.0-427.50.2.el9_4.x86_64 ContainerOsVersion:Red Hat Enterprise Linux CoreOS 418.94.202502100215-0 DockerVersion: DockerAPIVersion: CadvisorVersion: CadvisorRevision:} Mar 21 04:23:16 crc kubenswrapper[4839]: I0321 04:23:16.346140 4839 swap_util.go:113] "Swap is on" /proc/swaps contents="Filename\t\t\t\tType\t\tSize\t\tUsed\t\tPriority" Mar 21 04:23:16 crc kubenswrapper[4839]: I0321 04:23:16.346325 4839 container_manager_linux.go:267] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Mar 21 04:23:16 crc kubenswrapper[4839]: I0321 04:23:16.346356 4839 container_manager_linux.go:272] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"crc","RuntimeCgroupsName":"/system.slice/crio.service","SystemCgroupsName":"/system.slice","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":true,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":{"cpu":"200m","ephemeral-storage":"350Mi","memory":"350Mi"},"HardEvictionThresholds":[{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"ExperimentalMemoryManagerPolicy":"None","ExperimentalMemoryManagerReservedMemory":null,"PodPidsLimit":4096,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Mar 21 04:23:16 crc kubenswrapper[4839]: I0321 04:23:16.346554 4839 topology_manager.go:138] "Creating topology manager with none policy" Mar 21 04:23:16 crc kubenswrapper[4839]: I0321 04:23:16.346563 4839 container_manager_linux.go:303] "Creating device plugin manager" Mar 21 04:23:16 crc kubenswrapper[4839]: I0321 04:23:16.346935 4839 manager.go:142] "Creating Device Plugin manager" path="/var/lib/kubelet/device-plugins/kubelet.sock" Mar 21 04:23:16 crc kubenswrapper[4839]: I0321 04:23:16.346959 4839 server.go:66] "Creating device plugin registration server" version="v1beta1" socket="/var/lib/kubelet/device-plugins/kubelet.sock" Mar 21 04:23:16 crc kubenswrapper[4839]: I0321 04:23:16.347118 4839 state_mem.go:36] "Initialized new in-memory state store" Mar 21 04:23:16 crc kubenswrapper[4839]: I0321 04:23:16.347212 4839 server.go:1245] "Using root directory" path="/var/lib/kubelet" Mar 21 04:23:16 crc kubenswrapper[4839]: I0321 04:23:16.365109 4839 kubelet.go:418] "Attempting to sync node with API server" Mar 21 04:23:16 crc kubenswrapper[4839]: I0321 04:23:16.365155 4839 kubelet.go:313] "Adding static pod path" path="/etc/kubernetes/manifests" Mar 21 04:23:16 crc kubenswrapper[4839]: I0321 04:23:16.365181 4839 file.go:69] "Watching path" path="/etc/kubernetes/manifests" Mar 21 04:23:16 crc kubenswrapper[4839]: I0321 04:23:16.365195 4839 kubelet.go:324] "Adding apiserver pod source" Mar 21 04:23:16 crc kubenswrapper[4839]: I0321 04:23:16.365207 4839 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Mar 21 04:23:16 crc kubenswrapper[4839]: I0321 04:23:16.371123 4839 kuberuntime_manager.go:262] "Container runtime initialized" containerRuntime="cri-o" version="1.31.5-4.rhaos4.18.gitdad78d5.el9" apiVersion="v1" Mar 21 04:23:16 crc kubenswrapper[4839]: W0321 04:23:16.371661 4839 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": dial tcp 38.102.83.188:6443: connect: connection refused Mar 21 04:23:16 crc kubenswrapper[4839]: E0321 04:23:16.371820 4839 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 38.102.83.188:6443: connect: connection refused" logger="UnhandledError" Mar 21 04:23:16 crc kubenswrapper[4839]: W0321 04:23:16.371758 4839 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": dial tcp 38.102.83.188:6443: connect: connection refused Mar 21 04:23:16 crc kubenswrapper[4839]: E0321 04:23:16.371909 4839 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0\": dial tcp 38.102.83.188:6443: connect: connection refused" logger="UnhandledError" Mar 21 04:23:16 crc kubenswrapper[4839]: I0321 04:23:16.372421 4839 certificate_store.go:130] Loading cert/key pair from "/var/lib/kubelet/pki/kubelet-server-current.pem". Mar 21 04:23:16 crc kubenswrapper[4839]: I0321 04:23:16.377815 4839 kubelet.go:854] "Not starting ClusterTrustBundle informer because we are in static kubelet mode" Mar 21 04:23:16 crc kubenswrapper[4839]: I0321 04:23:16.384456 4839 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/portworx-volume" Mar 21 04:23:16 crc kubenswrapper[4839]: I0321 04:23:16.384514 4839 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/empty-dir" Mar 21 04:23:16 crc kubenswrapper[4839]: I0321 04:23:16.384528 4839 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/git-repo" Mar 21 04:23:16 crc kubenswrapper[4839]: I0321 04:23:16.384540 4839 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/host-path" Mar 21 04:23:16 crc kubenswrapper[4839]: I0321 04:23:16.384559 4839 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/nfs" Mar 21 04:23:16 crc kubenswrapper[4839]: I0321 04:23:16.384593 4839 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/secret" Mar 21 04:23:16 crc kubenswrapper[4839]: I0321 04:23:16.384603 4839 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/iscsi" Mar 21 04:23:16 crc kubenswrapper[4839]: I0321 04:23:16.384619 4839 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/downward-api" Mar 21 04:23:16 crc kubenswrapper[4839]: I0321 04:23:16.384654 4839 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/fc" Mar 21 04:23:16 crc kubenswrapper[4839]: I0321 04:23:16.384665 4839 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/configmap" Mar 21 04:23:16 crc kubenswrapper[4839]: I0321 04:23:16.384696 4839 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/projected" Mar 21 04:23:16 crc kubenswrapper[4839]: I0321 04:23:16.384705 4839 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/local-volume" Mar 21 04:23:16 crc kubenswrapper[4839]: I0321 04:23:16.385967 4839 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/csi" Mar 21 04:23:16 crc kubenswrapper[4839]: I0321 04:23:16.386614 4839 server.go:1280] "Started kubelet" Mar 21 04:23:16 crc systemd[1]: Started Kubernetes Kubelet. Mar 21 04:23:16 crc kubenswrapper[4839]: I0321 04:23:16.389542 4839 server.go:163] "Starting to listen" address="0.0.0.0" port=10250 Mar 21 04:23:16 crc kubenswrapper[4839]: I0321 04:23:16.389547 4839 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Mar 21 04:23:16 crc kubenswrapper[4839]: I0321 04:23:16.393149 4839 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate rotation is enabled Mar 21 04:23:16 crc kubenswrapper[4839]: I0321 04:23:16.393258 4839 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Mar 21 04:23:16 crc kubenswrapper[4839]: I0321 04:23:16.393318 4839 server.go:236] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Mar 21 04:23:16 crc kubenswrapper[4839]: I0321 04:23:16.393223 4839 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.102.83.188:6443: connect: connection refused Mar 21 04:23:16 crc kubenswrapper[4839]: I0321 04:23:16.394137 4839 volume_manager.go:287] "The desired_state_of_world populator starts" Mar 21 04:23:16 crc kubenswrapper[4839]: I0321 04:23:16.394171 4839 volume_manager.go:289] "Starting Kubelet Volume Manager" Mar 21 04:23:16 crc kubenswrapper[4839]: E0321 04:23:16.394178 4839 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 21 04:23:16 crc kubenswrapper[4839]: I0321 04:23:16.394274 4839 desired_state_of_world_populator.go:146] "Desired state populator starts to run" Mar 21 04:23:16 crc kubenswrapper[4839]: I0321 04:23:16.395412 4839 server.go:460] "Adding debug handlers to kubelet server" Mar 21 04:23:16 crc kubenswrapper[4839]: W0321 04:23:16.395707 4839 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 38.102.83.188:6443: connect: connection refused Mar 21 04:23:16 crc kubenswrapper[4839]: E0321 04:23:16.395807 4839 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 38.102.83.188:6443: connect: connection refused" logger="UnhandledError" Mar 21 04:23:16 crc kubenswrapper[4839]: E0321 04:23:16.395823 4839 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.188:6443: connect: connection refused" interval="200ms" Mar 21 04:23:16 crc kubenswrapper[4839]: I0321 04:23:16.403256 4839 factory.go:219] Registration of the containerd container factory failed: unable to create containerd client: containerd: cannot unix dial containerd api service: dial unix /run/containerd/containerd.sock: connect: no such file or directory Mar 21 04:23:16 crc kubenswrapper[4839]: I0321 04:23:16.403297 4839 factory.go:55] Registering systemd factory Mar 21 04:23:16 crc kubenswrapper[4839]: I0321 04:23:16.403307 4839 factory.go:221] Registration of the systemd container factory successfully Mar 21 04:23:16 crc kubenswrapper[4839]: I0321 04:23:16.403701 4839 factory.go:153] Registering CRI-O factory Mar 21 04:23:16 crc kubenswrapper[4839]: I0321 04:23:16.403745 4839 factory.go:221] Registration of the crio container factory successfully Mar 21 04:23:16 crc kubenswrapper[4839]: I0321 04:23:16.403790 4839 factory.go:103] Registering Raw factory Mar 21 04:23:16 crc kubenswrapper[4839]: I0321 04:23:16.403813 4839 manager.go:1196] Started watching for new ooms in manager Mar 21 04:23:16 crc kubenswrapper[4839]: I0321 04:23:16.404813 4839 manager.go:319] Starting recovery of all containers Mar 21 04:23:16 crc kubenswrapper[4839]: E0321 04:23:16.404640 4839 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/default/events\": dial tcp 38.102.83.188:6443: connect: connection refused" event="&Event{ObjectMeta:{crc.189ec088a75d2ad4 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-21 04:23:16.386540244 +0000 UTC m=+0.714326930,LastTimestamp:2026-03-21 04:23:16.386540244 +0000 UTC m=+0.714326930,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 21 04:23:16 crc kubenswrapper[4839]: I0321 04:23:16.408324 4839 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" volumeName="kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782" seLinuxMountContext="" Mar 21 04:23:16 crc kubenswrapper[4839]: I0321 04:23:16.408401 4839 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca" seLinuxMountContext="" Mar 21 04:23:16 crc kubenswrapper[4839]: I0321 04:23:16.408417 4839 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="25e176fe-21b4-4974-b1ed-c8b94f112a7f" volumeName="kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle" seLinuxMountContext="" Mar 21 04:23:16 crc kubenswrapper[4839]: I0321 04:23:16.408431 4839 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca" seLinuxMountContext="" Mar 21 04:23:16 crc kubenswrapper[4839]: I0321 04:23:16.408444 4839 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="57a731c4-ef35-47a8-b875-bfb08a7f8011" volumeName="kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct" seLinuxMountContext="" Mar 21 04:23:16 crc kubenswrapper[4839]: I0321 04:23:16.408456 4839 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7539238d-5fe0-46ed-884e-1c3b566537ec" volumeName="kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert" seLinuxMountContext="" Mar 21 04:23:16 crc kubenswrapper[4839]: I0321 04:23:16.408468 4839 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls" seLinuxMountContext="" Mar 21 04:23:16 crc kubenswrapper[4839]: I0321 04:23:16.408481 4839 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="96b93a3a-6083-4aea-8eab-fe1aa8245ad9" volumeName="kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7" seLinuxMountContext="" Mar 21 04:23:16 crc kubenswrapper[4839]: I0321 04:23:16.408497 4839 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3b6479f0-333b-4a96-9adf-2099afdc2447" volumeName="kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr" seLinuxMountContext="" Mar 21 04:23:16 crc kubenswrapper[4839]: I0321 04:23:16.408509 4839 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="44663579-783b-4372-86d6-acf235a62d72" volumeName="kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc" seLinuxMountContext="" Mar 21 04:23:16 crc kubenswrapper[4839]: I0321 04:23:16.408521 4839 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5225d0e4-402f-4861-b410-819f433b1803" volumeName="kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content" seLinuxMountContext="" Mar 21 04:23:16 crc kubenswrapper[4839]: I0321 04:23:16.408535 4839 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bd23aa5c-e532-4e53-bccf-e79f130c5ae8" volumeName="kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2" seLinuxMountContext="" Mar 21 04:23:16 crc kubenswrapper[4839]: I0321 04:23:16.408547 4839 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e7e6199b-1264-4501-8953-767f51328d08" volumeName="kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert" seLinuxMountContext="" Mar 21 04:23:16 crc kubenswrapper[4839]: I0321 04:23:16.408598 4839 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies" seLinuxMountContext="" Mar 21 04:23:16 crc kubenswrapper[4839]: I0321 04:23:16.408610 4839 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle" seLinuxMountContext="" Mar 21 04:23:16 crc kubenswrapper[4839]: I0321 04:23:16.408622 4839 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="4bb40260-dbaa-4fb0-84df-5e680505d512" volumeName="kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config" seLinuxMountContext="" Mar 21 04:23:16 crc kubenswrapper[4839]: I0321 04:23:16.408633 4839 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6731426b-95fe-49ff-bb5f-40441049fde2" volumeName="kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls" seLinuxMountContext="" Mar 21 04:23:16 crc kubenswrapper[4839]: I0321 04:23:16.408668 4839 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config" seLinuxMountContext="" Mar 21 04:23:16 crc kubenswrapper[4839]: I0321 04:23:16.408682 4839 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52" seLinuxMountContext="" Mar 21 04:23:16 crc kubenswrapper[4839]: I0321 04:23:16.408695 4839 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client" seLinuxMountContext="" Mar 21 04:23:16 crc kubenswrapper[4839]: I0321 04:23:16.408707 4839 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b574797-001e-440a-8f4e-c0be86edad0f" volumeName="kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls" seLinuxMountContext="" Mar 21 04:23:16 crc kubenswrapper[4839]: I0321 04:23:16.408719 4839 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit" seLinuxMountContext="" Mar 21 04:23:16 crc kubenswrapper[4839]: I0321 04:23:16.408732 4839 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" volumeName="kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config" seLinuxMountContext="" Mar 21 04:23:16 crc kubenswrapper[4839]: I0321 04:23:16.408745 4839 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config" seLinuxMountContext="" Mar 21 04:23:16 crc kubenswrapper[4839]: I0321 04:23:16.408758 4839 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config" seLinuxMountContext="" Mar 21 04:23:16 crc kubenswrapper[4839]: I0321 04:23:16.408770 4839 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates" seLinuxMountContext="" Mar 21 04:23:16 crc kubenswrapper[4839]: I0321 04:23:16.408812 4839 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-ovnkube-identity-cm" seLinuxMountContext="" Mar 21 04:23:16 crc kubenswrapper[4839]: I0321 04:23:16.408830 4839 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="01ab3dd5-8196-46d0-ad33-122e2ca51def" volumeName="kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert" seLinuxMountContext="" Mar 21 04:23:16 crc kubenswrapper[4839]: I0321 04:23:16.408843 4839 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1386a44e-36a2-460c-96d0-0359d2b6f0f5" volumeName="kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert" seLinuxMountContext="" Mar 21 04:23:16 crc kubenswrapper[4839]: I0321 04:23:16.408855 4839 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1d611f23-29be-4491-8495-bee1670e935f" volumeName="kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities" seLinuxMountContext="" Mar 21 04:23:16 crc kubenswrapper[4839]: I0321 04:23:16.408867 4839 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" volumeName="kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp" seLinuxMountContext="" Mar 21 04:23:16 crc kubenswrapper[4839]: I0321 04:23:16.408881 4839 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles" seLinuxMountContext="" Mar 21 04:23:16 crc kubenswrapper[4839]: I0321 04:23:16.408895 4839 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls" seLinuxMountContext="" Mar 21 04:23:16 crc kubenswrapper[4839]: I0321 04:23:16.408908 4839 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="d75a4c96-2883-4a0b-bab2-0fab2b6c0b49" volumeName="kubernetes.io/projected/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-kube-api-access-rczfb" seLinuxMountContext="" Mar 21 04:23:16 crc kubenswrapper[4839]: I0321 04:23:16.408920 4839 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection" seLinuxMountContext="" Mar 21 04:23:16 crc kubenswrapper[4839]: I0321 04:23:16.408961 4839 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca" seLinuxMountContext="" Mar 21 04:23:16 crc kubenswrapper[4839]: I0321 04:23:16.408975 4839 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1d611f23-29be-4491-8495-bee1670e935f" volumeName="kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz" seLinuxMountContext="" Mar 21 04:23:16 crc kubenswrapper[4839]: I0321 04:23:16.408988 4839 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="25e176fe-21b4-4974-b1ed-c8b94f112a7f" volumeName="kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv" seLinuxMountContext="" Mar 21 04:23:16 crc kubenswrapper[4839]: I0321 04:23:16.409026 4839 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert" seLinuxMountContext="" Mar 21 04:23:16 crc kubenswrapper[4839]: I0321 04:23:16.409039 4839 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3ab1a177-2de0-46d9-b765-d0d0649bb42e" volumeName="kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert" seLinuxMountContext="" Mar 21 04:23:16 crc kubenswrapper[4839]: I0321 04:23:16.409052 4839 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert" seLinuxMountContext="" Mar 21 04:23:16 crc kubenswrapper[4839]: I0321 04:23:16.409064 4839 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig" seLinuxMountContext="" Mar 21 04:23:16 crc kubenswrapper[4839]: I0321 04:23:16.409076 4839 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session" seLinuxMountContext="" Mar 21 04:23:16 crc kubenswrapper[4839]: I0321 04:23:16.409091 4839 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config" seLinuxMountContext="" Mar 21 04:23:16 crc kubenswrapper[4839]: I0321 04:23:16.409103 4839 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe579f8-e8a6-4643-bce5-a661393c4dde" volumeName="kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token" seLinuxMountContext="" Mar 21 04:23:16 crc kubenswrapper[4839]: I0321 04:23:16.409114 4839 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" volumeName="kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh" seLinuxMountContext="" Mar 21 04:23:16 crc kubenswrapper[4839]: I0321 04:23:16.409126 4839 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="d75a4c96-2883-4a0b-bab2-0fab2b6c0b49" volumeName="kubernetes.io/configmap/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-iptables-alerter-script" seLinuxMountContext="" Mar 21 04:23:16 crc kubenswrapper[4839]: I0321 04:23:16.409139 4839 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert" seLinuxMountContext="" Mar 21 04:23:16 crc kubenswrapper[4839]: I0321 04:23:16.409156 4839 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" volumeName="kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca" seLinuxMountContext="" Mar 21 04:23:16 crc kubenswrapper[4839]: I0321 04:23:16.409170 4839 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe579f8-e8a6-4643-bce5-a661393c4dde" volumeName="kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs" seLinuxMountContext="" Mar 21 04:23:16 crc kubenswrapper[4839]: I0321 04:23:16.409184 4839 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" volumeName="kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content" seLinuxMountContext="" Mar 21 04:23:16 crc kubenswrapper[4839]: I0321 04:23:16.409196 4839 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/projected/ef543e1b-8068-4ea3-b32a-61027b32e95d-kube-api-access-s2kz5" seLinuxMountContext="" Mar 21 04:23:16 crc kubenswrapper[4839]: I0321 04:23:16.409214 4839 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert" seLinuxMountContext="" Mar 21 04:23:16 crc kubenswrapper[4839]: I0321 04:23:16.409228 4839 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="20b0d48f-5fd6-431c-a545-e3c800c7b866" volumeName="kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert" seLinuxMountContext="" Mar 21 04:23:16 crc kubenswrapper[4839]: I0321 04:23:16.409241 4839 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf" seLinuxMountContext="" Mar 21 04:23:16 crc kubenswrapper[4839]: I0321 04:23:16.409254 4839 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="496e6271-fb68-4057-954e-a0d97a4afa3f" volumeName="kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access" seLinuxMountContext="" Mar 21 04:23:16 crc kubenswrapper[4839]: I0321 04:23:16.409267 4839 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="57a731c4-ef35-47a8-b875-bfb08a7f8011" volumeName="kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities" seLinuxMountContext="" Mar 21 04:23:16 crc kubenswrapper[4839]: I0321 04:23:16.409279 4839 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle" seLinuxMountContext="" Mar 21 04:23:16 crc kubenswrapper[4839]: I0321 04:23:16.409291 4839 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca" seLinuxMountContext="" Mar 21 04:23:16 crc kubenswrapper[4839]: I0321 04:23:16.409302 4839 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted" seLinuxMountContext="" Mar 21 04:23:16 crc kubenswrapper[4839]: I0321 04:23:16.409313 4839 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token" seLinuxMountContext="" Mar 21 04:23:16 crc kubenswrapper[4839]: I0321 04:23:16.409324 4839 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d" volumeName="kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85" seLinuxMountContext="" Mar 21 04:23:16 crc kubenswrapper[4839]: I0321 04:23:16.409338 4839 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e7e6199b-1264-4501-8953-767f51328d08" volumeName="kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access" seLinuxMountContext="" Mar 21 04:23:16 crc kubenswrapper[4839]: I0321 04:23:16.409349 4839 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e7e6199b-1264-4501-8953-767f51328d08" volumeName="kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config" seLinuxMountContext="" Mar 21 04:23:16 crc kubenswrapper[4839]: I0321 04:23:16.409361 4839 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config" seLinuxMountContext="" Mar 21 04:23:16 crc kubenswrapper[4839]: I0321 04:23:16.409374 4839 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" volumeName="kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8" seLinuxMountContext="" Mar 21 04:23:16 crc kubenswrapper[4839]: I0321 04:23:16.409392 4839 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="496e6271-fb68-4057-954e-a0d97a4afa3f" volumeName="kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config" seLinuxMountContext="" Mar 21 04:23:16 crc kubenswrapper[4839]: I0321 04:23:16.409405 4839 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle" seLinuxMountContext="" Mar 21 04:23:16 crc kubenswrapper[4839]: I0321 04:23:16.409417 4839 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz" seLinuxMountContext="" Mar 21 04:23:16 crc kubenswrapper[4839]: I0321 04:23:16.409429 4839 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides" seLinuxMountContext="" Mar 21 04:23:16 crc kubenswrapper[4839]: I0321 04:23:16.409442 4839 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth" seLinuxMountContext="" Mar 21 04:23:16 crc kubenswrapper[4839]: I0321 04:23:16.409453 4839 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1386a44e-36a2-460c-96d0-0359d2b6f0f5" volumeName="kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config" seLinuxMountContext="" Mar 21 04:23:16 crc kubenswrapper[4839]: I0321 04:23:16.409466 4839 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5" seLinuxMountContext="" Mar 21 04:23:16 crc kubenswrapper[4839]: I0321 04:23:16.409477 4839 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="496e6271-fb68-4057-954e-a0d97a4afa3f" volumeName="kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert" seLinuxMountContext="" Mar 21 04:23:16 crc kubenswrapper[4839]: I0321 04:23:16.409491 4839 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca" seLinuxMountContext="" Mar 21 04:23:16 crc kubenswrapper[4839]: I0321 04:23:16.409507 4839 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="20b0d48f-5fd6-431c-a545-e3c800c7b866" volumeName="kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds" seLinuxMountContext="" Mar 21 04:23:16 crc kubenswrapper[4839]: I0321 04:23:16.409519 4839 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert" seLinuxMountContext="" Mar 21 04:23:16 crc kubenswrapper[4839]: I0321 04:23:16.409531 4839 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert" seLinuxMountContext="" Mar 21 04:23:16 crc kubenswrapper[4839]: I0321 04:23:16.409543 4839 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca" seLinuxMountContext="" Mar 21 04:23:16 crc kubenswrapper[4839]: I0321 04:23:16.409556 4839 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate" seLinuxMountContext="" Mar 21 04:23:16 crc kubenswrapper[4839]: I0321 04:23:16.409588 4839 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" volumeName="kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert" seLinuxMountContext="" Mar 21 04:23:16 crc kubenswrapper[4839]: I0321 04:23:16.409600 4839 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets" seLinuxMountContext="" Mar 21 04:23:16 crc kubenswrapper[4839]: I0321 04:23:16.409615 4839 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config" seLinuxMountContext="" Mar 21 04:23:16 crc kubenswrapper[4839]: I0321 04:23:16.409650 4839 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle" seLinuxMountContext="" Mar 21 04:23:16 crc kubenswrapper[4839]: I0321 04:23:16.409664 4839 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="25e176fe-21b4-4974-b1ed-c8b94f112a7f" volumeName="kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key" seLinuxMountContext="" Mar 21 04:23:16 crc kubenswrapper[4839]: I0321 04:23:16.409678 4839 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca" seLinuxMountContext="" Mar 21 04:23:16 crc kubenswrapper[4839]: I0321 04:23:16.409691 4839 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49ef4625-1d3a-4a9f-b595-c2433d32326d" volumeName="kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v" seLinuxMountContext="" Mar 21 04:23:16 crc kubenswrapper[4839]: I0321 04:23:16.409704 4839 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7bb08738-c794-4ee8-9972-3a62ca171029" volumeName="kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy" seLinuxMountContext="" Mar 21 04:23:16 crc kubenswrapper[4839]: I0321 04:23:16.409716 4839 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" volumeName="kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities" seLinuxMountContext="" Mar 21 04:23:16 crc kubenswrapper[4839]: I0321 04:23:16.409729 4839 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv" seLinuxMountContext="" Mar 21 04:23:16 crc kubenswrapper[4839]: I0321 04:23:16.409741 4839 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca" seLinuxMountContext="" Mar 21 04:23:16 crc kubenswrapper[4839]: I0321 04:23:16.409754 4839 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5b88f790-22fa-440e-b583-365168c0b23d" volumeName="kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs" seLinuxMountContext="" Mar 21 04:23:16 crc kubenswrapper[4839]: I0321 04:23:16.409766 4839 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle" seLinuxMountContext="" Mar 21 04:23:16 crc kubenswrapper[4839]: I0321 04:23:16.409779 4839 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7539238d-5fe0-46ed-884e-1c3b566537ec" volumeName="kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c" seLinuxMountContext="" Mar 21 04:23:16 crc kubenswrapper[4839]: I0321 04:23:16.409791 4839 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert" seLinuxMountContext="" Mar 21 04:23:16 crc kubenswrapper[4839]: I0321 04:23:16.409803 4839 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a0128f3a-b052-44ed-a84e-c4c8aaf17c13" volumeName="kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls" seLinuxMountContext="" Mar 21 04:23:16 crc kubenswrapper[4839]: I0321 04:23:16.409816 4839 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" volumeName="kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd" seLinuxMountContext="" Mar 21 04:23:16 crc kubenswrapper[4839]: I0321 04:23:16.409829 4839 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6312bbd-5731-4ea0-a20f-81d5a57df44a" volumeName="kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert" seLinuxMountContext="" Mar 21 04:23:16 crc kubenswrapper[4839]: I0321 04:23:16.409841 4839 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="01ab3dd5-8196-46d0-ad33-122e2ca51def" volumeName="kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j" seLinuxMountContext="" Mar 21 04:23:16 crc kubenswrapper[4839]: I0321 04:23:16.409855 4839 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz" seLinuxMountContext="" Mar 21 04:23:16 crc kubenswrapper[4839]: I0321 04:23:16.409873 4839 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b78653f-4ff9-4508-8672-245ed9b561e3" volumeName="kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca" seLinuxMountContext="" Mar 21 04:23:16 crc kubenswrapper[4839]: I0321 04:23:16.409888 4839 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b78653f-4ff9-4508-8672-245ed9b561e3" volumeName="kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert" seLinuxMountContext="" Mar 21 04:23:16 crc kubenswrapper[4839]: I0321 04:23:16.409902 4839 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp" seLinuxMountContext="" Mar 21 04:23:16 crc kubenswrapper[4839]: I0321 04:23:16.409916 4839 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data" seLinuxMountContext="" Mar 21 04:23:16 crc kubenswrapper[4839]: I0321 04:23:16.409935 4839 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="96b93a3a-6083-4aea-8eab-fe1aa8245ad9" volumeName="kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls" seLinuxMountContext="" Mar 21 04:23:16 crc kubenswrapper[4839]: I0321 04:23:16.409950 4839 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5" seLinuxMountContext="" Mar 21 04:23:16 crc kubenswrapper[4839]: I0321 04:23:16.409966 4839 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="4bb40260-dbaa-4fb0-84df-5e680505d512" volumeName="kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy" seLinuxMountContext="" Mar 21 04:23:16 crc kubenswrapper[4839]: I0321 04:23:16.409980 4839 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" volumeName="kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert" seLinuxMountContext="" Mar 21 04:23:16 crc kubenswrapper[4839]: I0321 04:23:16.409994 4839 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7" seLinuxMountContext="" Mar 21 04:23:16 crc kubenswrapper[4839]: I0321 04:23:16.410008 4839 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b574797-001e-440a-8f4e-c0be86edad0f" volumeName="kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config" seLinuxMountContext="" Mar 21 04:23:16 crc kubenswrapper[4839]: I0321 04:23:16.410022 4839 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1d611f23-29be-4491-8495-bee1670e935f" volumeName="kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content" seLinuxMountContext="" Mar 21 04:23:16 crc kubenswrapper[4839]: I0321 04:23:16.410035 4839 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs" seLinuxMountContext="" Mar 21 04:23:16 crc kubenswrapper[4839]: I0321 04:23:16.410051 4839 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert" seLinuxMountContext="" Mar 21 04:23:16 crc kubenswrapper[4839]: I0321 04:23:16.410063 4839 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert" seLinuxMountContext="" Mar 21 04:23:16 crc kubenswrapper[4839]: I0321 04:23:16.410079 4839 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="87cf06ed-a83f-41a7-828d-70653580a8cb" volumeName="kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx" seLinuxMountContext="" Mar 21 04:23:16 crc kubenswrapper[4839]: I0321 04:23:16.410093 4839 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" volumeName="kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4" seLinuxMountContext="" Mar 21 04:23:16 crc kubenswrapper[4839]: I0321 04:23:16.410106 4839 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn" seLinuxMountContext="" Mar 21 04:23:16 crc kubenswrapper[4839]: I0321 04:23:16.410118 4839 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" volumeName="kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics" seLinuxMountContext="" Mar 21 04:23:16 crc kubenswrapper[4839]: I0321 04:23:16.410131 4839 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle" seLinuxMountContext="" Mar 21 04:23:16 crc kubenswrapper[4839]: I0321 04:23:16.410143 4839 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert" seLinuxMountContext="" Mar 21 04:23:16 crc kubenswrapper[4839]: I0321 04:23:16.410156 4839 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies" seLinuxMountContext="" Mar 21 04:23:16 crc kubenswrapper[4839]: I0321 04:23:16.410169 4839 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5225d0e4-402f-4861-b410-819f433b1803" volumeName="kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities" seLinuxMountContext="" Mar 21 04:23:16 crc kubenswrapper[4839]: I0321 04:23:16.410182 4839 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6" seLinuxMountContext="" Mar 21 04:23:16 crc kubenswrapper[4839]: I0321 04:23:16.410193 4839 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" volumeName="kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg" seLinuxMountContext="" Mar 21 04:23:16 crc kubenswrapper[4839]: I0321 04:23:16.410206 4839 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config" seLinuxMountContext="" Mar 21 04:23:16 crc kubenswrapper[4839]: I0321 04:23:16.410218 4839 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb" seLinuxMountContext="" Mar 21 04:23:16 crc kubenswrapper[4839]: I0321 04:23:16.410230 4839 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls" seLinuxMountContext="" Mar 21 04:23:16 crc kubenswrapper[4839]: I0321 04:23:16.410267 4839 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d751cbb-f2e2-430d-9754-c882a5e924a5" volumeName="kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl" seLinuxMountContext="" Mar 21 04:23:16 crc kubenswrapper[4839]: I0321 04:23:16.410279 4839 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls" seLinuxMountContext="" Mar 21 04:23:16 crc kubenswrapper[4839]: I0321 04:23:16.410291 4839 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="efdd0498-1daa-4136-9a4a-3b948c2293fc" volumeName="kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt" seLinuxMountContext="" Mar 21 04:23:16 crc kubenswrapper[4839]: I0321 04:23:16.410305 4839 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="fda69060-fa79-4696-b1a6-7980f124bf7c" volumeName="kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config" seLinuxMountContext="" Mar 21 04:23:16 crc kubenswrapper[4839]: I0321 04:23:16.410317 4839 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7" seLinuxMountContext="" Mar 21 04:23:16 crc kubenswrapper[4839]: I0321 04:23:16.410345 4839 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error" seLinuxMountContext="" Mar 21 04:23:16 crc kubenswrapper[4839]: I0321 04:23:16.410359 4839 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="87cf06ed-a83f-41a7-828d-70653580a8cb" volumeName="kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls" seLinuxMountContext="" Mar 21 04:23:16 crc kubenswrapper[4839]: I0321 04:23:16.410379 4839 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token" seLinuxMountContext="" Mar 21 04:23:16 crc kubenswrapper[4839]: I0321 04:23:16.410392 4839 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6312bbd-5731-4ea0-a20f-81d5a57df44a" volumeName="kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr" seLinuxMountContext="" Mar 21 04:23:16 crc kubenswrapper[4839]: I0321 04:23:16.410405 4839 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs" seLinuxMountContext="" Mar 21 04:23:16 crc kubenswrapper[4839]: I0321 04:23:16.410416 4839 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" volumeName="kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert" seLinuxMountContext="" Mar 21 04:23:16 crc kubenswrapper[4839]: I0321 04:23:16.410429 4839 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config" seLinuxMountContext="" Mar 21 04:23:16 crc kubenswrapper[4839]: I0321 04:23:16.410441 4839 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images" seLinuxMountContext="" Mar 21 04:23:16 crc kubenswrapper[4839]: I0321 04:23:16.410454 4839 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7bb08738-c794-4ee8-9972-3a62ca171029" volumeName="kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist" seLinuxMountContext="" Mar 21 04:23:16 crc kubenswrapper[4839]: I0321 04:23:16.410466 4839 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a0128f3a-b052-44ed-a84e-c4c8aaf17c13" volumeName="kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m" seLinuxMountContext="" Mar 21 04:23:16 crc kubenswrapper[4839]: I0321 04:23:16.410478 4839 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" volumeName="kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert" seLinuxMountContext="" Mar 21 04:23:16 crc kubenswrapper[4839]: I0321 04:23:16.410490 4839 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="fda69060-fa79-4696-b1a6-7980f124bf7c" volumeName="kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh" seLinuxMountContext="" Mar 21 04:23:16 crc kubenswrapper[4839]: I0321 04:23:16.410502 4839 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="fda69060-fa79-4696-b1a6-7980f124bf7c" volumeName="kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls" seLinuxMountContext="" Mar 21 04:23:16 crc kubenswrapper[4839]: I0321 04:23:16.410514 4839 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client" seLinuxMountContext="" Mar 21 04:23:16 crc kubenswrapper[4839]: I0321 04:23:16.410526 4839 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1386a44e-36a2-460c-96d0-0359d2b6f0f5" volumeName="kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access" seLinuxMountContext="" Mar 21 04:23:16 crc kubenswrapper[4839]: I0321 04:23:16.410539 4839 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config" seLinuxMountContext="" Mar 21 04:23:16 crc kubenswrapper[4839]: I0321 04:23:16.410592 4839 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="37a5e44f-9a88-4405-be8a-b645485e7312" volumeName="kubernetes.io/secret/37a5e44f-9a88-4405-be8a-b645485e7312-metrics-tls" seLinuxMountContext="" Mar 21 04:23:16 crc kubenswrapper[4839]: I0321 04:23:16.410605 4839 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7539238d-5fe0-46ed-884e-1c3b566537ec" volumeName="kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config" seLinuxMountContext="" Mar 21 04:23:16 crc kubenswrapper[4839]: I0321 04:23:16.410618 4839 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca" seLinuxMountContext="" Mar 21 04:23:16 crc kubenswrapper[4839]: I0321 04:23:16.410633 4839 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" volumeName="kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca" seLinuxMountContext="" Mar 21 04:23:16 crc kubenswrapper[4839]: I0321 04:23:16.410647 4839 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca" seLinuxMountContext="" Mar 21 04:23:16 crc kubenswrapper[4839]: I0321 04:23:16.410723 4839 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client" seLinuxMountContext="" Mar 21 04:23:16 crc kubenswrapper[4839]: I0321 04:23:16.410747 4839 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3ab1a177-2de0-46d9-b765-d0d0649bb42e" volumeName="kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj" seLinuxMountContext="" Mar 21 04:23:16 crc kubenswrapper[4839]: I0321 04:23:16.410782 4839 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5b88f790-22fa-440e-b583-365168c0b23d" volumeName="kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn" seLinuxMountContext="" Mar 21 04:23:16 crc kubenswrapper[4839]: I0321 04:23:16.410813 4839 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe579f8-e8a6-4643-bce5-a661393c4dde" volumeName="kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp" seLinuxMountContext="" Mar 21 04:23:16 crc kubenswrapper[4839]: I0321 04:23:16.410826 4839 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca" seLinuxMountContext="" Mar 21 04:23:16 crc kubenswrapper[4839]: I0321 04:23:16.410890 4839 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl" seLinuxMountContext="" Mar 21 04:23:16 crc kubenswrapper[4839]: I0321 04:23:16.410906 4839 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert" seLinuxMountContext="" Mar 21 04:23:16 crc kubenswrapper[4839]: I0321 04:23:16.410918 4839 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8" seLinuxMountContext="" Mar 21 04:23:16 crc kubenswrapper[4839]: I0321 04:23:16.410949 4839 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images" seLinuxMountContext="" Mar 21 04:23:16 crc kubenswrapper[4839]: I0321 04:23:16.410966 4839 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config" seLinuxMountContext="" Mar 21 04:23:16 crc kubenswrapper[4839]: I0321 04:23:16.410995 4839 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login" seLinuxMountContext="" Mar 21 04:23:16 crc kubenswrapper[4839]: I0321 04:23:16.411030 4839 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" volumeName="kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf" seLinuxMountContext="" Mar 21 04:23:16 crc kubenswrapper[4839]: I0321 04:23:16.411043 4839 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config" seLinuxMountContext="" Mar 21 04:23:16 crc kubenswrapper[4839]: I0321 04:23:16.411081 4839 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config" seLinuxMountContext="" Mar 21 04:23:16 crc kubenswrapper[4839]: I0321 04:23:16.411095 4839 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca" seLinuxMountContext="" Mar 21 04:23:16 crc kubenswrapper[4839]: I0321 04:23:16.411110 4839 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" seLinuxMountContext="" Mar 21 04:23:16 crc kubenswrapper[4839]: I0321 04:23:16.415504 4839 reconstruct.go:144] "Volume is marked device as uncertain and added into the actual state" volumeName="kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" deviceMountPath="/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/globalmount" Mar 21 04:23:16 crc kubenswrapper[4839]: I0321 04:23:16.415559 4839 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config" seLinuxMountContext="" Mar 21 04:23:16 crc kubenswrapper[4839]: I0321 04:23:16.415603 4839 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b78653f-4ff9-4508-8672-245ed9b561e3" volumeName="kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access" seLinuxMountContext="" Mar 21 04:23:16 crc kubenswrapper[4839]: I0321 04:23:16.415619 4839 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert" seLinuxMountContext="" Mar 21 04:23:16 crc kubenswrapper[4839]: I0321 04:23:16.415633 4839 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="57a731c4-ef35-47a8-b875-bfb08a7f8011" volumeName="kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content" seLinuxMountContext="" Mar 21 04:23:16 crc kubenswrapper[4839]: I0321 04:23:16.415650 4839 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6731426b-95fe-49ff-bb5f-40441049fde2" volumeName="kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh" seLinuxMountContext="" Mar 21 04:23:16 crc kubenswrapper[4839]: I0321 04:23:16.415666 4839 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides" seLinuxMountContext="" Mar 21 04:23:16 crc kubenswrapper[4839]: I0321 04:23:16.415680 4839 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" volumeName="kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config" seLinuxMountContext="" Mar 21 04:23:16 crc kubenswrapper[4839]: I0321 04:23:16.415695 4839 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs" seLinuxMountContext="" Mar 21 04:23:16 crc kubenswrapper[4839]: I0321 04:23:16.415708 4839 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6312bbd-5731-4ea0-a20f-81d5a57df44a" volumeName="kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert" seLinuxMountContext="" Mar 21 04:23:16 crc kubenswrapper[4839]: I0321 04:23:16.415721 4839 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf" seLinuxMountContext="" Mar 21 04:23:16 crc kubenswrapper[4839]: I0321 04:23:16.415734 4839 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/secret/ef543e1b-8068-4ea3-b32a-61027b32e95d-webhook-cert" seLinuxMountContext="" Mar 21 04:23:16 crc kubenswrapper[4839]: I0321 04:23:16.415747 4839 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" volumeName="kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert" seLinuxMountContext="" Mar 21 04:23:16 crc kubenswrapper[4839]: I0321 04:23:16.415761 4839 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="01ab3dd5-8196-46d0-ad33-122e2ca51def" volumeName="kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config" seLinuxMountContext="" Mar 21 04:23:16 crc kubenswrapper[4839]: I0321 04:23:16.415776 4839 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b574797-001e-440a-8f4e-c0be86edad0f" volumeName="kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88" seLinuxMountContext="" Mar 21 04:23:16 crc kubenswrapper[4839]: I0321 04:23:16.415790 4839 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls" seLinuxMountContext="" Mar 21 04:23:16 crc kubenswrapper[4839]: I0321 04:23:16.415805 4839 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls" seLinuxMountContext="" Mar 21 04:23:16 crc kubenswrapper[4839]: I0321 04:23:16.415835 4839 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert" seLinuxMountContext="" Mar 21 04:23:16 crc kubenswrapper[4839]: I0321 04:23:16.415852 4839 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" volumeName="kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert" seLinuxMountContext="" Mar 21 04:23:16 crc kubenswrapper[4839]: I0321 04:23:16.415869 4839 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib" seLinuxMountContext="" Mar 21 04:23:16 crc kubenswrapper[4839]: I0321 04:23:16.415886 4839 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-env-overrides" seLinuxMountContext="" Mar 21 04:23:16 crc kubenswrapper[4839]: I0321 04:23:16.415899 4839 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert" seLinuxMountContext="" Mar 21 04:23:16 crc kubenswrapper[4839]: I0321 04:23:16.415913 4839 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5225d0e4-402f-4861-b410-819f433b1803" volumeName="kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7" seLinuxMountContext="" Mar 21 04:23:16 crc kubenswrapper[4839]: I0321 04:23:16.415926 4839 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz" seLinuxMountContext="" Mar 21 04:23:16 crc kubenswrapper[4839]: I0321 04:23:16.415939 4839 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="87cf06ed-a83f-41a7-828d-70653580a8cb" volumeName="kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume" seLinuxMountContext="" Mar 21 04:23:16 crc kubenswrapper[4839]: I0321 04:23:16.415952 4839 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token" seLinuxMountContext="" Mar 21 04:23:16 crc kubenswrapper[4839]: I0321 04:23:16.415965 4839 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" volumeName="kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates" seLinuxMountContext="" Mar 21 04:23:16 crc kubenswrapper[4839]: I0321 04:23:16.415980 4839 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="37a5e44f-9a88-4405-be8a-b645485e7312" volumeName="kubernetes.io/projected/37a5e44f-9a88-4405-be8a-b645485e7312-kube-api-access-rdwmf" seLinuxMountContext="" Mar 21 04:23:16 crc kubenswrapper[4839]: I0321 04:23:16.415992 4839 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template" seLinuxMountContext="" Mar 21 04:23:16 crc kubenswrapper[4839]: I0321 04:23:16.416007 4839 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="4bb40260-dbaa-4fb0-84df-5e680505d512" volumeName="kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh" seLinuxMountContext="" Mar 21 04:23:16 crc kubenswrapper[4839]: I0321 04:23:16.416020 4839 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config" seLinuxMountContext="" Mar 21 04:23:16 crc kubenswrapper[4839]: I0321 04:23:16.416033 4839 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca" seLinuxMountContext="" Mar 21 04:23:16 crc kubenswrapper[4839]: I0321 04:23:16.416046 4839 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="efdd0498-1daa-4136-9a4a-3b948c2293fc" volumeName="kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs" seLinuxMountContext="" Mar 21 04:23:16 crc kubenswrapper[4839]: I0321 04:23:16.416059 4839 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config" seLinuxMountContext="" Mar 21 04:23:16 crc kubenswrapper[4839]: I0321 04:23:16.416133 4839 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs" seLinuxMountContext="" Mar 21 04:23:16 crc kubenswrapper[4839]: I0321 04:23:16.416147 4839 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle" seLinuxMountContext="" Mar 21 04:23:16 crc kubenswrapper[4839]: I0321 04:23:16.416160 4839 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config" seLinuxMountContext="" Mar 21 04:23:16 crc kubenswrapper[4839]: I0321 04:23:16.416173 4839 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7bb08738-c794-4ee8-9972-3a62ca171029" volumeName="kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb" seLinuxMountContext="" Mar 21 04:23:16 crc kubenswrapper[4839]: I0321 04:23:16.416186 4839 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk" seLinuxMountContext="" Mar 21 04:23:16 crc kubenswrapper[4839]: I0321 04:23:16.416200 4839 reconstruct.go:97] "Volume reconstruction finished" Mar 21 04:23:16 crc kubenswrapper[4839]: I0321 04:23:16.416210 4839 reconciler.go:26] "Reconciler: start to sync state" Mar 21 04:23:16 crc kubenswrapper[4839]: I0321 04:23:16.428675 4839 manager.go:324] Recovery completed Mar 21 04:23:16 crc kubenswrapper[4839]: I0321 04:23:16.436907 4839 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 21 04:23:16 crc kubenswrapper[4839]: I0321 04:23:16.438892 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:23:16 crc kubenswrapper[4839]: I0321 04:23:16.438928 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:23:16 crc kubenswrapper[4839]: I0321 04:23:16.438936 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:23:16 crc kubenswrapper[4839]: I0321 04:23:16.439676 4839 cpu_manager.go:225] "Starting CPU manager" policy="none" Mar 21 04:23:16 crc kubenswrapper[4839]: I0321 04:23:16.439705 4839 cpu_manager.go:226] "Reconciling" reconcilePeriod="10s" Mar 21 04:23:16 crc kubenswrapper[4839]: I0321 04:23:16.439733 4839 state_mem.go:36] "Initialized new in-memory state store" Mar 21 04:23:16 crc kubenswrapper[4839]: I0321 04:23:16.448431 4839 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv4" Mar 21 04:23:16 crc kubenswrapper[4839]: I0321 04:23:16.451432 4839 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv6" Mar 21 04:23:16 crc kubenswrapper[4839]: I0321 04:23:16.451502 4839 status_manager.go:217] "Starting to sync pod status with apiserver" Mar 21 04:23:16 crc kubenswrapper[4839]: I0321 04:23:16.451539 4839 kubelet.go:2335] "Starting kubelet main sync loop" Mar 21 04:23:16 crc kubenswrapper[4839]: E0321 04:23:16.451626 4839 kubelet.go:2359] "Skipping pod synchronization" err="[container runtime status check may not have completed yet, PLEG is not healthy: pleg has yet to be successful]" Mar 21 04:23:16 crc kubenswrapper[4839]: W0321 04:23:16.452462 4839 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 38.102.83.188:6443: connect: connection refused Mar 21 04:23:16 crc kubenswrapper[4839]: E0321 04:23:16.452554 4839 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 38.102.83.188:6443: connect: connection refused" logger="UnhandledError" Mar 21 04:23:16 crc kubenswrapper[4839]: I0321 04:23:16.454200 4839 policy_none.go:49] "None policy: Start" Mar 21 04:23:16 crc kubenswrapper[4839]: I0321 04:23:16.455116 4839 memory_manager.go:170] "Starting memorymanager" policy="None" Mar 21 04:23:16 crc kubenswrapper[4839]: I0321 04:23:16.455162 4839 state_mem.go:35] "Initializing new in-memory state store" Mar 21 04:23:16 crc kubenswrapper[4839]: E0321 04:23:16.494499 4839 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 21 04:23:16 crc kubenswrapper[4839]: I0321 04:23:16.510054 4839 manager.go:334] "Starting Device Plugin manager" Mar 21 04:23:16 crc kubenswrapper[4839]: I0321 04:23:16.510210 4839 manager.go:513] "Failed to read data from checkpoint" checkpoint="kubelet_internal_checkpoint" err="checkpoint is not found" Mar 21 04:23:16 crc kubenswrapper[4839]: I0321 04:23:16.510248 4839 server.go:79] "Starting device plugin registration server" Mar 21 04:23:16 crc kubenswrapper[4839]: I0321 04:23:16.510747 4839 eviction_manager.go:189] "Eviction manager: starting control loop" Mar 21 04:23:16 crc kubenswrapper[4839]: I0321 04:23:16.510769 4839 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Mar 21 04:23:16 crc kubenswrapper[4839]: I0321 04:23:16.511395 4839 plugin_watcher.go:51] "Plugin Watcher Start" path="/var/lib/kubelet/plugins_registry" Mar 21 04:23:16 crc kubenswrapper[4839]: I0321 04:23:16.511513 4839 plugin_manager.go:116] "The desired_state_of_world populator (plugin watcher) starts" Mar 21 04:23:16 crc kubenswrapper[4839]: I0321 04:23:16.511527 4839 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Mar 21 04:23:16 crc kubenswrapper[4839]: E0321 04:23:16.519415 4839 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Mar 21 04:23:16 crc kubenswrapper[4839]: I0321 04:23:16.551760 4839 kubelet.go:2421] "SyncLoop ADD" source="file" pods=["openshift-kube-controller-manager/kube-controller-manager-crc","openshift-kube-scheduler/openshift-kube-scheduler-crc","openshift-machine-config-operator/kube-rbac-proxy-crio-crc","openshift-etcd/etcd-crc","openshift-kube-apiserver/kube-apiserver-crc"] Mar 21 04:23:16 crc kubenswrapper[4839]: I0321 04:23:16.551874 4839 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 21 04:23:16 crc kubenswrapper[4839]: I0321 04:23:16.553485 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:23:16 crc kubenswrapper[4839]: I0321 04:23:16.553527 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:23:16 crc kubenswrapper[4839]: I0321 04:23:16.553539 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:23:16 crc kubenswrapper[4839]: I0321 04:23:16.553689 4839 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 21 04:23:16 crc kubenswrapper[4839]: I0321 04:23:16.553876 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 21 04:23:16 crc kubenswrapper[4839]: I0321 04:23:16.553931 4839 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 21 04:23:16 crc kubenswrapper[4839]: I0321 04:23:16.554622 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:23:16 crc kubenswrapper[4839]: I0321 04:23:16.554651 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:23:16 crc kubenswrapper[4839]: I0321 04:23:16.554664 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:23:16 crc kubenswrapper[4839]: I0321 04:23:16.554713 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:23:16 crc kubenswrapper[4839]: I0321 04:23:16.554731 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:23:16 crc kubenswrapper[4839]: I0321 04:23:16.554739 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:23:16 crc kubenswrapper[4839]: I0321 04:23:16.554761 4839 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 21 04:23:16 crc kubenswrapper[4839]: I0321 04:23:16.554926 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Mar 21 04:23:16 crc kubenswrapper[4839]: I0321 04:23:16.554968 4839 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 21 04:23:16 crc kubenswrapper[4839]: I0321 04:23:16.555748 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:23:16 crc kubenswrapper[4839]: I0321 04:23:16.555783 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:23:16 crc kubenswrapper[4839]: I0321 04:23:16.555802 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:23:16 crc kubenswrapper[4839]: I0321 04:23:16.555961 4839 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 21 04:23:16 crc kubenswrapper[4839]: I0321 04:23:16.556527 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Mar 21 04:23:16 crc kubenswrapper[4839]: I0321 04:23:16.556597 4839 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 21 04:23:16 crc kubenswrapper[4839]: I0321 04:23:16.556957 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:23:16 crc kubenswrapper[4839]: I0321 04:23:16.557012 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:23:16 crc kubenswrapper[4839]: I0321 04:23:16.557024 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:23:16 crc kubenswrapper[4839]: I0321 04:23:16.557226 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:23:16 crc kubenswrapper[4839]: I0321 04:23:16.557273 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:23:16 crc kubenswrapper[4839]: I0321 04:23:16.557286 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:23:16 crc kubenswrapper[4839]: I0321 04:23:16.557443 4839 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 21 04:23:16 crc kubenswrapper[4839]: I0321 04:23:16.557448 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:23:16 crc kubenswrapper[4839]: I0321 04:23:16.557578 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd/etcd-crc" Mar 21 04:23:16 crc kubenswrapper[4839]: I0321 04:23:16.557598 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:23:16 crc kubenswrapper[4839]: I0321 04:23:16.557613 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:23:16 crc kubenswrapper[4839]: I0321 04:23:16.557613 4839 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 21 04:23:16 crc kubenswrapper[4839]: I0321 04:23:16.558445 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:23:16 crc kubenswrapper[4839]: I0321 04:23:16.558482 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:23:16 crc kubenswrapper[4839]: I0321 04:23:16.558493 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:23:16 crc kubenswrapper[4839]: I0321 04:23:16.558771 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 21 04:23:16 crc kubenswrapper[4839]: I0321 04:23:16.558817 4839 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 21 04:23:16 crc kubenswrapper[4839]: I0321 04:23:16.558889 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:23:16 crc kubenswrapper[4839]: I0321 04:23:16.558923 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:23:16 crc kubenswrapper[4839]: I0321 04:23:16.558932 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:23:16 crc kubenswrapper[4839]: I0321 04:23:16.560766 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:23:16 crc kubenswrapper[4839]: I0321 04:23:16.560830 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:23:16 crc kubenswrapper[4839]: I0321 04:23:16.560843 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:23:16 crc kubenswrapper[4839]: E0321 04:23:16.596446 4839 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.188:6443: connect: connection refused" interval="400ms" Mar 21 04:23:16 crc kubenswrapper[4839]: I0321 04:23:16.611908 4839 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 21 04:23:16 crc kubenswrapper[4839]: I0321 04:23:16.617139 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:23:16 crc kubenswrapper[4839]: I0321 04:23:16.617196 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:23:16 crc kubenswrapper[4839]: I0321 04:23:16.617211 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:23:16 crc kubenswrapper[4839]: I0321 04:23:16.617245 4839 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 21 04:23:16 crc kubenswrapper[4839]: I0321 04:23:16.617611 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-cert-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Mar 21 04:23:16 crc kubenswrapper[4839]: I0321 04:23:16.617663 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-data-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 21 04:23:16 crc kubenswrapper[4839]: I0321 04:23:16.617688 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 21 04:23:16 crc kubenswrapper[4839]: I0321 04:23:16.617710 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-resource-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Mar 21 04:23:16 crc kubenswrapper[4839]: I0321 04:23:16.617731 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Mar 21 04:23:16 crc kubenswrapper[4839]: I0321 04:23:16.617837 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"static-pod-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-static-pod-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 21 04:23:16 crc kubenswrapper[4839]: E0321 04:23:16.617861 4839 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.188:6443: connect: connection refused" node="crc" Mar 21 04:23:16 crc kubenswrapper[4839]: I0321 04:23:16.617896 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-cert-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 21 04:23:16 crc kubenswrapper[4839]: I0321 04:23:16.617930 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 21 04:23:16 crc kubenswrapper[4839]: I0321 04:23:16.618050 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-resource-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 21 04:23:16 crc kubenswrapper[4839]: I0321 04:23:16.618129 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-cert-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 21 04:23:16 crc kubenswrapper[4839]: I0321 04:23:16.618183 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-log-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 21 04:23:16 crc kubenswrapper[4839]: I0321 04:23:16.618245 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 21 04:23:16 crc kubenswrapper[4839]: I0321 04:23:16.618289 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-etc-kube\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Mar 21 04:23:16 crc kubenswrapper[4839]: I0321 04:23:16.618329 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-resource-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 21 04:23:16 crc kubenswrapper[4839]: I0321 04:23:16.618370 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-local-bin\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-usr-local-bin\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 21 04:23:16 crc kubenswrapper[4839]: I0321 04:23:16.719518 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-resource-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Mar 21 04:23:16 crc kubenswrapper[4839]: I0321 04:23:16.719583 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Mar 21 04:23:16 crc kubenswrapper[4839]: I0321 04:23:16.719602 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"static-pod-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-static-pod-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 21 04:23:16 crc kubenswrapper[4839]: I0321 04:23:16.719621 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-cert-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 21 04:23:16 crc kubenswrapper[4839]: I0321 04:23:16.719641 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 21 04:23:16 crc kubenswrapper[4839]: I0321 04:23:16.719660 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-resource-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 21 04:23:16 crc kubenswrapper[4839]: I0321 04:23:16.719682 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-cert-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 21 04:23:16 crc kubenswrapper[4839]: I0321 04:23:16.719700 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-log-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 21 04:23:16 crc kubenswrapper[4839]: I0321 04:23:16.719722 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 21 04:23:16 crc kubenswrapper[4839]: I0321 04:23:16.719736 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-etc-kube\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Mar 21 04:23:16 crc kubenswrapper[4839]: I0321 04:23:16.719753 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-resource-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 21 04:23:16 crc kubenswrapper[4839]: I0321 04:23:16.719772 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"usr-local-bin\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-usr-local-bin\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 21 04:23:16 crc kubenswrapper[4839]: I0321 04:23:16.719769 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"static-pod-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-static-pod-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 21 04:23:16 crc kubenswrapper[4839]: I0321 04:23:16.719805 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 21 04:23:16 crc kubenswrapper[4839]: I0321 04:23:16.719790 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-cert-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Mar 21 04:23:16 crc kubenswrapper[4839]: I0321 04:23:16.719808 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Mar 21 04:23:16 crc kubenswrapper[4839]: I0321 04:23:16.719851 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 21 04:23:16 crc kubenswrapper[4839]: I0321 04:23:16.719833 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-cert-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Mar 21 04:23:16 crc kubenswrapper[4839]: I0321 04:23:16.719895 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-data-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 21 04:23:16 crc kubenswrapper[4839]: I0321 04:23:16.719868 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-etc-kube\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Mar 21 04:23:16 crc kubenswrapper[4839]: I0321 04:23:16.719903 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-cert-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 21 04:23:16 crc kubenswrapper[4839]: I0321 04:23:16.719921 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-cert-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 21 04:23:16 crc kubenswrapper[4839]: I0321 04:23:16.719993 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"usr-local-bin\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-usr-local-bin\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 21 04:23:16 crc kubenswrapper[4839]: I0321 04:23:16.719999 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-log-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 21 04:23:16 crc kubenswrapper[4839]: I0321 04:23:16.720002 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-resource-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 21 04:23:16 crc kubenswrapper[4839]: I0321 04:23:16.720022 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 21 04:23:16 crc kubenswrapper[4839]: I0321 04:23:16.720049 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 21 04:23:16 crc kubenswrapper[4839]: I0321 04:23:16.720032 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-resource-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 21 04:23:16 crc kubenswrapper[4839]: I0321 04:23:16.720029 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-data-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 21 04:23:16 crc kubenswrapper[4839]: I0321 04:23:16.720609 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-resource-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Mar 21 04:23:16 crc kubenswrapper[4839]: I0321 04:23:16.818987 4839 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 21 04:23:16 crc kubenswrapper[4839]: I0321 04:23:16.820205 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:23:16 crc kubenswrapper[4839]: I0321 04:23:16.820239 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:23:16 crc kubenswrapper[4839]: I0321 04:23:16.820250 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:23:16 crc kubenswrapper[4839]: I0321 04:23:16.820275 4839 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 21 04:23:16 crc kubenswrapper[4839]: E0321 04:23:16.820695 4839 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.188:6443: connect: connection refused" node="crc" Mar 21 04:23:16 crc kubenswrapper[4839]: I0321 04:23:16.894726 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 21 04:23:16 crc kubenswrapper[4839]: I0321 04:23:16.917853 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Mar 21 04:23:16 crc kubenswrapper[4839]: I0321 04:23:16.932610 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Mar 21 04:23:16 crc kubenswrapper[4839]: I0321 04:23:16.950615 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd/etcd-crc" Mar 21 04:23:16 crc kubenswrapper[4839]: I0321 04:23:16.957034 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 21 04:23:16 crc kubenswrapper[4839]: W0321 04:23:16.965732 4839 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3dcd261975c3d6b9a6ad6367fd4facd3.slice/crio-03957bb4622e5004f26f153371c6fddb77a147307f82bf8f87fd59c216a9ddbe WatchSource:0}: Error finding container 03957bb4622e5004f26f153371c6fddb77a147307f82bf8f87fd59c216a9ddbe: Status 404 returned error can't find the container with id 03957bb4622e5004f26f153371c6fddb77a147307f82bf8f87fd59c216a9ddbe Mar 21 04:23:16 crc kubenswrapper[4839]: W0321 04:23:16.969047 4839 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd1b160f5dda77d281dd8e69ec8d817f9.slice/crio-0b610b8937ee54703f32eda1bf359d6f6b8bbb248c8fe9fdd4d8714f22767243 WatchSource:0}: Error finding container 0b610b8937ee54703f32eda1bf359d6f6b8bbb248c8fe9fdd4d8714f22767243: Status 404 returned error can't find the container with id 0b610b8937ee54703f32eda1bf359d6f6b8bbb248c8fe9fdd4d8714f22767243 Mar 21 04:23:16 crc kubenswrapper[4839]: W0321 04:23:16.975074 4839 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2139d3e2895fc6797b9c76a1b4c9886d.slice/crio-014b541bf5b1bbe0d5a319c761c98dc491c7e5269fdcf13cd63afd8c06738cbc WatchSource:0}: Error finding container 014b541bf5b1bbe0d5a319c761c98dc491c7e5269fdcf13cd63afd8c06738cbc: Status 404 returned error can't find the container with id 014b541bf5b1bbe0d5a319c761c98dc491c7e5269fdcf13cd63afd8c06738cbc Mar 21 04:23:16 crc kubenswrapper[4839]: W0321 04:23:16.989588 4839 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf4b27818a5e8e43d0dc095d08835c792.slice/crio-754d299066d0ee97673eb4ca055e2e5aa667c8d5f8c83cff3cf369b224706549 WatchSource:0}: Error finding container 754d299066d0ee97673eb4ca055e2e5aa667c8d5f8c83cff3cf369b224706549: Status 404 returned error can't find the container with id 754d299066d0ee97673eb4ca055e2e5aa667c8d5f8c83cff3cf369b224706549 Mar 21 04:23:16 crc kubenswrapper[4839]: E0321 04:23:16.997905 4839 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.188:6443: connect: connection refused" interval="800ms" Mar 21 04:23:17 crc kubenswrapper[4839]: I0321 04:23:17.221079 4839 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 21 04:23:17 crc kubenswrapper[4839]: I0321 04:23:17.223488 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:23:17 crc kubenswrapper[4839]: I0321 04:23:17.223535 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:23:17 crc kubenswrapper[4839]: I0321 04:23:17.223546 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:23:17 crc kubenswrapper[4839]: I0321 04:23:17.223589 4839 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 21 04:23:17 crc kubenswrapper[4839]: E0321 04:23:17.224053 4839 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.188:6443: connect: connection refused" node="crc" Mar 21 04:23:17 crc kubenswrapper[4839]: W0321 04:23:17.376908 4839 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": dial tcp 38.102.83.188:6443: connect: connection refused Mar 21 04:23:17 crc kubenswrapper[4839]: E0321 04:23:17.377007 4839 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0\": dial tcp 38.102.83.188:6443: connect: connection refused" logger="UnhandledError" Mar 21 04:23:17 crc kubenswrapper[4839]: I0321 04:23:17.394893 4839 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.102.83.188:6443: connect: connection refused Mar 21 04:23:17 crc kubenswrapper[4839]: I0321 04:23:17.456340 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"014b541bf5b1bbe0d5a319c761c98dc491c7e5269fdcf13cd63afd8c06738cbc"} Mar 21 04:23:17 crc kubenswrapper[4839]: I0321 04:23:17.462471 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" event={"ID":"d1b160f5dda77d281dd8e69ec8d817f9","Type":"ContainerStarted","Data":"0b610b8937ee54703f32eda1bf359d6f6b8bbb248c8fe9fdd4d8714f22767243"} Mar 21 04:23:17 crc kubenswrapper[4839]: I0321 04:23:17.463475 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"03957bb4622e5004f26f153371c6fddb77a147307f82bf8f87fd59c216a9ddbe"} Mar 21 04:23:17 crc kubenswrapper[4839]: I0321 04:23:17.464373 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"a8e721ca03ad45ff241630168dd0d108390701f4a4d5343f2606e1ee00a9be73"} Mar 21 04:23:17 crc kubenswrapper[4839]: I0321 04:23:17.465298 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"754d299066d0ee97673eb4ca055e2e5aa667c8d5f8c83cff3cf369b224706549"} Mar 21 04:23:17 crc kubenswrapper[4839]: W0321 04:23:17.675951 4839 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 38.102.83.188:6443: connect: connection refused Mar 21 04:23:17 crc kubenswrapper[4839]: E0321 04:23:17.676025 4839 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 38.102.83.188:6443: connect: connection refused" logger="UnhandledError" Mar 21 04:23:17 crc kubenswrapper[4839]: W0321 04:23:17.709336 4839 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": dial tcp 38.102.83.188:6443: connect: connection refused Mar 21 04:23:17 crc kubenswrapper[4839]: E0321 04:23:17.709428 4839 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 38.102.83.188:6443: connect: connection refused" logger="UnhandledError" Mar 21 04:23:17 crc kubenswrapper[4839]: E0321 04:23:17.799464 4839 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.188:6443: connect: connection refused" interval="1.6s" Mar 21 04:23:17 crc kubenswrapper[4839]: W0321 04:23:17.841013 4839 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 38.102.83.188:6443: connect: connection refused Mar 21 04:23:17 crc kubenswrapper[4839]: E0321 04:23:17.841104 4839 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 38.102.83.188:6443: connect: connection refused" logger="UnhandledError" Mar 21 04:23:18 crc kubenswrapper[4839]: I0321 04:23:18.024674 4839 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 21 04:23:18 crc kubenswrapper[4839]: I0321 04:23:18.026169 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:23:18 crc kubenswrapper[4839]: I0321 04:23:18.026231 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:23:18 crc kubenswrapper[4839]: I0321 04:23:18.026244 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:23:18 crc kubenswrapper[4839]: I0321 04:23:18.026275 4839 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 21 04:23:18 crc kubenswrapper[4839]: E0321 04:23:18.026905 4839 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.188:6443: connect: connection refused" node="crc" Mar 21 04:23:18 crc kubenswrapper[4839]: I0321 04:23:18.321344 4839 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Rotating certificates Mar 21 04:23:18 crc kubenswrapper[4839]: E0321 04:23:18.322975 4839 certificate_manager.go:562] "Unhandled Error" err="kubernetes.io/kube-apiserver-client-kubelet: Failed while requesting a signed certificate from the control plane: cannot create certificate signing request: Post \"https://api-int.crc.testing:6443/apis/certificates.k8s.io/v1/certificatesigningrequests\": dial tcp 38.102.83.188:6443: connect: connection refused" logger="UnhandledError" Mar 21 04:23:18 crc kubenswrapper[4839]: I0321 04:23:18.395132 4839 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.102.83.188:6443: connect: connection refused Mar 21 04:23:18 crc kubenswrapper[4839]: I0321 04:23:18.469991 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"ac4ccdf976fbadd5ce5975dbec37b6be3b17b401a863ed5c1e93736176e1185e"} Mar 21 04:23:18 crc kubenswrapper[4839]: I0321 04:23:18.470067 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"1e036a62b31123ade40717ddff4b8b13971bdcda78062ebf348d49978c3c7a58"} Mar 21 04:23:18 crc kubenswrapper[4839]: I0321 04:23:18.470086 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"dd3ce831e80df764c5450f1bf45e92fc061ba1c1efa8ade627b570edcb41567c"} Mar 21 04:23:18 crc kubenswrapper[4839]: I0321 04:23:18.472015 4839 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="f70600d5431425836d94025fc8166068d8c9f411d0750be07cd7de44575a9649" exitCode=0 Mar 21 04:23:18 crc kubenswrapper[4839]: I0321 04:23:18.472088 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerDied","Data":"f70600d5431425836d94025fc8166068d8c9f411d0750be07cd7de44575a9649"} Mar 21 04:23:18 crc kubenswrapper[4839]: I0321 04:23:18.472224 4839 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 21 04:23:18 crc kubenswrapper[4839]: I0321 04:23:18.473591 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:23:18 crc kubenswrapper[4839]: I0321 04:23:18.473820 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:23:18 crc kubenswrapper[4839]: I0321 04:23:18.473858 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:23:18 crc kubenswrapper[4839]: I0321 04:23:18.475492 4839 generic.go:334] "Generic (PLEG): container finished" podID="2139d3e2895fc6797b9c76a1b4c9886d" containerID="843b9092466615ee64f818bbd1d6e51298c59932d1e135796967de09fb2f81bf" exitCode=0 Mar 21 04:23:18 crc kubenswrapper[4839]: I0321 04:23:18.475546 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerDied","Data":"843b9092466615ee64f818bbd1d6e51298c59932d1e135796967de09fb2f81bf"} Mar 21 04:23:18 crc kubenswrapper[4839]: I0321 04:23:18.475656 4839 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 21 04:23:18 crc kubenswrapper[4839]: I0321 04:23:18.476949 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:23:18 crc kubenswrapper[4839]: I0321 04:23:18.476997 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:23:18 crc kubenswrapper[4839]: I0321 04:23:18.477014 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:23:18 crc kubenswrapper[4839]: I0321 04:23:18.477648 4839 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 21 04:23:18 crc kubenswrapper[4839]: I0321 04:23:18.478701 4839 generic.go:334] "Generic (PLEG): container finished" podID="d1b160f5dda77d281dd8e69ec8d817f9" containerID="5b603ad92d6de73d444d98c4e9f38fd63b22eabeee61d7a3b00e8ca741016e68" exitCode=0 Mar 21 04:23:18 crc kubenswrapper[4839]: I0321 04:23:18.478831 4839 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 21 04:23:18 crc kubenswrapper[4839]: I0321 04:23:18.478873 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:23:18 crc kubenswrapper[4839]: I0321 04:23:18.478899 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:23:18 crc kubenswrapper[4839]: I0321 04:23:18.478914 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:23:18 crc kubenswrapper[4839]: I0321 04:23:18.478909 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" event={"ID":"d1b160f5dda77d281dd8e69ec8d817f9","Type":"ContainerDied","Data":"5b603ad92d6de73d444d98c4e9f38fd63b22eabeee61d7a3b00e8ca741016e68"} Mar 21 04:23:18 crc kubenswrapper[4839]: I0321 04:23:18.481011 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:23:18 crc kubenswrapper[4839]: I0321 04:23:18.481191 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:23:18 crc kubenswrapper[4839]: I0321 04:23:18.481205 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:23:18 crc kubenswrapper[4839]: I0321 04:23:18.482764 4839 generic.go:334] "Generic (PLEG): container finished" podID="3dcd261975c3d6b9a6ad6367fd4facd3" containerID="fec5f0e1a98ef435cb8b280400cfc7daff0f232d320947f22a93df0b81b6f441" exitCode=0 Mar 21 04:23:18 crc kubenswrapper[4839]: I0321 04:23:18.482798 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerDied","Data":"fec5f0e1a98ef435cb8b280400cfc7daff0f232d320947f22a93df0b81b6f441"} Mar 21 04:23:18 crc kubenswrapper[4839]: I0321 04:23:18.482876 4839 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 21 04:23:18 crc kubenswrapper[4839]: I0321 04:23:18.483801 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:23:18 crc kubenswrapper[4839]: I0321 04:23:18.483833 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:23:18 crc kubenswrapper[4839]: I0321 04:23:18.483846 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:23:19 crc kubenswrapper[4839]: I0321 04:23:19.395417 4839 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.102.83.188:6443: connect: connection refused Mar 21 04:23:19 crc kubenswrapper[4839]: E0321 04:23:19.401430 4839 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.188:6443: connect: connection refused" interval="3.2s" Mar 21 04:23:19 crc kubenswrapper[4839]: I0321 04:23:19.489726 4839 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 21 04:23:19 crc kubenswrapper[4839]: I0321 04:23:19.490043 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"1981f2299df7154d30ce01492943eaadeeed0f27c0b0a39a1228054345aa7a3a"} Mar 21 04:23:19 crc kubenswrapper[4839]: I0321 04:23:19.490137 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"4ea5d4d7d744a79c2e5515eaf49d09346d96b37b17eee6e3e32952e389248367"} Mar 21 04:23:19 crc kubenswrapper[4839]: I0321 04:23:19.490170 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"506189caf9101bd8eff0b327400189bde29280613baf494135bf28fa36bd80cd"} Mar 21 04:23:19 crc kubenswrapper[4839]: I0321 04:23:19.491152 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:23:19 crc kubenswrapper[4839]: I0321 04:23:19.491211 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:23:19 crc kubenswrapper[4839]: I0321 04:23:19.491228 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:23:19 crc kubenswrapper[4839]: I0321 04:23:19.493982 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"f432640080695974eb1c6c1297fad14596d99d17a162d541417fb3ec762575f5"} Mar 21 04:23:19 crc kubenswrapper[4839]: I0321 04:23:19.494042 4839 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 21 04:23:19 crc kubenswrapper[4839]: I0321 04:23:19.494904 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:23:19 crc kubenswrapper[4839]: I0321 04:23:19.494925 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:23:19 crc kubenswrapper[4839]: I0321 04:23:19.494935 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:23:19 crc kubenswrapper[4839]: I0321 04:23:19.499130 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"573a241321770c9c676d83d8280f81eba0ad01a3e2f857be89d31316117b8898"} Mar 21 04:23:19 crc kubenswrapper[4839]: I0321 04:23:19.499190 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"6d9bd6ceaa85cd21af55d1d512ac910eaf9d1bcf5d0f10868cd22ac80b13f66a"} Mar 21 04:23:19 crc kubenswrapper[4839]: I0321 04:23:19.499203 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"e4fe4a6c00181492cd5389e34a9bf18d905f69b2f7467eb75dcee9992eeb0d07"} Mar 21 04:23:19 crc kubenswrapper[4839]: I0321 04:23:19.499213 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"7299e7f513aff2d1d61b42cef3ed386843478716082a6e38d4d6d61d87f48529"} Mar 21 04:23:19 crc kubenswrapper[4839]: I0321 04:23:19.501188 4839 generic.go:334] "Generic (PLEG): container finished" podID="2139d3e2895fc6797b9c76a1b4c9886d" containerID="320bccb832e179c9ab109d22c5cd652cfb3bd770d2c698d91886401d9566efc5" exitCode=0 Mar 21 04:23:19 crc kubenswrapper[4839]: I0321 04:23:19.501261 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerDied","Data":"320bccb832e179c9ab109d22c5cd652cfb3bd770d2c698d91886401d9566efc5"} Mar 21 04:23:19 crc kubenswrapper[4839]: I0321 04:23:19.501399 4839 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 21 04:23:19 crc kubenswrapper[4839]: I0321 04:23:19.502719 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:23:19 crc kubenswrapper[4839]: I0321 04:23:19.502748 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:23:19 crc kubenswrapper[4839]: I0321 04:23:19.502759 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:23:19 crc kubenswrapper[4839]: I0321 04:23:19.507252 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" event={"ID":"d1b160f5dda77d281dd8e69ec8d817f9","Type":"ContainerStarted","Data":"5c5299f0598312d0ef997d7c51fad5c0b882bd65e5964794ac66179575373fcd"} Mar 21 04:23:19 crc kubenswrapper[4839]: I0321 04:23:19.507386 4839 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 21 04:23:19 crc kubenswrapper[4839]: I0321 04:23:19.508147 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:23:19 crc kubenswrapper[4839]: I0321 04:23:19.508174 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:23:19 crc kubenswrapper[4839]: I0321 04:23:19.508184 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:23:19 crc kubenswrapper[4839]: I0321 04:23:19.627350 4839 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 21 04:23:19 crc kubenswrapper[4839]: I0321 04:23:19.628624 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:23:19 crc kubenswrapper[4839]: I0321 04:23:19.628668 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:23:19 crc kubenswrapper[4839]: I0321 04:23:19.628677 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:23:19 crc kubenswrapper[4839]: I0321 04:23:19.628705 4839 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 21 04:23:19 crc kubenswrapper[4839]: E0321 04:23:19.629120 4839 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.188:6443: connect: connection refused" node="crc" Mar 21 04:23:19 crc kubenswrapper[4839]: I0321 04:23:19.820381 4839 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 21 04:23:19 crc kubenswrapper[4839]: I0321 04:23:19.836209 4839 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 21 04:23:19 crc kubenswrapper[4839]: W0321 04:23:19.840454 4839 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 38.102.83.188:6443: connect: connection refused Mar 21 04:23:19 crc kubenswrapper[4839]: E0321 04:23:19.840538 4839 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 38.102.83.188:6443: connect: connection refused" logger="UnhandledError" Mar 21 04:23:20 crc kubenswrapper[4839]: I0321 04:23:20.513238 4839 generic.go:334] "Generic (PLEG): container finished" podID="2139d3e2895fc6797b9c76a1b4c9886d" containerID="6e30344cfa0c0c9d42845a517f744a156634132bd6c2f6e30f745751913e968c" exitCode=0 Mar 21 04:23:20 crc kubenswrapper[4839]: I0321 04:23:20.513347 4839 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 21 04:23:20 crc kubenswrapper[4839]: I0321 04:23:20.513330 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerDied","Data":"6e30344cfa0c0c9d42845a517f744a156634132bd6c2f6e30f745751913e968c"} Mar 21 04:23:20 crc kubenswrapper[4839]: I0321 04:23:20.514114 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:23:20 crc kubenswrapper[4839]: I0321 04:23:20.514135 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:23:20 crc kubenswrapper[4839]: I0321 04:23:20.514146 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:23:20 crc kubenswrapper[4839]: I0321 04:23:20.517466 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"af0e08f67c4187e0c2d779bf55ac4af88b5145beda32bcfbd1ce7b4738e7d889"} Mar 21 04:23:20 crc kubenswrapper[4839]: I0321 04:23:20.517561 4839 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 21 04:23:20 crc kubenswrapper[4839]: I0321 04:23:20.517616 4839 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 21 04:23:20 crc kubenswrapper[4839]: I0321 04:23:20.517661 4839 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 21 04:23:20 crc kubenswrapper[4839]: I0321 04:23:20.517721 4839 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Mar 21 04:23:20 crc kubenswrapper[4839]: I0321 04:23:20.518122 4839 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 21 04:23:20 crc kubenswrapper[4839]: I0321 04:23:20.518382 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:23:20 crc kubenswrapper[4839]: I0321 04:23:20.518405 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:23:20 crc kubenswrapper[4839]: I0321 04:23:20.518416 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:23:20 crc kubenswrapper[4839]: I0321 04:23:20.519422 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:23:20 crc kubenswrapper[4839]: I0321 04:23:20.519444 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:23:20 crc kubenswrapper[4839]: I0321 04:23:20.519453 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:23:20 crc kubenswrapper[4839]: I0321 04:23:20.519489 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:23:20 crc kubenswrapper[4839]: I0321 04:23:20.519503 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:23:20 crc kubenswrapper[4839]: I0321 04:23:20.519511 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:23:20 crc kubenswrapper[4839]: I0321 04:23:20.519443 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:23:20 crc kubenswrapper[4839]: I0321 04:23:20.519588 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:23:20 crc kubenswrapper[4839]: I0321 04:23:20.519599 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:23:21 crc kubenswrapper[4839]: I0321 04:23:21.361378 4839 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 21 04:23:21 crc kubenswrapper[4839]: I0321 04:23:21.524724 4839 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Mar 21 04:23:21 crc kubenswrapper[4839]: I0321 04:23:21.524782 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"0a90276bf42ba474250d4e51c93b958e93ffcad8ab5f4285766b381f0247c484"} Mar 21 04:23:21 crc kubenswrapper[4839]: I0321 04:23:21.524817 4839 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 21 04:23:21 crc kubenswrapper[4839]: I0321 04:23:21.524864 4839 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 21 04:23:21 crc kubenswrapper[4839]: I0321 04:23:21.524897 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"b5fcb9a692a8e68cdaf00e9ab50192286582dc94a1ec9c1f269deec71d12e709"} Mar 21 04:23:21 crc kubenswrapper[4839]: I0321 04:23:21.524769 4839 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 21 04:23:21 crc kubenswrapper[4839]: I0321 04:23:21.524923 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"6ac8aa1efe8200de679b27c91d85e8934dd8b958a487da8adfcb2e7fe4d4af83"} Mar 21 04:23:21 crc kubenswrapper[4839]: I0321 04:23:21.524949 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"d2aeb8f0a5e6e7302570f0ecb899e31a8f5e9ed506f9bf7cfa6a0a6d55ae50dd"} Mar 21 04:23:21 crc kubenswrapper[4839]: I0321 04:23:21.524761 4839 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 21 04:23:21 crc kubenswrapper[4839]: I0321 04:23:21.526426 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:23:21 crc kubenswrapper[4839]: I0321 04:23:21.526513 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:23:21 crc kubenswrapper[4839]: I0321 04:23:21.526632 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:23:21 crc kubenswrapper[4839]: I0321 04:23:21.526776 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:23:21 crc kubenswrapper[4839]: I0321 04:23:21.526817 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:23:21 crc kubenswrapper[4839]: I0321 04:23:21.526830 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:23:21 crc kubenswrapper[4839]: I0321 04:23:21.527582 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:23:21 crc kubenswrapper[4839]: I0321 04:23:21.527621 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:23:21 crc kubenswrapper[4839]: I0321 04:23:21.527631 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:23:21 crc kubenswrapper[4839]: I0321 04:23:21.881950 4839 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 21 04:23:22 crc kubenswrapper[4839]: I0321 04:23:22.518791 4839 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Rotating certificates Mar 21 04:23:22 crc kubenswrapper[4839]: I0321 04:23:22.536025 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"418cc0271450c53990546d7ffb4506824d2a5889e49064cd40624ba291f485bd"} Mar 21 04:23:22 crc kubenswrapper[4839]: I0321 04:23:22.536092 4839 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 21 04:23:22 crc kubenswrapper[4839]: I0321 04:23:22.536207 4839 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 21 04:23:22 crc kubenswrapper[4839]: I0321 04:23:22.536297 4839 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 21 04:23:22 crc kubenswrapper[4839]: I0321 04:23:22.537882 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:23:22 crc kubenswrapper[4839]: I0321 04:23:22.537939 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:23:22 crc kubenswrapper[4839]: I0321 04:23:22.537963 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:23:22 crc kubenswrapper[4839]: I0321 04:23:22.538088 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:23:22 crc kubenswrapper[4839]: I0321 04:23:22.538131 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:23:22 crc kubenswrapper[4839]: I0321 04:23:22.538128 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:23:22 crc kubenswrapper[4839]: I0321 04:23:22.538161 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:23:22 crc kubenswrapper[4839]: I0321 04:23:22.538192 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:23:22 crc kubenswrapper[4839]: I0321 04:23:22.538414 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:23:22 crc kubenswrapper[4839]: I0321 04:23:22.829746 4839 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 21 04:23:22 crc kubenswrapper[4839]: I0321 04:23:22.832215 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:23:22 crc kubenswrapper[4839]: I0321 04:23:22.832273 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:23:22 crc kubenswrapper[4839]: I0321 04:23:22.832287 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:23:22 crc kubenswrapper[4839]: I0321 04:23:22.832324 4839 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 21 04:23:22 crc kubenswrapper[4839]: I0321 04:23:22.841853 4839 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-etcd/etcd-crc" Mar 21 04:23:23 crc kubenswrapper[4839]: I0321 04:23:23.539642 4839 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 21 04:23:23 crc kubenswrapper[4839]: I0321 04:23:23.541832 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:23:23 crc kubenswrapper[4839]: I0321 04:23:23.541918 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:23:23 crc kubenswrapper[4839]: I0321 04:23:23.541940 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:23:23 crc kubenswrapper[4839]: I0321 04:23:23.923199 4839 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 21 04:23:23 crc kubenswrapper[4839]: I0321 04:23:23.923470 4839 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 21 04:23:23 crc kubenswrapper[4839]: I0321 04:23:23.925071 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:23:23 crc kubenswrapper[4839]: I0321 04:23:23.925135 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:23:23 crc kubenswrapper[4839]: I0321 04:23:23.925158 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:23:24 crc kubenswrapper[4839]: I0321 04:23:24.250032 4839 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 21 04:23:24 crc kubenswrapper[4839]: I0321 04:23:24.543399 4839 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 21 04:23:24 crc kubenswrapper[4839]: I0321 04:23:24.543465 4839 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 21 04:23:24 crc kubenswrapper[4839]: I0321 04:23:24.545263 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:23:24 crc kubenswrapper[4839]: I0321 04:23:24.545299 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:23:24 crc kubenswrapper[4839]: I0321 04:23:24.545310 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:23:24 crc kubenswrapper[4839]: I0321 04:23:24.545380 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:23:24 crc kubenswrapper[4839]: I0321 04:23:24.545419 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:23:24 crc kubenswrapper[4839]: I0321 04:23:24.545441 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:23:25 crc kubenswrapper[4839]: I0321 04:23:25.450423 4839 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 21 04:23:25 crc kubenswrapper[4839]: I0321 04:23:25.450721 4839 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 21 04:23:25 crc kubenswrapper[4839]: I0321 04:23:25.452412 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:23:25 crc kubenswrapper[4839]: I0321 04:23:25.452462 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:23:25 crc kubenswrapper[4839]: I0321 04:23:25.452472 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:23:26 crc kubenswrapper[4839]: E0321 04:23:26.519600 4839 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Mar 21 04:23:27 crc kubenswrapper[4839]: I0321 04:23:27.251267 4839 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/cluster-policy-controller namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.126.11:10357/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 21 04:23:27 crc kubenswrapper[4839]: I0321 04:23:27.251367 4839 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="cluster-policy-controller" probeResult="failure" output="Get \"https://192.168.126.11:10357/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 21 04:23:27 crc kubenswrapper[4839]: I0321 04:23:27.472487 4839 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-etcd/etcd-crc" Mar 21 04:23:27 crc kubenswrapper[4839]: I0321 04:23:27.472762 4839 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 21 04:23:27 crc kubenswrapper[4839]: I0321 04:23:27.474091 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:23:27 crc kubenswrapper[4839]: I0321 04:23:27.474132 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:23:27 crc kubenswrapper[4839]: I0321 04:23:27.474147 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:23:30 crc kubenswrapper[4839]: W0321 04:23:30.254945 4839 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": net/http: TLS handshake timeout Mar 21 04:23:30 crc kubenswrapper[4839]: I0321 04:23:30.255080 4839 trace.go:236] Trace[1885837127]: "Reflector ListAndWatch" name:k8s.io/client-go/informers/factory.go:160 (21-Mar-2026 04:23:20.253) (total time: 10001ms): Mar 21 04:23:30 crc kubenswrapper[4839]: Trace[1885837127]: ---"Objects listed" error:Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": net/http: TLS handshake timeout 10001ms (04:23:30.254) Mar 21 04:23:30 crc kubenswrapper[4839]: Trace[1885837127]: [10.001844256s] [10.001844256s] END Mar 21 04:23:30 crc kubenswrapper[4839]: E0321 04:23:30.255116 4839 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0\": net/http: TLS handshake timeout" logger="UnhandledError" Mar 21 04:23:30 crc kubenswrapper[4839]: W0321 04:23:30.270715 4839 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": net/http: TLS handshake timeout Mar 21 04:23:30 crc kubenswrapper[4839]: I0321 04:23:30.270822 4839 trace.go:236] Trace[2096808232]: "Reflector ListAndWatch" name:k8s.io/client-go/informers/factory.go:160 (21-Mar-2026 04:23:20.269) (total time: 10001ms): Mar 21 04:23:30 crc kubenswrapper[4839]: Trace[2096808232]: ---"Objects listed" error:Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": net/http: TLS handshake timeout 10001ms (04:23:30.270) Mar 21 04:23:30 crc kubenswrapper[4839]: Trace[2096808232]: [10.001268971s] [10.001268971s] END Mar 21 04:23:30 crc kubenswrapper[4839]: E0321 04:23:30.270845 4839 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": net/http: TLS handshake timeout" logger="UnhandledError" Mar 21 04:23:30 crc kubenswrapper[4839]: I0321 04:23:30.394995 4839 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": net/http: TLS handshake timeout Mar 21 04:23:30 crc kubenswrapper[4839]: W0321 04:23:30.693793 4839 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": net/http: TLS handshake timeout Mar 21 04:23:30 crc kubenswrapper[4839]: I0321 04:23:30.693948 4839 trace.go:236] Trace[925494500]: "Reflector ListAndWatch" name:k8s.io/client-go/informers/factory.go:160 (21-Mar-2026 04:23:20.692) (total time: 10001ms): Mar 21 04:23:30 crc kubenswrapper[4839]: Trace[925494500]: ---"Objects listed" error:Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": net/http: TLS handshake timeout 10001ms (04:23:30.693) Mar 21 04:23:30 crc kubenswrapper[4839]: Trace[925494500]: [10.001568458s] [10.001568458s] END Mar 21 04:23:30 crc kubenswrapper[4839]: E0321 04:23:30.693990 4839 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": net/http: TLS handshake timeout" logger="UnhandledError" Mar 21 04:23:30 crc kubenswrapper[4839]: E0321 04:23:30.887954 4839 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:23:30Z is after 2026-02-23T05:33:13Z" node="crc" Mar 21 04:23:30 crc kubenswrapper[4839]: E0321 04:23:30.889021 4839 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/default/events\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:23:30Z is after 2026-02-23T05:33:13Z" event="&Event{ObjectMeta:{crc.189ec088a75d2ad4 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-21 04:23:16.386540244 +0000 UTC m=+0.714326930,LastTimestamp:2026-03-21 04:23:16.386540244 +0000 UTC m=+0.714326930,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 21 04:23:30 crc kubenswrapper[4839]: W0321 04:23:30.889251 4839 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:23:30Z is after 2026-02-23T05:33:13Z Mar 21 04:23:30 crc kubenswrapper[4839]: E0321 04:23:30.889327 4839 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:23:30Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Mar 21 04:23:30 crc kubenswrapper[4839]: E0321 04:23:30.889649 4839 certificate_manager.go:562] "Unhandled Error" err="kubernetes.io/kube-apiserver-client-kubelet: Failed while requesting a signed certificate from the control plane: cannot create certificate signing request: Post \"https://api-int.crc.testing:6443/apis/certificates.k8s.io/v1/certificatesigningrequests\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:23:30Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Mar 21 04:23:30 crc kubenswrapper[4839]: E0321 04:23:30.890558 4839 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:23:30Z is after 2026-02-23T05:33:13Z" interval="6.4s" Mar 21 04:23:30 crc kubenswrapper[4839]: I0321 04:23:30.899660 4839 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver namespace/openshift-kube-apiserver: Startup probe status=failure output="HTTP probe failed with statuscode: 403" start-of-body={"kind":"Status","apiVersion":"v1","metadata":{},"status":"Failure","message":"forbidden: User \"system:anonymous\" cannot get path \"/livez\"","reason":"Forbidden","details":{},"code":403} Mar 21 04:23:30 crc kubenswrapper[4839]: I0321 04:23:30.899722 4839 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" probeResult="failure" output="HTTP probe failed with statuscode: 403" Mar 21 04:23:30 crc kubenswrapper[4839]: I0321 04:23:30.904502 4839 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver namespace/openshift-kube-apiserver: Startup probe status=failure output="HTTP probe failed with statuscode: 403" start-of-body={"kind":"Status","apiVersion":"v1","metadata":{},"status":"Failure","message":"forbidden: User \"system:anonymous\" cannot get path \"/livez\"","reason":"Forbidden","details":{},"code":403} Mar 21 04:23:30 crc kubenswrapper[4839]: I0321 04:23:30.904591 4839 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" probeResult="failure" output="HTTP probe failed with statuscode: 403" Mar 21 04:23:31 crc kubenswrapper[4839]: I0321 04:23:31.368405 4839 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver namespace/openshift-kube-apiserver: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[+]ping ok Mar 21 04:23:31 crc kubenswrapper[4839]: [+]log ok Mar 21 04:23:31 crc kubenswrapper[4839]: [+]etcd ok Mar 21 04:23:31 crc kubenswrapper[4839]: [+]poststarthook/start-apiserver-admission-initializer ok Mar 21 04:23:31 crc kubenswrapper[4839]: [+]poststarthook/quota.openshift.io-clusterquotamapping ok Mar 21 04:23:31 crc kubenswrapper[4839]: [+]poststarthook/openshift.io-api-request-count-filter ok Mar 21 04:23:31 crc kubenswrapper[4839]: [+]poststarthook/openshift.io-startkubeinformers ok Mar 21 04:23:31 crc kubenswrapper[4839]: [+]poststarthook/openshift.io-openshift-apiserver-reachable ok Mar 21 04:23:31 crc kubenswrapper[4839]: [+]poststarthook/openshift.io-oauth-apiserver-reachable ok Mar 21 04:23:31 crc kubenswrapper[4839]: [+]poststarthook/generic-apiserver-start-informers ok Mar 21 04:23:31 crc kubenswrapper[4839]: [+]poststarthook/priority-and-fairness-config-consumer ok Mar 21 04:23:31 crc kubenswrapper[4839]: [+]poststarthook/priority-and-fairness-filter ok Mar 21 04:23:31 crc kubenswrapper[4839]: [+]poststarthook/storage-object-count-tracker-hook ok Mar 21 04:23:31 crc kubenswrapper[4839]: [+]poststarthook/start-apiextensions-informers ok Mar 21 04:23:31 crc kubenswrapper[4839]: [+]poststarthook/start-apiextensions-controllers ok Mar 21 04:23:31 crc kubenswrapper[4839]: [+]poststarthook/crd-informer-synced ok Mar 21 04:23:31 crc kubenswrapper[4839]: [+]poststarthook/start-system-namespaces-controller ok Mar 21 04:23:31 crc kubenswrapper[4839]: [+]poststarthook/start-cluster-authentication-info-controller ok Mar 21 04:23:31 crc kubenswrapper[4839]: [+]poststarthook/start-kube-apiserver-identity-lease-controller ok Mar 21 04:23:31 crc kubenswrapper[4839]: [+]poststarthook/start-kube-apiserver-identity-lease-garbage-collector ok Mar 21 04:23:31 crc kubenswrapper[4839]: [+]poststarthook/start-legacy-token-tracking-controller ok Mar 21 04:23:31 crc kubenswrapper[4839]: [+]poststarthook/start-service-ip-repair-controllers ok Mar 21 04:23:31 crc kubenswrapper[4839]: [-]poststarthook/rbac/bootstrap-roles failed: reason withheld Mar 21 04:23:31 crc kubenswrapper[4839]: [-]poststarthook/scheduling/bootstrap-system-priority-classes failed: reason withheld Mar 21 04:23:31 crc kubenswrapper[4839]: [+]poststarthook/priority-and-fairness-config-producer ok Mar 21 04:23:31 crc kubenswrapper[4839]: [+]poststarthook/bootstrap-controller ok Mar 21 04:23:31 crc kubenswrapper[4839]: [+]poststarthook/aggregator-reload-proxy-client-cert ok Mar 21 04:23:31 crc kubenswrapper[4839]: [+]poststarthook/start-kube-aggregator-informers ok Mar 21 04:23:31 crc kubenswrapper[4839]: [+]poststarthook/apiservice-status-local-available-controller ok Mar 21 04:23:31 crc kubenswrapper[4839]: [+]poststarthook/apiservice-status-remote-available-controller ok Mar 21 04:23:31 crc kubenswrapper[4839]: [+]poststarthook/apiservice-registration-controller ok Mar 21 04:23:31 crc kubenswrapper[4839]: [+]poststarthook/apiservice-wait-for-first-sync ok Mar 21 04:23:31 crc kubenswrapper[4839]: [+]poststarthook/apiservice-discovery-controller ok Mar 21 04:23:31 crc kubenswrapper[4839]: [+]poststarthook/kube-apiserver-autoregistration ok Mar 21 04:23:31 crc kubenswrapper[4839]: [+]autoregister-completion ok Mar 21 04:23:31 crc kubenswrapper[4839]: [+]poststarthook/apiservice-openapi-controller ok Mar 21 04:23:31 crc kubenswrapper[4839]: [+]poststarthook/apiservice-openapiv3-controller ok Mar 21 04:23:31 crc kubenswrapper[4839]: livez check failed Mar 21 04:23:31 crc kubenswrapper[4839]: I0321 04:23:31.368489 4839 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 21 04:23:31 crc kubenswrapper[4839]: I0321 04:23:31.396462 4839 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:23:31Z is after 2026-02-23T05:33:13Z Mar 21 04:23:31 crc kubenswrapper[4839]: I0321 04:23:31.566726 4839 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/0.log" Mar 21 04:23:31 crc kubenswrapper[4839]: I0321 04:23:31.568959 4839 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="af0e08f67c4187e0c2d779bf55ac4af88b5145beda32bcfbd1ce7b4738e7d889" exitCode=255 Mar 21 04:23:31 crc kubenswrapper[4839]: I0321 04:23:31.569010 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerDied","Data":"af0e08f67c4187e0c2d779bf55ac4af88b5145beda32bcfbd1ce7b4738e7d889"} Mar 21 04:23:31 crc kubenswrapper[4839]: I0321 04:23:31.569189 4839 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 21 04:23:31 crc kubenswrapper[4839]: I0321 04:23:31.570323 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:23:31 crc kubenswrapper[4839]: I0321 04:23:31.570381 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:23:31 crc kubenswrapper[4839]: I0321 04:23:31.570400 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:23:31 crc kubenswrapper[4839]: I0321 04:23:31.571274 4839 scope.go:117] "RemoveContainer" containerID="af0e08f67c4187e0c2d779bf55ac4af88b5145beda32bcfbd1ce7b4738e7d889" Mar 21 04:23:31 crc kubenswrapper[4839]: I0321 04:23:31.888986 4839 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 21 04:23:31 crc kubenswrapper[4839]: I0321 04:23:31.889168 4839 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 21 04:23:31 crc kubenswrapper[4839]: I0321 04:23:31.890455 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:23:31 crc kubenswrapper[4839]: I0321 04:23:31.890558 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:23:31 crc kubenswrapper[4839]: I0321 04:23:31.890641 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:23:32 crc kubenswrapper[4839]: I0321 04:23:32.398218 4839 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:23:32Z is after 2026-02-23T05:33:13Z Mar 21 04:23:32 crc kubenswrapper[4839]: I0321 04:23:32.575482 4839 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/0.log" Mar 21 04:23:32 crc kubenswrapper[4839]: I0321 04:23:32.577696 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"7476f5ffcd4543713ca21f82ab34782837d059df6a823fa21380af4d37875843"} Mar 21 04:23:32 crc kubenswrapper[4839]: I0321 04:23:32.577910 4839 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 21 04:23:32 crc kubenswrapper[4839]: I0321 04:23:32.579214 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:23:32 crc kubenswrapper[4839]: I0321 04:23:32.579272 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:23:32 crc kubenswrapper[4839]: I0321 04:23:32.579285 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:23:32 crc kubenswrapper[4839]: I0321 04:23:32.873839 4839 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-etcd/etcd-crc" Mar 21 04:23:32 crc kubenswrapper[4839]: I0321 04:23:32.874014 4839 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 21 04:23:32 crc kubenswrapper[4839]: I0321 04:23:32.875113 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:23:32 crc kubenswrapper[4839]: I0321 04:23:32.875186 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:23:32 crc kubenswrapper[4839]: I0321 04:23:32.875214 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:23:32 crc kubenswrapper[4839]: I0321 04:23:32.891761 4839 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-etcd/etcd-crc" Mar 21 04:23:33 crc kubenswrapper[4839]: I0321 04:23:33.399475 4839 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:23:33Z is after 2026-02-23T05:33:13Z Mar 21 04:23:33 crc kubenswrapper[4839]: I0321 04:23:33.583760 4839 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/1.log" Mar 21 04:23:33 crc kubenswrapper[4839]: I0321 04:23:33.584520 4839 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/0.log" Mar 21 04:23:33 crc kubenswrapper[4839]: I0321 04:23:33.587233 4839 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="7476f5ffcd4543713ca21f82ab34782837d059df6a823fa21380af4d37875843" exitCode=255 Mar 21 04:23:33 crc kubenswrapper[4839]: I0321 04:23:33.587337 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerDied","Data":"7476f5ffcd4543713ca21f82ab34782837d059df6a823fa21380af4d37875843"} Mar 21 04:23:33 crc kubenswrapper[4839]: I0321 04:23:33.587393 4839 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 21 04:23:33 crc kubenswrapper[4839]: I0321 04:23:33.587446 4839 scope.go:117] "RemoveContainer" containerID="af0e08f67c4187e0c2d779bf55ac4af88b5145beda32bcfbd1ce7b4738e7d889" Mar 21 04:23:33 crc kubenswrapper[4839]: I0321 04:23:33.587597 4839 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 21 04:23:33 crc kubenswrapper[4839]: I0321 04:23:33.590123 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:23:33 crc kubenswrapper[4839]: I0321 04:23:33.590179 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:23:33 crc kubenswrapper[4839]: I0321 04:23:33.590193 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:23:33 crc kubenswrapper[4839]: I0321 04:23:33.591257 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:23:33 crc kubenswrapper[4839]: I0321 04:23:33.591320 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:23:33 crc kubenswrapper[4839]: I0321 04:23:33.591335 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:23:33 crc kubenswrapper[4839]: I0321 04:23:33.592324 4839 scope.go:117] "RemoveContainer" containerID="7476f5ffcd4543713ca21f82ab34782837d059df6a823fa21380af4d37875843" Mar 21 04:23:33 crc kubenswrapper[4839]: E0321 04:23:33.592490 4839 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 21 04:23:33 crc kubenswrapper[4839]: I0321 04:23:33.679367 4839 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 21 04:23:33 crc kubenswrapper[4839]: W0321 04:23:33.961108 4839 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:23:33Z is after 2026-02-23T05:33:13Z Mar 21 04:23:33 crc kubenswrapper[4839]: E0321 04:23:33.961245 4839 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:23:33Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Mar 21 04:23:34 crc kubenswrapper[4839]: W0321 04:23:34.372335 4839 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:23:34Z is after 2026-02-23T05:33:13Z Mar 21 04:23:34 crc kubenswrapper[4839]: E0321 04:23:34.372413 4839 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:23:34Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Mar 21 04:23:34 crc kubenswrapper[4839]: I0321 04:23:34.397400 4839 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:23:34Z is after 2026-02-23T05:33:13Z Mar 21 04:23:34 crc kubenswrapper[4839]: I0321 04:23:34.593176 4839 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/1.log" Mar 21 04:23:34 crc kubenswrapper[4839]: I0321 04:23:34.595894 4839 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 21 04:23:34 crc kubenswrapper[4839]: I0321 04:23:34.596844 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:23:34 crc kubenswrapper[4839]: I0321 04:23:34.596894 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:23:34 crc kubenswrapper[4839]: I0321 04:23:34.596905 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:23:34 crc kubenswrapper[4839]: I0321 04:23:34.597332 4839 scope.go:117] "RemoveContainer" containerID="7476f5ffcd4543713ca21f82ab34782837d059df6a823fa21380af4d37875843" Mar 21 04:23:34 crc kubenswrapper[4839]: E0321 04:23:34.597496 4839 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 21 04:23:35 crc kubenswrapper[4839]: I0321 04:23:35.399479 4839 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:23:35Z is after 2026-02-23T05:33:13Z Mar 21 04:23:36 crc kubenswrapper[4839]: I0321 04:23:36.373165 4839 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 21 04:23:36 crc kubenswrapper[4839]: I0321 04:23:36.373426 4839 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 21 04:23:36 crc kubenswrapper[4839]: I0321 04:23:36.375300 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:23:36 crc kubenswrapper[4839]: I0321 04:23:36.375373 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:23:36 crc kubenswrapper[4839]: I0321 04:23:36.375396 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:23:36 crc kubenswrapper[4839]: I0321 04:23:36.379792 4839 scope.go:117] "RemoveContainer" containerID="7476f5ffcd4543713ca21f82ab34782837d059df6a823fa21380af4d37875843" Mar 21 04:23:36 crc kubenswrapper[4839]: E0321 04:23:36.380439 4839 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 21 04:23:36 crc kubenswrapper[4839]: I0321 04:23:36.384927 4839 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 21 04:23:36 crc kubenswrapper[4839]: I0321 04:23:36.399093 4839 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:23:36Z is after 2026-02-23T05:33:13Z Mar 21 04:23:36 crc kubenswrapper[4839]: E0321 04:23:36.519766 4839 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Mar 21 04:23:36 crc kubenswrapper[4839]: I0321 04:23:36.601239 4839 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 21 04:23:36 crc kubenswrapper[4839]: I0321 04:23:36.602834 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:23:36 crc kubenswrapper[4839]: I0321 04:23:36.603406 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:23:36 crc kubenswrapper[4839]: I0321 04:23:36.603746 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:23:36 crc kubenswrapper[4839]: I0321 04:23:36.605167 4839 scope.go:117] "RemoveContainer" containerID="7476f5ffcd4543713ca21f82ab34782837d059df6a823fa21380af4d37875843" Mar 21 04:23:36 crc kubenswrapper[4839]: E0321 04:23:36.605839 4839 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 21 04:23:36 crc kubenswrapper[4839]: W0321 04:23:36.659767 4839 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:23:36Z is after 2026-02-23T05:33:13Z Mar 21 04:23:36 crc kubenswrapper[4839]: E0321 04:23:36.659868 4839 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:23:36Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Mar 21 04:23:37 crc kubenswrapper[4839]: I0321 04:23:37.251780 4839 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/cluster-policy-controller namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 21 04:23:37 crc kubenswrapper[4839]: I0321 04:23:37.251889 4839 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="cluster-policy-controller" probeResult="failure" output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 21 04:23:37 crc kubenswrapper[4839]: I0321 04:23:37.288120 4839 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 21 04:23:37 crc kubenswrapper[4839]: I0321 04:23:37.289751 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:23:37 crc kubenswrapper[4839]: I0321 04:23:37.289833 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:23:37 crc kubenswrapper[4839]: I0321 04:23:37.289854 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:23:37 crc kubenswrapper[4839]: I0321 04:23:37.289906 4839 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 21 04:23:37 crc kubenswrapper[4839]: E0321 04:23:37.293157 4839 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:23:37Z is after 2026-02-23T05:33:13Z" node="crc" Mar 21 04:23:37 crc kubenswrapper[4839]: E0321 04:23:37.295647 4839 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:23:37Z is after 2026-02-23T05:33:13Z" interval="7s" Mar 21 04:23:37 crc kubenswrapper[4839]: I0321 04:23:37.397844 4839 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:23:37Z is after 2026-02-23T05:33:13Z Mar 21 04:23:37 crc kubenswrapper[4839]: I0321 04:23:37.770631 4839 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 21 04:23:37 crc kubenswrapper[4839]: I0321 04:23:37.770894 4839 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 21 04:23:37 crc kubenswrapper[4839]: I0321 04:23:37.772965 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:23:37 crc kubenswrapper[4839]: I0321 04:23:37.773010 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:23:37 crc kubenswrapper[4839]: I0321 04:23:37.773022 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:23:37 crc kubenswrapper[4839]: I0321 04:23:37.773799 4839 scope.go:117] "RemoveContainer" containerID="7476f5ffcd4543713ca21f82ab34782837d059df6a823fa21380af4d37875843" Mar 21 04:23:37 crc kubenswrapper[4839]: E0321 04:23:37.774027 4839 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 21 04:23:38 crc kubenswrapper[4839]: I0321 04:23:38.399888 4839 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:23:38Z is after 2026-02-23T05:33:13Z Mar 21 04:23:39 crc kubenswrapper[4839]: I0321 04:23:39.073852 4839 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Rotating certificates Mar 21 04:23:39 crc kubenswrapper[4839]: E0321 04:23:39.077489 4839 certificate_manager.go:562] "Unhandled Error" err="kubernetes.io/kube-apiserver-client-kubelet: Failed while requesting a signed certificate from the control plane: cannot create certificate signing request: Post \"https://api-int.crc.testing:6443/apis/certificates.k8s.io/v1/certificatesigningrequests\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:23:39Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Mar 21 04:23:39 crc kubenswrapper[4839]: I0321 04:23:39.398673 4839 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:23:39Z is after 2026-02-23T05:33:13Z Mar 21 04:23:40 crc kubenswrapper[4839]: I0321 04:23:40.397560 4839 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:23:40Z is after 2026-02-23T05:33:13Z Mar 21 04:23:40 crc kubenswrapper[4839]: E0321 04:23:40.895189 4839 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/default/events\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:23:40Z is after 2026-02-23T05:33:13Z" event="&Event{ObjectMeta:{crc.189ec088a75d2ad4 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-21 04:23:16.386540244 +0000 UTC m=+0.714326930,LastTimestamp:2026-03-21 04:23:16.386540244 +0000 UTC m=+0.714326930,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 21 04:23:41 crc kubenswrapper[4839]: W0321 04:23:41.373355 4839 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:23:41Z is after 2026-02-23T05:33:13Z Mar 21 04:23:41 crc kubenswrapper[4839]: E0321 04:23:41.373440 4839 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:23:41Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Mar 21 04:23:41 crc kubenswrapper[4839]: I0321 04:23:41.398009 4839 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:23:41Z is after 2026-02-23T05:33:13Z Mar 21 04:23:42 crc kubenswrapper[4839]: I0321 04:23:42.398778 4839 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:23:42Z is after 2026-02-23T05:33:13Z Mar 21 04:23:43 crc kubenswrapper[4839]: W0321 04:23:43.046506 4839 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:23:43Z is after 2026-02-23T05:33:13Z Mar 21 04:23:43 crc kubenswrapper[4839]: E0321 04:23:43.046659 4839 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:23:43Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Mar 21 04:23:43 crc kubenswrapper[4839]: I0321 04:23:43.399191 4839 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:23:43Z is after 2026-02-23T05:33:13Z Mar 21 04:23:43 crc kubenswrapper[4839]: W0321 04:23:43.916269 4839 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:23:43Z is after 2026-02-23T05:33:13Z Mar 21 04:23:43 crc kubenswrapper[4839]: E0321 04:23:43.916398 4839 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:23:43Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Mar 21 04:23:44 crc kubenswrapper[4839]: I0321 04:23:44.294079 4839 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 21 04:23:44 crc kubenswrapper[4839]: I0321 04:23:44.295687 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:23:44 crc kubenswrapper[4839]: I0321 04:23:44.295729 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:23:44 crc kubenswrapper[4839]: I0321 04:23:44.295741 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:23:44 crc kubenswrapper[4839]: I0321 04:23:44.295768 4839 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 21 04:23:44 crc kubenswrapper[4839]: E0321 04:23:44.301135 4839 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:23:44Z is after 2026-02-23T05:33:13Z" interval="7s" Mar 21 04:23:44 crc kubenswrapper[4839]: E0321 04:23:44.301556 4839 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:23:44Z is after 2026-02-23T05:33:13Z" node="crc" Mar 21 04:23:44 crc kubenswrapper[4839]: I0321 04:23:44.399858 4839 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:23:44Z is after 2026-02-23T05:33:13Z Mar 21 04:23:45 crc kubenswrapper[4839]: I0321 04:23:45.398703 4839 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:23:45Z is after 2026-02-23T05:33:13Z Mar 21 04:23:46 crc kubenswrapper[4839]: I0321 04:23:46.399295 4839 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:23:46Z is after 2026-02-23T05:33:13Z Mar 21 04:23:46 crc kubenswrapper[4839]: E0321 04:23:46.519846 4839 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Mar 21 04:23:47 crc kubenswrapper[4839]: I0321 04:23:47.251449 4839 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/cluster-policy-controller namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 21 04:23:47 crc kubenswrapper[4839]: I0321 04:23:47.251852 4839 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="cluster-policy-controller" probeResult="failure" output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 21 04:23:47 crc kubenswrapper[4839]: I0321 04:23:47.252000 4839 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 21 04:23:47 crc kubenswrapper[4839]: I0321 04:23:47.252226 4839 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 21 04:23:47 crc kubenswrapper[4839]: I0321 04:23:47.253652 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:23:47 crc kubenswrapper[4839]: I0321 04:23:47.253713 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:23:47 crc kubenswrapper[4839]: I0321 04:23:47.253734 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:23:47 crc kubenswrapper[4839]: I0321 04:23:47.254483 4839 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="cluster-policy-controller" containerStatusID={"Type":"cri-o","ID":"1e036a62b31123ade40717ddff4b8b13971bdcda78062ebf348d49978c3c7a58"} pod="openshift-kube-controller-manager/kube-controller-manager-crc" containerMessage="Container cluster-policy-controller failed startup probe, will be restarted" Mar 21 04:23:47 crc kubenswrapper[4839]: I0321 04:23:47.254762 4839 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="cluster-policy-controller" containerID="cri-o://1e036a62b31123ade40717ddff4b8b13971bdcda78062ebf348d49978c3c7a58" gracePeriod=30 Mar 21 04:23:47 crc kubenswrapper[4839]: I0321 04:23:47.397695 4839 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:23:47Z is after 2026-02-23T05:33:13Z Mar 21 04:23:47 crc kubenswrapper[4839]: I0321 04:23:47.635087 4839 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/cluster-policy-controller/0.log" Mar 21 04:23:47 crc kubenswrapper[4839]: I0321 04:23:47.635693 4839 generic.go:334] "Generic (PLEG): container finished" podID="f614b9022728cf315e60c057852e563e" containerID="1e036a62b31123ade40717ddff4b8b13971bdcda78062ebf348d49978c3c7a58" exitCode=255 Mar 21 04:23:47 crc kubenswrapper[4839]: I0321 04:23:47.635769 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerDied","Data":"1e036a62b31123ade40717ddff4b8b13971bdcda78062ebf348d49978c3c7a58"} Mar 21 04:23:48 crc kubenswrapper[4839]: W0321 04:23:48.313208 4839 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:23:48Z is after 2026-02-23T05:33:13Z Mar 21 04:23:48 crc kubenswrapper[4839]: E0321 04:23:48.313319 4839 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:23:48Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Mar 21 04:23:48 crc kubenswrapper[4839]: I0321 04:23:48.397037 4839 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:23:48Z is after 2026-02-23T05:33:13Z Mar 21 04:23:48 crc kubenswrapper[4839]: I0321 04:23:48.641734 4839 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/cluster-policy-controller/0.log" Mar 21 04:23:48 crc kubenswrapper[4839]: I0321 04:23:48.642599 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"ed71f104d54976b384a414c08b933d4f7185883ea0e2dc638863d67f9b245505"} Mar 21 04:23:48 crc kubenswrapper[4839]: I0321 04:23:48.642680 4839 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 21 04:23:48 crc kubenswrapper[4839]: I0321 04:23:48.643772 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:23:48 crc kubenswrapper[4839]: I0321 04:23:48.643806 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:23:48 crc kubenswrapper[4839]: I0321 04:23:48.643816 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:23:49 crc kubenswrapper[4839]: I0321 04:23:49.397606 4839 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:23:49Z is after 2026-02-23T05:33:13Z Mar 21 04:23:49 crc kubenswrapper[4839]: I0321 04:23:49.647891 4839 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 21 04:23:49 crc kubenswrapper[4839]: I0321 04:23:49.648946 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:23:49 crc kubenswrapper[4839]: I0321 04:23:49.649000 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:23:49 crc kubenswrapper[4839]: I0321 04:23:49.649016 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:23:50 crc kubenswrapper[4839]: I0321 04:23:50.397271 4839 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:23:50Z is after 2026-02-23T05:33:13Z Mar 21 04:23:50 crc kubenswrapper[4839]: I0321 04:23:50.452126 4839 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 21 04:23:50 crc kubenswrapper[4839]: I0321 04:23:50.454293 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:23:50 crc kubenswrapper[4839]: I0321 04:23:50.454331 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:23:50 crc kubenswrapper[4839]: I0321 04:23:50.454343 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:23:50 crc kubenswrapper[4839]: I0321 04:23:50.454983 4839 scope.go:117] "RemoveContainer" containerID="7476f5ffcd4543713ca21f82ab34782837d059df6a823fa21380af4d37875843" Mar 21 04:23:50 crc kubenswrapper[4839]: I0321 04:23:50.660671 4839 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/1.log" Mar 21 04:23:50 crc kubenswrapper[4839]: E0321 04:23:50.899415 4839 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/default/events\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:23:50Z is after 2026-02-23T05:33:13Z" event="&Event{ObjectMeta:{crc.189ec088a75d2ad4 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-21 04:23:16.386540244 +0000 UTC m=+0.714326930,LastTimestamp:2026-03-21 04:23:16.386540244 +0000 UTC m=+0.714326930,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 21 04:23:51 crc kubenswrapper[4839]: I0321 04:23:51.301720 4839 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 21 04:23:51 crc kubenswrapper[4839]: I0321 04:23:51.302804 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:23:51 crc kubenswrapper[4839]: I0321 04:23:51.302851 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:23:51 crc kubenswrapper[4839]: I0321 04:23:51.302869 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:23:51 crc kubenswrapper[4839]: I0321 04:23:51.302902 4839 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 21 04:23:51 crc kubenswrapper[4839]: E0321 04:23:51.304398 4839 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:23:51Z is after 2026-02-23T05:33:13Z" interval="7s" Mar 21 04:23:51 crc kubenswrapper[4839]: E0321 04:23:51.305779 4839 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:23:51Z is after 2026-02-23T05:33:13Z" node="crc" Mar 21 04:23:51 crc kubenswrapper[4839]: I0321 04:23:51.396586 4839 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:23:51Z is after 2026-02-23T05:33:13Z Mar 21 04:23:51 crc kubenswrapper[4839]: I0321 04:23:51.666513 4839 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/2.log" Mar 21 04:23:51 crc kubenswrapper[4839]: I0321 04:23:51.666935 4839 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/1.log" Mar 21 04:23:51 crc kubenswrapper[4839]: I0321 04:23:51.668480 4839 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="2edb841c97e50e44d1c47b38425979e96688c3d536fe263be8db34fe4f7ec6ce" exitCode=255 Mar 21 04:23:51 crc kubenswrapper[4839]: I0321 04:23:51.668531 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerDied","Data":"2edb841c97e50e44d1c47b38425979e96688c3d536fe263be8db34fe4f7ec6ce"} Mar 21 04:23:51 crc kubenswrapper[4839]: I0321 04:23:51.668602 4839 scope.go:117] "RemoveContainer" containerID="7476f5ffcd4543713ca21f82ab34782837d059df6a823fa21380af4d37875843" Mar 21 04:23:51 crc kubenswrapper[4839]: I0321 04:23:51.668724 4839 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 21 04:23:51 crc kubenswrapper[4839]: I0321 04:23:51.669609 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:23:51 crc kubenswrapper[4839]: I0321 04:23:51.669669 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:23:51 crc kubenswrapper[4839]: I0321 04:23:51.669690 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:23:51 crc kubenswrapper[4839]: I0321 04:23:51.670524 4839 scope.go:117] "RemoveContainer" containerID="2edb841c97e50e44d1c47b38425979e96688c3d536fe263be8db34fe4f7ec6ce" Mar 21 04:23:51 crc kubenswrapper[4839]: E0321 04:23:51.671090 4839 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 20s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 21 04:23:52 crc kubenswrapper[4839]: I0321 04:23:52.397422 4839 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:23:52Z is after 2026-02-23T05:33:13Z Mar 21 04:23:52 crc kubenswrapper[4839]: I0321 04:23:52.672858 4839 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/2.log" Mar 21 04:23:53 crc kubenswrapper[4839]: I0321 04:23:53.398666 4839 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:23:53Z is after 2026-02-23T05:33:13Z Mar 21 04:23:53 crc kubenswrapper[4839]: I0321 04:23:53.679470 4839 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 21 04:23:53 crc kubenswrapper[4839]: I0321 04:23:53.679759 4839 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 21 04:23:53 crc kubenswrapper[4839]: I0321 04:23:53.680934 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:23:53 crc kubenswrapper[4839]: I0321 04:23:53.680963 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:23:53 crc kubenswrapper[4839]: I0321 04:23:53.680973 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:23:53 crc kubenswrapper[4839]: I0321 04:23:53.681439 4839 scope.go:117] "RemoveContainer" containerID="2edb841c97e50e44d1c47b38425979e96688c3d536fe263be8db34fe4f7ec6ce" Mar 21 04:23:53 crc kubenswrapper[4839]: E0321 04:23:53.681636 4839 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 20s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 21 04:23:53 crc kubenswrapper[4839]: I0321 04:23:53.923270 4839 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 21 04:23:53 crc kubenswrapper[4839]: I0321 04:23:53.923458 4839 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 21 04:23:53 crc kubenswrapper[4839]: I0321 04:23:53.924718 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:23:53 crc kubenswrapper[4839]: I0321 04:23:53.924765 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:23:53 crc kubenswrapper[4839]: I0321 04:23:53.924781 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:23:54 crc kubenswrapper[4839]: I0321 04:23:54.250401 4839 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 21 04:23:54 crc kubenswrapper[4839]: I0321 04:23:54.398054 4839 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:23:54Z is after 2026-02-23T05:33:13Z Mar 21 04:23:54 crc kubenswrapper[4839]: I0321 04:23:54.679462 4839 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 21 04:23:54 crc kubenswrapper[4839]: I0321 04:23:54.680295 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:23:54 crc kubenswrapper[4839]: I0321 04:23:54.680336 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:23:54 crc kubenswrapper[4839]: I0321 04:23:54.680354 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:23:55 crc kubenswrapper[4839]: I0321 04:23:55.178547 4839 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Rotating certificates Mar 21 04:23:55 crc kubenswrapper[4839]: E0321 04:23:55.182192 4839 certificate_manager.go:562] "Unhandled Error" err="kubernetes.io/kube-apiserver-client-kubelet: Failed while requesting a signed certificate from the control plane: cannot create certificate signing request: Post \"https://api-int.crc.testing:6443/apis/certificates.k8s.io/v1/certificatesigningrequests\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:23:55Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Mar 21 04:23:55 crc kubenswrapper[4839]: E0321 04:23:55.183417 4839 certificate_manager.go:440] "Unhandled Error" err="kubernetes.io/kube-apiserver-client-kubelet: Reached backoff limit, still unable to rotate certs: timed out waiting for the condition" logger="UnhandledError" Mar 21 04:23:55 crc kubenswrapper[4839]: I0321 04:23:55.396966 4839 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:23:55Z is after 2026-02-23T05:33:13Z Mar 21 04:23:56 crc kubenswrapper[4839]: I0321 04:23:56.397033 4839 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:23:56Z is after 2026-02-23T05:33:13Z Mar 21 04:23:56 crc kubenswrapper[4839]: E0321 04:23:56.519964 4839 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Mar 21 04:23:57 crc kubenswrapper[4839]: I0321 04:23:57.251434 4839 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/cluster-policy-controller namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.126.11:10357/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 21 04:23:57 crc kubenswrapper[4839]: I0321 04:23:57.251557 4839 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="cluster-policy-controller" probeResult="failure" output="Get \"https://192.168.126.11:10357/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 21 04:23:57 crc kubenswrapper[4839]: I0321 04:23:57.397615 4839 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:23:57Z is after 2026-02-23T05:33:13Z Mar 21 04:23:57 crc kubenswrapper[4839]: I0321 04:23:57.770648 4839 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 21 04:23:57 crc kubenswrapper[4839]: I0321 04:23:57.770976 4839 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 21 04:23:57 crc kubenswrapper[4839]: I0321 04:23:57.772470 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:23:57 crc kubenswrapper[4839]: I0321 04:23:57.772524 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:23:57 crc kubenswrapper[4839]: I0321 04:23:57.772547 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:23:57 crc kubenswrapper[4839]: I0321 04:23:57.773177 4839 scope.go:117] "RemoveContainer" containerID="2edb841c97e50e44d1c47b38425979e96688c3d536fe263be8db34fe4f7ec6ce" Mar 21 04:23:57 crc kubenswrapper[4839]: E0321 04:23:57.773357 4839 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 20s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 21 04:23:58 crc kubenswrapper[4839]: I0321 04:23:58.306374 4839 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 21 04:23:58 crc kubenswrapper[4839]: I0321 04:23:58.307715 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:23:58 crc kubenswrapper[4839]: I0321 04:23:58.307759 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:23:58 crc kubenswrapper[4839]: I0321 04:23:58.307770 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:23:58 crc kubenswrapper[4839]: I0321 04:23:58.307795 4839 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 21 04:23:58 crc kubenswrapper[4839]: E0321 04:23:58.308044 4839 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:23:58Z is after 2026-02-23T05:33:13Z" interval="7s" Mar 21 04:23:58 crc kubenswrapper[4839]: E0321 04:23:58.310492 4839 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:23:58Z is after 2026-02-23T05:33:13Z" node="crc" Mar 21 04:23:58 crc kubenswrapper[4839]: I0321 04:23:58.400252 4839 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:23:58Z is after 2026-02-23T05:33:13Z Mar 21 04:23:58 crc kubenswrapper[4839]: W0321 04:23:58.917117 4839 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:23:58Z is after 2026-02-23T05:33:13Z Mar 21 04:23:58 crc kubenswrapper[4839]: E0321 04:23:58.917224 4839 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:23:58Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Mar 21 04:23:59 crc kubenswrapper[4839]: I0321 04:23:59.397425 4839 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:23:59Z is after 2026-02-23T05:33:13Z Mar 21 04:24:00 crc kubenswrapper[4839]: I0321 04:24:00.397844 4839 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:24:00Z is after 2026-02-23T05:33:13Z Mar 21 04:24:00 crc kubenswrapper[4839]: E0321 04:24:00.902729 4839 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/default/events\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:24:00Z is after 2026-02-23T05:33:13Z" event="&Event{ObjectMeta:{crc.189ec088a75d2ad4 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-21 04:23:16.386540244 +0000 UTC m=+0.714326930,LastTimestamp:2026-03-21 04:23:16.386540244 +0000 UTC m=+0.714326930,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 21 04:24:01 crc kubenswrapper[4839]: W0321 04:24:01.139787 4839 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:24:01Z is after 2026-02-23T05:33:13Z Mar 21 04:24:01 crc kubenswrapper[4839]: E0321 04:24:01.139868 4839 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:24:01Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Mar 21 04:24:01 crc kubenswrapper[4839]: I0321 04:24:01.398219 4839 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:24:01Z is after 2026-02-23T05:33:13Z Mar 21 04:24:02 crc kubenswrapper[4839]: I0321 04:24:02.397400 4839 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:24:02Z is after 2026-02-23T05:33:13Z Mar 21 04:24:03 crc kubenswrapper[4839]: I0321 04:24:03.398560 4839 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:24:03Z is after 2026-02-23T05:33:13Z Mar 21 04:24:04 crc kubenswrapper[4839]: I0321 04:24:04.398711 4839 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:24:04Z is after 2026-02-23T05:33:13Z Mar 21 04:24:04 crc kubenswrapper[4839]: W0321 04:24:04.736322 4839 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:24:04Z is after 2026-02-23T05:33:13Z Mar 21 04:24:04 crc kubenswrapper[4839]: E0321 04:24:04.736406 4839 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:24:04Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Mar 21 04:24:05 crc kubenswrapper[4839]: I0321 04:24:05.311720 4839 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 21 04:24:05 crc kubenswrapper[4839]: E0321 04:24:05.311873 4839 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:24:05Z is after 2026-02-23T05:33:13Z" interval="7s" Mar 21 04:24:05 crc kubenswrapper[4839]: I0321 04:24:05.313089 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:24:05 crc kubenswrapper[4839]: I0321 04:24:05.313139 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:24:05 crc kubenswrapper[4839]: I0321 04:24:05.313152 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:24:05 crc kubenswrapper[4839]: I0321 04:24:05.313180 4839 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 21 04:24:05 crc kubenswrapper[4839]: E0321 04:24:05.315708 4839 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:24:05Z is after 2026-02-23T05:33:13Z" node="crc" Mar 21 04:24:05 crc kubenswrapper[4839]: I0321 04:24:05.400080 4839 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:24:05Z is after 2026-02-23T05:33:13Z Mar 21 04:24:06 crc kubenswrapper[4839]: I0321 04:24:06.399011 4839 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:24:06Z is after 2026-02-23T05:33:13Z Mar 21 04:24:06 crc kubenswrapper[4839]: E0321 04:24:06.520114 4839 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Mar 21 04:24:07 crc kubenswrapper[4839]: I0321 04:24:07.250927 4839 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/cluster-policy-controller namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 21 04:24:07 crc kubenswrapper[4839]: I0321 04:24:07.251095 4839 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="cluster-policy-controller" probeResult="failure" output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 21 04:24:07 crc kubenswrapper[4839]: I0321 04:24:07.398987 4839 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:24:07Z is after 2026-02-23T05:33:13Z Mar 21 04:24:08 crc kubenswrapper[4839]: I0321 04:24:08.401832 4839 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 21 04:24:09 crc kubenswrapper[4839]: W0321 04:24:09.339294 4839 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: services is forbidden: User "system:anonymous" cannot list resource "services" in API group "" at the cluster scope Mar 21 04:24:09 crc kubenswrapper[4839]: E0321 04:24:09.339365 4839 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: services is forbidden: User \"system:anonymous\" cannot list resource \"services\" in API group \"\" at the cluster scope" logger="UnhandledError" Mar 21 04:24:09 crc kubenswrapper[4839]: I0321 04:24:09.398305 4839 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 21 04:24:09 crc kubenswrapper[4839]: I0321 04:24:09.398364 4839 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Mar 21 04:24:09 crc kubenswrapper[4839]: I0321 04:24:09.398872 4839 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 21 04:24:09 crc kubenswrapper[4839]: I0321 04:24:09.400715 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:24:09 crc kubenswrapper[4839]: I0321 04:24:09.400764 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:24:09 crc kubenswrapper[4839]: I0321 04:24:09.400777 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:24:10 crc kubenswrapper[4839]: I0321 04:24:10.401674 4839 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 21 04:24:10 crc kubenswrapper[4839]: E0321 04:24:10.908178 4839 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189ec088a75d2ad4 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-21 04:23:16.386540244 +0000 UTC m=+0.714326930,LastTimestamp:2026-03-21 04:23:16.386540244 +0000 UTC m=+0.714326930,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 21 04:24:10 crc kubenswrapper[4839]: E0321 04:24:10.914858 4839 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189ec088aa7c603a default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node crc status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-21 04:23:16.438917178 +0000 UTC m=+0.766703854,LastTimestamp:2026-03-21 04:23:16.438917178 +0000 UTC m=+0.766703854,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 21 04:24:10 crc kubenswrapper[4839]: E0321 04:24:10.919927 4839 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189ec088aa7c9f1e default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasNoDiskPressure,Message:Node crc status is now: NodeHasNoDiskPressure,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-21 04:23:16.438933278 +0000 UTC m=+0.766719954,LastTimestamp:2026-03-21 04:23:16.438933278 +0000 UTC m=+0.766719954,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 21 04:24:10 crc kubenswrapper[4839]: E0321 04:24:10.925923 4839 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189ec088aa7cbf3b default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientPID,Message:Node crc status is now: NodeHasSufficientPID,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-21 04:23:16.438941499 +0000 UTC m=+0.766728175,LastTimestamp:2026-03-21 04:23:16.438941499 +0000 UTC m=+0.766728175,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 21 04:24:10 crc kubenswrapper[4839]: E0321 04:24:10.930305 4839 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189ec088af12e94c default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeAllocatableEnforced,Message:Updated Node Allocatable limit across pods,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-21 04:23:16.515891532 +0000 UTC m=+0.843678208,LastTimestamp:2026-03-21 04:23:16.515891532 +0000 UTC m=+0.843678208,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 21 04:24:10 crc kubenswrapper[4839]: E0321 04:24:10.935276 4839 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189ec088aa7c603a\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189ec088aa7c603a default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node crc status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-21 04:23:16.438917178 +0000 UTC m=+0.766703854,LastTimestamp:2026-03-21 04:23:16.553508373 +0000 UTC m=+0.881295049,Count:2,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 21 04:24:10 crc kubenswrapper[4839]: E0321 04:24:10.940732 4839 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189ec088aa7c9f1e\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189ec088aa7c9f1e default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasNoDiskPressure,Message:Node crc status is now: NodeHasNoDiskPressure,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-21 04:23:16.438933278 +0000 UTC m=+0.766719954,LastTimestamp:2026-03-21 04:23:16.553535404 +0000 UTC m=+0.881322080,Count:2,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 21 04:24:10 crc kubenswrapper[4839]: E0321 04:24:10.944993 4839 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189ec088aa7cbf3b\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189ec088aa7cbf3b default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientPID,Message:Node crc status is now: NodeHasSufficientPID,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-21 04:23:16.438941499 +0000 UTC m=+0.766728175,LastTimestamp:2026-03-21 04:23:16.553545814 +0000 UTC m=+0.881332490,Count:2,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 21 04:24:10 crc kubenswrapper[4839]: E0321 04:24:10.949391 4839 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189ec088aa7c603a\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189ec088aa7c603a default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node crc status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-21 04:23:16.438917178 +0000 UTC m=+0.766703854,LastTimestamp:2026-03-21 04:23:16.554641629 +0000 UTC m=+0.882428305,Count:3,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 21 04:24:10 crc kubenswrapper[4839]: E0321 04:24:10.953477 4839 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189ec088aa7c9f1e\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189ec088aa7c9f1e default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasNoDiskPressure,Message:Node crc status is now: NodeHasNoDiskPressure,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-21 04:23:16.438933278 +0000 UTC m=+0.766719954,LastTimestamp:2026-03-21 04:23:16.55465985 +0000 UTC m=+0.882446526,Count:3,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 21 04:24:10 crc kubenswrapper[4839]: E0321 04:24:10.957939 4839 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189ec088aa7cbf3b\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189ec088aa7cbf3b default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientPID,Message:Node crc status is now: NodeHasSufficientPID,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-21 04:23:16.438941499 +0000 UTC m=+0.766728175,LastTimestamp:2026-03-21 04:23:16.55466963 +0000 UTC m=+0.882456306,Count:3,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 21 04:24:10 crc kubenswrapper[4839]: E0321 04:24:10.961452 4839 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189ec088aa7c603a\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189ec088aa7c603a default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node crc status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-21 04:23:16.438917178 +0000 UTC m=+0.766703854,LastTimestamp:2026-03-21 04:23:16.554725991 +0000 UTC m=+0.882512667,Count:4,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 21 04:24:10 crc kubenswrapper[4839]: E0321 04:24:10.964790 4839 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189ec088aa7c9f1e\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189ec088aa7c9f1e default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasNoDiskPressure,Message:Node crc status is now: NodeHasNoDiskPressure,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-21 04:23:16.438933278 +0000 UTC m=+0.766719954,LastTimestamp:2026-03-21 04:23:16.554736102 +0000 UTC m=+0.882522778,Count:4,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 21 04:24:10 crc kubenswrapper[4839]: E0321 04:24:10.968769 4839 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189ec088aa7cbf3b\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189ec088aa7cbf3b default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientPID,Message:Node crc status is now: NodeHasSufficientPID,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-21 04:23:16.438941499 +0000 UTC m=+0.766728175,LastTimestamp:2026-03-21 04:23:16.554744022 +0000 UTC m=+0.882530698,Count:4,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 21 04:24:10 crc kubenswrapper[4839]: E0321 04:24:10.972210 4839 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189ec088aa7c603a\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189ec088aa7c603a default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node crc status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-21 04:23:16.438917178 +0000 UTC m=+0.766703854,LastTimestamp:2026-03-21 04:23:16.555774686 +0000 UTC m=+0.883561382,Count:5,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 21 04:24:10 crc kubenswrapper[4839]: E0321 04:24:10.979734 4839 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189ec088aa7c9f1e\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189ec088aa7c9f1e default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasNoDiskPressure,Message:Node crc status is now: NodeHasNoDiskPressure,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-21 04:23:16.438933278 +0000 UTC m=+0.766719954,LastTimestamp:2026-03-21 04:23:16.555793796 +0000 UTC m=+0.883580492,Count:5,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 21 04:24:10 crc kubenswrapper[4839]: E0321 04:24:10.985185 4839 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189ec088aa7cbf3b\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189ec088aa7cbf3b default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientPID,Message:Node crc status is now: NodeHasSufficientPID,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-21 04:23:16.438941499 +0000 UTC m=+0.766728175,LastTimestamp:2026-03-21 04:23:16.555810127 +0000 UTC m=+0.883596823,Count:5,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 21 04:24:10 crc kubenswrapper[4839]: E0321 04:24:10.989407 4839 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189ec088aa7c603a\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189ec088aa7c603a default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node crc status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-21 04:23:16.438917178 +0000 UTC m=+0.766703854,LastTimestamp:2026-03-21 04:23:16.556993344 +0000 UTC m=+0.884780020,Count:6,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 21 04:24:10 crc kubenswrapper[4839]: E0321 04:24:10.993316 4839 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189ec088aa7c9f1e\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189ec088aa7c9f1e default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasNoDiskPressure,Message:Node crc status is now: NodeHasNoDiskPressure,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-21 04:23:16.438933278 +0000 UTC m=+0.766719954,LastTimestamp:2026-03-21 04:23:16.557019665 +0000 UTC m=+0.884806341,Count:6,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 21 04:24:10 crc kubenswrapper[4839]: E0321 04:24:10.997005 4839 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189ec088aa7cbf3b\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189ec088aa7cbf3b default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientPID,Message:Node crc status is now: NodeHasSufficientPID,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-21 04:23:16.438941499 +0000 UTC m=+0.766728175,LastTimestamp:2026-03-21 04:23:16.557028815 +0000 UTC m=+0.884815491,Count:6,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 21 04:24:11 crc kubenswrapper[4839]: E0321 04:24:11.000858 4839 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189ec088aa7c603a\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189ec088aa7c603a default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node crc status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-21 04:23:16.438917178 +0000 UTC m=+0.766703854,LastTimestamp:2026-03-21 04:23:16.55725637 +0000 UTC m=+0.885043046,Count:7,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 21 04:24:11 crc kubenswrapper[4839]: E0321 04:24:11.005429 4839 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189ec088aa7c9f1e\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189ec088aa7c9f1e default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasNoDiskPressure,Message:Node crc status is now: NodeHasNoDiskPressure,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-21 04:23:16.438933278 +0000 UTC m=+0.766719954,LastTimestamp:2026-03-21 04:23:16.557281751 +0000 UTC m=+0.885068427,Count:7,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 21 04:24:11 crc kubenswrapper[4839]: E0321 04:24:11.009348 4839 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189ec088aa7cbf3b\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189ec088aa7cbf3b default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientPID,Message:Node crc status is now: NodeHasSufficientPID,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-21 04:23:16.438941499 +0000 UTC m=+0.766728175,LastTimestamp:2026-03-21 04:23:16.557292121 +0000 UTC m=+0.885078797,Count:7,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 21 04:24:11 crc kubenswrapper[4839]: E0321 04:24:11.016353 4839 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189ec088aa7c603a\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189ec088aa7c603a default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node crc status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-21 04:23:16.438917178 +0000 UTC m=+0.766703854,LastTimestamp:2026-03-21 04:23:16.557578047 +0000 UTC m=+0.885364723,Count:8,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 21 04:24:11 crc kubenswrapper[4839]: E0321 04:24:11.019923 4839 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189ec088aa7c9f1e\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189ec088aa7c9f1e default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasNoDiskPressure,Message:Node crc status is now: NodeHasNoDiskPressure,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-21 04:23:16.438933278 +0000 UTC m=+0.766719954,LastTimestamp:2026-03-21 04:23:16.557608958 +0000 UTC m=+0.885395634,Count:8,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 21 04:24:11 crc kubenswrapper[4839]: E0321 04:24:11.024624 4839 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189ec088c92a6187 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-controller-manager},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-21 04:23:16.953637255 +0000 UTC m=+1.281423961,LastTimestamp:2026-03-21 04:23:16.953637255 +0000 UTC m=+1.281423961,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 21 04:24:11 crc kubenswrapper[4839]: E0321 04:24:11.028689 4839 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.189ec088ca45edee openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{wait-for-host-port},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-21 04:23:16.972219886 +0000 UTC m=+1.300006562,LastTimestamp:2026-03-21 04:23:16.972219886 +0000 UTC m=+1.300006562,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 21 04:24:11 crc kubenswrapper[4839]: E0321 04:24:11.033107 4839 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-machine-config-operator\"" event="&Event{ObjectMeta:{kube-rbac-proxy-crio-crc.189ec088ca490f8d openshift-machine-config-operator 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-machine-config-operator,Name:kube-rbac-proxy-crio-crc,UID:d1b160f5dda77d281dd8e69ec8d817f9,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{setup},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-21 04:23:16.972425101 +0000 UTC m=+1.300211807,LastTimestamp:2026-03-21 04:23:16.972425101 +0000 UTC m=+1.300211807,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 21 04:24:11 crc kubenswrapper[4839]: E0321 04:24:11.038256 4839 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189ec088cb0920ee openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{setup},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-21 04:23:16.985012462 +0000 UTC m=+1.312799138,LastTimestamp:2026-03-21 04:23:16.985012462 +0000 UTC m=+1.312799138,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 21 04:24:11 crc kubenswrapper[4839]: E0321 04:24:11.040326 4839 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189ec088cb831332 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{setup},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-21 04:23:16.993004338 +0000 UTC m=+1.320791014,LastTimestamp:2026-03-21 04:23:16.993004338 +0000 UTC m=+1.320791014,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 21 04:24:11 crc kubenswrapper[4839]: E0321 04:24:11.041841 4839 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189ec088fbfb6018 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{setup},},Reason:Created,Message:Created container setup,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-21 04:23:17.806194712 +0000 UTC m=+2.133981388,LastTimestamp:2026-03-21 04:23:17.806194712 +0000 UTC m=+2.133981388,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 21 04:24:11 crc kubenswrapper[4839]: E0321 04:24:11.045962 4839 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.189ec088fbfb5fdc openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{wait-for-host-port},},Reason:Created,Message:Created container wait-for-host-port,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-21 04:23:17.806194652 +0000 UTC m=+2.133981348,LastTimestamp:2026-03-21 04:23:17.806194652 +0000 UTC m=+2.133981348,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 21 04:24:11 crc kubenswrapper[4839]: E0321 04:24:11.049501 4839 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-machine-config-operator\"" event="&Event{ObjectMeta:{kube-rbac-proxy-crio-crc.189ec088fbfcea52 openshift-machine-config-operator 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-machine-config-operator,Name:kube-rbac-proxy-crio-crc,UID:d1b160f5dda77d281dd8e69ec8d817f9,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{setup},},Reason:Created,Message:Created container setup,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-21 04:23:17.806295634 +0000 UTC m=+2.134082310,LastTimestamp:2026-03-21 04:23:17.806295634 +0000 UTC m=+2.134082310,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 21 04:24:11 crc kubenswrapper[4839]: E0321 04:24:11.053134 4839 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189ec088fbfd015e openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-controller-manager},},Reason:Created,Message:Created container kube-controller-manager,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-21 04:23:17.806301534 +0000 UTC m=+2.134088250,LastTimestamp:2026-03-21 04:23:17.806301534 +0000 UTC m=+2.134088250,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 21 04:24:11 crc kubenswrapper[4839]: E0321 04:24:11.056563 4839 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189ec088fc04146d openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{setup},},Reason:Created,Message:Created container setup,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-21 04:23:17.806765165 +0000 UTC m=+2.134551851,LastTimestamp:2026-03-21 04:23:17.806765165 +0000 UTC m=+2.134551851,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 21 04:24:11 crc kubenswrapper[4839]: E0321 04:24:11.059746 4839 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189ec088fcbaae6e openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-controller-manager},},Reason:Started,Message:Started container kube-controller-manager,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-21 04:23:17.818732142 +0000 UTC m=+2.146518818,LastTimestamp:2026-03-21 04:23:17.818732142 +0000 UTC m=+2.146518818,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 21 04:24:11 crc kubenswrapper[4839]: E0321 04:24:11.063481 4839 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189ec088fcd112c0 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-21 04:23:17.820199616 +0000 UTC m=+2.147986292,LastTimestamp:2026-03-21 04:23:17.820199616 +0000 UTC m=+2.147986292,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 21 04:24:11 crc kubenswrapper[4839]: E0321 04:24:11.066656 4839 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189ec088fce323d2 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{setup},},Reason:Started,Message:Started container setup,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-21 04:23:17.821383634 +0000 UTC m=+2.149170310,LastTimestamp:2026-03-21 04:23:17.821383634 +0000 UTC m=+2.149170310,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 21 04:24:11 crc kubenswrapper[4839]: E0321 04:24:11.070149 4839 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.189ec088fceaa789 openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{wait-for-host-port},},Reason:Started,Message:Started container wait-for-host-port,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-21 04:23:17.821876105 +0000 UTC m=+2.149662781,LastTimestamp:2026-03-21 04:23:17.821876105 +0000 UTC m=+2.149662781,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 21 04:24:11 crc kubenswrapper[4839]: E0321 04:24:11.074037 4839 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189ec088fced4d8f openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{setup},},Reason:Started,Message:Started container setup,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-21 04:23:17.822049679 +0000 UTC m=+2.149836365,LastTimestamp:2026-03-21 04:23:17.822049679 +0000 UTC m=+2.149836365,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 21 04:24:11 crc kubenswrapper[4839]: E0321 04:24:11.077344 4839 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-machine-config-operator\"" event="&Event{ObjectMeta:{kube-rbac-proxy-crio-crc.189ec088fcf0454e openshift-machine-config-operator 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-machine-config-operator,Name:kube-rbac-proxy-crio-crc,UID:d1b160f5dda77d281dd8e69ec8d817f9,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{setup},},Reason:Started,Message:Started container setup,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-21 04:23:17.822244174 +0000 UTC m=+2.150030850,LastTimestamp:2026-03-21 04:23:17.822244174 +0000 UTC m=+2.150030850,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 21 04:24:11 crc kubenswrapper[4839]: E0321 04:24:11.081687 4839 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189ec0890e4a5fa6 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Created,Message:Created container cluster-policy-controller,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-21 04:23:18.11336183 +0000 UTC m=+2.441148506,LastTimestamp:2026-03-21 04:23:18.11336183 +0000 UTC m=+2.441148506,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 21 04:24:11 crc kubenswrapper[4839]: E0321 04:24:11.085312 4839 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189ec0890f18c62d openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Started,Message:Started container cluster-policy-controller,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-21 04:23:18.126888493 +0000 UTC m=+2.454675189,LastTimestamp:2026-03-21 04:23:18.126888493 +0000 UTC m=+2.454675189,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 21 04:24:11 crc kubenswrapper[4839]: E0321 04:24:11.088813 4839 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189ec0890f2fb464 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-controller-manager-cert-syncer},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-21 04:23:18.128391268 +0000 UTC m=+2.456177964,LastTimestamp:2026-03-21 04:23:18.128391268 +0000 UTC m=+2.456177964,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 21 04:24:11 crc kubenswrapper[4839]: E0321 04:24:11.092029 4839 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189ec0891a6a9381 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-controller-manager-cert-syncer},},Reason:Created,Message:Created container kube-controller-manager-cert-syncer,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-21 04:23:18.316798849 +0000 UTC m=+2.644585525,LastTimestamp:2026-03-21 04:23:18.316798849 +0000 UTC m=+2.644585525,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 21 04:24:11 crc kubenswrapper[4839]: E0321 04:24:11.095281 4839 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189ec0891b1325b4 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-controller-manager-cert-syncer},},Reason:Started,Message:Started container kube-controller-manager-cert-syncer,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-21 04:23:18.327846324 +0000 UTC m=+2.655633000,LastTimestamp:2026-03-21 04:23:18.327846324 +0000 UTC m=+2.655633000,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 21 04:24:11 crc kubenswrapper[4839]: E0321 04:24:11.098545 4839 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189ec0891b243a70 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-controller-manager-recovery-controller},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-21 04:23:18.328965744 +0000 UTC m=+2.656752410,LastTimestamp:2026-03-21 04:23:18.328965744 +0000 UTC m=+2.656752410,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 21 04:24:11 crc kubenswrapper[4839]: E0321 04:24:11.102112 4839 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189ec08923fcecf5 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-21 04:23:18.477384949 +0000 UTC m=+2.805171635,LastTimestamp:2026-03-21 04:23:18.477384949 +0000 UTC m=+2.805171635,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 21 04:24:11 crc kubenswrapper[4839]: E0321 04:24:11.105468 4839 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189ec089241ff99f openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{etcd-ensure-env-vars},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-21 04:23:18.479681951 +0000 UTC m=+2.807468647,LastTimestamp:2026-03-21 04:23:18.479681951 +0000 UTC m=+2.807468647,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 21 04:24:11 crc kubenswrapper[4839]: E0321 04:24:11.108958 4839 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-machine-config-operator\"" event="&Event{ObjectMeta:{kube-rbac-proxy-crio-crc.189ec089244654e8 openshift-machine-config-operator 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-machine-config-operator,Name:kube-rbac-proxy-crio-crc,UID:d1b160f5dda77d281dd8e69ec8d817f9,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-rbac-proxy-crio},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-21 04:23:18.482195688 +0000 UTC m=+2.809982374,LastTimestamp:2026-03-21 04:23:18.482195688 +0000 UTC m=+2.809982374,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 21 04:24:11 crc kubenswrapper[4839]: E0321 04:24:11.112252 4839 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.189ec08924997f3e openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-scheduler},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-21 04:23:18.487646014 +0000 UTC m=+2.815432700,LastTimestamp:2026-03-21 04:23:18.487646014 +0000 UTC m=+2.815432700,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 21 04:24:11 crc kubenswrapper[4839]: E0321 04:24:11.115598 4839 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189ec0892865f19e openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-controller-manager-recovery-controller},},Reason:Created,Message:Created container kube-controller-manager-recovery-controller,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-21 04:23:18.551376286 +0000 UTC m=+2.879162962,LastTimestamp:2026-03-21 04:23:18.551376286 +0000 UTC m=+2.879162962,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 21 04:24:11 crc kubenswrapper[4839]: E0321 04:24:11.118738 4839 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189ec0892a1a61ea openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-controller-manager-recovery-controller},},Reason:Started,Message:Started container kube-controller-manager-recovery-controller,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-21 04:23:18.57997873 +0000 UTC m=+2.907765426,LastTimestamp:2026-03-21 04:23:18.57997873 +0000 UTC m=+2.907765426,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 21 04:24:11 crc kubenswrapper[4839]: E0321 04:24:11.122033 4839 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189ec08930386c23 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver},},Reason:Created,Message:Created container kube-apiserver,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-21 04:23:18.682610723 +0000 UTC m=+3.010397409,LastTimestamp:2026-03-21 04:23:18.682610723 +0000 UTC m=+3.010397409,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 21 04:24:11 crc kubenswrapper[4839]: E0321 04:24:11.125237 4839 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.189ec089304673d5 openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-scheduler},},Reason:Created,Message:Created container kube-scheduler,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-21 04:23:18.683530197 +0000 UTC m=+3.011316873,LastTimestamp:2026-03-21 04:23:18.683530197 +0000 UTC m=+3.011316873,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 21 04:24:11 crc kubenswrapper[4839]: E0321 04:24:11.129284 4839 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189ec0893049ffc9 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{etcd-ensure-env-vars},},Reason:Created,Message:Created container etcd-ensure-env-vars,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-21 04:23:18.683762633 +0000 UTC m=+3.011549309,LastTimestamp:2026-03-21 04:23:18.683762633 +0000 UTC m=+3.011549309,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 21 04:24:11 crc kubenswrapper[4839]: E0321 04:24:11.133067 4839 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-machine-config-operator\"" event="&Event{ObjectMeta:{kube-rbac-proxy-crio-crc.189ec089304beb0d openshift-machine-config-operator 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-machine-config-operator,Name:kube-rbac-proxy-crio-crc,UID:d1b160f5dda77d281dd8e69ec8d817f9,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-rbac-proxy-crio},},Reason:Created,Message:Created container kube-rbac-proxy-crio,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-21 04:23:18.683888397 +0000 UTC m=+3.011675073,LastTimestamp:2026-03-21 04:23:18.683888397 +0000 UTC m=+3.011675073,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 21 04:24:11 crc kubenswrapper[4839]: E0321 04:24:11.136824 4839 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-machine-config-operator\"" event="&Event{ObjectMeta:{kube-rbac-proxy-crio-crc.189ec0893185ac4c openshift-machine-config-operator 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-machine-config-operator,Name:kube-rbac-proxy-crio-crc,UID:d1b160f5dda77d281dd8e69ec8d817f9,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-rbac-proxy-crio},},Reason:Started,Message:Started container kube-rbac-proxy-crio,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-21 04:23:18.704450636 +0000 UTC m=+3.032237332,LastTimestamp:2026-03-21 04:23:18.704450636 +0000 UTC m=+3.032237332,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 21 04:24:11 crc kubenswrapper[4839]: E0321 04:24:11.140382 4839 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189ec08931d8d544 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver},},Reason:Started,Message:Started container kube-apiserver,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-21 04:23:18.709900612 +0000 UTC m=+3.037687288,LastTimestamp:2026-03-21 04:23:18.709900612 +0000 UTC m=+3.037687288,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 21 04:24:11 crc kubenswrapper[4839]: E0321 04:24:11.142098 4839 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.189ec08931d91766 openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-scheduler},},Reason:Started,Message:Started container kube-scheduler,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-21 04:23:18.709917542 +0000 UTC m=+3.037704238,LastTimestamp:2026-03-21 04:23:18.709917542 +0000 UTC m=+3.037704238,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 21 04:24:11 crc kubenswrapper[4839]: E0321 04:24:11.144983 4839 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189ec08931d8d490 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{etcd-ensure-env-vars},},Reason:Started,Message:Started container etcd-ensure-env-vars,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-21 04:23:18.709900432 +0000 UTC m=+3.037687128,LastTimestamp:2026-03-21 04:23:18.709900432 +0000 UTC m=+3.037687128,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 21 04:24:11 crc kubenswrapper[4839]: E0321 04:24:11.148164 4839 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.189ec08931e84387 openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-scheduler-cert-syncer},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-21 04:23:18.710911879 +0000 UTC m=+3.038698565,LastTimestamp:2026-03-21 04:23:18.710911879 +0000 UTC m=+3.038698565,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 21 04:24:11 crc kubenswrapper[4839]: E0321 04:24:11.151369 4839 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189ec08931f4b4b1 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-cert-syncer},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-21 04:23:18.711727281 +0000 UTC m=+3.039513957,LastTimestamp:2026-03-21 04:23:18.711727281 +0000 UTC m=+3.039513957,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 21 04:24:11 crc kubenswrapper[4839]: E0321 04:24:11.154452 4839 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.189ec0893f1b027e openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-scheduler-cert-syncer},},Reason:Created,Message:Created container kube-scheduler-cert-syncer,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-21 04:23:18.932341374 +0000 UTC m=+3.260128050,LastTimestamp:2026-03-21 04:23:18.932341374 +0000 UTC m=+3.260128050,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 21 04:24:11 crc kubenswrapper[4839]: E0321 04:24:11.158036 4839 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189ec0893f1cb199 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-cert-syncer},},Reason:Created,Message:Created container kube-apiserver-cert-syncer,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-21 04:23:18.932451737 +0000 UTC m=+3.260238413,LastTimestamp:2026-03-21 04:23:18.932451737 +0000 UTC m=+3.260238413,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 21 04:24:11 crc kubenswrapper[4839]: E0321 04:24:11.161097 4839 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189ec0894037fb89 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-cert-syncer},},Reason:Started,Message:Started container kube-apiserver-cert-syncer,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-21 04:23:18.951017353 +0000 UTC m=+3.278804029,LastTimestamp:2026-03-21 04:23:18.951017353 +0000 UTC m=+3.278804029,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 21 04:24:11 crc kubenswrapper[4839]: E0321 04:24:11.164472 4839 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189ec089404d8bf5 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-cert-regeneration-controller},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-21 04:23:18.952430581 +0000 UTC m=+3.280217257,LastTimestamp:2026-03-21 04:23:18.952430581 +0000 UTC m=+3.280217257,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 21 04:24:11 crc kubenswrapper[4839]: E0321 04:24:11.167908 4839 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.189ec0894083a707 openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-scheduler-cert-syncer},},Reason:Started,Message:Started container kube-scheduler-cert-syncer,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-21 04:23:18.955976455 +0000 UTC m=+3.283763131,LastTimestamp:2026-03-21 04:23:18.955976455 +0000 UTC m=+3.283763131,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 21 04:24:11 crc kubenswrapper[4839]: E0321 04:24:11.171203 4839 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.189ec08940a38979 openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-scheduler-recovery-controller},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-21 04:23:18.958066041 +0000 UTC m=+3.285852707,LastTimestamp:2026-03-21 04:23:18.958066041 +0000 UTC m=+3.285852707,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 21 04:24:11 crc kubenswrapper[4839]: E0321 04:24:11.174408 4839 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189ec0894a3bec8e openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-cert-regeneration-controller},},Reason:Created,Message:Created container kube-apiserver-cert-regeneration-controller,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-21 04:23:19.119047822 +0000 UTC m=+3.446834528,LastTimestamp:2026-03-21 04:23:19.119047822 +0000 UTC m=+3.446834528,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 21 04:24:11 crc kubenswrapper[4839]: E0321 04:24:11.178181 4839 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.189ec0894a4e2004 openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-scheduler-recovery-controller},},Reason:Created,Message:Created container kube-scheduler-recovery-controller,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-21 04:23:19.120240644 +0000 UTC m=+3.448027310,LastTimestamp:2026-03-21 04:23:19.120240644 +0000 UTC m=+3.448027310,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 21 04:24:11 crc kubenswrapper[4839]: E0321 04:24:11.181806 4839 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189ec0894afca1c0 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-cert-regeneration-controller},},Reason:Started,Message:Started container kube-apiserver-cert-regeneration-controller,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-21 04:23:19.13167712 +0000 UTC m=+3.459463796,LastTimestamp:2026-03-21 04:23:19.13167712 +0000 UTC m=+3.459463796,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 21 04:24:11 crc kubenswrapper[4839]: E0321 04:24:11.185333 4839 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189ec0894b1590a7 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-insecure-readyz},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-21 04:23:19.133311143 +0000 UTC m=+3.461097819,LastTimestamp:2026-03-21 04:23:19.133311143 +0000 UTC m=+3.461097819,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 21 04:24:11 crc kubenswrapper[4839]: E0321 04:24:11.188768 4839 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.189ec0894b2fef9d openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-scheduler-recovery-controller},},Reason:Started,Message:Started container kube-scheduler-recovery-controller,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-21 04:23:19.135039389 +0000 UTC m=+3.462826065,LastTimestamp:2026-03-21 04:23:19.135039389 +0000 UTC m=+3.462826065,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 21 04:24:11 crc kubenswrapper[4839]: E0321 04:24:11.191923 4839 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189ec08955d80194 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-insecure-readyz},},Reason:Created,Message:Created container kube-apiserver-insecure-readyz,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-21 04:23:19.313826196 +0000 UTC m=+3.641612872,LastTimestamp:2026-03-21 04:23:19.313826196 +0000 UTC m=+3.641612872,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 21 04:24:11 crc kubenswrapper[4839]: E0321 04:24:11.195489 4839 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189ec08956c6c930 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-insecure-readyz},},Reason:Started,Message:Started container kube-apiserver-insecure-readyz,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-21 04:23:19.329474864 +0000 UTC m=+3.657261540,LastTimestamp:2026-03-21 04:23:19.329474864 +0000 UTC m=+3.657261540,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 21 04:24:11 crc kubenswrapper[4839]: E0321 04:24:11.199490 4839 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189ec08956da05e4 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-check-endpoints},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-21 04:23:19.330735588 +0000 UTC m=+3.658522264,LastTimestamp:2026-03-21 04:23:19.330735588 +0000 UTC m=+3.658522264,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 21 04:24:11 crc kubenswrapper[4839]: E0321 04:24:11.204267 4839 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189ec089615dafaf openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{etcd-resources-copy},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-21 04:23:19.507136431 +0000 UTC m=+3.834923107,LastTimestamp:2026-03-21 04:23:19.507136431 +0000 UTC m=+3.834923107,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 21 04:24:11 crc kubenswrapper[4839]: E0321 04:24:11.208107 4839 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189ec089642866eb openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-check-endpoints},},Reason:Created,Message:Created container kube-apiserver-check-endpoints,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-21 04:23:19.553976043 +0000 UTC m=+3.881762719,LastTimestamp:2026-03-21 04:23:19.553976043 +0000 UTC m=+3.881762719,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 21 04:24:11 crc kubenswrapper[4839]: E0321 04:24:11.211947 4839 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189ec089660847fb openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-check-endpoints},},Reason:Started,Message:Started container kube-apiserver-check-endpoints,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-21 04:23:19.585425403 +0000 UTC m=+3.913212079,LastTimestamp:2026-03-21 04:23:19.585425403 +0000 UTC m=+3.913212079,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 21 04:24:11 crc kubenswrapper[4839]: E0321 04:24:11.216074 4839 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189ec0896f33bf8b openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{etcd-resources-copy},},Reason:Created,Message:Created container etcd-resources-copy,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-21 04:23:19.739269003 +0000 UTC m=+4.067055679,LastTimestamp:2026-03-21 04:23:19.739269003 +0000 UTC m=+4.067055679,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 21 04:24:11 crc kubenswrapper[4839]: E0321 04:24:11.219693 4839 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189ec08970343400 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{etcd-resources-copy},},Reason:Started,Message:Started container etcd-resources-copy,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-21 04:23:19.756076032 +0000 UTC m=+4.083862708,LastTimestamp:2026-03-21 04:23:19.756076032 +0000 UTC m=+4.083862708,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 21 04:24:11 crc kubenswrapper[4839]: E0321 04:24:11.223355 4839 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189ec0899d7a9cab openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcdctl},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-21 04:23:20.515665067 +0000 UTC m=+4.843451733,LastTimestamp:2026-03-21 04:23:20.515665067 +0000 UTC m=+4.843451733,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 21 04:24:11 crc kubenswrapper[4839]: E0321 04:24:11.227939 4839 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189ec089ac3cfc52 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcdctl},},Reason:Created,Message:Created container etcdctl,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-21 04:23:20.763284562 +0000 UTC m=+5.091071238,LastTimestamp:2026-03-21 04:23:20.763284562 +0000 UTC m=+5.091071238,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 21 04:24:11 crc kubenswrapper[4839]: E0321 04:24:11.231359 4839 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189ec089ad08a3bd openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcdctl},},Reason:Started,Message:Started container etcdctl,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-21 04:23:20.776631229 +0000 UTC m=+5.104417915,LastTimestamp:2026-03-21 04:23:20.776631229 +0000 UTC m=+5.104417915,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 21 04:24:11 crc kubenswrapper[4839]: E0321 04:24:11.234323 4839 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189ec089ad18a4e1 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-21 04:23:20.777680097 +0000 UTC m=+5.105466793,LastTimestamp:2026-03-21 04:23:20.777680097 +0000 UTC m=+5.105466793,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 21 04:24:11 crc kubenswrapper[4839]: E0321 04:24:11.237936 4839 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189ec089bb509eeb openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd},},Reason:Created,Message:Created container etcd,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-21 04:23:21.016229611 +0000 UTC m=+5.344016287,LastTimestamp:2026-03-21 04:23:21.016229611 +0000 UTC m=+5.344016287,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 21 04:24:11 crc kubenswrapper[4839]: E0321 04:24:11.241145 4839 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189ec089bc25ec66 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd},},Reason:Started,Message:Started container etcd,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-21 04:23:21.030208614 +0000 UTC m=+5.357995290,LastTimestamp:2026-03-21 04:23:21.030208614 +0000 UTC m=+5.357995290,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 21 04:24:11 crc kubenswrapper[4839]: E0321 04:24:11.244587 4839 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189ec089bc33e5e1 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd-metrics},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-21 04:23:21.031124449 +0000 UTC m=+5.358911125,LastTimestamp:2026-03-21 04:23:21.031124449 +0000 UTC m=+5.358911125,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 21 04:24:11 crc kubenswrapper[4839]: E0321 04:24:11.248439 4839 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189ec089c84d9d2b openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd-metrics},},Reason:Created,Message:Created container etcd-metrics,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-21 04:23:21.234136363 +0000 UTC m=+5.561923059,LastTimestamp:2026-03-21 04:23:21.234136363 +0000 UTC m=+5.561923059,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 21 04:24:11 crc kubenswrapper[4839]: E0321 04:24:11.251487 4839 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189ec089c9293439 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd-metrics},},Reason:Started,Message:Started container etcd-metrics,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-21 04:23:21.248527417 +0000 UTC m=+5.576314103,LastTimestamp:2026-03-21 04:23:21.248527417 +0000 UTC m=+5.576314103,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 21 04:24:11 crc kubenswrapper[4839]: E0321 04:24:11.255334 4839 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189ec089c93b8b3b openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd-readyz},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-21 04:23:21.249729339 +0000 UTC m=+5.577516025,LastTimestamp:2026-03-21 04:23:21.249729339 +0000 UTC m=+5.577516025,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 21 04:24:11 crc kubenswrapper[4839]: E0321 04:24:11.259201 4839 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189ec089d6fa7bad openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd-readyz},},Reason:Created,Message:Created container etcd-readyz,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-21 04:23:21.480346541 +0000 UTC m=+5.808133227,LastTimestamp:2026-03-21 04:23:21.480346541 +0000 UTC m=+5.808133227,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 21 04:24:11 crc kubenswrapper[4839]: E0321 04:24:11.262694 4839 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189ec089d79c7f00 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd-readyz},},Reason:Started,Message:Started container etcd-readyz,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-21 04:23:21.490964224 +0000 UTC m=+5.818750910,LastTimestamp:2026-03-21 04:23:21.490964224 +0000 UTC m=+5.818750910,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 21 04:24:11 crc kubenswrapper[4839]: E0321 04:24:11.265821 4839 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189ec089d7ace029 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd-rev},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-21 04:23:21.492037673 +0000 UTC m=+5.819824359,LastTimestamp:2026-03-21 04:23:21.492037673 +0000 UTC m=+5.819824359,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 21 04:24:11 crc kubenswrapper[4839]: E0321 04:24:11.269879 4839 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189ec089e261670a openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd-rev},},Reason:Created,Message:Created container etcd-rev,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-21 04:23:21.671640842 +0000 UTC m=+5.999427508,LastTimestamp:2026-03-21 04:23:21.671640842 +0000 UTC m=+5.999427508,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 21 04:24:11 crc kubenswrapper[4839]: E0321 04:24:11.272856 4839 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189ec089e303b6d0 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd-rev},},Reason:Started,Message:Started container etcd-rev,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-21 04:23:21.682278096 +0000 UTC m=+6.010064762,LastTimestamp:2026-03-21 04:23:21.682278096 +0000 UTC m=+6.010064762,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 21 04:24:11 crc kubenswrapper[4839]: E0321 04:24:11.277728 4839 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event=< Mar 21 04:24:11 crc kubenswrapper[4839]: &Event{ObjectMeta:{kube-controller-manager-crc.189ec08b2ef4d318 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:ProbeError,Message:Startup probe error: Get "https://192.168.126.11:10357/healthz": context deadline exceeded (Client.Timeout exceeded while awaiting headers) Mar 21 04:24:11 crc kubenswrapper[4839]: body: Mar 21 04:24:11 crc kubenswrapper[4839]: ,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-21 04:23:27.251338008 +0000 UTC m=+11.579124694,LastTimestamp:2026-03-21 04:23:27.251338008 +0000 UTC m=+11.579124694,Count:1,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,} Mar 21 04:24:11 crc kubenswrapper[4839]: > Mar 21 04:24:11 crc kubenswrapper[4839]: E0321 04:24:11.281092 4839 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189ec08b2ef5d238 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Unhealthy,Message:Startup probe failed: Get \"https://192.168.126.11:10357/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers),Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-21 04:23:27.25140332 +0000 UTC m=+11.579190006,LastTimestamp:2026-03-21 04:23:27.25140332 +0000 UTC m=+11.579190006,Count:1,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 21 04:24:11 crc kubenswrapper[4839]: E0321 04:24:11.284543 4839 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event=< Mar 21 04:24:11 crc kubenswrapper[4839]: &Event{ObjectMeta:{kube-apiserver-crc.189ec08c086a7edc openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver},},Reason:ProbeError,Message:Startup probe error: HTTP probe failed with statuscode: 403 Mar 21 04:24:11 crc kubenswrapper[4839]: body: {"kind":"Status","apiVersion":"v1","metadata":{},"status":"Failure","message":"forbidden: User \"system:anonymous\" cannot get path \"/livez\"","reason":"Forbidden","details":{},"code":403} Mar 21 04:24:11 crc kubenswrapper[4839]: Mar 21 04:24:11 crc kubenswrapper[4839]: ,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-21 04:23:30.899705564 +0000 UTC m=+15.227492260,LastTimestamp:2026-03-21 04:23:30.899705564 +0000 UTC m=+15.227492260,Count:1,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,} Mar 21 04:24:11 crc kubenswrapper[4839]: > Mar 21 04:24:11 crc kubenswrapper[4839]: E0321 04:24:11.287943 4839 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189ec08c086c41e3 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver},},Reason:Unhealthy,Message:Startup probe failed: HTTP probe failed with statuscode: 403,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-21 04:23:30.899821027 +0000 UTC m=+15.227607723,LastTimestamp:2026-03-21 04:23:30.899821027 +0000 UTC m=+15.227607723,Count:1,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 21 04:24:11 crc kubenswrapper[4839]: E0321 04:24:11.291256 4839 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-apiserver-crc.189ec08c086a7edc\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event=< Mar 21 04:24:11 crc kubenswrapper[4839]: &Event{ObjectMeta:{kube-apiserver-crc.189ec08c086a7edc openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver},},Reason:ProbeError,Message:Startup probe error: HTTP probe failed with statuscode: 403 Mar 21 04:24:11 crc kubenswrapper[4839]: body: {"kind":"Status","apiVersion":"v1","metadata":{},"status":"Failure","message":"forbidden: User \"system:anonymous\" cannot get path \"/livez\"","reason":"Forbidden","details":{},"code":403} Mar 21 04:24:11 crc kubenswrapper[4839]: Mar 21 04:24:11 crc kubenswrapper[4839]: ,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-21 04:23:30.899705564 +0000 UTC m=+15.227492260,LastTimestamp:2026-03-21 04:23:30.904550433 +0000 UTC m=+15.232337119,Count:2,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,} Mar 21 04:24:11 crc kubenswrapper[4839]: > Mar 21 04:24:11 crc kubenswrapper[4839]: E0321 04:24:11.294986 4839 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-apiserver-crc.189ec08c086c41e3\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189ec08c086c41e3 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver},},Reason:Unhealthy,Message:Startup probe failed: HTTP probe failed with statuscode: 403,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-21 04:23:30.899821027 +0000 UTC m=+15.227607723,LastTimestamp:2026-03-21 04:23:30.904616535 +0000 UTC m=+15.232403221,Count:2,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 21 04:24:11 crc kubenswrapper[4839]: E0321 04:24:11.298038 4839 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event=< Mar 21 04:24:11 crc kubenswrapper[4839]: &Event{ObjectMeta:{kube-apiserver-crc.189ec08c245b404e openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver},},Reason:ProbeError,Message:Startup probe error: HTTP probe failed with statuscode: 500 Mar 21 04:24:11 crc kubenswrapper[4839]: body: [+]ping ok Mar 21 04:24:11 crc kubenswrapper[4839]: [+]log ok Mar 21 04:24:11 crc kubenswrapper[4839]: [+]etcd ok Mar 21 04:24:11 crc kubenswrapper[4839]: [+]poststarthook/start-apiserver-admission-initializer ok Mar 21 04:24:11 crc kubenswrapper[4839]: [+]poststarthook/quota.openshift.io-clusterquotamapping ok Mar 21 04:24:11 crc kubenswrapper[4839]: [+]poststarthook/openshift.io-api-request-count-filter ok Mar 21 04:24:11 crc kubenswrapper[4839]: [+]poststarthook/openshift.io-startkubeinformers ok Mar 21 04:24:11 crc kubenswrapper[4839]: [+]poststarthook/openshift.io-openshift-apiserver-reachable ok Mar 21 04:24:11 crc kubenswrapper[4839]: [+]poststarthook/openshift.io-oauth-apiserver-reachable ok Mar 21 04:24:11 crc kubenswrapper[4839]: [+]poststarthook/generic-apiserver-start-informers ok Mar 21 04:24:11 crc kubenswrapper[4839]: [+]poststarthook/priority-and-fairness-config-consumer ok Mar 21 04:24:11 crc kubenswrapper[4839]: [+]poststarthook/priority-and-fairness-filter ok Mar 21 04:24:11 crc kubenswrapper[4839]: [+]poststarthook/storage-object-count-tracker-hook ok Mar 21 04:24:11 crc kubenswrapper[4839]: [+]poststarthook/start-apiextensions-informers ok Mar 21 04:24:11 crc kubenswrapper[4839]: [+]poststarthook/start-apiextensions-controllers ok Mar 21 04:24:11 crc kubenswrapper[4839]: [+]poststarthook/crd-informer-synced ok Mar 21 04:24:11 crc kubenswrapper[4839]: [+]poststarthook/start-system-namespaces-controller ok Mar 21 04:24:11 crc kubenswrapper[4839]: [+]poststarthook/start-cluster-authentication-info-controller ok Mar 21 04:24:11 crc kubenswrapper[4839]: [+]poststarthook/start-kube-apiserver-identity-lease-controller ok Mar 21 04:24:11 crc kubenswrapper[4839]: [+]poststarthook/start-kube-apiserver-identity-lease-garbage-collector ok Mar 21 04:24:11 crc kubenswrapper[4839]: [+]poststarthook/start-legacy-token-tracking-controller ok Mar 21 04:24:11 crc kubenswrapper[4839]: [+]poststarthook/start-service-ip-repair-controllers ok Mar 21 04:24:11 crc kubenswrapper[4839]: [-]poststarthook/rbac/bootstrap-roles failed: reason withheld Mar 21 04:24:11 crc kubenswrapper[4839]: [-]poststarthook/scheduling/bootstrap-system-priority-classes failed: reason withheld Mar 21 04:24:11 crc kubenswrapper[4839]: [+]poststarthook/priority-and-fairness-config-producer ok Mar 21 04:24:11 crc kubenswrapper[4839]: [+]poststarthook/bootstrap-controller ok Mar 21 04:24:11 crc kubenswrapper[4839]: [+]poststarthook/aggregator-reload-proxy-client-cert ok Mar 21 04:24:11 crc kubenswrapper[4839]: [+]poststarthook/start-kube-aggregator-informers ok Mar 21 04:24:11 crc kubenswrapper[4839]: [+]poststarthook/apiservice-status-local-available-controller ok Mar 21 04:24:11 crc kubenswrapper[4839]: [+]poststarthook/apiservice-status-remote-available-controller ok Mar 21 04:24:11 crc kubenswrapper[4839]: [+]poststarthook/apiservice-registration-controller ok Mar 21 04:24:11 crc kubenswrapper[4839]: [+]poststarthook/apiservice-wait-for-first-sync ok Mar 21 04:24:11 crc kubenswrapper[4839]: [+]poststarthook/apiservice-discovery-controller ok Mar 21 04:24:11 crc kubenswrapper[4839]: [+]poststarthook/kube-apiserver-autoregistration ok Mar 21 04:24:11 crc kubenswrapper[4839]: [+]autoregister-completion ok Mar 21 04:24:11 crc kubenswrapper[4839]: [+]poststarthook/apiservice-openapi-controller ok Mar 21 04:24:11 crc kubenswrapper[4839]: [+]poststarthook/apiservice-openapiv3-controller ok Mar 21 04:24:11 crc kubenswrapper[4839]: livez check failed Mar 21 04:24:11 crc kubenswrapper[4839]: Mar 21 04:24:11 crc kubenswrapper[4839]: ,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-21 04:23:31.368468558 +0000 UTC m=+15.696255264,LastTimestamp:2026-03-21 04:23:31.368468558 +0000 UTC m=+15.696255264,Count:1,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,} Mar 21 04:24:11 crc kubenswrapper[4839]: > Mar 21 04:24:11 crc kubenswrapper[4839]: E0321 04:24:11.300887 4839 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189ec08c245c0986 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver},},Reason:Unhealthy,Message:Startup probe failed: HTTP probe failed with statuscode: 500,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-21 04:23:31.36852007 +0000 UTC m=+15.696306786,LastTimestamp:2026-03-21 04:23:31.36852007 +0000 UTC m=+15.696306786,Count:1,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 21 04:24:11 crc kubenswrapper[4839]: E0321 04:24:11.304765 4839 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-apiserver-crc.189ec08956da05e4\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189ec08956da05e4 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-check-endpoints},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-21 04:23:19.330735588 +0000 UTC m=+3.658522264,LastTimestamp:2026-03-21 04:23:31.572396107 +0000 UTC m=+15.900182783,Count:2,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 21 04:24:11 crc kubenswrapper[4839]: E0321 04:24:11.309178 4839 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event=< Mar 21 04:24:11 crc kubenswrapper[4839]: &Event{ObjectMeta:{kube-controller-manager-crc.189ec08d8308d163 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:ProbeError,Message:Startup probe error: Get "https://192.168.126.11:10357/healthz": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers) Mar 21 04:24:11 crc kubenswrapper[4839]: body: Mar 21 04:24:11 crc kubenswrapper[4839]: ,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-21 04:23:37.251869027 +0000 UTC m=+21.579655713,LastTimestamp:2026-03-21 04:23:37.251869027 +0000 UTC m=+21.579655713,Count:1,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,} Mar 21 04:24:11 crc kubenswrapper[4839]: > Mar 21 04:24:11 crc kubenswrapper[4839]: E0321 04:24:11.312621 4839 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189ec08d8309b48a openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Unhealthy,Message:Startup probe failed: Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers),Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-21 04:23:37.251927178 +0000 UTC m=+21.579713874,LastTimestamp:2026-03-21 04:23:37.251927178 +0000 UTC m=+21.579713874,Count:1,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 21 04:24:11 crc kubenswrapper[4839]: E0321 04:24:11.316780 4839 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-controller-manager-crc.189ec08d8308d163\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event=< Mar 21 04:24:11 crc kubenswrapper[4839]: &Event{ObjectMeta:{kube-controller-manager-crc.189ec08d8308d163 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:ProbeError,Message:Startup probe error: Get "https://192.168.126.11:10357/healthz": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers) Mar 21 04:24:11 crc kubenswrapper[4839]: body: Mar 21 04:24:11 crc kubenswrapper[4839]: ,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-21 04:23:37.251869027 +0000 UTC m=+21.579655713,LastTimestamp:2026-03-21 04:23:47.251820604 +0000 UTC m=+31.579607300,Count:2,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,} Mar 21 04:24:11 crc kubenswrapper[4839]: > Mar 21 04:24:11 crc kubenswrapper[4839]: E0321 04:24:11.319994 4839 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-controller-manager-crc.189ec08d8309b48a\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189ec08d8309b48a openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Unhealthy,Message:Startup probe failed: Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers),Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-21 04:23:37.251927178 +0000 UTC m=+21.579713874,LastTimestamp:2026-03-21 04:23:47.251953719 +0000 UTC m=+31.579740415,Count:2,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 21 04:24:11 crc kubenswrapper[4839]: E0321 04:24:11.323229 4839 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189ec08fd740778d openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Killing,Message:Container cluster-policy-controller failed startup probe, will be restarted,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-21 04:23:47.254736781 +0000 UTC m=+31.582523487,LastTimestamp:2026-03-21 04:23:47.254736781 +0000 UTC m=+31.582523487,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 21 04:24:11 crc kubenswrapper[4839]: E0321 04:24:11.327050 4839 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-controller-manager-crc.189ec088fcd112c0\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189ec088fcd112c0 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-21 04:23:17.820199616 +0000 UTC m=+2.147986292,LastTimestamp:2026-03-21 04:23:47.944168573 +0000 UTC m=+32.271955249,Count:2,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 21 04:24:11 crc kubenswrapper[4839]: E0321 04:24:11.333657 4839 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-controller-manager-crc.189ec0890e4a5fa6\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189ec0890e4a5fa6 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Created,Message:Created container cluster-policy-controller,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-21 04:23:18.11336183 +0000 UTC m=+2.441148506,LastTimestamp:2026-03-21 04:23:48.272318002 +0000 UTC m=+32.600104678,Count:2,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 21 04:24:11 crc kubenswrapper[4839]: E0321 04:24:11.336919 4839 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-controller-manager-crc.189ec0890f18c62d\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189ec0890f18c62d openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Started,Message:Started container cluster-policy-controller,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-21 04:23:18.126888493 +0000 UTC m=+2.454675189,LastTimestamp:2026-03-21 04:23:48.363119666 +0000 UTC m=+32.690906342,Count:2,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 21 04:24:11 crc kubenswrapper[4839]: E0321 04:24:11.344702 4839 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-controller-manager-crc.189ec08b2ef4d318\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event=< Mar 21 04:24:11 crc kubenswrapper[4839]: &Event{ObjectMeta:{kube-controller-manager-crc.189ec08b2ef4d318 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:ProbeError,Message:Startup probe error: Get "https://192.168.126.11:10357/healthz": context deadline exceeded (Client.Timeout exceeded while awaiting headers) Mar 21 04:24:11 crc kubenswrapper[4839]: body: Mar 21 04:24:11 crc kubenswrapper[4839]: ,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-21 04:23:27.251338008 +0000 UTC m=+11.579124694,LastTimestamp:2026-03-21 04:23:57.251519488 +0000 UTC m=+41.579306204,Count:2,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,} Mar 21 04:24:11 crc kubenswrapper[4839]: > Mar 21 04:24:11 crc kubenswrapper[4839]: E0321 04:24:11.348372 4839 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-controller-manager-crc.189ec08b2ef5d238\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189ec08b2ef5d238 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Unhealthy,Message:Startup probe failed: Get \"https://192.168.126.11:10357/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers),Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-21 04:23:27.25140332 +0000 UTC m=+11.579190006,LastTimestamp:2026-03-21 04:23:57.251648062 +0000 UTC m=+41.579434768,Count:2,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 21 04:24:11 crc kubenswrapper[4839]: E0321 04:24:11.352166 4839 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-controller-manager-crc.189ec08d8308d163\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event=< Mar 21 04:24:11 crc kubenswrapper[4839]: &Event{ObjectMeta:{kube-controller-manager-crc.189ec08d8308d163 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:ProbeError,Message:Startup probe error: Get "https://192.168.126.11:10357/healthz": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers) Mar 21 04:24:11 crc kubenswrapper[4839]: body: Mar 21 04:24:11 crc kubenswrapper[4839]: ,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-21 04:23:37.251869027 +0000 UTC m=+21.579655713,LastTimestamp:2026-03-21 04:24:07.251052796 +0000 UTC m=+51.578839512,Count:3,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,} Mar 21 04:24:11 crc kubenswrapper[4839]: > Mar 21 04:24:11 crc kubenswrapper[4839]: I0321 04:24:11.398292 4839 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 21 04:24:12 crc kubenswrapper[4839]: I0321 04:24:12.316630 4839 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 21 04:24:12 crc kubenswrapper[4839]: I0321 04:24:12.318850 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:24:12 crc kubenswrapper[4839]: I0321 04:24:12.318878 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:24:12 crc kubenswrapper[4839]: I0321 04:24:12.318887 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:24:12 crc kubenswrapper[4839]: I0321 04:24:12.318933 4839 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 21 04:24:12 crc kubenswrapper[4839]: E0321 04:24:12.320505 4839 kubelet_node_status.go:99] "Unable to register node with API server" err="nodes is forbidden: User \"system:anonymous\" cannot create resource \"nodes\" in API group \"\" at the cluster scope" node="crc" Mar 21 04:24:12 crc kubenswrapper[4839]: E0321 04:24:12.320989 4839 controller.go:145] "Failed to ensure lease exists, will retry" err="leases.coordination.k8s.io \"crc\" is forbidden: User \"system:anonymous\" cannot get resource \"leases\" in API group \"coordination.k8s.io\" in the namespace \"kube-node-lease\"" interval="7s" Mar 21 04:24:12 crc kubenswrapper[4839]: I0321 04:24:12.400020 4839 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 21 04:24:13 crc kubenswrapper[4839]: I0321 04:24:13.401900 4839 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 21 04:24:13 crc kubenswrapper[4839]: I0321 04:24:13.452802 4839 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 21 04:24:13 crc kubenswrapper[4839]: I0321 04:24:13.454339 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:24:13 crc kubenswrapper[4839]: I0321 04:24:13.454422 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:24:13 crc kubenswrapper[4839]: I0321 04:24:13.454443 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:24:13 crc kubenswrapper[4839]: I0321 04:24:13.455488 4839 scope.go:117] "RemoveContainer" containerID="2edb841c97e50e44d1c47b38425979e96688c3d536fe263be8db34fe4f7ec6ce" Mar 21 04:24:13 crc kubenswrapper[4839]: I0321 04:24:13.728941 4839 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/2.log" Mar 21 04:24:13 crc kubenswrapper[4839]: I0321 04:24:13.731897 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"ecafd1a4266b72f369f4b53d8a423bb875c1bc9719282c517a8e999ad5aecc66"} Mar 21 04:24:13 crc kubenswrapper[4839]: I0321 04:24:13.732271 4839 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 21 04:24:13 crc kubenswrapper[4839]: I0321 04:24:13.733870 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:24:13 crc kubenswrapper[4839]: I0321 04:24:13.733907 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:24:13 crc kubenswrapper[4839]: I0321 04:24:13.733919 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:24:14 crc kubenswrapper[4839]: I0321 04:24:14.399835 4839 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 21 04:24:15 crc kubenswrapper[4839]: I0321 04:24:15.399341 4839 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 21 04:24:15 crc kubenswrapper[4839]: I0321 04:24:15.739012 4839 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/3.log" Mar 21 04:24:15 crc kubenswrapper[4839]: I0321 04:24:15.739766 4839 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/2.log" Mar 21 04:24:15 crc kubenswrapper[4839]: I0321 04:24:15.741425 4839 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="ecafd1a4266b72f369f4b53d8a423bb875c1bc9719282c517a8e999ad5aecc66" exitCode=255 Mar 21 04:24:15 crc kubenswrapper[4839]: I0321 04:24:15.741460 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerDied","Data":"ecafd1a4266b72f369f4b53d8a423bb875c1bc9719282c517a8e999ad5aecc66"} Mar 21 04:24:15 crc kubenswrapper[4839]: I0321 04:24:15.741497 4839 scope.go:117] "RemoveContainer" containerID="2edb841c97e50e44d1c47b38425979e96688c3d536fe263be8db34fe4f7ec6ce" Mar 21 04:24:15 crc kubenswrapper[4839]: I0321 04:24:15.741679 4839 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 21 04:24:15 crc kubenswrapper[4839]: I0321 04:24:15.742488 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:24:15 crc kubenswrapper[4839]: I0321 04:24:15.742516 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:24:15 crc kubenswrapper[4839]: I0321 04:24:15.742524 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:24:15 crc kubenswrapper[4839]: I0321 04:24:15.749188 4839 scope.go:117] "RemoveContainer" containerID="ecafd1a4266b72f369f4b53d8a423bb875c1bc9719282c517a8e999ad5aecc66" Mar 21 04:24:15 crc kubenswrapper[4839]: E0321 04:24:15.750465 4839 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 21 04:24:16 crc kubenswrapper[4839]: I0321 04:24:16.398749 4839 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 21 04:24:16 crc kubenswrapper[4839]: E0321 04:24:16.520472 4839 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Mar 21 04:24:16 crc kubenswrapper[4839]: I0321 04:24:16.746384 4839 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/3.log" Mar 21 04:24:17 crc kubenswrapper[4839]: I0321 04:24:17.251290 4839 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/cluster-policy-controller namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 21 04:24:17 crc kubenswrapper[4839]: I0321 04:24:17.251351 4839 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="cluster-policy-controller" probeResult="failure" output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 21 04:24:17 crc kubenswrapper[4839]: I0321 04:24:17.251401 4839 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 21 04:24:17 crc kubenswrapper[4839]: I0321 04:24:17.251526 4839 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 21 04:24:17 crc kubenswrapper[4839]: I0321 04:24:17.253434 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:24:17 crc kubenswrapper[4839]: I0321 04:24:17.253461 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:24:17 crc kubenswrapper[4839]: I0321 04:24:17.253475 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:24:17 crc kubenswrapper[4839]: I0321 04:24:17.254054 4839 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="cluster-policy-controller" containerStatusID={"Type":"cri-o","ID":"ed71f104d54976b384a414c08b933d4f7185883ea0e2dc638863d67f9b245505"} pod="openshift-kube-controller-manager/kube-controller-manager-crc" containerMessage="Container cluster-policy-controller failed startup probe, will be restarted" Mar 21 04:24:17 crc kubenswrapper[4839]: I0321 04:24:17.254139 4839 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="cluster-policy-controller" containerID="cri-o://ed71f104d54976b384a414c08b933d4f7185883ea0e2dc638863d67f9b245505" gracePeriod=30 Mar 21 04:24:17 crc kubenswrapper[4839]: I0321 04:24:17.399904 4839 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 21 04:24:17 crc kubenswrapper[4839]: I0321 04:24:17.753444 4839 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/cluster-policy-controller/1.log" Mar 21 04:24:17 crc kubenswrapper[4839]: I0321 04:24:17.754545 4839 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/cluster-policy-controller/0.log" Mar 21 04:24:17 crc kubenswrapper[4839]: I0321 04:24:17.754923 4839 generic.go:334] "Generic (PLEG): container finished" podID="f614b9022728cf315e60c057852e563e" containerID="ed71f104d54976b384a414c08b933d4f7185883ea0e2dc638863d67f9b245505" exitCode=255 Mar 21 04:24:17 crc kubenswrapper[4839]: I0321 04:24:17.754967 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerDied","Data":"ed71f104d54976b384a414c08b933d4f7185883ea0e2dc638863d67f9b245505"} Mar 21 04:24:17 crc kubenswrapper[4839]: I0321 04:24:17.755000 4839 scope.go:117] "RemoveContainer" containerID="1e036a62b31123ade40717ddff4b8b13971bdcda78062ebf348d49978c3c7a58" Mar 21 04:24:17 crc kubenswrapper[4839]: I0321 04:24:17.770406 4839 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 21 04:24:17 crc kubenswrapper[4839]: I0321 04:24:17.770580 4839 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 21 04:24:17 crc kubenswrapper[4839]: I0321 04:24:17.771525 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:24:17 crc kubenswrapper[4839]: I0321 04:24:17.771602 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:24:17 crc kubenswrapper[4839]: I0321 04:24:17.771619 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:24:17 crc kubenswrapper[4839]: I0321 04:24:17.772226 4839 scope.go:117] "RemoveContainer" containerID="ecafd1a4266b72f369f4b53d8a423bb875c1bc9719282c517a8e999ad5aecc66" Mar 21 04:24:17 crc kubenswrapper[4839]: E0321 04:24:17.772425 4839 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 21 04:24:18 crc kubenswrapper[4839]: I0321 04:24:18.400304 4839 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 21 04:24:18 crc kubenswrapper[4839]: I0321 04:24:18.759626 4839 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/cluster-policy-controller/1.log" Mar 21 04:24:18 crc kubenswrapper[4839]: I0321 04:24:18.761086 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"4a7ef331cbf4076bc661dd93ec1669ee8b721edbac57aeef3df25f1e36527065"} Mar 21 04:24:18 crc kubenswrapper[4839]: I0321 04:24:18.761269 4839 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 21 04:24:18 crc kubenswrapper[4839]: I0321 04:24:18.762405 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:24:18 crc kubenswrapper[4839]: I0321 04:24:18.762439 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:24:18 crc kubenswrapper[4839]: I0321 04:24:18.762454 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:24:19 crc kubenswrapper[4839]: I0321 04:24:19.320650 4839 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 21 04:24:19 crc kubenswrapper[4839]: I0321 04:24:19.321756 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:24:19 crc kubenswrapper[4839]: I0321 04:24:19.321786 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:24:19 crc kubenswrapper[4839]: I0321 04:24:19.321797 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:24:19 crc kubenswrapper[4839]: I0321 04:24:19.321818 4839 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 21 04:24:19 crc kubenswrapper[4839]: E0321 04:24:19.324560 4839 kubelet_node_status.go:99] "Unable to register node with API server" err="nodes is forbidden: User \"system:anonymous\" cannot create resource \"nodes\" in API group \"\" at the cluster scope" node="crc" Mar 21 04:24:19 crc kubenswrapper[4839]: E0321 04:24:19.325424 4839 controller.go:145] "Failed to ensure lease exists, will retry" err="leases.coordination.k8s.io \"crc\" is forbidden: User \"system:anonymous\" cannot get resource \"leases\" in API group \"coordination.k8s.io\" in the namespace \"kube-node-lease\"" interval="7s" Mar 21 04:24:19 crc kubenswrapper[4839]: I0321 04:24:19.395633 4839 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 21 04:24:19 crc kubenswrapper[4839]: I0321 04:24:19.765286 4839 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 21 04:24:19 crc kubenswrapper[4839]: I0321 04:24:19.766188 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:24:19 crc kubenswrapper[4839]: I0321 04:24:19.766270 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:24:19 crc kubenswrapper[4839]: I0321 04:24:19.766296 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:24:20 crc kubenswrapper[4839]: I0321 04:24:20.411590 4839 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 21 04:24:21 crc kubenswrapper[4839]: I0321 04:24:21.402037 4839 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 21 04:24:22 crc kubenswrapper[4839]: I0321 04:24:22.397910 4839 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 21 04:24:23 crc kubenswrapper[4839]: I0321 04:24:23.398651 4839 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 21 04:24:23 crc kubenswrapper[4839]: I0321 04:24:23.678703 4839 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 21 04:24:23 crc kubenswrapper[4839]: I0321 04:24:23.678985 4839 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 21 04:24:23 crc kubenswrapper[4839]: I0321 04:24:23.680734 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:24:23 crc kubenswrapper[4839]: I0321 04:24:23.680773 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:24:23 crc kubenswrapper[4839]: I0321 04:24:23.680782 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:24:23 crc kubenswrapper[4839]: I0321 04:24:23.681308 4839 scope.go:117] "RemoveContainer" containerID="ecafd1a4266b72f369f4b53d8a423bb875c1bc9719282c517a8e999ad5aecc66" Mar 21 04:24:23 crc kubenswrapper[4839]: E0321 04:24:23.681455 4839 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 21 04:24:23 crc kubenswrapper[4839]: I0321 04:24:23.923908 4839 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 21 04:24:23 crc kubenswrapper[4839]: I0321 04:24:23.924071 4839 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 21 04:24:23 crc kubenswrapper[4839]: I0321 04:24:23.925184 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:24:23 crc kubenswrapper[4839]: I0321 04:24:23.925228 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:24:23 crc kubenswrapper[4839]: I0321 04:24:23.925238 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:24:24 crc kubenswrapper[4839]: I0321 04:24:24.250482 4839 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 21 04:24:24 crc kubenswrapper[4839]: I0321 04:24:24.254131 4839 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 21 04:24:24 crc kubenswrapper[4839]: I0321 04:24:24.397987 4839 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 21 04:24:24 crc kubenswrapper[4839]: I0321 04:24:24.774620 4839 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 21 04:24:24 crc kubenswrapper[4839]: I0321 04:24:24.775440 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:24:24 crc kubenswrapper[4839]: I0321 04:24:24.775500 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:24:24 crc kubenswrapper[4839]: I0321 04:24:24.775511 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:24:25 crc kubenswrapper[4839]: I0321 04:24:25.398864 4839 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 21 04:24:25 crc kubenswrapper[4839]: I0321 04:24:25.776537 4839 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 21 04:24:25 crc kubenswrapper[4839]: I0321 04:24:25.777383 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:24:25 crc kubenswrapper[4839]: I0321 04:24:25.777421 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:24:25 crc kubenswrapper[4839]: I0321 04:24:25.777434 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:24:26 crc kubenswrapper[4839]: I0321 04:24:26.325091 4839 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 21 04:24:26 crc kubenswrapper[4839]: I0321 04:24:26.326222 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:24:26 crc kubenswrapper[4839]: I0321 04:24:26.326254 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:24:26 crc kubenswrapper[4839]: I0321 04:24:26.326266 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:24:26 crc kubenswrapper[4839]: I0321 04:24:26.326291 4839 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 21 04:24:26 crc kubenswrapper[4839]: E0321 04:24:26.329907 4839 kubelet_node_status.go:99] "Unable to register node with API server" err="nodes is forbidden: User \"system:anonymous\" cannot create resource \"nodes\" in API group \"\" at the cluster scope" node="crc" Mar 21 04:24:26 crc kubenswrapper[4839]: E0321 04:24:26.330094 4839 controller.go:145] "Failed to ensure lease exists, will retry" err="leases.coordination.k8s.io \"crc\" is forbidden: User \"system:anonymous\" cannot get resource \"leases\" in API group \"coordination.k8s.io\" in the namespace \"kube-node-lease\"" interval="7s" Mar 21 04:24:26 crc kubenswrapper[4839]: I0321 04:24:26.397930 4839 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 21 04:24:26 crc kubenswrapper[4839]: E0321 04:24:26.520953 4839 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Mar 21 04:24:27 crc kubenswrapper[4839]: I0321 04:24:27.185120 4839 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Rotating certificates Mar 21 04:24:27 crc kubenswrapper[4839]: I0321 04:24:27.198044 4839 reflector.go:368] Caches populated for *v1.CertificateSigningRequest from k8s.io/client-go/tools/watch/informerwatcher.go:146 Mar 21 04:24:27 crc kubenswrapper[4839]: I0321 04:24:27.399814 4839 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 21 04:24:28 crc kubenswrapper[4839]: I0321 04:24:28.399810 4839 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 21 04:24:29 crc kubenswrapper[4839]: I0321 04:24:29.398854 4839 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 21 04:24:30 crc kubenswrapper[4839]: I0321 04:24:30.398312 4839 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 21 04:24:31 crc kubenswrapper[4839]: I0321 04:24:31.399493 4839 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 21 04:24:31 crc kubenswrapper[4839]: I0321 04:24:31.761548 4839 csr.go:261] certificate signing request csr-t6bsz is approved, waiting to be issued Mar 21 04:24:31 crc kubenswrapper[4839]: I0321 04:24:31.780867 4839 csr.go:257] certificate signing request csr-t6bsz is issued Mar 21 04:24:31 crc kubenswrapper[4839]: I0321 04:24:31.836816 4839 reconstruct.go:205] "DevicePaths of reconstructed volumes updated" Mar 21 04:24:32 crc kubenswrapper[4839]: I0321 04:24:32.187011 4839 transport.go:147] "Certificate rotation detected, shutting down client connections to start using new credentials" Mar 21 04:24:32 crc kubenswrapper[4839]: I0321 04:24:32.782104 4839 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Certificate expiration is 2027-02-24 05:54:36 +0000 UTC, rotation deadline is 2026-12-16 20:32:32.914753866 +0000 UTC Mar 21 04:24:32 crc kubenswrapper[4839]: I0321 04:24:32.782153 4839 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Waiting 6496h8m0.132603148s for next certificate rotation Mar 21 04:24:33 crc kubenswrapper[4839]: I0321 04:24:33.330440 4839 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 21 04:24:33 crc kubenswrapper[4839]: I0321 04:24:33.331492 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:24:33 crc kubenswrapper[4839]: I0321 04:24:33.331528 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:24:33 crc kubenswrapper[4839]: I0321 04:24:33.331537 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:24:33 crc kubenswrapper[4839]: I0321 04:24:33.331647 4839 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 21 04:24:33 crc kubenswrapper[4839]: I0321 04:24:33.340028 4839 kubelet_node_status.go:115] "Node was previously registered" node="crc" Mar 21 04:24:33 crc kubenswrapper[4839]: I0321 04:24:33.340109 4839 kubelet_node_status.go:79] "Successfully registered node" node="crc" Mar 21 04:24:33 crc kubenswrapper[4839]: E0321 04:24:33.340127 4839 kubelet_node_status.go:585] "Error updating node status, will retry" err="error getting node \"crc\": node \"crc\" not found" Mar 21 04:24:33 crc kubenswrapper[4839]: I0321 04:24:33.342841 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:24:33 crc kubenswrapper[4839]: I0321 04:24:33.342963 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:24:33 crc kubenswrapper[4839]: I0321 04:24:33.342989 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:24:33 crc kubenswrapper[4839]: I0321 04:24:33.343006 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 04:24:33 crc kubenswrapper[4839]: I0321 04:24:33.343018 4839 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T04:24:33Z","lastTransitionTime":"2026-03-21T04:24:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 04:24:33 crc kubenswrapper[4839]: E0321 04:24:33.357180 4839 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404564Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865364Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-21T04:24:33Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-21T04:24:33Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-21T04:24:33Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-21T04:24:33Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-21T04:24:33Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-21T04:24:33Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-21T04:24:33Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-21T04:24:33Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"d75f4a6d-5ce3-49bd-a7c7-4f5c3e6bf62f\\\",\\\"systemUUID\\\":\\\"2a7bfad9-30ba-42d8-b982-971191ebb9d6\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 21 04:24:33 crc kubenswrapper[4839]: I0321 04:24:33.363555 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:24:33 crc kubenswrapper[4839]: I0321 04:24:33.363612 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:24:33 crc kubenswrapper[4839]: I0321 04:24:33.363623 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:24:33 crc kubenswrapper[4839]: I0321 04:24:33.363637 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 04:24:33 crc kubenswrapper[4839]: I0321 04:24:33.363646 4839 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T04:24:33Z","lastTransitionTime":"2026-03-21T04:24:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 04:24:33 crc kubenswrapper[4839]: E0321 04:24:33.373896 4839 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404564Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865364Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-21T04:24:33Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-21T04:24:33Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-21T04:24:33Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-21T04:24:33Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-21T04:24:33Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-21T04:24:33Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-21T04:24:33Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-21T04:24:33Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"d75f4a6d-5ce3-49bd-a7c7-4f5c3e6bf62f\\\",\\\"systemUUID\\\":\\\"2a7bfad9-30ba-42d8-b982-971191ebb9d6\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 21 04:24:33 crc kubenswrapper[4839]: I0321 04:24:33.380056 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:24:33 crc kubenswrapper[4839]: I0321 04:24:33.380107 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:24:33 crc kubenswrapper[4839]: I0321 04:24:33.380117 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:24:33 crc kubenswrapper[4839]: I0321 04:24:33.380131 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 04:24:33 crc kubenswrapper[4839]: I0321 04:24:33.380140 4839 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T04:24:33Z","lastTransitionTime":"2026-03-21T04:24:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 04:24:33 crc kubenswrapper[4839]: E0321 04:24:33.388923 4839 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404564Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865364Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-21T04:24:33Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-21T04:24:33Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-21T04:24:33Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-21T04:24:33Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-21T04:24:33Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-21T04:24:33Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-21T04:24:33Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-21T04:24:33Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"d75f4a6d-5ce3-49bd-a7c7-4f5c3e6bf62f\\\",\\\"systemUUID\\\":\\\"2a7bfad9-30ba-42d8-b982-971191ebb9d6\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 21 04:24:33 crc kubenswrapper[4839]: I0321 04:24:33.397513 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:24:33 crc kubenswrapper[4839]: I0321 04:24:33.397782 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:24:33 crc kubenswrapper[4839]: I0321 04:24:33.397797 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:24:33 crc kubenswrapper[4839]: I0321 04:24:33.397815 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 04:24:33 crc kubenswrapper[4839]: I0321 04:24:33.397832 4839 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T04:24:33Z","lastTransitionTime":"2026-03-21T04:24:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 04:24:33 crc kubenswrapper[4839]: E0321 04:24:33.409392 4839 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404564Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865364Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-21T04:24:33Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-21T04:24:33Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-21T04:24:33Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-21T04:24:33Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-21T04:24:33Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-21T04:24:33Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-21T04:24:33Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-21T04:24:33Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"d75f4a6d-5ce3-49bd-a7c7-4f5c3e6bf62f\\\",\\\"systemUUID\\\":\\\"2a7bfad9-30ba-42d8-b982-971191ebb9d6\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 21 04:24:33 crc kubenswrapper[4839]: E0321 04:24:33.409498 4839 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Mar 21 04:24:33 crc kubenswrapper[4839]: E0321 04:24:33.409520 4839 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 21 04:24:33 crc kubenswrapper[4839]: E0321 04:24:33.510677 4839 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 21 04:24:33 crc kubenswrapper[4839]: E0321 04:24:33.611347 4839 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 21 04:24:33 crc kubenswrapper[4839]: E0321 04:24:33.712345 4839 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 21 04:24:33 crc kubenswrapper[4839]: E0321 04:24:33.813162 4839 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 21 04:24:33 crc kubenswrapper[4839]: E0321 04:24:33.913410 4839 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 21 04:24:33 crc kubenswrapper[4839]: I0321 04:24:33.927492 4839 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 21 04:24:33 crc kubenswrapper[4839]: I0321 04:24:33.927711 4839 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 21 04:24:33 crc kubenswrapper[4839]: I0321 04:24:33.929021 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:24:33 crc kubenswrapper[4839]: I0321 04:24:33.929054 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:24:33 crc kubenswrapper[4839]: I0321 04:24:33.929063 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:24:34 crc kubenswrapper[4839]: E0321 04:24:34.013514 4839 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 21 04:24:34 crc kubenswrapper[4839]: E0321 04:24:34.114030 4839 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 21 04:24:34 crc kubenswrapper[4839]: E0321 04:24:34.215179 4839 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 21 04:24:34 crc kubenswrapper[4839]: E0321 04:24:34.315745 4839 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 21 04:24:34 crc kubenswrapper[4839]: E0321 04:24:34.416914 4839 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 21 04:24:34 crc kubenswrapper[4839]: I0321 04:24:34.451931 4839 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 21 04:24:34 crc kubenswrapper[4839]: I0321 04:24:34.453624 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:24:34 crc kubenswrapper[4839]: I0321 04:24:34.453675 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:24:34 crc kubenswrapper[4839]: I0321 04:24:34.453687 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:24:34 crc kubenswrapper[4839]: E0321 04:24:34.518071 4839 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 21 04:24:34 crc kubenswrapper[4839]: E0321 04:24:34.618981 4839 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 21 04:24:34 crc kubenswrapper[4839]: E0321 04:24:34.720011 4839 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 21 04:24:34 crc kubenswrapper[4839]: E0321 04:24:34.821186 4839 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 21 04:24:34 crc kubenswrapper[4839]: E0321 04:24:34.922615 4839 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 21 04:24:35 crc kubenswrapper[4839]: E0321 04:24:35.023738 4839 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 21 04:24:35 crc kubenswrapper[4839]: E0321 04:24:35.124645 4839 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 21 04:24:35 crc kubenswrapper[4839]: E0321 04:24:35.225767 4839 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 21 04:24:35 crc kubenswrapper[4839]: E0321 04:24:35.326428 4839 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 21 04:24:35 crc kubenswrapper[4839]: E0321 04:24:35.426818 4839 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 21 04:24:35 crc kubenswrapper[4839]: I0321 04:24:35.452066 4839 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 21 04:24:35 crc kubenswrapper[4839]: I0321 04:24:35.453169 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:24:35 crc kubenswrapper[4839]: I0321 04:24:35.453195 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:24:35 crc kubenswrapper[4839]: I0321 04:24:35.453205 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:24:35 crc kubenswrapper[4839]: I0321 04:24:35.453794 4839 scope.go:117] "RemoveContainer" containerID="ecafd1a4266b72f369f4b53d8a423bb875c1bc9719282c517a8e999ad5aecc66" Mar 21 04:24:35 crc kubenswrapper[4839]: E0321 04:24:35.453966 4839 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 21 04:24:35 crc kubenswrapper[4839]: E0321 04:24:35.527356 4839 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 21 04:24:35 crc kubenswrapper[4839]: I0321 04:24:35.535276 4839 reflector.go:368] Caches populated for *v1.RuntimeClass from k8s.io/client-go/informers/factory.go:160 Mar 21 04:24:35 crc kubenswrapper[4839]: E0321 04:24:35.628379 4839 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 21 04:24:35 crc kubenswrapper[4839]: E0321 04:24:35.729443 4839 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 21 04:24:35 crc kubenswrapper[4839]: E0321 04:24:35.830392 4839 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 21 04:24:35 crc kubenswrapper[4839]: E0321 04:24:35.931048 4839 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 21 04:24:36 crc kubenswrapper[4839]: E0321 04:24:36.031332 4839 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 21 04:24:36 crc kubenswrapper[4839]: E0321 04:24:36.132162 4839 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 21 04:24:36 crc kubenswrapper[4839]: E0321 04:24:36.232875 4839 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 21 04:24:36 crc kubenswrapper[4839]: E0321 04:24:36.333467 4839 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 21 04:24:36 crc kubenswrapper[4839]: E0321 04:24:36.434354 4839 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 21 04:24:36 crc kubenswrapper[4839]: E0321 04:24:36.522049 4839 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Mar 21 04:24:36 crc kubenswrapper[4839]: E0321 04:24:36.534442 4839 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 21 04:24:36 crc kubenswrapper[4839]: E0321 04:24:36.634911 4839 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 21 04:24:36 crc kubenswrapper[4839]: E0321 04:24:36.735987 4839 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 21 04:24:36 crc kubenswrapper[4839]: E0321 04:24:36.836551 4839 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 21 04:24:36 crc kubenswrapper[4839]: E0321 04:24:36.937257 4839 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 21 04:24:37 crc kubenswrapper[4839]: E0321 04:24:37.037399 4839 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 21 04:24:37 crc kubenswrapper[4839]: E0321 04:24:37.137944 4839 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 21 04:24:37 crc kubenswrapper[4839]: E0321 04:24:37.238916 4839 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 21 04:24:37 crc kubenswrapper[4839]: E0321 04:24:37.339794 4839 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 21 04:24:37 crc kubenswrapper[4839]: E0321 04:24:37.440148 4839 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 21 04:24:37 crc kubenswrapper[4839]: E0321 04:24:37.541134 4839 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 21 04:24:37 crc kubenswrapper[4839]: E0321 04:24:37.642214 4839 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 21 04:24:37 crc kubenswrapper[4839]: E0321 04:24:37.742346 4839 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 21 04:24:37 crc kubenswrapper[4839]: E0321 04:24:37.843364 4839 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 21 04:24:37 crc kubenswrapper[4839]: E0321 04:24:37.943507 4839 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 21 04:24:38 crc kubenswrapper[4839]: E0321 04:24:38.043657 4839 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 21 04:24:38 crc kubenswrapper[4839]: E0321 04:24:38.144805 4839 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 21 04:24:38 crc kubenswrapper[4839]: E0321 04:24:38.245024 4839 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 21 04:24:38 crc kubenswrapper[4839]: E0321 04:24:38.345484 4839 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 21 04:24:38 crc kubenswrapper[4839]: E0321 04:24:38.446169 4839 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 21 04:24:38 crc kubenswrapper[4839]: E0321 04:24:38.546366 4839 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 21 04:24:38 crc kubenswrapper[4839]: E0321 04:24:38.646558 4839 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 21 04:24:38 crc kubenswrapper[4839]: E0321 04:24:38.747212 4839 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 21 04:24:38 crc kubenswrapper[4839]: E0321 04:24:38.847937 4839 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 21 04:24:38 crc kubenswrapper[4839]: E0321 04:24:38.948616 4839 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 21 04:24:39 crc kubenswrapper[4839]: E0321 04:24:39.049740 4839 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 21 04:24:39 crc kubenswrapper[4839]: E0321 04:24:39.150652 4839 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 21 04:24:39 crc kubenswrapper[4839]: E0321 04:24:39.251090 4839 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 21 04:24:39 crc kubenswrapper[4839]: E0321 04:24:39.351563 4839 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 21 04:24:39 crc kubenswrapper[4839]: E0321 04:24:39.452628 4839 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 21 04:24:39 crc kubenswrapper[4839]: E0321 04:24:39.553730 4839 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 21 04:24:39 crc kubenswrapper[4839]: E0321 04:24:39.654910 4839 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 21 04:24:39 crc kubenswrapper[4839]: E0321 04:24:39.755263 4839 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 21 04:24:39 crc kubenswrapper[4839]: E0321 04:24:39.855595 4839 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 21 04:24:39 crc kubenswrapper[4839]: E0321 04:24:39.955961 4839 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 21 04:24:40 crc kubenswrapper[4839]: E0321 04:24:40.056363 4839 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 21 04:24:40 crc kubenswrapper[4839]: E0321 04:24:40.157537 4839 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 21 04:24:40 crc kubenswrapper[4839]: E0321 04:24:40.258144 4839 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 21 04:24:40 crc kubenswrapper[4839]: E0321 04:24:40.358801 4839 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 21 04:24:40 crc kubenswrapper[4839]: E0321 04:24:40.459466 4839 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 21 04:24:40 crc kubenswrapper[4839]: E0321 04:24:40.560189 4839 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 21 04:24:40 crc kubenswrapper[4839]: E0321 04:24:40.661000 4839 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 21 04:24:40 crc kubenswrapper[4839]: E0321 04:24:40.761690 4839 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 21 04:24:40 crc kubenswrapper[4839]: I0321 04:24:40.800615 4839 reflector.go:368] Caches populated for *v1.Node from k8s.io/client-go/informers/factory.go:160 Mar 21 04:24:40 crc kubenswrapper[4839]: I0321 04:24:40.865124 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:24:40 crc kubenswrapper[4839]: I0321 04:24:40.865176 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:24:40 crc kubenswrapper[4839]: I0321 04:24:40.865193 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:24:40 crc kubenswrapper[4839]: I0321 04:24:40.865217 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 04:24:40 crc kubenswrapper[4839]: I0321 04:24:40.865232 4839 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T04:24:40Z","lastTransitionTime":"2026-03-21T04:24:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 04:24:40 crc kubenswrapper[4839]: I0321 04:24:40.968605 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:24:40 crc kubenswrapper[4839]: I0321 04:24:40.968654 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:24:40 crc kubenswrapper[4839]: I0321 04:24:40.968665 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:24:40 crc kubenswrapper[4839]: I0321 04:24:40.968683 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 04:24:40 crc kubenswrapper[4839]: I0321 04:24:40.968695 4839 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T04:24:40Z","lastTransitionTime":"2026-03-21T04:24:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.071070 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.071150 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.071169 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.071198 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.071219 4839 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T04:24:41Z","lastTransitionTime":"2026-03-21T04:24:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.174023 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.174103 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.174126 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.174153 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.174171 4839 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T04:24:41Z","lastTransitionTime":"2026-03-21T04:24:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.277143 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.277200 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.277214 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.277232 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.277244 4839 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T04:24:41Z","lastTransitionTime":"2026-03-21T04:24:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.381140 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.381192 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.381204 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.381223 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.381237 4839 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T04:24:41Z","lastTransitionTime":"2026-03-21T04:24:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.420540 4839 apiserver.go:52] "Watching apiserver" Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.427407 4839 reflector.go:368] Caches populated for *v1.Pod from pkg/kubelet/config/apiserver.go:66 Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.428021 4839 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-network-diagnostics/network-check-target-xd92c","openshift-network-node-identity/network-node-identity-vrzqb","openshift-network-operator/iptables-alerter-4ln5h","openshift-network-operator/network-operator-58b4c7f79c-55gtf","openshift-network-console/networking-console-plugin-85b44fc459-gdk6g","openshift-network-diagnostics/network-check-source-55646444c4-trplf"] Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.428642 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.428717 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.428784 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.429261 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-4ln5h" Mar 21 04:24:41 crc kubenswrapper[4839]: E0321 04:24:41.429279 4839 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.429647 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.429656 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-node-identity/network-node-identity-vrzqb" Mar 21 04:24:41 crc kubenswrapper[4839]: E0321 04:24:41.429980 4839 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 21 04:24:41 crc kubenswrapper[4839]: E0321 04:24:41.430030 4839 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.433416 4839 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"openshift-service-ca.crt" Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.433636 4839 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-operator"/"metrics-tls" Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.433443 4839 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"env-overrides" Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.434544 4839 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"kube-root-ca.crt" Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.434689 4839 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"iptables-alerter-script" Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.434697 4839 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-node-identity"/"network-node-identity-cert" Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.433478 4839 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"kube-root-ca.crt" Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.435055 4839 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"ovnkube-identity-cm" Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.435263 4839 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"openshift-service-ca.crt" Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.477040 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:24:41Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:24:41Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.484835 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.484891 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.484902 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.484920 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.484930 4839 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T04:24:41Z","lastTransitionTime":"2026-03-21T04:24:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.493339 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:24:41Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:24:41Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.495982 4839 desired_state_of_world_populator.go:154] "Finished populating initial desired state of world" Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.499851 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.500001 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle\") pod \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\" (UID: \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\") " Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.500109 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config\") pod \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\" (UID: \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\") " Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.500202 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8tdtz\" (UniqueName: \"kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.500298 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x7zkh\" (UniqueName: \"kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh\") pod \"6731426b-95fe-49ff-bb5f-40441049fde2\" (UID: \"6731426b-95fe-49ff-bb5f-40441049fde2\") " Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.500394 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert\") pod \"496e6271-fb68-4057-954e-a0d97a4afa3f\" (UID: \"496e6271-fb68-4057-954e-a0d97a4afa3f\") " Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.500486 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs" (OuterVolumeSpecName: "v4-0-config-system-router-certs") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-router-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.500697 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.500813 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config\") pod \"0b574797-001e-440a-8f4e-c0be86edad0f\" (UID: \"0b574797-001e-440a-8f4e-c0be86edad0f\") " Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.500910 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.501013 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.501109 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist\") pod \"7bb08738-c794-4ee8-9972-3a62ca171029\" (UID: \"7bb08738-c794-4ee8-9972-3a62ca171029\") " Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.501035 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz" (OuterVolumeSpecName: "kube-api-access-8tdtz") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "kube-api-access-8tdtz". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.501044 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle" (OuterVolumeSpecName: "signing-cabundle") pod "25e176fe-21b4-4974-b1ed-c8b94f112a7f" (UID: "25e176fe-21b4-4974-b1ed-c8b94f112a7f"). InnerVolumeSpecName "signing-cabundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 21 04:24:41 crc kubenswrapper[4839]: E0321 04:24:41.501136 4839 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-21 04:24:42.001118281 +0000 UTC m=+86.328904957 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.502477 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config" (OuterVolumeSpecName: "config") pod "1386a44e-36a2-460c-96d0-0359d2b6f0f5" (UID: "1386a44e-36a2-460c-96d0-0359d2b6f0f5"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.501366 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh" (OuterVolumeSpecName: "kube-api-access-x7zkh") pod "6731426b-95fe-49ff-bb5f-40441049fde2" (UID: "6731426b-95fe-49ff-bb5f-40441049fde2"). InnerVolumeSpecName "kube-api-access-x7zkh". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.501557 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "496e6271-fb68-4057-954e-a0d97a4afa3f" (UID: "496e6271-fb68-4057-954e-a0d97a4afa3f"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.501773 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert" (OuterVolumeSpecName: "webhook-cert") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "webhook-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.501889 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist" (OuterVolumeSpecName: "cni-sysctl-allowlist") pod "7bb08738-c794-4ee8-9972-3a62ca171029" (UID: "7bb08738-c794-4ee8-9972-3a62ca171029"). InnerVolumeSpecName "cni-sysctl-allowlist". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.502132 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert" (OuterVolumeSpecName: "apiservice-cert") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "apiservice-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.502190 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config" (OuterVolumeSpecName: "mcc-auth-proxy-config") pod "0b574797-001e-440a-8f4e-c0be86edad0f" (UID: "0b574797-001e-440a-8f4e-c0be86edad0f"). InnerVolumeSpecName "mcc-auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.502898 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy\") pod \"7bb08738-c794-4ee8-9972-3a62ca171029\" (UID: \"7bb08738-c794-4ee8-9972-3a62ca171029\") " Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.503021 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics\") pod \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\" (UID: \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\") " Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.503134 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config\") pod \"fda69060-fa79-4696-b1a6-7980f124bf7c\" (UID: \"fda69060-fa79-4696-b1a6-7980f124bf7c\") " Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.503230 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.503333 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert\") pod \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\" (UID: \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\") " Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.503432 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.503614 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.503437 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics" (OuterVolumeSpecName: "marketplace-operator-metrics") pod "b6cd30de-2eeb-49a2-ab40-9167f4560ff5" (UID: "b6cd30de-2eeb-49a2-ab40-9167f4560ff5"). InnerVolumeSpecName "marketplace-operator-metrics". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.503706 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.503738 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.503855 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lz9wn\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.503901 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.503895 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config" (OuterVolumeSpecName: "mcd-auth-proxy-config") pod "fda69060-fa79-4696-b1a6-7980f124bf7c" (UID: "fda69060-fa79-4696-b1a6-7980f124bf7c"). InnerVolumeSpecName "mcd-auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.503918 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert" (OuterVolumeSpecName: "srv-cert") pod "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" (UID: "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9"). InnerVolumeSpecName "srv-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.503941 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rnphk\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.503979 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.504019 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.504052 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert\") pod \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\" (UID: \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\") " Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.504091 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.504096 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.504128 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token\") pod \"5fe579f8-e8a6-4643-bce5-a661393c4dde\" (UID: \"5fe579f8-e8a6-4643-bce5-a661393c4dde\") " Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.504172 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.504208 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.504210 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn" (OuterVolumeSpecName: "kube-api-access-lz9wn") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "kube-api-access-lz9wn". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.504248 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qg5z5\" (UniqueName: \"kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.504289 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-htfz6\" (UniqueName: \"kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.504325 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.504368 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities\") pod \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\" (UID: \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\") " Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.504407 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zgdk5\" (UniqueName: \"kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.504445 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.504454 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" (UID: "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.504481 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.504522 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.504557 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls\") pod \"fda69060-fa79-4696-b1a6-7980f124bf7c\" (UID: \"fda69060-fa79-4696-b1a6-7980f124bf7c\") " Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.504627 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.504659 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access\") pod \"e7e6199b-1264-4501-8953-767f51328d08\" (UID: \"e7e6199b-1264-4501-8953-767f51328d08\") " Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.504693 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zkvpv\" (UniqueName: \"kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.504726 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.504757 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.504788 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert\") pod \"20b0d48f-5fd6-431c-a545-e3c800c7b866\" (UID: \"20b0d48f-5fd6-431c-a545-e3c800c7b866\") " Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.504821 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.504852 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.504881 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token" (OuterVolumeSpecName: "node-bootstrap-token") pod "5fe579f8-e8a6-4643-bce5-a661393c4dde" (UID: "5fe579f8-e8a6-4643-bce5-a661393c4dde"). InnerVolumeSpecName "node-bootstrap-token". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.504888 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sb6h7\" (UniqueName: \"kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.504953 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert\") pod \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\" (UID: \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\") " Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.504985 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk" (OuterVolumeSpecName: "kube-api-access-rnphk") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "kube-api-access-rnphk". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.505018 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates\") pod \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\" (UID: \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\") " Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.505048 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mnrrd\" (UniqueName: \"kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd\") pod \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\" (UID: \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\") " Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.505041 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:24:41Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:24:41Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.505065 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fqsjt\" (UniqueName: \"kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt\") pod \"efdd0498-1daa-4136-9a4a-3b948c2293fc\" (UID: \"efdd0498-1daa-4136-9a4a-3b948c2293fc\") " Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.505181 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert\") pod \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\" (UID: \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\") " Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.505205 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config\") pod \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\" (UID: \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\") " Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.505228 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x2m85\" (UniqueName: \"kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85\") pod \"cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d\" (UID: \"cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d\") " Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.505245 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config\") pod \"e7e6199b-1264-4501-8953-767f51328d08\" (UID: \"e7e6199b-1264-4501-8953-767f51328d08\") " Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.505261 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lzf88\" (UniqueName: \"kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88\") pod \"0b574797-001e-440a-8f4e-c0be86edad0f\" (UID: \"0b574797-001e-440a-8f4e-c0be86edad0f\") " Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.505280 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-s4n52\" (UniqueName: \"kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.505298 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert\") pod \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\" (UID: \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\") " Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.505317 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config\") pod \"7539238d-5fe0-46ed-884e-1c3b566537ec\" (UID: \"7539238d-5fe0-46ed-884e-1c3b566537ec\") " Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.505337 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.505351 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls\") pod \"87cf06ed-a83f-41a7-828d-70653580a8cb\" (UID: \"87cf06ed-a83f-41a7-828d-70653580a8cb\") " Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.505368 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7c4vf\" (UniqueName: \"kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.505388 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.505410 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fcqwp\" (UniqueName: \"kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp\") pod \"5fe579f8-e8a6-4643-bce5-a661393c4dde\" (UID: \"5fe579f8-e8a6-4643-bce5-a661393c4dde\") " Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.505428 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vt5rc\" (UniqueName: \"kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc\") pod \"44663579-783b-4372-86d6-acf235a62d72\" (UID: \"44663579-783b-4372-86d6-acf235a62d72\") " Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.505444 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content\") pod \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\" (UID: \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\") " Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.505460 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.505477 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.505496 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w7l8j\" (UniqueName: \"kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j\") pod \"01ab3dd5-8196-46d0-ad33-122e2ca51def\" (UID: \"01ab3dd5-8196-46d0-ad33-122e2ca51def\") " Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.505513 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mg5zb\" (UniqueName: \"kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.505538 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4d4hj\" (UniqueName: \"kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj\") pod \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\" (UID: \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\") " Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.505555 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.505591 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.505609 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-249nr\" (UniqueName: \"kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr\") pod \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\" (UID: \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\") " Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.505627 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.505645 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume\") pod \"87cf06ed-a83f-41a7-828d-70653580a8cb\" (UID: \"87cf06ed-a83f-41a7-828d-70653580a8cb\") " Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.505668 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.505685 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.505702 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.505718 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.505736 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.505751 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.505769 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities\") pod \"1d611f23-29be-4491-8495-bee1670e935f\" (UID: \"1d611f23-29be-4491-8495-bee1670e935f\") " Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.505789 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert\") pod \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\" (UID: \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\") " Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.505807 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.505825 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.505844 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jhbk2\" (UniqueName: \"kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2\") pod \"bd23aa5c-e532-4e53-bccf-e79f130c5ae8\" (UID: \"bd23aa5c-e532-4e53-bccf-e79f130c5ae8\") " Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.505901 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6g6sz\" (UniqueName: \"kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.505918 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tk88c\" (UniqueName: \"kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c\") pod \"7539238d-5fe0-46ed-884e-1c3b566537ec\" (UID: \"7539238d-5fe0-46ed-884e-1c3b566537ec\") " Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.505934 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.505953 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-v47cf\" (UniqueName: \"kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.505969 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.505986 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.506006 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key\") pod \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\" (UID: \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\") " Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.506031 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.506054 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xcphl\" (UniqueName: \"kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.506080 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nzwt7\" (UniqueName: \"kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7\") pod \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\" (UID: \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\") " Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.506108 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls\") pod \"6731426b-95fe-49ff-bb5f-40441049fde2\" (UID: \"6731426b-95fe-49ff-bb5f-40441049fde2\") " Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.506133 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca\") pod \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\" (UID: \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\") " Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.506163 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content\") pod \"5225d0e4-402f-4861-b410-819f433b1803\" (UID: \"5225d0e4-402f-4861-b410-819f433b1803\") " Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.506187 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.506212 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9xfj7\" (UniqueName: \"kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7\") pod \"5225d0e4-402f-4861-b410-819f433b1803\" (UID: \"5225d0e4-402f-4861-b410-819f433b1803\") " Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.506241 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kfwg7\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.506266 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.506292 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jkwtn\" (UniqueName: \"kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn\") pod \"5b88f790-22fa-440e-b583-365168c0b23d\" (UID: \"5b88f790-22fa-440e-b583-365168c0b23d\") " Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.504995 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.505032 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.505147 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls" (OuterVolumeSpecName: "image-registry-operator-tls") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "image-registry-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.505155 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy" (OuterVolumeSpecName: "cni-binary-copy") pod "7bb08738-c794-4ee8-9972-3a62ca171029" (UID: "7bb08738-c794-4ee8-9972-3a62ca171029"). InnerVolumeSpecName "cni-binary-copy". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.505302 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config" (OuterVolumeSpecName: "config") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.506389 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc" (OuterVolumeSpecName: "kube-api-access-vt5rc") pod "44663579-783b-4372-86d6-acf235a62d72" (UID: "44663579-783b-4372-86d6-acf235a62d72"). InnerVolumeSpecName "kube-api-access-vt5rc". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.505509 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7" (OuterVolumeSpecName: "kube-api-access-sb6h7") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "kube-api-access-sb6h7". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.505536 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5" (OuterVolumeSpecName: "kube-api-access-zgdk5") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "kube-api-access-zgdk5". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.505639 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" (UID: "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.505760 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.506054 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates" (OuterVolumeSpecName: "available-featuregates") pod "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" (UID: "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d"). InnerVolumeSpecName "available-featuregates". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.506062 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.506084 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca" (OuterVolumeSpecName: "etcd-ca") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "etcd-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.506127 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.506380 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.506681 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt" (OuterVolumeSpecName: "kube-api-access-fqsjt") pod "efdd0498-1daa-4136-9a4a-3b948c2293fc" (UID: "efdd0498-1daa-4136-9a4a-3b948c2293fc"). InnerVolumeSpecName "kube-api-access-fqsjt". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.506693 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities" (OuterVolumeSpecName: "utilities") pod "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" (UID: "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.506854 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert" (OuterVolumeSpecName: "srv-cert") pod "b6312bbd-5731-4ea0-a20f-81d5a57df44a" (UID: "b6312bbd-5731-4ea0-a20f-81d5a57df44a"). InnerVolumeSpecName "srv-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.506877 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "fda69060-fa79-4696-b1a6-7980f124bf7c" (UID: "fda69060-fa79-4696-b1a6-7980f124bf7c"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.506810 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5" (OuterVolumeSpecName: "kube-api-access-qg5z5") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "kube-api-access-qg5z5". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.507178 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config" (OuterVolumeSpecName: "config") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.507512 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.507537 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd" (OuterVolumeSpecName: "kube-api-access-mnrrd") pod "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" (UID: "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d"). InnerVolumeSpecName "kube-api-access-mnrrd". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.507716 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6" (OuterVolumeSpecName: "kube-api-access-htfz6") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "kube-api-access-htfz6". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.507982 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j" (OuterVolumeSpecName: "kube-api-access-w7l8j") pod "01ab3dd5-8196-46d0-ad33-122e2ca51def" (UID: "01ab3dd5-8196-46d0-ad33-122e2ca51def"). InnerVolumeSpecName "kube-api-access-w7l8j". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.507998 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config" (OuterVolumeSpecName: "config") pod "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" (UID: "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.508035 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "e7e6199b-1264-4501-8953-767f51328d08" (UID: "e7e6199b-1264-4501-8953-767f51328d08"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.508356 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides" (OuterVolumeSpecName: "env-overrides") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "env-overrides". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.508559 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config" (OuterVolumeSpecName: "ovnkube-config") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "ovnkube-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.508836 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies" (OuterVolumeSpecName: "audit-policies") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "audit-policies". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.508834 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login" (OuterVolumeSpecName: "v4-0-config-user-template-login") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-template-login". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.508965 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca" (OuterVolumeSpecName: "etcd-serving-ca") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "etcd-serving-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.509024 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls" (OuterVolumeSpecName: "metrics-tls") pod "87cf06ed-a83f-41a7-828d-70653580a8cb" (UID: "87cf06ed-a83f-41a7-828d-70653580a8cb"). InnerVolumeSpecName "metrics-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.509289 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf" (OuterVolumeSpecName: "kube-api-access-7c4vf") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "kube-api-access-7c4vf". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.506321 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.509345 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bf2bz\" (UniqueName: \"kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz\") pod \"1d611f23-29be-4491-8495-bee1670e935f\" (UID: \"1d611f23-29be-4491-8495-bee1670e935f\") " Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.509370 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w9rds\" (UniqueName: \"kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds\") pod \"20b0d48f-5fd6-431c-a545-e3c800c7b866\" (UID: \"20b0d48f-5fd6-431c-a545-e3c800c7b866\") " Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.509374 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr" (OuterVolumeSpecName: "kube-api-access-249nr") pod "b6312bbd-5731-4ea0-a20f-81d5a57df44a" (UID: "b6312bbd-5731-4ea0-a20f-81d5a57df44a"). InnerVolumeSpecName "kube-api-access-249nr". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.509389 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.509493 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca\") pod \"0b78653f-4ff9-4508-8672-245ed9b561e3\" (UID: \"0b78653f-4ff9-4508-8672-245ed9b561e3\") " Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.509695 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.509792 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xcgwh\" (UniqueName: \"kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh\") pod \"fda69060-fa79-4696-b1a6-7980f124bf7c\" (UID: \"fda69060-fa79-4696-b1a6-7980f124bf7c\") " Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.509873 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs\") pod \"5fe579f8-e8a6-4643-bce5-a661393c4dde\" (UID: \"5fe579f8-e8a6-4643-bce5-a661393c4dde\") " Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.509919 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.509965 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access\") pod \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\" (UID: \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\") " Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.510007 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.510046 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config\") pod \"01ab3dd5-8196-46d0-ad33-122e2ca51def\" (UID: \"01ab3dd5-8196-46d0-ad33-122e2ca51def\") " Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.510090 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.510131 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config\") pod \"4bb40260-dbaa-4fb0-84df-5e680505d512\" (UID: \"4bb40260-dbaa-4fb0-84df-5e680505d512\") " Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.510170 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert\") pod \"01ab3dd5-8196-46d0-ad33-122e2ca51def\" (UID: \"01ab3dd5-8196-46d0-ad33-122e2ca51def\") " Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.510209 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2d4wz\" (UniqueName: \"kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.510243 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.510279 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.510365 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pj782\" (UniqueName: \"kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782\") pod \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\" (UID: \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\") " Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.510407 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.511670 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.511719 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca\") pod \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\" (UID: \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\") " Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.512242 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls\") pod \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\" (UID: \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\") " Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.512280 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.512312 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2w9zh\" (UniqueName: \"kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh\") pod \"4bb40260-dbaa-4fb0-84df-5e680505d512\" (UID: \"4bb40260-dbaa-4fb0-84df-5e680505d512\") " Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.512343 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d6qdx\" (UniqueName: \"kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx\") pod \"87cf06ed-a83f-41a7-828d-70653580a8cb\" (UID: \"87cf06ed-a83f-41a7-828d-70653580a8cb\") " Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.512371 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.512398 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.512423 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.512458 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x4zgh\" (UniqueName: \"kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh\") pod \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\" (UID: \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\") " Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.512485 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.512511 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config\") pod \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\" (UID: \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\") " Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.512539 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-279lb\" (UniqueName: \"kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb\") pod \"7bb08738-c794-4ee8-9972-3a62ca171029\" (UID: \"7bb08738-c794-4ee8-9972-3a62ca171029\") " Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.512586 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cfbct\" (UniqueName: \"kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct\") pod \"57a731c4-ef35-47a8-b875-bfb08a7f8011\" (UID: \"57a731c4-ef35-47a8-b875-bfb08a7f8011\") " Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.512611 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert\") pod \"7539238d-5fe0-46ed-884e-1c3b566537ec\" (UID: \"7539238d-5fe0-46ed-884e-1c3b566537ec\") " Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.512642 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gf66m\" (UniqueName: \"kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m\") pod \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\" (UID: \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\") " Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.512667 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities\") pod \"5225d0e4-402f-4861-b410-819f433b1803\" (UID: \"5225d0e4-402f-4861-b410-819f433b1803\") " Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.512696 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.512721 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pcxfs\" (UniqueName: \"kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.512748 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.512775 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities\") pod \"57a731c4-ef35-47a8-b875-bfb08a7f8011\" (UID: \"57a731c4-ef35-47a8-b875-bfb08a7f8011\") " Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.512810 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pjr6v\" (UniqueName: \"kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v\") pod \"49ef4625-1d3a-4a9f-b595-c2433d32326d\" (UID: \"49ef4625-1d3a-4a9f-b595-c2433d32326d\") " Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.512838 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.512869 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.512899 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs\") pod \"5b88f790-22fa-440e-b583-365168c0b23d\" (UID: \"5b88f790-22fa-440e-b583-365168c0b23d\") " Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.512927 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.512954 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.512978 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.513003 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.513027 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.513053 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.513078 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.513101 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config\") pod \"496e6271-fb68-4057-954e-a0d97a4afa3f\" (UID: \"496e6271-fb68-4057-954e-a0d97a4afa3f\") " Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.513127 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.509420 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config" (OuterVolumeSpecName: "config") pod "e7e6199b-1264-4501-8953-767f51328d08" (UID: "e7e6199b-1264-4501-8953-767f51328d08"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.509738 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj" (OuterVolumeSpecName: "kube-api-access-4d4hj") pod "3ab1a177-2de0-46d9-b765-d0d0649bb42e" (UID: "3ab1a177-2de0-46d9-b765-d0d0649bb42e"). InnerVolumeSpecName "kube-api-access-4d4hj". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.509764 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.509789 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp" (OuterVolumeSpecName: "kube-api-access-fcqwp") pod "5fe579f8-e8a6-4643-bce5-a661393c4dde" (UID: "5fe579f8-e8a6-4643-bce5-a661393c4dde"). InnerVolumeSpecName "kube-api-access-fcqwp". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.509890 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52" (OuterVolumeSpecName: "kube-api-access-s4n52") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "kube-api-access-s4n52". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.509961 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88" (OuterVolumeSpecName: "kube-api-access-lzf88") pod "0b574797-001e-440a-8f4e-c0be86edad0f" (UID: "0b574797-001e-440a-8f4e-c0be86edad0f"). InnerVolumeSpecName "kube-api-access-lzf88". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.509980 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.510469 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb" (OuterVolumeSpecName: "kube-api-access-mg5zb") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "kube-api-access-mg5zb". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.510810 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv" (OuterVolumeSpecName: "kube-api-access-zkvpv") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "kube-api-access-zkvpv". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.511079 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85" (OuterVolumeSpecName: "kube-api-access-x2m85") pod "cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d" (UID: "cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d"). InnerVolumeSpecName "kube-api-access-x2m85". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.511178 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key" (OuterVolumeSpecName: "signing-key") pod "25e176fe-21b4-4974-b1ed-c8b94f112a7f" (UID: "25e176fe-21b4-4974-b1ed-c8b94f112a7f"). InnerVolumeSpecName "signing-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.511173 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert" (OuterVolumeSpecName: "v4-0-config-system-serving-cert") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.511265 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config" (OuterVolumeSpecName: "encryption-config") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "encryption-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.511375 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.511598 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" (UID: "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.511665 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client" (OuterVolumeSpecName: "etcd-client") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "etcd-client". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.511832 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert" (OuterVolumeSpecName: "ovn-control-plane-metrics-cert") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "ovn-control-plane-metrics-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.512024 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images" (OuterVolumeSpecName: "images") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "images". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.512089 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca" (OuterVolumeSpecName: "etcd-serving-ca") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "etcd-serving-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.512108 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz" (OuterVolumeSpecName: "kube-api-access-6g6sz") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "kube-api-access-6g6sz". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.512105 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities" (OuterVolumeSpecName: "utilities") pod "1d611f23-29be-4491-8495-bee1670e935f" (UID: "1d611f23-29be-4491-8495-bee1670e935f"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.512128 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz" (OuterVolumeSpecName: "kube-api-access-bf2bz") pod "1d611f23-29be-4491-8495-bee1670e935f" (UID: "1d611f23-29be-4491-8495-bee1670e935f"). InnerVolumeSpecName "kube-api-access-bf2bz". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.512327 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client" (OuterVolumeSpecName: "etcd-client") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "etcd-client". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.515778 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert" (OuterVolumeSpecName: "cert") pod "20b0d48f-5fd6-431c-a545-e3c800c7b866" (UID: "20b0d48f-5fd6-431c-a545-e3c800c7b866"). InnerVolumeSpecName "cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.516833 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls" (OuterVolumeSpecName: "metrics-tls") pod "96b93a3a-6083-4aea-8eab-fe1aa8245ad9" (UID: "96b93a3a-6083-4aea-8eab-fe1aa8245ad9"). InnerVolumeSpecName "metrics-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.516880 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.516975 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:24:41Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:24:41Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.517083 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.517119 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session" (OuterVolumeSpecName: "v4-0-config-system-session") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-session". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.517356 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "1386a44e-36a2-460c-96d0-0359d2b6f0f5" (UID: "1386a44e-36a2-460c-96d0-0359d2b6f0f5"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.517714 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data" (OuterVolumeSpecName: "v4-0-config-user-idp-0-file-data") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-idp-0-file-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.518067 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs" (OuterVolumeSpecName: "certs") pod "5fe579f8-e8a6-4643-bce5-a661393c4dde" (UID: "5fe579f8-e8a6-4643-bce5-a661393c4dde"). InnerVolumeSpecName "certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.518080 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib" (OuterVolumeSpecName: "ovnkube-script-lib") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "ovnkube-script-lib". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.518106 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client" (OuterVolumeSpecName: "etcd-client") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "etcd-client". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.518167 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert" (OuterVolumeSpecName: "ovn-node-metrics-cert") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "ovn-node-metrics-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.518296 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds" (OuterVolumeSpecName: "kube-api-access-w9rds") pod "20b0d48f-5fd6-431c-a545-e3c800c7b866" (UID: "20b0d48f-5fd6-431c-a545-e3c800c7b866"). InnerVolumeSpecName "kube-api-access-w9rds". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.518365 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl" (OuterVolumeSpecName: "kube-api-access-xcphl") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "kube-api-access-xcphl". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.518462 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca" (OuterVolumeSpecName: "marketplace-trusted-ca") pod "b6cd30de-2eeb-49a2-ab40-9167f4560ff5" (UID: "b6cd30de-2eeb-49a2-ab40-9167f4560ff5"). InnerVolumeSpecName "marketplace-trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.518498 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca" (OuterVolumeSpecName: "service-ca") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.518517 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls" (OuterVolumeSpecName: "control-plane-machine-set-operator-tls") pod "6731426b-95fe-49ff-bb5f-40441049fde2" (UID: "6731426b-95fe-49ff-bb5f-40441049fde2"). InnerVolumeSpecName "control-plane-machine-set-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.518131 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7" (OuterVolumeSpecName: "kube-api-access-9xfj7") pod "5225d0e4-402f-4861-b410-819f433b1803" (UID: "5225d0e4-402f-4861-b410-819f433b1803"). InnerVolumeSpecName "kube-api-access-9xfj7". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.518666 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7" (OuterVolumeSpecName: "kube-api-access-nzwt7") pod "96b93a3a-6083-4aea-8eab-fe1aa8245ad9" (UID: "96b93a3a-6083-4aea-8eab-fe1aa8245ad9"). InnerVolumeSpecName "kube-api-access-nzwt7". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.518805 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf" (OuterVolumeSpecName: "kube-api-access-v47cf") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "kube-api-access-v47cf". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.518844 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs" (OuterVolumeSpecName: "metrics-certs") pod "5b88f790-22fa-440e-b583-365168c0b23d" (UID: "5b88f790-22fa-440e-b583-365168c0b23d"). InnerVolumeSpecName "metrics-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.519001 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.513154 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.519388 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls\") pod \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\" (UID: \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\") " Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.519505 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert\") pod \"e7e6199b-1264-4501-8953-767f51328d08\" (UID: \"e7e6199b-1264-4501-8953-767f51328d08\") " Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.519536 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.519592 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d4lsv\" (UniqueName: \"kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv\") pod \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\" (UID: \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\") " Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.519620 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert\") pod \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\" (UID: \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\") " Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.519647 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w4xd4\" (UniqueName: \"kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4\") pod \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\" (UID: \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\") " Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.519676 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.519702 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls\") pod \"0b574797-001e-440a-8f4e-c0be86edad0f\" (UID: \"0b574797-001e-440a-8f4e-c0be86edad0f\") " Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.519728 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy\") pod \"4bb40260-dbaa-4fb0-84df-5e680505d512\" (UID: \"4bb40260-dbaa-4fb0-84df-5e680505d512\") " Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.519747 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.519756 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs\") pod \"efdd0498-1daa-4136-9a4a-3b948c2293fc\" (UID: \"efdd0498-1daa-4136-9a4a-3b948c2293fc\") " Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.519800 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.519825 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.519847 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.519867 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.519887 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dbsvg\" (UniqueName: \"kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg\") pod \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\" (UID: \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\") " Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.519907 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.519927 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access\") pod \"0b78653f-4ff9-4508-8672-245ed9b561e3\" (UID: \"0b78653f-4ff9-4508-8672-245ed9b561e3\") " Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.519945 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert\") pod \"0b78653f-4ff9-4508-8672-245ed9b561e3\" (UID: \"0b78653f-4ff9-4508-8672-245ed9b561e3\") " Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.519963 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert\") pod \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\" (UID: \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\") " Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.519982 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.520075 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.520097 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.520149 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.520169 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.520189 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access\") pod \"496e6271-fb68-4057-954e-a0d97a4afa3f\" (UID: \"496e6271-fb68-4057-954e-a0d97a4afa3f\") " Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.520210 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content\") pod \"1d611f23-29be-4491-8495-bee1670e935f\" (UID: \"1d611f23-29be-4491-8495-bee1670e935f\") " Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.520229 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.520249 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wxkg8\" (UniqueName: \"kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8\") pod \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\" (UID: \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\") " Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.520270 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.520290 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.520315 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qs4fp\" (UniqueName: \"kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp\") pod \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\" (UID: \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\") " Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.520332 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert\") pod \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\" (UID: \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\") " Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.520355 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ngvvp\" (UniqueName: \"kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.520353 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.520376 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6ccd8\" (UniqueName: \"kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.520400 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content\") pod \"57a731c4-ef35-47a8-b875-bfb08a7f8011\" (UID: \"57a731c4-ef35-47a8-b875-bfb08a7f8011\") " Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.520460 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rczfb\" (UniqueName: \"kubernetes.io/projected/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-kube-api-access-rczfb\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.520502 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/37a5e44f-9a88-4405-be8a-b645485e7312-metrics-tls\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.520522 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rdwmf\" (UniqueName: \"kubernetes.io/projected/37a5e44f-9a88-4405-be8a-b645485e7312-kube-api-access-rdwmf\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.520548 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-env-overrides\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.520590 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.520617 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.520653 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-host-slash\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.520675 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/37a5e44f-9a88-4405-be8a-b645485e7312-host-etc-kube\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.520697 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2kz5\" (UniqueName: \"kubernetes.io/projected/ef543e1b-8068-4ea3-b32a-61027b32e95d-kube-api-access-s2kz5\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.520722 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.520743 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.520767 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-identity-cm\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-ovnkube-identity-cm\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.520789 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-iptables-alerter-script\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.520815 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/ef543e1b-8068-4ea3-b32a-61027b32e95d-webhook-cert\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.520906 4839 reconciler_common.go:293] "Volume detached for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle\") on node \"crc\" DevicePath \"\"" Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.520920 4839 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config\") on node \"crc\" DevicePath \"\"" Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.520932 4839 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8tdtz\" (UniqueName: \"kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz\") on node \"crc\" DevicePath \"\"" Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.520945 4839 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs\") on node \"crc\" DevicePath \"\"" Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.520956 4839 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x7zkh\" (UniqueName: \"kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh\") on node \"crc\" DevicePath \"\"" Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.520967 4839 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.520978 4839 reconciler_common.go:293] "Volume detached for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.520989 4839 reconciler_common.go:293] "Volume detached for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert\") on node \"crc\" DevicePath \"\"" Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.521000 4839 reconciler_common.go:293] "Volume detached for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert\") on node \"crc\" DevicePath \"\"" Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.521010 4839 reconciler_common.go:293] "Volume detached for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist\") on node \"crc\" DevicePath \"\"" Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.521020 4839 reconciler_common.go:293] "Volume detached for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy\") on node \"crc\" DevicePath \"\"" Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.521031 4839 reconciler_common.go:293] "Volume detached for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert\") on node \"crc\" DevicePath \"\"" Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.521041 4839 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca\") on node \"crc\" DevicePath \"\"" Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.521051 4839 reconciler_common.go:293] "Volume detached for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca\") on node \"crc\" DevicePath \"\"" Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.521060 4839 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.521071 4839 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lz9wn\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn\") on node \"crc\" DevicePath \"\"" Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.521080 4839 reconciler_common.go:293] "Volume detached for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics\") on node \"crc\" DevicePath \"\"" Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.521091 4839 reconciler_common.go:293] "Volume detached for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.521107 4839 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token\") on node \"crc\" DevicePath \"\"" Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.521117 4839 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca\") on node \"crc\" DevicePath \"\"" Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.521127 4839 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rnphk\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk\") on node \"crc\" DevicePath \"\"" Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.521137 4839 reconciler_common.go:293] "Volume detached for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls\") on node \"crc\" DevicePath \"\"" Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.521147 4839 reconciler_common.go:293] "Volume detached for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token\") on node \"crc\" DevicePath \"\"" Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.521159 4839 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token\") on node \"crc\" DevicePath \"\"" Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.521168 4839 reconciler_common.go:293] "Volume detached for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides\") on node \"crc\" DevicePath \"\"" Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.521177 4839 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.521187 4839 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.521196 4839 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config\") on node \"crc\" DevicePath \"\"" Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.521205 4839 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qg5z5\" (UniqueName: \"kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5\") on node \"crc\" DevicePath \"\"" Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.521215 4839 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-htfz6\" (UniqueName: \"kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6\") on node \"crc\" DevicePath \"\"" Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.521226 4839 reconciler_common.go:293] "Volume detached for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca\") on node \"crc\" DevicePath \"\"" Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.521236 4839 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities\") on node \"crc\" DevicePath \"\"" Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.521247 4839 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zgdk5\" (UniqueName: \"kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5\") on node \"crc\" DevicePath \"\"" Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.521256 4839 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.521265 4839 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.521274 4839 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token\") on node \"crc\" DevicePath \"\"" Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.521284 4839 reconciler_common.go:293] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls\") on node \"crc\" DevicePath \"\"" Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.521293 4839 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access\") on node \"crc\" DevicePath \"\"" Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.526139 4839 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zkvpv\" (UniqueName: \"kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv\") on node \"crc\" DevicePath \"\"" Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.526169 4839 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.526195 4839 reconciler_common.go:293] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls\") on node \"crc\" DevicePath \"\"" Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.526216 4839 reconciler_common.go:293] "Volume detached for volume \"cert\" (UniqueName: \"kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert\") on node \"crc\" DevicePath \"\"" Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.526239 4839 reconciler_common.go:293] "Volume detached for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca\") on node \"crc\" DevicePath \"\"" Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.526266 4839 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data\") on node \"crc\" DevicePath \"\"" Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.526303 4839 reconciler_common.go:293] "Volume detached for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config\") on node \"crc\" DevicePath \"\"" Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.526329 4839 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sb6h7\" (UniqueName: \"kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7\") on node \"crc\" DevicePath \"\"" Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.526351 4839 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.526372 4839 reconciler_common.go:293] "Volume detached for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates\") on node \"crc\" DevicePath \"\"" Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.526394 4839 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mnrrd\" (UniqueName: \"kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd\") on node \"crc\" DevicePath \"\"" Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.526413 4839 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config\") on node \"crc\" DevicePath \"\"" Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.526433 4839 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x2m85\" (UniqueName: \"kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85\") on node \"crc\" DevicePath \"\"" Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.526453 4839 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config\") on node \"crc\" DevicePath \"\"" Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.526472 4839 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fqsjt\" (UniqueName: \"kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt\") on node \"crc\" DevicePath \"\"" Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.526491 4839 reconciler_common.go:293] "Volume detached for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert\") on node \"crc\" DevicePath \"\"" Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.526510 4839 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-s4n52\" (UniqueName: \"kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52\") on node \"crc\" DevicePath \"\"" Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.526528 4839 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.526555 4839 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lzf88\" (UniqueName: \"kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88\") on node \"crc\" DevicePath \"\"" Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.526606 4839 reconciler_common.go:293] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.526631 4839 reconciler_common.go:293] "Volume detached for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls\") on node \"crc\" DevicePath \"\"" Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.526651 4839 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7c4vf\" (UniqueName: \"kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf\") on node \"crc\" DevicePath \"\"" Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.526671 4839 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fcqwp\" (UniqueName: \"kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp\") on node \"crc\" DevicePath \"\"" Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.526692 4839 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vt5rc\" (UniqueName: \"kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc\") on node \"crc\" DevicePath \"\"" Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.526712 4839 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.526735 4839 reconciler_common.go:293] "Volume detached for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies\") on node \"crc\" DevicePath \"\"" Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.526757 4839 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w7l8j\" (UniqueName: \"kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j\") on node \"crc\" DevicePath \"\"" Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.526777 4839 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mg5zb\" (UniqueName: \"kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb\") on node \"crc\" DevicePath \"\"" Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.526797 4839 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4d4hj\" (UniqueName: \"kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj\") on node \"crc\" DevicePath \"\"" Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.526815 4839 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login\") on node \"crc\" DevicePath \"\"" Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.526859 4839 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-249nr\" (UniqueName: \"kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr\") on node \"crc\" DevicePath \"\"" Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.526881 4839 reconciler_common.go:293] "Volume detached for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client\") on node \"crc\" DevicePath \"\"" Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.526900 4839 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config\") on node \"crc\" DevicePath \"\"" Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.526920 4839 reconciler_common.go:293] "Volume detached for volume \"images\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images\") on node \"crc\" DevicePath \"\"" Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.526964 4839 reconciler_common.go:293] "Volume detached for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib\") on node \"crc\" DevicePath \"\"" Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.526982 4839 reconciler_common.go:293] "Volume detached for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client\") on node \"crc\" DevicePath \"\"" Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.527001 4839 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities\") on node \"crc\" DevicePath \"\"" Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.527020 4839 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.527042 4839 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.527063 4839 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6g6sz\" (UniqueName: \"kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz\") on node \"crc\" DevicePath \"\"" Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.527082 4839 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session\") on node \"crc\" DevicePath \"\"" Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.527101 4839 reconciler_common.go:293] "Volume detached for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key\") on node \"crc\" DevicePath \"\"" Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.527120 4839 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca\") on node \"crc\" DevicePath \"\"" Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.527139 4839 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-v47cf\" (UniqueName: \"kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf\") on node \"crc\" DevicePath \"\"" Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.527162 4839 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nzwt7\" (UniqueName: \"kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7\") on node \"crc\" DevicePath \"\"" Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.527180 4839 reconciler_common.go:293] "Volume detached for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls\") on node \"crc\" DevicePath \"\"" Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.527200 4839 reconciler_common.go:293] "Volume detached for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca\") on node \"crc\" DevicePath \"\"" Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.527222 4839 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xcphl\" (UniqueName: \"kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl\") on node \"crc\" DevicePath \"\"" Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.527240 4839 reconciler_common.go:293] "Volume detached for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert\") on node \"crc\" DevicePath \"\"" Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.527266 4839 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9xfj7\" (UniqueName: \"kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7\") on node \"crc\" DevicePath \"\"" Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.527287 4839 reconciler_common.go:293] "Volume detached for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config\") on node \"crc\" DevicePath \"\"" Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.527306 4839 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bf2bz\" (UniqueName: \"kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz\") on node \"crc\" DevicePath \"\"" Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.527325 4839 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w9rds\" (UniqueName: \"kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds\") on node \"crc\" DevicePath \"\"" Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.527348 4839 reconciler_common.go:293] "Volume detached for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert\") on node \"crc\" DevicePath \"\"" Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.527366 4839 reconciler_common.go:293] "Volume detached for volume \"certs\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs\") on node \"crc\" DevicePath \"\"" Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.527383 4839 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca\") on node \"crc\" DevicePath \"\"" Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.527401 4839 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.527421 4839 reconciler_common.go:293] "Volume detached for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client\") on node \"crc\" DevicePath \"\"" Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.527438 4839 reconciler_common.go:293] "Volume detached for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls\") on node \"crc\" DevicePath \"\"" Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.527458 4839 reconciler_common.go:293] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config\") on node \"crc\" DevicePath \"\"" Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.520512 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides" (OuterVolumeSpecName: "env-overrides") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "env-overrides". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.527477 4839 reconciler_common.go:293] "Volume detached for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs\") on node \"crc\" DevicePath \"\"" Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.520934 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "7539238d-5fe0-46ed-884e-1c3b566537ec" (UID: "7539238d-5fe0-46ed-884e-1c3b566537ec"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.520956 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs" (OuterVolumeSpecName: "tmpfs") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "tmpfs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.521114 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "0b78653f-4ff9-4508-8672-245ed9b561e3" (UID: "0b78653f-4ff9-4508-8672-245ed9b561e3"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.521290 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "0b574797-001e-440a-8f4e-c0be86edad0f" (UID: "0b574797-001e-440a-8f4e-c0be86edad0f"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.521272 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "1386a44e-36a2-460c-96d0-0359d2b6f0f5" (UID: "1386a44e-36a2-460c-96d0-0359d2b6f0f5"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.521433 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "01ab3dd5-8196-46d0-ad33-122e2ca51def" (UID: "01ab3dd5-8196-46d0-ad33-122e2ca51def"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.521694 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert" (OuterVolumeSpecName: "profile-collector-cert") pod "b6312bbd-5731-4ea0-a20f-81d5a57df44a" (UID: "b6312bbd-5731-4ea0-a20f-81d5a57df44a"). InnerVolumeSpecName "profile-collector-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.521800 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config" (OuterVolumeSpecName: "config") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.521848 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config" (OuterVolumeSpecName: "auth-proxy-config") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.527843 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig" (OuterVolumeSpecName: "v4-0-config-system-cliconfig") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-cliconfig". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.527498 4839 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.521900 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images" (OuterVolumeSpecName: "images") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "images". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.522064 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls" (OuterVolumeSpecName: "machine-api-operator-tls") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "machine-api-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.522096 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate" (OuterVolumeSpecName: "default-certificate") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "default-certificate". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.522133 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4" (OuterVolumeSpecName: "kube-api-access-w4xd4") pod "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" (UID: "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b"). InnerVolumeSpecName "kube-api-access-w4xd4". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 21 04:24:41 crc kubenswrapper[4839]: E0321 04:24:41.528046 4839 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.522216 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config" (OuterVolumeSpecName: "config") pod "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" (UID: "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.522245 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume" (OuterVolumeSpecName: "config-volume") pod "87cf06ed-a83f-41a7-828d-70653580a8cb" (UID: "87cf06ed-a83f-41a7-828d-70653580a8cb"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.522519 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8" (OuterVolumeSpecName: "kube-api-access-6ccd8") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "kube-api-access-6ccd8". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.528005 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-env-overrides\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Mar 21 04:24:41 crc kubenswrapper[4839]: E0321 04:24:41.528106 4839 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-21 04:24:42.028089872 +0000 UTC m=+86.355876548 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Mar 21 04:24:41 crc kubenswrapper[4839]: E0321 04:24:41.528417 4839 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.528593 4839 swap_util.go:74] "error creating dir to test if tmpfs noswap is enabled. Assuming not supported" mount path="" error="stat /var/lib/kubelet/plugins/kubernetes.io/empty-dir: no such file or directory" Mar 21 04:24:41 crc kubenswrapper[4839]: E0321 04:24:41.528637 4839 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-21 04:24:42.028605018 +0000 UTC m=+86.356391734 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.523298 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls" (OuterVolumeSpecName: "metrics-tls") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "metrics-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.523438 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca" (OuterVolumeSpecName: "serviceca") pod "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" (UID: "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59"). InnerVolumeSpecName "serviceca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.523492 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config" (OuterVolumeSpecName: "console-config") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.523530 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert" (OuterVolumeSpecName: "profile-collector-cert") pod "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" (UID: "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9"). InnerVolumeSpecName "profile-collector-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.523687 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config" (OuterVolumeSpecName: "ovnkube-config") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "ovnkube-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.524339 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle" (OuterVolumeSpecName: "service-ca-bundle") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "service-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.524387 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls" (OuterVolumeSpecName: "machine-approver-tls") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "machine-approver-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.525056 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb" (OuterVolumeSpecName: "kube-api-access-279lb") pod "7bb08738-c794-4ee8-9972-3a62ca171029" (UID: "7bb08738-c794-4ee8-9972-3a62ca171029"). InnerVolumeSpecName "kube-api-access-279lb". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.525133 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert" (OuterVolumeSpecName: "package-server-manager-serving-cert") pod "3ab1a177-2de0-46d9-b765-d0d0649bb42e" (UID: "3ab1a177-2de0-46d9-b765-d0d0649bb42e"). InnerVolumeSpecName "package-server-manager-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.525300 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp" (OuterVolumeSpecName: "kube-api-access-qs4fp") pod "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" (UID: "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c"). InnerVolumeSpecName "kube-api-access-qs4fp". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.525343 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m" (OuterVolumeSpecName: "kube-api-access-gf66m") pod "a0128f3a-b052-44ed-a84e-c4c8aaf17c13" (UID: "a0128f3a-b052-44ed-a84e-c4c8aaf17c13"). InnerVolumeSpecName "kube-api-access-gf66m". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.525384 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh" (OuterVolumeSpecName: "kube-api-access-x4zgh") pod "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" (UID: "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d"). InnerVolumeSpecName "kube-api-access-x4zgh". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.525778 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls" (OuterVolumeSpecName: "registry-tls") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "registry-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.525737 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls" (OuterVolumeSpecName: "samples-operator-tls") pod "a0128f3a-b052-44ed-a84e-c4c8aaf17c13" (UID: "a0128f3a-b052-44ed-a84e-c4c8aaf17c13"). InnerVolumeSpecName "samples-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.526061 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config" (OuterVolumeSpecName: "config") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.526246 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8" (OuterVolumeSpecName: "kube-api-access-wxkg8") pod "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" (UID: "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59"). InnerVolumeSpecName "kube-api-access-wxkg8". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.526363 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782" (OuterVolumeSpecName: "kube-api-access-pj782") pod "b6cd30de-2eeb-49a2-ab40-9167f4560ff5" (UID: "b6cd30de-2eeb-49a2-ab40-9167f4560ff5"). InnerVolumeSpecName "kube-api-access-pj782". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.526614 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca" (OuterVolumeSpecName: "v4-0-config-system-service-ca") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.526644 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error" (OuterVolumeSpecName: "v4-0-config-user-template-error") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-template-error". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.526672 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities" (OuterVolumeSpecName: "utilities") pod "5225d0e4-402f-4861-b410-819f433b1803" (UID: "5225d0e4-402f-4861-b410-819f433b1803"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.526704 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh" (OuterVolumeSpecName: "kube-api-access-xcgwh") pod "fda69060-fa79-4696-b1a6-7980f124bf7c" (UID: "fda69060-fa79-4696-b1a6-7980f124bf7c"). InnerVolumeSpecName "kube-api-access-xcgwh". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.527030 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "0b78653f-4ff9-4508-8672-245ed9b561e3" (UID: "0b78653f-4ff9-4508-8672-245ed9b561e3"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.527222 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth" (OuterVolumeSpecName: "stats-auth") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "stats-auth". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.527265 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg" (OuterVolumeSpecName: "kube-api-access-dbsvg") pod "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" (UID: "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9"). InnerVolumeSpecName "kube-api-access-dbsvg". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.527442 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config" (OuterVolumeSpecName: "config") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.522557 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config" (OuterVolumeSpecName: "config") pod "7539238d-5fe0-46ed-884e-1c3b566537ec" (UID: "7539238d-5fe0-46ed-884e-1c3b566537ec"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.529134 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn" (OuterVolumeSpecName: "kube-api-access-jkwtn") pod "5b88f790-22fa-440e-b583-365168c0b23d" (UID: "5b88f790-22fa-440e-b583-365168c0b23d"). InnerVolumeSpecName "kube-api-access-jkwtn". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.529340 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-identity-cm\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-ovnkube-identity-cm\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.529787 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-iptables-alerter-script\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.529853 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c" (OuterVolumeSpecName: "kube-api-access-tk88c") pod "7539238d-5fe0-46ed-884e-1c3b566537ec" (UID: "7539238d-5fe0-46ed-884e-1c3b566537ec"). InnerVolumeSpecName "kube-api-access-tk88c". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.530420 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config" (OuterVolumeSpecName: "auth-proxy-config") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.530417 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:24:41Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:24:41Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.532403 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx" (OuterVolumeSpecName: "kube-api-access-d6qdx") pod "87cf06ed-a83f-41a7-828d-70653580a8cb" (UID: "87cf06ed-a83f-41a7-828d-70653580a8cb"). InnerVolumeSpecName "kube-api-access-d6qdx". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.532917 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs" (OuterVolumeSpecName: "metrics-certs") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "metrics-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.533174 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca" (OuterVolumeSpecName: "etcd-service-ca") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "etcd-service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.533272 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2" (OuterVolumeSpecName: "kube-api-access-jhbk2") pod "bd23aa5c-e532-4e53-bccf-e79f130c5ae8" (UID: "bd23aa5c-e532-4e53-bccf-e79f130c5ae8"). InnerVolumeSpecName "kube-api-access-jhbk2". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.533314 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.533721 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.534277 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config" (OuterVolumeSpecName: "config") pod "01ab3dd5-8196-46d0-ad33-122e2ca51def" (UID: "01ab3dd5-8196-46d0-ad33-122e2ca51def"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.534714 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz" (OuterVolumeSpecName: "kube-api-access-2d4wz") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "kube-api-access-2d4wz". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.534749 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca" (OuterVolumeSpecName: "client-ca") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.534816 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca" (OuterVolumeSpecName: "client-ca") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.535070 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config" (OuterVolumeSpecName: "config") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.537554 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "1d611f23-29be-4491-8495-bee1670e935f" (UID: "1d611f23-29be-4491-8495-bee1670e935f"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.539691 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp" (OuterVolumeSpecName: "kube-api-access-ngvvp") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "kube-api-access-ngvvp". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.539893 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template" (OuterVolumeSpecName: "v4-0-config-system-ocp-branding-template") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-ocp-branding-template". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.541367 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs" (OuterVolumeSpecName: "webhook-certs") pod "efdd0498-1daa-4136-9a4a-3b948c2293fc" (UID: "efdd0498-1daa-4136-9a4a-3b948c2293fc"). InnerVolumeSpecName "webhook-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.541636 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca" (OuterVolumeSpecName: "image-import-ca") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "image-import-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.541819 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.543304 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "e7e6199b-1264-4501-8953-767f51328d08" (UID: "e7e6199b-1264-4501-8953-767f51328d08"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.543316 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/ef543e1b-8068-4ea3-b32a-61027b32e95d-webhook-cert\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.543490 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config" (OuterVolumeSpecName: "encryption-config") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "encryption-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.544649 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit" (OuterVolumeSpecName: "audit") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "audit". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.544719 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config" (OuterVolumeSpecName: "config") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.544843 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies" (OuterVolumeSpecName: "audit-policies") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "audit-policies". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.546274 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs" (OuterVolumeSpecName: "kube-api-access-pcxfs") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "kube-api-access-pcxfs". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.546480 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh" (OuterVolumeSpecName: "kube-api-access-2w9zh") pod "4bb40260-dbaa-4fb0-84df-5e680505d512" (UID: "4bb40260-dbaa-4fb0-84df-5e680505d512"). InnerVolumeSpecName "kube-api-access-2w9zh". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.547088 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy" (OuterVolumeSpecName: "cni-binary-copy") pod "4bb40260-dbaa-4fb0-84df-5e680505d512" (UID: "4bb40260-dbaa-4fb0-84df-5e680505d512"). InnerVolumeSpecName "cni-binary-copy". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.547200 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config" (OuterVolumeSpecName: "multus-daemon-config") pod "4bb40260-dbaa-4fb0-84df-5e680505d512" (UID: "4bb40260-dbaa-4fb0-84df-5e680505d512"). InnerVolumeSpecName "multus-daemon-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.547261 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.547338 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7" (OuterVolumeSpecName: "kube-api-access-kfwg7") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "kube-api-access-kfwg7". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.547595 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "496e6271-fb68-4057-954e-a0d97a4afa3f" (UID: "496e6271-fb68-4057-954e-a0d97a4afa3f"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.548058 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets" (OuterVolumeSpecName: "installation-pull-secrets") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "installation-pull-secrets". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.548297 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates" (OuterVolumeSpecName: "registry-certificates") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "registry-certificates". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.548905 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection" (OuterVolumeSpecName: "v4-0-config-user-template-provider-selection") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-template-provider-selection". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.549034 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v" (OuterVolumeSpecName: "kube-api-access-pjr6v") pod "49ef4625-1d3a-4a9f-b595-c2433d32326d" (UID: "49ef4625-1d3a-4a9f-b595-c2433d32326d"). InnerVolumeSpecName "kube-api-access-pjr6v". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.549289 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities" (OuterVolumeSpecName: "utilities") pod "57a731c4-ef35-47a8-b875-bfb08a7f8011" (UID: "57a731c4-ef35-47a8-b875-bfb08a7f8011"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.549894 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle" (OuterVolumeSpecName: "v4-0-config-system-trusted-ca-bundle") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.551898 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct" (OuterVolumeSpecName: "kube-api-access-cfbct") pod "57a731c4-ef35-47a8-b875-bfb08a7f8011" (UID: "57a731c4-ef35-47a8-b875-bfb08a7f8011"). InnerVolumeSpecName "kube-api-access-cfbct". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.553727 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config" (OuterVolumeSpecName: "config") pod "496e6271-fb68-4057-954e-a0d97a4afa3f" (UID: "496e6271-fb68-4057-954e-a0d97a4afa3f"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.554154 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.554878 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rdwmf\" (UniqueName: \"kubernetes.io/projected/37a5e44f-9a88-4405-be8a-b645485e7312-kube-api-access-rdwmf\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.555054 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s2kz5\" (UniqueName: \"kubernetes.io/projected/ef543e1b-8068-4ea3-b32a-61027b32e95d-kube-api-access-s2kz5\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Mar 21 04:24:41 crc kubenswrapper[4839]: E0321 04:24:41.555241 4839 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 21 04:24:41 crc kubenswrapper[4839]: E0321 04:24:41.555733 4839 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 21 04:24:41 crc kubenswrapper[4839]: E0321 04:24:41.555761 4839 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 21 04:24:41 crc kubenswrapper[4839]: E0321 04:24:41.555853 4839 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-03-21 04:24:42.055822995 +0000 UTC m=+86.383609671 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.555905 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv" (OuterVolumeSpecName: "kube-api-access-d4lsv") pod "25e176fe-21b4-4974-b1ed-c8b94f112a7f" (UID: "25e176fe-21b4-4974-b1ed-c8b94f112a7f"). InnerVolumeSpecName "kube-api-access-d4lsv". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 21 04:24:41 crc kubenswrapper[4839]: E0321 04:24:41.555367 4839 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 21 04:24:41 crc kubenswrapper[4839]: E0321 04:24:41.556001 4839 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 21 04:24:41 crc kubenswrapper[4839]: E0321 04:24:41.556018 4839 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 21 04:24:41 crc kubenswrapper[4839]: E0321 04:24:41.556066 4839 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-03-21 04:24:42.056051762 +0000 UTC m=+86.383838458 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.555457 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:24:41Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:24:41Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.556889 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rczfb\" (UniqueName: \"kubernetes.io/projected/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-kube-api-access-rczfb\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.557542 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/37a5e44f-9a88-4405-be8a-b645485e7312-metrics-tls\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.557742 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.557881 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config" (OuterVolumeSpecName: "config") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.558075 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle" (OuterVolumeSpecName: "service-ca-bundle") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "service-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.558159 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca" (OuterVolumeSpecName: "service-ca") pod "0b78653f-4ff9-4508-8672-245ed9b561e3" (UID: "0b78653f-4ff9-4508-8672-245ed9b561e3"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.571018 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" (UID: "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.571656 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:24:41Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:24:41Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.576307 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "57a731c4-ef35-47a8-b875-bfb08a7f8011" (UID: "57a731c4-ef35-47a8-b875-bfb08a7f8011"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.587708 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.587850 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.587933 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.588036 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.588130 4839 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T04:24:41Z","lastTransitionTime":"2026-03-21T04:24:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.592437 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted" (OuterVolumeSpecName: "ca-trust-extracted") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "ca-trust-extracted". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.593945 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "5225d0e4-402f-4861-b410-819f433b1803" (UID: "5225d0e4-402f-4861-b410-819f433b1803"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.628673 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/37a5e44f-9a88-4405-be8a-b645485e7312-host-etc-kube\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.628763 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-host-slash\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.628830 4839 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config\") on node \"crc\" DevicePath \"\"" Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.628847 4839 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error\") on node \"crc\" DevicePath \"\"" Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.628863 4839 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.628876 4839 reconciler_common.go:293] "Volume detached for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides\") on node \"crc\" DevicePath \"\"" Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.628890 4839 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume\") on node \"crc\" DevicePath \"\"" Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.628903 4839 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config\") on node \"crc\" DevicePath \"\"" Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.628917 4839 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config\") on node \"crc\" DevicePath \"\"" Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.628931 4839 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jhbk2\" (UniqueName: \"kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2\") on node \"crc\" DevicePath \"\"" Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.628946 4839 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tk88c\" (UniqueName: \"kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c\") on node \"crc\" DevicePath \"\"" Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.628960 4839 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.628973 4839 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config\") on node \"crc\" DevicePath \"\"" Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.628987 4839 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.629000 4839 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.629014 4839 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca\") on node \"crc\" DevicePath \"\"" Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.629027 4839 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kfwg7\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7\") on node \"crc\" DevicePath \"\"" Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.629045 4839 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jkwtn\" (UniqueName: \"kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn\") on node \"crc\" DevicePath \"\"" Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.629059 4839 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca\") on node \"crc\" DevicePath \"\"" Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.629072 4839 reconciler_common.go:293] "Volume detached for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls\") on node \"crc\" DevicePath \"\"" Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.629087 4839 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xcgwh\" (UniqueName: \"kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh\") on node \"crc\" DevicePath \"\"" Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.629100 4839 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config\") on node \"crc\" DevicePath \"\"" Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.629113 4839 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca\") on node \"crc\" DevicePath \"\"" Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.629127 4839 reconciler_common.go:293] "Volume detached for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config\") on node \"crc\" DevicePath \"\"" Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.629140 4839 reconciler_common.go:293] "Volume detached for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies\") on node \"crc\" DevicePath \"\"" Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.629152 4839 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access\") on node \"crc\" DevicePath \"\"" Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.629169 4839 reconciler_common.go:293] "Volume detached for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca\") on node \"crc\" DevicePath \"\"" Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.629183 4839 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.629199 4839 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2d4wz\" (UniqueName: \"kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz\") on node \"crc\" DevicePath \"\"" Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.629212 4839 reconciler_common.go:293] "Volume detached for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls\") on node \"crc\" DevicePath \"\"" Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.629230 4839 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pj782\" (UniqueName: \"kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782\") on node \"crc\" DevicePath \"\"" Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.629242 4839 reconciler_common.go:293] "Volume detached for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca\") on node \"crc\" DevicePath \"\"" Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.629255 4839 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2w9zh\" (UniqueName: \"kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh\") on node \"crc\" DevicePath \"\"" Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.629268 4839 reconciler_common.go:293] "Volume detached for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit\") on node \"crc\" DevicePath \"\"" Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.629283 4839 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x4zgh\" (UniqueName: \"kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh\") on node \"crc\" DevicePath \"\"" Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.629297 4839 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d6qdx\" (UniqueName: \"kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx\") on node \"crc\" DevicePath \"\"" Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.629310 4839 reconciler_common.go:293] "Volume detached for volume \"images\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images\") on node \"crc\" DevicePath \"\"" Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.629323 4839 reconciler_common.go:293] "Volume detached for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs\") on node \"crc\" DevicePath \"\"" Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.629340 4839 reconciler_common.go:293] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.629353 4839 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config\") on node \"crc\" DevicePath \"\"" Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.629366 4839 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cfbct\" (UniqueName: \"kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct\") on node \"crc\" DevicePath \"\"" Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.629381 4839 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.629397 4839 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gf66m\" (UniqueName: \"kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m\") on node \"crc\" DevicePath \"\"" Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.629410 4839 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-279lb\" (UniqueName: \"kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb\") on node \"crc\" DevicePath \"\"" Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.629423 4839 reconciler_common.go:293] "Volume detached for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates\") on node \"crc\" DevicePath \"\"" Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.629435 4839 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities\") on node \"crc\" DevicePath \"\"" Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.629448 4839 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pjr6v\" (UniqueName: \"kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v\") on node \"crc\" DevicePath \"\"" Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.629460 4839 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities\") on node \"crc\" DevicePath \"\"" Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.629473 4839 reconciler_common.go:293] "Volume detached for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs\") on node \"crc\" DevicePath \"\"" Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.629485 4839 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pcxfs\" (UniqueName: \"kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs\") on node \"crc\" DevicePath \"\"" Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.629498 4839 reconciler_common.go:293] "Volume detached for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca\") on node \"crc\" DevicePath \"\"" Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.629511 4839 reconciler_common.go:293] "Volume detached for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth\") on node \"crc\" DevicePath \"\"" Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.629524 4839 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection\") on node \"crc\" DevicePath \"\"" Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.629537 4839 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig\") on node \"crc\" DevicePath \"\"" Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.629554 4839 reconciler_common.go:293] "Volume detached for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.629592 4839 reconciler_common.go:293] "Volume detached for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.629609 4839 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca\") on node \"crc\" DevicePath \"\"" Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.629625 4839 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca\") on node \"crc\" DevicePath \"\"" Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.629642 4839 reconciler_common.go:293] "Volume detached for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls\") on node \"crc\" DevicePath \"\"" Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.629656 4839 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.629675 4839 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.629694 4839 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d4lsv\" (UniqueName: \"kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv\") on node \"crc\" DevicePath \"\"" Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.629711 4839 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config\") on node \"crc\" DevicePath \"\"" Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.629727 4839 reconciler_common.go:293] "Volume detached for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets\") on node \"crc\" DevicePath \"\"" Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.629745 4839 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.629763 4839 reconciler_common.go:293] "Volume detached for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert\") on node \"crc\" DevicePath \"\"" Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.629781 4839 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config\") on node \"crc\" DevicePath \"\"" Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.629798 4839 reconciler_common.go:293] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls\") on node \"crc\" DevicePath \"\"" Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.629814 4839 reconciler_common.go:293] "Volume detached for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy\") on node \"crc\" DevicePath \"\"" Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.629831 4839 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w4xd4\" (UniqueName: \"kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4\") on node \"crc\" DevicePath \"\"" Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.629848 4839 reconciler_common.go:293] "Volume detached for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate\") on node \"crc\" DevicePath \"\"" Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.629865 4839 reconciler_common.go:293] "Volume detached for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls\") on node \"crc\" DevicePath \"\"" Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.629863 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-host-slash\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.629882 4839 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config\") on node \"crc\" DevicePath \"\"" Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.629944 4839 reconciler_common.go:293] "Volume detached for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs\") on node \"crc\" DevicePath \"\"" Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.629930 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/37a5e44f-9a88-4405-be8a-b645485e7312-host-etc-kube\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.629964 4839 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.630082 4839 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dbsvg\" (UniqueName: \"kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg\") on node \"crc\" DevicePath \"\"" Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.630107 4839 reconciler_common.go:293] "Volume detached for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls\") on node \"crc\" DevicePath \"\"" Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.630128 4839 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.630147 4839 reconciler_common.go:293] "Volume detached for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert\") on node \"crc\" DevicePath \"\"" Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.630170 4839 reconciler_common.go:293] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config\") on node \"crc\" DevicePath \"\"" Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.630191 4839 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access\") on node \"crc\" DevicePath \"\"" Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.630213 4839 reconciler_common.go:293] "Volume detached for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config\") on node \"crc\" DevicePath \"\"" Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.630233 4839 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access\") on node \"crc\" DevicePath \"\"" Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.630257 4839 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.630280 4839 reconciler_common.go:293] "Volume detached for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.630301 4839 reconciler_common.go:293] "Volume detached for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config\") on node \"crc\" DevicePath \"\"" Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.630323 4839 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config\") on node \"crc\" DevicePath \"\"" Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.630342 4839 reconciler_common.go:293] "Volume detached for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.630361 4839 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qs4fp\" (UniqueName: \"kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp\") on node \"crc\" DevicePath \"\"" Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.630383 4839 reconciler_common.go:293] "Volume detached for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.630403 4839 reconciler_common.go:293] "Volume detached for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted\") on node \"crc\" DevicePath \"\"" Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.630424 4839 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wxkg8\" (UniqueName: \"kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8\") on node \"crc\" DevicePath \"\"" Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.630444 4839 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template\") on node \"crc\" DevicePath \"\"" Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.630465 4839 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ngvvp\" (UniqueName: \"kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp\") on node \"crc\" DevicePath \"\"" Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.630485 4839 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6ccd8\" (UniqueName: \"kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8\") on node \"crc\" DevicePath \"\"" Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.630504 4839 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.690551 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.690668 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.690688 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.690705 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.690715 4839 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T04:24:41Z","lastTransitionTime":"2026-03-21T04:24:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.746150 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.755520 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-4ln5h" Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.763752 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-node-identity/network-node-identity-vrzqb" Mar 21 04:24:41 crc kubenswrapper[4839]: E0321 04:24:41.777911 4839 kuberuntime_manager.go:1274] "Unhandled Error" err=< Mar 21 04:24:41 crc kubenswrapper[4839]: container &Container{Name:network-operator,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b,Command:[/bin/bash -c #!/bin/bash Mar 21 04:24:41 crc kubenswrapper[4839]: set -o allexport Mar 21 04:24:41 crc kubenswrapper[4839]: if [[ -f /etc/kubernetes/apiserver-url.env ]]; then Mar 21 04:24:41 crc kubenswrapper[4839]: source /etc/kubernetes/apiserver-url.env Mar 21 04:24:41 crc kubenswrapper[4839]: else Mar 21 04:24:41 crc kubenswrapper[4839]: echo "Error: /etc/kubernetes/apiserver-url.env is missing" Mar 21 04:24:41 crc kubenswrapper[4839]: exit 1 Mar 21 04:24:41 crc kubenswrapper[4839]: fi Mar 21 04:24:41 crc kubenswrapper[4839]: exec /usr/bin/cluster-network-operator start --listen=0.0.0.0:9104 Mar 21 04:24:41 crc kubenswrapper[4839]: ],Args:[],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:cno,HostPort:9104,ContainerPort:9104,Protocol:TCP,HostIP:,},},Env:[]EnvVar{EnvVar{Name:RELEASE_VERSION,Value:4.18.1,ValueFrom:nil,},EnvVar{Name:KUBE_PROXY_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b97554198294bf544fbc116c94a0a1fb2ec8a4de0e926bf9d9e320135f0bee6f,ValueFrom:nil,},EnvVar{Name:KUBE_RBAC_PROXY_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09,ValueFrom:nil,},EnvVar{Name:MULTUS_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26,ValueFrom:nil,},EnvVar{Name:MULTUS_ADMISSION_CONTROLLER_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317,ValueFrom:nil,},EnvVar{Name:CNI_PLUGINS_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc,ValueFrom:nil,},EnvVar{Name:BOND_CNI_PLUGIN_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78,ValueFrom:nil,},EnvVar{Name:WHEREABOUTS_CNI_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4,ValueFrom:nil,},EnvVar{Name:ROUTE_OVERRRIDE_CNI_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa,ValueFrom:nil,},EnvVar{Name:MULTUS_NETWORKPOLICY_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:23f833d3738d68706eb2f2868bd76bd71cee016cffa6faf5f045a60cc8c6eddd,ValueFrom:nil,},EnvVar{Name:OVN_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2,ValueFrom:nil,},EnvVar{Name:OVN_NB_RAFT_ELECTION_TIMER,Value:10,ValueFrom:nil,},EnvVar{Name:OVN_SB_RAFT_ELECTION_TIMER,Value:16,ValueFrom:nil,},EnvVar{Name:OVN_NORTHD_PROBE_INTERVAL,Value:10000,ValueFrom:nil,},EnvVar{Name:OVN_CONTROLLER_INACTIVITY_PROBE,Value:180000,ValueFrom:nil,},EnvVar{Name:OVN_NB_INACTIVITY_PROBE,Value:60000,ValueFrom:nil,},EnvVar{Name:EGRESS_ROUTER_CNI_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c,ValueFrom:nil,},EnvVar{Name:NETWORK_METRICS_DAEMON_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d,ValueFrom:nil,},EnvVar{Name:NETWORK_CHECK_SOURCE_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b,ValueFrom:nil,},EnvVar{Name:NETWORK_CHECK_TARGET_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b,ValueFrom:nil,},EnvVar{Name:NETWORK_OPERATOR_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b,ValueFrom:nil,},EnvVar{Name:CLOUD_NETWORK_CONFIG_CONTROLLER_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8048f1cb0be521f09749c0a489503cd56d85b68c6ca93380e082cfd693cd97a8,ValueFrom:nil,},EnvVar{Name:CLI_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2,ValueFrom:nil,},EnvVar{Name:FRR_K8S_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5dbf844e49bb46b78586930149e5e5f5dc121014c8afd10fe36f3651967cc256,ValueFrom:nil,},EnvVar{Name:NETWORKING_CONSOLE_PLUGIN_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd,ValueFrom:nil,},EnvVar{Name:POD_NAME,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:metadata.name,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{52428800 0} {} 50Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:host-etc-kube,ReadOnly:true,MountPath:/etc/kubernetes,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:metrics-tls,ReadOnly:false,MountPath:/var/run/secrets/serving-cert,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-rdwmf,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:nil,Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod network-operator-58b4c7f79c-55gtf_openshift-network-operator(37a5e44f-9a88-4405-be8a-b645485e7312): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars Mar 21 04:24:41 crc kubenswrapper[4839]: > logger="UnhandledError" Mar 21 04:24:41 crc kubenswrapper[4839]: W0321 04:24:41.778412 4839 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd75a4c96_2883_4a0b_bab2_0fab2b6c0b49.slice/crio-99a38806bf8af67d7dcd06a36784129e6b764322bb3536a7eb3fece336067d6b WatchSource:0}: Error finding container 99a38806bf8af67d7dcd06a36784129e6b764322bb3536a7eb3fece336067d6b: Status 404 returned error can't find the container with id 99a38806bf8af67d7dcd06a36784129e6b764322bb3536a7eb3fece336067d6b Mar 21 04:24:41 crc kubenswrapper[4839]: E0321 04:24:41.779528 4839 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"network-operator\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\"" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" podUID="37a5e44f-9a88-4405-be8a-b645485e7312" Mar 21 04:24:41 crc kubenswrapper[4839]: E0321 04:24:41.781698 4839 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:iptables-alerter,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2,Command:[/iptables-alerter/iptables-alerter.sh],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONTAINER_RUNTIME_ENDPOINT,Value:unix:///run/crio/crio.sock,ValueFrom:nil,},EnvVar{Name:ALERTER_POD_NAME,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:metadata.name,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{68157440 0} {} 65Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:iptables-alerter-script,ReadOnly:false,MountPath:/iptables-alerter,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:host-slash,ReadOnly:true,MountPath:/host,SubPath:,MountPropagation:*HostToContainer,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-rczfb,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:nil,Privileged:*true,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod iptables-alerter-4ln5h_openshift-network-operator(d75a4c96-2883-4a0b-bab2-0fab2b6c0b49): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars" logger="UnhandledError" Mar 21 04:24:41 crc kubenswrapper[4839]: E0321 04:24:41.782941 4839 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"iptables-alerter\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\"" pod="openshift-network-operator/iptables-alerter-4ln5h" podUID="d75a4c96-2883-4a0b-bab2-0fab2b6c0b49" Mar 21 04:24:41 crc kubenswrapper[4839]: W0321 04:24:41.785649 4839 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podef543e1b_8068_4ea3_b32a_61027b32e95d.slice/crio-72b98d76e39f4cbef24cdca436ed49737d850e0bb69ce99ba733a15e25210046 WatchSource:0}: Error finding container 72b98d76e39f4cbef24cdca436ed49737d850e0bb69ce99ba733a15e25210046: Status 404 returned error can't find the container with id 72b98d76e39f4cbef24cdca436ed49737d850e0bb69ce99ba733a15e25210046 Mar 21 04:24:41 crc kubenswrapper[4839]: E0321 04:24:41.789158 4839 kuberuntime_manager.go:1274] "Unhandled Error" err=< Mar 21 04:24:41 crc kubenswrapper[4839]: container &Container{Name:webhook,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2,Command:[/bin/bash -c set -xe Mar 21 04:24:41 crc kubenswrapper[4839]: if [[ -f "/env/_master" ]]; then Mar 21 04:24:41 crc kubenswrapper[4839]: set -o allexport Mar 21 04:24:41 crc kubenswrapper[4839]: source "/env/_master" Mar 21 04:24:41 crc kubenswrapper[4839]: set +o allexport Mar 21 04:24:41 crc kubenswrapper[4839]: fi Mar 21 04:24:41 crc kubenswrapper[4839]: # OVN-K will try to remove hybrid overlay node annotations even when the hybrid overlay is not enabled. Mar 21 04:24:41 crc kubenswrapper[4839]: # https://github.com/ovn-org/ovn-kubernetes/blob/ac6820df0b338a246f10f412cd5ec903bd234694/go-controller/pkg/ovn/master.go#L791 Mar 21 04:24:41 crc kubenswrapper[4839]: ho_enable="--enable-hybrid-overlay" Mar 21 04:24:41 crc kubenswrapper[4839]: echo "I$(date "+%m%d %H:%M:%S.%N") - network-node-identity - start webhook" Mar 21 04:24:41 crc kubenswrapper[4839]: # extra-allowed-user: service account `ovn-kubernetes-control-plane` Mar 21 04:24:41 crc kubenswrapper[4839]: # sets pod annotations in multi-homing layer3 network controller (cluster-manager) Mar 21 04:24:41 crc kubenswrapper[4839]: exec /usr/bin/ovnkube-identity --k8s-apiserver=https://api-int.crc.testing:6443 \ Mar 21 04:24:41 crc kubenswrapper[4839]: --webhook-cert-dir="/etc/webhook-cert" \ Mar 21 04:24:41 crc kubenswrapper[4839]: --webhook-host=127.0.0.1 \ Mar 21 04:24:41 crc kubenswrapper[4839]: --webhook-port=9743 \ Mar 21 04:24:41 crc kubenswrapper[4839]: ${ho_enable} \ Mar 21 04:24:41 crc kubenswrapper[4839]: --enable-interconnect \ Mar 21 04:24:41 crc kubenswrapper[4839]: --disable-approver \ Mar 21 04:24:41 crc kubenswrapper[4839]: --extra-allowed-user="system:serviceaccount:openshift-ovn-kubernetes:ovn-kubernetes-control-plane" \ Mar 21 04:24:41 crc kubenswrapper[4839]: --wait-for-kubernetes-api=200s \ Mar 21 04:24:41 crc kubenswrapper[4839]: --pod-admission-conditions="/var/run/ovnkube-identity-config/additional-pod-admission-cond.json" \ Mar 21 04:24:41 crc kubenswrapper[4839]: --loglevel="${LOGLEVEL}" Mar 21 04:24:41 crc kubenswrapper[4839]: ],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LOGLEVEL,Value:2,ValueFrom:nil,},EnvVar{Name:KUBERNETES_NODE_NAME,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:spec.nodeName,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{52428800 0} {} 50Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:webhook-cert,ReadOnly:false,MountPath:/etc/webhook-cert/,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:env-overrides,ReadOnly:false,MountPath:/env,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:ovnkube-identity-cm,ReadOnly:false,MountPath:/var/run/ovnkube-identity-config,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-s2kz5,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000470000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod network-node-identity-vrzqb_openshift-network-node-identity(ef543e1b-8068-4ea3-b32a-61027b32e95d): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars Mar 21 04:24:41 crc kubenswrapper[4839]: > logger="UnhandledError" Mar 21 04:24:41 crc kubenswrapper[4839]: E0321 04:24:41.791636 4839 kuberuntime_manager.go:1274] "Unhandled Error" err=< Mar 21 04:24:41 crc kubenswrapper[4839]: container &Container{Name:approver,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2,Command:[/bin/bash -c set -xe Mar 21 04:24:41 crc kubenswrapper[4839]: if [[ -f "/env/_master" ]]; then Mar 21 04:24:41 crc kubenswrapper[4839]: set -o allexport Mar 21 04:24:41 crc kubenswrapper[4839]: source "/env/_master" Mar 21 04:24:41 crc kubenswrapper[4839]: set +o allexport Mar 21 04:24:41 crc kubenswrapper[4839]: fi Mar 21 04:24:41 crc kubenswrapper[4839]: Mar 21 04:24:41 crc kubenswrapper[4839]: echo "I$(date "+%m%d %H:%M:%S.%N") - network-node-identity - start approver" Mar 21 04:24:41 crc kubenswrapper[4839]: exec /usr/bin/ovnkube-identity --k8s-apiserver=https://api-int.crc.testing:6443 \ Mar 21 04:24:41 crc kubenswrapper[4839]: --disable-webhook \ Mar 21 04:24:41 crc kubenswrapper[4839]: --csr-acceptance-conditions="/var/run/ovnkube-identity-config/additional-cert-acceptance-cond.json" \ Mar 21 04:24:41 crc kubenswrapper[4839]: --loglevel="${LOGLEVEL}" Mar 21 04:24:41 crc kubenswrapper[4839]: ],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LOGLEVEL,Value:4,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{52428800 0} {} 50Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:env-overrides,ReadOnly:false,MountPath:/env,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:ovnkube-identity-cm,ReadOnly:false,MountPath:/var/run/ovnkube-identity-config,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-s2kz5,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000470000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod network-node-identity-vrzqb_openshift-network-node-identity(ef543e1b-8068-4ea3-b32a-61027b32e95d): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars Mar 21 04:24:41 crc kubenswrapper[4839]: > logger="UnhandledError" Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.792781 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:24:41 crc kubenswrapper[4839]: E0321 04:24:41.792779 4839 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"webhook\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\", failed to \"StartContainer\" for \"approver\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\"]" pod="openshift-network-node-identity/network-node-identity-vrzqb" podUID="ef543e1b-8068-4ea3-b32a-61027b32e95d" Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.792853 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.792878 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.792909 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.792930 4839 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T04:24:41Z","lastTransitionTime":"2026-03-21T04:24:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.895994 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.896067 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.896106 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.896123 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.896134 4839 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T04:24:41Z","lastTransitionTime":"2026-03-21T04:24:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.943354 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" event={"ID":"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49","Type":"ContainerStarted","Data":"99a38806bf8af67d7dcd06a36784129e6b764322bb3536a7eb3fece336067d6b"} Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.945319 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" event={"ID":"37a5e44f-9a88-4405-be8a-b645485e7312","Type":"ContainerStarted","Data":"1f70681832960db1114ce205a81f59091acc07b1d7930b16e5ad95e81eedc8e5"} Mar 21 04:24:41 crc kubenswrapper[4839]: E0321 04:24:41.945942 4839 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:iptables-alerter,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2,Command:[/iptables-alerter/iptables-alerter.sh],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONTAINER_RUNTIME_ENDPOINT,Value:unix:///run/crio/crio.sock,ValueFrom:nil,},EnvVar{Name:ALERTER_POD_NAME,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:metadata.name,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{68157440 0} {} 65Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:iptables-alerter-script,ReadOnly:false,MountPath:/iptables-alerter,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:host-slash,ReadOnly:true,MountPath:/host,SubPath:,MountPropagation:*HostToContainer,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-rczfb,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:nil,Privileged:*true,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod iptables-alerter-4ln5h_openshift-network-operator(d75a4c96-2883-4a0b-bab2-0fab2b6c0b49): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars" logger="UnhandledError" Mar 21 04:24:41 crc kubenswrapper[4839]: E0321 04:24:41.947103 4839 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"iptables-alerter\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\"" pod="openshift-network-operator/iptables-alerter-4ln5h" podUID="d75a4c96-2883-4a0b-bab2-0fab2b6c0b49" Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.947111 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" event={"ID":"ef543e1b-8068-4ea3-b32a-61027b32e95d","Type":"ContainerStarted","Data":"72b98d76e39f4cbef24cdca436ed49737d850e0bb69ce99ba733a15e25210046"} Mar 21 04:24:41 crc kubenswrapper[4839]: E0321 04:24:41.947151 4839 kuberuntime_manager.go:1274] "Unhandled Error" err=< Mar 21 04:24:41 crc kubenswrapper[4839]: container &Container{Name:network-operator,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b,Command:[/bin/bash -c #!/bin/bash Mar 21 04:24:41 crc kubenswrapper[4839]: set -o allexport Mar 21 04:24:41 crc kubenswrapper[4839]: if [[ -f /etc/kubernetes/apiserver-url.env ]]; then Mar 21 04:24:41 crc kubenswrapper[4839]: source /etc/kubernetes/apiserver-url.env Mar 21 04:24:41 crc kubenswrapper[4839]: else Mar 21 04:24:41 crc kubenswrapper[4839]: echo "Error: /etc/kubernetes/apiserver-url.env is missing" Mar 21 04:24:41 crc kubenswrapper[4839]: exit 1 Mar 21 04:24:41 crc kubenswrapper[4839]: fi Mar 21 04:24:41 crc kubenswrapper[4839]: exec /usr/bin/cluster-network-operator start --listen=0.0.0.0:9104 Mar 21 04:24:41 crc kubenswrapper[4839]: ],Args:[],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:cno,HostPort:9104,ContainerPort:9104,Protocol:TCP,HostIP:,},},Env:[]EnvVar{EnvVar{Name:RELEASE_VERSION,Value:4.18.1,ValueFrom:nil,},EnvVar{Name:KUBE_PROXY_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b97554198294bf544fbc116c94a0a1fb2ec8a4de0e926bf9d9e320135f0bee6f,ValueFrom:nil,},EnvVar{Name:KUBE_RBAC_PROXY_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09,ValueFrom:nil,},EnvVar{Name:MULTUS_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26,ValueFrom:nil,},EnvVar{Name:MULTUS_ADMISSION_CONTROLLER_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317,ValueFrom:nil,},EnvVar{Name:CNI_PLUGINS_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc,ValueFrom:nil,},EnvVar{Name:BOND_CNI_PLUGIN_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78,ValueFrom:nil,},EnvVar{Name:WHEREABOUTS_CNI_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4,ValueFrom:nil,},EnvVar{Name:ROUTE_OVERRRIDE_CNI_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa,ValueFrom:nil,},EnvVar{Name:MULTUS_NETWORKPOLICY_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:23f833d3738d68706eb2f2868bd76bd71cee016cffa6faf5f045a60cc8c6eddd,ValueFrom:nil,},EnvVar{Name:OVN_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2,ValueFrom:nil,},EnvVar{Name:OVN_NB_RAFT_ELECTION_TIMER,Value:10,ValueFrom:nil,},EnvVar{Name:OVN_SB_RAFT_ELECTION_TIMER,Value:16,ValueFrom:nil,},EnvVar{Name:OVN_NORTHD_PROBE_INTERVAL,Value:10000,ValueFrom:nil,},EnvVar{Name:OVN_CONTROLLER_INACTIVITY_PROBE,Value:180000,ValueFrom:nil,},EnvVar{Name:OVN_NB_INACTIVITY_PROBE,Value:60000,ValueFrom:nil,},EnvVar{Name:EGRESS_ROUTER_CNI_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c,ValueFrom:nil,},EnvVar{Name:NETWORK_METRICS_DAEMON_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d,ValueFrom:nil,},EnvVar{Name:NETWORK_CHECK_SOURCE_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b,ValueFrom:nil,},EnvVar{Name:NETWORK_CHECK_TARGET_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b,ValueFrom:nil,},EnvVar{Name:NETWORK_OPERATOR_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b,ValueFrom:nil,},EnvVar{Name:CLOUD_NETWORK_CONFIG_CONTROLLER_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8048f1cb0be521f09749c0a489503cd56d85b68c6ca93380e082cfd693cd97a8,ValueFrom:nil,},EnvVar{Name:CLI_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2,ValueFrom:nil,},EnvVar{Name:FRR_K8S_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5dbf844e49bb46b78586930149e5e5f5dc121014c8afd10fe36f3651967cc256,ValueFrom:nil,},EnvVar{Name:NETWORKING_CONSOLE_PLUGIN_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd,ValueFrom:nil,},EnvVar{Name:POD_NAME,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:metadata.name,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{52428800 0} {} 50Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:host-etc-kube,ReadOnly:true,MountPath:/etc/kubernetes,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:metrics-tls,ReadOnly:false,MountPath:/var/run/secrets/serving-cert,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-rdwmf,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:nil,Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod network-operator-58b4c7f79c-55gtf_openshift-network-operator(37a5e44f-9a88-4405-be8a-b645485e7312): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars Mar 21 04:24:41 crc kubenswrapper[4839]: > logger="UnhandledError" Mar 21 04:24:41 crc kubenswrapper[4839]: E0321 04:24:41.948247 4839 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"network-operator\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\"" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" podUID="37a5e44f-9a88-4405-be8a-b645485e7312" Mar 21 04:24:41 crc kubenswrapper[4839]: E0321 04:24:41.948383 4839 kuberuntime_manager.go:1274] "Unhandled Error" err=< Mar 21 04:24:41 crc kubenswrapper[4839]: container &Container{Name:webhook,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2,Command:[/bin/bash -c set -xe Mar 21 04:24:41 crc kubenswrapper[4839]: if [[ -f "/env/_master" ]]; then Mar 21 04:24:41 crc kubenswrapper[4839]: set -o allexport Mar 21 04:24:41 crc kubenswrapper[4839]: source "/env/_master" Mar 21 04:24:41 crc kubenswrapper[4839]: set +o allexport Mar 21 04:24:41 crc kubenswrapper[4839]: fi Mar 21 04:24:41 crc kubenswrapper[4839]: # OVN-K will try to remove hybrid overlay node annotations even when the hybrid overlay is not enabled. Mar 21 04:24:41 crc kubenswrapper[4839]: # https://github.com/ovn-org/ovn-kubernetes/blob/ac6820df0b338a246f10f412cd5ec903bd234694/go-controller/pkg/ovn/master.go#L791 Mar 21 04:24:41 crc kubenswrapper[4839]: ho_enable="--enable-hybrid-overlay" Mar 21 04:24:41 crc kubenswrapper[4839]: echo "I$(date "+%m%d %H:%M:%S.%N") - network-node-identity - start webhook" Mar 21 04:24:41 crc kubenswrapper[4839]: # extra-allowed-user: service account `ovn-kubernetes-control-plane` Mar 21 04:24:41 crc kubenswrapper[4839]: # sets pod annotations in multi-homing layer3 network controller (cluster-manager) Mar 21 04:24:41 crc kubenswrapper[4839]: exec /usr/bin/ovnkube-identity --k8s-apiserver=https://api-int.crc.testing:6443 \ Mar 21 04:24:41 crc kubenswrapper[4839]: --webhook-cert-dir="/etc/webhook-cert" \ Mar 21 04:24:41 crc kubenswrapper[4839]: --webhook-host=127.0.0.1 \ Mar 21 04:24:41 crc kubenswrapper[4839]: --webhook-port=9743 \ Mar 21 04:24:41 crc kubenswrapper[4839]: ${ho_enable} \ Mar 21 04:24:41 crc kubenswrapper[4839]: --enable-interconnect \ Mar 21 04:24:41 crc kubenswrapper[4839]: --disable-approver \ Mar 21 04:24:41 crc kubenswrapper[4839]: --extra-allowed-user="system:serviceaccount:openshift-ovn-kubernetes:ovn-kubernetes-control-plane" \ Mar 21 04:24:41 crc kubenswrapper[4839]: --wait-for-kubernetes-api=200s \ Mar 21 04:24:41 crc kubenswrapper[4839]: --pod-admission-conditions="/var/run/ovnkube-identity-config/additional-pod-admission-cond.json" \ Mar 21 04:24:41 crc kubenswrapper[4839]: --loglevel="${LOGLEVEL}" Mar 21 04:24:41 crc kubenswrapper[4839]: ],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LOGLEVEL,Value:2,ValueFrom:nil,},EnvVar{Name:KUBERNETES_NODE_NAME,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:spec.nodeName,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{52428800 0} {} 50Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:webhook-cert,ReadOnly:false,MountPath:/etc/webhook-cert/,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:env-overrides,ReadOnly:false,MountPath:/env,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:ovnkube-identity-cm,ReadOnly:false,MountPath:/var/run/ovnkube-identity-config,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-s2kz5,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000470000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod network-node-identity-vrzqb_openshift-network-node-identity(ef543e1b-8068-4ea3-b32a-61027b32e95d): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars Mar 21 04:24:41 crc kubenswrapper[4839]: > logger="UnhandledError" Mar 21 04:24:41 crc kubenswrapper[4839]: E0321 04:24:41.950367 4839 kuberuntime_manager.go:1274] "Unhandled Error" err=< Mar 21 04:24:41 crc kubenswrapper[4839]: container &Container{Name:approver,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2,Command:[/bin/bash -c set -xe Mar 21 04:24:41 crc kubenswrapper[4839]: if [[ -f "/env/_master" ]]; then Mar 21 04:24:41 crc kubenswrapper[4839]: set -o allexport Mar 21 04:24:41 crc kubenswrapper[4839]: source "/env/_master" Mar 21 04:24:41 crc kubenswrapper[4839]: set +o allexport Mar 21 04:24:41 crc kubenswrapper[4839]: fi Mar 21 04:24:41 crc kubenswrapper[4839]: Mar 21 04:24:41 crc kubenswrapper[4839]: echo "I$(date "+%m%d %H:%M:%S.%N") - network-node-identity - start approver" Mar 21 04:24:41 crc kubenswrapper[4839]: exec /usr/bin/ovnkube-identity --k8s-apiserver=https://api-int.crc.testing:6443 \ Mar 21 04:24:41 crc kubenswrapper[4839]: --disable-webhook \ Mar 21 04:24:41 crc kubenswrapper[4839]: --csr-acceptance-conditions="/var/run/ovnkube-identity-config/additional-cert-acceptance-cond.json" \ Mar 21 04:24:41 crc kubenswrapper[4839]: --loglevel="${LOGLEVEL}" Mar 21 04:24:41 crc kubenswrapper[4839]: ],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LOGLEVEL,Value:4,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{52428800 0} {} 50Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:env-overrides,ReadOnly:false,MountPath:/env,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:ovnkube-identity-cm,ReadOnly:false,MountPath:/var/run/ovnkube-identity-config,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-s2kz5,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000470000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod network-node-identity-vrzqb_openshift-network-node-identity(ef543e1b-8068-4ea3-b32a-61027b32e95d): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars Mar 21 04:24:41 crc kubenswrapper[4839]: > logger="UnhandledError" Mar 21 04:24:41 crc kubenswrapper[4839]: E0321 04:24:41.951642 4839 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"webhook\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\", failed to \"StartContainer\" for \"approver\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\"]" pod="openshift-network-node-identity/network-node-identity-vrzqb" podUID="ef543e1b-8068-4ea3-b32a-61027b32e95d" Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.955982 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:24:41Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:24:41Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.971419 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:24:41Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:24:41Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.983247 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:24:41Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:24:41Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.992537 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:24:41Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:24:41Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.998182 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.998224 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.998243 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.998261 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.998273 4839 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T04:24:41Z","lastTransitionTime":"2026-03-21T04:24:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 04:24:42 crc kubenswrapper[4839]: I0321 04:24:42.001600 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:24:41Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:24:41Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 21 04:24:42 crc kubenswrapper[4839]: I0321 04:24:42.012001 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:24:41Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:24:41Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 21 04:24:42 crc kubenswrapper[4839]: I0321 04:24:42.021001 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:24:41Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:24:41Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 21 04:24:42 crc kubenswrapper[4839]: I0321 04:24:42.030891 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:24:41Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:24:41Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 21 04:24:42 crc kubenswrapper[4839]: I0321 04:24:42.033455 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 21 04:24:42 crc kubenswrapper[4839]: E0321 04:24:42.033668 4839 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-21 04:24:43.033624184 +0000 UTC m=+87.361411000 (durationBeforeRetry 1s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 21 04:24:42 crc kubenswrapper[4839]: I0321 04:24:42.033800 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 21 04:24:42 crc kubenswrapper[4839]: I0321 04:24:42.033933 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 21 04:24:42 crc kubenswrapper[4839]: E0321 04:24:42.033969 4839 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Mar 21 04:24:42 crc kubenswrapper[4839]: E0321 04:24:42.034021 4839 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-21 04:24:43.034010196 +0000 UTC m=+87.361796872 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Mar 21 04:24:42 crc kubenswrapper[4839]: E0321 04:24:42.034074 4839 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 21 04:24:42 crc kubenswrapper[4839]: E0321 04:24:42.034113 4839 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-21 04:24:43.034100979 +0000 UTC m=+87.361887865 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 21 04:24:42 crc kubenswrapper[4839]: I0321 04:24:42.040647 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:24:41Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:24:41Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 21 04:24:42 crc kubenswrapper[4839]: I0321 04:24:42.049941 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:24:41Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:24:41Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 21 04:24:42 crc kubenswrapper[4839]: I0321 04:24:42.058366 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:24:41Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:24:41Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 21 04:24:42 crc kubenswrapper[4839]: I0321 04:24:42.069995 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:24:41Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:24:41Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 21 04:24:42 crc kubenswrapper[4839]: I0321 04:24:42.100765 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:24:42 crc kubenswrapper[4839]: I0321 04:24:42.100810 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:24:42 crc kubenswrapper[4839]: I0321 04:24:42.100821 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:24:42 crc kubenswrapper[4839]: I0321 04:24:42.100839 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 04:24:42 crc kubenswrapper[4839]: I0321 04:24:42.100852 4839 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T04:24:42Z","lastTransitionTime":"2026-03-21T04:24:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 04:24:42 crc kubenswrapper[4839]: I0321 04:24:42.135433 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 21 04:24:42 crc kubenswrapper[4839]: I0321 04:24:42.135479 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 21 04:24:42 crc kubenswrapper[4839]: E0321 04:24:42.135648 4839 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 21 04:24:42 crc kubenswrapper[4839]: E0321 04:24:42.135665 4839 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 21 04:24:42 crc kubenswrapper[4839]: E0321 04:24:42.135676 4839 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 21 04:24:42 crc kubenswrapper[4839]: E0321 04:24:42.135712 4839 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 21 04:24:42 crc kubenswrapper[4839]: E0321 04:24:42.135781 4839 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 21 04:24:42 crc kubenswrapper[4839]: E0321 04:24:42.135799 4839 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 21 04:24:42 crc kubenswrapper[4839]: E0321 04:24:42.135728 4839 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-03-21 04:24:43.135713147 +0000 UTC m=+87.463499823 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 21 04:24:42 crc kubenswrapper[4839]: E0321 04:24:42.135903 4839 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-03-21 04:24:43.135878052 +0000 UTC m=+87.463664748 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 21 04:24:42 crc kubenswrapper[4839]: I0321 04:24:42.205550 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:24:42 crc kubenswrapper[4839]: I0321 04:24:42.205615 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:24:42 crc kubenswrapper[4839]: I0321 04:24:42.205625 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:24:42 crc kubenswrapper[4839]: I0321 04:24:42.205643 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 04:24:42 crc kubenswrapper[4839]: I0321 04:24:42.205658 4839 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T04:24:42Z","lastTransitionTime":"2026-03-21T04:24:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 04:24:42 crc kubenswrapper[4839]: I0321 04:24:42.308055 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:24:42 crc kubenswrapper[4839]: I0321 04:24:42.308097 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:24:42 crc kubenswrapper[4839]: I0321 04:24:42.308108 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:24:42 crc kubenswrapper[4839]: I0321 04:24:42.308124 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 04:24:42 crc kubenswrapper[4839]: I0321 04:24:42.308135 4839 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T04:24:42Z","lastTransitionTime":"2026-03-21T04:24:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 04:24:42 crc kubenswrapper[4839]: I0321 04:24:42.409967 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:24:42 crc kubenswrapper[4839]: I0321 04:24:42.410029 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:24:42 crc kubenswrapper[4839]: I0321 04:24:42.410050 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:24:42 crc kubenswrapper[4839]: I0321 04:24:42.410074 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 04:24:42 crc kubenswrapper[4839]: I0321 04:24:42.410088 4839 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T04:24:42Z","lastTransitionTime":"2026-03-21T04:24:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 04:24:42 crc kubenswrapper[4839]: I0321 04:24:42.456630 4839 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="01ab3dd5-8196-46d0-ad33-122e2ca51def" path="/var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes" Mar 21 04:24:42 crc kubenswrapper[4839]: I0321 04:24:42.457118 4839 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" path="/var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes" Mar 21 04:24:42 crc kubenswrapper[4839]: I0321 04:24:42.457939 4839 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="09efc573-dbb6-4249-bd59-9b87aba8dd28" path="/var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes" Mar 21 04:24:42 crc kubenswrapper[4839]: I0321 04:24:42.458633 4839 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0b574797-001e-440a-8f4e-c0be86edad0f" path="/var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes" Mar 21 04:24:42 crc kubenswrapper[4839]: I0321 04:24:42.459283 4839 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0b78653f-4ff9-4508-8672-245ed9b561e3" path="/var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes" Mar 21 04:24:42 crc kubenswrapper[4839]: I0321 04:24:42.459868 4839 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1386a44e-36a2-460c-96d0-0359d2b6f0f5" path="/var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes" Mar 21 04:24:42 crc kubenswrapper[4839]: I0321 04:24:42.460476 4839 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1bf7eb37-55a3-4c65-b768-a94c82151e69" path="/var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes" Mar 21 04:24:42 crc kubenswrapper[4839]: I0321 04:24:42.461103 4839 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1d611f23-29be-4491-8495-bee1670e935f" path="/var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes" Mar 21 04:24:42 crc kubenswrapper[4839]: I0321 04:24:42.461757 4839 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="20b0d48f-5fd6-431c-a545-e3c800c7b866" path="/var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/volumes" Mar 21 04:24:42 crc kubenswrapper[4839]: I0321 04:24:42.462252 4839 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" path="/var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes" Mar 21 04:24:42 crc kubenswrapper[4839]: I0321 04:24:42.462771 4839 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="22c825df-677d-4ca6-82db-3454ed06e783" path="/var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes" Mar 21 04:24:42 crc kubenswrapper[4839]: I0321 04:24:42.463421 4839 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="25e176fe-21b4-4974-b1ed-c8b94f112a7f" path="/var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes" Mar 21 04:24:42 crc kubenswrapper[4839]: I0321 04:24:42.463898 4839 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" path="/var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes" Mar 21 04:24:42 crc kubenswrapper[4839]: I0321 04:24:42.464420 4839 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="31d8b7a1-420e-4252-a5b7-eebe8a111292" path="/var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes" Mar 21 04:24:42 crc kubenswrapper[4839]: I0321 04:24:42.464930 4839 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3ab1a177-2de0-46d9-b765-d0d0649bb42e" path="/var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/volumes" Mar 21 04:24:42 crc kubenswrapper[4839]: I0321 04:24:42.465406 4839 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" path="/var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes" Mar 21 04:24:42 crc kubenswrapper[4839]: I0321 04:24:42.466008 4839 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="43509403-f426-496e-be36-56cef71462f5" path="/var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes" Mar 21 04:24:42 crc kubenswrapper[4839]: I0321 04:24:42.466443 4839 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="44663579-783b-4372-86d6-acf235a62d72" path="/var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/volumes" Mar 21 04:24:42 crc kubenswrapper[4839]: I0321 04:24:42.468416 4839 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="496e6271-fb68-4057-954e-a0d97a4afa3f" path="/var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes" Mar 21 04:24:42 crc kubenswrapper[4839]: I0321 04:24:42.469721 4839 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" path="/var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes" Mar 21 04:24:42 crc kubenswrapper[4839]: I0321 04:24:42.471171 4839 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="49ef4625-1d3a-4a9f-b595-c2433d32326d" path="/var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/volumes" Mar 21 04:24:42 crc kubenswrapper[4839]: I0321 04:24:42.472244 4839 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4bb40260-dbaa-4fb0-84df-5e680505d512" path="/var/lib/kubelet/pods/4bb40260-dbaa-4fb0-84df-5e680505d512/volumes" Mar 21 04:24:42 crc kubenswrapper[4839]: I0321 04:24:42.473188 4839 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5225d0e4-402f-4861-b410-819f433b1803" path="/var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes" Mar 21 04:24:42 crc kubenswrapper[4839]: I0321 04:24:42.475107 4839 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5441d097-087c-4d9a-baa8-b210afa90fc9" path="/var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes" Mar 21 04:24:42 crc kubenswrapper[4839]: I0321 04:24:42.476014 4839 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="57a731c4-ef35-47a8-b875-bfb08a7f8011" path="/var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes" Mar 21 04:24:42 crc kubenswrapper[4839]: I0321 04:24:42.477666 4839 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5b88f790-22fa-440e-b583-365168c0b23d" path="/var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/volumes" Mar 21 04:24:42 crc kubenswrapper[4839]: I0321 04:24:42.478522 4839 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5fe579f8-e8a6-4643-bce5-a661393c4dde" path="/var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/volumes" Mar 21 04:24:42 crc kubenswrapper[4839]: I0321 04:24:42.480019 4839 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6402fda4-df10-493c-b4e5-d0569419652d" path="/var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes" Mar 21 04:24:42 crc kubenswrapper[4839]: I0321 04:24:42.481435 4839 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6509e943-70c6-444c-bc41-48a544e36fbd" path="/var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes" Mar 21 04:24:42 crc kubenswrapper[4839]: I0321 04:24:42.482844 4839 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6731426b-95fe-49ff-bb5f-40441049fde2" path="/var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/volumes" Mar 21 04:24:42 crc kubenswrapper[4839]: I0321 04:24:42.483516 4839 kubelet_volumes.go:152] "Cleaned up orphaned volume subpath from pod" podUID="6ea678ab-3438-413e-bfe3-290ae7725660" path="/var/lib/kubelet/pods/6ea678ab-3438-413e-bfe3-290ae7725660/volume-subpaths/run-systemd/ovnkube-controller/6" Mar 21 04:24:42 crc kubenswrapper[4839]: I0321 04:24:42.483676 4839 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6ea678ab-3438-413e-bfe3-290ae7725660" path="/var/lib/kubelet/pods/6ea678ab-3438-413e-bfe3-290ae7725660/volumes" Mar 21 04:24:42 crc kubenswrapper[4839]: I0321 04:24:42.486671 4839 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7539238d-5fe0-46ed-884e-1c3b566537ec" path="/var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes" Mar 21 04:24:42 crc kubenswrapper[4839]: I0321 04:24:42.487391 4839 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7583ce53-e0fe-4a16-9e4d-50516596a136" path="/var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes" Mar 21 04:24:42 crc kubenswrapper[4839]: I0321 04:24:42.487958 4839 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7bb08738-c794-4ee8-9972-3a62ca171029" path="/var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes" Mar 21 04:24:42 crc kubenswrapper[4839]: I0321 04:24:42.490194 4839 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="87cf06ed-a83f-41a7-828d-70653580a8cb" path="/var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes" Mar 21 04:24:42 crc kubenswrapper[4839]: I0321 04:24:42.491646 4839 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" path="/var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes" Mar 21 04:24:42 crc kubenswrapper[4839]: I0321 04:24:42.492404 4839 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="925f1c65-6136-48ba-85aa-3a3b50560753" path="/var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes" Mar 21 04:24:42 crc kubenswrapper[4839]: I0321 04:24:42.493850 4839 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="96b93a3a-6083-4aea-8eab-fe1aa8245ad9" path="/var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/volumes" Mar 21 04:24:42 crc kubenswrapper[4839]: I0321 04:24:42.495181 4839 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9d4552c7-cd75-42dd-8880-30dd377c49a4" path="/var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes" Mar 21 04:24:42 crc kubenswrapper[4839]: I0321 04:24:42.496753 4839 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a0128f3a-b052-44ed-a84e-c4c8aaf17c13" path="/var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/volumes" Mar 21 04:24:42 crc kubenswrapper[4839]: I0321 04:24:42.498096 4839 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a31745f5-9847-4afe-82a5-3161cc66ca93" path="/var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes" Mar 21 04:24:42 crc kubenswrapper[4839]: I0321 04:24:42.499876 4839 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" path="/var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes" Mar 21 04:24:42 crc kubenswrapper[4839]: I0321 04:24:42.500857 4839 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b6312bbd-5731-4ea0-a20f-81d5a57df44a" path="/var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/volumes" Mar 21 04:24:42 crc kubenswrapper[4839]: I0321 04:24:42.502146 4839 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" path="/var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes" Mar 21 04:24:42 crc kubenswrapper[4839]: I0321 04:24:42.502920 4839 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" path="/var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes" Mar 21 04:24:42 crc kubenswrapper[4839]: I0321 04:24:42.504104 4839 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bd23aa5c-e532-4e53-bccf-e79f130c5ae8" path="/var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/volumes" Mar 21 04:24:42 crc kubenswrapper[4839]: I0321 04:24:42.505198 4839 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bf126b07-da06-4140-9a57-dfd54fc6b486" path="/var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes" Mar 21 04:24:42 crc kubenswrapper[4839]: I0321 04:24:42.506394 4839 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c03ee662-fb2f-4fc4-a2c1-af487c19d254" path="/var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes" Mar 21 04:24:42 crc kubenswrapper[4839]: I0321 04:24:42.507095 4839 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d" path="/var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/volumes" Mar 21 04:24:42 crc kubenswrapper[4839]: I0321 04:24:42.508293 4839 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e7e6199b-1264-4501-8953-767f51328d08" path="/var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes" Mar 21 04:24:42 crc kubenswrapper[4839]: I0321 04:24:42.509011 4839 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="efdd0498-1daa-4136-9a4a-3b948c2293fc" path="/var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/volumes" Mar 21 04:24:42 crc kubenswrapper[4839]: I0321 04:24:42.509825 4839 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" path="/var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/volumes" Mar 21 04:24:42 crc kubenswrapper[4839]: I0321 04:24:42.511017 4839 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fda69060-fa79-4696-b1a6-7980f124bf7c" path="/var/lib/kubelet/pods/fda69060-fa79-4696-b1a6-7980f124bf7c/volumes" Mar 21 04:24:42 crc kubenswrapper[4839]: I0321 04:24:42.512270 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:24:42 crc kubenswrapper[4839]: I0321 04:24:42.512297 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:24:42 crc kubenswrapper[4839]: I0321 04:24:42.512310 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:24:42 crc kubenswrapper[4839]: I0321 04:24:42.512324 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 04:24:42 crc kubenswrapper[4839]: I0321 04:24:42.512336 4839 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T04:24:42Z","lastTransitionTime":"2026-03-21T04:24:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 04:24:42 crc kubenswrapper[4839]: I0321 04:24:42.614305 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:24:42 crc kubenswrapper[4839]: I0321 04:24:42.614348 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:24:42 crc kubenswrapper[4839]: I0321 04:24:42.614359 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:24:42 crc kubenswrapper[4839]: I0321 04:24:42.614371 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 04:24:42 crc kubenswrapper[4839]: I0321 04:24:42.614381 4839 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T04:24:42Z","lastTransitionTime":"2026-03-21T04:24:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 04:24:42 crc kubenswrapper[4839]: I0321 04:24:42.716088 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:24:42 crc kubenswrapper[4839]: I0321 04:24:42.716130 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:24:42 crc kubenswrapper[4839]: I0321 04:24:42.716142 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:24:42 crc kubenswrapper[4839]: I0321 04:24:42.716159 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 04:24:42 crc kubenswrapper[4839]: I0321 04:24:42.716169 4839 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T04:24:42Z","lastTransitionTime":"2026-03-21T04:24:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 04:24:42 crc kubenswrapper[4839]: I0321 04:24:42.818415 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:24:42 crc kubenswrapper[4839]: I0321 04:24:42.818449 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:24:42 crc kubenswrapper[4839]: I0321 04:24:42.818456 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:24:42 crc kubenswrapper[4839]: I0321 04:24:42.818469 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 04:24:42 crc kubenswrapper[4839]: I0321 04:24:42.818477 4839 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T04:24:42Z","lastTransitionTime":"2026-03-21T04:24:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 04:24:42 crc kubenswrapper[4839]: I0321 04:24:42.920427 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:24:42 crc kubenswrapper[4839]: I0321 04:24:42.920456 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:24:42 crc kubenswrapper[4839]: I0321 04:24:42.920463 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:24:42 crc kubenswrapper[4839]: I0321 04:24:42.920475 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 04:24:42 crc kubenswrapper[4839]: I0321 04:24:42.920484 4839 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T04:24:42Z","lastTransitionTime":"2026-03-21T04:24:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 04:24:43 crc kubenswrapper[4839]: I0321 04:24:43.022218 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:24:43 crc kubenswrapper[4839]: I0321 04:24:43.022258 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:24:43 crc kubenswrapper[4839]: I0321 04:24:43.022269 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:24:43 crc kubenswrapper[4839]: I0321 04:24:43.022283 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 04:24:43 crc kubenswrapper[4839]: I0321 04:24:43.022291 4839 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T04:24:43Z","lastTransitionTime":"2026-03-21T04:24:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 04:24:43 crc kubenswrapper[4839]: I0321 04:24:43.041646 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 21 04:24:43 crc kubenswrapper[4839]: I0321 04:24:43.041713 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 21 04:24:43 crc kubenswrapper[4839]: I0321 04:24:43.041753 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 21 04:24:43 crc kubenswrapper[4839]: E0321 04:24:43.041779 4839 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-21 04:24:45.04175826 +0000 UTC m=+89.369544936 (durationBeforeRetry 2s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 21 04:24:43 crc kubenswrapper[4839]: E0321 04:24:43.041836 4839 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Mar 21 04:24:43 crc kubenswrapper[4839]: E0321 04:24:43.041840 4839 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 21 04:24:43 crc kubenswrapper[4839]: E0321 04:24:43.041877 4839 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-21 04:24:45.041869103 +0000 UTC m=+89.369655769 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 21 04:24:43 crc kubenswrapper[4839]: E0321 04:24:43.041889 4839 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-21 04:24:45.041883954 +0000 UTC m=+89.369670630 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Mar 21 04:24:43 crc kubenswrapper[4839]: I0321 04:24:43.124925 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:24:43 crc kubenswrapper[4839]: I0321 04:24:43.124964 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:24:43 crc kubenswrapper[4839]: I0321 04:24:43.124974 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:24:43 crc kubenswrapper[4839]: I0321 04:24:43.124990 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 04:24:43 crc kubenswrapper[4839]: I0321 04:24:43.125000 4839 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T04:24:43Z","lastTransitionTime":"2026-03-21T04:24:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 04:24:43 crc kubenswrapper[4839]: I0321 04:24:43.142266 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 21 04:24:43 crc kubenswrapper[4839]: I0321 04:24:43.142323 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 21 04:24:43 crc kubenswrapper[4839]: E0321 04:24:43.142427 4839 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 21 04:24:43 crc kubenswrapper[4839]: E0321 04:24:43.142437 4839 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 21 04:24:43 crc kubenswrapper[4839]: E0321 04:24:43.142476 4839 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 21 04:24:43 crc kubenswrapper[4839]: E0321 04:24:43.142489 4839 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 21 04:24:43 crc kubenswrapper[4839]: E0321 04:24:43.142540 4839 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-03-21 04:24:45.142521692 +0000 UTC m=+89.470308368 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 21 04:24:43 crc kubenswrapper[4839]: E0321 04:24:43.142443 4839 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 21 04:24:43 crc kubenswrapper[4839]: E0321 04:24:43.142922 4839 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 21 04:24:43 crc kubenswrapper[4839]: E0321 04:24:43.143035 4839 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-03-21 04:24:45.143020437 +0000 UTC m=+89.470807173 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 21 04:24:43 crc kubenswrapper[4839]: I0321 04:24:43.227094 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:24:43 crc kubenswrapper[4839]: I0321 04:24:43.227127 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:24:43 crc kubenswrapper[4839]: I0321 04:24:43.227136 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:24:43 crc kubenswrapper[4839]: I0321 04:24:43.227148 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 04:24:43 crc kubenswrapper[4839]: I0321 04:24:43.227156 4839 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T04:24:43Z","lastTransitionTime":"2026-03-21T04:24:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 04:24:43 crc kubenswrapper[4839]: I0321 04:24:43.329356 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:24:43 crc kubenswrapper[4839]: I0321 04:24:43.329399 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:24:43 crc kubenswrapper[4839]: I0321 04:24:43.329409 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:24:43 crc kubenswrapper[4839]: I0321 04:24:43.329425 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 04:24:43 crc kubenswrapper[4839]: I0321 04:24:43.329437 4839 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T04:24:43Z","lastTransitionTime":"2026-03-21T04:24:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 04:24:43 crc kubenswrapper[4839]: I0321 04:24:43.431952 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:24:43 crc kubenswrapper[4839]: I0321 04:24:43.431994 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:24:43 crc kubenswrapper[4839]: I0321 04:24:43.432006 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:24:43 crc kubenswrapper[4839]: I0321 04:24:43.432022 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 04:24:43 crc kubenswrapper[4839]: I0321 04:24:43.432033 4839 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T04:24:43Z","lastTransitionTime":"2026-03-21T04:24:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 04:24:43 crc kubenswrapper[4839]: I0321 04:24:43.451848 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 21 04:24:43 crc kubenswrapper[4839]: I0321 04:24:43.451869 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 21 04:24:43 crc kubenswrapper[4839]: I0321 04:24:43.451861 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 21 04:24:43 crc kubenswrapper[4839]: E0321 04:24:43.451978 4839 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 21 04:24:43 crc kubenswrapper[4839]: E0321 04:24:43.452115 4839 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 21 04:24:43 crc kubenswrapper[4839]: E0321 04:24:43.452230 4839 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 21 04:24:43 crc kubenswrapper[4839]: I0321 04:24:43.534361 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:24:43 crc kubenswrapper[4839]: I0321 04:24:43.534401 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:24:43 crc kubenswrapper[4839]: I0321 04:24:43.534411 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:24:43 crc kubenswrapper[4839]: I0321 04:24:43.534443 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 04:24:43 crc kubenswrapper[4839]: I0321 04:24:43.534454 4839 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T04:24:43Z","lastTransitionTime":"2026-03-21T04:24:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 04:24:43 crc kubenswrapper[4839]: I0321 04:24:43.637831 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:24:43 crc kubenswrapper[4839]: I0321 04:24:43.638139 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:24:43 crc kubenswrapper[4839]: I0321 04:24:43.638225 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:24:43 crc kubenswrapper[4839]: I0321 04:24:43.638345 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 04:24:43 crc kubenswrapper[4839]: I0321 04:24:43.638450 4839 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T04:24:43Z","lastTransitionTime":"2026-03-21T04:24:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 04:24:43 crc kubenswrapper[4839]: I0321 04:24:43.739321 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:24:43 crc kubenswrapper[4839]: I0321 04:24:43.739370 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:24:43 crc kubenswrapper[4839]: I0321 04:24:43.739380 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:24:43 crc kubenswrapper[4839]: I0321 04:24:43.739395 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 04:24:43 crc kubenswrapper[4839]: I0321 04:24:43.739405 4839 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T04:24:43Z","lastTransitionTime":"2026-03-21T04:24:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 04:24:43 crc kubenswrapper[4839]: E0321 04:24:43.748939 4839 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404564Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865364Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-21T04:24:43Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-21T04:24:43Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-21T04:24:43Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-21T04:24:43Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-21T04:24:43Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-21T04:24:43Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-21T04:24:43Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-21T04:24:43Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"d75f4a6d-5ce3-49bd-a7c7-4f5c3e6bf62f\\\",\\\"systemUUID\\\":\\\"2a7bfad9-30ba-42d8-b982-971191ebb9d6\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 21 04:24:43 crc kubenswrapper[4839]: I0321 04:24:43.751793 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:24:43 crc kubenswrapper[4839]: I0321 04:24:43.751826 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:24:43 crc kubenswrapper[4839]: I0321 04:24:43.751835 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:24:43 crc kubenswrapper[4839]: I0321 04:24:43.751848 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 04:24:43 crc kubenswrapper[4839]: I0321 04:24:43.751856 4839 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T04:24:43Z","lastTransitionTime":"2026-03-21T04:24:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 04:24:43 crc kubenswrapper[4839]: E0321 04:24:43.759743 4839 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404564Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865364Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-21T04:24:43Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-21T04:24:43Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-21T04:24:43Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-21T04:24:43Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-21T04:24:43Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-21T04:24:43Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-21T04:24:43Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-21T04:24:43Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"d75f4a6d-5ce3-49bd-a7c7-4f5c3e6bf62f\\\",\\\"systemUUID\\\":\\\"2a7bfad9-30ba-42d8-b982-971191ebb9d6\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 21 04:24:43 crc kubenswrapper[4839]: I0321 04:24:43.763639 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:24:43 crc kubenswrapper[4839]: I0321 04:24:43.763668 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:24:43 crc kubenswrapper[4839]: I0321 04:24:43.763677 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:24:43 crc kubenswrapper[4839]: I0321 04:24:43.763693 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 04:24:43 crc kubenswrapper[4839]: I0321 04:24:43.763705 4839 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T04:24:43Z","lastTransitionTime":"2026-03-21T04:24:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 04:24:43 crc kubenswrapper[4839]: E0321 04:24:43.771959 4839 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404564Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865364Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-21T04:24:43Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-21T04:24:43Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-21T04:24:43Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-21T04:24:43Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-21T04:24:43Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-21T04:24:43Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-21T04:24:43Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-21T04:24:43Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"d75f4a6d-5ce3-49bd-a7c7-4f5c3e6bf62f\\\",\\\"systemUUID\\\":\\\"2a7bfad9-30ba-42d8-b982-971191ebb9d6\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 21 04:24:43 crc kubenswrapper[4839]: I0321 04:24:43.775617 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:24:43 crc kubenswrapper[4839]: I0321 04:24:43.775654 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:24:43 crc kubenswrapper[4839]: I0321 04:24:43.775662 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:24:43 crc kubenswrapper[4839]: I0321 04:24:43.775676 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 04:24:43 crc kubenswrapper[4839]: I0321 04:24:43.775685 4839 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T04:24:43Z","lastTransitionTime":"2026-03-21T04:24:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 04:24:43 crc kubenswrapper[4839]: E0321 04:24:43.784404 4839 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404564Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865364Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-21T04:24:43Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-21T04:24:43Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-21T04:24:43Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-21T04:24:43Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-21T04:24:43Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-21T04:24:43Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-21T04:24:43Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-21T04:24:43Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"d75f4a6d-5ce3-49bd-a7c7-4f5c3e6bf62f\\\",\\\"systemUUID\\\":\\\"2a7bfad9-30ba-42d8-b982-971191ebb9d6\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 21 04:24:43 crc kubenswrapper[4839]: I0321 04:24:43.787093 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:24:43 crc kubenswrapper[4839]: I0321 04:24:43.787407 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:24:43 crc kubenswrapper[4839]: I0321 04:24:43.787489 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:24:43 crc kubenswrapper[4839]: I0321 04:24:43.787590 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 04:24:43 crc kubenswrapper[4839]: I0321 04:24:43.787690 4839 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T04:24:43Z","lastTransitionTime":"2026-03-21T04:24:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 04:24:43 crc kubenswrapper[4839]: E0321 04:24:43.795673 4839 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404564Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865364Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-21T04:24:43Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-21T04:24:43Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-21T04:24:43Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-21T04:24:43Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-21T04:24:43Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-21T04:24:43Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-21T04:24:43Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-21T04:24:43Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"d75f4a6d-5ce3-49bd-a7c7-4f5c3e6bf62f\\\",\\\"systemUUID\\\":\\\"2a7bfad9-30ba-42d8-b982-971191ebb9d6\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 21 04:24:43 crc kubenswrapper[4839]: E0321 04:24:43.795781 4839 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Mar 21 04:24:43 crc kubenswrapper[4839]: I0321 04:24:43.798107 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:24:43 crc kubenswrapper[4839]: I0321 04:24:43.798133 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:24:43 crc kubenswrapper[4839]: I0321 04:24:43.798142 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:24:43 crc kubenswrapper[4839]: I0321 04:24:43.798158 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 04:24:43 crc kubenswrapper[4839]: I0321 04:24:43.798168 4839 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T04:24:43Z","lastTransitionTime":"2026-03-21T04:24:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 04:24:43 crc kubenswrapper[4839]: I0321 04:24:43.900223 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:24:43 crc kubenswrapper[4839]: I0321 04:24:43.900295 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:24:43 crc kubenswrapper[4839]: I0321 04:24:43.900307 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:24:43 crc kubenswrapper[4839]: I0321 04:24:43.900328 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 04:24:43 crc kubenswrapper[4839]: I0321 04:24:43.900346 4839 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T04:24:43Z","lastTransitionTime":"2026-03-21T04:24:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 04:24:44 crc kubenswrapper[4839]: I0321 04:24:44.003297 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:24:44 crc kubenswrapper[4839]: I0321 04:24:44.003878 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:24:44 crc kubenswrapper[4839]: I0321 04:24:44.003964 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:24:44 crc kubenswrapper[4839]: I0321 04:24:44.004082 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 04:24:44 crc kubenswrapper[4839]: I0321 04:24:44.004165 4839 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T04:24:44Z","lastTransitionTime":"2026-03-21T04:24:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 04:24:44 crc kubenswrapper[4839]: I0321 04:24:44.108098 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:24:44 crc kubenswrapper[4839]: I0321 04:24:44.108138 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:24:44 crc kubenswrapper[4839]: I0321 04:24:44.108148 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:24:44 crc kubenswrapper[4839]: I0321 04:24:44.108163 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 04:24:44 crc kubenswrapper[4839]: I0321 04:24:44.108172 4839 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T04:24:44Z","lastTransitionTime":"2026-03-21T04:24:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 04:24:44 crc kubenswrapper[4839]: I0321 04:24:44.211452 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:24:44 crc kubenswrapper[4839]: I0321 04:24:44.211527 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:24:44 crc kubenswrapper[4839]: I0321 04:24:44.211547 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:24:44 crc kubenswrapper[4839]: I0321 04:24:44.211606 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 04:24:44 crc kubenswrapper[4839]: I0321 04:24:44.211624 4839 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T04:24:44Z","lastTransitionTime":"2026-03-21T04:24:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 04:24:44 crc kubenswrapper[4839]: I0321 04:24:44.314174 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:24:44 crc kubenswrapper[4839]: I0321 04:24:44.314238 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:24:44 crc kubenswrapper[4839]: I0321 04:24:44.314257 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:24:44 crc kubenswrapper[4839]: I0321 04:24:44.314280 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 04:24:44 crc kubenswrapper[4839]: I0321 04:24:44.314297 4839 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T04:24:44Z","lastTransitionTime":"2026-03-21T04:24:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 04:24:44 crc kubenswrapper[4839]: I0321 04:24:44.417446 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:24:44 crc kubenswrapper[4839]: I0321 04:24:44.417520 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:24:44 crc kubenswrapper[4839]: I0321 04:24:44.417535 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:24:44 crc kubenswrapper[4839]: I0321 04:24:44.417559 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 04:24:44 crc kubenswrapper[4839]: I0321 04:24:44.417589 4839 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T04:24:44Z","lastTransitionTime":"2026-03-21T04:24:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 04:24:44 crc kubenswrapper[4839]: I0321 04:24:44.520880 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:24:44 crc kubenswrapper[4839]: I0321 04:24:44.520973 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:24:44 crc kubenswrapper[4839]: I0321 04:24:44.520995 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:24:44 crc kubenswrapper[4839]: I0321 04:24:44.521030 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 04:24:44 crc kubenswrapper[4839]: I0321 04:24:44.521054 4839 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T04:24:44Z","lastTransitionTime":"2026-03-21T04:24:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 04:24:44 crc kubenswrapper[4839]: I0321 04:24:44.624148 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:24:44 crc kubenswrapper[4839]: I0321 04:24:44.624235 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:24:44 crc kubenswrapper[4839]: I0321 04:24:44.624253 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:24:44 crc kubenswrapper[4839]: I0321 04:24:44.624284 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 04:24:44 crc kubenswrapper[4839]: I0321 04:24:44.624308 4839 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T04:24:44Z","lastTransitionTime":"2026-03-21T04:24:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 04:24:44 crc kubenswrapper[4839]: I0321 04:24:44.730220 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:24:44 crc kubenswrapper[4839]: I0321 04:24:44.730267 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:24:44 crc kubenswrapper[4839]: I0321 04:24:44.730279 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:24:44 crc kubenswrapper[4839]: I0321 04:24:44.730296 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 04:24:44 crc kubenswrapper[4839]: I0321 04:24:44.730307 4839 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T04:24:44Z","lastTransitionTime":"2026-03-21T04:24:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 04:24:44 crc kubenswrapper[4839]: I0321 04:24:44.839517 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:24:44 crc kubenswrapper[4839]: I0321 04:24:44.839600 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:24:44 crc kubenswrapper[4839]: I0321 04:24:44.839615 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:24:44 crc kubenswrapper[4839]: I0321 04:24:44.839634 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 04:24:44 crc kubenswrapper[4839]: I0321 04:24:44.839647 4839 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T04:24:44Z","lastTransitionTime":"2026-03-21T04:24:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 04:24:44 crc kubenswrapper[4839]: I0321 04:24:44.942708 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:24:44 crc kubenswrapper[4839]: I0321 04:24:44.942779 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:24:44 crc kubenswrapper[4839]: I0321 04:24:44.942797 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:24:44 crc kubenswrapper[4839]: I0321 04:24:44.942828 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 04:24:44 crc kubenswrapper[4839]: I0321 04:24:44.942851 4839 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T04:24:44Z","lastTransitionTime":"2026-03-21T04:24:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 04:24:45 crc kubenswrapper[4839]: I0321 04:24:45.046108 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:24:45 crc kubenswrapper[4839]: I0321 04:24:45.046194 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:24:45 crc kubenswrapper[4839]: I0321 04:24:45.046217 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:24:45 crc kubenswrapper[4839]: I0321 04:24:45.046254 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 04:24:45 crc kubenswrapper[4839]: I0321 04:24:45.046275 4839 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T04:24:45Z","lastTransitionTime":"2026-03-21T04:24:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 04:24:45 crc kubenswrapper[4839]: I0321 04:24:45.064860 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 21 04:24:45 crc kubenswrapper[4839]: I0321 04:24:45.064975 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 21 04:24:45 crc kubenswrapper[4839]: I0321 04:24:45.065029 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 21 04:24:45 crc kubenswrapper[4839]: E0321 04:24:45.065156 4839 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 21 04:24:45 crc kubenswrapper[4839]: E0321 04:24:45.065220 4839 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-21 04:24:49.065201221 +0000 UTC m=+93.392987897 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 21 04:24:45 crc kubenswrapper[4839]: E0321 04:24:45.065409 4839 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Mar 21 04:24:45 crc kubenswrapper[4839]: E0321 04:24:45.065507 4839 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-21 04:24:49.06549698 +0000 UTC m=+93.393283656 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Mar 21 04:24:45 crc kubenswrapper[4839]: E0321 04:24:45.065661 4839 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-21 04:24:49.065651095 +0000 UTC m=+93.393437781 (durationBeforeRetry 4s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 21 04:24:45 crc kubenswrapper[4839]: I0321 04:24:45.149132 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:24:45 crc kubenswrapper[4839]: I0321 04:24:45.149373 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:24:45 crc kubenswrapper[4839]: I0321 04:24:45.149447 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:24:45 crc kubenswrapper[4839]: I0321 04:24:45.149512 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 04:24:45 crc kubenswrapper[4839]: I0321 04:24:45.149599 4839 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T04:24:45Z","lastTransitionTime":"2026-03-21T04:24:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 04:24:45 crc kubenswrapper[4839]: I0321 04:24:45.165657 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 21 04:24:45 crc kubenswrapper[4839]: I0321 04:24:45.165747 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 21 04:24:45 crc kubenswrapper[4839]: E0321 04:24:45.165912 4839 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 21 04:24:45 crc kubenswrapper[4839]: E0321 04:24:45.165936 4839 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 21 04:24:45 crc kubenswrapper[4839]: E0321 04:24:45.165965 4839 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 21 04:24:45 crc kubenswrapper[4839]: E0321 04:24:45.165971 4839 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 21 04:24:45 crc kubenswrapper[4839]: E0321 04:24:45.165986 4839 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 21 04:24:45 crc kubenswrapper[4839]: E0321 04:24:45.165993 4839 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 21 04:24:45 crc kubenswrapper[4839]: E0321 04:24:45.166083 4839 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-03-21 04:24:49.166059726 +0000 UTC m=+93.493846432 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 21 04:24:45 crc kubenswrapper[4839]: E0321 04:24:45.166561 4839 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-03-21 04:24:49.166544241 +0000 UTC m=+93.494330977 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 21 04:24:45 crc kubenswrapper[4839]: I0321 04:24:45.252384 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:24:45 crc kubenswrapper[4839]: I0321 04:24:45.252451 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:24:45 crc kubenswrapper[4839]: I0321 04:24:45.252465 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:24:45 crc kubenswrapper[4839]: I0321 04:24:45.252485 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 04:24:45 crc kubenswrapper[4839]: I0321 04:24:45.252496 4839 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T04:24:45Z","lastTransitionTime":"2026-03-21T04:24:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 04:24:45 crc kubenswrapper[4839]: I0321 04:24:45.354721 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:24:45 crc kubenswrapper[4839]: I0321 04:24:45.354773 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:24:45 crc kubenswrapper[4839]: I0321 04:24:45.354783 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:24:45 crc kubenswrapper[4839]: I0321 04:24:45.354797 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 04:24:45 crc kubenswrapper[4839]: I0321 04:24:45.354807 4839 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T04:24:45Z","lastTransitionTime":"2026-03-21T04:24:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 04:24:45 crc kubenswrapper[4839]: I0321 04:24:45.452118 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 21 04:24:45 crc kubenswrapper[4839]: I0321 04:24:45.452148 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 21 04:24:45 crc kubenswrapper[4839]: I0321 04:24:45.452196 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 21 04:24:45 crc kubenswrapper[4839]: E0321 04:24:45.452278 4839 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 21 04:24:45 crc kubenswrapper[4839]: E0321 04:24:45.452367 4839 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 21 04:24:45 crc kubenswrapper[4839]: E0321 04:24:45.452435 4839 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 21 04:24:45 crc kubenswrapper[4839]: I0321 04:24:45.456637 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:24:45 crc kubenswrapper[4839]: I0321 04:24:45.456674 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:24:45 crc kubenswrapper[4839]: I0321 04:24:45.456688 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:24:45 crc kubenswrapper[4839]: I0321 04:24:45.456706 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 04:24:45 crc kubenswrapper[4839]: I0321 04:24:45.456719 4839 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T04:24:45Z","lastTransitionTime":"2026-03-21T04:24:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 04:24:45 crc kubenswrapper[4839]: I0321 04:24:45.559121 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:24:45 crc kubenswrapper[4839]: I0321 04:24:45.559186 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:24:45 crc kubenswrapper[4839]: I0321 04:24:45.559195 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:24:45 crc kubenswrapper[4839]: I0321 04:24:45.559211 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 04:24:45 crc kubenswrapper[4839]: I0321 04:24:45.559220 4839 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T04:24:45Z","lastTransitionTime":"2026-03-21T04:24:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 04:24:45 crc kubenswrapper[4839]: I0321 04:24:45.661384 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:24:45 crc kubenswrapper[4839]: I0321 04:24:45.661446 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:24:45 crc kubenswrapper[4839]: I0321 04:24:45.661458 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:24:45 crc kubenswrapper[4839]: I0321 04:24:45.661475 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 04:24:45 crc kubenswrapper[4839]: I0321 04:24:45.661487 4839 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T04:24:45Z","lastTransitionTime":"2026-03-21T04:24:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 04:24:45 crc kubenswrapper[4839]: I0321 04:24:45.763672 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:24:45 crc kubenswrapper[4839]: I0321 04:24:45.763720 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:24:45 crc kubenswrapper[4839]: I0321 04:24:45.763732 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:24:45 crc kubenswrapper[4839]: I0321 04:24:45.763748 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 04:24:45 crc kubenswrapper[4839]: I0321 04:24:45.763760 4839 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T04:24:45Z","lastTransitionTime":"2026-03-21T04:24:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 04:24:45 crc kubenswrapper[4839]: I0321 04:24:45.865524 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:24:45 crc kubenswrapper[4839]: I0321 04:24:45.865609 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:24:45 crc kubenswrapper[4839]: I0321 04:24:45.865622 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:24:45 crc kubenswrapper[4839]: I0321 04:24:45.865635 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 04:24:45 crc kubenswrapper[4839]: I0321 04:24:45.865644 4839 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T04:24:45Z","lastTransitionTime":"2026-03-21T04:24:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 04:24:45 crc kubenswrapper[4839]: I0321 04:24:45.968117 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:24:45 crc kubenswrapper[4839]: I0321 04:24:45.968193 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:24:45 crc kubenswrapper[4839]: I0321 04:24:45.968213 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:24:45 crc kubenswrapper[4839]: I0321 04:24:45.968239 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 04:24:45 crc kubenswrapper[4839]: I0321 04:24:45.968258 4839 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T04:24:45Z","lastTransitionTime":"2026-03-21T04:24:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 04:24:46 crc kubenswrapper[4839]: I0321 04:24:46.071081 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:24:46 crc kubenswrapper[4839]: I0321 04:24:46.071206 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:24:46 crc kubenswrapper[4839]: I0321 04:24:46.071235 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:24:46 crc kubenswrapper[4839]: I0321 04:24:46.071274 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 04:24:46 crc kubenswrapper[4839]: I0321 04:24:46.071302 4839 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T04:24:46Z","lastTransitionTime":"2026-03-21T04:24:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 04:24:46 crc kubenswrapper[4839]: I0321 04:24:46.174169 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:24:46 crc kubenswrapper[4839]: I0321 04:24:46.174240 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:24:46 crc kubenswrapper[4839]: I0321 04:24:46.174252 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:24:46 crc kubenswrapper[4839]: I0321 04:24:46.174268 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 04:24:46 crc kubenswrapper[4839]: I0321 04:24:46.174280 4839 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T04:24:46Z","lastTransitionTime":"2026-03-21T04:24:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 04:24:46 crc kubenswrapper[4839]: I0321 04:24:46.276297 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:24:46 crc kubenswrapper[4839]: I0321 04:24:46.276334 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:24:46 crc kubenswrapper[4839]: I0321 04:24:46.276343 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:24:46 crc kubenswrapper[4839]: I0321 04:24:46.276356 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 04:24:46 crc kubenswrapper[4839]: I0321 04:24:46.276364 4839 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T04:24:46Z","lastTransitionTime":"2026-03-21T04:24:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 04:24:46 crc kubenswrapper[4839]: I0321 04:24:46.378489 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:24:46 crc kubenswrapper[4839]: I0321 04:24:46.378533 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:24:46 crc kubenswrapper[4839]: I0321 04:24:46.378541 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:24:46 crc kubenswrapper[4839]: I0321 04:24:46.378555 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 04:24:46 crc kubenswrapper[4839]: I0321 04:24:46.378579 4839 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T04:24:46Z","lastTransitionTime":"2026-03-21T04:24:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 04:24:46 crc kubenswrapper[4839]: I0321 04:24:46.465222 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:24:41Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:24:41Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 21 04:24:46 crc kubenswrapper[4839]: I0321 04:24:46.475210 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:24:41Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:24:41Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 21 04:24:46 crc kubenswrapper[4839]: I0321 04:24:46.481609 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:24:46 crc kubenswrapper[4839]: I0321 04:24:46.481789 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:24:46 crc kubenswrapper[4839]: I0321 04:24:46.481967 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:24:46 crc kubenswrapper[4839]: I0321 04:24:46.482122 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 04:24:46 crc kubenswrapper[4839]: I0321 04:24:46.482260 4839 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T04:24:46Z","lastTransitionTime":"2026-03-21T04:24:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 04:24:46 crc kubenswrapper[4839]: I0321 04:24:46.489291 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:24:41Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:24:41Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 21 04:24:46 crc kubenswrapper[4839]: I0321 04:24:46.500661 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:24:41Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:24:41Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 21 04:24:46 crc kubenswrapper[4839]: I0321 04:24:46.510840 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:24:41Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:24:41Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 21 04:24:46 crc kubenswrapper[4839]: I0321 04:24:46.520473 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:24:41Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:24:41Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 21 04:24:46 crc kubenswrapper[4839]: I0321 04:24:46.584272 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:24:46 crc kubenswrapper[4839]: I0321 04:24:46.584305 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:24:46 crc kubenswrapper[4839]: I0321 04:24:46.584313 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:24:46 crc kubenswrapper[4839]: I0321 04:24:46.584327 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 04:24:46 crc kubenswrapper[4839]: I0321 04:24:46.584337 4839 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T04:24:46Z","lastTransitionTime":"2026-03-21T04:24:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 04:24:46 crc kubenswrapper[4839]: I0321 04:24:46.686471 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:24:46 crc kubenswrapper[4839]: I0321 04:24:46.686511 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:24:46 crc kubenswrapper[4839]: I0321 04:24:46.686520 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:24:46 crc kubenswrapper[4839]: I0321 04:24:46.686534 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 04:24:46 crc kubenswrapper[4839]: I0321 04:24:46.686544 4839 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T04:24:46Z","lastTransitionTime":"2026-03-21T04:24:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 04:24:46 crc kubenswrapper[4839]: I0321 04:24:46.788824 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:24:46 crc kubenswrapper[4839]: I0321 04:24:46.788871 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:24:46 crc kubenswrapper[4839]: I0321 04:24:46.788881 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:24:46 crc kubenswrapper[4839]: I0321 04:24:46.788898 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 04:24:46 crc kubenswrapper[4839]: I0321 04:24:46.788910 4839 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T04:24:46Z","lastTransitionTime":"2026-03-21T04:24:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 04:24:46 crc kubenswrapper[4839]: I0321 04:24:46.890681 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:24:46 crc kubenswrapper[4839]: I0321 04:24:46.890722 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:24:46 crc kubenswrapper[4839]: I0321 04:24:46.890734 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:24:46 crc kubenswrapper[4839]: I0321 04:24:46.890751 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 04:24:46 crc kubenswrapper[4839]: I0321 04:24:46.890762 4839 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T04:24:46Z","lastTransitionTime":"2026-03-21T04:24:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 04:24:46 crc kubenswrapper[4839]: I0321 04:24:46.993345 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:24:46 crc kubenswrapper[4839]: I0321 04:24:46.993389 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:24:46 crc kubenswrapper[4839]: I0321 04:24:46.993405 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:24:46 crc kubenswrapper[4839]: I0321 04:24:46.993423 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 04:24:46 crc kubenswrapper[4839]: I0321 04:24:46.993433 4839 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T04:24:46Z","lastTransitionTime":"2026-03-21T04:24:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 04:24:47 crc kubenswrapper[4839]: I0321 04:24:47.095224 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:24:47 crc kubenswrapper[4839]: I0321 04:24:47.095263 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:24:47 crc kubenswrapper[4839]: I0321 04:24:47.095272 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:24:47 crc kubenswrapper[4839]: I0321 04:24:47.095286 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 04:24:47 crc kubenswrapper[4839]: I0321 04:24:47.095296 4839 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T04:24:47Z","lastTransitionTime":"2026-03-21T04:24:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 04:24:47 crc kubenswrapper[4839]: I0321 04:24:47.196939 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:24:47 crc kubenswrapper[4839]: I0321 04:24:47.196985 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:24:47 crc kubenswrapper[4839]: I0321 04:24:47.197010 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:24:47 crc kubenswrapper[4839]: I0321 04:24:47.197024 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 04:24:47 crc kubenswrapper[4839]: I0321 04:24:47.197033 4839 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T04:24:47Z","lastTransitionTime":"2026-03-21T04:24:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 04:24:47 crc kubenswrapper[4839]: I0321 04:24:47.299404 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:24:47 crc kubenswrapper[4839]: I0321 04:24:47.299768 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:24:47 crc kubenswrapper[4839]: I0321 04:24:47.299873 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:24:47 crc kubenswrapper[4839]: I0321 04:24:47.300003 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 04:24:47 crc kubenswrapper[4839]: I0321 04:24:47.300097 4839 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T04:24:47Z","lastTransitionTime":"2026-03-21T04:24:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 04:24:47 crc kubenswrapper[4839]: I0321 04:24:47.405213 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:24:47 crc kubenswrapper[4839]: I0321 04:24:47.405265 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:24:47 crc kubenswrapper[4839]: I0321 04:24:47.405284 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:24:47 crc kubenswrapper[4839]: I0321 04:24:47.405301 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 04:24:47 crc kubenswrapper[4839]: I0321 04:24:47.405314 4839 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T04:24:47Z","lastTransitionTime":"2026-03-21T04:24:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 04:24:47 crc kubenswrapper[4839]: I0321 04:24:47.452235 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 21 04:24:47 crc kubenswrapper[4839]: I0321 04:24:47.452319 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 21 04:24:47 crc kubenswrapper[4839]: E0321 04:24:47.452397 4839 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 21 04:24:47 crc kubenswrapper[4839]: I0321 04:24:47.452478 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 21 04:24:47 crc kubenswrapper[4839]: E0321 04:24:47.452532 4839 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 21 04:24:47 crc kubenswrapper[4839]: E0321 04:24:47.452710 4839 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 21 04:24:47 crc kubenswrapper[4839]: I0321 04:24:47.463220 4839 scope.go:117] "RemoveContainer" containerID="ecafd1a4266b72f369f4b53d8a423bb875c1bc9719282c517a8e999ad5aecc66" Mar 21 04:24:47 crc kubenswrapper[4839]: I0321 04:24:47.463381 4839 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Mar 21 04:24:47 crc kubenswrapper[4839]: E0321 04:24:47.463540 4839 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 21 04:24:47 crc kubenswrapper[4839]: I0321 04:24:47.507238 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:24:47 crc kubenswrapper[4839]: I0321 04:24:47.507291 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:24:47 crc kubenswrapper[4839]: I0321 04:24:47.507305 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:24:47 crc kubenswrapper[4839]: I0321 04:24:47.507318 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 04:24:47 crc kubenswrapper[4839]: I0321 04:24:47.507327 4839 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T04:24:47Z","lastTransitionTime":"2026-03-21T04:24:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 04:24:47 crc kubenswrapper[4839]: I0321 04:24:47.609191 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:24:47 crc kubenswrapper[4839]: I0321 04:24:47.609245 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:24:47 crc kubenswrapper[4839]: I0321 04:24:47.609270 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:24:47 crc kubenswrapper[4839]: I0321 04:24:47.609292 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 04:24:47 crc kubenswrapper[4839]: I0321 04:24:47.609309 4839 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T04:24:47Z","lastTransitionTime":"2026-03-21T04:24:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 04:24:47 crc kubenswrapper[4839]: I0321 04:24:47.711867 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:24:47 crc kubenswrapper[4839]: I0321 04:24:47.711918 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:24:47 crc kubenswrapper[4839]: I0321 04:24:47.711928 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:24:47 crc kubenswrapper[4839]: I0321 04:24:47.711942 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 04:24:47 crc kubenswrapper[4839]: I0321 04:24:47.711953 4839 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T04:24:47Z","lastTransitionTime":"2026-03-21T04:24:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 04:24:47 crc kubenswrapper[4839]: I0321 04:24:47.813780 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:24:47 crc kubenswrapper[4839]: I0321 04:24:47.813827 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:24:47 crc kubenswrapper[4839]: I0321 04:24:47.813839 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:24:47 crc kubenswrapper[4839]: I0321 04:24:47.813854 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 04:24:47 crc kubenswrapper[4839]: I0321 04:24:47.813865 4839 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T04:24:47Z","lastTransitionTime":"2026-03-21T04:24:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 04:24:47 crc kubenswrapper[4839]: I0321 04:24:47.916448 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:24:47 crc kubenswrapper[4839]: I0321 04:24:47.916495 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:24:47 crc kubenswrapper[4839]: I0321 04:24:47.916505 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:24:47 crc kubenswrapper[4839]: I0321 04:24:47.916521 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 04:24:47 crc kubenswrapper[4839]: I0321 04:24:47.916531 4839 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T04:24:47Z","lastTransitionTime":"2026-03-21T04:24:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 04:24:47 crc kubenswrapper[4839]: I0321 04:24:47.963124 4839 scope.go:117] "RemoveContainer" containerID="ecafd1a4266b72f369f4b53d8a423bb875c1bc9719282c517a8e999ad5aecc66" Mar 21 04:24:47 crc kubenswrapper[4839]: E0321 04:24:47.963267 4839 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 21 04:24:48 crc kubenswrapper[4839]: I0321 04:24:48.018539 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:24:48 crc kubenswrapper[4839]: I0321 04:24:48.018588 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:24:48 crc kubenswrapper[4839]: I0321 04:24:48.018606 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:24:48 crc kubenswrapper[4839]: I0321 04:24:48.018626 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 04:24:48 crc kubenswrapper[4839]: I0321 04:24:48.018636 4839 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T04:24:48Z","lastTransitionTime":"2026-03-21T04:24:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 04:24:48 crc kubenswrapper[4839]: I0321 04:24:48.120921 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:24:48 crc kubenswrapper[4839]: I0321 04:24:48.120956 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:24:48 crc kubenswrapper[4839]: I0321 04:24:48.120967 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:24:48 crc kubenswrapper[4839]: I0321 04:24:48.120982 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 04:24:48 crc kubenswrapper[4839]: I0321 04:24:48.120993 4839 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T04:24:48Z","lastTransitionTime":"2026-03-21T04:24:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 04:24:48 crc kubenswrapper[4839]: I0321 04:24:48.222639 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:24:48 crc kubenswrapper[4839]: I0321 04:24:48.222689 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:24:48 crc kubenswrapper[4839]: I0321 04:24:48.222703 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:24:48 crc kubenswrapper[4839]: I0321 04:24:48.222723 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 04:24:48 crc kubenswrapper[4839]: I0321 04:24:48.222734 4839 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T04:24:48Z","lastTransitionTime":"2026-03-21T04:24:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 04:24:48 crc kubenswrapper[4839]: I0321 04:24:48.325330 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:24:48 crc kubenswrapper[4839]: I0321 04:24:48.325383 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:24:48 crc kubenswrapper[4839]: I0321 04:24:48.325395 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:24:48 crc kubenswrapper[4839]: I0321 04:24:48.325413 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 04:24:48 crc kubenswrapper[4839]: I0321 04:24:48.325425 4839 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T04:24:48Z","lastTransitionTime":"2026-03-21T04:24:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 04:24:48 crc kubenswrapper[4839]: I0321 04:24:48.427270 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:24:48 crc kubenswrapper[4839]: I0321 04:24:48.427336 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:24:48 crc kubenswrapper[4839]: I0321 04:24:48.427352 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:24:48 crc kubenswrapper[4839]: I0321 04:24:48.427373 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 04:24:48 crc kubenswrapper[4839]: I0321 04:24:48.427397 4839 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T04:24:48Z","lastTransitionTime":"2026-03-21T04:24:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 04:24:48 crc kubenswrapper[4839]: I0321 04:24:48.529363 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:24:48 crc kubenswrapper[4839]: I0321 04:24:48.529407 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:24:48 crc kubenswrapper[4839]: I0321 04:24:48.529418 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:24:48 crc kubenswrapper[4839]: I0321 04:24:48.529435 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 04:24:48 crc kubenswrapper[4839]: I0321 04:24:48.529446 4839 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T04:24:48Z","lastTransitionTime":"2026-03-21T04:24:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 04:24:48 crc kubenswrapper[4839]: I0321 04:24:48.631997 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:24:48 crc kubenswrapper[4839]: I0321 04:24:48.632042 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:24:48 crc kubenswrapper[4839]: I0321 04:24:48.632056 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:24:48 crc kubenswrapper[4839]: I0321 04:24:48.632073 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 04:24:48 crc kubenswrapper[4839]: I0321 04:24:48.632084 4839 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T04:24:48Z","lastTransitionTime":"2026-03-21T04:24:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 04:24:48 crc kubenswrapper[4839]: I0321 04:24:48.734552 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:24:48 crc kubenswrapper[4839]: I0321 04:24:48.734626 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:24:48 crc kubenswrapper[4839]: I0321 04:24:48.734644 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:24:48 crc kubenswrapper[4839]: I0321 04:24:48.734667 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 04:24:48 crc kubenswrapper[4839]: I0321 04:24:48.734682 4839 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T04:24:48Z","lastTransitionTime":"2026-03-21T04:24:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 04:24:48 crc kubenswrapper[4839]: I0321 04:24:48.837108 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:24:48 crc kubenswrapper[4839]: I0321 04:24:48.837170 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:24:48 crc kubenswrapper[4839]: I0321 04:24:48.837187 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:24:48 crc kubenswrapper[4839]: I0321 04:24:48.837208 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 04:24:48 crc kubenswrapper[4839]: I0321 04:24:48.837225 4839 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T04:24:48Z","lastTransitionTime":"2026-03-21T04:24:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 04:24:48 crc kubenswrapper[4839]: I0321 04:24:48.939960 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:24:48 crc kubenswrapper[4839]: I0321 04:24:48.940012 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:24:48 crc kubenswrapper[4839]: I0321 04:24:48.940026 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:24:48 crc kubenswrapper[4839]: I0321 04:24:48.940047 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 04:24:48 crc kubenswrapper[4839]: I0321 04:24:48.940063 4839 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T04:24:48Z","lastTransitionTime":"2026-03-21T04:24:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 04:24:49 crc kubenswrapper[4839]: I0321 04:24:49.043191 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:24:49 crc kubenswrapper[4839]: I0321 04:24:49.043262 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:24:49 crc kubenswrapper[4839]: I0321 04:24:49.043279 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:24:49 crc kubenswrapper[4839]: I0321 04:24:49.043303 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 04:24:49 crc kubenswrapper[4839]: I0321 04:24:49.043320 4839 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T04:24:49Z","lastTransitionTime":"2026-03-21T04:24:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 04:24:49 crc kubenswrapper[4839]: I0321 04:24:49.097407 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 21 04:24:49 crc kubenswrapper[4839]: I0321 04:24:49.097544 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 21 04:24:49 crc kubenswrapper[4839]: E0321 04:24:49.097635 4839 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-21 04:24:57.097557346 +0000 UTC m=+101.425344072 (durationBeforeRetry 8s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 21 04:24:49 crc kubenswrapper[4839]: E0321 04:24:49.097700 4839 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Mar 21 04:24:49 crc kubenswrapper[4839]: I0321 04:24:49.097757 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 21 04:24:49 crc kubenswrapper[4839]: E0321 04:24:49.097779 4839 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-21 04:24:57.097757522 +0000 UTC m=+101.425544238 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Mar 21 04:24:49 crc kubenswrapper[4839]: E0321 04:24:49.097920 4839 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 21 04:24:49 crc kubenswrapper[4839]: E0321 04:24:49.097986 4839 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-21 04:24:57.097972249 +0000 UTC m=+101.425758955 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 21 04:24:49 crc kubenswrapper[4839]: I0321 04:24:49.145774 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:24:49 crc kubenswrapper[4839]: I0321 04:24:49.145849 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:24:49 crc kubenswrapper[4839]: I0321 04:24:49.145875 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:24:49 crc kubenswrapper[4839]: I0321 04:24:49.145905 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 04:24:49 crc kubenswrapper[4839]: I0321 04:24:49.145931 4839 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T04:24:49Z","lastTransitionTime":"2026-03-21T04:24:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 04:24:49 crc kubenswrapper[4839]: I0321 04:24:49.199214 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 21 04:24:49 crc kubenswrapper[4839]: I0321 04:24:49.199285 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 21 04:24:49 crc kubenswrapper[4839]: E0321 04:24:49.199451 4839 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 21 04:24:49 crc kubenswrapper[4839]: E0321 04:24:49.199507 4839 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 21 04:24:49 crc kubenswrapper[4839]: E0321 04:24:49.199526 4839 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 21 04:24:49 crc kubenswrapper[4839]: E0321 04:24:49.199460 4839 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 21 04:24:49 crc kubenswrapper[4839]: E0321 04:24:49.199646 4839 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 21 04:24:49 crc kubenswrapper[4839]: E0321 04:24:49.199616 4839 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-03-21 04:24:57.199564127 +0000 UTC m=+101.527350803 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 21 04:24:49 crc kubenswrapper[4839]: E0321 04:24:49.199667 4839 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 21 04:24:49 crc kubenswrapper[4839]: E0321 04:24:49.199799 4839 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-03-21 04:24:57.199771433 +0000 UTC m=+101.527558139 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 21 04:24:49 crc kubenswrapper[4839]: I0321 04:24:49.247873 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:24:49 crc kubenswrapper[4839]: I0321 04:24:49.247911 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:24:49 crc kubenswrapper[4839]: I0321 04:24:49.247921 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:24:49 crc kubenswrapper[4839]: I0321 04:24:49.247937 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 04:24:49 crc kubenswrapper[4839]: I0321 04:24:49.247949 4839 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T04:24:49Z","lastTransitionTime":"2026-03-21T04:24:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 04:24:49 crc kubenswrapper[4839]: I0321 04:24:49.350400 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:24:49 crc kubenswrapper[4839]: I0321 04:24:49.350437 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:24:49 crc kubenswrapper[4839]: I0321 04:24:49.350446 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:24:49 crc kubenswrapper[4839]: I0321 04:24:49.350460 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 04:24:49 crc kubenswrapper[4839]: I0321 04:24:49.350470 4839 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T04:24:49Z","lastTransitionTime":"2026-03-21T04:24:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 04:24:49 crc kubenswrapper[4839]: I0321 04:24:49.451712 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 21 04:24:49 crc kubenswrapper[4839]: I0321 04:24:49.451771 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 21 04:24:49 crc kubenswrapper[4839]: E0321 04:24:49.451890 4839 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 21 04:24:49 crc kubenswrapper[4839]: I0321 04:24:49.451792 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 21 04:24:49 crc kubenswrapper[4839]: E0321 04:24:49.451993 4839 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 21 04:24:49 crc kubenswrapper[4839]: E0321 04:24:49.452202 4839 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 21 04:24:49 crc kubenswrapper[4839]: I0321 04:24:49.453227 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:24:49 crc kubenswrapper[4839]: I0321 04:24:49.453289 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:24:49 crc kubenswrapper[4839]: I0321 04:24:49.453315 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:24:49 crc kubenswrapper[4839]: I0321 04:24:49.453348 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 04:24:49 crc kubenswrapper[4839]: I0321 04:24:49.453366 4839 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T04:24:49Z","lastTransitionTime":"2026-03-21T04:24:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 04:24:49 crc kubenswrapper[4839]: I0321 04:24:49.555305 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:24:49 crc kubenswrapper[4839]: I0321 04:24:49.555365 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:24:49 crc kubenswrapper[4839]: I0321 04:24:49.555381 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:24:49 crc kubenswrapper[4839]: I0321 04:24:49.555404 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 04:24:49 crc kubenswrapper[4839]: I0321 04:24:49.555420 4839 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T04:24:49Z","lastTransitionTime":"2026-03-21T04:24:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 04:24:49 crc kubenswrapper[4839]: I0321 04:24:49.657233 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:24:49 crc kubenswrapper[4839]: I0321 04:24:49.657297 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:24:49 crc kubenswrapper[4839]: I0321 04:24:49.657311 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:24:49 crc kubenswrapper[4839]: I0321 04:24:49.657328 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 04:24:49 crc kubenswrapper[4839]: I0321 04:24:49.657341 4839 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T04:24:49Z","lastTransitionTime":"2026-03-21T04:24:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 04:24:49 crc kubenswrapper[4839]: I0321 04:24:49.759196 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:24:49 crc kubenswrapper[4839]: I0321 04:24:49.759253 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:24:49 crc kubenswrapper[4839]: I0321 04:24:49.759271 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:24:49 crc kubenswrapper[4839]: I0321 04:24:49.759289 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 04:24:49 crc kubenswrapper[4839]: I0321 04:24:49.759300 4839 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T04:24:49Z","lastTransitionTime":"2026-03-21T04:24:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 04:24:49 crc kubenswrapper[4839]: I0321 04:24:49.862172 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:24:49 crc kubenswrapper[4839]: I0321 04:24:49.862209 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:24:49 crc kubenswrapper[4839]: I0321 04:24:49.862218 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:24:49 crc kubenswrapper[4839]: I0321 04:24:49.862232 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 04:24:49 crc kubenswrapper[4839]: I0321 04:24:49.862243 4839 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T04:24:49Z","lastTransitionTime":"2026-03-21T04:24:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 04:24:49 crc kubenswrapper[4839]: I0321 04:24:49.968474 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:24:49 crc kubenswrapper[4839]: I0321 04:24:49.968536 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:24:49 crc kubenswrapper[4839]: I0321 04:24:49.968547 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:24:49 crc kubenswrapper[4839]: I0321 04:24:49.968562 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 04:24:49 crc kubenswrapper[4839]: I0321 04:24:49.968594 4839 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T04:24:49Z","lastTransitionTime":"2026-03-21T04:24:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 04:24:50 crc kubenswrapper[4839]: I0321 04:24:50.070878 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:24:50 crc kubenswrapper[4839]: I0321 04:24:50.070918 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:24:50 crc kubenswrapper[4839]: I0321 04:24:50.070927 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:24:50 crc kubenswrapper[4839]: I0321 04:24:50.070942 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 04:24:50 crc kubenswrapper[4839]: I0321 04:24:50.070951 4839 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T04:24:50Z","lastTransitionTime":"2026-03-21T04:24:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 04:24:50 crc kubenswrapper[4839]: I0321 04:24:50.173259 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:24:50 crc kubenswrapper[4839]: I0321 04:24:50.173359 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:24:50 crc kubenswrapper[4839]: I0321 04:24:50.173433 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:24:50 crc kubenswrapper[4839]: I0321 04:24:50.173457 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 04:24:50 crc kubenswrapper[4839]: I0321 04:24:50.173469 4839 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T04:24:50Z","lastTransitionTime":"2026-03-21T04:24:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 04:24:50 crc kubenswrapper[4839]: I0321 04:24:50.276560 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:24:50 crc kubenswrapper[4839]: I0321 04:24:50.276624 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:24:50 crc kubenswrapper[4839]: I0321 04:24:50.276635 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:24:50 crc kubenswrapper[4839]: I0321 04:24:50.276660 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 04:24:50 crc kubenswrapper[4839]: I0321 04:24:50.276672 4839 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T04:24:50Z","lastTransitionTime":"2026-03-21T04:24:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 04:24:50 crc kubenswrapper[4839]: I0321 04:24:50.379317 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:24:50 crc kubenswrapper[4839]: I0321 04:24:50.379376 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:24:50 crc kubenswrapper[4839]: I0321 04:24:50.379390 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:24:50 crc kubenswrapper[4839]: I0321 04:24:50.379412 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 04:24:50 crc kubenswrapper[4839]: I0321 04:24:50.379426 4839 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T04:24:50Z","lastTransitionTime":"2026-03-21T04:24:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 04:24:50 crc kubenswrapper[4839]: I0321 04:24:50.481396 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:24:50 crc kubenswrapper[4839]: I0321 04:24:50.481437 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:24:50 crc kubenswrapper[4839]: I0321 04:24:50.481445 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:24:50 crc kubenswrapper[4839]: I0321 04:24:50.481458 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 04:24:50 crc kubenswrapper[4839]: I0321 04:24:50.481468 4839 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T04:24:50Z","lastTransitionTime":"2026-03-21T04:24:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 04:24:50 crc kubenswrapper[4839]: I0321 04:24:50.584558 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:24:50 crc kubenswrapper[4839]: I0321 04:24:50.584677 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:24:50 crc kubenswrapper[4839]: I0321 04:24:50.584697 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:24:50 crc kubenswrapper[4839]: I0321 04:24:50.584727 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 04:24:50 crc kubenswrapper[4839]: I0321 04:24:50.584747 4839 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T04:24:50Z","lastTransitionTime":"2026-03-21T04:24:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 04:24:50 crc kubenswrapper[4839]: I0321 04:24:50.688275 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:24:50 crc kubenswrapper[4839]: I0321 04:24:50.688363 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:24:50 crc kubenswrapper[4839]: I0321 04:24:50.688373 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:24:50 crc kubenswrapper[4839]: I0321 04:24:50.688391 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 04:24:50 crc kubenswrapper[4839]: I0321 04:24:50.688401 4839 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T04:24:50Z","lastTransitionTime":"2026-03-21T04:24:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 04:24:50 crc kubenswrapper[4839]: I0321 04:24:50.790810 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:24:50 crc kubenswrapper[4839]: I0321 04:24:50.790869 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:24:50 crc kubenswrapper[4839]: I0321 04:24:50.790880 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:24:50 crc kubenswrapper[4839]: I0321 04:24:50.790894 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 04:24:50 crc kubenswrapper[4839]: I0321 04:24:50.790902 4839 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T04:24:50Z","lastTransitionTime":"2026-03-21T04:24:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 04:24:50 crc kubenswrapper[4839]: I0321 04:24:50.893408 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:24:50 crc kubenswrapper[4839]: I0321 04:24:50.893438 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:24:50 crc kubenswrapper[4839]: I0321 04:24:50.893445 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:24:50 crc kubenswrapper[4839]: I0321 04:24:50.893458 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 04:24:50 crc kubenswrapper[4839]: I0321 04:24:50.893468 4839 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T04:24:50Z","lastTransitionTime":"2026-03-21T04:24:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 04:24:50 crc kubenswrapper[4839]: I0321 04:24:50.994890 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:24:50 crc kubenswrapper[4839]: I0321 04:24:50.994950 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:24:50 crc kubenswrapper[4839]: I0321 04:24:50.994960 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:24:50 crc kubenswrapper[4839]: I0321 04:24:50.994974 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 04:24:50 crc kubenswrapper[4839]: I0321 04:24:50.994985 4839 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T04:24:50Z","lastTransitionTime":"2026-03-21T04:24:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 04:24:51 crc kubenswrapper[4839]: I0321 04:24:51.097278 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:24:51 crc kubenswrapper[4839]: I0321 04:24:51.097319 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:24:51 crc kubenswrapper[4839]: I0321 04:24:51.097331 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:24:51 crc kubenswrapper[4839]: I0321 04:24:51.097346 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 04:24:51 crc kubenswrapper[4839]: I0321 04:24:51.097357 4839 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T04:24:51Z","lastTransitionTime":"2026-03-21T04:24:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 04:24:51 crc kubenswrapper[4839]: I0321 04:24:51.200230 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:24:51 crc kubenswrapper[4839]: I0321 04:24:51.200271 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:24:51 crc kubenswrapper[4839]: I0321 04:24:51.200282 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:24:51 crc kubenswrapper[4839]: I0321 04:24:51.200299 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 04:24:51 crc kubenswrapper[4839]: I0321 04:24:51.200309 4839 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T04:24:51Z","lastTransitionTime":"2026-03-21T04:24:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 04:24:51 crc kubenswrapper[4839]: I0321 04:24:51.301981 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:24:51 crc kubenswrapper[4839]: I0321 04:24:51.302024 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:24:51 crc kubenswrapper[4839]: I0321 04:24:51.302033 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:24:51 crc kubenswrapper[4839]: I0321 04:24:51.302048 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 04:24:51 crc kubenswrapper[4839]: I0321 04:24:51.302058 4839 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T04:24:51Z","lastTransitionTime":"2026-03-21T04:24:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 04:24:51 crc kubenswrapper[4839]: I0321 04:24:51.403691 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:24:51 crc kubenswrapper[4839]: I0321 04:24:51.403732 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:24:51 crc kubenswrapper[4839]: I0321 04:24:51.403743 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:24:51 crc kubenswrapper[4839]: I0321 04:24:51.403756 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 04:24:51 crc kubenswrapper[4839]: I0321 04:24:51.403765 4839 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T04:24:51Z","lastTransitionTime":"2026-03-21T04:24:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 04:24:51 crc kubenswrapper[4839]: I0321 04:24:51.452156 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 21 04:24:51 crc kubenswrapper[4839]: I0321 04:24:51.452200 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 21 04:24:51 crc kubenswrapper[4839]: I0321 04:24:51.452173 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 21 04:24:51 crc kubenswrapper[4839]: E0321 04:24:51.452276 4839 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 21 04:24:51 crc kubenswrapper[4839]: E0321 04:24:51.452377 4839 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 21 04:24:51 crc kubenswrapper[4839]: E0321 04:24:51.452452 4839 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 21 04:24:51 crc kubenswrapper[4839]: I0321 04:24:51.506358 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:24:51 crc kubenswrapper[4839]: I0321 04:24:51.506410 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:24:51 crc kubenswrapper[4839]: I0321 04:24:51.506420 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:24:51 crc kubenswrapper[4839]: I0321 04:24:51.506435 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 04:24:51 crc kubenswrapper[4839]: I0321 04:24:51.506446 4839 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T04:24:51Z","lastTransitionTime":"2026-03-21T04:24:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 04:24:51 crc kubenswrapper[4839]: I0321 04:24:51.608442 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:24:51 crc kubenswrapper[4839]: I0321 04:24:51.608493 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:24:51 crc kubenswrapper[4839]: I0321 04:24:51.608503 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:24:51 crc kubenswrapper[4839]: I0321 04:24:51.608518 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 04:24:51 crc kubenswrapper[4839]: I0321 04:24:51.608528 4839 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T04:24:51Z","lastTransitionTime":"2026-03-21T04:24:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 04:24:51 crc kubenswrapper[4839]: I0321 04:24:51.711736 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:24:51 crc kubenswrapper[4839]: I0321 04:24:51.711769 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:24:51 crc kubenswrapper[4839]: I0321 04:24:51.711777 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:24:51 crc kubenswrapper[4839]: I0321 04:24:51.711790 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 04:24:51 crc kubenswrapper[4839]: I0321 04:24:51.711798 4839 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T04:24:51Z","lastTransitionTime":"2026-03-21T04:24:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 04:24:51 crc kubenswrapper[4839]: I0321 04:24:51.814512 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:24:51 crc kubenswrapper[4839]: I0321 04:24:51.814583 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:24:51 crc kubenswrapper[4839]: I0321 04:24:51.814598 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:24:51 crc kubenswrapper[4839]: I0321 04:24:51.814621 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 04:24:51 crc kubenswrapper[4839]: I0321 04:24:51.814635 4839 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T04:24:51Z","lastTransitionTime":"2026-03-21T04:24:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 04:24:51 crc kubenswrapper[4839]: I0321 04:24:51.916953 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:24:51 crc kubenswrapper[4839]: I0321 04:24:51.917009 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:24:51 crc kubenswrapper[4839]: I0321 04:24:51.917019 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:24:51 crc kubenswrapper[4839]: I0321 04:24:51.917034 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 04:24:51 crc kubenswrapper[4839]: I0321 04:24:51.917046 4839 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T04:24:51Z","lastTransitionTime":"2026-03-21T04:24:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 04:24:52 crc kubenswrapper[4839]: I0321 04:24:52.019909 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:24:52 crc kubenswrapper[4839]: I0321 04:24:52.019952 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:24:52 crc kubenswrapper[4839]: I0321 04:24:52.019961 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:24:52 crc kubenswrapper[4839]: I0321 04:24:52.019975 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 04:24:52 crc kubenswrapper[4839]: I0321 04:24:52.019985 4839 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T04:24:52Z","lastTransitionTime":"2026-03-21T04:24:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 04:24:52 crc kubenswrapper[4839]: I0321 04:24:52.122775 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:24:52 crc kubenswrapper[4839]: I0321 04:24:52.122831 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:24:52 crc kubenswrapper[4839]: I0321 04:24:52.122842 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:24:52 crc kubenswrapper[4839]: I0321 04:24:52.122857 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 04:24:52 crc kubenswrapper[4839]: I0321 04:24:52.122865 4839 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T04:24:52Z","lastTransitionTime":"2026-03-21T04:24:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 04:24:52 crc kubenswrapper[4839]: I0321 04:24:52.225628 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:24:52 crc kubenswrapper[4839]: I0321 04:24:52.225692 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:24:52 crc kubenswrapper[4839]: I0321 04:24:52.225705 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:24:52 crc kubenswrapper[4839]: I0321 04:24:52.225719 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 04:24:52 crc kubenswrapper[4839]: I0321 04:24:52.225728 4839 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T04:24:52Z","lastTransitionTime":"2026-03-21T04:24:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 04:24:52 crc kubenswrapper[4839]: I0321 04:24:52.327302 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:24:52 crc kubenswrapper[4839]: I0321 04:24:52.327335 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:24:52 crc kubenswrapper[4839]: I0321 04:24:52.327344 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:24:52 crc kubenswrapper[4839]: I0321 04:24:52.327359 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 04:24:52 crc kubenswrapper[4839]: I0321 04:24:52.327368 4839 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T04:24:52Z","lastTransitionTime":"2026-03-21T04:24:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 04:24:52 crc kubenswrapper[4839]: I0321 04:24:52.430394 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:24:52 crc kubenswrapper[4839]: I0321 04:24:52.430437 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:24:52 crc kubenswrapper[4839]: I0321 04:24:52.430446 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:24:52 crc kubenswrapper[4839]: I0321 04:24:52.430461 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 04:24:52 crc kubenswrapper[4839]: I0321 04:24:52.430471 4839 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T04:24:52Z","lastTransitionTime":"2026-03-21T04:24:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 04:24:52 crc kubenswrapper[4839]: I0321 04:24:52.533672 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:24:52 crc kubenswrapper[4839]: I0321 04:24:52.533719 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:24:52 crc kubenswrapper[4839]: I0321 04:24:52.533729 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:24:52 crc kubenswrapper[4839]: I0321 04:24:52.533744 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 04:24:52 crc kubenswrapper[4839]: I0321 04:24:52.533754 4839 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T04:24:52Z","lastTransitionTime":"2026-03-21T04:24:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 04:24:52 crc kubenswrapper[4839]: I0321 04:24:52.636055 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:24:52 crc kubenswrapper[4839]: I0321 04:24:52.636147 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:24:52 crc kubenswrapper[4839]: I0321 04:24:52.636175 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:24:52 crc kubenswrapper[4839]: I0321 04:24:52.636209 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 04:24:52 crc kubenswrapper[4839]: I0321 04:24:52.636234 4839 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T04:24:52Z","lastTransitionTime":"2026-03-21T04:24:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 04:24:52 crc kubenswrapper[4839]: I0321 04:24:52.739222 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:24:52 crc kubenswrapper[4839]: I0321 04:24:52.739282 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:24:52 crc kubenswrapper[4839]: I0321 04:24:52.739294 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:24:52 crc kubenswrapper[4839]: I0321 04:24:52.739316 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 04:24:52 crc kubenswrapper[4839]: I0321 04:24:52.739329 4839 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T04:24:52Z","lastTransitionTime":"2026-03-21T04:24:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 04:24:52 crc kubenswrapper[4839]: I0321 04:24:52.843382 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:24:52 crc kubenswrapper[4839]: I0321 04:24:52.843484 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:24:52 crc kubenswrapper[4839]: I0321 04:24:52.843500 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:24:52 crc kubenswrapper[4839]: I0321 04:24:52.843526 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 04:24:52 crc kubenswrapper[4839]: I0321 04:24:52.843544 4839 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T04:24:52Z","lastTransitionTime":"2026-03-21T04:24:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 04:24:52 crc kubenswrapper[4839]: I0321 04:24:52.945198 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:24:52 crc kubenswrapper[4839]: I0321 04:24:52.945248 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:24:52 crc kubenswrapper[4839]: I0321 04:24:52.945263 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:24:52 crc kubenswrapper[4839]: I0321 04:24:52.945278 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 04:24:52 crc kubenswrapper[4839]: I0321 04:24:52.945287 4839 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T04:24:52Z","lastTransitionTime":"2026-03-21T04:24:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 04:24:53 crc kubenswrapper[4839]: I0321 04:24:53.047786 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:24:53 crc kubenswrapper[4839]: I0321 04:24:53.047849 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:24:53 crc kubenswrapper[4839]: I0321 04:24:53.047867 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:24:53 crc kubenswrapper[4839]: I0321 04:24:53.047891 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 04:24:53 crc kubenswrapper[4839]: I0321 04:24:53.047912 4839 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T04:24:53Z","lastTransitionTime":"2026-03-21T04:24:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 04:24:53 crc kubenswrapper[4839]: I0321 04:24:53.151481 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:24:53 crc kubenswrapper[4839]: I0321 04:24:53.151520 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:24:53 crc kubenswrapper[4839]: I0321 04:24:53.151531 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:24:53 crc kubenswrapper[4839]: I0321 04:24:53.151547 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 04:24:53 crc kubenswrapper[4839]: I0321 04:24:53.151558 4839 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T04:24:53Z","lastTransitionTime":"2026-03-21T04:24:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 04:24:53 crc kubenswrapper[4839]: I0321 04:24:53.254462 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:24:53 crc kubenswrapper[4839]: I0321 04:24:53.255867 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:24:53 crc kubenswrapper[4839]: I0321 04:24:53.256027 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:24:53 crc kubenswrapper[4839]: I0321 04:24:53.256218 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 04:24:53 crc kubenswrapper[4839]: I0321 04:24:53.256420 4839 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T04:24:53Z","lastTransitionTime":"2026-03-21T04:24:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 04:24:53 crc kubenswrapper[4839]: I0321 04:24:53.358797 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:24:53 crc kubenswrapper[4839]: I0321 04:24:53.358845 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:24:53 crc kubenswrapper[4839]: I0321 04:24:53.358855 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:24:53 crc kubenswrapper[4839]: I0321 04:24:53.358873 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 04:24:53 crc kubenswrapper[4839]: I0321 04:24:53.358884 4839 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T04:24:53Z","lastTransitionTime":"2026-03-21T04:24:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 04:24:53 crc kubenswrapper[4839]: I0321 04:24:53.452197 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 21 04:24:53 crc kubenswrapper[4839]: E0321 04:24:53.452339 4839 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 21 04:24:53 crc kubenswrapper[4839]: I0321 04:24:53.452443 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 21 04:24:53 crc kubenswrapper[4839]: E0321 04:24:53.452655 4839 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 21 04:24:53 crc kubenswrapper[4839]: I0321 04:24:53.453069 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 21 04:24:53 crc kubenswrapper[4839]: E0321 04:24:53.453504 4839 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 21 04:24:53 crc kubenswrapper[4839]: I0321 04:24:53.461465 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:24:53 crc kubenswrapper[4839]: I0321 04:24:53.461527 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:24:53 crc kubenswrapper[4839]: I0321 04:24:53.461545 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:24:53 crc kubenswrapper[4839]: I0321 04:24:53.461594 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 04:24:53 crc kubenswrapper[4839]: I0321 04:24:53.461612 4839 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T04:24:53Z","lastTransitionTime":"2026-03-21T04:24:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 04:24:53 crc kubenswrapper[4839]: I0321 04:24:53.475466 4839 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-etcd/etcd-crc"] Mar 21 04:24:53 crc kubenswrapper[4839]: I0321 04:24:53.564046 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:24:53 crc kubenswrapper[4839]: I0321 04:24:53.564097 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:24:53 crc kubenswrapper[4839]: I0321 04:24:53.564113 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:24:53 crc kubenswrapper[4839]: I0321 04:24:53.564136 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 04:24:53 crc kubenswrapper[4839]: I0321 04:24:53.564154 4839 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T04:24:53Z","lastTransitionTime":"2026-03-21T04:24:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 04:24:53 crc kubenswrapper[4839]: I0321 04:24:53.666762 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:24:53 crc kubenswrapper[4839]: I0321 04:24:53.666825 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:24:53 crc kubenswrapper[4839]: I0321 04:24:53.666841 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:24:53 crc kubenswrapper[4839]: I0321 04:24:53.666867 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 04:24:53 crc kubenswrapper[4839]: I0321 04:24:53.666885 4839 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T04:24:53Z","lastTransitionTime":"2026-03-21T04:24:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 04:24:53 crc kubenswrapper[4839]: I0321 04:24:53.770017 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:24:53 crc kubenswrapper[4839]: I0321 04:24:53.770060 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:24:53 crc kubenswrapper[4839]: I0321 04:24:53.770068 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:24:53 crc kubenswrapper[4839]: I0321 04:24:53.770083 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 04:24:53 crc kubenswrapper[4839]: I0321 04:24:53.770092 4839 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T04:24:53Z","lastTransitionTime":"2026-03-21T04:24:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 04:24:53 crc kubenswrapper[4839]: I0321 04:24:53.877932 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:24:53 crc kubenswrapper[4839]: I0321 04:24:53.878011 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:24:53 crc kubenswrapper[4839]: I0321 04:24:53.878041 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:24:53 crc kubenswrapper[4839]: I0321 04:24:53.878072 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 04:24:53 crc kubenswrapper[4839]: I0321 04:24:53.878092 4839 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T04:24:53Z","lastTransitionTime":"2026-03-21T04:24:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 04:24:53 crc kubenswrapper[4839]: I0321 04:24:53.897400 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:24:53 crc kubenswrapper[4839]: I0321 04:24:53.897427 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:24:53 crc kubenswrapper[4839]: I0321 04:24:53.897436 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:24:53 crc kubenswrapper[4839]: I0321 04:24:53.897446 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 04:24:53 crc kubenswrapper[4839]: I0321 04:24:53.897471 4839 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T04:24:53Z","lastTransitionTime":"2026-03-21T04:24:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 04:24:53 crc kubenswrapper[4839]: E0321 04:24:53.913747 4839 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404564Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865364Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-21T04:24:53Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-21T04:24:53Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-21T04:24:53Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-21T04:24:53Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-21T04:24:53Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-21T04:24:53Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-21T04:24:53Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-21T04:24:53Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"d75f4a6d-5ce3-49bd-a7c7-4f5c3e6bf62f\\\",\\\"systemUUID\\\":\\\"2a7bfad9-30ba-42d8-b982-971191ebb9d6\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 21 04:24:53 crc kubenswrapper[4839]: I0321 04:24:53.919035 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:24:53 crc kubenswrapper[4839]: I0321 04:24:53.919099 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:24:53 crc kubenswrapper[4839]: I0321 04:24:53.919109 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:24:53 crc kubenswrapper[4839]: I0321 04:24:53.919126 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 04:24:53 crc kubenswrapper[4839]: I0321 04:24:53.919136 4839 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T04:24:53Z","lastTransitionTime":"2026-03-21T04:24:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 04:24:53 crc kubenswrapper[4839]: E0321 04:24:53.932773 4839 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404564Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865364Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-21T04:24:53Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-21T04:24:53Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-21T04:24:53Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-21T04:24:53Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-21T04:24:53Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-21T04:24:53Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-21T04:24:53Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-21T04:24:53Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"d75f4a6d-5ce3-49bd-a7c7-4f5c3e6bf62f\\\",\\\"systemUUID\\\":\\\"2a7bfad9-30ba-42d8-b982-971191ebb9d6\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 21 04:24:53 crc kubenswrapper[4839]: I0321 04:24:53.938153 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:24:53 crc kubenswrapper[4839]: I0321 04:24:53.938202 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:24:53 crc kubenswrapper[4839]: I0321 04:24:53.938213 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:24:53 crc kubenswrapper[4839]: I0321 04:24:53.938227 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 04:24:53 crc kubenswrapper[4839]: I0321 04:24:53.938238 4839 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T04:24:53Z","lastTransitionTime":"2026-03-21T04:24:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 04:24:53 crc kubenswrapper[4839]: E0321 04:24:53.951777 4839 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404564Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865364Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-21T04:24:53Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-21T04:24:53Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-21T04:24:53Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-21T04:24:53Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-21T04:24:53Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-21T04:24:53Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-21T04:24:53Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-21T04:24:53Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"d75f4a6d-5ce3-49bd-a7c7-4f5c3e6bf62f\\\",\\\"systemUUID\\\":\\\"2a7bfad9-30ba-42d8-b982-971191ebb9d6\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 21 04:24:53 crc kubenswrapper[4839]: I0321 04:24:53.955377 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:24:53 crc kubenswrapper[4839]: I0321 04:24:53.955403 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:24:53 crc kubenswrapper[4839]: I0321 04:24:53.955412 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:24:53 crc kubenswrapper[4839]: I0321 04:24:53.955424 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 04:24:53 crc kubenswrapper[4839]: I0321 04:24:53.955433 4839 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T04:24:53Z","lastTransitionTime":"2026-03-21T04:24:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 04:24:53 crc kubenswrapper[4839]: E0321 04:24:53.968929 4839 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404564Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865364Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-21T04:24:53Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-21T04:24:53Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-21T04:24:53Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-21T04:24:53Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-21T04:24:53Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-21T04:24:53Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-21T04:24:53Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-21T04:24:53Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"d75f4a6d-5ce3-49bd-a7c7-4f5c3e6bf62f\\\",\\\"systemUUID\\\":\\\"2a7bfad9-30ba-42d8-b982-971191ebb9d6\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 21 04:24:53 crc kubenswrapper[4839]: I0321 04:24:53.973046 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:24:53 crc kubenswrapper[4839]: I0321 04:24:53.973074 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:24:53 crc kubenswrapper[4839]: I0321 04:24:53.973081 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:24:53 crc kubenswrapper[4839]: I0321 04:24:53.973092 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 04:24:53 crc kubenswrapper[4839]: I0321 04:24:53.973102 4839 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T04:24:53Z","lastTransitionTime":"2026-03-21T04:24:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 04:24:53 crc kubenswrapper[4839]: E0321 04:24:53.987453 4839 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404564Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865364Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-21T04:24:53Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-21T04:24:53Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-21T04:24:53Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-21T04:24:53Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-21T04:24:53Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-21T04:24:53Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-21T04:24:53Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-21T04:24:53Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"d75f4a6d-5ce3-49bd-a7c7-4f5c3e6bf62f\\\",\\\"systemUUID\\\":\\\"2a7bfad9-30ba-42d8-b982-971191ebb9d6\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 21 04:24:53 crc kubenswrapper[4839]: E0321 04:24:53.987640 4839 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Mar 21 04:24:53 crc kubenswrapper[4839]: I0321 04:24:53.989204 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:24:53 crc kubenswrapper[4839]: I0321 04:24:53.989232 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:24:53 crc kubenswrapper[4839]: I0321 04:24:53.989240 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:24:53 crc kubenswrapper[4839]: I0321 04:24:53.989255 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 04:24:53 crc kubenswrapper[4839]: I0321 04:24:53.989265 4839 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T04:24:53Z","lastTransitionTime":"2026-03-21T04:24:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 04:24:54 crc kubenswrapper[4839]: I0321 04:24:54.092007 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:24:54 crc kubenswrapper[4839]: I0321 04:24:54.092060 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:24:54 crc kubenswrapper[4839]: I0321 04:24:54.092077 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:24:54 crc kubenswrapper[4839]: I0321 04:24:54.092100 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 04:24:54 crc kubenswrapper[4839]: I0321 04:24:54.092118 4839 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T04:24:54Z","lastTransitionTime":"2026-03-21T04:24:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 04:24:54 crc kubenswrapper[4839]: I0321 04:24:54.194998 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:24:54 crc kubenswrapper[4839]: I0321 04:24:54.195063 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:24:54 crc kubenswrapper[4839]: I0321 04:24:54.195085 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:24:54 crc kubenswrapper[4839]: I0321 04:24:54.195115 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 04:24:54 crc kubenswrapper[4839]: I0321 04:24:54.195135 4839 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T04:24:54Z","lastTransitionTime":"2026-03-21T04:24:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 04:24:54 crc kubenswrapper[4839]: I0321 04:24:54.298008 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:24:54 crc kubenswrapper[4839]: I0321 04:24:54.298099 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:24:54 crc kubenswrapper[4839]: I0321 04:24:54.298114 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:24:54 crc kubenswrapper[4839]: I0321 04:24:54.298131 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 04:24:54 crc kubenswrapper[4839]: I0321 04:24:54.298143 4839 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T04:24:54Z","lastTransitionTime":"2026-03-21T04:24:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 04:24:54 crc kubenswrapper[4839]: I0321 04:24:54.400499 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:24:54 crc kubenswrapper[4839]: I0321 04:24:54.400606 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:24:54 crc kubenswrapper[4839]: I0321 04:24:54.400631 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:24:54 crc kubenswrapper[4839]: I0321 04:24:54.400666 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 04:24:54 crc kubenswrapper[4839]: I0321 04:24:54.400766 4839 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T04:24:54Z","lastTransitionTime":"2026-03-21T04:24:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 04:24:54 crc kubenswrapper[4839]: I0321 04:24:54.503166 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:24:54 crc kubenswrapper[4839]: I0321 04:24:54.503213 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:24:54 crc kubenswrapper[4839]: I0321 04:24:54.503222 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:24:54 crc kubenswrapper[4839]: I0321 04:24:54.503237 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 04:24:54 crc kubenswrapper[4839]: I0321 04:24:54.503246 4839 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T04:24:54Z","lastTransitionTime":"2026-03-21T04:24:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 04:24:54 crc kubenswrapper[4839]: I0321 04:24:54.605861 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:24:54 crc kubenswrapper[4839]: I0321 04:24:54.605912 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:24:54 crc kubenswrapper[4839]: I0321 04:24:54.605945 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:24:54 crc kubenswrapper[4839]: I0321 04:24:54.605958 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 04:24:54 crc kubenswrapper[4839]: I0321 04:24:54.605969 4839 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T04:24:54Z","lastTransitionTime":"2026-03-21T04:24:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 04:24:54 crc kubenswrapper[4839]: I0321 04:24:54.708093 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:24:54 crc kubenswrapper[4839]: I0321 04:24:54.708123 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:24:54 crc kubenswrapper[4839]: I0321 04:24:54.708132 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:24:54 crc kubenswrapper[4839]: I0321 04:24:54.708147 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 04:24:54 crc kubenswrapper[4839]: I0321 04:24:54.708157 4839 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T04:24:54Z","lastTransitionTime":"2026-03-21T04:24:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 04:24:54 crc kubenswrapper[4839]: I0321 04:24:54.810308 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:24:54 crc kubenswrapper[4839]: I0321 04:24:54.810343 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:24:54 crc kubenswrapper[4839]: I0321 04:24:54.810351 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:24:54 crc kubenswrapper[4839]: I0321 04:24:54.810363 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 04:24:54 crc kubenswrapper[4839]: I0321 04:24:54.810373 4839 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T04:24:54Z","lastTransitionTime":"2026-03-21T04:24:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 04:24:54 crc kubenswrapper[4839]: I0321 04:24:54.913131 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:24:54 crc kubenswrapper[4839]: I0321 04:24:54.913172 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:24:54 crc kubenswrapper[4839]: I0321 04:24:54.913182 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:24:54 crc kubenswrapper[4839]: I0321 04:24:54.913197 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 04:24:54 crc kubenswrapper[4839]: I0321 04:24:54.913206 4839 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T04:24:54Z","lastTransitionTime":"2026-03-21T04:24:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 04:24:55 crc kubenswrapper[4839]: I0321 04:24:55.015251 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:24:55 crc kubenswrapper[4839]: I0321 04:24:55.015305 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:24:55 crc kubenswrapper[4839]: I0321 04:24:55.015314 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:24:55 crc kubenswrapper[4839]: I0321 04:24:55.015327 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 04:24:55 crc kubenswrapper[4839]: I0321 04:24:55.015336 4839 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T04:24:55Z","lastTransitionTime":"2026-03-21T04:24:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 04:24:55 crc kubenswrapper[4839]: I0321 04:24:55.117980 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:24:55 crc kubenswrapper[4839]: I0321 04:24:55.118013 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:24:55 crc kubenswrapper[4839]: I0321 04:24:55.118022 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:24:55 crc kubenswrapper[4839]: I0321 04:24:55.118036 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 04:24:55 crc kubenswrapper[4839]: I0321 04:24:55.118045 4839 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T04:24:55Z","lastTransitionTime":"2026-03-21T04:24:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 04:24:55 crc kubenswrapper[4839]: I0321 04:24:55.220356 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:24:55 crc kubenswrapper[4839]: I0321 04:24:55.220435 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:24:55 crc kubenswrapper[4839]: I0321 04:24:55.220454 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:24:55 crc kubenswrapper[4839]: I0321 04:24:55.220477 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 04:24:55 crc kubenswrapper[4839]: I0321 04:24:55.220494 4839 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T04:24:55Z","lastTransitionTime":"2026-03-21T04:24:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 04:24:55 crc kubenswrapper[4839]: I0321 04:24:55.322522 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:24:55 crc kubenswrapper[4839]: I0321 04:24:55.322617 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:24:55 crc kubenswrapper[4839]: I0321 04:24:55.322631 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:24:55 crc kubenswrapper[4839]: I0321 04:24:55.322653 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 04:24:55 crc kubenswrapper[4839]: I0321 04:24:55.322665 4839 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T04:24:55Z","lastTransitionTime":"2026-03-21T04:24:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 04:24:55 crc kubenswrapper[4839]: I0321 04:24:55.424886 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:24:55 crc kubenswrapper[4839]: I0321 04:24:55.424927 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:24:55 crc kubenswrapper[4839]: I0321 04:24:55.424936 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:24:55 crc kubenswrapper[4839]: I0321 04:24:55.424950 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 04:24:55 crc kubenswrapper[4839]: I0321 04:24:55.424958 4839 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T04:24:55Z","lastTransitionTime":"2026-03-21T04:24:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 04:24:55 crc kubenswrapper[4839]: I0321 04:24:55.452547 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 21 04:24:55 crc kubenswrapper[4839]: I0321 04:24:55.452556 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 21 04:24:55 crc kubenswrapper[4839]: I0321 04:24:55.452640 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 21 04:24:55 crc kubenswrapper[4839]: E0321 04:24:55.452775 4839 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 21 04:24:55 crc kubenswrapper[4839]: E0321 04:24:55.453138 4839 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 21 04:24:55 crc kubenswrapper[4839]: E0321 04:24:55.453275 4839 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 21 04:24:55 crc kubenswrapper[4839]: E0321 04:24:55.454797 4839 kuberuntime_manager.go:1274] "Unhandled Error" err=< Mar 21 04:24:55 crc kubenswrapper[4839]: container &Container{Name:webhook,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2,Command:[/bin/bash -c set -xe Mar 21 04:24:55 crc kubenswrapper[4839]: if [[ -f "/env/_master" ]]; then Mar 21 04:24:55 crc kubenswrapper[4839]: set -o allexport Mar 21 04:24:55 crc kubenswrapper[4839]: source "/env/_master" Mar 21 04:24:55 crc kubenswrapper[4839]: set +o allexport Mar 21 04:24:55 crc kubenswrapper[4839]: fi Mar 21 04:24:55 crc kubenswrapper[4839]: # OVN-K will try to remove hybrid overlay node annotations even when the hybrid overlay is not enabled. Mar 21 04:24:55 crc kubenswrapper[4839]: # https://github.com/ovn-org/ovn-kubernetes/blob/ac6820df0b338a246f10f412cd5ec903bd234694/go-controller/pkg/ovn/master.go#L791 Mar 21 04:24:55 crc kubenswrapper[4839]: ho_enable="--enable-hybrid-overlay" Mar 21 04:24:55 crc kubenswrapper[4839]: echo "I$(date "+%m%d %H:%M:%S.%N") - network-node-identity - start webhook" Mar 21 04:24:55 crc kubenswrapper[4839]: # extra-allowed-user: service account `ovn-kubernetes-control-plane` Mar 21 04:24:55 crc kubenswrapper[4839]: # sets pod annotations in multi-homing layer3 network controller (cluster-manager) Mar 21 04:24:55 crc kubenswrapper[4839]: exec /usr/bin/ovnkube-identity --k8s-apiserver=https://api-int.crc.testing:6443 \ Mar 21 04:24:55 crc kubenswrapper[4839]: --webhook-cert-dir="/etc/webhook-cert" \ Mar 21 04:24:55 crc kubenswrapper[4839]: --webhook-host=127.0.0.1 \ Mar 21 04:24:55 crc kubenswrapper[4839]: --webhook-port=9743 \ Mar 21 04:24:55 crc kubenswrapper[4839]: ${ho_enable} \ Mar 21 04:24:55 crc kubenswrapper[4839]: --enable-interconnect \ Mar 21 04:24:55 crc kubenswrapper[4839]: --disable-approver \ Mar 21 04:24:55 crc kubenswrapper[4839]: --extra-allowed-user="system:serviceaccount:openshift-ovn-kubernetes:ovn-kubernetes-control-plane" \ Mar 21 04:24:55 crc kubenswrapper[4839]: --wait-for-kubernetes-api=200s \ Mar 21 04:24:55 crc kubenswrapper[4839]: --pod-admission-conditions="/var/run/ovnkube-identity-config/additional-pod-admission-cond.json" \ Mar 21 04:24:55 crc kubenswrapper[4839]: --loglevel="${LOGLEVEL}" Mar 21 04:24:55 crc kubenswrapper[4839]: ],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LOGLEVEL,Value:2,ValueFrom:nil,},EnvVar{Name:KUBERNETES_NODE_NAME,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:spec.nodeName,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{52428800 0} {} 50Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:webhook-cert,ReadOnly:false,MountPath:/etc/webhook-cert/,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:env-overrides,ReadOnly:false,MountPath:/env,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:ovnkube-identity-cm,ReadOnly:false,MountPath:/var/run/ovnkube-identity-config,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-s2kz5,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000470000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod network-node-identity-vrzqb_openshift-network-node-identity(ef543e1b-8068-4ea3-b32a-61027b32e95d): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars Mar 21 04:24:55 crc kubenswrapper[4839]: > logger="UnhandledError" Mar 21 04:24:55 crc kubenswrapper[4839]: E0321 04:24:55.458425 4839 kuberuntime_manager.go:1274] "Unhandled Error" err=< Mar 21 04:24:55 crc kubenswrapper[4839]: container &Container{Name:approver,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2,Command:[/bin/bash -c set -xe Mar 21 04:24:55 crc kubenswrapper[4839]: if [[ -f "/env/_master" ]]; then Mar 21 04:24:55 crc kubenswrapper[4839]: set -o allexport Mar 21 04:24:55 crc kubenswrapper[4839]: source "/env/_master" Mar 21 04:24:55 crc kubenswrapper[4839]: set +o allexport Mar 21 04:24:55 crc kubenswrapper[4839]: fi Mar 21 04:24:55 crc kubenswrapper[4839]: Mar 21 04:24:55 crc kubenswrapper[4839]: echo "I$(date "+%m%d %H:%M:%S.%N") - network-node-identity - start approver" Mar 21 04:24:55 crc kubenswrapper[4839]: exec /usr/bin/ovnkube-identity --k8s-apiserver=https://api-int.crc.testing:6443 \ Mar 21 04:24:55 crc kubenswrapper[4839]: --disable-webhook \ Mar 21 04:24:55 crc kubenswrapper[4839]: --csr-acceptance-conditions="/var/run/ovnkube-identity-config/additional-cert-acceptance-cond.json" \ Mar 21 04:24:55 crc kubenswrapper[4839]: --loglevel="${LOGLEVEL}" Mar 21 04:24:55 crc kubenswrapper[4839]: ],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LOGLEVEL,Value:4,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{52428800 0} {} 50Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:env-overrides,ReadOnly:false,MountPath:/env,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:ovnkube-identity-cm,ReadOnly:false,MountPath:/var/run/ovnkube-identity-config,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-s2kz5,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000470000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod network-node-identity-vrzqb_openshift-network-node-identity(ef543e1b-8068-4ea3-b32a-61027b32e95d): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars Mar 21 04:24:55 crc kubenswrapper[4839]: > logger="UnhandledError" Mar 21 04:24:55 crc kubenswrapper[4839]: E0321 04:24:55.459505 4839 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"webhook\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\", failed to \"StartContainer\" for \"approver\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\"]" pod="openshift-network-node-identity/network-node-identity-vrzqb" podUID="ef543e1b-8068-4ea3-b32a-61027b32e95d" Mar 21 04:24:55 crc kubenswrapper[4839]: I0321 04:24:55.523258 4839 reflector.go:368] Caches populated for *v1.CSIDriver from k8s.io/client-go/informers/factory.go:160 Mar 21 04:24:55 crc kubenswrapper[4839]: I0321 04:24:55.527444 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:24:55 crc kubenswrapper[4839]: I0321 04:24:55.527470 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:24:55 crc kubenswrapper[4839]: I0321 04:24:55.527481 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:24:55 crc kubenswrapper[4839]: I0321 04:24:55.527498 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 04:24:55 crc kubenswrapper[4839]: I0321 04:24:55.527508 4839 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T04:24:55Z","lastTransitionTime":"2026-03-21T04:24:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 04:24:55 crc kubenswrapper[4839]: I0321 04:24:55.629491 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:24:55 crc kubenswrapper[4839]: I0321 04:24:55.629519 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:24:55 crc kubenswrapper[4839]: I0321 04:24:55.629527 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:24:55 crc kubenswrapper[4839]: I0321 04:24:55.629539 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 04:24:55 crc kubenswrapper[4839]: I0321 04:24:55.629548 4839 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T04:24:55Z","lastTransitionTime":"2026-03-21T04:24:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 04:24:55 crc kubenswrapper[4839]: I0321 04:24:55.732217 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:24:55 crc kubenswrapper[4839]: I0321 04:24:55.732276 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:24:55 crc kubenswrapper[4839]: I0321 04:24:55.732299 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:24:55 crc kubenswrapper[4839]: I0321 04:24:55.732324 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 04:24:55 crc kubenswrapper[4839]: I0321 04:24:55.732344 4839 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T04:24:55Z","lastTransitionTime":"2026-03-21T04:24:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 04:24:55 crc kubenswrapper[4839]: I0321 04:24:55.834429 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:24:55 crc kubenswrapper[4839]: I0321 04:24:55.834498 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:24:55 crc kubenswrapper[4839]: I0321 04:24:55.834510 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:24:55 crc kubenswrapper[4839]: I0321 04:24:55.834542 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 04:24:55 crc kubenswrapper[4839]: I0321 04:24:55.834554 4839 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T04:24:55Z","lastTransitionTime":"2026-03-21T04:24:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 04:24:55 crc kubenswrapper[4839]: I0321 04:24:55.937425 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:24:55 crc kubenswrapper[4839]: I0321 04:24:55.937486 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:24:55 crc kubenswrapper[4839]: I0321 04:24:55.937504 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:24:55 crc kubenswrapper[4839]: I0321 04:24:55.937530 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 04:24:55 crc kubenswrapper[4839]: I0321 04:24:55.937548 4839 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T04:24:55Z","lastTransitionTime":"2026-03-21T04:24:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 04:24:56 crc kubenswrapper[4839]: I0321 04:24:56.040505 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:24:56 crc kubenswrapper[4839]: I0321 04:24:56.040590 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:24:56 crc kubenswrapper[4839]: I0321 04:24:56.040610 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:24:56 crc kubenswrapper[4839]: I0321 04:24:56.040633 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 04:24:56 crc kubenswrapper[4839]: I0321 04:24:56.040649 4839 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T04:24:56Z","lastTransitionTime":"2026-03-21T04:24:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 04:24:56 crc kubenswrapper[4839]: I0321 04:24:56.153807 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:24:56 crc kubenswrapper[4839]: I0321 04:24:56.153866 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:24:56 crc kubenswrapper[4839]: I0321 04:24:56.153878 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:24:56 crc kubenswrapper[4839]: I0321 04:24:56.153896 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 04:24:56 crc kubenswrapper[4839]: I0321 04:24:56.153909 4839 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T04:24:56Z","lastTransitionTime":"2026-03-21T04:24:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 04:24:56 crc kubenswrapper[4839]: I0321 04:24:56.257047 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:24:56 crc kubenswrapper[4839]: I0321 04:24:56.257118 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:24:56 crc kubenswrapper[4839]: I0321 04:24:56.257130 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:24:56 crc kubenswrapper[4839]: I0321 04:24:56.257148 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 04:24:56 crc kubenswrapper[4839]: I0321 04:24:56.257163 4839 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T04:24:56Z","lastTransitionTime":"2026-03-21T04:24:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 04:24:56 crc kubenswrapper[4839]: I0321 04:24:56.359412 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:24:56 crc kubenswrapper[4839]: I0321 04:24:56.359441 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:24:56 crc kubenswrapper[4839]: I0321 04:24:56.359449 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:24:56 crc kubenswrapper[4839]: I0321 04:24:56.359461 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 04:24:56 crc kubenswrapper[4839]: I0321 04:24:56.359469 4839 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T04:24:56Z","lastTransitionTime":"2026-03-21T04:24:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 04:24:56 crc kubenswrapper[4839]: E0321 04:24:56.454604 4839 kuberuntime_manager.go:1274] "Unhandled Error" err=< Mar 21 04:24:56 crc kubenswrapper[4839]: container &Container{Name:network-operator,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b,Command:[/bin/bash -c #!/bin/bash Mar 21 04:24:56 crc kubenswrapper[4839]: set -o allexport Mar 21 04:24:56 crc kubenswrapper[4839]: if [[ -f /etc/kubernetes/apiserver-url.env ]]; then Mar 21 04:24:56 crc kubenswrapper[4839]: source /etc/kubernetes/apiserver-url.env Mar 21 04:24:56 crc kubenswrapper[4839]: else Mar 21 04:24:56 crc kubenswrapper[4839]: echo "Error: /etc/kubernetes/apiserver-url.env is missing" Mar 21 04:24:56 crc kubenswrapper[4839]: exit 1 Mar 21 04:24:56 crc kubenswrapper[4839]: fi Mar 21 04:24:56 crc kubenswrapper[4839]: exec /usr/bin/cluster-network-operator start --listen=0.0.0.0:9104 Mar 21 04:24:56 crc kubenswrapper[4839]: ],Args:[],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:cno,HostPort:9104,ContainerPort:9104,Protocol:TCP,HostIP:,},},Env:[]EnvVar{EnvVar{Name:RELEASE_VERSION,Value:4.18.1,ValueFrom:nil,},EnvVar{Name:KUBE_PROXY_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b97554198294bf544fbc116c94a0a1fb2ec8a4de0e926bf9d9e320135f0bee6f,ValueFrom:nil,},EnvVar{Name:KUBE_RBAC_PROXY_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09,ValueFrom:nil,},EnvVar{Name:MULTUS_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26,ValueFrom:nil,},EnvVar{Name:MULTUS_ADMISSION_CONTROLLER_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317,ValueFrom:nil,},EnvVar{Name:CNI_PLUGINS_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc,ValueFrom:nil,},EnvVar{Name:BOND_CNI_PLUGIN_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78,ValueFrom:nil,},EnvVar{Name:WHEREABOUTS_CNI_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4,ValueFrom:nil,},EnvVar{Name:ROUTE_OVERRRIDE_CNI_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa,ValueFrom:nil,},EnvVar{Name:MULTUS_NETWORKPOLICY_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:23f833d3738d68706eb2f2868bd76bd71cee016cffa6faf5f045a60cc8c6eddd,ValueFrom:nil,},EnvVar{Name:OVN_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2,ValueFrom:nil,},EnvVar{Name:OVN_NB_RAFT_ELECTION_TIMER,Value:10,ValueFrom:nil,},EnvVar{Name:OVN_SB_RAFT_ELECTION_TIMER,Value:16,ValueFrom:nil,},EnvVar{Name:OVN_NORTHD_PROBE_INTERVAL,Value:10000,ValueFrom:nil,},EnvVar{Name:OVN_CONTROLLER_INACTIVITY_PROBE,Value:180000,ValueFrom:nil,},EnvVar{Name:OVN_NB_INACTIVITY_PROBE,Value:60000,ValueFrom:nil,},EnvVar{Name:EGRESS_ROUTER_CNI_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c,ValueFrom:nil,},EnvVar{Name:NETWORK_METRICS_DAEMON_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d,ValueFrom:nil,},EnvVar{Name:NETWORK_CHECK_SOURCE_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b,ValueFrom:nil,},EnvVar{Name:NETWORK_CHECK_TARGET_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b,ValueFrom:nil,},EnvVar{Name:NETWORK_OPERATOR_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b,ValueFrom:nil,},EnvVar{Name:CLOUD_NETWORK_CONFIG_CONTROLLER_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8048f1cb0be521f09749c0a489503cd56d85b68c6ca93380e082cfd693cd97a8,ValueFrom:nil,},EnvVar{Name:CLI_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2,ValueFrom:nil,},EnvVar{Name:FRR_K8S_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5dbf844e49bb46b78586930149e5e5f5dc121014c8afd10fe36f3651967cc256,ValueFrom:nil,},EnvVar{Name:NETWORKING_CONSOLE_PLUGIN_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd,ValueFrom:nil,},EnvVar{Name:POD_NAME,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:metadata.name,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{52428800 0} {} 50Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:host-etc-kube,ReadOnly:true,MountPath:/etc/kubernetes,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:metrics-tls,ReadOnly:false,MountPath:/var/run/secrets/serving-cert,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-rdwmf,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:nil,Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod network-operator-58b4c7f79c-55gtf_openshift-network-operator(37a5e44f-9a88-4405-be8a-b645485e7312): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars Mar 21 04:24:56 crc kubenswrapper[4839]: > logger="UnhandledError" Mar 21 04:24:56 crc kubenswrapper[4839]: E0321 04:24:56.454817 4839 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:iptables-alerter,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2,Command:[/iptables-alerter/iptables-alerter.sh],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONTAINER_RUNTIME_ENDPOINT,Value:unix:///run/crio/crio.sock,ValueFrom:nil,},EnvVar{Name:ALERTER_POD_NAME,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:metadata.name,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{68157440 0} {} 65Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:iptables-alerter-script,ReadOnly:false,MountPath:/iptables-alerter,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:host-slash,ReadOnly:true,MountPath:/host,SubPath:,MountPropagation:*HostToContainer,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-rczfb,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:nil,Privileged:*true,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod iptables-alerter-4ln5h_openshift-network-operator(d75a4c96-2883-4a0b-bab2-0fab2b6c0b49): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars" logger="UnhandledError" Mar 21 04:24:56 crc kubenswrapper[4839]: E0321 04:24:56.455977 4839 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"network-operator\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\"" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" podUID="37a5e44f-9a88-4405-be8a-b645485e7312" Mar 21 04:24:56 crc kubenswrapper[4839]: E0321 04:24:56.456018 4839 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"iptables-alerter\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\"" pod="openshift-network-operator/iptables-alerter-4ln5h" podUID="d75a4c96-2883-4a0b-bab2-0fab2b6c0b49" Mar 21 04:24:56 crc kubenswrapper[4839]: I0321 04:24:56.460996 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:24:56 crc kubenswrapper[4839]: I0321 04:24:56.461027 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:24:56 crc kubenswrapper[4839]: I0321 04:24:56.461035 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:24:56 crc kubenswrapper[4839]: I0321 04:24:56.461049 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 04:24:56 crc kubenswrapper[4839]: I0321 04:24:56.461059 4839 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T04:24:56Z","lastTransitionTime":"2026-03-21T04:24:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 04:24:56 crc kubenswrapper[4839]: I0321 04:24:56.465463 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c5a473f2-0430-4c9b-8ef8-60d457db5188\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:23:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:23:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:23:16Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:23:16Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:23:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7299e7f513aff2d1d61b42cef3ed386843478716082a6e38d4d6d61d87f48529\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:23:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6d9bd6ceaa85cd21af55d1d512ac910eaf9d1bcf5d0f10868cd22ac80b13f66a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:23:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e4fe4a6c00181492cd5389e34a9bf18d905f69b2f7467eb75dcee9992eeb0d07\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:23:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ecafd1a4266b72f369f4b53d8a423bb875c1bc9719282c517a8e999ad5aecc66\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ecafd1a4266b72f369f4b53d8a423bb875c1bc9719282c517a8e999ad5aecc66\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-21T04:24:14Z\\\",\\\"message\\\":\\\"W0321 04:24:13.708919 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI0321 04:24:13.709951 1 crypto.go:601] Generating new CA for check-endpoints-signer@1774067053 cert, and key in /tmp/serving-cert-1138089316/serving-signer.crt, /tmp/serving-cert-1138089316/serving-signer.key\\\\nI0321 04:24:14.009807 1 observer_polling.go:159] Starting file observer\\\\nW0321 04:24:14.016906 1 builder.go:272] unable to get owner reference (falling back to namespace): Unauthorized\\\\nI0321 04:24:14.017068 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0321 04:24:14.017655 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1138089316/tls.crt::/tmp/serving-cert-1138089316/tls.key\\\\\\\"\\\\nF0321 04:24:14.917069 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-21T04:24:13Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://573a241321770c9c676d83d8280f81eba0ad01a3e2f857be89d31316117b8898\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:23:19Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f70600d5431425836d94025fc8166068d8c9f411d0750be07cd7de44575a9649\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f70600d5431425836d94025fc8166068d8c9f411d0750be07cd7de44575a9649\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:23:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:23:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:23:16Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 21 04:24:56 crc kubenswrapper[4839]: I0321 04:24:56.474417 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:24:41Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:24:41Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 21 04:24:56 crc kubenswrapper[4839]: I0321 04:24:56.484112 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:24:41Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:24:41Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 21 04:24:56 crc kubenswrapper[4839]: I0321 04:24:56.493278 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:24:41Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:24:41Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 21 04:24:56 crc kubenswrapper[4839]: I0321 04:24:56.503623 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:24:41Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:24:41Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 21 04:24:56 crc kubenswrapper[4839]: I0321 04:24:56.508349 4839 reflector.go:368] Caches populated for *v1.Service from k8s.io/client-go/informers/factory.go:160 Mar 21 04:24:56 crc kubenswrapper[4839]: I0321 04:24:56.519121 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"845a5f72-fcd9-4e53-a65e-d875a0758312\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:23:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:23:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:23:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:23:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:23:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6ac8aa1efe8200de679b27c91d85e8934dd8b958a487da8adfcb2e7fe4d4af83\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:23:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b5fcb9a692a8e68cdaf00e9ab50192286582dc94a1ec9c1f269deec71d12e709\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:23:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0a90276bf42ba474250d4e51c93b958e93ffcad8ab5f4285766b381f0247c484\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:23:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://418cc0271450c53990546d7ffb4506824d2a5889e49064cd40624ba291f485bd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:23:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d2aeb8f0a5e6e7302570f0ecb899e31a8f5e9ed506f9bf7cfa6a0a6d55ae50dd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:23:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://843b9092466615ee64f818bbd1d6e51298c59932d1e135796967de09fb2f81bf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://843b9092466615ee64f818bbd1d6e51298c59932d1e135796967de09fb2f81bf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:23:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:23:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://320bccb832e179c9ab109d22c5cd652cfb3bd770d2c698d91886401d9566efc5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://320bccb832e179c9ab109d22c5cd652cfb3bd770d2c698d91886401d9566efc5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:23:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:23:18Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://6e30344cfa0c0c9d42845a517f744a156634132bd6c2f6e30f745751913e968c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6e30344cfa0c0c9d42845a517f744a156634132bd6c2f6e30f745751913e968c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:23:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:23:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:23:16Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 21 04:24:56 crc kubenswrapper[4839]: I0321 04:24:56.528977 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:24:41Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:24:41Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 21 04:24:56 crc kubenswrapper[4839]: I0321 04:24:56.540998 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:24:41Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:24:41Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 21 04:24:56 crc kubenswrapper[4839]: I0321 04:24:56.563457 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:24:56 crc kubenswrapper[4839]: I0321 04:24:56.563495 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:24:56 crc kubenswrapper[4839]: I0321 04:24:56.563506 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:24:56 crc kubenswrapper[4839]: I0321 04:24:56.563523 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 04:24:56 crc kubenswrapper[4839]: I0321 04:24:56.563537 4839 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T04:24:56Z","lastTransitionTime":"2026-03-21T04:24:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 04:24:56 crc kubenswrapper[4839]: I0321 04:24:56.666611 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:24:56 crc kubenswrapper[4839]: I0321 04:24:56.666661 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:24:56 crc kubenswrapper[4839]: I0321 04:24:56.666674 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:24:56 crc kubenswrapper[4839]: I0321 04:24:56.666693 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 04:24:56 crc kubenswrapper[4839]: I0321 04:24:56.666704 4839 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T04:24:56Z","lastTransitionTime":"2026-03-21T04:24:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 04:24:56 crc kubenswrapper[4839]: I0321 04:24:56.768992 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:24:56 crc kubenswrapper[4839]: I0321 04:24:56.769247 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:24:56 crc kubenswrapper[4839]: I0321 04:24:56.769368 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:24:56 crc kubenswrapper[4839]: I0321 04:24:56.769461 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 04:24:56 crc kubenswrapper[4839]: I0321 04:24:56.769550 4839 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T04:24:56Z","lastTransitionTime":"2026-03-21T04:24:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 04:24:56 crc kubenswrapper[4839]: I0321 04:24:56.872296 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:24:56 crc kubenswrapper[4839]: I0321 04:24:56.872677 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:24:56 crc kubenswrapper[4839]: I0321 04:24:56.872804 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:24:56 crc kubenswrapper[4839]: I0321 04:24:56.872967 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 04:24:56 crc kubenswrapper[4839]: I0321 04:24:56.873085 4839 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T04:24:56Z","lastTransitionTime":"2026-03-21T04:24:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 04:24:56 crc kubenswrapper[4839]: I0321 04:24:56.976221 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:24:56 crc kubenswrapper[4839]: I0321 04:24:56.976283 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:24:56 crc kubenswrapper[4839]: I0321 04:24:56.976310 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:24:56 crc kubenswrapper[4839]: I0321 04:24:56.976339 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 04:24:56 crc kubenswrapper[4839]: I0321 04:24:56.976362 4839 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T04:24:56Z","lastTransitionTime":"2026-03-21T04:24:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 04:24:57 crc kubenswrapper[4839]: I0321 04:24:57.078816 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:24:57 crc kubenswrapper[4839]: I0321 04:24:57.078882 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:24:57 crc kubenswrapper[4839]: I0321 04:24:57.078900 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:24:57 crc kubenswrapper[4839]: I0321 04:24:57.078924 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 04:24:57 crc kubenswrapper[4839]: I0321 04:24:57.078942 4839 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T04:24:57Z","lastTransitionTime":"2026-03-21T04:24:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 04:24:57 crc kubenswrapper[4839]: I0321 04:24:57.170443 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 21 04:24:57 crc kubenswrapper[4839]: I0321 04:24:57.170521 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 21 04:24:57 crc kubenswrapper[4839]: E0321 04:24:57.170650 4839 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-21 04:25:13.170611285 +0000 UTC m=+117.498397991 (durationBeforeRetry 16s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 21 04:24:57 crc kubenswrapper[4839]: E0321 04:24:57.170648 4839 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Mar 21 04:24:57 crc kubenswrapper[4839]: I0321 04:24:57.170744 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 21 04:24:57 crc kubenswrapper[4839]: E0321 04:24:57.170847 4839 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-21 04:25:13.170828332 +0000 UTC m=+117.498615098 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Mar 21 04:24:57 crc kubenswrapper[4839]: E0321 04:24:57.170973 4839 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 21 04:24:57 crc kubenswrapper[4839]: E0321 04:24:57.171107 4839 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-21 04:25:13.17107739 +0000 UTC m=+117.498864106 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 21 04:24:57 crc kubenswrapper[4839]: I0321 04:24:57.181995 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:24:57 crc kubenswrapper[4839]: I0321 04:24:57.182039 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:24:57 crc kubenswrapper[4839]: I0321 04:24:57.182051 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:24:57 crc kubenswrapper[4839]: I0321 04:24:57.182067 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 04:24:57 crc kubenswrapper[4839]: I0321 04:24:57.182078 4839 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T04:24:57Z","lastTransitionTime":"2026-03-21T04:24:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 04:24:57 crc kubenswrapper[4839]: I0321 04:24:57.271964 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 21 04:24:57 crc kubenswrapper[4839]: I0321 04:24:57.272005 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 21 04:24:57 crc kubenswrapper[4839]: E0321 04:24:57.272103 4839 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 21 04:24:57 crc kubenswrapper[4839]: E0321 04:24:57.272120 4839 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 21 04:24:57 crc kubenswrapper[4839]: E0321 04:24:57.272121 4839 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 21 04:24:57 crc kubenswrapper[4839]: E0321 04:24:57.272169 4839 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 21 04:24:57 crc kubenswrapper[4839]: E0321 04:24:57.272187 4839 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 21 04:24:57 crc kubenswrapper[4839]: E0321 04:24:57.272130 4839 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 21 04:24:57 crc kubenswrapper[4839]: E0321 04:24:57.272253 4839 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-03-21 04:25:13.272232144 +0000 UTC m=+117.600018850 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 21 04:24:57 crc kubenswrapper[4839]: E0321 04:24:57.272277 4839 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-03-21 04:25:13.272266255 +0000 UTC m=+117.600052961 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 21 04:24:57 crc kubenswrapper[4839]: I0321 04:24:57.284745 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:24:57 crc kubenswrapper[4839]: I0321 04:24:57.284813 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:24:57 crc kubenswrapper[4839]: I0321 04:24:57.284831 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:24:57 crc kubenswrapper[4839]: I0321 04:24:57.284858 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 04:24:57 crc kubenswrapper[4839]: I0321 04:24:57.284877 4839 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T04:24:57Z","lastTransitionTime":"2026-03-21T04:24:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 04:24:57 crc kubenswrapper[4839]: I0321 04:24:57.388244 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:24:57 crc kubenswrapper[4839]: I0321 04:24:57.388315 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:24:57 crc kubenswrapper[4839]: I0321 04:24:57.388334 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:24:57 crc kubenswrapper[4839]: I0321 04:24:57.388361 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 04:24:57 crc kubenswrapper[4839]: I0321 04:24:57.388383 4839 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T04:24:57Z","lastTransitionTime":"2026-03-21T04:24:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 04:24:57 crc kubenswrapper[4839]: I0321 04:24:57.452410 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 21 04:24:57 crc kubenswrapper[4839]: I0321 04:24:57.452493 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 21 04:24:57 crc kubenswrapper[4839]: E0321 04:24:57.452558 4839 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 21 04:24:57 crc kubenswrapper[4839]: I0321 04:24:57.452413 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 21 04:24:57 crc kubenswrapper[4839]: E0321 04:24:57.452682 4839 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 21 04:24:57 crc kubenswrapper[4839]: E0321 04:24:57.452681 4839 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 21 04:24:57 crc kubenswrapper[4839]: I0321 04:24:57.491033 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:24:57 crc kubenswrapper[4839]: I0321 04:24:57.491076 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:24:57 crc kubenswrapper[4839]: I0321 04:24:57.491085 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:24:57 crc kubenswrapper[4839]: I0321 04:24:57.491100 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 04:24:57 crc kubenswrapper[4839]: I0321 04:24:57.491109 4839 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T04:24:57Z","lastTransitionTime":"2026-03-21T04:24:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 04:24:57 crc kubenswrapper[4839]: I0321 04:24:57.593551 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:24:57 crc kubenswrapper[4839]: I0321 04:24:57.593627 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:24:57 crc kubenswrapper[4839]: I0321 04:24:57.593644 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:24:57 crc kubenswrapper[4839]: I0321 04:24:57.593666 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 04:24:57 crc kubenswrapper[4839]: I0321 04:24:57.593694 4839 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T04:24:57Z","lastTransitionTime":"2026-03-21T04:24:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 04:24:57 crc kubenswrapper[4839]: I0321 04:24:57.696543 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:24:57 crc kubenswrapper[4839]: I0321 04:24:57.696600 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:24:57 crc kubenswrapper[4839]: I0321 04:24:57.696613 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:24:57 crc kubenswrapper[4839]: I0321 04:24:57.696636 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 04:24:57 crc kubenswrapper[4839]: I0321 04:24:57.696648 4839 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T04:24:57Z","lastTransitionTime":"2026-03-21T04:24:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 04:24:57 crc kubenswrapper[4839]: I0321 04:24:57.799689 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:24:57 crc kubenswrapper[4839]: I0321 04:24:57.799753 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:24:57 crc kubenswrapper[4839]: I0321 04:24:57.799770 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:24:57 crc kubenswrapper[4839]: I0321 04:24:57.799794 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 04:24:57 crc kubenswrapper[4839]: I0321 04:24:57.799810 4839 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T04:24:57Z","lastTransitionTime":"2026-03-21T04:24:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 04:24:57 crc kubenswrapper[4839]: I0321 04:24:57.902950 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:24:57 crc kubenswrapper[4839]: I0321 04:24:57.902984 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:24:57 crc kubenswrapper[4839]: I0321 04:24:57.902993 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:24:57 crc kubenswrapper[4839]: I0321 04:24:57.903006 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 04:24:57 crc kubenswrapper[4839]: I0321 04:24:57.903015 4839 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T04:24:57Z","lastTransitionTime":"2026-03-21T04:24:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 04:24:58 crc kubenswrapper[4839]: I0321 04:24:58.005485 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:24:58 crc kubenswrapper[4839]: I0321 04:24:58.005597 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:24:58 crc kubenswrapper[4839]: I0321 04:24:58.005616 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:24:58 crc kubenswrapper[4839]: I0321 04:24:58.005639 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 04:24:58 crc kubenswrapper[4839]: I0321 04:24:58.005656 4839 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T04:24:58Z","lastTransitionTime":"2026-03-21T04:24:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 04:24:58 crc kubenswrapper[4839]: I0321 04:24:58.107988 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:24:58 crc kubenswrapper[4839]: I0321 04:24:58.108057 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:24:58 crc kubenswrapper[4839]: I0321 04:24:58.108109 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:24:58 crc kubenswrapper[4839]: I0321 04:24:58.108132 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 04:24:58 crc kubenswrapper[4839]: I0321 04:24:58.108149 4839 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T04:24:58Z","lastTransitionTime":"2026-03-21T04:24:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 04:24:58 crc kubenswrapper[4839]: I0321 04:24:58.211062 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:24:58 crc kubenswrapper[4839]: I0321 04:24:58.211112 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:24:58 crc kubenswrapper[4839]: I0321 04:24:58.211124 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:24:58 crc kubenswrapper[4839]: I0321 04:24:58.211138 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 04:24:58 crc kubenswrapper[4839]: I0321 04:24:58.211150 4839 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T04:24:58Z","lastTransitionTime":"2026-03-21T04:24:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 04:24:58 crc kubenswrapper[4839]: I0321 04:24:58.313911 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:24:58 crc kubenswrapper[4839]: I0321 04:24:58.313965 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:24:58 crc kubenswrapper[4839]: I0321 04:24:58.313977 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:24:58 crc kubenswrapper[4839]: I0321 04:24:58.313997 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 04:24:58 crc kubenswrapper[4839]: I0321 04:24:58.314009 4839 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T04:24:58Z","lastTransitionTime":"2026-03-21T04:24:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 04:24:58 crc kubenswrapper[4839]: I0321 04:24:58.415739 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:24:58 crc kubenswrapper[4839]: I0321 04:24:58.415793 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:24:58 crc kubenswrapper[4839]: I0321 04:24:58.415805 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:24:58 crc kubenswrapper[4839]: I0321 04:24:58.415822 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 04:24:58 crc kubenswrapper[4839]: I0321 04:24:58.415834 4839 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T04:24:58Z","lastTransitionTime":"2026-03-21T04:24:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 04:24:58 crc kubenswrapper[4839]: I0321 04:24:58.453129 4839 scope.go:117] "RemoveContainer" containerID="ecafd1a4266b72f369f4b53d8a423bb875c1bc9719282c517a8e999ad5aecc66" Mar 21 04:24:58 crc kubenswrapper[4839]: I0321 04:24:58.518040 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:24:58 crc kubenswrapper[4839]: I0321 04:24:58.518103 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:24:58 crc kubenswrapper[4839]: I0321 04:24:58.518122 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:24:58 crc kubenswrapper[4839]: I0321 04:24:58.518147 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 04:24:58 crc kubenswrapper[4839]: I0321 04:24:58.518165 4839 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T04:24:58Z","lastTransitionTime":"2026-03-21T04:24:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 04:24:58 crc kubenswrapper[4839]: I0321 04:24:58.621284 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:24:58 crc kubenswrapper[4839]: I0321 04:24:58.621681 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:24:58 crc kubenswrapper[4839]: I0321 04:24:58.621694 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:24:58 crc kubenswrapper[4839]: I0321 04:24:58.621714 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 04:24:58 crc kubenswrapper[4839]: I0321 04:24:58.621725 4839 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T04:24:58Z","lastTransitionTime":"2026-03-21T04:24:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 04:24:58 crc kubenswrapper[4839]: I0321 04:24:58.724496 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:24:58 crc kubenswrapper[4839]: I0321 04:24:58.724562 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:24:58 crc kubenswrapper[4839]: I0321 04:24:58.724620 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:24:58 crc kubenswrapper[4839]: I0321 04:24:58.724649 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 04:24:58 crc kubenswrapper[4839]: I0321 04:24:58.724671 4839 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T04:24:58Z","lastTransitionTime":"2026-03-21T04:24:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 04:24:58 crc kubenswrapper[4839]: I0321 04:24:58.827243 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:24:58 crc kubenswrapper[4839]: I0321 04:24:58.827307 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:24:58 crc kubenswrapper[4839]: I0321 04:24:58.827335 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:24:58 crc kubenswrapper[4839]: I0321 04:24:58.827365 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 04:24:58 crc kubenswrapper[4839]: I0321 04:24:58.827389 4839 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T04:24:58Z","lastTransitionTime":"2026-03-21T04:24:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 04:24:58 crc kubenswrapper[4839]: I0321 04:24:58.930481 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:24:58 crc kubenswrapper[4839]: I0321 04:24:58.930548 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:24:58 crc kubenswrapper[4839]: I0321 04:24:58.930589 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:24:58 crc kubenswrapper[4839]: I0321 04:24:58.930622 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 04:24:58 crc kubenswrapper[4839]: I0321 04:24:58.930638 4839 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T04:24:58Z","lastTransitionTime":"2026-03-21T04:24:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 04:24:58 crc kubenswrapper[4839]: I0321 04:24:58.990399 4839 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/3.log" Mar 21 04:24:58 crc kubenswrapper[4839]: I0321 04:24:58.991702 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"412472fce0e71838c2bb83101879b672e777dcd608bf27261a98261150d42e87"} Mar 21 04:24:58 crc kubenswrapper[4839]: I0321 04:24:58.992058 4839 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 21 04:24:59 crc kubenswrapper[4839]: I0321 04:24:59.012121 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"845a5f72-fcd9-4e53-a65e-d875a0758312\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:23:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:23:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:23:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:23:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:23:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6ac8aa1efe8200de679b27c91d85e8934dd8b958a487da8adfcb2e7fe4d4af83\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:23:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b5fcb9a692a8e68cdaf00e9ab50192286582dc94a1ec9c1f269deec71d12e709\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:23:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0a90276bf42ba474250d4e51c93b958e93ffcad8ab5f4285766b381f0247c484\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:23:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://418cc0271450c53990546d7ffb4506824d2a5889e49064cd40624ba291f485bd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:23:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d2aeb8f0a5e6e7302570f0ecb899e31a8f5e9ed506f9bf7cfa6a0a6d55ae50dd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:23:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://843b9092466615ee64f818bbd1d6e51298c59932d1e135796967de09fb2f81bf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://843b9092466615ee64f818bbd1d6e51298c59932d1e135796967de09fb2f81bf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:23:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:23:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://320bccb832e179c9ab109d22c5cd652cfb3bd770d2c698d91886401d9566efc5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://320bccb832e179c9ab109d22c5cd652cfb3bd770d2c698d91886401d9566efc5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:23:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:23:18Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://6e30344cfa0c0c9d42845a517f744a156634132bd6c2f6e30f745751913e968c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6e30344cfa0c0c9d42845a517f744a156634132bd6c2f6e30f745751913e968c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:23:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:23:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:23:16Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 21 04:24:59 crc kubenswrapper[4839]: I0321 04:24:59.023890 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:24:41Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:24:41Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 21 04:24:59 crc kubenswrapper[4839]: I0321 04:24:59.031779 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:24:41Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:24:41Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 21 04:24:59 crc kubenswrapper[4839]: I0321 04:24:59.033107 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:24:59 crc kubenswrapper[4839]: I0321 04:24:59.033141 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:24:59 crc kubenswrapper[4839]: I0321 04:24:59.033153 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:24:59 crc kubenswrapper[4839]: I0321 04:24:59.033170 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 04:24:59 crc kubenswrapper[4839]: I0321 04:24:59.033181 4839 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T04:24:59Z","lastTransitionTime":"2026-03-21T04:24:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 04:24:59 crc kubenswrapper[4839]: I0321 04:24:59.040988 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c5a473f2-0430-4c9b-8ef8-60d457db5188\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:23:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:23:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:23:16Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:23:16Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:23:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7299e7f513aff2d1d61b42cef3ed386843478716082a6e38d4d6d61d87f48529\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:23:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6d9bd6ceaa85cd21af55d1d512ac910eaf9d1bcf5d0f10868cd22ac80b13f66a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:23:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e4fe4a6c00181492cd5389e34a9bf18d905f69b2f7467eb75dcee9992eeb0d07\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:23:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://412472fce0e71838c2bb83101879b672e777dcd608bf27261a98261150d42e87\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ecafd1a4266b72f369f4b53d8a423bb875c1bc9719282c517a8e999ad5aecc66\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-21T04:24:14Z\\\",\\\"message\\\":\\\"W0321 04:24:13.708919 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI0321 04:24:13.709951 1 crypto.go:601] Generating new CA for check-endpoints-signer@1774067053 cert, and key in /tmp/serving-cert-1138089316/serving-signer.crt, /tmp/serving-cert-1138089316/serving-signer.key\\\\nI0321 04:24:14.009807 1 observer_polling.go:159] Starting file observer\\\\nW0321 04:24:14.016906 1 builder.go:272] unable to get owner reference (falling back to namespace): Unauthorized\\\\nI0321 04:24:14.017068 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0321 04:24:14.017655 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1138089316/tls.crt::/tmp/serving-cert-1138089316/tls.key\\\\\\\"\\\\nF0321 04:24:14.917069 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-21T04:24:13Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:24:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://573a241321770c9c676d83d8280f81eba0ad01a3e2f857be89d31316117b8898\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:23:19Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f70600d5431425836d94025fc8166068d8c9f411d0750be07cd7de44575a9649\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f70600d5431425836d94025fc8166068d8c9f411d0750be07cd7de44575a9649\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:23:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:23:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:23:16Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 21 04:24:59 crc kubenswrapper[4839]: I0321 04:24:59.048924 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:24:41Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:24:41Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 21 04:24:59 crc kubenswrapper[4839]: I0321 04:24:59.056231 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:24:41Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:24:41Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 21 04:24:59 crc kubenswrapper[4839]: I0321 04:24:59.063231 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:24:41Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:24:41Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 21 04:24:59 crc kubenswrapper[4839]: I0321 04:24:59.072103 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:24:41Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:24:41Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 21 04:24:59 crc kubenswrapper[4839]: I0321 04:24:59.135474 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:24:59 crc kubenswrapper[4839]: I0321 04:24:59.135508 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:24:59 crc kubenswrapper[4839]: I0321 04:24:59.135517 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:24:59 crc kubenswrapper[4839]: I0321 04:24:59.135531 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 04:24:59 crc kubenswrapper[4839]: I0321 04:24:59.135540 4839 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T04:24:59Z","lastTransitionTime":"2026-03-21T04:24:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 04:24:59 crc kubenswrapper[4839]: I0321 04:24:59.237166 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:24:59 crc kubenswrapper[4839]: I0321 04:24:59.237199 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:24:59 crc kubenswrapper[4839]: I0321 04:24:59.237208 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:24:59 crc kubenswrapper[4839]: I0321 04:24:59.237220 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 04:24:59 crc kubenswrapper[4839]: I0321 04:24:59.237229 4839 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T04:24:59Z","lastTransitionTime":"2026-03-21T04:24:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 04:24:59 crc kubenswrapper[4839]: I0321 04:24:59.340153 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:24:59 crc kubenswrapper[4839]: I0321 04:24:59.340198 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:24:59 crc kubenswrapper[4839]: I0321 04:24:59.340209 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:24:59 crc kubenswrapper[4839]: I0321 04:24:59.340226 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 04:24:59 crc kubenswrapper[4839]: I0321 04:24:59.340237 4839 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T04:24:59Z","lastTransitionTime":"2026-03-21T04:24:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 04:24:59 crc kubenswrapper[4839]: I0321 04:24:59.443003 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:24:59 crc kubenswrapper[4839]: I0321 04:24:59.443045 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:24:59 crc kubenswrapper[4839]: I0321 04:24:59.443056 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:24:59 crc kubenswrapper[4839]: I0321 04:24:59.443073 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 04:24:59 crc kubenswrapper[4839]: I0321 04:24:59.443084 4839 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T04:24:59Z","lastTransitionTime":"2026-03-21T04:24:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 04:24:59 crc kubenswrapper[4839]: I0321 04:24:59.452409 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 21 04:24:59 crc kubenswrapper[4839]: I0321 04:24:59.452432 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 21 04:24:59 crc kubenswrapper[4839]: I0321 04:24:59.452560 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 21 04:24:59 crc kubenswrapper[4839]: E0321 04:24:59.452662 4839 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 21 04:24:59 crc kubenswrapper[4839]: E0321 04:24:59.452861 4839 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 21 04:24:59 crc kubenswrapper[4839]: E0321 04:24:59.452949 4839 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 21 04:24:59 crc kubenswrapper[4839]: I0321 04:24:59.546666 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:24:59 crc kubenswrapper[4839]: I0321 04:24:59.546694 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:24:59 crc kubenswrapper[4839]: I0321 04:24:59.546703 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:24:59 crc kubenswrapper[4839]: I0321 04:24:59.546719 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 04:24:59 crc kubenswrapper[4839]: I0321 04:24:59.546727 4839 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T04:24:59Z","lastTransitionTime":"2026-03-21T04:24:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 04:24:59 crc kubenswrapper[4839]: I0321 04:24:59.649827 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:24:59 crc kubenswrapper[4839]: I0321 04:24:59.649875 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:24:59 crc kubenswrapper[4839]: I0321 04:24:59.649891 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:24:59 crc kubenswrapper[4839]: I0321 04:24:59.649909 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 04:24:59 crc kubenswrapper[4839]: I0321 04:24:59.649921 4839 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T04:24:59Z","lastTransitionTime":"2026-03-21T04:24:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 04:24:59 crc kubenswrapper[4839]: I0321 04:24:59.752410 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:24:59 crc kubenswrapper[4839]: I0321 04:24:59.752464 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:24:59 crc kubenswrapper[4839]: I0321 04:24:59.752478 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:24:59 crc kubenswrapper[4839]: I0321 04:24:59.752494 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 04:24:59 crc kubenswrapper[4839]: I0321 04:24:59.752503 4839 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T04:24:59Z","lastTransitionTime":"2026-03-21T04:24:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 04:24:59 crc kubenswrapper[4839]: I0321 04:24:59.855184 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:24:59 crc kubenswrapper[4839]: I0321 04:24:59.855232 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:24:59 crc kubenswrapper[4839]: I0321 04:24:59.855243 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:24:59 crc kubenswrapper[4839]: I0321 04:24:59.855257 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 04:24:59 crc kubenswrapper[4839]: I0321 04:24:59.855269 4839 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T04:24:59Z","lastTransitionTime":"2026-03-21T04:24:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 04:24:59 crc kubenswrapper[4839]: I0321 04:24:59.957861 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:24:59 crc kubenswrapper[4839]: I0321 04:24:59.957894 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:24:59 crc kubenswrapper[4839]: I0321 04:24:59.957902 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:24:59 crc kubenswrapper[4839]: I0321 04:24:59.957915 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 04:24:59 crc kubenswrapper[4839]: I0321 04:24:59.957925 4839 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T04:24:59Z","lastTransitionTime":"2026-03-21T04:24:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 04:25:00 crc kubenswrapper[4839]: I0321 04:25:00.060412 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:25:00 crc kubenswrapper[4839]: I0321 04:25:00.060468 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:25:00 crc kubenswrapper[4839]: I0321 04:25:00.060499 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:25:00 crc kubenswrapper[4839]: I0321 04:25:00.060517 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 04:25:00 crc kubenswrapper[4839]: I0321 04:25:00.060527 4839 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T04:25:00Z","lastTransitionTime":"2026-03-21T04:25:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 04:25:00 crc kubenswrapper[4839]: I0321 04:25:00.163113 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:25:00 crc kubenswrapper[4839]: I0321 04:25:00.163160 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:25:00 crc kubenswrapper[4839]: I0321 04:25:00.163171 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:25:00 crc kubenswrapper[4839]: I0321 04:25:00.163185 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 04:25:00 crc kubenswrapper[4839]: I0321 04:25:00.163196 4839 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T04:25:00Z","lastTransitionTime":"2026-03-21T04:25:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 04:25:00 crc kubenswrapper[4839]: I0321 04:25:00.265759 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:25:00 crc kubenswrapper[4839]: I0321 04:25:00.266141 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:25:00 crc kubenswrapper[4839]: I0321 04:25:00.266155 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:25:00 crc kubenswrapper[4839]: I0321 04:25:00.266170 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 04:25:00 crc kubenswrapper[4839]: I0321 04:25:00.266180 4839 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T04:25:00Z","lastTransitionTime":"2026-03-21T04:25:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 04:25:00 crc kubenswrapper[4839]: I0321 04:25:00.278645 4839 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-dns/node-resolver-g47qh"] Mar 21 04:25:00 crc kubenswrapper[4839]: I0321 04:25:00.279117 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-g47qh" Mar 21 04:25:00 crc kubenswrapper[4839]: I0321 04:25:00.281454 4839 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"node-resolver-dockercfg-kz9s7" Mar 21 04:25:00 crc kubenswrapper[4839]: I0321 04:25:00.281669 4839 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"openshift-service-ca.crt" Mar 21 04:25:00 crc kubenswrapper[4839]: I0321 04:25:00.281540 4839 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"kube-root-ca.crt" Mar 21 04:25:00 crc kubenswrapper[4839]: I0321 04:25:00.299229 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"845a5f72-fcd9-4e53-a65e-d875a0758312\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:23:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:23:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:23:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:23:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:23:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6ac8aa1efe8200de679b27c91d85e8934dd8b958a487da8adfcb2e7fe4d4af83\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:23:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b5fcb9a692a8e68cdaf00e9ab50192286582dc94a1ec9c1f269deec71d12e709\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:23:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0a90276bf42ba474250d4e51c93b958e93ffcad8ab5f4285766b381f0247c484\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:23:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://418cc0271450c53990546d7ffb4506824d2a5889e49064cd40624ba291f485bd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:23:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d2aeb8f0a5e6e7302570f0ecb899e31a8f5e9ed506f9bf7cfa6a0a6d55ae50dd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:23:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://843b9092466615ee64f818bbd1d6e51298c59932d1e135796967de09fb2f81bf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://843b9092466615ee64f818bbd1d6e51298c59932d1e135796967de09fb2f81bf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:23:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:23:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://320bccb832e179c9ab109d22c5cd652cfb3bd770d2c698d91886401d9566efc5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://320bccb832e179c9ab109d22c5cd652cfb3bd770d2c698d91886401d9566efc5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:23:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:23:18Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://6e30344cfa0c0c9d42845a517f744a156634132bd6c2f6e30f745751913e968c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6e30344cfa0c0c9d42845a517f744a156634132bd6c2f6e30f745751913e968c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:23:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:23:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:23:16Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 21 04:25:00 crc kubenswrapper[4839]: I0321 04:25:00.310139 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:24:41Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:24:41Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 21 04:25:00 crc kubenswrapper[4839]: I0321 04:25:00.318608 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:24:41Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:24:41Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 21 04:25:00 crc kubenswrapper[4839]: I0321 04:25:00.337531 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c5a473f2-0430-4c9b-8ef8-60d457db5188\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:23:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:23:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:23:16Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:23:16Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:23:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7299e7f513aff2d1d61b42cef3ed386843478716082a6e38d4d6d61d87f48529\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:23:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6d9bd6ceaa85cd21af55d1d512ac910eaf9d1bcf5d0f10868cd22ac80b13f66a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:23:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e4fe4a6c00181492cd5389e34a9bf18d905f69b2f7467eb75dcee9992eeb0d07\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:23:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://412472fce0e71838c2bb83101879b672e777dcd608bf27261a98261150d42e87\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ecafd1a4266b72f369f4b53d8a423bb875c1bc9719282c517a8e999ad5aecc66\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-21T04:24:14Z\\\",\\\"message\\\":\\\"W0321 04:24:13.708919 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI0321 04:24:13.709951 1 crypto.go:601] Generating new CA for check-endpoints-signer@1774067053 cert, and key in /tmp/serving-cert-1138089316/serving-signer.crt, /tmp/serving-cert-1138089316/serving-signer.key\\\\nI0321 04:24:14.009807 1 observer_polling.go:159] Starting file observer\\\\nW0321 04:24:14.016906 1 builder.go:272] unable to get owner reference (falling back to namespace): Unauthorized\\\\nI0321 04:24:14.017068 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0321 04:24:14.017655 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1138089316/tls.crt::/tmp/serving-cert-1138089316/tls.key\\\\\\\"\\\\nF0321 04:24:14.917069 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-21T04:24:13Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:24:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://573a241321770c9c676d83d8280f81eba0ad01a3e2f857be89d31316117b8898\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:23:19Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f70600d5431425836d94025fc8166068d8c9f411d0750be07cd7de44575a9649\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f70600d5431425836d94025fc8166068d8c9f411d0750be07cd7de44575a9649\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:23:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:23:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:23:16Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 21 04:25:00 crc kubenswrapper[4839]: I0321 04:25:00.349339 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:24:41Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:24:41Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 21 04:25:00 crc kubenswrapper[4839]: I0321 04:25:00.357453 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:24:41Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:24:41Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 21 04:25:00 crc kubenswrapper[4839]: I0321 04:25:00.366518 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:24:41Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:24:41Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 21 04:25:00 crc kubenswrapper[4839]: I0321 04:25:00.368680 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:25:00 crc kubenswrapper[4839]: I0321 04:25:00.368718 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:25:00 crc kubenswrapper[4839]: I0321 04:25:00.368728 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:25:00 crc kubenswrapper[4839]: I0321 04:25:00.368743 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 04:25:00 crc kubenswrapper[4839]: I0321 04:25:00.368754 4839 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T04:25:00Z","lastTransitionTime":"2026-03-21T04:25:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 04:25:00 crc kubenswrapper[4839]: I0321 04:25:00.376672 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:24:41Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:24:41Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 21 04:25:00 crc kubenswrapper[4839]: I0321 04:25:00.382840 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-g47qh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e646dbcd-c976-48e4-8dee-497be8a275bf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:00Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:00Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ljz2v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:25:00Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-g47qh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 21 04:25:00 crc kubenswrapper[4839]: I0321 04:25:00.398453 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ljz2v\" (UniqueName: \"kubernetes.io/projected/e646dbcd-c976-48e4-8dee-497be8a275bf-kube-api-access-ljz2v\") pod \"node-resolver-g47qh\" (UID: \"e646dbcd-c976-48e4-8dee-497be8a275bf\") " pod="openshift-dns/node-resolver-g47qh" Mar 21 04:25:00 crc kubenswrapper[4839]: I0321 04:25:00.398547 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/e646dbcd-c976-48e4-8dee-497be8a275bf-hosts-file\") pod \"node-resolver-g47qh\" (UID: \"e646dbcd-c976-48e4-8dee-497be8a275bf\") " pod="openshift-dns/node-resolver-g47qh" Mar 21 04:25:00 crc kubenswrapper[4839]: I0321 04:25:00.470853 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:25:00 crc kubenswrapper[4839]: I0321 04:25:00.470947 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:25:00 crc kubenswrapper[4839]: I0321 04:25:00.470971 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:25:00 crc kubenswrapper[4839]: I0321 04:25:00.471012 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 04:25:00 crc kubenswrapper[4839]: I0321 04:25:00.471041 4839 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T04:25:00Z","lastTransitionTime":"2026-03-21T04:25:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 04:25:00 crc kubenswrapper[4839]: I0321 04:25:00.499067 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ljz2v\" (UniqueName: \"kubernetes.io/projected/e646dbcd-c976-48e4-8dee-497be8a275bf-kube-api-access-ljz2v\") pod \"node-resolver-g47qh\" (UID: \"e646dbcd-c976-48e4-8dee-497be8a275bf\") " pod="openshift-dns/node-resolver-g47qh" Mar 21 04:25:00 crc kubenswrapper[4839]: I0321 04:25:00.499125 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/e646dbcd-c976-48e4-8dee-497be8a275bf-hosts-file\") pod \"node-resolver-g47qh\" (UID: \"e646dbcd-c976-48e4-8dee-497be8a275bf\") " pod="openshift-dns/node-resolver-g47qh" Mar 21 04:25:00 crc kubenswrapper[4839]: I0321 04:25:00.499217 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/e646dbcd-c976-48e4-8dee-497be8a275bf-hosts-file\") pod \"node-resolver-g47qh\" (UID: \"e646dbcd-c976-48e4-8dee-497be8a275bf\") " pod="openshift-dns/node-resolver-g47qh" Mar 21 04:25:00 crc kubenswrapper[4839]: I0321 04:25:00.516375 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ljz2v\" (UniqueName: \"kubernetes.io/projected/e646dbcd-c976-48e4-8dee-497be8a275bf-kube-api-access-ljz2v\") pod \"node-resolver-g47qh\" (UID: \"e646dbcd-c976-48e4-8dee-497be8a275bf\") " pod="openshift-dns/node-resolver-g47qh" Mar 21 04:25:00 crc kubenswrapper[4839]: I0321 04:25:00.573930 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:25:00 crc kubenswrapper[4839]: I0321 04:25:00.574171 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:25:00 crc kubenswrapper[4839]: I0321 04:25:00.574300 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:25:00 crc kubenswrapper[4839]: I0321 04:25:00.574398 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 04:25:00 crc kubenswrapper[4839]: I0321 04:25:00.574492 4839 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T04:25:00Z","lastTransitionTime":"2026-03-21T04:25:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 04:25:00 crc kubenswrapper[4839]: I0321 04:25:00.603874 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-g47qh" Mar 21 04:25:00 crc kubenswrapper[4839]: W0321 04:25:00.624947 4839 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode646dbcd_c976_48e4_8dee_497be8a275bf.slice/crio-ad11ae6b1f76dc4d610d074d746058d2d1171fb3a53b621a66c771b9c432c307 WatchSource:0}: Error finding container ad11ae6b1f76dc4d610d074d746058d2d1171fb3a53b621a66c771b9c432c307: Status 404 returned error can't find the container with id ad11ae6b1f76dc4d610d074d746058d2d1171fb3a53b621a66c771b9c432c307 Mar 21 04:25:00 crc kubenswrapper[4839]: I0321 04:25:00.649348 4839 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-daemon-jx4q7"] Mar 21 04:25:00 crc kubenswrapper[4839]: I0321 04:25:00.650106 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-daemon-jx4q7" Mar 21 04:25:00 crc kubenswrapper[4839]: I0321 04:25:00.653053 4839 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"proxy-tls" Mar 21 04:25:00 crc kubenswrapper[4839]: I0321 04:25:00.653447 4839 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"openshift-service-ca.crt" Mar 21 04:25:00 crc kubenswrapper[4839]: I0321 04:25:00.653814 4839 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-daemon-dockercfg-r5tcq" Mar 21 04:25:00 crc kubenswrapper[4839]: I0321 04:25:00.653989 4839 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/multus-additional-cni-plugins-scp2c"] Mar 21 04:25:00 crc kubenswrapper[4839]: I0321 04:25:00.654339 4839 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-rbac-proxy" Mar 21 04:25:00 crc kubenswrapper[4839]: I0321 04:25:00.655036 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-scp2c" Mar 21 04:25:00 crc kubenswrapper[4839]: I0321 04:25:00.655892 4839 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-root-ca.crt" Mar 21 04:25:00 crc kubenswrapper[4839]: I0321 04:25:00.656626 4839 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/multus-zqcw4"] Mar 21 04:25:00 crc kubenswrapper[4839]: I0321 04:25:00.657166 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-zqcw4" Mar 21 04:25:00 crc kubenswrapper[4839]: I0321 04:25:00.658229 4839 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ancillary-tools-dockercfg-vnmsz" Mar 21 04:25:00 crc kubenswrapper[4839]: I0321 04:25:00.658445 4839 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"openshift-service-ca.crt" Mar 21 04:25:00 crc kubenswrapper[4839]: I0321 04:25:00.658444 4839 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"cni-copy-resources" Mar 21 04:25:00 crc kubenswrapper[4839]: I0321 04:25:00.658752 4839 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"default-cni-sysctl-allowlist" Mar 21 04:25:00 crc kubenswrapper[4839]: I0321 04:25:00.658833 4839 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"kube-root-ca.crt" Mar 21 04:25:00 crc kubenswrapper[4839]: I0321 04:25:00.660804 4839 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"multus-daemon-config" Mar 21 04:25:00 crc kubenswrapper[4839]: I0321 04:25:00.661120 4839 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"default-dockercfg-2q5b6" Mar 21 04:25:00 crc kubenswrapper[4839]: I0321 04:25:00.667012 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c5a473f2-0430-4c9b-8ef8-60d457db5188\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:23:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:23:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:23:16Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:23:16Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:23:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7299e7f513aff2d1d61b42cef3ed386843478716082a6e38d4d6d61d87f48529\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:23:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6d9bd6ceaa85cd21af55d1d512ac910eaf9d1bcf5d0f10868cd22ac80b13f66a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:23:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e4fe4a6c00181492cd5389e34a9bf18d905f69b2f7467eb75dcee9992eeb0d07\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:23:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://412472fce0e71838c2bb83101879b672e777dcd608bf27261a98261150d42e87\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ecafd1a4266b72f369f4b53d8a423bb875c1bc9719282c517a8e999ad5aecc66\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-21T04:24:14Z\\\",\\\"message\\\":\\\"W0321 04:24:13.708919 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI0321 04:24:13.709951 1 crypto.go:601] Generating new CA for check-endpoints-signer@1774067053 cert, and key in /tmp/serving-cert-1138089316/serving-signer.crt, /tmp/serving-cert-1138089316/serving-signer.key\\\\nI0321 04:24:14.009807 1 observer_polling.go:159] Starting file observer\\\\nW0321 04:24:14.016906 1 builder.go:272] unable to get owner reference (falling back to namespace): Unauthorized\\\\nI0321 04:24:14.017068 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0321 04:24:14.017655 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1138089316/tls.crt::/tmp/serving-cert-1138089316/tls.key\\\\\\\"\\\\nF0321 04:24:14.917069 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-21T04:24:13Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:24:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://573a241321770c9c676d83d8280f81eba0ad01a3e2f857be89d31316117b8898\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:23:19Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f70600d5431425836d94025fc8166068d8c9f411d0750be07cd7de44575a9649\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f70600d5431425836d94025fc8166068d8c9f411d0750be07cd7de44575a9649\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:23:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:23:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:23:16Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 21 04:25:00 crc kubenswrapper[4839]: I0321 04:25:00.677653 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:24:41Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:24:41Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 21 04:25:00 crc kubenswrapper[4839]: I0321 04:25:00.679439 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:25:00 crc kubenswrapper[4839]: I0321 04:25:00.679469 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:25:00 crc kubenswrapper[4839]: I0321 04:25:00.679478 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:25:00 crc kubenswrapper[4839]: I0321 04:25:00.679492 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 04:25:00 crc kubenswrapper[4839]: I0321 04:25:00.679503 4839 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T04:25:00Z","lastTransitionTime":"2026-03-21T04:25:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 04:25:00 crc kubenswrapper[4839]: I0321 04:25:00.691247 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:24:41Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:24:41Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 21 04:25:00 crc kubenswrapper[4839]: I0321 04:25:00.704210 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:24:41Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:24:41Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 21 04:25:00 crc kubenswrapper[4839]: I0321 04:25:00.737879 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:24:41Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:24:41Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 21 04:25:00 crc kubenswrapper[4839]: I0321 04:25:00.755049 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-g47qh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e646dbcd-c976-48e4-8dee-497be8a275bf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:00Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:00Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ljz2v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:25:00Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-g47qh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 21 04:25:00 crc kubenswrapper[4839]: I0321 04:25:00.782117 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:25:00 crc kubenswrapper[4839]: I0321 04:25:00.782162 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:25:00 crc kubenswrapper[4839]: I0321 04:25:00.782200 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:25:00 crc kubenswrapper[4839]: I0321 04:25:00.782218 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 04:25:00 crc kubenswrapper[4839]: I0321 04:25:00.782229 4839 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T04:25:00Z","lastTransitionTime":"2026-03-21T04:25:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 04:25:00 crc kubenswrapper[4839]: I0321 04:25:00.786190 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"845a5f72-fcd9-4e53-a65e-d875a0758312\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:23:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:23:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:23:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:23:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:23:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6ac8aa1efe8200de679b27c91d85e8934dd8b958a487da8adfcb2e7fe4d4af83\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:23:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b5fcb9a692a8e68cdaf00e9ab50192286582dc94a1ec9c1f269deec71d12e709\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:23:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0a90276bf42ba474250d4e51c93b958e93ffcad8ab5f4285766b381f0247c484\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:23:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://418cc0271450c53990546d7ffb4506824d2a5889e49064cd40624ba291f485bd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:23:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d2aeb8f0a5e6e7302570f0ecb899e31a8f5e9ed506f9bf7cfa6a0a6d55ae50dd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:23:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://843b9092466615ee64f818bbd1d6e51298c59932d1e135796967de09fb2f81bf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://843b9092466615ee64f818bbd1d6e51298c59932d1e135796967de09fb2f81bf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:23:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:23:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://320bccb832e179c9ab109d22c5cd652cfb3bd770d2c698d91886401d9566efc5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://320bccb832e179c9ab109d22c5cd652cfb3bd770d2c698d91886401d9566efc5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:23:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:23:18Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://6e30344cfa0c0c9d42845a517f744a156634132bd6c2f6e30f745751913e968c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6e30344cfa0c0c9d42845a517f744a156634132bd6c2f6e30f745751913e968c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:23:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:23:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:23:16Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 21 04:25:00 crc kubenswrapper[4839]: I0321 04:25:00.796903 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:24:41Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:24:41Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 21 04:25:00 crc kubenswrapper[4839]: I0321 04:25:00.801729 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/4f92fefb-d5cd-451a-8bbe-31eea55d5bd9-mcd-auth-proxy-config\") pod \"machine-config-daemon-jx4q7\" (UID: \"4f92fefb-d5cd-451a-8bbe-31eea55d5bd9\") " pod="openshift-machine-config-operator/machine-config-daemon-jx4q7" Mar 21 04:25:00 crc kubenswrapper[4839]: I0321 04:25:00.801776 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9jqbw\" (UniqueName: \"kubernetes.io/projected/4f92fefb-d5cd-451a-8bbe-31eea55d5bd9-kube-api-access-9jqbw\") pod \"machine-config-daemon-jx4q7\" (UID: \"4f92fefb-d5cd-451a-8bbe-31eea55d5bd9\") " pod="openshift-machine-config-operator/machine-config-daemon-jx4q7" Mar 21 04:25:00 crc kubenswrapper[4839]: I0321 04:25:00.801799 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/1602189b-f4f3-40ee-ba63-c695c11069d0-cni-binary-copy\") pod \"multus-zqcw4\" (UID: \"1602189b-f4f3-40ee-ba63-c695c11069d0\") " pod="openshift-multus/multus-zqcw4" Mar 21 04:25:00 crc kubenswrapper[4839]: I0321 04:25:00.801821 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/e0848faa-daf7-4b62-a20f-36d92678db1d-cnibin\") pod \"multus-additional-cni-plugins-scp2c\" (UID: \"e0848faa-daf7-4b62-a20f-36d92678db1d\") " pod="openshift-multus/multus-additional-cni-plugins-scp2c" Mar 21 04:25:00 crc kubenswrapper[4839]: I0321 04:25:00.801842 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/e0848faa-daf7-4b62-a20f-36d92678db1d-os-release\") pod \"multus-additional-cni-plugins-scp2c\" (UID: \"e0848faa-daf7-4b62-a20f-36d92678db1d\") " pod="openshift-multus/multus-additional-cni-plugins-scp2c" Mar 21 04:25:00 crc kubenswrapper[4839]: I0321 04:25:00.801864 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/1602189b-f4f3-40ee-ba63-c695c11069d0-host-var-lib-cni-multus\") pod \"multus-zqcw4\" (UID: \"1602189b-f4f3-40ee-ba63-c695c11069d0\") " pod="openshift-multus/multus-zqcw4" Mar 21 04:25:00 crc kubenswrapper[4839]: I0321 04:25:00.801884 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/1602189b-f4f3-40ee-ba63-c695c11069d0-host-var-lib-kubelet\") pod \"multus-zqcw4\" (UID: \"1602189b-f4f3-40ee-ba63-c695c11069d0\") " pod="openshift-multus/multus-zqcw4" Mar 21 04:25:00 crc kubenswrapper[4839]: I0321 04:25:00.801904 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/1602189b-f4f3-40ee-ba63-c695c11069d0-multus-conf-dir\") pod \"multus-zqcw4\" (UID: \"1602189b-f4f3-40ee-ba63-c695c11069d0\") " pod="openshift-multus/multus-zqcw4" Mar 21 04:25:00 crc kubenswrapper[4839]: I0321 04:25:00.801923 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/4f92fefb-d5cd-451a-8bbe-31eea55d5bd9-rootfs\") pod \"machine-config-daemon-jx4q7\" (UID: \"4f92fefb-d5cd-451a-8bbe-31eea55d5bd9\") " pod="openshift-machine-config-operator/machine-config-daemon-jx4q7" Mar 21 04:25:00 crc kubenswrapper[4839]: I0321 04:25:00.801943 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/1602189b-f4f3-40ee-ba63-c695c11069d0-host-var-lib-cni-bin\") pod \"multus-zqcw4\" (UID: \"1602189b-f4f3-40ee-ba63-c695c11069d0\") " pod="openshift-multus/multus-zqcw4" Mar 21 04:25:00 crc kubenswrapper[4839]: I0321 04:25:00.801981 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/1602189b-f4f3-40ee-ba63-c695c11069d0-cnibin\") pod \"multus-zqcw4\" (UID: \"1602189b-f4f3-40ee-ba63-c695c11069d0\") " pod="openshift-multus/multus-zqcw4" Mar 21 04:25:00 crc kubenswrapper[4839]: I0321 04:25:00.801999 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/4f92fefb-d5cd-451a-8bbe-31eea55d5bd9-proxy-tls\") pod \"machine-config-daemon-jx4q7\" (UID: \"4f92fefb-d5cd-451a-8bbe-31eea55d5bd9\") " pod="openshift-machine-config-operator/machine-config-daemon-jx4q7" Mar 21 04:25:00 crc kubenswrapper[4839]: I0321 04:25:00.802017 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/e0848faa-daf7-4b62-a20f-36d92678db1d-system-cni-dir\") pod \"multus-additional-cni-plugins-scp2c\" (UID: \"e0848faa-daf7-4b62-a20f-36d92678db1d\") " pod="openshift-multus/multus-additional-cni-plugins-scp2c" Mar 21 04:25:00 crc kubenswrapper[4839]: I0321 04:25:00.802037 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/e0848faa-daf7-4b62-a20f-36d92678db1d-tuning-conf-dir\") pod \"multus-additional-cni-plugins-scp2c\" (UID: \"e0848faa-daf7-4b62-a20f-36d92678db1d\") " pod="openshift-multus/multus-additional-cni-plugins-scp2c" Mar 21 04:25:00 crc kubenswrapper[4839]: I0321 04:25:00.802062 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/1602189b-f4f3-40ee-ba63-c695c11069d0-multus-cni-dir\") pod \"multus-zqcw4\" (UID: \"1602189b-f4f3-40ee-ba63-c695c11069d0\") " pod="openshift-multus/multus-zqcw4" Mar 21 04:25:00 crc kubenswrapper[4839]: I0321 04:25:00.802081 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/1602189b-f4f3-40ee-ba63-c695c11069d0-etc-kubernetes\") pod \"multus-zqcw4\" (UID: \"1602189b-f4f3-40ee-ba63-c695c11069d0\") " pod="openshift-multus/multus-zqcw4" Mar 21 04:25:00 crc kubenswrapper[4839]: I0321 04:25:00.802104 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/e0848faa-daf7-4b62-a20f-36d92678db1d-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-scp2c\" (UID: \"e0848faa-daf7-4b62-a20f-36d92678db1d\") " pod="openshift-multus/multus-additional-cni-plugins-scp2c" Mar 21 04:25:00 crc kubenswrapper[4839]: I0321 04:25:00.802123 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/1602189b-f4f3-40ee-ba63-c695c11069d0-host-run-k8s-cni-cncf-io\") pod \"multus-zqcw4\" (UID: \"1602189b-f4f3-40ee-ba63-c695c11069d0\") " pod="openshift-multus/multus-zqcw4" Mar 21 04:25:00 crc kubenswrapper[4839]: I0321 04:25:00.802143 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/e0848faa-daf7-4b62-a20f-36d92678db1d-cni-binary-copy\") pod \"multus-additional-cni-plugins-scp2c\" (UID: \"e0848faa-daf7-4b62-a20f-36d92678db1d\") " pod="openshift-multus/multus-additional-cni-plugins-scp2c" Mar 21 04:25:00 crc kubenswrapper[4839]: I0321 04:25:00.802173 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f9gh6\" (UniqueName: \"kubernetes.io/projected/1602189b-f4f3-40ee-ba63-c695c11069d0-kube-api-access-f9gh6\") pod \"multus-zqcw4\" (UID: \"1602189b-f4f3-40ee-ba63-c695c11069d0\") " pod="openshift-multus/multus-zqcw4" Mar 21 04:25:00 crc kubenswrapper[4839]: I0321 04:25:00.802191 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/1602189b-f4f3-40ee-ba63-c695c11069d0-os-release\") pod \"multus-zqcw4\" (UID: \"1602189b-f4f3-40ee-ba63-c695c11069d0\") " pod="openshift-multus/multus-zqcw4" Mar 21 04:25:00 crc kubenswrapper[4839]: I0321 04:25:00.802208 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/1602189b-f4f3-40ee-ba63-c695c11069d0-hostroot\") pod \"multus-zqcw4\" (UID: \"1602189b-f4f3-40ee-ba63-c695c11069d0\") " pod="openshift-multus/multus-zqcw4" Mar 21 04:25:00 crc kubenswrapper[4839]: I0321 04:25:00.802230 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/1602189b-f4f3-40ee-ba63-c695c11069d0-multus-socket-dir-parent\") pod \"multus-zqcw4\" (UID: \"1602189b-f4f3-40ee-ba63-c695c11069d0\") " pod="openshift-multus/multus-zqcw4" Mar 21 04:25:00 crc kubenswrapper[4839]: I0321 04:25:00.802244 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/1602189b-f4f3-40ee-ba63-c695c11069d0-host-run-netns\") pod \"multus-zqcw4\" (UID: \"1602189b-f4f3-40ee-ba63-c695c11069d0\") " pod="openshift-multus/multus-zqcw4" Mar 21 04:25:00 crc kubenswrapper[4839]: I0321 04:25:00.802259 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/1602189b-f4f3-40ee-ba63-c695c11069d0-multus-daemon-config\") pod \"multus-zqcw4\" (UID: \"1602189b-f4f3-40ee-ba63-c695c11069d0\") " pod="openshift-multus/multus-zqcw4" Mar 21 04:25:00 crc kubenswrapper[4839]: I0321 04:25:00.802315 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/1602189b-f4f3-40ee-ba63-c695c11069d0-system-cni-dir\") pod \"multus-zqcw4\" (UID: \"1602189b-f4f3-40ee-ba63-c695c11069d0\") " pod="openshift-multus/multus-zqcw4" Mar 21 04:25:00 crc kubenswrapper[4839]: I0321 04:25:00.802378 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/1602189b-f4f3-40ee-ba63-c695c11069d0-host-run-multus-certs\") pod \"multus-zqcw4\" (UID: \"1602189b-f4f3-40ee-ba63-c695c11069d0\") " pod="openshift-multus/multus-zqcw4" Mar 21 04:25:00 crc kubenswrapper[4839]: I0321 04:25:00.802439 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v6trk\" (UniqueName: \"kubernetes.io/projected/e0848faa-daf7-4b62-a20f-36d92678db1d-kube-api-access-v6trk\") pod \"multus-additional-cni-plugins-scp2c\" (UID: \"e0848faa-daf7-4b62-a20f-36d92678db1d\") " pod="openshift-multus/multus-additional-cni-plugins-scp2c" Mar 21 04:25:00 crc kubenswrapper[4839]: I0321 04:25:00.807182 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:24:41Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:24:41Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 21 04:25:00 crc kubenswrapper[4839]: I0321 04:25:00.815190 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-jx4q7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4f92fefb-d5cd-451a-8bbe-31eea55d5bd9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:00Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:00Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9jqbw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9jqbw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:25:00Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-jx4q7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 21 04:25:00 crc kubenswrapper[4839]: I0321 04:25:00.823711 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:24:41Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:24:41Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 21 04:25:00 crc kubenswrapper[4839]: I0321 04:25:00.833062 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-jx4q7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4f92fefb-d5cd-451a-8bbe-31eea55d5bd9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:00Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:00Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9jqbw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9jqbw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:25:00Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-jx4q7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 21 04:25:00 crc kubenswrapper[4839]: I0321 04:25:00.844772 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:24:41Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:24:41Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 21 04:25:00 crc kubenswrapper[4839]: I0321 04:25:00.856433 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-scp2c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e0848faa-daf7-4b62-a20f-36d92678db1d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:00Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:00Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:00Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6trk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6trk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6trk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6trk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6trk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6trk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6trk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:25:00Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-scp2c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 21 04:25:00 crc kubenswrapper[4839]: I0321 04:25:00.875232 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-zqcw4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1602189b-f4f3-40ee-ba63-c695c11069d0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:00Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:00Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f9gh6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:25:00Z\\\"}}\" for pod \"openshift-multus\"/\"multus-zqcw4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 21 04:25:00 crc kubenswrapper[4839]: I0321 04:25:00.885174 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:24:41Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:24:41Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 21 04:25:00 crc kubenswrapper[4839]: I0321 04:25:00.887741 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:25:00 crc kubenswrapper[4839]: I0321 04:25:00.887777 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:25:00 crc kubenswrapper[4839]: I0321 04:25:00.887785 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:25:00 crc kubenswrapper[4839]: I0321 04:25:00.887800 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 04:25:00 crc kubenswrapper[4839]: I0321 04:25:00.887813 4839 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T04:25:00Z","lastTransitionTime":"2026-03-21T04:25:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 04:25:00 crc kubenswrapper[4839]: I0321 04:25:00.896962 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:24:41Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:24:41Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 21 04:25:00 crc kubenswrapper[4839]: I0321 04:25:00.903414 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/e0848faa-daf7-4b62-a20f-36d92678db1d-cni-binary-copy\") pod \"multus-additional-cni-plugins-scp2c\" (UID: \"e0848faa-daf7-4b62-a20f-36d92678db1d\") " pod="openshift-multus/multus-additional-cni-plugins-scp2c" Mar 21 04:25:00 crc kubenswrapper[4839]: I0321 04:25:00.903470 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-f9gh6\" (UniqueName: \"kubernetes.io/projected/1602189b-f4f3-40ee-ba63-c695c11069d0-kube-api-access-f9gh6\") pod \"multus-zqcw4\" (UID: \"1602189b-f4f3-40ee-ba63-c695c11069d0\") " pod="openshift-multus/multus-zqcw4" Mar 21 04:25:00 crc kubenswrapper[4839]: I0321 04:25:00.903491 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/1602189b-f4f3-40ee-ba63-c695c11069d0-os-release\") pod \"multus-zqcw4\" (UID: \"1602189b-f4f3-40ee-ba63-c695c11069d0\") " pod="openshift-multus/multus-zqcw4" Mar 21 04:25:00 crc kubenswrapper[4839]: I0321 04:25:00.903512 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/1602189b-f4f3-40ee-ba63-c695c11069d0-hostroot\") pod \"multus-zqcw4\" (UID: \"1602189b-f4f3-40ee-ba63-c695c11069d0\") " pod="openshift-multus/multus-zqcw4" Mar 21 04:25:00 crc kubenswrapper[4839]: I0321 04:25:00.903544 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/1602189b-f4f3-40ee-ba63-c695c11069d0-multus-socket-dir-parent\") pod \"multus-zqcw4\" (UID: \"1602189b-f4f3-40ee-ba63-c695c11069d0\") " pod="openshift-multus/multus-zqcw4" Mar 21 04:25:00 crc kubenswrapper[4839]: I0321 04:25:00.903584 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/1602189b-f4f3-40ee-ba63-c695c11069d0-host-run-netns\") pod \"multus-zqcw4\" (UID: \"1602189b-f4f3-40ee-ba63-c695c11069d0\") " pod="openshift-multus/multus-zqcw4" Mar 21 04:25:00 crc kubenswrapper[4839]: I0321 04:25:00.903606 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/1602189b-f4f3-40ee-ba63-c695c11069d0-multus-daemon-config\") pod \"multus-zqcw4\" (UID: \"1602189b-f4f3-40ee-ba63-c695c11069d0\") " pod="openshift-multus/multus-zqcw4" Mar 21 04:25:00 crc kubenswrapper[4839]: I0321 04:25:00.903630 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-v6trk\" (UniqueName: \"kubernetes.io/projected/e0848faa-daf7-4b62-a20f-36d92678db1d-kube-api-access-v6trk\") pod \"multus-additional-cni-plugins-scp2c\" (UID: \"e0848faa-daf7-4b62-a20f-36d92678db1d\") " pod="openshift-multus/multus-additional-cni-plugins-scp2c" Mar 21 04:25:00 crc kubenswrapper[4839]: I0321 04:25:00.903689 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/1602189b-f4f3-40ee-ba63-c695c11069d0-system-cni-dir\") pod \"multus-zqcw4\" (UID: \"1602189b-f4f3-40ee-ba63-c695c11069d0\") " pod="openshift-multus/multus-zqcw4" Mar 21 04:25:00 crc kubenswrapper[4839]: I0321 04:25:00.903711 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/1602189b-f4f3-40ee-ba63-c695c11069d0-host-run-multus-certs\") pod \"multus-zqcw4\" (UID: \"1602189b-f4f3-40ee-ba63-c695c11069d0\") " pod="openshift-multus/multus-zqcw4" Mar 21 04:25:00 crc kubenswrapper[4839]: I0321 04:25:00.903757 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/1602189b-f4f3-40ee-ba63-c695c11069d0-cni-binary-copy\") pod \"multus-zqcw4\" (UID: \"1602189b-f4f3-40ee-ba63-c695c11069d0\") " pod="openshift-multus/multus-zqcw4" Mar 21 04:25:00 crc kubenswrapper[4839]: I0321 04:25:00.903780 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/4f92fefb-d5cd-451a-8bbe-31eea55d5bd9-mcd-auth-proxy-config\") pod \"machine-config-daemon-jx4q7\" (UID: \"4f92fefb-d5cd-451a-8bbe-31eea55d5bd9\") " pod="openshift-machine-config-operator/machine-config-daemon-jx4q7" Mar 21 04:25:00 crc kubenswrapper[4839]: I0321 04:25:00.903803 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9jqbw\" (UniqueName: \"kubernetes.io/projected/4f92fefb-d5cd-451a-8bbe-31eea55d5bd9-kube-api-access-9jqbw\") pod \"machine-config-daemon-jx4q7\" (UID: \"4f92fefb-d5cd-451a-8bbe-31eea55d5bd9\") " pod="openshift-machine-config-operator/machine-config-daemon-jx4q7" Mar 21 04:25:00 crc kubenswrapper[4839]: I0321 04:25:00.903826 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/e0848faa-daf7-4b62-a20f-36d92678db1d-os-release\") pod \"multus-additional-cni-plugins-scp2c\" (UID: \"e0848faa-daf7-4b62-a20f-36d92678db1d\") " pod="openshift-multus/multus-additional-cni-plugins-scp2c" Mar 21 04:25:00 crc kubenswrapper[4839]: I0321 04:25:00.903846 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/1602189b-f4f3-40ee-ba63-c695c11069d0-host-var-lib-cni-multus\") pod \"multus-zqcw4\" (UID: \"1602189b-f4f3-40ee-ba63-c695c11069d0\") " pod="openshift-multus/multus-zqcw4" Mar 21 04:25:00 crc kubenswrapper[4839]: I0321 04:25:00.903867 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/e0848faa-daf7-4b62-a20f-36d92678db1d-cnibin\") pod \"multus-additional-cni-plugins-scp2c\" (UID: \"e0848faa-daf7-4b62-a20f-36d92678db1d\") " pod="openshift-multus/multus-additional-cni-plugins-scp2c" Mar 21 04:25:00 crc kubenswrapper[4839]: I0321 04:25:00.903890 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/1602189b-f4f3-40ee-ba63-c695c11069d0-host-var-lib-kubelet\") pod \"multus-zqcw4\" (UID: \"1602189b-f4f3-40ee-ba63-c695c11069d0\") " pod="openshift-multus/multus-zqcw4" Mar 21 04:25:00 crc kubenswrapper[4839]: I0321 04:25:00.903912 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/1602189b-f4f3-40ee-ba63-c695c11069d0-multus-conf-dir\") pod \"multus-zqcw4\" (UID: \"1602189b-f4f3-40ee-ba63-c695c11069d0\") " pod="openshift-multus/multus-zqcw4" Mar 21 04:25:00 crc kubenswrapper[4839]: I0321 04:25:00.903932 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/1602189b-f4f3-40ee-ba63-c695c11069d0-host-var-lib-cni-bin\") pod \"multus-zqcw4\" (UID: \"1602189b-f4f3-40ee-ba63-c695c11069d0\") " pod="openshift-multus/multus-zqcw4" Mar 21 04:25:00 crc kubenswrapper[4839]: I0321 04:25:00.903954 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/4f92fefb-d5cd-451a-8bbe-31eea55d5bd9-rootfs\") pod \"machine-config-daemon-jx4q7\" (UID: \"4f92fefb-d5cd-451a-8bbe-31eea55d5bd9\") " pod="openshift-machine-config-operator/machine-config-daemon-jx4q7" Mar 21 04:25:00 crc kubenswrapper[4839]: I0321 04:25:00.903975 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/1602189b-f4f3-40ee-ba63-c695c11069d0-cnibin\") pod \"multus-zqcw4\" (UID: \"1602189b-f4f3-40ee-ba63-c695c11069d0\") " pod="openshift-multus/multus-zqcw4" Mar 21 04:25:00 crc kubenswrapper[4839]: I0321 04:25:00.904004 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/4f92fefb-d5cd-451a-8bbe-31eea55d5bd9-proxy-tls\") pod \"machine-config-daemon-jx4q7\" (UID: \"4f92fefb-d5cd-451a-8bbe-31eea55d5bd9\") " pod="openshift-machine-config-operator/machine-config-daemon-jx4q7" Mar 21 04:25:00 crc kubenswrapper[4839]: I0321 04:25:00.904027 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/e0848faa-daf7-4b62-a20f-36d92678db1d-system-cni-dir\") pod \"multus-additional-cni-plugins-scp2c\" (UID: \"e0848faa-daf7-4b62-a20f-36d92678db1d\") " pod="openshift-multus/multus-additional-cni-plugins-scp2c" Mar 21 04:25:00 crc kubenswrapper[4839]: I0321 04:25:00.904053 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/e0848faa-daf7-4b62-a20f-36d92678db1d-tuning-conf-dir\") pod \"multus-additional-cni-plugins-scp2c\" (UID: \"e0848faa-daf7-4b62-a20f-36d92678db1d\") " pod="openshift-multus/multus-additional-cni-plugins-scp2c" Mar 21 04:25:00 crc kubenswrapper[4839]: I0321 04:25:00.904076 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/1602189b-f4f3-40ee-ba63-c695c11069d0-multus-cni-dir\") pod \"multus-zqcw4\" (UID: \"1602189b-f4f3-40ee-ba63-c695c11069d0\") " pod="openshift-multus/multus-zqcw4" Mar 21 04:25:00 crc kubenswrapper[4839]: I0321 04:25:00.904100 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/e0848faa-daf7-4b62-a20f-36d92678db1d-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-scp2c\" (UID: \"e0848faa-daf7-4b62-a20f-36d92678db1d\") " pod="openshift-multus/multus-additional-cni-plugins-scp2c" Mar 21 04:25:00 crc kubenswrapper[4839]: I0321 04:25:00.904123 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/1602189b-f4f3-40ee-ba63-c695c11069d0-host-run-k8s-cni-cncf-io\") pod \"multus-zqcw4\" (UID: \"1602189b-f4f3-40ee-ba63-c695c11069d0\") " pod="openshift-multus/multus-zqcw4" Mar 21 04:25:00 crc kubenswrapper[4839]: I0321 04:25:00.904143 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/1602189b-f4f3-40ee-ba63-c695c11069d0-etc-kubernetes\") pod \"multus-zqcw4\" (UID: \"1602189b-f4f3-40ee-ba63-c695c11069d0\") " pod="openshift-multus/multus-zqcw4" Mar 21 04:25:00 crc kubenswrapper[4839]: I0321 04:25:00.904213 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/1602189b-f4f3-40ee-ba63-c695c11069d0-etc-kubernetes\") pod \"multus-zqcw4\" (UID: \"1602189b-f4f3-40ee-ba63-c695c11069d0\") " pod="openshift-multus/multus-zqcw4" Mar 21 04:25:00 crc kubenswrapper[4839]: I0321 04:25:00.904294 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/e0848faa-daf7-4b62-a20f-36d92678db1d-os-release\") pod \"multus-additional-cni-plugins-scp2c\" (UID: \"e0848faa-daf7-4b62-a20f-36d92678db1d\") " pod="openshift-multus/multus-additional-cni-plugins-scp2c" Mar 21 04:25:00 crc kubenswrapper[4839]: I0321 04:25:00.904348 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/1602189b-f4f3-40ee-ba63-c695c11069d0-cnibin\") pod \"multus-zqcw4\" (UID: \"1602189b-f4f3-40ee-ba63-c695c11069d0\") " pod="openshift-multus/multus-zqcw4" Mar 21 04:25:00 crc kubenswrapper[4839]: I0321 04:25:00.904415 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/1602189b-f4f3-40ee-ba63-c695c11069d0-host-var-lib-cni-multus\") pod \"multus-zqcw4\" (UID: \"1602189b-f4f3-40ee-ba63-c695c11069d0\") " pod="openshift-multus/multus-zqcw4" Mar 21 04:25:00 crc kubenswrapper[4839]: I0321 04:25:00.904451 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/e0848faa-daf7-4b62-a20f-36d92678db1d-cnibin\") pod \"multus-additional-cni-plugins-scp2c\" (UID: \"e0848faa-daf7-4b62-a20f-36d92678db1d\") " pod="openshift-multus/multus-additional-cni-plugins-scp2c" Mar 21 04:25:00 crc kubenswrapper[4839]: I0321 04:25:00.904483 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/1602189b-f4f3-40ee-ba63-c695c11069d0-host-var-lib-kubelet\") pod \"multus-zqcw4\" (UID: \"1602189b-f4f3-40ee-ba63-c695c11069d0\") " pod="openshift-multus/multus-zqcw4" Mar 21 04:25:00 crc kubenswrapper[4839]: I0321 04:25:00.904517 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/1602189b-f4f3-40ee-ba63-c695c11069d0-multus-conf-dir\") pod \"multus-zqcw4\" (UID: \"1602189b-f4f3-40ee-ba63-c695c11069d0\") " pod="openshift-multus/multus-zqcw4" Mar 21 04:25:00 crc kubenswrapper[4839]: I0321 04:25:00.904545 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/1602189b-f4f3-40ee-ba63-c695c11069d0-host-var-lib-cni-bin\") pod \"multus-zqcw4\" (UID: \"1602189b-f4f3-40ee-ba63-c695c11069d0\") " pod="openshift-multus/multus-zqcw4" Mar 21 04:25:00 crc kubenswrapper[4839]: I0321 04:25:00.904596 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/4f92fefb-d5cd-451a-8bbe-31eea55d5bd9-rootfs\") pod \"machine-config-daemon-jx4q7\" (UID: \"4f92fefb-d5cd-451a-8bbe-31eea55d5bd9\") " pod="openshift-machine-config-operator/machine-config-daemon-jx4q7" Mar 21 04:25:00 crc kubenswrapper[4839]: I0321 04:25:00.904651 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/1602189b-f4f3-40ee-ba63-c695c11069d0-multus-cni-dir\") pod \"multus-zqcw4\" (UID: \"1602189b-f4f3-40ee-ba63-c695c11069d0\") " pod="openshift-multus/multus-zqcw4" Mar 21 04:25:00 crc kubenswrapper[4839]: I0321 04:25:00.904684 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/e0848faa-daf7-4b62-a20f-36d92678db1d-system-cni-dir\") pod \"multus-additional-cni-plugins-scp2c\" (UID: \"e0848faa-daf7-4b62-a20f-36d92678db1d\") " pod="openshift-multus/multus-additional-cni-plugins-scp2c" Mar 21 04:25:00 crc kubenswrapper[4839]: I0321 04:25:00.904931 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/e0848faa-daf7-4b62-a20f-36d92678db1d-cni-binary-copy\") pod \"multus-additional-cni-plugins-scp2c\" (UID: \"e0848faa-daf7-4b62-a20f-36d92678db1d\") " pod="openshift-multus/multus-additional-cni-plugins-scp2c" Mar 21 04:25:00 crc kubenswrapper[4839]: I0321 04:25:00.905172 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/1602189b-f4f3-40ee-ba63-c695c11069d0-system-cni-dir\") pod \"multus-zqcw4\" (UID: \"1602189b-f4f3-40ee-ba63-c695c11069d0\") " pod="openshift-multus/multus-zqcw4" Mar 21 04:25:00 crc kubenswrapper[4839]: I0321 04:25:00.905313 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/1602189b-f4f3-40ee-ba63-c695c11069d0-os-release\") pod \"multus-zqcw4\" (UID: \"1602189b-f4f3-40ee-ba63-c695c11069d0\") " pod="openshift-multus/multus-zqcw4" Mar 21 04:25:00 crc kubenswrapper[4839]: I0321 04:25:00.905350 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/1602189b-f4f3-40ee-ba63-c695c11069d0-hostroot\") pod \"multus-zqcw4\" (UID: \"1602189b-f4f3-40ee-ba63-c695c11069d0\") " pod="openshift-multus/multus-zqcw4" Mar 21 04:25:00 crc kubenswrapper[4839]: I0321 04:25:00.905384 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/1602189b-f4f3-40ee-ba63-c695c11069d0-multus-socket-dir-parent\") pod \"multus-zqcw4\" (UID: \"1602189b-f4f3-40ee-ba63-c695c11069d0\") " pod="openshift-multus/multus-zqcw4" Mar 21 04:25:00 crc kubenswrapper[4839]: I0321 04:25:00.905408 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/1602189b-f4f3-40ee-ba63-c695c11069d0-host-run-netns\") pod \"multus-zqcw4\" (UID: \"1602189b-f4f3-40ee-ba63-c695c11069d0\") " pod="openshift-multus/multus-zqcw4" Mar 21 04:25:00 crc kubenswrapper[4839]: I0321 04:25:00.905601 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/e0848faa-daf7-4b62-a20f-36d92678db1d-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-scp2c\" (UID: \"e0848faa-daf7-4b62-a20f-36d92678db1d\") " pod="openshift-multus/multus-additional-cni-plugins-scp2c" Mar 21 04:25:00 crc kubenswrapper[4839]: I0321 04:25:00.906092 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/4f92fefb-d5cd-451a-8bbe-31eea55d5bd9-mcd-auth-proxy-config\") pod \"machine-config-daemon-jx4q7\" (UID: \"4f92fefb-d5cd-451a-8bbe-31eea55d5bd9\") " pod="openshift-machine-config-operator/machine-config-daemon-jx4q7" Mar 21 04:25:00 crc kubenswrapper[4839]: I0321 04:25:00.906164 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/1602189b-f4f3-40ee-ba63-c695c11069d0-host-run-k8s-cni-cncf-io\") pod \"multus-zqcw4\" (UID: \"1602189b-f4f3-40ee-ba63-c695c11069d0\") " pod="openshift-multus/multus-zqcw4" Mar 21 04:25:00 crc kubenswrapper[4839]: I0321 04:25:00.906253 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/1602189b-f4f3-40ee-ba63-c695c11069d0-multus-daemon-config\") pod \"multus-zqcw4\" (UID: \"1602189b-f4f3-40ee-ba63-c695c11069d0\") " pod="openshift-multus/multus-zqcw4" Mar 21 04:25:00 crc kubenswrapper[4839]: I0321 04:25:00.906268 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/1602189b-f4f3-40ee-ba63-c695c11069d0-host-run-multus-certs\") pod \"multus-zqcw4\" (UID: \"1602189b-f4f3-40ee-ba63-c695c11069d0\") " pod="openshift-multus/multus-zqcw4" Mar 21 04:25:00 crc kubenswrapper[4839]: I0321 04:25:00.906364 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/e0848faa-daf7-4b62-a20f-36d92678db1d-tuning-conf-dir\") pod \"multus-additional-cni-plugins-scp2c\" (UID: \"e0848faa-daf7-4b62-a20f-36d92678db1d\") " pod="openshift-multus/multus-additional-cni-plugins-scp2c" Mar 21 04:25:00 crc kubenswrapper[4839]: I0321 04:25:00.906820 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/1602189b-f4f3-40ee-ba63-c695c11069d0-cni-binary-copy\") pod \"multus-zqcw4\" (UID: \"1602189b-f4f3-40ee-ba63-c695c11069d0\") " pod="openshift-multus/multus-zqcw4" Mar 21 04:25:00 crc kubenswrapper[4839]: I0321 04:25:00.909491 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/4f92fefb-d5cd-451a-8bbe-31eea55d5bd9-proxy-tls\") pod \"machine-config-daemon-jx4q7\" (UID: \"4f92fefb-d5cd-451a-8bbe-31eea55d5bd9\") " pod="openshift-machine-config-operator/machine-config-daemon-jx4q7" Mar 21 04:25:00 crc kubenswrapper[4839]: I0321 04:25:00.919530 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-f9gh6\" (UniqueName: \"kubernetes.io/projected/1602189b-f4f3-40ee-ba63-c695c11069d0-kube-api-access-f9gh6\") pod \"multus-zqcw4\" (UID: \"1602189b-f4f3-40ee-ba63-c695c11069d0\") " pod="openshift-multus/multus-zqcw4" Mar 21 04:25:00 crc kubenswrapper[4839]: I0321 04:25:00.919960 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"845a5f72-fcd9-4e53-a65e-d875a0758312\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:23:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:23:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:23:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:23:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:23:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6ac8aa1efe8200de679b27c91d85e8934dd8b958a487da8adfcb2e7fe4d4af83\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:23:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b5fcb9a692a8e68cdaf00e9ab50192286582dc94a1ec9c1f269deec71d12e709\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:23:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0a90276bf42ba474250d4e51c93b958e93ffcad8ab5f4285766b381f0247c484\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:23:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://418cc0271450c53990546d7ffb4506824d2a5889e49064cd40624ba291f485bd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:23:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d2aeb8f0a5e6e7302570f0ecb899e31a8f5e9ed506f9bf7cfa6a0a6d55ae50dd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:23:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://843b9092466615ee64f818bbd1d6e51298c59932d1e135796967de09fb2f81bf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://843b9092466615ee64f818bbd1d6e51298c59932d1e135796967de09fb2f81bf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:23:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:23:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://320bccb832e179c9ab109d22c5cd652cfb3bd770d2c698d91886401d9566efc5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://320bccb832e179c9ab109d22c5cd652cfb3bd770d2c698d91886401d9566efc5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:23:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:23:18Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://6e30344cfa0c0c9d42845a517f744a156634132bd6c2f6e30f745751913e968c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6e30344cfa0c0c9d42845a517f744a156634132bd6c2f6e30f745751913e968c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:23:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:23:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:23:16Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 21 04:25:00 crc kubenswrapper[4839]: I0321 04:25:00.925171 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-v6trk\" (UniqueName: \"kubernetes.io/projected/e0848faa-daf7-4b62-a20f-36d92678db1d-kube-api-access-v6trk\") pod \"multus-additional-cni-plugins-scp2c\" (UID: \"e0848faa-daf7-4b62-a20f-36d92678db1d\") " pod="openshift-multus/multus-additional-cni-plugins-scp2c" Mar 21 04:25:00 crc kubenswrapper[4839]: I0321 04:25:00.928252 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9jqbw\" (UniqueName: \"kubernetes.io/projected/4f92fefb-d5cd-451a-8bbe-31eea55d5bd9-kube-api-access-9jqbw\") pod \"machine-config-daemon-jx4q7\" (UID: \"4f92fefb-d5cd-451a-8bbe-31eea55d5bd9\") " pod="openshift-machine-config-operator/machine-config-daemon-jx4q7" Mar 21 04:25:00 crc kubenswrapper[4839]: I0321 04:25:00.931295 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c5a473f2-0430-4c9b-8ef8-60d457db5188\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:23:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:23:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:23:16Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:23:16Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:23:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7299e7f513aff2d1d61b42cef3ed386843478716082a6e38d4d6d61d87f48529\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:23:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6d9bd6ceaa85cd21af55d1d512ac910eaf9d1bcf5d0f10868cd22ac80b13f66a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:23:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e4fe4a6c00181492cd5389e34a9bf18d905f69b2f7467eb75dcee9992eeb0d07\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:23:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://412472fce0e71838c2bb83101879b672e777dcd608bf27261a98261150d42e87\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ecafd1a4266b72f369f4b53d8a423bb875c1bc9719282c517a8e999ad5aecc66\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-21T04:24:14Z\\\",\\\"message\\\":\\\"W0321 04:24:13.708919 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI0321 04:24:13.709951 1 crypto.go:601] Generating new CA for check-endpoints-signer@1774067053 cert, and key in /tmp/serving-cert-1138089316/serving-signer.crt, /tmp/serving-cert-1138089316/serving-signer.key\\\\nI0321 04:24:14.009807 1 observer_polling.go:159] Starting file observer\\\\nW0321 04:24:14.016906 1 builder.go:272] unable to get owner reference (falling back to namespace): Unauthorized\\\\nI0321 04:24:14.017068 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0321 04:24:14.017655 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1138089316/tls.crt::/tmp/serving-cert-1138089316/tls.key\\\\\\\"\\\\nF0321 04:24:14.917069 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-21T04:24:13Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:24:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://573a241321770c9c676d83d8280f81eba0ad01a3e2f857be89d31316117b8898\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:23:19Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f70600d5431425836d94025fc8166068d8c9f411d0750be07cd7de44575a9649\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f70600d5431425836d94025fc8166068d8c9f411d0750be07cd7de44575a9649\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:23:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:23:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:23:16Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 21 04:25:00 crc kubenswrapper[4839]: I0321 04:25:00.940122 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:24:41Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:24:41Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 21 04:25:00 crc kubenswrapper[4839]: I0321 04:25:00.947893 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:24:41Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:24:41Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 21 04:25:00 crc kubenswrapper[4839]: I0321 04:25:00.953987 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-g47qh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e646dbcd-c976-48e4-8dee-497be8a275bf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:00Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:00Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ljz2v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:25:00Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-g47qh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 21 04:25:00 crc kubenswrapper[4839]: I0321 04:25:00.979256 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-daemon-jx4q7" Mar 21 04:25:00 crc kubenswrapper[4839]: W0321 04:25:00.990274 4839 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4f92fefb_d5cd_451a_8bbe_31eea55d5bd9.slice/crio-f53f978a84252503a736dfceb91ada96eb5f3287ae5bc5d348304599bfc4be70 WatchSource:0}: Error finding container f53f978a84252503a736dfceb91ada96eb5f3287ae5bc5d348304599bfc4be70: Status 404 returned error can't find the container with id f53f978a84252503a736dfceb91ada96eb5f3287ae5bc5d348304599bfc4be70 Mar 21 04:25:00 crc kubenswrapper[4839]: I0321 04:25:00.990829 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:25:00 crc kubenswrapper[4839]: I0321 04:25:00.990861 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:25:00 crc kubenswrapper[4839]: I0321 04:25:00.990873 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:25:00 crc kubenswrapper[4839]: I0321 04:25:00.990894 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 04:25:00 crc kubenswrapper[4839]: I0321 04:25:00.990907 4839 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T04:25:00Z","lastTransitionTime":"2026-03-21T04:25:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 04:25:00 crc kubenswrapper[4839]: I0321 04:25:00.992058 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-scp2c" Mar 21 04:25:00 crc kubenswrapper[4839]: I0321 04:25:00.996731 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-g47qh" event={"ID":"e646dbcd-c976-48e4-8dee-497be8a275bf","Type":"ContainerStarted","Data":"2ceedbcee674106a8f519fededf37b2bbc4b4bd660845373899de825f1ed534e"} Mar 21 04:25:00 crc kubenswrapper[4839]: I0321 04:25:00.996794 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-g47qh" event={"ID":"e646dbcd-c976-48e4-8dee-497be8a275bf","Type":"ContainerStarted","Data":"ad11ae6b1f76dc4d610d074d746058d2d1171fb3a53b621a66c771b9c432c307"} Mar 21 04:25:00 crc kubenswrapper[4839]: I0321 04:25:00.997929 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-jx4q7" event={"ID":"4f92fefb-d5cd-451a-8bbe-31eea55d5bd9","Type":"ContainerStarted","Data":"f53f978a84252503a736dfceb91ada96eb5f3287ae5bc5d348304599bfc4be70"} Mar 21 04:25:00 crc kubenswrapper[4839]: I0321 04:25:00.998047 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-zqcw4" Mar 21 04:25:01 crc kubenswrapper[4839]: I0321 04:25:01.008160 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-zqcw4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1602189b-f4f3-40ee-ba63-c695c11069d0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:00Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:00Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f9gh6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:25:00Z\\\"}}\" for pod \"openshift-multus\"/\"multus-zqcw4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 21 04:25:01 crc kubenswrapper[4839]: W0321 04:25:01.014148 4839 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode0848faa_daf7_4b62_a20f_36d92678db1d.slice/crio-811b088a00fd73f9b5eefe3a065a3555e9a32793837e4125cec34d9084a87730 WatchSource:0}: Error finding container 811b088a00fd73f9b5eefe3a065a3555e9a32793837e4125cec34d9084a87730: Status 404 returned error can't find the container with id 811b088a00fd73f9b5eefe3a065a3555e9a32793837e4125cec34d9084a87730 Mar 21 04:25:01 crc kubenswrapper[4839]: W0321 04:25:01.019670 4839 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1602189b_f4f3_40ee_ba63_c695c11069d0.slice/crio-5a140131f05d7b0d663ff416e2780ce265a03da4be693c89cc63345c8b65f3dd WatchSource:0}: Error finding container 5a140131f05d7b0d663ff416e2780ce265a03da4be693c89cc63345c8b65f3dd: Status 404 returned error can't find the container with id 5a140131f05d7b0d663ff416e2780ce265a03da4be693c89cc63345c8b65f3dd Mar 21 04:25:01 crc kubenswrapper[4839]: I0321 04:25:01.019779 4839 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-spl4b"] Mar 21 04:25:01 crc kubenswrapper[4839]: I0321 04:25:01.020741 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-spl4b" Mar 21 04:25:01 crc kubenswrapper[4839]: I0321 04:25:01.022821 4839 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-node-dockercfg-pwtwl" Mar 21 04:25:01 crc kubenswrapper[4839]: I0321 04:25:01.023030 4839 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-config" Mar 21 04:25:01 crc kubenswrapper[4839]: I0321 04:25:01.023200 4839 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"env-overrides" Mar 21 04:25:01 crc kubenswrapper[4839]: I0321 04:25:01.023233 4839 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"kube-root-ca.crt" Mar 21 04:25:01 crc kubenswrapper[4839]: I0321 04:25:01.023329 4839 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-node-metrics-cert" Mar 21 04:25:01 crc kubenswrapper[4839]: I0321 04:25:01.023439 4839 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"openshift-service-ca.crt" Mar 21 04:25:01 crc kubenswrapper[4839]: I0321 04:25:01.023477 4839 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-script-lib" Mar 21 04:25:01 crc kubenswrapper[4839]: I0321 04:25:01.024962 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:24:41Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:24:41Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 21 04:25:01 crc kubenswrapper[4839]: I0321 04:25:01.034485 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:24:41Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:24:41Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 21 04:25:01 crc kubenswrapper[4839]: I0321 04:25:01.049764 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-scp2c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e0848faa-daf7-4b62-a20f-36d92678db1d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:00Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:00Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:00Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6trk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6trk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6trk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6trk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6trk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6trk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6trk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:25:00Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-scp2c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 21 04:25:01 crc kubenswrapper[4839]: I0321 04:25:01.067635 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"845a5f72-fcd9-4e53-a65e-d875a0758312\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:23:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:23:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:23:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:23:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:23:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6ac8aa1efe8200de679b27c91d85e8934dd8b958a487da8adfcb2e7fe4d4af83\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:23:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b5fcb9a692a8e68cdaf00e9ab50192286582dc94a1ec9c1f269deec71d12e709\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:23:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0a90276bf42ba474250d4e51c93b958e93ffcad8ab5f4285766b381f0247c484\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:23:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://418cc0271450c53990546d7ffb4506824d2a5889e49064cd40624ba291f485bd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:23:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d2aeb8f0a5e6e7302570f0ecb899e31a8f5e9ed506f9bf7cfa6a0a6d55ae50dd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:23:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://843b9092466615ee64f818bbd1d6e51298c59932d1e135796967de09fb2f81bf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://843b9092466615ee64f818bbd1d6e51298c59932d1e135796967de09fb2f81bf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:23:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:23:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://320bccb832e179c9ab109d22c5cd652cfb3bd770d2c698d91886401d9566efc5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://320bccb832e179c9ab109d22c5cd652cfb3bd770d2c698d91886401d9566efc5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:23:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:23:18Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://6e30344cfa0c0c9d42845a517f744a156634132bd6c2f6e30f745751913e968c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6e30344cfa0c0c9d42845a517f744a156634132bd6c2f6e30f745751913e968c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:23:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:23:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:23:16Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 21 04:25:01 crc kubenswrapper[4839]: I0321 04:25:01.077049 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:24:41Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:24:41Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 21 04:25:01 crc kubenswrapper[4839]: I0321 04:25:01.087062 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:24:41Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:24:41Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 21 04:25:01 crc kubenswrapper[4839]: I0321 04:25:01.093771 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:25:01 crc kubenswrapper[4839]: I0321 04:25:01.093805 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:25:01 crc kubenswrapper[4839]: I0321 04:25:01.093814 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:25:01 crc kubenswrapper[4839]: I0321 04:25:01.093828 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 04:25:01 crc kubenswrapper[4839]: I0321 04:25:01.093837 4839 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T04:25:01Z","lastTransitionTime":"2026-03-21T04:25:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 04:25:01 crc kubenswrapper[4839]: I0321 04:25:01.094042 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-g47qh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e646dbcd-c976-48e4-8dee-497be8a275bf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2ceedbcee674106a8f519fededf37b2bbc4b4bd660845373899de825f1ed534e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:25:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ljz2v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:25:00Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-g47qh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 21 04:25:01 crc kubenswrapper[4839]: I0321 04:25:01.105371 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c5a473f2-0430-4c9b-8ef8-60d457db5188\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:23:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:23:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:23:16Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:23:16Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:23:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7299e7f513aff2d1d61b42cef3ed386843478716082a6e38d4d6d61d87f48529\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:23:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6d9bd6ceaa85cd21af55d1d512ac910eaf9d1bcf5d0f10868cd22ac80b13f66a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:23:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e4fe4a6c00181492cd5389e34a9bf18d905f69b2f7467eb75dcee9992eeb0d07\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:23:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://412472fce0e71838c2bb83101879b672e777dcd608bf27261a98261150d42e87\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ecafd1a4266b72f369f4b53d8a423bb875c1bc9719282c517a8e999ad5aecc66\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-21T04:24:14Z\\\",\\\"message\\\":\\\"W0321 04:24:13.708919 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI0321 04:24:13.709951 1 crypto.go:601] Generating new CA for check-endpoints-signer@1774067053 cert, and key in /tmp/serving-cert-1138089316/serving-signer.crt, /tmp/serving-cert-1138089316/serving-signer.key\\\\nI0321 04:24:14.009807 1 observer_polling.go:159] Starting file observer\\\\nW0321 04:24:14.016906 1 builder.go:272] unable to get owner reference (falling back to namespace): Unauthorized\\\\nI0321 04:24:14.017068 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0321 04:24:14.017655 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1138089316/tls.crt::/tmp/serving-cert-1138089316/tls.key\\\\\\\"\\\\nF0321 04:24:14.917069 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-21T04:24:13Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:24:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://573a241321770c9c676d83d8280f81eba0ad01a3e2f857be89d31316117b8898\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:23:19Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f70600d5431425836d94025fc8166068d8c9f411d0750be07cd7de44575a9649\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f70600d5431425836d94025fc8166068d8c9f411d0750be07cd7de44575a9649\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:23:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:23:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:23:16Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 21 04:25:01 crc kubenswrapper[4839]: I0321 04:25:01.107645 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/d634043b-c9ec-4469-b267-26053b1f02f9-node-log\") pod \"ovnkube-node-spl4b\" (UID: \"d634043b-c9ec-4469-b267-26053b1f02f9\") " pod="openshift-ovn-kubernetes/ovnkube-node-spl4b" Mar 21 04:25:01 crc kubenswrapper[4839]: I0321 04:25:01.107678 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/d634043b-c9ec-4469-b267-26053b1f02f9-log-socket\") pod \"ovnkube-node-spl4b\" (UID: \"d634043b-c9ec-4469-b267-26053b1f02f9\") " pod="openshift-ovn-kubernetes/ovnkube-node-spl4b" Mar 21 04:25:01 crc kubenswrapper[4839]: I0321 04:25:01.107695 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/d634043b-c9ec-4469-b267-26053b1f02f9-run-systemd\") pod \"ovnkube-node-spl4b\" (UID: \"d634043b-c9ec-4469-b267-26053b1f02f9\") " pod="openshift-ovn-kubernetes/ovnkube-node-spl4b" Mar 21 04:25:01 crc kubenswrapper[4839]: I0321 04:25:01.107711 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cdph2\" (UniqueName: \"kubernetes.io/projected/d634043b-c9ec-4469-b267-26053b1f02f9-kube-api-access-cdph2\") pod \"ovnkube-node-spl4b\" (UID: \"d634043b-c9ec-4469-b267-26053b1f02f9\") " pod="openshift-ovn-kubernetes/ovnkube-node-spl4b" Mar 21 04:25:01 crc kubenswrapper[4839]: I0321 04:25:01.107728 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/d634043b-c9ec-4469-b267-26053b1f02f9-host-run-ovn-kubernetes\") pod \"ovnkube-node-spl4b\" (UID: \"d634043b-c9ec-4469-b267-26053b1f02f9\") " pod="openshift-ovn-kubernetes/ovnkube-node-spl4b" Mar 21 04:25:01 crc kubenswrapper[4839]: I0321 04:25:01.107741 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/d634043b-c9ec-4469-b267-26053b1f02f9-ovnkube-script-lib\") pod \"ovnkube-node-spl4b\" (UID: \"d634043b-c9ec-4469-b267-26053b1f02f9\") " pod="openshift-ovn-kubernetes/ovnkube-node-spl4b" Mar 21 04:25:01 crc kubenswrapper[4839]: I0321 04:25:01.107756 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d634043b-c9ec-4469-b267-26053b1f02f9-host-slash\") pod \"ovnkube-node-spl4b\" (UID: \"d634043b-c9ec-4469-b267-26053b1f02f9\") " pod="openshift-ovn-kubernetes/ovnkube-node-spl4b" Mar 21 04:25:01 crc kubenswrapper[4839]: I0321 04:25:01.107770 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/d634043b-c9ec-4469-b267-26053b1f02f9-host-kubelet\") pod \"ovnkube-node-spl4b\" (UID: \"d634043b-c9ec-4469-b267-26053b1f02f9\") " pod="openshift-ovn-kubernetes/ovnkube-node-spl4b" Mar 21 04:25:01 crc kubenswrapper[4839]: I0321 04:25:01.107785 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/d634043b-c9ec-4469-b267-26053b1f02f9-systemd-units\") pod \"ovnkube-node-spl4b\" (UID: \"d634043b-c9ec-4469-b267-26053b1f02f9\") " pod="openshift-ovn-kubernetes/ovnkube-node-spl4b" Mar 21 04:25:01 crc kubenswrapper[4839]: I0321 04:25:01.107799 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/d634043b-c9ec-4469-b267-26053b1f02f9-var-lib-openvswitch\") pod \"ovnkube-node-spl4b\" (UID: \"d634043b-c9ec-4469-b267-26053b1f02f9\") " pod="openshift-ovn-kubernetes/ovnkube-node-spl4b" Mar 21 04:25:01 crc kubenswrapper[4839]: I0321 04:25:01.107830 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/d634043b-c9ec-4469-b267-26053b1f02f9-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-spl4b\" (UID: \"d634043b-c9ec-4469-b267-26053b1f02f9\") " pod="openshift-ovn-kubernetes/ovnkube-node-spl4b" Mar 21 04:25:01 crc kubenswrapper[4839]: I0321 04:25:01.107856 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/d634043b-c9ec-4469-b267-26053b1f02f9-ovn-node-metrics-cert\") pod \"ovnkube-node-spl4b\" (UID: \"d634043b-c9ec-4469-b267-26053b1f02f9\") " pod="openshift-ovn-kubernetes/ovnkube-node-spl4b" Mar 21 04:25:01 crc kubenswrapper[4839]: I0321 04:25:01.107957 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/d634043b-c9ec-4469-b267-26053b1f02f9-host-run-netns\") pod \"ovnkube-node-spl4b\" (UID: \"d634043b-c9ec-4469-b267-26053b1f02f9\") " pod="openshift-ovn-kubernetes/ovnkube-node-spl4b" Mar 21 04:25:01 crc kubenswrapper[4839]: I0321 04:25:01.108006 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/d634043b-c9ec-4469-b267-26053b1f02f9-ovnkube-config\") pod \"ovnkube-node-spl4b\" (UID: \"d634043b-c9ec-4469-b267-26053b1f02f9\") " pod="openshift-ovn-kubernetes/ovnkube-node-spl4b" Mar 21 04:25:01 crc kubenswrapper[4839]: I0321 04:25:01.108042 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/d634043b-c9ec-4469-b267-26053b1f02f9-host-cni-bin\") pod \"ovnkube-node-spl4b\" (UID: \"d634043b-c9ec-4469-b267-26053b1f02f9\") " pod="openshift-ovn-kubernetes/ovnkube-node-spl4b" Mar 21 04:25:01 crc kubenswrapper[4839]: I0321 04:25:01.108070 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/d634043b-c9ec-4469-b267-26053b1f02f9-env-overrides\") pod \"ovnkube-node-spl4b\" (UID: \"d634043b-c9ec-4469-b267-26053b1f02f9\") " pod="openshift-ovn-kubernetes/ovnkube-node-spl4b" Mar 21 04:25:01 crc kubenswrapper[4839]: I0321 04:25:01.108130 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/d634043b-c9ec-4469-b267-26053b1f02f9-run-ovn\") pod \"ovnkube-node-spl4b\" (UID: \"d634043b-c9ec-4469-b267-26053b1f02f9\") " pod="openshift-ovn-kubernetes/ovnkube-node-spl4b" Mar 21 04:25:01 crc kubenswrapper[4839]: I0321 04:25:01.108147 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/d634043b-c9ec-4469-b267-26053b1f02f9-run-openvswitch\") pod \"ovnkube-node-spl4b\" (UID: \"d634043b-c9ec-4469-b267-26053b1f02f9\") " pod="openshift-ovn-kubernetes/ovnkube-node-spl4b" Mar 21 04:25:01 crc kubenswrapper[4839]: I0321 04:25:01.108196 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/d634043b-c9ec-4469-b267-26053b1f02f9-etc-openvswitch\") pod \"ovnkube-node-spl4b\" (UID: \"d634043b-c9ec-4469-b267-26053b1f02f9\") " pod="openshift-ovn-kubernetes/ovnkube-node-spl4b" Mar 21 04:25:01 crc kubenswrapper[4839]: I0321 04:25:01.108221 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/d634043b-c9ec-4469-b267-26053b1f02f9-host-cni-netd\") pod \"ovnkube-node-spl4b\" (UID: \"d634043b-c9ec-4469-b267-26053b1f02f9\") " pod="openshift-ovn-kubernetes/ovnkube-node-spl4b" Mar 21 04:25:01 crc kubenswrapper[4839]: I0321 04:25:01.119998 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:24:41Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:24:41Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 21 04:25:01 crc kubenswrapper[4839]: I0321 04:25:01.131985 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-jx4q7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4f92fefb-d5cd-451a-8bbe-31eea55d5bd9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:00Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:00Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9jqbw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9jqbw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:25:00Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-jx4q7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 21 04:25:01 crc kubenswrapper[4839]: I0321 04:25:01.141400 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:24:41Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:24:41Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 21 04:25:01 crc kubenswrapper[4839]: I0321 04:25:01.150368 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:24:41Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:24:41Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 21 04:25:01 crc kubenswrapper[4839]: I0321 04:25:01.158485 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-jx4q7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4f92fefb-d5cd-451a-8bbe-31eea55d5bd9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:00Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:00Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9jqbw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9jqbw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:25:00Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-jx4q7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 21 04:25:01 crc kubenswrapper[4839]: I0321 04:25:01.169433 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:24:41Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:24:41Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 21 04:25:01 crc kubenswrapper[4839]: I0321 04:25:01.183437 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:24:41Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:24:41Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 21 04:25:01 crc kubenswrapper[4839]: I0321 04:25:01.194038 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-scp2c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e0848faa-daf7-4b62-a20f-36d92678db1d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:00Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:00Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:00Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6trk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6trk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6trk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6trk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6trk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6trk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6trk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:25:00Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-scp2c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 21 04:25:01 crc kubenswrapper[4839]: I0321 04:25:01.195387 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:25:01 crc kubenswrapper[4839]: I0321 04:25:01.195417 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:25:01 crc kubenswrapper[4839]: I0321 04:25:01.195430 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:25:01 crc kubenswrapper[4839]: I0321 04:25:01.195445 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 04:25:01 crc kubenswrapper[4839]: I0321 04:25:01.195455 4839 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T04:25:01Z","lastTransitionTime":"2026-03-21T04:25:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 04:25:01 crc kubenswrapper[4839]: I0321 04:25:01.205053 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-zqcw4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1602189b-f4f3-40ee-ba63-c695c11069d0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:00Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:00Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f9gh6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:25:00Z\\\"}}\" for pod \"openshift-multus\"/\"multus-zqcw4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 21 04:25:01 crc kubenswrapper[4839]: I0321 04:25:01.209528 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/d634043b-c9ec-4469-b267-26053b1f02f9-host-kubelet\") pod \"ovnkube-node-spl4b\" (UID: \"d634043b-c9ec-4469-b267-26053b1f02f9\") " pod="openshift-ovn-kubernetes/ovnkube-node-spl4b" Mar 21 04:25:01 crc kubenswrapper[4839]: I0321 04:25:01.209562 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/d634043b-c9ec-4469-b267-26053b1f02f9-systemd-units\") pod \"ovnkube-node-spl4b\" (UID: \"d634043b-c9ec-4469-b267-26053b1f02f9\") " pod="openshift-ovn-kubernetes/ovnkube-node-spl4b" Mar 21 04:25:01 crc kubenswrapper[4839]: I0321 04:25:01.209591 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/d634043b-c9ec-4469-b267-26053b1f02f9-var-lib-openvswitch\") pod \"ovnkube-node-spl4b\" (UID: \"d634043b-c9ec-4469-b267-26053b1f02f9\") " pod="openshift-ovn-kubernetes/ovnkube-node-spl4b" Mar 21 04:25:01 crc kubenswrapper[4839]: I0321 04:25:01.209609 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/d634043b-c9ec-4469-b267-26053b1f02f9-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-spl4b\" (UID: \"d634043b-c9ec-4469-b267-26053b1f02f9\") " pod="openshift-ovn-kubernetes/ovnkube-node-spl4b" Mar 21 04:25:01 crc kubenswrapper[4839]: I0321 04:25:01.209632 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/d634043b-c9ec-4469-b267-26053b1f02f9-ovn-node-metrics-cert\") pod \"ovnkube-node-spl4b\" (UID: \"d634043b-c9ec-4469-b267-26053b1f02f9\") " pod="openshift-ovn-kubernetes/ovnkube-node-spl4b" Mar 21 04:25:01 crc kubenswrapper[4839]: I0321 04:25:01.209648 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/d634043b-c9ec-4469-b267-26053b1f02f9-host-run-netns\") pod \"ovnkube-node-spl4b\" (UID: \"d634043b-c9ec-4469-b267-26053b1f02f9\") " pod="openshift-ovn-kubernetes/ovnkube-node-spl4b" Mar 21 04:25:01 crc kubenswrapper[4839]: I0321 04:25:01.209662 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/d634043b-c9ec-4469-b267-26053b1f02f9-ovnkube-config\") pod \"ovnkube-node-spl4b\" (UID: \"d634043b-c9ec-4469-b267-26053b1f02f9\") " pod="openshift-ovn-kubernetes/ovnkube-node-spl4b" Mar 21 04:25:01 crc kubenswrapper[4839]: I0321 04:25:01.209678 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/d634043b-c9ec-4469-b267-26053b1f02f9-host-cni-bin\") pod \"ovnkube-node-spl4b\" (UID: \"d634043b-c9ec-4469-b267-26053b1f02f9\") " pod="openshift-ovn-kubernetes/ovnkube-node-spl4b" Mar 21 04:25:01 crc kubenswrapper[4839]: I0321 04:25:01.209685 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/d634043b-c9ec-4469-b267-26053b1f02f9-var-lib-openvswitch\") pod \"ovnkube-node-spl4b\" (UID: \"d634043b-c9ec-4469-b267-26053b1f02f9\") " pod="openshift-ovn-kubernetes/ovnkube-node-spl4b" Mar 21 04:25:01 crc kubenswrapper[4839]: I0321 04:25:01.209693 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/d634043b-c9ec-4469-b267-26053b1f02f9-systemd-units\") pod \"ovnkube-node-spl4b\" (UID: \"d634043b-c9ec-4469-b267-26053b1f02f9\") " pod="openshift-ovn-kubernetes/ovnkube-node-spl4b" Mar 21 04:25:01 crc kubenswrapper[4839]: I0321 04:25:01.209734 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/d634043b-c9ec-4469-b267-26053b1f02f9-host-kubelet\") pod \"ovnkube-node-spl4b\" (UID: \"d634043b-c9ec-4469-b267-26053b1f02f9\") " pod="openshift-ovn-kubernetes/ovnkube-node-spl4b" Mar 21 04:25:01 crc kubenswrapper[4839]: I0321 04:25:01.209693 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/d634043b-c9ec-4469-b267-26053b1f02f9-env-overrides\") pod \"ovnkube-node-spl4b\" (UID: \"d634043b-c9ec-4469-b267-26053b1f02f9\") " pod="openshift-ovn-kubernetes/ovnkube-node-spl4b" Mar 21 04:25:01 crc kubenswrapper[4839]: I0321 04:25:01.209761 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/d634043b-c9ec-4469-b267-26053b1f02f9-host-run-netns\") pod \"ovnkube-node-spl4b\" (UID: \"d634043b-c9ec-4469-b267-26053b1f02f9\") " pod="openshift-ovn-kubernetes/ovnkube-node-spl4b" Mar 21 04:25:01 crc kubenswrapper[4839]: I0321 04:25:01.209782 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/d634043b-c9ec-4469-b267-26053b1f02f9-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-spl4b\" (UID: \"d634043b-c9ec-4469-b267-26053b1f02f9\") " pod="openshift-ovn-kubernetes/ovnkube-node-spl4b" Mar 21 04:25:01 crc kubenswrapper[4839]: I0321 04:25:01.209815 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/d634043b-c9ec-4469-b267-26053b1f02f9-run-ovn\") pod \"ovnkube-node-spl4b\" (UID: \"d634043b-c9ec-4469-b267-26053b1f02f9\") " pod="openshift-ovn-kubernetes/ovnkube-node-spl4b" Mar 21 04:25:01 crc kubenswrapper[4839]: I0321 04:25:01.209836 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/d634043b-c9ec-4469-b267-26053b1f02f9-run-openvswitch\") pod \"ovnkube-node-spl4b\" (UID: \"d634043b-c9ec-4469-b267-26053b1f02f9\") " pod="openshift-ovn-kubernetes/ovnkube-node-spl4b" Mar 21 04:25:01 crc kubenswrapper[4839]: I0321 04:25:01.209860 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/d634043b-c9ec-4469-b267-26053b1f02f9-etc-openvswitch\") pod \"ovnkube-node-spl4b\" (UID: \"d634043b-c9ec-4469-b267-26053b1f02f9\") " pod="openshift-ovn-kubernetes/ovnkube-node-spl4b" Mar 21 04:25:01 crc kubenswrapper[4839]: I0321 04:25:01.209884 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/d634043b-c9ec-4469-b267-26053b1f02f9-host-cni-netd\") pod \"ovnkube-node-spl4b\" (UID: \"d634043b-c9ec-4469-b267-26053b1f02f9\") " pod="openshift-ovn-kubernetes/ovnkube-node-spl4b" Mar 21 04:25:01 crc kubenswrapper[4839]: I0321 04:25:01.209908 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/d634043b-c9ec-4469-b267-26053b1f02f9-node-log\") pod \"ovnkube-node-spl4b\" (UID: \"d634043b-c9ec-4469-b267-26053b1f02f9\") " pod="openshift-ovn-kubernetes/ovnkube-node-spl4b" Mar 21 04:25:01 crc kubenswrapper[4839]: I0321 04:25:01.209922 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/d634043b-c9ec-4469-b267-26053b1f02f9-log-socket\") pod \"ovnkube-node-spl4b\" (UID: \"d634043b-c9ec-4469-b267-26053b1f02f9\") " pod="openshift-ovn-kubernetes/ovnkube-node-spl4b" Mar 21 04:25:01 crc kubenswrapper[4839]: I0321 04:25:01.209941 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/d634043b-c9ec-4469-b267-26053b1f02f9-run-systemd\") pod \"ovnkube-node-spl4b\" (UID: \"d634043b-c9ec-4469-b267-26053b1f02f9\") " pod="openshift-ovn-kubernetes/ovnkube-node-spl4b" Mar 21 04:25:01 crc kubenswrapper[4839]: I0321 04:25:01.209955 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cdph2\" (UniqueName: \"kubernetes.io/projected/d634043b-c9ec-4469-b267-26053b1f02f9-kube-api-access-cdph2\") pod \"ovnkube-node-spl4b\" (UID: \"d634043b-c9ec-4469-b267-26053b1f02f9\") " pod="openshift-ovn-kubernetes/ovnkube-node-spl4b" Mar 21 04:25:01 crc kubenswrapper[4839]: I0321 04:25:01.209974 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d634043b-c9ec-4469-b267-26053b1f02f9-host-slash\") pod \"ovnkube-node-spl4b\" (UID: \"d634043b-c9ec-4469-b267-26053b1f02f9\") " pod="openshift-ovn-kubernetes/ovnkube-node-spl4b" Mar 21 04:25:01 crc kubenswrapper[4839]: I0321 04:25:01.209991 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/d634043b-c9ec-4469-b267-26053b1f02f9-host-run-ovn-kubernetes\") pod \"ovnkube-node-spl4b\" (UID: \"d634043b-c9ec-4469-b267-26053b1f02f9\") " pod="openshift-ovn-kubernetes/ovnkube-node-spl4b" Mar 21 04:25:01 crc kubenswrapper[4839]: I0321 04:25:01.210012 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/d634043b-c9ec-4469-b267-26053b1f02f9-ovnkube-script-lib\") pod \"ovnkube-node-spl4b\" (UID: \"d634043b-c9ec-4469-b267-26053b1f02f9\") " pod="openshift-ovn-kubernetes/ovnkube-node-spl4b" Mar 21 04:25:01 crc kubenswrapper[4839]: I0321 04:25:01.210168 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/d634043b-c9ec-4469-b267-26053b1f02f9-env-overrides\") pod \"ovnkube-node-spl4b\" (UID: \"d634043b-c9ec-4469-b267-26053b1f02f9\") " pod="openshift-ovn-kubernetes/ovnkube-node-spl4b" Mar 21 04:25:01 crc kubenswrapper[4839]: I0321 04:25:01.210206 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/d634043b-c9ec-4469-b267-26053b1f02f9-host-cni-bin\") pod \"ovnkube-node-spl4b\" (UID: \"d634043b-c9ec-4469-b267-26053b1f02f9\") " pod="openshift-ovn-kubernetes/ovnkube-node-spl4b" Mar 21 04:25:01 crc kubenswrapper[4839]: I0321 04:25:01.210232 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/d634043b-c9ec-4469-b267-26053b1f02f9-node-log\") pod \"ovnkube-node-spl4b\" (UID: \"d634043b-c9ec-4469-b267-26053b1f02f9\") " pod="openshift-ovn-kubernetes/ovnkube-node-spl4b" Mar 21 04:25:01 crc kubenswrapper[4839]: I0321 04:25:01.210250 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/d634043b-c9ec-4469-b267-26053b1f02f9-run-ovn\") pod \"ovnkube-node-spl4b\" (UID: \"d634043b-c9ec-4469-b267-26053b1f02f9\") " pod="openshift-ovn-kubernetes/ovnkube-node-spl4b" Mar 21 04:25:01 crc kubenswrapper[4839]: I0321 04:25:01.210270 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/d634043b-c9ec-4469-b267-26053b1f02f9-run-openvswitch\") pod \"ovnkube-node-spl4b\" (UID: \"d634043b-c9ec-4469-b267-26053b1f02f9\") " pod="openshift-ovn-kubernetes/ovnkube-node-spl4b" Mar 21 04:25:01 crc kubenswrapper[4839]: I0321 04:25:01.210289 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/d634043b-c9ec-4469-b267-26053b1f02f9-etc-openvswitch\") pod \"ovnkube-node-spl4b\" (UID: \"d634043b-c9ec-4469-b267-26053b1f02f9\") " pod="openshift-ovn-kubernetes/ovnkube-node-spl4b" Mar 21 04:25:01 crc kubenswrapper[4839]: I0321 04:25:01.210310 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/d634043b-c9ec-4469-b267-26053b1f02f9-host-cni-netd\") pod \"ovnkube-node-spl4b\" (UID: \"d634043b-c9ec-4469-b267-26053b1f02f9\") " pod="openshift-ovn-kubernetes/ovnkube-node-spl4b" Mar 21 04:25:01 crc kubenswrapper[4839]: I0321 04:25:01.210539 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/d634043b-c9ec-4469-b267-26053b1f02f9-log-socket\") pod \"ovnkube-node-spl4b\" (UID: \"d634043b-c9ec-4469-b267-26053b1f02f9\") " pod="openshift-ovn-kubernetes/ovnkube-node-spl4b" Mar 21 04:25:01 crc kubenswrapper[4839]: I0321 04:25:01.210583 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/d634043b-c9ec-4469-b267-26053b1f02f9-run-systemd\") pod \"ovnkube-node-spl4b\" (UID: \"d634043b-c9ec-4469-b267-26053b1f02f9\") " pod="openshift-ovn-kubernetes/ovnkube-node-spl4b" Mar 21 04:25:01 crc kubenswrapper[4839]: I0321 04:25:01.210609 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d634043b-c9ec-4469-b267-26053b1f02f9-host-slash\") pod \"ovnkube-node-spl4b\" (UID: \"d634043b-c9ec-4469-b267-26053b1f02f9\") " pod="openshift-ovn-kubernetes/ovnkube-node-spl4b" Mar 21 04:25:01 crc kubenswrapper[4839]: I0321 04:25:01.210631 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/d634043b-c9ec-4469-b267-26053b1f02f9-host-run-ovn-kubernetes\") pod \"ovnkube-node-spl4b\" (UID: \"d634043b-c9ec-4469-b267-26053b1f02f9\") " pod="openshift-ovn-kubernetes/ovnkube-node-spl4b" Mar 21 04:25:01 crc kubenswrapper[4839]: I0321 04:25:01.210642 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/d634043b-c9ec-4469-b267-26053b1f02f9-ovnkube-script-lib\") pod \"ovnkube-node-spl4b\" (UID: \"d634043b-c9ec-4469-b267-26053b1f02f9\") " pod="openshift-ovn-kubernetes/ovnkube-node-spl4b" Mar 21 04:25:01 crc kubenswrapper[4839]: I0321 04:25:01.210763 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/d634043b-c9ec-4469-b267-26053b1f02f9-ovnkube-config\") pod \"ovnkube-node-spl4b\" (UID: \"d634043b-c9ec-4469-b267-26053b1f02f9\") " pod="openshift-ovn-kubernetes/ovnkube-node-spl4b" Mar 21 04:25:01 crc kubenswrapper[4839]: I0321 04:25:01.212916 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/d634043b-c9ec-4469-b267-26053b1f02f9-ovn-node-metrics-cert\") pod \"ovnkube-node-spl4b\" (UID: \"d634043b-c9ec-4469-b267-26053b1f02f9\") " pod="openshift-ovn-kubernetes/ovnkube-node-spl4b" Mar 21 04:25:01 crc kubenswrapper[4839]: I0321 04:25:01.219655 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"845a5f72-fcd9-4e53-a65e-d875a0758312\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:23:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:23:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:23:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:23:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:23:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6ac8aa1efe8200de679b27c91d85e8934dd8b958a487da8adfcb2e7fe4d4af83\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:23:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b5fcb9a692a8e68cdaf00e9ab50192286582dc94a1ec9c1f269deec71d12e709\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:23:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0a90276bf42ba474250d4e51c93b958e93ffcad8ab5f4285766b381f0247c484\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:23:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://418cc0271450c53990546d7ffb4506824d2a5889e49064cd40624ba291f485bd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:23:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d2aeb8f0a5e6e7302570f0ecb899e31a8f5e9ed506f9bf7cfa6a0a6d55ae50dd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:23:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://843b9092466615ee64f818bbd1d6e51298c59932d1e135796967de09fb2f81bf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://843b9092466615ee64f818bbd1d6e51298c59932d1e135796967de09fb2f81bf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:23:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:23:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://320bccb832e179c9ab109d22c5cd652cfb3bd770d2c698d91886401d9566efc5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://320bccb832e179c9ab109d22c5cd652cfb3bd770d2c698d91886401d9566efc5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:23:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:23:18Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://6e30344cfa0c0c9d42845a517f744a156634132bd6c2f6e30f745751913e968c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6e30344cfa0c0c9d42845a517f744a156634132bd6c2f6e30f745751913e968c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:23:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:23:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:23:16Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 21 04:25:01 crc kubenswrapper[4839]: I0321 04:25:01.226016 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cdph2\" (UniqueName: \"kubernetes.io/projected/d634043b-c9ec-4469-b267-26053b1f02f9-kube-api-access-cdph2\") pod \"ovnkube-node-spl4b\" (UID: \"d634043b-c9ec-4469-b267-26053b1f02f9\") " pod="openshift-ovn-kubernetes/ovnkube-node-spl4b" Mar 21 04:25:01 crc kubenswrapper[4839]: I0321 04:25:01.227717 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:24:41Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:24:41Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 21 04:25:01 crc kubenswrapper[4839]: I0321 04:25:01.243878 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-spl4b" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d634043b-c9ec-4469-b267-26053b1f02f9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:01Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:01Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:01Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:01Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cdph2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cdph2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cdph2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cdph2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cdph2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cdph2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cdph2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cdph2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cdph2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:25:01Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-spl4b\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 21 04:25:01 crc kubenswrapper[4839]: I0321 04:25:01.253139 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c5a473f2-0430-4c9b-8ef8-60d457db5188\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:23:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:23:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:23:16Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:23:16Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:23:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7299e7f513aff2d1d61b42cef3ed386843478716082a6e38d4d6d61d87f48529\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:23:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6d9bd6ceaa85cd21af55d1d512ac910eaf9d1bcf5d0f10868cd22ac80b13f66a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:23:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e4fe4a6c00181492cd5389e34a9bf18d905f69b2f7467eb75dcee9992eeb0d07\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:23:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://412472fce0e71838c2bb83101879b672e777dcd608bf27261a98261150d42e87\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ecafd1a4266b72f369f4b53d8a423bb875c1bc9719282c517a8e999ad5aecc66\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-21T04:24:14Z\\\",\\\"message\\\":\\\"W0321 04:24:13.708919 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI0321 04:24:13.709951 1 crypto.go:601] Generating new CA for check-endpoints-signer@1774067053 cert, and key in /tmp/serving-cert-1138089316/serving-signer.crt, /tmp/serving-cert-1138089316/serving-signer.key\\\\nI0321 04:24:14.009807 1 observer_polling.go:159] Starting file observer\\\\nW0321 04:24:14.016906 1 builder.go:272] unable to get owner reference (falling back to namespace): Unauthorized\\\\nI0321 04:24:14.017068 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0321 04:24:14.017655 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1138089316/tls.crt::/tmp/serving-cert-1138089316/tls.key\\\\\\\"\\\\nF0321 04:24:14.917069 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-21T04:24:13Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:24:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://573a241321770c9c676d83d8280f81eba0ad01a3e2f857be89d31316117b8898\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:23:19Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f70600d5431425836d94025fc8166068d8c9f411d0750be07cd7de44575a9649\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f70600d5431425836d94025fc8166068d8c9f411d0750be07cd7de44575a9649\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:23:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:23:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:23:16Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 21 04:25:01 crc kubenswrapper[4839]: I0321 04:25:01.260832 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:24:41Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:24:41Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 21 04:25:01 crc kubenswrapper[4839]: I0321 04:25:01.268269 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:24:41Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:24:41Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 21 04:25:01 crc kubenswrapper[4839]: I0321 04:25:01.277149 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-g47qh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e646dbcd-c976-48e4-8dee-497be8a275bf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2ceedbcee674106a8f519fededf37b2bbc4b4bd660845373899de825f1ed534e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:25:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ljz2v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:25:00Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-g47qh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 21 04:25:01 crc kubenswrapper[4839]: I0321 04:25:01.297990 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:25:01 crc kubenswrapper[4839]: I0321 04:25:01.298063 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:25:01 crc kubenswrapper[4839]: I0321 04:25:01.298082 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:25:01 crc kubenswrapper[4839]: I0321 04:25:01.298106 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 04:25:01 crc kubenswrapper[4839]: I0321 04:25:01.298122 4839 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T04:25:01Z","lastTransitionTime":"2026-03-21T04:25:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 04:25:01 crc kubenswrapper[4839]: I0321 04:25:01.376055 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-spl4b" Mar 21 04:25:01 crc kubenswrapper[4839]: W0321 04:25:01.394375 4839 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd634043b_c9ec_4469_b267_26053b1f02f9.slice/crio-f75e324ef6ce35e2f9d2ecb83aad79d37f2471563f3c265cdfc0e67df74a76f1 WatchSource:0}: Error finding container f75e324ef6ce35e2f9d2ecb83aad79d37f2471563f3c265cdfc0e67df74a76f1: Status 404 returned error can't find the container with id f75e324ef6ce35e2f9d2ecb83aad79d37f2471563f3c265cdfc0e67df74a76f1 Mar 21 04:25:01 crc kubenswrapper[4839]: I0321 04:25:01.399467 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:25:01 crc kubenswrapper[4839]: I0321 04:25:01.399498 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:25:01 crc kubenswrapper[4839]: I0321 04:25:01.399511 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:25:01 crc kubenswrapper[4839]: I0321 04:25:01.399530 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 04:25:01 crc kubenswrapper[4839]: I0321 04:25:01.399542 4839 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T04:25:01Z","lastTransitionTime":"2026-03-21T04:25:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 04:25:01 crc kubenswrapper[4839]: I0321 04:25:01.451875 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 21 04:25:01 crc kubenswrapper[4839]: I0321 04:25:01.451942 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 21 04:25:01 crc kubenswrapper[4839]: E0321 04:25:01.452001 4839 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 21 04:25:01 crc kubenswrapper[4839]: E0321 04:25:01.452070 4839 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 21 04:25:01 crc kubenswrapper[4839]: I0321 04:25:01.452117 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 21 04:25:01 crc kubenswrapper[4839]: E0321 04:25:01.452175 4839 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 21 04:25:01 crc kubenswrapper[4839]: I0321 04:25:01.502178 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:25:01 crc kubenswrapper[4839]: I0321 04:25:01.502210 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:25:01 crc kubenswrapper[4839]: I0321 04:25:01.502221 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:25:01 crc kubenswrapper[4839]: I0321 04:25:01.502236 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 04:25:01 crc kubenswrapper[4839]: I0321 04:25:01.502248 4839 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T04:25:01Z","lastTransitionTime":"2026-03-21T04:25:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 04:25:01 crc kubenswrapper[4839]: I0321 04:25:01.605990 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:25:01 crc kubenswrapper[4839]: I0321 04:25:01.606077 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:25:01 crc kubenswrapper[4839]: I0321 04:25:01.606105 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:25:01 crc kubenswrapper[4839]: I0321 04:25:01.606144 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 04:25:01 crc kubenswrapper[4839]: I0321 04:25:01.606170 4839 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T04:25:01Z","lastTransitionTime":"2026-03-21T04:25:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 04:25:01 crc kubenswrapper[4839]: I0321 04:25:01.708804 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:25:01 crc kubenswrapper[4839]: I0321 04:25:01.708850 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:25:01 crc kubenswrapper[4839]: I0321 04:25:01.708864 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:25:01 crc kubenswrapper[4839]: I0321 04:25:01.708878 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 04:25:01 crc kubenswrapper[4839]: I0321 04:25:01.708889 4839 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T04:25:01Z","lastTransitionTime":"2026-03-21T04:25:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 04:25:01 crc kubenswrapper[4839]: I0321 04:25:01.811876 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:25:01 crc kubenswrapper[4839]: I0321 04:25:01.811937 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:25:01 crc kubenswrapper[4839]: I0321 04:25:01.811955 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:25:01 crc kubenswrapper[4839]: I0321 04:25:01.811980 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 04:25:01 crc kubenswrapper[4839]: I0321 04:25:01.811997 4839 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T04:25:01Z","lastTransitionTime":"2026-03-21T04:25:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 04:25:01 crc kubenswrapper[4839]: I0321 04:25:01.915351 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:25:01 crc kubenswrapper[4839]: I0321 04:25:01.915417 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:25:01 crc kubenswrapper[4839]: I0321 04:25:01.915444 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:25:01 crc kubenswrapper[4839]: I0321 04:25:01.915474 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 04:25:01 crc kubenswrapper[4839]: I0321 04:25:01.915497 4839 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T04:25:01Z","lastTransitionTime":"2026-03-21T04:25:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 04:25:02 crc kubenswrapper[4839]: I0321 04:25:02.002857 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-jx4q7" event={"ID":"4f92fefb-d5cd-451a-8bbe-31eea55d5bd9","Type":"ContainerStarted","Data":"6fc18a359e07bf64662248d7ceb65c8184407c0c4e14ce5483760975218407c9"} Mar 21 04:25:02 crc kubenswrapper[4839]: I0321 04:25:02.002917 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-jx4q7" event={"ID":"4f92fefb-d5cd-451a-8bbe-31eea55d5bd9","Type":"ContainerStarted","Data":"a124ddc90d8b93fe4b6f77b778fbdac127b4896f5f55e6d5d78629cd17584311"} Mar 21 04:25:02 crc kubenswrapper[4839]: I0321 04:25:02.005487 4839 generic.go:334] "Generic (PLEG): container finished" podID="d634043b-c9ec-4469-b267-26053b1f02f9" containerID="466ed122245008a86b4471250670e086e61c981de3a7a026461c7c2238c87f3c" exitCode=0 Mar 21 04:25:02 crc kubenswrapper[4839]: I0321 04:25:02.005600 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-spl4b" event={"ID":"d634043b-c9ec-4469-b267-26053b1f02f9","Type":"ContainerDied","Data":"466ed122245008a86b4471250670e086e61c981de3a7a026461c7c2238c87f3c"} Mar 21 04:25:02 crc kubenswrapper[4839]: I0321 04:25:02.005640 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-spl4b" event={"ID":"d634043b-c9ec-4469-b267-26053b1f02f9","Type":"ContainerStarted","Data":"f75e324ef6ce35e2f9d2ecb83aad79d37f2471563f3c265cdfc0e67df74a76f1"} Mar 21 04:25:02 crc kubenswrapper[4839]: I0321 04:25:02.008545 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-zqcw4" event={"ID":"1602189b-f4f3-40ee-ba63-c695c11069d0","Type":"ContainerStarted","Data":"abb804e65728bf323531d9b35e52f6700584bab85de4a5b749067d2d89224747"} Mar 21 04:25:02 crc kubenswrapper[4839]: I0321 04:25:02.008635 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-zqcw4" event={"ID":"1602189b-f4f3-40ee-ba63-c695c11069d0","Type":"ContainerStarted","Data":"5a140131f05d7b0d663ff416e2780ce265a03da4be693c89cc63345c8b65f3dd"} Mar 21 04:25:02 crc kubenswrapper[4839]: I0321 04:25:02.012085 4839 generic.go:334] "Generic (PLEG): container finished" podID="e0848faa-daf7-4b62-a20f-36d92678db1d" containerID="5ee6bb585321fd698c356e2ad9b8f48f62f2b8e09847371060e8ab0395a059ea" exitCode=0 Mar 21 04:25:02 crc kubenswrapper[4839]: I0321 04:25:02.012127 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-scp2c" event={"ID":"e0848faa-daf7-4b62-a20f-36d92678db1d","Type":"ContainerDied","Data":"5ee6bb585321fd698c356e2ad9b8f48f62f2b8e09847371060e8ab0395a059ea"} Mar 21 04:25:02 crc kubenswrapper[4839]: I0321 04:25:02.012154 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-scp2c" event={"ID":"e0848faa-daf7-4b62-a20f-36d92678db1d","Type":"ContainerStarted","Data":"811b088a00fd73f9b5eefe3a065a3555e9a32793837e4125cec34d9084a87730"} Mar 21 04:25:02 crc kubenswrapper[4839]: I0321 04:25:02.016024 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:24:41Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:24:41Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 21 04:25:02 crc kubenswrapper[4839]: I0321 04:25:02.018155 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:25:02 crc kubenswrapper[4839]: I0321 04:25:02.018197 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:25:02 crc kubenswrapper[4839]: I0321 04:25:02.018208 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:25:02 crc kubenswrapper[4839]: I0321 04:25:02.018224 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 04:25:02 crc kubenswrapper[4839]: I0321 04:25:02.018235 4839 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T04:25:02Z","lastTransitionTime":"2026-03-21T04:25:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 04:25:02 crc kubenswrapper[4839]: I0321 04:25:02.030449 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-jx4q7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4f92fefb-d5cd-451a-8bbe-31eea55d5bd9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6fc18a359e07bf64662248d7ceb65c8184407c0c4e14ce5483760975218407c9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:25:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9jqbw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a124ddc90d8b93fe4b6f77b778fbdac127b4896f5f55e6d5d78629cd17584311\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:25:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9jqbw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:25:00Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-jx4q7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 21 04:25:02 crc kubenswrapper[4839]: I0321 04:25:02.046881 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-scp2c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e0848faa-daf7-4b62-a20f-36d92678db1d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:00Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:00Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:00Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6trk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6trk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6trk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6trk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6trk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6trk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6trk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:25:00Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-scp2c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 21 04:25:02 crc kubenswrapper[4839]: I0321 04:25:02.061626 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-zqcw4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1602189b-f4f3-40ee-ba63-c695c11069d0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:00Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:00Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f9gh6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:25:00Z\\\"}}\" for pod \"openshift-multus\"/\"multus-zqcw4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 21 04:25:02 crc kubenswrapper[4839]: I0321 04:25:02.075415 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:24:41Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:24:41Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 21 04:25:02 crc kubenswrapper[4839]: I0321 04:25:02.088684 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:24:41Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:24:41Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 21 04:25:02 crc kubenswrapper[4839]: I0321 04:25:02.112494 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-spl4b" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d634043b-c9ec-4469-b267-26053b1f02f9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:01Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:01Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:01Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:01Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cdph2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cdph2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cdph2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cdph2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cdph2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cdph2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cdph2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cdph2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cdph2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:25:01Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-spl4b\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 21 04:25:02 crc kubenswrapper[4839]: I0321 04:25:02.127803 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:25:02 crc kubenswrapper[4839]: I0321 04:25:02.127836 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:25:02 crc kubenswrapper[4839]: I0321 04:25:02.127844 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:25:02 crc kubenswrapper[4839]: I0321 04:25:02.127858 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 04:25:02 crc kubenswrapper[4839]: I0321 04:25:02.127868 4839 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T04:25:02Z","lastTransitionTime":"2026-03-21T04:25:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 04:25:02 crc kubenswrapper[4839]: I0321 04:25:02.134516 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"845a5f72-fcd9-4e53-a65e-d875a0758312\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:23:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:23:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:23:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:23:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:23:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6ac8aa1efe8200de679b27c91d85e8934dd8b958a487da8adfcb2e7fe4d4af83\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:23:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b5fcb9a692a8e68cdaf00e9ab50192286582dc94a1ec9c1f269deec71d12e709\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:23:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0a90276bf42ba474250d4e51c93b958e93ffcad8ab5f4285766b381f0247c484\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:23:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://418cc0271450c53990546d7ffb4506824d2a5889e49064cd40624ba291f485bd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:23:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d2aeb8f0a5e6e7302570f0ecb899e31a8f5e9ed506f9bf7cfa6a0a6d55ae50dd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:23:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://843b9092466615ee64f818bbd1d6e51298c59932d1e135796967de09fb2f81bf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://843b9092466615ee64f818bbd1d6e51298c59932d1e135796967de09fb2f81bf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:23:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:23:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://320bccb832e179c9ab109d22c5cd652cfb3bd770d2c698d91886401d9566efc5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://320bccb832e179c9ab109d22c5cd652cfb3bd770d2c698d91886401d9566efc5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:23:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:23:18Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://6e30344cfa0c0c9d42845a517f744a156634132bd6c2f6e30f745751913e968c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6e30344cfa0c0c9d42845a517f744a156634132bd6c2f6e30f745751913e968c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:23:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:23:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:23:16Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 21 04:25:02 crc kubenswrapper[4839]: I0321 04:25:02.146050 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:24:41Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:24:41Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 21 04:25:02 crc kubenswrapper[4839]: I0321 04:25:02.161030 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:24:41Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:24:41Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 21 04:25:02 crc kubenswrapper[4839]: I0321 04:25:02.171151 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:24:41Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:24:41Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 21 04:25:02 crc kubenswrapper[4839]: I0321 04:25:02.179769 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-g47qh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e646dbcd-c976-48e4-8dee-497be8a275bf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2ceedbcee674106a8f519fededf37b2bbc4b4bd660845373899de825f1ed534e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:25:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ljz2v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:25:00Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-g47qh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 21 04:25:02 crc kubenswrapper[4839]: I0321 04:25:02.189450 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c5a473f2-0430-4c9b-8ef8-60d457db5188\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:23:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:23:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:23:16Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:23:16Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:23:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7299e7f513aff2d1d61b42cef3ed386843478716082a6e38d4d6d61d87f48529\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:23:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6d9bd6ceaa85cd21af55d1d512ac910eaf9d1bcf5d0f10868cd22ac80b13f66a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:23:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e4fe4a6c00181492cd5389e34a9bf18d905f69b2f7467eb75dcee9992eeb0d07\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:23:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://412472fce0e71838c2bb83101879b672e777dcd608bf27261a98261150d42e87\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ecafd1a4266b72f369f4b53d8a423bb875c1bc9719282c517a8e999ad5aecc66\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-21T04:24:14Z\\\",\\\"message\\\":\\\"W0321 04:24:13.708919 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI0321 04:24:13.709951 1 crypto.go:601] Generating new CA for check-endpoints-signer@1774067053 cert, and key in /tmp/serving-cert-1138089316/serving-signer.crt, /tmp/serving-cert-1138089316/serving-signer.key\\\\nI0321 04:24:14.009807 1 observer_polling.go:159] Starting file observer\\\\nW0321 04:24:14.016906 1 builder.go:272] unable to get owner reference (falling back to namespace): Unauthorized\\\\nI0321 04:24:14.017068 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0321 04:24:14.017655 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1138089316/tls.crt::/tmp/serving-cert-1138089316/tls.key\\\\\\\"\\\\nF0321 04:24:14.917069 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-21T04:24:13Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:24:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://573a241321770c9c676d83d8280f81eba0ad01a3e2f857be89d31316117b8898\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:23:19Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f70600d5431425836d94025fc8166068d8c9f411d0750be07cd7de44575a9649\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f70600d5431425836d94025fc8166068d8c9f411d0750be07cd7de44575a9649\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:23:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:23:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:23:16Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 21 04:25:02 crc kubenswrapper[4839]: I0321 04:25:02.199662 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c5a473f2-0430-4c9b-8ef8-60d457db5188\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:23:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:23:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:23:16Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:23:16Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:23:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7299e7f513aff2d1d61b42cef3ed386843478716082a6e38d4d6d61d87f48529\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:23:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6d9bd6ceaa85cd21af55d1d512ac910eaf9d1bcf5d0f10868cd22ac80b13f66a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:23:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e4fe4a6c00181492cd5389e34a9bf18d905f69b2f7467eb75dcee9992eeb0d07\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:23:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://412472fce0e71838c2bb83101879b672e777dcd608bf27261a98261150d42e87\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ecafd1a4266b72f369f4b53d8a423bb875c1bc9719282c517a8e999ad5aecc66\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-21T04:24:14Z\\\",\\\"message\\\":\\\"W0321 04:24:13.708919 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI0321 04:24:13.709951 1 crypto.go:601] Generating new CA for check-endpoints-signer@1774067053 cert, and key in /tmp/serving-cert-1138089316/serving-signer.crt, /tmp/serving-cert-1138089316/serving-signer.key\\\\nI0321 04:24:14.009807 1 observer_polling.go:159] Starting file observer\\\\nW0321 04:24:14.016906 1 builder.go:272] unable to get owner reference (falling back to namespace): Unauthorized\\\\nI0321 04:24:14.017068 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0321 04:24:14.017655 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1138089316/tls.crt::/tmp/serving-cert-1138089316/tls.key\\\\\\\"\\\\nF0321 04:24:14.917069 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-21T04:24:13Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:24:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://573a241321770c9c676d83d8280f81eba0ad01a3e2f857be89d31316117b8898\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:23:19Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f70600d5431425836d94025fc8166068d8c9f411d0750be07cd7de44575a9649\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f70600d5431425836d94025fc8166068d8c9f411d0750be07cd7de44575a9649\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:23:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:23:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:23:16Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 21 04:25:02 crc kubenswrapper[4839]: I0321 04:25:02.211998 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:24:41Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:24:41Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 21 04:25:02 crc kubenswrapper[4839]: I0321 04:25:02.227098 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:24:41Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:24:41Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 21 04:25:02 crc kubenswrapper[4839]: I0321 04:25:02.230043 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:25:02 crc kubenswrapper[4839]: I0321 04:25:02.230087 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:25:02 crc kubenswrapper[4839]: I0321 04:25:02.230100 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:25:02 crc kubenswrapper[4839]: I0321 04:25:02.230121 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 04:25:02 crc kubenswrapper[4839]: I0321 04:25:02.230136 4839 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T04:25:02Z","lastTransitionTime":"2026-03-21T04:25:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 04:25:02 crc kubenswrapper[4839]: I0321 04:25:02.237250 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-g47qh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e646dbcd-c976-48e4-8dee-497be8a275bf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2ceedbcee674106a8f519fededf37b2bbc4b4bd660845373899de825f1ed534e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:25:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ljz2v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:25:00Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-g47qh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 21 04:25:02 crc kubenswrapper[4839]: I0321 04:25:02.249722 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:24:41Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:24:41Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 21 04:25:02 crc kubenswrapper[4839]: I0321 04:25:02.259235 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-jx4q7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4f92fefb-d5cd-451a-8bbe-31eea55d5bd9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6fc18a359e07bf64662248d7ceb65c8184407c0c4e14ce5483760975218407c9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:25:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9jqbw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a124ddc90d8b93fe4b6f77b778fbdac127b4896f5f55e6d5d78629cd17584311\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:25:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9jqbw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:25:00Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-jx4q7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 21 04:25:02 crc kubenswrapper[4839]: I0321 04:25:02.272001 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:24:41Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:24:41Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 21 04:25:02 crc kubenswrapper[4839]: I0321 04:25:02.280263 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:24:41Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:24:41Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 21 04:25:02 crc kubenswrapper[4839]: I0321 04:25:02.293503 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-scp2c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e0848faa-daf7-4b62-a20f-36d92678db1d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:00Z\\\",\\\"message\\\":\\\"containers with incomplete status: [cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:00Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:00Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6trk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5ee6bb585321fd698c356e2ad9b8f48f62f2b8e09847371060e8ab0395a059ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5ee6bb585321fd698c356e2ad9b8f48f62f2b8e09847371060e8ab0395a059ea\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:25:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:25:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6trk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6trk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6trk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6trk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6trk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6trk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:25:00Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-scp2c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 21 04:25:02 crc kubenswrapper[4839]: I0321 04:25:02.304314 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-zqcw4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1602189b-f4f3-40ee-ba63-c695c11069d0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://abb804e65728bf323531d9b35e52f6700584bab85de4a5b749067d2d89224747\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:25:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f9gh6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:25:00Z\\\"}}\" for pod \"openshift-multus\"/\"multus-zqcw4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 21 04:25:02 crc kubenswrapper[4839]: I0321 04:25:02.322810 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"845a5f72-fcd9-4e53-a65e-d875a0758312\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:23:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:23:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:23:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:23:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:23:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6ac8aa1efe8200de679b27c91d85e8934dd8b958a487da8adfcb2e7fe4d4af83\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:23:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b5fcb9a692a8e68cdaf00e9ab50192286582dc94a1ec9c1f269deec71d12e709\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:23:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0a90276bf42ba474250d4e51c93b958e93ffcad8ab5f4285766b381f0247c484\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:23:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://418cc0271450c53990546d7ffb4506824d2a5889e49064cd40624ba291f485bd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:23:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d2aeb8f0a5e6e7302570f0ecb899e31a8f5e9ed506f9bf7cfa6a0a6d55ae50dd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:23:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://843b9092466615ee64f818bbd1d6e51298c59932d1e135796967de09fb2f81bf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://843b9092466615ee64f818bbd1d6e51298c59932d1e135796967de09fb2f81bf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:23:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:23:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://320bccb832e179c9ab109d22c5cd652cfb3bd770d2c698d91886401d9566efc5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://320bccb832e179c9ab109d22c5cd652cfb3bd770d2c698d91886401d9566efc5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:23:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:23:18Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://6e30344cfa0c0c9d42845a517f744a156634132bd6c2f6e30f745751913e968c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6e30344cfa0c0c9d42845a517f744a156634132bd6c2f6e30f745751913e968c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:23:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:23:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:23:16Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 21 04:25:02 crc kubenswrapper[4839]: I0321 04:25:02.331413 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:24:41Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:24:41Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 21 04:25:02 crc kubenswrapper[4839]: I0321 04:25:02.332532 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:25:02 crc kubenswrapper[4839]: I0321 04:25:02.332715 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:25:02 crc kubenswrapper[4839]: I0321 04:25:02.332805 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:25:02 crc kubenswrapper[4839]: I0321 04:25:02.332902 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 04:25:02 crc kubenswrapper[4839]: I0321 04:25:02.332987 4839 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T04:25:02Z","lastTransitionTime":"2026-03-21T04:25:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 04:25:02 crc kubenswrapper[4839]: I0321 04:25:02.346555 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-spl4b" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d634043b-c9ec-4469-b267-26053b1f02f9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:01Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:01Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cdph2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cdph2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cdph2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cdph2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cdph2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cdph2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cdph2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cdph2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://466ed122245008a86b4471250670e086e61c981de3a7a026461c7c2238c87f3c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://466ed122245008a86b4471250670e086e61c981de3a7a026461c7c2238c87f3c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:25:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:25:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cdph2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:25:01Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-spl4b\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 21 04:25:02 crc kubenswrapper[4839]: I0321 04:25:02.436025 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:25:02 crc kubenswrapper[4839]: I0321 04:25:02.436068 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:25:02 crc kubenswrapper[4839]: I0321 04:25:02.436082 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:25:02 crc kubenswrapper[4839]: I0321 04:25:02.436131 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 04:25:02 crc kubenswrapper[4839]: I0321 04:25:02.436144 4839 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T04:25:02Z","lastTransitionTime":"2026-03-21T04:25:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 04:25:02 crc kubenswrapper[4839]: I0321 04:25:02.538564 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:25:02 crc kubenswrapper[4839]: I0321 04:25:02.538647 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:25:02 crc kubenswrapper[4839]: I0321 04:25:02.538675 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:25:02 crc kubenswrapper[4839]: I0321 04:25:02.538698 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 04:25:02 crc kubenswrapper[4839]: I0321 04:25:02.538712 4839 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T04:25:02Z","lastTransitionTime":"2026-03-21T04:25:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 04:25:02 crc kubenswrapper[4839]: I0321 04:25:02.640842 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:25:02 crc kubenswrapper[4839]: I0321 04:25:02.641270 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:25:02 crc kubenswrapper[4839]: I0321 04:25:02.641279 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:25:02 crc kubenswrapper[4839]: I0321 04:25:02.641294 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 04:25:02 crc kubenswrapper[4839]: I0321 04:25:02.641304 4839 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T04:25:02Z","lastTransitionTime":"2026-03-21T04:25:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 04:25:02 crc kubenswrapper[4839]: I0321 04:25:02.744505 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:25:02 crc kubenswrapper[4839]: I0321 04:25:02.744755 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:25:02 crc kubenswrapper[4839]: I0321 04:25:02.744887 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:25:02 crc kubenswrapper[4839]: I0321 04:25:02.744997 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 04:25:02 crc kubenswrapper[4839]: I0321 04:25:02.745103 4839 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T04:25:02Z","lastTransitionTime":"2026-03-21T04:25:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 04:25:02 crc kubenswrapper[4839]: I0321 04:25:02.848286 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:25:02 crc kubenswrapper[4839]: I0321 04:25:02.848352 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:25:02 crc kubenswrapper[4839]: I0321 04:25:02.848365 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:25:02 crc kubenswrapper[4839]: I0321 04:25:02.848385 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 04:25:02 crc kubenswrapper[4839]: I0321 04:25:02.848398 4839 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T04:25:02Z","lastTransitionTime":"2026-03-21T04:25:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 04:25:02 crc kubenswrapper[4839]: I0321 04:25:02.951420 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:25:02 crc kubenswrapper[4839]: I0321 04:25:02.951469 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:25:02 crc kubenswrapper[4839]: I0321 04:25:02.951482 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:25:02 crc kubenswrapper[4839]: I0321 04:25:02.951507 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 04:25:02 crc kubenswrapper[4839]: I0321 04:25:02.951521 4839 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T04:25:02Z","lastTransitionTime":"2026-03-21T04:25:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 04:25:03 crc kubenswrapper[4839]: I0321 04:25:03.020142 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-spl4b" event={"ID":"d634043b-c9ec-4469-b267-26053b1f02f9","Type":"ContainerStarted","Data":"c532a8668c47301ab93dbb1ea21be6ed687a83a6c9008b887e5023a46f27d172"} Mar 21 04:25:03 crc kubenswrapper[4839]: I0321 04:25:03.020192 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-spl4b" event={"ID":"d634043b-c9ec-4469-b267-26053b1f02f9","Type":"ContainerStarted","Data":"5854b81091eb3920f499e2fb8bdb5e51afa5cf975397ad7d789603150fdafe92"} Mar 21 04:25:03 crc kubenswrapper[4839]: I0321 04:25:03.020202 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-spl4b" event={"ID":"d634043b-c9ec-4469-b267-26053b1f02f9","Type":"ContainerStarted","Data":"821316a28b33f897df4ea4ae553ed0cbdd066ff220cc310af604bbe062e4b00e"} Mar 21 04:25:03 crc kubenswrapper[4839]: I0321 04:25:03.020211 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-spl4b" event={"ID":"d634043b-c9ec-4469-b267-26053b1f02f9","Type":"ContainerStarted","Data":"0555faac11cd1da1d5573074ee5d8704c1c4a7d8559996229bf242d9ed7290f4"} Mar 21 04:25:03 crc kubenswrapper[4839]: I0321 04:25:03.022101 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-scp2c" event={"ID":"e0848faa-daf7-4b62-a20f-36d92678db1d","Type":"ContainerStarted","Data":"d6527e3c31caba53559cb6dfe4494b576789a0c8e3e5a7d392352441f1b9b1ad"} Mar 21 04:25:03 crc kubenswrapper[4839]: I0321 04:25:03.042246 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"845a5f72-fcd9-4e53-a65e-d875a0758312\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:23:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:23:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:23:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:23:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:23:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6ac8aa1efe8200de679b27c91d85e8934dd8b958a487da8adfcb2e7fe4d4af83\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:23:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b5fcb9a692a8e68cdaf00e9ab50192286582dc94a1ec9c1f269deec71d12e709\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:23:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0a90276bf42ba474250d4e51c93b958e93ffcad8ab5f4285766b381f0247c484\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:23:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://418cc0271450c53990546d7ffb4506824d2a5889e49064cd40624ba291f485bd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:23:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d2aeb8f0a5e6e7302570f0ecb899e31a8f5e9ed506f9bf7cfa6a0a6d55ae50dd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:23:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://843b9092466615ee64f818bbd1d6e51298c59932d1e135796967de09fb2f81bf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://843b9092466615ee64f818bbd1d6e51298c59932d1e135796967de09fb2f81bf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:23:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:23:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://320bccb832e179c9ab109d22c5cd652cfb3bd770d2c698d91886401d9566efc5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://320bccb832e179c9ab109d22c5cd652cfb3bd770d2c698d91886401d9566efc5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:23:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:23:18Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://6e30344cfa0c0c9d42845a517f744a156634132bd6c2f6e30f745751913e968c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6e30344cfa0c0c9d42845a517f744a156634132bd6c2f6e30f745751913e968c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:23:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:23:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:23:16Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 21 04:25:03 crc kubenswrapper[4839]: I0321 04:25:03.053213 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:24:41Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:24:41Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 21 04:25:03 crc kubenswrapper[4839]: I0321 04:25:03.054007 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:25:03 crc kubenswrapper[4839]: I0321 04:25:03.054060 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:25:03 crc kubenswrapper[4839]: I0321 04:25:03.054074 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:25:03 crc kubenswrapper[4839]: I0321 04:25:03.054090 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 04:25:03 crc kubenswrapper[4839]: I0321 04:25:03.054137 4839 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T04:25:03Z","lastTransitionTime":"2026-03-21T04:25:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 04:25:03 crc kubenswrapper[4839]: I0321 04:25:03.068187 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-spl4b" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d634043b-c9ec-4469-b267-26053b1f02f9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:01Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:01Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cdph2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cdph2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cdph2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cdph2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cdph2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cdph2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cdph2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cdph2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://466ed122245008a86b4471250670e086e61c981de3a7a026461c7c2238c87f3c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://466ed122245008a86b4471250670e086e61c981de3a7a026461c7c2238c87f3c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:25:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:25:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cdph2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:25:01Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-spl4b\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 21 04:25:03 crc kubenswrapper[4839]: I0321 04:25:03.084715 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c5a473f2-0430-4c9b-8ef8-60d457db5188\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:23:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:23:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:23:16Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:23:16Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:23:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7299e7f513aff2d1d61b42cef3ed386843478716082a6e38d4d6d61d87f48529\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:23:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6d9bd6ceaa85cd21af55d1d512ac910eaf9d1bcf5d0f10868cd22ac80b13f66a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:23:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e4fe4a6c00181492cd5389e34a9bf18d905f69b2f7467eb75dcee9992eeb0d07\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:23:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://412472fce0e71838c2bb83101879b672e777dcd608bf27261a98261150d42e87\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ecafd1a4266b72f369f4b53d8a423bb875c1bc9719282c517a8e999ad5aecc66\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-21T04:24:14Z\\\",\\\"message\\\":\\\"W0321 04:24:13.708919 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI0321 04:24:13.709951 1 crypto.go:601] Generating new CA for check-endpoints-signer@1774067053 cert, and key in /tmp/serving-cert-1138089316/serving-signer.crt, /tmp/serving-cert-1138089316/serving-signer.key\\\\nI0321 04:24:14.009807 1 observer_polling.go:159] Starting file observer\\\\nW0321 04:24:14.016906 1 builder.go:272] unable to get owner reference (falling back to namespace): Unauthorized\\\\nI0321 04:24:14.017068 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0321 04:24:14.017655 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1138089316/tls.crt::/tmp/serving-cert-1138089316/tls.key\\\\\\\"\\\\nF0321 04:24:14.917069 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-21T04:24:13Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:24:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://573a241321770c9c676d83d8280f81eba0ad01a3e2f857be89d31316117b8898\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:23:19Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f70600d5431425836d94025fc8166068d8c9f411d0750be07cd7de44575a9649\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f70600d5431425836d94025fc8166068d8c9f411d0750be07cd7de44575a9649\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:23:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:23:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:23:16Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 21 04:25:03 crc kubenswrapper[4839]: I0321 04:25:03.094116 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:24:41Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:24:41Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 21 04:25:03 crc kubenswrapper[4839]: I0321 04:25:03.103495 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:24:41Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:24:41Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 21 04:25:03 crc kubenswrapper[4839]: I0321 04:25:03.112971 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-g47qh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e646dbcd-c976-48e4-8dee-497be8a275bf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2ceedbcee674106a8f519fededf37b2bbc4b4bd660845373899de825f1ed534e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:25:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ljz2v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:25:00Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-g47qh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 21 04:25:03 crc kubenswrapper[4839]: I0321 04:25:03.122101 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:24:41Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:24:41Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 21 04:25:03 crc kubenswrapper[4839]: I0321 04:25:03.130470 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-jx4q7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4f92fefb-d5cd-451a-8bbe-31eea55d5bd9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6fc18a359e07bf64662248d7ceb65c8184407c0c4e14ce5483760975218407c9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:25:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9jqbw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a124ddc90d8b93fe4b6f77b778fbdac127b4896f5f55e6d5d78629cd17584311\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:25:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9jqbw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:25:00Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-jx4q7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 21 04:25:03 crc kubenswrapper[4839]: I0321 04:25:03.143983 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:24:41Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:24:41Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 21 04:25:03 crc kubenswrapper[4839]: I0321 04:25:03.153944 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:24:41Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:24:41Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 21 04:25:03 crc kubenswrapper[4839]: I0321 04:25:03.157018 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:25:03 crc kubenswrapper[4839]: I0321 04:25:03.157064 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:25:03 crc kubenswrapper[4839]: I0321 04:25:03.157077 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:25:03 crc kubenswrapper[4839]: I0321 04:25:03.157095 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 04:25:03 crc kubenswrapper[4839]: I0321 04:25:03.157107 4839 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T04:25:03Z","lastTransitionTime":"2026-03-21T04:25:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 04:25:03 crc kubenswrapper[4839]: I0321 04:25:03.167093 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-scp2c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e0848faa-daf7-4b62-a20f-36d92678db1d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:00Z\\\",\\\"message\\\":\\\"containers with incomplete status: [cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:00Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:00Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6trk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5ee6bb585321fd698c356e2ad9b8f48f62f2b8e09847371060e8ab0395a059ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5ee6bb585321fd698c356e2ad9b8f48f62f2b8e09847371060e8ab0395a059ea\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:25:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:25:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6trk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d6527e3c31caba53559cb6dfe4494b576789a0c8e3e5a7d392352441f1b9b1ad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:25:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6trk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6trk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6trk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6trk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6trk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:25:00Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-scp2c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 21 04:25:03 crc kubenswrapper[4839]: I0321 04:25:03.177978 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-zqcw4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1602189b-f4f3-40ee-ba63-c695c11069d0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://abb804e65728bf323531d9b35e52f6700584bab85de4a5b749067d2d89224747\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:25:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f9gh6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:25:00Z\\\"}}\" for pod \"openshift-multus\"/\"multus-zqcw4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 21 04:25:03 crc kubenswrapper[4839]: I0321 04:25:03.259657 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:25:03 crc kubenswrapper[4839]: I0321 04:25:03.259711 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:25:03 crc kubenswrapper[4839]: I0321 04:25:03.259722 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:25:03 crc kubenswrapper[4839]: I0321 04:25:03.259742 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 04:25:03 crc kubenswrapper[4839]: I0321 04:25:03.259754 4839 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T04:25:03Z","lastTransitionTime":"2026-03-21T04:25:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 04:25:03 crc kubenswrapper[4839]: I0321 04:25:03.361835 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:25:03 crc kubenswrapper[4839]: I0321 04:25:03.361871 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:25:03 crc kubenswrapper[4839]: I0321 04:25:03.361883 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:25:03 crc kubenswrapper[4839]: I0321 04:25:03.361898 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 04:25:03 crc kubenswrapper[4839]: I0321 04:25:03.361909 4839 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T04:25:03Z","lastTransitionTime":"2026-03-21T04:25:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 04:25:03 crc kubenswrapper[4839]: I0321 04:25:03.451768 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 21 04:25:03 crc kubenswrapper[4839]: E0321 04:25:03.451868 4839 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 21 04:25:03 crc kubenswrapper[4839]: I0321 04:25:03.451787 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 21 04:25:03 crc kubenswrapper[4839]: I0321 04:25:03.451768 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 21 04:25:03 crc kubenswrapper[4839]: E0321 04:25:03.451929 4839 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 21 04:25:03 crc kubenswrapper[4839]: E0321 04:25:03.452067 4839 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 21 04:25:03 crc kubenswrapper[4839]: I0321 04:25:03.464096 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:25:03 crc kubenswrapper[4839]: I0321 04:25:03.464139 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:25:03 crc kubenswrapper[4839]: I0321 04:25:03.464147 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:25:03 crc kubenswrapper[4839]: I0321 04:25:03.464163 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 04:25:03 crc kubenswrapper[4839]: I0321 04:25:03.464172 4839 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T04:25:03Z","lastTransitionTime":"2026-03-21T04:25:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 04:25:03 crc kubenswrapper[4839]: I0321 04:25:03.566947 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:25:03 crc kubenswrapper[4839]: I0321 04:25:03.567004 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:25:03 crc kubenswrapper[4839]: I0321 04:25:03.567022 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:25:03 crc kubenswrapper[4839]: I0321 04:25:03.567045 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 04:25:03 crc kubenswrapper[4839]: I0321 04:25:03.567063 4839 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T04:25:03Z","lastTransitionTime":"2026-03-21T04:25:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 04:25:03 crc kubenswrapper[4839]: I0321 04:25:03.670121 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:25:03 crc kubenswrapper[4839]: I0321 04:25:03.670162 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:25:03 crc kubenswrapper[4839]: I0321 04:25:03.670171 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:25:03 crc kubenswrapper[4839]: I0321 04:25:03.670188 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 04:25:03 crc kubenswrapper[4839]: I0321 04:25:03.670197 4839 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T04:25:03Z","lastTransitionTime":"2026-03-21T04:25:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 04:25:03 crc kubenswrapper[4839]: I0321 04:25:03.773017 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:25:03 crc kubenswrapper[4839]: I0321 04:25:03.773068 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:25:03 crc kubenswrapper[4839]: I0321 04:25:03.773084 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:25:03 crc kubenswrapper[4839]: I0321 04:25:03.773110 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 04:25:03 crc kubenswrapper[4839]: I0321 04:25:03.773126 4839 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T04:25:03Z","lastTransitionTime":"2026-03-21T04:25:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 04:25:03 crc kubenswrapper[4839]: I0321 04:25:03.876189 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:25:03 crc kubenswrapper[4839]: I0321 04:25:03.876242 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:25:03 crc kubenswrapper[4839]: I0321 04:25:03.876259 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:25:03 crc kubenswrapper[4839]: I0321 04:25:03.876283 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 04:25:03 crc kubenswrapper[4839]: I0321 04:25:03.876300 4839 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T04:25:03Z","lastTransitionTime":"2026-03-21T04:25:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 04:25:03 crc kubenswrapper[4839]: I0321 04:25:03.983162 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:25:03 crc kubenswrapper[4839]: I0321 04:25:03.983262 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:25:03 crc kubenswrapper[4839]: I0321 04:25:03.983285 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:25:03 crc kubenswrapper[4839]: I0321 04:25:03.983314 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 04:25:03 crc kubenswrapper[4839]: I0321 04:25:03.983334 4839 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T04:25:03Z","lastTransitionTime":"2026-03-21T04:25:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 04:25:04 crc kubenswrapper[4839]: I0321 04:25:04.031007 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-spl4b" event={"ID":"d634043b-c9ec-4469-b267-26053b1f02f9","Type":"ContainerStarted","Data":"340ce870dbfbd3b102f8f1c5e6705da80e4a23db035175d7746ffe6ad63310f0"} Mar 21 04:25:04 crc kubenswrapper[4839]: I0321 04:25:04.031100 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-spl4b" event={"ID":"d634043b-c9ec-4469-b267-26053b1f02f9","Type":"ContainerStarted","Data":"04c0470b8aa154ad901d5416631136ad2480da45bff39836dda6837a3e4b980e"} Mar 21 04:25:04 crc kubenswrapper[4839]: I0321 04:25:04.033803 4839 generic.go:334] "Generic (PLEG): container finished" podID="e0848faa-daf7-4b62-a20f-36d92678db1d" containerID="d6527e3c31caba53559cb6dfe4494b576789a0c8e3e5a7d392352441f1b9b1ad" exitCode=0 Mar 21 04:25:04 crc kubenswrapper[4839]: I0321 04:25:04.033856 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-scp2c" event={"ID":"e0848faa-daf7-4b62-a20f-36d92678db1d","Type":"ContainerDied","Data":"d6527e3c31caba53559cb6dfe4494b576789a0c8e3e5a7d392352441f1b9b1ad"} Mar 21 04:25:04 crc kubenswrapper[4839]: I0321 04:25:04.063761 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"845a5f72-fcd9-4e53-a65e-d875a0758312\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:23:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:23:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:23:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:23:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:23:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6ac8aa1efe8200de679b27c91d85e8934dd8b958a487da8adfcb2e7fe4d4af83\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:23:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b5fcb9a692a8e68cdaf00e9ab50192286582dc94a1ec9c1f269deec71d12e709\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:23:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0a90276bf42ba474250d4e51c93b958e93ffcad8ab5f4285766b381f0247c484\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:23:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://418cc0271450c53990546d7ffb4506824d2a5889e49064cd40624ba291f485bd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:23:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d2aeb8f0a5e6e7302570f0ecb899e31a8f5e9ed506f9bf7cfa6a0a6d55ae50dd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:23:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://843b9092466615ee64f818bbd1d6e51298c59932d1e135796967de09fb2f81bf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://843b9092466615ee64f818bbd1d6e51298c59932d1e135796967de09fb2f81bf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:23:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:23:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://320bccb832e179c9ab109d22c5cd652cfb3bd770d2c698d91886401d9566efc5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://320bccb832e179c9ab109d22c5cd652cfb3bd770d2c698d91886401d9566efc5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:23:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:23:18Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://6e30344cfa0c0c9d42845a517f744a156634132bd6c2f6e30f745751913e968c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6e30344cfa0c0c9d42845a517f744a156634132bd6c2f6e30f745751913e968c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:23:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:23:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:23:16Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 21 04:25:04 crc kubenswrapper[4839]: I0321 04:25:04.077060 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:24:41Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:24:41Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 21 04:25:04 crc kubenswrapper[4839]: I0321 04:25:04.086818 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:25:04 crc kubenswrapper[4839]: I0321 04:25:04.086861 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:25:04 crc kubenswrapper[4839]: I0321 04:25:04.086873 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:25:04 crc kubenswrapper[4839]: I0321 04:25:04.086889 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 04:25:04 crc kubenswrapper[4839]: I0321 04:25:04.086900 4839 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T04:25:04Z","lastTransitionTime":"2026-03-21T04:25:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 04:25:04 crc kubenswrapper[4839]: I0321 04:25:04.098989 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-spl4b" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d634043b-c9ec-4469-b267-26053b1f02f9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:01Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:01Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cdph2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cdph2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cdph2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cdph2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cdph2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cdph2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cdph2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cdph2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://466ed122245008a86b4471250670e086e61c981de3a7a026461c7c2238c87f3c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://466ed122245008a86b4471250670e086e61c981de3a7a026461c7c2238c87f3c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:25:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:25:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cdph2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:25:01Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-spl4b\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 21 04:25:04 crc kubenswrapper[4839]: I0321 04:25:04.108592 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:24:41Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:24:41Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 21 04:25:04 crc kubenswrapper[4839]: I0321 04:25:04.116443 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-g47qh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e646dbcd-c976-48e4-8dee-497be8a275bf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2ceedbcee674106a8f519fededf37b2bbc4b4bd660845373899de825f1ed534e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:25:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ljz2v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:25:00Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-g47qh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 21 04:25:04 crc kubenswrapper[4839]: I0321 04:25:04.126390 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c5a473f2-0430-4c9b-8ef8-60d457db5188\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:23:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:23:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:23:16Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:23:16Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:23:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7299e7f513aff2d1d61b42cef3ed386843478716082a6e38d4d6d61d87f48529\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:23:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6d9bd6ceaa85cd21af55d1d512ac910eaf9d1bcf5d0f10868cd22ac80b13f66a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:23:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e4fe4a6c00181492cd5389e34a9bf18d905f69b2f7467eb75dcee9992eeb0d07\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:23:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://412472fce0e71838c2bb83101879b672e777dcd608bf27261a98261150d42e87\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ecafd1a4266b72f369f4b53d8a423bb875c1bc9719282c517a8e999ad5aecc66\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-21T04:24:14Z\\\",\\\"message\\\":\\\"W0321 04:24:13.708919 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI0321 04:24:13.709951 1 crypto.go:601] Generating new CA for check-endpoints-signer@1774067053 cert, and key in /tmp/serving-cert-1138089316/serving-signer.crt, /tmp/serving-cert-1138089316/serving-signer.key\\\\nI0321 04:24:14.009807 1 observer_polling.go:159] Starting file observer\\\\nW0321 04:24:14.016906 1 builder.go:272] unable to get owner reference (falling back to namespace): Unauthorized\\\\nI0321 04:24:14.017068 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0321 04:24:14.017655 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1138089316/tls.crt::/tmp/serving-cert-1138089316/tls.key\\\\\\\"\\\\nF0321 04:24:14.917069 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-21T04:24:13Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:24:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://573a241321770c9c676d83d8280f81eba0ad01a3e2f857be89d31316117b8898\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:23:19Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f70600d5431425836d94025fc8166068d8c9f411d0750be07cd7de44575a9649\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f70600d5431425836d94025fc8166068d8c9f411d0750be07cd7de44575a9649\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:23:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:23:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:23:16Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 21 04:25:04 crc kubenswrapper[4839]: I0321 04:25:04.136897 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:24:41Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:24:41Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 21 04:25:04 crc kubenswrapper[4839]: I0321 04:25:04.144740 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-jx4q7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4f92fefb-d5cd-451a-8bbe-31eea55d5bd9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6fc18a359e07bf64662248d7ceb65c8184407c0c4e14ce5483760975218407c9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:25:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9jqbw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a124ddc90d8b93fe4b6f77b778fbdac127b4896f5f55e6d5d78629cd17584311\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:25:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9jqbw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:25:00Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-jx4q7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 21 04:25:04 crc kubenswrapper[4839]: I0321 04:25:04.153736 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:24:41Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:24:41Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 21 04:25:04 crc kubenswrapper[4839]: I0321 04:25:04.162894 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-zqcw4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1602189b-f4f3-40ee-ba63-c695c11069d0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://abb804e65728bf323531d9b35e52f6700584bab85de4a5b749067d2d89224747\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:25:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f9gh6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:25:00Z\\\"}}\" for pod \"openshift-multus\"/\"multus-zqcw4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 21 04:25:04 crc kubenswrapper[4839]: I0321 04:25:04.171903 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:24:41Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:24:41Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 21 04:25:04 crc kubenswrapper[4839]: I0321 04:25:04.181028 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:24:41Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:24:41Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 21 04:25:04 crc kubenswrapper[4839]: I0321 04:25:04.189161 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:25:04 crc kubenswrapper[4839]: I0321 04:25:04.189221 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:25:04 crc kubenswrapper[4839]: I0321 04:25:04.189245 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:25:04 crc kubenswrapper[4839]: I0321 04:25:04.189276 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 04:25:04 crc kubenswrapper[4839]: I0321 04:25:04.189297 4839 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T04:25:04Z","lastTransitionTime":"2026-03-21T04:25:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 04:25:04 crc kubenswrapper[4839]: I0321 04:25:04.193063 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-scp2c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e0848faa-daf7-4b62-a20f-36d92678db1d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:00Z\\\",\\\"message\\\":\\\"containers with incomplete status: [bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:00Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:00Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6trk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5ee6bb585321fd698c356e2ad9b8f48f62f2b8e09847371060e8ab0395a059ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5ee6bb585321fd698c356e2ad9b8f48f62f2b8e09847371060e8ab0395a059ea\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:25:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:25:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6trk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d6527e3c31caba53559cb6dfe4494b576789a0c8e3e5a7d392352441f1b9b1ad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d6527e3c31caba53559cb6dfe4494b576789a0c8e3e5a7d392352441f1b9b1ad\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:25:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:25:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6trk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6trk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6trk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6trk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6trk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:25:00Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-scp2c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 21 04:25:04 crc kubenswrapper[4839]: I0321 04:25:04.292377 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:25:04 crc kubenswrapper[4839]: I0321 04:25:04.292410 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:25:04 crc kubenswrapper[4839]: I0321 04:25:04.292421 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:25:04 crc kubenswrapper[4839]: I0321 04:25:04.292436 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 04:25:04 crc kubenswrapper[4839]: I0321 04:25:04.292446 4839 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T04:25:04Z","lastTransitionTime":"2026-03-21T04:25:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 04:25:04 crc kubenswrapper[4839]: I0321 04:25:04.321418 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:25:04 crc kubenswrapper[4839]: I0321 04:25:04.321489 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:25:04 crc kubenswrapper[4839]: I0321 04:25:04.321512 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:25:04 crc kubenswrapper[4839]: I0321 04:25:04.321542 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 04:25:04 crc kubenswrapper[4839]: I0321 04:25:04.321594 4839 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T04:25:04Z","lastTransitionTime":"2026-03-21T04:25:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 04:25:04 crc kubenswrapper[4839]: E0321 04:25:04.335639 4839 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404564Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865364Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-21T04:25:04Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:04Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-21T04:25:04Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:04Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-21T04:25:04Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:04Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-21T04:25:04Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:04Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"d75f4a6d-5ce3-49bd-a7c7-4f5c3e6bf62f\\\",\\\"systemUUID\\\":\\\"2a7bfad9-30ba-42d8-b982-971191ebb9d6\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 21 04:25:04 crc kubenswrapper[4839]: I0321 04:25:04.340731 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:25:04 crc kubenswrapper[4839]: I0321 04:25:04.340881 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:25:04 crc kubenswrapper[4839]: I0321 04:25:04.340907 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:25:04 crc kubenswrapper[4839]: I0321 04:25:04.340935 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 04:25:04 crc kubenswrapper[4839]: I0321 04:25:04.340957 4839 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T04:25:04Z","lastTransitionTime":"2026-03-21T04:25:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 04:25:04 crc kubenswrapper[4839]: E0321 04:25:04.356754 4839 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404564Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865364Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-21T04:25:04Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:04Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-21T04:25:04Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:04Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-21T04:25:04Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:04Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-21T04:25:04Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:04Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"d75f4a6d-5ce3-49bd-a7c7-4f5c3e6bf62f\\\",\\\"systemUUID\\\":\\\"2a7bfad9-30ba-42d8-b982-971191ebb9d6\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 21 04:25:04 crc kubenswrapper[4839]: I0321 04:25:04.361071 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:25:04 crc kubenswrapper[4839]: I0321 04:25:04.361112 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:25:04 crc kubenswrapper[4839]: I0321 04:25:04.361121 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:25:04 crc kubenswrapper[4839]: I0321 04:25:04.361137 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 04:25:04 crc kubenswrapper[4839]: I0321 04:25:04.361150 4839 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T04:25:04Z","lastTransitionTime":"2026-03-21T04:25:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 04:25:04 crc kubenswrapper[4839]: E0321 04:25:04.374777 4839 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404564Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865364Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-21T04:25:04Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:04Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-21T04:25:04Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:04Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-21T04:25:04Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:04Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-21T04:25:04Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:04Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"d75f4a6d-5ce3-49bd-a7c7-4f5c3e6bf62f\\\",\\\"systemUUID\\\":\\\"2a7bfad9-30ba-42d8-b982-971191ebb9d6\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 21 04:25:04 crc kubenswrapper[4839]: I0321 04:25:04.379720 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:25:04 crc kubenswrapper[4839]: I0321 04:25:04.379794 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:25:04 crc kubenswrapper[4839]: I0321 04:25:04.379811 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:25:04 crc kubenswrapper[4839]: I0321 04:25:04.379830 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 04:25:04 crc kubenswrapper[4839]: I0321 04:25:04.379912 4839 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T04:25:04Z","lastTransitionTime":"2026-03-21T04:25:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 04:25:04 crc kubenswrapper[4839]: E0321 04:25:04.393929 4839 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404564Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865364Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-21T04:25:04Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:04Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-21T04:25:04Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:04Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-21T04:25:04Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:04Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-21T04:25:04Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:04Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"d75f4a6d-5ce3-49bd-a7c7-4f5c3e6bf62f\\\",\\\"systemUUID\\\":\\\"2a7bfad9-30ba-42d8-b982-971191ebb9d6\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 21 04:25:04 crc kubenswrapper[4839]: I0321 04:25:04.399435 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:25:04 crc kubenswrapper[4839]: I0321 04:25:04.399477 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:25:04 crc kubenswrapper[4839]: I0321 04:25:04.399487 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:25:04 crc kubenswrapper[4839]: I0321 04:25:04.399502 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 04:25:04 crc kubenswrapper[4839]: I0321 04:25:04.399513 4839 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T04:25:04Z","lastTransitionTime":"2026-03-21T04:25:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 04:25:04 crc kubenswrapper[4839]: E0321 04:25:04.408786 4839 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404564Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865364Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-21T04:25:04Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:04Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-21T04:25:04Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:04Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-21T04:25:04Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:04Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-21T04:25:04Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:04Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"d75f4a6d-5ce3-49bd-a7c7-4f5c3e6bf62f\\\",\\\"systemUUID\\\":\\\"2a7bfad9-30ba-42d8-b982-971191ebb9d6\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 21 04:25:04 crc kubenswrapper[4839]: E0321 04:25:04.408900 4839 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Mar 21 04:25:04 crc kubenswrapper[4839]: I0321 04:25:04.410688 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:25:04 crc kubenswrapper[4839]: I0321 04:25:04.410716 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:25:04 crc kubenswrapper[4839]: I0321 04:25:04.410725 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:25:04 crc kubenswrapper[4839]: I0321 04:25:04.410740 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 04:25:04 crc kubenswrapper[4839]: I0321 04:25:04.410749 4839 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T04:25:04Z","lastTransitionTime":"2026-03-21T04:25:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 04:25:04 crc kubenswrapper[4839]: I0321 04:25:04.518110 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:25:04 crc kubenswrapper[4839]: I0321 04:25:04.518193 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:25:04 crc kubenswrapper[4839]: I0321 04:25:04.518221 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:25:04 crc kubenswrapper[4839]: I0321 04:25:04.518268 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 04:25:04 crc kubenswrapper[4839]: I0321 04:25:04.518291 4839 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T04:25:04Z","lastTransitionTime":"2026-03-21T04:25:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 04:25:04 crc kubenswrapper[4839]: I0321 04:25:04.621753 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:25:04 crc kubenswrapper[4839]: I0321 04:25:04.621804 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:25:04 crc kubenswrapper[4839]: I0321 04:25:04.621818 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:25:04 crc kubenswrapper[4839]: I0321 04:25:04.621837 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 04:25:04 crc kubenswrapper[4839]: I0321 04:25:04.621851 4839 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T04:25:04Z","lastTransitionTime":"2026-03-21T04:25:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 04:25:04 crc kubenswrapper[4839]: I0321 04:25:04.724239 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:25:04 crc kubenswrapper[4839]: I0321 04:25:04.724282 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:25:04 crc kubenswrapper[4839]: I0321 04:25:04.724296 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:25:04 crc kubenswrapper[4839]: I0321 04:25:04.724315 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 04:25:04 crc kubenswrapper[4839]: I0321 04:25:04.724329 4839 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T04:25:04Z","lastTransitionTime":"2026-03-21T04:25:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 04:25:04 crc kubenswrapper[4839]: I0321 04:25:04.827425 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:25:04 crc kubenswrapper[4839]: I0321 04:25:04.827480 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:25:04 crc kubenswrapper[4839]: I0321 04:25:04.827496 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:25:04 crc kubenswrapper[4839]: I0321 04:25:04.827516 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 04:25:04 crc kubenswrapper[4839]: I0321 04:25:04.827532 4839 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T04:25:04Z","lastTransitionTime":"2026-03-21T04:25:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 04:25:04 crc kubenswrapper[4839]: I0321 04:25:04.930976 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:25:04 crc kubenswrapper[4839]: I0321 04:25:04.931025 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:25:04 crc kubenswrapper[4839]: I0321 04:25:04.931036 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:25:04 crc kubenswrapper[4839]: I0321 04:25:04.931053 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 04:25:04 crc kubenswrapper[4839]: I0321 04:25:04.931066 4839 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T04:25:04Z","lastTransitionTime":"2026-03-21T04:25:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 04:25:05 crc kubenswrapper[4839]: I0321 04:25:05.034961 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:25:05 crc kubenswrapper[4839]: I0321 04:25:05.035041 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:25:05 crc kubenswrapper[4839]: I0321 04:25:05.035063 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:25:05 crc kubenswrapper[4839]: I0321 04:25:05.035099 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 04:25:05 crc kubenswrapper[4839]: I0321 04:25:05.035119 4839 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T04:25:05Z","lastTransitionTime":"2026-03-21T04:25:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 04:25:05 crc kubenswrapper[4839]: I0321 04:25:05.040028 4839 generic.go:334] "Generic (PLEG): container finished" podID="e0848faa-daf7-4b62-a20f-36d92678db1d" containerID="688598020c8b1be055194d4805d234769230262450e5f816f35cc120ff5ff9a2" exitCode=0 Mar 21 04:25:05 crc kubenswrapper[4839]: I0321 04:25:05.040094 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-scp2c" event={"ID":"e0848faa-daf7-4b62-a20f-36d92678db1d","Type":"ContainerDied","Data":"688598020c8b1be055194d4805d234769230262450e5f816f35cc120ff5ff9a2"} Mar 21 04:25:05 crc kubenswrapper[4839]: I0321 04:25:05.061447 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:24:41Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:24:41Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 21 04:25:05 crc kubenswrapper[4839]: I0321 04:25:05.084605 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:24:41Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:24:41Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 21 04:25:05 crc kubenswrapper[4839]: I0321 04:25:05.101878 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-scp2c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e0848faa-daf7-4b62-a20f-36d92678db1d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:00Z\\\",\\\"message\\\":\\\"containers with incomplete status: [routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:00Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:00Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6trk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5ee6bb585321fd698c356e2ad9b8f48f62f2b8e09847371060e8ab0395a059ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5ee6bb585321fd698c356e2ad9b8f48f62f2b8e09847371060e8ab0395a059ea\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:25:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:25:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6trk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d6527e3c31caba53559cb6dfe4494b576789a0c8e3e5a7d392352441f1b9b1ad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d6527e3c31caba53559cb6dfe4494b576789a0c8e3e5a7d392352441f1b9b1ad\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:25:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:25:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6trk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://688598020c8b1be055194d4805d234769230262450e5f816f35cc120ff5ff9a2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://688598020c8b1be055194d4805d234769230262450e5f816f35cc120ff5ff9a2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:25:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:25:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6trk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6trk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6trk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6trk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:25:00Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-scp2c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 21 04:25:05 crc kubenswrapper[4839]: I0321 04:25:05.116284 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-zqcw4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1602189b-f4f3-40ee-ba63-c695c11069d0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://abb804e65728bf323531d9b35e52f6700584bab85de4a5b749067d2d89224747\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:25:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f9gh6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:25:00Z\\\"}}\" for pod \"openshift-multus\"/\"multus-zqcw4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 21 04:25:05 crc kubenswrapper[4839]: I0321 04:25:05.136947 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:25:05 crc kubenswrapper[4839]: I0321 04:25:05.136977 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:25:05 crc kubenswrapper[4839]: I0321 04:25:05.136984 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:25:05 crc kubenswrapper[4839]: I0321 04:25:05.136997 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 04:25:05 crc kubenswrapper[4839]: I0321 04:25:05.137005 4839 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T04:25:05Z","lastTransitionTime":"2026-03-21T04:25:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 04:25:05 crc kubenswrapper[4839]: I0321 04:25:05.142229 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"845a5f72-fcd9-4e53-a65e-d875a0758312\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:23:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:23:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:23:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:23:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:23:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6ac8aa1efe8200de679b27c91d85e8934dd8b958a487da8adfcb2e7fe4d4af83\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:23:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b5fcb9a692a8e68cdaf00e9ab50192286582dc94a1ec9c1f269deec71d12e709\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:23:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0a90276bf42ba474250d4e51c93b958e93ffcad8ab5f4285766b381f0247c484\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:23:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://418cc0271450c53990546d7ffb4506824d2a5889e49064cd40624ba291f485bd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:23:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d2aeb8f0a5e6e7302570f0ecb899e31a8f5e9ed506f9bf7cfa6a0a6d55ae50dd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:23:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://843b9092466615ee64f818bbd1d6e51298c59932d1e135796967de09fb2f81bf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://843b9092466615ee64f818bbd1d6e51298c59932d1e135796967de09fb2f81bf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:23:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:23:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://320bccb832e179c9ab109d22c5cd652cfb3bd770d2c698d91886401d9566efc5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://320bccb832e179c9ab109d22c5cd652cfb3bd770d2c698d91886401d9566efc5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:23:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:23:18Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://6e30344cfa0c0c9d42845a517f744a156634132bd6c2f6e30f745751913e968c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6e30344cfa0c0c9d42845a517f744a156634132bd6c2f6e30f745751913e968c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:23:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:23:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:23:16Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 21 04:25:05 crc kubenswrapper[4839]: I0321 04:25:05.154622 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:24:41Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:24:41Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 21 04:25:05 crc kubenswrapper[4839]: I0321 04:25:05.172143 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-spl4b" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d634043b-c9ec-4469-b267-26053b1f02f9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:01Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:01Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cdph2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cdph2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cdph2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cdph2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cdph2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cdph2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cdph2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cdph2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://466ed122245008a86b4471250670e086e61c981de3a7a026461c7c2238c87f3c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://466ed122245008a86b4471250670e086e61c981de3a7a026461c7c2238c87f3c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:25:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:25:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cdph2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:25:01Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-spl4b\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 21 04:25:05 crc kubenswrapper[4839]: I0321 04:25:05.180705 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-g47qh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e646dbcd-c976-48e4-8dee-497be8a275bf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2ceedbcee674106a8f519fededf37b2bbc4b4bd660845373899de825f1ed534e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:25:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ljz2v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:25:00Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-g47qh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 21 04:25:05 crc kubenswrapper[4839]: I0321 04:25:05.192617 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c5a473f2-0430-4c9b-8ef8-60d457db5188\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:23:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:23:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:23:16Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:23:16Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:23:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7299e7f513aff2d1d61b42cef3ed386843478716082a6e38d4d6d61d87f48529\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:23:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6d9bd6ceaa85cd21af55d1d512ac910eaf9d1bcf5d0f10868cd22ac80b13f66a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:23:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e4fe4a6c00181492cd5389e34a9bf18d905f69b2f7467eb75dcee9992eeb0d07\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:23:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://412472fce0e71838c2bb83101879b672e777dcd608bf27261a98261150d42e87\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ecafd1a4266b72f369f4b53d8a423bb875c1bc9719282c517a8e999ad5aecc66\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-21T04:24:14Z\\\",\\\"message\\\":\\\"W0321 04:24:13.708919 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI0321 04:24:13.709951 1 crypto.go:601] Generating new CA for check-endpoints-signer@1774067053 cert, and key in /tmp/serving-cert-1138089316/serving-signer.crt, /tmp/serving-cert-1138089316/serving-signer.key\\\\nI0321 04:24:14.009807 1 observer_polling.go:159] Starting file observer\\\\nW0321 04:24:14.016906 1 builder.go:272] unable to get owner reference (falling back to namespace): Unauthorized\\\\nI0321 04:24:14.017068 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0321 04:24:14.017655 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1138089316/tls.crt::/tmp/serving-cert-1138089316/tls.key\\\\\\\"\\\\nF0321 04:24:14.917069 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-21T04:24:13Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:24:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://573a241321770c9c676d83d8280f81eba0ad01a3e2f857be89d31316117b8898\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:23:19Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f70600d5431425836d94025fc8166068d8c9f411d0750be07cd7de44575a9649\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f70600d5431425836d94025fc8166068d8c9f411d0750be07cd7de44575a9649\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:23:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:23:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:23:16Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 21 04:25:05 crc kubenswrapper[4839]: I0321 04:25:05.200313 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:24:41Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:24:41Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 21 04:25:05 crc kubenswrapper[4839]: I0321 04:25:05.209333 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:24:41Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:24:41Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 21 04:25:05 crc kubenswrapper[4839]: I0321 04:25:05.217559 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:24:41Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:24:41Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 21 04:25:05 crc kubenswrapper[4839]: I0321 04:25:05.223849 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-jx4q7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4f92fefb-d5cd-451a-8bbe-31eea55d5bd9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6fc18a359e07bf64662248d7ceb65c8184407c0c4e14ce5483760975218407c9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:25:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9jqbw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a124ddc90d8b93fe4b6f77b778fbdac127b4896f5f55e6d5d78629cd17584311\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:25:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9jqbw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:25:00Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-jx4q7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 21 04:25:05 crc kubenswrapper[4839]: I0321 04:25:05.239192 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:25:05 crc kubenswrapper[4839]: I0321 04:25:05.239224 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:25:05 crc kubenswrapper[4839]: I0321 04:25:05.239235 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:25:05 crc kubenswrapper[4839]: I0321 04:25:05.239250 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 04:25:05 crc kubenswrapper[4839]: I0321 04:25:05.239261 4839 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T04:25:05Z","lastTransitionTime":"2026-03-21T04:25:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 04:25:05 crc kubenswrapper[4839]: I0321 04:25:05.341164 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:25:05 crc kubenswrapper[4839]: I0321 04:25:05.341494 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:25:05 crc kubenswrapper[4839]: I0321 04:25:05.341504 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:25:05 crc kubenswrapper[4839]: I0321 04:25:05.341516 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 04:25:05 crc kubenswrapper[4839]: I0321 04:25:05.341526 4839 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T04:25:05Z","lastTransitionTime":"2026-03-21T04:25:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 04:25:05 crc kubenswrapper[4839]: I0321 04:25:05.443363 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:25:05 crc kubenswrapper[4839]: I0321 04:25:05.443392 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:25:05 crc kubenswrapper[4839]: I0321 04:25:05.443401 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:25:05 crc kubenswrapper[4839]: I0321 04:25:05.443413 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 04:25:05 crc kubenswrapper[4839]: I0321 04:25:05.443421 4839 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T04:25:05Z","lastTransitionTime":"2026-03-21T04:25:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 04:25:05 crc kubenswrapper[4839]: I0321 04:25:05.452233 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 21 04:25:05 crc kubenswrapper[4839]: I0321 04:25:05.452237 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 21 04:25:05 crc kubenswrapper[4839]: I0321 04:25:05.452310 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 21 04:25:05 crc kubenswrapper[4839]: E0321 04:25:05.452438 4839 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 21 04:25:05 crc kubenswrapper[4839]: E0321 04:25:05.452467 4839 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 21 04:25:05 crc kubenswrapper[4839]: E0321 04:25:05.452515 4839 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 21 04:25:05 crc kubenswrapper[4839]: I0321 04:25:05.545780 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:25:05 crc kubenswrapper[4839]: I0321 04:25:05.545817 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:25:05 crc kubenswrapper[4839]: I0321 04:25:05.545827 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:25:05 crc kubenswrapper[4839]: I0321 04:25:05.545842 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 04:25:05 crc kubenswrapper[4839]: I0321 04:25:05.545852 4839 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T04:25:05Z","lastTransitionTime":"2026-03-21T04:25:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 04:25:05 crc kubenswrapper[4839]: I0321 04:25:05.648347 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:25:05 crc kubenswrapper[4839]: I0321 04:25:05.648383 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:25:05 crc kubenswrapper[4839]: I0321 04:25:05.648393 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:25:05 crc kubenswrapper[4839]: I0321 04:25:05.648407 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 04:25:05 crc kubenswrapper[4839]: I0321 04:25:05.648418 4839 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T04:25:05Z","lastTransitionTime":"2026-03-21T04:25:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 04:25:05 crc kubenswrapper[4839]: I0321 04:25:05.751371 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:25:05 crc kubenswrapper[4839]: I0321 04:25:05.751402 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:25:05 crc kubenswrapper[4839]: I0321 04:25:05.751414 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:25:05 crc kubenswrapper[4839]: I0321 04:25:05.751429 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 04:25:05 crc kubenswrapper[4839]: I0321 04:25:05.751441 4839 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T04:25:05Z","lastTransitionTime":"2026-03-21T04:25:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 04:25:05 crc kubenswrapper[4839]: I0321 04:25:05.854244 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:25:05 crc kubenswrapper[4839]: I0321 04:25:05.854278 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:25:05 crc kubenswrapper[4839]: I0321 04:25:05.854287 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:25:05 crc kubenswrapper[4839]: I0321 04:25:05.854301 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 04:25:05 crc kubenswrapper[4839]: I0321 04:25:05.854309 4839 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T04:25:05Z","lastTransitionTime":"2026-03-21T04:25:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 04:25:05 crc kubenswrapper[4839]: I0321 04:25:05.956797 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:25:05 crc kubenswrapper[4839]: I0321 04:25:05.956856 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:25:05 crc kubenswrapper[4839]: I0321 04:25:05.956874 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:25:05 crc kubenswrapper[4839]: I0321 04:25:05.956896 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 04:25:05 crc kubenswrapper[4839]: I0321 04:25:05.956908 4839 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T04:25:05Z","lastTransitionTime":"2026-03-21T04:25:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 04:25:06 crc kubenswrapper[4839]: I0321 04:25:06.047421 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-spl4b" event={"ID":"d634043b-c9ec-4469-b267-26053b1f02f9","Type":"ContainerStarted","Data":"ee9c34719e2681860a7065870d4f0fd1d5aae10da5f2cc780e83f8020211e536"} Mar 21 04:25:06 crc kubenswrapper[4839]: I0321 04:25:06.051729 4839 generic.go:334] "Generic (PLEG): container finished" podID="e0848faa-daf7-4b62-a20f-36d92678db1d" containerID="137b7d338371020059e20feb028cea9e79fa43d32ff7de446719a1326051c4f4" exitCode=0 Mar 21 04:25:06 crc kubenswrapper[4839]: I0321 04:25:06.051789 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-scp2c" event={"ID":"e0848faa-daf7-4b62-a20f-36d92678db1d","Type":"ContainerDied","Data":"137b7d338371020059e20feb028cea9e79fa43d32ff7de446719a1326051c4f4"} Mar 21 04:25:06 crc kubenswrapper[4839]: I0321 04:25:06.061801 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:25:06 crc kubenswrapper[4839]: I0321 04:25:06.061845 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:25:06 crc kubenswrapper[4839]: I0321 04:25:06.061857 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:25:06 crc kubenswrapper[4839]: I0321 04:25:06.061875 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 04:25:06 crc kubenswrapper[4839]: I0321 04:25:06.061888 4839 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T04:25:06Z","lastTransitionTime":"2026-03-21T04:25:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 04:25:06 crc kubenswrapper[4839]: I0321 04:25:06.068805 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c5a473f2-0430-4c9b-8ef8-60d457db5188\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:23:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:23:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:23:16Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:23:16Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:23:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7299e7f513aff2d1d61b42cef3ed386843478716082a6e38d4d6d61d87f48529\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:23:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6d9bd6ceaa85cd21af55d1d512ac910eaf9d1bcf5d0f10868cd22ac80b13f66a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:23:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e4fe4a6c00181492cd5389e34a9bf18d905f69b2f7467eb75dcee9992eeb0d07\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:23:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://412472fce0e71838c2bb83101879b672e777dcd608bf27261a98261150d42e87\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ecafd1a4266b72f369f4b53d8a423bb875c1bc9719282c517a8e999ad5aecc66\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-21T04:24:14Z\\\",\\\"message\\\":\\\"W0321 04:24:13.708919 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI0321 04:24:13.709951 1 crypto.go:601] Generating new CA for check-endpoints-signer@1774067053 cert, and key in /tmp/serving-cert-1138089316/serving-signer.crt, /tmp/serving-cert-1138089316/serving-signer.key\\\\nI0321 04:24:14.009807 1 observer_polling.go:159] Starting file observer\\\\nW0321 04:24:14.016906 1 builder.go:272] unable to get owner reference (falling back to namespace): Unauthorized\\\\nI0321 04:24:14.017068 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0321 04:24:14.017655 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1138089316/tls.crt::/tmp/serving-cert-1138089316/tls.key\\\\\\\"\\\\nF0321 04:24:14.917069 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-21T04:24:13Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:24:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://573a241321770c9c676d83d8280f81eba0ad01a3e2f857be89d31316117b8898\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:23:19Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f70600d5431425836d94025fc8166068d8c9f411d0750be07cd7de44575a9649\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f70600d5431425836d94025fc8166068d8c9f411d0750be07cd7de44575a9649\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:23:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:23:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:23:16Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 21 04:25:06 crc kubenswrapper[4839]: I0321 04:25:06.080826 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:24:41Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:24:41Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 21 04:25:06 crc kubenswrapper[4839]: I0321 04:25:06.096185 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:24:41Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:24:41Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 21 04:25:06 crc kubenswrapper[4839]: I0321 04:25:06.108126 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-g47qh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e646dbcd-c976-48e4-8dee-497be8a275bf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2ceedbcee674106a8f519fededf37b2bbc4b4bd660845373899de825f1ed534e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:25:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ljz2v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:25:00Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-g47qh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 21 04:25:06 crc kubenswrapper[4839]: I0321 04:25:06.123556 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:24:41Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:24:41Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 21 04:25:06 crc kubenswrapper[4839]: I0321 04:25:06.134678 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-jx4q7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4f92fefb-d5cd-451a-8bbe-31eea55d5bd9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6fc18a359e07bf64662248d7ceb65c8184407c0c4e14ce5483760975218407c9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:25:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9jqbw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a124ddc90d8b93fe4b6f77b778fbdac127b4896f5f55e6d5d78629cd17584311\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:25:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9jqbw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:25:00Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-jx4q7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 21 04:25:06 crc kubenswrapper[4839]: I0321 04:25:06.146918 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:24:41Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:24:41Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 21 04:25:06 crc kubenswrapper[4839]: I0321 04:25:06.158477 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-scp2c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e0848faa-daf7-4b62-a20f-36d92678db1d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:00Z\\\",\\\"message\\\":\\\"containers with incomplete status: [whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:00Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:00Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6trk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5ee6bb585321fd698c356e2ad9b8f48f62f2b8e09847371060e8ab0395a059ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5ee6bb585321fd698c356e2ad9b8f48f62f2b8e09847371060e8ab0395a059ea\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:25:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:25:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6trk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d6527e3c31caba53559cb6dfe4494b576789a0c8e3e5a7d392352441f1b9b1ad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d6527e3c31caba53559cb6dfe4494b576789a0c8e3e5a7d392352441f1b9b1ad\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:25:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:25:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6trk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://688598020c8b1be055194d4805d234769230262450e5f816f35cc120ff5ff9a2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://688598020c8b1be055194d4805d234769230262450e5f816f35cc120ff5ff9a2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:25:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:25:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6trk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://137b7d338371020059e20feb028cea9e79fa43d32ff7de446719a1326051c4f4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://137b7d338371020059e20feb028cea9e79fa43d32ff7de446719a1326051c4f4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:25:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:25:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6trk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6trk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6trk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:25:00Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-scp2c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 21 04:25:06 crc kubenswrapper[4839]: I0321 04:25:06.163537 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:25:06 crc kubenswrapper[4839]: I0321 04:25:06.163561 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:25:06 crc kubenswrapper[4839]: I0321 04:25:06.163586 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:25:06 crc kubenswrapper[4839]: I0321 04:25:06.163605 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 04:25:06 crc kubenswrapper[4839]: I0321 04:25:06.163616 4839 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T04:25:06Z","lastTransitionTime":"2026-03-21T04:25:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 04:25:06 crc kubenswrapper[4839]: I0321 04:25:06.170718 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-zqcw4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1602189b-f4f3-40ee-ba63-c695c11069d0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://abb804e65728bf323531d9b35e52f6700584bab85de4a5b749067d2d89224747\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:25:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f9gh6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:25:00Z\\\"}}\" for pod \"openshift-multus\"/\"multus-zqcw4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 21 04:25:06 crc kubenswrapper[4839]: I0321 04:25:06.181105 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:24:41Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:24:41Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 21 04:25:06 crc kubenswrapper[4839]: I0321 04:25:06.190258 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:24:41Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:24:41Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 21 04:25:06 crc kubenswrapper[4839]: I0321 04:25:06.203728 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-spl4b" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d634043b-c9ec-4469-b267-26053b1f02f9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:01Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:01Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cdph2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cdph2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cdph2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cdph2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cdph2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cdph2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cdph2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cdph2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://466ed122245008a86b4471250670e086e61c981de3a7a026461c7c2238c87f3c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://466ed122245008a86b4471250670e086e61c981de3a7a026461c7c2238c87f3c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:25:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:25:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cdph2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:25:01Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-spl4b\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 21 04:25:06 crc kubenswrapper[4839]: I0321 04:25:06.220081 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"845a5f72-fcd9-4e53-a65e-d875a0758312\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:23:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:23:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:23:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:23:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:23:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6ac8aa1efe8200de679b27c91d85e8934dd8b958a487da8adfcb2e7fe4d4af83\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:23:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b5fcb9a692a8e68cdaf00e9ab50192286582dc94a1ec9c1f269deec71d12e709\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:23:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0a90276bf42ba474250d4e51c93b958e93ffcad8ab5f4285766b381f0247c484\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:23:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://418cc0271450c53990546d7ffb4506824d2a5889e49064cd40624ba291f485bd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:23:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d2aeb8f0a5e6e7302570f0ecb899e31a8f5e9ed506f9bf7cfa6a0a6d55ae50dd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:23:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://843b9092466615ee64f818bbd1d6e51298c59932d1e135796967de09fb2f81bf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://843b9092466615ee64f818bbd1d6e51298c59932d1e135796967de09fb2f81bf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:23:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:23:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://320bccb832e179c9ab109d22c5cd652cfb3bd770d2c698d91886401d9566efc5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://320bccb832e179c9ab109d22c5cd652cfb3bd770d2c698d91886401d9566efc5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:23:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:23:18Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://6e30344cfa0c0c9d42845a517f744a156634132bd6c2f6e30f745751913e968c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6e30344cfa0c0c9d42845a517f744a156634132bd6c2f6e30f745751913e968c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:23:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:23:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:23:16Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 21 04:25:06 crc kubenswrapper[4839]: I0321 04:25:06.265339 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:25:06 crc kubenswrapper[4839]: I0321 04:25:06.265372 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:25:06 crc kubenswrapper[4839]: I0321 04:25:06.265380 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:25:06 crc kubenswrapper[4839]: I0321 04:25:06.265393 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 04:25:06 crc kubenswrapper[4839]: I0321 04:25:06.265402 4839 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T04:25:06Z","lastTransitionTime":"2026-03-21T04:25:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 04:25:06 crc kubenswrapper[4839]: I0321 04:25:06.367759 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:25:06 crc kubenswrapper[4839]: I0321 04:25:06.367822 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:25:06 crc kubenswrapper[4839]: I0321 04:25:06.367846 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:25:06 crc kubenswrapper[4839]: I0321 04:25:06.367876 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 04:25:06 crc kubenswrapper[4839]: I0321 04:25:06.367897 4839 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T04:25:06Z","lastTransitionTime":"2026-03-21T04:25:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 04:25:06 crc kubenswrapper[4839]: I0321 04:25:06.456016 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 21 04:25:06 crc kubenswrapper[4839]: E0321 04:25:06.456116 4839 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 21 04:25:06 crc kubenswrapper[4839]: I0321 04:25:06.472767 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:25:06 crc kubenswrapper[4839]: I0321 04:25:06.472810 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:25:06 crc kubenswrapper[4839]: I0321 04:25:06.472821 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:25:06 crc kubenswrapper[4839]: I0321 04:25:06.472835 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 04:25:06 crc kubenswrapper[4839]: I0321 04:25:06.472843 4839 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T04:25:06Z","lastTransitionTime":"2026-03-21T04:25:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 04:25:06 crc kubenswrapper[4839]: I0321 04:25:06.488765 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:24:41Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:24:41Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 21 04:25:06 crc kubenswrapper[4839]: I0321 04:25:06.497270 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-jx4q7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4f92fefb-d5cd-451a-8bbe-31eea55d5bd9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6fc18a359e07bf64662248d7ceb65c8184407c0c4e14ce5483760975218407c9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:25:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9jqbw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a124ddc90d8b93fe4b6f77b778fbdac127b4896f5f55e6d5d78629cd17584311\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:25:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9jqbw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:25:00Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-jx4q7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 21 04:25:06 crc kubenswrapper[4839]: I0321 04:25:06.508297 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:24:41Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:24:41Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 21 04:25:06 crc kubenswrapper[4839]: I0321 04:25:06.517138 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:24:41Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:24:41Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 21 04:25:06 crc kubenswrapper[4839]: I0321 04:25:06.527978 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-scp2c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e0848faa-daf7-4b62-a20f-36d92678db1d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:00Z\\\",\\\"message\\\":\\\"containers with incomplete status: [whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:00Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:00Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6trk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5ee6bb585321fd698c356e2ad9b8f48f62f2b8e09847371060e8ab0395a059ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5ee6bb585321fd698c356e2ad9b8f48f62f2b8e09847371060e8ab0395a059ea\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:25:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:25:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6trk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d6527e3c31caba53559cb6dfe4494b576789a0c8e3e5a7d392352441f1b9b1ad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d6527e3c31caba53559cb6dfe4494b576789a0c8e3e5a7d392352441f1b9b1ad\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:25:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:25:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6trk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://688598020c8b1be055194d4805d234769230262450e5f816f35cc120ff5ff9a2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://688598020c8b1be055194d4805d234769230262450e5f816f35cc120ff5ff9a2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:25:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:25:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6trk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://137b7d338371020059e20feb028cea9e79fa43d32ff7de446719a1326051c4f4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://137b7d338371020059e20feb028cea9e79fa43d32ff7de446719a1326051c4f4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:25:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:25:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6trk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6trk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6trk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:25:00Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-scp2c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 21 04:25:06 crc kubenswrapper[4839]: I0321 04:25:06.542813 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-zqcw4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1602189b-f4f3-40ee-ba63-c695c11069d0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://abb804e65728bf323531d9b35e52f6700584bab85de4a5b749067d2d89224747\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:25:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f9gh6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:25:00Z\\\"}}\" for pod \"openshift-multus\"/\"multus-zqcw4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 21 04:25:06 crc kubenswrapper[4839]: I0321 04:25:06.556602 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"845a5f72-fcd9-4e53-a65e-d875a0758312\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:23:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:23:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:23:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:23:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:23:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6ac8aa1efe8200de679b27c91d85e8934dd8b958a487da8adfcb2e7fe4d4af83\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:23:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b5fcb9a692a8e68cdaf00e9ab50192286582dc94a1ec9c1f269deec71d12e709\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:23:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0a90276bf42ba474250d4e51c93b958e93ffcad8ab5f4285766b381f0247c484\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:23:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://418cc0271450c53990546d7ffb4506824d2a5889e49064cd40624ba291f485bd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:23:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d2aeb8f0a5e6e7302570f0ecb899e31a8f5e9ed506f9bf7cfa6a0a6d55ae50dd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:23:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://843b9092466615ee64f818bbd1d6e51298c59932d1e135796967de09fb2f81bf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://843b9092466615ee64f818bbd1d6e51298c59932d1e135796967de09fb2f81bf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:23:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:23:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://320bccb832e179c9ab109d22c5cd652cfb3bd770d2c698d91886401d9566efc5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://320bccb832e179c9ab109d22c5cd652cfb3bd770d2c698d91886401d9566efc5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:23:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:23:18Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://6e30344cfa0c0c9d42845a517f744a156634132bd6c2f6e30f745751913e968c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6e30344cfa0c0c9d42845a517f744a156634132bd6c2f6e30f745751913e968c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:23:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:23:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:23:16Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 21 04:25:06 crc kubenswrapper[4839]: I0321 04:25:06.563367 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:24:41Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:24:41Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 21 04:25:06 crc kubenswrapper[4839]: I0321 04:25:06.575158 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:25:06 crc kubenswrapper[4839]: I0321 04:25:06.575197 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:25:06 crc kubenswrapper[4839]: I0321 04:25:06.575208 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:25:06 crc kubenswrapper[4839]: I0321 04:25:06.575224 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 04:25:06 crc kubenswrapper[4839]: I0321 04:25:06.575236 4839 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T04:25:06Z","lastTransitionTime":"2026-03-21T04:25:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 04:25:06 crc kubenswrapper[4839]: I0321 04:25:06.576503 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-spl4b" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d634043b-c9ec-4469-b267-26053b1f02f9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:01Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:01Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cdph2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cdph2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cdph2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cdph2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cdph2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cdph2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cdph2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cdph2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://466ed122245008a86b4471250670e086e61c981de3a7a026461c7c2238c87f3c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://466ed122245008a86b4471250670e086e61c981de3a7a026461c7c2238c87f3c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:25:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:25:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cdph2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:25:01Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-spl4b\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 21 04:25:06 crc kubenswrapper[4839]: I0321 04:25:06.587002 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c5a473f2-0430-4c9b-8ef8-60d457db5188\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:23:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:23:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:23:16Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:23:16Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:23:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7299e7f513aff2d1d61b42cef3ed386843478716082a6e38d4d6d61d87f48529\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:23:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6d9bd6ceaa85cd21af55d1d512ac910eaf9d1bcf5d0f10868cd22ac80b13f66a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:23:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e4fe4a6c00181492cd5389e34a9bf18d905f69b2f7467eb75dcee9992eeb0d07\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:23:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://412472fce0e71838c2bb83101879b672e777dcd608bf27261a98261150d42e87\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ecafd1a4266b72f369f4b53d8a423bb875c1bc9719282c517a8e999ad5aecc66\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-21T04:24:14Z\\\",\\\"message\\\":\\\"W0321 04:24:13.708919 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI0321 04:24:13.709951 1 crypto.go:601] Generating new CA for check-endpoints-signer@1774067053 cert, and key in /tmp/serving-cert-1138089316/serving-signer.crt, /tmp/serving-cert-1138089316/serving-signer.key\\\\nI0321 04:24:14.009807 1 observer_polling.go:159] Starting file observer\\\\nW0321 04:24:14.016906 1 builder.go:272] unable to get owner reference (falling back to namespace): Unauthorized\\\\nI0321 04:24:14.017068 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0321 04:24:14.017655 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1138089316/tls.crt::/tmp/serving-cert-1138089316/tls.key\\\\\\\"\\\\nF0321 04:24:14.917069 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-21T04:24:13Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:24:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://573a241321770c9c676d83d8280f81eba0ad01a3e2f857be89d31316117b8898\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:23:19Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f70600d5431425836d94025fc8166068d8c9f411d0750be07cd7de44575a9649\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f70600d5431425836d94025fc8166068d8c9f411d0750be07cd7de44575a9649\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:23:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:23:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:23:16Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 21 04:25:06 crc kubenswrapper[4839]: I0321 04:25:06.596441 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:24:41Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:24:41Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 21 04:25:06 crc kubenswrapper[4839]: I0321 04:25:06.606253 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:24:41Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:24:41Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 21 04:25:06 crc kubenswrapper[4839]: I0321 04:25:06.612449 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-g47qh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e646dbcd-c976-48e4-8dee-497be8a275bf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2ceedbcee674106a8f519fededf37b2bbc4b4bd660845373899de825f1ed534e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:25:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ljz2v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:25:00Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-g47qh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 21 04:25:06 crc kubenswrapper[4839]: I0321 04:25:06.677686 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:25:06 crc kubenswrapper[4839]: I0321 04:25:06.677715 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:25:06 crc kubenswrapper[4839]: I0321 04:25:06.677723 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:25:06 crc kubenswrapper[4839]: I0321 04:25:06.677736 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 04:25:06 crc kubenswrapper[4839]: I0321 04:25:06.677745 4839 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T04:25:06Z","lastTransitionTime":"2026-03-21T04:25:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 04:25:06 crc kubenswrapper[4839]: I0321 04:25:06.782357 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:25:06 crc kubenswrapper[4839]: I0321 04:25:06.782398 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:25:06 crc kubenswrapper[4839]: I0321 04:25:06.782410 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:25:06 crc kubenswrapper[4839]: I0321 04:25:06.782430 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 04:25:06 crc kubenswrapper[4839]: I0321 04:25:06.782447 4839 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T04:25:06Z","lastTransitionTime":"2026-03-21T04:25:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 04:25:06 crc kubenswrapper[4839]: I0321 04:25:06.816252 4839 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/node-ca-sxs57"] Mar 21 04:25:06 crc kubenswrapper[4839]: I0321 04:25:06.816636 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-sxs57" Mar 21 04:25:06 crc kubenswrapper[4839]: I0321 04:25:06.818913 4839 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"image-registry-certificates" Mar 21 04:25:06 crc kubenswrapper[4839]: I0321 04:25:06.818961 4839 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"openshift-service-ca.crt" Mar 21 04:25:06 crc kubenswrapper[4839]: I0321 04:25:06.819009 4839 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"kube-root-ca.crt" Mar 21 04:25:06 crc kubenswrapper[4839]: I0321 04:25:06.820317 4839 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"node-ca-dockercfg-4777p" Mar 21 04:25:06 crc kubenswrapper[4839]: I0321 04:25:06.834141 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-zqcw4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1602189b-f4f3-40ee-ba63-c695c11069d0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://abb804e65728bf323531d9b35e52f6700584bab85de4a5b749067d2d89224747\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:25:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f9gh6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:25:00Z\\\"}}\" for pod \"openshift-multus\"/\"multus-zqcw4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 21 04:25:06 crc kubenswrapper[4839]: I0321 04:25:06.841824 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-sxs57" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e99177d8-5f41-4cee-a2c9-ae1c314d9d8d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:06Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:06Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:06Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vqm8m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:25:06Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-sxs57\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 21 04:25:06 crc kubenswrapper[4839]: I0321 04:25:06.852050 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:24:41Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:24:41Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 21 04:25:06 crc kubenswrapper[4839]: I0321 04:25:06.862618 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:24:41Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:24:41Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 21 04:25:06 crc kubenswrapper[4839]: I0321 04:25:06.873341 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-scp2c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e0848faa-daf7-4b62-a20f-36d92678db1d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:00Z\\\",\\\"message\\\":\\\"containers with incomplete status: [whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:00Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:00Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6trk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5ee6bb585321fd698c356e2ad9b8f48f62f2b8e09847371060e8ab0395a059ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5ee6bb585321fd698c356e2ad9b8f48f62f2b8e09847371060e8ab0395a059ea\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:25:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:25:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6trk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d6527e3c31caba53559cb6dfe4494b576789a0c8e3e5a7d392352441f1b9b1ad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d6527e3c31caba53559cb6dfe4494b576789a0c8e3e5a7d392352441f1b9b1ad\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:25:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:25:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6trk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://688598020c8b1be055194d4805d234769230262450e5f816f35cc120ff5ff9a2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://688598020c8b1be055194d4805d234769230262450e5f816f35cc120ff5ff9a2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:25:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:25:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6trk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://137b7d338371020059e20feb028cea9e79fa43d32ff7de446719a1326051c4f4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://137b7d338371020059e20feb028cea9e79fa43d32ff7de446719a1326051c4f4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:25:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:25:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6trk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6trk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6trk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:25:00Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-scp2c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 21 04:25:06 crc kubenswrapper[4839]: I0321 04:25:06.884859 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:25:06 crc kubenswrapper[4839]: I0321 04:25:06.884894 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:25:06 crc kubenswrapper[4839]: I0321 04:25:06.884930 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:25:06 crc kubenswrapper[4839]: I0321 04:25:06.884952 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 04:25:06 crc kubenswrapper[4839]: I0321 04:25:06.884964 4839 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T04:25:06Z","lastTransitionTime":"2026-03-21T04:25:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 04:25:06 crc kubenswrapper[4839]: I0321 04:25:06.893407 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"845a5f72-fcd9-4e53-a65e-d875a0758312\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:23:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:23:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:23:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:23:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:23:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6ac8aa1efe8200de679b27c91d85e8934dd8b958a487da8adfcb2e7fe4d4af83\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:23:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b5fcb9a692a8e68cdaf00e9ab50192286582dc94a1ec9c1f269deec71d12e709\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:23:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0a90276bf42ba474250d4e51c93b958e93ffcad8ab5f4285766b381f0247c484\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:23:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://418cc0271450c53990546d7ffb4506824d2a5889e49064cd40624ba291f485bd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:23:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d2aeb8f0a5e6e7302570f0ecb899e31a8f5e9ed506f9bf7cfa6a0a6d55ae50dd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:23:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://843b9092466615ee64f818bbd1d6e51298c59932d1e135796967de09fb2f81bf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://843b9092466615ee64f818bbd1d6e51298c59932d1e135796967de09fb2f81bf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:23:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:23:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://320bccb832e179c9ab109d22c5cd652cfb3bd770d2c698d91886401d9566efc5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://320bccb832e179c9ab109d22c5cd652cfb3bd770d2c698d91886401d9566efc5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:23:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:23:18Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://6e30344cfa0c0c9d42845a517f744a156634132bd6c2f6e30f745751913e968c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6e30344cfa0c0c9d42845a517f744a156634132bd6c2f6e30f745751913e968c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:23:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:23:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:23:16Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 21 04:25:06 crc kubenswrapper[4839]: I0321 04:25:06.901635 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:24:41Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:24:41Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 21 04:25:06 crc kubenswrapper[4839]: I0321 04:25:06.917152 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-spl4b" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d634043b-c9ec-4469-b267-26053b1f02f9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:01Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:01Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cdph2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cdph2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cdph2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cdph2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cdph2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cdph2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cdph2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cdph2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://466ed122245008a86b4471250670e086e61c981de3a7a026461c7c2238c87f3c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://466ed122245008a86b4471250670e086e61c981de3a7a026461c7c2238c87f3c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:25:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:25:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cdph2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:25:01Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-spl4b\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 21 04:25:06 crc kubenswrapper[4839]: I0321 04:25:06.928291 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:24:41Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:24:41Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 21 04:25:06 crc kubenswrapper[4839]: I0321 04:25:06.936306 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-g47qh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e646dbcd-c976-48e4-8dee-497be8a275bf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2ceedbcee674106a8f519fededf37b2bbc4b4bd660845373899de825f1ed534e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:25:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ljz2v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:25:00Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-g47qh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 21 04:25:06 crc kubenswrapper[4839]: I0321 04:25:06.946794 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c5a473f2-0430-4c9b-8ef8-60d457db5188\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:23:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:23:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:23:16Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:23:16Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:23:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7299e7f513aff2d1d61b42cef3ed386843478716082a6e38d4d6d61d87f48529\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:23:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6d9bd6ceaa85cd21af55d1d512ac910eaf9d1bcf5d0f10868cd22ac80b13f66a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:23:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e4fe4a6c00181492cd5389e34a9bf18d905f69b2f7467eb75dcee9992eeb0d07\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:23:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://412472fce0e71838c2bb83101879b672e777dcd608bf27261a98261150d42e87\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ecafd1a4266b72f369f4b53d8a423bb875c1bc9719282c517a8e999ad5aecc66\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-21T04:24:14Z\\\",\\\"message\\\":\\\"W0321 04:24:13.708919 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI0321 04:24:13.709951 1 crypto.go:601] Generating new CA for check-endpoints-signer@1774067053 cert, and key in /tmp/serving-cert-1138089316/serving-signer.crt, /tmp/serving-cert-1138089316/serving-signer.key\\\\nI0321 04:24:14.009807 1 observer_polling.go:159] Starting file observer\\\\nW0321 04:24:14.016906 1 builder.go:272] unable to get owner reference (falling back to namespace): Unauthorized\\\\nI0321 04:24:14.017068 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0321 04:24:14.017655 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1138089316/tls.crt::/tmp/serving-cert-1138089316/tls.key\\\\\\\"\\\\nF0321 04:24:14.917069 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-21T04:24:13Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:24:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://573a241321770c9c676d83d8280f81eba0ad01a3e2f857be89d31316117b8898\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:23:19Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f70600d5431425836d94025fc8166068d8c9f411d0750be07cd7de44575a9649\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f70600d5431425836d94025fc8166068d8c9f411d0750be07cd7de44575a9649\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:23:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:23:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:23:16Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 21 04:25:06 crc kubenswrapper[4839]: I0321 04:25:06.957524 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:24:41Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:24:41Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 21 04:25:06 crc kubenswrapper[4839]: I0321 04:25:06.965533 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-jx4q7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4f92fefb-d5cd-451a-8bbe-31eea55d5bd9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6fc18a359e07bf64662248d7ceb65c8184407c0c4e14ce5483760975218407c9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:25:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9jqbw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a124ddc90d8b93fe4b6f77b778fbdac127b4896f5f55e6d5d78629cd17584311\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:25:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9jqbw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:25:00Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-jx4q7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 21 04:25:06 crc kubenswrapper[4839]: I0321 04:25:06.969818 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vqm8m\" (UniqueName: \"kubernetes.io/projected/e99177d8-5f41-4cee-a2c9-ae1c314d9d8d-kube-api-access-vqm8m\") pod \"node-ca-sxs57\" (UID: \"e99177d8-5f41-4cee-a2c9-ae1c314d9d8d\") " pod="openshift-image-registry/node-ca-sxs57" Mar 21 04:25:06 crc kubenswrapper[4839]: I0321 04:25:06.969861 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/e99177d8-5f41-4cee-a2c9-ae1c314d9d8d-host\") pod \"node-ca-sxs57\" (UID: \"e99177d8-5f41-4cee-a2c9-ae1c314d9d8d\") " pod="openshift-image-registry/node-ca-sxs57" Mar 21 04:25:06 crc kubenswrapper[4839]: I0321 04:25:06.969896 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/e99177d8-5f41-4cee-a2c9-ae1c314d9d8d-serviceca\") pod \"node-ca-sxs57\" (UID: \"e99177d8-5f41-4cee-a2c9-ae1c314d9d8d\") " pod="openshift-image-registry/node-ca-sxs57" Mar 21 04:25:06 crc kubenswrapper[4839]: I0321 04:25:06.974803 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:24:41Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:24:41Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 21 04:25:06 crc kubenswrapper[4839]: I0321 04:25:06.987805 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:25:06 crc kubenswrapper[4839]: I0321 04:25:06.987838 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:25:06 crc kubenswrapper[4839]: I0321 04:25:06.987846 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:25:06 crc kubenswrapper[4839]: I0321 04:25:06.987861 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 04:25:06 crc kubenswrapper[4839]: I0321 04:25:06.987869 4839 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T04:25:06Z","lastTransitionTime":"2026-03-21T04:25:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 04:25:07 crc kubenswrapper[4839]: I0321 04:25:07.057791 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-scp2c" event={"ID":"e0848faa-daf7-4b62-a20f-36d92678db1d","Type":"ContainerStarted","Data":"6059c9b392e73f503a825716b885d248ddf659fe4fc1cd23f47353dd0fdc0e07"} Mar 21 04:25:07 crc kubenswrapper[4839]: I0321 04:25:07.069227 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:24:41Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:24:41Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 21 04:25:07 crc kubenswrapper[4839]: I0321 04:25:07.070642 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/e99177d8-5f41-4cee-a2c9-ae1c314d9d8d-serviceca\") pod \"node-ca-sxs57\" (UID: \"e99177d8-5f41-4cee-a2c9-ae1c314d9d8d\") " pod="openshift-image-registry/node-ca-sxs57" Mar 21 04:25:07 crc kubenswrapper[4839]: I0321 04:25:07.070732 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vqm8m\" (UniqueName: \"kubernetes.io/projected/e99177d8-5f41-4cee-a2c9-ae1c314d9d8d-kube-api-access-vqm8m\") pod \"node-ca-sxs57\" (UID: \"e99177d8-5f41-4cee-a2c9-ae1c314d9d8d\") " pod="openshift-image-registry/node-ca-sxs57" Mar 21 04:25:07 crc kubenswrapper[4839]: I0321 04:25:07.070761 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/e99177d8-5f41-4cee-a2c9-ae1c314d9d8d-host\") pod \"node-ca-sxs57\" (UID: \"e99177d8-5f41-4cee-a2c9-ae1c314d9d8d\") " pod="openshift-image-registry/node-ca-sxs57" Mar 21 04:25:07 crc kubenswrapper[4839]: I0321 04:25:07.070828 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/e99177d8-5f41-4cee-a2c9-ae1c314d9d8d-host\") pod \"node-ca-sxs57\" (UID: \"e99177d8-5f41-4cee-a2c9-ae1c314d9d8d\") " pod="openshift-image-registry/node-ca-sxs57" Mar 21 04:25:07 crc kubenswrapper[4839]: I0321 04:25:07.071865 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/e99177d8-5f41-4cee-a2c9-ae1c314d9d8d-serviceca\") pod \"node-ca-sxs57\" (UID: \"e99177d8-5f41-4cee-a2c9-ae1c314d9d8d\") " pod="openshift-image-registry/node-ca-sxs57" Mar 21 04:25:07 crc kubenswrapper[4839]: I0321 04:25:07.079159 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-scp2c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e0848faa-daf7-4b62-a20f-36d92678db1d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:00Z\\\",\\\"message\\\":\\\"containers with incomplete status: [whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:00Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:00Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6trk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5ee6bb585321fd698c356e2ad9b8f48f62f2b8e09847371060e8ab0395a059ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5ee6bb585321fd698c356e2ad9b8f48f62f2b8e09847371060e8ab0395a059ea\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:25:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:25:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6trk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d6527e3c31caba53559cb6dfe4494b576789a0c8e3e5a7d392352441f1b9b1ad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d6527e3c31caba53559cb6dfe4494b576789a0c8e3e5a7d392352441f1b9b1ad\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:25:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:25:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6trk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://688598020c8b1be055194d4805d234769230262450e5f816f35cc120ff5ff9a2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://688598020c8b1be055194d4805d234769230262450e5f816f35cc120ff5ff9a2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:25:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:25:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6trk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://137b7d338371020059e20feb028cea9e79fa43d32ff7de446719a1326051c4f4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://137b7d338371020059e20feb028cea9e79fa43d32ff7de446719a1326051c4f4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:25:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:25:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6trk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6059c9b392e73f503a825716b885d248ddf659fe4fc1cd23f47353dd0fdc0e07\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:25:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6trk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6trk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:25:00Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-scp2c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 21 04:25:07 crc kubenswrapper[4839]: I0321 04:25:07.089598 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vqm8m\" (UniqueName: \"kubernetes.io/projected/e99177d8-5f41-4cee-a2c9-ae1c314d9d8d-kube-api-access-vqm8m\") pod \"node-ca-sxs57\" (UID: \"e99177d8-5f41-4cee-a2c9-ae1c314d9d8d\") " pod="openshift-image-registry/node-ca-sxs57" Mar 21 04:25:07 crc kubenswrapper[4839]: I0321 04:25:07.089539 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-zqcw4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1602189b-f4f3-40ee-ba63-c695c11069d0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://abb804e65728bf323531d9b35e52f6700584bab85de4a5b749067d2d89224747\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:25:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f9gh6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:25:00Z\\\"}}\" for pod \"openshift-multus\"/\"multus-zqcw4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 21 04:25:07 crc kubenswrapper[4839]: I0321 04:25:07.090250 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:25:07 crc kubenswrapper[4839]: I0321 04:25:07.090289 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:25:07 crc kubenswrapper[4839]: I0321 04:25:07.090303 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:25:07 crc kubenswrapper[4839]: I0321 04:25:07.090317 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 04:25:07 crc kubenswrapper[4839]: I0321 04:25:07.090330 4839 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T04:25:07Z","lastTransitionTime":"2026-03-21T04:25:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 04:25:07 crc kubenswrapper[4839]: I0321 04:25:07.096801 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-sxs57" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e99177d8-5f41-4cee-a2c9-ae1c314d9d8d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:06Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:06Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:06Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vqm8m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:25:06Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-sxs57\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 21 04:25:07 crc kubenswrapper[4839]: I0321 04:25:07.105637 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:24:41Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:24:41Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 21 04:25:07 crc kubenswrapper[4839]: I0321 04:25:07.113413 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:24:41Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:24:41Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 21 04:25:07 crc kubenswrapper[4839]: I0321 04:25:07.128027 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-spl4b" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d634043b-c9ec-4469-b267-26053b1f02f9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:01Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:01Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cdph2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cdph2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cdph2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cdph2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cdph2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cdph2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cdph2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cdph2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://466ed122245008a86b4471250670e086e61c981de3a7a026461c7c2238c87f3c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://466ed122245008a86b4471250670e086e61c981de3a7a026461c7c2238c87f3c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:25:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:25:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cdph2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:25:01Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-spl4b\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 21 04:25:07 crc kubenswrapper[4839]: I0321 04:25:07.130094 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-sxs57" Mar 21 04:25:07 crc kubenswrapper[4839]: I0321 04:25:07.144236 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"845a5f72-fcd9-4e53-a65e-d875a0758312\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:23:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:23:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:23:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:23:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:23:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6ac8aa1efe8200de679b27c91d85e8934dd8b958a487da8adfcb2e7fe4d4af83\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:23:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b5fcb9a692a8e68cdaf00e9ab50192286582dc94a1ec9c1f269deec71d12e709\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:23:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0a90276bf42ba474250d4e51c93b958e93ffcad8ab5f4285766b381f0247c484\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:23:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://418cc0271450c53990546d7ffb4506824d2a5889e49064cd40624ba291f485bd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:23:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d2aeb8f0a5e6e7302570f0ecb899e31a8f5e9ed506f9bf7cfa6a0a6d55ae50dd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:23:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://843b9092466615ee64f818bbd1d6e51298c59932d1e135796967de09fb2f81bf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://843b9092466615ee64f818bbd1d6e51298c59932d1e135796967de09fb2f81bf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:23:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:23:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://320bccb832e179c9ab109d22c5cd652cfb3bd770d2c698d91886401d9566efc5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://320bccb832e179c9ab109d22c5cd652cfb3bd770d2c698d91886401d9566efc5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:23:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:23:18Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://6e30344cfa0c0c9d42845a517f744a156634132bd6c2f6e30f745751913e968c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6e30344cfa0c0c9d42845a517f744a156634132bd6c2f6e30f745751913e968c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:23:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:23:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:23:16Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 21 04:25:07 crc kubenswrapper[4839]: W0321 04:25:07.147293 4839 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode99177d8_5f41_4cee_a2c9_ae1c314d9d8d.slice/crio-f2aca93f8be5ce0f5e7828afb1727002c39312da949d6da70bc8973ef7174727 WatchSource:0}: Error finding container f2aca93f8be5ce0f5e7828afb1727002c39312da949d6da70bc8973ef7174727: Status 404 returned error can't find the container with id f2aca93f8be5ce0f5e7828afb1727002c39312da949d6da70bc8973ef7174727 Mar 21 04:25:07 crc kubenswrapper[4839]: I0321 04:25:07.158266 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c5a473f2-0430-4c9b-8ef8-60d457db5188\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:23:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:23:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:23:16Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:23:16Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:23:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7299e7f513aff2d1d61b42cef3ed386843478716082a6e38d4d6d61d87f48529\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:23:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6d9bd6ceaa85cd21af55d1d512ac910eaf9d1bcf5d0f10868cd22ac80b13f66a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:23:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e4fe4a6c00181492cd5389e34a9bf18d905f69b2f7467eb75dcee9992eeb0d07\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:23:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://412472fce0e71838c2bb83101879b672e777dcd608bf27261a98261150d42e87\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ecafd1a4266b72f369f4b53d8a423bb875c1bc9719282c517a8e999ad5aecc66\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-21T04:24:14Z\\\",\\\"message\\\":\\\"W0321 04:24:13.708919 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI0321 04:24:13.709951 1 crypto.go:601] Generating new CA for check-endpoints-signer@1774067053 cert, and key in /tmp/serving-cert-1138089316/serving-signer.crt, /tmp/serving-cert-1138089316/serving-signer.key\\\\nI0321 04:24:14.009807 1 observer_polling.go:159] Starting file observer\\\\nW0321 04:24:14.016906 1 builder.go:272] unable to get owner reference (falling back to namespace): Unauthorized\\\\nI0321 04:24:14.017068 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0321 04:24:14.017655 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1138089316/tls.crt::/tmp/serving-cert-1138089316/tls.key\\\\\\\"\\\\nF0321 04:24:14.917069 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-21T04:24:13Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:24:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://573a241321770c9c676d83d8280f81eba0ad01a3e2f857be89d31316117b8898\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:23:19Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f70600d5431425836d94025fc8166068d8c9f411d0750be07cd7de44575a9649\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f70600d5431425836d94025fc8166068d8c9f411d0750be07cd7de44575a9649\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:23:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:23:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:23:16Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 21 04:25:07 crc kubenswrapper[4839]: I0321 04:25:07.169703 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:24:41Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:24:41Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 21 04:25:07 crc kubenswrapper[4839]: I0321 04:25:07.180926 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:24:41Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:24:41Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 21 04:25:07 crc kubenswrapper[4839]: I0321 04:25:07.190266 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-g47qh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e646dbcd-c976-48e4-8dee-497be8a275bf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2ceedbcee674106a8f519fededf37b2bbc4b4bd660845373899de825f1ed534e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:25:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ljz2v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:25:00Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-g47qh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 21 04:25:07 crc kubenswrapper[4839]: I0321 04:25:07.194237 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:25:07 crc kubenswrapper[4839]: I0321 04:25:07.194274 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:25:07 crc kubenswrapper[4839]: I0321 04:25:07.194283 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:25:07 crc kubenswrapper[4839]: I0321 04:25:07.194296 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 04:25:07 crc kubenswrapper[4839]: I0321 04:25:07.194306 4839 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T04:25:07Z","lastTransitionTime":"2026-03-21T04:25:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 04:25:07 crc kubenswrapper[4839]: I0321 04:25:07.203756 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:24:41Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:24:41Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 21 04:25:07 crc kubenswrapper[4839]: I0321 04:25:07.211670 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-jx4q7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4f92fefb-d5cd-451a-8bbe-31eea55d5bd9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6fc18a359e07bf64662248d7ceb65c8184407c0c4e14ce5483760975218407c9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:25:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9jqbw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a124ddc90d8b93fe4b6f77b778fbdac127b4896f5f55e6d5d78629cd17584311\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:25:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9jqbw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:25:00Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-jx4q7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 21 04:25:07 crc kubenswrapper[4839]: I0321 04:25:07.297812 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:25:07 crc kubenswrapper[4839]: I0321 04:25:07.297845 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:25:07 crc kubenswrapper[4839]: I0321 04:25:07.297856 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:25:07 crc kubenswrapper[4839]: I0321 04:25:07.297877 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 04:25:07 crc kubenswrapper[4839]: I0321 04:25:07.297888 4839 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T04:25:07Z","lastTransitionTime":"2026-03-21T04:25:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 04:25:07 crc kubenswrapper[4839]: I0321 04:25:07.401194 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:25:07 crc kubenswrapper[4839]: I0321 04:25:07.401225 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:25:07 crc kubenswrapper[4839]: I0321 04:25:07.401235 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:25:07 crc kubenswrapper[4839]: I0321 04:25:07.401256 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 04:25:07 crc kubenswrapper[4839]: I0321 04:25:07.401265 4839 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T04:25:07Z","lastTransitionTime":"2026-03-21T04:25:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 04:25:07 crc kubenswrapper[4839]: I0321 04:25:07.452440 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 21 04:25:07 crc kubenswrapper[4839]: I0321 04:25:07.452458 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 21 04:25:07 crc kubenswrapper[4839]: E0321 04:25:07.452611 4839 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 21 04:25:07 crc kubenswrapper[4839]: E0321 04:25:07.452694 4839 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 21 04:25:07 crc kubenswrapper[4839]: I0321 04:25:07.503316 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:25:07 crc kubenswrapper[4839]: I0321 04:25:07.503364 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:25:07 crc kubenswrapper[4839]: I0321 04:25:07.503375 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:25:07 crc kubenswrapper[4839]: I0321 04:25:07.503392 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 04:25:07 crc kubenswrapper[4839]: I0321 04:25:07.503402 4839 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T04:25:07Z","lastTransitionTime":"2026-03-21T04:25:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 04:25:07 crc kubenswrapper[4839]: I0321 04:25:07.605613 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:25:07 crc kubenswrapper[4839]: I0321 04:25:07.605662 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:25:07 crc kubenswrapper[4839]: I0321 04:25:07.605674 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:25:07 crc kubenswrapper[4839]: I0321 04:25:07.605691 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 04:25:07 crc kubenswrapper[4839]: I0321 04:25:07.605701 4839 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T04:25:07Z","lastTransitionTime":"2026-03-21T04:25:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 04:25:07 crc kubenswrapper[4839]: I0321 04:25:07.707545 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:25:07 crc kubenswrapper[4839]: I0321 04:25:07.707596 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:25:07 crc kubenswrapper[4839]: I0321 04:25:07.707604 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:25:07 crc kubenswrapper[4839]: I0321 04:25:07.707618 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 04:25:07 crc kubenswrapper[4839]: I0321 04:25:07.707631 4839 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T04:25:07Z","lastTransitionTime":"2026-03-21T04:25:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 04:25:07 crc kubenswrapper[4839]: I0321 04:25:07.813841 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:25:07 crc kubenswrapper[4839]: I0321 04:25:07.813876 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:25:07 crc kubenswrapper[4839]: I0321 04:25:07.813885 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:25:07 crc kubenswrapper[4839]: I0321 04:25:07.813900 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 04:25:07 crc kubenswrapper[4839]: I0321 04:25:07.813908 4839 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T04:25:07Z","lastTransitionTime":"2026-03-21T04:25:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 04:25:07 crc kubenswrapper[4839]: I0321 04:25:07.916013 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:25:07 crc kubenswrapper[4839]: I0321 04:25:07.916099 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:25:07 crc kubenswrapper[4839]: I0321 04:25:07.916110 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:25:07 crc kubenswrapper[4839]: I0321 04:25:07.916127 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 04:25:07 crc kubenswrapper[4839]: I0321 04:25:07.916139 4839 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T04:25:07Z","lastTransitionTime":"2026-03-21T04:25:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 04:25:08 crc kubenswrapper[4839]: I0321 04:25:08.018174 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:25:08 crc kubenswrapper[4839]: I0321 04:25:08.018225 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:25:08 crc kubenswrapper[4839]: I0321 04:25:08.018238 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:25:08 crc kubenswrapper[4839]: I0321 04:25:08.018256 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 04:25:08 crc kubenswrapper[4839]: I0321 04:25:08.018268 4839 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T04:25:08Z","lastTransitionTime":"2026-03-21T04:25:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 04:25:08 crc kubenswrapper[4839]: I0321 04:25:08.064061 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-spl4b" event={"ID":"d634043b-c9ec-4469-b267-26053b1f02f9","Type":"ContainerStarted","Data":"a5c1d96269996b1709d6a1c4744c11ca23d8e4fd56bc740881a1ddb1f0198172"} Mar 21 04:25:08 crc kubenswrapper[4839]: I0321 04:25:08.064209 4839 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-spl4b" Mar 21 04:25:08 crc kubenswrapper[4839]: I0321 04:25:08.064342 4839 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-spl4b" Mar 21 04:25:08 crc kubenswrapper[4839]: I0321 04:25:08.064371 4839 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-spl4b" Mar 21 04:25:08 crc kubenswrapper[4839]: I0321 04:25:08.066779 4839 generic.go:334] "Generic (PLEG): container finished" podID="e0848faa-daf7-4b62-a20f-36d92678db1d" containerID="6059c9b392e73f503a825716b885d248ddf659fe4fc1cd23f47353dd0fdc0e07" exitCode=0 Mar 21 04:25:08 crc kubenswrapper[4839]: I0321 04:25:08.066827 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-scp2c" event={"ID":"e0848faa-daf7-4b62-a20f-36d92678db1d","Type":"ContainerDied","Data":"6059c9b392e73f503a825716b885d248ddf659fe4fc1cd23f47353dd0fdc0e07"} Mar 21 04:25:08 crc kubenswrapper[4839]: I0321 04:25:08.068550 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-sxs57" event={"ID":"e99177d8-5f41-4cee-a2c9-ae1c314d9d8d","Type":"ContainerStarted","Data":"4499202f3d135af4fbc0aaeddb85c3af3d571f0a6a0da31b0c054112cbeef85e"} Mar 21 04:25:08 crc kubenswrapper[4839]: I0321 04:25:08.068618 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-sxs57" event={"ID":"e99177d8-5f41-4cee-a2c9-ae1c314d9d8d","Type":"ContainerStarted","Data":"f2aca93f8be5ce0f5e7828afb1727002c39312da949d6da70bc8973ef7174727"} Mar 21 04:25:08 crc kubenswrapper[4839]: I0321 04:25:08.072267 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-sxs57" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e99177d8-5f41-4cee-a2c9-ae1c314d9d8d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:06Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:06Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:06Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vqm8m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:25:06Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-sxs57\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 21 04:25:08 crc kubenswrapper[4839]: I0321 04:25:08.082174 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:24:41Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:24:41Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 21 04:25:08 crc kubenswrapper[4839]: I0321 04:25:08.089519 4839 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-spl4b" Mar 21 04:25:08 crc kubenswrapper[4839]: I0321 04:25:08.090100 4839 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-spl4b" Mar 21 04:25:08 crc kubenswrapper[4839]: I0321 04:25:08.091307 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:24:41Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:24:41Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 21 04:25:08 crc kubenswrapper[4839]: I0321 04:25:08.102497 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-scp2c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e0848faa-daf7-4b62-a20f-36d92678db1d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:00Z\\\",\\\"message\\\":\\\"containers with incomplete status: [whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:00Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:00Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6trk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5ee6bb585321fd698c356e2ad9b8f48f62f2b8e09847371060e8ab0395a059ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5ee6bb585321fd698c356e2ad9b8f48f62f2b8e09847371060e8ab0395a059ea\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:25:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:25:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6trk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d6527e3c31caba53559cb6dfe4494b576789a0c8e3e5a7d392352441f1b9b1ad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d6527e3c31caba53559cb6dfe4494b576789a0c8e3e5a7d392352441f1b9b1ad\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:25:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:25:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6trk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://688598020c8b1be055194d4805d234769230262450e5f816f35cc120ff5ff9a2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://688598020c8b1be055194d4805d234769230262450e5f816f35cc120ff5ff9a2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:25:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:25:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6trk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://137b7d338371020059e20feb028cea9e79fa43d32ff7de446719a1326051c4f4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://137b7d338371020059e20feb028cea9e79fa43d32ff7de446719a1326051c4f4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:25:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:25:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6trk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6059c9b392e73f503a825716b885d248ddf659fe4fc1cd23f47353dd0fdc0e07\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:25:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6trk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6trk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:25:00Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-scp2c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 21 04:25:08 crc kubenswrapper[4839]: I0321 04:25:08.115991 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-zqcw4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1602189b-f4f3-40ee-ba63-c695c11069d0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://abb804e65728bf323531d9b35e52f6700584bab85de4a5b749067d2d89224747\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:25:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f9gh6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:25:00Z\\\"}}\" for pod \"openshift-multus\"/\"multus-zqcw4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 21 04:25:08 crc kubenswrapper[4839]: I0321 04:25:08.121111 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:25:08 crc kubenswrapper[4839]: I0321 04:25:08.121146 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:25:08 crc kubenswrapper[4839]: I0321 04:25:08.121157 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:25:08 crc kubenswrapper[4839]: I0321 04:25:08.121175 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 04:25:08 crc kubenswrapper[4839]: I0321 04:25:08.121188 4839 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T04:25:08Z","lastTransitionTime":"2026-03-21T04:25:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 04:25:08 crc kubenswrapper[4839]: I0321 04:25:08.131656 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"845a5f72-fcd9-4e53-a65e-d875a0758312\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:23:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:23:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:23:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:23:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:23:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6ac8aa1efe8200de679b27c91d85e8934dd8b958a487da8adfcb2e7fe4d4af83\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:23:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b5fcb9a692a8e68cdaf00e9ab50192286582dc94a1ec9c1f269deec71d12e709\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:23:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0a90276bf42ba474250d4e51c93b958e93ffcad8ab5f4285766b381f0247c484\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:23:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://418cc0271450c53990546d7ffb4506824d2a5889e49064cd40624ba291f485bd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:23:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d2aeb8f0a5e6e7302570f0ecb899e31a8f5e9ed506f9bf7cfa6a0a6d55ae50dd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:23:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://843b9092466615ee64f818bbd1d6e51298c59932d1e135796967de09fb2f81bf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://843b9092466615ee64f818bbd1d6e51298c59932d1e135796967de09fb2f81bf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:23:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:23:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://320bccb832e179c9ab109d22c5cd652cfb3bd770d2c698d91886401d9566efc5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://320bccb832e179c9ab109d22c5cd652cfb3bd770d2c698d91886401d9566efc5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:23:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:23:18Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://6e30344cfa0c0c9d42845a517f744a156634132bd6c2f6e30f745751913e968c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6e30344cfa0c0c9d42845a517f744a156634132bd6c2f6e30f745751913e968c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:23:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:23:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:23:16Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 21 04:25:08 crc kubenswrapper[4839]: I0321 04:25:08.141450 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:24:41Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:24:41Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 21 04:25:08 crc kubenswrapper[4839]: I0321 04:25:08.163794 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-spl4b" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d634043b-c9ec-4469-b267-26053b1f02f9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:01Z\\\",\\\"message\\\":\\\"containers with unready status: [nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:01Z\\\",\\\"message\\\":\\\"containers with unready status: [nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5854b81091eb3920f499e2fb8bdb5e51afa5cf975397ad7d789603150fdafe92\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:25:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cdph2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c532a8668c47301ab93dbb1ea21be6ed687a83a6c9008b887e5023a46f27d172\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:25:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cdph2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://340ce870dbfbd3b102f8f1c5e6705da80e4a23db035175d7746ffe6ad63310f0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:25:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cdph2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://04c0470b8aa154ad901d5416631136ad2480da45bff39836dda6837a3e4b980e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:25:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cdph2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://821316a28b33f897df4ea4ae553ed0cbdd066ff220cc310af604bbe062e4b00e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:25:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cdph2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0555faac11cd1da1d5573074ee5d8704c1c4a7d8559996229bf242d9ed7290f4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:25:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cdph2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a5c1d96269996b1709d6a1c4744c11ca23d8e4fd56bc740881a1ddb1f0198172\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:25:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cdph2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ee9c34719e2681860a7065870d4f0fd1d5aae10da5f2cc780e83f8020211e536\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:25:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cdph2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://466ed122245008a86b4471250670e086e61c981de3a7a026461c7c2238c87f3c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://466ed122245008a86b4471250670e086e61c981de3a7a026461c7c2238c87f3c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:25:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:25:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cdph2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:25:01Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-spl4b\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 21 04:25:08 crc kubenswrapper[4839]: I0321 04:25:08.170300 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-g47qh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e646dbcd-c976-48e4-8dee-497be8a275bf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2ceedbcee674106a8f519fededf37b2bbc4b4bd660845373899de825f1ed534e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:25:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ljz2v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:25:00Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-g47qh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 21 04:25:08 crc kubenswrapper[4839]: I0321 04:25:08.185412 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c5a473f2-0430-4c9b-8ef8-60d457db5188\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:23:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:23:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:23:16Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:23:16Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:23:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7299e7f513aff2d1d61b42cef3ed386843478716082a6e38d4d6d61d87f48529\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:23:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6d9bd6ceaa85cd21af55d1d512ac910eaf9d1bcf5d0f10868cd22ac80b13f66a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:23:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e4fe4a6c00181492cd5389e34a9bf18d905f69b2f7467eb75dcee9992eeb0d07\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:23:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://412472fce0e71838c2bb83101879b672e777dcd608bf27261a98261150d42e87\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ecafd1a4266b72f369f4b53d8a423bb875c1bc9719282c517a8e999ad5aecc66\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-21T04:24:14Z\\\",\\\"message\\\":\\\"W0321 04:24:13.708919 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI0321 04:24:13.709951 1 crypto.go:601] Generating new CA for check-endpoints-signer@1774067053 cert, and key in /tmp/serving-cert-1138089316/serving-signer.crt, /tmp/serving-cert-1138089316/serving-signer.key\\\\nI0321 04:24:14.009807 1 observer_polling.go:159] Starting file observer\\\\nW0321 04:24:14.016906 1 builder.go:272] unable to get owner reference (falling back to namespace): Unauthorized\\\\nI0321 04:24:14.017068 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0321 04:24:14.017655 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1138089316/tls.crt::/tmp/serving-cert-1138089316/tls.key\\\\\\\"\\\\nF0321 04:24:14.917069 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-21T04:24:13Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:24:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://573a241321770c9c676d83d8280f81eba0ad01a3e2f857be89d31316117b8898\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:23:19Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f70600d5431425836d94025fc8166068d8c9f411d0750be07cd7de44575a9649\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f70600d5431425836d94025fc8166068d8c9f411d0750be07cd7de44575a9649\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:23:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:23:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:23:16Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 21 04:25:08 crc kubenswrapper[4839]: I0321 04:25:08.195717 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:24:41Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:24:41Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 21 04:25:08 crc kubenswrapper[4839]: I0321 04:25:08.206264 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:24:41Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:24:41Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 21 04:25:08 crc kubenswrapper[4839]: I0321 04:25:08.215210 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:24:41Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:24:41Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 21 04:25:08 crc kubenswrapper[4839]: I0321 04:25:08.224220 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:25:08 crc kubenswrapper[4839]: I0321 04:25:08.224274 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:25:08 crc kubenswrapper[4839]: I0321 04:25:08.224285 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:25:08 crc kubenswrapper[4839]: I0321 04:25:08.224297 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 04:25:08 crc kubenswrapper[4839]: I0321 04:25:08.224334 4839 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T04:25:08Z","lastTransitionTime":"2026-03-21T04:25:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 04:25:08 crc kubenswrapper[4839]: I0321 04:25:08.226383 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-jx4q7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4f92fefb-d5cd-451a-8bbe-31eea55d5bd9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6fc18a359e07bf64662248d7ceb65c8184407c0c4e14ce5483760975218407c9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:25:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9jqbw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a124ddc90d8b93fe4b6f77b778fbdac127b4896f5f55e6d5d78629cd17584311\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:25:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9jqbw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:25:00Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-jx4q7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 21 04:25:08 crc kubenswrapper[4839]: I0321 04:25:08.236022 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:24:41Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:24:41Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 21 04:25:08 crc kubenswrapper[4839]: I0321 04:25:08.245597 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:24:41Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:24:41Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 21 04:25:08 crc kubenswrapper[4839]: I0321 04:25:08.257752 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-scp2c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e0848faa-daf7-4b62-a20f-36d92678db1d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:00Z\\\",\\\"message\\\":\\\"containers with incomplete status: [whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:00Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:00Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6trk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5ee6bb585321fd698c356e2ad9b8f48f62f2b8e09847371060e8ab0395a059ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5ee6bb585321fd698c356e2ad9b8f48f62f2b8e09847371060e8ab0395a059ea\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:25:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:25:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6trk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d6527e3c31caba53559cb6dfe4494b576789a0c8e3e5a7d392352441f1b9b1ad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d6527e3c31caba53559cb6dfe4494b576789a0c8e3e5a7d392352441f1b9b1ad\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:25:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:25:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6trk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://688598020c8b1be055194d4805d234769230262450e5f816f35cc120ff5ff9a2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://688598020c8b1be055194d4805d234769230262450e5f816f35cc120ff5ff9a2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:25:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:25:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6trk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://137b7d338371020059e20feb028cea9e79fa43d32ff7de446719a1326051c4f4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://137b7d338371020059e20feb028cea9e79fa43d32ff7de446719a1326051c4f4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:25:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:25:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6trk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6059c9b392e73f503a825716b885d248ddf659fe4fc1cd23f47353dd0fdc0e07\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6059c9b392e73f503a825716b885d248ddf659fe4fc1cd23f47353dd0fdc0e07\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:25:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:25:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6trk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6trk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:25:00Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-scp2c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 21 04:25:08 crc kubenswrapper[4839]: I0321 04:25:08.269357 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-zqcw4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1602189b-f4f3-40ee-ba63-c695c11069d0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://abb804e65728bf323531d9b35e52f6700584bab85de4a5b749067d2d89224747\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:25:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f9gh6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:25:00Z\\\"}}\" for pod \"openshift-multus\"/\"multus-zqcw4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 21 04:25:08 crc kubenswrapper[4839]: I0321 04:25:08.276026 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-sxs57" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e99177d8-5f41-4cee-a2c9-ae1c314d9d8d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4499202f3d135af4fbc0aaeddb85c3af3d571f0a6a0da31b0c054112cbeef85e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:25:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vqm8m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:25:06Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-sxs57\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 21 04:25:08 crc kubenswrapper[4839]: I0321 04:25:08.290111 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"845a5f72-fcd9-4e53-a65e-d875a0758312\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:23:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:23:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:23:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:23:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:23:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6ac8aa1efe8200de679b27c91d85e8934dd8b958a487da8adfcb2e7fe4d4af83\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:23:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b5fcb9a692a8e68cdaf00e9ab50192286582dc94a1ec9c1f269deec71d12e709\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:23:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0a90276bf42ba474250d4e51c93b958e93ffcad8ab5f4285766b381f0247c484\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:23:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://418cc0271450c53990546d7ffb4506824d2a5889e49064cd40624ba291f485bd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:23:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d2aeb8f0a5e6e7302570f0ecb899e31a8f5e9ed506f9bf7cfa6a0a6d55ae50dd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:23:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://843b9092466615ee64f818bbd1d6e51298c59932d1e135796967de09fb2f81bf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://843b9092466615ee64f818bbd1d6e51298c59932d1e135796967de09fb2f81bf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:23:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:23:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://320bccb832e179c9ab109d22c5cd652cfb3bd770d2c698d91886401d9566efc5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://320bccb832e179c9ab109d22c5cd652cfb3bd770d2c698d91886401d9566efc5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:23:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:23:18Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://6e30344cfa0c0c9d42845a517f744a156634132bd6c2f6e30f745751913e968c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6e30344cfa0c0c9d42845a517f744a156634132bd6c2f6e30f745751913e968c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:23:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:23:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:23:16Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 21 04:25:08 crc kubenswrapper[4839]: I0321 04:25:08.298120 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:24:41Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:24:41Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 21 04:25:08 crc kubenswrapper[4839]: I0321 04:25:08.310866 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-spl4b" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d634043b-c9ec-4469-b267-26053b1f02f9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:01Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:01Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5854b81091eb3920f499e2fb8bdb5e51afa5cf975397ad7d789603150fdafe92\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:25:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cdph2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c532a8668c47301ab93dbb1ea21be6ed687a83a6c9008b887e5023a46f27d172\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:25:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cdph2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://340ce870dbfbd3b102f8f1c5e6705da80e4a23db035175d7746ffe6ad63310f0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:25:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cdph2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://04c0470b8aa154ad901d5416631136ad2480da45bff39836dda6837a3e4b980e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:25:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cdph2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://821316a28b33f897df4ea4ae553ed0cbdd066ff220cc310af604bbe062e4b00e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:25:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cdph2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0555faac11cd1da1d5573074ee5d8704c1c4a7d8559996229bf242d9ed7290f4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:25:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cdph2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a5c1d96269996b1709d6a1c4744c11ca23d8e4fd56bc740881a1ddb1f0198172\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:25:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cdph2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ee9c34719e2681860a7065870d4f0fd1d5aae10da5f2cc780e83f8020211e536\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:25:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cdph2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://466ed122245008a86b4471250670e086e61c981de3a7a026461c7c2238c87f3c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://466ed122245008a86b4471250670e086e61c981de3a7a026461c7c2238c87f3c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:25:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:25:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cdph2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:25:01Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-spl4b\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 21 04:25:08 crc kubenswrapper[4839]: I0321 04:25:08.320105 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c5a473f2-0430-4c9b-8ef8-60d457db5188\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:23:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:23:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:23:16Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:23:16Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:23:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7299e7f513aff2d1d61b42cef3ed386843478716082a6e38d4d6d61d87f48529\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:23:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6d9bd6ceaa85cd21af55d1d512ac910eaf9d1bcf5d0f10868cd22ac80b13f66a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:23:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e4fe4a6c00181492cd5389e34a9bf18d905f69b2f7467eb75dcee9992eeb0d07\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:23:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://412472fce0e71838c2bb83101879b672e777dcd608bf27261a98261150d42e87\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ecafd1a4266b72f369f4b53d8a423bb875c1bc9719282c517a8e999ad5aecc66\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-21T04:24:14Z\\\",\\\"message\\\":\\\"W0321 04:24:13.708919 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI0321 04:24:13.709951 1 crypto.go:601] Generating new CA for check-endpoints-signer@1774067053 cert, and key in /tmp/serving-cert-1138089316/serving-signer.crt, /tmp/serving-cert-1138089316/serving-signer.key\\\\nI0321 04:24:14.009807 1 observer_polling.go:159] Starting file observer\\\\nW0321 04:24:14.016906 1 builder.go:272] unable to get owner reference (falling back to namespace): Unauthorized\\\\nI0321 04:24:14.017068 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0321 04:24:14.017655 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1138089316/tls.crt::/tmp/serving-cert-1138089316/tls.key\\\\\\\"\\\\nF0321 04:24:14.917069 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-21T04:24:13Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:24:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://573a241321770c9c676d83d8280f81eba0ad01a3e2f857be89d31316117b8898\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:23:19Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f70600d5431425836d94025fc8166068d8c9f411d0750be07cd7de44575a9649\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f70600d5431425836d94025fc8166068d8c9f411d0750be07cd7de44575a9649\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:23:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:23:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:23:16Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 21 04:25:08 crc kubenswrapper[4839]: I0321 04:25:08.326023 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:25:08 crc kubenswrapper[4839]: I0321 04:25:08.326061 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:25:08 crc kubenswrapper[4839]: I0321 04:25:08.326073 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:25:08 crc kubenswrapper[4839]: I0321 04:25:08.326089 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 04:25:08 crc kubenswrapper[4839]: I0321 04:25:08.326099 4839 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T04:25:08Z","lastTransitionTime":"2026-03-21T04:25:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 04:25:08 crc kubenswrapper[4839]: I0321 04:25:08.328989 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:24:41Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:24:41Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 21 04:25:08 crc kubenswrapper[4839]: I0321 04:25:08.338259 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:24:41Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:24:41Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 21 04:25:08 crc kubenswrapper[4839]: I0321 04:25:08.344525 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-g47qh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e646dbcd-c976-48e4-8dee-497be8a275bf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2ceedbcee674106a8f519fededf37b2bbc4b4bd660845373899de825f1ed534e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:25:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ljz2v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:25:00Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-g47qh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 21 04:25:08 crc kubenswrapper[4839]: I0321 04:25:08.352355 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:24:41Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:24:41Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 21 04:25:08 crc kubenswrapper[4839]: I0321 04:25:08.359128 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-jx4q7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4f92fefb-d5cd-451a-8bbe-31eea55d5bd9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6fc18a359e07bf64662248d7ceb65c8184407c0c4e14ce5483760975218407c9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:25:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9jqbw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a124ddc90d8b93fe4b6f77b778fbdac127b4896f5f55e6d5d78629cd17584311\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:25:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9jqbw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:25:00Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-jx4q7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 21 04:25:08 crc kubenswrapper[4839]: I0321 04:25:08.428824 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:25:08 crc kubenswrapper[4839]: I0321 04:25:08.428861 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:25:08 crc kubenswrapper[4839]: I0321 04:25:08.428889 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:25:08 crc kubenswrapper[4839]: I0321 04:25:08.428912 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 04:25:08 crc kubenswrapper[4839]: I0321 04:25:08.428921 4839 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T04:25:08Z","lastTransitionTime":"2026-03-21T04:25:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 04:25:08 crc kubenswrapper[4839]: I0321 04:25:08.452671 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 21 04:25:08 crc kubenswrapper[4839]: E0321 04:25:08.453435 4839 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 21 04:25:08 crc kubenswrapper[4839]: I0321 04:25:08.531432 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:25:08 crc kubenswrapper[4839]: I0321 04:25:08.531465 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:25:08 crc kubenswrapper[4839]: I0321 04:25:08.531473 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:25:08 crc kubenswrapper[4839]: I0321 04:25:08.531503 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 04:25:08 crc kubenswrapper[4839]: I0321 04:25:08.531513 4839 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T04:25:08Z","lastTransitionTime":"2026-03-21T04:25:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 04:25:08 crc kubenswrapper[4839]: I0321 04:25:08.633339 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:25:08 crc kubenswrapper[4839]: I0321 04:25:08.633378 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:25:08 crc kubenswrapper[4839]: I0321 04:25:08.633388 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:25:08 crc kubenswrapper[4839]: I0321 04:25:08.633402 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 04:25:08 crc kubenswrapper[4839]: I0321 04:25:08.633412 4839 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T04:25:08Z","lastTransitionTime":"2026-03-21T04:25:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 04:25:08 crc kubenswrapper[4839]: I0321 04:25:08.736328 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:25:08 crc kubenswrapper[4839]: I0321 04:25:08.736369 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:25:08 crc kubenswrapper[4839]: I0321 04:25:08.736378 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:25:08 crc kubenswrapper[4839]: I0321 04:25:08.736393 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 04:25:08 crc kubenswrapper[4839]: I0321 04:25:08.736402 4839 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T04:25:08Z","lastTransitionTime":"2026-03-21T04:25:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 04:25:08 crc kubenswrapper[4839]: I0321 04:25:08.838169 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:25:08 crc kubenswrapper[4839]: I0321 04:25:08.838206 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:25:08 crc kubenswrapper[4839]: I0321 04:25:08.838215 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:25:08 crc kubenswrapper[4839]: I0321 04:25:08.838231 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 04:25:08 crc kubenswrapper[4839]: I0321 04:25:08.838242 4839 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T04:25:08Z","lastTransitionTime":"2026-03-21T04:25:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 04:25:08 crc kubenswrapper[4839]: I0321 04:25:08.941068 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:25:08 crc kubenswrapper[4839]: I0321 04:25:08.941107 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:25:08 crc kubenswrapper[4839]: I0321 04:25:08.941117 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:25:08 crc kubenswrapper[4839]: I0321 04:25:08.941133 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 04:25:08 crc kubenswrapper[4839]: I0321 04:25:08.941145 4839 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T04:25:08Z","lastTransitionTime":"2026-03-21T04:25:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 04:25:09 crc kubenswrapper[4839]: I0321 04:25:09.044001 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:25:09 crc kubenswrapper[4839]: I0321 04:25:09.044033 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:25:09 crc kubenswrapper[4839]: I0321 04:25:09.044042 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:25:09 crc kubenswrapper[4839]: I0321 04:25:09.044055 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 04:25:09 crc kubenswrapper[4839]: I0321 04:25:09.044064 4839 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T04:25:09Z","lastTransitionTime":"2026-03-21T04:25:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 04:25:09 crc kubenswrapper[4839]: I0321 04:25:09.073933 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" event={"ID":"ef543e1b-8068-4ea3-b32a-61027b32e95d","Type":"ContainerStarted","Data":"17d5f53fe3137552caf74bf775b44fba819d8790d2bc4d5042744dbd7b7307ce"} Mar 21 04:25:09 crc kubenswrapper[4839]: I0321 04:25:09.074005 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" event={"ID":"ef543e1b-8068-4ea3-b32a-61027b32e95d","Type":"ContainerStarted","Data":"e480aacd9f65a8a4ccedd9c0a943c53bf6de8a8de00eb0d421ec782af8a785bb"} Mar 21 04:25:09 crc kubenswrapper[4839]: I0321 04:25:09.078219 4839 generic.go:334] "Generic (PLEG): container finished" podID="e0848faa-daf7-4b62-a20f-36d92678db1d" containerID="9367e3b70275a4fd5cc98faa785da0149e890de672bce34ddc831a46be5dd6b3" exitCode=0 Mar 21 04:25:09 crc kubenswrapper[4839]: I0321 04:25:09.078291 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-scp2c" event={"ID":"e0848faa-daf7-4b62-a20f-36d92678db1d","Type":"ContainerDied","Data":"9367e3b70275a4fd5cc98faa785da0149e890de672bce34ddc831a46be5dd6b3"} Mar 21 04:25:09 crc kubenswrapper[4839]: I0321 04:25:09.096330 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:24:41Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:24:41Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 21 04:25:09 crc kubenswrapper[4839]: I0321 04:25:09.106955 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-jx4q7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4f92fefb-d5cd-451a-8bbe-31eea55d5bd9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6fc18a359e07bf64662248d7ceb65c8184407c0c4e14ce5483760975218407c9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:25:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9jqbw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a124ddc90d8b93fe4b6f77b778fbdac127b4896f5f55e6d5d78629cd17584311\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:25:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9jqbw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:25:00Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-jx4q7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 21 04:25:09 crc kubenswrapper[4839]: I0321 04:25:09.124098 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-scp2c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e0848faa-daf7-4b62-a20f-36d92678db1d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:00Z\\\",\\\"message\\\":\\\"containers with incomplete status: [whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:00Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:00Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6trk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5ee6bb585321fd698c356e2ad9b8f48f62f2b8e09847371060e8ab0395a059ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5ee6bb585321fd698c356e2ad9b8f48f62f2b8e09847371060e8ab0395a059ea\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:25:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:25:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6trk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d6527e3c31caba53559cb6dfe4494b576789a0c8e3e5a7d392352441f1b9b1ad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d6527e3c31caba53559cb6dfe4494b576789a0c8e3e5a7d392352441f1b9b1ad\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:25:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:25:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6trk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://688598020c8b1be055194d4805d234769230262450e5f816f35cc120ff5ff9a2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://688598020c8b1be055194d4805d234769230262450e5f816f35cc120ff5ff9a2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:25:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:25:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6trk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://137b7d338371020059e20feb028cea9e79fa43d32ff7de446719a1326051c4f4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://137b7d338371020059e20feb028cea9e79fa43d32ff7de446719a1326051c4f4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:25:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:25:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6trk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6059c9b392e73f503a825716b885d248ddf659fe4fc1cd23f47353dd0fdc0e07\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6059c9b392e73f503a825716b885d248ddf659fe4fc1cd23f47353dd0fdc0e07\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:25:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:25:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6trk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6trk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:25:00Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-scp2c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 21 04:25:09 crc kubenswrapper[4839]: I0321 04:25:09.136305 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-zqcw4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1602189b-f4f3-40ee-ba63-c695c11069d0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://abb804e65728bf323531d9b35e52f6700584bab85de4a5b749067d2d89224747\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:25:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f9gh6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:25:00Z\\\"}}\" for pod \"openshift-multus\"/\"multus-zqcw4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 21 04:25:09 crc kubenswrapper[4839]: I0321 04:25:09.147364 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-sxs57" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e99177d8-5f41-4cee-a2c9-ae1c314d9d8d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4499202f3d135af4fbc0aaeddb85c3af3d571f0a6a0da31b0c054112cbeef85e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:25:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vqm8m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:25:06Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-sxs57\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 21 04:25:09 crc kubenswrapper[4839]: I0321 04:25:09.148757 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:25:09 crc kubenswrapper[4839]: I0321 04:25:09.148780 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:25:09 crc kubenswrapper[4839]: I0321 04:25:09.149486 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:25:09 crc kubenswrapper[4839]: I0321 04:25:09.149527 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 04:25:09 crc kubenswrapper[4839]: I0321 04:25:09.149541 4839 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T04:25:09Z","lastTransitionTime":"2026-03-21T04:25:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 04:25:09 crc kubenswrapper[4839]: I0321 04:25:09.160242 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:24:41Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:24:41Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 21 04:25:09 crc kubenswrapper[4839]: I0321 04:25:09.171792 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:24:41Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:24:41Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 21 04:25:09 crc kubenswrapper[4839]: I0321 04:25:09.187390 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-spl4b" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d634043b-c9ec-4469-b267-26053b1f02f9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:01Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:01Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5854b81091eb3920f499e2fb8bdb5e51afa5cf975397ad7d789603150fdafe92\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:25:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cdph2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c532a8668c47301ab93dbb1ea21be6ed687a83a6c9008b887e5023a46f27d172\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:25:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cdph2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://340ce870dbfbd3b102f8f1c5e6705da80e4a23db035175d7746ffe6ad63310f0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:25:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cdph2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://04c0470b8aa154ad901d5416631136ad2480da45bff39836dda6837a3e4b980e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:25:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cdph2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://821316a28b33f897df4ea4ae553ed0cbdd066ff220cc310af604bbe062e4b00e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:25:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cdph2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0555faac11cd1da1d5573074ee5d8704c1c4a7d8559996229bf242d9ed7290f4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:25:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cdph2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a5c1d96269996b1709d6a1c4744c11ca23d8e4fd56bc740881a1ddb1f0198172\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:25:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cdph2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ee9c34719e2681860a7065870d4f0fd1d5aae10da5f2cc780e83f8020211e536\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:25:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cdph2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://466ed122245008a86b4471250670e086e61c981de3a7a026461c7c2238c87f3c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://466ed122245008a86b4471250670e086e61c981de3a7a026461c7c2238c87f3c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:25:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:25:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cdph2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:25:01Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-spl4b\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 21 04:25:09 crc kubenswrapper[4839]: I0321 04:25:09.204721 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"845a5f72-fcd9-4e53-a65e-d875a0758312\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:23:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:23:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:23:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:23:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:23:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6ac8aa1efe8200de679b27c91d85e8934dd8b958a487da8adfcb2e7fe4d4af83\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:23:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b5fcb9a692a8e68cdaf00e9ab50192286582dc94a1ec9c1f269deec71d12e709\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:23:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0a90276bf42ba474250d4e51c93b958e93ffcad8ab5f4285766b381f0247c484\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:23:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://418cc0271450c53990546d7ffb4506824d2a5889e49064cd40624ba291f485bd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:23:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d2aeb8f0a5e6e7302570f0ecb899e31a8f5e9ed506f9bf7cfa6a0a6d55ae50dd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:23:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://843b9092466615ee64f818bbd1d6e51298c59932d1e135796967de09fb2f81bf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://843b9092466615ee64f818bbd1d6e51298c59932d1e135796967de09fb2f81bf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:23:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:23:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://320bccb832e179c9ab109d22c5cd652cfb3bd770d2c698d91886401d9566efc5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://320bccb832e179c9ab109d22c5cd652cfb3bd770d2c698d91886401d9566efc5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:23:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:23:18Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://6e30344cfa0c0c9d42845a517f744a156634132bd6c2f6e30f745751913e968c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6e30344cfa0c0c9d42845a517f744a156634132bd6c2f6e30f745751913e968c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:23:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:23:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:23:16Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 21 04:25:09 crc kubenswrapper[4839]: I0321 04:25:09.214132 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:24:41Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:24:41Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 21 04:25:09 crc kubenswrapper[4839]: I0321 04:25:09.223547 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:24:41Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:24:41Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 21 04:25:09 crc kubenswrapper[4839]: I0321 04:25:09.235707 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:24:41Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:09Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://17d5f53fe3137552caf74bf775b44fba819d8790d2bc4d5042744dbd7b7307ce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:25:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e480aacd9f65a8a4ccedd9c0a943c53bf6de8a8de00eb0d421ec782af8a785bb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:25:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 21 04:25:09 crc kubenswrapper[4839]: I0321 04:25:09.243406 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-g47qh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e646dbcd-c976-48e4-8dee-497be8a275bf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2ceedbcee674106a8f519fededf37b2bbc4b4bd660845373899de825f1ed534e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:25:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ljz2v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:25:00Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-g47qh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 21 04:25:09 crc kubenswrapper[4839]: I0321 04:25:09.252356 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:25:09 crc kubenswrapper[4839]: I0321 04:25:09.252382 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:25:09 crc kubenswrapper[4839]: I0321 04:25:09.252391 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:25:09 crc kubenswrapper[4839]: I0321 04:25:09.252405 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 04:25:09 crc kubenswrapper[4839]: I0321 04:25:09.252413 4839 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T04:25:09Z","lastTransitionTime":"2026-03-21T04:25:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 04:25:09 crc kubenswrapper[4839]: I0321 04:25:09.254760 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c5a473f2-0430-4c9b-8ef8-60d457db5188\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:23:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:23:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:23:16Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:23:16Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:23:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7299e7f513aff2d1d61b42cef3ed386843478716082a6e38d4d6d61d87f48529\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:23:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6d9bd6ceaa85cd21af55d1d512ac910eaf9d1bcf5d0f10868cd22ac80b13f66a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:23:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e4fe4a6c00181492cd5389e34a9bf18d905f69b2f7467eb75dcee9992eeb0d07\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:23:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://412472fce0e71838c2bb83101879b672e777dcd608bf27261a98261150d42e87\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ecafd1a4266b72f369f4b53d8a423bb875c1bc9719282c517a8e999ad5aecc66\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-21T04:24:14Z\\\",\\\"message\\\":\\\"W0321 04:24:13.708919 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI0321 04:24:13.709951 1 crypto.go:601] Generating new CA for check-endpoints-signer@1774067053 cert, and key in /tmp/serving-cert-1138089316/serving-signer.crt, /tmp/serving-cert-1138089316/serving-signer.key\\\\nI0321 04:24:14.009807 1 observer_polling.go:159] Starting file observer\\\\nW0321 04:24:14.016906 1 builder.go:272] unable to get owner reference (falling back to namespace): Unauthorized\\\\nI0321 04:24:14.017068 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0321 04:24:14.017655 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1138089316/tls.crt::/tmp/serving-cert-1138089316/tls.key\\\\\\\"\\\\nF0321 04:24:14.917069 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-21T04:24:13Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:24:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://573a241321770c9c676d83d8280f81eba0ad01a3e2f857be89d31316117b8898\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:23:19Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f70600d5431425836d94025fc8166068d8c9f411d0750be07cd7de44575a9649\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f70600d5431425836d94025fc8166068d8c9f411d0750be07cd7de44575a9649\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:23:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:23:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:23:16Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 21 04:25:09 crc kubenswrapper[4839]: I0321 04:25:09.271600 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"845a5f72-fcd9-4e53-a65e-d875a0758312\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:23:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:23:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:23:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:23:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:23:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6ac8aa1efe8200de679b27c91d85e8934dd8b958a487da8adfcb2e7fe4d4af83\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:23:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b5fcb9a692a8e68cdaf00e9ab50192286582dc94a1ec9c1f269deec71d12e709\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:23:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0a90276bf42ba474250d4e51c93b958e93ffcad8ab5f4285766b381f0247c484\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:23:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://418cc0271450c53990546d7ffb4506824d2a5889e49064cd40624ba291f485bd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:23:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d2aeb8f0a5e6e7302570f0ecb899e31a8f5e9ed506f9bf7cfa6a0a6d55ae50dd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:23:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://843b9092466615ee64f818bbd1d6e51298c59932d1e135796967de09fb2f81bf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://843b9092466615ee64f818bbd1d6e51298c59932d1e135796967de09fb2f81bf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:23:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:23:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://320bccb832e179c9ab109d22c5cd652cfb3bd770d2c698d91886401d9566efc5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://320bccb832e179c9ab109d22c5cd652cfb3bd770d2c698d91886401d9566efc5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:23:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:23:18Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://6e30344cfa0c0c9d42845a517f744a156634132bd6c2f6e30f745751913e968c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6e30344cfa0c0c9d42845a517f744a156634132bd6c2f6e30f745751913e968c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:23:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:23:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:23:16Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 21 04:25:09 crc kubenswrapper[4839]: I0321 04:25:09.284920 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:24:41Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:24:41Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 21 04:25:09 crc kubenswrapper[4839]: I0321 04:25:09.301790 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-spl4b" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d634043b-c9ec-4469-b267-26053b1f02f9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:01Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:01Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5854b81091eb3920f499e2fb8bdb5e51afa5cf975397ad7d789603150fdafe92\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:25:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cdph2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c532a8668c47301ab93dbb1ea21be6ed687a83a6c9008b887e5023a46f27d172\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:25:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cdph2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://340ce870dbfbd3b102f8f1c5e6705da80e4a23db035175d7746ffe6ad63310f0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:25:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cdph2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://04c0470b8aa154ad901d5416631136ad2480da45bff39836dda6837a3e4b980e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:25:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cdph2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://821316a28b33f897df4ea4ae553ed0cbdd066ff220cc310af604bbe062e4b00e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:25:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cdph2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0555faac11cd1da1d5573074ee5d8704c1c4a7d8559996229bf242d9ed7290f4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:25:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cdph2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a5c1d96269996b1709d6a1c4744c11ca23d8e4fd56bc740881a1ddb1f0198172\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:25:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cdph2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ee9c34719e2681860a7065870d4f0fd1d5aae10da5f2cc780e83f8020211e536\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:25:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cdph2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://466ed122245008a86b4471250670e086e61c981de3a7a026461c7c2238c87f3c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://466ed122245008a86b4471250670e086e61c981de3a7a026461c7c2238c87f3c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:25:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:25:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cdph2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:25:01Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-spl4b\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 21 04:25:09 crc kubenswrapper[4839]: I0321 04:25:09.313727 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c5a473f2-0430-4c9b-8ef8-60d457db5188\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:23:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:23:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:23:16Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:23:16Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:23:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7299e7f513aff2d1d61b42cef3ed386843478716082a6e38d4d6d61d87f48529\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:23:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6d9bd6ceaa85cd21af55d1d512ac910eaf9d1bcf5d0f10868cd22ac80b13f66a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:23:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e4fe4a6c00181492cd5389e34a9bf18d905f69b2f7467eb75dcee9992eeb0d07\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:23:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://412472fce0e71838c2bb83101879b672e777dcd608bf27261a98261150d42e87\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ecafd1a4266b72f369f4b53d8a423bb875c1bc9719282c517a8e999ad5aecc66\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-21T04:24:14Z\\\",\\\"message\\\":\\\"W0321 04:24:13.708919 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI0321 04:24:13.709951 1 crypto.go:601] Generating new CA for check-endpoints-signer@1774067053 cert, and key in /tmp/serving-cert-1138089316/serving-signer.crt, /tmp/serving-cert-1138089316/serving-signer.key\\\\nI0321 04:24:14.009807 1 observer_polling.go:159] Starting file observer\\\\nW0321 04:24:14.016906 1 builder.go:272] unable to get owner reference (falling back to namespace): Unauthorized\\\\nI0321 04:24:14.017068 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0321 04:24:14.017655 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1138089316/tls.crt::/tmp/serving-cert-1138089316/tls.key\\\\\\\"\\\\nF0321 04:24:14.917069 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-21T04:24:13Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:24:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://573a241321770c9c676d83d8280f81eba0ad01a3e2f857be89d31316117b8898\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:23:19Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f70600d5431425836d94025fc8166068d8c9f411d0750be07cd7de44575a9649\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f70600d5431425836d94025fc8166068d8c9f411d0750be07cd7de44575a9649\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:23:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:23:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:23:16Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 21 04:25:09 crc kubenswrapper[4839]: I0321 04:25:09.327204 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:24:41Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:24:41Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 21 04:25:09 crc kubenswrapper[4839]: I0321 04:25:09.339728 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:24:41Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:09Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://17d5f53fe3137552caf74bf775b44fba819d8790d2bc4d5042744dbd7b7307ce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:25:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e480aacd9f65a8a4ccedd9c0a943c53bf6de8a8de00eb0d421ec782af8a785bb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:25:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 21 04:25:09 crc kubenswrapper[4839]: I0321 04:25:09.349593 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-g47qh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e646dbcd-c976-48e4-8dee-497be8a275bf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2ceedbcee674106a8f519fededf37b2bbc4b4bd660845373899de825f1ed534e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:25:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ljz2v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:25:00Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-g47qh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 21 04:25:09 crc kubenswrapper[4839]: I0321 04:25:09.354456 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:25:09 crc kubenswrapper[4839]: I0321 04:25:09.354489 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:25:09 crc kubenswrapper[4839]: I0321 04:25:09.354500 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:25:09 crc kubenswrapper[4839]: I0321 04:25:09.354521 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 04:25:09 crc kubenswrapper[4839]: I0321 04:25:09.354533 4839 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T04:25:09Z","lastTransitionTime":"2026-03-21T04:25:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 04:25:09 crc kubenswrapper[4839]: I0321 04:25:09.360312 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:24:41Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:24:41Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 21 04:25:09 crc kubenswrapper[4839]: I0321 04:25:09.370890 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-jx4q7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4f92fefb-d5cd-451a-8bbe-31eea55d5bd9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6fc18a359e07bf64662248d7ceb65c8184407c0c4e14ce5483760975218407c9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:25:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9jqbw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a124ddc90d8b93fe4b6f77b778fbdac127b4896f5f55e6d5d78629cd17584311\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:25:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9jqbw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:25:00Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-jx4q7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 21 04:25:09 crc kubenswrapper[4839]: I0321 04:25:09.385185 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:24:41Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:24:41Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 21 04:25:09 crc kubenswrapper[4839]: I0321 04:25:09.394010 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:24:41Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:24:41Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 21 04:25:09 crc kubenswrapper[4839]: I0321 04:25:09.406100 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-scp2c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e0848faa-daf7-4b62-a20f-36d92678db1d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:00Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:00Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6trk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5ee6bb585321fd698c356e2ad9b8f48f62f2b8e09847371060e8ab0395a059ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5ee6bb585321fd698c356e2ad9b8f48f62f2b8e09847371060e8ab0395a059ea\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:25:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:25:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6trk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d6527e3c31caba53559cb6dfe4494b576789a0c8e3e5a7d392352441f1b9b1ad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d6527e3c31caba53559cb6dfe4494b576789a0c8e3e5a7d392352441f1b9b1ad\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:25:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:25:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6trk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://688598020c8b1be055194d4805d234769230262450e5f816f35cc120ff5ff9a2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://688598020c8b1be055194d4805d234769230262450e5f816f35cc120ff5ff9a2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:25:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:25:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6trk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://137b7d338371020059e20feb028cea9e79fa43d32ff7de446719a1326051c4f4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://137b7d338371020059e20feb028cea9e79fa43d32ff7de446719a1326051c4f4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:25:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:25:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6trk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6059c9b392e73f503a825716b885d248ddf659fe4fc1cd23f47353dd0fdc0e07\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6059c9b392e73f503a825716b885d248ddf659fe4fc1cd23f47353dd0fdc0e07\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:25:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:25:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6trk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9367e3b70275a4fd5cc98faa785da0149e890de672bce34ddc831a46be5dd6b3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9367e3b70275a4fd5cc98faa785da0149e890de672bce34ddc831a46be5dd6b3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:25:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:25:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6trk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:25:00Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-scp2c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 21 04:25:09 crc kubenswrapper[4839]: I0321 04:25:09.418409 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-zqcw4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1602189b-f4f3-40ee-ba63-c695c11069d0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://abb804e65728bf323531d9b35e52f6700584bab85de4a5b749067d2d89224747\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:25:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f9gh6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:25:00Z\\\"}}\" for pod \"openshift-multus\"/\"multus-zqcw4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 21 04:25:09 crc kubenswrapper[4839]: I0321 04:25:09.427182 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-sxs57" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e99177d8-5f41-4cee-a2c9-ae1c314d9d8d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4499202f3d135af4fbc0aaeddb85c3af3d571f0a6a0da31b0c054112cbeef85e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:25:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vqm8m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:25:06Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-sxs57\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 21 04:25:09 crc kubenswrapper[4839]: I0321 04:25:09.452539 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 21 04:25:09 crc kubenswrapper[4839]: I0321 04:25:09.452701 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 21 04:25:09 crc kubenswrapper[4839]: E0321 04:25:09.452724 4839 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 21 04:25:09 crc kubenswrapper[4839]: E0321 04:25:09.452908 4839 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 21 04:25:09 crc kubenswrapper[4839]: I0321 04:25:09.458180 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:25:09 crc kubenswrapper[4839]: I0321 04:25:09.458214 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:25:09 crc kubenswrapper[4839]: I0321 04:25:09.458227 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:25:09 crc kubenswrapper[4839]: I0321 04:25:09.458244 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 04:25:09 crc kubenswrapper[4839]: I0321 04:25:09.458256 4839 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T04:25:09Z","lastTransitionTime":"2026-03-21T04:25:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 04:25:09 crc kubenswrapper[4839]: I0321 04:25:09.562763 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:25:09 crc kubenswrapper[4839]: I0321 04:25:09.562867 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:25:09 crc kubenswrapper[4839]: I0321 04:25:09.562899 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:25:09 crc kubenswrapper[4839]: I0321 04:25:09.562936 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 04:25:09 crc kubenswrapper[4839]: I0321 04:25:09.562963 4839 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T04:25:09Z","lastTransitionTime":"2026-03-21T04:25:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 04:25:09 crc kubenswrapper[4839]: I0321 04:25:09.665772 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:25:09 crc kubenswrapper[4839]: I0321 04:25:09.665808 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:25:09 crc kubenswrapper[4839]: I0321 04:25:09.665817 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:25:09 crc kubenswrapper[4839]: I0321 04:25:09.665830 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 04:25:09 crc kubenswrapper[4839]: I0321 04:25:09.665839 4839 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T04:25:09Z","lastTransitionTime":"2026-03-21T04:25:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 04:25:09 crc kubenswrapper[4839]: I0321 04:25:09.769019 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:25:09 crc kubenswrapper[4839]: I0321 04:25:09.769059 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:25:09 crc kubenswrapper[4839]: I0321 04:25:09.769070 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:25:09 crc kubenswrapper[4839]: I0321 04:25:09.769088 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 04:25:09 crc kubenswrapper[4839]: I0321 04:25:09.769098 4839 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T04:25:09Z","lastTransitionTime":"2026-03-21T04:25:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 04:25:09 crc kubenswrapper[4839]: I0321 04:25:09.871794 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:25:09 crc kubenswrapper[4839]: I0321 04:25:09.871990 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:25:09 crc kubenswrapper[4839]: I0321 04:25:09.872053 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:25:09 crc kubenswrapper[4839]: I0321 04:25:09.872119 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 04:25:09 crc kubenswrapper[4839]: I0321 04:25:09.872192 4839 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T04:25:09Z","lastTransitionTime":"2026-03-21T04:25:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 04:25:09 crc kubenswrapper[4839]: I0321 04:25:09.975008 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:25:09 crc kubenswrapper[4839]: I0321 04:25:09.975051 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:25:09 crc kubenswrapper[4839]: I0321 04:25:09.975062 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:25:09 crc kubenswrapper[4839]: I0321 04:25:09.975078 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 04:25:09 crc kubenswrapper[4839]: I0321 04:25:09.975089 4839 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T04:25:09Z","lastTransitionTime":"2026-03-21T04:25:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 04:25:10 crc kubenswrapper[4839]: I0321 04:25:10.078065 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:25:10 crc kubenswrapper[4839]: I0321 04:25:10.078099 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:25:10 crc kubenswrapper[4839]: I0321 04:25:10.078111 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:25:10 crc kubenswrapper[4839]: I0321 04:25:10.078126 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 04:25:10 crc kubenswrapper[4839]: I0321 04:25:10.078137 4839 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T04:25:10Z","lastTransitionTime":"2026-03-21T04:25:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 04:25:10 crc kubenswrapper[4839]: I0321 04:25:10.083910 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-scp2c" event={"ID":"e0848faa-daf7-4b62-a20f-36d92678db1d","Type":"ContainerStarted","Data":"2ae22ec83bed051327f3e013af8f13d50dfab76794f5fbc972c4091173c8c464"} Mar 21 04:25:10 crc kubenswrapper[4839]: I0321 04:25:10.102895 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"845a5f72-fcd9-4e53-a65e-d875a0758312\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:23:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:23:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:23:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:23:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:23:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6ac8aa1efe8200de679b27c91d85e8934dd8b958a487da8adfcb2e7fe4d4af83\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:23:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b5fcb9a692a8e68cdaf00e9ab50192286582dc94a1ec9c1f269deec71d12e709\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:23:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0a90276bf42ba474250d4e51c93b958e93ffcad8ab5f4285766b381f0247c484\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:23:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://418cc0271450c53990546d7ffb4506824d2a5889e49064cd40624ba291f485bd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:23:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d2aeb8f0a5e6e7302570f0ecb899e31a8f5e9ed506f9bf7cfa6a0a6d55ae50dd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:23:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://843b9092466615ee64f818bbd1d6e51298c59932d1e135796967de09fb2f81bf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://843b9092466615ee64f818bbd1d6e51298c59932d1e135796967de09fb2f81bf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:23:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:23:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://320bccb832e179c9ab109d22c5cd652cfb3bd770d2c698d91886401d9566efc5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://320bccb832e179c9ab109d22c5cd652cfb3bd770d2c698d91886401d9566efc5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:23:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:23:18Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://6e30344cfa0c0c9d42845a517f744a156634132bd6c2f6e30f745751913e968c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6e30344cfa0c0c9d42845a517f744a156634132bd6c2f6e30f745751913e968c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:23:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:23:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:23:16Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:25:10Z is after 2025-08-24T17:21:41Z" Mar 21 04:25:10 crc kubenswrapper[4839]: I0321 04:25:10.113352 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:24:41Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:24:41Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:25:10Z is after 2025-08-24T17:21:41Z" Mar 21 04:25:10 crc kubenswrapper[4839]: I0321 04:25:10.132603 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-spl4b" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d634043b-c9ec-4469-b267-26053b1f02f9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:01Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:01Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5854b81091eb3920f499e2fb8bdb5e51afa5cf975397ad7d789603150fdafe92\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:25:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cdph2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c532a8668c47301ab93dbb1ea21be6ed687a83a6c9008b887e5023a46f27d172\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:25:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cdph2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://340ce870dbfbd3b102f8f1c5e6705da80e4a23db035175d7746ffe6ad63310f0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:25:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cdph2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://04c0470b8aa154ad901d5416631136ad2480da45bff39836dda6837a3e4b980e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:25:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cdph2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://821316a28b33f897df4ea4ae553ed0cbdd066ff220cc310af604bbe062e4b00e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:25:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cdph2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0555faac11cd1da1d5573074ee5d8704c1c4a7d8559996229bf242d9ed7290f4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:25:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cdph2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a5c1d96269996b1709d6a1c4744c11ca23d8e4fd56bc740881a1ddb1f0198172\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:25:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cdph2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ee9c34719e2681860a7065870d4f0fd1d5aae10da5f2cc780e83f8020211e536\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:25:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cdph2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://466ed122245008a86b4471250670e086e61c981de3a7a026461c7c2238c87f3c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://466ed122245008a86b4471250670e086e61c981de3a7a026461c7c2238c87f3c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:25:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:25:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cdph2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:25:01Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-spl4b\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:25:10Z is after 2025-08-24T17:21:41Z" Mar 21 04:25:10 crc kubenswrapper[4839]: I0321 04:25:10.145930 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c5a473f2-0430-4c9b-8ef8-60d457db5188\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:23:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:23:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:23:16Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:23:16Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:23:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7299e7f513aff2d1d61b42cef3ed386843478716082a6e38d4d6d61d87f48529\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:23:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6d9bd6ceaa85cd21af55d1d512ac910eaf9d1bcf5d0f10868cd22ac80b13f66a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:23:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e4fe4a6c00181492cd5389e34a9bf18d905f69b2f7467eb75dcee9992eeb0d07\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:23:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://412472fce0e71838c2bb83101879b672e777dcd608bf27261a98261150d42e87\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ecafd1a4266b72f369f4b53d8a423bb875c1bc9719282c517a8e999ad5aecc66\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-21T04:24:14Z\\\",\\\"message\\\":\\\"W0321 04:24:13.708919 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI0321 04:24:13.709951 1 crypto.go:601] Generating new CA for check-endpoints-signer@1774067053 cert, and key in /tmp/serving-cert-1138089316/serving-signer.crt, /tmp/serving-cert-1138089316/serving-signer.key\\\\nI0321 04:24:14.009807 1 observer_polling.go:159] Starting file observer\\\\nW0321 04:24:14.016906 1 builder.go:272] unable to get owner reference (falling back to namespace): Unauthorized\\\\nI0321 04:24:14.017068 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0321 04:24:14.017655 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1138089316/tls.crt::/tmp/serving-cert-1138089316/tls.key\\\\\\\"\\\\nF0321 04:24:14.917069 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-21T04:24:13Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:24:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://573a241321770c9c676d83d8280f81eba0ad01a3e2f857be89d31316117b8898\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:23:19Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f70600d5431425836d94025fc8166068d8c9f411d0750be07cd7de44575a9649\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f70600d5431425836d94025fc8166068d8c9f411d0750be07cd7de44575a9649\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:23:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:23:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:23:16Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:25:10Z is after 2025-08-24T17:21:41Z" Mar 21 04:25:10 crc kubenswrapper[4839]: I0321 04:25:10.158191 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:24:41Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:24:41Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:25:10Z is after 2025-08-24T17:21:41Z" Mar 21 04:25:10 crc kubenswrapper[4839]: I0321 04:25:10.171395 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:24:41Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:09Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://17d5f53fe3137552caf74bf775b44fba819d8790d2bc4d5042744dbd7b7307ce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:25:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e480aacd9f65a8a4ccedd9c0a943c53bf6de8a8de00eb0d421ec782af8a785bb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:25:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:25:10Z is after 2025-08-24T17:21:41Z" Mar 21 04:25:10 crc kubenswrapper[4839]: I0321 04:25:10.181873 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:25:10 crc kubenswrapper[4839]: I0321 04:25:10.181912 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:25:10 crc kubenswrapper[4839]: I0321 04:25:10.181925 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:25:10 crc kubenswrapper[4839]: I0321 04:25:10.181942 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 04:25:10 crc kubenswrapper[4839]: I0321 04:25:10.181955 4839 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T04:25:10Z","lastTransitionTime":"2026-03-21T04:25:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 04:25:10 crc kubenswrapper[4839]: I0321 04:25:10.182040 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-g47qh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e646dbcd-c976-48e4-8dee-497be8a275bf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2ceedbcee674106a8f519fededf37b2bbc4b4bd660845373899de825f1ed534e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:25:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ljz2v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:25:00Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-g47qh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:25:10Z is after 2025-08-24T17:21:41Z" Mar 21 04:25:10 crc kubenswrapper[4839]: I0321 04:25:10.192624 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:24:41Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:24:41Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:25:10Z is after 2025-08-24T17:21:41Z" Mar 21 04:25:10 crc kubenswrapper[4839]: I0321 04:25:10.202070 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-jx4q7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4f92fefb-d5cd-451a-8bbe-31eea55d5bd9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6fc18a359e07bf64662248d7ceb65c8184407c0c4e14ce5483760975218407c9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:25:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9jqbw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a124ddc90d8b93fe4b6f77b778fbdac127b4896f5f55e6d5d78629cd17584311\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:25:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9jqbw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:25:00Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-jx4q7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:25:10Z is after 2025-08-24T17:21:41Z" Mar 21 04:25:10 crc kubenswrapper[4839]: I0321 04:25:10.213927 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:24:41Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:24:41Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:25:10Z is after 2025-08-24T17:21:41Z" Mar 21 04:25:10 crc kubenswrapper[4839]: I0321 04:25:10.226894 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:24:41Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:24:41Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:25:10Z is after 2025-08-24T17:21:41Z" Mar 21 04:25:10 crc kubenswrapper[4839]: I0321 04:25:10.242586 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-scp2c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e0848faa-daf7-4b62-a20f-36d92678db1d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2ae22ec83bed051327f3e013af8f13d50dfab76794f5fbc972c4091173c8c464\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:25:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6trk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5ee6bb585321fd698c356e2ad9b8f48f62f2b8e09847371060e8ab0395a059ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5ee6bb585321fd698c356e2ad9b8f48f62f2b8e09847371060e8ab0395a059ea\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:25:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:25:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6trk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d6527e3c31caba53559cb6dfe4494b576789a0c8e3e5a7d392352441f1b9b1ad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d6527e3c31caba53559cb6dfe4494b576789a0c8e3e5a7d392352441f1b9b1ad\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:25:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:25:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6trk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://688598020c8b1be055194d4805d234769230262450e5f816f35cc120ff5ff9a2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://688598020c8b1be055194d4805d234769230262450e5f816f35cc120ff5ff9a2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:25:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:25:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6trk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://137b7d338371020059e20feb028cea9e79fa43d32ff7de446719a1326051c4f4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://137b7d338371020059e20feb028cea9e79fa43d32ff7de446719a1326051c4f4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:25:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:25:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6trk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6059c9b392e73f503a825716b885d248ddf659fe4fc1cd23f47353dd0fdc0e07\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6059c9b392e73f503a825716b885d248ddf659fe4fc1cd23f47353dd0fdc0e07\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:25:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:25:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6trk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9367e3b70275a4fd5cc98faa785da0149e890de672bce34ddc831a46be5dd6b3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9367e3b70275a4fd5cc98faa785da0149e890de672bce34ddc831a46be5dd6b3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:25:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:25:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6trk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:25:00Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-scp2c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:25:10Z is after 2025-08-24T17:21:41Z" Mar 21 04:25:10 crc kubenswrapper[4839]: I0321 04:25:10.257152 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-zqcw4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1602189b-f4f3-40ee-ba63-c695c11069d0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://abb804e65728bf323531d9b35e52f6700584bab85de4a5b749067d2d89224747\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:25:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f9gh6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:25:00Z\\\"}}\" for pod \"openshift-multus\"/\"multus-zqcw4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:25:10Z is after 2025-08-24T17:21:41Z" Mar 21 04:25:10 crc kubenswrapper[4839]: I0321 04:25:10.270507 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-sxs57" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e99177d8-5f41-4cee-a2c9-ae1c314d9d8d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4499202f3d135af4fbc0aaeddb85c3af3d571f0a6a0da31b0c054112cbeef85e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:25:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vqm8m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:25:06Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-sxs57\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:25:10Z is after 2025-08-24T17:21:41Z" Mar 21 04:25:10 crc kubenswrapper[4839]: I0321 04:25:10.284730 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:25:10 crc kubenswrapper[4839]: I0321 04:25:10.284777 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:25:10 crc kubenswrapper[4839]: I0321 04:25:10.284791 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:25:10 crc kubenswrapper[4839]: I0321 04:25:10.284811 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 04:25:10 crc kubenswrapper[4839]: I0321 04:25:10.284823 4839 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T04:25:10Z","lastTransitionTime":"2026-03-21T04:25:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 04:25:10 crc kubenswrapper[4839]: I0321 04:25:10.387083 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:25:10 crc kubenswrapper[4839]: I0321 04:25:10.387131 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:25:10 crc kubenswrapper[4839]: I0321 04:25:10.387140 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:25:10 crc kubenswrapper[4839]: I0321 04:25:10.387155 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 04:25:10 crc kubenswrapper[4839]: I0321 04:25:10.387163 4839 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T04:25:10Z","lastTransitionTime":"2026-03-21T04:25:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 04:25:10 crc kubenswrapper[4839]: I0321 04:25:10.452181 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 21 04:25:10 crc kubenswrapper[4839]: E0321 04:25:10.453111 4839 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 21 04:25:10 crc kubenswrapper[4839]: I0321 04:25:10.489543 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:25:10 crc kubenswrapper[4839]: I0321 04:25:10.489601 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:25:10 crc kubenswrapper[4839]: I0321 04:25:10.489610 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:25:10 crc kubenswrapper[4839]: I0321 04:25:10.489625 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 04:25:10 crc kubenswrapper[4839]: I0321 04:25:10.489672 4839 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T04:25:10Z","lastTransitionTime":"2026-03-21T04:25:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 04:25:10 crc kubenswrapper[4839]: I0321 04:25:10.591938 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:25:10 crc kubenswrapper[4839]: I0321 04:25:10.591975 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:25:10 crc kubenswrapper[4839]: I0321 04:25:10.591984 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:25:10 crc kubenswrapper[4839]: I0321 04:25:10.591999 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 04:25:10 crc kubenswrapper[4839]: I0321 04:25:10.592008 4839 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T04:25:10Z","lastTransitionTime":"2026-03-21T04:25:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 04:25:10 crc kubenswrapper[4839]: I0321 04:25:10.694092 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:25:10 crc kubenswrapper[4839]: I0321 04:25:10.694133 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:25:10 crc kubenswrapper[4839]: I0321 04:25:10.694143 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:25:10 crc kubenswrapper[4839]: I0321 04:25:10.694157 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 04:25:10 crc kubenswrapper[4839]: I0321 04:25:10.694166 4839 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T04:25:10Z","lastTransitionTime":"2026-03-21T04:25:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 04:25:10 crc kubenswrapper[4839]: I0321 04:25:10.796261 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:25:10 crc kubenswrapper[4839]: I0321 04:25:10.796305 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:25:10 crc kubenswrapper[4839]: I0321 04:25:10.796314 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:25:10 crc kubenswrapper[4839]: I0321 04:25:10.796328 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 04:25:10 crc kubenswrapper[4839]: I0321 04:25:10.796337 4839 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T04:25:10Z","lastTransitionTime":"2026-03-21T04:25:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 04:25:10 crc kubenswrapper[4839]: I0321 04:25:10.899651 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:25:10 crc kubenswrapper[4839]: I0321 04:25:10.899743 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:25:10 crc kubenswrapper[4839]: I0321 04:25:10.899776 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:25:10 crc kubenswrapper[4839]: I0321 04:25:10.899806 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 04:25:10 crc kubenswrapper[4839]: I0321 04:25:10.899830 4839 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T04:25:10Z","lastTransitionTime":"2026-03-21T04:25:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 04:25:11 crc kubenswrapper[4839]: I0321 04:25:11.002639 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:25:11 crc kubenswrapper[4839]: I0321 04:25:11.002717 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:25:11 crc kubenswrapper[4839]: I0321 04:25:11.002759 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:25:11 crc kubenswrapper[4839]: I0321 04:25:11.002801 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 04:25:11 crc kubenswrapper[4839]: I0321 04:25:11.002819 4839 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T04:25:11Z","lastTransitionTime":"2026-03-21T04:25:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 04:25:11 crc kubenswrapper[4839]: I0321 04:25:11.089226 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" event={"ID":"37a5e44f-9a88-4405-be8a-b645485e7312","Type":"ContainerStarted","Data":"c3ce591ac49d818720803d65a1f2cbc28e1781a5bd3cff7c6cdaaa4b55c01b62"} Mar 21 04:25:11 crc kubenswrapper[4839]: I0321 04:25:11.093056 4839 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-spl4b_d634043b-c9ec-4469-b267-26053b1f02f9/ovnkube-controller/0.log" Mar 21 04:25:11 crc kubenswrapper[4839]: I0321 04:25:11.099761 4839 generic.go:334] "Generic (PLEG): container finished" podID="d634043b-c9ec-4469-b267-26053b1f02f9" containerID="a5c1d96269996b1709d6a1c4744c11ca23d8e4fd56bc740881a1ddb1f0198172" exitCode=1 Mar 21 04:25:11 crc kubenswrapper[4839]: I0321 04:25:11.099850 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-spl4b" event={"ID":"d634043b-c9ec-4469-b267-26053b1f02f9","Type":"ContainerDied","Data":"a5c1d96269996b1709d6a1c4744c11ca23d8e4fd56bc740881a1ddb1f0198172"} Mar 21 04:25:11 crc kubenswrapper[4839]: I0321 04:25:11.100754 4839 scope.go:117] "RemoveContainer" containerID="a5c1d96269996b1709d6a1c4744c11ca23d8e4fd56bc740881a1ddb1f0198172" Mar 21 04:25:11 crc kubenswrapper[4839]: I0321 04:25:11.106657 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:25:11 crc kubenswrapper[4839]: I0321 04:25:11.106704 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:25:11 crc kubenswrapper[4839]: I0321 04:25:11.106718 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:25:11 crc kubenswrapper[4839]: I0321 04:25:11.106737 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 04:25:11 crc kubenswrapper[4839]: I0321 04:25:11.106753 4839 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T04:25:11Z","lastTransitionTime":"2026-03-21T04:25:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 04:25:11 crc kubenswrapper[4839]: I0321 04:25:11.112256 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:24:41Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:11Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c3ce591ac49d818720803d65a1f2cbc28e1781a5bd3cff7c6cdaaa4b55c01b62\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:25:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:25:11Z is after 2025-08-24T17:21:41Z" Mar 21 04:25:11 crc kubenswrapper[4839]: I0321 04:25:11.127435 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:24:41Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:24:41Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:25:11Z is after 2025-08-24T17:21:41Z" Mar 21 04:25:11 crc kubenswrapper[4839]: I0321 04:25:11.140417 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-scp2c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e0848faa-daf7-4b62-a20f-36d92678db1d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2ae22ec83bed051327f3e013af8f13d50dfab76794f5fbc972c4091173c8c464\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:25:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6trk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5ee6bb585321fd698c356e2ad9b8f48f62f2b8e09847371060e8ab0395a059ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5ee6bb585321fd698c356e2ad9b8f48f62f2b8e09847371060e8ab0395a059ea\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:25:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:25:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6trk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d6527e3c31caba53559cb6dfe4494b576789a0c8e3e5a7d392352441f1b9b1ad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d6527e3c31caba53559cb6dfe4494b576789a0c8e3e5a7d392352441f1b9b1ad\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:25:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:25:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6trk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://688598020c8b1be055194d4805d234769230262450e5f816f35cc120ff5ff9a2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://688598020c8b1be055194d4805d234769230262450e5f816f35cc120ff5ff9a2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:25:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:25:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6trk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://137b7d338371020059e20feb028cea9e79fa43d32ff7de446719a1326051c4f4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://137b7d338371020059e20feb028cea9e79fa43d32ff7de446719a1326051c4f4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:25:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:25:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6trk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6059c9b392e73f503a825716b885d248ddf659fe4fc1cd23f47353dd0fdc0e07\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6059c9b392e73f503a825716b885d248ddf659fe4fc1cd23f47353dd0fdc0e07\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:25:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:25:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6trk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9367e3b70275a4fd5cc98faa785da0149e890de672bce34ddc831a46be5dd6b3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9367e3b70275a4fd5cc98faa785da0149e890de672bce34ddc831a46be5dd6b3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:25:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:25:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6trk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:25:00Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-scp2c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:25:11Z is after 2025-08-24T17:21:41Z" Mar 21 04:25:11 crc kubenswrapper[4839]: I0321 04:25:11.154065 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-zqcw4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1602189b-f4f3-40ee-ba63-c695c11069d0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://abb804e65728bf323531d9b35e52f6700584bab85de4a5b749067d2d89224747\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:25:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f9gh6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:25:00Z\\\"}}\" for pod \"openshift-multus\"/\"multus-zqcw4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:25:11Z is after 2025-08-24T17:21:41Z" Mar 21 04:25:11 crc kubenswrapper[4839]: I0321 04:25:11.168215 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-sxs57" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e99177d8-5f41-4cee-a2c9-ae1c314d9d8d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4499202f3d135af4fbc0aaeddb85c3af3d571f0a6a0da31b0c054112cbeef85e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:25:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vqm8m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:25:06Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-sxs57\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:25:11Z is after 2025-08-24T17:21:41Z" Mar 21 04:25:11 crc kubenswrapper[4839]: I0321 04:25:11.186181 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"845a5f72-fcd9-4e53-a65e-d875a0758312\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:23:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:23:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:23:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:23:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:23:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6ac8aa1efe8200de679b27c91d85e8934dd8b958a487da8adfcb2e7fe4d4af83\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:23:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b5fcb9a692a8e68cdaf00e9ab50192286582dc94a1ec9c1f269deec71d12e709\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:23:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0a90276bf42ba474250d4e51c93b958e93ffcad8ab5f4285766b381f0247c484\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:23:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://418cc0271450c53990546d7ffb4506824d2a5889e49064cd40624ba291f485bd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:23:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d2aeb8f0a5e6e7302570f0ecb899e31a8f5e9ed506f9bf7cfa6a0a6d55ae50dd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:23:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://843b9092466615ee64f818bbd1d6e51298c59932d1e135796967de09fb2f81bf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://843b9092466615ee64f818bbd1d6e51298c59932d1e135796967de09fb2f81bf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:23:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:23:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://320bccb832e179c9ab109d22c5cd652cfb3bd770d2c698d91886401d9566efc5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://320bccb832e179c9ab109d22c5cd652cfb3bd770d2c698d91886401d9566efc5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:23:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:23:18Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://6e30344cfa0c0c9d42845a517f744a156634132bd6c2f6e30f745751913e968c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6e30344cfa0c0c9d42845a517f744a156634132bd6c2f6e30f745751913e968c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:23:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:23:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:23:16Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:25:11Z is after 2025-08-24T17:21:41Z" Mar 21 04:25:11 crc kubenswrapper[4839]: I0321 04:25:11.197487 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:24:41Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:24:41Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:25:11Z is after 2025-08-24T17:21:41Z" Mar 21 04:25:11 crc kubenswrapper[4839]: I0321 04:25:11.208676 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:25:11 crc kubenswrapper[4839]: I0321 04:25:11.208707 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:25:11 crc kubenswrapper[4839]: I0321 04:25:11.208716 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:25:11 crc kubenswrapper[4839]: I0321 04:25:11.208729 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 04:25:11 crc kubenswrapper[4839]: I0321 04:25:11.208738 4839 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T04:25:11Z","lastTransitionTime":"2026-03-21T04:25:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 04:25:11 crc kubenswrapper[4839]: I0321 04:25:11.214940 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-spl4b" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d634043b-c9ec-4469-b267-26053b1f02f9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:01Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:01Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5854b81091eb3920f499e2fb8bdb5e51afa5cf975397ad7d789603150fdafe92\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:25:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cdph2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c532a8668c47301ab93dbb1ea21be6ed687a83a6c9008b887e5023a46f27d172\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:25:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cdph2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://340ce870dbfbd3b102f8f1c5e6705da80e4a23db035175d7746ffe6ad63310f0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:25:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cdph2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://04c0470b8aa154ad901d5416631136ad2480da45bff39836dda6837a3e4b980e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:25:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cdph2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://821316a28b33f897df4ea4ae553ed0cbdd066ff220cc310af604bbe062e4b00e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:25:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cdph2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0555faac11cd1da1d5573074ee5d8704c1c4a7d8559996229bf242d9ed7290f4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:25:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cdph2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a5c1d96269996b1709d6a1c4744c11ca23d8e4fd56bc740881a1ddb1f0198172\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:25:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cdph2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ee9c34719e2681860a7065870d4f0fd1d5aae10da5f2cc780e83f8020211e536\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:25:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cdph2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://466ed122245008a86b4471250670e086e61c981de3a7a026461c7c2238c87f3c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://466ed122245008a86b4471250670e086e61c981de3a7a026461c7c2238c87f3c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:25:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:25:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cdph2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:25:01Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-spl4b\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:25:11Z is after 2025-08-24T17:21:41Z" Mar 21 04:25:11 crc kubenswrapper[4839]: I0321 04:25:11.228562 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c5a473f2-0430-4c9b-8ef8-60d457db5188\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:23:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:23:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:23:16Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:23:16Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:23:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7299e7f513aff2d1d61b42cef3ed386843478716082a6e38d4d6d61d87f48529\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:23:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6d9bd6ceaa85cd21af55d1d512ac910eaf9d1bcf5d0f10868cd22ac80b13f66a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:23:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e4fe4a6c00181492cd5389e34a9bf18d905f69b2f7467eb75dcee9992eeb0d07\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:23:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://412472fce0e71838c2bb83101879b672e777dcd608bf27261a98261150d42e87\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ecafd1a4266b72f369f4b53d8a423bb875c1bc9719282c517a8e999ad5aecc66\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-21T04:24:14Z\\\",\\\"message\\\":\\\"W0321 04:24:13.708919 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI0321 04:24:13.709951 1 crypto.go:601] Generating new CA for check-endpoints-signer@1774067053 cert, and key in /tmp/serving-cert-1138089316/serving-signer.crt, /tmp/serving-cert-1138089316/serving-signer.key\\\\nI0321 04:24:14.009807 1 observer_polling.go:159] Starting file observer\\\\nW0321 04:24:14.016906 1 builder.go:272] unable to get owner reference (falling back to namespace): Unauthorized\\\\nI0321 04:24:14.017068 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0321 04:24:14.017655 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1138089316/tls.crt::/tmp/serving-cert-1138089316/tls.key\\\\\\\"\\\\nF0321 04:24:14.917069 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-21T04:24:13Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:24:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://573a241321770c9c676d83d8280f81eba0ad01a3e2f857be89d31316117b8898\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:23:19Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f70600d5431425836d94025fc8166068d8c9f411d0750be07cd7de44575a9649\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f70600d5431425836d94025fc8166068d8c9f411d0750be07cd7de44575a9649\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:23:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:23:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:23:16Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:25:11Z is after 2025-08-24T17:21:41Z" Mar 21 04:25:11 crc kubenswrapper[4839]: I0321 04:25:11.241026 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:24:41Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:24:41Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:25:11Z is after 2025-08-24T17:21:41Z" Mar 21 04:25:11 crc kubenswrapper[4839]: I0321 04:25:11.253652 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:24:41Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:09Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://17d5f53fe3137552caf74bf775b44fba819d8790d2bc4d5042744dbd7b7307ce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:25:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e480aacd9f65a8a4ccedd9c0a943c53bf6de8a8de00eb0d421ec782af8a785bb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:25:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:25:11Z is after 2025-08-24T17:21:41Z" Mar 21 04:25:11 crc kubenswrapper[4839]: I0321 04:25:11.262168 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-g47qh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e646dbcd-c976-48e4-8dee-497be8a275bf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2ceedbcee674106a8f519fededf37b2bbc4b4bd660845373899de825f1ed534e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:25:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ljz2v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:25:00Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-g47qh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:25:11Z is after 2025-08-24T17:21:41Z" Mar 21 04:25:11 crc kubenswrapper[4839]: I0321 04:25:11.276413 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:24:41Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:24:41Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:25:11Z is after 2025-08-24T17:21:41Z" Mar 21 04:25:11 crc kubenswrapper[4839]: I0321 04:25:11.287111 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-jx4q7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4f92fefb-d5cd-451a-8bbe-31eea55d5bd9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6fc18a359e07bf64662248d7ceb65c8184407c0c4e14ce5483760975218407c9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:25:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9jqbw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a124ddc90d8b93fe4b6f77b778fbdac127b4896f5f55e6d5d78629cd17584311\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:25:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9jqbw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:25:00Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-jx4q7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:25:11Z is after 2025-08-24T17:21:41Z" Mar 21 04:25:11 crc kubenswrapper[4839]: I0321 04:25:11.299133 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:24:41Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:11Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c3ce591ac49d818720803d65a1f2cbc28e1781a5bd3cff7c6cdaaa4b55c01b62\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:25:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:25:11Z is after 2025-08-24T17:21:41Z" Mar 21 04:25:11 crc kubenswrapper[4839]: I0321 04:25:11.309860 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:24:41Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:24:41Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:25:11Z is after 2025-08-24T17:21:41Z" Mar 21 04:25:11 crc kubenswrapper[4839]: I0321 04:25:11.310678 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:25:11 crc kubenswrapper[4839]: I0321 04:25:11.310707 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:25:11 crc kubenswrapper[4839]: I0321 04:25:11.310716 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:25:11 crc kubenswrapper[4839]: I0321 04:25:11.310730 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 04:25:11 crc kubenswrapper[4839]: I0321 04:25:11.310739 4839 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T04:25:11Z","lastTransitionTime":"2026-03-21T04:25:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 04:25:11 crc kubenswrapper[4839]: I0321 04:25:11.322306 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-scp2c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e0848faa-daf7-4b62-a20f-36d92678db1d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2ae22ec83bed051327f3e013af8f13d50dfab76794f5fbc972c4091173c8c464\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:25:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6trk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5ee6bb585321fd698c356e2ad9b8f48f62f2b8e09847371060e8ab0395a059ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5ee6bb585321fd698c356e2ad9b8f48f62f2b8e09847371060e8ab0395a059ea\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:25:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:25:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6trk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d6527e3c31caba53559cb6dfe4494b576789a0c8e3e5a7d392352441f1b9b1ad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d6527e3c31caba53559cb6dfe4494b576789a0c8e3e5a7d392352441f1b9b1ad\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:25:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:25:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6trk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://688598020c8b1be055194d4805d234769230262450e5f816f35cc120ff5ff9a2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://688598020c8b1be055194d4805d234769230262450e5f816f35cc120ff5ff9a2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:25:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:25:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6trk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://137b7d338371020059e20feb028cea9e79fa43d32ff7de446719a1326051c4f4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://137b7d338371020059e20feb028cea9e79fa43d32ff7de446719a1326051c4f4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:25:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:25:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6trk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6059c9b392e73f503a825716b885d248ddf659fe4fc1cd23f47353dd0fdc0e07\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6059c9b392e73f503a825716b885d248ddf659fe4fc1cd23f47353dd0fdc0e07\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:25:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:25:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6trk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9367e3b70275a4fd5cc98faa785da0149e890de672bce34ddc831a46be5dd6b3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9367e3b70275a4fd5cc98faa785da0149e890de672bce34ddc831a46be5dd6b3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:25:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:25:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6trk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:25:00Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-scp2c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:25:11Z is after 2025-08-24T17:21:41Z" Mar 21 04:25:11 crc kubenswrapper[4839]: I0321 04:25:11.334216 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-zqcw4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1602189b-f4f3-40ee-ba63-c695c11069d0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://abb804e65728bf323531d9b35e52f6700584bab85de4a5b749067d2d89224747\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:25:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f9gh6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:25:00Z\\\"}}\" for pod \"openshift-multus\"/\"multus-zqcw4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:25:11Z is after 2025-08-24T17:21:41Z" Mar 21 04:25:11 crc kubenswrapper[4839]: I0321 04:25:11.343617 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-sxs57" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e99177d8-5f41-4cee-a2c9-ae1c314d9d8d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4499202f3d135af4fbc0aaeddb85c3af3d571f0a6a0da31b0c054112cbeef85e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:25:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vqm8m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:25:06Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-sxs57\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:25:11Z is after 2025-08-24T17:21:41Z" Mar 21 04:25:11 crc kubenswrapper[4839]: I0321 04:25:11.370943 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"845a5f72-fcd9-4e53-a65e-d875a0758312\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:23:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:23:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:23:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:23:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:23:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6ac8aa1efe8200de679b27c91d85e8934dd8b958a487da8adfcb2e7fe4d4af83\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:23:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b5fcb9a692a8e68cdaf00e9ab50192286582dc94a1ec9c1f269deec71d12e709\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:23:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0a90276bf42ba474250d4e51c93b958e93ffcad8ab5f4285766b381f0247c484\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:23:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://418cc0271450c53990546d7ffb4506824d2a5889e49064cd40624ba291f485bd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:23:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d2aeb8f0a5e6e7302570f0ecb899e31a8f5e9ed506f9bf7cfa6a0a6d55ae50dd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:23:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://843b9092466615ee64f818bbd1d6e51298c59932d1e135796967de09fb2f81bf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://843b9092466615ee64f818bbd1d6e51298c59932d1e135796967de09fb2f81bf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:23:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:23:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://320bccb832e179c9ab109d22c5cd652cfb3bd770d2c698d91886401d9566efc5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://320bccb832e179c9ab109d22c5cd652cfb3bd770d2c698d91886401d9566efc5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:23:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:23:18Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://6e30344cfa0c0c9d42845a517f744a156634132bd6c2f6e30f745751913e968c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6e30344cfa0c0c9d42845a517f744a156634132bd6c2f6e30f745751913e968c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:23:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:23:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:23:16Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:25:11Z is after 2025-08-24T17:21:41Z" Mar 21 04:25:11 crc kubenswrapper[4839]: I0321 04:25:11.393486 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:24:41Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:24:41Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:25:11Z is after 2025-08-24T17:21:41Z" Mar 21 04:25:11 crc kubenswrapper[4839]: I0321 04:25:11.412323 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:25:11 crc kubenswrapper[4839]: I0321 04:25:11.412368 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:25:11 crc kubenswrapper[4839]: I0321 04:25:11.412380 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:25:11 crc kubenswrapper[4839]: I0321 04:25:11.412398 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 04:25:11 crc kubenswrapper[4839]: I0321 04:25:11.412411 4839 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T04:25:11Z","lastTransitionTime":"2026-03-21T04:25:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 04:25:11 crc kubenswrapper[4839]: I0321 04:25:11.423116 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-spl4b" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d634043b-c9ec-4469-b267-26053b1f02f9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:01Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:01Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5854b81091eb3920f499e2fb8bdb5e51afa5cf975397ad7d789603150fdafe92\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:25:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cdph2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c532a8668c47301ab93dbb1ea21be6ed687a83a6c9008b887e5023a46f27d172\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:25:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cdph2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://340ce870dbfbd3b102f8f1c5e6705da80e4a23db035175d7746ffe6ad63310f0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:25:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cdph2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://04c0470b8aa154ad901d5416631136ad2480da45bff39836dda6837a3e4b980e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:25:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cdph2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://821316a28b33f897df4ea4ae553ed0cbdd066ff220cc310af604bbe062e4b00e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:25:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cdph2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0555faac11cd1da1d5573074ee5d8704c1c4a7d8559996229bf242d9ed7290f4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:25:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cdph2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a5c1d96269996b1709d6a1c4744c11ca23d8e4fd56bc740881a1ddb1f0198172\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a5c1d96269996b1709d6a1c4744c11ca23d8e4fd56bc740881a1ddb1f0198172\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-21T04:25:10Z\\\",\\\"message\\\":\\\"7\\\\nI0321 04:25:10.380763 6652 handler.go:208] Removed *v1.Node event handler 2\\\\nI0321 04:25:10.380776 6652 reflector.go:311] Stopping reflector *v1.EgressFirewall (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressfirewall/v1/apis/informers/externalversions/factory.go:140\\\\nI0321 04:25:10.380779 6652 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI0321 04:25:10.380790 6652 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI0321 04:25:10.380821 6652 handler.go:208] Removed *v1.Pod event handler 6\\\\nI0321 04:25:10.380831 6652 reflector.go:311] Stopping reflector *v1.EndpointSlice (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0321 04:25:10.380838 6652 handler.go:208] Removed *v1.Pod event handler 3\\\\nI0321 04:25:10.380893 6652 reflector.go:311] Stopping reflector *v1.EgressIP (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressip/v1/apis/informers/externalversions/factory.go:140\\\\nI0321 04:25:10.380983 6652 reflector.go:311] Stopping reflector *v1.EgressQoS (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressqos/v1/apis/informers/externalversions/factory.go:140\\\\nI0321 04:25:10.381323 6652 reflector.go:311] Stopping reflector *v1alpha1.BaselineAdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/f\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-21T04:25:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cdph2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ee9c34719e2681860a7065870d4f0fd1d5aae10da5f2cc780e83f8020211e536\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:25:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cdph2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://466ed122245008a86b4471250670e086e61c981de3a7a026461c7c2238c87f3c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://466ed122245008a86b4471250670e086e61c981de3a7a026461c7c2238c87f3c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:25:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:25:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cdph2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:25:01Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-spl4b\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:25:11Z is after 2025-08-24T17:21:41Z" Mar 21 04:25:11 crc kubenswrapper[4839]: I0321 04:25:11.434599 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c5a473f2-0430-4c9b-8ef8-60d457db5188\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:23:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:23:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:23:16Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:23:16Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:23:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7299e7f513aff2d1d61b42cef3ed386843478716082a6e38d4d6d61d87f48529\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:23:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6d9bd6ceaa85cd21af55d1d512ac910eaf9d1bcf5d0f10868cd22ac80b13f66a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:23:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e4fe4a6c00181492cd5389e34a9bf18d905f69b2f7467eb75dcee9992eeb0d07\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:23:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://412472fce0e71838c2bb83101879b672e777dcd608bf27261a98261150d42e87\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ecafd1a4266b72f369f4b53d8a423bb875c1bc9719282c517a8e999ad5aecc66\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-21T04:24:14Z\\\",\\\"message\\\":\\\"W0321 04:24:13.708919 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI0321 04:24:13.709951 1 crypto.go:601] Generating new CA for check-endpoints-signer@1774067053 cert, and key in /tmp/serving-cert-1138089316/serving-signer.crt, /tmp/serving-cert-1138089316/serving-signer.key\\\\nI0321 04:24:14.009807 1 observer_polling.go:159] Starting file observer\\\\nW0321 04:24:14.016906 1 builder.go:272] unable to get owner reference (falling back to namespace): Unauthorized\\\\nI0321 04:24:14.017068 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0321 04:24:14.017655 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1138089316/tls.crt::/tmp/serving-cert-1138089316/tls.key\\\\\\\"\\\\nF0321 04:24:14.917069 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-21T04:24:13Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:24:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://573a241321770c9c676d83d8280f81eba0ad01a3e2f857be89d31316117b8898\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:23:19Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f70600d5431425836d94025fc8166068d8c9f411d0750be07cd7de44575a9649\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f70600d5431425836d94025fc8166068d8c9f411d0750be07cd7de44575a9649\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:23:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:23:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:23:16Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:25:11Z is after 2025-08-24T17:21:41Z" Mar 21 04:25:11 crc kubenswrapper[4839]: I0321 04:25:11.444557 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:24:41Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:24:41Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:25:11Z is after 2025-08-24T17:21:41Z" Mar 21 04:25:11 crc kubenswrapper[4839]: I0321 04:25:11.452035 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 21 04:25:11 crc kubenswrapper[4839]: I0321 04:25:11.452081 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 21 04:25:11 crc kubenswrapper[4839]: E0321 04:25:11.452114 4839 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 21 04:25:11 crc kubenswrapper[4839]: E0321 04:25:11.452208 4839 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 21 04:25:11 crc kubenswrapper[4839]: I0321 04:25:11.455179 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:24:41Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:09Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://17d5f53fe3137552caf74bf775b44fba819d8790d2bc4d5042744dbd7b7307ce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:25:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e480aacd9f65a8a4ccedd9c0a943c53bf6de8a8de00eb0d421ec782af8a785bb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:25:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:25:11Z is after 2025-08-24T17:21:41Z" Mar 21 04:25:11 crc kubenswrapper[4839]: I0321 04:25:11.465003 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-g47qh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e646dbcd-c976-48e4-8dee-497be8a275bf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2ceedbcee674106a8f519fededf37b2bbc4b4bd660845373899de825f1ed534e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:25:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ljz2v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:25:00Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-g47qh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:25:11Z is after 2025-08-24T17:21:41Z" Mar 21 04:25:11 crc kubenswrapper[4839]: I0321 04:25:11.474582 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:24:41Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:24:41Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:25:11Z is after 2025-08-24T17:21:41Z" Mar 21 04:25:11 crc kubenswrapper[4839]: I0321 04:25:11.484465 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-jx4q7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4f92fefb-d5cd-451a-8bbe-31eea55d5bd9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6fc18a359e07bf64662248d7ceb65c8184407c0c4e14ce5483760975218407c9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:25:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9jqbw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a124ddc90d8b93fe4b6f77b778fbdac127b4896f5f55e6d5d78629cd17584311\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:25:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9jqbw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:25:00Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-jx4q7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:25:11Z is after 2025-08-24T17:21:41Z" Mar 21 04:25:11 crc kubenswrapper[4839]: I0321 04:25:11.518936 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:25:11 crc kubenswrapper[4839]: I0321 04:25:11.518984 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:25:11 crc kubenswrapper[4839]: I0321 04:25:11.519016 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:25:11 crc kubenswrapper[4839]: I0321 04:25:11.519035 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 04:25:11 crc kubenswrapper[4839]: I0321 04:25:11.519045 4839 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T04:25:11Z","lastTransitionTime":"2026-03-21T04:25:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 04:25:11 crc kubenswrapper[4839]: I0321 04:25:11.621275 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:25:11 crc kubenswrapper[4839]: I0321 04:25:11.621313 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:25:11 crc kubenswrapper[4839]: I0321 04:25:11.621325 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:25:11 crc kubenswrapper[4839]: I0321 04:25:11.621341 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 04:25:11 crc kubenswrapper[4839]: I0321 04:25:11.621352 4839 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T04:25:11Z","lastTransitionTime":"2026-03-21T04:25:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 04:25:11 crc kubenswrapper[4839]: I0321 04:25:11.724001 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:25:11 crc kubenswrapper[4839]: I0321 04:25:11.724047 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:25:11 crc kubenswrapper[4839]: I0321 04:25:11.724063 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:25:11 crc kubenswrapper[4839]: I0321 04:25:11.724087 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 04:25:11 crc kubenswrapper[4839]: I0321 04:25:11.724104 4839 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T04:25:11Z","lastTransitionTime":"2026-03-21T04:25:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 04:25:11 crc kubenswrapper[4839]: I0321 04:25:11.825856 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:25:11 crc kubenswrapper[4839]: I0321 04:25:11.825934 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:25:11 crc kubenswrapper[4839]: I0321 04:25:11.825947 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:25:11 crc kubenswrapper[4839]: I0321 04:25:11.825962 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 04:25:11 crc kubenswrapper[4839]: I0321 04:25:11.825974 4839 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T04:25:11Z","lastTransitionTime":"2026-03-21T04:25:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 04:25:11 crc kubenswrapper[4839]: I0321 04:25:11.937363 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:25:11 crc kubenswrapper[4839]: I0321 04:25:11.937390 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:25:11 crc kubenswrapper[4839]: I0321 04:25:11.937399 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:25:11 crc kubenswrapper[4839]: I0321 04:25:11.937413 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 04:25:11 crc kubenswrapper[4839]: I0321 04:25:11.937421 4839 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T04:25:11Z","lastTransitionTime":"2026-03-21T04:25:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 04:25:12 crc kubenswrapper[4839]: I0321 04:25:12.039908 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:25:12 crc kubenswrapper[4839]: I0321 04:25:12.039943 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:25:12 crc kubenswrapper[4839]: I0321 04:25:12.039955 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:25:12 crc kubenswrapper[4839]: I0321 04:25:12.039971 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 04:25:12 crc kubenswrapper[4839]: I0321 04:25:12.039984 4839 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T04:25:12Z","lastTransitionTime":"2026-03-21T04:25:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 04:25:12 crc kubenswrapper[4839]: I0321 04:25:12.105457 4839 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-spl4b_d634043b-c9ec-4469-b267-26053b1f02f9/ovnkube-controller/1.log" Mar 21 04:25:12 crc kubenswrapper[4839]: I0321 04:25:12.106627 4839 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-spl4b_d634043b-c9ec-4469-b267-26053b1f02f9/ovnkube-controller/0.log" Mar 21 04:25:12 crc kubenswrapper[4839]: I0321 04:25:12.109456 4839 generic.go:334] "Generic (PLEG): container finished" podID="d634043b-c9ec-4469-b267-26053b1f02f9" containerID="4f8906b09cbadb32578e0445be63ed6c1e3ce446f301f92ae844664327d19a63" exitCode=1 Mar 21 04:25:12 crc kubenswrapper[4839]: I0321 04:25:12.109511 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-spl4b" event={"ID":"d634043b-c9ec-4469-b267-26053b1f02f9","Type":"ContainerDied","Data":"4f8906b09cbadb32578e0445be63ed6c1e3ce446f301f92ae844664327d19a63"} Mar 21 04:25:12 crc kubenswrapper[4839]: I0321 04:25:12.109542 4839 scope.go:117] "RemoveContainer" containerID="a5c1d96269996b1709d6a1c4744c11ca23d8e4fd56bc740881a1ddb1f0198172" Mar 21 04:25:12 crc kubenswrapper[4839]: I0321 04:25:12.110237 4839 scope.go:117] "RemoveContainer" containerID="4f8906b09cbadb32578e0445be63ed6c1e3ce446f301f92ae844664327d19a63" Mar 21 04:25:12 crc kubenswrapper[4839]: E0321 04:25:12.110367 4839 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-spl4b_openshift-ovn-kubernetes(d634043b-c9ec-4469-b267-26053b1f02f9)\"" pod="openshift-ovn-kubernetes/ovnkube-node-spl4b" podUID="d634043b-c9ec-4469-b267-26053b1f02f9" Mar 21 04:25:12 crc kubenswrapper[4839]: I0321 04:25:12.113425 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" event={"ID":"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49","Type":"ContainerStarted","Data":"ea7373c259189351899cd74615296d7188b2529f808af9c8c85098177ffbe85d"} Mar 21 04:25:12 crc kubenswrapper[4839]: I0321 04:25:12.131932 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:24:41Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:24:41Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:25:12Z is after 2025-08-24T17:21:41Z" Mar 21 04:25:12 crc kubenswrapper[4839]: I0321 04:25:12.141182 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-jx4q7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4f92fefb-d5cd-451a-8bbe-31eea55d5bd9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6fc18a359e07bf64662248d7ceb65c8184407c0c4e14ce5483760975218407c9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:25:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9jqbw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a124ddc90d8b93fe4b6f77b778fbdac127b4896f5f55e6d5d78629cd17584311\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:25:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9jqbw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:25:00Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-jx4q7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:25:12Z is after 2025-08-24T17:21:41Z" Mar 21 04:25:12 crc kubenswrapper[4839]: I0321 04:25:12.142013 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:25:12 crc kubenswrapper[4839]: I0321 04:25:12.142042 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:25:12 crc kubenswrapper[4839]: I0321 04:25:12.142055 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:25:12 crc kubenswrapper[4839]: I0321 04:25:12.142071 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 04:25:12 crc kubenswrapper[4839]: I0321 04:25:12.142081 4839 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T04:25:12Z","lastTransitionTime":"2026-03-21T04:25:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 04:25:12 crc kubenswrapper[4839]: I0321 04:25:12.153106 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:24:41Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:11Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c3ce591ac49d818720803d65a1f2cbc28e1781a5bd3cff7c6cdaaa4b55c01b62\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:25:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:25:12Z is after 2025-08-24T17:21:41Z" Mar 21 04:25:12 crc kubenswrapper[4839]: I0321 04:25:12.164843 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:24:41Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:24:41Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:25:12Z is after 2025-08-24T17:21:41Z" Mar 21 04:25:12 crc kubenswrapper[4839]: I0321 04:25:12.178093 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-scp2c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e0848faa-daf7-4b62-a20f-36d92678db1d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2ae22ec83bed051327f3e013af8f13d50dfab76794f5fbc972c4091173c8c464\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:25:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6trk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5ee6bb585321fd698c356e2ad9b8f48f62f2b8e09847371060e8ab0395a059ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5ee6bb585321fd698c356e2ad9b8f48f62f2b8e09847371060e8ab0395a059ea\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:25:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:25:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6trk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d6527e3c31caba53559cb6dfe4494b576789a0c8e3e5a7d392352441f1b9b1ad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d6527e3c31caba53559cb6dfe4494b576789a0c8e3e5a7d392352441f1b9b1ad\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:25:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:25:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6trk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://688598020c8b1be055194d4805d234769230262450e5f816f35cc120ff5ff9a2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://688598020c8b1be055194d4805d234769230262450e5f816f35cc120ff5ff9a2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:25:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:25:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6trk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://137b7d338371020059e20feb028cea9e79fa43d32ff7de446719a1326051c4f4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://137b7d338371020059e20feb028cea9e79fa43d32ff7de446719a1326051c4f4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:25:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:25:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6trk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6059c9b392e73f503a825716b885d248ddf659fe4fc1cd23f47353dd0fdc0e07\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6059c9b392e73f503a825716b885d248ddf659fe4fc1cd23f47353dd0fdc0e07\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:25:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:25:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6trk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9367e3b70275a4fd5cc98faa785da0149e890de672bce34ddc831a46be5dd6b3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9367e3b70275a4fd5cc98faa785da0149e890de672bce34ddc831a46be5dd6b3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:25:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:25:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6trk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:25:00Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-scp2c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:25:12Z is after 2025-08-24T17:21:41Z" Mar 21 04:25:12 crc kubenswrapper[4839]: I0321 04:25:12.189599 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-zqcw4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1602189b-f4f3-40ee-ba63-c695c11069d0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://abb804e65728bf323531d9b35e52f6700584bab85de4a5b749067d2d89224747\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:25:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f9gh6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:25:00Z\\\"}}\" for pod \"openshift-multus\"/\"multus-zqcw4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:25:12Z is after 2025-08-24T17:21:41Z" Mar 21 04:25:12 crc kubenswrapper[4839]: I0321 04:25:12.199339 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-sxs57" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e99177d8-5f41-4cee-a2c9-ae1c314d9d8d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4499202f3d135af4fbc0aaeddb85c3af3d571f0a6a0da31b0c054112cbeef85e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:25:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vqm8m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:25:06Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-sxs57\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:25:12Z is after 2025-08-24T17:21:41Z" Mar 21 04:25:12 crc kubenswrapper[4839]: I0321 04:25:12.220755 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"845a5f72-fcd9-4e53-a65e-d875a0758312\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:23:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:23:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:23:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:23:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:23:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6ac8aa1efe8200de679b27c91d85e8934dd8b958a487da8adfcb2e7fe4d4af83\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:23:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b5fcb9a692a8e68cdaf00e9ab50192286582dc94a1ec9c1f269deec71d12e709\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:23:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0a90276bf42ba474250d4e51c93b958e93ffcad8ab5f4285766b381f0247c484\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:23:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://418cc0271450c53990546d7ffb4506824d2a5889e49064cd40624ba291f485bd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:23:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d2aeb8f0a5e6e7302570f0ecb899e31a8f5e9ed506f9bf7cfa6a0a6d55ae50dd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:23:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://843b9092466615ee64f818bbd1d6e51298c59932d1e135796967de09fb2f81bf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://843b9092466615ee64f818bbd1d6e51298c59932d1e135796967de09fb2f81bf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:23:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:23:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://320bccb832e179c9ab109d22c5cd652cfb3bd770d2c698d91886401d9566efc5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://320bccb832e179c9ab109d22c5cd652cfb3bd770d2c698d91886401d9566efc5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:23:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:23:18Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://6e30344cfa0c0c9d42845a517f744a156634132bd6c2f6e30f745751913e968c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6e30344cfa0c0c9d42845a517f744a156634132bd6c2f6e30f745751913e968c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:23:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:23:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:23:16Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:25:12Z is after 2025-08-24T17:21:41Z" Mar 21 04:25:12 crc kubenswrapper[4839]: I0321 04:25:12.235787 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:24:41Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:24:41Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:25:12Z is after 2025-08-24T17:21:41Z" Mar 21 04:25:12 crc kubenswrapper[4839]: I0321 04:25:12.244639 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:25:12 crc kubenswrapper[4839]: I0321 04:25:12.244669 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:25:12 crc kubenswrapper[4839]: I0321 04:25:12.244678 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:25:12 crc kubenswrapper[4839]: I0321 04:25:12.244691 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 04:25:12 crc kubenswrapper[4839]: I0321 04:25:12.244699 4839 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T04:25:12Z","lastTransitionTime":"2026-03-21T04:25:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 04:25:12 crc kubenswrapper[4839]: I0321 04:25:12.258004 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-spl4b" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d634043b-c9ec-4469-b267-26053b1f02f9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:01Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:01Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5854b81091eb3920f499e2fb8bdb5e51afa5cf975397ad7d789603150fdafe92\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:25:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cdph2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c532a8668c47301ab93dbb1ea21be6ed687a83a6c9008b887e5023a46f27d172\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:25:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cdph2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://340ce870dbfbd3b102f8f1c5e6705da80e4a23db035175d7746ffe6ad63310f0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:25:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cdph2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://04c0470b8aa154ad901d5416631136ad2480da45bff39836dda6837a3e4b980e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:25:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cdph2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://821316a28b33f897df4ea4ae553ed0cbdd066ff220cc310af604bbe062e4b00e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:25:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cdph2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0555faac11cd1da1d5573074ee5d8704c1c4a7d8559996229bf242d9ed7290f4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:25:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cdph2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4f8906b09cbadb32578e0445be63ed6c1e3ce446f301f92ae844664327d19a63\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a5c1d96269996b1709d6a1c4744c11ca23d8e4fd56bc740881a1ddb1f0198172\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-21T04:25:10Z\\\",\\\"message\\\":\\\"7\\\\nI0321 04:25:10.380763 6652 handler.go:208] Removed *v1.Node event handler 2\\\\nI0321 04:25:10.380776 6652 reflector.go:311] Stopping reflector *v1.EgressFirewall (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressfirewall/v1/apis/informers/externalversions/factory.go:140\\\\nI0321 04:25:10.380779 6652 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI0321 04:25:10.380790 6652 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI0321 04:25:10.380821 6652 handler.go:208] Removed *v1.Pod event handler 6\\\\nI0321 04:25:10.380831 6652 reflector.go:311] Stopping reflector *v1.EndpointSlice (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0321 04:25:10.380838 6652 handler.go:208] Removed *v1.Pod event handler 3\\\\nI0321 04:25:10.380893 6652 reflector.go:311] Stopping reflector *v1.EgressIP (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressip/v1/apis/informers/externalversions/factory.go:140\\\\nI0321 04:25:10.380983 6652 reflector.go:311] Stopping reflector *v1.EgressQoS (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressqos/v1/apis/informers/externalversions/factory.go:140\\\\nI0321 04:25:10.381323 6652 reflector.go:311] Stopping reflector *v1alpha1.BaselineAdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/f\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-21T04:25:07Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4f8906b09cbadb32578e0445be63ed6c1e3ce446f301f92ae844664327d19a63\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-21T04:25:12Z\\\",\\\"message\\\":\\\"al\\\\nI0321 04:25:12.025879 6856 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI0321 04:25:12.025892 6856 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI0321 04:25:12.025912 6856 handler.go:208] Removed *v1.Pod event handler 6\\\\nI0321 04:25:12.025931 6856 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0321 04:25:12.025958 6856 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI0321 04:25:12.025968 6856 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI0321 04:25:12.025975 6856 handler.go:208] Removed *v1.Pod event handler 3\\\\nI0321 04:25:12.025945 6856 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI0321 04:25:12.025953 6856 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0321 04:25:12.025985 6856 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI0321 04:25:12.026002 6856 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI0321 04:25:12.026060 6856 handler.go:208] Removed *v1.Node event handler 2\\\\nI0321 04:25:12.026070 6856 handler.go:208] Removed *v1.Node event handler 7\\\\nI0321 04:25:12.026092 6856 factory.go:656] Stopping watch factory\\\\nI0321 04:25:12.026121 6856 ovnkube.go:599] Stopped ovnkube\\\\nI0321 04:25:12.026162 6856 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF0321 04:25:\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-21T04:25:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cdph2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ee9c34719e2681860a7065870d4f0fd1d5aae10da5f2cc780e83f8020211e536\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:25:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cdph2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://466ed122245008a86b4471250670e086e61c981de3a7a026461c7c2238c87f3c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://466ed122245008a86b4471250670e086e61c981de3a7a026461c7c2238c87f3c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:25:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:25:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cdph2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:25:01Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-spl4b\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:25:12Z is after 2025-08-24T17:21:41Z" Mar 21 04:25:12 crc kubenswrapper[4839]: I0321 04:25:12.268863 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c5a473f2-0430-4c9b-8ef8-60d457db5188\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:23:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:23:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:23:16Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:23:16Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:23:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7299e7f513aff2d1d61b42cef3ed386843478716082a6e38d4d6d61d87f48529\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:23:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6d9bd6ceaa85cd21af55d1d512ac910eaf9d1bcf5d0f10868cd22ac80b13f66a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:23:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e4fe4a6c00181492cd5389e34a9bf18d905f69b2f7467eb75dcee9992eeb0d07\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:23:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://412472fce0e71838c2bb83101879b672e777dcd608bf27261a98261150d42e87\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ecafd1a4266b72f369f4b53d8a423bb875c1bc9719282c517a8e999ad5aecc66\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-21T04:24:14Z\\\",\\\"message\\\":\\\"W0321 04:24:13.708919 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI0321 04:24:13.709951 1 crypto.go:601] Generating new CA for check-endpoints-signer@1774067053 cert, and key in /tmp/serving-cert-1138089316/serving-signer.crt, /tmp/serving-cert-1138089316/serving-signer.key\\\\nI0321 04:24:14.009807 1 observer_polling.go:159] Starting file observer\\\\nW0321 04:24:14.016906 1 builder.go:272] unable to get owner reference (falling back to namespace): Unauthorized\\\\nI0321 04:24:14.017068 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0321 04:24:14.017655 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1138089316/tls.crt::/tmp/serving-cert-1138089316/tls.key\\\\\\\"\\\\nF0321 04:24:14.917069 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-21T04:24:13Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:24:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://573a241321770c9c676d83d8280f81eba0ad01a3e2f857be89d31316117b8898\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:23:19Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f70600d5431425836d94025fc8166068d8c9f411d0750be07cd7de44575a9649\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f70600d5431425836d94025fc8166068d8c9f411d0750be07cd7de44575a9649\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:23:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:23:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:23:16Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:25:12Z is after 2025-08-24T17:21:41Z" Mar 21 04:25:12 crc kubenswrapper[4839]: I0321 04:25:12.279431 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:24:41Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:24:41Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:25:12Z is after 2025-08-24T17:21:41Z" Mar 21 04:25:12 crc kubenswrapper[4839]: I0321 04:25:12.290305 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:24:41Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:09Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://17d5f53fe3137552caf74bf775b44fba819d8790d2bc4d5042744dbd7b7307ce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:25:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e480aacd9f65a8a4ccedd9c0a943c53bf6de8a8de00eb0d421ec782af8a785bb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:25:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:25:12Z is after 2025-08-24T17:21:41Z" Mar 21 04:25:12 crc kubenswrapper[4839]: I0321 04:25:12.298429 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-g47qh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e646dbcd-c976-48e4-8dee-497be8a275bf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2ceedbcee674106a8f519fededf37b2bbc4b4bd660845373899de825f1ed534e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:25:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ljz2v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:25:00Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-g47qh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:25:12Z is after 2025-08-24T17:21:41Z" Mar 21 04:25:12 crc kubenswrapper[4839]: I0321 04:25:12.317904 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"845a5f72-fcd9-4e53-a65e-d875a0758312\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:23:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:23:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:23:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:23:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:23:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6ac8aa1efe8200de679b27c91d85e8934dd8b958a487da8adfcb2e7fe4d4af83\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:23:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b5fcb9a692a8e68cdaf00e9ab50192286582dc94a1ec9c1f269deec71d12e709\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:23:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0a90276bf42ba474250d4e51c93b958e93ffcad8ab5f4285766b381f0247c484\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:23:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://418cc0271450c53990546d7ffb4506824d2a5889e49064cd40624ba291f485bd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:23:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d2aeb8f0a5e6e7302570f0ecb899e31a8f5e9ed506f9bf7cfa6a0a6d55ae50dd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:23:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://843b9092466615ee64f818bbd1d6e51298c59932d1e135796967de09fb2f81bf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://843b9092466615ee64f818bbd1d6e51298c59932d1e135796967de09fb2f81bf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:23:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:23:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://320bccb832e179c9ab109d22c5cd652cfb3bd770d2c698d91886401d9566efc5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://320bccb832e179c9ab109d22c5cd652cfb3bd770d2c698d91886401d9566efc5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:23:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:23:18Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://6e30344cfa0c0c9d42845a517f744a156634132bd6c2f6e30f745751913e968c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6e30344cfa0c0c9d42845a517f744a156634132bd6c2f6e30f745751913e968c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:23:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:23:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:23:16Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:25:12Z is after 2025-08-24T17:21:41Z" Mar 21 04:25:12 crc kubenswrapper[4839]: I0321 04:25:12.330188 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:24:41Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:12Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ea7373c259189351899cd74615296d7188b2529f808af9c8c85098177ffbe85d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:25:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:25:12Z is after 2025-08-24T17:21:41Z" Mar 21 04:25:12 crc kubenswrapper[4839]: I0321 04:25:12.346667 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:25:12 crc kubenswrapper[4839]: I0321 04:25:12.346702 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:25:12 crc kubenswrapper[4839]: I0321 04:25:12.346712 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:25:12 crc kubenswrapper[4839]: I0321 04:25:12.346728 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 04:25:12 crc kubenswrapper[4839]: I0321 04:25:12.346740 4839 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T04:25:12Z","lastTransitionTime":"2026-03-21T04:25:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 04:25:12 crc kubenswrapper[4839]: I0321 04:25:12.346906 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-spl4b" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d634043b-c9ec-4469-b267-26053b1f02f9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:01Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:01Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5854b81091eb3920f499e2fb8bdb5e51afa5cf975397ad7d789603150fdafe92\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:25:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cdph2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c532a8668c47301ab93dbb1ea21be6ed687a83a6c9008b887e5023a46f27d172\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:25:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cdph2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://340ce870dbfbd3b102f8f1c5e6705da80e4a23db035175d7746ffe6ad63310f0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:25:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cdph2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://04c0470b8aa154ad901d5416631136ad2480da45bff39836dda6837a3e4b980e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:25:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cdph2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://821316a28b33f897df4ea4ae553ed0cbdd066ff220cc310af604bbe062e4b00e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:25:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cdph2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0555faac11cd1da1d5573074ee5d8704c1c4a7d8559996229bf242d9ed7290f4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:25:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cdph2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4f8906b09cbadb32578e0445be63ed6c1e3ce446f301f92ae844664327d19a63\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a5c1d96269996b1709d6a1c4744c11ca23d8e4fd56bc740881a1ddb1f0198172\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-21T04:25:10Z\\\",\\\"message\\\":\\\"7\\\\nI0321 04:25:10.380763 6652 handler.go:208] Removed *v1.Node event handler 2\\\\nI0321 04:25:10.380776 6652 reflector.go:311] Stopping reflector *v1.EgressFirewall (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressfirewall/v1/apis/informers/externalversions/factory.go:140\\\\nI0321 04:25:10.380779 6652 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI0321 04:25:10.380790 6652 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI0321 04:25:10.380821 6652 handler.go:208] Removed *v1.Pod event handler 6\\\\nI0321 04:25:10.380831 6652 reflector.go:311] Stopping reflector *v1.EndpointSlice (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0321 04:25:10.380838 6652 handler.go:208] Removed *v1.Pod event handler 3\\\\nI0321 04:25:10.380893 6652 reflector.go:311] Stopping reflector *v1.EgressIP (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressip/v1/apis/informers/externalversions/factory.go:140\\\\nI0321 04:25:10.380983 6652 reflector.go:311] Stopping reflector *v1.EgressQoS (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressqos/v1/apis/informers/externalversions/factory.go:140\\\\nI0321 04:25:10.381323 6652 reflector.go:311] Stopping reflector *v1alpha1.BaselineAdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/f\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-21T04:25:07Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4f8906b09cbadb32578e0445be63ed6c1e3ce446f301f92ae844664327d19a63\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-21T04:25:12Z\\\",\\\"message\\\":\\\"al\\\\nI0321 04:25:12.025879 6856 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI0321 04:25:12.025892 6856 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI0321 04:25:12.025912 6856 handler.go:208] Removed *v1.Pod event handler 6\\\\nI0321 04:25:12.025931 6856 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0321 04:25:12.025958 6856 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI0321 04:25:12.025968 6856 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI0321 04:25:12.025975 6856 handler.go:208] Removed *v1.Pod event handler 3\\\\nI0321 04:25:12.025945 6856 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI0321 04:25:12.025953 6856 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0321 04:25:12.025985 6856 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI0321 04:25:12.026002 6856 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI0321 04:25:12.026060 6856 handler.go:208] Removed *v1.Node event handler 2\\\\nI0321 04:25:12.026070 6856 handler.go:208] Removed *v1.Node event handler 7\\\\nI0321 04:25:12.026092 6856 factory.go:656] Stopping watch factory\\\\nI0321 04:25:12.026121 6856 ovnkube.go:599] Stopped ovnkube\\\\nI0321 04:25:12.026162 6856 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF0321 04:25:\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-21T04:25:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cdph2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ee9c34719e2681860a7065870d4f0fd1d5aae10da5f2cc780e83f8020211e536\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:25:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cdph2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://466ed122245008a86b4471250670e086e61c981de3a7a026461c7c2238c87f3c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://466ed122245008a86b4471250670e086e61c981de3a7a026461c7c2238c87f3c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:25:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:25:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cdph2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:25:01Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-spl4b\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:25:12Z is after 2025-08-24T17:21:41Z" Mar 21 04:25:12 crc kubenswrapper[4839]: I0321 04:25:12.358613 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c5a473f2-0430-4c9b-8ef8-60d457db5188\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:23:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:23:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:23:16Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:23:16Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:23:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7299e7f513aff2d1d61b42cef3ed386843478716082a6e38d4d6d61d87f48529\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:23:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6d9bd6ceaa85cd21af55d1d512ac910eaf9d1bcf5d0f10868cd22ac80b13f66a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:23:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e4fe4a6c00181492cd5389e34a9bf18d905f69b2f7467eb75dcee9992eeb0d07\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:23:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://412472fce0e71838c2bb83101879b672e777dcd608bf27261a98261150d42e87\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ecafd1a4266b72f369f4b53d8a423bb875c1bc9719282c517a8e999ad5aecc66\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-21T04:24:14Z\\\",\\\"message\\\":\\\"W0321 04:24:13.708919 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI0321 04:24:13.709951 1 crypto.go:601] Generating new CA for check-endpoints-signer@1774067053 cert, and key in /tmp/serving-cert-1138089316/serving-signer.crt, /tmp/serving-cert-1138089316/serving-signer.key\\\\nI0321 04:24:14.009807 1 observer_polling.go:159] Starting file observer\\\\nW0321 04:24:14.016906 1 builder.go:272] unable to get owner reference (falling back to namespace): Unauthorized\\\\nI0321 04:24:14.017068 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0321 04:24:14.017655 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1138089316/tls.crt::/tmp/serving-cert-1138089316/tls.key\\\\\\\"\\\\nF0321 04:24:14.917069 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-21T04:24:13Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:24:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://573a241321770c9c676d83d8280f81eba0ad01a3e2f857be89d31316117b8898\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:23:19Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f70600d5431425836d94025fc8166068d8c9f411d0750be07cd7de44575a9649\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f70600d5431425836d94025fc8166068d8c9f411d0750be07cd7de44575a9649\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:23:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:23:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:23:16Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:25:12Z is after 2025-08-24T17:21:41Z" Mar 21 04:25:12 crc kubenswrapper[4839]: I0321 04:25:12.368193 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:24:41Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:24:41Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:25:12Z is after 2025-08-24T17:21:41Z" Mar 21 04:25:12 crc kubenswrapper[4839]: I0321 04:25:12.379143 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:24:41Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:09Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://17d5f53fe3137552caf74bf775b44fba819d8790d2bc4d5042744dbd7b7307ce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:25:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e480aacd9f65a8a4ccedd9c0a943c53bf6de8a8de00eb0d421ec782af8a785bb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:25:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:25:12Z is after 2025-08-24T17:21:41Z" Mar 21 04:25:12 crc kubenswrapper[4839]: I0321 04:25:12.389666 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-g47qh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e646dbcd-c976-48e4-8dee-497be8a275bf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2ceedbcee674106a8f519fededf37b2bbc4b4bd660845373899de825f1ed534e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:25:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ljz2v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:25:00Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-g47qh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:25:12Z is after 2025-08-24T17:21:41Z" Mar 21 04:25:12 crc kubenswrapper[4839]: I0321 04:25:12.401321 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:24:41Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:24:41Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:25:12Z is after 2025-08-24T17:21:41Z" Mar 21 04:25:12 crc kubenswrapper[4839]: I0321 04:25:12.411365 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-jx4q7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4f92fefb-d5cd-451a-8bbe-31eea55d5bd9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6fc18a359e07bf64662248d7ceb65c8184407c0c4e14ce5483760975218407c9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:25:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9jqbw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a124ddc90d8b93fe4b6f77b778fbdac127b4896f5f55e6d5d78629cd17584311\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:25:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9jqbw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:25:00Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-jx4q7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:25:12Z is after 2025-08-24T17:21:41Z" Mar 21 04:25:12 crc kubenswrapper[4839]: I0321 04:25:12.423261 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:24:41Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:11Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c3ce591ac49d818720803d65a1f2cbc28e1781a5bd3cff7c6cdaaa4b55c01b62\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:25:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:25:12Z is after 2025-08-24T17:21:41Z" Mar 21 04:25:12 crc kubenswrapper[4839]: I0321 04:25:12.433659 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:24:41Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:24:41Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:25:12Z is after 2025-08-24T17:21:41Z" Mar 21 04:25:12 crc kubenswrapper[4839]: I0321 04:25:12.447403 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-scp2c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e0848faa-daf7-4b62-a20f-36d92678db1d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2ae22ec83bed051327f3e013af8f13d50dfab76794f5fbc972c4091173c8c464\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:25:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6trk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5ee6bb585321fd698c356e2ad9b8f48f62f2b8e09847371060e8ab0395a059ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5ee6bb585321fd698c356e2ad9b8f48f62f2b8e09847371060e8ab0395a059ea\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:25:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:25:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6trk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d6527e3c31caba53559cb6dfe4494b576789a0c8e3e5a7d392352441f1b9b1ad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d6527e3c31caba53559cb6dfe4494b576789a0c8e3e5a7d392352441f1b9b1ad\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:25:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:25:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6trk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://688598020c8b1be055194d4805d234769230262450e5f816f35cc120ff5ff9a2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://688598020c8b1be055194d4805d234769230262450e5f816f35cc120ff5ff9a2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:25:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:25:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6trk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://137b7d338371020059e20feb028cea9e79fa43d32ff7de446719a1326051c4f4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://137b7d338371020059e20feb028cea9e79fa43d32ff7de446719a1326051c4f4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:25:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:25:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6trk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6059c9b392e73f503a825716b885d248ddf659fe4fc1cd23f47353dd0fdc0e07\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6059c9b392e73f503a825716b885d248ddf659fe4fc1cd23f47353dd0fdc0e07\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:25:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:25:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6trk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9367e3b70275a4fd5cc98faa785da0149e890de672bce34ddc831a46be5dd6b3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9367e3b70275a4fd5cc98faa785da0149e890de672bce34ddc831a46be5dd6b3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:25:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:25:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6trk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:25:00Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-scp2c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:25:12Z is after 2025-08-24T17:21:41Z" Mar 21 04:25:12 crc kubenswrapper[4839]: I0321 04:25:12.448658 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:25:12 crc kubenswrapper[4839]: I0321 04:25:12.448696 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:25:12 crc kubenswrapper[4839]: I0321 04:25:12.448709 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:25:12 crc kubenswrapper[4839]: I0321 04:25:12.448735 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 04:25:12 crc kubenswrapper[4839]: I0321 04:25:12.448748 4839 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T04:25:12Z","lastTransitionTime":"2026-03-21T04:25:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 04:25:12 crc kubenswrapper[4839]: I0321 04:25:12.451778 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 21 04:25:12 crc kubenswrapper[4839]: E0321 04:25:12.451901 4839 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 21 04:25:12 crc kubenswrapper[4839]: I0321 04:25:12.459122 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-zqcw4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1602189b-f4f3-40ee-ba63-c695c11069d0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://abb804e65728bf323531d9b35e52f6700584bab85de4a5b749067d2d89224747\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:25:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f9gh6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:25:00Z\\\"}}\" for pod \"openshift-multus\"/\"multus-zqcw4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:25:12Z is after 2025-08-24T17:21:41Z" Mar 21 04:25:12 crc kubenswrapper[4839]: I0321 04:25:12.466781 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-sxs57" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e99177d8-5f41-4cee-a2c9-ae1c314d9d8d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4499202f3d135af4fbc0aaeddb85c3af3d571f0a6a0da31b0c054112cbeef85e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:25:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vqm8m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:25:06Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-sxs57\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:25:12Z is after 2025-08-24T17:21:41Z" Mar 21 04:25:12 crc kubenswrapper[4839]: I0321 04:25:12.550512 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:25:12 crc kubenswrapper[4839]: I0321 04:25:12.550545 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:25:12 crc kubenswrapper[4839]: I0321 04:25:12.550556 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:25:12 crc kubenswrapper[4839]: I0321 04:25:12.550586 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 04:25:12 crc kubenswrapper[4839]: I0321 04:25:12.550597 4839 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T04:25:12Z","lastTransitionTime":"2026-03-21T04:25:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 04:25:12 crc kubenswrapper[4839]: I0321 04:25:12.591194 4839 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-9hl57"] Mar 21 04:25:12 crc kubenswrapper[4839]: I0321 04:25:12.591733 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-9hl57" Mar 21 04:25:12 crc kubenswrapper[4839]: I0321 04:25:12.593731 4839 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-control-plane-dockercfg-gs7dd" Mar 21 04:25:12 crc kubenswrapper[4839]: I0321 04:25:12.594207 4839 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-control-plane-metrics-cert" Mar 21 04:25:12 crc kubenswrapper[4839]: I0321 04:25:12.605506 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:24:41Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:24:41Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:25:12Z is after 2025-08-24T17:21:41Z" Mar 21 04:25:12 crc kubenswrapper[4839]: I0321 04:25:12.617817 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-jx4q7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4f92fefb-d5cd-451a-8bbe-31eea55d5bd9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6fc18a359e07bf64662248d7ceb65c8184407c0c4e14ce5483760975218407c9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:25:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9jqbw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a124ddc90d8b93fe4b6f77b778fbdac127b4896f5f55e6d5d78629cd17584311\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:25:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9jqbw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:25:00Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-jx4q7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:25:12Z is after 2025-08-24T17:21:41Z" Mar 21 04:25:12 crc kubenswrapper[4839]: I0321 04:25:12.630239 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:24:41Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:11Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c3ce591ac49d818720803d65a1f2cbc28e1781a5bd3cff7c6cdaaa4b55c01b62\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:25:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:25:12Z is after 2025-08-24T17:21:41Z" Mar 21 04:25:12 crc kubenswrapper[4839]: I0321 04:25:12.640776 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:24:41Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:24:41Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:25:12Z is after 2025-08-24T17:21:41Z" Mar 21 04:25:12 crc kubenswrapper[4839]: I0321 04:25:12.642311 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6n8pc\" (UniqueName: \"kubernetes.io/projected/4dee692e-c3b8-4538-86d7-210dd7e96173-kube-api-access-6n8pc\") pod \"ovnkube-control-plane-749d76644c-9hl57\" (UID: \"4dee692e-c3b8-4538-86d7-210dd7e96173\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-9hl57" Mar 21 04:25:12 crc kubenswrapper[4839]: I0321 04:25:12.642381 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/4dee692e-c3b8-4538-86d7-210dd7e96173-env-overrides\") pod \"ovnkube-control-plane-749d76644c-9hl57\" (UID: \"4dee692e-c3b8-4538-86d7-210dd7e96173\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-9hl57" Mar 21 04:25:12 crc kubenswrapper[4839]: I0321 04:25:12.642446 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/4dee692e-c3b8-4538-86d7-210dd7e96173-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-749d76644c-9hl57\" (UID: \"4dee692e-c3b8-4538-86d7-210dd7e96173\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-9hl57" Mar 21 04:25:12 crc kubenswrapper[4839]: I0321 04:25:12.642469 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/4dee692e-c3b8-4538-86d7-210dd7e96173-ovnkube-config\") pod \"ovnkube-control-plane-749d76644c-9hl57\" (UID: \"4dee692e-c3b8-4538-86d7-210dd7e96173\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-9hl57" Mar 21 04:25:12 crc kubenswrapper[4839]: I0321 04:25:12.652703 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:25:12 crc kubenswrapper[4839]: I0321 04:25:12.652745 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:25:12 crc kubenswrapper[4839]: I0321 04:25:12.652754 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:25:12 crc kubenswrapper[4839]: I0321 04:25:12.652768 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 04:25:12 crc kubenswrapper[4839]: I0321 04:25:12.652777 4839 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T04:25:12Z","lastTransitionTime":"2026-03-21T04:25:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 04:25:12 crc kubenswrapper[4839]: I0321 04:25:12.655514 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-scp2c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e0848faa-daf7-4b62-a20f-36d92678db1d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2ae22ec83bed051327f3e013af8f13d50dfab76794f5fbc972c4091173c8c464\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:25:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6trk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5ee6bb585321fd698c356e2ad9b8f48f62f2b8e09847371060e8ab0395a059ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5ee6bb585321fd698c356e2ad9b8f48f62f2b8e09847371060e8ab0395a059ea\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:25:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:25:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6trk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d6527e3c31caba53559cb6dfe4494b576789a0c8e3e5a7d392352441f1b9b1ad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d6527e3c31caba53559cb6dfe4494b576789a0c8e3e5a7d392352441f1b9b1ad\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:25:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:25:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6trk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://688598020c8b1be055194d4805d234769230262450e5f816f35cc120ff5ff9a2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://688598020c8b1be055194d4805d234769230262450e5f816f35cc120ff5ff9a2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:25:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:25:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6trk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://137b7d338371020059e20feb028cea9e79fa43d32ff7de446719a1326051c4f4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://137b7d338371020059e20feb028cea9e79fa43d32ff7de446719a1326051c4f4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:25:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:25:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6trk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6059c9b392e73f503a825716b885d248ddf659fe4fc1cd23f47353dd0fdc0e07\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6059c9b392e73f503a825716b885d248ddf659fe4fc1cd23f47353dd0fdc0e07\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:25:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:25:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6trk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9367e3b70275a4fd5cc98faa785da0149e890de672bce34ddc831a46be5dd6b3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9367e3b70275a4fd5cc98faa785da0149e890de672bce34ddc831a46be5dd6b3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:25:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:25:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6trk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:25:00Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-scp2c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:25:12Z is after 2025-08-24T17:21:41Z" Mar 21 04:25:12 crc kubenswrapper[4839]: I0321 04:25:12.666745 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-zqcw4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1602189b-f4f3-40ee-ba63-c695c11069d0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://abb804e65728bf323531d9b35e52f6700584bab85de4a5b749067d2d89224747\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:25:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f9gh6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:25:00Z\\\"}}\" for pod \"openshift-multus\"/\"multus-zqcw4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:25:12Z is after 2025-08-24T17:21:41Z" Mar 21 04:25:12 crc kubenswrapper[4839]: I0321 04:25:12.677723 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-sxs57" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e99177d8-5f41-4cee-a2c9-ae1c314d9d8d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4499202f3d135af4fbc0aaeddb85c3af3d571f0a6a0da31b0c054112cbeef85e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:25:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vqm8m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:25:06Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-sxs57\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:25:12Z is after 2025-08-24T17:21:41Z" Mar 21 04:25:12 crc kubenswrapper[4839]: I0321 04:25:12.693879 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"845a5f72-fcd9-4e53-a65e-d875a0758312\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:23:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:23:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:23:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:23:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:23:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6ac8aa1efe8200de679b27c91d85e8934dd8b958a487da8adfcb2e7fe4d4af83\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:23:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b5fcb9a692a8e68cdaf00e9ab50192286582dc94a1ec9c1f269deec71d12e709\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:23:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0a90276bf42ba474250d4e51c93b958e93ffcad8ab5f4285766b381f0247c484\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:23:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://418cc0271450c53990546d7ffb4506824d2a5889e49064cd40624ba291f485bd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:23:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d2aeb8f0a5e6e7302570f0ecb899e31a8f5e9ed506f9bf7cfa6a0a6d55ae50dd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:23:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://843b9092466615ee64f818bbd1d6e51298c59932d1e135796967de09fb2f81bf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://843b9092466615ee64f818bbd1d6e51298c59932d1e135796967de09fb2f81bf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:23:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:23:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://320bccb832e179c9ab109d22c5cd652cfb3bd770d2c698d91886401d9566efc5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://320bccb832e179c9ab109d22c5cd652cfb3bd770d2c698d91886401d9566efc5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:23:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:23:18Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://6e30344cfa0c0c9d42845a517f744a156634132bd6c2f6e30f745751913e968c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6e30344cfa0c0c9d42845a517f744a156634132bd6c2f6e30f745751913e968c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:23:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:23:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:23:16Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:25:12Z is after 2025-08-24T17:21:41Z" Mar 21 04:25:12 crc kubenswrapper[4839]: I0321 04:25:12.703448 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:24:41Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:12Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ea7373c259189351899cd74615296d7188b2529f808af9c8c85098177ffbe85d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:25:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:25:12Z is after 2025-08-24T17:21:41Z" Mar 21 04:25:12 crc kubenswrapper[4839]: I0321 04:25:12.719449 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-spl4b" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d634043b-c9ec-4469-b267-26053b1f02f9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:01Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:01Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5854b81091eb3920f499e2fb8bdb5e51afa5cf975397ad7d789603150fdafe92\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:25:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cdph2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c532a8668c47301ab93dbb1ea21be6ed687a83a6c9008b887e5023a46f27d172\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:25:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cdph2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://340ce870dbfbd3b102f8f1c5e6705da80e4a23db035175d7746ffe6ad63310f0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:25:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cdph2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://04c0470b8aa154ad901d5416631136ad2480da45bff39836dda6837a3e4b980e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:25:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cdph2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://821316a28b33f897df4ea4ae553ed0cbdd066ff220cc310af604bbe062e4b00e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:25:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cdph2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0555faac11cd1da1d5573074ee5d8704c1c4a7d8559996229bf242d9ed7290f4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:25:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cdph2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4f8906b09cbadb32578e0445be63ed6c1e3ce446f301f92ae844664327d19a63\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a5c1d96269996b1709d6a1c4744c11ca23d8e4fd56bc740881a1ddb1f0198172\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-21T04:25:10Z\\\",\\\"message\\\":\\\"7\\\\nI0321 04:25:10.380763 6652 handler.go:208] Removed *v1.Node event handler 2\\\\nI0321 04:25:10.380776 6652 reflector.go:311] Stopping reflector *v1.EgressFirewall (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressfirewall/v1/apis/informers/externalversions/factory.go:140\\\\nI0321 04:25:10.380779 6652 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI0321 04:25:10.380790 6652 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI0321 04:25:10.380821 6652 handler.go:208] Removed *v1.Pod event handler 6\\\\nI0321 04:25:10.380831 6652 reflector.go:311] Stopping reflector *v1.EndpointSlice (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0321 04:25:10.380838 6652 handler.go:208] Removed *v1.Pod event handler 3\\\\nI0321 04:25:10.380893 6652 reflector.go:311] Stopping reflector *v1.EgressIP (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressip/v1/apis/informers/externalversions/factory.go:140\\\\nI0321 04:25:10.380983 6652 reflector.go:311] Stopping reflector *v1.EgressQoS (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressqos/v1/apis/informers/externalversions/factory.go:140\\\\nI0321 04:25:10.381323 6652 reflector.go:311] Stopping reflector *v1alpha1.BaselineAdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/f\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-21T04:25:07Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4f8906b09cbadb32578e0445be63ed6c1e3ce446f301f92ae844664327d19a63\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-21T04:25:12Z\\\",\\\"message\\\":\\\"al\\\\nI0321 04:25:12.025879 6856 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI0321 04:25:12.025892 6856 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI0321 04:25:12.025912 6856 handler.go:208] Removed *v1.Pod event handler 6\\\\nI0321 04:25:12.025931 6856 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0321 04:25:12.025958 6856 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI0321 04:25:12.025968 6856 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI0321 04:25:12.025975 6856 handler.go:208] Removed *v1.Pod event handler 3\\\\nI0321 04:25:12.025945 6856 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI0321 04:25:12.025953 6856 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0321 04:25:12.025985 6856 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI0321 04:25:12.026002 6856 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI0321 04:25:12.026060 6856 handler.go:208] Removed *v1.Node event handler 2\\\\nI0321 04:25:12.026070 6856 handler.go:208] Removed *v1.Node event handler 7\\\\nI0321 04:25:12.026092 6856 factory.go:656] Stopping watch factory\\\\nI0321 04:25:12.026121 6856 ovnkube.go:599] Stopped ovnkube\\\\nI0321 04:25:12.026162 6856 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF0321 04:25:\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-21T04:25:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cdph2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ee9c34719e2681860a7065870d4f0fd1d5aae10da5f2cc780e83f8020211e536\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:25:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cdph2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://466ed122245008a86b4471250670e086e61c981de3a7a026461c7c2238c87f3c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://466ed122245008a86b4471250670e086e61c981de3a7a026461c7c2238c87f3c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:25:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:25:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cdph2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:25:01Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-spl4b\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:25:12Z is after 2025-08-24T17:21:41Z" Mar 21 04:25:12 crc kubenswrapper[4839]: I0321 04:25:12.729375 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-9hl57" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4dee692e-c3b8-4538-86d7-210dd7e96173\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:12Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:12Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:12Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6n8pc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6n8pc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:25:12Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-9hl57\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:25:12Z is after 2025-08-24T17:21:41Z" Mar 21 04:25:12 crc kubenswrapper[4839]: I0321 04:25:12.743764 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6n8pc\" (UniqueName: \"kubernetes.io/projected/4dee692e-c3b8-4538-86d7-210dd7e96173-kube-api-access-6n8pc\") pod \"ovnkube-control-plane-749d76644c-9hl57\" (UID: \"4dee692e-c3b8-4538-86d7-210dd7e96173\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-9hl57" Mar 21 04:25:12 crc kubenswrapper[4839]: I0321 04:25:12.743840 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/4dee692e-c3b8-4538-86d7-210dd7e96173-env-overrides\") pod \"ovnkube-control-plane-749d76644c-9hl57\" (UID: \"4dee692e-c3b8-4538-86d7-210dd7e96173\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-9hl57" Mar 21 04:25:12 crc kubenswrapper[4839]: I0321 04:25:12.743890 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/4dee692e-c3b8-4538-86d7-210dd7e96173-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-749d76644c-9hl57\" (UID: \"4dee692e-c3b8-4538-86d7-210dd7e96173\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-9hl57" Mar 21 04:25:12 crc kubenswrapper[4839]: I0321 04:25:12.743912 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/4dee692e-c3b8-4538-86d7-210dd7e96173-ovnkube-config\") pod \"ovnkube-control-plane-749d76644c-9hl57\" (UID: \"4dee692e-c3b8-4538-86d7-210dd7e96173\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-9hl57" Mar 21 04:25:12 crc kubenswrapper[4839]: I0321 04:25:12.744650 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/4dee692e-c3b8-4538-86d7-210dd7e96173-env-overrides\") pod \"ovnkube-control-plane-749d76644c-9hl57\" (UID: \"4dee692e-c3b8-4538-86d7-210dd7e96173\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-9hl57" Mar 21 04:25:12 crc kubenswrapper[4839]: I0321 04:25:12.744935 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/4dee692e-c3b8-4538-86d7-210dd7e96173-ovnkube-config\") pod \"ovnkube-control-plane-749d76644c-9hl57\" (UID: \"4dee692e-c3b8-4538-86d7-210dd7e96173\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-9hl57" Mar 21 04:25:12 crc kubenswrapper[4839]: I0321 04:25:12.746396 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c5a473f2-0430-4c9b-8ef8-60d457db5188\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:23:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:23:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:23:16Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:23:16Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:23:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7299e7f513aff2d1d61b42cef3ed386843478716082a6e38d4d6d61d87f48529\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:23:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6d9bd6ceaa85cd21af55d1d512ac910eaf9d1bcf5d0f10868cd22ac80b13f66a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:23:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e4fe4a6c00181492cd5389e34a9bf18d905f69b2f7467eb75dcee9992eeb0d07\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:23:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://412472fce0e71838c2bb83101879b672e777dcd608bf27261a98261150d42e87\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ecafd1a4266b72f369f4b53d8a423bb875c1bc9719282c517a8e999ad5aecc66\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-21T04:24:14Z\\\",\\\"message\\\":\\\"W0321 04:24:13.708919 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI0321 04:24:13.709951 1 crypto.go:601] Generating new CA for check-endpoints-signer@1774067053 cert, and key in /tmp/serving-cert-1138089316/serving-signer.crt, /tmp/serving-cert-1138089316/serving-signer.key\\\\nI0321 04:24:14.009807 1 observer_polling.go:159] Starting file observer\\\\nW0321 04:24:14.016906 1 builder.go:272] unable to get owner reference (falling back to namespace): Unauthorized\\\\nI0321 04:24:14.017068 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0321 04:24:14.017655 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1138089316/tls.crt::/tmp/serving-cert-1138089316/tls.key\\\\\\\"\\\\nF0321 04:24:14.917069 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-21T04:24:13Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:24:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://573a241321770c9c676d83d8280f81eba0ad01a3e2f857be89d31316117b8898\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:23:19Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f70600d5431425836d94025fc8166068d8c9f411d0750be07cd7de44575a9649\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f70600d5431425836d94025fc8166068d8c9f411d0750be07cd7de44575a9649\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:23:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:23:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:23:16Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:25:12Z is after 2025-08-24T17:21:41Z" Mar 21 04:25:12 crc kubenswrapper[4839]: I0321 04:25:12.752936 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/4dee692e-c3b8-4538-86d7-210dd7e96173-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-749d76644c-9hl57\" (UID: \"4dee692e-c3b8-4538-86d7-210dd7e96173\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-9hl57" Mar 21 04:25:12 crc kubenswrapper[4839]: I0321 04:25:12.754927 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:25:12 crc kubenswrapper[4839]: I0321 04:25:12.754953 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:25:12 crc kubenswrapper[4839]: I0321 04:25:12.754961 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:25:12 crc kubenswrapper[4839]: I0321 04:25:12.754975 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 04:25:12 crc kubenswrapper[4839]: I0321 04:25:12.754983 4839 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T04:25:12Z","lastTransitionTime":"2026-03-21T04:25:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 04:25:12 crc kubenswrapper[4839]: I0321 04:25:12.759539 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6n8pc\" (UniqueName: \"kubernetes.io/projected/4dee692e-c3b8-4538-86d7-210dd7e96173-kube-api-access-6n8pc\") pod \"ovnkube-control-plane-749d76644c-9hl57\" (UID: \"4dee692e-c3b8-4538-86d7-210dd7e96173\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-9hl57" Mar 21 04:25:12 crc kubenswrapper[4839]: I0321 04:25:12.763285 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:24:41Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:24:41Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:25:12Z is after 2025-08-24T17:21:41Z" Mar 21 04:25:12 crc kubenswrapper[4839]: I0321 04:25:12.775951 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:24:41Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:09Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://17d5f53fe3137552caf74bf775b44fba819d8790d2bc4d5042744dbd7b7307ce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:25:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e480aacd9f65a8a4ccedd9c0a943c53bf6de8a8de00eb0d421ec782af8a785bb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:25:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:25:12Z is after 2025-08-24T17:21:41Z" Mar 21 04:25:12 crc kubenswrapper[4839]: I0321 04:25:12.786494 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-g47qh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e646dbcd-c976-48e4-8dee-497be8a275bf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2ceedbcee674106a8f519fededf37b2bbc4b4bd660845373899de825f1ed534e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:25:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ljz2v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:25:00Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-g47qh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:25:12Z is after 2025-08-24T17:21:41Z" Mar 21 04:25:12 crc kubenswrapper[4839]: I0321 04:25:12.857508 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:25:12 crc kubenswrapper[4839]: I0321 04:25:12.857540 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:25:12 crc kubenswrapper[4839]: I0321 04:25:12.857549 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:25:12 crc kubenswrapper[4839]: I0321 04:25:12.857562 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 04:25:12 crc kubenswrapper[4839]: I0321 04:25:12.857584 4839 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T04:25:12Z","lastTransitionTime":"2026-03-21T04:25:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 04:25:12 crc kubenswrapper[4839]: I0321 04:25:12.904299 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-9hl57" Mar 21 04:25:12 crc kubenswrapper[4839]: W0321 04:25:12.918455 4839 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4dee692e_c3b8_4538_86d7_210dd7e96173.slice/crio-e3778dcf037c8735baaeaca27f2f08bc7a560ecd9e2017dbe012d3156a6e515d WatchSource:0}: Error finding container e3778dcf037c8735baaeaca27f2f08bc7a560ecd9e2017dbe012d3156a6e515d: Status 404 returned error can't find the container with id e3778dcf037c8735baaeaca27f2f08bc7a560ecd9e2017dbe012d3156a6e515d Mar 21 04:25:12 crc kubenswrapper[4839]: I0321 04:25:12.960015 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:25:12 crc kubenswrapper[4839]: I0321 04:25:12.960070 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:25:12 crc kubenswrapper[4839]: I0321 04:25:12.960083 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:25:12 crc kubenswrapper[4839]: I0321 04:25:12.960105 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 04:25:12 crc kubenswrapper[4839]: I0321 04:25:12.960123 4839 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T04:25:12Z","lastTransitionTime":"2026-03-21T04:25:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 04:25:13 crc kubenswrapper[4839]: I0321 04:25:13.062213 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:25:13 crc kubenswrapper[4839]: I0321 04:25:13.062265 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:25:13 crc kubenswrapper[4839]: I0321 04:25:13.062274 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:25:13 crc kubenswrapper[4839]: I0321 04:25:13.062290 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 04:25:13 crc kubenswrapper[4839]: I0321 04:25:13.062299 4839 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T04:25:13Z","lastTransitionTime":"2026-03-21T04:25:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 04:25:13 crc kubenswrapper[4839]: I0321 04:25:13.121874 4839 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-spl4b_d634043b-c9ec-4469-b267-26053b1f02f9/ovnkube-controller/1.log" Mar 21 04:25:13 crc kubenswrapper[4839]: I0321 04:25:13.125346 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-9hl57" event={"ID":"4dee692e-c3b8-4538-86d7-210dd7e96173","Type":"ContainerStarted","Data":"e3778dcf037c8735baaeaca27f2f08bc7a560ecd9e2017dbe012d3156a6e515d"} Mar 21 04:25:13 crc kubenswrapper[4839]: I0321 04:25:13.164441 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:25:13 crc kubenswrapper[4839]: I0321 04:25:13.164478 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:25:13 crc kubenswrapper[4839]: I0321 04:25:13.164491 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:25:13 crc kubenswrapper[4839]: I0321 04:25:13.164506 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 04:25:13 crc kubenswrapper[4839]: I0321 04:25:13.164518 4839 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T04:25:13Z","lastTransitionTime":"2026-03-21T04:25:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 04:25:13 crc kubenswrapper[4839]: I0321 04:25:13.248602 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 21 04:25:13 crc kubenswrapper[4839]: I0321 04:25:13.248719 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 21 04:25:13 crc kubenswrapper[4839]: I0321 04:25:13.248778 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 21 04:25:13 crc kubenswrapper[4839]: E0321 04:25:13.248950 4839 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 21 04:25:13 crc kubenswrapper[4839]: E0321 04:25:13.249020 4839 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-21 04:25:45.249004347 +0000 UTC m=+149.576791023 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 21 04:25:13 crc kubenswrapper[4839]: E0321 04:25:13.249112 4839 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-21 04:25:45.24910251 +0000 UTC m=+149.576889186 (durationBeforeRetry 32s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 21 04:25:13 crc kubenswrapper[4839]: E0321 04:25:13.249179 4839 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Mar 21 04:25:13 crc kubenswrapper[4839]: E0321 04:25:13.249219 4839 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-21 04:25:45.249209323 +0000 UTC m=+149.576995999 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Mar 21 04:25:13 crc kubenswrapper[4839]: I0321 04:25:13.266483 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:25:13 crc kubenswrapper[4839]: I0321 04:25:13.266542 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:25:13 crc kubenswrapper[4839]: I0321 04:25:13.266560 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:25:13 crc kubenswrapper[4839]: I0321 04:25:13.266604 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 04:25:13 crc kubenswrapper[4839]: I0321 04:25:13.266619 4839 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T04:25:13Z","lastTransitionTime":"2026-03-21T04:25:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 04:25:13 crc kubenswrapper[4839]: I0321 04:25:13.328624 4839 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/network-metrics-daemon-445ww"] Mar 21 04:25:13 crc kubenswrapper[4839]: I0321 04:25:13.329054 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-445ww" Mar 21 04:25:13 crc kubenswrapper[4839]: E0321 04:25:13.329103 4839 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-445ww" podUID="fa13ce27-53f2-4178-8560-251f0bb3f034" Mar 21 04:25:13 crc kubenswrapper[4839]: I0321 04:25:13.343649 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:24:41Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:24:41Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:25:13Z is after 2025-08-24T17:21:41Z" Mar 21 04:25:13 crc kubenswrapper[4839]: I0321 04:25:13.349897 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4b26h\" (UniqueName: \"kubernetes.io/projected/fa13ce27-53f2-4178-8560-251f0bb3f034-kube-api-access-4b26h\") pod \"network-metrics-daemon-445ww\" (UID: \"fa13ce27-53f2-4178-8560-251f0bb3f034\") " pod="openshift-multus/network-metrics-daemon-445ww" Mar 21 04:25:13 crc kubenswrapper[4839]: I0321 04:25:13.349940 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/fa13ce27-53f2-4178-8560-251f0bb3f034-metrics-certs\") pod \"network-metrics-daemon-445ww\" (UID: \"fa13ce27-53f2-4178-8560-251f0bb3f034\") " pod="openshift-multus/network-metrics-daemon-445ww" Mar 21 04:25:13 crc kubenswrapper[4839]: I0321 04:25:13.349980 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 21 04:25:13 crc kubenswrapper[4839]: I0321 04:25:13.350015 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 21 04:25:13 crc kubenswrapper[4839]: E0321 04:25:13.350152 4839 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 21 04:25:13 crc kubenswrapper[4839]: E0321 04:25:13.350183 4839 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 21 04:25:13 crc kubenswrapper[4839]: E0321 04:25:13.350180 4839 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 21 04:25:13 crc kubenswrapper[4839]: E0321 04:25:13.350222 4839 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 21 04:25:13 crc kubenswrapper[4839]: E0321 04:25:13.350233 4839 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 21 04:25:13 crc kubenswrapper[4839]: E0321 04:25:13.350280 4839 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-03-21 04:25:45.350265113 +0000 UTC m=+149.678051789 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 21 04:25:13 crc kubenswrapper[4839]: E0321 04:25:13.350196 4839 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 21 04:25:13 crc kubenswrapper[4839]: E0321 04:25:13.350333 4839 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-03-21 04:25:45.350319195 +0000 UTC m=+149.678105861 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 21 04:25:13 crc kubenswrapper[4839]: I0321 04:25:13.354690 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-jx4q7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4f92fefb-d5cd-451a-8bbe-31eea55d5bd9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6fc18a359e07bf64662248d7ceb65c8184407c0c4e14ce5483760975218407c9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:25:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9jqbw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a124ddc90d8b93fe4b6f77b778fbdac127b4896f5f55e6d5d78629cd17584311\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:25:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9jqbw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:25:00Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-jx4q7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:25:13Z is after 2025-08-24T17:21:41Z" Mar 21 04:25:13 crc kubenswrapper[4839]: I0321 04:25:13.367432 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:24:41Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:11Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c3ce591ac49d818720803d65a1f2cbc28e1781a5bd3cff7c6cdaaa4b55c01b62\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:25:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:25:13Z is after 2025-08-24T17:21:41Z" Mar 21 04:25:13 crc kubenswrapper[4839]: I0321 04:25:13.369865 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:25:13 crc kubenswrapper[4839]: I0321 04:25:13.369908 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:25:13 crc kubenswrapper[4839]: I0321 04:25:13.369919 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:25:13 crc kubenswrapper[4839]: I0321 04:25:13.369940 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 04:25:13 crc kubenswrapper[4839]: I0321 04:25:13.369951 4839 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T04:25:13Z","lastTransitionTime":"2026-03-21T04:25:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 04:25:13 crc kubenswrapper[4839]: I0321 04:25:13.380916 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:24:41Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:24:41Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:25:13Z is after 2025-08-24T17:21:41Z" Mar 21 04:25:13 crc kubenswrapper[4839]: I0321 04:25:13.394945 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-scp2c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e0848faa-daf7-4b62-a20f-36d92678db1d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2ae22ec83bed051327f3e013af8f13d50dfab76794f5fbc972c4091173c8c464\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:25:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6trk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5ee6bb585321fd698c356e2ad9b8f48f62f2b8e09847371060e8ab0395a059ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5ee6bb585321fd698c356e2ad9b8f48f62f2b8e09847371060e8ab0395a059ea\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:25:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:25:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6trk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d6527e3c31caba53559cb6dfe4494b576789a0c8e3e5a7d392352441f1b9b1ad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d6527e3c31caba53559cb6dfe4494b576789a0c8e3e5a7d392352441f1b9b1ad\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:25:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:25:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6trk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://688598020c8b1be055194d4805d234769230262450e5f816f35cc120ff5ff9a2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://688598020c8b1be055194d4805d234769230262450e5f816f35cc120ff5ff9a2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:25:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:25:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6trk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://137b7d338371020059e20feb028cea9e79fa43d32ff7de446719a1326051c4f4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://137b7d338371020059e20feb028cea9e79fa43d32ff7de446719a1326051c4f4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:25:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:25:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6trk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6059c9b392e73f503a825716b885d248ddf659fe4fc1cd23f47353dd0fdc0e07\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6059c9b392e73f503a825716b885d248ddf659fe4fc1cd23f47353dd0fdc0e07\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:25:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:25:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6trk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9367e3b70275a4fd5cc98faa785da0149e890de672bce34ddc831a46be5dd6b3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9367e3b70275a4fd5cc98faa785da0149e890de672bce34ddc831a46be5dd6b3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:25:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:25:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6trk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:25:00Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-scp2c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:25:13Z is after 2025-08-24T17:21:41Z" Mar 21 04:25:13 crc kubenswrapper[4839]: I0321 04:25:13.407134 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-zqcw4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1602189b-f4f3-40ee-ba63-c695c11069d0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://abb804e65728bf323531d9b35e52f6700584bab85de4a5b749067d2d89224747\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:25:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f9gh6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:25:00Z\\\"}}\" for pod \"openshift-multus\"/\"multus-zqcw4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:25:13Z is after 2025-08-24T17:21:41Z" Mar 21 04:25:13 crc kubenswrapper[4839]: I0321 04:25:13.416163 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-sxs57" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e99177d8-5f41-4cee-a2c9-ae1c314d9d8d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4499202f3d135af4fbc0aaeddb85c3af3d571f0a6a0da31b0c054112cbeef85e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:25:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vqm8m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:25:06Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-sxs57\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:25:13Z is after 2025-08-24T17:21:41Z" Mar 21 04:25:13 crc kubenswrapper[4839]: I0321 04:25:13.433906 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"845a5f72-fcd9-4e53-a65e-d875a0758312\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:23:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:23:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:23:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:23:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:23:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6ac8aa1efe8200de679b27c91d85e8934dd8b958a487da8adfcb2e7fe4d4af83\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:23:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b5fcb9a692a8e68cdaf00e9ab50192286582dc94a1ec9c1f269deec71d12e709\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:23:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0a90276bf42ba474250d4e51c93b958e93ffcad8ab5f4285766b381f0247c484\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:23:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://418cc0271450c53990546d7ffb4506824d2a5889e49064cd40624ba291f485bd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:23:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d2aeb8f0a5e6e7302570f0ecb899e31a8f5e9ed506f9bf7cfa6a0a6d55ae50dd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:23:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://843b9092466615ee64f818bbd1d6e51298c59932d1e135796967de09fb2f81bf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://843b9092466615ee64f818bbd1d6e51298c59932d1e135796967de09fb2f81bf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:23:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:23:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://320bccb832e179c9ab109d22c5cd652cfb3bd770d2c698d91886401d9566efc5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://320bccb832e179c9ab109d22c5cd652cfb3bd770d2c698d91886401d9566efc5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:23:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:23:18Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://6e30344cfa0c0c9d42845a517f744a156634132bd6c2f6e30f745751913e968c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6e30344cfa0c0c9d42845a517f744a156634132bd6c2f6e30f745751913e968c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:23:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:23:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:23:16Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:25:13Z is after 2025-08-24T17:21:41Z" Mar 21 04:25:13 crc kubenswrapper[4839]: I0321 04:25:13.450035 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:24:41Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:12Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ea7373c259189351899cd74615296d7188b2529f808af9c8c85098177ffbe85d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:25:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:25:13Z is after 2025-08-24T17:21:41Z" Mar 21 04:25:13 crc kubenswrapper[4839]: I0321 04:25:13.451735 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 21 04:25:13 crc kubenswrapper[4839]: I0321 04:25:13.451778 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 21 04:25:13 crc kubenswrapper[4839]: E0321 04:25:13.451851 4839 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 21 04:25:13 crc kubenswrapper[4839]: E0321 04:25:13.451980 4839 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 21 04:25:13 crc kubenswrapper[4839]: I0321 04:25:13.452353 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4b26h\" (UniqueName: \"kubernetes.io/projected/fa13ce27-53f2-4178-8560-251f0bb3f034-kube-api-access-4b26h\") pod \"network-metrics-daemon-445ww\" (UID: \"fa13ce27-53f2-4178-8560-251f0bb3f034\") " pod="openshift-multus/network-metrics-daemon-445ww" Mar 21 04:25:13 crc kubenswrapper[4839]: I0321 04:25:13.452391 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/fa13ce27-53f2-4178-8560-251f0bb3f034-metrics-certs\") pod \"network-metrics-daemon-445ww\" (UID: \"fa13ce27-53f2-4178-8560-251f0bb3f034\") " pod="openshift-multus/network-metrics-daemon-445ww" Mar 21 04:25:13 crc kubenswrapper[4839]: E0321 04:25:13.452524 4839 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Mar 21 04:25:13 crc kubenswrapper[4839]: E0321 04:25:13.452581 4839 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/fa13ce27-53f2-4178-8560-251f0bb3f034-metrics-certs podName:fa13ce27-53f2-4178-8560-251f0bb3f034 nodeName:}" failed. No retries permitted until 2026-03-21 04:25:13.952556432 +0000 UTC m=+118.280343108 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/fa13ce27-53f2-4178-8560-251f0bb3f034-metrics-certs") pod "network-metrics-daemon-445ww" (UID: "fa13ce27-53f2-4178-8560-251f0bb3f034") : object "openshift-multus"/"metrics-daemon-secret" not registered Mar 21 04:25:13 crc kubenswrapper[4839]: I0321 04:25:13.466319 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-spl4b" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d634043b-c9ec-4469-b267-26053b1f02f9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:01Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:01Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5854b81091eb3920f499e2fb8bdb5e51afa5cf975397ad7d789603150fdafe92\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:25:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cdph2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c532a8668c47301ab93dbb1ea21be6ed687a83a6c9008b887e5023a46f27d172\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:25:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cdph2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://340ce870dbfbd3b102f8f1c5e6705da80e4a23db035175d7746ffe6ad63310f0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:25:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cdph2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://04c0470b8aa154ad901d5416631136ad2480da45bff39836dda6837a3e4b980e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:25:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cdph2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://821316a28b33f897df4ea4ae553ed0cbdd066ff220cc310af604bbe062e4b00e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:25:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cdph2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0555faac11cd1da1d5573074ee5d8704c1c4a7d8559996229bf242d9ed7290f4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:25:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cdph2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4f8906b09cbadb32578e0445be63ed6c1e3ce446f301f92ae844664327d19a63\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a5c1d96269996b1709d6a1c4744c11ca23d8e4fd56bc740881a1ddb1f0198172\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-21T04:25:10Z\\\",\\\"message\\\":\\\"7\\\\nI0321 04:25:10.380763 6652 handler.go:208] Removed *v1.Node event handler 2\\\\nI0321 04:25:10.380776 6652 reflector.go:311] Stopping reflector *v1.EgressFirewall (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressfirewall/v1/apis/informers/externalversions/factory.go:140\\\\nI0321 04:25:10.380779 6652 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI0321 04:25:10.380790 6652 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI0321 04:25:10.380821 6652 handler.go:208] Removed *v1.Pod event handler 6\\\\nI0321 04:25:10.380831 6652 reflector.go:311] Stopping reflector *v1.EndpointSlice (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0321 04:25:10.380838 6652 handler.go:208] Removed *v1.Pod event handler 3\\\\nI0321 04:25:10.380893 6652 reflector.go:311] Stopping reflector *v1.EgressIP (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressip/v1/apis/informers/externalversions/factory.go:140\\\\nI0321 04:25:10.380983 6652 reflector.go:311] Stopping reflector *v1.EgressQoS (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressqos/v1/apis/informers/externalversions/factory.go:140\\\\nI0321 04:25:10.381323 6652 reflector.go:311] Stopping reflector *v1alpha1.BaselineAdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/f\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-21T04:25:07Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4f8906b09cbadb32578e0445be63ed6c1e3ce446f301f92ae844664327d19a63\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-21T04:25:12Z\\\",\\\"message\\\":\\\"al\\\\nI0321 04:25:12.025879 6856 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI0321 04:25:12.025892 6856 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI0321 04:25:12.025912 6856 handler.go:208] Removed *v1.Pod event handler 6\\\\nI0321 04:25:12.025931 6856 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0321 04:25:12.025958 6856 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI0321 04:25:12.025968 6856 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI0321 04:25:12.025975 6856 handler.go:208] Removed *v1.Pod event handler 3\\\\nI0321 04:25:12.025945 6856 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI0321 04:25:12.025953 6856 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0321 04:25:12.025985 6856 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI0321 04:25:12.026002 6856 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI0321 04:25:12.026060 6856 handler.go:208] Removed *v1.Node event handler 2\\\\nI0321 04:25:12.026070 6856 handler.go:208] Removed *v1.Node event handler 7\\\\nI0321 04:25:12.026092 6856 factory.go:656] Stopping watch factory\\\\nI0321 04:25:12.026121 6856 ovnkube.go:599] Stopped ovnkube\\\\nI0321 04:25:12.026162 6856 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF0321 04:25:\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-21T04:25:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cdph2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ee9c34719e2681860a7065870d4f0fd1d5aae10da5f2cc780e83f8020211e536\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:25:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cdph2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://466ed122245008a86b4471250670e086e61c981de3a7a026461c7c2238c87f3c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://466ed122245008a86b4471250670e086e61c981de3a7a026461c7c2238c87f3c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:25:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:25:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cdph2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:25:01Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-spl4b\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:25:13Z is after 2025-08-24T17:21:41Z" Mar 21 04:25:13 crc kubenswrapper[4839]: I0321 04:25:13.467364 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4b26h\" (UniqueName: \"kubernetes.io/projected/fa13ce27-53f2-4178-8560-251f0bb3f034-kube-api-access-4b26h\") pod \"network-metrics-daemon-445ww\" (UID: \"fa13ce27-53f2-4178-8560-251f0bb3f034\") " pod="openshift-multus/network-metrics-daemon-445ww" Mar 21 04:25:13 crc kubenswrapper[4839]: I0321 04:25:13.472431 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:25:13 crc kubenswrapper[4839]: I0321 04:25:13.472459 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:25:13 crc kubenswrapper[4839]: I0321 04:25:13.472471 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:25:13 crc kubenswrapper[4839]: I0321 04:25:13.472487 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 04:25:13 crc kubenswrapper[4839]: I0321 04:25:13.472497 4839 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T04:25:13Z","lastTransitionTime":"2026-03-21T04:25:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 04:25:13 crc kubenswrapper[4839]: I0321 04:25:13.476948 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-9hl57" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4dee692e-c3b8-4538-86d7-210dd7e96173\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:12Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:12Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:12Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6n8pc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6n8pc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:25:12Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-9hl57\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:25:13Z is after 2025-08-24T17:21:41Z" Mar 21 04:25:13 crc kubenswrapper[4839]: I0321 04:25:13.486414 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-445ww" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fa13ce27-53f2-4178-8560-251f0bb3f034\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:13Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:13Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4b26h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4b26h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:25:13Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-445ww\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:25:13Z is after 2025-08-24T17:21:41Z" Mar 21 04:25:13 crc kubenswrapper[4839]: I0321 04:25:13.503932 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c5a473f2-0430-4c9b-8ef8-60d457db5188\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:23:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:23:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:23:16Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:23:16Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:23:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7299e7f513aff2d1d61b42cef3ed386843478716082a6e38d4d6d61d87f48529\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:23:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6d9bd6ceaa85cd21af55d1d512ac910eaf9d1bcf5d0f10868cd22ac80b13f66a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:23:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e4fe4a6c00181492cd5389e34a9bf18d905f69b2f7467eb75dcee9992eeb0d07\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:23:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://412472fce0e71838c2bb83101879b672e777dcd608bf27261a98261150d42e87\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ecafd1a4266b72f369f4b53d8a423bb875c1bc9719282c517a8e999ad5aecc66\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-21T04:24:14Z\\\",\\\"message\\\":\\\"W0321 04:24:13.708919 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI0321 04:24:13.709951 1 crypto.go:601] Generating new CA for check-endpoints-signer@1774067053 cert, and key in /tmp/serving-cert-1138089316/serving-signer.crt, /tmp/serving-cert-1138089316/serving-signer.key\\\\nI0321 04:24:14.009807 1 observer_polling.go:159] Starting file observer\\\\nW0321 04:24:14.016906 1 builder.go:272] unable to get owner reference (falling back to namespace): Unauthorized\\\\nI0321 04:24:14.017068 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0321 04:24:14.017655 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1138089316/tls.crt::/tmp/serving-cert-1138089316/tls.key\\\\\\\"\\\\nF0321 04:24:14.917069 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-21T04:24:13Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:24:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://573a241321770c9c676d83d8280f81eba0ad01a3e2f857be89d31316117b8898\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:23:19Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f70600d5431425836d94025fc8166068d8c9f411d0750be07cd7de44575a9649\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f70600d5431425836d94025fc8166068d8c9f411d0750be07cd7de44575a9649\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:23:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:23:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:23:16Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:25:13Z is after 2025-08-24T17:21:41Z" Mar 21 04:25:13 crc kubenswrapper[4839]: I0321 04:25:13.515847 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:24:41Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:24:41Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:25:13Z is after 2025-08-24T17:21:41Z" Mar 21 04:25:13 crc kubenswrapper[4839]: I0321 04:25:13.531903 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:24:41Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:09Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://17d5f53fe3137552caf74bf775b44fba819d8790d2bc4d5042744dbd7b7307ce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:25:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e480aacd9f65a8a4ccedd9c0a943c53bf6de8a8de00eb0d421ec782af8a785bb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:25:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:25:13Z is after 2025-08-24T17:21:41Z" Mar 21 04:25:13 crc kubenswrapper[4839]: I0321 04:25:13.544471 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-g47qh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e646dbcd-c976-48e4-8dee-497be8a275bf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2ceedbcee674106a8f519fededf37b2bbc4b4bd660845373899de825f1ed534e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:25:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ljz2v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:25:00Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-g47qh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:25:13Z is after 2025-08-24T17:21:41Z" Mar 21 04:25:13 crc kubenswrapper[4839]: I0321 04:25:13.576885 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:25:13 crc kubenswrapper[4839]: I0321 04:25:13.577187 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:25:13 crc kubenswrapper[4839]: I0321 04:25:13.577296 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:25:13 crc kubenswrapper[4839]: I0321 04:25:13.577414 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 04:25:13 crc kubenswrapper[4839]: I0321 04:25:13.577528 4839 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T04:25:13Z","lastTransitionTime":"2026-03-21T04:25:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 04:25:13 crc kubenswrapper[4839]: I0321 04:25:13.679901 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:25:13 crc kubenswrapper[4839]: I0321 04:25:13.679930 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:25:13 crc kubenswrapper[4839]: I0321 04:25:13.679940 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:25:13 crc kubenswrapper[4839]: I0321 04:25:13.679955 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 04:25:13 crc kubenswrapper[4839]: I0321 04:25:13.679966 4839 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T04:25:13Z","lastTransitionTime":"2026-03-21T04:25:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 04:25:13 crc kubenswrapper[4839]: I0321 04:25:13.781973 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:25:13 crc kubenswrapper[4839]: I0321 04:25:13.782048 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:25:13 crc kubenswrapper[4839]: I0321 04:25:13.782070 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:25:13 crc kubenswrapper[4839]: I0321 04:25:13.782099 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 04:25:13 crc kubenswrapper[4839]: I0321 04:25:13.782120 4839 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T04:25:13Z","lastTransitionTime":"2026-03-21T04:25:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 04:25:13 crc kubenswrapper[4839]: I0321 04:25:13.885242 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:25:13 crc kubenswrapper[4839]: I0321 04:25:13.885278 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:25:13 crc kubenswrapper[4839]: I0321 04:25:13.885294 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:25:13 crc kubenswrapper[4839]: I0321 04:25:13.885308 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 04:25:13 crc kubenswrapper[4839]: I0321 04:25:13.885318 4839 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T04:25:13Z","lastTransitionTime":"2026-03-21T04:25:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 04:25:13 crc kubenswrapper[4839]: I0321 04:25:13.956775 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/fa13ce27-53f2-4178-8560-251f0bb3f034-metrics-certs\") pod \"network-metrics-daemon-445ww\" (UID: \"fa13ce27-53f2-4178-8560-251f0bb3f034\") " pod="openshift-multus/network-metrics-daemon-445ww" Mar 21 04:25:13 crc kubenswrapper[4839]: E0321 04:25:13.956948 4839 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Mar 21 04:25:13 crc kubenswrapper[4839]: E0321 04:25:13.957010 4839 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/fa13ce27-53f2-4178-8560-251f0bb3f034-metrics-certs podName:fa13ce27-53f2-4178-8560-251f0bb3f034 nodeName:}" failed. No retries permitted until 2026-03-21 04:25:14.956993072 +0000 UTC m=+119.284779748 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/fa13ce27-53f2-4178-8560-251f0bb3f034-metrics-certs") pod "network-metrics-daemon-445ww" (UID: "fa13ce27-53f2-4178-8560-251f0bb3f034") : object "openshift-multus"/"metrics-daemon-secret" not registered Mar 21 04:25:13 crc kubenswrapper[4839]: I0321 04:25:13.987074 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:25:13 crc kubenswrapper[4839]: I0321 04:25:13.987116 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:25:13 crc kubenswrapper[4839]: I0321 04:25:13.987125 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:25:13 crc kubenswrapper[4839]: I0321 04:25:13.987138 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 04:25:13 crc kubenswrapper[4839]: I0321 04:25:13.987147 4839 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T04:25:13Z","lastTransitionTime":"2026-03-21T04:25:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 04:25:14 crc kubenswrapper[4839]: I0321 04:25:14.089539 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:25:14 crc kubenswrapper[4839]: I0321 04:25:14.089608 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:25:14 crc kubenswrapper[4839]: I0321 04:25:14.089619 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:25:14 crc kubenswrapper[4839]: I0321 04:25:14.089636 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 04:25:14 crc kubenswrapper[4839]: I0321 04:25:14.089649 4839 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T04:25:14Z","lastTransitionTime":"2026-03-21T04:25:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 04:25:14 crc kubenswrapper[4839]: I0321 04:25:14.129785 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-9hl57" event={"ID":"4dee692e-c3b8-4538-86d7-210dd7e96173","Type":"ContainerStarted","Data":"99d7bb8a699d6022c3036c24052e58f8f45699b88971a2137e3d8bc27a21fe56"} Mar 21 04:25:14 crc kubenswrapper[4839]: I0321 04:25:14.129824 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-9hl57" event={"ID":"4dee692e-c3b8-4538-86d7-210dd7e96173","Type":"ContainerStarted","Data":"6058f68ef58f6259be3064c7cea97ec261dc1174655dbe75d4caf208a9c300e5"} Mar 21 04:25:14 crc kubenswrapper[4839]: I0321 04:25:14.142745 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-g47qh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e646dbcd-c976-48e4-8dee-497be8a275bf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2ceedbcee674106a8f519fededf37b2bbc4b4bd660845373899de825f1ed534e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:25:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ljz2v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:25:00Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-g47qh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:25:14Z is after 2025-08-24T17:21:41Z" Mar 21 04:25:14 crc kubenswrapper[4839]: I0321 04:25:14.157520 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c5a473f2-0430-4c9b-8ef8-60d457db5188\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:23:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:23:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:23:16Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:23:16Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:23:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7299e7f513aff2d1d61b42cef3ed386843478716082a6e38d4d6d61d87f48529\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:23:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6d9bd6ceaa85cd21af55d1d512ac910eaf9d1bcf5d0f10868cd22ac80b13f66a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:23:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e4fe4a6c00181492cd5389e34a9bf18d905f69b2f7467eb75dcee9992eeb0d07\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:23:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://412472fce0e71838c2bb83101879b672e777dcd608bf27261a98261150d42e87\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ecafd1a4266b72f369f4b53d8a423bb875c1bc9719282c517a8e999ad5aecc66\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-21T04:24:14Z\\\",\\\"message\\\":\\\"W0321 04:24:13.708919 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI0321 04:24:13.709951 1 crypto.go:601] Generating new CA for check-endpoints-signer@1774067053 cert, and key in /tmp/serving-cert-1138089316/serving-signer.crt, /tmp/serving-cert-1138089316/serving-signer.key\\\\nI0321 04:24:14.009807 1 observer_polling.go:159] Starting file observer\\\\nW0321 04:24:14.016906 1 builder.go:272] unable to get owner reference (falling back to namespace): Unauthorized\\\\nI0321 04:24:14.017068 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0321 04:24:14.017655 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1138089316/tls.crt::/tmp/serving-cert-1138089316/tls.key\\\\\\\"\\\\nF0321 04:24:14.917069 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-21T04:24:13Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:24:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://573a241321770c9c676d83d8280f81eba0ad01a3e2f857be89d31316117b8898\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:23:19Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f70600d5431425836d94025fc8166068d8c9f411d0750be07cd7de44575a9649\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f70600d5431425836d94025fc8166068d8c9f411d0750be07cd7de44575a9649\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:23:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:23:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:23:16Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:25:14Z is after 2025-08-24T17:21:41Z" Mar 21 04:25:14 crc kubenswrapper[4839]: I0321 04:25:14.176551 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:24:41Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:24:41Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:25:14Z is after 2025-08-24T17:21:41Z" Mar 21 04:25:14 crc kubenswrapper[4839]: I0321 04:25:14.188624 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:24:41Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:09Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://17d5f53fe3137552caf74bf775b44fba819d8790d2bc4d5042744dbd7b7307ce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:25:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e480aacd9f65a8a4ccedd9c0a943c53bf6de8a8de00eb0d421ec782af8a785bb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:25:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:25:14Z is after 2025-08-24T17:21:41Z" Mar 21 04:25:14 crc kubenswrapper[4839]: I0321 04:25:14.192150 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:25:14 crc kubenswrapper[4839]: I0321 04:25:14.192190 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:25:14 crc kubenswrapper[4839]: I0321 04:25:14.192201 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:25:14 crc kubenswrapper[4839]: I0321 04:25:14.192215 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 04:25:14 crc kubenswrapper[4839]: I0321 04:25:14.192225 4839 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T04:25:14Z","lastTransitionTime":"2026-03-21T04:25:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 04:25:14 crc kubenswrapper[4839]: I0321 04:25:14.201760 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:24:41Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:24:41Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:25:14Z is after 2025-08-24T17:21:41Z" Mar 21 04:25:14 crc kubenswrapper[4839]: I0321 04:25:14.212981 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-jx4q7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4f92fefb-d5cd-451a-8bbe-31eea55d5bd9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6fc18a359e07bf64662248d7ceb65c8184407c0c4e14ce5483760975218407c9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:25:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9jqbw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a124ddc90d8b93fe4b6f77b778fbdac127b4896f5f55e6d5d78629cd17584311\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:25:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9jqbw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:25:00Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-jx4q7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:25:14Z is after 2025-08-24T17:21:41Z" Mar 21 04:25:14 crc kubenswrapper[4839]: I0321 04:25:14.221987 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-sxs57" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e99177d8-5f41-4cee-a2c9-ae1c314d9d8d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4499202f3d135af4fbc0aaeddb85c3af3d571f0a6a0da31b0c054112cbeef85e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:25:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vqm8m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:25:06Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-sxs57\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:25:14Z is after 2025-08-24T17:21:41Z" Mar 21 04:25:14 crc kubenswrapper[4839]: I0321 04:25:14.234819 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:24:41Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:11Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c3ce591ac49d818720803d65a1f2cbc28e1781a5bd3cff7c6cdaaa4b55c01b62\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:25:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:25:14Z is after 2025-08-24T17:21:41Z" Mar 21 04:25:14 crc kubenswrapper[4839]: I0321 04:25:14.267437 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:24:41Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:24:41Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:25:14Z is after 2025-08-24T17:21:41Z" Mar 21 04:25:14 crc kubenswrapper[4839]: I0321 04:25:14.294640 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-scp2c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e0848faa-daf7-4b62-a20f-36d92678db1d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2ae22ec83bed051327f3e013af8f13d50dfab76794f5fbc972c4091173c8c464\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:25:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6trk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5ee6bb585321fd698c356e2ad9b8f48f62f2b8e09847371060e8ab0395a059ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5ee6bb585321fd698c356e2ad9b8f48f62f2b8e09847371060e8ab0395a059ea\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:25:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:25:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6trk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d6527e3c31caba53559cb6dfe4494b576789a0c8e3e5a7d392352441f1b9b1ad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d6527e3c31caba53559cb6dfe4494b576789a0c8e3e5a7d392352441f1b9b1ad\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:25:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:25:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6trk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://688598020c8b1be055194d4805d234769230262450e5f816f35cc120ff5ff9a2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://688598020c8b1be055194d4805d234769230262450e5f816f35cc120ff5ff9a2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:25:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:25:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6trk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://137b7d338371020059e20feb028cea9e79fa43d32ff7de446719a1326051c4f4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://137b7d338371020059e20feb028cea9e79fa43d32ff7de446719a1326051c4f4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:25:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:25:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6trk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6059c9b392e73f503a825716b885d248ddf659fe4fc1cd23f47353dd0fdc0e07\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6059c9b392e73f503a825716b885d248ddf659fe4fc1cd23f47353dd0fdc0e07\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:25:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:25:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6trk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9367e3b70275a4fd5cc98faa785da0149e890de672bce34ddc831a46be5dd6b3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9367e3b70275a4fd5cc98faa785da0149e890de672bce34ddc831a46be5dd6b3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:25:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:25:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6trk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:25:00Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-scp2c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:25:14Z is after 2025-08-24T17:21:41Z" Mar 21 04:25:14 crc kubenswrapper[4839]: I0321 04:25:14.295132 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:25:14 crc kubenswrapper[4839]: I0321 04:25:14.295187 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:25:14 crc kubenswrapper[4839]: I0321 04:25:14.295199 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:25:14 crc kubenswrapper[4839]: I0321 04:25:14.295217 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 04:25:14 crc kubenswrapper[4839]: I0321 04:25:14.295229 4839 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T04:25:14Z","lastTransitionTime":"2026-03-21T04:25:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 04:25:14 crc kubenswrapper[4839]: I0321 04:25:14.308350 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-zqcw4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1602189b-f4f3-40ee-ba63-c695c11069d0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://abb804e65728bf323531d9b35e52f6700584bab85de4a5b749067d2d89224747\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:25:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f9gh6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:25:00Z\\\"}}\" for pod \"openshift-multus\"/\"multus-zqcw4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:25:14Z is after 2025-08-24T17:21:41Z" Mar 21 04:25:14 crc kubenswrapper[4839]: I0321 04:25:14.318041 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-445ww" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fa13ce27-53f2-4178-8560-251f0bb3f034\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:13Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:13Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4b26h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4b26h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:25:13Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-445ww\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:25:14Z is after 2025-08-24T17:21:41Z" Mar 21 04:25:14 crc kubenswrapper[4839]: I0321 04:25:14.334373 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"845a5f72-fcd9-4e53-a65e-d875a0758312\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:23:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:23:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:23:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:23:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:23:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6ac8aa1efe8200de679b27c91d85e8934dd8b958a487da8adfcb2e7fe4d4af83\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:23:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b5fcb9a692a8e68cdaf00e9ab50192286582dc94a1ec9c1f269deec71d12e709\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:23:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0a90276bf42ba474250d4e51c93b958e93ffcad8ab5f4285766b381f0247c484\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:23:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://418cc0271450c53990546d7ffb4506824d2a5889e49064cd40624ba291f485bd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:23:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d2aeb8f0a5e6e7302570f0ecb899e31a8f5e9ed506f9bf7cfa6a0a6d55ae50dd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:23:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://843b9092466615ee64f818bbd1d6e51298c59932d1e135796967de09fb2f81bf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://843b9092466615ee64f818bbd1d6e51298c59932d1e135796967de09fb2f81bf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:23:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:23:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://320bccb832e179c9ab109d22c5cd652cfb3bd770d2c698d91886401d9566efc5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://320bccb832e179c9ab109d22c5cd652cfb3bd770d2c698d91886401d9566efc5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:23:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:23:18Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://6e30344cfa0c0c9d42845a517f744a156634132bd6c2f6e30f745751913e968c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6e30344cfa0c0c9d42845a517f744a156634132bd6c2f6e30f745751913e968c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:23:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:23:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:23:16Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:25:14Z is after 2025-08-24T17:21:41Z" Mar 21 04:25:14 crc kubenswrapper[4839]: I0321 04:25:14.344034 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:24:41Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:12Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ea7373c259189351899cd74615296d7188b2529f808af9c8c85098177ffbe85d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:25:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:25:14Z is after 2025-08-24T17:21:41Z" Mar 21 04:25:14 crc kubenswrapper[4839]: I0321 04:25:14.359362 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-spl4b" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d634043b-c9ec-4469-b267-26053b1f02f9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:01Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:01Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5854b81091eb3920f499e2fb8bdb5e51afa5cf975397ad7d789603150fdafe92\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:25:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cdph2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c532a8668c47301ab93dbb1ea21be6ed687a83a6c9008b887e5023a46f27d172\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:25:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cdph2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://340ce870dbfbd3b102f8f1c5e6705da80e4a23db035175d7746ffe6ad63310f0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:25:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cdph2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://04c0470b8aa154ad901d5416631136ad2480da45bff39836dda6837a3e4b980e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:25:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cdph2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://821316a28b33f897df4ea4ae553ed0cbdd066ff220cc310af604bbe062e4b00e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:25:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cdph2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0555faac11cd1da1d5573074ee5d8704c1c4a7d8559996229bf242d9ed7290f4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:25:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cdph2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4f8906b09cbadb32578e0445be63ed6c1e3ce446f301f92ae844664327d19a63\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a5c1d96269996b1709d6a1c4744c11ca23d8e4fd56bc740881a1ddb1f0198172\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-21T04:25:10Z\\\",\\\"message\\\":\\\"7\\\\nI0321 04:25:10.380763 6652 handler.go:208] Removed *v1.Node event handler 2\\\\nI0321 04:25:10.380776 6652 reflector.go:311] Stopping reflector *v1.EgressFirewall (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressfirewall/v1/apis/informers/externalversions/factory.go:140\\\\nI0321 04:25:10.380779 6652 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI0321 04:25:10.380790 6652 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI0321 04:25:10.380821 6652 handler.go:208] Removed *v1.Pod event handler 6\\\\nI0321 04:25:10.380831 6652 reflector.go:311] Stopping reflector *v1.EndpointSlice (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0321 04:25:10.380838 6652 handler.go:208] Removed *v1.Pod event handler 3\\\\nI0321 04:25:10.380893 6652 reflector.go:311] Stopping reflector *v1.EgressIP (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressip/v1/apis/informers/externalversions/factory.go:140\\\\nI0321 04:25:10.380983 6652 reflector.go:311] Stopping reflector *v1.EgressQoS (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressqos/v1/apis/informers/externalversions/factory.go:140\\\\nI0321 04:25:10.381323 6652 reflector.go:311] Stopping reflector *v1alpha1.BaselineAdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/f\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-21T04:25:07Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4f8906b09cbadb32578e0445be63ed6c1e3ce446f301f92ae844664327d19a63\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-21T04:25:12Z\\\",\\\"message\\\":\\\"al\\\\nI0321 04:25:12.025879 6856 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI0321 04:25:12.025892 6856 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI0321 04:25:12.025912 6856 handler.go:208] Removed *v1.Pod event handler 6\\\\nI0321 04:25:12.025931 6856 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0321 04:25:12.025958 6856 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI0321 04:25:12.025968 6856 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI0321 04:25:12.025975 6856 handler.go:208] Removed *v1.Pod event handler 3\\\\nI0321 04:25:12.025945 6856 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI0321 04:25:12.025953 6856 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0321 04:25:12.025985 6856 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI0321 04:25:12.026002 6856 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI0321 04:25:12.026060 6856 handler.go:208] Removed *v1.Node event handler 2\\\\nI0321 04:25:12.026070 6856 handler.go:208] Removed *v1.Node event handler 7\\\\nI0321 04:25:12.026092 6856 factory.go:656] Stopping watch factory\\\\nI0321 04:25:12.026121 6856 ovnkube.go:599] Stopped ovnkube\\\\nI0321 04:25:12.026162 6856 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF0321 04:25:\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-21T04:25:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cdph2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ee9c34719e2681860a7065870d4f0fd1d5aae10da5f2cc780e83f8020211e536\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:25:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cdph2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://466ed122245008a86b4471250670e086e61c981de3a7a026461c7c2238c87f3c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://466ed122245008a86b4471250670e086e61c981de3a7a026461c7c2238c87f3c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:25:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:25:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cdph2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:25:01Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-spl4b\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:25:14Z is after 2025-08-24T17:21:41Z" Mar 21 04:25:14 crc kubenswrapper[4839]: I0321 04:25:14.369945 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-9hl57" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4dee692e-c3b8-4538-86d7-210dd7e96173\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6058f68ef58f6259be3064c7cea97ec261dc1174655dbe75d4caf208a9c300e5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:25:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6n8pc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://99d7bb8a699d6022c3036c24052e58f8f45699b88971a2137e3d8bc27a21fe56\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:25:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6n8pc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:25:12Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-9hl57\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:25:14Z is after 2025-08-24T17:21:41Z" Mar 21 04:25:14 crc kubenswrapper[4839]: I0321 04:25:14.397550 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:25:14 crc kubenswrapper[4839]: I0321 04:25:14.397624 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:25:14 crc kubenswrapper[4839]: I0321 04:25:14.397634 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:25:14 crc kubenswrapper[4839]: I0321 04:25:14.397649 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 04:25:14 crc kubenswrapper[4839]: I0321 04:25:14.397658 4839 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T04:25:14Z","lastTransitionTime":"2026-03-21T04:25:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 04:25:14 crc kubenswrapper[4839]: I0321 04:25:14.451975 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 21 04:25:14 crc kubenswrapper[4839]: E0321 04:25:14.452092 4839 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 21 04:25:14 crc kubenswrapper[4839]: I0321 04:25:14.500012 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:25:14 crc kubenswrapper[4839]: I0321 04:25:14.500079 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:25:14 crc kubenswrapper[4839]: I0321 04:25:14.500095 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:25:14 crc kubenswrapper[4839]: I0321 04:25:14.500119 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 04:25:14 crc kubenswrapper[4839]: I0321 04:25:14.500134 4839 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T04:25:14Z","lastTransitionTime":"2026-03-21T04:25:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 04:25:14 crc kubenswrapper[4839]: I0321 04:25:14.529610 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:25:14 crc kubenswrapper[4839]: I0321 04:25:14.529653 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:25:14 crc kubenswrapper[4839]: I0321 04:25:14.529662 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:25:14 crc kubenswrapper[4839]: I0321 04:25:14.529679 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 04:25:14 crc kubenswrapper[4839]: I0321 04:25:14.529690 4839 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T04:25:14Z","lastTransitionTime":"2026-03-21T04:25:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 04:25:14 crc kubenswrapper[4839]: E0321 04:25:14.540467 4839 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404564Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865364Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-21T04:25:14Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:14Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-21T04:25:14Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:14Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-21T04:25:14Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:14Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-21T04:25:14Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:14Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"d75f4a6d-5ce3-49bd-a7c7-4f5c3e6bf62f\\\",\\\"systemUUID\\\":\\\"2a7bfad9-30ba-42d8-b982-971191ebb9d6\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:25:14Z is after 2025-08-24T17:21:41Z" Mar 21 04:25:14 crc kubenswrapper[4839]: I0321 04:25:14.543546 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:25:14 crc kubenswrapper[4839]: I0321 04:25:14.543600 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:25:14 crc kubenswrapper[4839]: I0321 04:25:14.543610 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:25:14 crc kubenswrapper[4839]: I0321 04:25:14.543623 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 04:25:14 crc kubenswrapper[4839]: I0321 04:25:14.543632 4839 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T04:25:14Z","lastTransitionTime":"2026-03-21T04:25:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 04:25:14 crc kubenswrapper[4839]: E0321 04:25:14.553650 4839 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404564Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865364Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-21T04:25:14Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:14Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-21T04:25:14Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:14Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-21T04:25:14Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:14Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-21T04:25:14Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:14Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"d75f4a6d-5ce3-49bd-a7c7-4f5c3e6bf62f\\\",\\\"systemUUID\\\":\\\"2a7bfad9-30ba-42d8-b982-971191ebb9d6\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:25:14Z is after 2025-08-24T17:21:41Z" Mar 21 04:25:14 crc kubenswrapper[4839]: I0321 04:25:14.557051 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:25:14 crc kubenswrapper[4839]: I0321 04:25:14.557076 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:25:14 crc kubenswrapper[4839]: I0321 04:25:14.557086 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:25:14 crc kubenswrapper[4839]: I0321 04:25:14.557142 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 04:25:14 crc kubenswrapper[4839]: I0321 04:25:14.557154 4839 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T04:25:14Z","lastTransitionTime":"2026-03-21T04:25:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 04:25:14 crc kubenswrapper[4839]: E0321 04:25:14.571361 4839 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404564Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865364Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-21T04:25:14Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:14Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-21T04:25:14Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:14Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-21T04:25:14Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:14Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-21T04:25:14Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:14Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"d75f4a6d-5ce3-49bd-a7c7-4f5c3e6bf62f\\\",\\\"systemUUID\\\":\\\"2a7bfad9-30ba-42d8-b982-971191ebb9d6\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:25:14Z is after 2025-08-24T17:21:41Z" Mar 21 04:25:14 crc kubenswrapper[4839]: I0321 04:25:14.575062 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:25:14 crc kubenswrapper[4839]: I0321 04:25:14.575111 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:25:14 crc kubenswrapper[4839]: I0321 04:25:14.575122 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:25:14 crc kubenswrapper[4839]: I0321 04:25:14.575140 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 04:25:14 crc kubenswrapper[4839]: I0321 04:25:14.575157 4839 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T04:25:14Z","lastTransitionTime":"2026-03-21T04:25:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 04:25:14 crc kubenswrapper[4839]: E0321 04:25:14.588870 4839 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404564Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865364Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-21T04:25:14Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:14Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-21T04:25:14Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:14Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-21T04:25:14Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:14Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-21T04:25:14Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:14Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"d75f4a6d-5ce3-49bd-a7c7-4f5c3e6bf62f\\\",\\\"systemUUID\\\":\\\"2a7bfad9-30ba-42d8-b982-971191ebb9d6\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:25:14Z is after 2025-08-24T17:21:41Z" Mar 21 04:25:14 crc kubenswrapper[4839]: I0321 04:25:14.592999 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:25:14 crc kubenswrapper[4839]: I0321 04:25:14.593050 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:25:14 crc kubenswrapper[4839]: I0321 04:25:14.593065 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:25:14 crc kubenswrapper[4839]: I0321 04:25:14.593092 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 04:25:14 crc kubenswrapper[4839]: I0321 04:25:14.593107 4839 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T04:25:14Z","lastTransitionTime":"2026-03-21T04:25:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 04:25:14 crc kubenswrapper[4839]: E0321 04:25:14.605908 4839 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404564Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865364Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-21T04:25:14Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:14Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-21T04:25:14Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:14Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-21T04:25:14Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:14Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-21T04:25:14Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:14Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"d75f4a6d-5ce3-49bd-a7c7-4f5c3e6bf62f\\\",\\\"systemUUID\\\":\\\"2a7bfad9-30ba-42d8-b982-971191ebb9d6\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:25:14Z is after 2025-08-24T17:21:41Z" Mar 21 04:25:14 crc kubenswrapper[4839]: E0321 04:25:14.606041 4839 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Mar 21 04:25:14 crc kubenswrapper[4839]: I0321 04:25:14.608224 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:25:14 crc kubenswrapper[4839]: I0321 04:25:14.608263 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:25:14 crc kubenswrapper[4839]: I0321 04:25:14.608273 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:25:14 crc kubenswrapper[4839]: I0321 04:25:14.608292 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 04:25:14 crc kubenswrapper[4839]: I0321 04:25:14.608306 4839 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T04:25:14Z","lastTransitionTime":"2026-03-21T04:25:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 04:25:14 crc kubenswrapper[4839]: I0321 04:25:14.711173 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:25:14 crc kubenswrapper[4839]: I0321 04:25:14.711226 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:25:14 crc kubenswrapper[4839]: I0321 04:25:14.711238 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:25:14 crc kubenswrapper[4839]: I0321 04:25:14.711258 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 04:25:14 crc kubenswrapper[4839]: I0321 04:25:14.711269 4839 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T04:25:14Z","lastTransitionTime":"2026-03-21T04:25:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 04:25:14 crc kubenswrapper[4839]: I0321 04:25:14.813543 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:25:14 crc kubenswrapper[4839]: I0321 04:25:14.813622 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:25:14 crc kubenswrapper[4839]: I0321 04:25:14.813638 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:25:14 crc kubenswrapper[4839]: I0321 04:25:14.813654 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 04:25:14 crc kubenswrapper[4839]: I0321 04:25:14.813668 4839 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T04:25:14Z","lastTransitionTime":"2026-03-21T04:25:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 04:25:14 crc kubenswrapper[4839]: I0321 04:25:14.915892 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:25:14 crc kubenswrapper[4839]: I0321 04:25:14.915961 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:25:14 crc kubenswrapper[4839]: I0321 04:25:14.915991 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:25:14 crc kubenswrapper[4839]: I0321 04:25:14.916006 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 04:25:14 crc kubenswrapper[4839]: I0321 04:25:14.916016 4839 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T04:25:14Z","lastTransitionTime":"2026-03-21T04:25:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 04:25:14 crc kubenswrapper[4839]: I0321 04:25:14.967058 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/fa13ce27-53f2-4178-8560-251f0bb3f034-metrics-certs\") pod \"network-metrics-daemon-445ww\" (UID: \"fa13ce27-53f2-4178-8560-251f0bb3f034\") " pod="openshift-multus/network-metrics-daemon-445ww" Mar 21 04:25:14 crc kubenswrapper[4839]: E0321 04:25:14.967266 4839 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Mar 21 04:25:14 crc kubenswrapper[4839]: E0321 04:25:14.967330 4839 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/fa13ce27-53f2-4178-8560-251f0bb3f034-metrics-certs podName:fa13ce27-53f2-4178-8560-251f0bb3f034 nodeName:}" failed. No retries permitted until 2026-03-21 04:25:16.967310385 +0000 UTC m=+121.295097071 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/fa13ce27-53f2-4178-8560-251f0bb3f034-metrics-certs") pod "network-metrics-daemon-445ww" (UID: "fa13ce27-53f2-4178-8560-251f0bb3f034") : object "openshift-multus"/"metrics-daemon-secret" not registered Mar 21 04:25:15 crc kubenswrapper[4839]: I0321 04:25:15.017917 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:25:15 crc kubenswrapper[4839]: I0321 04:25:15.017957 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:25:15 crc kubenswrapper[4839]: I0321 04:25:15.017965 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:25:15 crc kubenswrapper[4839]: I0321 04:25:15.017979 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 04:25:15 crc kubenswrapper[4839]: I0321 04:25:15.017988 4839 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T04:25:15Z","lastTransitionTime":"2026-03-21T04:25:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 04:25:15 crc kubenswrapper[4839]: I0321 04:25:15.119960 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:25:15 crc kubenswrapper[4839]: I0321 04:25:15.119997 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:25:15 crc kubenswrapper[4839]: I0321 04:25:15.120006 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:25:15 crc kubenswrapper[4839]: I0321 04:25:15.120020 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 04:25:15 crc kubenswrapper[4839]: I0321 04:25:15.120029 4839 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T04:25:15Z","lastTransitionTime":"2026-03-21T04:25:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 04:25:15 crc kubenswrapper[4839]: I0321 04:25:15.222194 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:25:15 crc kubenswrapper[4839]: I0321 04:25:15.222227 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:25:15 crc kubenswrapper[4839]: I0321 04:25:15.222235 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:25:15 crc kubenswrapper[4839]: I0321 04:25:15.222248 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 04:25:15 crc kubenswrapper[4839]: I0321 04:25:15.222257 4839 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T04:25:15Z","lastTransitionTime":"2026-03-21T04:25:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 04:25:15 crc kubenswrapper[4839]: I0321 04:25:15.325465 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:25:15 crc kubenswrapper[4839]: I0321 04:25:15.325783 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:25:15 crc kubenswrapper[4839]: I0321 04:25:15.325794 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:25:15 crc kubenswrapper[4839]: I0321 04:25:15.325809 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 04:25:15 crc kubenswrapper[4839]: I0321 04:25:15.325820 4839 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T04:25:15Z","lastTransitionTime":"2026-03-21T04:25:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 04:25:15 crc kubenswrapper[4839]: I0321 04:25:15.428105 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:25:15 crc kubenswrapper[4839]: I0321 04:25:15.428142 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:25:15 crc kubenswrapper[4839]: I0321 04:25:15.428153 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:25:15 crc kubenswrapper[4839]: I0321 04:25:15.428168 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 04:25:15 crc kubenswrapper[4839]: I0321 04:25:15.428178 4839 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T04:25:15Z","lastTransitionTime":"2026-03-21T04:25:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 04:25:15 crc kubenswrapper[4839]: I0321 04:25:15.452324 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-445ww" Mar 21 04:25:15 crc kubenswrapper[4839]: I0321 04:25:15.452338 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 21 04:25:15 crc kubenswrapper[4839]: E0321 04:25:15.452473 4839 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-445ww" podUID="fa13ce27-53f2-4178-8560-251f0bb3f034" Mar 21 04:25:15 crc kubenswrapper[4839]: E0321 04:25:15.452607 4839 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 21 04:25:15 crc kubenswrapper[4839]: I0321 04:25:15.452794 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 21 04:25:15 crc kubenswrapper[4839]: E0321 04:25:15.453049 4839 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 21 04:25:15 crc kubenswrapper[4839]: I0321 04:25:15.530624 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:25:15 crc kubenswrapper[4839]: I0321 04:25:15.530766 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:25:15 crc kubenswrapper[4839]: I0321 04:25:15.530802 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:25:15 crc kubenswrapper[4839]: I0321 04:25:15.530834 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 04:25:15 crc kubenswrapper[4839]: I0321 04:25:15.530855 4839 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T04:25:15Z","lastTransitionTime":"2026-03-21T04:25:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 04:25:15 crc kubenswrapper[4839]: I0321 04:25:15.633543 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:25:15 crc kubenswrapper[4839]: I0321 04:25:15.633618 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:25:15 crc kubenswrapper[4839]: I0321 04:25:15.633630 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:25:15 crc kubenswrapper[4839]: I0321 04:25:15.633650 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 04:25:15 crc kubenswrapper[4839]: I0321 04:25:15.633666 4839 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T04:25:15Z","lastTransitionTime":"2026-03-21T04:25:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 04:25:15 crc kubenswrapper[4839]: I0321 04:25:15.736203 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:25:15 crc kubenswrapper[4839]: I0321 04:25:15.736247 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:25:15 crc kubenswrapper[4839]: I0321 04:25:15.736260 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:25:15 crc kubenswrapper[4839]: I0321 04:25:15.736281 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 04:25:15 crc kubenswrapper[4839]: I0321 04:25:15.736293 4839 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T04:25:15Z","lastTransitionTime":"2026-03-21T04:25:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 04:25:15 crc kubenswrapper[4839]: I0321 04:25:15.838546 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:25:15 crc kubenswrapper[4839]: I0321 04:25:15.838633 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:25:15 crc kubenswrapper[4839]: I0321 04:25:15.838651 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:25:15 crc kubenswrapper[4839]: I0321 04:25:15.838676 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 04:25:15 crc kubenswrapper[4839]: I0321 04:25:15.838692 4839 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T04:25:15Z","lastTransitionTime":"2026-03-21T04:25:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 04:25:15 crc kubenswrapper[4839]: I0321 04:25:15.941755 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:25:15 crc kubenswrapper[4839]: I0321 04:25:15.941793 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:25:15 crc kubenswrapper[4839]: I0321 04:25:15.941830 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:25:15 crc kubenswrapper[4839]: I0321 04:25:15.941848 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 04:25:15 crc kubenswrapper[4839]: I0321 04:25:15.941860 4839 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T04:25:15Z","lastTransitionTime":"2026-03-21T04:25:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 04:25:16 crc kubenswrapper[4839]: I0321 04:25:16.045032 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:25:16 crc kubenswrapper[4839]: I0321 04:25:16.045096 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:25:16 crc kubenswrapper[4839]: I0321 04:25:16.045113 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:25:16 crc kubenswrapper[4839]: I0321 04:25:16.045138 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 04:25:16 crc kubenswrapper[4839]: I0321 04:25:16.045156 4839 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T04:25:16Z","lastTransitionTime":"2026-03-21T04:25:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 04:25:16 crc kubenswrapper[4839]: I0321 04:25:16.150445 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:25:16 crc kubenswrapper[4839]: I0321 04:25:16.150496 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:25:16 crc kubenswrapper[4839]: I0321 04:25:16.150509 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:25:16 crc kubenswrapper[4839]: I0321 04:25:16.150527 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 04:25:16 crc kubenswrapper[4839]: I0321 04:25:16.150541 4839 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T04:25:16Z","lastTransitionTime":"2026-03-21T04:25:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 04:25:16 crc kubenswrapper[4839]: I0321 04:25:16.253424 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:25:16 crc kubenswrapper[4839]: I0321 04:25:16.253469 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:25:16 crc kubenswrapper[4839]: I0321 04:25:16.253482 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:25:16 crc kubenswrapper[4839]: I0321 04:25:16.253500 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 04:25:16 crc kubenswrapper[4839]: I0321 04:25:16.253514 4839 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T04:25:16Z","lastTransitionTime":"2026-03-21T04:25:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 04:25:16 crc kubenswrapper[4839]: I0321 04:25:16.355442 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:25:16 crc kubenswrapper[4839]: I0321 04:25:16.355477 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:25:16 crc kubenswrapper[4839]: I0321 04:25:16.355486 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:25:16 crc kubenswrapper[4839]: I0321 04:25:16.355498 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 04:25:16 crc kubenswrapper[4839]: I0321 04:25:16.355507 4839 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T04:25:16Z","lastTransitionTime":"2026-03-21T04:25:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 04:25:16 crc kubenswrapper[4839]: I0321 04:25:16.451870 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 21 04:25:16 crc kubenswrapper[4839]: E0321 04:25:16.452761 4839 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 21 04:25:16 crc kubenswrapper[4839]: E0321 04:25:16.455660 4839 kubelet_node_status.go:497] "Node not becoming ready in time after startup" Mar 21 04:25:16 crc kubenswrapper[4839]: I0321 04:25:16.464614 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-sxs57" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e99177d8-5f41-4cee-a2c9-ae1c314d9d8d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4499202f3d135af4fbc0aaeddb85c3af3d571f0a6a0da31b0c054112cbeef85e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:25:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vqm8m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:25:06Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-sxs57\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:25:16Z is after 2025-08-24T17:21:41Z" Mar 21 04:25:16 crc kubenswrapper[4839]: I0321 04:25:16.478979 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:24:41Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:11Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c3ce591ac49d818720803d65a1f2cbc28e1781a5bd3cff7c6cdaaa4b55c01b62\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:25:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:25:16Z is after 2025-08-24T17:21:41Z" Mar 21 04:25:16 crc kubenswrapper[4839]: I0321 04:25:16.490342 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:24:41Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:24:41Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:25:16Z is after 2025-08-24T17:21:41Z" Mar 21 04:25:16 crc kubenswrapper[4839]: I0321 04:25:16.505146 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-scp2c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e0848faa-daf7-4b62-a20f-36d92678db1d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2ae22ec83bed051327f3e013af8f13d50dfab76794f5fbc972c4091173c8c464\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:25:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6trk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5ee6bb585321fd698c356e2ad9b8f48f62f2b8e09847371060e8ab0395a059ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5ee6bb585321fd698c356e2ad9b8f48f62f2b8e09847371060e8ab0395a059ea\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:25:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:25:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6trk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d6527e3c31caba53559cb6dfe4494b576789a0c8e3e5a7d392352441f1b9b1ad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d6527e3c31caba53559cb6dfe4494b576789a0c8e3e5a7d392352441f1b9b1ad\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:25:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:25:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6trk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://688598020c8b1be055194d4805d234769230262450e5f816f35cc120ff5ff9a2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://688598020c8b1be055194d4805d234769230262450e5f816f35cc120ff5ff9a2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:25:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:25:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6trk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://137b7d338371020059e20feb028cea9e79fa43d32ff7de446719a1326051c4f4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://137b7d338371020059e20feb028cea9e79fa43d32ff7de446719a1326051c4f4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:25:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:25:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6trk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6059c9b392e73f503a825716b885d248ddf659fe4fc1cd23f47353dd0fdc0e07\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6059c9b392e73f503a825716b885d248ddf659fe4fc1cd23f47353dd0fdc0e07\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:25:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:25:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6trk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9367e3b70275a4fd5cc98faa785da0149e890de672bce34ddc831a46be5dd6b3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9367e3b70275a4fd5cc98faa785da0149e890de672bce34ddc831a46be5dd6b3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:25:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:25:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6trk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:25:00Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-scp2c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:25:16Z is after 2025-08-24T17:21:41Z" Mar 21 04:25:16 crc kubenswrapper[4839]: I0321 04:25:16.517282 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-zqcw4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1602189b-f4f3-40ee-ba63-c695c11069d0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://abb804e65728bf323531d9b35e52f6700584bab85de4a5b749067d2d89224747\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:25:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f9gh6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:25:00Z\\\"}}\" for pod \"openshift-multus\"/\"multus-zqcw4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:25:16Z is after 2025-08-24T17:21:41Z" Mar 21 04:25:16 crc kubenswrapper[4839]: I0321 04:25:16.527798 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-445ww" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fa13ce27-53f2-4178-8560-251f0bb3f034\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:13Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:13Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4b26h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4b26h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:25:13Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-445ww\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:25:16Z is after 2025-08-24T17:21:41Z" Mar 21 04:25:16 crc kubenswrapper[4839]: E0321 04:25:16.535927 4839 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 21 04:25:16 crc kubenswrapper[4839]: I0321 04:25:16.547752 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"845a5f72-fcd9-4e53-a65e-d875a0758312\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:23:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:23:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:23:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:23:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:23:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6ac8aa1efe8200de679b27c91d85e8934dd8b958a487da8adfcb2e7fe4d4af83\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:23:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b5fcb9a692a8e68cdaf00e9ab50192286582dc94a1ec9c1f269deec71d12e709\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:23:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0a90276bf42ba474250d4e51c93b958e93ffcad8ab5f4285766b381f0247c484\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:23:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://418cc0271450c53990546d7ffb4506824d2a5889e49064cd40624ba291f485bd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:23:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d2aeb8f0a5e6e7302570f0ecb899e31a8f5e9ed506f9bf7cfa6a0a6d55ae50dd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:23:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://843b9092466615ee64f818bbd1d6e51298c59932d1e135796967de09fb2f81bf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://843b9092466615ee64f818bbd1d6e51298c59932d1e135796967de09fb2f81bf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:23:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:23:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://320bccb832e179c9ab109d22c5cd652cfb3bd770d2c698d91886401d9566efc5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://320bccb832e179c9ab109d22c5cd652cfb3bd770d2c698d91886401d9566efc5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:23:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:23:18Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://6e30344cfa0c0c9d42845a517f744a156634132bd6c2f6e30f745751913e968c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6e30344cfa0c0c9d42845a517f744a156634132bd6c2f6e30f745751913e968c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:23:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:23:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:23:16Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:25:16Z is after 2025-08-24T17:21:41Z" Mar 21 04:25:16 crc kubenswrapper[4839]: I0321 04:25:16.558270 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:24:41Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:12Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ea7373c259189351899cd74615296d7188b2529f808af9c8c85098177ffbe85d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:25:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:25:16Z is after 2025-08-24T17:21:41Z" Mar 21 04:25:16 crc kubenswrapper[4839]: I0321 04:25:16.577728 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-spl4b" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d634043b-c9ec-4469-b267-26053b1f02f9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:01Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:01Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5854b81091eb3920f499e2fb8bdb5e51afa5cf975397ad7d789603150fdafe92\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:25:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cdph2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c532a8668c47301ab93dbb1ea21be6ed687a83a6c9008b887e5023a46f27d172\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:25:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cdph2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://340ce870dbfbd3b102f8f1c5e6705da80e4a23db035175d7746ffe6ad63310f0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:25:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cdph2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://04c0470b8aa154ad901d5416631136ad2480da45bff39836dda6837a3e4b980e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:25:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cdph2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://821316a28b33f897df4ea4ae553ed0cbdd066ff220cc310af604bbe062e4b00e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:25:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cdph2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0555faac11cd1da1d5573074ee5d8704c1c4a7d8559996229bf242d9ed7290f4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:25:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cdph2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4f8906b09cbadb32578e0445be63ed6c1e3ce446f301f92ae844664327d19a63\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a5c1d96269996b1709d6a1c4744c11ca23d8e4fd56bc740881a1ddb1f0198172\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-21T04:25:10Z\\\",\\\"message\\\":\\\"7\\\\nI0321 04:25:10.380763 6652 handler.go:208] Removed *v1.Node event handler 2\\\\nI0321 04:25:10.380776 6652 reflector.go:311] Stopping reflector *v1.EgressFirewall (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressfirewall/v1/apis/informers/externalversions/factory.go:140\\\\nI0321 04:25:10.380779 6652 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI0321 04:25:10.380790 6652 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI0321 04:25:10.380821 6652 handler.go:208] Removed *v1.Pod event handler 6\\\\nI0321 04:25:10.380831 6652 reflector.go:311] Stopping reflector *v1.EndpointSlice (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0321 04:25:10.380838 6652 handler.go:208] Removed *v1.Pod event handler 3\\\\nI0321 04:25:10.380893 6652 reflector.go:311] Stopping reflector *v1.EgressIP (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressip/v1/apis/informers/externalversions/factory.go:140\\\\nI0321 04:25:10.380983 6652 reflector.go:311] Stopping reflector *v1.EgressQoS (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressqos/v1/apis/informers/externalversions/factory.go:140\\\\nI0321 04:25:10.381323 6652 reflector.go:311] Stopping reflector *v1alpha1.BaselineAdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/f\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-21T04:25:07Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4f8906b09cbadb32578e0445be63ed6c1e3ce446f301f92ae844664327d19a63\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-21T04:25:12Z\\\",\\\"message\\\":\\\"al\\\\nI0321 04:25:12.025879 6856 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI0321 04:25:12.025892 6856 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI0321 04:25:12.025912 6856 handler.go:208] Removed *v1.Pod event handler 6\\\\nI0321 04:25:12.025931 6856 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0321 04:25:12.025958 6856 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI0321 04:25:12.025968 6856 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI0321 04:25:12.025975 6856 handler.go:208] Removed *v1.Pod event handler 3\\\\nI0321 04:25:12.025945 6856 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI0321 04:25:12.025953 6856 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0321 04:25:12.025985 6856 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI0321 04:25:12.026002 6856 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI0321 04:25:12.026060 6856 handler.go:208] Removed *v1.Node event handler 2\\\\nI0321 04:25:12.026070 6856 handler.go:208] Removed *v1.Node event handler 7\\\\nI0321 04:25:12.026092 6856 factory.go:656] Stopping watch factory\\\\nI0321 04:25:12.026121 6856 ovnkube.go:599] Stopped ovnkube\\\\nI0321 04:25:12.026162 6856 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF0321 04:25:\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-21T04:25:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cdph2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ee9c34719e2681860a7065870d4f0fd1d5aae10da5f2cc780e83f8020211e536\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:25:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cdph2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://466ed122245008a86b4471250670e086e61c981de3a7a026461c7c2238c87f3c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://466ed122245008a86b4471250670e086e61c981de3a7a026461c7c2238c87f3c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:25:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:25:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cdph2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:25:01Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-spl4b\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:25:16Z is after 2025-08-24T17:21:41Z" Mar 21 04:25:16 crc kubenswrapper[4839]: I0321 04:25:16.587805 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-9hl57" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4dee692e-c3b8-4538-86d7-210dd7e96173\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6058f68ef58f6259be3064c7cea97ec261dc1174655dbe75d4caf208a9c300e5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:25:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6n8pc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://99d7bb8a699d6022c3036c24052e58f8f45699b88971a2137e3d8bc27a21fe56\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:25:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6n8pc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:25:12Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-9hl57\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:25:16Z is after 2025-08-24T17:21:41Z" Mar 21 04:25:16 crc kubenswrapper[4839]: I0321 04:25:16.596744 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-g47qh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e646dbcd-c976-48e4-8dee-497be8a275bf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2ceedbcee674106a8f519fededf37b2bbc4b4bd660845373899de825f1ed534e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:25:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ljz2v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:25:00Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-g47qh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:25:16Z is after 2025-08-24T17:21:41Z" Mar 21 04:25:16 crc kubenswrapper[4839]: I0321 04:25:16.607097 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c5a473f2-0430-4c9b-8ef8-60d457db5188\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:23:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:23:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:23:16Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:23:16Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:23:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7299e7f513aff2d1d61b42cef3ed386843478716082a6e38d4d6d61d87f48529\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:23:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6d9bd6ceaa85cd21af55d1d512ac910eaf9d1bcf5d0f10868cd22ac80b13f66a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:23:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e4fe4a6c00181492cd5389e34a9bf18d905f69b2f7467eb75dcee9992eeb0d07\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:23:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://412472fce0e71838c2bb83101879b672e777dcd608bf27261a98261150d42e87\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ecafd1a4266b72f369f4b53d8a423bb875c1bc9719282c517a8e999ad5aecc66\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-21T04:24:14Z\\\",\\\"message\\\":\\\"W0321 04:24:13.708919 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI0321 04:24:13.709951 1 crypto.go:601] Generating new CA for check-endpoints-signer@1774067053 cert, and key in /tmp/serving-cert-1138089316/serving-signer.crt, /tmp/serving-cert-1138089316/serving-signer.key\\\\nI0321 04:24:14.009807 1 observer_polling.go:159] Starting file observer\\\\nW0321 04:24:14.016906 1 builder.go:272] unable to get owner reference (falling back to namespace): Unauthorized\\\\nI0321 04:24:14.017068 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0321 04:24:14.017655 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1138089316/tls.crt::/tmp/serving-cert-1138089316/tls.key\\\\\\\"\\\\nF0321 04:24:14.917069 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-21T04:24:13Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:24:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://573a241321770c9c676d83d8280f81eba0ad01a3e2f857be89d31316117b8898\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:23:19Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f70600d5431425836d94025fc8166068d8c9f411d0750be07cd7de44575a9649\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f70600d5431425836d94025fc8166068d8c9f411d0750be07cd7de44575a9649\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:23:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:23:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:23:16Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:25:16Z is after 2025-08-24T17:21:41Z" Mar 21 04:25:16 crc kubenswrapper[4839]: I0321 04:25:16.617683 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:24:41Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:24:41Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:25:16Z is after 2025-08-24T17:21:41Z" Mar 21 04:25:16 crc kubenswrapper[4839]: I0321 04:25:16.627956 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:24:41Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:09Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://17d5f53fe3137552caf74bf775b44fba819d8790d2bc4d5042744dbd7b7307ce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:25:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e480aacd9f65a8a4ccedd9c0a943c53bf6de8a8de00eb0d421ec782af8a785bb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:25:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:25:16Z is after 2025-08-24T17:21:41Z" Mar 21 04:25:16 crc kubenswrapper[4839]: I0321 04:25:16.640499 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:24:41Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:24:41Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:25:16Z is after 2025-08-24T17:21:41Z" Mar 21 04:25:16 crc kubenswrapper[4839]: I0321 04:25:16.650908 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-jx4q7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4f92fefb-d5cd-451a-8bbe-31eea55d5bd9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6fc18a359e07bf64662248d7ceb65c8184407c0c4e14ce5483760975218407c9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:25:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9jqbw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a124ddc90d8b93fe4b6f77b778fbdac127b4896f5f55e6d5d78629cd17584311\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:25:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9jqbw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:25:00Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-jx4q7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:25:16Z is after 2025-08-24T17:21:41Z" Mar 21 04:25:16 crc kubenswrapper[4839]: I0321 04:25:16.985875 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/fa13ce27-53f2-4178-8560-251f0bb3f034-metrics-certs\") pod \"network-metrics-daemon-445ww\" (UID: \"fa13ce27-53f2-4178-8560-251f0bb3f034\") " pod="openshift-multus/network-metrics-daemon-445ww" Mar 21 04:25:16 crc kubenswrapper[4839]: E0321 04:25:16.986041 4839 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Mar 21 04:25:16 crc kubenswrapper[4839]: E0321 04:25:16.986137 4839 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/fa13ce27-53f2-4178-8560-251f0bb3f034-metrics-certs podName:fa13ce27-53f2-4178-8560-251f0bb3f034 nodeName:}" failed. No retries permitted until 2026-03-21 04:25:20.986112183 +0000 UTC m=+125.313898899 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/fa13ce27-53f2-4178-8560-251f0bb3f034-metrics-certs") pod "network-metrics-daemon-445ww" (UID: "fa13ce27-53f2-4178-8560-251f0bb3f034") : object "openshift-multus"/"metrics-daemon-secret" not registered Mar 21 04:25:17 crc kubenswrapper[4839]: I0321 04:25:17.451932 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-445ww" Mar 21 04:25:17 crc kubenswrapper[4839]: I0321 04:25:17.451952 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 21 04:25:17 crc kubenswrapper[4839]: I0321 04:25:17.451972 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 21 04:25:17 crc kubenswrapper[4839]: E0321 04:25:17.452401 4839 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 21 04:25:17 crc kubenswrapper[4839]: E0321 04:25:17.452435 4839 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 21 04:25:17 crc kubenswrapper[4839]: E0321 04:25:17.452256 4839 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-445ww" podUID="fa13ce27-53f2-4178-8560-251f0bb3f034" Mar 21 04:25:17 crc kubenswrapper[4839]: I0321 04:25:17.775224 4839 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 21 04:25:17 crc kubenswrapper[4839]: I0321 04:25:17.791163 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-9hl57" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4dee692e-c3b8-4538-86d7-210dd7e96173\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6058f68ef58f6259be3064c7cea97ec261dc1174655dbe75d4caf208a9c300e5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:25:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6n8pc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://99d7bb8a699d6022c3036c24052e58f8f45699b88971a2137e3d8bc27a21fe56\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:25:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6n8pc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:25:12Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-9hl57\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:25:17Z is after 2025-08-24T17:21:41Z" Mar 21 04:25:17 crc kubenswrapper[4839]: I0321 04:25:17.803702 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-445ww" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fa13ce27-53f2-4178-8560-251f0bb3f034\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:13Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:13Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4b26h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4b26h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:25:13Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-445ww\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:25:17Z is after 2025-08-24T17:21:41Z" Mar 21 04:25:17 crc kubenswrapper[4839]: I0321 04:25:17.829618 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"845a5f72-fcd9-4e53-a65e-d875a0758312\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:23:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:23:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:23:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:23:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:23:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6ac8aa1efe8200de679b27c91d85e8934dd8b958a487da8adfcb2e7fe4d4af83\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:23:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b5fcb9a692a8e68cdaf00e9ab50192286582dc94a1ec9c1f269deec71d12e709\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:23:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0a90276bf42ba474250d4e51c93b958e93ffcad8ab5f4285766b381f0247c484\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:23:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://418cc0271450c53990546d7ffb4506824d2a5889e49064cd40624ba291f485bd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:23:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d2aeb8f0a5e6e7302570f0ecb899e31a8f5e9ed506f9bf7cfa6a0a6d55ae50dd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:23:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://843b9092466615ee64f818bbd1d6e51298c59932d1e135796967de09fb2f81bf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://843b9092466615ee64f818bbd1d6e51298c59932d1e135796967de09fb2f81bf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:23:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:23:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://320bccb832e179c9ab109d22c5cd652cfb3bd770d2c698d91886401d9566efc5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://320bccb832e179c9ab109d22c5cd652cfb3bd770d2c698d91886401d9566efc5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:23:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:23:18Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://6e30344cfa0c0c9d42845a517f744a156634132bd6c2f6e30f745751913e968c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6e30344cfa0c0c9d42845a517f744a156634132bd6c2f6e30f745751913e968c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:23:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:23:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:23:16Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:25:17Z is after 2025-08-24T17:21:41Z" Mar 21 04:25:17 crc kubenswrapper[4839]: I0321 04:25:17.847153 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:24:41Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:12Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ea7373c259189351899cd74615296d7188b2529f808af9c8c85098177ffbe85d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:25:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:25:17Z is after 2025-08-24T17:21:41Z" Mar 21 04:25:17 crc kubenswrapper[4839]: I0321 04:25:17.867480 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-spl4b" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d634043b-c9ec-4469-b267-26053b1f02f9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:01Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:01Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5854b81091eb3920f499e2fb8bdb5e51afa5cf975397ad7d789603150fdafe92\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:25:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cdph2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c532a8668c47301ab93dbb1ea21be6ed687a83a6c9008b887e5023a46f27d172\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:25:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cdph2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://340ce870dbfbd3b102f8f1c5e6705da80e4a23db035175d7746ffe6ad63310f0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:25:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cdph2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://04c0470b8aa154ad901d5416631136ad2480da45bff39836dda6837a3e4b980e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:25:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cdph2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://821316a28b33f897df4ea4ae553ed0cbdd066ff220cc310af604bbe062e4b00e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:25:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cdph2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0555faac11cd1da1d5573074ee5d8704c1c4a7d8559996229bf242d9ed7290f4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:25:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cdph2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4f8906b09cbadb32578e0445be63ed6c1e3ce446f301f92ae844664327d19a63\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a5c1d96269996b1709d6a1c4744c11ca23d8e4fd56bc740881a1ddb1f0198172\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-21T04:25:10Z\\\",\\\"message\\\":\\\"7\\\\nI0321 04:25:10.380763 6652 handler.go:208] Removed *v1.Node event handler 2\\\\nI0321 04:25:10.380776 6652 reflector.go:311] Stopping reflector *v1.EgressFirewall (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressfirewall/v1/apis/informers/externalversions/factory.go:140\\\\nI0321 04:25:10.380779 6652 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI0321 04:25:10.380790 6652 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI0321 04:25:10.380821 6652 handler.go:208] Removed *v1.Pod event handler 6\\\\nI0321 04:25:10.380831 6652 reflector.go:311] Stopping reflector *v1.EndpointSlice (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0321 04:25:10.380838 6652 handler.go:208] Removed *v1.Pod event handler 3\\\\nI0321 04:25:10.380893 6652 reflector.go:311] Stopping reflector *v1.EgressIP (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressip/v1/apis/informers/externalversions/factory.go:140\\\\nI0321 04:25:10.380983 6652 reflector.go:311] Stopping reflector *v1.EgressQoS (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressqos/v1/apis/informers/externalversions/factory.go:140\\\\nI0321 04:25:10.381323 6652 reflector.go:311] Stopping reflector *v1alpha1.BaselineAdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/f\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-21T04:25:07Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4f8906b09cbadb32578e0445be63ed6c1e3ce446f301f92ae844664327d19a63\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-21T04:25:12Z\\\",\\\"message\\\":\\\"al\\\\nI0321 04:25:12.025879 6856 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI0321 04:25:12.025892 6856 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI0321 04:25:12.025912 6856 handler.go:208] Removed *v1.Pod event handler 6\\\\nI0321 04:25:12.025931 6856 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0321 04:25:12.025958 6856 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI0321 04:25:12.025968 6856 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI0321 04:25:12.025975 6856 handler.go:208] Removed *v1.Pod event handler 3\\\\nI0321 04:25:12.025945 6856 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI0321 04:25:12.025953 6856 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0321 04:25:12.025985 6856 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI0321 04:25:12.026002 6856 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI0321 04:25:12.026060 6856 handler.go:208] Removed *v1.Node event handler 2\\\\nI0321 04:25:12.026070 6856 handler.go:208] Removed *v1.Node event handler 7\\\\nI0321 04:25:12.026092 6856 factory.go:656] Stopping watch factory\\\\nI0321 04:25:12.026121 6856 ovnkube.go:599] Stopped ovnkube\\\\nI0321 04:25:12.026162 6856 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF0321 04:25:\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-21T04:25:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cdph2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ee9c34719e2681860a7065870d4f0fd1d5aae10da5f2cc780e83f8020211e536\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:25:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cdph2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://466ed122245008a86b4471250670e086e61c981de3a7a026461c7c2238c87f3c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://466ed122245008a86b4471250670e086e61c981de3a7a026461c7c2238c87f3c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:25:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:25:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cdph2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:25:01Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-spl4b\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:25:17Z is after 2025-08-24T17:21:41Z" Mar 21 04:25:17 crc kubenswrapper[4839]: I0321 04:25:17.881145 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:24:41Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:09Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://17d5f53fe3137552caf74bf775b44fba819d8790d2bc4d5042744dbd7b7307ce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:25:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e480aacd9f65a8a4ccedd9c0a943c53bf6de8a8de00eb0d421ec782af8a785bb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:25:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:25:17Z is after 2025-08-24T17:21:41Z" Mar 21 04:25:17 crc kubenswrapper[4839]: I0321 04:25:17.892550 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-g47qh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e646dbcd-c976-48e4-8dee-497be8a275bf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2ceedbcee674106a8f519fededf37b2bbc4b4bd660845373899de825f1ed534e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:25:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ljz2v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:25:00Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-g47qh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:25:17Z is after 2025-08-24T17:21:41Z" Mar 21 04:25:17 crc kubenswrapper[4839]: I0321 04:25:17.907667 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c5a473f2-0430-4c9b-8ef8-60d457db5188\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:23:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:23:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:23:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7299e7f513aff2d1d61b42cef3ed386843478716082a6e38d4d6d61d87f48529\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:23:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6d9bd6ceaa85cd21af55d1d512ac910eaf9d1bcf5d0f10868cd22ac80b13f66a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:23:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e4fe4a6c00181492cd5389e34a9bf18d905f69b2f7467eb75dcee9992eeb0d07\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:23:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://412472fce0e71838c2bb83101879b672e777dcd608bf27261a98261150d42e87\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ecafd1a4266b72f369f4b53d8a423bb875c1bc9719282c517a8e999ad5aecc66\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-21T04:24:14Z\\\",\\\"message\\\":\\\"W0321 04:24:13.708919 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI0321 04:24:13.709951 1 crypto.go:601] Generating new CA for check-endpoints-signer@1774067053 cert, and key in /tmp/serving-cert-1138089316/serving-signer.crt, /tmp/serving-cert-1138089316/serving-signer.key\\\\nI0321 04:24:14.009807 1 observer_polling.go:159] Starting file observer\\\\nW0321 04:24:14.016906 1 builder.go:272] unable to get owner reference (falling back to namespace): Unauthorized\\\\nI0321 04:24:14.017068 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0321 04:24:14.017655 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1138089316/tls.crt::/tmp/serving-cert-1138089316/tls.key\\\\\\\"\\\\nF0321 04:24:14.917069 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-21T04:24:13Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:24:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://573a241321770c9c676d83d8280f81eba0ad01a3e2f857be89d31316117b8898\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:23:19Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f70600d5431425836d94025fc8166068d8c9f411d0750be07cd7de44575a9649\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f70600d5431425836d94025fc8166068d8c9f411d0750be07cd7de44575a9649\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:23:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:23:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:23:16Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:25:17Z is after 2025-08-24T17:21:41Z" Mar 21 04:25:17 crc kubenswrapper[4839]: I0321 04:25:17.921702 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:24:41Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:24:41Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:25:17Z is after 2025-08-24T17:21:41Z" Mar 21 04:25:17 crc kubenswrapper[4839]: I0321 04:25:17.933253 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-jx4q7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4f92fefb-d5cd-451a-8bbe-31eea55d5bd9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6fc18a359e07bf64662248d7ceb65c8184407c0c4e14ce5483760975218407c9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:25:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9jqbw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a124ddc90d8b93fe4b6f77b778fbdac127b4896f5f55e6d5d78629cd17584311\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:25:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9jqbw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:25:00Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-jx4q7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:25:17Z is after 2025-08-24T17:21:41Z" Mar 21 04:25:17 crc kubenswrapper[4839]: I0321 04:25:17.946435 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:24:41Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:24:41Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:25:17Z is after 2025-08-24T17:21:41Z" Mar 21 04:25:17 crc kubenswrapper[4839]: I0321 04:25:17.959828 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-zqcw4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1602189b-f4f3-40ee-ba63-c695c11069d0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://abb804e65728bf323531d9b35e52f6700584bab85de4a5b749067d2d89224747\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:25:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f9gh6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:25:00Z\\\"}}\" for pod \"openshift-multus\"/\"multus-zqcw4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:25:17Z is after 2025-08-24T17:21:41Z" Mar 21 04:25:17 crc kubenswrapper[4839]: I0321 04:25:17.972846 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-sxs57" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e99177d8-5f41-4cee-a2c9-ae1c314d9d8d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4499202f3d135af4fbc0aaeddb85c3af3d571f0a6a0da31b0c054112cbeef85e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:25:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vqm8m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:25:06Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-sxs57\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:25:17Z is after 2025-08-24T17:21:41Z" Mar 21 04:25:17 crc kubenswrapper[4839]: I0321 04:25:17.988288 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:24:41Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:11Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c3ce591ac49d818720803d65a1f2cbc28e1781a5bd3cff7c6cdaaa4b55c01b62\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:25:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:25:17Z is after 2025-08-24T17:21:41Z" Mar 21 04:25:18 crc kubenswrapper[4839]: I0321 04:25:18.002595 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:24:41Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:24:41Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:25:18Z is after 2025-08-24T17:21:41Z" Mar 21 04:25:18 crc kubenswrapper[4839]: I0321 04:25:18.017754 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-scp2c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e0848faa-daf7-4b62-a20f-36d92678db1d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2ae22ec83bed051327f3e013af8f13d50dfab76794f5fbc972c4091173c8c464\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:25:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6trk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5ee6bb585321fd698c356e2ad9b8f48f62f2b8e09847371060e8ab0395a059ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5ee6bb585321fd698c356e2ad9b8f48f62f2b8e09847371060e8ab0395a059ea\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:25:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:25:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6trk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d6527e3c31caba53559cb6dfe4494b576789a0c8e3e5a7d392352441f1b9b1ad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d6527e3c31caba53559cb6dfe4494b576789a0c8e3e5a7d392352441f1b9b1ad\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:25:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:25:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6trk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://688598020c8b1be055194d4805d234769230262450e5f816f35cc120ff5ff9a2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://688598020c8b1be055194d4805d234769230262450e5f816f35cc120ff5ff9a2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:25:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:25:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6trk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://137b7d338371020059e20feb028cea9e79fa43d32ff7de446719a1326051c4f4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://137b7d338371020059e20feb028cea9e79fa43d32ff7de446719a1326051c4f4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:25:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:25:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6trk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6059c9b392e73f503a825716b885d248ddf659fe4fc1cd23f47353dd0fdc0e07\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6059c9b392e73f503a825716b885d248ddf659fe4fc1cd23f47353dd0fdc0e07\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:25:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:25:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6trk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9367e3b70275a4fd5cc98faa785da0149e890de672bce34ddc831a46be5dd6b3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9367e3b70275a4fd5cc98faa785da0149e890de672bce34ddc831a46be5dd6b3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:25:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:25:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6trk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:25:00Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-scp2c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:25:18Z is after 2025-08-24T17:21:41Z" Mar 21 04:25:18 crc kubenswrapper[4839]: I0321 04:25:18.452544 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 21 04:25:18 crc kubenswrapper[4839]: E0321 04:25:18.452778 4839 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 21 04:25:19 crc kubenswrapper[4839]: I0321 04:25:19.452768 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-445ww" Mar 21 04:25:19 crc kubenswrapper[4839]: I0321 04:25:19.452773 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 21 04:25:19 crc kubenswrapper[4839]: E0321 04:25:19.452893 4839 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-445ww" podUID="fa13ce27-53f2-4178-8560-251f0bb3f034" Mar 21 04:25:19 crc kubenswrapper[4839]: I0321 04:25:19.452781 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 21 04:25:19 crc kubenswrapper[4839]: E0321 04:25:19.453048 4839 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 21 04:25:19 crc kubenswrapper[4839]: E0321 04:25:19.453167 4839 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 21 04:25:20 crc kubenswrapper[4839]: I0321 04:25:20.451858 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 21 04:25:20 crc kubenswrapper[4839]: E0321 04:25:20.452236 4839 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 21 04:25:21 crc kubenswrapper[4839]: I0321 04:25:21.028317 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/fa13ce27-53f2-4178-8560-251f0bb3f034-metrics-certs\") pod \"network-metrics-daemon-445ww\" (UID: \"fa13ce27-53f2-4178-8560-251f0bb3f034\") " pod="openshift-multus/network-metrics-daemon-445ww" Mar 21 04:25:21 crc kubenswrapper[4839]: E0321 04:25:21.028426 4839 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Mar 21 04:25:21 crc kubenswrapper[4839]: E0321 04:25:21.029042 4839 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/fa13ce27-53f2-4178-8560-251f0bb3f034-metrics-certs podName:fa13ce27-53f2-4178-8560-251f0bb3f034 nodeName:}" failed. No retries permitted until 2026-03-21 04:25:29.029022604 +0000 UTC m=+133.356809280 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/fa13ce27-53f2-4178-8560-251f0bb3f034-metrics-certs") pod "network-metrics-daemon-445ww" (UID: "fa13ce27-53f2-4178-8560-251f0bb3f034") : object "openshift-multus"/"metrics-daemon-secret" not registered Mar 21 04:25:21 crc kubenswrapper[4839]: I0321 04:25:21.452303 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 21 04:25:21 crc kubenswrapper[4839]: I0321 04:25:21.452345 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-445ww" Mar 21 04:25:21 crc kubenswrapper[4839]: E0321 04:25:21.452487 4839 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 21 04:25:21 crc kubenswrapper[4839]: E0321 04:25:21.452663 4839 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-445ww" podUID="fa13ce27-53f2-4178-8560-251f0bb3f034" Mar 21 04:25:21 crc kubenswrapper[4839]: I0321 04:25:21.453002 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 21 04:25:21 crc kubenswrapper[4839]: E0321 04:25:21.453372 4839 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 21 04:25:21 crc kubenswrapper[4839]: E0321 04:25:21.537519 4839 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 21 04:25:22 crc kubenswrapper[4839]: I0321 04:25:22.452443 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 21 04:25:22 crc kubenswrapper[4839]: E0321 04:25:22.452741 4839 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 21 04:25:23 crc kubenswrapper[4839]: I0321 04:25:23.452777 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 21 04:25:23 crc kubenswrapper[4839]: E0321 04:25:23.452949 4839 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 21 04:25:23 crc kubenswrapper[4839]: I0321 04:25:23.452800 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-445ww" Mar 21 04:25:23 crc kubenswrapper[4839]: E0321 04:25:23.453031 4839 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-445ww" podUID="fa13ce27-53f2-4178-8560-251f0bb3f034" Mar 21 04:25:23 crc kubenswrapper[4839]: I0321 04:25:23.453072 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 21 04:25:23 crc kubenswrapper[4839]: E0321 04:25:23.453118 4839 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 21 04:25:24 crc kubenswrapper[4839]: I0321 04:25:24.452380 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 21 04:25:24 crc kubenswrapper[4839]: E0321 04:25:24.452643 4839 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 21 04:25:24 crc kubenswrapper[4839]: I0321 04:25:24.809525 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:25:24 crc kubenswrapper[4839]: I0321 04:25:24.809613 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:25:24 crc kubenswrapper[4839]: I0321 04:25:24.809631 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:25:24 crc kubenswrapper[4839]: I0321 04:25:24.809657 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 04:25:24 crc kubenswrapper[4839]: I0321 04:25:24.809682 4839 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T04:25:24Z","lastTransitionTime":"2026-03-21T04:25:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 04:25:24 crc kubenswrapper[4839]: E0321 04:25:24.830342 4839 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404564Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865364Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-21T04:25:24Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:24Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-21T04:25:24Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:24Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-21T04:25:24Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:24Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-21T04:25:24Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:24Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"d75f4a6d-5ce3-49bd-a7c7-4f5c3e6bf62f\\\",\\\"systemUUID\\\":\\\"2a7bfad9-30ba-42d8-b982-971191ebb9d6\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:25:24Z is after 2025-08-24T17:21:41Z" Mar 21 04:25:24 crc kubenswrapper[4839]: I0321 04:25:24.834467 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:25:24 crc kubenswrapper[4839]: I0321 04:25:24.834516 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:25:24 crc kubenswrapper[4839]: I0321 04:25:24.834530 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:25:24 crc kubenswrapper[4839]: I0321 04:25:24.834553 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 04:25:24 crc kubenswrapper[4839]: I0321 04:25:24.834597 4839 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T04:25:24Z","lastTransitionTime":"2026-03-21T04:25:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 04:25:24 crc kubenswrapper[4839]: E0321 04:25:24.848249 4839 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404564Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865364Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-21T04:25:24Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:24Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-21T04:25:24Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:24Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-21T04:25:24Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:24Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-21T04:25:24Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:24Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"d75f4a6d-5ce3-49bd-a7c7-4f5c3e6bf62f\\\",\\\"systemUUID\\\":\\\"2a7bfad9-30ba-42d8-b982-971191ebb9d6\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:25:24Z is after 2025-08-24T17:21:41Z" Mar 21 04:25:24 crc kubenswrapper[4839]: I0321 04:25:24.852129 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:25:24 crc kubenswrapper[4839]: I0321 04:25:24.852168 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:25:24 crc kubenswrapper[4839]: I0321 04:25:24.852180 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:25:24 crc kubenswrapper[4839]: I0321 04:25:24.852197 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 04:25:24 crc kubenswrapper[4839]: I0321 04:25:24.852209 4839 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T04:25:24Z","lastTransitionTime":"2026-03-21T04:25:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 04:25:24 crc kubenswrapper[4839]: E0321 04:25:24.867287 4839 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404564Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865364Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-21T04:25:24Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:24Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-21T04:25:24Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:24Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-21T04:25:24Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:24Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-21T04:25:24Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:24Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"d75f4a6d-5ce3-49bd-a7c7-4f5c3e6bf62f\\\",\\\"systemUUID\\\":\\\"2a7bfad9-30ba-42d8-b982-971191ebb9d6\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:25:24Z is after 2025-08-24T17:21:41Z" Mar 21 04:25:24 crc kubenswrapper[4839]: I0321 04:25:24.871679 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:25:24 crc kubenswrapper[4839]: I0321 04:25:24.871716 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:25:24 crc kubenswrapper[4839]: I0321 04:25:24.871728 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:25:24 crc kubenswrapper[4839]: I0321 04:25:24.871745 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 04:25:24 crc kubenswrapper[4839]: I0321 04:25:24.871760 4839 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T04:25:24Z","lastTransitionTime":"2026-03-21T04:25:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 04:25:24 crc kubenswrapper[4839]: E0321 04:25:24.884379 4839 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404564Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865364Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-21T04:25:24Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:24Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-21T04:25:24Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:24Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-21T04:25:24Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:24Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-21T04:25:24Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:24Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"d75f4a6d-5ce3-49bd-a7c7-4f5c3e6bf62f\\\",\\\"systemUUID\\\":\\\"2a7bfad9-30ba-42d8-b982-971191ebb9d6\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:25:24Z is after 2025-08-24T17:21:41Z" Mar 21 04:25:24 crc kubenswrapper[4839]: I0321 04:25:24.888460 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:25:24 crc kubenswrapper[4839]: I0321 04:25:24.888554 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:25:24 crc kubenswrapper[4839]: I0321 04:25:24.888596 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:25:24 crc kubenswrapper[4839]: I0321 04:25:24.888620 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 04:25:24 crc kubenswrapper[4839]: I0321 04:25:24.888638 4839 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T04:25:24Z","lastTransitionTime":"2026-03-21T04:25:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 04:25:24 crc kubenswrapper[4839]: E0321 04:25:24.900621 4839 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404564Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865364Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-21T04:25:24Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:24Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-21T04:25:24Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:24Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-21T04:25:24Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:24Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-21T04:25:24Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:24Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"d75f4a6d-5ce3-49bd-a7c7-4f5c3e6bf62f\\\",\\\"systemUUID\\\":\\\"2a7bfad9-30ba-42d8-b982-971191ebb9d6\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:25:24Z is after 2025-08-24T17:21:41Z" Mar 21 04:25:24 crc kubenswrapper[4839]: E0321 04:25:24.900728 4839 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Mar 21 04:25:25 crc kubenswrapper[4839]: I0321 04:25:25.452349 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 21 04:25:25 crc kubenswrapper[4839]: I0321 04:25:25.452349 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-445ww" Mar 21 04:25:25 crc kubenswrapper[4839]: I0321 04:25:25.452351 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 21 04:25:25 crc kubenswrapper[4839]: E0321 04:25:25.452761 4839 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 21 04:25:25 crc kubenswrapper[4839]: E0321 04:25:25.452812 4839 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 21 04:25:25 crc kubenswrapper[4839]: E0321 04:25:25.452841 4839 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-445ww" podUID="fa13ce27-53f2-4178-8560-251f0bb3f034" Mar 21 04:25:25 crc kubenswrapper[4839]: I0321 04:25:25.452880 4839 scope.go:117] "RemoveContainer" containerID="4f8906b09cbadb32578e0445be63ed6c1e3ce446f301f92ae844664327d19a63" Mar 21 04:25:25 crc kubenswrapper[4839]: I0321 04:25:25.467328 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:24:41Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:11Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c3ce591ac49d818720803d65a1f2cbc28e1781a5bd3cff7c6cdaaa4b55c01b62\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:25:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:25:25Z is after 2025-08-24T17:21:41Z" Mar 21 04:25:25 crc kubenswrapper[4839]: I0321 04:25:25.481323 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:24:41Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:24:41Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:25:25Z is after 2025-08-24T17:21:41Z" Mar 21 04:25:25 crc kubenswrapper[4839]: I0321 04:25:25.494660 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-scp2c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e0848faa-daf7-4b62-a20f-36d92678db1d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2ae22ec83bed051327f3e013af8f13d50dfab76794f5fbc972c4091173c8c464\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:25:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6trk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5ee6bb585321fd698c356e2ad9b8f48f62f2b8e09847371060e8ab0395a059ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5ee6bb585321fd698c356e2ad9b8f48f62f2b8e09847371060e8ab0395a059ea\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:25:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:25:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6trk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d6527e3c31caba53559cb6dfe4494b576789a0c8e3e5a7d392352441f1b9b1ad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d6527e3c31caba53559cb6dfe4494b576789a0c8e3e5a7d392352441f1b9b1ad\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:25:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:25:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6trk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://688598020c8b1be055194d4805d234769230262450e5f816f35cc120ff5ff9a2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://688598020c8b1be055194d4805d234769230262450e5f816f35cc120ff5ff9a2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:25:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:25:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6trk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://137b7d338371020059e20feb028cea9e79fa43d32ff7de446719a1326051c4f4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://137b7d338371020059e20feb028cea9e79fa43d32ff7de446719a1326051c4f4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:25:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:25:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6trk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6059c9b392e73f503a825716b885d248ddf659fe4fc1cd23f47353dd0fdc0e07\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6059c9b392e73f503a825716b885d248ddf659fe4fc1cd23f47353dd0fdc0e07\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:25:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:25:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6trk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9367e3b70275a4fd5cc98faa785da0149e890de672bce34ddc831a46be5dd6b3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9367e3b70275a4fd5cc98faa785da0149e890de672bce34ddc831a46be5dd6b3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:25:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:25:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6trk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:25:00Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-scp2c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:25:25Z is after 2025-08-24T17:21:41Z" Mar 21 04:25:25 crc kubenswrapper[4839]: I0321 04:25:25.511410 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-zqcw4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1602189b-f4f3-40ee-ba63-c695c11069d0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://abb804e65728bf323531d9b35e52f6700584bab85de4a5b749067d2d89224747\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:25:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f9gh6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:25:00Z\\\"}}\" for pod \"openshift-multus\"/\"multus-zqcw4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:25:25Z is after 2025-08-24T17:21:41Z" Mar 21 04:25:25 crc kubenswrapper[4839]: I0321 04:25:25.523231 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-sxs57" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e99177d8-5f41-4cee-a2c9-ae1c314d9d8d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4499202f3d135af4fbc0aaeddb85c3af3d571f0a6a0da31b0c054112cbeef85e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:25:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vqm8m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:25:06Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-sxs57\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:25:25Z is after 2025-08-24T17:21:41Z" Mar 21 04:25:25 crc kubenswrapper[4839]: I0321 04:25:25.544974 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"845a5f72-fcd9-4e53-a65e-d875a0758312\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:23:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:23:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:23:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:23:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:23:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6ac8aa1efe8200de679b27c91d85e8934dd8b958a487da8adfcb2e7fe4d4af83\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:23:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b5fcb9a692a8e68cdaf00e9ab50192286582dc94a1ec9c1f269deec71d12e709\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:23:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0a90276bf42ba474250d4e51c93b958e93ffcad8ab5f4285766b381f0247c484\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:23:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://418cc0271450c53990546d7ffb4506824d2a5889e49064cd40624ba291f485bd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:23:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d2aeb8f0a5e6e7302570f0ecb899e31a8f5e9ed506f9bf7cfa6a0a6d55ae50dd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:23:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://843b9092466615ee64f818bbd1d6e51298c59932d1e135796967de09fb2f81bf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://843b9092466615ee64f818bbd1d6e51298c59932d1e135796967de09fb2f81bf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:23:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:23:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://320bccb832e179c9ab109d22c5cd652cfb3bd770d2c698d91886401d9566efc5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://320bccb832e179c9ab109d22c5cd652cfb3bd770d2c698d91886401d9566efc5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:23:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:23:18Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://6e30344cfa0c0c9d42845a517f744a156634132bd6c2f6e30f745751913e968c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6e30344cfa0c0c9d42845a517f744a156634132bd6c2f6e30f745751913e968c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:23:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:23:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:23:16Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:25:25Z is after 2025-08-24T17:21:41Z" Mar 21 04:25:25 crc kubenswrapper[4839]: I0321 04:25:25.555784 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:24:41Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:12Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ea7373c259189351899cd74615296d7188b2529f808af9c8c85098177ffbe85d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:25:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:25:25Z is after 2025-08-24T17:21:41Z" Mar 21 04:25:25 crc kubenswrapper[4839]: I0321 04:25:25.576433 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-spl4b" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d634043b-c9ec-4469-b267-26053b1f02f9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:01Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:01Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5854b81091eb3920f499e2fb8bdb5e51afa5cf975397ad7d789603150fdafe92\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:25:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cdph2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c532a8668c47301ab93dbb1ea21be6ed687a83a6c9008b887e5023a46f27d172\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:25:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cdph2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://340ce870dbfbd3b102f8f1c5e6705da80e4a23db035175d7746ffe6ad63310f0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:25:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cdph2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://04c0470b8aa154ad901d5416631136ad2480da45bff39836dda6837a3e4b980e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:25:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cdph2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://821316a28b33f897df4ea4ae553ed0cbdd066ff220cc310af604bbe062e4b00e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:25:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cdph2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0555faac11cd1da1d5573074ee5d8704c1c4a7d8559996229bf242d9ed7290f4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:25:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cdph2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4f8906b09cbadb32578e0445be63ed6c1e3ce446f301f92ae844664327d19a63\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4f8906b09cbadb32578e0445be63ed6c1e3ce446f301f92ae844664327d19a63\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-21T04:25:12Z\\\",\\\"message\\\":\\\"al\\\\nI0321 04:25:12.025879 6856 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI0321 04:25:12.025892 6856 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI0321 04:25:12.025912 6856 handler.go:208] Removed *v1.Pod event handler 6\\\\nI0321 04:25:12.025931 6856 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0321 04:25:12.025958 6856 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI0321 04:25:12.025968 6856 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI0321 04:25:12.025975 6856 handler.go:208] Removed *v1.Pod event handler 3\\\\nI0321 04:25:12.025945 6856 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI0321 04:25:12.025953 6856 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0321 04:25:12.025985 6856 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI0321 04:25:12.026002 6856 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI0321 04:25:12.026060 6856 handler.go:208] Removed *v1.Node event handler 2\\\\nI0321 04:25:12.026070 6856 handler.go:208] Removed *v1.Node event handler 7\\\\nI0321 04:25:12.026092 6856 factory.go:656] Stopping watch factory\\\\nI0321 04:25:12.026121 6856 ovnkube.go:599] Stopped ovnkube\\\\nI0321 04:25:12.026162 6856 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF0321 04:25:\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-21T04:25:11Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-spl4b_openshift-ovn-kubernetes(d634043b-c9ec-4469-b267-26053b1f02f9)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cdph2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ee9c34719e2681860a7065870d4f0fd1d5aae10da5f2cc780e83f8020211e536\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:25:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cdph2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://466ed122245008a86b4471250670e086e61c981de3a7a026461c7c2238c87f3c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://466ed122245008a86b4471250670e086e61c981de3a7a026461c7c2238c87f3c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:25:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:25:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cdph2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:25:01Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-spl4b\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:25:25Z is after 2025-08-24T17:21:41Z" Mar 21 04:25:25 crc kubenswrapper[4839]: I0321 04:25:25.587635 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-9hl57" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4dee692e-c3b8-4538-86d7-210dd7e96173\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6058f68ef58f6259be3064c7cea97ec261dc1174655dbe75d4caf208a9c300e5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:25:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6n8pc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://99d7bb8a699d6022c3036c24052e58f8f45699b88971a2137e3d8bc27a21fe56\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:25:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6n8pc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:25:12Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-9hl57\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:25:25Z is after 2025-08-24T17:21:41Z" Mar 21 04:25:25 crc kubenswrapper[4839]: I0321 04:25:25.597893 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-445ww" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fa13ce27-53f2-4178-8560-251f0bb3f034\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:13Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:13Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4b26h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4b26h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:25:13Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-445ww\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:25:25Z is after 2025-08-24T17:21:41Z" Mar 21 04:25:25 crc kubenswrapper[4839]: I0321 04:25:25.612756 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c5a473f2-0430-4c9b-8ef8-60d457db5188\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:23:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:23:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:23:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7299e7f513aff2d1d61b42cef3ed386843478716082a6e38d4d6d61d87f48529\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:23:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6d9bd6ceaa85cd21af55d1d512ac910eaf9d1bcf5d0f10868cd22ac80b13f66a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:23:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e4fe4a6c00181492cd5389e34a9bf18d905f69b2f7467eb75dcee9992eeb0d07\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:23:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://412472fce0e71838c2bb83101879b672e777dcd608bf27261a98261150d42e87\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ecafd1a4266b72f369f4b53d8a423bb875c1bc9719282c517a8e999ad5aecc66\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-21T04:24:14Z\\\",\\\"message\\\":\\\"W0321 04:24:13.708919 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI0321 04:24:13.709951 1 crypto.go:601] Generating new CA for check-endpoints-signer@1774067053 cert, and key in /tmp/serving-cert-1138089316/serving-signer.crt, /tmp/serving-cert-1138089316/serving-signer.key\\\\nI0321 04:24:14.009807 1 observer_polling.go:159] Starting file observer\\\\nW0321 04:24:14.016906 1 builder.go:272] unable to get owner reference (falling back to namespace): Unauthorized\\\\nI0321 04:24:14.017068 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0321 04:24:14.017655 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1138089316/tls.crt::/tmp/serving-cert-1138089316/tls.key\\\\\\\"\\\\nF0321 04:24:14.917069 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-21T04:24:13Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:24:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://573a241321770c9c676d83d8280f81eba0ad01a3e2f857be89d31316117b8898\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:23:19Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f70600d5431425836d94025fc8166068d8c9f411d0750be07cd7de44575a9649\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f70600d5431425836d94025fc8166068d8c9f411d0750be07cd7de44575a9649\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:23:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:23:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:23:16Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:25:25Z is after 2025-08-24T17:21:41Z" Mar 21 04:25:25 crc kubenswrapper[4839]: I0321 04:25:25.631623 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:24:41Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:24:41Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:25:25Z is after 2025-08-24T17:21:41Z" Mar 21 04:25:25 crc kubenswrapper[4839]: I0321 04:25:25.643809 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:24:41Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:09Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://17d5f53fe3137552caf74bf775b44fba819d8790d2bc4d5042744dbd7b7307ce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:25:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e480aacd9f65a8a4ccedd9c0a943c53bf6de8a8de00eb0d421ec782af8a785bb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:25:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:25:25Z is after 2025-08-24T17:21:41Z" Mar 21 04:25:25 crc kubenswrapper[4839]: I0321 04:25:25.651423 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-g47qh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e646dbcd-c976-48e4-8dee-497be8a275bf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2ceedbcee674106a8f519fededf37b2bbc4b4bd660845373899de825f1ed534e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:25:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ljz2v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:25:00Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-g47qh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:25:25Z is after 2025-08-24T17:21:41Z" Mar 21 04:25:25 crc kubenswrapper[4839]: I0321 04:25:25.661965 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:24:41Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:24:41Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:25:25Z is after 2025-08-24T17:21:41Z" Mar 21 04:25:25 crc kubenswrapper[4839]: I0321 04:25:25.671239 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-jx4q7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4f92fefb-d5cd-451a-8bbe-31eea55d5bd9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6fc18a359e07bf64662248d7ceb65c8184407c0c4e14ce5483760975218407c9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:25:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9jqbw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a124ddc90d8b93fe4b6f77b778fbdac127b4896f5f55e6d5d78629cd17584311\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:25:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9jqbw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:25:00Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-jx4q7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:25:25Z is after 2025-08-24T17:21:41Z" Mar 21 04:25:26 crc kubenswrapper[4839]: I0321 04:25:26.178131 4839 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-spl4b_d634043b-c9ec-4469-b267-26053b1f02f9/ovnkube-controller/1.log" Mar 21 04:25:26 crc kubenswrapper[4839]: I0321 04:25:26.181526 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-spl4b" event={"ID":"d634043b-c9ec-4469-b267-26053b1f02f9","Type":"ContainerStarted","Data":"c3b3b070bcb4467cbc49a3b7bcae7caeb0458824dadce792fcc31a4a2df828b2"} Mar 21 04:25:26 crc kubenswrapper[4839]: I0321 04:25:26.182367 4839 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-spl4b" Mar 21 04:25:26 crc kubenswrapper[4839]: I0321 04:25:26.195263 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:24:41Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:11Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c3ce591ac49d818720803d65a1f2cbc28e1781a5bd3cff7c6cdaaa4b55c01b62\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:25:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:25:26Z is after 2025-08-24T17:21:41Z" Mar 21 04:25:26 crc kubenswrapper[4839]: I0321 04:25:26.207040 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:24:41Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:24:41Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:25:26Z is after 2025-08-24T17:21:41Z" Mar 21 04:25:26 crc kubenswrapper[4839]: I0321 04:25:26.223423 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-scp2c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e0848faa-daf7-4b62-a20f-36d92678db1d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2ae22ec83bed051327f3e013af8f13d50dfab76794f5fbc972c4091173c8c464\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:25:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6trk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5ee6bb585321fd698c356e2ad9b8f48f62f2b8e09847371060e8ab0395a059ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5ee6bb585321fd698c356e2ad9b8f48f62f2b8e09847371060e8ab0395a059ea\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:25:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:25:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6trk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d6527e3c31caba53559cb6dfe4494b576789a0c8e3e5a7d392352441f1b9b1ad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d6527e3c31caba53559cb6dfe4494b576789a0c8e3e5a7d392352441f1b9b1ad\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:25:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:25:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6trk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://688598020c8b1be055194d4805d234769230262450e5f816f35cc120ff5ff9a2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://688598020c8b1be055194d4805d234769230262450e5f816f35cc120ff5ff9a2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:25:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:25:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6trk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://137b7d338371020059e20feb028cea9e79fa43d32ff7de446719a1326051c4f4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://137b7d338371020059e20feb028cea9e79fa43d32ff7de446719a1326051c4f4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:25:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:25:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6trk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6059c9b392e73f503a825716b885d248ddf659fe4fc1cd23f47353dd0fdc0e07\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6059c9b392e73f503a825716b885d248ddf659fe4fc1cd23f47353dd0fdc0e07\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:25:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:25:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6trk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9367e3b70275a4fd5cc98faa785da0149e890de672bce34ddc831a46be5dd6b3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9367e3b70275a4fd5cc98faa785da0149e890de672bce34ddc831a46be5dd6b3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:25:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:25:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6trk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:25:00Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-scp2c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:25:26Z is after 2025-08-24T17:21:41Z" Mar 21 04:25:26 crc kubenswrapper[4839]: I0321 04:25:26.237070 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-zqcw4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1602189b-f4f3-40ee-ba63-c695c11069d0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://abb804e65728bf323531d9b35e52f6700584bab85de4a5b749067d2d89224747\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:25:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f9gh6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:25:00Z\\\"}}\" for pod \"openshift-multus\"/\"multus-zqcw4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:25:26Z is after 2025-08-24T17:21:41Z" Mar 21 04:25:26 crc kubenswrapper[4839]: I0321 04:25:26.248051 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-sxs57" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e99177d8-5f41-4cee-a2c9-ae1c314d9d8d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4499202f3d135af4fbc0aaeddb85c3af3d571f0a6a0da31b0c054112cbeef85e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:25:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vqm8m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:25:06Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-sxs57\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:25:26Z is after 2025-08-24T17:21:41Z" Mar 21 04:25:26 crc kubenswrapper[4839]: I0321 04:25:26.267088 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"845a5f72-fcd9-4e53-a65e-d875a0758312\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:23:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:23:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:23:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:23:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:23:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6ac8aa1efe8200de679b27c91d85e8934dd8b958a487da8adfcb2e7fe4d4af83\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:23:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b5fcb9a692a8e68cdaf00e9ab50192286582dc94a1ec9c1f269deec71d12e709\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:23:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0a90276bf42ba474250d4e51c93b958e93ffcad8ab5f4285766b381f0247c484\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:23:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://418cc0271450c53990546d7ffb4506824d2a5889e49064cd40624ba291f485bd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:23:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d2aeb8f0a5e6e7302570f0ecb899e31a8f5e9ed506f9bf7cfa6a0a6d55ae50dd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:23:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://843b9092466615ee64f818bbd1d6e51298c59932d1e135796967de09fb2f81bf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://843b9092466615ee64f818bbd1d6e51298c59932d1e135796967de09fb2f81bf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:23:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:23:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://320bccb832e179c9ab109d22c5cd652cfb3bd770d2c698d91886401d9566efc5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://320bccb832e179c9ab109d22c5cd652cfb3bd770d2c698d91886401d9566efc5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:23:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:23:18Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://6e30344cfa0c0c9d42845a517f744a156634132bd6c2f6e30f745751913e968c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6e30344cfa0c0c9d42845a517f744a156634132bd6c2f6e30f745751913e968c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:23:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:23:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:23:16Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:25:26Z is after 2025-08-24T17:21:41Z" Mar 21 04:25:26 crc kubenswrapper[4839]: I0321 04:25:26.278497 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:24:41Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:12Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ea7373c259189351899cd74615296d7188b2529f808af9c8c85098177ffbe85d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:25:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:25:26Z is after 2025-08-24T17:21:41Z" Mar 21 04:25:26 crc kubenswrapper[4839]: I0321 04:25:26.294314 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-spl4b" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d634043b-c9ec-4469-b267-26053b1f02f9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:01Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:01Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5854b81091eb3920f499e2fb8bdb5e51afa5cf975397ad7d789603150fdafe92\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:25:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cdph2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c532a8668c47301ab93dbb1ea21be6ed687a83a6c9008b887e5023a46f27d172\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:25:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cdph2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://340ce870dbfbd3b102f8f1c5e6705da80e4a23db035175d7746ffe6ad63310f0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:25:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cdph2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://04c0470b8aa154ad901d5416631136ad2480da45bff39836dda6837a3e4b980e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:25:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cdph2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://821316a28b33f897df4ea4ae553ed0cbdd066ff220cc310af604bbe062e4b00e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:25:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cdph2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0555faac11cd1da1d5573074ee5d8704c1c4a7d8559996229bf242d9ed7290f4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:25:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cdph2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c3b3b070bcb4467cbc49a3b7bcae7caeb0458824dadce792fcc31a4a2df828b2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4f8906b09cbadb32578e0445be63ed6c1e3ce446f301f92ae844664327d19a63\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-21T04:25:12Z\\\",\\\"message\\\":\\\"al\\\\nI0321 04:25:12.025879 6856 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI0321 04:25:12.025892 6856 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI0321 04:25:12.025912 6856 handler.go:208] Removed *v1.Pod event handler 6\\\\nI0321 04:25:12.025931 6856 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0321 04:25:12.025958 6856 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI0321 04:25:12.025968 6856 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI0321 04:25:12.025975 6856 handler.go:208] Removed *v1.Pod event handler 3\\\\nI0321 04:25:12.025945 6856 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI0321 04:25:12.025953 6856 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0321 04:25:12.025985 6856 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI0321 04:25:12.026002 6856 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI0321 04:25:12.026060 6856 handler.go:208] Removed *v1.Node event handler 2\\\\nI0321 04:25:12.026070 6856 handler.go:208] Removed *v1.Node event handler 7\\\\nI0321 04:25:12.026092 6856 factory.go:656] Stopping watch factory\\\\nI0321 04:25:12.026121 6856 ovnkube.go:599] Stopped ovnkube\\\\nI0321 04:25:12.026162 6856 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF0321 04:25:\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-21T04:25:11Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:25:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cdph2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ee9c34719e2681860a7065870d4f0fd1d5aae10da5f2cc780e83f8020211e536\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:25:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cdph2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://466ed122245008a86b4471250670e086e61c981de3a7a026461c7c2238c87f3c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://466ed122245008a86b4471250670e086e61c981de3a7a026461c7c2238c87f3c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:25:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:25:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cdph2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:25:01Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-spl4b\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:25:26Z is after 2025-08-24T17:21:41Z" Mar 21 04:25:26 crc kubenswrapper[4839]: I0321 04:25:26.304779 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-9hl57" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4dee692e-c3b8-4538-86d7-210dd7e96173\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6058f68ef58f6259be3064c7cea97ec261dc1174655dbe75d4caf208a9c300e5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:25:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6n8pc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://99d7bb8a699d6022c3036c24052e58f8f45699b88971a2137e3d8bc27a21fe56\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:25:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6n8pc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:25:12Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-9hl57\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:25:26Z is after 2025-08-24T17:21:41Z" Mar 21 04:25:26 crc kubenswrapper[4839]: I0321 04:25:26.313087 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-445ww" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fa13ce27-53f2-4178-8560-251f0bb3f034\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:13Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:13Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4b26h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4b26h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:25:13Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-445ww\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:25:26Z is after 2025-08-24T17:21:41Z" Mar 21 04:25:26 crc kubenswrapper[4839]: I0321 04:25:26.323802 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c5a473f2-0430-4c9b-8ef8-60d457db5188\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:23:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:23:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:23:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7299e7f513aff2d1d61b42cef3ed386843478716082a6e38d4d6d61d87f48529\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:23:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6d9bd6ceaa85cd21af55d1d512ac910eaf9d1bcf5d0f10868cd22ac80b13f66a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:23:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e4fe4a6c00181492cd5389e34a9bf18d905f69b2f7467eb75dcee9992eeb0d07\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:23:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://412472fce0e71838c2bb83101879b672e777dcd608bf27261a98261150d42e87\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ecafd1a4266b72f369f4b53d8a423bb875c1bc9719282c517a8e999ad5aecc66\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-21T04:24:14Z\\\",\\\"message\\\":\\\"W0321 04:24:13.708919 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI0321 04:24:13.709951 1 crypto.go:601] Generating new CA for check-endpoints-signer@1774067053 cert, and key in /tmp/serving-cert-1138089316/serving-signer.crt, /tmp/serving-cert-1138089316/serving-signer.key\\\\nI0321 04:24:14.009807 1 observer_polling.go:159] Starting file observer\\\\nW0321 04:24:14.016906 1 builder.go:272] unable to get owner reference (falling back to namespace): Unauthorized\\\\nI0321 04:24:14.017068 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0321 04:24:14.017655 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1138089316/tls.crt::/tmp/serving-cert-1138089316/tls.key\\\\\\\"\\\\nF0321 04:24:14.917069 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-21T04:24:13Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:24:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://573a241321770c9c676d83d8280f81eba0ad01a3e2f857be89d31316117b8898\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:23:19Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f70600d5431425836d94025fc8166068d8c9f411d0750be07cd7de44575a9649\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f70600d5431425836d94025fc8166068d8c9f411d0750be07cd7de44575a9649\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:23:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:23:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:23:16Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:25:26Z is after 2025-08-24T17:21:41Z" Mar 21 04:25:26 crc kubenswrapper[4839]: I0321 04:25:26.334967 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:24:41Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:24:41Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:25:26Z is after 2025-08-24T17:21:41Z" Mar 21 04:25:26 crc kubenswrapper[4839]: I0321 04:25:26.347052 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:24:41Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:09Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://17d5f53fe3137552caf74bf775b44fba819d8790d2bc4d5042744dbd7b7307ce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:25:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e480aacd9f65a8a4ccedd9c0a943c53bf6de8a8de00eb0d421ec782af8a785bb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:25:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:25:26Z is after 2025-08-24T17:21:41Z" Mar 21 04:25:26 crc kubenswrapper[4839]: I0321 04:25:26.357556 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-g47qh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e646dbcd-c976-48e4-8dee-497be8a275bf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2ceedbcee674106a8f519fededf37b2bbc4b4bd660845373899de825f1ed534e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:25:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ljz2v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:25:00Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-g47qh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:25:26Z is after 2025-08-24T17:21:41Z" Mar 21 04:25:26 crc kubenswrapper[4839]: I0321 04:25:26.369662 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:24:41Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:24:41Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:25:26Z is after 2025-08-24T17:21:41Z" Mar 21 04:25:26 crc kubenswrapper[4839]: I0321 04:25:26.386727 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-jx4q7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4f92fefb-d5cd-451a-8bbe-31eea55d5bd9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6fc18a359e07bf64662248d7ceb65c8184407c0c4e14ce5483760975218407c9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:25:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9jqbw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a124ddc90d8b93fe4b6f77b778fbdac127b4896f5f55e6d5d78629cd17584311\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:25:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9jqbw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:25:00Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-jx4q7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:25:26Z is after 2025-08-24T17:21:41Z" Mar 21 04:25:26 crc kubenswrapper[4839]: I0321 04:25:26.452397 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 21 04:25:26 crc kubenswrapper[4839]: E0321 04:25:26.452604 4839 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 21 04:25:26 crc kubenswrapper[4839]: I0321 04:25:26.479989 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-spl4b" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d634043b-c9ec-4469-b267-26053b1f02f9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:01Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:01Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5854b81091eb3920f499e2fb8bdb5e51afa5cf975397ad7d789603150fdafe92\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:25:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cdph2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c532a8668c47301ab93dbb1ea21be6ed687a83a6c9008b887e5023a46f27d172\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:25:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cdph2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://340ce870dbfbd3b102f8f1c5e6705da80e4a23db035175d7746ffe6ad63310f0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:25:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cdph2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://04c0470b8aa154ad901d5416631136ad2480da45bff39836dda6837a3e4b980e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:25:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cdph2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://821316a28b33f897df4ea4ae553ed0cbdd066ff220cc310af604bbe062e4b00e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:25:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cdph2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0555faac11cd1da1d5573074ee5d8704c1c4a7d8559996229bf242d9ed7290f4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:25:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cdph2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c3b3b070bcb4467cbc49a3b7bcae7caeb0458824dadce792fcc31a4a2df828b2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4f8906b09cbadb32578e0445be63ed6c1e3ce446f301f92ae844664327d19a63\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-21T04:25:12Z\\\",\\\"message\\\":\\\"al\\\\nI0321 04:25:12.025879 6856 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI0321 04:25:12.025892 6856 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI0321 04:25:12.025912 6856 handler.go:208] Removed *v1.Pod event handler 6\\\\nI0321 04:25:12.025931 6856 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0321 04:25:12.025958 6856 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI0321 04:25:12.025968 6856 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI0321 04:25:12.025975 6856 handler.go:208] Removed *v1.Pod event handler 3\\\\nI0321 04:25:12.025945 6856 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI0321 04:25:12.025953 6856 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0321 04:25:12.025985 6856 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI0321 04:25:12.026002 6856 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI0321 04:25:12.026060 6856 handler.go:208] Removed *v1.Node event handler 2\\\\nI0321 04:25:12.026070 6856 handler.go:208] Removed *v1.Node event handler 7\\\\nI0321 04:25:12.026092 6856 factory.go:656] Stopping watch factory\\\\nI0321 04:25:12.026121 6856 ovnkube.go:599] Stopped ovnkube\\\\nI0321 04:25:12.026162 6856 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF0321 04:25:\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-21T04:25:11Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:25:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cdph2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ee9c34719e2681860a7065870d4f0fd1d5aae10da5f2cc780e83f8020211e536\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:25:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cdph2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://466ed122245008a86b4471250670e086e61c981de3a7a026461c7c2238c87f3c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://466ed122245008a86b4471250670e086e61c981de3a7a026461c7c2238c87f3c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:25:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:25:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cdph2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:25:01Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-spl4b\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:25:26Z is after 2025-08-24T17:21:41Z" Mar 21 04:25:26 crc kubenswrapper[4839]: I0321 04:25:26.497555 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-9hl57" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4dee692e-c3b8-4538-86d7-210dd7e96173\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6058f68ef58f6259be3064c7cea97ec261dc1174655dbe75d4caf208a9c300e5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:25:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6n8pc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://99d7bb8a699d6022c3036c24052e58f8f45699b88971a2137e3d8bc27a21fe56\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:25:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6n8pc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:25:12Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-9hl57\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:25:26Z is after 2025-08-24T17:21:41Z" Mar 21 04:25:26 crc kubenswrapper[4839]: I0321 04:25:26.512532 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-445ww" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fa13ce27-53f2-4178-8560-251f0bb3f034\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:13Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:13Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4b26h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4b26h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:25:13Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-445ww\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:25:26Z is after 2025-08-24T17:21:41Z" Mar 21 04:25:26 crc kubenswrapper[4839]: I0321 04:25:26.533542 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"845a5f72-fcd9-4e53-a65e-d875a0758312\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:23:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:23:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:23:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:23:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:23:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6ac8aa1efe8200de679b27c91d85e8934dd8b958a487da8adfcb2e7fe4d4af83\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:23:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b5fcb9a692a8e68cdaf00e9ab50192286582dc94a1ec9c1f269deec71d12e709\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:23:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0a90276bf42ba474250d4e51c93b958e93ffcad8ab5f4285766b381f0247c484\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:23:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://418cc0271450c53990546d7ffb4506824d2a5889e49064cd40624ba291f485bd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:23:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d2aeb8f0a5e6e7302570f0ecb899e31a8f5e9ed506f9bf7cfa6a0a6d55ae50dd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:23:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://843b9092466615ee64f818bbd1d6e51298c59932d1e135796967de09fb2f81bf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://843b9092466615ee64f818bbd1d6e51298c59932d1e135796967de09fb2f81bf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:23:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:23:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://320bccb832e179c9ab109d22c5cd652cfb3bd770d2c698d91886401d9566efc5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://320bccb832e179c9ab109d22c5cd652cfb3bd770d2c698d91886401d9566efc5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:23:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:23:18Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://6e30344cfa0c0c9d42845a517f744a156634132bd6c2f6e30f745751913e968c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6e30344cfa0c0c9d42845a517f744a156634132bd6c2f6e30f745751913e968c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:23:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:23:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:23:16Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:25:26Z is after 2025-08-24T17:21:41Z" Mar 21 04:25:26 crc kubenswrapper[4839]: E0321 04:25:26.538256 4839 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 21 04:25:26 crc kubenswrapper[4839]: I0321 04:25:26.550835 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:24:41Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:12Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ea7373c259189351899cd74615296d7188b2529f808af9c8c85098177ffbe85d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:25:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:25:26Z is after 2025-08-24T17:21:41Z" Mar 21 04:25:26 crc kubenswrapper[4839]: I0321 04:25:26.565604 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:24:41Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:24:41Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:25:26Z is after 2025-08-24T17:21:41Z" Mar 21 04:25:26 crc kubenswrapper[4839]: I0321 04:25:26.579831 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:24:41Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:09Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://17d5f53fe3137552caf74bf775b44fba819d8790d2bc4d5042744dbd7b7307ce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:25:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e480aacd9f65a8a4ccedd9c0a943c53bf6de8a8de00eb0d421ec782af8a785bb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:25:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:25:26Z is after 2025-08-24T17:21:41Z" Mar 21 04:25:26 crc kubenswrapper[4839]: I0321 04:25:26.591299 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-g47qh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e646dbcd-c976-48e4-8dee-497be8a275bf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2ceedbcee674106a8f519fededf37b2bbc4b4bd660845373899de825f1ed534e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:25:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ljz2v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:25:00Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-g47qh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:25:26Z is after 2025-08-24T17:21:41Z" Mar 21 04:25:26 crc kubenswrapper[4839]: I0321 04:25:26.606727 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c5a473f2-0430-4c9b-8ef8-60d457db5188\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:23:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:23:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:23:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7299e7f513aff2d1d61b42cef3ed386843478716082a6e38d4d6d61d87f48529\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:23:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6d9bd6ceaa85cd21af55d1d512ac910eaf9d1bcf5d0f10868cd22ac80b13f66a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:23:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e4fe4a6c00181492cd5389e34a9bf18d905f69b2f7467eb75dcee9992eeb0d07\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:23:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://412472fce0e71838c2bb83101879b672e777dcd608bf27261a98261150d42e87\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ecafd1a4266b72f369f4b53d8a423bb875c1bc9719282c517a8e999ad5aecc66\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-21T04:24:14Z\\\",\\\"message\\\":\\\"W0321 04:24:13.708919 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI0321 04:24:13.709951 1 crypto.go:601] Generating new CA for check-endpoints-signer@1774067053 cert, and key in /tmp/serving-cert-1138089316/serving-signer.crt, /tmp/serving-cert-1138089316/serving-signer.key\\\\nI0321 04:24:14.009807 1 observer_polling.go:159] Starting file observer\\\\nW0321 04:24:14.016906 1 builder.go:272] unable to get owner reference (falling back to namespace): Unauthorized\\\\nI0321 04:24:14.017068 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0321 04:24:14.017655 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1138089316/tls.crt::/tmp/serving-cert-1138089316/tls.key\\\\\\\"\\\\nF0321 04:24:14.917069 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-21T04:24:13Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:24:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://573a241321770c9c676d83d8280f81eba0ad01a3e2f857be89d31316117b8898\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:23:19Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f70600d5431425836d94025fc8166068d8c9f411d0750be07cd7de44575a9649\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f70600d5431425836d94025fc8166068d8c9f411d0750be07cd7de44575a9649\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:23:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:23:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:23:16Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:25:26Z is after 2025-08-24T17:21:41Z" Mar 21 04:25:26 crc kubenswrapper[4839]: I0321 04:25:26.621398 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:24:41Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:24:41Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:25:26Z is after 2025-08-24T17:21:41Z" Mar 21 04:25:26 crc kubenswrapper[4839]: I0321 04:25:26.634348 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-jx4q7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4f92fefb-d5cd-451a-8bbe-31eea55d5bd9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6fc18a359e07bf64662248d7ceb65c8184407c0c4e14ce5483760975218407c9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:25:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9jqbw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a124ddc90d8b93fe4b6f77b778fbdac127b4896f5f55e6d5d78629cd17584311\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:25:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9jqbw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:25:00Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-jx4q7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:25:26Z is after 2025-08-24T17:21:41Z" Mar 21 04:25:26 crc kubenswrapper[4839]: I0321 04:25:26.649725 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-scp2c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e0848faa-daf7-4b62-a20f-36d92678db1d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2ae22ec83bed051327f3e013af8f13d50dfab76794f5fbc972c4091173c8c464\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:25:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6trk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5ee6bb585321fd698c356e2ad9b8f48f62f2b8e09847371060e8ab0395a059ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5ee6bb585321fd698c356e2ad9b8f48f62f2b8e09847371060e8ab0395a059ea\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:25:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:25:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6trk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d6527e3c31caba53559cb6dfe4494b576789a0c8e3e5a7d392352441f1b9b1ad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d6527e3c31caba53559cb6dfe4494b576789a0c8e3e5a7d392352441f1b9b1ad\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:25:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:25:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6trk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://688598020c8b1be055194d4805d234769230262450e5f816f35cc120ff5ff9a2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://688598020c8b1be055194d4805d234769230262450e5f816f35cc120ff5ff9a2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:25:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:25:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6trk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://137b7d338371020059e20feb028cea9e79fa43d32ff7de446719a1326051c4f4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://137b7d338371020059e20feb028cea9e79fa43d32ff7de446719a1326051c4f4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:25:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:25:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6trk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6059c9b392e73f503a825716b885d248ddf659fe4fc1cd23f47353dd0fdc0e07\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6059c9b392e73f503a825716b885d248ddf659fe4fc1cd23f47353dd0fdc0e07\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:25:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:25:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6trk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9367e3b70275a4fd5cc98faa785da0149e890de672bce34ddc831a46be5dd6b3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9367e3b70275a4fd5cc98faa785da0149e890de672bce34ddc831a46be5dd6b3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:25:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:25:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6trk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:25:00Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-scp2c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:25:26Z is after 2025-08-24T17:21:41Z" Mar 21 04:25:26 crc kubenswrapper[4839]: I0321 04:25:26.659998 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-zqcw4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1602189b-f4f3-40ee-ba63-c695c11069d0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://abb804e65728bf323531d9b35e52f6700584bab85de4a5b749067d2d89224747\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:25:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f9gh6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:25:00Z\\\"}}\" for pod \"openshift-multus\"/\"multus-zqcw4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:25:26Z is after 2025-08-24T17:21:41Z" Mar 21 04:25:26 crc kubenswrapper[4839]: I0321 04:25:26.668522 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-sxs57" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e99177d8-5f41-4cee-a2c9-ae1c314d9d8d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4499202f3d135af4fbc0aaeddb85c3af3d571f0a6a0da31b0c054112cbeef85e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:25:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vqm8m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:25:06Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-sxs57\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:25:26Z is after 2025-08-24T17:21:41Z" Mar 21 04:25:26 crc kubenswrapper[4839]: I0321 04:25:26.679304 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:24:41Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:11Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c3ce591ac49d818720803d65a1f2cbc28e1781a5bd3cff7c6cdaaa4b55c01b62\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:25:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:25:26Z is after 2025-08-24T17:21:41Z" Mar 21 04:25:26 crc kubenswrapper[4839]: I0321 04:25:26.690365 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:24:41Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:24:41Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:25:26Z is after 2025-08-24T17:21:41Z" Mar 21 04:25:27 crc kubenswrapper[4839]: I0321 04:25:27.186136 4839 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-spl4b_d634043b-c9ec-4469-b267-26053b1f02f9/ovnkube-controller/2.log" Mar 21 04:25:27 crc kubenswrapper[4839]: I0321 04:25:27.186942 4839 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-spl4b_d634043b-c9ec-4469-b267-26053b1f02f9/ovnkube-controller/1.log" Mar 21 04:25:27 crc kubenswrapper[4839]: I0321 04:25:27.190328 4839 generic.go:334] "Generic (PLEG): container finished" podID="d634043b-c9ec-4469-b267-26053b1f02f9" containerID="c3b3b070bcb4467cbc49a3b7bcae7caeb0458824dadce792fcc31a4a2df828b2" exitCode=1 Mar 21 04:25:27 crc kubenswrapper[4839]: I0321 04:25:27.190391 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-spl4b" event={"ID":"d634043b-c9ec-4469-b267-26053b1f02f9","Type":"ContainerDied","Data":"c3b3b070bcb4467cbc49a3b7bcae7caeb0458824dadce792fcc31a4a2df828b2"} Mar 21 04:25:27 crc kubenswrapper[4839]: I0321 04:25:27.190437 4839 scope.go:117] "RemoveContainer" containerID="4f8906b09cbadb32578e0445be63ed6c1e3ce446f301f92ae844664327d19a63" Mar 21 04:25:27 crc kubenswrapper[4839]: I0321 04:25:27.190898 4839 scope.go:117] "RemoveContainer" containerID="c3b3b070bcb4467cbc49a3b7bcae7caeb0458824dadce792fcc31a4a2df828b2" Mar 21 04:25:27 crc kubenswrapper[4839]: E0321 04:25:27.191042 4839 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-spl4b_openshift-ovn-kubernetes(d634043b-c9ec-4469-b267-26053b1f02f9)\"" pod="openshift-ovn-kubernetes/ovnkube-node-spl4b" podUID="d634043b-c9ec-4469-b267-26053b1f02f9" Mar 21 04:25:27 crc kubenswrapper[4839]: I0321 04:25:27.218069 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-spl4b" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d634043b-c9ec-4469-b267-26053b1f02f9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:01Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:01Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5854b81091eb3920f499e2fb8bdb5e51afa5cf975397ad7d789603150fdafe92\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:25:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cdph2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c532a8668c47301ab93dbb1ea21be6ed687a83a6c9008b887e5023a46f27d172\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:25:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cdph2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://340ce870dbfbd3b102f8f1c5e6705da80e4a23db035175d7746ffe6ad63310f0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:25:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cdph2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://04c0470b8aa154ad901d5416631136ad2480da45bff39836dda6837a3e4b980e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:25:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cdph2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://821316a28b33f897df4ea4ae553ed0cbdd066ff220cc310af604bbe062e4b00e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:25:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cdph2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0555faac11cd1da1d5573074ee5d8704c1c4a7d8559996229bf242d9ed7290f4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:25:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cdph2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c3b3b070bcb4467cbc49a3b7bcae7caeb0458824dadce792fcc31a4a2df828b2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4f8906b09cbadb32578e0445be63ed6c1e3ce446f301f92ae844664327d19a63\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-21T04:25:12Z\\\",\\\"message\\\":\\\"al\\\\nI0321 04:25:12.025879 6856 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI0321 04:25:12.025892 6856 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI0321 04:25:12.025912 6856 handler.go:208] Removed *v1.Pod event handler 6\\\\nI0321 04:25:12.025931 6856 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0321 04:25:12.025958 6856 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI0321 04:25:12.025968 6856 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI0321 04:25:12.025975 6856 handler.go:208] Removed *v1.Pod event handler 3\\\\nI0321 04:25:12.025945 6856 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI0321 04:25:12.025953 6856 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0321 04:25:12.025985 6856 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI0321 04:25:12.026002 6856 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI0321 04:25:12.026060 6856 handler.go:208] Removed *v1.Node event handler 2\\\\nI0321 04:25:12.026070 6856 handler.go:208] Removed *v1.Node event handler 7\\\\nI0321 04:25:12.026092 6856 factory.go:656] Stopping watch factory\\\\nI0321 04:25:12.026121 6856 ovnkube.go:599] Stopped ovnkube\\\\nI0321 04:25:12.026162 6856 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF0321 04:25:\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-21T04:25:11Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c3b3b070bcb4467cbc49a3b7bcae7caeb0458824dadce792fcc31a4a2df828b2\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-21T04:25:26Z\\\",\\\"message\\\":\\\"e/networking-console-plugin-85b44fc459-gdk6g openshift-etcd/etcd-crc openshift-multus/multus-zqcw4 openshift-network-diagnostics/network-check-target-xd92c openshift-network-operator/iptables-alerter-4ln5h openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-9hl57 openshift-multus/network-metrics-daemon-445ww openshift-network-diagnostics/network-check-source-55646444c4-trplf openshift-network-node-identity/network-node-identity-vrzqb]\\\\nI0321 04:25:26.224924 7118 obj_retry.go:418] Waiting for all the *v1.Pod retry setup to complete in iterateRetryResources\\\\nI0321 04:25:26.224943 7118 obj_retry.go:303] Retry object setup: *v1.Pod openshift-network-node-identity/network-node-identity-vrzqb\\\\nI0321 04:25:26.224952 7118 obj_retry.go:365] Adding new object: *v1.Pod openshift-network-node-identity/network-node-identity-vrzqb\\\\nI0321 04:25:26.224964 7118 ovn.go:134] Ensuring zone local for Pod openshift-network-node-identity/network-node-identity-vrzqb in node crc\\\\nF0321 04:25:26.224967 7118 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer becau\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-21T04:25:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cdph2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ee9c34719e2681860a7065870d4f0fd1d5aae10da5f2cc780e83f8020211e536\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:25:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cdph2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://466ed122245008a86b4471250670e086e61c981de3a7a026461c7c2238c87f3c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://466ed122245008a86b4471250670e086e61c981de3a7a026461c7c2238c87f3c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:25:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:25:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cdph2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:25:01Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-spl4b\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:25:27Z is after 2025-08-24T17:21:41Z" Mar 21 04:25:27 crc kubenswrapper[4839]: I0321 04:25:27.231552 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-9hl57" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4dee692e-c3b8-4538-86d7-210dd7e96173\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6058f68ef58f6259be3064c7cea97ec261dc1174655dbe75d4caf208a9c300e5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:25:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6n8pc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://99d7bb8a699d6022c3036c24052e58f8f45699b88971a2137e3d8bc27a21fe56\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:25:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6n8pc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:25:12Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-9hl57\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:25:27Z is after 2025-08-24T17:21:41Z" Mar 21 04:25:27 crc kubenswrapper[4839]: I0321 04:25:27.244231 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-445ww" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fa13ce27-53f2-4178-8560-251f0bb3f034\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:13Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:13Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4b26h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4b26h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:25:13Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-445ww\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:25:27Z is after 2025-08-24T17:21:41Z" Mar 21 04:25:27 crc kubenswrapper[4839]: I0321 04:25:27.266199 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"845a5f72-fcd9-4e53-a65e-d875a0758312\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:23:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:23:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:23:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:23:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:23:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6ac8aa1efe8200de679b27c91d85e8934dd8b958a487da8adfcb2e7fe4d4af83\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:23:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b5fcb9a692a8e68cdaf00e9ab50192286582dc94a1ec9c1f269deec71d12e709\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:23:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0a90276bf42ba474250d4e51c93b958e93ffcad8ab5f4285766b381f0247c484\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:23:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://418cc0271450c53990546d7ffb4506824d2a5889e49064cd40624ba291f485bd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:23:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d2aeb8f0a5e6e7302570f0ecb899e31a8f5e9ed506f9bf7cfa6a0a6d55ae50dd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:23:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://843b9092466615ee64f818bbd1d6e51298c59932d1e135796967de09fb2f81bf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://843b9092466615ee64f818bbd1d6e51298c59932d1e135796967de09fb2f81bf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:23:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:23:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://320bccb832e179c9ab109d22c5cd652cfb3bd770d2c698d91886401d9566efc5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://320bccb832e179c9ab109d22c5cd652cfb3bd770d2c698d91886401d9566efc5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:23:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:23:18Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://6e30344cfa0c0c9d42845a517f744a156634132bd6c2f6e30f745751913e968c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6e30344cfa0c0c9d42845a517f744a156634132bd6c2f6e30f745751913e968c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:23:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:23:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:23:16Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:25:27Z is after 2025-08-24T17:21:41Z" Mar 21 04:25:27 crc kubenswrapper[4839]: I0321 04:25:27.280704 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:24:41Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:12Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ea7373c259189351899cd74615296d7188b2529f808af9c8c85098177ffbe85d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:25:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:25:27Z is after 2025-08-24T17:21:41Z" Mar 21 04:25:27 crc kubenswrapper[4839]: I0321 04:25:27.296361 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:24:41Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:24:41Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:25:27Z is after 2025-08-24T17:21:41Z" Mar 21 04:25:27 crc kubenswrapper[4839]: I0321 04:25:27.312916 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:24:41Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:09Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://17d5f53fe3137552caf74bf775b44fba819d8790d2bc4d5042744dbd7b7307ce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:25:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e480aacd9f65a8a4ccedd9c0a943c53bf6de8a8de00eb0d421ec782af8a785bb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:25:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:25:27Z is after 2025-08-24T17:21:41Z" Mar 21 04:25:27 crc kubenswrapper[4839]: I0321 04:25:27.327461 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-g47qh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e646dbcd-c976-48e4-8dee-497be8a275bf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2ceedbcee674106a8f519fededf37b2bbc4b4bd660845373899de825f1ed534e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:25:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ljz2v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:25:00Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-g47qh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:25:27Z is after 2025-08-24T17:21:41Z" Mar 21 04:25:27 crc kubenswrapper[4839]: I0321 04:25:27.341164 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c5a473f2-0430-4c9b-8ef8-60d457db5188\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:23:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:23:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:23:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7299e7f513aff2d1d61b42cef3ed386843478716082a6e38d4d6d61d87f48529\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:23:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6d9bd6ceaa85cd21af55d1d512ac910eaf9d1bcf5d0f10868cd22ac80b13f66a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:23:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e4fe4a6c00181492cd5389e34a9bf18d905f69b2f7467eb75dcee9992eeb0d07\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:23:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://412472fce0e71838c2bb83101879b672e777dcd608bf27261a98261150d42e87\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ecafd1a4266b72f369f4b53d8a423bb875c1bc9719282c517a8e999ad5aecc66\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-21T04:24:14Z\\\",\\\"message\\\":\\\"W0321 04:24:13.708919 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI0321 04:24:13.709951 1 crypto.go:601] Generating new CA for check-endpoints-signer@1774067053 cert, and key in /tmp/serving-cert-1138089316/serving-signer.crt, /tmp/serving-cert-1138089316/serving-signer.key\\\\nI0321 04:24:14.009807 1 observer_polling.go:159] Starting file observer\\\\nW0321 04:24:14.016906 1 builder.go:272] unable to get owner reference (falling back to namespace): Unauthorized\\\\nI0321 04:24:14.017068 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0321 04:24:14.017655 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1138089316/tls.crt::/tmp/serving-cert-1138089316/tls.key\\\\\\\"\\\\nF0321 04:24:14.917069 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-21T04:24:13Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:24:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://573a241321770c9c676d83d8280f81eba0ad01a3e2f857be89d31316117b8898\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:23:19Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f70600d5431425836d94025fc8166068d8c9f411d0750be07cd7de44575a9649\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f70600d5431425836d94025fc8166068d8c9f411d0750be07cd7de44575a9649\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:23:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:23:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:23:16Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:25:27Z is after 2025-08-24T17:21:41Z" Mar 21 04:25:27 crc kubenswrapper[4839]: I0321 04:25:27.357194 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:24:41Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:24:41Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:25:27Z is after 2025-08-24T17:21:41Z" Mar 21 04:25:27 crc kubenswrapper[4839]: I0321 04:25:27.370340 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-jx4q7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4f92fefb-d5cd-451a-8bbe-31eea55d5bd9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6fc18a359e07bf64662248d7ceb65c8184407c0c4e14ce5483760975218407c9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:25:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9jqbw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a124ddc90d8b93fe4b6f77b778fbdac127b4896f5f55e6d5d78629cd17584311\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:25:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9jqbw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:25:00Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-jx4q7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:25:27Z is after 2025-08-24T17:21:41Z" Mar 21 04:25:27 crc kubenswrapper[4839]: I0321 04:25:27.388434 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-scp2c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e0848faa-daf7-4b62-a20f-36d92678db1d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2ae22ec83bed051327f3e013af8f13d50dfab76794f5fbc972c4091173c8c464\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:25:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6trk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5ee6bb585321fd698c356e2ad9b8f48f62f2b8e09847371060e8ab0395a059ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5ee6bb585321fd698c356e2ad9b8f48f62f2b8e09847371060e8ab0395a059ea\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:25:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:25:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6trk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d6527e3c31caba53559cb6dfe4494b576789a0c8e3e5a7d392352441f1b9b1ad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d6527e3c31caba53559cb6dfe4494b576789a0c8e3e5a7d392352441f1b9b1ad\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:25:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:25:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6trk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://688598020c8b1be055194d4805d234769230262450e5f816f35cc120ff5ff9a2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://688598020c8b1be055194d4805d234769230262450e5f816f35cc120ff5ff9a2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:25:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:25:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6trk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://137b7d338371020059e20feb028cea9e79fa43d32ff7de446719a1326051c4f4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://137b7d338371020059e20feb028cea9e79fa43d32ff7de446719a1326051c4f4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:25:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:25:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6trk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6059c9b392e73f503a825716b885d248ddf659fe4fc1cd23f47353dd0fdc0e07\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6059c9b392e73f503a825716b885d248ddf659fe4fc1cd23f47353dd0fdc0e07\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:25:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:25:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6trk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9367e3b70275a4fd5cc98faa785da0149e890de672bce34ddc831a46be5dd6b3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9367e3b70275a4fd5cc98faa785da0149e890de672bce34ddc831a46be5dd6b3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:25:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:25:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6trk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:25:00Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-scp2c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:25:27Z is after 2025-08-24T17:21:41Z" Mar 21 04:25:27 crc kubenswrapper[4839]: I0321 04:25:27.404741 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-zqcw4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1602189b-f4f3-40ee-ba63-c695c11069d0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://abb804e65728bf323531d9b35e52f6700584bab85de4a5b749067d2d89224747\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:25:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f9gh6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:25:00Z\\\"}}\" for pod \"openshift-multus\"/\"multus-zqcw4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:25:27Z is after 2025-08-24T17:21:41Z" Mar 21 04:25:27 crc kubenswrapper[4839]: I0321 04:25:27.416798 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-sxs57" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e99177d8-5f41-4cee-a2c9-ae1c314d9d8d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4499202f3d135af4fbc0aaeddb85c3af3d571f0a6a0da31b0c054112cbeef85e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:25:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vqm8m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:25:06Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-sxs57\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:25:27Z is after 2025-08-24T17:21:41Z" Mar 21 04:25:27 crc kubenswrapper[4839]: I0321 04:25:27.434888 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:24:41Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:11Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c3ce591ac49d818720803d65a1f2cbc28e1781a5bd3cff7c6cdaaa4b55c01b62\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:25:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:25:27Z is after 2025-08-24T17:21:41Z" Mar 21 04:25:27 crc kubenswrapper[4839]: I0321 04:25:27.449941 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:24:41Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:24:41Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:25:27Z is after 2025-08-24T17:21:41Z" Mar 21 04:25:27 crc kubenswrapper[4839]: I0321 04:25:27.452131 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 21 04:25:27 crc kubenswrapper[4839]: I0321 04:25:27.452162 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-445ww" Mar 21 04:25:27 crc kubenswrapper[4839]: I0321 04:25:27.452230 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 21 04:25:27 crc kubenswrapper[4839]: E0321 04:25:27.452332 4839 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 21 04:25:27 crc kubenswrapper[4839]: E0321 04:25:27.452414 4839 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-445ww" podUID="fa13ce27-53f2-4178-8560-251f0bb3f034" Mar 21 04:25:27 crc kubenswrapper[4839]: E0321 04:25:27.452504 4839 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 21 04:25:28 crc kubenswrapper[4839]: I0321 04:25:28.195593 4839 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-spl4b_d634043b-c9ec-4469-b267-26053b1f02f9/ovnkube-controller/2.log" Mar 21 04:25:28 crc kubenswrapper[4839]: I0321 04:25:28.198492 4839 scope.go:117] "RemoveContainer" containerID="c3b3b070bcb4467cbc49a3b7bcae7caeb0458824dadce792fcc31a4a2df828b2" Mar 21 04:25:28 crc kubenswrapper[4839]: E0321 04:25:28.198662 4839 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-spl4b_openshift-ovn-kubernetes(d634043b-c9ec-4469-b267-26053b1f02f9)\"" pod="openshift-ovn-kubernetes/ovnkube-node-spl4b" podUID="d634043b-c9ec-4469-b267-26053b1f02f9" Mar 21 04:25:28 crc kubenswrapper[4839]: I0321 04:25:28.209640 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:24:41Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:24:41Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:25:28Z is after 2025-08-24T17:21:41Z" Mar 21 04:25:28 crc kubenswrapper[4839]: I0321 04:25:28.222599 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:24:41Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:09Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://17d5f53fe3137552caf74bf775b44fba819d8790d2bc4d5042744dbd7b7307ce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:25:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e480aacd9f65a8a4ccedd9c0a943c53bf6de8a8de00eb0d421ec782af8a785bb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:25:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:25:28Z is after 2025-08-24T17:21:41Z" Mar 21 04:25:28 crc kubenswrapper[4839]: I0321 04:25:28.236987 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-g47qh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e646dbcd-c976-48e4-8dee-497be8a275bf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2ceedbcee674106a8f519fededf37b2bbc4b4bd660845373899de825f1ed534e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:25:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ljz2v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:25:00Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-g47qh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:25:28Z is after 2025-08-24T17:21:41Z" Mar 21 04:25:28 crc kubenswrapper[4839]: I0321 04:25:28.250050 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c5a473f2-0430-4c9b-8ef8-60d457db5188\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:23:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:23:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:23:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7299e7f513aff2d1d61b42cef3ed386843478716082a6e38d4d6d61d87f48529\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:23:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6d9bd6ceaa85cd21af55d1d512ac910eaf9d1bcf5d0f10868cd22ac80b13f66a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:23:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e4fe4a6c00181492cd5389e34a9bf18d905f69b2f7467eb75dcee9992eeb0d07\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:23:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://412472fce0e71838c2bb83101879b672e777dcd608bf27261a98261150d42e87\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ecafd1a4266b72f369f4b53d8a423bb875c1bc9719282c517a8e999ad5aecc66\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-21T04:24:14Z\\\",\\\"message\\\":\\\"W0321 04:24:13.708919 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI0321 04:24:13.709951 1 crypto.go:601] Generating new CA for check-endpoints-signer@1774067053 cert, and key in /tmp/serving-cert-1138089316/serving-signer.crt, /tmp/serving-cert-1138089316/serving-signer.key\\\\nI0321 04:24:14.009807 1 observer_polling.go:159] Starting file observer\\\\nW0321 04:24:14.016906 1 builder.go:272] unable to get owner reference (falling back to namespace): Unauthorized\\\\nI0321 04:24:14.017068 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0321 04:24:14.017655 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1138089316/tls.crt::/tmp/serving-cert-1138089316/tls.key\\\\\\\"\\\\nF0321 04:24:14.917069 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-21T04:24:13Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:24:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://573a241321770c9c676d83d8280f81eba0ad01a3e2f857be89d31316117b8898\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:23:19Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f70600d5431425836d94025fc8166068d8c9f411d0750be07cd7de44575a9649\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f70600d5431425836d94025fc8166068d8c9f411d0750be07cd7de44575a9649\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:23:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:23:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:23:16Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:25:28Z is after 2025-08-24T17:21:41Z" Mar 21 04:25:28 crc kubenswrapper[4839]: I0321 04:25:28.261237 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:24:41Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:24:41Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:25:28Z is after 2025-08-24T17:21:41Z" Mar 21 04:25:28 crc kubenswrapper[4839]: I0321 04:25:28.270637 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-jx4q7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4f92fefb-d5cd-451a-8bbe-31eea55d5bd9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6fc18a359e07bf64662248d7ceb65c8184407c0c4e14ce5483760975218407c9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:25:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9jqbw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a124ddc90d8b93fe4b6f77b778fbdac127b4896f5f55e6d5d78629cd17584311\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:25:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9jqbw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:25:00Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-jx4q7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:25:28Z is after 2025-08-24T17:21:41Z" Mar 21 04:25:28 crc kubenswrapper[4839]: I0321 04:25:28.286591 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-scp2c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e0848faa-daf7-4b62-a20f-36d92678db1d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2ae22ec83bed051327f3e013af8f13d50dfab76794f5fbc972c4091173c8c464\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:25:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6trk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5ee6bb585321fd698c356e2ad9b8f48f62f2b8e09847371060e8ab0395a059ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5ee6bb585321fd698c356e2ad9b8f48f62f2b8e09847371060e8ab0395a059ea\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:25:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:25:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6trk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d6527e3c31caba53559cb6dfe4494b576789a0c8e3e5a7d392352441f1b9b1ad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d6527e3c31caba53559cb6dfe4494b576789a0c8e3e5a7d392352441f1b9b1ad\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:25:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:25:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6trk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://688598020c8b1be055194d4805d234769230262450e5f816f35cc120ff5ff9a2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://688598020c8b1be055194d4805d234769230262450e5f816f35cc120ff5ff9a2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:25:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:25:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6trk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://137b7d338371020059e20feb028cea9e79fa43d32ff7de446719a1326051c4f4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://137b7d338371020059e20feb028cea9e79fa43d32ff7de446719a1326051c4f4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:25:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:25:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6trk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6059c9b392e73f503a825716b885d248ddf659fe4fc1cd23f47353dd0fdc0e07\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6059c9b392e73f503a825716b885d248ddf659fe4fc1cd23f47353dd0fdc0e07\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:25:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:25:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6trk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9367e3b70275a4fd5cc98faa785da0149e890de672bce34ddc831a46be5dd6b3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9367e3b70275a4fd5cc98faa785da0149e890de672bce34ddc831a46be5dd6b3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:25:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:25:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6trk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:25:00Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-scp2c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:25:28Z is after 2025-08-24T17:21:41Z" Mar 21 04:25:28 crc kubenswrapper[4839]: I0321 04:25:28.298383 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-zqcw4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1602189b-f4f3-40ee-ba63-c695c11069d0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://abb804e65728bf323531d9b35e52f6700584bab85de4a5b749067d2d89224747\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:25:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f9gh6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:25:00Z\\\"}}\" for pod \"openshift-multus\"/\"multus-zqcw4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:25:28Z is after 2025-08-24T17:21:41Z" Mar 21 04:25:28 crc kubenswrapper[4839]: I0321 04:25:28.312595 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-sxs57" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e99177d8-5f41-4cee-a2c9-ae1c314d9d8d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4499202f3d135af4fbc0aaeddb85c3af3d571f0a6a0da31b0c054112cbeef85e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:25:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vqm8m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:25:06Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-sxs57\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:25:28Z is after 2025-08-24T17:21:41Z" Mar 21 04:25:28 crc kubenswrapper[4839]: I0321 04:25:28.332630 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:24:41Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:11Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c3ce591ac49d818720803d65a1f2cbc28e1781a5bd3cff7c6cdaaa4b55c01b62\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:25:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:25:28Z is after 2025-08-24T17:21:41Z" Mar 21 04:25:28 crc kubenswrapper[4839]: I0321 04:25:28.347335 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:24:41Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:24:41Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:25:28Z is after 2025-08-24T17:21:41Z" Mar 21 04:25:28 crc kubenswrapper[4839]: I0321 04:25:28.367009 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-spl4b" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d634043b-c9ec-4469-b267-26053b1f02f9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:01Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:01Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5854b81091eb3920f499e2fb8bdb5e51afa5cf975397ad7d789603150fdafe92\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:25:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cdph2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c532a8668c47301ab93dbb1ea21be6ed687a83a6c9008b887e5023a46f27d172\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:25:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cdph2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://340ce870dbfbd3b102f8f1c5e6705da80e4a23db035175d7746ffe6ad63310f0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:25:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cdph2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://04c0470b8aa154ad901d5416631136ad2480da45bff39836dda6837a3e4b980e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:25:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cdph2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://821316a28b33f897df4ea4ae553ed0cbdd066ff220cc310af604bbe062e4b00e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:25:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cdph2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0555faac11cd1da1d5573074ee5d8704c1c4a7d8559996229bf242d9ed7290f4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:25:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cdph2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c3b3b070bcb4467cbc49a3b7bcae7caeb0458824dadce792fcc31a4a2df828b2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c3b3b070bcb4467cbc49a3b7bcae7caeb0458824dadce792fcc31a4a2df828b2\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-21T04:25:26Z\\\",\\\"message\\\":\\\"e/networking-console-plugin-85b44fc459-gdk6g openshift-etcd/etcd-crc openshift-multus/multus-zqcw4 openshift-network-diagnostics/network-check-target-xd92c openshift-network-operator/iptables-alerter-4ln5h openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-9hl57 openshift-multus/network-metrics-daemon-445ww openshift-network-diagnostics/network-check-source-55646444c4-trplf openshift-network-node-identity/network-node-identity-vrzqb]\\\\nI0321 04:25:26.224924 7118 obj_retry.go:418] Waiting for all the *v1.Pod retry setup to complete in iterateRetryResources\\\\nI0321 04:25:26.224943 7118 obj_retry.go:303] Retry object setup: *v1.Pod openshift-network-node-identity/network-node-identity-vrzqb\\\\nI0321 04:25:26.224952 7118 obj_retry.go:365] Adding new object: *v1.Pod openshift-network-node-identity/network-node-identity-vrzqb\\\\nI0321 04:25:26.224964 7118 ovn.go:134] Ensuring zone local for Pod openshift-network-node-identity/network-node-identity-vrzqb in node crc\\\\nF0321 04:25:26.224967 7118 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer becau\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-21T04:25:25Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-spl4b_openshift-ovn-kubernetes(d634043b-c9ec-4469-b267-26053b1f02f9)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cdph2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ee9c34719e2681860a7065870d4f0fd1d5aae10da5f2cc780e83f8020211e536\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:25:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cdph2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://466ed122245008a86b4471250670e086e61c981de3a7a026461c7c2238c87f3c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://466ed122245008a86b4471250670e086e61c981de3a7a026461c7c2238c87f3c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:25:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:25:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cdph2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:25:01Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-spl4b\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:25:28Z is after 2025-08-24T17:21:41Z" Mar 21 04:25:28 crc kubenswrapper[4839]: I0321 04:25:28.379279 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-9hl57" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4dee692e-c3b8-4538-86d7-210dd7e96173\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6058f68ef58f6259be3064c7cea97ec261dc1174655dbe75d4caf208a9c300e5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:25:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6n8pc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://99d7bb8a699d6022c3036c24052e58f8f45699b88971a2137e3d8bc27a21fe56\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:25:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6n8pc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:25:12Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-9hl57\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:25:28Z is after 2025-08-24T17:21:41Z" Mar 21 04:25:28 crc kubenswrapper[4839]: I0321 04:25:28.391300 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-445ww" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fa13ce27-53f2-4178-8560-251f0bb3f034\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:13Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:13Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4b26h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4b26h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:25:13Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-445ww\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:25:28Z is after 2025-08-24T17:21:41Z" Mar 21 04:25:28 crc kubenswrapper[4839]: I0321 04:25:28.414132 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"845a5f72-fcd9-4e53-a65e-d875a0758312\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:23:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:23:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:23:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:23:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:23:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6ac8aa1efe8200de679b27c91d85e8934dd8b958a487da8adfcb2e7fe4d4af83\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:23:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b5fcb9a692a8e68cdaf00e9ab50192286582dc94a1ec9c1f269deec71d12e709\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:23:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0a90276bf42ba474250d4e51c93b958e93ffcad8ab5f4285766b381f0247c484\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:23:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://418cc0271450c53990546d7ffb4506824d2a5889e49064cd40624ba291f485bd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:23:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d2aeb8f0a5e6e7302570f0ecb899e31a8f5e9ed506f9bf7cfa6a0a6d55ae50dd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:23:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://843b9092466615ee64f818bbd1d6e51298c59932d1e135796967de09fb2f81bf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://843b9092466615ee64f818bbd1d6e51298c59932d1e135796967de09fb2f81bf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:23:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:23:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://320bccb832e179c9ab109d22c5cd652cfb3bd770d2c698d91886401d9566efc5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://320bccb832e179c9ab109d22c5cd652cfb3bd770d2c698d91886401d9566efc5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:23:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:23:18Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://6e30344cfa0c0c9d42845a517f744a156634132bd6c2f6e30f745751913e968c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6e30344cfa0c0c9d42845a517f744a156634132bd6c2f6e30f745751913e968c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:23:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:23:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:23:16Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:25:28Z is after 2025-08-24T17:21:41Z" Mar 21 04:25:28 crc kubenswrapper[4839]: I0321 04:25:28.431722 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:24:41Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:12Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ea7373c259189351899cd74615296d7188b2529f808af9c8c85098177ffbe85d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:25:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:25:28Z is after 2025-08-24T17:21:41Z" Mar 21 04:25:28 crc kubenswrapper[4839]: I0321 04:25:28.452176 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 21 04:25:28 crc kubenswrapper[4839]: E0321 04:25:28.452511 4839 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 21 04:25:29 crc kubenswrapper[4839]: I0321 04:25:29.109291 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/fa13ce27-53f2-4178-8560-251f0bb3f034-metrics-certs\") pod \"network-metrics-daemon-445ww\" (UID: \"fa13ce27-53f2-4178-8560-251f0bb3f034\") " pod="openshift-multus/network-metrics-daemon-445ww" Mar 21 04:25:29 crc kubenswrapper[4839]: E0321 04:25:29.109514 4839 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Mar 21 04:25:29 crc kubenswrapper[4839]: E0321 04:25:29.109623 4839 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/fa13ce27-53f2-4178-8560-251f0bb3f034-metrics-certs podName:fa13ce27-53f2-4178-8560-251f0bb3f034 nodeName:}" failed. No retries permitted until 2026-03-21 04:25:45.109599733 +0000 UTC m=+149.437386439 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/fa13ce27-53f2-4178-8560-251f0bb3f034-metrics-certs") pod "network-metrics-daemon-445ww" (UID: "fa13ce27-53f2-4178-8560-251f0bb3f034") : object "openshift-multus"/"metrics-daemon-secret" not registered Mar 21 04:25:29 crc kubenswrapper[4839]: I0321 04:25:29.452631 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 21 04:25:29 crc kubenswrapper[4839]: I0321 04:25:29.452670 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 21 04:25:29 crc kubenswrapper[4839]: E0321 04:25:29.452833 4839 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 21 04:25:29 crc kubenswrapper[4839]: I0321 04:25:29.452670 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-445ww" Mar 21 04:25:29 crc kubenswrapper[4839]: E0321 04:25:29.453124 4839 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-445ww" podUID="fa13ce27-53f2-4178-8560-251f0bb3f034" Mar 21 04:25:29 crc kubenswrapper[4839]: E0321 04:25:29.453186 4839 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 21 04:25:30 crc kubenswrapper[4839]: I0321 04:25:30.465002 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 21 04:25:30 crc kubenswrapper[4839]: E0321 04:25:30.465221 4839 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 21 04:25:31 crc kubenswrapper[4839]: I0321 04:25:31.452668 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 21 04:25:31 crc kubenswrapper[4839]: I0321 04:25:31.452702 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-445ww" Mar 21 04:25:31 crc kubenswrapper[4839]: E0321 04:25:31.452793 4839 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 21 04:25:31 crc kubenswrapper[4839]: E0321 04:25:31.452920 4839 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-445ww" podUID="fa13ce27-53f2-4178-8560-251f0bb3f034" Mar 21 04:25:31 crc kubenswrapper[4839]: I0321 04:25:31.452958 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 21 04:25:31 crc kubenswrapper[4839]: E0321 04:25:31.453036 4839 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 21 04:25:31 crc kubenswrapper[4839]: E0321 04:25:31.539627 4839 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 21 04:25:32 crc kubenswrapper[4839]: I0321 04:25:32.452653 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 21 04:25:32 crc kubenswrapper[4839]: E0321 04:25:32.453068 4839 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 21 04:25:33 crc kubenswrapper[4839]: I0321 04:25:33.452722 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 21 04:25:33 crc kubenswrapper[4839]: I0321 04:25:33.452724 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 21 04:25:33 crc kubenswrapper[4839]: E0321 04:25:33.452865 4839 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 21 04:25:33 crc kubenswrapper[4839]: E0321 04:25:33.453025 4839 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 21 04:25:33 crc kubenswrapper[4839]: I0321 04:25:33.453919 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-445ww" Mar 21 04:25:33 crc kubenswrapper[4839]: E0321 04:25:33.454162 4839 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-445ww" podUID="fa13ce27-53f2-4178-8560-251f0bb3f034" Mar 21 04:25:34 crc kubenswrapper[4839]: I0321 04:25:34.452116 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 21 04:25:34 crc kubenswrapper[4839]: E0321 04:25:34.452237 4839 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 21 04:25:34 crc kubenswrapper[4839]: I0321 04:25:34.468996 4839 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-scheduler/openshift-kube-scheduler-crc"] Mar 21 04:25:34 crc kubenswrapper[4839]: I0321 04:25:34.953415 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:25:34 crc kubenswrapper[4839]: I0321 04:25:34.953487 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:25:34 crc kubenswrapper[4839]: I0321 04:25:34.953506 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:25:34 crc kubenswrapper[4839]: I0321 04:25:34.953538 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 04:25:34 crc kubenswrapper[4839]: I0321 04:25:34.953559 4839 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T04:25:34Z","lastTransitionTime":"2026-03-21T04:25:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 04:25:34 crc kubenswrapper[4839]: E0321 04:25:34.977180 4839 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404564Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865364Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-21T04:25:34Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:34Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-21T04:25:34Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:34Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-21T04:25:34Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:34Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-21T04:25:34Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:34Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"d75f4a6d-5ce3-49bd-a7c7-4f5c3e6bf62f\\\",\\\"systemUUID\\\":\\\"2a7bfad9-30ba-42d8-b982-971191ebb9d6\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:25:34Z is after 2025-08-24T17:21:41Z" Mar 21 04:25:34 crc kubenswrapper[4839]: I0321 04:25:34.983754 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:25:34 crc kubenswrapper[4839]: I0321 04:25:34.983814 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:25:34 crc kubenswrapper[4839]: I0321 04:25:34.983831 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:25:34 crc kubenswrapper[4839]: I0321 04:25:34.983856 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 04:25:34 crc kubenswrapper[4839]: I0321 04:25:34.983873 4839 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T04:25:34Z","lastTransitionTime":"2026-03-21T04:25:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 04:25:35 crc kubenswrapper[4839]: E0321 04:25:35.006040 4839 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404564Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865364Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-21T04:25:34Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:34Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-21T04:25:34Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:34Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-21T04:25:34Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:34Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-21T04:25:34Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:34Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"d75f4a6d-5ce3-49bd-a7c7-4f5c3e6bf62f\\\",\\\"systemUUID\\\":\\\"2a7bfad9-30ba-42d8-b982-971191ebb9d6\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:25:35Z is after 2025-08-24T17:21:41Z" Mar 21 04:25:35 crc kubenswrapper[4839]: I0321 04:25:35.011546 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:25:35 crc kubenswrapper[4839]: I0321 04:25:35.011622 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:25:35 crc kubenswrapper[4839]: I0321 04:25:35.011643 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:25:35 crc kubenswrapper[4839]: I0321 04:25:35.011671 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 04:25:35 crc kubenswrapper[4839]: I0321 04:25:35.011690 4839 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T04:25:35Z","lastTransitionTime":"2026-03-21T04:25:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 04:25:35 crc kubenswrapper[4839]: E0321 04:25:35.029610 4839 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404564Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865364Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-21T04:25:35Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:35Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-21T04:25:35Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:35Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-21T04:25:35Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:35Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-21T04:25:35Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:35Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"d75f4a6d-5ce3-49bd-a7c7-4f5c3e6bf62f\\\",\\\"systemUUID\\\":\\\"2a7bfad9-30ba-42d8-b982-971191ebb9d6\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:25:35Z is after 2025-08-24T17:21:41Z" Mar 21 04:25:35 crc kubenswrapper[4839]: I0321 04:25:35.034717 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:25:35 crc kubenswrapper[4839]: I0321 04:25:35.034771 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:25:35 crc kubenswrapper[4839]: I0321 04:25:35.034785 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:25:35 crc kubenswrapper[4839]: I0321 04:25:35.034869 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 04:25:35 crc kubenswrapper[4839]: I0321 04:25:35.034895 4839 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T04:25:35Z","lastTransitionTime":"2026-03-21T04:25:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 04:25:35 crc kubenswrapper[4839]: E0321 04:25:35.053125 4839 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404564Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865364Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-21T04:25:35Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:35Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-21T04:25:35Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:35Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-21T04:25:35Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:35Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-21T04:25:35Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:35Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"d75f4a6d-5ce3-49bd-a7c7-4f5c3e6bf62f\\\",\\\"systemUUID\\\":\\\"2a7bfad9-30ba-42d8-b982-971191ebb9d6\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:25:35Z is after 2025-08-24T17:21:41Z" Mar 21 04:25:35 crc kubenswrapper[4839]: I0321 04:25:35.058037 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:25:35 crc kubenswrapper[4839]: I0321 04:25:35.058104 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:25:35 crc kubenswrapper[4839]: I0321 04:25:35.058122 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:25:35 crc kubenswrapper[4839]: I0321 04:25:35.058150 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 04:25:35 crc kubenswrapper[4839]: I0321 04:25:35.058168 4839 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T04:25:35Z","lastTransitionTime":"2026-03-21T04:25:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 04:25:35 crc kubenswrapper[4839]: E0321 04:25:35.077664 4839 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404564Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865364Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-21T04:25:35Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:35Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-21T04:25:35Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:35Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-21T04:25:35Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:35Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-21T04:25:35Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:35Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"d75f4a6d-5ce3-49bd-a7c7-4f5c3e6bf62f\\\",\\\"systemUUID\\\":\\\"2a7bfad9-30ba-42d8-b982-971191ebb9d6\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:25:35Z is after 2025-08-24T17:21:41Z" Mar 21 04:25:35 crc kubenswrapper[4839]: E0321 04:25:35.077895 4839 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Mar 21 04:25:35 crc kubenswrapper[4839]: I0321 04:25:35.452104 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-445ww" Mar 21 04:25:35 crc kubenswrapper[4839]: I0321 04:25:35.452143 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 21 04:25:35 crc kubenswrapper[4839]: E0321 04:25:35.452239 4839 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-445ww" podUID="fa13ce27-53f2-4178-8560-251f0bb3f034" Mar 21 04:25:35 crc kubenswrapper[4839]: I0321 04:25:35.452156 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 21 04:25:35 crc kubenswrapper[4839]: E0321 04:25:35.452673 4839 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 21 04:25:35 crc kubenswrapper[4839]: E0321 04:25:35.452762 4839 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 21 04:25:36 crc kubenswrapper[4839]: I0321 04:25:36.452325 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 21 04:25:36 crc kubenswrapper[4839]: E0321 04:25:36.452515 4839 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 21 04:25:36 crc kubenswrapper[4839]: I0321 04:25:36.471075 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-9hl57" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4dee692e-c3b8-4538-86d7-210dd7e96173\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6058f68ef58f6259be3064c7cea97ec261dc1174655dbe75d4caf208a9c300e5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:25:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6n8pc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://99d7bb8a699d6022c3036c24052e58f8f45699b88971a2137e3d8bc27a21fe56\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:25:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6n8pc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:25:12Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-9hl57\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:25:36Z is after 2025-08-24T17:21:41Z" Mar 21 04:25:36 crc kubenswrapper[4839]: I0321 04:25:36.488352 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-445ww" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fa13ce27-53f2-4178-8560-251f0bb3f034\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:13Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:13Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4b26h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4b26h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:25:13Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-445ww\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:25:36Z is after 2025-08-24T17:21:41Z" Mar 21 04:25:36 crc kubenswrapper[4839]: I0321 04:25:36.523764 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"845a5f72-fcd9-4e53-a65e-d875a0758312\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:23:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:23:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:23:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:23:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:23:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6ac8aa1efe8200de679b27c91d85e8934dd8b958a487da8adfcb2e7fe4d4af83\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:23:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b5fcb9a692a8e68cdaf00e9ab50192286582dc94a1ec9c1f269deec71d12e709\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:23:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0a90276bf42ba474250d4e51c93b958e93ffcad8ab5f4285766b381f0247c484\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:23:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://418cc0271450c53990546d7ffb4506824d2a5889e49064cd40624ba291f485bd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:23:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d2aeb8f0a5e6e7302570f0ecb899e31a8f5e9ed506f9bf7cfa6a0a6d55ae50dd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:23:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://843b9092466615ee64f818bbd1d6e51298c59932d1e135796967de09fb2f81bf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://843b9092466615ee64f818bbd1d6e51298c59932d1e135796967de09fb2f81bf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:23:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:23:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://320bccb832e179c9ab109d22c5cd652cfb3bd770d2c698d91886401d9566efc5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://320bccb832e179c9ab109d22c5cd652cfb3bd770d2c698d91886401d9566efc5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:23:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:23:18Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://6e30344cfa0c0c9d42845a517f744a156634132bd6c2f6e30f745751913e968c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6e30344cfa0c0c9d42845a517f744a156634132bd6c2f6e30f745751913e968c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:23:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:23:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:23:16Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:25:36Z is after 2025-08-24T17:21:41Z" Mar 21 04:25:36 crc kubenswrapper[4839]: E0321 04:25:36.540443 4839 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 21 04:25:36 crc kubenswrapper[4839]: I0321 04:25:36.543668 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:24:41Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:12Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ea7373c259189351899cd74615296d7188b2529f808af9c8c85098177ffbe85d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:25:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:25:36Z is after 2025-08-24T17:21:41Z" Mar 21 04:25:36 crc kubenswrapper[4839]: I0321 04:25:36.584102 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-spl4b" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d634043b-c9ec-4469-b267-26053b1f02f9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:01Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:01Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5854b81091eb3920f499e2fb8bdb5e51afa5cf975397ad7d789603150fdafe92\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:25:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cdph2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c532a8668c47301ab93dbb1ea21be6ed687a83a6c9008b887e5023a46f27d172\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:25:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cdph2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://340ce870dbfbd3b102f8f1c5e6705da80e4a23db035175d7746ffe6ad63310f0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:25:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cdph2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://04c0470b8aa154ad901d5416631136ad2480da45bff39836dda6837a3e4b980e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:25:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cdph2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://821316a28b33f897df4ea4ae553ed0cbdd066ff220cc310af604bbe062e4b00e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:25:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cdph2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0555faac11cd1da1d5573074ee5d8704c1c4a7d8559996229bf242d9ed7290f4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:25:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cdph2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c3b3b070bcb4467cbc49a3b7bcae7caeb0458824dadce792fcc31a4a2df828b2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c3b3b070bcb4467cbc49a3b7bcae7caeb0458824dadce792fcc31a4a2df828b2\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-21T04:25:26Z\\\",\\\"message\\\":\\\"e/networking-console-plugin-85b44fc459-gdk6g openshift-etcd/etcd-crc openshift-multus/multus-zqcw4 openshift-network-diagnostics/network-check-target-xd92c openshift-network-operator/iptables-alerter-4ln5h openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-9hl57 openshift-multus/network-metrics-daemon-445ww openshift-network-diagnostics/network-check-source-55646444c4-trplf openshift-network-node-identity/network-node-identity-vrzqb]\\\\nI0321 04:25:26.224924 7118 obj_retry.go:418] Waiting for all the *v1.Pod retry setup to complete in iterateRetryResources\\\\nI0321 04:25:26.224943 7118 obj_retry.go:303] Retry object setup: *v1.Pod openshift-network-node-identity/network-node-identity-vrzqb\\\\nI0321 04:25:26.224952 7118 obj_retry.go:365] Adding new object: *v1.Pod openshift-network-node-identity/network-node-identity-vrzqb\\\\nI0321 04:25:26.224964 7118 ovn.go:134] Ensuring zone local for Pod openshift-network-node-identity/network-node-identity-vrzqb in node crc\\\\nF0321 04:25:26.224967 7118 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer becau\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-21T04:25:25Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-spl4b_openshift-ovn-kubernetes(d634043b-c9ec-4469-b267-26053b1f02f9)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cdph2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ee9c34719e2681860a7065870d4f0fd1d5aae10da5f2cc780e83f8020211e536\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:25:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cdph2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://466ed122245008a86b4471250670e086e61c981de3a7a026461c7c2238c87f3c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://466ed122245008a86b4471250670e086e61c981de3a7a026461c7c2238c87f3c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:25:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:25:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cdph2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:25:01Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-spl4b\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:25:36Z is after 2025-08-24T17:21:41Z" Mar 21 04:25:36 crc kubenswrapper[4839]: I0321 04:25:36.606480 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:24:41Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:09Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://17d5f53fe3137552caf74bf775b44fba819d8790d2bc4d5042744dbd7b7307ce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:25:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e480aacd9f65a8a4ccedd9c0a943c53bf6de8a8de00eb0d421ec782af8a785bb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:25:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:25:36Z is after 2025-08-24T17:21:41Z" Mar 21 04:25:36 crc kubenswrapper[4839]: I0321 04:25:36.626917 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-g47qh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e646dbcd-c976-48e4-8dee-497be8a275bf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2ceedbcee674106a8f519fededf37b2bbc4b4bd660845373899de825f1ed534e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:25:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ljz2v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:25:00Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-g47qh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:25:36Z is after 2025-08-24T17:21:41Z" Mar 21 04:25:36 crc kubenswrapper[4839]: I0321 04:25:36.650775 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c5a473f2-0430-4c9b-8ef8-60d457db5188\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:23:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:23:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:23:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7299e7f513aff2d1d61b42cef3ed386843478716082a6e38d4d6d61d87f48529\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:23:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6d9bd6ceaa85cd21af55d1d512ac910eaf9d1bcf5d0f10868cd22ac80b13f66a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:23:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e4fe4a6c00181492cd5389e34a9bf18d905f69b2f7467eb75dcee9992eeb0d07\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:23:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://412472fce0e71838c2bb83101879b672e777dcd608bf27261a98261150d42e87\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ecafd1a4266b72f369f4b53d8a423bb875c1bc9719282c517a8e999ad5aecc66\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-21T04:24:14Z\\\",\\\"message\\\":\\\"W0321 04:24:13.708919 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI0321 04:24:13.709951 1 crypto.go:601] Generating new CA for check-endpoints-signer@1774067053 cert, and key in /tmp/serving-cert-1138089316/serving-signer.crt, /tmp/serving-cert-1138089316/serving-signer.key\\\\nI0321 04:24:14.009807 1 observer_polling.go:159] Starting file observer\\\\nW0321 04:24:14.016906 1 builder.go:272] unable to get owner reference (falling back to namespace): Unauthorized\\\\nI0321 04:24:14.017068 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0321 04:24:14.017655 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1138089316/tls.crt::/tmp/serving-cert-1138089316/tls.key\\\\\\\"\\\\nF0321 04:24:14.917069 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-21T04:24:13Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:24:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://573a241321770c9c676d83d8280f81eba0ad01a3e2f857be89d31316117b8898\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:23:19Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f70600d5431425836d94025fc8166068d8c9f411d0750be07cd7de44575a9649\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f70600d5431425836d94025fc8166068d8c9f411d0750be07cd7de44575a9649\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:23:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:23:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:23:16Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:25:36Z is after 2025-08-24T17:21:41Z" Mar 21 04:25:36 crc kubenswrapper[4839]: I0321 04:25:36.665535 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:24:41Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:24:41Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:25:36Z is after 2025-08-24T17:21:41Z" Mar 21 04:25:36 crc kubenswrapper[4839]: I0321 04:25:36.678662 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-jx4q7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4f92fefb-d5cd-451a-8bbe-31eea55d5bd9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6fc18a359e07bf64662248d7ceb65c8184407c0c4e14ce5483760975218407c9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:25:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9jqbw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a124ddc90d8b93fe4b6f77b778fbdac127b4896f5f55e6d5d78629cd17584311\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:25:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9jqbw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:25:00Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-jx4q7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:25:36Z is after 2025-08-24T17:21:41Z" Mar 21 04:25:36 crc kubenswrapper[4839]: I0321 04:25:36.691264 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b27e726e-fb6e-471f-b189-71637a34994c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:23:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:23:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:24:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:24:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:23:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://506189caf9101bd8eff0b327400189bde29280613baf494135bf28fa36bd80cd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:23:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4ea5d4d7d744a79c2e5515eaf49d09346d96b37b17eee6e3e32952e389248367\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:23:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1981f2299df7154d30ce01492943eaadeeed0f27c0b0a39a1228054345aa7a3a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:23:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fec5f0e1a98ef435cb8b280400cfc7daff0f232d320947f22a93df0b81b6f441\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fec5f0e1a98ef435cb8b280400cfc7daff0f232d320947f22a93df0b81b6f441\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:23:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:23:17Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:23:16Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:25:36Z is after 2025-08-24T17:21:41Z" Mar 21 04:25:36 crc kubenswrapper[4839]: I0321 04:25:36.707554 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:24:41Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:24:41Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:25:36Z is after 2025-08-24T17:21:41Z" Mar 21 04:25:36 crc kubenswrapper[4839]: I0321 04:25:36.725019 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-zqcw4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1602189b-f4f3-40ee-ba63-c695c11069d0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://abb804e65728bf323531d9b35e52f6700584bab85de4a5b749067d2d89224747\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:25:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f9gh6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:25:00Z\\\"}}\" for pod \"openshift-multus\"/\"multus-zqcw4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:25:36Z is after 2025-08-24T17:21:41Z" Mar 21 04:25:36 crc kubenswrapper[4839]: I0321 04:25:36.740715 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-sxs57" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e99177d8-5f41-4cee-a2c9-ae1c314d9d8d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4499202f3d135af4fbc0aaeddb85c3af3d571f0a6a0da31b0c054112cbeef85e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:25:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vqm8m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:25:06Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-sxs57\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:25:36Z is after 2025-08-24T17:21:41Z" Mar 21 04:25:36 crc kubenswrapper[4839]: I0321 04:25:36.760511 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:24:41Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:11Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c3ce591ac49d818720803d65a1f2cbc28e1781a5bd3cff7c6cdaaa4b55c01b62\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:25:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:25:36Z is after 2025-08-24T17:21:41Z" Mar 21 04:25:36 crc kubenswrapper[4839]: I0321 04:25:36.776200 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:24:41Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:24:41Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:25:36Z is after 2025-08-24T17:21:41Z" Mar 21 04:25:36 crc kubenswrapper[4839]: I0321 04:25:36.789532 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-scp2c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e0848faa-daf7-4b62-a20f-36d92678db1d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2ae22ec83bed051327f3e013af8f13d50dfab76794f5fbc972c4091173c8c464\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:25:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6trk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5ee6bb585321fd698c356e2ad9b8f48f62f2b8e09847371060e8ab0395a059ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5ee6bb585321fd698c356e2ad9b8f48f62f2b8e09847371060e8ab0395a059ea\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:25:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:25:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6trk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d6527e3c31caba53559cb6dfe4494b576789a0c8e3e5a7d392352441f1b9b1ad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d6527e3c31caba53559cb6dfe4494b576789a0c8e3e5a7d392352441f1b9b1ad\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:25:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:25:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6trk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://688598020c8b1be055194d4805d234769230262450e5f816f35cc120ff5ff9a2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://688598020c8b1be055194d4805d234769230262450e5f816f35cc120ff5ff9a2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:25:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:25:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6trk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://137b7d338371020059e20feb028cea9e79fa43d32ff7de446719a1326051c4f4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://137b7d338371020059e20feb028cea9e79fa43d32ff7de446719a1326051c4f4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:25:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:25:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6trk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6059c9b392e73f503a825716b885d248ddf659fe4fc1cd23f47353dd0fdc0e07\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6059c9b392e73f503a825716b885d248ddf659fe4fc1cd23f47353dd0fdc0e07\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:25:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:25:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6trk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9367e3b70275a4fd5cc98faa785da0149e890de672bce34ddc831a46be5dd6b3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9367e3b70275a4fd5cc98faa785da0149e890de672bce34ddc831a46be5dd6b3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:25:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:25:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6trk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:25:00Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-scp2c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:25:36Z is after 2025-08-24T17:21:41Z" Mar 21 04:25:37 crc kubenswrapper[4839]: I0321 04:25:37.452558 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-445ww" Mar 21 04:25:37 crc kubenswrapper[4839]: I0321 04:25:37.452707 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 21 04:25:37 crc kubenswrapper[4839]: E0321 04:25:37.452748 4839 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-445ww" podUID="fa13ce27-53f2-4178-8560-251f0bb3f034" Mar 21 04:25:37 crc kubenswrapper[4839]: E0321 04:25:37.452837 4839 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 21 04:25:37 crc kubenswrapper[4839]: I0321 04:25:37.453366 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 21 04:25:37 crc kubenswrapper[4839]: E0321 04:25:37.453525 4839 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 21 04:25:38 crc kubenswrapper[4839]: I0321 04:25:38.452609 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 21 04:25:38 crc kubenswrapper[4839]: E0321 04:25:38.452778 4839 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 21 04:25:39 crc kubenswrapper[4839]: I0321 04:25:39.452555 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-445ww" Mar 21 04:25:39 crc kubenswrapper[4839]: I0321 04:25:39.452728 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 21 04:25:39 crc kubenswrapper[4839]: E0321 04:25:39.453358 4839 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-445ww" podUID="fa13ce27-53f2-4178-8560-251f0bb3f034" Mar 21 04:25:39 crc kubenswrapper[4839]: I0321 04:25:39.452733 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 21 04:25:39 crc kubenswrapper[4839]: E0321 04:25:39.453498 4839 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 21 04:25:39 crc kubenswrapper[4839]: E0321 04:25:39.454692 4839 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 21 04:25:39 crc kubenswrapper[4839]: I0321 04:25:39.469758 4839 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager/kube-controller-manager-crc"] Mar 21 04:25:40 crc kubenswrapper[4839]: I0321 04:25:40.451853 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 21 04:25:40 crc kubenswrapper[4839]: E0321 04:25:40.451975 4839 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 21 04:25:41 crc kubenswrapper[4839]: I0321 04:25:41.452794 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 21 04:25:41 crc kubenswrapper[4839]: I0321 04:25:41.452864 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-445ww" Mar 21 04:25:41 crc kubenswrapper[4839]: E0321 04:25:41.453890 4839 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 21 04:25:41 crc kubenswrapper[4839]: E0321 04:25:41.454021 4839 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-445ww" podUID="fa13ce27-53f2-4178-8560-251f0bb3f034" Mar 21 04:25:41 crc kubenswrapper[4839]: I0321 04:25:41.452907 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 21 04:25:41 crc kubenswrapper[4839]: E0321 04:25:41.454106 4839 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 21 04:25:41 crc kubenswrapper[4839]: E0321 04:25:41.542163 4839 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 21 04:25:42 crc kubenswrapper[4839]: I0321 04:25:42.452464 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 21 04:25:42 crc kubenswrapper[4839]: E0321 04:25:42.453176 4839 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 21 04:25:42 crc kubenswrapper[4839]: I0321 04:25:42.453871 4839 scope.go:117] "RemoveContainer" containerID="c3b3b070bcb4467cbc49a3b7bcae7caeb0458824dadce792fcc31a4a2df828b2" Mar 21 04:25:42 crc kubenswrapper[4839]: E0321 04:25:42.454141 4839 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-spl4b_openshift-ovn-kubernetes(d634043b-c9ec-4469-b267-26053b1f02f9)\"" pod="openshift-ovn-kubernetes/ovnkube-node-spl4b" podUID="d634043b-c9ec-4469-b267-26053b1f02f9" Mar 21 04:25:43 crc kubenswrapper[4839]: I0321 04:25:43.452450 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-445ww" Mar 21 04:25:43 crc kubenswrapper[4839]: E0321 04:25:43.452651 4839 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-445ww" podUID="fa13ce27-53f2-4178-8560-251f0bb3f034" Mar 21 04:25:43 crc kubenswrapper[4839]: I0321 04:25:43.452466 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 21 04:25:43 crc kubenswrapper[4839]: E0321 04:25:43.452743 4839 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 21 04:25:43 crc kubenswrapper[4839]: I0321 04:25:43.452466 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 21 04:25:43 crc kubenswrapper[4839]: E0321 04:25:43.452811 4839 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 21 04:25:44 crc kubenswrapper[4839]: I0321 04:25:44.452892 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 21 04:25:44 crc kubenswrapper[4839]: E0321 04:25:44.453083 4839 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 21 04:25:45 crc kubenswrapper[4839]: I0321 04:25:45.179146 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/fa13ce27-53f2-4178-8560-251f0bb3f034-metrics-certs\") pod \"network-metrics-daemon-445ww\" (UID: \"fa13ce27-53f2-4178-8560-251f0bb3f034\") " pod="openshift-multus/network-metrics-daemon-445ww" Mar 21 04:25:45 crc kubenswrapper[4839]: E0321 04:25:45.179407 4839 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Mar 21 04:25:45 crc kubenswrapper[4839]: E0321 04:25:45.179532 4839 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/fa13ce27-53f2-4178-8560-251f0bb3f034-metrics-certs podName:fa13ce27-53f2-4178-8560-251f0bb3f034 nodeName:}" failed. No retries permitted until 2026-03-21 04:26:17.179499645 +0000 UTC m=+181.507286491 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/fa13ce27-53f2-4178-8560-251f0bb3f034-metrics-certs") pod "network-metrics-daemon-445ww" (UID: "fa13ce27-53f2-4178-8560-251f0bb3f034") : object "openshift-multus"/"metrics-daemon-secret" not registered Mar 21 04:25:45 crc kubenswrapper[4839]: I0321 04:25:45.280791 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 21 04:25:45 crc kubenswrapper[4839]: E0321 04:25:45.280995 4839 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-21 04:26:49.280947992 +0000 UTC m=+213.608734678 (durationBeforeRetry 1m4s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 21 04:25:45 crc kubenswrapper[4839]: I0321 04:25:45.281074 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 21 04:25:45 crc kubenswrapper[4839]: I0321 04:25:45.281160 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 21 04:25:45 crc kubenswrapper[4839]: E0321 04:25:45.281240 4839 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Mar 21 04:25:45 crc kubenswrapper[4839]: E0321 04:25:45.281307 4839 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-21 04:26:49.281292273 +0000 UTC m=+213.609078949 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Mar 21 04:25:45 crc kubenswrapper[4839]: E0321 04:25:45.281373 4839 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 21 04:25:45 crc kubenswrapper[4839]: E0321 04:25:45.281505 4839 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-21 04:26:49.281461188 +0000 UTC m=+213.609248064 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 21 04:25:45 crc kubenswrapper[4839]: I0321 04:25:45.382905 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 21 04:25:45 crc kubenswrapper[4839]: I0321 04:25:45.383033 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 21 04:25:45 crc kubenswrapper[4839]: E0321 04:25:45.383140 4839 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 21 04:25:45 crc kubenswrapper[4839]: E0321 04:25:45.383181 4839 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 21 04:25:45 crc kubenswrapper[4839]: E0321 04:25:45.383195 4839 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 21 04:25:45 crc kubenswrapper[4839]: E0321 04:25:45.383254 4839 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-03-21 04:26:49.383237285 +0000 UTC m=+213.711023961 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 21 04:25:45 crc kubenswrapper[4839]: E0321 04:25:45.383297 4839 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 21 04:25:45 crc kubenswrapper[4839]: E0321 04:25:45.383326 4839 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 21 04:25:45 crc kubenswrapper[4839]: E0321 04:25:45.383343 4839 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 21 04:25:45 crc kubenswrapper[4839]: E0321 04:25:45.383452 4839 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-03-21 04:26:49.3834009 +0000 UTC m=+213.711187756 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 21 04:25:45 crc kubenswrapper[4839]: I0321 04:25:45.452349 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-445ww" Mar 21 04:25:45 crc kubenswrapper[4839]: I0321 04:25:45.452486 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 21 04:25:45 crc kubenswrapper[4839]: E0321 04:25:45.453168 4839 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-445ww" podUID="fa13ce27-53f2-4178-8560-251f0bb3f034" Mar 21 04:25:45 crc kubenswrapper[4839]: I0321 04:25:45.452513 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 21 04:25:45 crc kubenswrapper[4839]: E0321 04:25:45.453271 4839 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 21 04:25:45 crc kubenswrapper[4839]: E0321 04:25:45.453391 4839 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 21 04:25:45 crc kubenswrapper[4839]: I0321 04:25:45.470834 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:25:45 crc kubenswrapper[4839]: I0321 04:25:45.470903 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:25:45 crc kubenswrapper[4839]: I0321 04:25:45.470920 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:25:45 crc kubenswrapper[4839]: I0321 04:25:45.470948 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 04:25:45 crc kubenswrapper[4839]: I0321 04:25:45.470966 4839 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T04:25:45Z","lastTransitionTime":"2026-03-21T04:25:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 04:25:45 crc kubenswrapper[4839]: E0321 04:25:45.485703 4839 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404564Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865364Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-21T04:25:45Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:45Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-21T04:25:45Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:45Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-21T04:25:45Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:45Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-21T04:25:45Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:45Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"d75f4a6d-5ce3-49bd-a7c7-4f5c3e6bf62f\\\",\\\"systemUUID\\\":\\\"2a7bfad9-30ba-42d8-b982-971191ebb9d6\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:25:45Z is after 2025-08-24T17:21:41Z" Mar 21 04:25:45 crc kubenswrapper[4839]: I0321 04:25:45.489524 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:25:45 crc kubenswrapper[4839]: I0321 04:25:45.489592 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:25:45 crc kubenswrapper[4839]: I0321 04:25:45.489602 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:25:45 crc kubenswrapper[4839]: I0321 04:25:45.489618 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 04:25:45 crc kubenswrapper[4839]: I0321 04:25:45.489627 4839 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T04:25:45Z","lastTransitionTime":"2026-03-21T04:25:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 04:25:45 crc kubenswrapper[4839]: E0321 04:25:45.502702 4839 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404564Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865364Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-21T04:25:45Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:45Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-21T04:25:45Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:45Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-21T04:25:45Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:45Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-21T04:25:45Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:45Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"d75f4a6d-5ce3-49bd-a7c7-4f5c3e6bf62f\\\",\\\"systemUUID\\\":\\\"2a7bfad9-30ba-42d8-b982-971191ebb9d6\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:25:45Z is after 2025-08-24T17:21:41Z" Mar 21 04:25:45 crc kubenswrapper[4839]: I0321 04:25:45.506653 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:25:45 crc kubenswrapper[4839]: I0321 04:25:45.506733 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:25:45 crc kubenswrapper[4839]: I0321 04:25:45.506750 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:25:45 crc kubenswrapper[4839]: I0321 04:25:45.506780 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 04:25:45 crc kubenswrapper[4839]: I0321 04:25:45.506793 4839 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T04:25:45Z","lastTransitionTime":"2026-03-21T04:25:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 04:25:45 crc kubenswrapper[4839]: E0321 04:25:45.518375 4839 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404564Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865364Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-21T04:25:45Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:45Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-21T04:25:45Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:45Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-21T04:25:45Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:45Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-21T04:25:45Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:45Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"d75f4a6d-5ce3-49bd-a7c7-4f5c3e6bf62f\\\",\\\"systemUUID\\\":\\\"2a7bfad9-30ba-42d8-b982-971191ebb9d6\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:25:45Z is after 2025-08-24T17:21:41Z" Mar 21 04:25:45 crc kubenswrapper[4839]: I0321 04:25:45.522611 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:25:45 crc kubenswrapper[4839]: I0321 04:25:45.522658 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:25:45 crc kubenswrapper[4839]: I0321 04:25:45.522672 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:25:45 crc kubenswrapper[4839]: I0321 04:25:45.522689 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 04:25:45 crc kubenswrapper[4839]: I0321 04:25:45.522700 4839 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T04:25:45Z","lastTransitionTime":"2026-03-21T04:25:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 04:25:45 crc kubenswrapper[4839]: E0321 04:25:45.536228 4839 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404564Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865364Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-21T04:25:45Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:45Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-21T04:25:45Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:45Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-21T04:25:45Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:45Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-21T04:25:45Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:45Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"d75f4a6d-5ce3-49bd-a7c7-4f5c3e6bf62f\\\",\\\"systemUUID\\\":\\\"2a7bfad9-30ba-42d8-b982-971191ebb9d6\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:25:45Z is after 2025-08-24T17:21:41Z" Mar 21 04:25:45 crc kubenswrapper[4839]: I0321 04:25:45.541556 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:25:45 crc kubenswrapper[4839]: I0321 04:25:45.541670 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:25:45 crc kubenswrapper[4839]: I0321 04:25:45.541698 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:25:45 crc kubenswrapper[4839]: I0321 04:25:45.541735 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 04:25:45 crc kubenswrapper[4839]: I0321 04:25:45.541760 4839 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T04:25:45Z","lastTransitionTime":"2026-03-21T04:25:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 04:25:45 crc kubenswrapper[4839]: E0321 04:25:45.555281 4839 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404564Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865364Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-21T04:25:45Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:45Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-21T04:25:45Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:45Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-21T04:25:45Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:45Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-21T04:25:45Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:45Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"d75f4a6d-5ce3-49bd-a7c7-4f5c3e6bf62f\\\",\\\"systemUUID\\\":\\\"2a7bfad9-30ba-42d8-b982-971191ebb9d6\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:25:45Z is after 2025-08-24T17:21:41Z" Mar 21 04:25:45 crc kubenswrapper[4839]: E0321 04:25:45.555430 4839 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Mar 21 04:25:46 crc kubenswrapper[4839]: I0321 04:25:46.452105 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 21 04:25:46 crc kubenswrapper[4839]: E0321 04:25:46.452247 4839 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 21 04:25:46 crc kubenswrapper[4839]: I0321 04:25:46.463003 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-9hl57" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4dee692e-c3b8-4538-86d7-210dd7e96173\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6058f68ef58f6259be3064c7cea97ec261dc1174655dbe75d4caf208a9c300e5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:25:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6n8pc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://99d7bb8a699d6022c3036c24052e58f8f45699b88971a2137e3d8bc27a21fe56\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:25:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6n8pc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:25:12Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-9hl57\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:25:46Z is after 2025-08-24T17:21:41Z" Mar 21 04:25:46 crc kubenswrapper[4839]: I0321 04:25:46.472682 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-445ww" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fa13ce27-53f2-4178-8560-251f0bb3f034\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:13Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:13Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4b26h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4b26h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:25:13Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-445ww\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:25:46Z is after 2025-08-24T17:21:41Z" Mar 21 04:25:46 crc kubenswrapper[4839]: I0321 04:25:46.495825 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"845a5f72-fcd9-4e53-a65e-d875a0758312\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:23:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:23:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:23:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:23:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:23:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6ac8aa1efe8200de679b27c91d85e8934dd8b958a487da8adfcb2e7fe4d4af83\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:23:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b5fcb9a692a8e68cdaf00e9ab50192286582dc94a1ec9c1f269deec71d12e709\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:23:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0a90276bf42ba474250d4e51c93b958e93ffcad8ab5f4285766b381f0247c484\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:23:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://418cc0271450c53990546d7ffb4506824d2a5889e49064cd40624ba291f485bd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:23:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d2aeb8f0a5e6e7302570f0ecb899e31a8f5e9ed506f9bf7cfa6a0a6d55ae50dd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:23:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://843b9092466615ee64f818bbd1d6e51298c59932d1e135796967de09fb2f81bf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://843b9092466615ee64f818bbd1d6e51298c59932d1e135796967de09fb2f81bf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:23:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:23:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://320bccb832e179c9ab109d22c5cd652cfb3bd770d2c698d91886401d9566efc5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://320bccb832e179c9ab109d22c5cd652cfb3bd770d2c698d91886401d9566efc5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:23:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:23:18Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://6e30344cfa0c0c9d42845a517f744a156634132bd6c2f6e30f745751913e968c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6e30344cfa0c0c9d42845a517f744a156634132bd6c2f6e30f745751913e968c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:23:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:23:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:23:16Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:25:46Z is after 2025-08-24T17:21:41Z" Mar 21 04:25:46 crc kubenswrapper[4839]: I0321 04:25:46.508804 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:24:41Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:12Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ea7373c259189351899cd74615296d7188b2529f808af9c8c85098177ffbe85d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:25:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:25:46Z is after 2025-08-24T17:21:41Z" Mar 21 04:25:46 crc kubenswrapper[4839]: I0321 04:25:46.527551 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-spl4b" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d634043b-c9ec-4469-b267-26053b1f02f9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:01Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:01Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5854b81091eb3920f499e2fb8bdb5e51afa5cf975397ad7d789603150fdafe92\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:25:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cdph2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c532a8668c47301ab93dbb1ea21be6ed687a83a6c9008b887e5023a46f27d172\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:25:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cdph2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://340ce870dbfbd3b102f8f1c5e6705da80e4a23db035175d7746ffe6ad63310f0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:25:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cdph2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://04c0470b8aa154ad901d5416631136ad2480da45bff39836dda6837a3e4b980e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:25:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cdph2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://821316a28b33f897df4ea4ae553ed0cbdd066ff220cc310af604bbe062e4b00e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:25:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cdph2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0555faac11cd1da1d5573074ee5d8704c1c4a7d8559996229bf242d9ed7290f4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:25:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cdph2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c3b3b070bcb4467cbc49a3b7bcae7caeb0458824dadce792fcc31a4a2df828b2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c3b3b070bcb4467cbc49a3b7bcae7caeb0458824dadce792fcc31a4a2df828b2\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-21T04:25:26Z\\\",\\\"message\\\":\\\"e/networking-console-plugin-85b44fc459-gdk6g openshift-etcd/etcd-crc openshift-multus/multus-zqcw4 openshift-network-diagnostics/network-check-target-xd92c openshift-network-operator/iptables-alerter-4ln5h openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-9hl57 openshift-multus/network-metrics-daemon-445ww openshift-network-diagnostics/network-check-source-55646444c4-trplf openshift-network-node-identity/network-node-identity-vrzqb]\\\\nI0321 04:25:26.224924 7118 obj_retry.go:418] Waiting for all the *v1.Pod retry setup to complete in iterateRetryResources\\\\nI0321 04:25:26.224943 7118 obj_retry.go:303] Retry object setup: *v1.Pod openshift-network-node-identity/network-node-identity-vrzqb\\\\nI0321 04:25:26.224952 7118 obj_retry.go:365] Adding new object: *v1.Pod openshift-network-node-identity/network-node-identity-vrzqb\\\\nI0321 04:25:26.224964 7118 ovn.go:134] Ensuring zone local for Pod openshift-network-node-identity/network-node-identity-vrzqb in node crc\\\\nF0321 04:25:26.224967 7118 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer becau\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-21T04:25:25Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-spl4b_openshift-ovn-kubernetes(d634043b-c9ec-4469-b267-26053b1f02f9)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cdph2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ee9c34719e2681860a7065870d4f0fd1d5aae10da5f2cc780e83f8020211e536\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:25:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cdph2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://466ed122245008a86b4471250670e086e61c981de3a7a026461c7c2238c87f3c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://466ed122245008a86b4471250670e086e61c981de3a7a026461c7c2238c87f3c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:25:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:25:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cdph2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:25:01Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-spl4b\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:25:46Z is after 2025-08-24T17:21:41Z" Mar 21 04:25:46 crc kubenswrapper[4839]: I0321 04:25:46.542607 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:24:41Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:09Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://17d5f53fe3137552caf74bf775b44fba819d8790d2bc4d5042744dbd7b7307ce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:25:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e480aacd9f65a8a4ccedd9c0a943c53bf6de8a8de00eb0d421ec782af8a785bb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:25:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:25:46Z is after 2025-08-24T17:21:41Z" Mar 21 04:25:46 crc kubenswrapper[4839]: E0321 04:25:46.543327 4839 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 21 04:25:46 crc kubenswrapper[4839]: I0321 04:25:46.557347 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-g47qh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e646dbcd-c976-48e4-8dee-497be8a275bf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2ceedbcee674106a8f519fededf37b2bbc4b4bd660845373899de825f1ed534e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:25:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ljz2v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:25:00Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-g47qh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:25:46Z is after 2025-08-24T17:21:41Z" Mar 21 04:25:46 crc kubenswrapper[4839]: I0321 04:25:46.569215 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"99628ae3-9af9-4946-8e9f-2fc369d82cfa\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:23:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:23:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:24:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:24:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:23:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4a7ef331cbf4076bc661dd93ec1669ee8b721edbac57aeef3df25f1e36527065\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ed71f104d54976b384a414c08b933d4f7185883ea0e2dc638863d67f9b245505\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-21T04:24:17Z\\\",\\\"message\\\":\\\"+ timeout 3m /bin/bash -exuo pipefail -c 'while [ -n \\\\\\\"$(ss -Htanop \\\\\\\\( sport = 10357 \\\\\\\\))\\\\\\\" ]; do sleep 1; done'\\\\n++ ss -Htanop '(' sport = 10357 ')'\\\\n+ '[' -n '' ']'\\\\n+ exec cluster-policy-controller start --config=/etc/kubernetes/static-pod-resources/configmaps/cluster-policy-controller-config/config.yaml --kubeconfig=/etc/kubernetes/static-pod-resources/configmaps/controller-manager-kubeconfig/kubeconfig --namespace=openshift-kube-controller-manager -v=2\\\\nI0321 04:23:48.352735 1 leaderelection.go:121] The leader election gives 4 retries and allows for 30s of clock skew. The kube-apiserver downtime tolerance is 78s. Worst non-graceful lease acquisition is 2m43s. Worst graceful lease acquisition is {26s}.\\\\nI0321 04:23:48.353466 1 observer_polling.go:159] Starting file observer\\\\nI0321 04:23:48.354210 1 builder.go:298] cluster-policy-controller version 4.18.0-202501230001.p0.g5fd8525.assembly.stream.el9-5fd8525-5fd852525909ce6eab52972ba9ce8fcf56528eb9\\\\nI0321 04:23:48.354819 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.crt::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.key\\\\\\\"\\\\nI0321 04:24:09.734400 1 cert_rotation.go:91] certificate rotation detected, shutting down client connections to start using new credentials\\\\nI0321 04:24:17.259238 1 cmd.go:138] Received SIGTERM or SIGINT signal, shutting down controller.\\\\nF0321 04:24:17.259294 1 cmd.go:179] failed checking apiserver connectivity: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-21T04:23:48Z\\\"}},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:24:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dd3ce831e80df764c5450f1bf45e92fc061ba1c1efa8ade627b570edcb41567c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:23:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ac4ccdf976fbadd5ce5975dbec37b6be3b17b401a863ed5c1e93736176e1185e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:23:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f432640080695974eb1c6c1297fad14596d99d17a162d541417fb3ec762575f5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:23:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:23:16Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:25:46Z is after 2025-08-24T17:21:41Z" Mar 21 04:25:46 crc kubenswrapper[4839]: I0321 04:25:46.581270 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c5a473f2-0430-4c9b-8ef8-60d457db5188\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:23:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:23:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:23:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7299e7f513aff2d1d61b42cef3ed386843478716082a6e38d4d6d61d87f48529\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:23:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6d9bd6ceaa85cd21af55d1d512ac910eaf9d1bcf5d0f10868cd22ac80b13f66a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:23:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e4fe4a6c00181492cd5389e34a9bf18d905f69b2f7467eb75dcee9992eeb0d07\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:23:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://412472fce0e71838c2bb83101879b672e777dcd608bf27261a98261150d42e87\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ecafd1a4266b72f369f4b53d8a423bb875c1bc9719282c517a8e999ad5aecc66\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-21T04:24:14Z\\\",\\\"message\\\":\\\"W0321 04:24:13.708919 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI0321 04:24:13.709951 1 crypto.go:601] Generating new CA for check-endpoints-signer@1774067053 cert, and key in /tmp/serving-cert-1138089316/serving-signer.crt, /tmp/serving-cert-1138089316/serving-signer.key\\\\nI0321 04:24:14.009807 1 observer_polling.go:159] Starting file observer\\\\nW0321 04:24:14.016906 1 builder.go:272] unable to get owner reference (falling back to namespace): Unauthorized\\\\nI0321 04:24:14.017068 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0321 04:24:14.017655 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1138089316/tls.crt::/tmp/serving-cert-1138089316/tls.key\\\\\\\"\\\\nF0321 04:24:14.917069 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-21T04:24:13Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:24:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://573a241321770c9c676d83d8280f81eba0ad01a3e2f857be89d31316117b8898\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:23:19Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f70600d5431425836d94025fc8166068d8c9f411d0750be07cd7de44575a9649\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f70600d5431425836d94025fc8166068d8c9f411d0750be07cd7de44575a9649\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:23:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:23:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:23:16Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:25:46Z is after 2025-08-24T17:21:41Z" Mar 21 04:25:46 crc kubenswrapper[4839]: I0321 04:25:46.594360 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:24:41Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:24:41Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:25:46Z is after 2025-08-24T17:21:41Z" Mar 21 04:25:46 crc kubenswrapper[4839]: I0321 04:25:46.605789 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-jx4q7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4f92fefb-d5cd-451a-8bbe-31eea55d5bd9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6fc18a359e07bf64662248d7ceb65c8184407c0c4e14ce5483760975218407c9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:25:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9jqbw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a124ddc90d8b93fe4b6f77b778fbdac127b4896f5f55e6d5d78629cd17584311\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:25:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9jqbw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:25:00Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-jx4q7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:25:46Z is after 2025-08-24T17:21:41Z" Mar 21 04:25:46 crc kubenswrapper[4839]: I0321 04:25:46.622108 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b27e726e-fb6e-471f-b189-71637a34994c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:23:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:23:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:24:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:24:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:23:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://506189caf9101bd8eff0b327400189bde29280613baf494135bf28fa36bd80cd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:23:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4ea5d4d7d744a79c2e5515eaf49d09346d96b37b17eee6e3e32952e389248367\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:23:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1981f2299df7154d30ce01492943eaadeeed0f27c0b0a39a1228054345aa7a3a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:23:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fec5f0e1a98ef435cb8b280400cfc7daff0f232d320947f22a93df0b81b6f441\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fec5f0e1a98ef435cb8b280400cfc7daff0f232d320947f22a93df0b81b6f441\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:23:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:23:17Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:23:16Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:25:46Z is after 2025-08-24T17:21:41Z" Mar 21 04:25:46 crc kubenswrapper[4839]: I0321 04:25:46.635334 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:24:41Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:24:41Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:25:46Z is after 2025-08-24T17:21:41Z" Mar 21 04:25:46 crc kubenswrapper[4839]: I0321 04:25:46.647967 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-zqcw4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1602189b-f4f3-40ee-ba63-c695c11069d0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://abb804e65728bf323531d9b35e52f6700584bab85de4a5b749067d2d89224747\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:25:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f9gh6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:25:00Z\\\"}}\" for pod \"openshift-multus\"/\"multus-zqcw4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:25:46Z is after 2025-08-24T17:21:41Z" Mar 21 04:25:46 crc kubenswrapper[4839]: I0321 04:25:46.658965 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-sxs57" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e99177d8-5f41-4cee-a2c9-ae1c314d9d8d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4499202f3d135af4fbc0aaeddb85c3af3d571f0a6a0da31b0c054112cbeef85e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:25:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vqm8m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:25:06Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-sxs57\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:25:46Z is after 2025-08-24T17:21:41Z" Mar 21 04:25:46 crc kubenswrapper[4839]: I0321 04:25:46.672202 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:24:41Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:11Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c3ce591ac49d818720803d65a1f2cbc28e1781a5bd3cff7c6cdaaa4b55c01b62\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:25:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:25:46Z is after 2025-08-24T17:21:41Z" Mar 21 04:25:46 crc kubenswrapper[4839]: I0321 04:25:46.683191 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:24:41Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:24:41Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:25:46Z is after 2025-08-24T17:21:41Z" Mar 21 04:25:46 crc kubenswrapper[4839]: I0321 04:25:46.697289 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-scp2c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e0848faa-daf7-4b62-a20f-36d92678db1d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2ae22ec83bed051327f3e013af8f13d50dfab76794f5fbc972c4091173c8c464\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:25:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6trk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5ee6bb585321fd698c356e2ad9b8f48f62f2b8e09847371060e8ab0395a059ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5ee6bb585321fd698c356e2ad9b8f48f62f2b8e09847371060e8ab0395a059ea\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:25:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:25:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6trk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d6527e3c31caba53559cb6dfe4494b576789a0c8e3e5a7d392352441f1b9b1ad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d6527e3c31caba53559cb6dfe4494b576789a0c8e3e5a7d392352441f1b9b1ad\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:25:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:25:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6trk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://688598020c8b1be055194d4805d234769230262450e5f816f35cc120ff5ff9a2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://688598020c8b1be055194d4805d234769230262450e5f816f35cc120ff5ff9a2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:25:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:25:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6trk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://137b7d338371020059e20feb028cea9e79fa43d32ff7de446719a1326051c4f4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://137b7d338371020059e20feb028cea9e79fa43d32ff7de446719a1326051c4f4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:25:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:25:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6trk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6059c9b392e73f503a825716b885d248ddf659fe4fc1cd23f47353dd0fdc0e07\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6059c9b392e73f503a825716b885d248ddf659fe4fc1cd23f47353dd0fdc0e07\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:25:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:25:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6trk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9367e3b70275a4fd5cc98faa785da0149e890de672bce34ddc831a46be5dd6b3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9367e3b70275a4fd5cc98faa785da0149e890de672bce34ddc831a46be5dd6b3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:25:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:25:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6trk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:25:00Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-scp2c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:25:46Z is after 2025-08-24T17:21:41Z" Mar 21 04:25:47 crc kubenswrapper[4839]: I0321 04:25:47.452014 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-445ww" Mar 21 04:25:47 crc kubenswrapper[4839]: I0321 04:25:47.452080 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 21 04:25:47 crc kubenswrapper[4839]: I0321 04:25:47.452014 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 21 04:25:47 crc kubenswrapper[4839]: E0321 04:25:47.452161 4839 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-445ww" podUID="fa13ce27-53f2-4178-8560-251f0bb3f034" Mar 21 04:25:47 crc kubenswrapper[4839]: E0321 04:25:47.452210 4839 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 21 04:25:47 crc kubenswrapper[4839]: E0321 04:25:47.452301 4839 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 21 04:25:48 crc kubenswrapper[4839]: I0321 04:25:48.265040 4839 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-zqcw4_1602189b-f4f3-40ee-ba63-c695c11069d0/kube-multus/0.log" Mar 21 04:25:48 crc kubenswrapper[4839]: I0321 04:25:48.265094 4839 generic.go:334] "Generic (PLEG): container finished" podID="1602189b-f4f3-40ee-ba63-c695c11069d0" containerID="abb804e65728bf323531d9b35e52f6700584bab85de4a5b749067d2d89224747" exitCode=1 Mar 21 04:25:48 crc kubenswrapper[4839]: I0321 04:25:48.265122 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-zqcw4" event={"ID":"1602189b-f4f3-40ee-ba63-c695c11069d0","Type":"ContainerDied","Data":"abb804e65728bf323531d9b35e52f6700584bab85de4a5b749067d2d89224747"} Mar 21 04:25:48 crc kubenswrapper[4839]: I0321 04:25:48.265473 4839 scope.go:117] "RemoveContainer" containerID="abb804e65728bf323531d9b35e52f6700584bab85de4a5b749067d2d89224747" Mar 21 04:25:48 crc kubenswrapper[4839]: I0321 04:25:48.284063 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"845a5f72-fcd9-4e53-a65e-d875a0758312\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:23:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:23:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:23:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:23:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:23:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6ac8aa1efe8200de679b27c91d85e8934dd8b958a487da8adfcb2e7fe4d4af83\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:23:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b5fcb9a692a8e68cdaf00e9ab50192286582dc94a1ec9c1f269deec71d12e709\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:23:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0a90276bf42ba474250d4e51c93b958e93ffcad8ab5f4285766b381f0247c484\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:23:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://418cc0271450c53990546d7ffb4506824d2a5889e49064cd40624ba291f485bd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:23:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d2aeb8f0a5e6e7302570f0ecb899e31a8f5e9ed506f9bf7cfa6a0a6d55ae50dd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:23:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://843b9092466615ee64f818bbd1d6e51298c59932d1e135796967de09fb2f81bf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://843b9092466615ee64f818bbd1d6e51298c59932d1e135796967de09fb2f81bf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:23:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:23:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://320bccb832e179c9ab109d22c5cd652cfb3bd770d2c698d91886401d9566efc5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://320bccb832e179c9ab109d22c5cd652cfb3bd770d2c698d91886401d9566efc5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:23:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:23:18Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://6e30344cfa0c0c9d42845a517f744a156634132bd6c2f6e30f745751913e968c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6e30344cfa0c0c9d42845a517f744a156634132bd6c2f6e30f745751913e968c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:23:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:23:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:23:16Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:25:48Z is after 2025-08-24T17:21:41Z" Mar 21 04:25:48 crc kubenswrapper[4839]: I0321 04:25:48.296541 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:24:41Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:12Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ea7373c259189351899cd74615296d7188b2529f808af9c8c85098177ffbe85d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:25:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:25:48Z is after 2025-08-24T17:21:41Z" Mar 21 04:25:48 crc kubenswrapper[4839]: I0321 04:25:48.313613 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-spl4b" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d634043b-c9ec-4469-b267-26053b1f02f9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:01Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:01Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5854b81091eb3920f499e2fb8bdb5e51afa5cf975397ad7d789603150fdafe92\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:25:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cdph2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c532a8668c47301ab93dbb1ea21be6ed687a83a6c9008b887e5023a46f27d172\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:25:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cdph2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://340ce870dbfbd3b102f8f1c5e6705da80e4a23db035175d7746ffe6ad63310f0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:25:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cdph2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://04c0470b8aa154ad901d5416631136ad2480da45bff39836dda6837a3e4b980e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:25:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cdph2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://821316a28b33f897df4ea4ae553ed0cbdd066ff220cc310af604bbe062e4b00e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:25:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cdph2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0555faac11cd1da1d5573074ee5d8704c1c4a7d8559996229bf242d9ed7290f4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:25:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cdph2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c3b3b070bcb4467cbc49a3b7bcae7caeb0458824dadce792fcc31a4a2df828b2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c3b3b070bcb4467cbc49a3b7bcae7caeb0458824dadce792fcc31a4a2df828b2\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-21T04:25:26Z\\\",\\\"message\\\":\\\"e/networking-console-plugin-85b44fc459-gdk6g openshift-etcd/etcd-crc openshift-multus/multus-zqcw4 openshift-network-diagnostics/network-check-target-xd92c openshift-network-operator/iptables-alerter-4ln5h openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-9hl57 openshift-multus/network-metrics-daemon-445ww openshift-network-diagnostics/network-check-source-55646444c4-trplf openshift-network-node-identity/network-node-identity-vrzqb]\\\\nI0321 04:25:26.224924 7118 obj_retry.go:418] Waiting for all the *v1.Pod retry setup to complete in iterateRetryResources\\\\nI0321 04:25:26.224943 7118 obj_retry.go:303] Retry object setup: *v1.Pod openshift-network-node-identity/network-node-identity-vrzqb\\\\nI0321 04:25:26.224952 7118 obj_retry.go:365] Adding new object: *v1.Pod openshift-network-node-identity/network-node-identity-vrzqb\\\\nI0321 04:25:26.224964 7118 ovn.go:134] Ensuring zone local for Pod openshift-network-node-identity/network-node-identity-vrzqb in node crc\\\\nF0321 04:25:26.224967 7118 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer becau\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-21T04:25:25Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-spl4b_openshift-ovn-kubernetes(d634043b-c9ec-4469-b267-26053b1f02f9)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cdph2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ee9c34719e2681860a7065870d4f0fd1d5aae10da5f2cc780e83f8020211e536\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:25:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cdph2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://466ed122245008a86b4471250670e086e61c981de3a7a026461c7c2238c87f3c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://466ed122245008a86b4471250670e086e61c981de3a7a026461c7c2238c87f3c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:25:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:25:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cdph2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:25:01Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-spl4b\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:25:48Z is after 2025-08-24T17:21:41Z" Mar 21 04:25:48 crc kubenswrapper[4839]: I0321 04:25:48.325631 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-9hl57" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4dee692e-c3b8-4538-86d7-210dd7e96173\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6058f68ef58f6259be3064c7cea97ec261dc1174655dbe75d4caf208a9c300e5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:25:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6n8pc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://99d7bb8a699d6022c3036c24052e58f8f45699b88971a2137e3d8bc27a21fe56\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:25:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6n8pc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:25:12Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-9hl57\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:25:48Z is after 2025-08-24T17:21:41Z" Mar 21 04:25:48 crc kubenswrapper[4839]: I0321 04:25:48.335865 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-445ww" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fa13ce27-53f2-4178-8560-251f0bb3f034\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:13Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:13Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4b26h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4b26h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:25:13Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-445ww\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:25:48Z is after 2025-08-24T17:21:41Z" Mar 21 04:25:48 crc kubenswrapper[4839]: I0321 04:25:48.350551 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"99628ae3-9af9-4946-8e9f-2fc369d82cfa\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:23:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:23:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:24:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:24:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:23:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4a7ef331cbf4076bc661dd93ec1669ee8b721edbac57aeef3df25f1e36527065\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ed71f104d54976b384a414c08b933d4f7185883ea0e2dc638863d67f9b245505\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-21T04:24:17Z\\\",\\\"message\\\":\\\"+ timeout 3m /bin/bash -exuo pipefail -c 'while [ -n \\\\\\\"$(ss -Htanop \\\\\\\\( sport = 10357 \\\\\\\\))\\\\\\\" ]; do sleep 1; done'\\\\n++ ss -Htanop '(' sport = 10357 ')'\\\\n+ '[' -n '' ']'\\\\n+ exec cluster-policy-controller start --config=/etc/kubernetes/static-pod-resources/configmaps/cluster-policy-controller-config/config.yaml --kubeconfig=/etc/kubernetes/static-pod-resources/configmaps/controller-manager-kubeconfig/kubeconfig --namespace=openshift-kube-controller-manager -v=2\\\\nI0321 04:23:48.352735 1 leaderelection.go:121] The leader election gives 4 retries and allows for 30s of clock skew. The kube-apiserver downtime tolerance is 78s. Worst non-graceful lease acquisition is 2m43s. Worst graceful lease acquisition is {26s}.\\\\nI0321 04:23:48.353466 1 observer_polling.go:159] Starting file observer\\\\nI0321 04:23:48.354210 1 builder.go:298] cluster-policy-controller version 4.18.0-202501230001.p0.g5fd8525.assembly.stream.el9-5fd8525-5fd852525909ce6eab52972ba9ce8fcf56528eb9\\\\nI0321 04:23:48.354819 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.crt::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.key\\\\\\\"\\\\nI0321 04:24:09.734400 1 cert_rotation.go:91] certificate rotation detected, shutting down client connections to start using new credentials\\\\nI0321 04:24:17.259238 1 cmd.go:138] Received SIGTERM or SIGINT signal, shutting down controller.\\\\nF0321 04:24:17.259294 1 cmd.go:179] failed checking apiserver connectivity: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-21T04:23:48Z\\\"}},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:24:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dd3ce831e80df764c5450f1bf45e92fc061ba1c1efa8ade627b570edcb41567c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:23:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ac4ccdf976fbadd5ce5975dbec37b6be3b17b401a863ed5c1e93736176e1185e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:23:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f432640080695974eb1c6c1297fad14596d99d17a162d541417fb3ec762575f5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:23:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:23:16Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:25:48Z is after 2025-08-24T17:21:41Z" Mar 21 04:25:48 crc kubenswrapper[4839]: I0321 04:25:48.364676 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c5a473f2-0430-4c9b-8ef8-60d457db5188\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:23:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:23:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:23:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7299e7f513aff2d1d61b42cef3ed386843478716082a6e38d4d6d61d87f48529\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:23:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6d9bd6ceaa85cd21af55d1d512ac910eaf9d1bcf5d0f10868cd22ac80b13f66a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:23:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e4fe4a6c00181492cd5389e34a9bf18d905f69b2f7467eb75dcee9992eeb0d07\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:23:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://412472fce0e71838c2bb83101879b672e777dcd608bf27261a98261150d42e87\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ecafd1a4266b72f369f4b53d8a423bb875c1bc9719282c517a8e999ad5aecc66\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-21T04:24:14Z\\\",\\\"message\\\":\\\"W0321 04:24:13.708919 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI0321 04:24:13.709951 1 crypto.go:601] Generating new CA for check-endpoints-signer@1774067053 cert, and key in /tmp/serving-cert-1138089316/serving-signer.crt, /tmp/serving-cert-1138089316/serving-signer.key\\\\nI0321 04:24:14.009807 1 observer_polling.go:159] Starting file observer\\\\nW0321 04:24:14.016906 1 builder.go:272] unable to get owner reference (falling back to namespace): Unauthorized\\\\nI0321 04:24:14.017068 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0321 04:24:14.017655 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1138089316/tls.crt::/tmp/serving-cert-1138089316/tls.key\\\\\\\"\\\\nF0321 04:24:14.917069 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-21T04:24:13Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:24:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://573a241321770c9c676d83d8280f81eba0ad01a3e2f857be89d31316117b8898\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:23:19Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f70600d5431425836d94025fc8166068d8c9f411d0750be07cd7de44575a9649\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f70600d5431425836d94025fc8166068d8c9f411d0750be07cd7de44575a9649\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:23:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:23:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:23:16Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:25:48Z is after 2025-08-24T17:21:41Z" Mar 21 04:25:48 crc kubenswrapper[4839]: I0321 04:25:48.375712 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:24:41Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:24:41Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:25:48Z is after 2025-08-24T17:21:41Z" Mar 21 04:25:48 crc kubenswrapper[4839]: I0321 04:25:48.386720 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:24:41Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:09Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://17d5f53fe3137552caf74bf775b44fba819d8790d2bc4d5042744dbd7b7307ce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:25:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e480aacd9f65a8a4ccedd9c0a943c53bf6de8a8de00eb0d421ec782af8a785bb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:25:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:25:48Z is after 2025-08-24T17:21:41Z" Mar 21 04:25:48 crc kubenswrapper[4839]: I0321 04:25:48.399882 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-g47qh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e646dbcd-c976-48e4-8dee-497be8a275bf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2ceedbcee674106a8f519fededf37b2bbc4b4bd660845373899de825f1ed534e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:25:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ljz2v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:25:00Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-g47qh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:25:48Z is after 2025-08-24T17:21:41Z" Mar 21 04:25:48 crc kubenswrapper[4839]: I0321 04:25:48.411757 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b27e726e-fb6e-471f-b189-71637a34994c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:23:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:23:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:24:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:24:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:23:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://506189caf9101bd8eff0b327400189bde29280613baf494135bf28fa36bd80cd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:23:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4ea5d4d7d744a79c2e5515eaf49d09346d96b37b17eee6e3e32952e389248367\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:23:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1981f2299df7154d30ce01492943eaadeeed0f27c0b0a39a1228054345aa7a3a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:23:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fec5f0e1a98ef435cb8b280400cfc7daff0f232d320947f22a93df0b81b6f441\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fec5f0e1a98ef435cb8b280400cfc7daff0f232d320947f22a93df0b81b6f441\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:23:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:23:17Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:23:16Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:25:48Z is after 2025-08-24T17:21:41Z" Mar 21 04:25:48 crc kubenswrapper[4839]: I0321 04:25:48.423764 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:24:41Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:24:41Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:25:48Z is after 2025-08-24T17:21:41Z" Mar 21 04:25:48 crc kubenswrapper[4839]: I0321 04:25:48.433909 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-jx4q7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4f92fefb-d5cd-451a-8bbe-31eea55d5bd9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6fc18a359e07bf64662248d7ceb65c8184407c0c4e14ce5483760975218407c9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:25:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9jqbw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a124ddc90d8b93fe4b6f77b778fbdac127b4896f5f55e6d5d78629cd17584311\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:25:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9jqbw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:25:00Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-jx4q7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:25:48Z is after 2025-08-24T17:21:41Z" Mar 21 04:25:48 crc kubenswrapper[4839]: I0321 04:25:48.445812 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:24:41Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:11Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c3ce591ac49d818720803d65a1f2cbc28e1781a5bd3cff7c6cdaaa4b55c01b62\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:25:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:25:48Z is after 2025-08-24T17:21:41Z" Mar 21 04:25:48 crc kubenswrapper[4839]: I0321 04:25:48.452509 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 21 04:25:48 crc kubenswrapper[4839]: E0321 04:25:48.452616 4839 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 21 04:25:48 crc kubenswrapper[4839]: I0321 04:25:48.458129 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:24:41Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:24:41Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:25:48Z is after 2025-08-24T17:21:41Z" Mar 21 04:25:48 crc kubenswrapper[4839]: I0321 04:25:48.470355 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-scp2c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e0848faa-daf7-4b62-a20f-36d92678db1d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2ae22ec83bed051327f3e013af8f13d50dfab76794f5fbc972c4091173c8c464\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:25:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6trk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5ee6bb585321fd698c356e2ad9b8f48f62f2b8e09847371060e8ab0395a059ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5ee6bb585321fd698c356e2ad9b8f48f62f2b8e09847371060e8ab0395a059ea\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:25:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:25:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6trk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d6527e3c31caba53559cb6dfe4494b576789a0c8e3e5a7d392352441f1b9b1ad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d6527e3c31caba53559cb6dfe4494b576789a0c8e3e5a7d392352441f1b9b1ad\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:25:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:25:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6trk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://688598020c8b1be055194d4805d234769230262450e5f816f35cc120ff5ff9a2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://688598020c8b1be055194d4805d234769230262450e5f816f35cc120ff5ff9a2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:25:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:25:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6trk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://137b7d338371020059e20feb028cea9e79fa43d32ff7de446719a1326051c4f4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://137b7d338371020059e20feb028cea9e79fa43d32ff7de446719a1326051c4f4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:25:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:25:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6trk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6059c9b392e73f503a825716b885d248ddf659fe4fc1cd23f47353dd0fdc0e07\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6059c9b392e73f503a825716b885d248ddf659fe4fc1cd23f47353dd0fdc0e07\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:25:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:25:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6trk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9367e3b70275a4fd5cc98faa785da0149e890de672bce34ddc831a46be5dd6b3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9367e3b70275a4fd5cc98faa785da0149e890de672bce34ddc831a46be5dd6b3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:25:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:25:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6trk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:25:00Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-scp2c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:25:48Z is after 2025-08-24T17:21:41Z" Mar 21 04:25:48 crc kubenswrapper[4839]: I0321 04:25:48.480535 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-zqcw4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1602189b-f4f3-40ee-ba63-c695c11069d0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:48Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:48Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://abb804e65728bf323531d9b35e52f6700584bab85de4a5b749067d2d89224747\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://abb804e65728bf323531d9b35e52f6700584bab85de4a5b749067d2d89224747\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-21T04:25:48Z\\\",\\\"message\\\":\\\"2026-03-21T04:25:02+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_d2a3febf-889d-41cb-9c6b-9024dff2f413\\\\n2026-03-21T04:25:02+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_d2a3febf-889d-41cb-9c6b-9024dff2f413 to /host/opt/cni/bin/\\\\n2026-03-21T04:25:02Z [verbose] multus-daemon started\\\\n2026-03-21T04:25:02Z [verbose] Readiness Indicator file check\\\\n2026-03-21T04:25:47Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-21T04:25:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f9gh6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:25:00Z\\\"}}\" for pod \"openshift-multus\"/\"multus-zqcw4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:25:48Z is after 2025-08-24T17:21:41Z" Mar 21 04:25:48 crc kubenswrapper[4839]: I0321 04:25:48.490548 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-sxs57" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e99177d8-5f41-4cee-a2c9-ae1c314d9d8d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4499202f3d135af4fbc0aaeddb85c3af3d571f0a6a0da31b0c054112cbeef85e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:25:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vqm8m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:25:06Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-sxs57\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:25:48Z is after 2025-08-24T17:21:41Z" Mar 21 04:25:49 crc kubenswrapper[4839]: I0321 04:25:49.269678 4839 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-zqcw4_1602189b-f4f3-40ee-ba63-c695c11069d0/kube-multus/0.log" Mar 21 04:25:49 crc kubenswrapper[4839]: I0321 04:25:49.269733 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-zqcw4" event={"ID":"1602189b-f4f3-40ee-ba63-c695c11069d0","Type":"ContainerStarted","Data":"bc026a6498e53a6dc434660326fcbdb2323e59c0c7d6f9e5d768c3a347b614d1"} Mar 21 04:25:49 crc kubenswrapper[4839]: I0321 04:25:49.281525 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-g47qh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e646dbcd-c976-48e4-8dee-497be8a275bf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2ceedbcee674106a8f519fededf37b2bbc4b4bd660845373899de825f1ed534e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:25:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ljz2v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:25:00Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-g47qh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:25:49Z is after 2025-08-24T17:21:41Z" Mar 21 04:25:49 crc kubenswrapper[4839]: I0321 04:25:49.294990 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"99628ae3-9af9-4946-8e9f-2fc369d82cfa\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:23:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:23:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:24:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:24:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:23:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4a7ef331cbf4076bc661dd93ec1669ee8b721edbac57aeef3df25f1e36527065\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ed71f104d54976b384a414c08b933d4f7185883ea0e2dc638863d67f9b245505\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-21T04:24:17Z\\\",\\\"message\\\":\\\"+ timeout 3m /bin/bash -exuo pipefail -c 'while [ -n \\\\\\\"$(ss -Htanop \\\\\\\\( sport = 10357 \\\\\\\\))\\\\\\\" ]; do sleep 1; done'\\\\n++ ss -Htanop '(' sport = 10357 ')'\\\\n+ '[' -n '' ']'\\\\n+ exec cluster-policy-controller start --config=/etc/kubernetes/static-pod-resources/configmaps/cluster-policy-controller-config/config.yaml --kubeconfig=/etc/kubernetes/static-pod-resources/configmaps/controller-manager-kubeconfig/kubeconfig --namespace=openshift-kube-controller-manager -v=2\\\\nI0321 04:23:48.352735 1 leaderelection.go:121] The leader election gives 4 retries and allows for 30s of clock skew. The kube-apiserver downtime tolerance is 78s. Worst non-graceful lease acquisition is 2m43s. Worst graceful lease acquisition is {26s}.\\\\nI0321 04:23:48.353466 1 observer_polling.go:159] Starting file observer\\\\nI0321 04:23:48.354210 1 builder.go:298] cluster-policy-controller version 4.18.0-202501230001.p0.g5fd8525.assembly.stream.el9-5fd8525-5fd852525909ce6eab52972ba9ce8fcf56528eb9\\\\nI0321 04:23:48.354819 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.crt::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.key\\\\\\\"\\\\nI0321 04:24:09.734400 1 cert_rotation.go:91] certificate rotation detected, shutting down client connections to start using new credentials\\\\nI0321 04:24:17.259238 1 cmd.go:138] Received SIGTERM or SIGINT signal, shutting down controller.\\\\nF0321 04:24:17.259294 1 cmd.go:179] failed checking apiserver connectivity: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-21T04:23:48Z\\\"}},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:24:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dd3ce831e80df764c5450f1bf45e92fc061ba1c1efa8ade627b570edcb41567c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:23:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ac4ccdf976fbadd5ce5975dbec37b6be3b17b401a863ed5c1e93736176e1185e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:23:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f432640080695974eb1c6c1297fad14596d99d17a162d541417fb3ec762575f5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:23:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:23:16Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:25:49Z is after 2025-08-24T17:21:41Z" Mar 21 04:25:49 crc kubenswrapper[4839]: I0321 04:25:49.307778 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c5a473f2-0430-4c9b-8ef8-60d457db5188\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:23:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:23:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:23:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7299e7f513aff2d1d61b42cef3ed386843478716082a6e38d4d6d61d87f48529\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:23:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6d9bd6ceaa85cd21af55d1d512ac910eaf9d1bcf5d0f10868cd22ac80b13f66a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:23:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e4fe4a6c00181492cd5389e34a9bf18d905f69b2f7467eb75dcee9992eeb0d07\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:23:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://412472fce0e71838c2bb83101879b672e777dcd608bf27261a98261150d42e87\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ecafd1a4266b72f369f4b53d8a423bb875c1bc9719282c517a8e999ad5aecc66\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-21T04:24:14Z\\\",\\\"message\\\":\\\"W0321 04:24:13.708919 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI0321 04:24:13.709951 1 crypto.go:601] Generating new CA for check-endpoints-signer@1774067053 cert, and key in /tmp/serving-cert-1138089316/serving-signer.crt, /tmp/serving-cert-1138089316/serving-signer.key\\\\nI0321 04:24:14.009807 1 observer_polling.go:159] Starting file observer\\\\nW0321 04:24:14.016906 1 builder.go:272] unable to get owner reference (falling back to namespace): Unauthorized\\\\nI0321 04:24:14.017068 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0321 04:24:14.017655 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1138089316/tls.crt::/tmp/serving-cert-1138089316/tls.key\\\\\\\"\\\\nF0321 04:24:14.917069 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-21T04:24:13Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:24:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://573a241321770c9c676d83d8280f81eba0ad01a3e2f857be89d31316117b8898\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:23:19Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f70600d5431425836d94025fc8166068d8c9f411d0750be07cd7de44575a9649\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f70600d5431425836d94025fc8166068d8c9f411d0750be07cd7de44575a9649\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:23:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:23:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:23:16Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:25:49Z is after 2025-08-24T17:21:41Z" Mar 21 04:25:49 crc kubenswrapper[4839]: I0321 04:25:49.318708 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:24:41Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:24:41Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:25:49Z is after 2025-08-24T17:21:41Z" Mar 21 04:25:49 crc kubenswrapper[4839]: I0321 04:25:49.328922 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:24:41Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:09Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://17d5f53fe3137552caf74bf775b44fba819d8790d2bc4d5042744dbd7b7307ce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:25:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e480aacd9f65a8a4ccedd9c0a943c53bf6de8a8de00eb0d421ec782af8a785bb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:25:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:25:49Z is after 2025-08-24T17:21:41Z" Mar 21 04:25:49 crc kubenswrapper[4839]: I0321 04:25:49.339816 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b27e726e-fb6e-471f-b189-71637a34994c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:23:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:23:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:24:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:24:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:23:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://506189caf9101bd8eff0b327400189bde29280613baf494135bf28fa36bd80cd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:23:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4ea5d4d7d744a79c2e5515eaf49d09346d96b37b17eee6e3e32952e389248367\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:23:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1981f2299df7154d30ce01492943eaadeeed0f27c0b0a39a1228054345aa7a3a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:23:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fec5f0e1a98ef435cb8b280400cfc7daff0f232d320947f22a93df0b81b6f441\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fec5f0e1a98ef435cb8b280400cfc7daff0f232d320947f22a93df0b81b6f441\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:23:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:23:17Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:23:16Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:25:49Z is after 2025-08-24T17:21:41Z" Mar 21 04:25:49 crc kubenswrapper[4839]: I0321 04:25:49.350464 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:24:41Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:24:41Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:25:49Z is after 2025-08-24T17:21:41Z" Mar 21 04:25:49 crc kubenswrapper[4839]: I0321 04:25:49.360970 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-jx4q7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4f92fefb-d5cd-451a-8bbe-31eea55d5bd9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6fc18a359e07bf64662248d7ceb65c8184407c0c4e14ce5483760975218407c9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:25:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9jqbw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a124ddc90d8b93fe4b6f77b778fbdac127b4896f5f55e6d5d78629cd17584311\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:25:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9jqbw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:25:00Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-jx4q7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:25:49Z is after 2025-08-24T17:21:41Z" Mar 21 04:25:49 crc kubenswrapper[4839]: I0321 04:25:49.369508 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-sxs57" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e99177d8-5f41-4cee-a2c9-ae1c314d9d8d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4499202f3d135af4fbc0aaeddb85c3af3d571f0a6a0da31b0c054112cbeef85e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:25:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vqm8m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:25:06Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-sxs57\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:25:49Z is after 2025-08-24T17:21:41Z" Mar 21 04:25:49 crc kubenswrapper[4839]: I0321 04:25:49.382026 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:24:41Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:11Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c3ce591ac49d818720803d65a1f2cbc28e1781a5bd3cff7c6cdaaa4b55c01b62\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:25:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:25:49Z is after 2025-08-24T17:21:41Z" Mar 21 04:25:49 crc kubenswrapper[4839]: I0321 04:25:49.392448 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:24:41Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:24:41Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:25:49Z is after 2025-08-24T17:21:41Z" Mar 21 04:25:49 crc kubenswrapper[4839]: I0321 04:25:49.406753 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-scp2c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e0848faa-daf7-4b62-a20f-36d92678db1d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2ae22ec83bed051327f3e013af8f13d50dfab76794f5fbc972c4091173c8c464\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:25:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6trk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5ee6bb585321fd698c356e2ad9b8f48f62f2b8e09847371060e8ab0395a059ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5ee6bb585321fd698c356e2ad9b8f48f62f2b8e09847371060e8ab0395a059ea\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:25:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:25:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6trk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d6527e3c31caba53559cb6dfe4494b576789a0c8e3e5a7d392352441f1b9b1ad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d6527e3c31caba53559cb6dfe4494b576789a0c8e3e5a7d392352441f1b9b1ad\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:25:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:25:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6trk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://688598020c8b1be055194d4805d234769230262450e5f816f35cc120ff5ff9a2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://688598020c8b1be055194d4805d234769230262450e5f816f35cc120ff5ff9a2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:25:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:25:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6trk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://137b7d338371020059e20feb028cea9e79fa43d32ff7de446719a1326051c4f4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://137b7d338371020059e20feb028cea9e79fa43d32ff7de446719a1326051c4f4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:25:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:25:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6trk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6059c9b392e73f503a825716b885d248ddf659fe4fc1cd23f47353dd0fdc0e07\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6059c9b392e73f503a825716b885d248ddf659fe4fc1cd23f47353dd0fdc0e07\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:25:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:25:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6trk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9367e3b70275a4fd5cc98faa785da0149e890de672bce34ddc831a46be5dd6b3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9367e3b70275a4fd5cc98faa785da0149e890de672bce34ddc831a46be5dd6b3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:25:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:25:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6trk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:25:00Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-scp2c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:25:49Z is after 2025-08-24T17:21:41Z" Mar 21 04:25:49 crc kubenswrapper[4839]: I0321 04:25:49.417675 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-zqcw4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1602189b-f4f3-40ee-ba63-c695c11069d0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bc026a6498e53a6dc434660326fcbdb2323e59c0c7d6f9e5d768c3a347b614d1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://abb804e65728bf323531d9b35e52f6700584bab85de4a5b749067d2d89224747\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-21T04:25:48Z\\\",\\\"message\\\":\\\"2026-03-21T04:25:02+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_d2a3febf-889d-41cb-9c6b-9024dff2f413\\\\n2026-03-21T04:25:02+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_d2a3febf-889d-41cb-9c6b-9024dff2f413 to /host/opt/cni/bin/\\\\n2026-03-21T04:25:02Z [verbose] multus-daemon started\\\\n2026-03-21T04:25:02Z [verbose] Readiness Indicator file check\\\\n2026-03-21T04:25:47Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-21T04:25:01Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:25:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f9gh6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:25:00Z\\\"}}\" for pod \"openshift-multus\"/\"multus-zqcw4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:25:49Z is after 2025-08-24T17:21:41Z" Mar 21 04:25:49 crc kubenswrapper[4839]: I0321 04:25:49.426085 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-445ww" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fa13ce27-53f2-4178-8560-251f0bb3f034\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:13Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:13Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4b26h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4b26h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:25:13Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-445ww\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:25:49Z is after 2025-08-24T17:21:41Z" Mar 21 04:25:49 crc kubenswrapper[4839]: I0321 04:25:49.443426 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"845a5f72-fcd9-4e53-a65e-d875a0758312\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:23:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:23:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:23:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:23:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:23:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6ac8aa1efe8200de679b27c91d85e8934dd8b958a487da8adfcb2e7fe4d4af83\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:23:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b5fcb9a692a8e68cdaf00e9ab50192286582dc94a1ec9c1f269deec71d12e709\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:23:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0a90276bf42ba474250d4e51c93b958e93ffcad8ab5f4285766b381f0247c484\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:23:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://418cc0271450c53990546d7ffb4506824d2a5889e49064cd40624ba291f485bd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:23:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d2aeb8f0a5e6e7302570f0ecb899e31a8f5e9ed506f9bf7cfa6a0a6d55ae50dd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:23:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://843b9092466615ee64f818bbd1d6e51298c59932d1e135796967de09fb2f81bf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://843b9092466615ee64f818bbd1d6e51298c59932d1e135796967de09fb2f81bf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:23:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:23:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://320bccb832e179c9ab109d22c5cd652cfb3bd770d2c698d91886401d9566efc5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://320bccb832e179c9ab109d22c5cd652cfb3bd770d2c698d91886401d9566efc5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:23:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:23:18Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://6e30344cfa0c0c9d42845a517f744a156634132bd6c2f6e30f745751913e968c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6e30344cfa0c0c9d42845a517f744a156634132bd6c2f6e30f745751913e968c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:23:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:23:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:23:16Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:25:49Z is after 2025-08-24T17:21:41Z" Mar 21 04:25:49 crc kubenswrapper[4839]: I0321 04:25:49.452722 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-445ww" Mar 21 04:25:49 crc kubenswrapper[4839]: I0321 04:25:49.452900 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 21 04:25:49 crc kubenswrapper[4839]: I0321 04:25:49.453064 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 21 04:25:49 crc kubenswrapper[4839]: E0321 04:25:49.453056 4839 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-445ww" podUID="fa13ce27-53f2-4178-8560-251f0bb3f034" Mar 21 04:25:49 crc kubenswrapper[4839]: E0321 04:25:49.453249 4839 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 21 04:25:49 crc kubenswrapper[4839]: E0321 04:25:49.453394 4839 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 21 04:25:49 crc kubenswrapper[4839]: I0321 04:25:49.454249 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:24:41Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:12Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ea7373c259189351899cd74615296d7188b2529f808af9c8c85098177ffbe85d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:25:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:25:49Z is after 2025-08-24T17:21:41Z" Mar 21 04:25:49 crc kubenswrapper[4839]: I0321 04:25:49.471075 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-spl4b" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d634043b-c9ec-4469-b267-26053b1f02f9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:01Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:01Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5854b81091eb3920f499e2fb8bdb5e51afa5cf975397ad7d789603150fdafe92\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:25:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cdph2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c532a8668c47301ab93dbb1ea21be6ed687a83a6c9008b887e5023a46f27d172\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:25:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cdph2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://340ce870dbfbd3b102f8f1c5e6705da80e4a23db035175d7746ffe6ad63310f0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:25:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cdph2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://04c0470b8aa154ad901d5416631136ad2480da45bff39836dda6837a3e4b980e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:25:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cdph2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://821316a28b33f897df4ea4ae553ed0cbdd066ff220cc310af604bbe062e4b00e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:25:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cdph2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0555faac11cd1da1d5573074ee5d8704c1c4a7d8559996229bf242d9ed7290f4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:25:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cdph2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c3b3b070bcb4467cbc49a3b7bcae7caeb0458824dadce792fcc31a4a2df828b2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c3b3b070bcb4467cbc49a3b7bcae7caeb0458824dadce792fcc31a4a2df828b2\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-21T04:25:26Z\\\",\\\"message\\\":\\\"e/networking-console-plugin-85b44fc459-gdk6g openshift-etcd/etcd-crc openshift-multus/multus-zqcw4 openshift-network-diagnostics/network-check-target-xd92c openshift-network-operator/iptables-alerter-4ln5h openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-9hl57 openshift-multus/network-metrics-daemon-445ww openshift-network-diagnostics/network-check-source-55646444c4-trplf openshift-network-node-identity/network-node-identity-vrzqb]\\\\nI0321 04:25:26.224924 7118 obj_retry.go:418] Waiting for all the *v1.Pod retry setup to complete in iterateRetryResources\\\\nI0321 04:25:26.224943 7118 obj_retry.go:303] Retry object setup: *v1.Pod openshift-network-node-identity/network-node-identity-vrzqb\\\\nI0321 04:25:26.224952 7118 obj_retry.go:365] Adding new object: *v1.Pod openshift-network-node-identity/network-node-identity-vrzqb\\\\nI0321 04:25:26.224964 7118 ovn.go:134] Ensuring zone local for Pod openshift-network-node-identity/network-node-identity-vrzqb in node crc\\\\nF0321 04:25:26.224967 7118 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer becau\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-21T04:25:25Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-spl4b_openshift-ovn-kubernetes(d634043b-c9ec-4469-b267-26053b1f02f9)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cdph2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ee9c34719e2681860a7065870d4f0fd1d5aae10da5f2cc780e83f8020211e536\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:25:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cdph2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://466ed122245008a86b4471250670e086e61c981de3a7a026461c7c2238c87f3c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://466ed122245008a86b4471250670e086e61c981de3a7a026461c7c2238c87f3c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:25:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:25:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cdph2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:25:01Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-spl4b\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:25:49Z is after 2025-08-24T17:21:41Z" Mar 21 04:25:49 crc kubenswrapper[4839]: I0321 04:25:49.480229 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-9hl57" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4dee692e-c3b8-4538-86d7-210dd7e96173\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6058f68ef58f6259be3064c7cea97ec261dc1174655dbe75d4caf208a9c300e5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:25:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6n8pc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://99d7bb8a699d6022c3036c24052e58f8f45699b88971a2137e3d8bc27a21fe56\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:25:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6n8pc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:25:12Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-9hl57\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:25:49Z is after 2025-08-24T17:21:41Z" Mar 21 04:25:50 crc kubenswrapper[4839]: I0321 04:25:50.451971 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 21 04:25:50 crc kubenswrapper[4839]: E0321 04:25:50.452243 4839 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 21 04:25:51 crc kubenswrapper[4839]: I0321 04:25:51.452161 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 21 04:25:51 crc kubenswrapper[4839]: I0321 04:25:51.452188 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 21 04:25:51 crc kubenswrapper[4839]: I0321 04:25:51.452287 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-445ww" Mar 21 04:25:51 crc kubenswrapper[4839]: E0321 04:25:51.452332 4839 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 21 04:25:51 crc kubenswrapper[4839]: E0321 04:25:51.452473 4839 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-445ww" podUID="fa13ce27-53f2-4178-8560-251f0bb3f034" Mar 21 04:25:51 crc kubenswrapper[4839]: E0321 04:25:51.452538 4839 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 21 04:25:51 crc kubenswrapper[4839]: E0321 04:25:51.545060 4839 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 21 04:25:52 crc kubenswrapper[4839]: I0321 04:25:52.452492 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 21 04:25:52 crc kubenswrapper[4839]: E0321 04:25:52.452627 4839 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 21 04:25:53 crc kubenswrapper[4839]: I0321 04:25:53.452979 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 21 04:25:53 crc kubenswrapper[4839]: I0321 04:25:53.453109 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-445ww" Mar 21 04:25:53 crc kubenswrapper[4839]: E0321 04:25:53.453182 4839 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 21 04:25:53 crc kubenswrapper[4839]: I0321 04:25:53.453233 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 21 04:25:53 crc kubenswrapper[4839]: E0321 04:25:53.453383 4839 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-445ww" podUID="fa13ce27-53f2-4178-8560-251f0bb3f034" Mar 21 04:25:53 crc kubenswrapper[4839]: E0321 04:25:53.453506 4839 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 21 04:25:53 crc kubenswrapper[4839]: I0321 04:25:53.454669 4839 scope.go:117] "RemoveContainer" containerID="c3b3b070bcb4467cbc49a3b7bcae7caeb0458824dadce792fcc31a4a2df828b2" Mar 21 04:25:54 crc kubenswrapper[4839]: I0321 04:25:54.286609 4839 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-spl4b_d634043b-c9ec-4469-b267-26053b1f02f9/ovnkube-controller/2.log" Mar 21 04:25:54 crc kubenswrapper[4839]: I0321 04:25:54.289581 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-spl4b" event={"ID":"d634043b-c9ec-4469-b267-26053b1f02f9","Type":"ContainerStarted","Data":"616e01984d5c14dee369d9f7e59f96c9dfaccfd431fc19972454ffbbb0b5489f"} Mar 21 04:25:54 crc kubenswrapper[4839]: I0321 04:25:54.290050 4839 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-spl4b" Mar 21 04:25:54 crc kubenswrapper[4839]: I0321 04:25:54.308810 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"845a5f72-fcd9-4e53-a65e-d875a0758312\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:23:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:23:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:23:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:23:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:23:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6ac8aa1efe8200de679b27c91d85e8934dd8b958a487da8adfcb2e7fe4d4af83\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:23:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b5fcb9a692a8e68cdaf00e9ab50192286582dc94a1ec9c1f269deec71d12e709\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:23:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0a90276bf42ba474250d4e51c93b958e93ffcad8ab5f4285766b381f0247c484\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:23:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://418cc0271450c53990546d7ffb4506824d2a5889e49064cd40624ba291f485bd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:23:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d2aeb8f0a5e6e7302570f0ecb899e31a8f5e9ed506f9bf7cfa6a0a6d55ae50dd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:23:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://843b9092466615ee64f818bbd1d6e51298c59932d1e135796967de09fb2f81bf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://843b9092466615ee64f818bbd1d6e51298c59932d1e135796967de09fb2f81bf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:23:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:23:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://320bccb832e179c9ab109d22c5cd652cfb3bd770d2c698d91886401d9566efc5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://320bccb832e179c9ab109d22c5cd652cfb3bd770d2c698d91886401d9566efc5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:23:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:23:18Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://6e30344cfa0c0c9d42845a517f744a156634132bd6c2f6e30f745751913e968c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6e30344cfa0c0c9d42845a517f744a156634132bd6c2f6e30f745751913e968c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:23:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:23:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:23:16Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:25:54Z is after 2025-08-24T17:21:41Z" Mar 21 04:25:54 crc kubenswrapper[4839]: I0321 04:25:54.319670 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:24:41Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:12Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ea7373c259189351899cd74615296d7188b2529f808af9c8c85098177ffbe85d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:25:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:25:54Z is after 2025-08-24T17:21:41Z" Mar 21 04:25:54 crc kubenswrapper[4839]: I0321 04:25:54.340045 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-spl4b" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d634043b-c9ec-4469-b267-26053b1f02f9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:01Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:01Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5854b81091eb3920f499e2fb8bdb5e51afa5cf975397ad7d789603150fdafe92\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:25:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cdph2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c532a8668c47301ab93dbb1ea21be6ed687a83a6c9008b887e5023a46f27d172\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:25:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cdph2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://340ce870dbfbd3b102f8f1c5e6705da80e4a23db035175d7746ffe6ad63310f0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:25:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cdph2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://04c0470b8aa154ad901d5416631136ad2480da45bff39836dda6837a3e4b980e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:25:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cdph2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://821316a28b33f897df4ea4ae553ed0cbdd066ff220cc310af604bbe062e4b00e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:25:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cdph2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0555faac11cd1da1d5573074ee5d8704c1c4a7d8559996229bf242d9ed7290f4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:25:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cdph2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://616e01984d5c14dee369d9f7e59f96c9dfaccfd431fc19972454ffbbb0b5489f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c3b3b070bcb4467cbc49a3b7bcae7caeb0458824dadce792fcc31a4a2df828b2\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-21T04:25:26Z\\\",\\\"message\\\":\\\"e/networking-console-plugin-85b44fc459-gdk6g openshift-etcd/etcd-crc openshift-multus/multus-zqcw4 openshift-network-diagnostics/network-check-target-xd92c openshift-network-operator/iptables-alerter-4ln5h openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-9hl57 openshift-multus/network-metrics-daemon-445ww openshift-network-diagnostics/network-check-source-55646444c4-trplf openshift-network-node-identity/network-node-identity-vrzqb]\\\\nI0321 04:25:26.224924 7118 obj_retry.go:418] Waiting for all the *v1.Pod retry setup to complete in iterateRetryResources\\\\nI0321 04:25:26.224943 7118 obj_retry.go:303] Retry object setup: *v1.Pod openshift-network-node-identity/network-node-identity-vrzqb\\\\nI0321 04:25:26.224952 7118 obj_retry.go:365] Adding new object: *v1.Pod openshift-network-node-identity/network-node-identity-vrzqb\\\\nI0321 04:25:26.224964 7118 ovn.go:134] Ensuring zone local for Pod openshift-network-node-identity/network-node-identity-vrzqb in node crc\\\\nF0321 04:25:26.224967 7118 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer becau\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-21T04:25:25Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:25:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cdph2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ee9c34719e2681860a7065870d4f0fd1d5aae10da5f2cc780e83f8020211e536\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:25:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cdph2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://466ed122245008a86b4471250670e086e61c981de3a7a026461c7c2238c87f3c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://466ed122245008a86b4471250670e086e61c981de3a7a026461c7c2238c87f3c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:25:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:25:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cdph2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:25:01Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-spl4b\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:25:54Z is after 2025-08-24T17:21:41Z" Mar 21 04:25:54 crc kubenswrapper[4839]: I0321 04:25:54.350728 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-9hl57" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4dee692e-c3b8-4538-86d7-210dd7e96173\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6058f68ef58f6259be3064c7cea97ec261dc1174655dbe75d4caf208a9c300e5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:25:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6n8pc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://99d7bb8a699d6022c3036c24052e58f8f45699b88971a2137e3d8bc27a21fe56\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:25:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6n8pc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:25:12Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-9hl57\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:25:54Z is after 2025-08-24T17:21:41Z" Mar 21 04:25:54 crc kubenswrapper[4839]: I0321 04:25:54.360339 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-445ww" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fa13ce27-53f2-4178-8560-251f0bb3f034\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:13Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:13Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4b26h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4b26h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:25:13Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-445ww\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:25:54Z is after 2025-08-24T17:21:41Z" Mar 21 04:25:54 crc kubenswrapper[4839]: I0321 04:25:54.375705 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"99628ae3-9af9-4946-8e9f-2fc369d82cfa\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:23:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:23:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:24:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:24:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:23:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4a7ef331cbf4076bc661dd93ec1669ee8b721edbac57aeef3df25f1e36527065\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ed71f104d54976b384a414c08b933d4f7185883ea0e2dc638863d67f9b245505\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-21T04:24:17Z\\\",\\\"message\\\":\\\"+ timeout 3m /bin/bash -exuo pipefail -c 'while [ -n \\\\\\\"$(ss -Htanop \\\\\\\\( sport = 10357 \\\\\\\\))\\\\\\\" ]; do sleep 1; done'\\\\n++ ss -Htanop '(' sport = 10357 ')'\\\\n+ '[' -n '' ']'\\\\n+ exec cluster-policy-controller start --config=/etc/kubernetes/static-pod-resources/configmaps/cluster-policy-controller-config/config.yaml --kubeconfig=/etc/kubernetes/static-pod-resources/configmaps/controller-manager-kubeconfig/kubeconfig --namespace=openshift-kube-controller-manager -v=2\\\\nI0321 04:23:48.352735 1 leaderelection.go:121] The leader election gives 4 retries and allows for 30s of clock skew. The kube-apiserver downtime tolerance is 78s. Worst non-graceful lease acquisition is 2m43s. Worst graceful lease acquisition is {26s}.\\\\nI0321 04:23:48.353466 1 observer_polling.go:159] Starting file observer\\\\nI0321 04:23:48.354210 1 builder.go:298] cluster-policy-controller version 4.18.0-202501230001.p0.g5fd8525.assembly.stream.el9-5fd8525-5fd852525909ce6eab52972ba9ce8fcf56528eb9\\\\nI0321 04:23:48.354819 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.crt::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.key\\\\\\\"\\\\nI0321 04:24:09.734400 1 cert_rotation.go:91] certificate rotation detected, shutting down client connections to start using new credentials\\\\nI0321 04:24:17.259238 1 cmd.go:138] Received SIGTERM or SIGINT signal, shutting down controller.\\\\nF0321 04:24:17.259294 1 cmd.go:179] failed checking apiserver connectivity: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-21T04:23:48Z\\\"}},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:24:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dd3ce831e80df764c5450f1bf45e92fc061ba1c1efa8ade627b570edcb41567c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:23:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ac4ccdf976fbadd5ce5975dbec37b6be3b17b401a863ed5c1e93736176e1185e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:23:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f432640080695974eb1c6c1297fad14596d99d17a162d541417fb3ec762575f5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:23:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:23:16Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:25:54Z is after 2025-08-24T17:21:41Z" Mar 21 04:25:54 crc kubenswrapper[4839]: I0321 04:25:54.393145 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c5a473f2-0430-4c9b-8ef8-60d457db5188\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:23:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:23:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:23:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7299e7f513aff2d1d61b42cef3ed386843478716082a6e38d4d6d61d87f48529\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:23:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6d9bd6ceaa85cd21af55d1d512ac910eaf9d1bcf5d0f10868cd22ac80b13f66a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:23:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e4fe4a6c00181492cd5389e34a9bf18d905f69b2f7467eb75dcee9992eeb0d07\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:23:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://412472fce0e71838c2bb83101879b672e777dcd608bf27261a98261150d42e87\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ecafd1a4266b72f369f4b53d8a423bb875c1bc9719282c517a8e999ad5aecc66\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-21T04:24:14Z\\\",\\\"message\\\":\\\"W0321 04:24:13.708919 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI0321 04:24:13.709951 1 crypto.go:601] Generating new CA for check-endpoints-signer@1774067053 cert, and key in /tmp/serving-cert-1138089316/serving-signer.crt, /tmp/serving-cert-1138089316/serving-signer.key\\\\nI0321 04:24:14.009807 1 observer_polling.go:159] Starting file observer\\\\nW0321 04:24:14.016906 1 builder.go:272] unable to get owner reference (falling back to namespace): Unauthorized\\\\nI0321 04:24:14.017068 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0321 04:24:14.017655 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1138089316/tls.crt::/tmp/serving-cert-1138089316/tls.key\\\\\\\"\\\\nF0321 04:24:14.917069 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-21T04:24:13Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:24:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://573a241321770c9c676d83d8280f81eba0ad01a3e2f857be89d31316117b8898\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:23:19Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f70600d5431425836d94025fc8166068d8c9f411d0750be07cd7de44575a9649\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f70600d5431425836d94025fc8166068d8c9f411d0750be07cd7de44575a9649\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:23:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:23:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:23:16Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:25:54Z is after 2025-08-24T17:21:41Z" Mar 21 04:25:54 crc kubenswrapper[4839]: I0321 04:25:54.404460 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:24:41Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:24:41Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:25:54Z is after 2025-08-24T17:21:41Z" Mar 21 04:25:54 crc kubenswrapper[4839]: I0321 04:25:54.415794 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:24:41Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:09Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://17d5f53fe3137552caf74bf775b44fba819d8790d2bc4d5042744dbd7b7307ce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:25:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e480aacd9f65a8a4ccedd9c0a943c53bf6de8a8de00eb0d421ec782af8a785bb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:25:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:25:54Z is after 2025-08-24T17:21:41Z" Mar 21 04:25:54 crc kubenswrapper[4839]: I0321 04:25:54.426948 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-g47qh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e646dbcd-c976-48e4-8dee-497be8a275bf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2ceedbcee674106a8f519fededf37b2bbc4b4bd660845373899de825f1ed534e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:25:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ljz2v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:25:00Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-g47qh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:25:54Z is after 2025-08-24T17:21:41Z" Mar 21 04:25:54 crc kubenswrapper[4839]: I0321 04:25:54.438126 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b27e726e-fb6e-471f-b189-71637a34994c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:23:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:23:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:24:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:24:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:23:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://506189caf9101bd8eff0b327400189bde29280613baf494135bf28fa36bd80cd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:23:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4ea5d4d7d744a79c2e5515eaf49d09346d96b37b17eee6e3e32952e389248367\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:23:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1981f2299df7154d30ce01492943eaadeeed0f27c0b0a39a1228054345aa7a3a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:23:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fec5f0e1a98ef435cb8b280400cfc7daff0f232d320947f22a93df0b81b6f441\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fec5f0e1a98ef435cb8b280400cfc7daff0f232d320947f22a93df0b81b6f441\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:23:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:23:17Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:23:16Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:25:54Z is after 2025-08-24T17:21:41Z" Mar 21 04:25:54 crc kubenswrapper[4839]: I0321 04:25:54.448123 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:24:41Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:24:41Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:25:54Z is after 2025-08-24T17:21:41Z" Mar 21 04:25:54 crc kubenswrapper[4839]: I0321 04:25:54.452437 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 21 04:25:54 crc kubenswrapper[4839]: E0321 04:25:54.452527 4839 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 21 04:25:54 crc kubenswrapper[4839]: I0321 04:25:54.458182 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-jx4q7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4f92fefb-d5cd-451a-8bbe-31eea55d5bd9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6fc18a359e07bf64662248d7ceb65c8184407c0c4e14ce5483760975218407c9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:25:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9jqbw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a124ddc90d8b93fe4b6f77b778fbdac127b4896f5f55e6d5d78629cd17584311\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:25:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9jqbw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:25:00Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-jx4q7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:25:54Z is after 2025-08-24T17:21:41Z" Mar 21 04:25:54 crc kubenswrapper[4839]: I0321 04:25:54.468213 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:24:41Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:11Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c3ce591ac49d818720803d65a1f2cbc28e1781a5bd3cff7c6cdaaa4b55c01b62\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:25:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:25:54Z is after 2025-08-24T17:21:41Z" Mar 21 04:25:54 crc kubenswrapper[4839]: I0321 04:25:54.478253 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:24:41Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:24:41Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:25:54Z is after 2025-08-24T17:21:41Z" Mar 21 04:25:54 crc kubenswrapper[4839]: I0321 04:25:54.490536 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-scp2c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e0848faa-daf7-4b62-a20f-36d92678db1d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2ae22ec83bed051327f3e013af8f13d50dfab76794f5fbc972c4091173c8c464\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:25:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6trk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5ee6bb585321fd698c356e2ad9b8f48f62f2b8e09847371060e8ab0395a059ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5ee6bb585321fd698c356e2ad9b8f48f62f2b8e09847371060e8ab0395a059ea\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:25:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:25:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6trk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d6527e3c31caba53559cb6dfe4494b576789a0c8e3e5a7d392352441f1b9b1ad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d6527e3c31caba53559cb6dfe4494b576789a0c8e3e5a7d392352441f1b9b1ad\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:25:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:25:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6trk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://688598020c8b1be055194d4805d234769230262450e5f816f35cc120ff5ff9a2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://688598020c8b1be055194d4805d234769230262450e5f816f35cc120ff5ff9a2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:25:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:25:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6trk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://137b7d338371020059e20feb028cea9e79fa43d32ff7de446719a1326051c4f4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://137b7d338371020059e20feb028cea9e79fa43d32ff7de446719a1326051c4f4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:25:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:25:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6trk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6059c9b392e73f503a825716b885d248ddf659fe4fc1cd23f47353dd0fdc0e07\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6059c9b392e73f503a825716b885d248ddf659fe4fc1cd23f47353dd0fdc0e07\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:25:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:25:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6trk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9367e3b70275a4fd5cc98faa785da0149e890de672bce34ddc831a46be5dd6b3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9367e3b70275a4fd5cc98faa785da0149e890de672bce34ddc831a46be5dd6b3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:25:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:25:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6trk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:25:00Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-scp2c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:25:54Z is after 2025-08-24T17:21:41Z" Mar 21 04:25:54 crc kubenswrapper[4839]: I0321 04:25:54.501696 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-zqcw4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1602189b-f4f3-40ee-ba63-c695c11069d0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bc026a6498e53a6dc434660326fcbdb2323e59c0c7d6f9e5d768c3a347b614d1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://abb804e65728bf323531d9b35e52f6700584bab85de4a5b749067d2d89224747\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-21T04:25:48Z\\\",\\\"message\\\":\\\"2026-03-21T04:25:02+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_d2a3febf-889d-41cb-9c6b-9024dff2f413\\\\n2026-03-21T04:25:02+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_d2a3febf-889d-41cb-9c6b-9024dff2f413 to /host/opt/cni/bin/\\\\n2026-03-21T04:25:02Z [verbose] multus-daemon started\\\\n2026-03-21T04:25:02Z [verbose] Readiness Indicator file check\\\\n2026-03-21T04:25:47Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-21T04:25:01Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:25:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f9gh6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:25:00Z\\\"}}\" for pod \"openshift-multus\"/\"multus-zqcw4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:25:54Z is after 2025-08-24T17:21:41Z" Mar 21 04:25:54 crc kubenswrapper[4839]: I0321 04:25:54.510670 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-sxs57" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e99177d8-5f41-4cee-a2c9-ae1c314d9d8d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4499202f3d135af4fbc0aaeddb85c3af3d571f0a6a0da31b0c054112cbeef85e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:25:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vqm8m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:25:06Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-sxs57\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:25:54Z is after 2025-08-24T17:21:41Z" Mar 21 04:25:55 crc kubenswrapper[4839]: I0321 04:25:55.295314 4839 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-spl4b_d634043b-c9ec-4469-b267-26053b1f02f9/ovnkube-controller/3.log" Mar 21 04:25:55 crc kubenswrapper[4839]: I0321 04:25:55.296253 4839 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-spl4b_d634043b-c9ec-4469-b267-26053b1f02f9/ovnkube-controller/2.log" Mar 21 04:25:55 crc kubenswrapper[4839]: I0321 04:25:55.298855 4839 generic.go:334] "Generic (PLEG): container finished" podID="d634043b-c9ec-4469-b267-26053b1f02f9" containerID="616e01984d5c14dee369d9f7e59f96c9dfaccfd431fc19972454ffbbb0b5489f" exitCode=1 Mar 21 04:25:55 crc kubenswrapper[4839]: I0321 04:25:55.298922 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-spl4b" event={"ID":"d634043b-c9ec-4469-b267-26053b1f02f9","Type":"ContainerDied","Data":"616e01984d5c14dee369d9f7e59f96c9dfaccfd431fc19972454ffbbb0b5489f"} Mar 21 04:25:55 crc kubenswrapper[4839]: I0321 04:25:55.298992 4839 scope.go:117] "RemoveContainer" containerID="c3b3b070bcb4467cbc49a3b7bcae7caeb0458824dadce792fcc31a4a2df828b2" Mar 21 04:25:55 crc kubenswrapper[4839]: I0321 04:25:55.299948 4839 scope.go:117] "RemoveContainer" containerID="616e01984d5c14dee369d9f7e59f96c9dfaccfd431fc19972454ffbbb0b5489f" Mar 21 04:25:55 crc kubenswrapper[4839]: E0321 04:25:55.300225 4839 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-spl4b_openshift-ovn-kubernetes(d634043b-c9ec-4469-b267-26053b1f02f9)\"" pod="openshift-ovn-kubernetes/ovnkube-node-spl4b" podUID="d634043b-c9ec-4469-b267-26053b1f02f9" Mar 21 04:25:55 crc kubenswrapper[4839]: I0321 04:25:55.325416 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:24:41Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:11Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c3ce591ac49d818720803d65a1f2cbc28e1781a5bd3cff7c6cdaaa4b55c01b62\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:25:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:25:55Z is after 2025-08-24T17:21:41Z" Mar 21 04:25:55 crc kubenswrapper[4839]: I0321 04:25:55.340459 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:24:41Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:24:41Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:25:55Z is after 2025-08-24T17:21:41Z" Mar 21 04:25:55 crc kubenswrapper[4839]: I0321 04:25:55.363720 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-scp2c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e0848faa-daf7-4b62-a20f-36d92678db1d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2ae22ec83bed051327f3e013af8f13d50dfab76794f5fbc972c4091173c8c464\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:25:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6trk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5ee6bb585321fd698c356e2ad9b8f48f62f2b8e09847371060e8ab0395a059ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5ee6bb585321fd698c356e2ad9b8f48f62f2b8e09847371060e8ab0395a059ea\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:25:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:25:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6trk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d6527e3c31caba53559cb6dfe4494b576789a0c8e3e5a7d392352441f1b9b1ad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d6527e3c31caba53559cb6dfe4494b576789a0c8e3e5a7d392352441f1b9b1ad\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:25:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:25:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6trk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://688598020c8b1be055194d4805d234769230262450e5f816f35cc120ff5ff9a2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://688598020c8b1be055194d4805d234769230262450e5f816f35cc120ff5ff9a2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:25:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:25:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6trk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://137b7d338371020059e20feb028cea9e79fa43d32ff7de446719a1326051c4f4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://137b7d338371020059e20feb028cea9e79fa43d32ff7de446719a1326051c4f4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:25:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:25:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6trk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6059c9b392e73f503a825716b885d248ddf659fe4fc1cd23f47353dd0fdc0e07\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6059c9b392e73f503a825716b885d248ddf659fe4fc1cd23f47353dd0fdc0e07\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:25:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:25:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6trk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9367e3b70275a4fd5cc98faa785da0149e890de672bce34ddc831a46be5dd6b3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9367e3b70275a4fd5cc98faa785da0149e890de672bce34ddc831a46be5dd6b3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:25:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:25:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6trk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:25:00Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-scp2c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:25:55Z is after 2025-08-24T17:21:41Z" Mar 21 04:25:55 crc kubenswrapper[4839]: I0321 04:25:55.380443 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-zqcw4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1602189b-f4f3-40ee-ba63-c695c11069d0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bc026a6498e53a6dc434660326fcbdb2323e59c0c7d6f9e5d768c3a347b614d1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://abb804e65728bf323531d9b35e52f6700584bab85de4a5b749067d2d89224747\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-21T04:25:48Z\\\",\\\"message\\\":\\\"2026-03-21T04:25:02+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_d2a3febf-889d-41cb-9c6b-9024dff2f413\\\\n2026-03-21T04:25:02+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_d2a3febf-889d-41cb-9c6b-9024dff2f413 to /host/opt/cni/bin/\\\\n2026-03-21T04:25:02Z [verbose] multus-daemon started\\\\n2026-03-21T04:25:02Z [verbose] Readiness Indicator file check\\\\n2026-03-21T04:25:47Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-21T04:25:01Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:25:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f9gh6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:25:00Z\\\"}}\" for pod \"openshift-multus\"/\"multus-zqcw4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:25:55Z is after 2025-08-24T17:21:41Z" Mar 21 04:25:55 crc kubenswrapper[4839]: I0321 04:25:55.398559 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-sxs57" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e99177d8-5f41-4cee-a2c9-ae1c314d9d8d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4499202f3d135af4fbc0aaeddb85c3af3d571f0a6a0da31b0c054112cbeef85e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:25:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vqm8m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:25:06Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-sxs57\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:25:55Z is after 2025-08-24T17:21:41Z" Mar 21 04:25:55 crc kubenswrapper[4839]: I0321 04:25:55.420443 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"845a5f72-fcd9-4e53-a65e-d875a0758312\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:23:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:23:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:23:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:23:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:23:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6ac8aa1efe8200de679b27c91d85e8934dd8b958a487da8adfcb2e7fe4d4af83\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:23:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b5fcb9a692a8e68cdaf00e9ab50192286582dc94a1ec9c1f269deec71d12e709\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:23:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0a90276bf42ba474250d4e51c93b958e93ffcad8ab5f4285766b381f0247c484\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:23:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://418cc0271450c53990546d7ffb4506824d2a5889e49064cd40624ba291f485bd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:23:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d2aeb8f0a5e6e7302570f0ecb899e31a8f5e9ed506f9bf7cfa6a0a6d55ae50dd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:23:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://843b9092466615ee64f818bbd1d6e51298c59932d1e135796967de09fb2f81bf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://843b9092466615ee64f818bbd1d6e51298c59932d1e135796967de09fb2f81bf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:23:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:23:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://320bccb832e179c9ab109d22c5cd652cfb3bd770d2c698d91886401d9566efc5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://320bccb832e179c9ab109d22c5cd652cfb3bd770d2c698d91886401d9566efc5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:23:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:23:18Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://6e30344cfa0c0c9d42845a517f744a156634132bd6c2f6e30f745751913e968c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6e30344cfa0c0c9d42845a517f744a156634132bd6c2f6e30f745751913e968c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:23:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:23:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:23:16Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:25:55Z is after 2025-08-24T17:21:41Z" Mar 21 04:25:55 crc kubenswrapper[4839]: I0321 04:25:55.433427 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:24:41Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:12Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ea7373c259189351899cd74615296d7188b2529f808af9c8c85098177ffbe85d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:25:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:25:55Z is after 2025-08-24T17:21:41Z" Mar 21 04:25:55 crc kubenswrapper[4839]: I0321 04:25:55.452748 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-445ww" Mar 21 04:25:55 crc kubenswrapper[4839]: E0321 04:25:55.452892 4839 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-445ww" podUID="fa13ce27-53f2-4178-8560-251f0bb3f034" Mar 21 04:25:55 crc kubenswrapper[4839]: I0321 04:25:55.452935 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 21 04:25:55 crc kubenswrapper[4839]: I0321 04:25:55.452996 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 21 04:25:55 crc kubenswrapper[4839]: E0321 04:25:55.453090 4839 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 21 04:25:55 crc kubenswrapper[4839]: E0321 04:25:55.453173 4839 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 21 04:25:55 crc kubenswrapper[4839]: I0321 04:25:55.459093 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-spl4b" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d634043b-c9ec-4469-b267-26053b1f02f9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:01Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:01Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5854b81091eb3920f499e2fb8bdb5e51afa5cf975397ad7d789603150fdafe92\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:25:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cdph2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c532a8668c47301ab93dbb1ea21be6ed687a83a6c9008b887e5023a46f27d172\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:25:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cdph2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://340ce870dbfbd3b102f8f1c5e6705da80e4a23db035175d7746ffe6ad63310f0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:25:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cdph2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://04c0470b8aa154ad901d5416631136ad2480da45bff39836dda6837a3e4b980e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:25:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cdph2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://821316a28b33f897df4ea4ae553ed0cbdd066ff220cc310af604bbe062e4b00e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:25:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cdph2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0555faac11cd1da1d5573074ee5d8704c1c4a7d8559996229bf242d9ed7290f4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:25:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cdph2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://616e01984d5c14dee369d9f7e59f96c9dfaccfd431fc19972454ffbbb0b5489f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c3b3b070bcb4467cbc49a3b7bcae7caeb0458824dadce792fcc31a4a2df828b2\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-21T04:25:26Z\\\",\\\"message\\\":\\\"e/networking-console-plugin-85b44fc459-gdk6g openshift-etcd/etcd-crc openshift-multus/multus-zqcw4 openshift-network-diagnostics/network-check-target-xd92c openshift-network-operator/iptables-alerter-4ln5h openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-9hl57 openshift-multus/network-metrics-daemon-445ww openshift-network-diagnostics/network-check-source-55646444c4-trplf openshift-network-node-identity/network-node-identity-vrzqb]\\\\nI0321 04:25:26.224924 7118 obj_retry.go:418] Waiting for all the *v1.Pod retry setup to complete in iterateRetryResources\\\\nI0321 04:25:26.224943 7118 obj_retry.go:303] Retry object setup: *v1.Pod openshift-network-node-identity/network-node-identity-vrzqb\\\\nI0321 04:25:26.224952 7118 obj_retry.go:365] Adding new object: *v1.Pod openshift-network-node-identity/network-node-identity-vrzqb\\\\nI0321 04:25:26.224964 7118 ovn.go:134] Ensuring zone local for Pod openshift-network-node-identity/network-node-identity-vrzqb in node crc\\\\nF0321 04:25:26.224967 7118 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer becau\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-21T04:25:25Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://616e01984d5c14dee369d9f7e59f96c9dfaccfd431fc19972454ffbbb0b5489f\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-21T04:25:54Z\\\",\\\"message\\\":\\\"led to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:25:54Z is after 2025-08-24T17:21:41Z]\\\\nI0321 04:25:54.232530 7446 transact.go:42] Configuring OVN: [{Op:update Table:Logical_Switch_Port Row:map[addresses:{GoSet:[0a:58:0a:d9:00:5c 10.217.0.92]} options:{GoMap:map[iface-id-ver:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 requested-chassis:crc]} port_security:{GoSet:[0a:58:0a:d9:00:5c 10.217.0.92]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {c94130be-172c-477c-88c4-40cc7eba30fe}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:} {Op:mutate Table:Logical_Switch Row:map[] Rows:[] Columns:[] Mutations:[{Column:ports Mutator:insert Value:{GoSet:[{GoUUID:c94130be-172c-477c-88c4-40cc7eba30fe}]}}] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {7e8bb06a-06a5-45bc-a752-26a17d322811}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:} {Op:mutate Table:Port_Group Row:map[] Rows:[] Columns:[] Mutations:[{Column:ports Mutator:insert Value:{GoSet:[{GoUUID:c94130be-172c-477c-88c4-40cc7eba30fe}]}}] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {eb8eef51-1a8d-43f9-ae2e-3b2cc00ded60}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:} {Op:update Table:NAT Row:map[external_ip:192.168.126.11 logical_ip:10.217.0.92 options:{GoMap:map[stateless:false]} typ\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-21T04:25:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cdph2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ee9c34719e2681860a7065870d4f0fd1d5aae10da5f2cc780e83f8020211e536\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:25:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cdph2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://466ed122245008a86b4471250670e086e61c981de3a7a026461c7c2238c87f3c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://466ed122245008a86b4471250670e086e61c981de3a7a026461c7c2238c87f3c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:25:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:25:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cdph2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:25:01Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-spl4b\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:25:55Z is after 2025-08-24T17:21:41Z" Mar 21 04:25:55 crc kubenswrapper[4839]: I0321 04:25:55.470139 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-9hl57" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4dee692e-c3b8-4538-86d7-210dd7e96173\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6058f68ef58f6259be3064c7cea97ec261dc1174655dbe75d4caf208a9c300e5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:25:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6n8pc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://99d7bb8a699d6022c3036c24052e58f8f45699b88971a2137e3d8bc27a21fe56\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:25:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6n8pc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:25:12Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-9hl57\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:25:55Z is after 2025-08-24T17:21:41Z" Mar 21 04:25:55 crc kubenswrapper[4839]: I0321 04:25:55.484325 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-445ww" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fa13ce27-53f2-4178-8560-251f0bb3f034\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:13Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:13Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4b26h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4b26h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:25:13Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-445ww\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:25:55Z is after 2025-08-24T17:21:41Z" Mar 21 04:25:55 crc kubenswrapper[4839]: I0321 04:25:55.498790 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"99628ae3-9af9-4946-8e9f-2fc369d82cfa\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:23:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:23:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:24:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:24:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:23:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4a7ef331cbf4076bc661dd93ec1669ee8b721edbac57aeef3df25f1e36527065\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ed71f104d54976b384a414c08b933d4f7185883ea0e2dc638863d67f9b245505\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-21T04:24:17Z\\\",\\\"message\\\":\\\"+ timeout 3m /bin/bash -exuo pipefail -c 'while [ -n \\\\\\\"$(ss -Htanop \\\\\\\\( sport = 10357 \\\\\\\\))\\\\\\\" ]; do sleep 1; done'\\\\n++ ss -Htanop '(' sport = 10357 ')'\\\\n+ '[' -n '' ']'\\\\n+ exec cluster-policy-controller start --config=/etc/kubernetes/static-pod-resources/configmaps/cluster-policy-controller-config/config.yaml --kubeconfig=/etc/kubernetes/static-pod-resources/configmaps/controller-manager-kubeconfig/kubeconfig --namespace=openshift-kube-controller-manager -v=2\\\\nI0321 04:23:48.352735 1 leaderelection.go:121] The leader election gives 4 retries and allows for 30s of clock skew. The kube-apiserver downtime tolerance is 78s. Worst non-graceful lease acquisition is 2m43s. Worst graceful lease acquisition is {26s}.\\\\nI0321 04:23:48.353466 1 observer_polling.go:159] Starting file observer\\\\nI0321 04:23:48.354210 1 builder.go:298] cluster-policy-controller version 4.18.0-202501230001.p0.g5fd8525.assembly.stream.el9-5fd8525-5fd852525909ce6eab52972ba9ce8fcf56528eb9\\\\nI0321 04:23:48.354819 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.crt::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.key\\\\\\\"\\\\nI0321 04:24:09.734400 1 cert_rotation.go:91] certificate rotation detected, shutting down client connections to start using new credentials\\\\nI0321 04:24:17.259238 1 cmd.go:138] Received SIGTERM or SIGINT signal, shutting down controller.\\\\nF0321 04:24:17.259294 1 cmd.go:179] failed checking apiserver connectivity: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-21T04:23:48Z\\\"}},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:24:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dd3ce831e80df764c5450f1bf45e92fc061ba1c1efa8ade627b570edcb41567c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:23:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ac4ccdf976fbadd5ce5975dbec37b6be3b17b401a863ed5c1e93736176e1185e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:23:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f432640080695974eb1c6c1297fad14596d99d17a162d541417fb3ec762575f5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:23:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:23:16Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:25:55Z is after 2025-08-24T17:21:41Z" Mar 21 04:25:55 crc kubenswrapper[4839]: I0321 04:25:55.511680 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c5a473f2-0430-4c9b-8ef8-60d457db5188\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:23:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:23:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:23:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7299e7f513aff2d1d61b42cef3ed386843478716082a6e38d4d6d61d87f48529\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:23:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6d9bd6ceaa85cd21af55d1d512ac910eaf9d1bcf5d0f10868cd22ac80b13f66a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:23:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e4fe4a6c00181492cd5389e34a9bf18d905f69b2f7467eb75dcee9992eeb0d07\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:23:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://412472fce0e71838c2bb83101879b672e777dcd608bf27261a98261150d42e87\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ecafd1a4266b72f369f4b53d8a423bb875c1bc9719282c517a8e999ad5aecc66\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-21T04:24:14Z\\\",\\\"message\\\":\\\"W0321 04:24:13.708919 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI0321 04:24:13.709951 1 crypto.go:601] Generating new CA for check-endpoints-signer@1774067053 cert, and key in /tmp/serving-cert-1138089316/serving-signer.crt, /tmp/serving-cert-1138089316/serving-signer.key\\\\nI0321 04:24:14.009807 1 observer_polling.go:159] Starting file observer\\\\nW0321 04:24:14.016906 1 builder.go:272] unable to get owner reference (falling back to namespace): Unauthorized\\\\nI0321 04:24:14.017068 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0321 04:24:14.017655 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1138089316/tls.crt::/tmp/serving-cert-1138089316/tls.key\\\\\\\"\\\\nF0321 04:24:14.917069 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-21T04:24:13Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:24:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://573a241321770c9c676d83d8280f81eba0ad01a3e2f857be89d31316117b8898\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:23:19Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f70600d5431425836d94025fc8166068d8c9f411d0750be07cd7de44575a9649\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f70600d5431425836d94025fc8166068d8c9f411d0750be07cd7de44575a9649\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:23:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:23:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:23:16Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:25:55Z is after 2025-08-24T17:21:41Z" Mar 21 04:25:55 crc kubenswrapper[4839]: I0321 04:25:55.522261 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:24:41Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:24:41Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:25:55Z is after 2025-08-24T17:21:41Z" Mar 21 04:25:55 crc kubenswrapper[4839]: I0321 04:25:55.533509 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:24:41Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:09Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://17d5f53fe3137552caf74bf775b44fba819d8790d2bc4d5042744dbd7b7307ce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:25:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e480aacd9f65a8a4ccedd9c0a943c53bf6de8a8de00eb0d421ec782af8a785bb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:25:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:25:55Z is after 2025-08-24T17:21:41Z" Mar 21 04:25:55 crc kubenswrapper[4839]: I0321 04:25:55.542438 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-g47qh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e646dbcd-c976-48e4-8dee-497be8a275bf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2ceedbcee674106a8f519fededf37b2bbc4b4bd660845373899de825f1ed534e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:25:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ljz2v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:25:00Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-g47qh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:25:55Z is after 2025-08-24T17:21:41Z" Mar 21 04:25:55 crc kubenswrapper[4839]: I0321 04:25:55.551877 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b27e726e-fb6e-471f-b189-71637a34994c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:23:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:23:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:24:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:24:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:23:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://506189caf9101bd8eff0b327400189bde29280613baf494135bf28fa36bd80cd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:23:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4ea5d4d7d744a79c2e5515eaf49d09346d96b37b17eee6e3e32952e389248367\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:23:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1981f2299df7154d30ce01492943eaadeeed0f27c0b0a39a1228054345aa7a3a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:23:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fec5f0e1a98ef435cb8b280400cfc7daff0f232d320947f22a93df0b81b6f441\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fec5f0e1a98ef435cb8b280400cfc7daff0f232d320947f22a93df0b81b6f441\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:23:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:23:17Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:23:16Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:25:55Z is after 2025-08-24T17:21:41Z" Mar 21 04:25:55 crc kubenswrapper[4839]: I0321 04:25:55.563831 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:24:41Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:24:41Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:25:55Z is after 2025-08-24T17:21:41Z" Mar 21 04:25:55 crc kubenswrapper[4839]: I0321 04:25:55.575931 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-jx4q7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4f92fefb-d5cd-451a-8bbe-31eea55d5bd9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6fc18a359e07bf64662248d7ceb65c8184407c0c4e14ce5483760975218407c9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:25:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9jqbw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a124ddc90d8b93fe4b6f77b778fbdac127b4896f5f55e6d5d78629cd17584311\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:25:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9jqbw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:25:00Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-jx4q7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:25:55Z is after 2025-08-24T17:21:41Z" Mar 21 04:25:55 crc kubenswrapper[4839]: I0321 04:25:55.898670 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:25:55 crc kubenswrapper[4839]: I0321 04:25:55.898721 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:25:55 crc kubenswrapper[4839]: I0321 04:25:55.898736 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:25:55 crc kubenswrapper[4839]: I0321 04:25:55.898756 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 04:25:55 crc kubenswrapper[4839]: I0321 04:25:55.898772 4839 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T04:25:55Z","lastTransitionTime":"2026-03-21T04:25:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 04:25:55 crc kubenswrapper[4839]: E0321 04:25:55.912166 4839 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404564Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865364Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-21T04:25:55Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:55Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-21T04:25:55Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:55Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-21T04:25:55Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:55Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-21T04:25:55Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:55Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"d75f4a6d-5ce3-49bd-a7c7-4f5c3e6bf62f\\\",\\\"systemUUID\\\":\\\"2a7bfad9-30ba-42d8-b982-971191ebb9d6\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:25:55Z is after 2025-08-24T17:21:41Z" Mar 21 04:25:55 crc kubenswrapper[4839]: I0321 04:25:55.916254 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:25:55 crc kubenswrapper[4839]: I0321 04:25:55.916293 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:25:55 crc kubenswrapper[4839]: I0321 04:25:55.916305 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:25:55 crc kubenswrapper[4839]: I0321 04:25:55.916322 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 04:25:55 crc kubenswrapper[4839]: I0321 04:25:55.916334 4839 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T04:25:55Z","lastTransitionTime":"2026-03-21T04:25:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 04:25:55 crc kubenswrapper[4839]: E0321 04:25:55.935714 4839 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404564Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865364Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-21T04:25:55Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:55Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-21T04:25:55Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:55Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-21T04:25:55Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:55Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-21T04:25:55Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:55Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"d75f4a6d-5ce3-49bd-a7c7-4f5c3e6bf62f\\\",\\\"systemUUID\\\":\\\"2a7bfad9-30ba-42d8-b982-971191ebb9d6\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:25:55Z is after 2025-08-24T17:21:41Z" Mar 21 04:25:55 crc kubenswrapper[4839]: I0321 04:25:55.940347 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:25:55 crc kubenswrapper[4839]: I0321 04:25:55.940382 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:25:55 crc kubenswrapper[4839]: I0321 04:25:55.940393 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:25:55 crc kubenswrapper[4839]: I0321 04:25:55.940407 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 04:25:55 crc kubenswrapper[4839]: I0321 04:25:55.940415 4839 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T04:25:55Z","lastTransitionTime":"2026-03-21T04:25:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 04:25:55 crc kubenswrapper[4839]: E0321 04:25:55.956550 4839 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404564Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865364Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-21T04:25:55Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:55Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-21T04:25:55Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:55Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-21T04:25:55Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:55Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-21T04:25:55Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:55Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"d75f4a6d-5ce3-49bd-a7c7-4f5c3e6bf62f\\\",\\\"systemUUID\\\":\\\"2a7bfad9-30ba-42d8-b982-971191ebb9d6\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:25:55Z is after 2025-08-24T17:21:41Z" Mar 21 04:25:55 crc kubenswrapper[4839]: I0321 04:25:55.961740 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:25:55 crc kubenswrapper[4839]: I0321 04:25:55.961827 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:25:55 crc kubenswrapper[4839]: I0321 04:25:55.961865 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:25:55 crc kubenswrapper[4839]: I0321 04:25:55.961896 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 04:25:55 crc kubenswrapper[4839]: I0321 04:25:55.961920 4839 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T04:25:55Z","lastTransitionTime":"2026-03-21T04:25:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 04:25:55 crc kubenswrapper[4839]: E0321 04:25:55.974408 4839 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404564Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865364Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-21T04:25:55Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:55Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-21T04:25:55Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:55Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-21T04:25:55Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:55Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-21T04:25:55Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:55Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"d75f4a6d-5ce3-49bd-a7c7-4f5c3e6bf62f\\\",\\\"systemUUID\\\":\\\"2a7bfad9-30ba-42d8-b982-971191ebb9d6\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:25:55Z is after 2025-08-24T17:21:41Z" Mar 21 04:25:55 crc kubenswrapper[4839]: I0321 04:25:55.978140 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:25:55 crc kubenswrapper[4839]: I0321 04:25:55.978198 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:25:55 crc kubenswrapper[4839]: I0321 04:25:55.978215 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:25:55 crc kubenswrapper[4839]: I0321 04:25:55.978242 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 04:25:55 crc kubenswrapper[4839]: I0321 04:25:55.978259 4839 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T04:25:55Z","lastTransitionTime":"2026-03-21T04:25:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 04:25:55 crc kubenswrapper[4839]: E0321 04:25:55.990248 4839 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404564Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865364Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-21T04:25:55Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:55Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-21T04:25:55Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:55Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-21T04:25:55Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:55Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-21T04:25:55Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:55Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"d75f4a6d-5ce3-49bd-a7c7-4f5c3e6bf62f\\\",\\\"systemUUID\\\":\\\"2a7bfad9-30ba-42d8-b982-971191ebb9d6\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:25:55Z is after 2025-08-24T17:21:41Z" Mar 21 04:25:55 crc kubenswrapper[4839]: E0321 04:25:55.990375 4839 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Mar 21 04:25:56 crc kubenswrapper[4839]: I0321 04:25:56.303335 4839 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-spl4b_d634043b-c9ec-4469-b267-26053b1f02f9/ovnkube-controller/3.log" Mar 21 04:25:56 crc kubenswrapper[4839]: I0321 04:25:56.306752 4839 scope.go:117] "RemoveContainer" containerID="616e01984d5c14dee369d9f7e59f96c9dfaccfd431fc19972454ffbbb0b5489f" Mar 21 04:25:56 crc kubenswrapper[4839]: E0321 04:25:56.306907 4839 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-spl4b_openshift-ovn-kubernetes(d634043b-c9ec-4469-b267-26053b1f02f9)\"" pod="openshift-ovn-kubernetes/ovnkube-node-spl4b" podUID="d634043b-c9ec-4469-b267-26053b1f02f9" Mar 21 04:25:56 crc kubenswrapper[4839]: I0321 04:25:56.319515 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c5a473f2-0430-4c9b-8ef8-60d457db5188\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:23:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:23:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:23:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7299e7f513aff2d1d61b42cef3ed386843478716082a6e38d4d6d61d87f48529\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:23:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6d9bd6ceaa85cd21af55d1d512ac910eaf9d1bcf5d0f10868cd22ac80b13f66a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:23:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e4fe4a6c00181492cd5389e34a9bf18d905f69b2f7467eb75dcee9992eeb0d07\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:23:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://412472fce0e71838c2bb83101879b672e777dcd608bf27261a98261150d42e87\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ecafd1a4266b72f369f4b53d8a423bb875c1bc9719282c517a8e999ad5aecc66\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-21T04:24:14Z\\\",\\\"message\\\":\\\"W0321 04:24:13.708919 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI0321 04:24:13.709951 1 crypto.go:601] Generating new CA for check-endpoints-signer@1774067053 cert, and key in /tmp/serving-cert-1138089316/serving-signer.crt, /tmp/serving-cert-1138089316/serving-signer.key\\\\nI0321 04:24:14.009807 1 observer_polling.go:159] Starting file observer\\\\nW0321 04:24:14.016906 1 builder.go:272] unable to get owner reference (falling back to namespace): Unauthorized\\\\nI0321 04:24:14.017068 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0321 04:24:14.017655 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1138089316/tls.crt::/tmp/serving-cert-1138089316/tls.key\\\\\\\"\\\\nF0321 04:24:14.917069 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-21T04:24:13Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:24:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://573a241321770c9c676d83d8280f81eba0ad01a3e2f857be89d31316117b8898\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:23:19Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f70600d5431425836d94025fc8166068d8c9f411d0750be07cd7de44575a9649\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f70600d5431425836d94025fc8166068d8c9f411d0750be07cd7de44575a9649\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:23:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:23:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:23:16Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:25:56Z is after 2025-08-24T17:21:41Z" Mar 21 04:25:56 crc kubenswrapper[4839]: I0321 04:25:56.330162 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:24:41Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:24:41Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:25:56Z is after 2025-08-24T17:21:41Z" Mar 21 04:25:56 crc kubenswrapper[4839]: I0321 04:25:56.342465 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:24:41Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:09Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://17d5f53fe3137552caf74bf775b44fba819d8790d2bc4d5042744dbd7b7307ce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:25:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e480aacd9f65a8a4ccedd9c0a943c53bf6de8a8de00eb0d421ec782af8a785bb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:25:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:25:56Z is after 2025-08-24T17:21:41Z" Mar 21 04:25:56 crc kubenswrapper[4839]: I0321 04:25:56.353658 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-g47qh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e646dbcd-c976-48e4-8dee-497be8a275bf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2ceedbcee674106a8f519fededf37b2bbc4b4bd660845373899de825f1ed534e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:25:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ljz2v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:25:00Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-g47qh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:25:56Z is after 2025-08-24T17:21:41Z" Mar 21 04:25:56 crc kubenswrapper[4839]: I0321 04:25:56.366523 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"99628ae3-9af9-4946-8e9f-2fc369d82cfa\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:23:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:23:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:24:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:24:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:23:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4a7ef331cbf4076bc661dd93ec1669ee8b721edbac57aeef3df25f1e36527065\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ed71f104d54976b384a414c08b933d4f7185883ea0e2dc638863d67f9b245505\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-21T04:24:17Z\\\",\\\"message\\\":\\\"+ timeout 3m /bin/bash -exuo pipefail -c 'while [ -n \\\\\\\"$(ss -Htanop \\\\\\\\( sport = 10357 \\\\\\\\))\\\\\\\" ]; do sleep 1; done'\\\\n++ ss -Htanop '(' sport = 10357 ')'\\\\n+ '[' -n '' ']'\\\\n+ exec cluster-policy-controller start --config=/etc/kubernetes/static-pod-resources/configmaps/cluster-policy-controller-config/config.yaml --kubeconfig=/etc/kubernetes/static-pod-resources/configmaps/controller-manager-kubeconfig/kubeconfig --namespace=openshift-kube-controller-manager -v=2\\\\nI0321 04:23:48.352735 1 leaderelection.go:121] The leader election gives 4 retries and allows for 30s of clock skew. The kube-apiserver downtime tolerance is 78s. Worst non-graceful lease acquisition is 2m43s. Worst graceful lease acquisition is {26s}.\\\\nI0321 04:23:48.353466 1 observer_polling.go:159] Starting file observer\\\\nI0321 04:23:48.354210 1 builder.go:298] cluster-policy-controller version 4.18.0-202501230001.p0.g5fd8525.assembly.stream.el9-5fd8525-5fd852525909ce6eab52972ba9ce8fcf56528eb9\\\\nI0321 04:23:48.354819 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.crt::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.key\\\\\\\"\\\\nI0321 04:24:09.734400 1 cert_rotation.go:91] certificate rotation detected, shutting down client connections to start using new credentials\\\\nI0321 04:24:17.259238 1 cmd.go:138] Received SIGTERM or SIGINT signal, shutting down controller.\\\\nF0321 04:24:17.259294 1 cmd.go:179] failed checking apiserver connectivity: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-21T04:23:48Z\\\"}},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:24:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dd3ce831e80df764c5450f1bf45e92fc061ba1c1efa8ade627b570edcb41567c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:23:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ac4ccdf976fbadd5ce5975dbec37b6be3b17b401a863ed5c1e93736176e1185e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:23:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f432640080695974eb1c6c1297fad14596d99d17a162d541417fb3ec762575f5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:23:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:23:16Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:25:56Z is after 2025-08-24T17:21:41Z" Mar 21 04:25:56 crc kubenswrapper[4839]: I0321 04:25:56.378330 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:24:41Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:24:41Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:25:56Z is after 2025-08-24T17:21:41Z" Mar 21 04:25:56 crc kubenswrapper[4839]: I0321 04:25:56.390520 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-jx4q7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4f92fefb-d5cd-451a-8bbe-31eea55d5bd9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6fc18a359e07bf64662248d7ceb65c8184407c0c4e14ce5483760975218407c9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:25:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9jqbw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a124ddc90d8b93fe4b6f77b778fbdac127b4896f5f55e6d5d78629cd17584311\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:25:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9jqbw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:25:00Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-jx4q7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:25:56Z is after 2025-08-24T17:21:41Z" Mar 21 04:25:56 crc kubenswrapper[4839]: I0321 04:25:56.401196 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b27e726e-fb6e-471f-b189-71637a34994c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:23:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:23:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:24:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:24:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:23:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://506189caf9101bd8eff0b327400189bde29280613baf494135bf28fa36bd80cd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:23:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4ea5d4d7d744a79c2e5515eaf49d09346d96b37b17eee6e3e32952e389248367\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:23:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1981f2299df7154d30ce01492943eaadeeed0f27c0b0a39a1228054345aa7a3a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:23:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fec5f0e1a98ef435cb8b280400cfc7daff0f232d320947f22a93df0b81b6f441\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fec5f0e1a98ef435cb8b280400cfc7daff0f232d320947f22a93df0b81b6f441\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:23:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:23:17Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:23:16Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:25:56Z is after 2025-08-24T17:21:41Z" Mar 21 04:25:56 crc kubenswrapper[4839]: I0321 04:25:56.412689 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:24:41Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:24:41Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:25:56Z is after 2025-08-24T17:21:41Z" Mar 21 04:25:56 crc kubenswrapper[4839]: I0321 04:25:56.426840 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-scp2c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e0848faa-daf7-4b62-a20f-36d92678db1d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2ae22ec83bed051327f3e013af8f13d50dfab76794f5fbc972c4091173c8c464\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:25:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6trk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5ee6bb585321fd698c356e2ad9b8f48f62f2b8e09847371060e8ab0395a059ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5ee6bb585321fd698c356e2ad9b8f48f62f2b8e09847371060e8ab0395a059ea\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:25:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:25:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6trk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d6527e3c31caba53559cb6dfe4494b576789a0c8e3e5a7d392352441f1b9b1ad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d6527e3c31caba53559cb6dfe4494b576789a0c8e3e5a7d392352441f1b9b1ad\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:25:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:25:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6trk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://688598020c8b1be055194d4805d234769230262450e5f816f35cc120ff5ff9a2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://688598020c8b1be055194d4805d234769230262450e5f816f35cc120ff5ff9a2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:25:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:25:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6trk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://137b7d338371020059e20feb028cea9e79fa43d32ff7de446719a1326051c4f4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://137b7d338371020059e20feb028cea9e79fa43d32ff7de446719a1326051c4f4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:25:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:25:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6trk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6059c9b392e73f503a825716b885d248ddf659fe4fc1cd23f47353dd0fdc0e07\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6059c9b392e73f503a825716b885d248ddf659fe4fc1cd23f47353dd0fdc0e07\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:25:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:25:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6trk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9367e3b70275a4fd5cc98faa785da0149e890de672bce34ddc831a46be5dd6b3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9367e3b70275a4fd5cc98faa785da0149e890de672bce34ddc831a46be5dd6b3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:25:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:25:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6trk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:25:00Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-scp2c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:25:56Z is after 2025-08-24T17:21:41Z" Mar 21 04:25:56 crc kubenswrapper[4839]: I0321 04:25:56.441176 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-zqcw4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1602189b-f4f3-40ee-ba63-c695c11069d0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bc026a6498e53a6dc434660326fcbdb2323e59c0c7d6f9e5d768c3a347b614d1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://abb804e65728bf323531d9b35e52f6700584bab85de4a5b749067d2d89224747\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-21T04:25:48Z\\\",\\\"message\\\":\\\"2026-03-21T04:25:02+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_d2a3febf-889d-41cb-9c6b-9024dff2f413\\\\n2026-03-21T04:25:02+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_d2a3febf-889d-41cb-9c6b-9024dff2f413 to /host/opt/cni/bin/\\\\n2026-03-21T04:25:02Z [verbose] multus-daemon started\\\\n2026-03-21T04:25:02Z [verbose] Readiness Indicator file check\\\\n2026-03-21T04:25:47Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-21T04:25:01Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:25:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f9gh6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:25:00Z\\\"}}\" for pod \"openshift-multus\"/\"multus-zqcw4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:25:56Z is after 2025-08-24T17:21:41Z" Mar 21 04:25:56 crc kubenswrapper[4839]: I0321 04:25:56.452065 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 21 04:25:56 crc kubenswrapper[4839]: E0321 04:25:56.452329 4839 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 21 04:25:56 crc kubenswrapper[4839]: I0321 04:25:56.452201 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-sxs57" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e99177d8-5f41-4cee-a2c9-ae1c314d9d8d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4499202f3d135af4fbc0aaeddb85c3af3d571f0a6a0da31b0c054112cbeef85e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:25:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vqm8m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:25:06Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-sxs57\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:25:56Z is after 2025-08-24T17:21:41Z" Mar 21 04:25:56 crc kubenswrapper[4839]: I0321 04:25:56.462283 4839 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/kube-rbac-proxy-crio-crc"] Mar 21 04:25:56 crc kubenswrapper[4839]: I0321 04:25:56.466558 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:24:41Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:11Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c3ce591ac49d818720803d65a1f2cbc28e1781a5bd3cff7c6cdaaa4b55c01b62\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:25:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:25:56Z is after 2025-08-24T17:21:41Z" Mar 21 04:25:56 crc kubenswrapper[4839]: I0321 04:25:56.478788 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:24:41Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:12Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ea7373c259189351899cd74615296d7188b2529f808af9c8c85098177ffbe85d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:25:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:25:56Z is after 2025-08-24T17:21:41Z" Mar 21 04:25:56 crc kubenswrapper[4839]: I0321 04:25:56.498525 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-spl4b" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d634043b-c9ec-4469-b267-26053b1f02f9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:01Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:01Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5854b81091eb3920f499e2fb8bdb5e51afa5cf975397ad7d789603150fdafe92\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:25:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cdph2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c532a8668c47301ab93dbb1ea21be6ed687a83a6c9008b887e5023a46f27d172\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:25:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cdph2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://340ce870dbfbd3b102f8f1c5e6705da80e4a23db035175d7746ffe6ad63310f0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:25:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cdph2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://04c0470b8aa154ad901d5416631136ad2480da45bff39836dda6837a3e4b980e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:25:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cdph2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://821316a28b33f897df4ea4ae553ed0cbdd066ff220cc310af604bbe062e4b00e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:25:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cdph2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0555faac11cd1da1d5573074ee5d8704c1c4a7d8559996229bf242d9ed7290f4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:25:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cdph2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://616e01984d5c14dee369d9f7e59f96c9dfaccfd431fc19972454ffbbb0b5489f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://616e01984d5c14dee369d9f7e59f96c9dfaccfd431fc19972454ffbbb0b5489f\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-21T04:25:54Z\\\",\\\"message\\\":\\\"led to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:25:54Z is after 2025-08-24T17:21:41Z]\\\\nI0321 04:25:54.232530 7446 transact.go:42] Configuring OVN: [{Op:update Table:Logical_Switch_Port Row:map[addresses:{GoSet:[0a:58:0a:d9:00:5c 10.217.0.92]} options:{GoMap:map[iface-id-ver:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 requested-chassis:crc]} port_security:{GoSet:[0a:58:0a:d9:00:5c 10.217.0.92]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {c94130be-172c-477c-88c4-40cc7eba30fe}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:} {Op:mutate Table:Logical_Switch Row:map[] Rows:[] Columns:[] Mutations:[{Column:ports Mutator:insert Value:{GoSet:[{GoUUID:c94130be-172c-477c-88c4-40cc7eba30fe}]}}] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {7e8bb06a-06a5-45bc-a752-26a17d322811}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:} {Op:mutate Table:Port_Group Row:map[] Rows:[] Columns:[] Mutations:[{Column:ports Mutator:insert Value:{GoSet:[{GoUUID:c94130be-172c-477c-88c4-40cc7eba30fe}]}}] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {eb8eef51-1a8d-43f9-ae2e-3b2cc00ded60}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:} {Op:update Table:NAT Row:map[external_ip:192.168.126.11 logical_ip:10.217.0.92 options:{GoMap:map[stateless:false]} typ\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-21T04:25:53Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-spl4b_openshift-ovn-kubernetes(d634043b-c9ec-4469-b267-26053b1f02f9)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cdph2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ee9c34719e2681860a7065870d4f0fd1d5aae10da5f2cc780e83f8020211e536\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:25:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cdph2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://466ed122245008a86b4471250670e086e61c981de3a7a026461c7c2238c87f3c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://466ed122245008a86b4471250670e086e61c981de3a7a026461c7c2238c87f3c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:25:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:25:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cdph2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:25:01Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-spl4b\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:25:56Z is after 2025-08-24T17:21:41Z" Mar 21 04:25:56 crc kubenswrapper[4839]: I0321 04:25:56.513150 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-9hl57" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4dee692e-c3b8-4538-86d7-210dd7e96173\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6058f68ef58f6259be3064c7cea97ec261dc1174655dbe75d4caf208a9c300e5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:25:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6n8pc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://99d7bb8a699d6022c3036c24052e58f8f45699b88971a2137e3d8bc27a21fe56\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:25:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6n8pc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:25:12Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-9hl57\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:25:56Z is after 2025-08-24T17:21:41Z" Mar 21 04:25:56 crc kubenswrapper[4839]: I0321 04:25:56.523823 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-445ww" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fa13ce27-53f2-4178-8560-251f0bb3f034\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:13Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:13Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4b26h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4b26h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:25:13Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-445ww\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:25:56Z is after 2025-08-24T17:21:41Z" Mar 21 04:25:56 crc kubenswrapper[4839]: E0321 04:25:56.546060 4839 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 21 04:25:56 crc kubenswrapper[4839]: I0321 04:25:56.547448 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"845a5f72-fcd9-4e53-a65e-d875a0758312\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:23:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:23:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:23:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:23:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:23:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6ac8aa1efe8200de679b27c91d85e8934dd8b958a487da8adfcb2e7fe4d4af83\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:23:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b5fcb9a692a8e68cdaf00e9ab50192286582dc94a1ec9c1f269deec71d12e709\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:23:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0a90276bf42ba474250d4e51c93b958e93ffcad8ab5f4285766b381f0247c484\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:23:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://418cc0271450c53990546d7ffb4506824d2a5889e49064cd40624ba291f485bd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:23:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d2aeb8f0a5e6e7302570f0ecb899e31a8f5e9ed506f9bf7cfa6a0a6d55ae50dd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:23:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://843b9092466615ee64f818bbd1d6e51298c59932d1e135796967de09fb2f81bf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://843b9092466615ee64f818bbd1d6e51298c59932d1e135796967de09fb2f81bf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:23:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:23:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://320bccb832e179c9ab109d22c5cd652cfb3bd770d2c698d91886401d9566efc5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://320bccb832e179c9ab109d22c5cd652cfb3bd770d2c698d91886401d9566efc5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:23:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:23:18Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://6e30344cfa0c0c9d42845a517f744a156634132bd6c2f6e30f745751913e968c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6e30344cfa0c0c9d42845a517f744a156634132bd6c2f6e30f745751913e968c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:23:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:23:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:23:16Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:25:56Z is after 2025-08-24T17:21:41Z" Mar 21 04:25:56 crc kubenswrapper[4839]: I0321 04:25:56.559889 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b27e726e-fb6e-471f-b189-71637a34994c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:23:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:23:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:24:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:24:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:23:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://506189caf9101bd8eff0b327400189bde29280613baf494135bf28fa36bd80cd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:23:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4ea5d4d7d744a79c2e5515eaf49d09346d96b37b17eee6e3e32952e389248367\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:23:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1981f2299df7154d30ce01492943eaadeeed0f27c0b0a39a1228054345aa7a3a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:23:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fec5f0e1a98ef435cb8b280400cfc7daff0f232d320947f22a93df0b81b6f441\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fec5f0e1a98ef435cb8b280400cfc7daff0f232d320947f22a93df0b81b6f441\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:23:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:23:17Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:23:16Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:25:56Z is after 2025-08-24T17:21:41Z" Mar 21 04:25:56 crc kubenswrapper[4839]: I0321 04:25:56.569558 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c8235ac9-7c3f-438e-957f-1bdedeff6f64\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:23:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:23:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:23:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:23:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:23:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5c5299f0598312d0ef997d7c51fad5c0b882bd65e5964794ac66179575373fcd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:23:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5b603ad92d6de73d444d98c4e9f38fd63b22eabeee61d7a3b00e8ca741016e68\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5b603ad92d6de73d444d98c4e9f38fd63b22eabeee61d7a3b00e8ca741016e68\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:23:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:23:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:23:16Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:25:56Z is after 2025-08-24T17:21:41Z" Mar 21 04:25:56 crc kubenswrapper[4839]: I0321 04:25:56.582409 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:24:41Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:24:41Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:25:56Z is after 2025-08-24T17:21:41Z" Mar 21 04:25:56 crc kubenswrapper[4839]: I0321 04:25:56.593860 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-jx4q7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4f92fefb-d5cd-451a-8bbe-31eea55d5bd9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6fc18a359e07bf64662248d7ceb65c8184407c0c4e14ce5483760975218407c9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:25:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9jqbw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a124ddc90d8b93fe4b6f77b778fbdac127b4896f5f55e6d5d78629cd17584311\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:25:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9jqbw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:25:00Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-jx4q7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:25:56Z is after 2025-08-24T17:21:41Z" Mar 21 04:25:56 crc kubenswrapper[4839]: I0321 04:25:56.608003 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:24:41Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:11Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c3ce591ac49d818720803d65a1f2cbc28e1781a5bd3cff7c6cdaaa4b55c01b62\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:25:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:25:56Z is after 2025-08-24T17:21:41Z" Mar 21 04:25:56 crc kubenswrapper[4839]: I0321 04:25:56.621175 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:24:41Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:24:41Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:25:56Z is after 2025-08-24T17:21:41Z" Mar 21 04:25:56 crc kubenswrapper[4839]: I0321 04:25:56.638918 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-scp2c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e0848faa-daf7-4b62-a20f-36d92678db1d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2ae22ec83bed051327f3e013af8f13d50dfab76794f5fbc972c4091173c8c464\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:25:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6trk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5ee6bb585321fd698c356e2ad9b8f48f62f2b8e09847371060e8ab0395a059ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5ee6bb585321fd698c356e2ad9b8f48f62f2b8e09847371060e8ab0395a059ea\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:25:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:25:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6trk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d6527e3c31caba53559cb6dfe4494b576789a0c8e3e5a7d392352441f1b9b1ad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d6527e3c31caba53559cb6dfe4494b576789a0c8e3e5a7d392352441f1b9b1ad\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:25:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:25:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6trk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://688598020c8b1be055194d4805d234769230262450e5f816f35cc120ff5ff9a2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://688598020c8b1be055194d4805d234769230262450e5f816f35cc120ff5ff9a2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:25:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:25:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6trk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://137b7d338371020059e20feb028cea9e79fa43d32ff7de446719a1326051c4f4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://137b7d338371020059e20feb028cea9e79fa43d32ff7de446719a1326051c4f4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:25:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:25:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6trk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6059c9b392e73f503a825716b885d248ddf659fe4fc1cd23f47353dd0fdc0e07\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6059c9b392e73f503a825716b885d248ddf659fe4fc1cd23f47353dd0fdc0e07\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:25:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:25:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6trk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9367e3b70275a4fd5cc98faa785da0149e890de672bce34ddc831a46be5dd6b3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9367e3b70275a4fd5cc98faa785da0149e890de672bce34ddc831a46be5dd6b3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:25:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:25:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6trk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:25:00Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-scp2c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:25:56Z is after 2025-08-24T17:21:41Z" Mar 21 04:25:56 crc kubenswrapper[4839]: I0321 04:25:56.654898 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-zqcw4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1602189b-f4f3-40ee-ba63-c695c11069d0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bc026a6498e53a6dc434660326fcbdb2323e59c0c7d6f9e5d768c3a347b614d1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://abb804e65728bf323531d9b35e52f6700584bab85de4a5b749067d2d89224747\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-21T04:25:48Z\\\",\\\"message\\\":\\\"2026-03-21T04:25:02+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_d2a3febf-889d-41cb-9c6b-9024dff2f413\\\\n2026-03-21T04:25:02+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_d2a3febf-889d-41cb-9c6b-9024dff2f413 to /host/opt/cni/bin/\\\\n2026-03-21T04:25:02Z [verbose] multus-daemon started\\\\n2026-03-21T04:25:02Z [verbose] Readiness Indicator file check\\\\n2026-03-21T04:25:47Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-21T04:25:01Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:25:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f9gh6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:25:00Z\\\"}}\" for pod \"openshift-multus\"/\"multus-zqcw4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:25:56Z is after 2025-08-24T17:21:41Z" Mar 21 04:25:56 crc kubenswrapper[4839]: I0321 04:25:56.665408 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-sxs57" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e99177d8-5f41-4cee-a2c9-ae1c314d9d8d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4499202f3d135af4fbc0aaeddb85c3af3d571f0a6a0da31b0c054112cbeef85e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:25:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vqm8m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:25:06Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-sxs57\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:25:56Z is after 2025-08-24T17:21:41Z" Mar 21 04:25:56 crc kubenswrapper[4839]: I0321 04:25:56.683518 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"845a5f72-fcd9-4e53-a65e-d875a0758312\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:23:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:23:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:23:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:23:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:23:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6ac8aa1efe8200de679b27c91d85e8934dd8b958a487da8adfcb2e7fe4d4af83\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:23:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b5fcb9a692a8e68cdaf00e9ab50192286582dc94a1ec9c1f269deec71d12e709\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:23:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0a90276bf42ba474250d4e51c93b958e93ffcad8ab5f4285766b381f0247c484\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:23:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://418cc0271450c53990546d7ffb4506824d2a5889e49064cd40624ba291f485bd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:23:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d2aeb8f0a5e6e7302570f0ecb899e31a8f5e9ed506f9bf7cfa6a0a6d55ae50dd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:23:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://843b9092466615ee64f818bbd1d6e51298c59932d1e135796967de09fb2f81bf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://843b9092466615ee64f818bbd1d6e51298c59932d1e135796967de09fb2f81bf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:23:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:23:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://320bccb832e179c9ab109d22c5cd652cfb3bd770d2c698d91886401d9566efc5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://320bccb832e179c9ab109d22c5cd652cfb3bd770d2c698d91886401d9566efc5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:23:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:23:18Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://6e30344cfa0c0c9d42845a517f744a156634132bd6c2f6e30f745751913e968c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6e30344cfa0c0c9d42845a517f744a156634132bd6c2f6e30f745751913e968c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:23:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:23:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:23:16Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:25:56Z is after 2025-08-24T17:21:41Z" Mar 21 04:25:56 crc kubenswrapper[4839]: I0321 04:25:56.693764 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:24:41Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:12Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ea7373c259189351899cd74615296d7188b2529f808af9c8c85098177ffbe85d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:25:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:25:56Z is after 2025-08-24T17:21:41Z" Mar 21 04:25:56 crc kubenswrapper[4839]: I0321 04:25:56.710758 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-spl4b" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d634043b-c9ec-4469-b267-26053b1f02f9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:01Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:01Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5854b81091eb3920f499e2fb8bdb5e51afa5cf975397ad7d789603150fdafe92\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:25:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cdph2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c532a8668c47301ab93dbb1ea21be6ed687a83a6c9008b887e5023a46f27d172\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:25:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cdph2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://340ce870dbfbd3b102f8f1c5e6705da80e4a23db035175d7746ffe6ad63310f0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:25:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cdph2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://04c0470b8aa154ad901d5416631136ad2480da45bff39836dda6837a3e4b980e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:25:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cdph2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://821316a28b33f897df4ea4ae553ed0cbdd066ff220cc310af604bbe062e4b00e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:25:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cdph2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0555faac11cd1da1d5573074ee5d8704c1c4a7d8559996229bf242d9ed7290f4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:25:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cdph2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://616e01984d5c14dee369d9f7e59f96c9dfaccfd431fc19972454ffbbb0b5489f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://616e01984d5c14dee369d9f7e59f96c9dfaccfd431fc19972454ffbbb0b5489f\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-21T04:25:54Z\\\",\\\"message\\\":\\\"led to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:25:54Z is after 2025-08-24T17:21:41Z]\\\\nI0321 04:25:54.232530 7446 transact.go:42] Configuring OVN: [{Op:update Table:Logical_Switch_Port Row:map[addresses:{GoSet:[0a:58:0a:d9:00:5c 10.217.0.92]} options:{GoMap:map[iface-id-ver:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 requested-chassis:crc]} port_security:{GoSet:[0a:58:0a:d9:00:5c 10.217.0.92]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {c94130be-172c-477c-88c4-40cc7eba30fe}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:} {Op:mutate Table:Logical_Switch Row:map[] Rows:[] Columns:[] Mutations:[{Column:ports Mutator:insert Value:{GoSet:[{GoUUID:c94130be-172c-477c-88c4-40cc7eba30fe}]}}] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {7e8bb06a-06a5-45bc-a752-26a17d322811}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:} {Op:mutate Table:Port_Group Row:map[] Rows:[] Columns:[] Mutations:[{Column:ports Mutator:insert Value:{GoSet:[{GoUUID:c94130be-172c-477c-88c4-40cc7eba30fe}]}}] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {eb8eef51-1a8d-43f9-ae2e-3b2cc00ded60}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:} {Op:update Table:NAT Row:map[external_ip:192.168.126.11 logical_ip:10.217.0.92 options:{GoMap:map[stateless:false]} typ\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-21T04:25:53Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-spl4b_openshift-ovn-kubernetes(d634043b-c9ec-4469-b267-26053b1f02f9)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cdph2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ee9c34719e2681860a7065870d4f0fd1d5aae10da5f2cc780e83f8020211e536\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:25:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cdph2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://466ed122245008a86b4471250670e086e61c981de3a7a026461c7c2238c87f3c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://466ed122245008a86b4471250670e086e61c981de3a7a026461c7c2238c87f3c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:25:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:25:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cdph2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:25:01Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-spl4b\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:25:56Z is after 2025-08-24T17:21:41Z" Mar 21 04:25:56 crc kubenswrapper[4839]: I0321 04:25:56.721062 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-9hl57" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4dee692e-c3b8-4538-86d7-210dd7e96173\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6058f68ef58f6259be3064c7cea97ec261dc1174655dbe75d4caf208a9c300e5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:25:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6n8pc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://99d7bb8a699d6022c3036c24052e58f8f45699b88971a2137e3d8bc27a21fe56\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:25:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6n8pc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:25:12Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-9hl57\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:25:56Z is after 2025-08-24T17:21:41Z" Mar 21 04:25:56 crc kubenswrapper[4839]: I0321 04:25:56.730643 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-445ww" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fa13ce27-53f2-4178-8560-251f0bb3f034\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:13Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:13Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4b26h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4b26h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:25:13Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-445ww\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:25:56Z is after 2025-08-24T17:21:41Z" Mar 21 04:25:56 crc kubenswrapper[4839]: I0321 04:25:56.740627 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"99628ae3-9af9-4946-8e9f-2fc369d82cfa\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:23:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:23:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:24:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:24:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:23:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4a7ef331cbf4076bc661dd93ec1669ee8b721edbac57aeef3df25f1e36527065\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ed71f104d54976b384a414c08b933d4f7185883ea0e2dc638863d67f9b245505\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-21T04:24:17Z\\\",\\\"message\\\":\\\"+ timeout 3m /bin/bash -exuo pipefail -c 'while [ -n \\\\\\\"$(ss -Htanop \\\\\\\\( sport = 10357 \\\\\\\\))\\\\\\\" ]; do sleep 1; done'\\\\n++ ss -Htanop '(' sport = 10357 ')'\\\\n+ '[' -n '' ']'\\\\n+ exec cluster-policy-controller start --config=/etc/kubernetes/static-pod-resources/configmaps/cluster-policy-controller-config/config.yaml --kubeconfig=/etc/kubernetes/static-pod-resources/configmaps/controller-manager-kubeconfig/kubeconfig --namespace=openshift-kube-controller-manager -v=2\\\\nI0321 04:23:48.352735 1 leaderelection.go:121] The leader election gives 4 retries and allows for 30s of clock skew. The kube-apiserver downtime tolerance is 78s. Worst non-graceful lease acquisition is 2m43s. Worst graceful lease acquisition is {26s}.\\\\nI0321 04:23:48.353466 1 observer_polling.go:159] Starting file observer\\\\nI0321 04:23:48.354210 1 builder.go:298] cluster-policy-controller version 4.18.0-202501230001.p0.g5fd8525.assembly.stream.el9-5fd8525-5fd852525909ce6eab52972ba9ce8fcf56528eb9\\\\nI0321 04:23:48.354819 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.crt::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.key\\\\\\\"\\\\nI0321 04:24:09.734400 1 cert_rotation.go:91] certificate rotation detected, shutting down client connections to start using new credentials\\\\nI0321 04:24:17.259238 1 cmd.go:138] Received SIGTERM or SIGINT signal, shutting down controller.\\\\nF0321 04:24:17.259294 1 cmd.go:179] failed checking apiserver connectivity: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-21T04:23:48Z\\\"}},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:24:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dd3ce831e80df764c5450f1bf45e92fc061ba1c1efa8ade627b570edcb41567c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:23:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ac4ccdf976fbadd5ce5975dbec37b6be3b17b401a863ed5c1e93736176e1185e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:23:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f432640080695974eb1c6c1297fad14596d99d17a162d541417fb3ec762575f5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:23:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:23:16Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:25:56Z is after 2025-08-24T17:21:41Z" Mar 21 04:25:56 crc kubenswrapper[4839]: I0321 04:25:56.752061 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c5a473f2-0430-4c9b-8ef8-60d457db5188\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:23:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:23:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:23:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7299e7f513aff2d1d61b42cef3ed386843478716082a6e38d4d6d61d87f48529\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:23:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6d9bd6ceaa85cd21af55d1d512ac910eaf9d1bcf5d0f10868cd22ac80b13f66a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:23:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e4fe4a6c00181492cd5389e34a9bf18d905f69b2f7467eb75dcee9992eeb0d07\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:23:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://412472fce0e71838c2bb83101879b672e777dcd608bf27261a98261150d42e87\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ecafd1a4266b72f369f4b53d8a423bb875c1bc9719282c517a8e999ad5aecc66\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-21T04:24:14Z\\\",\\\"message\\\":\\\"W0321 04:24:13.708919 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI0321 04:24:13.709951 1 crypto.go:601] Generating new CA for check-endpoints-signer@1774067053 cert, and key in /tmp/serving-cert-1138089316/serving-signer.crt, /tmp/serving-cert-1138089316/serving-signer.key\\\\nI0321 04:24:14.009807 1 observer_polling.go:159] Starting file observer\\\\nW0321 04:24:14.016906 1 builder.go:272] unable to get owner reference (falling back to namespace): Unauthorized\\\\nI0321 04:24:14.017068 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0321 04:24:14.017655 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1138089316/tls.crt::/tmp/serving-cert-1138089316/tls.key\\\\\\\"\\\\nF0321 04:24:14.917069 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-21T04:24:13Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:24:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://573a241321770c9c676d83d8280f81eba0ad01a3e2f857be89d31316117b8898\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:23:19Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f70600d5431425836d94025fc8166068d8c9f411d0750be07cd7de44575a9649\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f70600d5431425836d94025fc8166068d8c9f411d0750be07cd7de44575a9649\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:23:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:23:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:23:16Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:25:56Z is after 2025-08-24T17:21:41Z" Mar 21 04:25:56 crc kubenswrapper[4839]: I0321 04:25:56.762901 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:24:41Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:24:41Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:25:56Z is after 2025-08-24T17:21:41Z" Mar 21 04:25:56 crc kubenswrapper[4839]: I0321 04:25:56.774140 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:24:41Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:09Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://17d5f53fe3137552caf74bf775b44fba819d8790d2bc4d5042744dbd7b7307ce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:25:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e480aacd9f65a8a4ccedd9c0a943c53bf6de8a8de00eb0d421ec782af8a785bb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:25:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:25:56Z is after 2025-08-24T17:21:41Z" Mar 21 04:25:56 crc kubenswrapper[4839]: I0321 04:25:56.783698 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-g47qh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e646dbcd-c976-48e4-8dee-497be8a275bf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2ceedbcee674106a8f519fededf37b2bbc4b4bd660845373899de825f1ed534e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:25:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ljz2v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:25:00Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-g47qh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:25:56Z is after 2025-08-24T17:21:41Z" Mar 21 04:25:57 crc kubenswrapper[4839]: I0321 04:25:57.452737 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 21 04:25:57 crc kubenswrapper[4839]: I0321 04:25:57.452833 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 21 04:25:57 crc kubenswrapper[4839]: E0321 04:25:57.452877 4839 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 21 04:25:57 crc kubenswrapper[4839]: I0321 04:25:57.452833 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-445ww" Mar 21 04:25:57 crc kubenswrapper[4839]: E0321 04:25:57.453031 4839 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 21 04:25:57 crc kubenswrapper[4839]: E0321 04:25:57.453206 4839 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-445ww" podUID="fa13ce27-53f2-4178-8560-251f0bb3f034" Mar 21 04:25:58 crc kubenswrapper[4839]: I0321 04:25:58.452273 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 21 04:25:58 crc kubenswrapper[4839]: E0321 04:25:58.452466 4839 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 21 04:25:59 crc kubenswrapper[4839]: I0321 04:25:59.452126 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-445ww" Mar 21 04:25:59 crc kubenswrapper[4839]: I0321 04:25:59.452272 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 21 04:25:59 crc kubenswrapper[4839]: E0321 04:25:59.452281 4839 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-445ww" podUID="fa13ce27-53f2-4178-8560-251f0bb3f034" Mar 21 04:25:59 crc kubenswrapper[4839]: E0321 04:25:59.452476 4839 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 21 04:25:59 crc kubenswrapper[4839]: I0321 04:25:59.452730 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 21 04:25:59 crc kubenswrapper[4839]: E0321 04:25:59.452869 4839 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 21 04:26:00 crc kubenswrapper[4839]: I0321 04:26:00.452405 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 21 04:26:00 crc kubenswrapper[4839]: E0321 04:26:00.453185 4839 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 21 04:26:01 crc kubenswrapper[4839]: I0321 04:26:01.452728 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 21 04:26:01 crc kubenswrapper[4839]: I0321 04:26:01.452738 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 21 04:26:01 crc kubenswrapper[4839]: E0321 04:26:01.452883 4839 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 21 04:26:01 crc kubenswrapper[4839]: E0321 04:26:01.452962 4839 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 21 04:26:01 crc kubenswrapper[4839]: I0321 04:26:01.452816 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-445ww" Mar 21 04:26:01 crc kubenswrapper[4839]: E0321 04:26:01.453077 4839 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-445ww" podUID="fa13ce27-53f2-4178-8560-251f0bb3f034" Mar 21 04:26:01 crc kubenswrapper[4839]: E0321 04:26:01.547552 4839 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 21 04:26:02 crc kubenswrapper[4839]: I0321 04:26:02.451919 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 21 04:26:02 crc kubenswrapper[4839]: E0321 04:26:02.452035 4839 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 21 04:26:03 crc kubenswrapper[4839]: I0321 04:26:03.452370 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-445ww" Mar 21 04:26:03 crc kubenswrapper[4839]: I0321 04:26:03.452417 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 21 04:26:03 crc kubenswrapper[4839]: I0321 04:26:03.452432 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 21 04:26:03 crc kubenswrapper[4839]: E0321 04:26:03.452533 4839 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-445ww" podUID="fa13ce27-53f2-4178-8560-251f0bb3f034" Mar 21 04:26:03 crc kubenswrapper[4839]: E0321 04:26:03.452725 4839 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 21 04:26:03 crc kubenswrapper[4839]: E0321 04:26:03.452771 4839 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 21 04:26:04 crc kubenswrapper[4839]: I0321 04:26:04.452732 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 21 04:26:04 crc kubenswrapper[4839]: E0321 04:26:04.453513 4839 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 21 04:26:05 crc kubenswrapper[4839]: I0321 04:26:05.452390 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 21 04:26:05 crc kubenswrapper[4839]: I0321 04:26:05.452388 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-445ww" Mar 21 04:26:05 crc kubenswrapper[4839]: E0321 04:26:05.452596 4839 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 21 04:26:05 crc kubenswrapper[4839]: E0321 04:26:05.452611 4839 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-445ww" podUID="fa13ce27-53f2-4178-8560-251f0bb3f034" Mar 21 04:26:05 crc kubenswrapper[4839]: I0321 04:26:05.452505 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 21 04:26:05 crc kubenswrapper[4839]: E0321 04:26:05.452816 4839 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 21 04:26:06 crc kubenswrapper[4839]: I0321 04:26:06.368908 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:26:06 crc kubenswrapper[4839]: I0321 04:26:06.368953 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:26:06 crc kubenswrapper[4839]: I0321 04:26:06.368964 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:26:06 crc kubenswrapper[4839]: I0321 04:26:06.368980 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 04:26:06 crc kubenswrapper[4839]: I0321 04:26:06.368990 4839 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T04:26:06Z","lastTransitionTime":"2026-03-21T04:26:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 04:26:06 crc kubenswrapper[4839]: E0321 04:26:06.381795 4839 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404564Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865364Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-21T04:26:06Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-21T04:26:06Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-21T04:26:06Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-21T04:26:06Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-21T04:26:06Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-21T04:26:06Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-21T04:26:06Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-21T04:26:06Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"d75f4a6d-5ce3-49bd-a7c7-4f5c3e6bf62f\\\",\\\"systemUUID\\\":\\\"2a7bfad9-30ba-42d8-b982-971191ebb9d6\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:26:06Z is after 2025-08-24T17:21:41Z" Mar 21 04:26:06 crc kubenswrapper[4839]: I0321 04:26:06.384935 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:26:06 crc kubenswrapper[4839]: I0321 04:26:06.384974 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:26:06 crc kubenswrapper[4839]: I0321 04:26:06.384983 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:26:06 crc kubenswrapper[4839]: I0321 04:26:06.384997 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 04:26:06 crc kubenswrapper[4839]: I0321 04:26:06.385006 4839 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T04:26:06Z","lastTransitionTime":"2026-03-21T04:26:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 04:26:06 crc kubenswrapper[4839]: E0321 04:26:06.399597 4839 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404564Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865364Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-21T04:26:06Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-21T04:26:06Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-21T04:26:06Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-21T04:26:06Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-21T04:26:06Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-21T04:26:06Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-21T04:26:06Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-21T04:26:06Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"d75f4a6d-5ce3-49bd-a7c7-4f5c3e6bf62f\\\",\\\"systemUUID\\\":\\\"2a7bfad9-30ba-42d8-b982-971191ebb9d6\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:26:06Z is after 2025-08-24T17:21:41Z" Mar 21 04:26:06 crc kubenswrapper[4839]: I0321 04:26:06.404048 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:26:06 crc kubenswrapper[4839]: I0321 04:26:06.404096 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:26:06 crc kubenswrapper[4839]: I0321 04:26:06.404105 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:26:06 crc kubenswrapper[4839]: I0321 04:26:06.404119 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 04:26:06 crc kubenswrapper[4839]: I0321 04:26:06.404150 4839 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T04:26:06Z","lastTransitionTime":"2026-03-21T04:26:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 04:26:06 crc kubenswrapper[4839]: E0321 04:26:06.415027 4839 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404564Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865364Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-21T04:26:06Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-21T04:26:06Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-21T04:26:06Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-21T04:26:06Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-21T04:26:06Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-21T04:26:06Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-21T04:26:06Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-21T04:26:06Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"d75f4a6d-5ce3-49bd-a7c7-4f5c3e6bf62f\\\",\\\"systemUUID\\\":\\\"2a7bfad9-30ba-42d8-b982-971191ebb9d6\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:26:06Z is after 2025-08-24T17:21:41Z" Mar 21 04:26:06 crc kubenswrapper[4839]: I0321 04:26:06.418734 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:26:06 crc kubenswrapper[4839]: I0321 04:26:06.418775 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:26:06 crc kubenswrapper[4839]: I0321 04:26:06.418789 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:26:06 crc kubenswrapper[4839]: I0321 04:26:06.418806 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 04:26:06 crc kubenswrapper[4839]: I0321 04:26:06.418817 4839 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T04:26:06Z","lastTransitionTime":"2026-03-21T04:26:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 04:26:06 crc kubenswrapper[4839]: E0321 04:26:06.431805 4839 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404564Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865364Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-21T04:26:06Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-21T04:26:06Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-21T04:26:06Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-21T04:26:06Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-21T04:26:06Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-21T04:26:06Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-21T04:26:06Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-21T04:26:06Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"d75f4a6d-5ce3-49bd-a7c7-4f5c3e6bf62f\\\",\\\"systemUUID\\\":\\\"2a7bfad9-30ba-42d8-b982-971191ebb9d6\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:26:06Z is after 2025-08-24T17:21:41Z" Mar 21 04:26:06 crc kubenswrapper[4839]: I0321 04:26:06.435031 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:26:06 crc kubenswrapper[4839]: I0321 04:26:06.435069 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:26:06 crc kubenswrapper[4839]: I0321 04:26:06.435079 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:26:06 crc kubenswrapper[4839]: I0321 04:26:06.435095 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 04:26:06 crc kubenswrapper[4839]: I0321 04:26:06.435104 4839 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T04:26:06Z","lastTransitionTime":"2026-03-21T04:26:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 04:26:06 crc kubenswrapper[4839]: E0321 04:26:06.445330 4839 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404564Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865364Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-21T04:26:06Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-21T04:26:06Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-21T04:26:06Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-21T04:26:06Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-21T04:26:06Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-21T04:26:06Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-21T04:26:06Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-21T04:26:06Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"d75f4a6d-5ce3-49bd-a7c7-4f5c3e6bf62f\\\",\\\"systemUUID\\\":\\\"2a7bfad9-30ba-42d8-b982-971191ebb9d6\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:26:06Z is after 2025-08-24T17:21:41Z" Mar 21 04:26:06 crc kubenswrapper[4839]: E0321 04:26:06.445489 4839 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Mar 21 04:26:06 crc kubenswrapper[4839]: I0321 04:26:06.452515 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 21 04:26:06 crc kubenswrapper[4839]: E0321 04:26:06.452693 4839 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 21 04:26:06 crc kubenswrapper[4839]: I0321 04:26:06.466413 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"99628ae3-9af9-4946-8e9f-2fc369d82cfa\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:23:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:23:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:24:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:24:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:23:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4a7ef331cbf4076bc661dd93ec1669ee8b721edbac57aeef3df25f1e36527065\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ed71f104d54976b384a414c08b933d4f7185883ea0e2dc638863d67f9b245505\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-21T04:24:17Z\\\",\\\"message\\\":\\\"+ timeout 3m /bin/bash -exuo pipefail -c 'while [ -n \\\\\\\"$(ss -Htanop \\\\\\\\( sport = 10357 \\\\\\\\))\\\\\\\" ]; do sleep 1; done'\\\\n++ ss -Htanop '(' sport = 10357 ')'\\\\n+ '[' -n '' ']'\\\\n+ exec cluster-policy-controller start --config=/etc/kubernetes/static-pod-resources/configmaps/cluster-policy-controller-config/config.yaml --kubeconfig=/etc/kubernetes/static-pod-resources/configmaps/controller-manager-kubeconfig/kubeconfig --namespace=openshift-kube-controller-manager -v=2\\\\nI0321 04:23:48.352735 1 leaderelection.go:121] The leader election gives 4 retries and allows for 30s of clock skew. The kube-apiserver downtime tolerance is 78s. Worst non-graceful lease acquisition is 2m43s. Worst graceful lease acquisition is {26s}.\\\\nI0321 04:23:48.353466 1 observer_polling.go:159] Starting file observer\\\\nI0321 04:23:48.354210 1 builder.go:298] cluster-policy-controller version 4.18.0-202501230001.p0.g5fd8525.assembly.stream.el9-5fd8525-5fd852525909ce6eab52972ba9ce8fcf56528eb9\\\\nI0321 04:23:48.354819 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.crt::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.key\\\\\\\"\\\\nI0321 04:24:09.734400 1 cert_rotation.go:91] certificate rotation detected, shutting down client connections to start using new credentials\\\\nI0321 04:24:17.259238 1 cmd.go:138] Received SIGTERM or SIGINT signal, shutting down controller.\\\\nF0321 04:24:17.259294 1 cmd.go:179] failed checking apiserver connectivity: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-21T04:23:48Z\\\"}},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:24:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dd3ce831e80df764c5450f1bf45e92fc061ba1c1efa8ade627b570edcb41567c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:23:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ac4ccdf976fbadd5ce5975dbec37b6be3b17b401a863ed5c1e93736176e1185e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:23:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f432640080695974eb1c6c1297fad14596d99d17a162d541417fb3ec762575f5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:23:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:23:16Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:26:06Z is after 2025-08-24T17:21:41Z" Mar 21 04:26:06 crc kubenswrapper[4839]: I0321 04:26:06.484922 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c5a473f2-0430-4c9b-8ef8-60d457db5188\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:23:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:23:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:23:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7299e7f513aff2d1d61b42cef3ed386843478716082a6e38d4d6d61d87f48529\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:23:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6d9bd6ceaa85cd21af55d1d512ac910eaf9d1bcf5d0f10868cd22ac80b13f66a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:23:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e4fe4a6c00181492cd5389e34a9bf18d905f69b2f7467eb75dcee9992eeb0d07\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:23:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://412472fce0e71838c2bb83101879b672e777dcd608bf27261a98261150d42e87\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ecafd1a4266b72f369f4b53d8a423bb875c1bc9719282c517a8e999ad5aecc66\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-21T04:24:14Z\\\",\\\"message\\\":\\\"W0321 04:24:13.708919 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI0321 04:24:13.709951 1 crypto.go:601] Generating new CA for check-endpoints-signer@1774067053 cert, and key in /tmp/serving-cert-1138089316/serving-signer.crt, /tmp/serving-cert-1138089316/serving-signer.key\\\\nI0321 04:24:14.009807 1 observer_polling.go:159] Starting file observer\\\\nW0321 04:24:14.016906 1 builder.go:272] unable to get owner reference (falling back to namespace): Unauthorized\\\\nI0321 04:24:14.017068 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0321 04:24:14.017655 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1138089316/tls.crt::/tmp/serving-cert-1138089316/tls.key\\\\\\\"\\\\nF0321 04:24:14.917069 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-21T04:24:13Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:24:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://573a241321770c9c676d83d8280f81eba0ad01a3e2f857be89d31316117b8898\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:23:19Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f70600d5431425836d94025fc8166068d8c9f411d0750be07cd7de44575a9649\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f70600d5431425836d94025fc8166068d8c9f411d0750be07cd7de44575a9649\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:23:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:23:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:23:16Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:26:06Z is after 2025-08-24T17:21:41Z" Mar 21 04:26:06 crc kubenswrapper[4839]: I0321 04:26:06.496859 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:24:41Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:24:41Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:26:06Z is after 2025-08-24T17:21:41Z" Mar 21 04:26:06 crc kubenswrapper[4839]: I0321 04:26:06.510579 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:24:41Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:09Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://17d5f53fe3137552caf74bf775b44fba819d8790d2bc4d5042744dbd7b7307ce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:25:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e480aacd9f65a8a4ccedd9c0a943c53bf6de8a8de00eb0d421ec782af8a785bb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:25:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:26:06Z is after 2025-08-24T17:21:41Z" Mar 21 04:26:06 crc kubenswrapper[4839]: I0321 04:26:06.523839 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-g47qh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e646dbcd-c976-48e4-8dee-497be8a275bf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2ceedbcee674106a8f519fededf37b2bbc4b4bd660845373899de825f1ed534e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:25:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ljz2v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:25:00Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-g47qh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:26:06Z is after 2025-08-24T17:21:41Z" Mar 21 04:26:06 crc kubenswrapper[4839]: I0321 04:26:06.535394 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b27e726e-fb6e-471f-b189-71637a34994c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:23:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:23:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:24:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:24:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:23:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://506189caf9101bd8eff0b327400189bde29280613baf494135bf28fa36bd80cd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:23:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4ea5d4d7d744a79c2e5515eaf49d09346d96b37b17eee6e3e32952e389248367\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:23:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1981f2299df7154d30ce01492943eaadeeed0f27c0b0a39a1228054345aa7a3a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:23:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fec5f0e1a98ef435cb8b280400cfc7daff0f232d320947f22a93df0b81b6f441\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fec5f0e1a98ef435cb8b280400cfc7daff0f232d320947f22a93df0b81b6f441\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:23:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:23:17Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:23:16Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:26:06Z is after 2025-08-24T17:21:41Z" Mar 21 04:26:06 crc kubenswrapper[4839]: E0321 04:26:06.547969 4839 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 21 04:26:06 crc kubenswrapper[4839]: I0321 04:26:06.549114 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c8235ac9-7c3f-438e-957f-1bdedeff6f64\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:23:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:23:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:23:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:23:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:23:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5c5299f0598312d0ef997d7c51fad5c0b882bd65e5964794ac66179575373fcd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:23:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5b603ad92d6de73d444d98c4e9f38fd63b22eabeee61d7a3b00e8ca741016e68\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5b603ad92d6de73d444d98c4e9f38fd63b22eabeee61d7a3b00e8ca741016e68\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:23:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:23:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:23:16Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:26:06Z is after 2025-08-24T17:21:41Z" Mar 21 04:26:06 crc kubenswrapper[4839]: I0321 04:26:06.562429 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:24:41Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:24:41Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:26:06Z is after 2025-08-24T17:21:41Z" Mar 21 04:26:06 crc kubenswrapper[4839]: I0321 04:26:06.573466 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-jx4q7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4f92fefb-d5cd-451a-8bbe-31eea55d5bd9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6fc18a359e07bf64662248d7ceb65c8184407c0c4e14ce5483760975218407c9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:25:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9jqbw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a124ddc90d8b93fe4b6f77b778fbdac127b4896f5f55e6d5d78629cd17584311\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:25:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9jqbw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:25:00Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-jx4q7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:26:06Z is after 2025-08-24T17:21:41Z" Mar 21 04:26:06 crc kubenswrapper[4839]: I0321 04:26:06.585212 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:24:41Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:11Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c3ce591ac49d818720803d65a1f2cbc28e1781a5bd3cff7c6cdaaa4b55c01b62\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:25:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:26:06Z is after 2025-08-24T17:21:41Z" Mar 21 04:26:06 crc kubenswrapper[4839]: I0321 04:26:06.600483 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:24:41Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:24:41Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:26:06Z is after 2025-08-24T17:21:41Z" Mar 21 04:26:06 crc kubenswrapper[4839]: I0321 04:26:06.614052 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-scp2c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e0848faa-daf7-4b62-a20f-36d92678db1d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2ae22ec83bed051327f3e013af8f13d50dfab76794f5fbc972c4091173c8c464\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:25:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6trk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5ee6bb585321fd698c356e2ad9b8f48f62f2b8e09847371060e8ab0395a059ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5ee6bb585321fd698c356e2ad9b8f48f62f2b8e09847371060e8ab0395a059ea\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:25:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:25:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6trk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d6527e3c31caba53559cb6dfe4494b576789a0c8e3e5a7d392352441f1b9b1ad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d6527e3c31caba53559cb6dfe4494b576789a0c8e3e5a7d392352441f1b9b1ad\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:25:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:25:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6trk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://688598020c8b1be055194d4805d234769230262450e5f816f35cc120ff5ff9a2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://688598020c8b1be055194d4805d234769230262450e5f816f35cc120ff5ff9a2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:25:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:25:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6trk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://137b7d338371020059e20feb028cea9e79fa43d32ff7de446719a1326051c4f4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://137b7d338371020059e20feb028cea9e79fa43d32ff7de446719a1326051c4f4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:25:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:25:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6trk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6059c9b392e73f503a825716b885d248ddf659fe4fc1cd23f47353dd0fdc0e07\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6059c9b392e73f503a825716b885d248ddf659fe4fc1cd23f47353dd0fdc0e07\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:25:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:25:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6trk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9367e3b70275a4fd5cc98faa785da0149e890de672bce34ddc831a46be5dd6b3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9367e3b70275a4fd5cc98faa785da0149e890de672bce34ddc831a46be5dd6b3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:25:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:25:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6trk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:25:00Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-scp2c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:26:06Z is after 2025-08-24T17:21:41Z" Mar 21 04:26:06 crc kubenswrapper[4839]: I0321 04:26:06.627334 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-zqcw4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1602189b-f4f3-40ee-ba63-c695c11069d0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bc026a6498e53a6dc434660326fcbdb2323e59c0c7d6f9e5d768c3a347b614d1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://abb804e65728bf323531d9b35e52f6700584bab85de4a5b749067d2d89224747\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-21T04:25:48Z\\\",\\\"message\\\":\\\"2026-03-21T04:25:02+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_d2a3febf-889d-41cb-9c6b-9024dff2f413\\\\n2026-03-21T04:25:02+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_d2a3febf-889d-41cb-9c6b-9024dff2f413 to /host/opt/cni/bin/\\\\n2026-03-21T04:25:02Z [verbose] multus-daemon started\\\\n2026-03-21T04:25:02Z [verbose] Readiness Indicator file check\\\\n2026-03-21T04:25:47Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-21T04:25:01Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:25:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f9gh6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:25:00Z\\\"}}\" for pod \"openshift-multus\"/\"multus-zqcw4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:26:06Z is after 2025-08-24T17:21:41Z" Mar 21 04:26:06 crc kubenswrapper[4839]: I0321 04:26:06.640281 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-sxs57" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e99177d8-5f41-4cee-a2c9-ae1c314d9d8d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4499202f3d135af4fbc0aaeddb85c3af3d571f0a6a0da31b0c054112cbeef85e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:25:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vqm8m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:25:06Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-sxs57\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:26:06Z is after 2025-08-24T17:21:41Z" Mar 21 04:26:06 crc kubenswrapper[4839]: I0321 04:26:06.659115 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"845a5f72-fcd9-4e53-a65e-d875a0758312\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:23:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:23:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:23:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:23:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:23:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6ac8aa1efe8200de679b27c91d85e8934dd8b958a487da8adfcb2e7fe4d4af83\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:23:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b5fcb9a692a8e68cdaf00e9ab50192286582dc94a1ec9c1f269deec71d12e709\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:23:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0a90276bf42ba474250d4e51c93b958e93ffcad8ab5f4285766b381f0247c484\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:23:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://418cc0271450c53990546d7ffb4506824d2a5889e49064cd40624ba291f485bd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:23:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d2aeb8f0a5e6e7302570f0ecb899e31a8f5e9ed506f9bf7cfa6a0a6d55ae50dd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:23:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://843b9092466615ee64f818bbd1d6e51298c59932d1e135796967de09fb2f81bf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://843b9092466615ee64f818bbd1d6e51298c59932d1e135796967de09fb2f81bf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:23:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:23:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://320bccb832e179c9ab109d22c5cd652cfb3bd770d2c698d91886401d9566efc5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://320bccb832e179c9ab109d22c5cd652cfb3bd770d2c698d91886401d9566efc5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:23:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:23:18Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://6e30344cfa0c0c9d42845a517f744a156634132bd6c2f6e30f745751913e968c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6e30344cfa0c0c9d42845a517f744a156634132bd6c2f6e30f745751913e968c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:23:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:23:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:23:16Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:26:06Z is after 2025-08-24T17:21:41Z" Mar 21 04:26:06 crc kubenswrapper[4839]: I0321 04:26:06.671076 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:24:41Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:12Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ea7373c259189351899cd74615296d7188b2529f808af9c8c85098177ffbe85d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:25:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:26:06Z is after 2025-08-24T17:21:41Z" Mar 21 04:26:06 crc kubenswrapper[4839]: I0321 04:26:06.698297 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-spl4b" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d634043b-c9ec-4469-b267-26053b1f02f9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:01Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:01Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5854b81091eb3920f499e2fb8bdb5e51afa5cf975397ad7d789603150fdafe92\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:25:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cdph2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c532a8668c47301ab93dbb1ea21be6ed687a83a6c9008b887e5023a46f27d172\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:25:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cdph2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://340ce870dbfbd3b102f8f1c5e6705da80e4a23db035175d7746ffe6ad63310f0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:25:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cdph2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://04c0470b8aa154ad901d5416631136ad2480da45bff39836dda6837a3e4b980e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:25:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cdph2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://821316a28b33f897df4ea4ae553ed0cbdd066ff220cc310af604bbe062e4b00e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:25:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cdph2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0555faac11cd1da1d5573074ee5d8704c1c4a7d8559996229bf242d9ed7290f4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:25:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cdph2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://616e01984d5c14dee369d9f7e59f96c9dfaccfd431fc19972454ffbbb0b5489f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://616e01984d5c14dee369d9f7e59f96c9dfaccfd431fc19972454ffbbb0b5489f\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-21T04:25:54Z\\\",\\\"message\\\":\\\"led to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:25:54Z is after 2025-08-24T17:21:41Z]\\\\nI0321 04:25:54.232530 7446 transact.go:42] Configuring OVN: [{Op:update Table:Logical_Switch_Port Row:map[addresses:{GoSet:[0a:58:0a:d9:00:5c 10.217.0.92]} options:{GoMap:map[iface-id-ver:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 requested-chassis:crc]} port_security:{GoSet:[0a:58:0a:d9:00:5c 10.217.0.92]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {c94130be-172c-477c-88c4-40cc7eba30fe}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:} {Op:mutate Table:Logical_Switch Row:map[] Rows:[] Columns:[] Mutations:[{Column:ports Mutator:insert Value:{GoSet:[{GoUUID:c94130be-172c-477c-88c4-40cc7eba30fe}]}}] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {7e8bb06a-06a5-45bc-a752-26a17d322811}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:} {Op:mutate Table:Port_Group Row:map[] Rows:[] Columns:[] Mutations:[{Column:ports Mutator:insert Value:{GoSet:[{GoUUID:c94130be-172c-477c-88c4-40cc7eba30fe}]}}] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {eb8eef51-1a8d-43f9-ae2e-3b2cc00ded60}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:} {Op:update Table:NAT Row:map[external_ip:192.168.126.11 logical_ip:10.217.0.92 options:{GoMap:map[stateless:false]} typ\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-21T04:25:53Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-spl4b_openshift-ovn-kubernetes(d634043b-c9ec-4469-b267-26053b1f02f9)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cdph2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ee9c34719e2681860a7065870d4f0fd1d5aae10da5f2cc780e83f8020211e536\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:25:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cdph2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://466ed122245008a86b4471250670e086e61c981de3a7a026461c7c2238c87f3c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://466ed122245008a86b4471250670e086e61c981de3a7a026461c7c2238c87f3c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:25:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:25:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cdph2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:25:01Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-spl4b\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:26:06Z is after 2025-08-24T17:21:41Z" Mar 21 04:26:06 crc kubenswrapper[4839]: I0321 04:26:06.709384 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-9hl57" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4dee692e-c3b8-4538-86d7-210dd7e96173\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6058f68ef58f6259be3064c7cea97ec261dc1174655dbe75d4caf208a9c300e5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:25:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6n8pc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://99d7bb8a699d6022c3036c24052e58f8f45699b88971a2137e3d8bc27a21fe56\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:25:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6n8pc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:25:12Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-9hl57\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:26:06Z is after 2025-08-24T17:21:41Z" Mar 21 04:26:06 crc kubenswrapper[4839]: I0321 04:26:06.718386 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-445ww" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fa13ce27-53f2-4178-8560-251f0bb3f034\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:13Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:13Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4b26h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4b26h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:25:13Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-445ww\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:26:06Z is after 2025-08-24T17:21:41Z" Mar 21 04:26:07 crc kubenswrapper[4839]: I0321 04:26:07.452108 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-445ww" Mar 21 04:26:07 crc kubenswrapper[4839]: I0321 04:26:07.452149 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 21 04:26:07 crc kubenswrapper[4839]: I0321 04:26:07.452150 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 21 04:26:07 crc kubenswrapper[4839]: E0321 04:26:07.452266 4839 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-445ww" podUID="fa13ce27-53f2-4178-8560-251f0bb3f034" Mar 21 04:26:07 crc kubenswrapper[4839]: E0321 04:26:07.452380 4839 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 21 04:26:07 crc kubenswrapper[4839]: E0321 04:26:07.452431 4839 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 21 04:26:08 crc kubenswrapper[4839]: I0321 04:26:08.452205 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 21 04:26:08 crc kubenswrapper[4839]: E0321 04:26:08.452331 4839 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 21 04:26:09 crc kubenswrapper[4839]: I0321 04:26:09.452676 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-445ww" Mar 21 04:26:09 crc kubenswrapper[4839]: I0321 04:26:09.452767 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 21 04:26:09 crc kubenswrapper[4839]: E0321 04:26:09.452833 4839 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-445ww" podUID="fa13ce27-53f2-4178-8560-251f0bb3f034" Mar 21 04:26:09 crc kubenswrapper[4839]: E0321 04:26:09.452910 4839 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 21 04:26:09 crc kubenswrapper[4839]: I0321 04:26:09.454329 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 21 04:26:09 crc kubenswrapper[4839]: E0321 04:26:09.454531 4839 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 21 04:26:10 crc kubenswrapper[4839]: I0321 04:26:10.451816 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 21 04:26:10 crc kubenswrapper[4839]: E0321 04:26:10.451959 4839 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 21 04:26:10 crc kubenswrapper[4839]: I0321 04:26:10.452698 4839 scope.go:117] "RemoveContainer" containerID="616e01984d5c14dee369d9f7e59f96c9dfaccfd431fc19972454ffbbb0b5489f" Mar 21 04:26:10 crc kubenswrapper[4839]: E0321 04:26:10.452944 4839 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-spl4b_openshift-ovn-kubernetes(d634043b-c9ec-4469-b267-26053b1f02f9)\"" pod="openshift-ovn-kubernetes/ovnkube-node-spl4b" podUID="d634043b-c9ec-4469-b267-26053b1f02f9" Mar 21 04:26:11 crc kubenswrapper[4839]: I0321 04:26:11.452612 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 21 04:26:11 crc kubenswrapper[4839]: E0321 04:26:11.452730 4839 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 21 04:26:11 crc kubenswrapper[4839]: I0321 04:26:11.452879 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 21 04:26:11 crc kubenswrapper[4839]: E0321 04:26:11.452923 4839 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 21 04:26:11 crc kubenswrapper[4839]: I0321 04:26:11.453010 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-445ww" Mar 21 04:26:11 crc kubenswrapper[4839]: E0321 04:26:11.453058 4839 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-445ww" podUID="fa13ce27-53f2-4178-8560-251f0bb3f034" Mar 21 04:26:11 crc kubenswrapper[4839]: E0321 04:26:11.549191 4839 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 21 04:26:12 crc kubenswrapper[4839]: I0321 04:26:12.451959 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 21 04:26:12 crc kubenswrapper[4839]: E0321 04:26:12.452113 4839 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 21 04:26:13 crc kubenswrapper[4839]: I0321 04:26:13.452832 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 21 04:26:13 crc kubenswrapper[4839]: E0321 04:26:13.453187 4839 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 21 04:26:13 crc kubenswrapper[4839]: I0321 04:26:13.453499 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-445ww" Mar 21 04:26:13 crc kubenswrapper[4839]: E0321 04:26:13.453688 4839 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-445ww" podUID="fa13ce27-53f2-4178-8560-251f0bb3f034" Mar 21 04:26:13 crc kubenswrapper[4839]: I0321 04:26:13.454028 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 21 04:26:13 crc kubenswrapper[4839]: E0321 04:26:13.454223 4839 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 21 04:26:14 crc kubenswrapper[4839]: I0321 04:26:14.452529 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 21 04:26:14 crc kubenswrapper[4839]: E0321 04:26:14.452785 4839 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 21 04:26:15 crc kubenswrapper[4839]: I0321 04:26:15.452769 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-445ww" Mar 21 04:26:15 crc kubenswrapper[4839]: I0321 04:26:15.452881 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 21 04:26:15 crc kubenswrapper[4839]: E0321 04:26:15.452918 4839 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-445ww" podUID="fa13ce27-53f2-4178-8560-251f0bb3f034" Mar 21 04:26:15 crc kubenswrapper[4839]: I0321 04:26:15.452790 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 21 04:26:15 crc kubenswrapper[4839]: E0321 04:26:15.453135 4839 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 21 04:26:15 crc kubenswrapper[4839]: E0321 04:26:15.453189 4839 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 21 04:26:16 crc kubenswrapper[4839]: I0321 04:26:16.452782 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 21 04:26:16 crc kubenswrapper[4839]: E0321 04:26:16.452914 4839 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 21 04:26:16 crc kubenswrapper[4839]: I0321 04:26:16.471963 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"99628ae3-9af9-4946-8e9f-2fc369d82cfa\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:23:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:23:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:24:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:24:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:23:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4a7ef331cbf4076bc661dd93ec1669ee8b721edbac57aeef3df25f1e36527065\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ed71f104d54976b384a414c08b933d4f7185883ea0e2dc638863d67f9b245505\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-21T04:24:17Z\\\",\\\"message\\\":\\\"+ timeout 3m /bin/bash -exuo pipefail -c 'while [ -n \\\\\\\"$(ss -Htanop \\\\\\\\( sport = 10357 \\\\\\\\))\\\\\\\" ]; do sleep 1; done'\\\\n++ ss -Htanop '(' sport = 10357 ')'\\\\n+ '[' -n '' ']'\\\\n+ exec cluster-policy-controller start --config=/etc/kubernetes/static-pod-resources/configmaps/cluster-policy-controller-config/config.yaml --kubeconfig=/etc/kubernetes/static-pod-resources/configmaps/controller-manager-kubeconfig/kubeconfig --namespace=openshift-kube-controller-manager -v=2\\\\nI0321 04:23:48.352735 1 leaderelection.go:121] The leader election gives 4 retries and allows for 30s of clock skew. The kube-apiserver downtime tolerance is 78s. Worst non-graceful lease acquisition is 2m43s. Worst graceful lease acquisition is {26s}.\\\\nI0321 04:23:48.353466 1 observer_polling.go:159] Starting file observer\\\\nI0321 04:23:48.354210 1 builder.go:298] cluster-policy-controller version 4.18.0-202501230001.p0.g5fd8525.assembly.stream.el9-5fd8525-5fd852525909ce6eab52972ba9ce8fcf56528eb9\\\\nI0321 04:23:48.354819 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.crt::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.key\\\\\\\"\\\\nI0321 04:24:09.734400 1 cert_rotation.go:91] certificate rotation detected, shutting down client connections to start using new credentials\\\\nI0321 04:24:17.259238 1 cmd.go:138] Received SIGTERM or SIGINT signal, shutting down controller.\\\\nF0321 04:24:17.259294 1 cmd.go:179] failed checking apiserver connectivity: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-21T04:23:48Z\\\"}},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:24:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dd3ce831e80df764c5450f1bf45e92fc061ba1c1efa8ade627b570edcb41567c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:23:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ac4ccdf976fbadd5ce5975dbec37b6be3b17b401a863ed5c1e93736176e1185e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:23:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f432640080695974eb1c6c1297fad14596d99d17a162d541417fb3ec762575f5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:23:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:23:16Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:26:16Z is after 2025-08-24T17:21:41Z" Mar 21 04:26:16 crc kubenswrapper[4839]: I0321 04:26:16.487378 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c5a473f2-0430-4c9b-8ef8-60d457db5188\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:23:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:23:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:23:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7299e7f513aff2d1d61b42cef3ed386843478716082a6e38d4d6d61d87f48529\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:23:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6d9bd6ceaa85cd21af55d1d512ac910eaf9d1bcf5d0f10868cd22ac80b13f66a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:23:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e4fe4a6c00181492cd5389e34a9bf18d905f69b2f7467eb75dcee9992eeb0d07\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:23:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://412472fce0e71838c2bb83101879b672e777dcd608bf27261a98261150d42e87\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ecafd1a4266b72f369f4b53d8a423bb875c1bc9719282c517a8e999ad5aecc66\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-21T04:24:14Z\\\",\\\"message\\\":\\\"W0321 04:24:13.708919 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI0321 04:24:13.709951 1 crypto.go:601] Generating new CA for check-endpoints-signer@1774067053 cert, and key in /tmp/serving-cert-1138089316/serving-signer.crt, /tmp/serving-cert-1138089316/serving-signer.key\\\\nI0321 04:24:14.009807 1 observer_polling.go:159] Starting file observer\\\\nW0321 04:24:14.016906 1 builder.go:272] unable to get owner reference (falling back to namespace): Unauthorized\\\\nI0321 04:24:14.017068 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0321 04:24:14.017655 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1138089316/tls.crt::/tmp/serving-cert-1138089316/tls.key\\\\\\\"\\\\nF0321 04:24:14.917069 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-21T04:24:13Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:24:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://573a241321770c9c676d83d8280f81eba0ad01a3e2f857be89d31316117b8898\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:23:19Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f70600d5431425836d94025fc8166068d8c9f411d0750be07cd7de44575a9649\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f70600d5431425836d94025fc8166068d8c9f411d0750be07cd7de44575a9649\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:23:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:23:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:23:16Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:26:16Z is after 2025-08-24T17:21:41Z" Mar 21 04:26:16 crc kubenswrapper[4839]: I0321 04:26:16.500840 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:24:41Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:24:41Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:26:16Z is after 2025-08-24T17:21:41Z" Mar 21 04:26:16 crc kubenswrapper[4839]: I0321 04:26:16.515455 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:24:41Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:09Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://17d5f53fe3137552caf74bf775b44fba819d8790d2bc4d5042744dbd7b7307ce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:25:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e480aacd9f65a8a4ccedd9c0a943c53bf6de8a8de00eb0d421ec782af8a785bb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:25:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:26:16Z is after 2025-08-24T17:21:41Z" Mar 21 04:26:16 crc kubenswrapper[4839]: I0321 04:26:16.525051 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-g47qh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e646dbcd-c976-48e4-8dee-497be8a275bf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2ceedbcee674106a8f519fededf37b2bbc4b4bd660845373899de825f1ed534e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:25:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ljz2v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:25:00Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-g47qh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:26:16Z is after 2025-08-24T17:21:41Z" Mar 21 04:26:16 crc kubenswrapper[4839]: I0321 04:26:16.534338 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b27e726e-fb6e-471f-b189-71637a34994c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:23:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:23:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:24:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:24:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:23:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://506189caf9101bd8eff0b327400189bde29280613baf494135bf28fa36bd80cd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:23:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4ea5d4d7d744a79c2e5515eaf49d09346d96b37b17eee6e3e32952e389248367\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:23:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1981f2299df7154d30ce01492943eaadeeed0f27c0b0a39a1228054345aa7a3a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:23:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fec5f0e1a98ef435cb8b280400cfc7daff0f232d320947f22a93df0b81b6f441\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fec5f0e1a98ef435cb8b280400cfc7daff0f232d320947f22a93df0b81b6f441\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:23:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:23:17Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:23:16Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:26:16Z is after 2025-08-24T17:21:41Z" Mar 21 04:26:16 crc kubenswrapper[4839]: I0321 04:26:16.542778 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c8235ac9-7c3f-438e-957f-1bdedeff6f64\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:23:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:23:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:23:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:23:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:23:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5c5299f0598312d0ef997d7c51fad5c0b882bd65e5964794ac66179575373fcd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:23:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5b603ad92d6de73d444d98c4e9f38fd63b22eabeee61d7a3b00e8ca741016e68\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5b603ad92d6de73d444d98c4e9f38fd63b22eabeee61d7a3b00e8ca741016e68\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:23:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:23:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:23:16Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:26:16Z is after 2025-08-24T17:21:41Z" Mar 21 04:26:16 crc kubenswrapper[4839]: E0321 04:26:16.549493 4839 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 21 04:26:16 crc kubenswrapper[4839]: I0321 04:26:16.553957 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:24:41Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:24:41Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:26:16Z is after 2025-08-24T17:21:41Z" Mar 21 04:26:16 crc kubenswrapper[4839]: I0321 04:26:16.567101 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-jx4q7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4f92fefb-d5cd-451a-8bbe-31eea55d5bd9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6fc18a359e07bf64662248d7ceb65c8184407c0c4e14ce5483760975218407c9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:25:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9jqbw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a124ddc90d8b93fe4b6f77b778fbdac127b4896f5f55e6d5d78629cd17584311\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:25:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9jqbw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:25:00Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-jx4q7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:26:16Z is after 2025-08-24T17:21:41Z" Mar 21 04:26:16 crc kubenswrapper[4839]: I0321 04:26:16.580637 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:24:41Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:11Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c3ce591ac49d818720803d65a1f2cbc28e1781a5bd3cff7c6cdaaa4b55c01b62\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:25:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:26:16Z is after 2025-08-24T17:21:41Z" Mar 21 04:26:16 crc kubenswrapper[4839]: I0321 04:26:16.585884 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:26:16 crc kubenswrapper[4839]: I0321 04:26:16.585921 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:26:16 crc kubenswrapper[4839]: I0321 04:26:16.585933 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:26:16 crc kubenswrapper[4839]: I0321 04:26:16.585950 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 04:26:16 crc kubenswrapper[4839]: I0321 04:26:16.585961 4839 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T04:26:16Z","lastTransitionTime":"2026-03-21T04:26:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 04:26:16 crc kubenswrapper[4839]: I0321 04:26:16.595207 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:24:41Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:24:41Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:26:16Z is after 2025-08-24T17:21:41Z" Mar 21 04:26:16 crc kubenswrapper[4839]: E0321 04:26:16.600455 4839 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404564Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865364Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-21T04:26:16Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-21T04:26:16Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-21T04:26:16Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-21T04:26:16Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-21T04:26:16Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-21T04:26:16Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-21T04:26:16Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-21T04:26:16Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"d75f4a6d-5ce3-49bd-a7c7-4f5c3e6bf62f\\\",\\\"systemUUID\\\":\\\"2a7bfad9-30ba-42d8-b982-971191ebb9d6\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:26:16Z is after 2025-08-24T17:21:41Z" Mar 21 04:26:16 crc kubenswrapper[4839]: I0321 04:26:16.605059 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:26:16 crc kubenswrapper[4839]: I0321 04:26:16.605105 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:26:16 crc kubenswrapper[4839]: I0321 04:26:16.605115 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:26:16 crc kubenswrapper[4839]: I0321 04:26:16.605130 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 04:26:16 crc kubenswrapper[4839]: I0321 04:26:16.605143 4839 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T04:26:16Z","lastTransitionTime":"2026-03-21T04:26:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 04:26:16 crc kubenswrapper[4839]: I0321 04:26:16.608465 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-scp2c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e0848faa-daf7-4b62-a20f-36d92678db1d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2ae22ec83bed051327f3e013af8f13d50dfab76794f5fbc972c4091173c8c464\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:25:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6trk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5ee6bb585321fd698c356e2ad9b8f48f62f2b8e09847371060e8ab0395a059ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5ee6bb585321fd698c356e2ad9b8f48f62f2b8e09847371060e8ab0395a059ea\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:25:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:25:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6trk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d6527e3c31caba53559cb6dfe4494b576789a0c8e3e5a7d392352441f1b9b1ad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d6527e3c31caba53559cb6dfe4494b576789a0c8e3e5a7d392352441f1b9b1ad\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:25:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:25:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6trk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://688598020c8b1be055194d4805d234769230262450e5f816f35cc120ff5ff9a2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://688598020c8b1be055194d4805d234769230262450e5f816f35cc120ff5ff9a2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:25:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:25:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6trk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://137b7d338371020059e20feb028cea9e79fa43d32ff7de446719a1326051c4f4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://137b7d338371020059e20feb028cea9e79fa43d32ff7de446719a1326051c4f4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:25:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:25:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6trk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6059c9b392e73f503a825716b885d248ddf659fe4fc1cd23f47353dd0fdc0e07\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6059c9b392e73f503a825716b885d248ddf659fe4fc1cd23f47353dd0fdc0e07\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:25:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:25:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6trk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9367e3b70275a4fd5cc98faa785da0149e890de672bce34ddc831a46be5dd6b3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9367e3b70275a4fd5cc98faa785da0149e890de672bce34ddc831a46be5dd6b3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:25:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:25:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6trk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:25:00Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-scp2c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:26:16Z is after 2025-08-24T17:21:41Z" Mar 21 04:26:16 crc kubenswrapper[4839]: E0321 04:26:16.617443 4839 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404564Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865364Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-21T04:26:16Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-21T04:26:16Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-21T04:26:16Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-21T04:26:16Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-21T04:26:16Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-21T04:26:16Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-21T04:26:16Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-21T04:26:16Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"d75f4a6d-5ce3-49bd-a7c7-4f5c3e6bf62f\\\",\\\"systemUUID\\\":\\\"2a7bfad9-30ba-42d8-b982-971191ebb9d6\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:26:16Z is after 2025-08-24T17:21:41Z" Mar 21 04:26:16 crc kubenswrapper[4839]: I0321 04:26:16.620590 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:26:16 crc kubenswrapper[4839]: I0321 04:26:16.620638 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:26:16 crc kubenswrapper[4839]: I0321 04:26:16.620649 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:26:16 crc kubenswrapper[4839]: I0321 04:26:16.620665 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 04:26:16 crc kubenswrapper[4839]: I0321 04:26:16.620678 4839 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T04:26:16Z","lastTransitionTime":"2026-03-21T04:26:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 04:26:16 crc kubenswrapper[4839]: I0321 04:26:16.621430 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-zqcw4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1602189b-f4f3-40ee-ba63-c695c11069d0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bc026a6498e53a6dc434660326fcbdb2323e59c0c7d6f9e5d768c3a347b614d1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://abb804e65728bf323531d9b35e52f6700584bab85de4a5b749067d2d89224747\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-21T04:25:48Z\\\",\\\"message\\\":\\\"2026-03-21T04:25:02+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_d2a3febf-889d-41cb-9c6b-9024dff2f413\\\\n2026-03-21T04:25:02+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_d2a3febf-889d-41cb-9c6b-9024dff2f413 to /host/opt/cni/bin/\\\\n2026-03-21T04:25:02Z [verbose] multus-daemon started\\\\n2026-03-21T04:25:02Z [verbose] Readiness Indicator file check\\\\n2026-03-21T04:25:47Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-21T04:25:01Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:25:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f9gh6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:25:00Z\\\"}}\" for pod \"openshift-multus\"/\"multus-zqcw4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:26:16Z is after 2025-08-24T17:21:41Z" Mar 21 04:26:16 crc kubenswrapper[4839]: I0321 04:26:16.631234 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-sxs57" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e99177d8-5f41-4cee-a2c9-ae1c314d9d8d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4499202f3d135af4fbc0aaeddb85c3af3d571f0a6a0da31b0c054112cbeef85e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:25:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vqm8m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:25:06Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-sxs57\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:26:16Z is after 2025-08-24T17:21:41Z" Mar 21 04:26:16 crc kubenswrapper[4839]: E0321 04:26:16.632908 4839 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404564Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865364Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-21T04:26:16Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-21T04:26:16Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-21T04:26:16Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-21T04:26:16Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-21T04:26:16Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-21T04:26:16Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-21T04:26:16Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-21T04:26:16Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"d75f4a6d-5ce3-49bd-a7c7-4f5c3e6bf62f\\\",\\\"systemUUID\\\":\\\"2a7bfad9-30ba-42d8-b982-971191ebb9d6\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:26:16Z is after 2025-08-24T17:21:41Z" Mar 21 04:26:16 crc kubenswrapper[4839]: I0321 04:26:16.635808 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:26:16 crc kubenswrapper[4839]: I0321 04:26:16.635843 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:26:16 crc kubenswrapper[4839]: I0321 04:26:16.635851 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:26:16 crc kubenswrapper[4839]: I0321 04:26:16.635864 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 04:26:16 crc kubenswrapper[4839]: I0321 04:26:16.635876 4839 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T04:26:16Z","lastTransitionTime":"2026-03-21T04:26:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 04:26:16 crc kubenswrapper[4839]: E0321 04:26:16.649186 4839 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404564Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865364Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-21T04:26:16Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-21T04:26:16Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-21T04:26:16Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-21T04:26:16Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-21T04:26:16Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-21T04:26:16Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-21T04:26:16Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-21T04:26:16Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"d75f4a6d-5ce3-49bd-a7c7-4f5c3e6bf62f\\\",\\\"systemUUID\\\":\\\"2a7bfad9-30ba-42d8-b982-971191ebb9d6\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:26:16Z is after 2025-08-24T17:21:41Z" Mar 21 04:26:16 crc kubenswrapper[4839]: I0321 04:26:16.649561 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"845a5f72-fcd9-4e53-a65e-d875a0758312\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:23:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:23:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:23:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:23:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:23:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6ac8aa1efe8200de679b27c91d85e8934dd8b958a487da8adfcb2e7fe4d4af83\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:23:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b5fcb9a692a8e68cdaf00e9ab50192286582dc94a1ec9c1f269deec71d12e709\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:23:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0a90276bf42ba474250d4e51c93b958e93ffcad8ab5f4285766b381f0247c484\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:23:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://418cc0271450c53990546d7ffb4506824d2a5889e49064cd40624ba291f485bd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:23:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d2aeb8f0a5e6e7302570f0ecb899e31a8f5e9ed506f9bf7cfa6a0a6d55ae50dd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:23:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://843b9092466615ee64f818bbd1d6e51298c59932d1e135796967de09fb2f81bf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://843b9092466615ee64f818bbd1d6e51298c59932d1e135796967de09fb2f81bf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:23:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:23:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://320bccb832e179c9ab109d22c5cd652cfb3bd770d2c698d91886401d9566efc5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://320bccb832e179c9ab109d22c5cd652cfb3bd770d2c698d91886401d9566efc5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:23:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:23:18Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://6e30344cfa0c0c9d42845a517f744a156634132bd6c2f6e30f745751913e968c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6e30344cfa0c0c9d42845a517f744a156634132bd6c2f6e30f745751913e968c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:23:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:23:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:23:16Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:26:16Z is after 2025-08-24T17:21:41Z" Mar 21 04:26:16 crc kubenswrapper[4839]: I0321 04:26:16.652927 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:26:16 crc kubenswrapper[4839]: I0321 04:26:16.652973 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:26:16 crc kubenswrapper[4839]: I0321 04:26:16.652986 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:26:16 crc kubenswrapper[4839]: I0321 04:26:16.653004 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 04:26:16 crc kubenswrapper[4839]: I0321 04:26:16.653018 4839 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T04:26:16Z","lastTransitionTime":"2026-03-21T04:26:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 04:26:16 crc kubenswrapper[4839]: I0321 04:26:16.663347 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:24:41Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:12Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ea7373c259189351899cd74615296d7188b2529f808af9c8c85098177ffbe85d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:25:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:26:16Z is after 2025-08-24T17:21:41Z" Mar 21 04:26:16 crc kubenswrapper[4839]: E0321 04:26:16.666650 4839 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404564Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865364Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-21T04:26:16Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-21T04:26:16Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-21T04:26:16Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-21T04:26:16Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-21T04:26:16Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-21T04:26:16Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-21T04:26:16Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-21T04:26:16Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"d75f4a6d-5ce3-49bd-a7c7-4f5c3e6bf62f\\\",\\\"systemUUID\\\":\\\"2a7bfad9-30ba-42d8-b982-971191ebb9d6\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:26:16Z is after 2025-08-24T17:21:41Z" Mar 21 04:26:16 crc kubenswrapper[4839]: E0321 04:26:16.666769 4839 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Mar 21 04:26:16 crc kubenswrapper[4839]: I0321 04:26:16.682785 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-spl4b" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d634043b-c9ec-4469-b267-26053b1f02f9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:01Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:01Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5854b81091eb3920f499e2fb8bdb5e51afa5cf975397ad7d789603150fdafe92\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:25:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cdph2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c532a8668c47301ab93dbb1ea21be6ed687a83a6c9008b887e5023a46f27d172\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:25:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cdph2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://340ce870dbfbd3b102f8f1c5e6705da80e4a23db035175d7746ffe6ad63310f0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:25:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cdph2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://04c0470b8aa154ad901d5416631136ad2480da45bff39836dda6837a3e4b980e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:25:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cdph2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://821316a28b33f897df4ea4ae553ed0cbdd066ff220cc310af604bbe062e4b00e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:25:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cdph2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0555faac11cd1da1d5573074ee5d8704c1c4a7d8559996229bf242d9ed7290f4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:25:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cdph2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://616e01984d5c14dee369d9f7e59f96c9dfaccfd431fc19972454ffbbb0b5489f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://616e01984d5c14dee369d9f7e59f96c9dfaccfd431fc19972454ffbbb0b5489f\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-21T04:25:54Z\\\",\\\"message\\\":\\\"led to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:25:54Z is after 2025-08-24T17:21:41Z]\\\\nI0321 04:25:54.232530 7446 transact.go:42] Configuring OVN: [{Op:update Table:Logical_Switch_Port Row:map[addresses:{GoSet:[0a:58:0a:d9:00:5c 10.217.0.92]} options:{GoMap:map[iface-id-ver:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 requested-chassis:crc]} port_security:{GoSet:[0a:58:0a:d9:00:5c 10.217.0.92]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {c94130be-172c-477c-88c4-40cc7eba30fe}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:} {Op:mutate Table:Logical_Switch Row:map[] Rows:[] Columns:[] Mutations:[{Column:ports Mutator:insert Value:{GoSet:[{GoUUID:c94130be-172c-477c-88c4-40cc7eba30fe}]}}] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {7e8bb06a-06a5-45bc-a752-26a17d322811}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:} {Op:mutate Table:Port_Group Row:map[] Rows:[] Columns:[] Mutations:[{Column:ports Mutator:insert Value:{GoSet:[{GoUUID:c94130be-172c-477c-88c4-40cc7eba30fe}]}}] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {eb8eef51-1a8d-43f9-ae2e-3b2cc00ded60}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:} {Op:update Table:NAT Row:map[external_ip:192.168.126.11 logical_ip:10.217.0.92 options:{GoMap:map[stateless:false]} typ\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-21T04:25:53Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-spl4b_openshift-ovn-kubernetes(d634043b-c9ec-4469-b267-26053b1f02f9)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cdph2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ee9c34719e2681860a7065870d4f0fd1d5aae10da5f2cc780e83f8020211e536\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:25:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cdph2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://466ed122245008a86b4471250670e086e61c981de3a7a026461c7c2238c87f3c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://466ed122245008a86b4471250670e086e61c981de3a7a026461c7c2238c87f3c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:25:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:25:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cdph2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:25:01Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-spl4b\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:26:16Z is after 2025-08-24T17:21:41Z" Mar 21 04:26:16 crc kubenswrapper[4839]: I0321 04:26:16.695454 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-9hl57" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4dee692e-c3b8-4538-86d7-210dd7e96173\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6058f68ef58f6259be3064c7cea97ec261dc1174655dbe75d4caf208a9c300e5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:25:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6n8pc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://99d7bb8a699d6022c3036c24052e58f8f45699b88971a2137e3d8bc27a21fe56\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:25:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6n8pc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:25:12Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-9hl57\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:26:16Z is after 2025-08-24T17:21:41Z" Mar 21 04:26:16 crc kubenswrapper[4839]: I0321 04:26:16.705902 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-445ww" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fa13ce27-53f2-4178-8560-251f0bb3f034\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:13Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:13Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4b26h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4b26h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:25:13Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-445ww\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:26:16Z is after 2025-08-24T17:21:41Z" Mar 21 04:26:17 crc kubenswrapper[4839]: I0321 04:26:17.219438 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/fa13ce27-53f2-4178-8560-251f0bb3f034-metrics-certs\") pod \"network-metrics-daemon-445ww\" (UID: \"fa13ce27-53f2-4178-8560-251f0bb3f034\") " pod="openshift-multus/network-metrics-daemon-445ww" Mar 21 04:26:17 crc kubenswrapper[4839]: E0321 04:26:17.219619 4839 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Mar 21 04:26:17 crc kubenswrapper[4839]: E0321 04:26:17.219680 4839 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/fa13ce27-53f2-4178-8560-251f0bb3f034-metrics-certs podName:fa13ce27-53f2-4178-8560-251f0bb3f034 nodeName:}" failed. No retries permitted until 2026-03-21 04:27:21.219664728 +0000 UTC m=+245.547451394 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/fa13ce27-53f2-4178-8560-251f0bb3f034-metrics-certs") pod "network-metrics-daemon-445ww" (UID: "fa13ce27-53f2-4178-8560-251f0bb3f034") : object "openshift-multus"/"metrics-daemon-secret" not registered Mar 21 04:26:17 crc kubenswrapper[4839]: I0321 04:26:17.452115 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-445ww" Mar 21 04:26:17 crc kubenswrapper[4839]: I0321 04:26:17.452130 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 21 04:26:17 crc kubenswrapper[4839]: E0321 04:26:17.452326 4839 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-445ww" podUID="fa13ce27-53f2-4178-8560-251f0bb3f034" Mar 21 04:26:17 crc kubenswrapper[4839]: E0321 04:26:17.452427 4839 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 21 04:26:17 crc kubenswrapper[4839]: I0321 04:26:17.452512 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 21 04:26:17 crc kubenswrapper[4839]: E0321 04:26:17.452729 4839 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 21 04:26:18 crc kubenswrapper[4839]: I0321 04:26:18.452164 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 21 04:26:18 crc kubenswrapper[4839]: E0321 04:26:18.452349 4839 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 21 04:26:19 crc kubenswrapper[4839]: I0321 04:26:19.451763 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-445ww" Mar 21 04:26:19 crc kubenswrapper[4839]: E0321 04:26:19.452199 4839 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-445ww" podUID="fa13ce27-53f2-4178-8560-251f0bb3f034" Mar 21 04:26:19 crc kubenswrapper[4839]: I0321 04:26:19.451836 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 21 04:26:19 crc kubenswrapper[4839]: E0321 04:26:19.452899 4839 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 21 04:26:19 crc kubenswrapper[4839]: I0321 04:26:19.451799 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 21 04:26:19 crc kubenswrapper[4839]: E0321 04:26:19.453146 4839 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 21 04:26:20 crc kubenswrapper[4839]: I0321 04:26:20.452553 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 21 04:26:20 crc kubenswrapper[4839]: E0321 04:26:20.452800 4839 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 21 04:26:21 crc kubenswrapper[4839]: I0321 04:26:21.451988 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 21 04:26:21 crc kubenswrapper[4839]: I0321 04:26:21.452074 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 21 04:26:21 crc kubenswrapper[4839]: E0321 04:26:21.452453 4839 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 21 04:26:21 crc kubenswrapper[4839]: I0321 04:26:21.452095 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-445ww" Mar 21 04:26:21 crc kubenswrapper[4839]: E0321 04:26:21.452818 4839 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 21 04:26:21 crc kubenswrapper[4839]: E0321 04:26:21.452996 4839 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-445ww" podUID="fa13ce27-53f2-4178-8560-251f0bb3f034" Mar 21 04:26:21 crc kubenswrapper[4839]: E0321 04:26:21.550832 4839 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 21 04:26:22 crc kubenswrapper[4839]: I0321 04:26:22.452705 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 21 04:26:22 crc kubenswrapper[4839]: E0321 04:26:22.453139 4839 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 21 04:26:22 crc kubenswrapper[4839]: I0321 04:26:22.453278 4839 scope.go:117] "RemoveContainer" containerID="616e01984d5c14dee369d9f7e59f96c9dfaccfd431fc19972454ffbbb0b5489f" Mar 21 04:26:22 crc kubenswrapper[4839]: E0321 04:26:22.453397 4839 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-spl4b_openshift-ovn-kubernetes(d634043b-c9ec-4469-b267-26053b1f02f9)\"" pod="openshift-ovn-kubernetes/ovnkube-node-spl4b" podUID="d634043b-c9ec-4469-b267-26053b1f02f9" Mar 21 04:26:23 crc kubenswrapper[4839]: I0321 04:26:23.452450 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 21 04:26:23 crc kubenswrapper[4839]: I0321 04:26:23.452485 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 21 04:26:23 crc kubenswrapper[4839]: I0321 04:26:23.452450 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-445ww" Mar 21 04:26:23 crc kubenswrapper[4839]: E0321 04:26:23.452604 4839 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 21 04:26:23 crc kubenswrapper[4839]: E0321 04:26:23.452676 4839 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-445ww" podUID="fa13ce27-53f2-4178-8560-251f0bb3f034" Mar 21 04:26:23 crc kubenswrapper[4839]: E0321 04:26:23.452726 4839 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 21 04:26:24 crc kubenswrapper[4839]: I0321 04:26:24.452023 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 21 04:26:24 crc kubenswrapper[4839]: E0321 04:26:24.452160 4839 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 21 04:26:25 crc kubenswrapper[4839]: I0321 04:26:25.452001 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-445ww" Mar 21 04:26:25 crc kubenswrapper[4839]: I0321 04:26:25.452016 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 21 04:26:25 crc kubenswrapper[4839]: I0321 04:26:25.453253 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 21 04:26:25 crc kubenswrapper[4839]: E0321 04:26:25.453487 4839 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-445ww" podUID="fa13ce27-53f2-4178-8560-251f0bb3f034" Mar 21 04:26:25 crc kubenswrapper[4839]: E0321 04:26:25.453617 4839 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 21 04:26:25 crc kubenswrapper[4839]: E0321 04:26:25.453726 4839 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 21 04:26:26 crc kubenswrapper[4839]: I0321 04:26:26.453022 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 21 04:26:26 crc kubenswrapper[4839]: E0321 04:26:26.454412 4839 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 21 04:26:26 crc kubenswrapper[4839]: I0321 04:26:26.474996 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:24:41Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:09Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://17d5f53fe3137552caf74bf775b44fba819d8790d2bc4d5042744dbd7b7307ce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:25:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e480aacd9f65a8a4ccedd9c0a943c53bf6de8a8de00eb0d421ec782af8a785bb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:25:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:26:26Z is after 2025-08-24T17:21:41Z" Mar 21 04:26:26 crc kubenswrapper[4839]: I0321 04:26:26.493476 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-g47qh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e646dbcd-c976-48e4-8dee-497be8a275bf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2ceedbcee674106a8f519fededf37b2bbc4b4bd660845373899de825f1ed534e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:25:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ljz2v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:25:00Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-g47qh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:26:26Z is after 2025-08-24T17:21:41Z" Mar 21 04:26:26 crc kubenswrapper[4839]: I0321 04:26:26.512847 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"99628ae3-9af9-4946-8e9f-2fc369d82cfa\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:23:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:23:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:24:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:24:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:23:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4a7ef331cbf4076bc661dd93ec1669ee8b721edbac57aeef3df25f1e36527065\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ed71f104d54976b384a414c08b933d4f7185883ea0e2dc638863d67f9b245505\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-21T04:24:17Z\\\",\\\"message\\\":\\\"+ timeout 3m /bin/bash -exuo pipefail -c 'while [ -n \\\\\\\"$(ss -Htanop \\\\\\\\( sport = 10357 \\\\\\\\))\\\\\\\" ]; do sleep 1; done'\\\\n++ ss -Htanop '(' sport = 10357 ')'\\\\n+ '[' -n '' ']'\\\\n+ exec cluster-policy-controller start --config=/etc/kubernetes/static-pod-resources/configmaps/cluster-policy-controller-config/config.yaml --kubeconfig=/etc/kubernetes/static-pod-resources/configmaps/controller-manager-kubeconfig/kubeconfig --namespace=openshift-kube-controller-manager -v=2\\\\nI0321 04:23:48.352735 1 leaderelection.go:121] The leader election gives 4 retries and allows for 30s of clock skew. The kube-apiserver downtime tolerance is 78s. Worst non-graceful lease acquisition is 2m43s. Worst graceful lease acquisition is {26s}.\\\\nI0321 04:23:48.353466 1 observer_polling.go:159] Starting file observer\\\\nI0321 04:23:48.354210 1 builder.go:298] cluster-policy-controller version 4.18.0-202501230001.p0.g5fd8525.assembly.stream.el9-5fd8525-5fd852525909ce6eab52972ba9ce8fcf56528eb9\\\\nI0321 04:23:48.354819 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.crt::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.key\\\\\\\"\\\\nI0321 04:24:09.734400 1 cert_rotation.go:91] certificate rotation detected, shutting down client connections to start using new credentials\\\\nI0321 04:24:17.259238 1 cmd.go:138] Received SIGTERM or SIGINT signal, shutting down controller.\\\\nF0321 04:24:17.259294 1 cmd.go:179] failed checking apiserver connectivity: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-21T04:23:48Z\\\"}},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:24:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dd3ce831e80df764c5450f1bf45e92fc061ba1c1efa8ade627b570edcb41567c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:23:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ac4ccdf976fbadd5ce5975dbec37b6be3b17b401a863ed5c1e93736176e1185e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:23:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f432640080695974eb1c6c1297fad14596d99d17a162d541417fb3ec762575f5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:23:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:23:16Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:26:26Z is after 2025-08-24T17:21:41Z" Mar 21 04:26:26 crc kubenswrapper[4839]: I0321 04:26:26.529449 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c5a473f2-0430-4c9b-8ef8-60d457db5188\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:23:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:23:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:23:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7299e7f513aff2d1d61b42cef3ed386843478716082a6e38d4d6d61d87f48529\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:23:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6d9bd6ceaa85cd21af55d1d512ac910eaf9d1bcf5d0f10868cd22ac80b13f66a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:23:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e4fe4a6c00181492cd5389e34a9bf18d905f69b2f7467eb75dcee9992eeb0d07\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:23:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://412472fce0e71838c2bb83101879b672e777dcd608bf27261a98261150d42e87\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ecafd1a4266b72f369f4b53d8a423bb875c1bc9719282c517a8e999ad5aecc66\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-21T04:24:14Z\\\",\\\"message\\\":\\\"W0321 04:24:13.708919 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI0321 04:24:13.709951 1 crypto.go:601] Generating new CA for check-endpoints-signer@1774067053 cert, and key in /tmp/serving-cert-1138089316/serving-signer.crt, /tmp/serving-cert-1138089316/serving-signer.key\\\\nI0321 04:24:14.009807 1 observer_polling.go:159] Starting file observer\\\\nW0321 04:24:14.016906 1 builder.go:272] unable to get owner reference (falling back to namespace): Unauthorized\\\\nI0321 04:24:14.017068 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0321 04:24:14.017655 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1138089316/tls.crt::/tmp/serving-cert-1138089316/tls.key\\\\\\\"\\\\nF0321 04:24:14.917069 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-21T04:24:13Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:24:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://573a241321770c9c676d83d8280f81eba0ad01a3e2f857be89d31316117b8898\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:23:19Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f70600d5431425836d94025fc8166068d8c9f411d0750be07cd7de44575a9649\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f70600d5431425836d94025fc8166068d8c9f411d0750be07cd7de44575a9649\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:23:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:23:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:23:16Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:26:26Z is after 2025-08-24T17:21:41Z" Mar 21 04:26:26 crc kubenswrapper[4839]: I0321 04:26:26.545909 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:24:41Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:24:41Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:26:26Z is after 2025-08-24T17:21:41Z" Mar 21 04:26:26 crc kubenswrapper[4839]: E0321 04:26:26.551503 4839 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 21 04:26:26 crc kubenswrapper[4839]: I0321 04:26:26.566749 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-jx4q7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4f92fefb-d5cd-451a-8bbe-31eea55d5bd9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6fc18a359e07bf64662248d7ceb65c8184407c0c4e14ce5483760975218407c9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:25:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9jqbw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a124ddc90d8b93fe4b6f77b778fbdac127b4896f5f55e6d5d78629cd17584311\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:25:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9jqbw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:25:00Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-jx4q7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:26:26Z is after 2025-08-24T17:21:41Z" Mar 21 04:26:26 crc kubenswrapper[4839]: I0321 04:26:26.581247 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b27e726e-fb6e-471f-b189-71637a34994c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:23:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:23:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:24:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:24:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:23:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://506189caf9101bd8eff0b327400189bde29280613baf494135bf28fa36bd80cd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:23:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4ea5d4d7d744a79c2e5515eaf49d09346d96b37b17eee6e3e32952e389248367\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:23:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1981f2299df7154d30ce01492943eaadeeed0f27c0b0a39a1228054345aa7a3a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:23:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fec5f0e1a98ef435cb8b280400cfc7daff0f232d320947f22a93df0b81b6f441\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fec5f0e1a98ef435cb8b280400cfc7daff0f232d320947f22a93df0b81b6f441\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:23:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:23:17Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:23:16Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:26:26Z is after 2025-08-24T17:21:41Z" Mar 21 04:26:26 crc kubenswrapper[4839]: I0321 04:26:26.597828 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c8235ac9-7c3f-438e-957f-1bdedeff6f64\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:23:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:23:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:23:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:23:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:23:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5c5299f0598312d0ef997d7c51fad5c0b882bd65e5964794ac66179575373fcd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:23:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5b603ad92d6de73d444d98c4e9f38fd63b22eabeee61d7a3b00e8ca741016e68\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5b603ad92d6de73d444d98c4e9f38fd63b22eabeee61d7a3b00e8ca741016e68\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:23:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:23:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:23:16Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:26:26Z is after 2025-08-24T17:21:41Z" Mar 21 04:26:26 crc kubenswrapper[4839]: I0321 04:26:26.686036 4839 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-zqcw4" podStartSLOduration=126.686018502 podStartE2EDuration="2m6.686018502s" podCreationTimestamp="2026-03-21 04:24:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-21 04:26:26.674672125 +0000 UTC m=+191.002458801" watchObservedRunningTime="2026-03-21 04:26:26.686018502 +0000 UTC m=+191.013805178" Mar 21 04:26:26 crc kubenswrapper[4839]: I0321 04:26:26.705141 4839 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/node-ca-sxs57" podStartSLOduration=126.705122371 podStartE2EDuration="2m6.705122371s" podCreationTimestamp="2026-03-21 04:24:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-21 04:26:26.687441185 +0000 UTC m=+191.015227861" watchObservedRunningTime="2026-03-21 04:26:26.705122371 +0000 UTC m=+191.032909047" Mar 21 04:26:26 crc kubenswrapper[4839]: I0321 04:26:26.734001 4839 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-additional-cni-plugins-scp2c" podStartSLOduration=126.733983879 podStartE2EDuration="2m6.733983879s" podCreationTimestamp="2026-03-21 04:24:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-21 04:26:26.733559296 +0000 UTC m=+191.061345972" watchObservedRunningTime="2026-03-21 04:26:26.733983879 +0000 UTC m=+191.061770555" Mar 21 04:26:26 crc kubenswrapper[4839]: I0321 04:26:26.753774 4839 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-9hl57" podStartSLOduration=125.753754097 podStartE2EDuration="2m5.753754097s" podCreationTimestamp="2026-03-21 04:24:21 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-21 04:26:26.745602135 +0000 UTC m=+191.073388831" watchObservedRunningTime="2026-03-21 04:26:26.753754097 +0000 UTC m=+191.081540773" Mar 21 04:26:26 crc kubenswrapper[4839]: I0321 04:26:26.774532 4839 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-etcd/etcd-crc" podStartSLOduration=93.774518405 podStartE2EDuration="1m33.774518405s" podCreationTimestamp="2026-03-21 04:24:53 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-21 04:26:26.773670589 +0000 UTC m=+191.101457265" watchObservedRunningTime="2026-03-21 04:26:26.774518405 +0000 UTC m=+191.102305081" Mar 21 04:26:26 crc kubenswrapper[4839]: I0321 04:26:26.863510 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:26:26 crc kubenswrapper[4839]: I0321 04:26:26.863549 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:26:26 crc kubenswrapper[4839]: I0321 04:26:26.863610 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:26:26 crc kubenswrapper[4839]: I0321 04:26:26.863627 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 04:26:26 crc kubenswrapper[4839]: I0321 04:26:26.863638 4839 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T04:26:26Z","lastTransitionTime":"2026-03-21T04:26:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 04:26:26 crc kubenswrapper[4839]: I0321 04:26:26.908963 4839 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-version/cluster-version-operator-5c965bbfc6-vbv7d"] Mar 21 04:26:26 crc kubenswrapper[4839]: I0321 04:26:26.910228 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-vbv7d" Mar 21 04:26:26 crc kubenswrapper[4839]: I0321 04:26:26.916057 4839 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"default-dockercfg-gxtc4" Mar 21 04:26:26 crc kubenswrapper[4839]: I0321 04:26:26.916304 4839 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"kube-root-ca.crt" Mar 21 04:26:26 crc kubenswrapper[4839]: I0321 04:26:26.916381 4839 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"cluster-version-operator-serving-cert" Mar 21 04:26:26 crc kubenswrapper[4839]: I0321 04:26:26.916935 4839 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"openshift-service-ca.crt" Mar 21 04:26:26 crc kubenswrapper[4839]: I0321 04:26:26.929309 4839 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" podStartSLOduration=52.929281818 podStartE2EDuration="52.929281818s" podCreationTimestamp="2026-03-21 04:25:34 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-21 04:26:26.928112843 +0000 UTC m=+191.255899519" watchObservedRunningTime="2026-03-21 04:26:26.929281818 +0000 UTC m=+191.257068534" Mar 21 04:26:26 crc kubenswrapper[4839]: I0321 04:26:26.943016 4839 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" podStartSLOduration=30.942996766 podStartE2EDuration="30.942996766s" podCreationTimestamp="2026-03-21 04:25:56 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-21 04:26:26.942361097 +0000 UTC m=+191.270147803" watchObservedRunningTime="2026-03-21 04:26:26.942996766 +0000 UTC m=+191.270783442" Mar 21 04:26:26 crc kubenswrapper[4839]: I0321 04:26:26.955174 4839 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-daemon-jx4q7" podStartSLOduration=126.955152608 podStartE2EDuration="2m6.955152608s" podCreationTimestamp="2026-03-21 04:24:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-21 04:26:26.954992563 +0000 UTC m=+191.282779239" watchObservedRunningTime="2026-03-21 04:26:26.955152608 +0000 UTC m=+191.282939284" Mar 21 04:26:26 crc kubenswrapper[4839]: I0321 04:26:26.995515 4839 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podStartSLOduration=47.995499108 podStartE2EDuration="47.995499108s" podCreationTimestamp="2026-03-21 04:25:39 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-21 04:26:26.982527292 +0000 UTC m=+191.310313968" watchObservedRunningTime="2026-03-21 04:26:26.995499108 +0000 UTC m=+191.323285784" Mar 21 04:26:26 crc kubenswrapper[4839]: I0321 04:26:26.996244 4839 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/kube-apiserver-crc" podStartSLOduration=99.99623707 podStartE2EDuration="1m39.99623707s" podCreationTimestamp="2026-03-21 04:24:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-21 04:26:26.995422306 +0000 UTC m=+191.323208992" watchObservedRunningTime="2026-03-21 04:26:26.99623707 +0000 UTC m=+191.324023746" Mar 21 04:26:27 crc kubenswrapper[4839]: I0321 04:26:27.016815 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/0207a845-18d9-4431-844b-4bd01600c2d5-etc-ssl-certs\") pod \"cluster-version-operator-5c965bbfc6-vbv7d\" (UID: \"0207a845-18d9-4431-844b-4bd01600c2d5\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-vbv7d" Mar 21 04:26:27 crc kubenswrapper[4839]: I0321 04:26:27.016951 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0207a845-18d9-4431-844b-4bd01600c2d5-kube-api-access\") pod \"cluster-version-operator-5c965bbfc6-vbv7d\" (UID: \"0207a845-18d9-4431-844b-4bd01600c2d5\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-vbv7d" Mar 21 04:26:27 crc kubenswrapper[4839]: I0321 04:26:27.017002 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0207a845-18d9-4431-844b-4bd01600c2d5-serving-cert\") pod \"cluster-version-operator-5c965bbfc6-vbv7d\" (UID: \"0207a845-18d9-4431-844b-4bd01600c2d5\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-vbv7d" Mar 21 04:26:27 crc kubenswrapper[4839]: I0321 04:26:27.017043 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/0207a845-18d9-4431-844b-4bd01600c2d5-etc-cvo-updatepayloads\") pod \"cluster-version-operator-5c965bbfc6-vbv7d\" (UID: \"0207a845-18d9-4431-844b-4bd01600c2d5\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-vbv7d" Mar 21 04:26:27 crc kubenswrapper[4839]: I0321 04:26:27.017158 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/0207a845-18d9-4431-844b-4bd01600c2d5-service-ca\") pod \"cluster-version-operator-5c965bbfc6-vbv7d\" (UID: \"0207a845-18d9-4431-844b-4bd01600c2d5\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-vbv7d" Mar 21 04:26:27 crc kubenswrapper[4839]: I0321 04:26:27.118242 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/0207a845-18d9-4431-844b-4bd01600c2d5-service-ca\") pod \"cluster-version-operator-5c965bbfc6-vbv7d\" (UID: \"0207a845-18d9-4431-844b-4bd01600c2d5\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-vbv7d" Mar 21 04:26:27 crc kubenswrapper[4839]: I0321 04:26:27.118521 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/0207a845-18d9-4431-844b-4bd01600c2d5-etc-ssl-certs\") pod \"cluster-version-operator-5c965bbfc6-vbv7d\" (UID: \"0207a845-18d9-4431-844b-4bd01600c2d5\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-vbv7d" Mar 21 04:26:27 crc kubenswrapper[4839]: I0321 04:26:27.118737 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0207a845-18d9-4431-844b-4bd01600c2d5-kube-api-access\") pod \"cluster-version-operator-5c965bbfc6-vbv7d\" (UID: \"0207a845-18d9-4431-844b-4bd01600c2d5\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-vbv7d" Mar 21 04:26:27 crc kubenswrapper[4839]: I0321 04:26:27.118912 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0207a845-18d9-4431-844b-4bd01600c2d5-serving-cert\") pod \"cluster-version-operator-5c965bbfc6-vbv7d\" (UID: \"0207a845-18d9-4431-844b-4bd01600c2d5\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-vbv7d" Mar 21 04:26:27 crc kubenswrapper[4839]: I0321 04:26:27.118598 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/0207a845-18d9-4431-844b-4bd01600c2d5-etc-ssl-certs\") pod \"cluster-version-operator-5c965bbfc6-vbv7d\" (UID: \"0207a845-18d9-4431-844b-4bd01600c2d5\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-vbv7d" Mar 21 04:26:27 crc kubenswrapper[4839]: I0321 04:26:27.119011 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/0207a845-18d9-4431-844b-4bd01600c2d5-etc-cvo-updatepayloads\") pod \"cluster-version-operator-5c965bbfc6-vbv7d\" (UID: \"0207a845-18d9-4431-844b-4bd01600c2d5\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-vbv7d" Mar 21 04:26:27 crc kubenswrapper[4839]: I0321 04:26:27.119044 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/0207a845-18d9-4431-844b-4bd01600c2d5-etc-cvo-updatepayloads\") pod \"cluster-version-operator-5c965bbfc6-vbv7d\" (UID: \"0207a845-18d9-4431-844b-4bd01600c2d5\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-vbv7d" Mar 21 04:26:27 crc kubenswrapper[4839]: I0321 04:26:27.119164 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/0207a845-18d9-4431-844b-4bd01600c2d5-service-ca\") pod \"cluster-version-operator-5c965bbfc6-vbv7d\" (UID: \"0207a845-18d9-4431-844b-4bd01600c2d5\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-vbv7d" Mar 21 04:26:27 crc kubenswrapper[4839]: I0321 04:26:27.126154 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0207a845-18d9-4431-844b-4bd01600c2d5-serving-cert\") pod \"cluster-version-operator-5c965bbfc6-vbv7d\" (UID: \"0207a845-18d9-4431-844b-4bd01600c2d5\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-vbv7d" Mar 21 04:26:27 crc kubenswrapper[4839]: I0321 04:26:27.140008 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0207a845-18d9-4431-844b-4bd01600c2d5-kube-api-access\") pod \"cluster-version-operator-5c965bbfc6-vbv7d\" (UID: \"0207a845-18d9-4431-844b-4bd01600c2d5\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-vbv7d" Mar 21 04:26:27 crc kubenswrapper[4839]: I0321 04:26:27.229057 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-vbv7d" Mar 21 04:26:27 crc kubenswrapper[4839]: I0321 04:26:27.404333 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-vbv7d" event={"ID":"0207a845-18d9-4431-844b-4bd01600c2d5","Type":"ContainerStarted","Data":"7f5715c87d2d5e7a3ef02da86248f54671904f0c67edbbfc034461ba914a9b40"} Mar 21 04:26:27 crc kubenswrapper[4839]: I0321 04:26:27.452043 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-445ww" Mar 21 04:26:27 crc kubenswrapper[4839]: I0321 04:26:27.452114 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 21 04:26:27 crc kubenswrapper[4839]: I0321 04:26:27.452131 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 21 04:26:27 crc kubenswrapper[4839]: E0321 04:26:27.452205 4839 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-445ww" podUID="fa13ce27-53f2-4178-8560-251f0bb3f034" Mar 21 04:26:27 crc kubenswrapper[4839]: E0321 04:26:27.452260 4839 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 21 04:26:27 crc kubenswrapper[4839]: E0321 04:26:27.452319 4839 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 21 04:26:27 crc kubenswrapper[4839]: I0321 04:26:27.505257 4839 certificate_manager.go:356] kubernetes.io/kubelet-serving: Rotating certificates Mar 21 04:26:27 crc kubenswrapper[4839]: I0321 04:26:27.513643 4839 reflector.go:368] Caches populated for *v1.CertificateSigningRequest from k8s.io/client-go/tools/watch/informerwatcher.go:146 Mar 21 04:26:28 crc kubenswrapper[4839]: I0321 04:26:28.409362 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-vbv7d" event={"ID":"0207a845-18d9-4431-844b-4bd01600c2d5","Type":"ContainerStarted","Data":"3d8f42dbc76c69b64b7fd8850007b7710aac379b8538e5eb25782151e9647dac"} Mar 21 04:26:28 crc kubenswrapper[4839]: I0321 04:26:28.429139 4839 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/node-resolver-g47qh" podStartSLOduration=128.429120201 podStartE2EDuration="2m8.429120201s" podCreationTimestamp="2026-03-21 04:24:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-21 04:26:27.030303313 +0000 UTC m=+191.358089989" watchObservedRunningTime="2026-03-21 04:26:28.429120201 +0000 UTC m=+192.756906897" Mar 21 04:26:28 crc kubenswrapper[4839]: I0321 04:26:28.452307 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 21 04:26:28 crc kubenswrapper[4839]: E0321 04:26:28.452550 4839 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 21 04:26:29 crc kubenswrapper[4839]: I0321 04:26:29.452166 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 21 04:26:29 crc kubenswrapper[4839]: I0321 04:26:29.452236 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-445ww" Mar 21 04:26:29 crc kubenswrapper[4839]: I0321 04:26:29.452169 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 21 04:26:29 crc kubenswrapper[4839]: E0321 04:26:29.452282 4839 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 21 04:26:29 crc kubenswrapper[4839]: E0321 04:26:29.452317 4839 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-445ww" podUID="fa13ce27-53f2-4178-8560-251f0bb3f034" Mar 21 04:26:29 crc kubenswrapper[4839]: E0321 04:26:29.452360 4839 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 21 04:26:30 crc kubenswrapper[4839]: I0321 04:26:30.452055 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 21 04:26:30 crc kubenswrapper[4839]: E0321 04:26:30.452405 4839 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 21 04:26:31 crc kubenswrapper[4839]: I0321 04:26:31.451803 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 21 04:26:31 crc kubenswrapper[4839]: I0321 04:26:31.451892 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 21 04:26:31 crc kubenswrapper[4839]: I0321 04:26:31.451970 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-445ww" Mar 21 04:26:31 crc kubenswrapper[4839]: E0321 04:26:31.451974 4839 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 21 04:26:31 crc kubenswrapper[4839]: E0321 04:26:31.452017 4839 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 21 04:26:31 crc kubenswrapper[4839]: E0321 04:26:31.452097 4839 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-445ww" podUID="fa13ce27-53f2-4178-8560-251f0bb3f034" Mar 21 04:26:31 crc kubenswrapper[4839]: E0321 04:26:31.553099 4839 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 21 04:26:32 crc kubenswrapper[4839]: I0321 04:26:32.452711 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 21 04:26:32 crc kubenswrapper[4839]: E0321 04:26:32.452883 4839 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 21 04:26:33 crc kubenswrapper[4839]: I0321 04:26:33.451835 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 21 04:26:33 crc kubenswrapper[4839]: E0321 04:26:33.451970 4839 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 21 04:26:33 crc kubenswrapper[4839]: I0321 04:26:33.452017 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 21 04:26:33 crc kubenswrapper[4839]: I0321 04:26:33.452181 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-445ww" Mar 21 04:26:33 crc kubenswrapper[4839]: E0321 04:26:33.452247 4839 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-445ww" podUID="fa13ce27-53f2-4178-8560-251f0bb3f034" Mar 21 04:26:33 crc kubenswrapper[4839]: E0321 04:26:33.452601 4839 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 21 04:26:33 crc kubenswrapper[4839]: I0321 04:26:33.453277 4839 scope.go:117] "RemoveContainer" containerID="616e01984d5c14dee369d9f7e59f96c9dfaccfd431fc19972454ffbbb0b5489f" Mar 21 04:26:33 crc kubenswrapper[4839]: E0321 04:26:33.453422 4839 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-spl4b_openshift-ovn-kubernetes(d634043b-c9ec-4469-b267-26053b1f02f9)\"" pod="openshift-ovn-kubernetes/ovnkube-node-spl4b" podUID="d634043b-c9ec-4469-b267-26053b1f02f9" Mar 21 04:26:34 crc kubenswrapper[4839]: I0321 04:26:34.425342 4839 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-zqcw4_1602189b-f4f3-40ee-ba63-c695c11069d0/kube-multus/1.log" Mar 21 04:26:34 crc kubenswrapper[4839]: I0321 04:26:34.426067 4839 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-zqcw4_1602189b-f4f3-40ee-ba63-c695c11069d0/kube-multus/0.log" Mar 21 04:26:34 crc kubenswrapper[4839]: I0321 04:26:34.426133 4839 generic.go:334] "Generic (PLEG): container finished" podID="1602189b-f4f3-40ee-ba63-c695c11069d0" containerID="bc026a6498e53a6dc434660326fcbdb2323e59c0c7d6f9e5d768c3a347b614d1" exitCode=1 Mar 21 04:26:34 crc kubenswrapper[4839]: I0321 04:26:34.426168 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-zqcw4" event={"ID":"1602189b-f4f3-40ee-ba63-c695c11069d0","Type":"ContainerDied","Data":"bc026a6498e53a6dc434660326fcbdb2323e59c0c7d6f9e5d768c3a347b614d1"} Mar 21 04:26:34 crc kubenswrapper[4839]: I0321 04:26:34.426206 4839 scope.go:117] "RemoveContainer" containerID="abb804e65728bf323531d9b35e52f6700584bab85de4a5b749067d2d89224747" Mar 21 04:26:34 crc kubenswrapper[4839]: I0321 04:26:34.426590 4839 scope.go:117] "RemoveContainer" containerID="bc026a6498e53a6dc434660326fcbdb2323e59c0c7d6f9e5d768c3a347b614d1" Mar 21 04:26:34 crc kubenswrapper[4839]: E0321 04:26:34.426772 4839 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-multus\" with CrashLoopBackOff: \"back-off 10s restarting failed container=kube-multus pod=multus-zqcw4_openshift-multus(1602189b-f4f3-40ee-ba63-c695c11069d0)\"" pod="openshift-multus/multus-zqcw4" podUID="1602189b-f4f3-40ee-ba63-c695c11069d0" Mar 21 04:26:34 crc kubenswrapper[4839]: I0321 04:26:34.441514 4839 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-vbv7d" podStartSLOduration=134.441498104 podStartE2EDuration="2m14.441498104s" podCreationTimestamp="2026-03-21 04:24:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-21 04:26:28.428552884 +0000 UTC m=+192.756339600" watchObservedRunningTime="2026-03-21 04:26:34.441498104 +0000 UTC m=+198.769284780" Mar 21 04:26:34 crc kubenswrapper[4839]: I0321 04:26:34.452095 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 21 04:26:34 crc kubenswrapper[4839]: E0321 04:26:34.452247 4839 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 21 04:26:35 crc kubenswrapper[4839]: I0321 04:26:35.430071 4839 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-zqcw4_1602189b-f4f3-40ee-ba63-c695c11069d0/kube-multus/1.log" Mar 21 04:26:35 crc kubenswrapper[4839]: I0321 04:26:35.452483 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 21 04:26:35 crc kubenswrapper[4839]: I0321 04:26:35.452483 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-445ww" Mar 21 04:26:35 crc kubenswrapper[4839]: E0321 04:26:35.452642 4839 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 21 04:26:35 crc kubenswrapper[4839]: E0321 04:26:35.452698 4839 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-445ww" podUID="fa13ce27-53f2-4178-8560-251f0bb3f034" Mar 21 04:26:35 crc kubenswrapper[4839]: I0321 04:26:35.452488 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 21 04:26:35 crc kubenswrapper[4839]: E0321 04:26:35.452794 4839 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 21 04:26:36 crc kubenswrapper[4839]: I0321 04:26:36.451999 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 21 04:26:36 crc kubenswrapper[4839]: E0321 04:26:36.452763 4839 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 21 04:26:36 crc kubenswrapper[4839]: E0321 04:26:36.553875 4839 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 21 04:26:37 crc kubenswrapper[4839]: I0321 04:26:37.452040 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 21 04:26:37 crc kubenswrapper[4839]: I0321 04:26:37.452139 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-445ww" Mar 21 04:26:37 crc kubenswrapper[4839]: I0321 04:26:37.452220 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 21 04:26:37 crc kubenswrapper[4839]: E0321 04:26:37.452155 4839 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 21 04:26:37 crc kubenswrapper[4839]: E0321 04:26:37.452323 4839 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-445ww" podUID="fa13ce27-53f2-4178-8560-251f0bb3f034" Mar 21 04:26:37 crc kubenswrapper[4839]: E0321 04:26:37.452453 4839 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 21 04:26:38 crc kubenswrapper[4839]: I0321 04:26:38.452502 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 21 04:26:38 crc kubenswrapper[4839]: E0321 04:26:38.452762 4839 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 21 04:26:39 crc kubenswrapper[4839]: I0321 04:26:39.452724 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-445ww" Mar 21 04:26:39 crc kubenswrapper[4839]: I0321 04:26:39.452835 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 21 04:26:39 crc kubenswrapper[4839]: E0321 04:26:39.452874 4839 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-445ww" podUID="fa13ce27-53f2-4178-8560-251f0bb3f034" Mar 21 04:26:39 crc kubenswrapper[4839]: I0321 04:26:39.452725 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 21 04:26:39 crc kubenswrapper[4839]: E0321 04:26:39.452999 4839 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 21 04:26:39 crc kubenswrapper[4839]: E0321 04:26:39.453169 4839 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 21 04:26:40 crc kubenswrapper[4839]: I0321 04:26:40.451922 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 21 04:26:40 crc kubenswrapper[4839]: E0321 04:26:40.452053 4839 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 21 04:26:41 crc kubenswrapper[4839]: I0321 04:26:41.452029 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-445ww" Mar 21 04:26:41 crc kubenswrapper[4839]: I0321 04:26:41.452051 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 21 04:26:41 crc kubenswrapper[4839]: I0321 04:26:41.452140 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 21 04:26:41 crc kubenswrapper[4839]: E0321 04:26:41.452141 4839 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-445ww" podUID="fa13ce27-53f2-4178-8560-251f0bb3f034" Mar 21 04:26:41 crc kubenswrapper[4839]: E0321 04:26:41.452266 4839 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 21 04:26:41 crc kubenswrapper[4839]: E0321 04:26:41.452378 4839 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 21 04:26:41 crc kubenswrapper[4839]: E0321 04:26:41.555249 4839 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 21 04:26:42 crc kubenswrapper[4839]: I0321 04:26:42.451943 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 21 04:26:42 crc kubenswrapper[4839]: E0321 04:26:42.452079 4839 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 21 04:26:43 crc kubenswrapper[4839]: I0321 04:26:43.451755 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-445ww" Mar 21 04:26:43 crc kubenswrapper[4839]: I0321 04:26:43.451785 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 21 04:26:43 crc kubenswrapper[4839]: E0321 04:26:43.451894 4839 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-445ww" podUID="fa13ce27-53f2-4178-8560-251f0bb3f034" Mar 21 04:26:43 crc kubenswrapper[4839]: I0321 04:26:43.451905 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 21 04:26:43 crc kubenswrapper[4839]: E0321 04:26:43.452005 4839 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 21 04:26:43 crc kubenswrapper[4839]: E0321 04:26:43.452077 4839 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 21 04:26:44 crc kubenswrapper[4839]: I0321 04:26:44.452743 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 21 04:26:44 crc kubenswrapper[4839]: E0321 04:26:44.453143 4839 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 21 04:26:44 crc kubenswrapper[4839]: I0321 04:26:44.453492 4839 scope.go:117] "RemoveContainer" containerID="616e01984d5c14dee369d9f7e59f96c9dfaccfd431fc19972454ffbbb0b5489f" Mar 21 04:26:45 crc kubenswrapper[4839]: I0321 04:26:45.307355 4839 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-445ww"] Mar 21 04:26:45 crc kubenswrapper[4839]: I0321 04:26:45.307502 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-445ww" Mar 21 04:26:45 crc kubenswrapper[4839]: E0321 04:26:45.307630 4839 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-445ww" podUID="fa13ce27-53f2-4178-8560-251f0bb3f034" Mar 21 04:26:45 crc kubenswrapper[4839]: I0321 04:26:45.451897 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 21 04:26:45 crc kubenswrapper[4839]: I0321 04:26:45.452271 4839 scope.go:117] "RemoveContainer" containerID="bc026a6498e53a6dc434660326fcbdb2323e59c0c7d6f9e5d768c3a347b614d1" Mar 21 04:26:45 crc kubenswrapper[4839]: I0321 04:26:45.451969 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 21 04:26:45 crc kubenswrapper[4839]: E0321 04:26:45.452276 4839 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 21 04:26:45 crc kubenswrapper[4839]: E0321 04:26:45.452496 4839 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 21 04:26:45 crc kubenswrapper[4839]: I0321 04:26:45.460559 4839 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-spl4b_d634043b-c9ec-4469-b267-26053b1f02f9/ovnkube-controller/3.log" Mar 21 04:26:45 crc kubenswrapper[4839]: I0321 04:26:45.465686 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-spl4b" event={"ID":"d634043b-c9ec-4469-b267-26053b1f02f9","Type":"ContainerStarted","Data":"06ba1fe337b0b90d457f17a01536cb29904e3afa201457970ad79c949040543e"} Mar 21 04:26:45 crc kubenswrapper[4839]: I0321 04:26:45.466057 4839 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-spl4b" Mar 21 04:26:45 crc kubenswrapper[4839]: I0321 04:26:45.510845 4839 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-spl4b" podStartSLOduration=145.510823351 podStartE2EDuration="2m25.510823351s" podCreationTimestamp="2026-03-21 04:24:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-21 04:26:45.509840532 +0000 UTC m=+209.837627208" watchObservedRunningTime="2026-03-21 04:26:45.510823351 +0000 UTC m=+209.838610037" Mar 21 04:26:46 crc kubenswrapper[4839]: I0321 04:26:46.452295 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 21 04:26:46 crc kubenswrapper[4839]: E0321 04:26:46.453819 4839 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 21 04:26:46 crc kubenswrapper[4839]: I0321 04:26:46.469507 4839 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-zqcw4_1602189b-f4f3-40ee-ba63-c695c11069d0/kube-multus/1.log" Mar 21 04:26:46 crc kubenswrapper[4839]: I0321 04:26:46.469627 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-zqcw4" event={"ID":"1602189b-f4f3-40ee-ba63-c695c11069d0","Type":"ContainerStarted","Data":"44c7b00e724e15bccb8ef54953306d49bc029cd21069ea40d7f724706be68de4"} Mar 21 04:26:46 crc kubenswrapper[4839]: E0321 04:26:46.555835 4839 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 21 04:26:47 crc kubenswrapper[4839]: I0321 04:26:47.452371 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-445ww" Mar 21 04:26:47 crc kubenswrapper[4839]: I0321 04:26:47.452377 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 21 04:26:47 crc kubenswrapper[4839]: E0321 04:26:47.452596 4839 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-445ww" podUID="fa13ce27-53f2-4178-8560-251f0bb3f034" Mar 21 04:26:47 crc kubenswrapper[4839]: I0321 04:26:47.452384 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 21 04:26:47 crc kubenswrapper[4839]: E0321 04:26:47.452708 4839 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 21 04:26:47 crc kubenswrapper[4839]: E0321 04:26:47.453116 4839 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 21 04:26:48 crc kubenswrapper[4839]: I0321 04:26:48.452262 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 21 04:26:48 crc kubenswrapper[4839]: E0321 04:26:48.452478 4839 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 21 04:26:49 crc kubenswrapper[4839]: I0321 04:26:49.359533 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 21 04:26:49 crc kubenswrapper[4839]: E0321 04:26:49.359733 4839 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-21 04:28:51.359698362 +0000 UTC m=+335.687485058 (durationBeforeRetry 2m2s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 21 04:26:49 crc kubenswrapper[4839]: I0321 04:26:49.360116 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 21 04:26:49 crc kubenswrapper[4839]: I0321 04:26:49.360180 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 21 04:26:49 crc kubenswrapper[4839]: E0321 04:26:49.360293 4839 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Mar 21 04:26:49 crc kubenswrapper[4839]: E0321 04:26:49.360385 4839 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-21 04:28:51.360364872 +0000 UTC m=+335.688151548 (durationBeforeRetry 2m2s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Mar 21 04:26:49 crc kubenswrapper[4839]: E0321 04:26:49.360304 4839 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 21 04:26:49 crc kubenswrapper[4839]: E0321 04:26:49.360466 4839 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-21 04:28:51.360455935 +0000 UTC m=+335.688242631 (durationBeforeRetry 2m2s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 21 04:26:49 crc kubenswrapper[4839]: I0321 04:26:49.452369 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 21 04:26:49 crc kubenswrapper[4839]: E0321 04:26:49.452516 4839 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 21 04:26:49 crc kubenswrapper[4839]: I0321 04:26:49.452604 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-445ww" Mar 21 04:26:49 crc kubenswrapper[4839]: E0321 04:26:49.452685 4839 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-445ww" podUID="fa13ce27-53f2-4178-8560-251f0bb3f034" Mar 21 04:26:49 crc kubenswrapper[4839]: I0321 04:26:49.452800 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 21 04:26:49 crc kubenswrapper[4839]: E0321 04:26:49.452956 4839 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 21 04:26:49 crc kubenswrapper[4839]: I0321 04:26:49.460973 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 21 04:26:49 crc kubenswrapper[4839]: I0321 04:26:49.461060 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 21 04:26:49 crc kubenswrapper[4839]: E0321 04:26:49.461212 4839 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 21 04:26:49 crc kubenswrapper[4839]: E0321 04:26:49.461225 4839 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 21 04:26:49 crc kubenswrapper[4839]: E0321 04:26:49.461268 4839 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 21 04:26:49 crc kubenswrapper[4839]: E0321 04:26:49.461280 4839 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 21 04:26:49 crc kubenswrapper[4839]: E0321 04:26:49.461337 4839 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-03-21 04:28:51.461320581 +0000 UTC m=+335.789107257 (durationBeforeRetry 2m2s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 21 04:26:49 crc kubenswrapper[4839]: E0321 04:26:49.461234 4839 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 21 04:26:49 crc kubenswrapper[4839]: E0321 04:26:49.461443 4839 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 21 04:26:49 crc kubenswrapper[4839]: E0321 04:26:49.461501 4839 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-03-21 04:28:51.461482976 +0000 UTC m=+335.789269752 (durationBeforeRetry 2m2s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 21 04:26:50 crc kubenswrapper[4839]: I0321 04:26:50.452479 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 21 04:26:50 crc kubenswrapper[4839]: E0321 04:26:50.452664 4839 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 21 04:26:51 crc kubenswrapper[4839]: I0321 04:26:51.452007 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-445ww" Mar 21 04:26:51 crc kubenswrapper[4839]: I0321 04:26:51.452030 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 21 04:26:51 crc kubenswrapper[4839]: I0321 04:26:51.452052 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 21 04:26:51 crc kubenswrapper[4839]: E0321 04:26:51.452141 4839 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-445ww" podUID="fa13ce27-53f2-4178-8560-251f0bb3f034" Mar 21 04:26:51 crc kubenswrapper[4839]: E0321 04:26:51.452290 4839 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 21 04:26:51 crc kubenswrapper[4839]: E0321 04:26:51.452356 4839 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 21 04:26:52 crc kubenswrapper[4839]: I0321 04:26:52.451863 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 21 04:26:52 crc kubenswrapper[4839]: I0321 04:26:52.454194 4839 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"openshift-service-ca.crt" Mar 21 04:26:52 crc kubenswrapper[4839]: I0321 04:26:52.454688 4839 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"kube-root-ca.crt" Mar 21 04:26:53 crc kubenswrapper[4839]: I0321 04:26:53.452122 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 21 04:26:53 crc kubenswrapper[4839]: I0321 04:26:53.452205 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 21 04:26:53 crc kubenswrapper[4839]: I0321 04:26:53.452122 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-445ww" Mar 21 04:26:53 crc kubenswrapper[4839]: I0321 04:26:53.454399 4839 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-console"/"networking-console-plugin" Mar 21 04:26:53 crc kubenswrapper[4839]: I0321 04:26:53.454522 4839 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-sa-dockercfg-d427c" Mar 21 04:26:53 crc kubenswrapper[4839]: I0321 04:26:53.458910 4839 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-secret" Mar 21 04:26:53 crc kubenswrapper[4839]: I0321 04:26:53.461360 4839 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-console"/"networking-console-plugin-cert" Mar 21 04:26:57 crc kubenswrapper[4839]: I0321 04:26:57.545241 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeReady" Mar 21 04:26:57 crc kubenswrapper[4839]: I0321 04:26:57.579692 4839 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-45jfn"] Mar 21 04:26:57 crc kubenswrapper[4839]: I0321 04:26:57.580197 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-45jfn" Mar 21 04:26:57 crc kubenswrapper[4839]: I0321 04:26:57.581146 4839 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-oauth-apiserver/apiserver-7bbb656c7d-85pc8"] Mar 21 04:26:57 crc kubenswrapper[4839]: I0321 04:26:57.581586 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-85pc8" Mar 21 04:26:57 crc kubenswrapper[4839]: I0321 04:26:57.582698 4839 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-api/machine-api-operator-5694c8668f-nmj8p"] Mar 21 04:26:57 crc kubenswrapper[4839]: I0321 04:26:57.583193 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/machine-api-operator-5694c8668f-nmj8p" Mar 21 04:26:57 crc kubenswrapper[4839]: I0321 04:26:57.584201 4839 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-8sv4j"] Mar 21 04:26:57 crc kubenswrapper[4839]: I0321 04:26:57.584277 4839 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"kube-root-ca.crt" Mar 21 04:26:57 crc kubenswrapper[4839]: I0321 04:26:57.584558 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-8sv4j" Mar 21 04:26:57 crc kubenswrapper[4839]: I0321 04:26:57.585892 4839 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-authentication-operator/authentication-operator-69f744f599-k5nwf"] Mar 21 04:26:57 crc kubenswrapper[4839]: I0321 04:26:57.586453 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication-operator/authentication-operator-69f744f599-k5nwf" Mar 21 04:26:57 crc kubenswrapper[4839]: I0321 04:26:57.589548 4839 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Mar 21 04:26:57 crc kubenswrapper[4839]: I0321 04:26:57.589548 4839 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Mar 21 04:26:57 crc kubenswrapper[4839]: I0321 04:26:57.589654 4839 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"audit-1" Mar 21 04:26:57 crc kubenswrapper[4839]: I0321 04:26:57.592392 4839 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-machine-approver/machine-approver-56656f9798-fh8k5"] Mar 21 04:26:57 crc kubenswrapper[4839]: I0321 04:26:57.592993 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-fh8k5" Mar 21 04:26:57 crc kubenswrapper[4839]: I0321 04:26:57.594855 4839 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"encryption-config-1" Mar 21 04:26:57 crc kubenswrapper[4839]: I0321 04:26:57.596074 4839 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Mar 21 04:26:57 crc kubenswrapper[4839]: I0321 04:26:57.596362 4839 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"machine-api-operator-images" Mar 21 04:26:57 crc kubenswrapper[4839]: I0321 04:26:57.596526 4839 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Mar 21 04:26:57 crc kubenswrapper[4839]: I0321 04:26:57.596692 4839 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Mar 21 04:26:57 crc kubenswrapper[4839]: I0321 04:26:57.596954 4839 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"openshift-service-ca.crt" Mar 21 04:26:57 crc kubenswrapper[4839]: I0321 04:26:57.597076 4839 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"trusted-ca-bundle" Mar 21 04:26:57 crc kubenswrapper[4839]: I0321 04:26:57.597500 4839 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-root-ca.crt" Mar 21 04:26:57 crc kubenswrapper[4839]: I0321 04:26:57.597703 4839 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"openshift-service-ca.crt" Mar 21 04:26:57 crc kubenswrapper[4839]: I0321 04:26:57.597732 4839 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-tls" Mar 21 04:26:57 crc kubenswrapper[4839]: I0321 04:26:57.597635 4839 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"serving-cert" Mar 21 04:26:57 crc kubenswrapper[4839]: I0321 04:26:57.597917 4839 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-rbac-proxy" Mar 21 04:26:57 crc kubenswrapper[4839]: I0321 04:26:57.598164 4839 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-serving-cert" Mar 21 04:26:57 crc kubenswrapper[4839]: I0321 04:26:57.598252 4839 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Mar 21 04:26:57 crc kubenswrapper[4839]: I0321 04:26:57.598354 4839 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-dockercfg-mfbb7" Mar 21 04:26:57 crc kubenswrapper[4839]: I0321 04:26:57.598443 4839 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-76ctz"] Mar 21 04:26:57 crc kubenswrapper[4839]: I0321 04:26:57.598359 4839 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"etcd-client" Mar 21 04:26:57 crc kubenswrapper[4839]: I0321 04:26:57.598614 4839 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-service-ca.crt" Mar 21 04:26:57 crc kubenswrapper[4839]: I0321 04:26:57.598902 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-76ctz" Mar 21 04:26:57 crc kubenswrapper[4839]: I0321 04:26:57.598624 4839 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"etcd-serving-ca" Mar 21 04:26:57 crc kubenswrapper[4839]: I0321 04:26:57.598676 4839 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"oauth-apiserver-sa-dockercfg-6r2bq" Mar 21 04:26:57 crc kubenswrapper[4839]: I0321 04:26:57.600280 4839 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"machine-approver-config" Mar 21 04:26:57 crc kubenswrapper[4839]: I0321 04:26:57.601198 4839 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-tz8sp"] Mar 21 04:26:57 crc kubenswrapper[4839]: I0321 04:26:57.602350 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-tz8sp" Mar 21 04:26:57 crc kubenswrapper[4839]: I0321 04:26:57.603177 4839 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"service-ca-bundle" Mar 21 04:26:57 crc kubenswrapper[4839]: I0321 04:26:57.603268 4839 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-dockercfg-xtcjv" Mar 21 04:26:57 crc kubenswrapper[4839]: I0321 04:26:57.603524 4839 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-sa-dockercfg-nl2j4" Mar 21 04:26:57 crc kubenswrapper[4839]: I0321 04:26:57.603774 4839 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"kube-root-ca.crt" Mar 21 04:26:57 crc kubenswrapper[4839]: I0321 04:26:57.603915 4839 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"serving-cert" Mar 21 04:26:57 crc kubenswrapper[4839]: I0321 04:26:57.603780 4839 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-root-ca.crt" Mar 21 04:26:57 crc kubenswrapper[4839]: I0321 04:26:57.604106 4839 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"kube-root-ca.crt" Mar 21 04:26:57 crc kubenswrapper[4839]: I0321 04:26:57.604221 4839 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-rbac-proxy" Mar 21 04:26:57 crc kubenswrapper[4839]: I0321 04:26:57.604351 4839 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-apiserver/apiserver-76f77b778f-gl7rc"] Mar 21 04:26:57 crc kubenswrapper[4839]: I0321 04:26:57.604404 4839 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-tls" Mar 21 04:26:57 crc kubenswrapper[4839]: I0321 04:26:57.604486 4839 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"openshift-service-ca.crt" Mar 21 04:26:57 crc kubenswrapper[4839]: I0321 04:26:57.605292 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver/apiserver-76f77b778f-gl7rc" Mar 21 04:26:57 crc kubenswrapper[4839]: I0321 04:26:57.606304 4839 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-zt77f"] Mar 21 04:26:57 crc kubenswrapper[4839]: I0321 04:26:57.607099 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-zt77f" Mar 21 04:26:57 crc kubenswrapper[4839]: I0321 04:26:57.607102 4839 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"authentication-operator-dockercfg-mz9bj" Mar 21 04:26:57 crc kubenswrapper[4839]: I0321 04:26:57.614640 4839 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-config" Mar 21 04:26:57 crc kubenswrapper[4839]: I0321 04:26:57.607173 4839 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"openshift-service-ca.crt" Mar 21 04:26:57 crc kubenswrapper[4839]: I0321 04:26:57.608388 4839 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-config-operator/openshift-config-operator-7777fb866f-p4nnp"] Mar 21 04:26:57 crc kubenswrapper[4839]: I0321 04:26:57.616496 4839 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"openshift-service-ca.crt" Mar 21 04:26:57 crc kubenswrapper[4839]: I0321 04:26:57.617204 4839 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"serving-cert" Mar 21 04:26:57 crc kubenswrapper[4839]: I0321 04:26:57.618022 4839 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"etcd-client" Mar 21 04:26:57 crc kubenswrapper[4839]: I0321 04:26:57.618267 4839 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"openshift-service-ca.crt" Mar 21 04:26:57 crc kubenswrapper[4839]: I0321 04:26:57.618980 4839 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Mar 21 04:26:57 crc kubenswrapper[4839]: I0321 04:26:57.619326 4839 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Mar 21 04:26:57 crc kubenswrapper[4839]: I0321 04:26:57.619426 4839 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Mar 21 04:26:57 crc kubenswrapper[4839]: I0321 04:26:57.619532 4839 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"kube-root-ca.crt" Mar 21 04:26:57 crc kubenswrapper[4839]: I0321 04:26:57.619621 4839 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Mar 21 04:26:57 crc kubenswrapper[4839]: I0321 04:26:57.619686 4839 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Mar 21 04:26:57 crc kubenswrapper[4839]: I0321 04:26:57.620183 4839 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"audit-1" Mar 21 04:26:57 crc kubenswrapper[4839]: I0321 04:26:57.620319 4839 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"cluster-samples-operator-dockercfg-xpp9w" Mar 21 04:26:57 crc kubenswrapper[4839]: I0321 04:26:57.620393 4839 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"encryption-config-1" Mar 21 04:26:57 crc kubenswrapper[4839]: I0321 04:26:57.620473 4839 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"config" Mar 21 04:26:57 crc kubenswrapper[4839]: I0321 04:26:57.620552 4839 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"image-import-ca" Mar 21 04:26:57 crc kubenswrapper[4839]: I0321 04:26:57.620649 4839 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"etcd-serving-ca" Mar 21 04:26:57 crc kubenswrapper[4839]: I0321 04:26:57.621137 4839 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"samples-operator-tls" Mar 21 04:26:57 crc kubenswrapper[4839]: I0321 04:26:57.621806 4839 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Mar 21 04:26:57 crc kubenswrapper[4839]: I0321 04:26:57.621903 4839 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"openshift-apiserver-sa-dockercfg-djjff" Mar 21 04:26:57 crc kubenswrapper[4839]: I0321 04:26:57.622029 4839 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"kube-root-ca.crt" Mar 21 04:26:57 crc kubenswrapper[4839]: I0321 04:26:57.622144 4839 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Mar 21 04:26:57 crc kubenswrapper[4839]: I0321 04:26:57.622235 4839 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"authentication-operator-config" Mar 21 04:26:57 crc kubenswrapper[4839]: I0321 04:26:57.634477 4839 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console-operator/console-operator-58897d9998-hkg98"] Mar 21 04:26:57 crc kubenswrapper[4839]: I0321 04:26:57.643877 4839 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-8sv4j"] Mar 21 04:26:57 crc kubenswrapper[4839]: I0321 04:26:57.643918 4839 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/downloads-7954f5f757-qp8mz"] Mar 21 04:26:57 crc kubenswrapper[4839]: I0321 04:26:57.652910 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-config-operator/openshift-config-operator-7777fb866f-p4nnp" Mar 21 04:26:57 crc kubenswrapper[4839]: I0321 04:26:57.655015 4839 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-etcd-operator/etcd-operator-b45778765-2s6j7"] Mar 21 04:26:57 crc kubenswrapper[4839]: I0321 04:26:57.655400 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd-operator/etcd-operator-b45778765-2s6j7" Mar 21 04:26:57 crc kubenswrapper[4839]: I0321 04:26:57.655616 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/downloads-7954f5f757-qp8mz" Mar 21 04:26:57 crc kubenswrapper[4839]: I0321 04:26:57.656343 4839 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-service-ca" Mar 21 04:26:57 crc kubenswrapper[4839]: I0321 04:26:57.656472 4839 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-session" Mar 21 04:26:57 crc kubenswrapper[4839]: I0321 04:26:57.656623 4839 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-idp-0-file-data" Mar 21 04:26:57 crc kubenswrapper[4839]: I0321 04:26:57.656702 4839 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-error" Mar 21 04:26:57 crc kubenswrapper[4839]: I0321 04:26:57.657769 4839 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/console-f9d7485db-bj929"] Mar 21 04:26:57 crc kubenswrapper[4839]: I0321 04:26:57.657802 4839 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"openshift-service-ca.crt" Mar 21 04:26:57 crc kubenswrapper[4839]: I0321 04:26:57.658211 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-bj929" Mar 21 04:26:57 crc kubenswrapper[4839]: I0321 04:26:57.658815 4839 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-dns-operator/dns-operator-744455d44c-g2rrh"] Mar 21 04:26:57 crc kubenswrapper[4839]: I0321 04:26:57.659407 4839 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"oauth-openshift-dockercfg-znhcc" Mar 21 04:26:57 crc kubenswrapper[4839]: I0321 04:26:57.659864 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns-operator/dns-operator-744455d44c-g2rrh" Mar 21 04:26:57 crc kubenswrapper[4839]: I0321 04:26:57.659741 4839 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"openshift-service-ca.crt" Mar 21 04:26:57 crc kubenswrapper[4839]: I0321 04:26:57.660901 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-58897d9998-hkg98" Mar 21 04:26:57 crc kubenswrapper[4839]: I0321 04:26:57.661233 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3498feaf-72d5-471a-b25e-fb4b68875767-config\") pod \"controller-manager-879f6c89f-45jfn\" (UID: \"3498feaf-72d5-471a-b25e-fb4b68875767\") " pod="openshift-controller-manager/controller-manager-879f6c89f-45jfn" Mar 21 04:26:57 crc kubenswrapper[4839]: I0321 04:26:57.667955 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/c4d393d7-42d7-4b7d-a3cd-f7e325b97c54-images\") pod \"machine-api-operator-5694c8668f-nmj8p\" (UID: \"c4d393d7-42d7-4b7d-a3cd-f7e325b97c54\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-nmj8p" Mar 21 04:26:57 crc kubenswrapper[4839]: I0321 04:26:57.667985 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wp2sf\" (UniqueName: \"kubernetes.io/projected/3498feaf-72d5-471a-b25e-fb4b68875767-kube-api-access-wp2sf\") pod \"controller-manager-879f6c89f-45jfn\" (UID: \"3498feaf-72d5-471a-b25e-fb4b68875767\") " pod="openshift-controller-manager/controller-manager-879f6c89f-45jfn" Mar 21 04:26:57 crc kubenswrapper[4839]: I0321 04:26:57.668011 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/3498feaf-72d5-471a-b25e-fb4b68875767-client-ca\") pod \"controller-manager-879f6c89f-45jfn\" (UID: \"3498feaf-72d5-471a-b25e-fb4b68875767\") " pod="openshift-controller-manager/controller-manager-879f6c89f-45jfn" Mar 21 04:26:57 crc kubenswrapper[4839]: I0321 04:26:57.668039 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/67adff78-dfe5-440a-80b0-fefd703c3aa7-trusted-ca-bundle\") pod \"apiserver-7bbb656c7d-85pc8\" (UID: \"67adff78-dfe5-440a-80b0-fefd703c3aa7\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-85pc8" Mar 21 04:26:57 crc kubenswrapper[4839]: I0321 04:26:57.668067 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/67adff78-dfe5-440a-80b0-fefd703c3aa7-etcd-serving-ca\") pod \"apiserver-7bbb656c7d-85pc8\" (UID: \"67adff78-dfe5-440a-80b0-fefd703c3aa7\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-85pc8" Mar 21 04:26:57 crc kubenswrapper[4839]: I0321 04:26:57.668103 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/c4d393d7-42d7-4b7d-a3cd-f7e325b97c54-machine-api-operator-tls\") pod \"machine-api-operator-5694c8668f-nmj8p\" (UID: \"c4d393d7-42d7-4b7d-a3cd-f7e325b97c54\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-nmj8p" Mar 21 04:26:57 crc kubenswrapper[4839]: I0321 04:26:57.668149 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/3498feaf-72d5-471a-b25e-fb4b68875767-serving-cert\") pod \"controller-manager-879f6c89f-45jfn\" (UID: \"3498feaf-72d5-471a-b25e-fb4b68875767\") " pod="openshift-controller-manager/controller-manager-879f6c89f-45jfn" Mar 21 04:26:57 crc kubenswrapper[4839]: I0321 04:26:57.668173 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-29czp\" (UniqueName: \"kubernetes.io/projected/c4d393d7-42d7-4b7d-a3cd-f7e325b97c54-kube-api-access-29czp\") pod \"machine-api-operator-5694c8668f-nmj8p\" (UID: \"c4d393d7-42d7-4b7d-a3cd-f7e325b97c54\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-nmj8p" Mar 21 04:26:57 crc kubenswrapper[4839]: I0321 04:26:57.668192 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/67adff78-dfe5-440a-80b0-fefd703c3aa7-encryption-config\") pod \"apiserver-7bbb656c7d-85pc8\" (UID: \"67adff78-dfe5-440a-80b0-fefd703c3aa7\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-85pc8" Mar 21 04:26:57 crc kubenswrapper[4839]: I0321 04:26:57.668213 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/67adff78-dfe5-440a-80b0-fefd703c3aa7-audit-policies\") pod \"apiserver-7bbb656c7d-85pc8\" (UID: \"67adff78-dfe5-440a-80b0-fefd703c3aa7\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-85pc8" Mar 21 04:26:57 crc kubenswrapper[4839]: I0321 04:26:57.668230 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/67adff78-dfe5-440a-80b0-fefd703c3aa7-etcd-client\") pod \"apiserver-7bbb656c7d-85pc8\" (UID: \"67adff78-dfe5-440a-80b0-fefd703c3aa7\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-85pc8" Mar 21 04:26:57 crc kubenswrapper[4839]: I0321 04:26:57.668247 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f2bf4\" (UniqueName: \"kubernetes.io/projected/67adff78-dfe5-440a-80b0-fefd703c3aa7-kube-api-access-f2bf4\") pod \"apiserver-7bbb656c7d-85pc8\" (UID: \"67adff78-dfe5-440a-80b0-fefd703c3aa7\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-85pc8" Mar 21 04:26:57 crc kubenswrapper[4839]: I0321 04:26:57.668294 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/67adff78-dfe5-440a-80b0-fefd703c3aa7-audit-dir\") pod \"apiserver-7bbb656c7d-85pc8\" (UID: \"67adff78-dfe5-440a-80b0-fefd703c3aa7\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-85pc8" Mar 21 04:26:57 crc kubenswrapper[4839]: I0321 04:26:57.668317 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c4d393d7-42d7-4b7d-a3cd-f7e325b97c54-config\") pod \"machine-api-operator-5694c8668f-nmj8p\" (UID: \"c4d393d7-42d7-4b7d-a3cd-f7e325b97c54\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-nmj8p" Mar 21 04:26:57 crc kubenswrapper[4839]: I0321 04:26:57.668340 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/67adff78-dfe5-440a-80b0-fefd703c3aa7-serving-cert\") pod \"apiserver-7bbb656c7d-85pc8\" (UID: \"67adff78-dfe5-440a-80b0-fefd703c3aa7\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-85pc8" Mar 21 04:26:57 crc kubenswrapper[4839]: I0321 04:26:57.668360 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/3498feaf-72d5-471a-b25e-fb4b68875767-proxy-ca-bundles\") pod \"controller-manager-879f6c89f-45jfn\" (UID: \"3498feaf-72d5-471a-b25e-fb4b68875767\") " pod="openshift-controller-manager/controller-manager-879f6c89f-45jfn" Mar 21 04:26:57 crc kubenswrapper[4839]: I0321 04:26:57.659997 4839 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-cliconfig" Mar 21 04:26:57 crc kubenswrapper[4839]: I0321 04:26:57.660030 4839 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"audit" Mar 21 04:26:57 crc kubenswrapper[4839]: I0321 04:26:57.660127 4839 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-router-certs" Mar 21 04:26:57 crc kubenswrapper[4839]: I0321 04:26:57.660224 4839 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-serving-cert" Mar 21 04:26:57 crc kubenswrapper[4839]: I0321 04:26:57.660854 4839 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-provider-selection" Mar 21 04:26:57 crc kubenswrapper[4839]: I0321 04:26:57.661248 4839 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"kube-root-ca.crt" Mar 21 04:26:57 crc kubenswrapper[4839]: I0321 04:26:57.661294 4839 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"openshift-config-operator-dockercfg-7pc5z" Mar 21 04:26:57 crc kubenswrapper[4839]: I0321 04:26:57.661356 4839 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"kube-root-ca.crt" Mar 21 04:26:57 crc kubenswrapper[4839]: I0321 04:26:57.661378 4839 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"trusted-ca-bundle" Mar 21 04:26:57 crc kubenswrapper[4839]: I0321 04:26:57.662294 4839 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"trusted-ca-bundle" Mar 21 04:26:57 crc kubenswrapper[4839]: I0321 04:26:57.663817 4839 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-login" Mar 21 04:26:57 crc kubenswrapper[4839]: I0321 04:26:57.665096 4839 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"config-operator-serving-cert" Mar 21 04:26:57 crc kubenswrapper[4839]: I0321 04:26:57.665141 4839 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-operator-config" Mar 21 04:26:57 crc kubenswrapper[4839]: I0321 04:26:57.665558 4839 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"oauth-serving-cert" Mar 21 04:26:57 crc kubenswrapper[4839]: I0321 04:26:57.665659 4839 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-dockercfg-r9srn" Mar 21 04:26:57 crc kubenswrapper[4839]: I0321 04:26:57.665759 4839 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"service-ca" Mar 21 04:26:57 crc kubenswrapper[4839]: I0321 04:26:57.665789 4839 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-ca-bundle" Mar 21 04:26:57 crc kubenswrapper[4839]: I0321 04:26:57.665816 4839 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"openshift-service-ca.crt" Mar 21 04:26:57 crc kubenswrapper[4839]: I0321 04:26:57.660170 4839 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/machine-api-operator-5694c8668f-nmj8p"] Mar 21 04:26:57 crc kubenswrapper[4839]: I0321 04:26:57.670187 4839 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-45jfn"] Mar 21 04:26:57 crc kubenswrapper[4839]: I0321 04:26:57.671621 4839 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-oauth-apiserver/apiserver-7bbb656c7d-85pc8"] Mar 21 04:26:57 crc kubenswrapper[4839]: I0321 04:26:57.672735 4839 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-4cwc5"] Mar 21 04:26:57 crc kubenswrapper[4839]: I0321 04:26:57.673365 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-4cwc5" Mar 21 04:26:57 crc kubenswrapper[4839]: I0321 04:26:57.673631 4839 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-oauth-config" Mar 21 04:26:57 crc kubenswrapper[4839]: I0321 04:26:57.673794 4839 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"console-config" Mar 21 04:26:57 crc kubenswrapper[4839]: I0321 04:26:57.673890 4839 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-service-ca-bundle" Mar 21 04:26:57 crc kubenswrapper[4839]: I0321 04:26:57.674606 4839 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"default-dockercfg-chnjx" Mar 21 04:26:57 crc kubenswrapper[4839]: I0321 04:26:57.674723 4839 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"kube-root-ca.crt" Mar 21 04:26:57 crc kubenswrapper[4839]: I0321 04:26:57.674818 4839 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-serving-cert" Mar 21 04:26:57 crc kubenswrapper[4839]: I0321 04:26:57.674915 4839 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-serving-cert" Mar 21 04:26:57 crc kubenswrapper[4839]: I0321 04:26:57.675009 4839 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-client" Mar 21 04:26:57 crc kubenswrapper[4839]: I0321 04:26:57.675111 4839 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"openshift-service-ca.crt" Mar 21 04:26:57 crc kubenswrapper[4839]: I0321 04:26:57.675206 4839 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"kube-root-ca.crt" Mar 21 04:26:57 crc kubenswrapper[4839]: I0321 04:26:57.675619 4839 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-fhqz6"] Mar 21 04:26:57 crc kubenswrapper[4839]: I0321 04:26:57.676149 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-fhqz6" Mar 21 04:26:57 crc kubenswrapper[4839]: I0321 04:26:57.676356 4839 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-dockercfg-f62pw" Mar 21 04:26:57 crc kubenswrapper[4839]: I0321 04:26:57.683412 4839 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-tz8sp"] Mar 21 04:26:57 crc kubenswrapper[4839]: I0321 04:26:57.686441 4839 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-ocp-branding-template" Mar 21 04:26:57 crc kubenswrapper[4839]: I0321 04:26:57.687629 4839 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"openshift-service-ca.crt" Mar 21 04:26:57 crc kubenswrapper[4839]: I0321 04:26:57.687809 4839 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"metrics-tls" Mar 21 04:26:57 crc kubenswrapper[4839]: I0321 04:26:57.687869 4839 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"openshift-service-ca.crt" Mar 21 04:26:57 crc kubenswrapper[4839]: I0321 04:26:57.688073 4839 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"dns-operator-dockercfg-9mqw5" Mar 21 04:26:57 crc kubenswrapper[4839]: I0321 04:26:57.693265 4839 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-trusted-ca-bundle" Mar 21 04:26:57 crc kubenswrapper[4839]: I0321 04:26:57.697065 4839 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-whlp9"] Mar 21 04:26:57 crc kubenswrapper[4839]: I0321 04:26:57.697602 4839 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-c4tgz"] Mar 21 04:26:57 crc kubenswrapper[4839]: I0321 04:26:57.697809 4839 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"trusted-ca-bundle" Mar 21 04:26:57 crc kubenswrapper[4839]: I0321 04:26:57.697862 4839 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-controller-84d6567774-hlk25"] Mar 21 04:26:57 crc kubenswrapper[4839]: I0321 04:26:57.698084 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-whlp9" Mar 21 04:26:57 crc kubenswrapper[4839]: I0321 04:26:57.698310 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-hlk25" Mar 21 04:26:57 crc kubenswrapper[4839]: I0321 04:26:57.698368 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-c4tgz" Mar 21 04:26:57 crc kubenswrapper[4839]: I0321 04:26:57.701611 4839 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-zt77f"] Mar 21 04:26:57 crc kubenswrapper[4839]: I0321 04:26:57.701658 4839 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-config-operator/openshift-config-operator-7777fb866f-p4nnp"] Mar 21 04:26:57 crc kubenswrapper[4839]: I0321 04:26:57.714984 4839 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ingress-operator/ingress-operator-5b745b69d9-9m6tl"] Mar 21 04:26:57 crc kubenswrapper[4839]: I0321 04:26:57.716002 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-9m6tl" Mar 21 04:26:57 crc kubenswrapper[4839]: I0321 04:26:57.718250 4839 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-ql2ps"] Mar 21 04:26:57 crc kubenswrapper[4839]: I0321 04:26:57.718319 4839 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"console-operator-config" Mar 21 04:26:57 crc kubenswrapper[4839]: I0321 04:26:57.720324 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-ql2ps" Mar 21 04:26:57 crc kubenswrapper[4839]: I0321 04:26:57.734304 4839 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"trusted-ca" Mar 21 04:26:57 crc kubenswrapper[4839]: I0321 04:26:57.735171 4839 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-djrdt"] Mar 21 04:26:57 crc kubenswrapper[4839]: I0321 04:26:57.735847 4839 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-storage-version-migrator/migrator-59844c95c7-blcpt"] Mar 21 04:26:57 crc kubenswrapper[4839]: I0321 04:26:57.736351 4839 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver/apiserver-76f77b778f-gl7rc"] Mar 21 04:26:57 crc kubenswrapper[4839]: I0321 04:26:57.736461 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-blcpt" Mar 21 04:26:57 crc kubenswrapper[4839]: I0321 04:26:57.736834 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-djrdt" Mar 21 04:26:57 crc kubenswrapper[4839]: I0321 04:26:57.738492 4839 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-operator-74547568cd-lqm8j"] Mar 21 04:26:57 crc kubenswrapper[4839]: I0321 04:26:57.739148 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-lqm8j" Mar 21 04:26:57 crc kubenswrapper[4839]: I0321 04:26:57.742790 4839 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"kube-root-ca.crt" Mar 21 04:26:57 crc kubenswrapper[4839]: I0321 04:26:57.743693 4839 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-x2g8w"] Mar 21 04:26:57 crc kubenswrapper[4839]: I0321 04:26:57.744507 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-x2g8w" Mar 21 04:26:57 crc kubenswrapper[4839]: I0321 04:26:57.744836 4839 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-hmqv7"] Mar 21 04:26:57 crc kubenswrapper[4839]: I0321 04:26:57.745339 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-hmqv7" Mar 21 04:26:57 crc kubenswrapper[4839]: I0321 04:26:57.750114 4839 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ingress/router-default-5444994796-w6dzs"] Mar 21 04:26:57 crc kubenswrapper[4839]: I0321 04:26:57.750897 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/router-default-5444994796-w6dzs" Mar 21 04:26:57 crc kubenswrapper[4839]: I0321 04:26:57.754184 4839 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/downloads-7954f5f757-qp8mz"] Mar 21 04:26:57 crc kubenswrapper[4839]: I0321 04:26:57.754228 4839 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication-operator/authentication-operator-69f744f599-k5nwf"] Mar 21 04:26:57 crc kubenswrapper[4839]: I0321 04:26:57.755516 4839 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns-operator/dns-operator-744455d44c-g2rrh"] Mar 21 04:26:57 crc kubenswrapper[4839]: I0321 04:26:57.757743 4839 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-etcd-operator/etcd-operator-b45778765-2s6j7"] Mar 21 04:26:57 crc kubenswrapper[4839]: I0321 04:26:57.759885 4839 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-76ctz"] Mar 21 04:26:57 crc kubenswrapper[4839]: I0321 04:26:57.759933 4839 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-4cwc5"] Mar 21 04:26:57 crc kubenswrapper[4839]: I0321 04:26:57.763057 4839 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"kube-root-ca.crt" Mar 21 04:26:57 crc kubenswrapper[4839]: I0321 04:26:57.764089 4839 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-8jgh7"] Mar 21 04:26:57 crc kubenswrapper[4839]: I0321 04:26:57.764914 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-8jgh7" Mar 21 04:26:57 crc kubenswrapper[4839]: I0321 04:26:57.767667 4839 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-bxg8h"] Mar 21 04:26:57 crc kubenswrapper[4839]: I0321 04:26:57.768448 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-bxg8h" Mar 21 04:26:57 crc kubenswrapper[4839]: I0321 04:26:57.769235 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/67adff78-dfe5-440a-80b0-fefd703c3aa7-trusted-ca-bundle\") pod \"apiserver-7bbb656c7d-85pc8\" (UID: \"67adff78-dfe5-440a-80b0-fefd703c3aa7\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-85pc8" Mar 21 04:26:57 crc kubenswrapper[4839]: I0321 04:26:57.769283 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/ebdfec0a-a8bf-47b0-b51a-75a76d4341f2-console-oauth-config\") pod \"console-f9d7485db-bj929\" (UID: \"ebdfec0a-a8bf-47b0-b51a-75a76d4341f2\") " pod="openshift-console/console-f9d7485db-bj929" Mar 21 04:26:57 crc kubenswrapper[4839]: I0321 04:26:57.769312 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/3db892a0-fb40-4e0e-93ee-a8f2876ad8be-trusted-ca\") pod \"cluster-image-registry-operator-dc59b4c8b-fhqz6\" (UID: \"3db892a0-fb40-4e0e-93ee-a8f2876ad8be\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-fhqz6" Mar 21 04:26:57 crc kubenswrapper[4839]: I0321 04:26:57.769341 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f7a32fe8-bd63-44c6-a90c-a6a5438e3cd5-config\") pod \"openshift-apiserver-operator-796bbdcf4f-8sv4j\" (UID: \"f7a32fe8-bd63-44c6-a90c-a6a5438e3cd5\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-8sv4j" Mar 21 04:26:57 crc kubenswrapper[4839]: I0321 04:26:57.769363 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ssrbd\" (UniqueName: \"kubernetes.io/projected/cae2f42f-b7c7-43c7-b397-a8273ea5844b-kube-api-access-ssrbd\") pod \"dns-operator-744455d44c-g2rrh\" (UID: \"cae2f42f-b7c7-43c7-b397-a8273ea5844b\") " pod="openshift-dns-operator/dns-operator-744455d44c-g2rrh" Mar 21 04:26:57 crc kubenswrapper[4839]: I0321 04:26:57.769385 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9pwc7\" (UniqueName: \"kubernetes.io/projected/9d291bc8-87c0-4a9e-b269-52a7801f050b-kube-api-access-9pwc7\") pod \"apiserver-76f77b778f-gl7rc\" (UID: \"9d291bc8-87c0-4a9e-b269-52a7801f050b\") " pod="openshift-apiserver/apiserver-76f77b778f-gl7rc" Mar 21 04:26:57 crc kubenswrapper[4839]: I0321 04:26:57.769424 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/67adff78-dfe5-440a-80b0-fefd703c3aa7-etcd-serving-ca\") pod \"apiserver-7bbb656c7d-85pc8\" (UID: \"67adff78-dfe5-440a-80b0-fefd703c3aa7\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-85pc8" Mar 21 04:26:57 crc kubenswrapper[4839]: I0321 04:26:57.769462 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/c4d393d7-42d7-4b7d-a3cd-f7e325b97c54-machine-api-operator-tls\") pod \"machine-api-operator-5694c8668f-nmj8p\" (UID: \"c4d393d7-42d7-4b7d-a3cd-f7e325b97c54\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-nmj8p" Mar 21 04:26:57 crc kubenswrapper[4839]: I0321 04:26:57.769489 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/9d291bc8-87c0-4a9e-b269-52a7801f050b-encryption-config\") pod \"apiserver-76f77b778f-gl7rc\" (UID: \"9d291bc8-87c0-4a9e-b269-52a7801f050b\") " pod="openshift-apiserver/apiserver-76f77b778f-gl7rc" Mar 21 04:26:57 crc kubenswrapper[4839]: I0321 04:26:57.769511 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/93bc1508-a828-4d23-b078-1d4164d1bc2c-serving-cert\") pod \"etcd-operator-b45778765-2s6j7\" (UID: \"93bc1508-a828-4d23-b078-1d4164d1bc2c\") " pod="openshift-etcd-operator/etcd-operator-b45778765-2s6j7" Mar 21 04:26:57 crc kubenswrapper[4839]: I0321 04:26:57.769536 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/93bc1508-a828-4d23-b078-1d4164d1bc2c-config\") pod \"etcd-operator-b45778765-2s6j7\" (UID: \"93bc1508-a828-4d23-b078-1d4164d1bc2c\") " pod="openshift-etcd-operator/etcd-operator-b45778765-2s6j7" Mar 21 04:26:57 crc kubenswrapper[4839]: I0321 04:26:57.769557 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/9d291bc8-87c0-4a9e-b269-52a7801f050b-etcd-serving-ca\") pod \"apiserver-76f77b778f-gl7rc\" (UID: \"9d291bc8-87c0-4a9e-b269-52a7801f050b\") " pod="openshift-apiserver/apiserver-76f77b778f-gl7rc" Mar 21 04:26:57 crc kubenswrapper[4839]: I0321 04:26:57.769606 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/14ed32fc-196a-4d5e-a8f3-d9fc6ec765b1-auth-proxy-config\") pod \"machine-approver-56656f9798-fh8k5\" (UID: \"14ed32fc-196a-4d5e-a8f3-d9fc6ec765b1\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-fh8k5" Mar 21 04:26:57 crc kubenswrapper[4839]: I0321 04:26:57.769641 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r66pp\" (UniqueName: \"kubernetes.io/projected/4d63cdfd-21e7-4a63-960b-363fb131ac08-kube-api-access-r66pp\") pod \"downloads-7954f5f757-qp8mz\" (UID: \"4d63cdfd-21e7-4a63-960b-363fb131ac08\") " pod="openshift-console/downloads-7954f5f757-qp8mz" Mar 21 04:26:57 crc kubenswrapper[4839]: I0321 04:26:57.769667 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e81e2384-94b0-4639-bb2d-e4152385c932-config\") pod \"route-controller-manager-6576b87f9c-76ctz\" (UID: \"e81e2384-94b0-4639-bb2d-e4152385c932\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-76ctz" Mar 21 04:26:57 crc kubenswrapper[4839]: I0321 04:26:57.769702 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c7fd2012-4ccf-4fb6-ad00-eeb7fd6be2cf-config\") pod \"authentication-operator-69f744f599-k5nwf\" (UID: \"c7fd2012-4ccf-4fb6-ad00-eeb7fd6be2cf\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-k5nwf" Mar 21 04:26:57 crc kubenswrapper[4839]: I0321 04:26:57.769728 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/3498feaf-72d5-471a-b25e-fb4b68875767-serving-cert\") pod \"controller-manager-879f6c89f-45jfn\" (UID: \"3498feaf-72d5-471a-b25e-fb4b68875767\") " pod="openshift-controller-manager/controller-manager-879f6c89f-45jfn" Mar 21 04:26:57 crc kubenswrapper[4839]: I0321 04:26:57.769752 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/9d291bc8-87c0-4a9e-b269-52a7801f050b-etcd-client\") pod \"apiserver-76f77b778f-gl7rc\" (UID: \"9d291bc8-87c0-4a9e-b269-52a7801f050b\") " pod="openshift-apiserver/apiserver-76f77b778f-gl7rc" Mar 21 04:26:57 crc kubenswrapper[4839]: I0321 04:26:57.769777 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-29czp\" (UniqueName: \"kubernetes.io/projected/c4d393d7-42d7-4b7d-a3cd-f7e325b97c54-kube-api-access-29czp\") pod \"machine-api-operator-5694c8668f-nmj8p\" (UID: \"c4d393d7-42d7-4b7d-a3cd-f7e325b97c54\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-nmj8p" Mar 21 04:26:57 crc kubenswrapper[4839]: I0321 04:26:57.769801 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a79fc033-c671-42ff-aa06-78ae64967c92-config\") pod \"openshift-controller-manager-operator-756b6f6bc6-4cwc5\" (UID: \"a79fc033-c671-42ff-aa06-78ae64967c92\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-4cwc5" Mar 21 04:26:57 crc kubenswrapper[4839]: I0321 04:26:57.769825 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/3db892a0-fb40-4e0e-93ee-a8f2876ad8be-bound-sa-token\") pod \"cluster-image-registry-operator-dc59b4c8b-fhqz6\" (UID: \"3db892a0-fb40-4e0e-93ee-a8f2876ad8be\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-fhqz6" Mar 21 04:26:57 crc kubenswrapper[4839]: I0321 04:26:57.769850 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/67adff78-dfe5-440a-80b0-fefd703c3aa7-encryption-config\") pod \"apiserver-7bbb656c7d-85pc8\" (UID: \"67adff78-dfe5-440a-80b0-fefd703c3aa7\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-85pc8" Mar 21 04:26:57 crc kubenswrapper[4839]: I0321 04:26:57.769876 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/b60d6f1b-b109-4fa4-a85d-ebb845b342bd-available-featuregates\") pod \"openshift-config-operator-7777fb866f-p4nnp\" (UID: \"b60d6f1b-b109-4fa4-a85d-ebb845b342bd\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-p4nnp" Mar 21 04:26:57 crc kubenswrapper[4839]: I0321 04:26:57.769908 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-f2bf4\" (UniqueName: \"kubernetes.io/projected/67adff78-dfe5-440a-80b0-fefd703c3aa7-kube-api-access-f2bf4\") pod \"apiserver-7bbb656c7d-85pc8\" (UID: \"67adff78-dfe5-440a-80b0-fefd703c3aa7\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-85pc8" Mar 21 04:26:57 crc kubenswrapper[4839]: I0321 04:26:57.769918 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/67adff78-dfe5-440a-80b0-fefd703c3aa7-trusted-ca-bundle\") pod \"apiserver-7bbb656c7d-85pc8\" (UID: \"67adff78-dfe5-440a-80b0-fefd703c3aa7\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-85pc8" Mar 21 04:26:57 crc kubenswrapper[4839]: I0321 04:26:57.769934 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b60d6f1b-b109-4fa4-a85d-ebb845b342bd-serving-cert\") pod \"openshift-config-operator-7777fb866f-p4nnp\" (UID: \"b60d6f1b-b109-4fa4-a85d-ebb845b342bd\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-p4nnp" Mar 21 04:26:57 crc kubenswrapper[4839]: I0321 04:26:57.769977 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/67adff78-dfe5-440a-80b0-fefd703c3aa7-audit-policies\") pod \"apiserver-7bbb656c7d-85pc8\" (UID: \"67adff78-dfe5-440a-80b0-fefd703c3aa7\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-85pc8" Mar 21 04:26:57 crc kubenswrapper[4839]: I0321 04:26:57.770379 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/67adff78-dfe5-440a-80b0-fefd703c3aa7-audit-policies\") pod \"apiserver-7bbb656c7d-85pc8\" (UID: \"67adff78-dfe5-440a-80b0-fefd703c3aa7\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-85pc8" Mar 21 04:26:57 crc kubenswrapper[4839]: I0321 04:26:57.770419 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/67adff78-dfe5-440a-80b0-fefd703c3aa7-etcd-client\") pod \"apiserver-7bbb656c7d-85pc8\" (UID: \"67adff78-dfe5-440a-80b0-fefd703c3aa7\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-85pc8" Mar 21 04:26:57 crc kubenswrapper[4839]: I0321 04:26:57.770457 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/6bfbd19d-a44a-459c-bd6e-150241ce3ebb-v4-0-config-system-session\") pod \"oauth-openshift-558db77b4-zt77f\" (UID: \"6bfbd19d-a44a-459c-bd6e-150241ce3ebb\") " pod="openshift-authentication/oauth-openshift-558db77b4-zt77f" Mar 21 04:26:57 crc kubenswrapper[4839]: I0321 04:26:57.770479 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/6bfbd19d-a44a-459c-bd6e-150241ce3ebb-v4-0-config-system-service-ca\") pod \"oauth-openshift-558db77b4-zt77f\" (UID: \"6bfbd19d-a44a-459c-bd6e-150241ce3ebb\") " pod="openshift-authentication/oauth-openshift-558db77b4-zt77f" Mar 21 04:26:57 crc kubenswrapper[4839]: I0321 04:26:57.770518 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6bfbd19d-a44a-459c-bd6e-150241ce3ebb-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-558db77b4-zt77f\" (UID: \"6bfbd19d-a44a-459c-bd6e-150241ce3ebb\") " pod="openshift-authentication/oauth-openshift-558db77b4-zt77f" Mar 21 04:26:57 crc kubenswrapper[4839]: I0321 04:26:57.770539 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/3db892a0-fb40-4e0e-93ee-a8f2876ad8be-image-registry-operator-tls\") pod \"cluster-image-registry-operator-dc59b4c8b-fhqz6\" (UID: \"3db892a0-fb40-4e0e-93ee-a8f2876ad8be\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-fhqz6" Mar 21 04:26:57 crc kubenswrapper[4839]: I0321 04:26:57.770556 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f7a32fe8-bd63-44c6-a90c-a6a5438e3cd5-serving-cert\") pod \"openshift-apiserver-operator-796bbdcf4f-8sv4j\" (UID: \"f7a32fe8-bd63-44c6-a90c-a6a5438e3cd5\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-8sv4j" Mar 21 04:26:57 crc kubenswrapper[4839]: I0321 04:26:57.770606 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/93bc1508-a828-4d23-b078-1d4164d1bc2c-etcd-client\") pod \"etcd-operator-b45778765-2s6j7\" (UID: \"93bc1508-a828-4d23-b078-1d4164d1bc2c\") " pod="openshift-etcd-operator/etcd-operator-b45778765-2s6j7" Mar 21 04:26:57 crc kubenswrapper[4839]: I0321 04:26:57.770623 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/6bfbd19d-a44a-459c-bd6e-150241ce3ebb-audit-dir\") pod \"oauth-openshift-558db77b4-zt77f\" (UID: \"6bfbd19d-a44a-459c-bd6e-150241ce3ebb\") " pod="openshift-authentication/oauth-openshift-558db77b4-zt77f" Mar 21 04:26:57 crc kubenswrapper[4839]: I0321 04:26:57.770640 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/6bfbd19d-a44a-459c-bd6e-150241ce3ebb-v4-0-config-system-serving-cert\") pod \"oauth-openshift-558db77b4-zt77f\" (UID: \"6bfbd19d-a44a-459c-bd6e-150241ce3ebb\") " pod="openshift-authentication/oauth-openshift-558db77b4-zt77f" Mar 21 04:26:57 crc kubenswrapper[4839]: I0321 04:26:57.770660 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q47zt\" (UniqueName: \"kubernetes.io/projected/ebdfec0a-a8bf-47b0-b51a-75a76d4341f2-kube-api-access-q47zt\") pod \"console-f9d7485db-bj929\" (UID: \"ebdfec0a-a8bf-47b0-b51a-75a76d4341f2\") " pod="openshift-console/console-f9d7485db-bj929" Mar 21 04:26:57 crc kubenswrapper[4839]: I0321 04:26:57.770677 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/6bfbd19d-a44a-459c-bd6e-150241ce3ebb-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-558db77b4-zt77f\" (UID: \"6bfbd19d-a44a-459c-bd6e-150241ce3ebb\") " pod="openshift-authentication/oauth-openshift-558db77b4-zt77f" Mar 21 04:26:57 crc kubenswrapper[4839]: I0321 04:26:57.770715 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dxvqw\" (UniqueName: \"kubernetes.io/projected/93bc1508-a828-4d23-b078-1d4164d1bc2c-kube-api-access-dxvqw\") pod \"etcd-operator-b45778765-2s6j7\" (UID: \"93bc1508-a828-4d23-b078-1d4164d1bc2c\") " pod="openshift-etcd-operator/etcd-operator-b45778765-2s6j7" Mar 21 04:26:57 crc kubenswrapper[4839]: I0321 04:26:57.770734 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a79fc033-c671-42ff-aa06-78ae64967c92-serving-cert\") pod \"openshift-controller-manager-operator-756b6f6bc6-4cwc5\" (UID: \"a79fc033-c671-42ff-aa06-78ae64967c92\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-4cwc5" Mar 21 04:26:57 crc kubenswrapper[4839]: I0321 04:26:57.770751 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/25af4e9d-c029-4ee7-9952-18a3a5e3c333-samples-operator-tls\") pod \"cluster-samples-operator-665b6dd947-tz8sp\" (UID: \"25af4e9d-c029-4ee7-9952-18a3a5e3c333\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-tz8sp" Mar 21 04:26:57 crc kubenswrapper[4839]: I0321 04:26:57.770769 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f7156267-6917-4c54-ba75-4a91a0772025-config\") pod \"console-operator-58897d9998-hkg98\" (UID: \"f7156267-6917-4c54-ba75-4a91a0772025\") " pod="openshift-console-operator/console-operator-58897d9998-hkg98" Mar 21 04:26:57 crc kubenswrapper[4839]: I0321 04:26:57.770784 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c7fd2012-4ccf-4fb6-ad00-eeb7fd6be2cf-service-ca-bundle\") pod \"authentication-operator-69f744f599-k5nwf\" (UID: \"c7fd2012-4ccf-4fb6-ad00-eeb7fd6be2cf\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-k5nwf" Mar 21 04:26:57 crc kubenswrapper[4839]: I0321 04:26:57.770802 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/6bfbd19d-a44a-459c-bd6e-150241ce3ebb-v4-0-config-system-cliconfig\") pod \"oauth-openshift-558db77b4-zt77f\" (UID: \"6bfbd19d-a44a-459c-bd6e-150241ce3ebb\") " pod="openshift-authentication/oauth-openshift-558db77b4-zt77f" Mar 21 04:26:57 crc kubenswrapper[4839]: I0321 04:26:57.770820 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/ebdfec0a-a8bf-47b0-b51a-75a76d4341f2-console-serving-cert\") pod \"console-f9d7485db-bj929\" (UID: \"ebdfec0a-a8bf-47b0-b51a-75a76d4341f2\") " pod="openshift-console/console-f9d7485db-bj929" Mar 21 04:26:57 crc kubenswrapper[4839]: I0321 04:26:57.770860 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c4d393d7-42d7-4b7d-a3cd-f7e325b97c54-config\") pod \"machine-api-operator-5694c8668f-nmj8p\" (UID: \"c4d393d7-42d7-4b7d-a3cd-f7e325b97c54\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-nmj8p" Mar 21 04:26:57 crc kubenswrapper[4839]: I0321 04:26:57.770876 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/67adff78-dfe5-440a-80b0-fefd703c3aa7-audit-dir\") pod \"apiserver-7bbb656c7d-85pc8\" (UID: \"67adff78-dfe5-440a-80b0-fefd703c3aa7\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-85pc8" Mar 21 04:26:57 crc kubenswrapper[4839]: I0321 04:26:57.770894 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/6bfbd19d-a44a-459c-bd6e-150241ce3ebb-v4-0-config-user-template-login\") pod \"oauth-openshift-558db77b4-zt77f\" (UID: \"6bfbd19d-a44a-459c-bd6e-150241ce3ebb\") " pod="openshift-authentication/oauth-openshift-558db77b4-zt77f" Mar 21 04:26:57 crc kubenswrapper[4839]: I0321 04:26:57.770916 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/ebdfec0a-a8bf-47b0-b51a-75a76d4341f2-service-ca\") pod \"console-f9d7485db-bj929\" (UID: \"ebdfec0a-a8bf-47b0-b51a-75a76d4341f2\") " pod="openshift-console/console-f9d7485db-bj929" Mar 21 04:26:57 crc kubenswrapper[4839]: I0321 04:26:57.770932 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s8kjw\" (UniqueName: \"kubernetes.io/projected/a79fc033-c671-42ff-aa06-78ae64967c92-kube-api-access-s8kjw\") pod \"openshift-controller-manager-operator-756b6f6bc6-4cwc5\" (UID: \"a79fc033-c671-42ff-aa06-78ae64967c92\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-4cwc5" Mar 21 04:26:57 crc kubenswrapper[4839]: I0321 04:26:57.770949 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/14ed32fc-196a-4d5e-a8f3-d9fc6ec765b1-config\") pod \"machine-approver-56656f9798-fh8k5\" (UID: \"14ed32fc-196a-4d5e-a8f3-d9fc6ec765b1\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-fh8k5" Mar 21 04:26:57 crc kubenswrapper[4839]: I0321 04:26:57.770967 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/67adff78-dfe5-440a-80b0-fefd703c3aa7-serving-cert\") pod \"apiserver-7bbb656c7d-85pc8\" (UID: \"67adff78-dfe5-440a-80b0-fefd703c3aa7\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-85pc8" Mar 21 04:26:57 crc kubenswrapper[4839]: I0321 04:26:57.770983 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/6bfbd19d-a44a-459c-bd6e-150241ce3ebb-v4-0-config-user-template-error\") pod \"oauth-openshift-558db77b4-zt77f\" (UID: \"6bfbd19d-a44a-459c-bd6e-150241ce3ebb\") " pod="openshift-authentication/oauth-openshift-558db77b4-zt77f" Mar 21 04:26:57 crc kubenswrapper[4839]: I0321 04:26:57.771003 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/3498feaf-72d5-471a-b25e-fb4b68875767-proxy-ca-bundles\") pod \"controller-manager-879f6c89f-45jfn\" (UID: \"3498feaf-72d5-471a-b25e-fb4b68875767\") " pod="openshift-controller-manager/controller-manager-879f6c89f-45jfn" Mar 21 04:26:57 crc kubenswrapper[4839]: I0321 04:26:57.771082 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/ebdfec0a-a8bf-47b0-b51a-75a76d4341f2-console-config\") pod \"console-f9d7485db-bj929\" (UID: \"ebdfec0a-a8bf-47b0-b51a-75a76d4341f2\") " pod="openshift-console/console-f9d7485db-bj929" Mar 21 04:26:57 crc kubenswrapper[4839]: I0321 04:26:57.771100 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5zmdv\" (UniqueName: \"kubernetes.io/projected/f7156267-6917-4c54-ba75-4a91a0772025-kube-api-access-5zmdv\") pod \"console-operator-58897d9998-hkg98\" (UID: \"f7156267-6917-4c54-ba75-4a91a0772025\") " pod="openshift-console-operator/console-operator-58897d9998-hkg98" Mar 21 04:26:57 crc kubenswrapper[4839]: I0321 04:26:57.771118 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-94mt6\" (UniqueName: \"kubernetes.io/projected/c7fd2012-4ccf-4fb6-ad00-eeb7fd6be2cf-kube-api-access-94mt6\") pod \"authentication-operator-69f744f599-k5nwf\" (UID: \"c7fd2012-4ccf-4fb6-ad00-eeb7fd6be2cf\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-k5nwf" Mar 21 04:26:57 crc kubenswrapper[4839]: I0321 04:26:57.771147 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/9d291bc8-87c0-4a9e-b269-52a7801f050b-audit\") pod \"apiserver-76f77b778f-gl7rc\" (UID: \"9d291bc8-87c0-4a9e-b269-52a7801f050b\") " pod="openshift-apiserver/apiserver-76f77b778f-gl7rc" Mar 21 04:26:57 crc kubenswrapper[4839]: I0321 04:26:57.771162 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c7fd2012-4ccf-4fb6-ad00-eeb7fd6be2cf-trusted-ca-bundle\") pod \"authentication-operator-69f744f599-k5nwf\" (UID: \"c7fd2012-4ccf-4fb6-ad00-eeb7fd6be2cf\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-k5nwf" Mar 21 04:26:57 crc kubenswrapper[4839]: I0321 04:26:57.771176 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/14ed32fc-196a-4d5e-a8f3-d9fc6ec765b1-machine-approver-tls\") pod \"machine-approver-56656f9798-fh8k5\" (UID: \"14ed32fc-196a-4d5e-a8f3-d9fc6ec765b1\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-fh8k5" Mar 21 04:26:57 crc kubenswrapper[4839]: I0321 04:26:57.771194 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pgq6h\" (UniqueName: \"kubernetes.io/projected/25af4e9d-c029-4ee7-9952-18a3a5e3c333-kube-api-access-pgq6h\") pod \"cluster-samples-operator-665b6dd947-tz8sp\" (UID: \"25af4e9d-c029-4ee7-9952-18a3a5e3c333\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-tz8sp" Mar 21 04:26:57 crc kubenswrapper[4839]: I0321 04:26:57.771208 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9d291bc8-87c0-4a9e-b269-52a7801f050b-serving-cert\") pod \"apiserver-76f77b778f-gl7rc\" (UID: \"9d291bc8-87c0-4a9e-b269-52a7801f050b\") " pod="openshift-apiserver/apiserver-76f77b778f-gl7rc" Mar 21 04:26:57 crc kubenswrapper[4839]: I0321 04:26:57.771223 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zmj6b\" (UniqueName: \"kubernetes.io/projected/b60d6f1b-b109-4fa4-a85d-ebb845b342bd-kube-api-access-zmj6b\") pod \"openshift-config-operator-7777fb866f-p4nnp\" (UID: \"b60d6f1b-b109-4fa4-a85d-ebb845b342bd\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-p4nnp" Mar 21 04:26:57 crc kubenswrapper[4839]: I0321 04:26:57.771238 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/6bfbd19d-a44a-459c-bd6e-150241ce3ebb-audit-policies\") pod \"oauth-openshift-558db77b4-zt77f\" (UID: \"6bfbd19d-a44a-459c-bd6e-150241ce3ebb\") " pod="openshift-authentication/oauth-openshift-558db77b4-zt77f" Mar 21 04:26:57 crc kubenswrapper[4839]: I0321 04:26:57.771236 4839 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/multus-admission-controller-857f4d67dd-g9tz2"] Mar 21 04:26:57 crc kubenswrapper[4839]: I0321 04:26:57.771254 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/e81e2384-94b0-4639-bb2d-e4152385c932-client-ca\") pod \"route-controller-manager-6576b87f9c-76ctz\" (UID: \"e81e2384-94b0-4639-bb2d-e4152385c932\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-76ctz" Mar 21 04:26:57 crc kubenswrapper[4839]: I0321 04:26:57.771271 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/9d291bc8-87c0-4a9e-b269-52a7801f050b-image-import-ca\") pod \"apiserver-76f77b778f-gl7rc\" (UID: \"9d291bc8-87c0-4a9e-b269-52a7801f050b\") " pod="openshift-apiserver/apiserver-76f77b778f-gl7rc" Mar 21 04:26:57 crc kubenswrapper[4839]: I0321 04:26:57.771288 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/9d291bc8-87c0-4a9e-b269-52a7801f050b-node-pullsecrets\") pod \"apiserver-76f77b778f-gl7rc\" (UID: \"9d291bc8-87c0-4a9e-b269-52a7801f050b\") " pod="openshift-apiserver/apiserver-76f77b778f-gl7rc" Mar 21 04:26:57 crc kubenswrapper[4839]: I0321 04:26:57.771303 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/93bc1508-a828-4d23-b078-1d4164d1bc2c-etcd-ca\") pod \"etcd-operator-b45778765-2s6j7\" (UID: \"93bc1508-a828-4d23-b078-1d4164d1bc2c\") " pod="openshift-etcd-operator/etcd-operator-b45778765-2s6j7" Mar 21 04:26:57 crc kubenswrapper[4839]: I0321 04:26:57.771319 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c7fd2012-4ccf-4fb6-ad00-eeb7fd6be2cf-serving-cert\") pod \"authentication-operator-69f744f599-k5nwf\" (UID: \"c7fd2012-4ccf-4fb6-ad00-eeb7fd6be2cf\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-k5nwf" Mar 21 04:26:57 crc kubenswrapper[4839]: I0321 04:26:57.771339 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3498feaf-72d5-471a-b25e-fb4b68875767-config\") pod \"controller-manager-879f6c89f-45jfn\" (UID: \"3498feaf-72d5-471a-b25e-fb4b68875767\") " pod="openshift-controller-manager/controller-manager-879f6c89f-45jfn" Mar 21 04:26:57 crc kubenswrapper[4839]: I0321 04:26:57.771354 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/cae2f42f-b7c7-43c7-b397-a8273ea5844b-metrics-tls\") pod \"dns-operator-744455d44c-g2rrh\" (UID: \"cae2f42f-b7c7-43c7-b397-a8273ea5844b\") " pod="openshift-dns-operator/dns-operator-744455d44c-g2rrh" Mar 21 04:26:57 crc kubenswrapper[4839]: I0321 04:26:57.771368 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e81e2384-94b0-4639-bb2d-e4152385c932-serving-cert\") pod \"route-controller-manager-6576b87f9c-76ctz\" (UID: \"e81e2384-94b0-4639-bb2d-e4152385c932\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-76ctz" Mar 21 04:26:57 crc kubenswrapper[4839]: I0321 04:26:57.771384 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ebdfec0a-a8bf-47b0-b51a-75a76d4341f2-trusted-ca-bundle\") pod \"console-f9d7485db-bj929\" (UID: \"ebdfec0a-a8bf-47b0-b51a-75a76d4341f2\") " pod="openshift-console/console-f9d7485db-bj929" Mar 21 04:26:57 crc kubenswrapper[4839]: I0321 04:26:57.771475 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-njsf7\" (UniqueName: \"kubernetes.io/projected/3db892a0-fb40-4e0e-93ee-a8f2876ad8be-kube-api-access-njsf7\") pod \"cluster-image-registry-operator-dc59b4c8b-fhqz6\" (UID: \"3db892a0-fb40-4e0e-93ee-a8f2876ad8be\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-fhqz6" Mar 21 04:26:57 crc kubenswrapper[4839]: I0321 04:26:57.771498 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/6bfbd19d-a44a-459c-bd6e-150241ce3ebb-v4-0-config-system-router-certs\") pod \"oauth-openshift-558db77b4-zt77f\" (UID: \"6bfbd19d-a44a-459c-bd6e-150241ce3ebb\") " pod="openshift-authentication/oauth-openshift-558db77b4-zt77f" Mar 21 04:26:57 crc kubenswrapper[4839]: I0321 04:26:57.771521 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f7156267-6917-4c54-ba75-4a91a0772025-serving-cert\") pod \"console-operator-58897d9998-hkg98\" (UID: \"f7156267-6917-4c54-ba75-4a91a0772025\") " pod="openshift-console-operator/console-operator-58897d9998-hkg98" Mar 21 04:26:57 crc kubenswrapper[4839]: I0321 04:26:57.771559 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/9d291bc8-87c0-4a9e-b269-52a7801f050b-audit-dir\") pod \"apiserver-76f77b778f-gl7rc\" (UID: \"9d291bc8-87c0-4a9e-b269-52a7801f050b\") " pod="openshift-apiserver/apiserver-76f77b778f-gl7rc" Mar 21 04:26:57 crc kubenswrapper[4839]: I0321 04:26:57.771598 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/93bc1508-a828-4d23-b078-1d4164d1bc2c-etcd-service-ca\") pod \"etcd-operator-b45778765-2s6j7\" (UID: \"93bc1508-a828-4d23-b078-1d4164d1bc2c\") " pod="openshift-etcd-operator/etcd-operator-b45778765-2s6j7" Mar 21 04:26:57 crc kubenswrapper[4839]: I0321 04:26:57.771618 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xlf56\" (UniqueName: \"kubernetes.io/projected/14ed32fc-196a-4d5e-a8f3-d9fc6ec765b1-kube-api-access-xlf56\") pod \"machine-approver-56656f9798-fh8k5\" (UID: \"14ed32fc-196a-4d5e-a8f3-d9fc6ec765b1\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-fh8k5" Mar 21 04:26:57 crc kubenswrapper[4839]: I0321 04:26:57.771639 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/c4d393d7-42d7-4b7d-a3cd-f7e325b97c54-images\") pod \"machine-api-operator-5694c8668f-nmj8p\" (UID: \"c4d393d7-42d7-4b7d-a3cd-f7e325b97c54\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-nmj8p" Mar 21 04:26:57 crc kubenswrapper[4839]: I0321 04:26:57.771656 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wp2sf\" (UniqueName: \"kubernetes.io/projected/3498feaf-72d5-471a-b25e-fb4b68875767-kube-api-access-wp2sf\") pod \"controller-manager-879f6c89f-45jfn\" (UID: \"3498feaf-72d5-471a-b25e-fb4b68875767\") " pod="openshift-controller-manager/controller-manager-879f6c89f-45jfn" Mar 21 04:26:57 crc kubenswrapper[4839]: I0321 04:26:57.771676 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/3498feaf-72d5-471a-b25e-fb4b68875767-client-ca\") pod \"controller-manager-879f6c89f-45jfn\" (UID: \"3498feaf-72d5-471a-b25e-fb4b68875767\") " pod="openshift-controller-manager/controller-manager-879f6c89f-45jfn" Mar 21 04:26:57 crc kubenswrapper[4839]: I0321 04:26:57.771692 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/f7156267-6917-4c54-ba75-4a91a0772025-trusted-ca\") pod \"console-operator-58897d9998-hkg98\" (UID: \"f7156267-6917-4c54-ba75-4a91a0772025\") " pod="openshift-console-operator/console-operator-58897d9998-hkg98" Mar 21 04:26:57 crc kubenswrapper[4839]: I0321 04:26:57.771706 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9d291bc8-87c0-4a9e-b269-52a7801f050b-config\") pod \"apiserver-76f77b778f-gl7rc\" (UID: \"9d291bc8-87c0-4a9e-b269-52a7801f050b\") " pod="openshift-apiserver/apiserver-76f77b778f-gl7rc" Mar 21 04:26:57 crc kubenswrapper[4839]: I0321 04:26:57.771729 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/ebdfec0a-a8bf-47b0-b51a-75a76d4341f2-oauth-serving-cert\") pod \"console-f9d7485db-bj929\" (UID: \"ebdfec0a-a8bf-47b0-b51a-75a76d4341f2\") " pod="openshift-console/console-f9d7485db-bj929" Mar 21 04:26:57 crc kubenswrapper[4839]: I0321 04:26:57.771745 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/6bfbd19d-a44a-459c-bd6e-150241ce3ebb-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-558db77b4-zt77f\" (UID: \"6bfbd19d-a44a-459c-bd6e-150241ce3ebb\") " pod="openshift-authentication/oauth-openshift-558db77b4-zt77f" Mar 21 04:26:57 crc kubenswrapper[4839]: I0321 04:26:57.771761 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/6bfbd19d-a44a-459c-bd6e-150241ce3ebb-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-558db77b4-zt77f\" (UID: \"6bfbd19d-a44a-459c-bd6e-150241ce3ebb\") " pod="openshift-authentication/oauth-openshift-558db77b4-zt77f" Mar 21 04:26:57 crc kubenswrapper[4839]: I0321 04:26:57.771777 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z5ktq\" (UniqueName: \"kubernetes.io/projected/6bfbd19d-a44a-459c-bd6e-150241ce3ebb-kube-api-access-z5ktq\") pod \"oauth-openshift-558db77b4-zt77f\" (UID: \"6bfbd19d-a44a-459c-bd6e-150241ce3ebb\") " pod="openshift-authentication/oauth-openshift-558db77b4-zt77f" Mar 21 04:26:57 crc kubenswrapper[4839]: I0321 04:26:57.771793 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wthj5\" (UniqueName: \"kubernetes.io/projected/f7a32fe8-bd63-44c6-a90c-a6a5438e3cd5-kube-api-access-wthj5\") pod \"openshift-apiserver-operator-796bbdcf4f-8sv4j\" (UID: \"f7a32fe8-bd63-44c6-a90c-a6a5438e3cd5\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-8sv4j" Mar 21 04:26:57 crc kubenswrapper[4839]: I0321 04:26:57.771808 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-64b9q\" (UniqueName: \"kubernetes.io/projected/e81e2384-94b0-4639-bb2d-e4152385c932-kube-api-access-64b9q\") pod \"route-controller-manager-6576b87f9c-76ctz\" (UID: \"e81e2384-94b0-4639-bb2d-e4152385c932\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-76ctz" Mar 21 04:26:57 crc kubenswrapper[4839]: I0321 04:26:57.771822 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/9d291bc8-87c0-4a9e-b269-52a7801f050b-trusted-ca-bundle\") pod \"apiserver-76f77b778f-gl7rc\" (UID: \"9d291bc8-87c0-4a9e-b269-52a7801f050b\") " pod="openshift-apiserver/apiserver-76f77b778f-gl7rc" Mar 21 04:26:57 crc kubenswrapper[4839]: I0321 04:26:57.772039 4839 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-vg8dq"] Mar 21 04:26:57 crc kubenswrapper[4839]: I0321 04:26:57.772506 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-vg8dq" Mar 21 04:26:57 crc kubenswrapper[4839]: I0321 04:26:57.772519 4839 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/catalog-operator-68c6474976-xldvn"] Mar 21 04:26:57 crc kubenswrapper[4839]: I0321 04:26:57.772772 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-admission-controller-857f4d67dd-g9tz2" Mar 21 04:26:57 crc kubenswrapper[4839]: I0321 04:26:57.773034 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-xldvn" Mar 21 04:26:57 crc kubenswrapper[4839]: I0321 04:26:57.773060 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/3498feaf-72d5-471a-b25e-fb4b68875767-proxy-ca-bundles\") pod \"controller-manager-879f6c89f-45jfn\" (UID: \"3498feaf-72d5-471a-b25e-fb4b68875767\") " pod="openshift-controller-manager/controller-manager-879f6c89f-45jfn" Mar 21 04:26:57 crc kubenswrapper[4839]: I0321 04:26:57.773556 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/67adff78-dfe5-440a-80b0-fefd703c3aa7-etcd-serving-ca\") pod \"apiserver-7bbb656c7d-85pc8\" (UID: \"67adff78-dfe5-440a-80b0-fefd703c3aa7\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-85pc8" Mar 21 04:26:57 crc kubenswrapper[4839]: I0321 04:26:57.773883 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/3498feaf-72d5-471a-b25e-fb4b68875767-client-ca\") pod \"controller-manager-879f6c89f-45jfn\" (UID: \"3498feaf-72d5-471a-b25e-fb4b68875767\") " pod="openshift-controller-manager/controller-manager-879f6c89f-45jfn" Mar 21 04:26:57 crc kubenswrapper[4839]: I0321 04:26:57.774335 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/67adff78-dfe5-440a-80b0-fefd703c3aa7-audit-dir\") pod \"apiserver-7bbb656c7d-85pc8\" (UID: \"67adff78-dfe5-440a-80b0-fefd703c3aa7\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-85pc8" Mar 21 04:26:57 crc kubenswrapper[4839]: I0321 04:26:57.774455 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"images\" (UniqueName: \"kubernetes.io/configmap/c4d393d7-42d7-4b7d-a3cd-f7e325b97c54-images\") pod \"machine-api-operator-5694c8668f-nmj8p\" (UID: \"c4d393d7-42d7-4b7d-a3cd-f7e325b97c54\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-nmj8p" Mar 21 04:26:57 crc kubenswrapper[4839]: I0321 04:26:57.775103 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c4d393d7-42d7-4b7d-a3cd-f7e325b97c54-config\") pod \"machine-api-operator-5694c8668f-nmj8p\" (UID: \"c4d393d7-42d7-4b7d-a3cd-f7e325b97c54\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-nmj8p" Mar 21 04:26:57 crc kubenswrapper[4839]: I0321 04:26:57.776676 4839 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-dns/dns-default-5jhkc"] Mar 21 04:26:57 crc kubenswrapper[4839]: I0321 04:26:57.778207 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-5jhkc" Mar 21 04:26:57 crc kubenswrapper[4839]: I0321 04:26:57.778337 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/67adff78-dfe5-440a-80b0-fefd703c3aa7-encryption-config\") pod \"apiserver-7bbb656c7d-85pc8\" (UID: \"67adff78-dfe5-440a-80b0-fefd703c3aa7\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-85pc8" Mar 21 04:26:57 crc kubenswrapper[4839]: I0321 04:26:57.779059 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/c4d393d7-42d7-4b7d-a3cd-f7e325b97c54-machine-api-operator-tls\") pod \"machine-api-operator-5694c8668f-nmj8p\" (UID: \"c4d393d7-42d7-4b7d-a3cd-f7e325b97c54\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-nmj8p" Mar 21 04:26:57 crc kubenswrapper[4839]: I0321 04:26:57.779458 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/67adff78-dfe5-440a-80b0-fefd703c3aa7-serving-cert\") pod \"apiserver-7bbb656c7d-85pc8\" (UID: \"67adff78-dfe5-440a-80b0-fefd703c3aa7\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-85pc8" Mar 21 04:26:57 crc kubenswrapper[4839]: I0321 04:26:57.780064 4839 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-78xr9"] Mar 21 04:26:57 crc kubenswrapper[4839]: I0321 04:26:57.780860 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-78xr9" Mar 21 04:26:57 crc kubenswrapper[4839]: I0321 04:26:57.781095 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/67adff78-dfe5-440a-80b0-fefd703c3aa7-etcd-client\") pod \"apiserver-7bbb656c7d-85pc8\" (UID: \"67adff78-dfe5-440a-80b0-fefd703c3aa7\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-85pc8" Mar 21 04:26:57 crc kubenswrapper[4839]: I0321 04:26:57.781204 4839 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console-operator/console-operator-58897d9998-hkg98"] Mar 21 04:26:57 crc kubenswrapper[4839]: I0321 04:26:57.782319 4839 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-service-ca-operator/service-ca-operator-777779d784-shqhf"] Mar 21 04:26:57 crc kubenswrapper[4839]: I0321 04:26:57.782375 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3498feaf-72d5-471a-b25e-fb4b68875767-config\") pod \"controller-manager-879f6c89f-45jfn\" (UID: \"3498feaf-72d5-471a-b25e-fb4b68875767\") " pod="openshift-controller-manager/controller-manager-879f6c89f-45jfn" Mar 21 04:26:57 crc kubenswrapper[4839]: I0321 04:26:57.783092 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca-operator/service-ca-operator-777779d784-shqhf" Mar 21 04:26:57 crc kubenswrapper[4839]: I0321 04:26:57.784084 4839 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"serving-cert" Mar 21 04:26:57 crc kubenswrapper[4839]: I0321 04:26:57.792606 4839 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-f9d7485db-bj929"] Mar 21 04:26:57 crc kubenswrapper[4839]: I0321 04:26:57.793230 4839 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-c4tgz"] Mar 21 04:26:57 crc kubenswrapper[4839]: I0321 04:26:57.794089 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/3498feaf-72d5-471a-b25e-fb4b68875767-serving-cert\") pod \"controller-manager-879f6c89f-45jfn\" (UID: \"3498feaf-72d5-471a-b25e-fb4b68875767\") " pod="openshift-controller-manager/controller-manager-879f6c89f-45jfn" Mar 21 04:26:57 crc kubenswrapper[4839]: I0321 04:26:57.794408 4839 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-whlp9"] Mar 21 04:26:57 crc kubenswrapper[4839]: I0321 04:26:57.796114 4839 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-controller-84d6567774-hlk25"] Mar 21 04:26:57 crc kubenswrapper[4839]: I0321 04:26:57.797374 4839 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-djrdt"] Mar 21 04:26:57 crc kubenswrapper[4839]: I0321 04:26:57.798377 4839 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-hmqv7"] Mar 21 04:26:57 crc kubenswrapper[4839]: I0321 04:26:57.799478 4839 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-service-ca/service-ca-9c57cc56f-6rrrs"] Mar 21 04:26:57 crc kubenswrapper[4839]: I0321 04:26:57.800833 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca/service-ca-9c57cc56f-6rrrs" Mar 21 04:26:57 crc kubenswrapper[4839]: I0321 04:26:57.801716 4839 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator/migrator-59844c95c7-blcpt"] Mar 21 04:26:57 crc kubenswrapper[4839]: I0321 04:26:57.804297 4839 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"console-operator-dockercfg-4xjcr" Mar 21 04:26:57 crc kubenswrapper[4839]: I0321 04:26:57.805398 4839 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-fhqz6"] Mar 21 04:26:57 crc kubenswrapper[4839]: I0321 04:26:57.806759 4839 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-server-4sj57"] Mar 21 04:26:57 crc kubenswrapper[4839]: I0321 04:26:57.808501 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-server-4sj57" Mar 21 04:26:57 crc kubenswrapper[4839]: I0321 04:26:57.808770 4839 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29567775-lfv48"] Mar 21 04:26:57 crc kubenswrapper[4839]: I0321 04:26:57.809131 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29567775-lfv48" Mar 21 04:26:57 crc kubenswrapper[4839]: I0321 04:26:57.810926 4839 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-ql2ps"] Mar 21 04:26:57 crc kubenswrapper[4839]: I0321 04:26:57.812454 4839 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29567786-d8w8k"] Mar 21 04:26:57 crc kubenswrapper[4839]: I0321 04:26:57.813145 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567786-d8w8k" Mar 21 04:26:57 crc kubenswrapper[4839]: I0321 04:26:57.814429 4839 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-8jgh7"] Mar 21 04:26:57 crc kubenswrapper[4839]: I0321 04:26:57.815845 4839 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-operator-74547568cd-lqm8j"] Mar 21 04:26:57 crc kubenswrapper[4839]: I0321 04:26:57.817696 4839 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-bxg8h"] Mar 21 04:26:57 crc kubenswrapper[4839]: I0321 04:26:57.822124 4839 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ingress-canary/ingress-canary-brnnr"] Mar 21 04:26:57 crc kubenswrapper[4839]: I0321 04:26:57.823808 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-brnnr" Mar 21 04:26:57 crc kubenswrapper[4839]: I0321 04:26:57.825688 4839 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["hostpath-provisioner/csi-hostpathplugin-cstqb"] Mar 21 04:26:57 crc kubenswrapper[4839]: I0321 04:26:57.830845 4839 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-x2g8w"] Mar 21 04:26:57 crc kubenswrapper[4839]: I0321 04:26:57.832289 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="hostpath-provisioner/csi-hostpathplugin-cstqb" Mar 21 04:26:57 crc kubenswrapper[4839]: I0321 04:26:57.836368 4839 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-operator/ingress-operator-5b745b69d9-9m6tl"] Mar 21 04:26:57 crc kubenswrapper[4839]: I0321 04:26:57.837502 4839 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-vg8dq"] Mar 21 04:26:57 crc kubenswrapper[4839]: I0321 04:26:57.838809 4839 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-5jhkc"] Mar 21 04:26:57 crc kubenswrapper[4839]: I0321 04:26:57.839826 4839 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/multus-admission-controller-857f4d67dd-g9tz2"] Mar 21 04:26:57 crc kubenswrapper[4839]: I0321 04:26:57.842138 4839 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-service-ca.crt" Mar 21 04:26:57 crc kubenswrapper[4839]: I0321 04:26:57.842278 4839 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-78xr9"] Mar 21 04:26:57 crc kubenswrapper[4839]: I0321 04:26:57.842321 4839 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/catalog-operator-68c6474976-xldvn"] Mar 21 04:26:57 crc kubenswrapper[4839]: I0321 04:26:57.843294 4839 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca/service-ca-9c57cc56f-6rrrs"] Mar 21 04:26:57 crc kubenswrapper[4839]: I0321 04:26:57.845208 4839 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-brnnr"] Mar 21 04:26:57 crc kubenswrapper[4839]: I0321 04:26:57.846214 4839 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29567786-d8w8k"] Mar 21 04:26:57 crc kubenswrapper[4839]: I0321 04:26:57.847999 4839 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca-operator/service-ca-operator-777779d784-shqhf"] Mar 21 04:26:57 crc kubenswrapper[4839]: I0321 04:26:57.849166 4839 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29567775-lfv48"] Mar 21 04:26:57 crc kubenswrapper[4839]: I0321 04:26:57.850539 4839 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["hostpath-provisioner/csi-hostpathplugin-cstqb"] Mar 21 04:26:57 crc kubenswrapper[4839]: I0321 04:26:57.862099 4839 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-dockercfg-vw8fw" Mar 21 04:26:57 crc kubenswrapper[4839]: I0321 04:26:57.872325 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/f7156267-6917-4c54-ba75-4a91a0772025-trusted-ca\") pod \"console-operator-58897d9998-hkg98\" (UID: \"f7156267-6917-4c54-ba75-4a91a0772025\") " pod="openshift-console-operator/console-operator-58897d9998-hkg98" Mar 21 04:26:57 crc kubenswrapper[4839]: I0321 04:26:57.872498 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9d291bc8-87c0-4a9e-b269-52a7801f050b-config\") pod \"apiserver-76f77b778f-gl7rc\" (UID: \"9d291bc8-87c0-4a9e-b269-52a7801f050b\") " pod="openshift-apiserver/apiserver-76f77b778f-gl7rc" Mar 21 04:26:57 crc kubenswrapper[4839]: I0321 04:26:57.872635 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/6bfbd19d-a44a-459c-bd6e-150241ce3ebb-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-558db77b4-zt77f\" (UID: \"6bfbd19d-a44a-459c-bd6e-150241ce3ebb\") " pod="openshift-authentication/oauth-openshift-558db77b4-zt77f" Mar 21 04:26:57 crc kubenswrapper[4839]: I0321 04:26:57.872949 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/6bfbd19d-a44a-459c-bd6e-150241ce3ebb-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-558db77b4-zt77f\" (UID: \"6bfbd19d-a44a-459c-bd6e-150241ce3ebb\") " pod="openshift-authentication/oauth-openshift-558db77b4-zt77f" Mar 21 04:26:57 crc kubenswrapper[4839]: I0321 04:26:57.873068 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-z5ktq\" (UniqueName: \"kubernetes.io/projected/6bfbd19d-a44a-459c-bd6e-150241ce3ebb-kube-api-access-z5ktq\") pod \"oauth-openshift-558db77b4-zt77f\" (UID: \"6bfbd19d-a44a-459c-bd6e-150241ce3ebb\") " pod="openshift-authentication/oauth-openshift-558db77b4-zt77f" Mar 21 04:26:57 crc kubenswrapper[4839]: I0321 04:26:57.873225 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/ebdfec0a-a8bf-47b0-b51a-75a76d4341f2-oauth-serving-cert\") pod \"console-f9d7485db-bj929\" (UID: \"ebdfec0a-a8bf-47b0-b51a-75a76d4341f2\") " pod="openshift-console/console-f9d7485db-bj929" Mar 21 04:26:57 crc kubenswrapper[4839]: I0321 04:26:57.873319 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wthj5\" (UniqueName: \"kubernetes.io/projected/f7a32fe8-bd63-44c6-a90c-a6a5438e3cd5-kube-api-access-wthj5\") pod \"openshift-apiserver-operator-796bbdcf4f-8sv4j\" (UID: \"f7a32fe8-bd63-44c6-a90c-a6a5438e3cd5\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-8sv4j" Mar 21 04:26:57 crc kubenswrapper[4839]: I0321 04:26:57.873408 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-64b9q\" (UniqueName: \"kubernetes.io/projected/e81e2384-94b0-4639-bb2d-e4152385c932-kube-api-access-64b9q\") pod \"route-controller-manager-6576b87f9c-76ctz\" (UID: \"e81e2384-94b0-4639-bb2d-e4152385c932\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-76ctz" Mar 21 04:26:57 crc kubenswrapper[4839]: I0321 04:26:57.873501 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/9d291bc8-87c0-4a9e-b269-52a7801f050b-trusted-ca-bundle\") pod \"apiserver-76f77b778f-gl7rc\" (UID: \"9d291bc8-87c0-4a9e-b269-52a7801f050b\") " pod="openshift-apiserver/apiserver-76f77b778f-gl7rc" Mar 21 04:26:57 crc kubenswrapper[4839]: I0321 04:26:57.873678 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/ebdfec0a-a8bf-47b0-b51a-75a76d4341f2-console-oauth-config\") pod \"console-f9d7485db-bj929\" (UID: \"ebdfec0a-a8bf-47b0-b51a-75a76d4341f2\") " pod="openshift-console/console-f9d7485db-bj929" Mar 21 04:26:57 crc kubenswrapper[4839]: I0321 04:26:57.873336 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9d291bc8-87c0-4a9e-b269-52a7801f050b-config\") pod \"apiserver-76f77b778f-gl7rc\" (UID: \"9d291bc8-87c0-4a9e-b269-52a7801f050b\") " pod="openshift-apiserver/apiserver-76f77b778f-gl7rc" Mar 21 04:26:57 crc kubenswrapper[4839]: I0321 04:26:57.873793 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/3db892a0-fb40-4e0e-93ee-a8f2876ad8be-trusted-ca\") pod \"cluster-image-registry-operator-dc59b4c8b-fhqz6\" (UID: \"3db892a0-fb40-4e0e-93ee-a8f2876ad8be\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-fhqz6" Mar 21 04:26:57 crc kubenswrapper[4839]: I0321 04:26:57.873864 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9pwc7\" (UniqueName: \"kubernetes.io/projected/9d291bc8-87c0-4a9e-b269-52a7801f050b-kube-api-access-9pwc7\") pod \"apiserver-76f77b778f-gl7rc\" (UID: \"9d291bc8-87c0-4a9e-b269-52a7801f050b\") " pod="openshift-apiserver/apiserver-76f77b778f-gl7rc" Mar 21 04:26:57 crc kubenswrapper[4839]: I0321 04:26:57.873898 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f7a32fe8-bd63-44c6-a90c-a6a5438e3cd5-config\") pod \"openshift-apiserver-operator-796bbdcf4f-8sv4j\" (UID: \"f7a32fe8-bd63-44c6-a90c-a6a5438e3cd5\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-8sv4j" Mar 21 04:26:57 crc kubenswrapper[4839]: I0321 04:26:57.873923 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ssrbd\" (UniqueName: \"kubernetes.io/projected/cae2f42f-b7c7-43c7-b397-a8273ea5844b-kube-api-access-ssrbd\") pod \"dns-operator-744455d44c-g2rrh\" (UID: \"cae2f42f-b7c7-43c7-b397-a8273ea5844b\") " pod="openshift-dns-operator/dns-operator-744455d44c-g2rrh" Mar 21 04:26:57 crc kubenswrapper[4839]: I0321 04:26:57.873966 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/9d291bc8-87c0-4a9e-b269-52a7801f050b-encryption-config\") pod \"apiserver-76f77b778f-gl7rc\" (UID: \"9d291bc8-87c0-4a9e-b269-52a7801f050b\") " pod="openshift-apiserver/apiserver-76f77b778f-gl7rc" Mar 21 04:26:57 crc kubenswrapper[4839]: I0321 04:26:57.873702 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/f7156267-6917-4c54-ba75-4a91a0772025-trusted-ca\") pod \"console-operator-58897d9998-hkg98\" (UID: \"f7156267-6917-4c54-ba75-4a91a0772025\") " pod="openshift-console-operator/console-operator-58897d9998-hkg98" Mar 21 04:26:57 crc kubenswrapper[4839]: I0321 04:26:57.873992 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/93bc1508-a828-4d23-b078-1d4164d1bc2c-serving-cert\") pod \"etcd-operator-b45778765-2s6j7\" (UID: \"93bc1508-a828-4d23-b078-1d4164d1bc2c\") " pod="openshift-etcd-operator/etcd-operator-b45778765-2s6j7" Mar 21 04:26:57 crc kubenswrapper[4839]: I0321 04:26:57.874051 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/93bc1508-a828-4d23-b078-1d4164d1bc2c-config\") pod \"etcd-operator-b45778765-2s6j7\" (UID: \"93bc1508-a828-4d23-b078-1d4164d1bc2c\") " pod="openshift-etcd-operator/etcd-operator-b45778765-2s6j7" Mar 21 04:26:57 crc kubenswrapper[4839]: I0321 04:26:57.874082 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/2bc52acb-29f0-4f24-a46a-928a529264dc-proxy-tls\") pod \"machine-config-operator-74547568cd-lqm8j\" (UID: \"2bc52acb-29f0-4f24-a46a-928a529264dc\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-lqm8j" Mar 21 04:26:57 crc kubenswrapper[4839]: I0321 04:26:57.874120 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-r66pp\" (UniqueName: \"kubernetes.io/projected/4d63cdfd-21e7-4a63-960b-363fb131ac08-kube-api-access-r66pp\") pod \"downloads-7954f5f757-qp8mz\" (UID: \"4d63cdfd-21e7-4a63-960b-363fb131ac08\") " pod="openshift-console/downloads-7954f5f757-qp8mz" Mar 21 04:26:57 crc kubenswrapper[4839]: I0321 04:26:57.874142 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/9d291bc8-87c0-4a9e-b269-52a7801f050b-etcd-serving-ca\") pod \"apiserver-76f77b778f-gl7rc\" (UID: \"9d291bc8-87c0-4a9e-b269-52a7801f050b\") " pod="openshift-apiserver/apiserver-76f77b778f-gl7rc" Mar 21 04:26:57 crc kubenswrapper[4839]: I0321 04:26:57.874169 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/14ed32fc-196a-4d5e-a8f3-d9fc6ec765b1-auth-proxy-config\") pod \"machine-approver-56656f9798-fh8k5\" (UID: \"14ed32fc-196a-4d5e-a8f3-d9fc6ec765b1\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-fh8k5" Mar 21 04:26:57 crc kubenswrapper[4839]: I0321 04:26:57.874195 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e81e2384-94b0-4639-bb2d-e4152385c932-config\") pod \"route-controller-manager-6576b87f9c-76ctz\" (UID: \"e81e2384-94b0-4639-bb2d-e4152385c932\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-76ctz" Mar 21 04:26:57 crc kubenswrapper[4839]: I0321 04:26:57.874239 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/9d291bc8-87c0-4a9e-b269-52a7801f050b-etcd-client\") pod \"apiserver-76f77b778f-gl7rc\" (UID: \"9d291bc8-87c0-4a9e-b269-52a7801f050b\") " pod="openshift-apiserver/apiserver-76f77b778f-gl7rc" Mar 21 04:26:57 crc kubenswrapper[4839]: I0321 04:26:57.874268 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c7fd2012-4ccf-4fb6-ad00-eeb7fd6be2cf-config\") pod \"authentication-operator-69f744f599-k5nwf\" (UID: \"c7fd2012-4ccf-4fb6-ad00-eeb7fd6be2cf\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-k5nwf" Mar 21 04:26:57 crc kubenswrapper[4839]: I0321 04:26:57.874292 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a79fc033-c671-42ff-aa06-78ae64967c92-config\") pod \"openshift-controller-manager-operator-756b6f6bc6-4cwc5\" (UID: \"a79fc033-c671-42ff-aa06-78ae64967c92\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-4cwc5" Mar 21 04:26:57 crc kubenswrapper[4839]: I0321 04:26:57.874317 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/3db892a0-fb40-4e0e-93ee-a8f2876ad8be-bound-sa-token\") pod \"cluster-image-registry-operator-dc59b4c8b-fhqz6\" (UID: \"3db892a0-fb40-4e0e-93ee-a8f2876ad8be\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-fhqz6" Mar 21 04:26:57 crc kubenswrapper[4839]: I0321 04:26:57.874360 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/b60d6f1b-b109-4fa4-a85d-ebb845b342bd-available-featuregates\") pod \"openshift-config-operator-7777fb866f-p4nnp\" (UID: \"b60d6f1b-b109-4fa4-a85d-ebb845b342bd\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-p4nnp" Mar 21 04:26:57 crc kubenswrapper[4839]: I0321 04:26:57.874387 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/2bc52acb-29f0-4f24-a46a-928a529264dc-auth-proxy-config\") pod \"machine-config-operator-74547568cd-lqm8j\" (UID: \"2bc52acb-29f0-4f24-a46a-928a529264dc\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-lqm8j" Mar 21 04:26:57 crc kubenswrapper[4839]: I0321 04:26:57.874414 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bl5rt\" (UniqueName: \"kubernetes.io/projected/2bc52acb-29f0-4f24-a46a-928a529264dc-kube-api-access-bl5rt\") pod \"machine-config-operator-74547568cd-lqm8j\" (UID: \"2bc52acb-29f0-4f24-a46a-928a529264dc\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-lqm8j" Mar 21 04:26:57 crc kubenswrapper[4839]: I0321 04:26:57.874452 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b60d6f1b-b109-4fa4-a85d-ebb845b342bd-serving-cert\") pod \"openshift-config-operator-7777fb866f-p4nnp\" (UID: \"b60d6f1b-b109-4fa4-a85d-ebb845b342bd\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-p4nnp" Mar 21 04:26:57 crc kubenswrapper[4839]: I0321 04:26:57.874480 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6bfbd19d-a44a-459c-bd6e-150241ce3ebb-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-558db77b4-zt77f\" (UID: \"6bfbd19d-a44a-459c-bd6e-150241ce3ebb\") " pod="openshift-authentication/oauth-openshift-558db77b4-zt77f" Mar 21 04:26:57 crc kubenswrapper[4839]: I0321 04:26:57.874507 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/14f49362-2145-40aa-8a7c-e07c70ea910c-config\") pod \"kube-controller-manager-operator-78b949d7b-c4tgz\" (UID: \"14f49362-2145-40aa-8a7c-e07c70ea910c\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-c4tgz" Mar 21 04:26:57 crc kubenswrapper[4839]: I0321 04:26:57.874533 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/3db892a0-fb40-4e0e-93ee-a8f2876ad8be-image-registry-operator-tls\") pod \"cluster-image-registry-operator-dc59b4c8b-fhqz6\" (UID: \"3db892a0-fb40-4e0e-93ee-a8f2876ad8be\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-fhqz6" Mar 21 04:26:57 crc kubenswrapper[4839]: I0321 04:26:57.874555 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/6bfbd19d-a44a-459c-bd6e-150241ce3ebb-v4-0-config-system-session\") pod \"oauth-openshift-558db77b4-zt77f\" (UID: \"6bfbd19d-a44a-459c-bd6e-150241ce3ebb\") " pod="openshift-authentication/oauth-openshift-558db77b4-zt77f" Mar 21 04:26:57 crc kubenswrapper[4839]: I0321 04:26:57.874598 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/6bfbd19d-a44a-459c-bd6e-150241ce3ebb-v4-0-config-system-service-ca\") pod \"oauth-openshift-558db77b4-zt77f\" (UID: \"6bfbd19d-a44a-459c-bd6e-150241ce3ebb\") " pod="openshift-authentication/oauth-openshift-558db77b4-zt77f" Mar 21 04:26:57 crc kubenswrapper[4839]: I0321 04:26:57.874598 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f7a32fe8-bd63-44c6-a90c-a6a5438e3cd5-config\") pod \"openshift-apiserver-operator-796bbdcf4f-8sv4j\" (UID: \"f7a32fe8-bd63-44c6-a90c-a6a5438e3cd5\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-8sv4j" Mar 21 04:26:57 crc kubenswrapper[4839]: I0321 04:26:57.874622 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/6bfbd19d-a44a-459c-bd6e-150241ce3ebb-audit-dir\") pod \"oauth-openshift-558db77b4-zt77f\" (UID: \"6bfbd19d-a44a-459c-bd6e-150241ce3ebb\") " pod="openshift-authentication/oauth-openshift-558db77b4-zt77f" Mar 21 04:26:57 crc kubenswrapper[4839]: I0321 04:26:57.874646 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/6bfbd19d-a44a-459c-bd6e-150241ce3ebb-v4-0-config-system-serving-cert\") pod \"oauth-openshift-558db77b4-zt77f\" (UID: \"6bfbd19d-a44a-459c-bd6e-150241ce3ebb\") " pod="openshift-authentication/oauth-openshift-558db77b4-zt77f" Mar 21 04:26:57 crc kubenswrapper[4839]: I0321 04:26:57.874671 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-q47zt\" (UniqueName: \"kubernetes.io/projected/ebdfec0a-a8bf-47b0-b51a-75a76d4341f2-kube-api-access-q47zt\") pod \"console-f9d7485db-bj929\" (UID: \"ebdfec0a-a8bf-47b0-b51a-75a76d4341f2\") " pod="openshift-console/console-f9d7485db-bj929" Mar 21 04:26:57 crc kubenswrapper[4839]: I0321 04:26:57.874698 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f7a32fe8-bd63-44c6-a90c-a6a5438e3cd5-serving-cert\") pod \"openshift-apiserver-operator-796bbdcf4f-8sv4j\" (UID: \"f7a32fe8-bd63-44c6-a90c-a6a5438e3cd5\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-8sv4j" Mar 21 04:26:57 crc kubenswrapper[4839]: I0321 04:26:57.874721 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/93bc1508-a828-4d23-b078-1d4164d1bc2c-etcd-client\") pod \"etcd-operator-b45778765-2s6j7\" (UID: \"93bc1508-a828-4d23-b078-1d4164d1bc2c\") " pod="openshift-etcd-operator/etcd-operator-b45778765-2s6j7" Mar 21 04:26:57 crc kubenswrapper[4839]: I0321 04:26:57.874744 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/6bfbd19d-a44a-459c-bd6e-150241ce3ebb-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-558db77b4-zt77f\" (UID: \"6bfbd19d-a44a-459c-bd6e-150241ce3ebb\") " pod="openshift-authentication/oauth-openshift-558db77b4-zt77f" Mar 21 04:26:57 crc kubenswrapper[4839]: I0321 04:26:57.874766 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/2bc52acb-29f0-4f24-a46a-928a529264dc-images\") pod \"machine-config-operator-74547568cd-lqm8j\" (UID: \"2bc52acb-29f0-4f24-a46a-928a529264dc\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-lqm8j" Mar 21 04:26:57 crc kubenswrapper[4839]: I0321 04:26:57.874790 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dxvqw\" (UniqueName: \"kubernetes.io/projected/93bc1508-a828-4d23-b078-1d4164d1bc2c-kube-api-access-dxvqw\") pod \"etcd-operator-b45778765-2s6j7\" (UID: \"93bc1508-a828-4d23-b078-1d4164d1bc2c\") " pod="openshift-etcd-operator/etcd-operator-b45778765-2s6j7" Mar 21 04:26:57 crc kubenswrapper[4839]: I0321 04:26:57.874817 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f7156267-6917-4c54-ba75-4a91a0772025-config\") pod \"console-operator-58897d9998-hkg98\" (UID: \"f7156267-6917-4c54-ba75-4a91a0772025\") " pod="openshift-console-operator/console-operator-58897d9998-hkg98" Mar 21 04:26:57 crc kubenswrapper[4839]: I0321 04:26:57.874851 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c7fd2012-4ccf-4fb6-ad00-eeb7fd6be2cf-service-ca-bundle\") pod \"authentication-operator-69f744f599-k5nwf\" (UID: \"c7fd2012-4ccf-4fb6-ad00-eeb7fd6be2cf\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-k5nwf" Mar 21 04:26:57 crc kubenswrapper[4839]: I0321 04:26:57.874874 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/6bfbd19d-a44a-459c-bd6e-150241ce3ebb-v4-0-config-system-cliconfig\") pod \"oauth-openshift-558db77b4-zt77f\" (UID: \"6bfbd19d-a44a-459c-bd6e-150241ce3ebb\") " pod="openshift-authentication/oauth-openshift-558db77b4-zt77f" Mar 21 04:26:57 crc kubenswrapper[4839]: I0321 04:26:57.874904 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/ebdfec0a-a8bf-47b0-b51a-75a76d4341f2-console-serving-cert\") pod \"console-f9d7485db-bj929\" (UID: \"ebdfec0a-a8bf-47b0-b51a-75a76d4341f2\") " pod="openshift-console/console-f9d7485db-bj929" Mar 21 04:26:57 crc kubenswrapper[4839]: I0321 04:26:57.874914 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/93bc1508-a828-4d23-b078-1d4164d1bc2c-config\") pod \"etcd-operator-b45778765-2s6j7\" (UID: \"93bc1508-a828-4d23-b078-1d4164d1bc2c\") " pod="openshift-etcd-operator/etcd-operator-b45778765-2s6j7" Mar 21 04:26:57 crc kubenswrapper[4839]: I0321 04:26:57.874932 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a79fc033-c671-42ff-aa06-78ae64967c92-serving-cert\") pod \"openshift-controller-manager-operator-756b6f6bc6-4cwc5\" (UID: \"a79fc033-c671-42ff-aa06-78ae64967c92\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-4cwc5" Mar 21 04:26:57 crc kubenswrapper[4839]: I0321 04:26:57.874952 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/9d291bc8-87c0-4a9e-b269-52a7801f050b-trusted-ca-bundle\") pod \"apiserver-76f77b778f-gl7rc\" (UID: \"9d291bc8-87c0-4a9e-b269-52a7801f050b\") " pod="openshift-apiserver/apiserver-76f77b778f-gl7rc" Mar 21 04:26:57 crc kubenswrapper[4839]: I0321 04:26:57.874958 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/25af4e9d-c029-4ee7-9952-18a3a5e3c333-samples-operator-tls\") pod \"cluster-samples-operator-665b6dd947-tz8sp\" (UID: \"25af4e9d-c029-4ee7-9952-18a3a5e3c333\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-tz8sp" Mar 21 04:26:57 crc kubenswrapper[4839]: I0321 04:26:57.875016 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/6bfbd19d-a44a-459c-bd6e-150241ce3ebb-v4-0-config-user-template-login\") pod \"oauth-openshift-558db77b4-zt77f\" (UID: \"6bfbd19d-a44a-459c-bd6e-150241ce3ebb\") " pod="openshift-authentication/oauth-openshift-558db77b4-zt77f" Mar 21 04:26:57 crc kubenswrapper[4839]: I0321 04:26:57.875049 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/ebdfec0a-a8bf-47b0-b51a-75a76d4341f2-service-ca\") pod \"console-f9d7485db-bj929\" (UID: \"ebdfec0a-a8bf-47b0-b51a-75a76d4341f2\") " pod="openshift-console/console-f9d7485db-bj929" Mar 21 04:26:57 crc kubenswrapper[4839]: I0321 04:26:57.875078 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/28599c04-0840-41a0-91dd-c0ed5bcf99fd-package-server-manager-serving-cert\") pod \"package-server-manager-789f6589d5-bxg8h\" (UID: \"28599c04-0840-41a0-91dd-c0ed5bcf99fd\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-bxg8h" Mar 21 04:26:57 crc kubenswrapper[4839]: I0321 04:26:57.875105 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/14f49362-2145-40aa-8a7c-e07c70ea910c-kube-api-access\") pod \"kube-controller-manager-operator-78b949d7b-c4tgz\" (UID: \"14f49362-2145-40aa-8a7c-e07c70ea910c\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-c4tgz" Mar 21 04:26:57 crc kubenswrapper[4839]: I0321 04:26:57.875132 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s8kjw\" (UniqueName: \"kubernetes.io/projected/a79fc033-c671-42ff-aa06-78ae64967c92-kube-api-access-s8kjw\") pod \"openshift-controller-manager-operator-756b6f6bc6-4cwc5\" (UID: \"a79fc033-c671-42ff-aa06-78ae64967c92\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-4cwc5" Mar 21 04:26:57 crc kubenswrapper[4839]: I0321 04:26:57.875154 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/14ed32fc-196a-4d5e-a8f3-d9fc6ec765b1-config\") pod \"machine-approver-56656f9798-fh8k5\" (UID: \"14ed32fc-196a-4d5e-a8f3-d9fc6ec765b1\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-fh8k5" Mar 21 04:26:57 crc kubenswrapper[4839]: I0321 04:26:57.875177 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/ebdfec0a-a8bf-47b0-b51a-75a76d4341f2-console-config\") pod \"console-f9d7485db-bj929\" (UID: \"ebdfec0a-a8bf-47b0-b51a-75a76d4341f2\") " pod="openshift-console/console-f9d7485db-bj929" Mar 21 04:26:57 crc kubenswrapper[4839]: I0321 04:26:57.875200 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/6bfbd19d-a44a-459c-bd6e-150241ce3ebb-v4-0-config-user-template-error\") pod \"oauth-openshift-558db77b4-zt77f\" (UID: \"6bfbd19d-a44a-459c-bd6e-150241ce3ebb\") " pod="openshift-authentication/oauth-openshift-558db77b4-zt77f" Mar 21 04:26:57 crc kubenswrapper[4839]: I0321 04:26:57.875222 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-94mt6\" (UniqueName: \"kubernetes.io/projected/c7fd2012-4ccf-4fb6-ad00-eeb7fd6be2cf-kube-api-access-94mt6\") pod \"authentication-operator-69f744f599-k5nwf\" (UID: \"c7fd2012-4ccf-4fb6-ad00-eeb7fd6be2cf\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-k5nwf" Mar 21 04:26:57 crc kubenswrapper[4839]: I0321 04:26:57.875245 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g7z9v\" (UniqueName: \"kubernetes.io/projected/28599c04-0840-41a0-91dd-c0ed5bcf99fd-kube-api-access-g7z9v\") pod \"package-server-manager-789f6589d5-bxg8h\" (UID: \"28599c04-0840-41a0-91dd-c0ed5bcf99fd\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-bxg8h" Mar 21 04:26:57 crc kubenswrapper[4839]: I0321 04:26:57.875287 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5zmdv\" (UniqueName: \"kubernetes.io/projected/f7156267-6917-4c54-ba75-4a91a0772025-kube-api-access-5zmdv\") pod \"console-operator-58897d9998-hkg98\" (UID: \"f7156267-6917-4c54-ba75-4a91a0772025\") " pod="openshift-console-operator/console-operator-58897d9998-hkg98" Mar 21 04:26:57 crc kubenswrapper[4839]: I0321 04:26:57.875311 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/14ed32fc-196a-4d5e-a8f3-d9fc6ec765b1-machine-approver-tls\") pod \"machine-approver-56656f9798-fh8k5\" (UID: \"14ed32fc-196a-4d5e-a8f3-d9fc6ec765b1\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-fh8k5" Mar 21 04:26:57 crc kubenswrapper[4839]: I0321 04:26:57.875317 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/9d291bc8-87c0-4a9e-b269-52a7801f050b-etcd-serving-ca\") pod \"apiserver-76f77b778f-gl7rc\" (UID: \"9d291bc8-87c0-4a9e-b269-52a7801f050b\") " pod="openshift-apiserver/apiserver-76f77b778f-gl7rc" Mar 21 04:26:57 crc kubenswrapper[4839]: I0321 04:26:57.875336 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pgq6h\" (UniqueName: \"kubernetes.io/projected/25af4e9d-c029-4ee7-9952-18a3a5e3c333-kube-api-access-pgq6h\") pod \"cluster-samples-operator-665b6dd947-tz8sp\" (UID: \"25af4e9d-c029-4ee7-9952-18a3a5e3c333\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-tz8sp" Mar 21 04:26:57 crc kubenswrapper[4839]: I0321 04:26:57.875360 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/9d291bc8-87c0-4a9e-b269-52a7801f050b-audit\") pod \"apiserver-76f77b778f-gl7rc\" (UID: \"9d291bc8-87c0-4a9e-b269-52a7801f050b\") " pod="openshift-apiserver/apiserver-76f77b778f-gl7rc" Mar 21 04:26:57 crc kubenswrapper[4839]: I0321 04:26:57.875382 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c7fd2012-4ccf-4fb6-ad00-eeb7fd6be2cf-trusted-ca-bundle\") pod \"authentication-operator-69f744f599-k5nwf\" (UID: \"c7fd2012-4ccf-4fb6-ad00-eeb7fd6be2cf\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-k5nwf" Mar 21 04:26:57 crc kubenswrapper[4839]: I0321 04:26:57.875404 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/6bfbd19d-a44a-459c-bd6e-150241ce3ebb-audit-policies\") pod \"oauth-openshift-558db77b4-zt77f\" (UID: \"6bfbd19d-a44a-459c-bd6e-150241ce3ebb\") " pod="openshift-authentication/oauth-openshift-558db77b4-zt77f" Mar 21 04:26:57 crc kubenswrapper[4839]: I0321 04:26:57.875425 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/e81e2384-94b0-4639-bb2d-e4152385c932-client-ca\") pod \"route-controller-manager-6576b87f9c-76ctz\" (UID: \"e81e2384-94b0-4639-bb2d-e4152385c932\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-76ctz" Mar 21 04:26:57 crc kubenswrapper[4839]: I0321 04:26:57.875441 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/9d291bc8-87c0-4a9e-b269-52a7801f050b-image-import-ca\") pod \"apiserver-76f77b778f-gl7rc\" (UID: \"9d291bc8-87c0-4a9e-b269-52a7801f050b\") " pod="openshift-apiserver/apiserver-76f77b778f-gl7rc" Mar 21 04:26:57 crc kubenswrapper[4839]: I0321 04:26:57.875457 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9d291bc8-87c0-4a9e-b269-52a7801f050b-serving-cert\") pod \"apiserver-76f77b778f-gl7rc\" (UID: \"9d291bc8-87c0-4a9e-b269-52a7801f050b\") " pod="openshift-apiserver/apiserver-76f77b778f-gl7rc" Mar 21 04:26:57 crc kubenswrapper[4839]: I0321 04:26:57.875474 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zmj6b\" (UniqueName: \"kubernetes.io/projected/b60d6f1b-b109-4fa4-a85d-ebb845b342bd-kube-api-access-zmj6b\") pod \"openshift-config-operator-7777fb866f-p4nnp\" (UID: \"b60d6f1b-b109-4fa4-a85d-ebb845b342bd\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-p4nnp" Mar 21 04:26:57 crc kubenswrapper[4839]: I0321 04:26:57.875490 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c7fd2012-4ccf-4fb6-ad00-eeb7fd6be2cf-serving-cert\") pod \"authentication-operator-69f744f599-k5nwf\" (UID: \"c7fd2012-4ccf-4fb6-ad00-eeb7fd6be2cf\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-k5nwf" Mar 21 04:26:57 crc kubenswrapper[4839]: I0321 04:26:57.875508 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/9d291bc8-87c0-4a9e-b269-52a7801f050b-node-pullsecrets\") pod \"apiserver-76f77b778f-gl7rc\" (UID: \"9d291bc8-87c0-4a9e-b269-52a7801f050b\") " pod="openshift-apiserver/apiserver-76f77b778f-gl7rc" Mar 21 04:26:57 crc kubenswrapper[4839]: I0321 04:26:57.875523 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/93bc1508-a828-4d23-b078-1d4164d1bc2c-etcd-ca\") pod \"etcd-operator-b45778765-2s6j7\" (UID: \"93bc1508-a828-4d23-b078-1d4164d1bc2c\") " pod="openshift-etcd-operator/etcd-operator-b45778765-2s6j7" Mar 21 04:26:57 crc kubenswrapper[4839]: I0321 04:26:57.875545 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e81e2384-94b0-4639-bb2d-e4152385c932-serving-cert\") pod \"route-controller-manager-6576b87f9c-76ctz\" (UID: \"e81e2384-94b0-4639-bb2d-e4152385c932\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-76ctz" Mar 21 04:26:57 crc kubenswrapper[4839]: I0321 04:26:57.875595 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/cae2f42f-b7c7-43c7-b397-a8273ea5844b-metrics-tls\") pod \"dns-operator-744455d44c-g2rrh\" (UID: \"cae2f42f-b7c7-43c7-b397-a8273ea5844b\") " pod="openshift-dns-operator/dns-operator-744455d44c-g2rrh" Mar 21 04:26:57 crc kubenswrapper[4839]: I0321 04:26:57.875618 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-njsf7\" (UniqueName: \"kubernetes.io/projected/3db892a0-fb40-4e0e-93ee-a8f2876ad8be-kube-api-access-njsf7\") pod \"cluster-image-registry-operator-dc59b4c8b-fhqz6\" (UID: \"3db892a0-fb40-4e0e-93ee-a8f2876ad8be\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-fhqz6" Mar 21 04:26:57 crc kubenswrapper[4839]: I0321 04:26:57.875641 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/6bfbd19d-a44a-459c-bd6e-150241ce3ebb-v4-0-config-system-router-certs\") pod \"oauth-openshift-558db77b4-zt77f\" (UID: \"6bfbd19d-a44a-459c-bd6e-150241ce3ebb\") " pod="openshift-authentication/oauth-openshift-558db77b4-zt77f" Mar 21 04:26:57 crc kubenswrapper[4839]: I0321 04:26:57.875665 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ebdfec0a-a8bf-47b0-b51a-75a76d4341f2-trusted-ca-bundle\") pod \"console-f9d7485db-bj929\" (UID: \"ebdfec0a-a8bf-47b0-b51a-75a76d4341f2\") " pod="openshift-console/console-f9d7485db-bj929" Mar 21 04:26:57 crc kubenswrapper[4839]: I0321 04:26:57.875698 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f7156267-6917-4c54-ba75-4a91a0772025-serving-cert\") pod \"console-operator-58897d9998-hkg98\" (UID: \"f7156267-6917-4c54-ba75-4a91a0772025\") " pod="openshift-console-operator/console-operator-58897d9998-hkg98" Mar 21 04:26:57 crc kubenswrapper[4839]: I0321 04:26:57.875726 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xlf56\" (UniqueName: \"kubernetes.io/projected/14ed32fc-196a-4d5e-a8f3-d9fc6ec765b1-kube-api-access-xlf56\") pod \"machine-approver-56656f9798-fh8k5\" (UID: \"14ed32fc-196a-4d5e-a8f3-d9fc6ec765b1\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-fh8k5" Mar 21 04:26:57 crc kubenswrapper[4839]: I0321 04:26:57.875749 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/14f49362-2145-40aa-8a7c-e07c70ea910c-serving-cert\") pod \"kube-controller-manager-operator-78b949d7b-c4tgz\" (UID: \"14f49362-2145-40aa-8a7c-e07c70ea910c\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-c4tgz" Mar 21 04:26:57 crc kubenswrapper[4839]: I0321 04:26:57.875779 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/9d291bc8-87c0-4a9e-b269-52a7801f050b-audit-dir\") pod \"apiserver-76f77b778f-gl7rc\" (UID: \"9d291bc8-87c0-4a9e-b269-52a7801f050b\") " pod="openshift-apiserver/apiserver-76f77b778f-gl7rc" Mar 21 04:26:57 crc kubenswrapper[4839]: I0321 04:26:57.875799 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/93bc1508-a828-4d23-b078-1d4164d1bc2c-etcd-service-ca\") pod \"etcd-operator-b45778765-2s6j7\" (UID: \"93bc1508-a828-4d23-b078-1d4164d1bc2c\") " pod="openshift-etcd-operator/etcd-operator-b45778765-2s6j7" Mar 21 04:26:57 crc kubenswrapper[4839]: I0321 04:26:57.875915 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c7fd2012-4ccf-4fb6-ad00-eeb7fd6be2cf-config\") pod \"authentication-operator-69f744f599-k5nwf\" (UID: \"c7fd2012-4ccf-4fb6-ad00-eeb7fd6be2cf\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-k5nwf" Mar 21 04:26:57 crc kubenswrapper[4839]: I0321 04:26:57.876772 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/14ed32fc-196a-4d5e-a8f3-d9fc6ec765b1-auth-proxy-config\") pod \"machine-approver-56656f9798-fh8k5\" (UID: \"14ed32fc-196a-4d5e-a8f3-d9fc6ec765b1\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-fh8k5" Mar 21 04:26:57 crc kubenswrapper[4839]: I0321 04:26:57.874905 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/b60d6f1b-b109-4fa4-a85d-ebb845b342bd-available-featuregates\") pod \"openshift-config-operator-7777fb866f-p4nnp\" (UID: \"b60d6f1b-b109-4fa4-a85d-ebb845b342bd\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-p4nnp" Mar 21 04:26:57 crc kubenswrapper[4839]: I0321 04:26:57.877154 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/6bfbd19d-a44a-459c-bd6e-150241ce3ebb-audit-dir\") pod \"oauth-openshift-558db77b4-zt77f\" (UID: \"6bfbd19d-a44a-459c-bd6e-150241ce3ebb\") " pod="openshift-authentication/oauth-openshift-558db77b4-zt77f" Mar 21 04:26:57 crc kubenswrapper[4839]: I0321 04:26:57.877442 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/9d291bc8-87c0-4a9e-b269-52a7801f050b-encryption-config\") pod \"apiserver-76f77b778f-gl7rc\" (UID: \"9d291bc8-87c0-4a9e-b269-52a7801f050b\") " pod="openshift-apiserver/apiserver-76f77b778f-gl7rc" Mar 21 04:26:57 crc kubenswrapper[4839]: I0321 04:26:57.877456 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/ebdfec0a-a8bf-47b0-b51a-75a76d4341f2-oauth-serving-cert\") pod \"console-f9d7485db-bj929\" (UID: \"ebdfec0a-a8bf-47b0-b51a-75a76d4341f2\") " pod="openshift-console/console-f9d7485db-bj929" Mar 21 04:26:57 crc kubenswrapper[4839]: I0321 04:26:57.877631 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/6bfbd19d-a44a-459c-bd6e-150241ce3ebb-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-558db77b4-zt77f\" (UID: \"6bfbd19d-a44a-459c-bd6e-150241ce3ebb\") " pod="openshift-authentication/oauth-openshift-558db77b4-zt77f" Mar 21 04:26:57 crc kubenswrapper[4839]: I0321 04:26:57.877948 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/9d291bc8-87c0-4a9e-b269-52a7801f050b-etcd-client\") pod \"apiserver-76f77b778f-gl7rc\" (UID: \"9d291bc8-87c0-4a9e-b269-52a7801f050b\") " pod="openshift-apiserver/apiserver-76f77b778f-gl7rc" Mar 21 04:26:57 crc kubenswrapper[4839]: I0321 04:26:57.878034 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/93bc1508-a828-4d23-b078-1d4164d1bc2c-etcd-service-ca\") pod \"etcd-operator-b45778765-2s6j7\" (UID: \"93bc1508-a828-4d23-b078-1d4164d1bc2c\") " pod="openshift-etcd-operator/etcd-operator-b45778765-2s6j7" Mar 21 04:26:57 crc kubenswrapper[4839]: I0321 04:26:57.878444 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f7156267-6917-4c54-ba75-4a91a0772025-config\") pod \"console-operator-58897d9998-hkg98\" (UID: \"f7156267-6917-4c54-ba75-4a91a0772025\") " pod="openshift-console-operator/console-operator-58897d9998-hkg98" Mar 21 04:26:57 crc kubenswrapper[4839]: I0321 04:26:57.878665 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6bfbd19d-a44a-459c-bd6e-150241ce3ebb-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-558db77b4-zt77f\" (UID: \"6bfbd19d-a44a-459c-bd6e-150241ce3ebb\") " pod="openshift-authentication/oauth-openshift-558db77b4-zt77f" Mar 21 04:26:57 crc kubenswrapper[4839]: I0321 04:26:57.879369 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/9d291bc8-87c0-4a9e-b269-52a7801f050b-audit-dir\") pod \"apiserver-76f77b778f-gl7rc\" (UID: \"9d291bc8-87c0-4a9e-b269-52a7801f050b\") " pod="openshift-apiserver/apiserver-76f77b778f-gl7rc" Mar 21 04:26:57 crc kubenswrapper[4839]: I0321 04:26:57.879440 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/9d291bc8-87c0-4a9e-b269-52a7801f050b-node-pullsecrets\") pod \"apiserver-76f77b778f-gl7rc\" (UID: \"9d291bc8-87c0-4a9e-b269-52a7801f050b\") " pod="openshift-apiserver/apiserver-76f77b778f-gl7rc" Mar 21 04:26:57 crc kubenswrapper[4839]: I0321 04:26:57.879668 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/ebdfec0a-a8bf-47b0-b51a-75a76d4341f2-service-ca\") pod \"console-f9d7485db-bj929\" (UID: \"ebdfec0a-a8bf-47b0-b51a-75a76d4341f2\") " pod="openshift-console/console-f9d7485db-bj929" Mar 21 04:26:57 crc kubenswrapper[4839]: I0321 04:26:57.879795 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/6bfbd19d-a44a-459c-bd6e-150241ce3ebb-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-558db77b4-zt77f\" (UID: \"6bfbd19d-a44a-459c-bd6e-150241ce3ebb\") " pod="openshift-authentication/oauth-openshift-558db77b4-zt77f" Mar 21 04:26:57 crc kubenswrapper[4839]: I0321 04:26:57.879812 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/6bfbd19d-a44a-459c-bd6e-150241ce3ebb-v4-0-config-system-cliconfig\") pod \"oauth-openshift-558db77b4-zt77f\" (UID: \"6bfbd19d-a44a-459c-bd6e-150241ce3ebb\") " pod="openshift-authentication/oauth-openshift-558db77b4-zt77f" Mar 21 04:26:57 crc kubenswrapper[4839]: I0321 04:26:57.879875 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e81e2384-94b0-4639-bb2d-e4152385c932-config\") pod \"route-controller-manager-6576b87f9c-76ctz\" (UID: \"e81e2384-94b0-4639-bb2d-e4152385c932\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-76ctz" Mar 21 04:26:57 crc kubenswrapper[4839]: I0321 04:26:57.880096 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/6bfbd19d-a44a-459c-bd6e-150241ce3ebb-v4-0-config-system-service-ca\") pod \"oauth-openshift-558db77b4-zt77f\" (UID: \"6bfbd19d-a44a-459c-bd6e-150241ce3ebb\") " pod="openshift-authentication/oauth-openshift-558db77b4-zt77f" Mar 21 04:26:57 crc kubenswrapper[4839]: I0321 04:26:57.880207 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/25af4e9d-c029-4ee7-9952-18a3a5e3c333-samples-operator-tls\") pod \"cluster-samples-operator-665b6dd947-tz8sp\" (UID: \"25af4e9d-c029-4ee7-9952-18a3a5e3c333\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-tz8sp" Mar 21 04:26:57 crc kubenswrapper[4839]: I0321 04:26:57.880302 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/e81e2384-94b0-4639-bb2d-e4152385c932-client-ca\") pod \"route-controller-manager-6576b87f9c-76ctz\" (UID: \"e81e2384-94b0-4639-bb2d-e4152385c932\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-76ctz" Mar 21 04:26:57 crc kubenswrapper[4839]: I0321 04:26:57.880497 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c7fd2012-4ccf-4fb6-ad00-eeb7fd6be2cf-service-ca-bundle\") pod \"authentication-operator-69f744f599-k5nwf\" (UID: \"c7fd2012-4ccf-4fb6-ad00-eeb7fd6be2cf\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-k5nwf" Mar 21 04:26:57 crc kubenswrapper[4839]: I0321 04:26:57.880873 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/93bc1508-a828-4d23-b078-1d4164d1bc2c-serving-cert\") pod \"etcd-operator-b45778765-2s6j7\" (UID: \"93bc1508-a828-4d23-b078-1d4164d1bc2c\") " pod="openshift-etcd-operator/etcd-operator-b45778765-2s6j7" Mar 21 04:26:57 crc kubenswrapper[4839]: I0321 04:26:57.881144 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/14ed32fc-196a-4d5e-a8f3-d9fc6ec765b1-config\") pod \"machine-approver-56656f9798-fh8k5\" (UID: \"14ed32fc-196a-4d5e-a8f3-d9fc6ec765b1\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-fh8k5" Mar 21 04:26:57 crc kubenswrapper[4839]: I0321 04:26:57.881271 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/9d291bc8-87c0-4a9e-b269-52a7801f050b-audit\") pod \"apiserver-76f77b778f-gl7rc\" (UID: \"9d291bc8-87c0-4a9e-b269-52a7801f050b\") " pod="openshift-apiserver/apiserver-76f77b778f-gl7rc" Mar 21 04:26:57 crc kubenswrapper[4839]: I0321 04:26:57.881404 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/ebdfec0a-a8bf-47b0-b51a-75a76d4341f2-console-config\") pod \"console-f9d7485db-bj929\" (UID: \"ebdfec0a-a8bf-47b0-b51a-75a76d4341f2\") " pod="openshift-console/console-f9d7485db-bj929" Mar 21 04:26:57 crc kubenswrapper[4839]: I0321 04:26:57.881434 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/6bfbd19d-a44a-459c-bd6e-150241ce3ebb-audit-policies\") pod \"oauth-openshift-558db77b4-zt77f\" (UID: \"6bfbd19d-a44a-459c-bd6e-150241ce3ebb\") " pod="openshift-authentication/oauth-openshift-558db77b4-zt77f" Mar 21 04:26:57 crc kubenswrapper[4839]: I0321 04:26:57.881687 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/6bfbd19d-a44a-459c-bd6e-150241ce3ebb-v4-0-config-user-template-login\") pod \"oauth-openshift-558db77b4-zt77f\" (UID: \"6bfbd19d-a44a-459c-bd6e-150241ce3ebb\") " pod="openshift-authentication/oauth-openshift-558db77b4-zt77f" Mar 21 04:26:57 crc kubenswrapper[4839]: I0321 04:26:57.881859 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c7fd2012-4ccf-4fb6-ad00-eeb7fd6be2cf-trusted-ca-bundle\") pod \"authentication-operator-69f744f599-k5nwf\" (UID: \"c7fd2012-4ccf-4fb6-ad00-eeb7fd6be2cf\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-k5nwf" Mar 21 04:26:57 crc kubenswrapper[4839]: I0321 04:26:57.881955 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/9d291bc8-87c0-4a9e-b269-52a7801f050b-image-import-ca\") pod \"apiserver-76f77b778f-gl7rc\" (UID: \"9d291bc8-87c0-4a9e-b269-52a7801f050b\") " pod="openshift-apiserver/apiserver-76f77b778f-gl7rc" Mar 21 04:26:57 crc kubenswrapper[4839]: I0321 04:26:57.882060 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/93bc1508-a828-4d23-b078-1d4164d1bc2c-etcd-client\") pod \"etcd-operator-b45778765-2s6j7\" (UID: \"93bc1508-a828-4d23-b078-1d4164d1bc2c\") " pod="openshift-etcd-operator/etcd-operator-b45778765-2s6j7" Mar 21 04:26:57 crc kubenswrapper[4839]: I0321 04:26:57.882651 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ebdfec0a-a8bf-47b0-b51a-75a76d4341f2-trusted-ca-bundle\") pod \"console-f9d7485db-bj929\" (UID: \"ebdfec0a-a8bf-47b0-b51a-75a76d4341f2\") " pod="openshift-console/console-f9d7485db-bj929" Mar 21 04:26:57 crc kubenswrapper[4839]: I0321 04:26:57.883631 4839 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-serving-cert" Mar 21 04:26:57 crc kubenswrapper[4839]: I0321 04:26:57.883695 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/ebdfec0a-a8bf-47b0-b51a-75a76d4341f2-console-oauth-config\") pod \"console-f9d7485db-bj929\" (UID: \"ebdfec0a-a8bf-47b0-b51a-75a76d4341f2\") " pod="openshift-console/console-f9d7485db-bj929" Mar 21 04:26:57 crc kubenswrapper[4839]: I0321 04:26:57.884157 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/93bc1508-a828-4d23-b078-1d4164d1bc2c-etcd-ca\") pod \"etcd-operator-b45778765-2s6j7\" (UID: \"93bc1508-a828-4d23-b078-1d4164d1bc2c\") " pod="openshift-etcd-operator/etcd-operator-b45778765-2s6j7" Mar 21 04:26:57 crc kubenswrapper[4839]: I0321 04:26:57.885201 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f7156267-6917-4c54-ba75-4a91a0772025-serving-cert\") pod \"console-operator-58897d9998-hkg98\" (UID: \"f7156267-6917-4c54-ba75-4a91a0772025\") " pod="openshift-console-operator/console-operator-58897d9998-hkg98" Mar 21 04:26:57 crc kubenswrapper[4839]: I0321 04:26:57.885360 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/14ed32fc-196a-4d5e-a8f3-d9fc6ec765b1-machine-approver-tls\") pod \"machine-approver-56656f9798-fh8k5\" (UID: \"14ed32fc-196a-4d5e-a8f3-d9fc6ec765b1\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-fh8k5" Mar 21 04:26:57 crc kubenswrapper[4839]: I0321 04:26:57.885628 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b60d6f1b-b109-4fa4-a85d-ebb845b342bd-serving-cert\") pod \"openshift-config-operator-7777fb866f-p4nnp\" (UID: \"b60d6f1b-b109-4fa4-a85d-ebb845b342bd\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-p4nnp" Mar 21 04:26:57 crc kubenswrapper[4839]: I0321 04:26:57.886339 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f7a32fe8-bd63-44c6-a90c-a6a5438e3cd5-serving-cert\") pod \"openshift-apiserver-operator-796bbdcf4f-8sv4j\" (UID: \"f7a32fe8-bd63-44c6-a90c-a6a5438e3cd5\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-8sv4j" Mar 21 04:26:57 crc kubenswrapper[4839]: I0321 04:26:57.886530 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/6bfbd19d-a44a-459c-bd6e-150241ce3ebb-v4-0-config-system-serving-cert\") pod \"oauth-openshift-558db77b4-zt77f\" (UID: \"6bfbd19d-a44a-459c-bd6e-150241ce3ebb\") " pod="openshift-authentication/oauth-openshift-558db77b4-zt77f" Mar 21 04:26:57 crc kubenswrapper[4839]: I0321 04:26:57.886663 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/6bfbd19d-a44a-459c-bd6e-150241ce3ebb-v4-0-config-system-router-certs\") pod \"oauth-openshift-558db77b4-zt77f\" (UID: \"6bfbd19d-a44a-459c-bd6e-150241ce3ebb\") " pod="openshift-authentication/oauth-openshift-558db77b4-zt77f" Mar 21 04:26:57 crc kubenswrapper[4839]: I0321 04:26:57.887641 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/6bfbd19d-a44a-459c-bd6e-150241ce3ebb-v4-0-config-user-template-error\") pod \"oauth-openshift-558db77b4-zt77f\" (UID: \"6bfbd19d-a44a-459c-bd6e-150241ce3ebb\") " pod="openshift-authentication/oauth-openshift-558db77b4-zt77f" Mar 21 04:26:57 crc kubenswrapper[4839]: I0321 04:26:57.889620 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/6bfbd19d-a44a-459c-bd6e-150241ce3ebb-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-558db77b4-zt77f\" (UID: \"6bfbd19d-a44a-459c-bd6e-150241ce3ebb\") " pod="openshift-authentication/oauth-openshift-558db77b4-zt77f" Mar 21 04:26:57 crc kubenswrapper[4839]: I0321 04:26:57.889898 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/6bfbd19d-a44a-459c-bd6e-150241ce3ebb-v4-0-config-system-session\") pod \"oauth-openshift-558db77b4-zt77f\" (UID: \"6bfbd19d-a44a-459c-bd6e-150241ce3ebb\") " pod="openshift-authentication/oauth-openshift-558db77b4-zt77f" Mar 21 04:26:57 crc kubenswrapper[4839]: I0321 04:26:57.889958 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a79fc033-c671-42ff-aa06-78ae64967c92-serving-cert\") pod \"openshift-controller-manager-operator-756b6f6bc6-4cwc5\" (UID: \"a79fc033-c671-42ff-aa06-78ae64967c92\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-4cwc5" Mar 21 04:26:57 crc kubenswrapper[4839]: I0321 04:26:57.891264 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/cae2f42f-b7c7-43c7-b397-a8273ea5844b-metrics-tls\") pod \"dns-operator-744455d44c-g2rrh\" (UID: \"cae2f42f-b7c7-43c7-b397-a8273ea5844b\") " pod="openshift-dns-operator/dns-operator-744455d44c-g2rrh" Mar 21 04:26:57 crc kubenswrapper[4839]: I0321 04:26:57.892166 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e81e2384-94b0-4639-bb2d-e4152385c932-serving-cert\") pod \"route-controller-manager-6576b87f9c-76ctz\" (UID: \"e81e2384-94b0-4639-bb2d-e4152385c932\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-76ctz" Mar 21 04:26:57 crc kubenswrapper[4839]: I0321 04:26:57.892168 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9d291bc8-87c0-4a9e-b269-52a7801f050b-serving-cert\") pod \"apiserver-76f77b778f-gl7rc\" (UID: \"9d291bc8-87c0-4a9e-b269-52a7801f050b\") " pod="openshift-apiserver/apiserver-76f77b778f-gl7rc" Mar 21 04:26:57 crc kubenswrapper[4839]: I0321 04:26:57.894223 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/ebdfec0a-a8bf-47b0-b51a-75a76d4341f2-console-serving-cert\") pod \"console-f9d7485db-bj929\" (UID: \"ebdfec0a-a8bf-47b0-b51a-75a76d4341f2\") " pod="openshift-console/console-f9d7485db-bj929" Mar 21 04:26:57 crc kubenswrapper[4839]: I0321 04:26:57.896056 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c7fd2012-4ccf-4fb6-ad00-eeb7fd6be2cf-serving-cert\") pod \"authentication-operator-69f744f599-k5nwf\" (UID: \"c7fd2012-4ccf-4fb6-ad00-eeb7fd6be2cf\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-k5nwf" Mar 21 04:26:57 crc kubenswrapper[4839]: I0321 04:26:57.902740 4839 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-config" Mar 21 04:26:57 crc kubenswrapper[4839]: I0321 04:26:57.907414 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a79fc033-c671-42ff-aa06-78ae64967c92-config\") pod \"openshift-controller-manager-operator-756b6f6bc6-4cwc5\" (UID: \"a79fc033-c671-42ff-aa06-78ae64967c92\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-4cwc5" Mar 21 04:26:57 crc kubenswrapper[4839]: I0321 04:26:57.922591 4839 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"kube-root-ca.crt" Mar 21 04:26:57 crc kubenswrapper[4839]: I0321 04:26:57.949717 4839 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"trusted-ca" Mar 21 04:26:57 crc kubenswrapper[4839]: I0321 04:26:57.960490 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/3db892a0-fb40-4e0e-93ee-a8f2876ad8be-trusted-ca\") pod \"cluster-image-registry-operator-dc59b4c8b-fhqz6\" (UID: \"3db892a0-fb40-4e0e-93ee-a8f2876ad8be\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-fhqz6" Mar 21 04:26:57 crc kubenswrapper[4839]: I0321 04:26:57.962738 4839 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"cluster-image-registry-operator-dockercfg-m4qtx" Mar 21 04:26:57 crc kubenswrapper[4839]: I0321 04:26:57.976636 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/28599c04-0840-41a0-91dd-c0ed5bcf99fd-package-server-manager-serving-cert\") pod \"package-server-manager-789f6589d5-bxg8h\" (UID: \"28599c04-0840-41a0-91dd-c0ed5bcf99fd\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-bxg8h" Mar 21 04:26:57 crc kubenswrapper[4839]: I0321 04:26:57.976694 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/14f49362-2145-40aa-8a7c-e07c70ea910c-kube-api-access\") pod \"kube-controller-manager-operator-78b949d7b-c4tgz\" (UID: \"14f49362-2145-40aa-8a7c-e07c70ea910c\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-c4tgz" Mar 21 04:26:57 crc kubenswrapper[4839]: I0321 04:26:57.976742 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g7z9v\" (UniqueName: \"kubernetes.io/projected/28599c04-0840-41a0-91dd-c0ed5bcf99fd-kube-api-access-g7z9v\") pod \"package-server-manager-789f6589d5-bxg8h\" (UID: \"28599c04-0840-41a0-91dd-c0ed5bcf99fd\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-bxg8h" Mar 21 04:26:57 crc kubenswrapper[4839]: I0321 04:26:57.976819 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/14f49362-2145-40aa-8a7c-e07c70ea910c-serving-cert\") pod \"kube-controller-manager-operator-78b949d7b-c4tgz\" (UID: \"14f49362-2145-40aa-8a7c-e07c70ea910c\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-c4tgz" Mar 21 04:26:57 crc kubenswrapper[4839]: I0321 04:26:57.976886 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/2bc52acb-29f0-4f24-a46a-928a529264dc-proxy-tls\") pod \"machine-config-operator-74547568cd-lqm8j\" (UID: \"2bc52acb-29f0-4f24-a46a-928a529264dc\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-lqm8j" Mar 21 04:26:57 crc kubenswrapper[4839]: I0321 04:26:57.976980 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/2bc52acb-29f0-4f24-a46a-928a529264dc-auth-proxy-config\") pod \"machine-config-operator-74547568cd-lqm8j\" (UID: \"2bc52acb-29f0-4f24-a46a-928a529264dc\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-lqm8j" Mar 21 04:26:57 crc kubenswrapper[4839]: I0321 04:26:57.977006 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bl5rt\" (UniqueName: \"kubernetes.io/projected/2bc52acb-29f0-4f24-a46a-928a529264dc-kube-api-access-bl5rt\") pod \"machine-config-operator-74547568cd-lqm8j\" (UID: \"2bc52acb-29f0-4f24-a46a-928a529264dc\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-lqm8j" Mar 21 04:26:57 crc kubenswrapper[4839]: I0321 04:26:57.977027 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/14f49362-2145-40aa-8a7c-e07c70ea910c-config\") pod \"kube-controller-manager-operator-78b949d7b-c4tgz\" (UID: \"14f49362-2145-40aa-8a7c-e07c70ea910c\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-c4tgz" Mar 21 04:26:57 crc kubenswrapper[4839]: I0321 04:26:57.977060 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/2bc52acb-29f0-4f24-a46a-928a529264dc-images\") pod \"machine-config-operator-74547568cd-lqm8j\" (UID: \"2bc52acb-29f0-4f24-a46a-928a529264dc\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-lqm8j" Mar 21 04:26:57 crc kubenswrapper[4839]: I0321 04:26:57.977773 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/2bc52acb-29f0-4f24-a46a-928a529264dc-auth-proxy-config\") pod \"machine-config-operator-74547568cd-lqm8j\" (UID: \"2bc52acb-29f0-4f24-a46a-928a529264dc\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-lqm8j" Mar 21 04:26:57 crc kubenswrapper[4839]: I0321 04:26:57.982692 4839 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-operator-tls" Mar 21 04:26:57 crc kubenswrapper[4839]: I0321 04:26:57.990011 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/3db892a0-fb40-4e0e-93ee-a8f2876ad8be-image-registry-operator-tls\") pod \"cluster-image-registry-operator-dc59b4c8b-fhqz6\" (UID: \"3db892a0-fb40-4e0e-93ee-a8f2876ad8be\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-fhqz6" Mar 21 04:26:58 crc kubenswrapper[4839]: I0321 04:26:58.002312 4839 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-root-ca.crt" Mar 21 04:26:58 crc kubenswrapper[4839]: I0321 04:26:58.021912 4839 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-dockercfg-k9rxt" Mar 21 04:26:58 crc kubenswrapper[4839]: I0321 04:26:58.043235 4839 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-tls" Mar 21 04:26:58 crc kubenswrapper[4839]: I0321 04:26:58.062375 4839 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-controller-dockercfg-c2lfx" Mar 21 04:26:58 crc kubenswrapper[4839]: I0321 04:26:58.083338 4839 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mcc-proxy-tls" Mar 21 04:26:58 crc kubenswrapper[4839]: I0321 04:26:58.102644 4839 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-dockercfg-gkqpw" Mar 21 04:26:58 crc kubenswrapper[4839]: I0321 04:26:58.123536 4839 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-serving-cert" Mar 21 04:26:58 crc kubenswrapper[4839]: I0321 04:26:58.131224 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/14f49362-2145-40aa-8a7c-e07c70ea910c-serving-cert\") pod \"kube-controller-manager-operator-78b949d7b-c4tgz\" (UID: \"14f49362-2145-40aa-8a7c-e07c70ea910c\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-c4tgz" Mar 21 04:26:58 crc kubenswrapper[4839]: I0321 04:26:58.142132 4839 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-config" Mar 21 04:26:58 crc kubenswrapper[4839]: I0321 04:26:58.148784 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/14f49362-2145-40aa-8a7c-e07c70ea910c-config\") pod \"kube-controller-manager-operator-78b949d7b-c4tgz\" (UID: \"14f49362-2145-40aa-8a7c-e07c70ea910c\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-c4tgz" Mar 21 04:26:58 crc kubenswrapper[4839]: I0321 04:26:58.162210 4839 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"openshift-service-ca.crt" Mar 21 04:26:58 crc kubenswrapper[4839]: I0321 04:26:58.182899 4839 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"ingress-operator-dockercfg-7lnqk" Mar 21 04:26:58 crc kubenswrapper[4839]: I0321 04:26:58.201907 4839 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"metrics-tls" Mar 21 04:26:58 crc kubenswrapper[4839]: I0321 04:26:58.229213 4839 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"trusted-ca" Mar 21 04:26:58 crc kubenswrapper[4839]: I0321 04:26:58.242300 4839 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"kube-root-ca.crt" Mar 21 04:26:58 crc kubenswrapper[4839]: I0321 04:26:58.262868 4839 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"registry-dockercfg-kzzsd" Mar 21 04:26:58 crc kubenswrapper[4839]: I0321 04:26:58.283164 4839 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-tls" Mar 21 04:26:58 crc kubenswrapper[4839]: I0321 04:26:58.303076 4839 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"installation-pull-secrets" Mar 21 04:26:58 crc kubenswrapper[4839]: I0321 04:26:58.323311 4839 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"openshift-service-ca.crt" Mar 21 04:26:58 crc kubenswrapper[4839]: I0321 04:26:58.342809 4839 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator"/"kube-storage-version-migrator-sa-dockercfg-5xfcg" Mar 21 04:26:58 crc kubenswrapper[4839]: I0321 04:26:58.362915 4839 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"kube-root-ca.crt" Mar 21 04:26:58 crc kubenswrapper[4839]: I0321 04:26:58.382927 4839 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-root-ca.crt" Mar 21 04:26:58 crc kubenswrapper[4839]: I0321 04:26:58.403385 4839 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-config" Mar 21 04:26:58 crc kubenswrapper[4839]: I0321 04:26:58.422615 4839 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-serving-cert" Mar 21 04:26:58 crc kubenswrapper[4839]: I0321 04:26:58.443758 4839 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-dockercfg-x57mr" Mar 21 04:26:58 crc kubenswrapper[4839]: I0321 04:26:58.463243 4839 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"machine-config-operator-images" Mar 21 04:26:58 crc kubenswrapper[4839]: I0321 04:26:58.468163 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"images\" (UniqueName: \"kubernetes.io/configmap/2bc52acb-29f0-4f24-a46a-928a529264dc-images\") pod \"machine-config-operator-74547568cd-lqm8j\" (UID: \"2bc52acb-29f0-4f24-a46a-928a529264dc\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-lqm8j" Mar 21 04:26:58 crc kubenswrapper[4839]: I0321 04:26:58.482812 4839 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-operator-dockercfg-98p87" Mar 21 04:26:58 crc kubenswrapper[4839]: I0321 04:26:58.503755 4839 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mco-proxy-tls" Mar 21 04:26:58 crc kubenswrapper[4839]: I0321 04:26:58.511604 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/2bc52acb-29f0-4f24-a46a-928a529264dc-proxy-tls\") pod \"machine-config-operator-74547568cd-lqm8j\" (UID: \"2bc52acb-29f0-4f24-a46a-928a529264dc\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-lqm8j" Mar 21 04:26:58 crc kubenswrapper[4839]: I0321 04:26:58.523781 4839 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"config" Mar 21 04:26:58 crc kubenswrapper[4839]: I0321 04:26:58.544521 4839 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"kube-storage-version-migrator-operator-dockercfg-2bh8d" Mar 21 04:26:58 crc kubenswrapper[4839]: I0321 04:26:58.563849 4839 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"serving-cert" Mar 21 04:26:58 crc kubenswrapper[4839]: I0321 04:26:58.583410 4839 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"kube-root-ca.crt" Mar 21 04:26:58 crc kubenswrapper[4839]: I0321 04:26:58.603346 4839 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"openshift-service-ca.crt" Mar 21 04:26:58 crc kubenswrapper[4839]: I0321 04:26:58.623025 4839 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"kube-root-ca.crt" Mar 21 04:26:58 crc kubenswrapper[4839]: I0321 04:26:58.644238 4839 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-dockercfg-qt55r" Mar 21 04:26:58 crc kubenswrapper[4839]: I0321 04:26:58.663803 4839 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"kube-scheduler-operator-serving-cert" Mar 21 04:26:58 crc kubenswrapper[4839]: I0321 04:26:58.684072 4839 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-config" Mar 21 04:26:58 crc kubenswrapper[4839]: I0321 04:26:58.702978 4839 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"openshift-service-ca.crt" Mar 21 04:26:58 crc kubenswrapper[4839]: I0321 04:26:58.723658 4839 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-dockercfg-zdk86" Mar 21 04:26:58 crc kubenswrapper[4839]: I0321 04:26:58.743067 4839 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-certs-default" Mar 21 04:26:58 crc kubenswrapper[4839]: I0321 04:26:58.760482 4839 request.go:700] Waited for 1.009015012s due to client-side throttling, not priority and fairness, request: GET:https://api-int.crc.testing:6443/api/v1/namespaces/openshift-ingress/secrets?fieldSelector=metadata.name%3Drouter-stats-default&limit=500&resourceVersion=0 Mar 21 04:26:58 crc kubenswrapper[4839]: I0321 04:26:58.762160 4839 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-stats-default" Mar 21 04:26:58 crc kubenswrapper[4839]: I0321 04:26:58.783351 4839 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-metrics-certs-default" Mar 21 04:26:58 crc kubenswrapper[4839]: I0321 04:26:58.803389 4839 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"service-ca-bundle" Mar 21 04:26:58 crc kubenswrapper[4839]: I0321 04:26:58.823306 4839 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"kube-root-ca.crt" Mar 21 04:26:58 crc kubenswrapper[4839]: I0321 04:26:58.842365 4839 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"openshift-service-ca.crt" Mar 21 04:26:58 crc kubenswrapper[4839]: I0321 04:26:58.863586 4839 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-dockercfg-5nsgg" Mar 21 04:26:58 crc kubenswrapper[4839]: I0321 04:26:58.882961 4839 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-metrics" Mar 21 04:26:58 crc kubenswrapper[4839]: I0321 04:26:58.910386 4839 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"marketplace-trusted-ca" Mar 21 04:26:58 crc kubenswrapper[4839]: I0321 04:26:58.923326 4839 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"kube-root-ca.crt" Mar 21 04:26:58 crc kubenswrapper[4839]: I0321 04:26:58.943359 4839 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"package-server-manager-serving-cert" Mar 21 04:26:58 crc kubenswrapper[4839]: I0321 04:26:58.953721 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/28599c04-0840-41a0-91dd-c0ed5bcf99fd-package-server-manager-serving-cert\") pod \"package-server-manager-789f6589d5-bxg8h\" (UID: \"28599c04-0840-41a0-91dd-c0ed5bcf99fd\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-bxg8h" Mar 21 04:26:58 crc kubenswrapper[4839]: I0321 04:26:58.963151 4839 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serviceaccount-dockercfg-rq7zk" Mar 21 04:26:58 crc kubenswrapper[4839]: I0321 04:26:58.983115 4839 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"openshift-service-ca.crt" Mar 21 04:26:59 crc kubenswrapper[4839]: I0321 04:26:59.004374 4839 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"kube-root-ca.crt" Mar 21 04:26:59 crc kubenswrapper[4839]: I0321 04:26:59.038990 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-29czp\" (UniqueName: \"kubernetes.io/projected/c4d393d7-42d7-4b7d-a3cd-f7e325b97c54-kube-api-access-29czp\") pod \"machine-api-operator-5694c8668f-nmj8p\" (UID: \"c4d393d7-42d7-4b7d-a3cd-f7e325b97c54\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-nmj8p" Mar 21 04:26:59 crc kubenswrapper[4839]: I0321 04:26:59.063280 4839 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"packageserver-service-cert" Mar 21 04:26:59 crc kubenswrapper[4839]: I0321 04:26:59.068169 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-f2bf4\" (UniqueName: \"kubernetes.io/projected/67adff78-dfe5-440a-80b0-fefd703c3aa7-kube-api-access-f2bf4\") pod \"apiserver-7bbb656c7d-85pc8\" (UID: \"67adff78-dfe5-440a-80b0-fefd703c3aa7\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-85pc8" Mar 21 04:26:59 crc kubenswrapper[4839]: I0321 04:26:59.082600 4839 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-admission-controller-secret" Mar 21 04:26:59 crc kubenswrapper[4839]: I0321 04:26:59.103290 4839 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ac-dockercfg-9lkdf" Mar 21 04:26:59 crc kubenswrapper[4839]: I0321 04:26:59.117750 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-85pc8" Mar 21 04:26:59 crc kubenswrapper[4839]: I0321 04:26:59.123508 4839 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"pprof-cert" Mar 21 04:26:59 crc kubenswrapper[4839]: I0321 04:26:59.143345 4839 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"catalog-operator-serving-cert" Mar 21 04:26:59 crc kubenswrapper[4839]: I0321 04:26:59.168110 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/machine-api-operator-5694c8668f-nmj8p" Mar 21 04:26:59 crc kubenswrapper[4839]: I0321 04:26:59.179519 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wp2sf\" (UniqueName: \"kubernetes.io/projected/3498feaf-72d5-471a-b25e-fb4b68875767-kube-api-access-wp2sf\") pod \"controller-manager-879f6c89f-45jfn\" (UID: \"3498feaf-72d5-471a-b25e-fb4b68875767\") " pod="openshift-controller-manager/controller-manager-879f6c89f-45jfn" Mar 21 04:26:59 crc kubenswrapper[4839]: I0321 04:26:59.182497 4839 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-dockercfg-jwfmh" Mar 21 04:26:59 crc kubenswrapper[4839]: I0321 04:26:59.202833 4839 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-default-metrics-tls" Mar 21 04:26:59 crc kubenswrapper[4839]: I0321 04:26:59.223235 4839 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"dns-default" Mar 21 04:26:59 crc kubenswrapper[4839]: I0321 04:26:59.242776 4839 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serving-cert" Mar 21 04:26:59 crc kubenswrapper[4839]: I0321 04:26:59.265239 4839 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"kube-root-ca.crt" Mar 21 04:26:59 crc kubenswrapper[4839]: I0321 04:26:59.283141 4839 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"service-ca-operator-dockercfg-rg9jl" Mar 21 04:26:59 crc kubenswrapper[4839]: I0321 04:26:59.303846 4839 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"serving-cert" Mar 21 04:26:59 crc kubenswrapper[4839]: I0321 04:26:59.323248 4839 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"openshift-service-ca.crt" Mar 21 04:26:59 crc kubenswrapper[4839]: I0321 04:26:59.342449 4839 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"service-ca-operator-config" Mar 21 04:26:59 crc kubenswrapper[4839]: I0321 04:26:59.383407 4839 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"signing-key" Mar 21 04:26:59 crc kubenswrapper[4839]: I0321 04:26:59.402055 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-45jfn" Mar 21 04:26:59 crc kubenswrapper[4839]: I0321 04:26:59.402266 4839 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"openshift-service-ca.crt" Mar 21 04:26:59 crc kubenswrapper[4839]: I0321 04:26:59.408630 4839 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/machine-api-operator-5694c8668f-nmj8p"] Mar 21 04:26:59 crc kubenswrapper[4839]: I0321 04:26:59.409906 4839 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-oauth-apiserver/apiserver-7bbb656c7d-85pc8"] Mar 21 04:26:59 crc kubenswrapper[4839]: W0321 04:26:59.416767 4839 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod67adff78_dfe5_440a_80b0_fefd703c3aa7.slice/crio-81b4fc74d4bcb1143af960c81d13c8b2beb6b1e43a405460cfe879f905bc17cf WatchSource:0}: Error finding container 81b4fc74d4bcb1143af960c81d13c8b2beb6b1e43a405460cfe879f905bc17cf: Status 404 returned error can't find the container with id 81b4fc74d4bcb1143af960c81d13c8b2beb6b1e43a405460cfe879f905bc17cf Mar 21 04:26:59 crc kubenswrapper[4839]: W0321 04:26:59.417110 4839 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc4d393d7_42d7_4b7d_a3cd_f7e325b97c54.slice/crio-c187917beb001e03623ae690390f9e84833b05f6d8e76c1a87f8c27bfd7ec465 WatchSource:0}: Error finding container c187917beb001e03623ae690390f9e84833b05f6d8e76c1a87f8c27bfd7ec465: Status 404 returned error can't find the container with id c187917beb001e03623ae690390f9e84833b05f6d8e76c1a87f8c27bfd7ec465 Mar 21 04:26:59 crc kubenswrapper[4839]: I0321 04:26:59.423008 4839 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"service-ca-dockercfg-pn86c" Mar 21 04:26:59 crc kubenswrapper[4839]: I0321 04:26:59.443127 4839 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"kube-root-ca.crt" Mar 21 04:26:59 crc kubenswrapper[4839]: I0321 04:26:59.461983 4839 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"signing-cabundle" Mar 21 04:26:59 crc kubenswrapper[4839]: I0321 04:26:59.485376 4839 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-tls" Mar 21 04:26:59 crc kubenswrapper[4839]: I0321 04:26:59.503657 4839 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-dockercfg-qx5rd" Mar 21 04:26:59 crc kubenswrapper[4839]: I0321 04:26:59.521155 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/machine-api-operator-5694c8668f-nmj8p" event={"ID":"c4d393d7-42d7-4b7d-a3cd-f7e325b97c54","Type":"ContainerStarted","Data":"c187917beb001e03623ae690390f9e84833b05f6d8e76c1a87f8c27bfd7ec465"} Mar 21 04:26:59 crc kubenswrapper[4839]: I0321 04:26:59.522381 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-85pc8" event={"ID":"67adff78-dfe5-440a-80b0-fefd703c3aa7","Type":"ContainerStarted","Data":"81b4fc74d4bcb1143af960c81d13c8b2beb6b1e43a405460cfe879f905bc17cf"} Mar 21 04:26:59 crc kubenswrapper[4839]: I0321 04:26:59.523975 4839 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"node-bootstrapper-token" Mar 21 04:26:59 crc kubenswrapper[4839]: I0321 04:26:59.543366 4839 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Mar 21 04:26:59 crc kubenswrapper[4839]: I0321 04:26:59.564209 4839 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Mar 21 04:26:59 crc kubenswrapper[4839]: I0321 04:26:59.564385 4839 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-45jfn"] Mar 21 04:26:59 crc kubenswrapper[4839]: I0321 04:26:59.582295 4839 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 21 04:26:59 crc kubenswrapper[4839]: I0321 04:26:59.602776 4839 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 21 04:26:59 crc kubenswrapper[4839]: I0321 04:26:59.623034 4839 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"openshift-service-ca.crt" Mar 21 04:26:59 crc kubenswrapper[4839]: I0321 04:26:59.643121 4839 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"default-dockercfg-2llfx" Mar 21 04:26:59 crc kubenswrapper[4839]: I0321 04:26:59.663009 4839 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"canary-serving-cert" Mar 21 04:26:59 crc kubenswrapper[4839]: I0321 04:26:59.682622 4839 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"kube-root-ca.crt" Mar 21 04:26:59 crc kubenswrapper[4839]: I0321 04:26:59.723365 4839 reflector.go:368] Caches populated for *v1.Secret from object-"hostpath-provisioner"/"csi-hostpath-provisioner-sa-dockercfg-qd74k" Mar 21 04:26:59 crc kubenswrapper[4839]: I0321 04:26:59.742482 4839 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"openshift-service-ca.crt" Mar 21 04:26:59 crc kubenswrapper[4839]: I0321 04:26:59.761134 4839 request.go:700] Waited for 1.926755326s due to client-side throttling, not priority and fairness, request: GET:https://api-int.crc.testing:6443/api/v1/namespaces/hostpath-provisioner/configmaps?fieldSelector=metadata.name%3Dkube-root-ca.crt&limit=500&resourceVersion=0 Mar 21 04:26:59 crc kubenswrapper[4839]: I0321 04:26:59.763190 4839 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"kube-root-ca.crt" Mar 21 04:26:59 crc kubenswrapper[4839]: I0321 04:26:59.801711 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-z5ktq\" (UniqueName: \"kubernetes.io/projected/6bfbd19d-a44a-459c-bd6e-150241ce3ebb-kube-api-access-z5ktq\") pod \"oauth-openshift-558db77b4-zt77f\" (UID: \"6bfbd19d-a44a-459c-bd6e-150241ce3ebb\") " pod="openshift-authentication/oauth-openshift-558db77b4-zt77f" Mar 21 04:26:59 crc kubenswrapper[4839]: I0321 04:26:59.816732 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wthj5\" (UniqueName: \"kubernetes.io/projected/f7a32fe8-bd63-44c6-a90c-a6a5438e3cd5-kube-api-access-wthj5\") pod \"openshift-apiserver-operator-796bbdcf4f-8sv4j\" (UID: \"f7a32fe8-bd63-44c6-a90c-a6a5438e3cd5\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-8sv4j" Mar 21 04:26:59 crc kubenswrapper[4839]: I0321 04:26:59.841533 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-64b9q\" (UniqueName: \"kubernetes.io/projected/e81e2384-94b0-4639-bb2d-e4152385c932-kube-api-access-64b9q\") pod \"route-controller-manager-6576b87f9c-76ctz\" (UID: \"e81e2384-94b0-4639-bb2d-e4152385c932\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-76ctz" Mar 21 04:26:59 crc kubenswrapper[4839]: I0321 04:26:59.841858 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-76ctz" Mar 21 04:26:59 crc kubenswrapper[4839]: I0321 04:26:59.855707 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9pwc7\" (UniqueName: \"kubernetes.io/projected/9d291bc8-87c0-4a9e-b269-52a7801f050b-kube-api-access-9pwc7\") pod \"apiserver-76f77b778f-gl7rc\" (UID: \"9d291bc8-87c0-4a9e-b269-52a7801f050b\") " pod="openshift-apiserver/apiserver-76f77b778f-gl7rc" Mar 21 04:26:59 crc kubenswrapper[4839]: I0321 04:26:59.860656 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver/apiserver-76f77b778f-gl7rc" Mar 21 04:26:59 crc kubenswrapper[4839]: I0321 04:26:59.867097 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-zt77f" Mar 21 04:26:59 crc kubenswrapper[4839]: I0321 04:26:59.881505 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ssrbd\" (UniqueName: \"kubernetes.io/projected/cae2f42f-b7c7-43c7-b397-a8273ea5844b-kube-api-access-ssrbd\") pod \"dns-operator-744455d44c-g2rrh\" (UID: \"cae2f42f-b7c7-43c7-b397-a8273ea5844b\") " pod="openshift-dns-operator/dns-operator-744455d44c-g2rrh" Mar 21 04:26:59 crc kubenswrapper[4839]: I0321 04:26:59.896969 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-r66pp\" (UniqueName: \"kubernetes.io/projected/4d63cdfd-21e7-4a63-960b-363fb131ac08-kube-api-access-r66pp\") pod \"downloads-7954f5f757-qp8mz\" (UID: \"4d63cdfd-21e7-4a63-960b-363fb131ac08\") " pod="openshift-console/downloads-7954f5f757-qp8mz" Mar 21 04:26:59 crc kubenswrapper[4839]: I0321 04:26:59.921602 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/3db892a0-fb40-4e0e-93ee-a8f2876ad8be-bound-sa-token\") pod \"cluster-image-registry-operator-dc59b4c8b-fhqz6\" (UID: \"3db892a0-fb40-4e0e-93ee-a8f2876ad8be\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-fhqz6" Mar 21 04:26:59 crc kubenswrapper[4839]: I0321 04:26:59.930604 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/downloads-7954f5f757-qp8mz" Mar 21 04:26:59 crc kubenswrapper[4839]: I0321 04:26:59.952805 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-q47zt\" (UniqueName: \"kubernetes.io/projected/ebdfec0a-a8bf-47b0-b51a-75a76d4341f2-kube-api-access-q47zt\") pod \"console-f9d7485db-bj929\" (UID: \"ebdfec0a-a8bf-47b0-b51a-75a76d4341f2\") " pod="openshift-console/console-f9d7485db-bj929" Mar 21 04:26:59 crc kubenswrapper[4839]: I0321 04:26:59.955960 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns-operator/dns-operator-744455d44c-g2rrh" Mar 21 04:26:59 crc kubenswrapper[4839]: I0321 04:26:59.963189 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dxvqw\" (UniqueName: \"kubernetes.io/projected/93bc1508-a828-4d23-b078-1d4164d1bc2c-kube-api-access-dxvqw\") pod \"etcd-operator-b45778765-2s6j7\" (UID: \"93bc1508-a828-4d23-b078-1d4164d1bc2c\") " pod="openshift-etcd-operator/etcd-operator-b45778765-2s6j7" Mar 21 04:26:59 crc kubenswrapper[4839]: I0321 04:26:59.992296 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-94mt6\" (UniqueName: \"kubernetes.io/projected/c7fd2012-4ccf-4fb6-ad00-eeb7fd6be2cf-kube-api-access-94mt6\") pod \"authentication-operator-69f744f599-k5nwf\" (UID: \"c7fd2012-4ccf-4fb6-ad00-eeb7fd6be2cf\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-k5nwf" Mar 21 04:27:00 crc kubenswrapper[4839]: I0321 04:27:00.009038 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s8kjw\" (UniqueName: \"kubernetes.io/projected/a79fc033-c671-42ff-aa06-78ae64967c92-kube-api-access-s8kjw\") pod \"openshift-controller-manager-operator-756b6f6bc6-4cwc5\" (UID: \"a79fc033-c671-42ff-aa06-78ae64967c92\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-4cwc5" Mar 21 04:27:00 crc kubenswrapper[4839]: I0321 04:27:00.031801 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xlf56\" (UniqueName: \"kubernetes.io/projected/14ed32fc-196a-4d5e-a8f3-d9fc6ec765b1-kube-api-access-xlf56\") pod \"machine-approver-56656f9798-fh8k5\" (UID: \"14ed32fc-196a-4d5e-a8f3-d9fc6ec765b1\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-fh8k5" Mar 21 04:27:00 crc kubenswrapper[4839]: I0321 04:27:00.036332 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-njsf7\" (UniqueName: \"kubernetes.io/projected/3db892a0-fb40-4e0e-93ee-a8f2876ad8be-kube-api-access-njsf7\") pod \"cluster-image-registry-operator-dc59b4c8b-fhqz6\" (UID: \"3db892a0-fb40-4e0e-93ee-a8f2876ad8be\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-fhqz6" Mar 21 04:27:00 crc kubenswrapper[4839]: I0321 04:27:00.065274 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5zmdv\" (UniqueName: \"kubernetes.io/projected/f7156267-6917-4c54-ba75-4a91a0772025-kube-api-access-5zmdv\") pod \"console-operator-58897d9998-hkg98\" (UID: \"f7156267-6917-4c54-ba75-4a91a0772025\") " pod="openshift-console-operator/console-operator-58897d9998-hkg98" Mar 21 04:27:00 crc kubenswrapper[4839]: I0321 04:27:00.083680 4839 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-76ctz"] Mar 21 04:27:00 crc kubenswrapper[4839]: I0321 04:27:00.085679 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pgq6h\" (UniqueName: \"kubernetes.io/projected/25af4e9d-c029-4ee7-9952-18a3a5e3c333-kube-api-access-pgq6h\") pod \"cluster-samples-operator-665b6dd947-tz8sp\" (UID: \"25af4e9d-c029-4ee7-9952-18a3a5e3c333\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-tz8sp" Mar 21 04:27:00 crc kubenswrapper[4839]: I0321 04:27:00.087071 4839 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-zt77f"] Mar 21 04:27:00 crc kubenswrapper[4839]: I0321 04:27:00.097599 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-8sv4j" Mar 21 04:27:00 crc kubenswrapper[4839]: I0321 04:27:00.107125 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zmj6b\" (UniqueName: \"kubernetes.io/projected/b60d6f1b-b109-4fa4-a85d-ebb845b342bd-kube-api-access-zmj6b\") pod \"openshift-config-operator-7777fb866f-p4nnp\" (UID: \"b60d6f1b-b109-4fa4-a85d-ebb845b342bd\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-p4nnp" Mar 21 04:27:00 crc kubenswrapper[4839]: I0321 04:27:00.116980 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication-operator/authentication-operator-69f744f599-k5nwf" Mar 21 04:27:00 crc kubenswrapper[4839]: I0321 04:27:00.123640 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-fh8k5" Mar 21 04:27:00 crc kubenswrapper[4839]: I0321 04:27:00.126465 4839 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver/apiserver-76f77b778f-gl7rc"] Mar 21 04:27:00 crc kubenswrapper[4839]: I0321 04:27:00.128523 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-g7z9v\" (UniqueName: \"kubernetes.io/projected/28599c04-0840-41a0-91dd-c0ed5bcf99fd-kube-api-access-g7z9v\") pod \"package-server-manager-789f6589d5-bxg8h\" (UID: \"28599c04-0840-41a0-91dd-c0ed5bcf99fd\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-bxg8h" Mar 21 04:27:00 crc kubenswrapper[4839]: I0321 04:27:00.133103 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-bxg8h" Mar 21 04:27:00 crc kubenswrapper[4839]: I0321 04:27:00.137198 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/14f49362-2145-40aa-8a7c-e07c70ea910c-kube-api-access\") pod \"kube-controller-manager-operator-78b949d7b-c4tgz\" (UID: \"14f49362-2145-40aa-8a7c-e07c70ea910c\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-c4tgz" Mar 21 04:27:00 crc kubenswrapper[4839]: I0321 04:27:00.152955 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-tz8sp" Mar 21 04:27:00 crc kubenswrapper[4839]: I0321 04:27:00.162886 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bl5rt\" (UniqueName: \"kubernetes.io/projected/2bc52acb-29f0-4f24-a46a-928a529264dc-kube-api-access-bl5rt\") pod \"machine-config-operator-74547568cd-lqm8j\" (UID: \"2bc52acb-29f0-4f24-a46a-928a529264dc\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-lqm8j" Mar 21 04:27:00 crc kubenswrapper[4839]: I0321 04:27:00.174897 4839 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/downloads-7954f5f757-qp8mz"] Mar 21 04:27:00 crc kubenswrapper[4839]: W0321 04:27:00.186898 4839 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4d63cdfd_21e7_4a63_960b_363fb131ac08.slice/crio-3f550c2c325d41fb1c4343671414703435e40bc13120f59039726685c708adaf WatchSource:0}: Error finding container 3f550c2c325d41fb1c4343671414703435e40bc13120f59039726685c708adaf: Status 404 returned error can't find the container with id 3f550c2c325d41fb1c4343671414703435e40bc13120f59039726685c708adaf Mar 21 04:27:00 crc kubenswrapper[4839]: I0321 04:27:00.203854 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-config-operator/openshift-config-operator-7777fb866f-p4nnp" Mar 21 04:27:00 crc kubenswrapper[4839]: I0321 04:27:00.212405 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd-operator/etcd-operator-b45778765-2s6j7" Mar 21 04:27:00 crc kubenswrapper[4839]: I0321 04:27:00.215602 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/10dc7791-eebd-49e9-8d9c-63711119e9d7-config\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-hmqv7\" (UID: \"10dc7791-eebd-49e9-8d9c-63711119e9d7\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-hmqv7" Mar 21 04:27:00 crc kubenswrapper[4839]: I0321 04:27:00.215673 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/7ef3f28d-e496-434e-a803-3b9a0fa24690-bound-sa-token\") pod \"image-registry-697d97f7c8-ql2ps\" (UID: \"7ef3f28d-e496-434e-a803-3b9a0fa24690\") " pod="openshift-image-registry/image-registry-697d97f7c8-ql2ps" Mar 21 04:27:00 crc kubenswrapper[4839]: I0321 04:27:00.215702 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/10dc7791-eebd-49e9-8d9c-63711119e9d7-kube-api-access\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-hmqv7\" (UID: \"10dc7791-eebd-49e9-8d9c-63711119e9d7\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-hmqv7" Mar 21 04:27:00 crc kubenswrapper[4839]: I0321 04:27:00.215746 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/7ef3f28d-e496-434e-a803-3b9a0fa24690-trusted-ca\") pod \"image-registry-697d97f7c8-ql2ps\" (UID: \"7ef3f28d-e496-434e-a803-3b9a0fa24690\") " pod="openshift-image-registry/image-registry-697d97f7c8-ql2ps" Mar 21 04:27:00 crc kubenswrapper[4839]: I0321 04:27:00.215764 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/81a76ad1-da33-4b42-9c0a-d0ada077729a-config-volume\") pod \"dns-default-5jhkc\" (UID: \"81a76ad1-da33-4b42-9c0a-d0ada077729a\") " pod="openshift-dns/dns-default-5jhkc" Mar 21 04:27:00 crc kubenswrapper[4839]: I0321 04:27:00.215799 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/28ce563b-8e5b-4abe-b71b-02c588bff511-service-ca-bundle\") pod \"router-default-5444994796-w6dzs\" (UID: \"28ce563b-8e5b-4abe-b71b-02c588bff511\") " pod="openshift-ingress/router-default-5444994796-w6dzs" Mar 21 04:27:00 crc kubenswrapper[4839]: I0321 04:27:00.215815 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dbz79\" (UniqueName: \"kubernetes.io/projected/7f2c6e22-6a88-4c63-9da2-e38b813e0f1c-kube-api-access-dbz79\") pod \"migrator-59844c95c7-blcpt\" (UID: \"7f2c6e22-6a88-4c63-9da2-e38b813e0f1c\") " pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-blcpt" Mar 21 04:27:00 crc kubenswrapper[4839]: I0321 04:27:00.215845 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/745f7801-7150-4924-b9fb-e8a0aa1e7edb-kube-api-access\") pod \"kube-apiserver-operator-766d6c64bb-djrdt\" (UID: \"745f7801-7150-4924-b9fb-e8a0aa1e7edb\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-djrdt" Mar 21 04:27:00 crc kubenswrapper[4839]: I0321 04:27:00.215860 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-627w7\" (UniqueName: \"kubernetes.io/projected/28ce563b-8e5b-4abe-b71b-02c588bff511-kube-api-access-627w7\") pod \"router-default-5444994796-w6dzs\" (UID: \"28ce563b-8e5b-4abe-b71b-02c588bff511\") " pod="openshift-ingress/router-default-5444994796-w6dzs" Mar 21 04:27:00 crc kubenswrapper[4839]: I0321 04:27:00.215894 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/40014780-8cb8-47fa-8b2c-c4fb7d04a85c-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-78cbb6b69f-whlp9\" (UID: \"40014780-8cb8-47fa-8b2c-c4fb7d04a85c\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-whlp9" Mar 21 04:27:00 crc kubenswrapper[4839]: I0321 04:27:00.216071 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/6240548e-b827-4fdb-b2be-c7187d6a28e8-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-8jgh7\" (UID: \"6240548e-b827-4fdb-b2be-c7187d6a28e8\") " pod="openshift-marketplace/marketplace-operator-79b997595-8jgh7" Mar 21 04:27:00 crc kubenswrapper[4839]: I0321 04:27:00.216104 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zhhkt\" (UniqueName: \"kubernetes.io/projected/a810b51a-5b19-4da9-ad80-05f189d821e4-kube-api-access-zhhkt\") pod \"machine-config-controller-84d6567774-hlk25\" (UID: \"a810b51a-5b19-4da9-ad80-05f189d821e4\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-hlk25" Mar 21 04:27:00 crc kubenswrapper[4839]: I0321 04:27:00.216131 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vmtdf\" (UniqueName: \"kubernetes.io/projected/8c8a6e75-7e5f-41c8-8312-b9d274284f35-kube-api-access-vmtdf\") pod \"multus-admission-controller-857f4d67dd-g9tz2\" (UID: \"8c8a6e75-7e5f-41c8-8312-b9d274284f35\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-g9tz2" Mar 21 04:27:00 crc kubenswrapper[4839]: I0321 04:27:00.216173 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/745f7801-7150-4924-b9fb-e8a0aa1e7edb-serving-cert\") pod \"kube-apiserver-operator-766d6c64bb-djrdt\" (UID: \"745f7801-7150-4924-b9fb-e8a0aa1e7edb\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-djrdt" Mar 21 04:27:00 crc kubenswrapper[4839]: I0321 04:27:00.216197 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/a83789bf-1523-4d5e-892d-6597aed01b7d-srv-cert\") pod \"catalog-operator-68c6474976-xldvn\" (UID: \"a83789bf-1523-4d5e-892d-6597aed01b7d\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-xldvn" Mar 21 04:27:00 crc kubenswrapper[4839]: I0321 04:27:00.216303 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/2bceecf8-583d-4e26-9749-f5939280540b-tmpfs\") pod \"packageserver-d55dfcdfc-vg8dq\" (UID: \"2bceecf8-583d-4e26-9749-f5939280540b\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-vg8dq" Mar 21 04:27:00 crc kubenswrapper[4839]: I0321 04:27:00.216351 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/28ce563b-8e5b-4abe-b71b-02c588bff511-stats-auth\") pod \"router-default-5444994796-w6dzs\" (UID: \"28ce563b-8e5b-4abe-b71b-02c588bff511\") " pod="openshift-ingress/router-default-5444994796-w6dzs" Mar 21 04:27:00 crc kubenswrapper[4839]: I0321 04:27:00.216375 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/2bceecf8-583d-4e26-9749-f5939280540b-webhook-cert\") pod \"packageserver-d55dfcdfc-vg8dq\" (UID: \"2bceecf8-583d-4e26-9749-f5939280540b\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-vg8dq" Mar 21 04:27:00 crc kubenswrapper[4839]: I0321 04:27:00.216403 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/81a76ad1-da33-4b42-9c0a-d0ada077729a-metrics-tls\") pod \"dns-default-5jhkc\" (UID: \"81a76ad1-da33-4b42-9c0a-d0ada077729a\") " pod="openshift-dns/dns-default-5jhkc" Mar 21 04:27:00 crc kubenswrapper[4839]: I0321 04:27:00.216432 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/9fc77b0b-af5d-42f5-8fd1-69ac8d6616d8-metrics-tls\") pod \"ingress-operator-5b745b69d9-9m6tl\" (UID: \"9fc77b0b-af5d-42f5-8fd1-69ac8d6616d8\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-9m6tl" Mar 21 04:27:00 crc kubenswrapper[4839]: I0321 04:27:00.216492 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9lg7c\" (UniqueName: \"kubernetes.io/projected/34efe2c8-d7a8-47c1-8890-85ebd5ef1eb9-kube-api-access-9lg7c\") pod \"kube-storage-version-migrator-operator-b67b599dd-x2g8w\" (UID: \"34efe2c8-d7a8-47c1-8890-85ebd5ef1eb9\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-x2g8w" Mar 21 04:27:00 crc kubenswrapper[4839]: I0321 04:27:00.216541 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/34efe2c8-d7a8-47c1-8890-85ebd5ef1eb9-serving-cert\") pod \"kube-storage-version-migrator-operator-b67b599dd-x2g8w\" (UID: \"34efe2c8-d7a8-47c1-8890-85ebd5ef1eb9\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-x2g8w" Mar 21 04:27:00 crc kubenswrapper[4839]: I0321 04:27:00.216743 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7fjmf\" (UniqueName: \"kubernetes.io/projected/a83789bf-1523-4d5e-892d-6597aed01b7d-kube-api-access-7fjmf\") pod \"catalog-operator-68c6474976-xldvn\" (UID: \"a83789bf-1523-4d5e-892d-6597aed01b7d\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-xldvn" Mar 21 04:27:00 crc kubenswrapper[4839]: I0321 04:27:00.216777 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d6s9q\" (UniqueName: \"kubernetes.io/projected/6dd3a400-6155-44b9-a358-d2cd089db1f6-kube-api-access-d6s9q\") pod \"service-ca-operator-777779d784-shqhf\" (UID: \"6dd3a400-6155-44b9-a358-d2cd089db1f6\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-shqhf" Mar 21 04:27:00 crc kubenswrapper[4839]: I0321 04:27:00.216823 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/28ce563b-8e5b-4abe-b71b-02c588bff511-default-certificate\") pod \"router-default-5444994796-w6dzs\" (UID: \"28ce563b-8e5b-4abe-b71b-02c588bff511\") " pod="openshift-ingress/router-default-5444994796-w6dzs" Mar 21 04:27:00 crc kubenswrapper[4839]: I0321 04:27:00.216885 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/745f7801-7150-4924-b9fb-e8a0aa1e7edb-config\") pod \"kube-apiserver-operator-766d6c64bb-djrdt\" (UID: \"745f7801-7150-4924-b9fb-e8a0aa1e7edb\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-djrdt" Mar 21 04:27:00 crc kubenswrapper[4839]: I0321 04:27:00.216915 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jtldg\" (UniqueName: \"kubernetes.io/projected/2bceecf8-583d-4e26-9749-f5939280540b-kube-api-access-jtldg\") pod \"packageserver-d55dfcdfc-vg8dq\" (UID: \"2bceecf8-583d-4e26-9749-f5939280540b\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-vg8dq" Mar 21 04:27:00 crc kubenswrapper[4839]: I0321 04:27:00.216932 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6dd3a400-6155-44b9-a358-d2cd089db1f6-config\") pod \"service-ca-operator-777779d784-shqhf\" (UID: \"6dd3a400-6155-44b9-a358-d2cd089db1f6\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-shqhf" Mar 21 04:27:00 crc kubenswrapper[4839]: I0321 04:27:00.216969 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/34efe2c8-d7a8-47c1-8890-85ebd5ef1eb9-config\") pod \"kube-storage-version-migrator-operator-b67b599dd-x2g8w\" (UID: \"34efe2c8-d7a8-47c1-8890-85ebd5ef1eb9\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-x2g8w" Mar 21 04:27:00 crc kubenswrapper[4839]: I0321 04:27:00.216985 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-75c2v\" (UniqueName: \"kubernetes.io/projected/685c3b51-a70f-484e-b7db-f98383f75003-kube-api-access-75c2v\") pod \"olm-operator-6b444d44fb-78xr9\" (UID: \"685c3b51-a70f-484e-b7db-f98383f75003\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-78xr9" Mar 21 04:27:00 crc kubenswrapper[4839]: I0321 04:27:00.217006 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/10dc7791-eebd-49e9-8d9c-63711119e9d7-serving-cert\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-hmqv7\" (UID: \"10dc7791-eebd-49e9-8d9c-63711119e9d7\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-hmqv7" Mar 21 04:27:00 crc kubenswrapper[4839]: I0321 04:27:00.217184 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6dd3a400-6155-44b9-a358-d2cd089db1f6-serving-cert\") pod \"service-ca-operator-777779d784-shqhf\" (UID: \"6dd3a400-6155-44b9-a358-d2cd089db1f6\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-shqhf" Mar 21 04:27:00 crc kubenswrapper[4839]: I0321 04:27:00.217242 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pfdvw\" (UniqueName: \"kubernetes.io/projected/7ef3f28d-e496-434e-a803-3b9a0fa24690-kube-api-access-pfdvw\") pod \"image-registry-697d97f7c8-ql2ps\" (UID: \"7ef3f28d-e496-434e-a803-3b9a0fa24690\") " pod="openshift-image-registry/image-registry-697d97f7c8-ql2ps" Mar 21 04:27:00 crc kubenswrapper[4839]: I0321 04:27:00.217293 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/6240548e-b827-4fdb-b2be-c7187d6a28e8-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-8jgh7\" (UID: \"6240548e-b827-4fdb-b2be-c7187d6a28e8\") " pod="openshift-marketplace/marketplace-operator-79b997595-8jgh7" Mar 21 04:27:00 crc kubenswrapper[4839]: I0321 04:27:00.217327 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/685c3b51-a70f-484e-b7db-f98383f75003-srv-cert\") pod \"olm-operator-6b444d44fb-78xr9\" (UID: \"685c3b51-a70f-484e-b7db-f98383f75003\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-78xr9" Mar 21 04:27:00 crc kubenswrapper[4839]: I0321 04:27:00.217363 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-ql2ps\" (UID: \"7ef3f28d-e496-434e-a803-3b9a0fa24690\") " pod="openshift-image-registry/image-registry-697d97f7c8-ql2ps" Mar 21 04:27:00 crc kubenswrapper[4839]: I0321 04:27:00.217387 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/2bceecf8-583d-4e26-9749-f5939280540b-apiservice-cert\") pod \"packageserver-d55dfcdfc-vg8dq\" (UID: \"2bceecf8-583d-4e26-9749-f5939280540b\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-vg8dq" Mar 21 04:27:00 crc kubenswrapper[4839]: I0321 04:27:00.217463 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/a810b51a-5b19-4da9-ad80-05f189d821e4-proxy-tls\") pod \"machine-config-controller-84d6567774-hlk25\" (UID: \"a810b51a-5b19-4da9-ad80-05f189d821e4\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-hlk25" Mar 21 04:27:00 crc kubenswrapper[4839]: I0321 04:27:00.217492 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/8c8a6e75-7e5f-41c8-8312-b9d274284f35-webhook-certs\") pod \"multus-admission-controller-857f4d67dd-g9tz2\" (UID: \"8c8a6e75-7e5f-41c8-8312-b9d274284f35\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-g9tz2" Mar 21 04:27:00 crc kubenswrapper[4839]: I0321 04:27:00.217523 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/9fc77b0b-af5d-42f5-8fd1-69ac8d6616d8-bound-sa-token\") pod \"ingress-operator-5b745b69d9-9m6tl\" (UID: \"9fc77b0b-af5d-42f5-8fd1-69ac8d6616d8\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-9m6tl" Mar 21 04:27:00 crc kubenswrapper[4839]: E0321 04:27:00.219915 4839 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-21 04:27:00.719896838 +0000 UTC m=+225.047683514 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-ql2ps" (UID: "7ef3f28d-e496-434e-a803-3b9a0fa24690") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 21 04:27:00 crc kubenswrapper[4839]: I0321 04:27:00.220337 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/a810b51a-5b19-4da9-ad80-05f189d821e4-mcc-auth-proxy-config\") pod \"machine-config-controller-84d6567774-hlk25\" (UID: \"a810b51a-5b19-4da9-ad80-05f189d821e4\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-hlk25" Mar 21 04:27:00 crc kubenswrapper[4839]: I0321 04:27:00.220490 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v69rn\" (UniqueName: \"kubernetes.io/projected/6240548e-b827-4fdb-b2be-c7187d6a28e8-kube-api-access-v69rn\") pod \"marketplace-operator-79b997595-8jgh7\" (UID: \"6240548e-b827-4fdb-b2be-c7187d6a28e8\") " pod="openshift-marketplace/marketplace-operator-79b997595-8jgh7" Mar 21 04:27:00 crc kubenswrapper[4839]: I0321 04:27:00.220541 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/28ce563b-8e5b-4abe-b71b-02c588bff511-metrics-certs\") pod \"router-default-5444994796-w6dzs\" (UID: \"28ce563b-8e5b-4abe-b71b-02c588bff511\") " pod="openshift-ingress/router-default-5444994796-w6dzs" Mar 21 04:27:00 crc kubenswrapper[4839]: I0321 04:27:00.220589 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/7ef3f28d-e496-434e-a803-3b9a0fa24690-registry-tls\") pod \"image-registry-697d97f7c8-ql2ps\" (UID: \"7ef3f28d-e496-434e-a803-3b9a0fa24690\") " pod="openshift-image-registry/image-registry-697d97f7c8-ql2ps" Mar 21 04:27:00 crc kubenswrapper[4839]: I0321 04:27:00.220615 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/7ef3f28d-e496-434e-a803-3b9a0fa24690-installation-pull-secrets\") pod \"image-registry-697d97f7c8-ql2ps\" (UID: \"7ef3f28d-e496-434e-a803-3b9a0fa24690\") " pod="openshift-image-registry/image-registry-697d97f7c8-ql2ps" Mar 21 04:27:00 crc kubenswrapper[4839]: I0321 04:27:00.220724 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/7ef3f28d-e496-434e-a803-3b9a0fa24690-registry-certificates\") pod \"image-registry-697d97f7c8-ql2ps\" (UID: \"7ef3f28d-e496-434e-a803-3b9a0fa24690\") " pod="openshift-image-registry/image-registry-697d97f7c8-ql2ps" Mar 21 04:27:00 crc kubenswrapper[4839]: I0321 04:27:00.220770 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t6nxn\" (UniqueName: \"kubernetes.io/projected/9fc77b0b-af5d-42f5-8fd1-69ac8d6616d8-kube-api-access-t6nxn\") pod \"ingress-operator-5b745b69d9-9m6tl\" (UID: \"9fc77b0b-af5d-42f5-8fd1-69ac8d6616d8\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-9m6tl" Mar 21 04:27:00 crc kubenswrapper[4839]: I0321 04:27:00.220821 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8m9qn\" (UniqueName: \"kubernetes.io/projected/40014780-8cb8-47fa-8b2c-c4fb7d04a85c-kube-api-access-8m9qn\") pod \"control-plane-machine-set-operator-78cbb6b69f-whlp9\" (UID: \"40014780-8cb8-47fa-8b2c-c4fb7d04a85c\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-whlp9" Mar 21 04:27:00 crc kubenswrapper[4839]: I0321 04:27:00.220878 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/7ef3f28d-e496-434e-a803-3b9a0fa24690-ca-trust-extracted\") pod \"image-registry-697d97f7c8-ql2ps\" (UID: \"7ef3f28d-e496-434e-a803-3b9a0fa24690\") " pod="openshift-image-registry/image-registry-697d97f7c8-ql2ps" Mar 21 04:27:00 crc kubenswrapper[4839]: I0321 04:27:00.220902 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/685c3b51-a70f-484e-b7db-f98383f75003-profile-collector-cert\") pod \"olm-operator-6b444d44fb-78xr9\" (UID: \"685c3b51-a70f-484e-b7db-f98383f75003\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-78xr9" Mar 21 04:27:00 crc kubenswrapper[4839]: I0321 04:27:00.220927 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/a83789bf-1523-4d5e-892d-6597aed01b7d-profile-collector-cert\") pod \"catalog-operator-68c6474976-xldvn\" (UID: \"a83789bf-1523-4d5e-892d-6597aed01b7d\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-xldvn" Mar 21 04:27:00 crc kubenswrapper[4839]: I0321 04:27:00.221550 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rmkvx\" (UniqueName: \"kubernetes.io/projected/81a76ad1-da33-4b42-9c0a-d0ada077729a-kube-api-access-rmkvx\") pod \"dns-default-5jhkc\" (UID: \"81a76ad1-da33-4b42-9c0a-d0ada077729a\") " pod="openshift-dns/dns-default-5jhkc" Mar 21 04:27:00 crc kubenswrapper[4839]: I0321 04:27:00.221685 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/9fc77b0b-af5d-42f5-8fd1-69ac8d6616d8-trusted-ca\") pod \"ingress-operator-5b745b69d9-9m6tl\" (UID: \"9fc77b0b-af5d-42f5-8fd1-69ac8d6616d8\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-9m6tl" Mar 21 04:27:00 crc kubenswrapper[4839]: I0321 04:27:00.245869 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-bj929" Mar 21 04:27:00 crc kubenswrapper[4839]: I0321 04:27:00.263047 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-58897d9998-hkg98" Mar 21 04:27:00 crc kubenswrapper[4839]: I0321 04:27:00.275506 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-4cwc5" Mar 21 04:27:00 crc kubenswrapper[4839]: I0321 04:27:00.323948 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-fhqz6" Mar 21 04:27:00 crc kubenswrapper[4839]: I0321 04:27:00.324361 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 21 04:27:00 crc kubenswrapper[4839]: E0321 04:27:00.324454 4839 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-21 04:27:00.824433914 +0000 UTC m=+225.152220580 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 21 04:27:00 crc kubenswrapper[4839]: I0321 04:27:00.324626 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/28ce563b-8e5b-4abe-b71b-02c588bff511-stats-auth\") pod \"router-default-5444994796-w6dzs\" (UID: \"28ce563b-8e5b-4abe-b71b-02c588bff511\") " pod="openshift-ingress/router-default-5444994796-w6dzs" Mar 21 04:27:00 crc kubenswrapper[4839]: I0321 04:27:00.324653 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/2bceecf8-583d-4e26-9749-f5939280540b-webhook-cert\") pod \"packageserver-d55dfcdfc-vg8dq\" (UID: \"2bceecf8-583d-4e26-9749-f5939280540b\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-vg8dq" Mar 21 04:27:00 crc kubenswrapper[4839]: I0321 04:27:00.324700 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/81a76ad1-da33-4b42-9c0a-d0ada077729a-metrics-tls\") pod \"dns-default-5jhkc\" (UID: \"81a76ad1-da33-4b42-9c0a-d0ada077729a\") " pod="openshift-dns/dns-default-5jhkc" Mar 21 04:27:00 crc kubenswrapper[4839]: I0321 04:27:00.325679 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/9fc77b0b-af5d-42f5-8fd1-69ac8d6616d8-metrics-tls\") pod \"ingress-operator-5b745b69d9-9m6tl\" (UID: \"9fc77b0b-af5d-42f5-8fd1-69ac8d6616d8\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-9m6tl" Mar 21 04:27:00 crc kubenswrapper[4839]: I0321 04:27:00.326305 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9lg7c\" (UniqueName: \"kubernetes.io/projected/34efe2c8-d7a8-47c1-8890-85ebd5ef1eb9-kube-api-access-9lg7c\") pod \"kube-storage-version-migrator-operator-b67b599dd-x2g8w\" (UID: \"34efe2c8-d7a8-47c1-8890-85ebd5ef1eb9\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-x2g8w" Mar 21 04:27:00 crc kubenswrapper[4839]: I0321 04:27:00.326329 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/ad426123-af7f-45c4-8a6b-bca3c83017be-signing-key\") pod \"service-ca-9c57cc56f-6rrrs\" (UID: \"ad426123-af7f-45c4-8a6b-bca3c83017be\") " pod="openshift-service-ca/service-ca-9c57cc56f-6rrrs" Mar 21 04:27:00 crc kubenswrapper[4839]: I0321 04:27:00.326349 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/a178972b-b463-42db-b2c9-dcba9a51c4bc-certs\") pod \"machine-config-server-4sj57\" (UID: \"a178972b-b463-42db-b2c9-dcba9a51c4bc\") " pod="openshift-machine-config-operator/machine-config-server-4sj57" Mar 21 04:27:00 crc kubenswrapper[4839]: I0321 04:27:00.326402 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/34efe2c8-d7a8-47c1-8890-85ebd5ef1eb9-serving-cert\") pod \"kube-storage-version-migrator-operator-b67b599dd-x2g8w\" (UID: \"34efe2c8-d7a8-47c1-8890-85ebd5ef1eb9\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-x2g8w" Mar 21 04:27:00 crc kubenswrapper[4839]: I0321 04:27:00.326422 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7fjmf\" (UniqueName: \"kubernetes.io/projected/a83789bf-1523-4d5e-892d-6597aed01b7d-kube-api-access-7fjmf\") pod \"catalog-operator-68c6474976-xldvn\" (UID: \"a83789bf-1523-4d5e-892d-6597aed01b7d\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-xldvn" Mar 21 04:27:00 crc kubenswrapper[4839]: I0321 04:27:00.326461 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/28ce563b-8e5b-4abe-b71b-02c588bff511-default-certificate\") pod \"router-default-5444994796-w6dzs\" (UID: \"28ce563b-8e5b-4abe-b71b-02c588bff511\") " pod="openshift-ingress/router-default-5444994796-w6dzs" Mar 21 04:27:00 crc kubenswrapper[4839]: I0321 04:27:00.326477 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d6s9q\" (UniqueName: \"kubernetes.io/projected/6dd3a400-6155-44b9-a358-d2cd089db1f6-kube-api-access-d6s9q\") pod \"service-ca-operator-777779d784-shqhf\" (UID: \"6dd3a400-6155-44b9-a358-d2cd089db1f6\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-shqhf" Mar 21 04:27:00 crc kubenswrapper[4839]: I0321 04:27:00.326495 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/745f7801-7150-4924-b9fb-e8a0aa1e7edb-config\") pod \"kube-apiserver-operator-766d6c64bb-djrdt\" (UID: \"745f7801-7150-4924-b9fb-e8a0aa1e7edb\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-djrdt" Mar 21 04:27:00 crc kubenswrapper[4839]: I0321 04:27:00.326512 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jtldg\" (UniqueName: \"kubernetes.io/projected/2bceecf8-583d-4e26-9749-f5939280540b-kube-api-access-jtldg\") pod \"packageserver-d55dfcdfc-vg8dq\" (UID: \"2bceecf8-583d-4e26-9749-f5939280540b\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-vg8dq" Mar 21 04:27:00 crc kubenswrapper[4839]: I0321 04:27:00.326529 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/34efe2c8-d7a8-47c1-8890-85ebd5ef1eb9-config\") pod \"kube-storage-version-migrator-operator-b67b599dd-x2g8w\" (UID: \"34efe2c8-d7a8-47c1-8890-85ebd5ef1eb9\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-x2g8w" Mar 21 04:27:00 crc kubenswrapper[4839]: I0321 04:27:00.326544 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6dd3a400-6155-44b9-a358-d2cd089db1f6-config\") pod \"service-ca-operator-777779d784-shqhf\" (UID: \"6dd3a400-6155-44b9-a358-d2cd089db1f6\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-shqhf" Mar 21 04:27:00 crc kubenswrapper[4839]: I0321 04:27:00.326559 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/a178972b-b463-42db-b2c9-dcba9a51c4bc-node-bootstrap-token\") pod \"machine-config-server-4sj57\" (UID: \"a178972b-b463-42db-b2c9-dcba9a51c4bc\") " pod="openshift-machine-config-operator/machine-config-server-4sj57" Mar 21 04:27:00 crc kubenswrapper[4839]: I0321 04:27:00.326601 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-75c2v\" (UniqueName: \"kubernetes.io/projected/685c3b51-a70f-484e-b7db-f98383f75003-kube-api-access-75c2v\") pod \"olm-operator-6b444d44fb-78xr9\" (UID: \"685c3b51-a70f-484e-b7db-f98383f75003\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-78xr9" Mar 21 04:27:00 crc kubenswrapper[4839]: I0321 04:27:00.326619 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/10dc7791-eebd-49e9-8d9c-63711119e9d7-serving-cert\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-hmqv7\" (UID: \"10dc7791-eebd-49e9-8d9c-63711119e9d7\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-hmqv7" Mar 21 04:27:00 crc kubenswrapper[4839]: I0321 04:27:00.326663 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/0368223e-2e01-4681-a7a6-67b77387f8d8-secret-volume\") pod \"collect-profiles-29567775-lfv48\" (UID: \"0368223e-2e01-4681-a7a6-67b77387f8d8\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29567775-lfv48" Mar 21 04:27:00 crc kubenswrapper[4839]: I0321 04:27:00.326679 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"csi-data-dir\" (UniqueName: \"kubernetes.io/host-path/4fee5524-9cb1-48c7-83b6-10bf3230c783-csi-data-dir\") pod \"csi-hostpathplugin-cstqb\" (UID: \"4fee5524-9cb1-48c7-83b6-10bf3230c783\") " pod="hostpath-provisioner/csi-hostpathplugin-cstqb" Mar 21 04:27:00 crc kubenswrapper[4839]: I0321 04:27:00.326697 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6dd3a400-6155-44b9-a358-d2cd089db1f6-serving-cert\") pod \"service-ca-operator-777779d784-shqhf\" (UID: \"6dd3a400-6155-44b9-a358-d2cd089db1f6\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-shqhf" Mar 21 04:27:00 crc kubenswrapper[4839]: I0321 04:27:00.326715 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/d1439545-f492-4e4c-858c-ec85c5c2a9d9-cert\") pod \"ingress-canary-brnnr\" (UID: \"d1439545-f492-4e4c-858c-ec85c5c2a9d9\") " pod="openshift-ingress-canary/ingress-canary-brnnr" Mar 21 04:27:00 crc kubenswrapper[4839]: I0321 04:27:00.326735 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pfdvw\" (UniqueName: \"kubernetes.io/projected/7ef3f28d-e496-434e-a803-3b9a0fa24690-kube-api-access-pfdvw\") pod \"image-registry-697d97f7c8-ql2ps\" (UID: \"7ef3f28d-e496-434e-a803-3b9a0fa24690\") " pod="openshift-image-registry/image-registry-697d97f7c8-ql2ps" Mar 21 04:27:00 crc kubenswrapper[4839]: I0321 04:27:00.326755 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/6240548e-b827-4fdb-b2be-c7187d6a28e8-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-8jgh7\" (UID: \"6240548e-b827-4fdb-b2be-c7187d6a28e8\") " pod="openshift-marketplace/marketplace-operator-79b997595-8jgh7" Mar 21 04:27:00 crc kubenswrapper[4839]: I0321 04:27:00.326780 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/685c3b51-a70f-484e-b7db-f98383f75003-srv-cert\") pod \"olm-operator-6b444d44fb-78xr9\" (UID: \"685c3b51-a70f-484e-b7db-f98383f75003\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-78xr9" Mar 21 04:27:00 crc kubenswrapper[4839]: I0321 04:27:00.326816 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-ql2ps\" (UID: \"7ef3f28d-e496-434e-a803-3b9a0fa24690\") " pod="openshift-image-registry/image-registry-697d97f7c8-ql2ps" Mar 21 04:27:00 crc kubenswrapper[4839]: I0321 04:27:00.326839 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/2bceecf8-583d-4e26-9749-f5939280540b-apiservice-cert\") pod \"packageserver-d55dfcdfc-vg8dq\" (UID: \"2bceecf8-583d-4e26-9749-f5939280540b\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-vg8dq" Mar 21 04:27:00 crc kubenswrapper[4839]: I0321 04:27:00.326863 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/a810b51a-5b19-4da9-ad80-05f189d821e4-proxy-tls\") pod \"machine-config-controller-84d6567774-hlk25\" (UID: \"a810b51a-5b19-4da9-ad80-05f189d821e4\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-hlk25" Mar 21 04:27:00 crc kubenswrapper[4839]: I0321 04:27:00.326884 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/8c8a6e75-7e5f-41c8-8312-b9d274284f35-webhook-certs\") pod \"multus-admission-controller-857f4d67dd-g9tz2\" (UID: \"8c8a6e75-7e5f-41c8-8312-b9d274284f35\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-g9tz2" Mar 21 04:27:00 crc kubenswrapper[4839]: I0321 04:27:00.326898 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/9fc77b0b-af5d-42f5-8fd1-69ac8d6616d8-bound-sa-token\") pod \"ingress-operator-5b745b69d9-9m6tl\" (UID: \"9fc77b0b-af5d-42f5-8fd1-69ac8d6616d8\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-9m6tl" Mar 21 04:27:00 crc kubenswrapper[4839]: I0321 04:27:00.327010 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/a810b51a-5b19-4da9-ad80-05f189d821e4-mcc-auth-proxy-config\") pod \"machine-config-controller-84d6567774-hlk25\" (UID: \"a810b51a-5b19-4da9-ad80-05f189d821e4\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-hlk25" Mar 21 04:27:00 crc kubenswrapper[4839]: I0321 04:27:00.327032 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2jr79\" (UniqueName: \"kubernetes.io/projected/ad426123-af7f-45c4-8a6b-bca3c83017be-kube-api-access-2jr79\") pod \"service-ca-9c57cc56f-6rrrs\" (UID: \"ad426123-af7f-45c4-8a6b-bca3c83017be\") " pod="openshift-service-ca/service-ca-9c57cc56f-6rrrs" Mar 21 04:27:00 crc kubenswrapper[4839]: I0321 04:27:00.327058 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-dir\" (UniqueName: \"kubernetes.io/host-path/4fee5524-9cb1-48c7-83b6-10bf3230c783-plugins-dir\") pod \"csi-hostpathplugin-cstqb\" (UID: \"4fee5524-9cb1-48c7-83b6-10bf3230c783\") " pod="hostpath-provisioner/csi-hostpathplugin-cstqb" Mar 21 04:27:00 crc kubenswrapper[4839]: I0321 04:27:00.327084 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-v69rn\" (UniqueName: \"kubernetes.io/projected/6240548e-b827-4fdb-b2be-c7187d6a28e8-kube-api-access-v69rn\") pod \"marketplace-operator-79b997595-8jgh7\" (UID: \"6240548e-b827-4fdb-b2be-c7187d6a28e8\") " pod="openshift-marketplace/marketplace-operator-79b997595-8jgh7" Mar 21 04:27:00 crc kubenswrapper[4839]: I0321 04:27:00.327107 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/28ce563b-8e5b-4abe-b71b-02c588bff511-metrics-certs\") pod \"router-default-5444994796-w6dzs\" (UID: \"28ce563b-8e5b-4abe-b71b-02c588bff511\") " pod="openshift-ingress/router-default-5444994796-w6dzs" Mar 21 04:27:00 crc kubenswrapper[4839]: I0321 04:27:00.327122 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-26gxn\" (UniqueName: \"kubernetes.io/projected/a178972b-b463-42db-b2c9-dcba9a51c4bc-kube-api-access-26gxn\") pod \"machine-config-server-4sj57\" (UID: \"a178972b-b463-42db-b2c9-dcba9a51c4bc\") " pod="openshift-machine-config-operator/machine-config-server-4sj57" Mar 21 04:27:00 crc kubenswrapper[4839]: I0321 04:27:00.327139 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/7ef3f28d-e496-434e-a803-3b9a0fa24690-registry-tls\") pod \"image-registry-697d97f7c8-ql2ps\" (UID: \"7ef3f28d-e496-434e-a803-3b9a0fa24690\") " pod="openshift-image-registry/image-registry-697d97f7c8-ql2ps" Mar 21 04:27:00 crc kubenswrapper[4839]: I0321 04:27:00.327157 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8tf6t\" (UniqueName: \"kubernetes.io/projected/d1439545-f492-4e4c-858c-ec85c5c2a9d9-kube-api-access-8tf6t\") pod \"ingress-canary-brnnr\" (UID: \"d1439545-f492-4e4c-858c-ec85c5c2a9d9\") " pod="openshift-ingress-canary/ingress-canary-brnnr" Mar 21 04:27:00 crc kubenswrapper[4839]: I0321 04:27:00.327174 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/7ef3f28d-e496-434e-a803-3b9a0fa24690-registry-certificates\") pod \"image-registry-697d97f7c8-ql2ps\" (UID: \"7ef3f28d-e496-434e-a803-3b9a0fa24690\") " pod="openshift-image-registry/image-registry-697d97f7c8-ql2ps" Mar 21 04:27:00 crc kubenswrapper[4839]: I0321 04:27:00.327189 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/7ef3f28d-e496-434e-a803-3b9a0fa24690-installation-pull-secrets\") pod \"image-registry-697d97f7c8-ql2ps\" (UID: \"7ef3f28d-e496-434e-a803-3b9a0fa24690\") " pod="openshift-image-registry/image-registry-697d97f7c8-ql2ps" Mar 21 04:27:00 crc kubenswrapper[4839]: I0321 04:27:00.327204 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/0368223e-2e01-4681-a7a6-67b77387f8d8-config-volume\") pod \"collect-profiles-29567775-lfv48\" (UID: \"0368223e-2e01-4681-a7a6-67b77387f8d8\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29567775-lfv48" Mar 21 04:27:00 crc kubenswrapper[4839]: I0321 04:27:00.327219 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l9wch\" (UniqueName: \"kubernetes.io/projected/0368223e-2e01-4681-a7a6-67b77387f8d8-kube-api-access-l9wch\") pod \"collect-profiles-29567775-lfv48\" (UID: \"0368223e-2e01-4681-a7a6-67b77387f8d8\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29567775-lfv48" Mar 21 04:27:00 crc kubenswrapper[4839]: I0321 04:27:00.327234 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-t6nxn\" (UniqueName: \"kubernetes.io/projected/9fc77b0b-af5d-42f5-8fd1-69ac8d6616d8-kube-api-access-t6nxn\") pod \"ingress-operator-5b745b69d9-9m6tl\" (UID: \"9fc77b0b-af5d-42f5-8fd1-69ac8d6616d8\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-9m6tl" Mar 21 04:27:00 crc kubenswrapper[4839]: I0321 04:27:00.327251 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/7ef3f28d-e496-434e-a803-3b9a0fa24690-ca-trust-extracted\") pod \"image-registry-697d97f7c8-ql2ps\" (UID: \"7ef3f28d-e496-434e-a803-3b9a0fa24690\") " pod="openshift-image-registry/image-registry-697d97f7c8-ql2ps" Mar 21 04:27:00 crc kubenswrapper[4839]: I0321 04:27:00.327266 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8m9qn\" (UniqueName: \"kubernetes.io/projected/40014780-8cb8-47fa-8b2c-c4fb7d04a85c-kube-api-access-8m9qn\") pod \"control-plane-machine-set-operator-78cbb6b69f-whlp9\" (UID: \"40014780-8cb8-47fa-8b2c-c4fb7d04a85c\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-whlp9" Mar 21 04:27:00 crc kubenswrapper[4839]: I0321 04:27:00.327284 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/685c3b51-a70f-484e-b7db-f98383f75003-profile-collector-cert\") pod \"olm-operator-6b444d44fb-78xr9\" (UID: \"685c3b51-a70f-484e-b7db-f98383f75003\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-78xr9" Mar 21 04:27:00 crc kubenswrapper[4839]: I0321 04:27:00.327470 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/a83789bf-1523-4d5e-892d-6597aed01b7d-profile-collector-cert\") pod \"catalog-operator-68c6474976-xldvn\" (UID: \"a83789bf-1523-4d5e-892d-6597aed01b7d\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-xldvn" Mar 21 04:27:00 crc kubenswrapper[4839]: I0321 04:27:00.327507 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rmkvx\" (UniqueName: \"kubernetes.io/projected/81a76ad1-da33-4b42-9c0a-d0ada077729a-kube-api-access-rmkvx\") pod \"dns-default-5jhkc\" (UID: \"81a76ad1-da33-4b42-9c0a-d0ada077729a\") " pod="openshift-dns/dns-default-5jhkc" Mar 21 04:27:00 crc kubenswrapper[4839]: I0321 04:27:00.327560 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/9fc77b0b-af5d-42f5-8fd1-69ac8d6616d8-trusted-ca\") pod \"ingress-operator-5b745b69d9-9m6tl\" (UID: \"9fc77b0b-af5d-42f5-8fd1-69ac8d6616d8\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-9m6tl" Mar 21 04:27:00 crc kubenswrapper[4839]: I0321 04:27:00.327628 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/10dc7791-eebd-49e9-8d9c-63711119e9d7-config\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-hmqv7\" (UID: \"10dc7791-eebd-49e9-8d9c-63711119e9d7\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-hmqv7" Mar 21 04:27:00 crc kubenswrapper[4839]: I0321 04:27:00.327657 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/4fee5524-9cb1-48c7-83b6-10bf3230c783-registration-dir\") pod \"csi-hostpathplugin-cstqb\" (UID: \"4fee5524-9cb1-48c7-83b6-10bf3230c783\") " pod="hostpath-provisioner/csi-hostpathplugin-cstqb" Mar 21 04:27:00 crc kubenswrapper[4839]: I0321 04:27:00.327695 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/10dc7791-eebd-49e9-8d9c-63711119e9d7-kube-api-access\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-hmqv7\" (UID: \"10dc7791-eebd-49e9-8d9c-63711119e9d7\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-hmqv7" Mar 21 04:27:00 crc kubenswrapper[4839]: I0321 04:27:00.327709 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/7ef3f28d-e496-434e-a803-3b9a0fa24690-bound-sa-token\") pod \"image-registry-697d97f7c8-ql2ps\" (UID: \"7ef3f28d-e496-434e-a803-3b9a0fa24690\") " pod="openshift-image-registry/image-registry-697d97f7c8-ql2ps" Mar 21 04:27:00 crc kubenswrapper[4839]: I0321 04:27:00.327723 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/7ef3f28d-e496-434e-a803-3b9a0fa24690-trusted-ca\") pod \"image-registry-697d97f7c8-ql2ps\" (UID: \"7ef3f28d-e496-434e-a803-3b9a0fa24690\") " pod="openshift-image-registry/image-registry-697d97f7c8-ql2ps" Mar 21 04:27:00 crc kubenswrapper[4839]: I0321 04:27:00.327738 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vjkdf\" (UniqueName: \"kubernetes.io/projected/609ace61-45d1-44f6-b378-fb97eecf2374-kube-api-access-vjkdf\") pod \"auto-csr-approver-29567786-d8w8k\" (UID: \"609ace61-45d1-44f6-b378-fb97eecf2374\") " pod="openshift-infra/auto-csr-approver-29567786-d8w8k" Mar 21 04:27:00 crc kubenswrapper[4839]: I0321 04:27:00.327762 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/28ce563b-8e5b-4abe-b71b-02c588bff511-service-ca-bundle\") pod \"router-default-5444994796-w6dzs\" (UID: \"28ce563b-8e5b-4abe-b71b-02c588bff511\") " pod="openshift-ingress/router-default-5444994796-w6dzs" Mar 21 04:27:00 crc kubenswrapper[4839]: I0321 04:27:00.327779 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dbz79\" (UniqueName: \"kubernetes.io/projected/7f2c6e22-6a88-4c63-9da2-e38b813e0f1c-kube-api-access-dbz79\") pod \"migrator-59844c95c7-blcpt\" (UID: \"7f2c6e22-6a88-4c63-9da2-e38b813e0f1c\") " pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-blcpt" Mar 21 04:27:00 crc kubenswrapper[4839]: I0321 04:27:00.327793 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/81a76ad1-da33-4b42-9c0a-d0ada077729a-config-volume\") pod \"dns-default-5jhkc\" (UID: \"81a76ad1-da33-4b42-9c0a-d0ada077729a\") " pod="openshift-dns/dns-default-5jhkc" Mar 21 04:27:00 crc kubenswrapper[4839]: I0321 04:27:00.327812 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/745f7801-7150-4924-b9fb-e8a0aa1e7edb-kube-api-access\") pod \"kube-apiserver-operator-766d6c64bb-djrdt\" (UID: \"745f7801-7150-4924-b9fb-e8a0aa1e7edb\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-djrdt" Mar 21 04:27:00 crc kubenswrapper[4839]: I0321 04:27:00.327833 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/ad426123-af7f-45c4-8a6b-bca3c83017be-signing-cabundle\") pod \"service-ca-9c57cc56f-6rrrs\" (UID: \"ad426123-af7f-45c4-8a6b-bca3c83017be\") " pod="openshift-service-ca/service-ca-9c57cc56f-6rrrs" Mar 21 04:27:00 crc kubenswrapper[4839]: I0321 04:27:00.327857 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-627w7\" (UniqueName: \"kubernetes.io/projected/28ce563b-8e5b-4abe-b71b-02c588bff511-kube-api-access-627w7\") pod \"router-default-5444994796-w6dzs\" (UID: \"28ce563b-8e5b-4abe-b71b-02c588bff511\") " pod="openshift-ingress/router-default-5444994796-w6dzs" Mar 21 04:27:00 crc kubenswrapper[4839]: I0321 04:27:00.327877 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/4fee5524-9cb1-48c7-83b6-10bf3230c783-socket-dir\") pod \"csi-hostpathplugin-cstqb\" (UID: \"4fee5524-9cb1-48c7-83b6-10bf3230c783\") " pod="hostpath-provisioner/csi-hostpathplugin-cstqb" Mar 21 04:27:00 crc kubenswrapper[4839]: I0321 04:27:00.327912 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/40014780-8cb8-47fa-8b2c-c4fb7d04a85c-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-78cbb6b69f-whlp9\" (UID: \"40014780-8cb8-47fa-8b2c-c4fb7d04a85c\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-whlp9" Mar 21 04:27:00 crc kubenswrapper[4839]: I0321 04:27:00.327941 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"mountpoint-dir\" (UniqueName: \"kubernetes.io/host-path/4fee5524-9cb1-48c7-83b6-10bf3230c783-mountpoint-dir\") pod \"csi-hostpathplugin-cstqb\" (UID: \"4fee5524-9cb1-48c7-83b6-10bf3230c783\") " pod="hostpath-provisioner/csi-hostpathplugin-cstqb" Mar 21 04:27:00 crc kubenswrapper[4839]: I0321 04:27:00.327966 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/6240548e-b827-4fdb-b2be-c7187d6a28e8-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-8jgh7\" (UID: \"6240548e-b827-4fdb-b2be-c7187d6a28e8\") " pod="openshift-marketplace/marketplace-operator-79b997595-8jgh7" Mar 21 04:27:00 crc kubenswrapper[4839]: I0321 04:27:00.327983 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zhhkt\" (UniqueName: \"kubernetes.io/projected/a810b51a-5b19-4da9-ad80-05f189d821e4-kube-api-access-zhhkt\") pod \"machine-config-controller-84d6567774-hlk25\" (UID: \"a810b51a-5b19-4da9-ad80-05f189d821e4\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-hlk25" Mar 21 04:27:00 crc kubenswrapper[4839]: I0321 04:27:00.328002 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vmtdf\" (UniqueName: \"kubernetes.io/projected/8c8a6e75-7e5f-41c8-8312-b9d274284f35-kube-api-access-vmtdf\") pod \"multus-admission-controller-857f4d67dd-g9tz2\" (UID: \"8c8a6e75-7e5f-41c8-8312-b9d274284f35\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-g9tz2" Mar 21 04:27:00 crc kubenswrapper[4839]: I0321 04:27:00.328021 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/745f7801-7150-4924-b9fb-e8a0aa1e7edb-serving-cert\") pod \"kube-apiserver-operator-766d6c64bb-djrdt\" (UID: \"745f7801-7150-4924-b9fb-e8a0aa1e7edb\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-djrdt" Mar 21 04:27:00 crc kubenswrapper[4839]: I0321 04:27:00.328037 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/a83789bf-1523-4d5e-892d-6597aed01b7d-srv-cert\") pod \"catalog-operator-68c6474976-xldvn\" (UID: \"a83789bf-1523-4d5e-892d-6597aed01b7d\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-xldvn" Mar 21 04:27:00 crc kubenswrapper[4839]: I0321 04:27:00.328062 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vffpc\" (UniqueName: \"kubernetes.io/projected/4fee5524-9cb1-48c7-83b6-10bf3230c783-kube-api-access-vffpc\") pod \"csi-hostpathplugin-cstqb\" (UID: \"4fee5524-9cb1-48c7-83b6-10bf3230c783\") " pod="hostpath-provisioner/csi-hostpathplugin-cstqb" Mar 21 04:27:00 crc kubenswrapper[4839]: I0321 04:27:00.328088 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/2bceecf8-583d-4e26-9749-f5939280540b-tmpfs\") pod \"packageserver-d55dfcdfc-vg8dq\" (UID: \"2bceecf8-583d-4e26-9749-f5939280540b\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-vg8dq" Mar 21 04:27:00 crc kubenswrapper[4839]: I0321 04:27:00.328725 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/2bceecf8-583d-4e26-9749-f5939280540b-tmpfs\") pod \"packageserver-d55dfcdfc-vg8dq\" (UID: \"2bceecf8-583d-4e26-9749-f5939280540b\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-vg8dq" Mar 21 04:27:00 crc kubenswrapper[4839]: I0321 04:27:00.331757 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/2bceecf8-583d-4e26-9749-f5939280540b-webhook-cert\") pod \"packageserver-d55dfcdfc-vg8dq\" (UID: \"2bceecf8-583d-4e26-9749-f5939280540b\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-vg8dq" Mar 21 04:27:00 crc kubenswrapper[4839]: I0321 04:27:00.332205 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/28ce563b-8e5b-4abe-b71b-02c588bff511-stats-auth\") pod \"router-default-5444994796-w6dzs\" (UID: \"28ce563b-8e5b-4abe-b71b-02c588bff511\") " pod="openshift-ingress/router-default-5444994796-w6dzs" Mar 21 04:27:00 crc kubenswrapper[4839]: I0321 04:27:00.334019 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/7ef3f28d-e496-434e-a803-3b9a0fa24690-trusted-ca\") pod \"image-registry-697d97f7c8-ql2ps\" (UID: \"7ef3f28d-e496-434e-a803-3b9a0fa24690\") " pod="openshift-image-registry/image-registry-697d97f7c8-ql2ps" Mar 21 04:27:00 crc kubenswrapper[4839]: I0321 04:27:00.336878 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/9fc77b0b-af5d-42f5-8fd1-69ac8d6616d8-metrics-tls\") pod \"ingress-operator-5b745b69d9-9m6tl\" (UID: \"9fc77b0b-af5d-42f5-8fd1-69ac8d6616d8\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-9m6tl" Mar 21 04:27:00 crc kubenswrapper[4839]: I0321 04:27:00.337174 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/81a76ad1-da33-4b42-9c0a-d0ada077729a-metrics-tls\") pod \"dns-default-5jhkc\" (UID: \"81a76ad1-da33-4b42-9c0a-d0ada077729a\") " pod="openshift-dns/dns-default-5jhkc" Mar 21 04:27:00 crc kubenswrapper[4839]: I0321 04:27:00.337392 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/7ef3f28d-e496-434e-a803-3b9a0fa24690-registry-certificates\") pod \"image-registry-697d97f7c8-ql2ps\" (UID: \"7ef3f28d-e496-434e-a803-3b9a0fa24690\") " pod="openshift-image-registry/image-registry-697d97f7c8-ql2ps" Mar 21 04:27:00 crc kubenswrapper[4839]: I0321 04:27:00.337878 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/28ce563b-8e5b-4abe-b71b-02c588bff511-service-ca-bundle\") pod \"router-default-5444994796-w6dzs\" (UID: \"28ce563b-8e5b-4abe-b71b-02c588bff511\") " pod="openshift-ingress/router-default-5444994796-w6dzs" Mar 21 04:27:00 crc kubenswrapper[4839]: I0321 04:27:00.339524 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/34efe2c8-d7a8-47c1-8890-85ebd5ef1eb9-serving-cert\") pod \"kube-storage-version-migrator-operator-b67b599dd-x2g8w\" (UID: \"34efe2c8-d7a8-47c1-8890-85ebd5ef1eb9\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-x2g8w" Mar 21 04:27:00 crc kubenswrapper[4839]: I0321 04:27:00.340177 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/a810b51a-5b19-4da9-ad80-05f189d821e4-mcc-auth-proxy-config\") pod \"machine-config-controller-84d6567774-hlk25\" (UID: \"a810b51a-5b19-4da9-ad80-05f189d821e4\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-hlk25" Mar 21 04:27:00 crc kubenswrapper[4839]: I0321 04:27:00.341179 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/6240548e-b827-4fdb-b2be-c7187d6a28e8-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-8jgh7\" (UID: \"6240548e-b827-4fdb-b2be-c7187d6a28e8\") " pod="openshift-marketplace/marketplace-operator-79b997595-8jgh7" Mar 21 04:27:00 crc kubenswrapper[4839]: I0321 04:27:00.344448 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/28ce563b-8e5b-4abe-b71b-02c588bff511-default-certificate\") pod \"router-default-5444994796-w6dzs\" (UID: \"28ce563b-8e5b-4abe-b71b-02c588bff511\") " pod="openshift-ingress/router-default-5444994796-w6dzs" Mar 21 04:27:00 crc kubenswrapper[4839]: I0321 04:27:00.344779 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/7ef3f28d-e496-434e-a803-3b9a0fa24690-ca-trust-extracted\") pod \"image-registry-697d97f7c8-ql2ps\" (UID: \"7ef3f28d-e496-434e-a803-3b9a0fa24690\") " pod="openshift-image-registry/image-registry-697d97f7c8-ql2ps" Mar 21 04:27:00 crc kubenswrapper[4839]: I0321 04:27:00.346243 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/745f7801-7150-4924-b9fb-e8a0aa1e7edb-config\") pod \"kube-apiserver-operator-766d6c64bb-djrdt\" (UID: \"745f7801-7150-4924-b9fb-e8a0aa1e7edb\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-djrdt" Mar 21 04:27:00 crc kubenswrapper[4839]: I0321 04:27:00.347093 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/10dc7791-eebd-49e9-8d9c-63711119e9d7-serving-cert\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-hmqv7\" (UID: \"10dc7791-eebd-49e9-8d9c-63711119e9d7\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-hmqv7" Mar 21 04:27:00 crc kubenswrapper[4839]: I0321 04:27:00.347167 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/685c3b51-a70f-484e-b7db-f98383f75003-srv-cert\") pod \"olm-operator-6b444d44fb-78xr9\" (UID: \"685c3b51-a70f-484e-b7db-f98383f75003\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-78xr9" Mar 21 04:27:00 crc kubenswrapper[4839]: I0321 04:27:00.347258 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/2bceecf8-583d-4e26-9749-f5939280540b-apiservice-cert\") pod \"packageserver-d55dfcdfc-vg8dq\" (UID: \"2bceecf8-583d-4e26-9749-f5939280540b\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-vg8dq" Mar 21 04:27:00 crc kubenswrapper[4839]: E0321 04:27:00.348098 4839 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-21 04:27:00.848080391 +0000 UTC m=+225.175867067 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-ql2ps" (UID: "7ef3f28d-e496-434e-a803-3b9a0fa24690") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 21 04:27:00 crc kubenswrapper[4839]: I0321 04:27:00.349421 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/34efe2c8-d7a8-47c1-8890-85ebd5ef1eb9-config\") pod \"kube-storage-version-migrator-operator-b67b599dd-x2g8w\" (UID: \"34efe2c8-d7a8-47c1-8890-85ebd5ef1eb9\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-x2g8w" Mar 21 04:27:00 crc kubenswrapper[4839]: I0321 04:27:00.349950 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/81a76ad1-da33-4b42-9c0a-d0ada077729a-config-volume\") pod \"dns-default-5jhkc\" (UID: \"81a76ad1-da33-4b42-9c0a-d0ada077729a\") " pod="openshift-dns/dns-default-5jhkc" Mar 21 04:27:00 crc kubenswrapper[4839]: I0321 04:27:00.350915 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/9fc77b0b-af5d-42f5-8fd1-69ac8d6616d8-trusted-ca\") pod \"ingress-operator-5b745b69d9-9m6tl\" (UID: \"9fc77b0b-af5d-42f5-8fd1-69ac8d6616d8\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-9m6tl" Mar 21 04:27:00 crc kubenswrapper[4839]: I0321 04:27:00.350988 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/a83789bf-1523-4d5e-892d-6597aed01b7d-srv-cert\") pod \"catalog-operator-68c6474976-xldvn\" (UID: \"a83789bf-1523-4d5e-892d-6597aed01b7d\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-xldvn" Mar 21 04:27:00 crc kubenswrapper[4839]: I0321 04:27:00.351374 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6dd3a400-6155-44b9-a358-d2cd089db1f6-config\") pod \"service-ca-operator-777779d784-shqhf\" (UID: \"6dd3a400-6155-44b9-a358-d2cd089db1f6\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-shqhf" Mar 21 04:27:00 crc kubenswrapper[4839]: I0321 04:27:00.351945 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/10dc7791-eebd-49e9-8d9c-63711119e9d7-config\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-hmqv7\" (UID: \"10dc7791-eebd-49e9-8d9c-63711119e9d7\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-hmqv7" Mar 21 04:27:00 crc kubenswrapper[4839]: I0321 04:27:00.359337 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-c4tgz" Mar 21 04:27:00 crc kubenswrapper[4839]: I0321 04:27:00.359592 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/a810b51a-5b19-4da9-ad80-05f189d821e4-proxy-tls\") pod \"machine-config-controller-84d6567774-hlk25\" (UID: \"a810b51a-5b19-4da9-ad80-05f189d821e4\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-hlk25" Mar 21 04:27:00 crc kubenswrapper[4839]: I0321 04:27:00.360101 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/6240548e-b827-4fdb-b2be-c7187d6a28e8-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-8jgh7\" (UID: \"6240548e-b827-4fdb-b2be-c7187d6a28e8\") " pod="openshift-marketplace/marketplace-operator-79b997595-8jgh7" Mar 21 04:27:00 crc kubenswrapper[4839]: I0321 04:27:00.360537 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/7ef3f28d-e496-434e-a803-3b9a0fa24690-registry-tls\") pod \"image-registry-697d97f7c8-ql2ps\" (UID: \"7ef3f28d-e496-434e-a803-3b9a0fa24690\") " pod="openshift-image-registry/image-registry-697d97f7c8-ql2ps" Mar 21 04:27:00 crc kubenswrapper[4839]: I0321 04:27:00.360609 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/a83789bf-1523-4d5e-892d-6597aed01b7d-profile-collector-cert\") pod \"catalog-operator-68c6474976-xldvn\" (UID: \"a83789bf-1523-4d5e-892d-6597aed01b7d\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-xldvn" Mar 21 04:27:00 crc kubenswrapper[4839]: I0321 04:27:00.361088 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/8c8a6e75-7e5f-41c8-8312-b9d274284f35-webhook-certs\") pod \"multus-admission-controller-857f4d67dd-g9tz2\" (UID: \"8c8a6e75-7e5f-41c8-8312-b9d274284f35\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-g9tz2" Mar 21 04:27:00 crc kubenswrapper[4839]: I0321 04:27:00.369616 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/28ce563b-8e5b-4abe-b71b-02c588bff511-metrics-certs\") pod \"router-default-5444994796-w6dzs\" (UID: \"28ce563b-8e5b-4abe-b71b-02c588bff511\") " pod="openshift-ingress/router-default-5444994796-w6dzs" Mar 21 04:27:00 crc kubenswrapper[4839]: I0321 04:27:00.370211 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/40014780-8cb8-47fa-8b2c-c4fb7d04a85c-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-78cbb6b69f-whlp9\" (UID: \"40014780-8cb8-47fa-8b2c-c4fb7d04a85c\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-whlp9" Mar 21 04:27:00 crc kubenswrapper[4839]: I0321 04:27:00.370445 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/745f7801-7150-4924-b9fb-e8a0aa1e7edb-serving-cert\") pod \"kube-apiserver-operator-766d6c64bb-djrdt\" (UID: \"745f7801-7150-4924-b9fb-e8a0aa1e7edb\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-djrdt" Mar 21 04:27:00 crc kubenswrapper[4839]: I0321 04:27:00.370921 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/7ef3f28d-e496-434e-a803-3b9a0fa24690-installation-pull-secrets\") pod \"image-registry-697d97f7c8-ql2ps\" (UID: \"7ef3f28d-e496-434e-a803-3b9a0fa24690\") " pod="openshift-image-registry/image-registry-697d97f7c8-ql2ps" Mar 21 04:27:00 crc kubenswrapper[4839]: I0321 04:27:00.371992 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/685c3b51-a70f-484e-b7db-f98383f75003-profile-collector-cert\") pod \"olm-operator-6b444d44fb-78xr9\" (UID: \"685c3b51-a70f-484e-b7db-f98383f75003\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-78xr9" Mar 21 04:27:00 crc kubenswrapper[4839]: I0321 04:27:00.378081 4839 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-8sv4j"] Mar 21 04:27:00 crc kubenswrapper[4839]: I0321 04:27:00.391465 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6dd3a400-6155-44b9-a358-d2cd089db1f6-serving-cert\") pod \"service-ca-operator-777779d784-shqhf\" (UID: \"6dd3a400-6155-44b9-a358-d2cd089db1f6\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-shqhf" Mar 21 04:27:00 crc kubenswrapper[4839]: I0321 04:27:00.392036 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-lqm8j" Mar 21 04:27:00 crc kubenswrapper[4839]: I0321 04:27:00.393019 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9lg7c\" (UniqueName: \"kubernetes.io/projected/34efe2c8-d7a8-47c1-8890-85ebd5ef1eb9-kube-api-access-9lg7c\") pod \"kube-storage-version-migrator-operator-b67b599dd-x2g8w\" (UID: \"34efe2c8-d7a8-47c1-8890-85ebd5ef1eb9\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-x2g8w" Mar 21 04:27:00 crc kubenswrapper[4839]: I0321 04:27:00.400464 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-x2g8w" Mar 21 04:27:00 crc kubenswrapper[4839]: I0321 04:27:00.407743 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7fjmf\" (UniqueName: \"kubernetes.io/projected/a83789bf-1523-4d5e-892d-6597aed01b7d-kube-api-access-7fjmf\") pod \"catalog-operator-68c6474976-xldvn\" (UID: \"a83789bf-1523-4d5e-892d-6597aed01b7d\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-xldvn" Mar 21 04:27:00 crc kubenswrapper[4839]: I0321 04:27:00.422417 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-v69rn\" (UniqueName: \"kubernetes.io/projected/6240548e-b827-4fdb-b2be-c7187d6a28e8-kube-api-access-v69rn\") pod \"marketplace-operator-79b997595-8jgh7\" (UID: \"6240548e-b827-4fdb-b2be-c7187d6a28e8\") " pod="openshift-marketplace/marketplace-operator-79b997595-8jgh7" Mar 21 04:27:00 crc kubenswrapper[4839]: I0321 04:27:00.424908 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-8jgh7" Mar 21 04:27:00 crc kubenswrapper[4839]: I0321 04:27:00.431940 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 21 04:27:00 crc kubenswrapper[4839]: I0321 04:27:00.432264 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/ad426123-af7f-45c4-8a6b-bca3c83017be-signing-key\") pod \"service-ca-9c57cc56f-6rrrs\" (UID: \"ad426123-af7f-45c4-8a6b-bca3c83017be\") " pod="openshift-service-ca/service-ca-9c57cc56f-6rrrs" Mar 21 04:27:00 crc kubenswrapper[4839]: I0321 04:27:00.432296 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/a178972b-b463-42db-b2c9-dcba9a51c4bc-certs\") pod \"machine-config-server-4sj57\" (UID: \"a178972b-b463-42db-b2c9-dcba9a51c4bc\") " pod="openshift-machine-config-operator/machine-config-server-4sj57" Mar 21 04:27:00 crc kubenswrapper[4839]: I0321 04:27:00.432341 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/a178972b-b463-42db-b2c9-dcba9a51c4bc-node-bootstrap-token\") pod \"machine-config-server-4sj57\" (UID: \"a178972b-b463-42db-b2c9-dcba9a51c4bc\") " pod="openshift-machine-config-operator/machine-config-server-4sj57" Mar 21 04:27:00 crc kubenswrapper[4839]: I0321 04:27:00.432372 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/0368223e-2e01-4681-a7a6-67b77387f8d8-secret-volume\") pod \"collect-profiles-29567775-lfv48\" (UID: \"0368223e-2e01-4681-a7a6-67b77387f8d8\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29567775-lfv48" Mar 21 04:27:00 crc kubenswrapper[4839]: I0321 04:27:00.432391 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"csi-data-dir\" (UniqueName: \"kubernetes.io/host-path/4fee5524-9cb1-48c7-83b6-10bf3230c783-csi-data-dir\") pod \"csi-hostpathplugin-cstqb\" (UID: \"4fee5524-9cb1-48c7-83b6-10bf3230c783\") " pod="hostpath-provisioner/csi-hostpathplugin-cstqb" Mar 21 04:27:00 crc kubenswrapper[4839]: I0321 04:27:00.432412 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/d1439545-f492-4e4c-858c-ec85c5c2a9d9-cert\") pod \"ingress-canary-brnnr\" (UID: \"d1439545-f492-4e4c-858c-ec85c5c2a9d9\") " pod="openshift-ingress-canary/ingress-canary-brnnr" Mar 21 04:27:00 crc kubenswrapper[4839]: I0321 04:27:00.432485 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2jr79\" (UniqueName: \"kubernetes.io/projected/ad426123-af7f-45c4-8a6b-bca3c83017be-kube-api-access-2jr79\") pod \"service-ca-9c57cc56f-6rrrs\" (UID: \"ad426123-af7f-45c4-8a6b-bca3c83017be\") " pod="openshift-service-ca/service-ca-9c57cc56f-6rrrs" Mar 21 04:27:00 crc kubenswrapper[4839]: I0321 04:27:00.432508 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-dir\" (UniqueName: \"kubernetes.io/host-path/4fee5524-9cb1-48c7-83b6-10bf3230c783-plugins-dir\") pod \"csi-hostpathplugin-cstqb\" (UID: \"4fee5524-9cb1-48c7-83b6-10bf3230c783\") " pod="hostpath-provisioner/csi-hostpathplugin-cstqb" Mar 21 04:27:00 crc kubenswrapper[4839]: I0321 04:27:00.432537 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-26gxn\" (UniqueName: \"kubernetes.io/projected/a178972b-b463-42db-b2c9-dcba9a51c4bc-kube-api-access-26gxn\") pod \"machine-config-server-4sj57\" (UID: \"a178972b-b463-42db-b2c9-dcba9a51c4bc\") " pod="openshift-machine-config-operator/machine-config-server-4sj57" Mar 21 04:27:00 crc kubenswrapper[4839]: I0321 04:27:00.432559 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8tf6t\" (UniqueName: \"kubernetes.io/projected/d1439545-f492-4e4c-858c-ec85c5c2a9d9-kube-api-access-8tf6t\") pod \"ingress-canary-brnnr\" (UID: \"d1439545-f492-4e4c-858c-ec85c5c2a9d9\") " pod="openshift-ingress-canary/ingress-canary-brnnr" Mar 21 04:27:00 crc kubenswrapper[4839]: I0321 04:27:00.432605 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/0368223e-2e01-4681-a7a6-67b77387f8d8-config-volume\") pod \"collect-profiles-29567775-lfv48\" (UID: \"0368223e-2e01-4681-a7a6-67b77387f8d8\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29567775-lfv48" Mar 21 04:27:00 crc kubenswrapper[4839]: I0321 04:27:00.432628 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l9wch\" (UniqueName: \"kubernetes.io/projected/0368223e-2e01-4681-a7a6-67b77387f8d8-kube-api-access-l9wch\") pod \"collect-profiles-29567775-lfv48\" (UID: \"0368223e-2e01-4681-a7a6-67b77387f8d8\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29567775-lfv48" Mar 21 04:27:00 crc kubenswrapper[4839]: I0321 04:27:00.432679 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/4fee5524-9cb1-48c7-83b6-10bf3230c783-registration-dir\") pod \"csi-hostpathplugin-cstqb\" (UID: \"4fee5524-9cb1-48c7-83b6-10bf3230c783\") " pod="hostpath-provisioner/csi-hostpathplugin-cstqb" Mar 21 04:27:00 crc kubenswrapper[4839]: I0321 04:27:00.432728 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vjkdf\" (UniqueName: \"kubernetes.io/projected/609ace61-45d1-44f6-b378-fb97eecf2374-kube-api-access-vjkdf\") pod \"auto-csr-approver-29567786-d8w8k\" (UID: \"609ace61-45d1-44f6-b378-fb97eecf2374\") " pod="openshift-infra/auto-csr-approver-29567786-d8w8k" Mar 21 04:27:00 crc kubenswrapper[4839]: I0321 04:27:00.432778 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/ad426123-af7f-45c4-8a6b-bca3c83017be-signing-cabundle\") pod \"service-ca-9c57cc56f-6rrrs\" (UID: \"ad426123-af7f-45c4-8a6b-bca3c83017be\") " pod="openshift-service-ca/service-ca-9c57cc56f-6rrrs" Mar 21 04:27:00 crc kubenswrapper[4839]: I0321 04:27:00.432806 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/4fee5524-9cb1-48c7-83b6-10bf3230c783-socket-dir\") pod \"csi-hostpathplugin-cstqb\" (UID: \"4fee5524-9cb1-48c7-83b6-10bf3230c783\") " pod="hostpath-provisioner/csi-hostpathplugin-cstqb" Mar 21 04:27:00 crc kubenswrapper[4839]: I0321 04:27:00.432826 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mountpoint-dir\" (UniqueName: \"kubernetes.io/host-path/4fee5524-9cb1-48c7-83b6-10bf3230c783-mountpoint-dir\") pod \"csi-hostpathplugin-cstqb\" (UID: \"4fee5524-9cb1-48c7-83b6-10bf3230c783\") " pod="hostpath-provisioner/csi-hostpathplugin-cstqb" Mar 21 04:27:00 crc kubenswrapper[4839]: I0321 04:27:00.432871 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vffpc\" (UniqueName: \"kubernetes.io/projected/4fee5524-9cb1-48c7-83b6-10bf3230c783-kube-api-access-vffpc\") pod \"csi-hostpathplugin-cstqb\" (UID: \"4fee5524-9cb1-48c7-83b6-10bf3230c783\") " pod="hostpath-provisioner/csi-hostpathplugin-cstqb" Mar 21 04:27:00 crc kubenswrapper[4839]: I0321 04:27:00.432889 4839 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns-operator/dns-operator-744455d44c-g2rrh"] Mar 21 04:27:00 crc kubenswrapper[4839]: I0321 04:27:00.432934 4839 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication-operator/authentication-operator-69f744f599-k5nwf"] Mar 21 04:27:00 crc kubenswrapper[4839]: I0321 04:27:00.433988 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/0368223e-2e01-4681-a7a6-67b77387f8d8-config-volume\") pod \"collect-profiles-29567775-lfv48\" (UID: \"0368223e-2e01-4681-a7a6-67b77387f8d8\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29567775-lfv48" Mar 21 04:27:00 crc kubenswrapper[4839]: I0321 04:27:00.434140 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/4fee5524-9cb1-48c7-83b6-10bf3230c783-registration-dir\") pod \"csi-hostpathplugin-cstqb\" (UID: \"4fee5524-9cb1-48c7-83b6-10bf3230c783\") " pod="hostpath-provisioner/csi-hostpathplugin-cstqb" Mar 21 04:27:00 crc kubenswrapper[4839]: I0321 04:27:00.434400 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-dir\" (UniqueName: \"kubernetes.io/host-path/4fee5524-9cb1-48c7-83b6-10bf3230c783-plugins-dir\") pod \"csi-hostpathplugin-cstqb\" (UID: \"4fee5524-9cb1-48c7-83b6-10bf3230c783\") " pod="hostpath-provisioner/csi-hostpathplugin-cstqb" Mar 21 04:27:00 crc kubenswrapper[4839]: I0321 04:27:00.437409 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/a178972b-b463-42db-b2c9-dcba9a51c4bc-node-bootstrap-token\") pod \"machine-config-server-4sj57\" (UID: \"a178972b-b463-42db-b2c9-dcba9a51c4bc\") " pod="openshift-machine-config-operator/machine-config-server-4sj57" Mar 21 04:27:00 crc kubenswrapper[4839]: I0321 04:27:00.440310 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/ad426123-af7f-45c4-8a6b-bca3c83017be-signing-key\") pod \"service-ca-9c57cc56f-6rrrs\" (UID: \"ad426123-af7f-45c4-8a6b-bca3c83017be\") " pod="openshift-service-ca/service-ca-9c57cc56f-6rrrs" Mar 21 04:27:00 crc kubenswrapper[4839]: E0321 04:27:00.440467 4839 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-21 04:27:00.940434703 +0000 UTC m=+225.268221519 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 21 04:27:00 crc kubenswrapper[4839]: I0321 04:27:00.440587 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/4fee5524-9cb1-48c7-83b6-10bf3230c783-socket-dir\") pod \"csi-hostpathplugin-cstqb\" (UID: \"4fee5524-9cb1-48c7-83b6-10bf3230c783\") " pod="hostpath-provisioner/csi-hostpathplugin-cstqb" Mar 21 04:27:00 crc kubenswrapper[4839]: I0321 04:27:00.440911 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"csi-data-dir\" (UniqueName: \"kubernetes.io/host-path/4fee5524-9cb1-48c7-83b6-10bf3230c783-csi-data-dir\") pod \"csi-hostpathplugin-cstqb\" (UID: \"4fee5524-9cb1-48c7-83b6-10bf3230c783\") " pod="hostpath-provisioner/csi-hostpathplugin-cstqb" Mar 21 04:27:00 crc kubenswrapper[4839]: I0321 04:27:00.441024 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"mountpoint-dir\" (UniqueName: \"kubernetes.io/host-path/4fee5524-9cb1-48c7-83b6-10bf3230c783-mountpoint-dir\") pod \"csi-hostpathplugin-cstqb\" (UID: \"4fee5524-9cb1-48c7-83b6-10bf3230c783\") " pod="hostpath-provisioner/csi-hostpathplugin-cstqb" Mar 21 04:27:00 crc kubenswrapper[4839]: I0321 04:27:00.445436 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/ad426123-af7f-45c4-8a6b-bca3c83017be-signing-cabundle\") pod \"service-ca-9c57cc56f-6rrrs\" (UID: \"ad426123-af7f-45c4-8a6b-bca3c83017be\") " pod="openshift-service-ca/service-ca-9c57cc56f-6rrrs" Mar 21 04:27:00 crc kubenswrapper[4839]: I0321 04:27:00.445550 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zhhkt\" (UniqueName: \"kubernetes.io/projected/a810b51a-5b19-4da9-ad80-05f189d821e4-kube-api-access-zhhkt\") pod \"machine-config-controller-84d6567774-hlk25\" (UID: \"a810b51a-5b19-4da9-ad80-05f189d821e4\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-hlk25" Mar 21 04:27:00 crc kubenswrapper[4839]: I0321 04:27:00.446476 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"certs\" (UniqueName: \"kubernetes.io/secret/a178972b-b463-42db-b2c9-dcba9a51c4bc-certs\") pod \"machine-config-server-4sj57\" (UID: \"a178972b-b463-42db-b2c9-dcba9a51c4bc\") " pod="openshift-machine-config-operator/machine-config-server-4sj57" Mar 21 04:27:00 crc kubenswrapper[4839]: I0321 04:27:00.448193 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/0368223e-2e01-4681-a7a6-67b77387f8d8-secret-volume\") pod \"collect-profiles-29567775-lfv48\" (UID: \"0368223e-2e01-4681-a7a6-67b77387f8d8\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29567775-lfv48" Mar 21 04:27:00 crc kubenswrapper[4839]: I0321 04:27:00.450248 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/d1439545-f492-4e4c-858c-ec85c5c2a9d9-cert\") pod \"ingress-canary-brnnr\" (UID: \"d1439545-f492-4e4c-858c-ec85c5c2a9d9\") " pod="openshift-ingress-canary/ingress-canary-brnnr" Mar 21 04:27:00 crc kubenswrapper[4839]: I0321 04:27:00.462062 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-xldvn" Mar 21 04:27:00 crc kubenswrapper[4839]: I0321 04:27:00.465332 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jtldg\" (UniqueName: \"kubernetes.io/projected/2bceecf8-583d-4e26-9749-f5939280540b-kube-api-access-jtldg\") pod \"packageserver-d55dfcdfc-vg8dq\" (UID: \"2bceecf8-583d-4e26-9749-f5939280540b\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-vg8dq" Mar 21 04:27:00 crc kubenswrapper[4839]: I0321 04:27:00.486811 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d6s9q\" (UniqueName: \"kubernetes.io/projected/6dd3a400-6155-44b9-a358-d2cd089db1f6-kube-api-access-d6s9q\") pod \"service-ca-operator-777779d784-shqhf\" (UID: \"6dd3a400-6155-44b9-a358-d2cd089db1f6\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-shqhf" Mar 21 04:27:00 crc kubenswrapper[4839]: I0321 04:27:00.487982 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca-operator/service-ca-operator-777779d784-shqhf" Mar 21 04:27:00 crc kubenswrapper[4839]: I0321 04:27:00.509633 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rmkvx\" (UniqueName: \"kubernetes.io/projected/81a76ad1-da33-4b42-9c0a-d0ada077729a-kube-api-access-rmkvx\") pod \"dns-default-5jhkc\" (UID: \"81a76ad1-da33-4b42-9c0a-d0ada077729a\") " pod="openshift-dns/dns-default-5jhkc" Mar 21 04:27:00 crc kubenswrapper[4839]: I0321 04:27:00.526436 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-t6nxn\" (UniqueName: \"kubernetes.io/projected/9fc77b0b-af5d-42f5-8fd1-69ac8d6616d8-kube-api-access-t6nxn\") pod \"ingress-operator-5b745b69d9-9m6tl\" (UID: \"9fc77b0b-af5d-42f5-8fd1-69ac8d6616d8\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-9m6tl" Mar 21 04:27:00 crc kubenswrapper[4839]: I0321 04:27:00.535531 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-fh8k5" event={"ID":"14ed32fc-196a-4d5e-a8f3-d9fc6ec765b1","Type":"ContainerStarted","Data":"bebe147c7d3c9f550ac210dc3b87fa28986a146b1b4c1a309591b5d1c2e502e6"} Mar 21 04:27:00 crc kubenswrapper[4839]: I0321 04:27:00.536293 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-ql2ps\" (UID: \"7ef3f28d-e496-434e-a803-3b9a0fa24690\") " pod="openshift-image-registry/image-registry-697d97f7c8-ql2ps" Mar 21 04:27:00 crc kubenswrapper[4839]: E0321 04:27:00.537709 4839 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-21 04:27:01.037680161 +0000 UTC m=+225.365466837 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-ql2ps" (UID: "7ef3f28d-e496-434e-a803-3b9a0fa24690") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 21 04:27:00 crc kubenswrapper[4839]: I0321 04:27:00.543677 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vmtdf\" (UniqueName: \"kubernetes.io/projected/8c8a6e75-7e5f-41c8-8312-b9d274284f35-kube-api-access-vmtdf\") pod \"multus-admission-controller-857f4d67dd-g9tz2\" (UID: \"8c8a6e75-7e5f-41c8-8312-b9d274284f35\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-g9tz2" Mar 21 04:27:00 crc kubenswrapper[4839]: I0321 04:27:00.544079 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-8sv4j" event={"ID":"f7a32fe8-bd63-44c6-a90c-a6a5438e3cd5","Type":"ContainerStarted","Data":"69c4299d7bce6eb82eeb4f3117432443b42c0d2372ac0441df1988528f83756e"} Mar 21 04:27:00 crc kubenswrapper[4839]: I0321 04:27:00.551467 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-7954f5f757-qp8mz" event={"ID":"4d63cdfd-21e7-4a63-960b-363fb131ac08","Type":"ContainerStarted","Data":"3f550c2c325d41fb1c4343671414703435e40bc13120f59039726685c708adaf"} Mar 21 04:27:00 crc kubenswrapper[4839]: I0321 04:27:00.553763 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication-operator/authentication-operator-69f744f599-k5nwf" event={"ID":"c7fd2012-4ccf-4fb6-ad00-eeb7fd6be2cf","Type":"ContainerStarted","Data":"104bf948e8df0428dbcca58a1fb16cf0fb51aec4d7ecc1e8a05a5ab60ffd5268"} Mar 21 04:27:00 crc kubenswrapper[4839]: I0321 04:27:00.562346 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns-operator/dns-operator-744455d44c-g2rrh" event={"ID":"cae2f42f-b7c7-43c7-b397-a8273ea5844b","Type":"ContainerStarted","Data":"12ed70382799e5e4264fe38403057c4e4da70ccbeb46a9c9bf332c2f17ea1512"} Mar 21 04:27:00 crc kubenswrapper[4839]: I0321 04:27:00.564334 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pfdvw\" (UniqueName: \"kubernetes.io/projected/7ef3f28d-e496-434e-a803-3b9a0fa24690-kube-api-access-pfdvw\") pod \"image-registry-697d97f7c8-ql2ps\" (UID: \"7ef3f28d-e496-434e-a803-3b9a0fa24690\") " pod="openshift-image-registry/image-registry-697d97f7c8-ql2ps" Mar 21 04:27:00 crc kubenswrapper[4839]: I0321 04:27:00.567013 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-76ctz" event={"ID":"e81e2384-94b0-4639-bb2d-e4152385c932","Type":"ContainerStarted","Data":"e0cf19f30a06660139a2f20b41696b02c7e76a7a7208cb22acb223aa74d717d6"} Mar 21 04:27:00 crc kubenswrapper[4839]: I0321 04:27:00.567052 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-76ctz" event={"ID":"e81e2384-94b0-4639-bb2d-e4152385c932","Type":"ContainerStarted","Data":"14f69a0375e6c5d03a334a0b16024be0f89d91b54fd01559c41e7673087b6d53"} Mar 21 04:27:00 crc kubenswrapper[4839]: I0321 04:27:00.567853 4839 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-76ctz" Mar 21 04:27:00 crc kubenswrapper[4839]: I0321 04:27:00.569636 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-45jfn" event={"ID":"3498feaf-72d5-471a-b25e-fb4b68875767","Type":"ContainerStarted","Data":"3be930827d4d25e493d6f8a65c67c37e9fd4d484ec4f005269d96061246be637"} Mar 21 04:27:00 crc kubenswrapper[4839]: I0321 04:27:00.569666 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-45jfn" event={"ID":"3498feaf-72d5-471a-b25e-fb4b68875767","Type":"ContainerStarted","Data":"37e1b23e6295895c07381fb6ccbe7c11a0a79e23312f6edb20749b2a0cf5c684"} Mar 21 04:27:00 crc kubenswrapper[4839]: I0321 04:27:00.570179 4839 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-879f6c89f-45jfn" Mar 21 04:27:00 crc kubenswrapper[4839]: I0321 04:27:00.572555 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-zt77f" event={"ID":"6bfbd19d-a44a-459c-bd6e-150241ce3ebb","Type":"ContainerStarted","Data":"acab2b98e2e3828439a02d33c7b3fd1855365edb0946861b3e5dc01800f9adfe"} Mar 21 04:27:00 crc kubenswrapper[4839]: I0321 04:27:00.574112 4839 generic.go:334] "Generic (PLEG): container finished" podID="67adff78-dfe5-440a-80b0-fefd703c3aa7" containerID="86006a257303cbb685395d36b051f2b2669e91a34cb7a57c065b2e9ca3122ed3" exitCode=0 Mar 21 04:27:00 crc kubenswrapper[4839]: I0321 04:27:00.574205 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-85pc8" event={"ID":"67adff78-dfe5-440a-80b0-fefd703c3aa7","Type":"ContainerDied","Data":"86006a257303cbb685395d36b051f2b2669e91a34cb7a57c065b2e9ca3122ed3"} Mar 21 04:27:00 crc kubenswrapper[4839]: I0321 04:27:00.577929 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8m9qn\" (UniqueName: \"kubernetes.io/projected/40014780-8cb8-47fa-8b2c-c4fb7d04a85c-kube-api-access-8m9qn\") pod \"control-plane-machine-set-operator-78cbb6b69f-whlp9\" (UID: \"40014780-8cb8-47fa-8b2c-c4fb7d04a85c\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-whlp9" Mar 21 04:27:00 crc kubenswrapper[4839]: I0321 04:27:00.579956 4839 patch_prober.go:28] interesting pod/controller-manager-879f6c89f-45jfn container/controller-manager namespace/openshift-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.5:8443/healthz\": dial tcp 10.217.0.5:8443: connect: connection refused" start-of-body= Mar 21 04:27:00 crc kubenswrapper[4839]: I0321 04:27:00.579988 4839 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-controller-manager/controller-manager-879f6c89f-45jfn" podUID="3498feaf-72d5-471a-b25e-fb4b68875767" containerName="controller-manager" probeResult="failure" output="Get \"https://10.217.0.5:8443/healthz\": dial tcp 10.217.0.5:8443: connect: connection refused" Mar 21 04:27:00 crc kubenswrapper[4839]: I0321 04:27:00.580289 4839 patch_prober.go:28] interesting pod/route-controller-manager-6576b87f9c-76ctz container/route-controller-manager namespace/openshift-route-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.13:8443/healthz\": dial tcp 10.217.0.13:8443: connect: connection refused" start-of-body= Mar 21 04:27:00 crc kubenswrapper[4839]: I0321 04:27:00.580310 4839 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-76ctz" podUID="e81e2384-94b0-4639-bb2d-e4152385c932" containerName="route-controller-manager" probeResult="failure" output="Get \"https://10.217.0.13:8443/healthz\": dial tcp 10.217.0.13:8443: connect: connection refused" Mar 21 04:27:00 crc kubenswrapper[4839]: I0321 04:27:00.589253 4839 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-bxg8h"] Mar 21 04:27:00 crc kubenswrapper[4839]: I0321 04:27:00.591476 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/machine-api-operator-5694c8668f-nmj8p" event={"ID":"c4d393d7-42d7-4b7d-a3cd-f7e325b97c54","Type":"ContainerStarted","Data":"db930d1be9182a20b88151b3736501f6341e4e6a1cbd302ae707a449c2596737"} Mar 21 04:27:00 crc kubenswrapper[4839]: I0321 04:27:00.591525 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/machine-api-operator-5694c8668f-nmj8p" event={"ID":"c4d393d7-42d7-4b7d-a3cd-f7e325b97c54","Type":"ContainerStarted","Data":"00b4c67df5fe9cc89fd22031364169470c3fcfc89d4ce53b74bb4b6af15acd7a"} Mar 21 04:27:00 crc kubenswrapper[4839]: I0321 04:27:00.603099 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dbz79\" (UniqueName: \"kubernetes.io/projected/7f2c6e22-6a88-4c63-9da2-e38b813e0f1c-kube-api-access-dbz79\") pod \"migrator-59844c95c7-blcpt\" (UID: \"7f2c6e22-6a88-4c63-9da2-e38b813e0f1c\") " pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-blcpt" Mar 21 04:27:00 crc kubenswrapper[4839]: I0321 04:27:00.617113 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-75c2v\" (UniqueName: \"kubernetes.io/projected/685c3b51-a70f-484e-b7db-f98383f75003-kube-api-access-75c2v\") pod \"olm-operator-6b444d44fb-78xr9\" (UID: \"685c3b51-a70f-484e-b7db-f98383f75003\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-78xr9" Mar 21 04:27:00 crc kubenswrapper[4839]: I0321 04:27:00.617142 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-gl7rc" event={"ID":"9d291bc8-87c0-4a9e-b269-52a7801f050b","Type":"ContainerStarted","Data":"2a0fd9145ce6bbc2d66d5309033c5df6833aa692240adbf22a21b5c347926398"} Mar 21 04:27:00 crc kubenswrapper[4839]: I0321 04:27:00.630080 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-whlp9" Mar 21 04:27:00 crc kubenswrapper[4839]: I0321 04:27:00.637964 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 21 04:27:00 crc kubenswrapper[4839]: I0321 04:27:00.638006 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-hlk25" Mar 21 04:27:00 crc kubenswrapper[4839]: E0321 04:27:00.638062 4839 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-21 04:27:01.138040372 +0000 UTC m=+225.465827048 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 21 04:27:00 crc kubenswrapper[4839]: I0321 04:27:00.638516 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-ql2ps\" (UID: \"7ef3f28d-e496-434e-a803-3b9a0fa24690\") " pod="openshift-image-registry/image-registry-697d97f7c8-ql2ps" Mar 21 04:27:00 crc kubenswrapper[4839]: I0321 04:27:00.638813 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/745f7801-7150-4924-b9fb-e8a0aa1e7edb-kube-api-access\") pod \"kube-apiserver-operator-766d6c64bb-djrdt\" (UID: \"745f7801-7150-4924-b9fb-e8a0aa1e7edb\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-djrdt" Mar 21 04:27:00 crc kubenswrapper[4839]: E0321 04:27:00.639511 4839 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-21 04:27:01.139490425 +0000 UTC m=+225.467277101 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-ql2ps" (UID: "7ef3f28d-e496-434e-a803-3b9a0fa24690") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 21 04:27:00 crc kubenswrapper[4839]: I0321 04:27:00.656894 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/9fc77b0b-af5d-42f5-8fd1-69ac8d6616d8-bound-sa-token\") pod \"ingress-operator-5b745b69d9-9m6tl\" (UID: \"9fc77b0b-af5d-42f5-8fd1-69ac8d6616d8\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-9m6tl" Mar 21 04:27:00 crc kubenswrapper[4839]: I0321 04:27:00.675539 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-blcpt" Mar 21 04:27:00 crc kubenswrapper[4839]: I0321 04:27:00.683844 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-djrdt" Mar 21 04:27:00 crc kubenswrapper[4839]: I0321 04:27:00.684295 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-627w7\" (UniqueName: \"kubernetes.io/projected/28ce563b-8e5b-4abe-b71b-02c588bff511-kube-api-access-627w7\") pod \"router-default-5444994796-w6dzs\" (UID: \"28ce563b-8e5b-4abe-b71b-02c588bff511\") " pod="openshift-ingress/router-default-5444994796-w6dzs" Mar 21 04:27:00 crc kubenswrapper[4839]: I0321 04:27:00.691711 4839 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-tz8sp"] Mar 21 04:27:00 crc kubenswrapper[4839]: I0321 04:27:00.706448 4839 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-etcd-operator/etcd-operator-b45778765-2s6j7"] Mar 21 04:27:00 crc kubenswrapper[4839]: I0321 04:27:00.706688 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/10dc7791-eebd-49e9-8d9c-63711119e9d7-kube-api-access\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-hmqv7\" (UID: \"10dc7791-eebd-49e9-8d9c-63711119e9d7\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-hmqv7" Mar 21 04:27:00 crc kubenswrapper[4839]: I0321 04:27:00.707336 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-hmqv7" Mar 21 04:27:00 crc kubenswrapper[4839]: I0321 04:27:00.716629 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/router-default-5444994796-w6dzs" Mar 21 04:27:00 crc kubenswrapper[4839]: I0321 04:27:00.728482 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/7ef3f28d-e496-434e-a803-3b9a0fa24690-bound-sa-token\") pod \"image-registry-697d97f7c8-ql2ps\" (UID: \"7ef3f28d-e496-434e-a803-3b9a0fa24690\") " pod="openshift-image-registry/image-registry-697d97f7c8-ql2ps" Mar 21 04:27:00 crc kubenswrapper[4839]: I0321 04:27:00.744180 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-vg8dq" Mar 21 04:27:00 crc kubenswrapper[4839]: I0321 04:27:00.744393 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 21 04:27:00 crc kubenswrapper[4839]: E0321 04:27:00.744460 4839 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-21 04:27:01.244441483 +0000 UTC m=+225.572228159 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 21 04:27:00 crc kubenswrapper[4839]: I0321 04:27:00.744948 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-ql2ps\" (UID: \"7ef3f28d-e496-434e-a803-3b9a0fa24690\") " pod="openshift-image-registry/image-registry-697d97f7c8-ql2ps" Mar 21 04:27:00 crc kubenswrapper[4839]: E0321 04:27:00.745688 4839 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-21 04:27:01.24567298 +0000 UTC m=+225.573459646 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-ql2ps" (UID: "7ef3f28d-e496-434e-a803-3b9a0fa24690") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 21 04:27:00 crc kubenswrapper[4839]: I0321 04:27:00.748959 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-admission-controller-857f4d67dd-g9tz2" Mar 21 04:27:00 crc kubenswrapper[4839]: I0321 04:27:00.752963 4839 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-fhqz6"] Mar 21 04:27:00 crc kubenswrapper[4839]: I0321 04:27:00.768188 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-5jhkc" Mar 21 04:27:00 crc kubenswrapper[4839]: I0321 04:27:00.779246 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-78xr9" Mar 21 04:27:00 crc kubenswrapper[4839]: I0321 04:27:00.781158 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vffpc\" (UniqueName: \"kubernetes.io/projected/4fee5524-9cb1-48c7-83b6-10bf3230c783-kube-api-access-vffpc\") pod \"csi-hostpathplugin-cstqb\" (UID: \"4fee5524-9cb1-48c7-83b6-10bf3230c783\") " pod="hostpath-provisioner/csi-hostpathplugin-cstqb" Mar 21 04:27:00 crc kubenswrapper[4839]: I0321 04:27:00.786134 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-26gxn\" (UniqueName: \"kubernetes.io/projected/a178972b-b463-42db-b2c9-dcba9a51c4bc-kube-api-access-26gxn\") pod \"machine-config-server-4sj57\" (UID: \"a178972b-b463-42db-b2c9-dcba9a51c4bc\") " pod="openshift-machine-config-operator/machine-config-server-4sj57" Mar 21 04:27:00 crc kubenswrapper[4839]: I0321 04:27:00.798613 4839 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-config-operator/openshift-config-operator-7777fb866f-p4nnp"] Mar 21 04:27:00 crc kubenswrapper[4839]: I0321 04:27:00.802268 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8tf6t\" (UniqueName: \"kubernetes.io/projected/d1439545-f492-4e4c-858c-ec85c5c2a9d9-kube-api-access-8tf6t\") pod \"ingress-canary-brnnr\" (UID: \"d1439545-f492-4e4c-858c-ec85c5c2a9d9\") " pod="openshift-ingress-canary/ingress-canary-brnnr" Mar 21 04:27:00 crc kubenswrapper[4839]: I0321 04:27:00.806469 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-server-4sj57" Mar 21 04:27:00 crc kubenswrapper[4839]: I0321 04:27:00.820542 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l9wch\" (UniqueName: \"kubernetes.io/projected/0368223e-2e01-4681-a7a6-67b77387f8d8-kube-api-access-l9wch\") pod \"collect-profiles-29567775-lfv48\" (UID: \"0368223e-2e01-4681-a7a6-67b77387f8d8\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29567775-lfv48" Mar 21 04:27:00 crc kubenswrapper[4839]: I0321 04:27:00.839140 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-brnnr" Mar 21 04:27:00 crc kubenswrapper[4839]: I0321 04:27:00.842808 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="hostpath-provisioner/csi-hostpathplugin-cstqb" Mar 21 04:27:00 crc kubenswrapper[4839]: I0321 04:27:00.847052 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 21 04:27:00 crc kubenswrapper[4839]: E0321 04:27:00.847400 4839 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-21 04:27:01.347383581 +0000 UTC m=+225.675170257 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 21 04:27:00 crc kubenswrapper[4839]: I0321 04:27:00.848862 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vjkdf\" (UniqueName: \"kubernetes.io/projected/609ace61-45d1-44f6-b378-fb97eecf2374-kube-api-access-vjkdf\") pod \"auto-csr-approver-29567786-d8w8k\" (UID: \"609ace61-45d1-44f6-b378-fb97eecf2374\") " pod="openshift-infra/auto-csr-approver-29567786-d8w8k" Mar 21 04:27:00 crc kubenswrapper[4839]: I0321 04:27:00.865864 4839 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console-operator/console-operator-58897d9998-hkg98"] Mar 21 04:27:00 crc kubenswrapper[4839]: I0321 04:27:00.868873 4839 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-f9d7485db-bj929"] Mar 21 04:27:00 crc kubenswrapper[4839]: I0321 04:27:00.873355 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2jr79\" (UniqueName: \"kubernetes.io/projected/ad426123-af7f-45c4-8a6b-bca3c83017be-kube-api-access-2jr79\") pod \"service-ca-9c57cc56f-6rrrs\" (UID: \"ad426123-af7f-45c4-8a6b-bca3c83017be\") " pod="openshift-service-ca/service-ca-9c57cc56f-6rrrs" Mar 21 04:27:00 crc kubenswrapper[4839]: W0321 04:27:00.911420 4839 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3db892a0_fb40_4e0e_93ee_a8f2876ad8be.slice/crio-257ab6cb6e299cc5ad93a18838e91cd1a2df571cc8cf41230ae4ec84ecc3404e WatchSource:0}: Error finding container 257ab6cb6e299cc5ad93a18838e91cd1a2df571cc8cf41230ae4ec84ecc3404e: Status 404 returned error can't find the container with id 257ab6cb6e299cc5ad93a18838e91cd1a2df571cc8cf41230ae4ec84ecc3404e Mar 21 04:27:00 crc kubenswrapper[4839]: W0321 04:27:00.944831 4839 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podebdfec0a_a8bf_47b0_b51a_75a76d4341f2.slice/crio-774dc5e188fdd2949c4be19591127c4007be44d80647d0129310982be9176b4a WatchSource:0}: Error finding container 774dc5e188fdd2949c4be19591127c4007be44d80647d0129310982be9176b4a: Status 404 returned error can't find the container with id 774dc5e188fdd2949c4be19591127c4007be44d80647d0129310982be9176b4a Mar 21 04:27:00 crc kubenswrapper[4839]: I0321 04:27:00.962382 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-9m6tl" Mar 21 04:27:00 crc kubenswrapper[4839]: I0321 04:27:00.965986 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-ql2ps\" (UID: \"7ef3f28d-e496-434e-a803-3b9a0fa24690\") " pod="openshift-image-registry/image-registry-697d97f7c8-ql2ps" Mar 21 04:27:00 crc kubenswrapper[4839]: W0321 04:27:00.966627 4839 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod28ce563b_8e5b_4abe_b71b_02c588bff511.slice/crio-9397277df1875a90eed39cc6dd9f436aae3aa7999383ad4c20bd29d0e86597d9 WatchSource:0}: Error finding container 9397277df1875a90eed39cc6dd9f436aae3aa7999383ad4c20bd29d0e86597d9: Status 404 returned error can't find the container with id 9397277df1875a90eed39cc6dd9f436aae3aa7999383ad4c20bd29d0e86597d9 Mar 21 04:27:00 crc kubenswrapper[4839]: E0321 04:27:00.966979 4839 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-21 04:27:01.466959697 +0000 UTC m=+225.794746373 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-ql2ps" (UID: "7ef3f28d-e496-434e-a803-3b9a0fa24690") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 21 04:27:00 crc kubenswrapper[4839]: I0321 04:27:00.970042 4839 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-8jgh7"] Mar 21 04:27:01 crc kubenswrapper[4839]: I0321 04:27:01.028246 4839 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-4cwc5"] Mar 21 04:27:01 crc kubenswrapper[4839]: I0321 04:27:01.075736 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 21 04:27:01 crc kubenswrapper[4839]: E0321 04:27:01.076099 4839 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-21 04:27:01.57607883 +0000 UTC m=+225.903865506 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 21 04:27:01 crc kubenswrapper[4839]: I0321 04:27:01.076638 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-ql2ps\" (UID: \"7ef3f28d-e496-434e-a803-3b9a0fa24690\") " pod="openshift-image-registry/image-registry-697d97f7c8-ql2ps" Mar 21 04:27:01 crc kubenswrapper[4839]: E0321 04:27:01.077369 4839 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-21 04:27:01.577358378 +0000 UTC m=+225.905145054 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-ql2ps" (UID: "7ef3f28d-e496-434e-a803-3b9a0fa24690") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 21 04:27:01 crc kubenswrapper[4839]: I0321 04:27:01.098004 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca/service-ca-9c57cc56f-6rrrs" Mar 21 04:27:01 crc kubenswrapper[4839]: W0321 04:27:01.100225 4839 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6240548e_b827_4fdb_b2be_c7187d6a28e8.slice/crio-dec41352b22dc4b1f265aecf13bbf9f995403b64a2bfc4f44c88616523722931 WatchSource:0}: Error finding container dec41352b22dc4b1f265aecf13bbf9f995403b64a2bfc4f44c88616523722931: Status 404 returned error can't find the container with id dec41352b22dc4b1f265aecf13bbf9f995403b64a2bfc4f44c88616523722931 Mar 21 04:27:01 crc kubenswrapper[4839]: W0321 04:27:01.101643 4839 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda79fc033_c671_42ff_aa06_78ae64967c92.slice/crio-142271d8088c37908157cb75e28ad863f196a6dd38dd6b39f7a315657d6ce315 WatchSource:0}: Error finding container 142271d8088c37908157cb75e28ad863f196a6dd38dd6b39f7a315657d6ce315: Status 404 returned error can't find the container with id 142271d8088c37908157cb75e28ad863f196a6dd38dd6b39f7a315657d6ce315 Mar 21 04:27:01 crc kubenswrapper[4839]: I0321 04:27:01.111358 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29567775-lfv48" Mar 21 04:27:01 crc kubenswrapper[4839]: I0321 04:27:01.131517 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567786-d8w8k" Mar 21 04:27:01 crc kubenswrapper[4839]: I0321 04:27:01.186076 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 21 04:27:01 crc kubenswrapper[4839]: E0321 04:27:01.186300 4839 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-21 04:27:01.686255134 +0000 UTC m=+226.014041810 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 21 04:27:01 crc kubenswrapper[4839]: I0321 04:27:01.186591 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-ql2ps\" (UID: \"7ef3f28d-e496-434e-a803-3b9a0fa24690\") " pod="openshift-image-registry/image-registry-697d97f7c8-ql2ps" Mar 21 04:27:01 crc kubenswrapper[4839]: E0321 04:27:01.187416 4839 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-21 04:27:01.687408369 +0000 UTC m=+226.015195045 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-ql2ps" (UID: "7ef3f28d-e496-434e-a803-3b9a0fa24690") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 21 04:27:01 crc kubenswrapper[4839]: I0321 04:27:01.197977 4839 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-api/machine-api-operator-5694c8668f-nmj8p" podStartSLOduration=160.197959424 podStartE2EDuration="2m40.197959424s" podCreationTimestamp="2026-03-21 04:24:21 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-21 04:27:01.197778839 +0000 UTC m=+225.525565515" watchObservedRunningTime="2026-03-21 04:27:01.197959424 +0000 UTC m=+225.525746100" Mar 21 04:27:01 crc kubenswrapper[4839]: I0321 04:27:01.234140 4839 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-operator-74547568cd-lqm8j"] Mar 21 04:27:01 crc kubenswrapper[4839]: I0321 04:27:01.288349 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 21 04:27:01 crc kubenswrapper[4839]: E0321 04:27:01.289276 4839 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-21 04:27:01.789255144 +0000 UTC m=+226.117041820 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 21 04:27:01 crc kubenswrapper[4839]: W0321 04:27:01.323026 4839 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2bc52acb_29f0_4f24_a46a_928a529264dc.slice/crio-1ec9bfe764509947efa939fe8f8f570a445c6f87cdfb141b97b2916972a436e8 WatchSource:0}: Error finding container 1ec9bfe764509947efa939fe8f8f570a445c6f87cdfb141b97b2916972a436e8: Status 404 returned error can't find the container with id 1ec9bfe764509947efa939fe8f8f570a445c6f87cdfb141b97b2916972a436e8 Mar 21 04:27:01 crc kubenswrapper[4839]: I0321 04:27:01.374707 4839 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-x2g8w"] Mar 21 04:27:01 crc kubenswrapper[4839]: I0321 04:27:01.376666 4839 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca-operator/service-ca-operator-777779d784-shqhf"] Mar 21 04:27:01 crc kubenswrapper[4839]: I0321 04:27:01.391904 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-ql2ps\" (UID: \"7ef3f28d-e496-434e-a803-3b9a0fa24690\") " pod="openshift-image-registry/image-registry-697d97f7c8-ql2ps" Mar 21 04:27:01 crc kubenswrapper[4839]: E0321 04:27:01.392256 4839 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-21 04:27:01.892238304 +0000 UTC m=+226.220024980 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-ql2ps" (UID: "7ef3f28d-e496-434e-a803-3b9a0fa24690") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 21 04:27:01 crc kubenswrapper[4839]: I0321 04:27:01.401422 4839 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-whlp9"] Mar 21 04:27:01 crc kubenswrapper[4839]: I0321 04:27:01.423013 4839 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-controller-84d6567774-hlk25"] Mar 21 04:27:01 crc kubenswrapper[4839]: I0321 04:27:01.444216 4839 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-spl4b" Mar 21 04:27:01 crc kubenswrapper[4839]: I0321 04:27:01.454950 4839 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-hmqv7"] Mar 21 04:27:01 crc kubenswrapper[4839]: I0321 04:27:01.488482 4839 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-djrdt"] Mar 21 04:27:01 crc kubenswrapper[4839]: I0321 04:27:01.491579 4839 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-879f6c89f-45jfn" podStartSLOduration=161.491540553 podStartE2EDuration="2m41.491540553s" podCreationTimestamp="2026-03-21 04:24:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-21 04:27:01.484855883 +0000 UTC m=+225.812642569" watchObservedRunningTime="2026-03-21 04:27:01.491540553 +0000 UTC m=+225.819327229" Mar 21 04:27:01 crc kubenswrapper[4839]: I0321 04:27:01.494323 4839 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/catalog-operator-68c6474976-xldvn"] Mar 21 04:27:01 crc kubenswrapper[4839]: I0321 04:27:01.496212 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 21 04:27:01 crc kubenswrapper[4839]: E0321 04:27:01.496893 4839 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-21 04:27:01.996869922 +0000 UTC m=+226.324656598 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 21 04:27:01 crc kubenswrapper[4839]: I0321 04:27:01.560911 4839 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-76ctz" podStartSLOduration=160.560895157 podStartE2EDuration="2m40.560895157s" podCreationTimestamp="2026-03-21 04:24:21 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-21 04:27:01.513982004 +0000 UTC m=+225.841768680" watchObservedRunningTime="2026-03-21 04:27:01.560895157 +0000 UTC m=+225.888681833" Mar 21 04:27:01 crc kubenswrapper[4839]: I0321 04:27:01.561666 4839 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator/migrator-59844c95c7-blcpt"] Mar 21 04:27:01 crc kubenswrapper[4839]: W0321 04:27:01.566589 4839 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda178972b_b463_42db_b2c9_dcba9a51c4bc.slice/crio-7e4bb9e4330247c67d80afd7268f7a8acc7be6291db286ad65c56513d5057c79 WatchSource:0}: Error finding container 7e4bb9e4330247c67d80afd7268f7a8acc7be6291db286ad65c56513d5057c79: Status 404 returned error can't find the container with id 7e4bb9e4330247c67d80afd7268f7a8acc7be6291db286ad65c56513d5057c79 Mar 21 04:27:01 crc kubenswrapper[4839]: I0321 04:27:01.567186 4839 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-c4tgz"] Mar 21 04:27:01 crc kubenswrapper[4839]: I0321 04:27:01.598344 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-ql2ps\" (UID: \"7ef3f28d-e496-434e-a803-3b9a0fa24690\") " pod="openshift-image-registry/image-registry-697d97f7c8-ql2ps" Mar 21 04:27:01 crc kubenswrapper[4839]: E0321 04:27:01.600337 4839 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-21 04:27:02.100314706 +0000 UTC m=+226.428101382 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-ql2ps" (UID: "7ef3f28d-e496-434e-a803-3b9a0fa24690") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 21 04:27:01 crc kubenswrapper[4839]: W0321 04:27:01.639015 4839 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod34efe2c8_d7a8_47c1_8890_85ebd5ef1eb9.slice/crio-f687bfd38b7cfa3b591f3fb6b760992a834ed7768fb5ee365a16b165dfb16871 WatchSource:0}: Error finding container f687bfd38b7cfa3b591f3fb6b760992a834ed7768fb5ee365a16b165dfb16871: Status 404 returned error can't find the container with id f687bfd38b7cfa3b591f3fb6b760992a834ed7768fb5ee365a16b165dfb16871 Mar 21 04:27:01 crc kubenswrapper[4839]: W0321 04:27:01.639490 4839 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6dd3a400_6155_44b9_a358_d2cd089db1f6.slice/crio-570791c95d0b2b190c1d8da17592465418d253b93a8afbc494ffb34625ed0073 WatchSource:0}: Error finding container 570791c95d0b2b190c1d8da17592465418d253b93a8afbc494ffb34625ed0073: Status 404 returned error can't find the container with id 570791c95d0b2b190c1d8da17592465418d253b93a8afbc494ffb34625ed0073 Mar 21 04:27:01 crc kubenswrapper[4839]: I0321 04:27:01.643662 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication-operator/authentication-operator-69f744f599-k5nwf" event={"ID":"c7fd2012-4ccf-4fb6-ad00-eeb7fd6be2cf","Type":"ContainerStarted","Data":"87165f3b58c7c76de9f3e3e80e2dcb2a93d5a302c209109ff14fff2635704ea2"} Mar 21 04:27:01 crc kubenswrapper[4839]: I0321 04:27:01.662620 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-tz8sp" event={"ID":"25af4e9d-c029-4ee7-9952-18a3a5e3c333","Type":"ContainerStarted","Data":"e50502a2010cac7980cfefa0621f469d7b42ed82cc45209c2b4818de457bba55"} Mar 21 04:27:01 crc kubenswrapper[4839]: W0321 04:27:01.663323 4839 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda810b51a_5b19_4da9_ad80_05f189d821e4.slice/crio-2bac721ba6d2c55bdfbd0753be9818e4659f19ed6afa991c8d3cd42bf3709d6d WatchSource:0}: Error finding container 2bac721ba6d2c55bdfbd0753be9818e4659f19ed6afa991c8d3cd42bf3709d6d: Status 404 returned error can't find the container with id 2bac721ba6d2c55bdfbd0753be9818e4659f19ed6afa991c8d3cd42bf3709d6d Mar 21 04:27:01 crc kubenswrapper[4839]: I0321 04:27:01.673298 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd-operator/etcd-operator-b45778765-2s6j7" event={"ID":"93bc1508-a828-4d23-b078-1d4164d1bc2c","Type":"ContainerStarted","Data":"94c766f682c6ebef7629e40f94406109f19df712e94a944ff6c5ac196f0815cf"} Mar 21 04:27:01 crc kubenswrapper[4839]: I0321 04:27:01.677444 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-85pc8" event={"ID":"67adff78-dfe5-440a-80b0-fefd703c3aa7","Type":"ContainerStarted","Data":"f6b6c78c9d1bdb848eba4c2502571d555e01e3ddf133248352c6864deb900c4d"} Mar 21 04:27:01 crc kubenswrapper[4839]: I0321 04:27:01.681269 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-58897d9998-hkg98" event={"ID":"f7156267-6917-4c54-ba75-4a91a0772025","Type":"ContainerStarted","Data":"bec5b782b49b348195a3494de220af759b7d819426abf2092fef180efedbebb1"} Mar 21 04:27:01 crc kubenswrapper[4839]: I0321 04:27:01.691620 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-7954f5f757-qp8mz" event={"ID":"4d63cdfd-21e7-4a63-960b-363fb131ac08","Type":"ContainerStarted","Data":"c59c5df1b834753b3397abb13d229ecb94c80f25b00f08838046e04f48ad820c"} Mar 21 04:27:01 crc kubenswrapper[4839]: I0321 04:27:01.691796 4839 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/downloads-7954f5f757-qp8mz" Mar 21 04:27:01 crc kubenswrapper[4839]: I0321 04:27:01.694737 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-5444994796-w6dzs" event={"ID":"28ce563b-8e5b-4abe-b71b-02c588bff511","Type":"ContainerStarted","Data":"9397277df1875a90eed39cc6dd9f436aae3aa7999383ad4c20bd29d0e86597d9"} Mar 21 04:27:01 crc kubenswrapper[4839]: I0321 04:27:01.696841 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-zt77f" event={"ID":"6bfbd19d-a44a-459c-bd6e-150241ce3ebb","Type":"ContainerStarted","Data":"5917d0257c4a81565499bf920cd6ba405e8b8d34fcd640889d185cadd9ae650d"} Mar 21 04:27:01 crc kubenswrapper[4839]: I0321 04:27:01.697556 4839 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-authentication/oauth-openshift-558db77b4-zt77f" Mar 21 04:27:01 crc kubenswrapper[4839]: I0321 04:27:01.700989 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 21 04:27:01 crc kubenswrapper[4839]: E0321 04:27:01.701622 4839 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-21 04:27:02.201594584 +0000 UTC m=+226.529381260 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 21 04:27:01 crc kubenswrapper[4839]: I0321 04:27:01.701818 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-ql2ps\" (UID: \"7ef3f28d-e496-434e-a803-3b9a0fa24690\") " pod="openshift-image-registry/image-registry-697d97f7c8-ql2ps" Mar 21 04:27:01 crc kubenswrapper[4839]: I0321 04:27:01.702054 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-lqm8j" event={"ID":"2bc52acb-29f0-4f24-a46a-928a529264dc","Type":"ContainerStarted","Data":"1ec9bfe764509947efa939fe8f8f570a445c6f87cdfb141b97b2916972a436e8"} Mar 21 04:27:01 crc kubenswrapper[4839]: E0321 04:27:01.702555 4839 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-21 04:27:02.202544933 +0000 UTC m=+226.530331609 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-ql2ps" (UID: "7ef3f28d-e496-434e-a803-3b9a0fa24690") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 21 04:27:01 crc kubenswrapper[4839]: I0321 04:27:01.705984 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-fh8k5" event={"ID":"14ed32fc-196a-4d5e-a8f3-d9fc6ec765b1","Type":"ContainerStarted","Data":"7dadea2ef0ce3a1677e46665b198a5ba8d6e517c88efd0a391f511f13f383ce4"} Mar 21 04:27:01 crc kubenswrapper[4839]: I0321 04:27:01.730666 4839 generic.go:334] "Generic (PLEG): container finished" podID="9d291bc8-87c0-4a9e-b269-52a7801f050b" containerID="60becfbe63e155a2f60cd187ea8d4be4fbd710e4f3d615c52e807eb20a456a0f" exitCode=0 Mar 21 04:27:01 crc kubenswrapper[4839]: I0321 04:27:01.730866 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-gl7rc" event={"ID":"9d291bc8-87c0-4a9e-b269-52a7801f050b","Type":"ContainerDied","Data":"60becfbe63e155a2f60cd187ea8d4be4fbd710e4f3d615c52e807eb20a456a0f"} Mar 21 04:27:01 crc kubenswrapper[4839]: I0321 04:27:01.733818 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-server-4sj57" event={"ID":"a178972b-b463-42db-b2c9-dcba9a51c4bc","Type":"ContainerStarted","Data":"7e4bb9e4330247c67d80afd7268f7a8acc7be6291db286ad65c56513d5057c79"} Mar 21 04:27:01 crc kubenswrapper[4839]: I0321 04:27:01.735207 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-bj929" event={"ID":"ebdfec0a-a8bf-47b0-b51a-75a76d4341f2","Type":"ContainerStarted","Data":"774dc5e188fdd2949c4be19591127c4007be44d80647d0129310982be9176b4a"} Mar 21 04:27:01 crc kubenswrapper[4839]: I0321 04:27:01.741322 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-4cwc5" event={"ID":"a79fc033-c671-42ff-aa06-78ae64967c92","Type":"ContainerStarted","Data":"142271d8088c37908157cb75e28ad863f196a6dd38dd6b39f7a315657d6ce315"} Mar 21 04:27:01 crc kubenswrapper[4839]: I0321 04:27:01.749933 4839 patch_prober.go:28] interesting pod/oauth-openshift-558db77b4-zt77f container/oauth-openshift namespace/openshift-authentication: Readiness probe status=failure output="Get \"https://10.217.0.11:6443/healthz\": dial tcp 10.217.0.11:6443: connect: connection refused" start-of-body= Mar 21 04:27:01 crc kubenswrapper[4839]: I0321 04:27:01.750017 4839 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-authentication/oauth-openshift-558db77b4-zt77f" podUID="6bfbd19d-a44a-459c-bd6e-150241ce3ebb" containerName="oauth-openshift" probeResult="failure" output="Get \"https://10.217.0.11:6443/healthz\": dial tcp 10.217.0.11:6443: connect: connection refused" Mar 21 04:27:01 crc kubenswrapper[4839]: I0321 04:27:01.751506 4839 patch_prober.go:28] interesting pod/downloads-7954f5f757-qp8mz container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.14:8080/\": dial tcp 10.217.0.14:8080: connect: connection refused" start-of-body= Mar 21 04:27:01 crc kubenswrapper[4839]: I0321 04:27:01.751630 4839 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-qp8mz" podUID="4d63cdfd-21e7-4a63-960b-363fb131ac08" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.14:8080/\": dial tcp 10.217.0.14:8080: connect: connection refused" Mar 21 04:27:01 crc kubenswrapper[4839]: I0321 04:27:01.820664 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 21 04:27:01 crc kubenswrapper[4839]: E0321 04:27:01.823338 4839 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-21 04:27:02.323314364 +0000 UTC m=+226.651101040 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 21 04:27:01 crc kubenswrapper[4839]: I0321 04:27:01.865858 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns-operator/dns-operator-744455d44c-g2rrh" event={"ID":"cae2f42f-b7c7-43c7-b397-a8273ea5844b","Type":"ContainerStarted","Data":"8b2e1ca7f6124db0444244a4f24ba74da0d7c7ef5546bae4cc34079e0bc014c1"} Mar 21 04:27:01 crc kubenswrapper[4839]: I0321 04:27:01.872073 4839 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-operator/ingress-operator-5b745b69d9-9m6tl"] Mar 21 04:27:01 crc kubenswrapper[4839]: I0321 04:27:01.876984 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-8jgh7" event={"ID":"6240548e-b827-4fdb-b2be-c7187d6a28e8","Type":"ContainerStarted","Data":"dec41352b22dc4b1f265aecf13bbf9f995403b64a2bfc4f44c88616523722931"} Mar 21 04:27:01 crc kubenswrapper[4839]: I0321 04:27:01.885069 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-fhqz6" event={"ID":"3db892a0-fb40-4e0e-93ee-a8f2876ad8be","Type":"ContainerStarted","Data":"257ab6cb6e299cc5ad93a18838e91cd1a2df571cc8cf41230ae4ec84ecc3404e"} Mar 21 04:27:01 crc kubenswrapper[4839]: I0321 04:27:01.887796 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-p4nnp" event={"ID":"b60d6f1b-b109-4fa4-a85d-ebb845b342bd","Type":"ContainerStarted","Data":"36a30ee3de09b6a8070c6d8df57ed3e960516cc6813b51d774b7ac64b759f079"} Mar 21 04:27:01 crc kubenswrapper[4839]: I0321 04:27:01.895837 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-8sv4j" event={"ID":"f7a32fe8-bd63-44c6-a90c-a6a5438e3cd5","Type":"ContainerStarted","Data":"18ebfd00c2bdfa1069adfe9873fe32d14a8d869d0ed9b0b7f2aafcdd2abbfa77"} Mar 21 04:27:01 crc kubenswrapper[4839]: I0321 04:27:01.898862 4839 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-vg8dq"] Mar 21 04:27:01 crc kubenswrapper[4839]: I0321 04:27:01.923150 4839 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["hostpath-provisioner/csi-hostpathplugin-cstqb"] Mar 21 04:27:01 crc kubenswrapper[4839]: I0321 04:27:01.923761 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-bxg8h" event={"ID":"28599c04-0840-41a0-91dd-c0ed5bcf99fd","Type":"ContainerStarted","Data":"df9320e90b9c1289c659ebd828f5ec8dc8624d7becb6e45520b5ad077c294540"} Mar 21 04:27:01 crc kubenswrapper[4839]: I0321 04:27:01.923785 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-bxg8h" event={"ID":"28599c04-0840-41a0-91dd-c0ed5bcf99fd","Type":"ContainerStarted","Data":"5e968af271e571456f1fc861cbce1ec77c551902e916a8cd6bcf6e1d56aff536"} Mar 21 04:27:01 crc kubenswrapper[4839]: I0321 04:27:01.925318 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-ql2ps\" (UID: \"7ef3f28d-e496-434e-a803-3b9a0fa24690\") " pod="openshift-image-registry/image-registry-697d97f7c8-ql2ps" Mar 21 04:27:01 crc kubenswrapper[4839]: I0321 04:27:01.925541 4839 patch_prober.go:28] interesting pod/route-controller-manager-6576b87f9c-76ctz container/route-controller-manager namespace/openshift-route-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.13:8443/healthz\": dial tcp 10.217.0.13:8443: connect: connection refused" start-of-body= Mar 21 04:27:01 crc kubenswrapper[4839]: I0321 04:27:01.925604 4839 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-76ctz" podUID="e81e2384-94b0-4639-bb2d-e4152385c932" containerName="route-controller-manager" probeResult="failure" output="Get \"https://10.217.0.13:8443/healthz\": dial tcp 10.217.0.13:8443: connect: connection refused" Mar 21 04:27:01 crc kubenswrapper[4839]: E0321 04:27:01.926144 4839 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-21 04:27:02.426131069 +0000 UTC m=+226.753917745 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-ql2ps" (UID: "7ef3f28d-e496-434e-a803-3b9a0fa24690") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 21 04:27:01 crc kubenswrapper[4839]: I0321 04:27:01.933100 4839 patch_prober.go:28] interesting pod/controller-manager-879f6c89f-45jfn container/controller-manager namespace/openshift-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.5:8443/healthz\": dial tcp 10.217.0.5:8443: connect: connection refused" start-of-body= Mar 21 04:27:01 crc kubenswrapper[4839]: I0321 04:27:01.933186 4839 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-controller-manager/controller-manager-879f6c89f-45jfn" podUID="3498feaf-72d5-471a-b25e-fb4b68875767" containerName="controller-manager" probeResult="failure" output="Get \"https://10.217.0.5:8443/healthz\": dial tcp 10.217.0.5:8443: connect: connection refused" Mar 21 04:27:01 crc kubenswrapper[4839]: I0321 04:27:01.942796 4839 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/multus-admission-controller-857f4d67dd-g9tz2"] Mar 21 04:27:01 crc kubenswrapper[4839]: I0321 04:27:01.949097 4839 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-78xr9"] Mar 21 04:27:02 crc kubenswrapper[4839]: I0321 04:27:02.006395 4839 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29567775-lfv48"] Mar 21 04:27:02 crc kubenswrapper[4839]: I0321 04:27:02.035040 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 21 04:27:02 crc kubenswrapper[4839]: E0321 04:27:02.036233 4839 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-21 04:27:02.53621556 +0000 UTC m=+226.864002236 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 21 04:27:02 crc kubenswrapper[4839]: W0321 04:27:02.041379 4839 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9fc77b0b_af5d_42f5_8fd1_69ac8d6616d8.slice/crio-3c79577dadd9928c69db7c80eda570ad44512d4d86f82078636cc99a6209ca04 WatchSource:0}: Error finding container 3c79577dadd9928c69db7c80eda570ad44512d4d86f82078636cc99a6209ca04: Status 404 returned error can't find the container with id 3c79577dadd9928c69db7c80eda570ad44512d4d86f82078636cc99a6209ca04 Mar 21 04:27:02 crc kubenswrapper[4839]: W0321 04:27:02.042746 4839 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2bceecf8_583d_4e26_9749_f5939280540b.slice/crio-f454b3f61918548eed258d39d1cc2445884b335cf1cb9cc8888fef8a8986c650 WatchSource:0}: Error finding container f454b3f61918548eed258d39d1cc2445884b335cf1cb9cc8888fef8a8986c650: Status 404 returned error can't find the container with id f454b3f61918548eed258d39d1cc2445884b335cf1cb9cc8888fef8a8986c650 Mar 21 04:27:02 crc kubenswrapper[4839]: W0321 04:27:02.047885 4839 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod685c3b51_a70f_484e_b7db_f98383f75003.slice/crio-046845a3c18469b2c4953cb4c3caa7c880744e360ea33c1ec1534a3bfca174e1 WatchSource:0}: Error finding container 046845a3c18469b2c4953cb4c3caa7c880744e360ea33c1ec1534a3bfca174e1: Status 404 returned error can't find the container with id 046845a3c18469b2c4953cb4c3caa7c880744e360ea33c1ec1534a3bfca174e1 Mar 21 04:27:02 crc kubenswrapper[4839]: I0321 04:27:02.061427 4839 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29567786-d8w8k"] Mar 21 04:27:02 crc kubenswrapper[4839]: I0321 04:27:02.109010 4839 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-5jhkc"] Mar 21 04:27:02 crc kubenswrapper[4839]: I0321 04:27:02.126183 4839 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-brnnr"] Mar 21 04:27:02 crc kubenswrapper[4839]: I0321 04:27:02.129173 4839 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca/service-ca-9c57cc56f-6rrrs"] Mar 21 04:27:02 crc kubenswrapper[4839]: I0321 04:27:02.139402 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-ql2ps\" (UID: \"7ef3f28d-e496-434e-a803-3b9a0fa24690\") " pod="openshift-image-registry/image-registry-697d97f7c8-ql2ps" Mar 21 04:27:02 crc kubenswrapper[4839]: E0321 04:27:02.140522 4839 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-21 04:27:02.640487208 +0000 UTC m=+226.968273884 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-ql2ps" (UID: "7ef3f28d-e496-434e-a803-3b9a0fa24690") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 21 04:27:02 crc kubenswrapper[4839]: I0321 04:27:02.143708 4839 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 21 04:27:02 crc kubenswrapper[4839]: W0321 04:27:02.143650 4839 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod81a76ad1_da33_4b42_9c0a_d0ada077729a.slice/crio-a9427ef3e75a30fe9f8aa880b7baddc8b52f8630e29a8f3524aaed9baa58f7f8 WatchSource:0}: Error finding container a9427ef3e75a30fe9f8aa880b7baddc8b52f8630e29a8f3524aaed9baa58f7f8: Status 404 returned error can't find the container with id a9427ef3e75a30fe9f8aa880b7baddc8b52f8630e29a8f3524aaed9baa58f7f8 Mar 21 04:27:02 crc kubenswrapper[4839]: I0321 04:27:02.240464 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 21 04:27:02 crc kubenswrapper[4839]: E0321 04:27:02.241137 4839 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-21 04:27:02.741117338 +0000 UTC m=+227.068904014 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 21 04:27:02 crc kubenswrapper[4839]: I0321 04:27:02.241293 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-ql2ps\" (UID: \"7ef3f28d-e496-434e-a803-3b9a0fa24690\") " pod="openshift-image-registry/image-registry-697d97f7c8-ql2ps" Mar 21 04:27:02 crc kubenswrapper[4839]: E0321 04:27:02.241654 4839 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-21 04:27:02.741647483 +0000 UTC m=+227.069434159 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-ql2ps" (UID: "7ef3f28d-e496-434e-a803-3b9a0fa24690") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 21 04:27:02 crc kubenswrapper[4839]: I0321 04:27:02.346320 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 21 04:27:02 crc kubenswrapper[4839]: E0321 04:27:02.346468 4839 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-21 04:27:02.846443327 +0000 UTC m=+227.174230003 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 21 04:27:02 crc kubenswrapper[4839]: I0321 04:27:02.347658 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-ql2ps\" (UID: \"7ef3f28d-e496-434e-a803-3b9a0fa24690\") " pod="openshift-image-registry/image-registry-697d97f7c8-ql2ps" Mar 21 04:27:02 crc kubenswrapper[4839]: E0321 04:27:02.348290 4839 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-21 04:27:02.848275012 +0000 UTC m=+227.176061688 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-ql2ps" (UID: "7ef3f28d-e496-434e-a803-3b9a0fa24690") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 21 04:27:02 crc kubenswrapper[4839]: I0321 04:27:02.449886 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 21 04:27:02 crc kubenswrapper[4839]: E0321 04:27:02.450598 4839 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-21 04:27:02.950583081 +0000 UTC m=+227.278369757 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 21 04:27:02 crc kubenswrapper[4839]: I0321 04:27:02.545648 4839 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-85pc8" podStartSLOduration=161.545623473 podStartE2EDuration="2m41.545623473s" podCreationTimestamp="2026-03-21 04:24:21 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-21 04:27:02.542815129 +0000 UTC m=+226.870601815" watchObservedRunningTime="2026-03-21 04:27:02.545623473 +0000 UTC m=+226.873410149" Mar 21 04:27:02 crc kubenswrapper[4839]: I0321 04:27:02.551406 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-ql2ps\" (UID: \"7ef3f28d-e496-434e-a803-3b9a0fa24690\") " pod="openshift-image-registry/image-registry-697d97f7c8-ql2ps" Mar 21 04:27:02 crc kubenswrapper[4839]: E0321 04:27:02.551964 4839 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-21 04:27:03.051952522 +0000 UTC m=+227.379739198 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-ql2ps" (UID: "7ef3f28d-e496-434e-a803-3b9a0fa24690") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 21 04:27:02 crc kubenswrapper[4839]: I0321 04:27:02.590920 4839 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-8sv4j" podStartSLOduration=162.590905437 podStartE2EDuration="2m42.590905437s" podCreationTimestamp="2026-03-21 04:24:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-21 04:27:02.568040043 +0000 UTC m=+226.895826729" watchObservedRunningTime="2026-03-21 04:27:02.590905437 +0000 UTC m=+226.918692113" Mar 21 04:27:02 crc kubenswrapper[4839]: I0321 04:27:02.591636 4839 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/downloads-7954f5f757-qp8mz" podStartSLOduration=162.591629649 podStartE2EDuration="2m42.591629649s" podCreationTimestamp="2026-03-21 04:24:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-21 04:27:02.591103673 +0000 UTC m=+226.918890349" watchObservedRunningTime="2026-03-21 04:27:02.591629649 +0000 UTC m=+226.919416325" Mar 21 04:27:02 crc kubenswrapper[4839]: I0321 04:27:02.639091 4839 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-authentication-operator/authentication-operator-69f744f599-k5nwf" podStartSLOduration=162.639074568 podStartE2EDuration="2m42.639074568s" podCreationTimestamp="2026-03-21 04:24:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-21 04:27:02.637797139 +0000 UTC m=+226.965583805" watchObservedRunningTime="2026-03-21 04:27:02.639074568 +0000 UTC m=+226.966861234" Mar 21 04:27:02 crc kubenswrapper[4839]: I0321 04:27:02.653190 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 21 04:27:02 crc kubenswrapper[4839]: E0321 04:27:02.653682 4839 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-21 04:27:03.153658764 +0000 UTC m=+227.481445440 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 21 04:27:02 crc kubenswrapper[4839]: I0321 04:27:02.736942 4839 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-authentication/oauth-openshift-558db77b4-zt77f" podStartSLOduration=162.736925374 podStartE2EDuration="2m42.736925374s" podCreationTimestamp="2026-03-21 04:24:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-21 04:27:02.682531697 +0000 UTC m=+227.010318383" watchObservedRunningTime="2026-03-21 04:27:02.736925374 +0000 UTC m=+227.064712050" Mar 21 04:27:02 crc kubenswrapper[4839]: I0321 04:27:02.759545 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-ql2ps\" (UID: \"7ef3f28d-e496-434e-a803-3b9a0fa24690\") " pod="openshift-image-registry/image-registry-697d97f7c8-ql2ps" Mar 21 04:27:02 crc kubenswrapper[4839]: E0321 04:27:02.760140 4839 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-21 04:27:03.260123497 +0000 UTC m=+227.587910173 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-ql2ps" (UID: "7ef3f28d-e496-434e-a803-3b9a0fa24690") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 21 04:27:02 crc kubenswrapper[4839]: I0321 04:27:02.867876 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 21 04:27:02 crc kubenswrapper[4839]: E0321 04:27:02.868192 4839 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-21 04:27:03.368174148 +0000 UTC m=+227.695960824 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 21 04:27:02 crc kubenswrapper[4839]: I0321 04:27:02.955066 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-lqm8j" event={"ID":"2bc52acb-29f0-4f24-a46a-928a529264dc","Type":"ContainerStarted","Data":"462a8bb447e356d4b7a74bcc52ea16689ebef97585cfaf2dd0b1940d90242dbe"} Mar 21 04:27:02 crc kubenswrapper[4839]: I0321 04:27:02.972396 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-ql2ps\" (UID: \"7ef3f28d-e496-434e-a803-3b9a0fa24690\") " pod="openshift-image-registry/image-registry-697d97f7c8-ql2ps" Mar 21 04:27:02 crc kubenswrapper[4839]: E0321 04:27:02.972796 4839 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-21 04:27:03.472783056 +0000 UTC m=+227.800569732 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-ql2ps" (UID: "7ef3f28d-e496-434e-a803-3b9a0fa24690") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 21 04:27:02 crc kubenswrapper[4839]: I0321 04:27:02.973269 4839 generic.go:334] "Generic (PLEG): container finished" podID="b60d6f1b-b109-4fa4-a85d-ebb845b342bd" containerID="045c9336a13b752744cb5d13222b2d90daf20b00e721612357866f428a0e3828" exitCode=0 Mar 21 04:27:02 crc kubenswrapper[4839]: I0321 04:27:02.974461 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-p4nnp" event={"ID":"b60d6f1b-b109-4fa4-a85d-ebb845b342bd","Type":"ContainerDied","Data":"045c9336a13b752744cb5d13222b2d90daf20b00e721612357866f428a0e3828"} Mar 21 04:27:02 crc kubenswrapper[4839]: I0321 04:27:02.984460 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-hlk25" event={"ID":"a810b51a-5b19-4da9-ad80-05f189d821e4","Type":"ContainerStarted","Data":"3720dcdc84d0b541718745f3af152af7b1b68a7618c107c94916993866a70cf0"} Mar 21 04:27:02 crc kubenswrapper[4839]: I0321 04:27:02.984510 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-hlk25" event={"ID":"a810b51a-5b19-4da9-ad80-05f189d821e4","Type":"ContainerStarted","Data":"2bac721ba6d2c55bdfbd0753be9818e4659f19ed6afa991c8d3cd42bf3709d6d"} Mar 21 04:27:02 crc kubenswrapper[4839]: I0321 04:27:02.993234 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-blcpt" event={"ID":"7f2c6e22-6a88-4c63-9da2-e38b813e0f1c","Type":"ContainerStarted","Data":"e2b84720a1b200524901f41ca64fd70986ddf0fdd064e6d873edb1c351855d72"} Mar 21 04:27:02 crc kubenswrapper[4839]: I0321 04:27:02.993759 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-blcpt" event={"ID":"7f2c6e22-6a88-4c63-9da2-e38b813e0f1c","Type":"ContainerStarted","Data":"b088cc5d0a87ccf4d21ca157a589c34e462b3620d9b14feb6ce67e36f4b78c7a"} Mar 21 04:27:03 crc kubenswrapper[4839]: I0321 04:27:03.003079 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-8jgh7" event={"ID":"6240548e-b827-4fdb-b2be-c7187d6a28e8","Type":"ContainerStarted","Data":"10d054bdba792021ed12279ec4d8798c4bb962afe280d6356a1f51072ca850c9"} Mar 21 04:27:03 crc kubenswrapper[4839]: I0321 04:27:03.005134 4839 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/marketplace-operator-79b997595-8jgh7" Mar 21 04:27:03 crc kubenswrapper[4839]: I0321 04:27:03.008046 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-brnnr" event={"ID":"d1439545-f492-4e4c-858c-ec85c5c2a9d9","Type":"ContainerStarted","Data":"187e50ca092521b2aa02a610aa2b4cadf43ce1a33f7adfcc7192fe95d3f50787"} Mar 21 04:27:03 crc kubenswrapper[4839]: I0321 04:27:03.014531 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-tz8sp" event={"ID":"25af4e9d-c029-4ee7-9952-18a3a5e3c333","Type":"ContainerStarted","Data":"9143e2ccac432e733a084565d2fa1c38d821357e0d475219c7271e385d278800"} Mar 21 04:27:03 crc kubenswrapper[4839]: I0321 04:27:03.014609 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-tz8sp" event={"ID":"25af4e9d-c029-4ee7-9952-18a3a5e3c333","Type":"ContainerStarted","Data":"eaba76d3d3d621142d7292023ad530d32d9757f6e702f3b0e5deb428624e31ae"} Mar 21 04:27:03 crc kubenswrapper[4839]: I0321 04:27:03.023619 4839 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/marketplace-operator-79b997595-8jgh7" podStartSLOduration=162.023598636 podStartE2EDuration="2m42.023598636s" podCreationTimestamp="2026-03-21 04:24:21 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-21 04:27:03.020424121 +0000 UTC m=+227.348210797" watchObservedRunningTime="2026-03-21 04:27:03.023598636 +0000 UTC m=+227.351385312" Mar 21 04:27:03 crc kubenswrapper[4839]: I0321 04:27:03.025138 4839 patch_prober.go:28] interesting pod/marketplace-operator-79b997595-8jgh7 container/marketplace-operator namespace/openshift-marketplace: Readiness probe status=failure output="Get \"http://10.217.0.30:8080/healthz\": dial tcp 10.217.0.30:8080: connect: connection refused" start-of-body= Mar 21 04:27:03 crc kubenswrapper[4839]: I0321 04:27:03.025185 4839 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-marketplace/marketplace-operator-79b997595-8jgh7" podUID="6240548e-b827-4fdb-b2be-c7187d6a28e8" containerName="marketplace-operator" probeResult="failure" output="Get \"http://10.217.0.30:8080/healthz\": dial tcp 10.217.0.30:8080: connect: connection refused" Mar 21 04:27:03 crc kubenswrapper[4839]: I0321 04:27:03.025973 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567786-d8w8k" event={"ID":"609ace61-45d1-44f6-b378-fb97eecf2374","Type":"ContainerStarted","Data":"3881096e968c291ccdd0e957e85d1c17697b418b86707f4eba8dd532d8654b50"} Mar 21 04:27:03 crc kubenswrapper[4839]: I0321 04:27:03.041548 4839 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-tz8sp" podStartSLOduration=163.041532702 podStartE2EDuration="2m43.041532702s" podCreationTimestamp="2026-03-21 04:24:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-21 04:27:03.040372038 +0000 UTC m=+227.368158714" watchObservedRunningTime="2026-03-21 04:27:03.041532702 +0000 UTC m=+227.369319398" Mar 21 04:27:03 crc kubenswrapper[4839]: I0321 04:27:03.043694 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-5jhkc" event={"ID":"81a76ad1-da33-4b42-9c0a-d0ada077729a","Type":"ContainerStarted","Data":"a9427ef3e75a30fe9f8aa880b7baddc8b52f8630e29a8f3524aaed9baa58f7f8"} Mar 21 04:27:03 crc kubenswrapper[4839]: I0321 04:27:03.058000 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-78xr9" event={"ID":"685c3b51-a70f-484e-b7db-f98383f75003","Type":"ContainerStarted","Data":"046845a3c18469b2c4953cb4c3caa7c880744e360ea33c1ec1534a3bfca174e1"} Mar 21 04:27:03 crc kubenswrapper[4839]: I0321 04:27:03.073311 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 21 04:27:03 crc kubenswrapper[4839]: E0321 04:27:03.073727 4839 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-21 04:27:03.573708704 +0000 UTC m=+227.901495380 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 21 04:27:03 crc kubenswrapper[4839]: I0321 04:27:03.082440 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-cstqb" event={"ID":"4fee5524-9cb1-48c7-83b6-10bf3230c783","Type":"ContainerStarted","Data":"91fdc9d2521333990da8d1f6444e8c42711f0aa0359076d3070c0e7a6f7242e2"} Mar 21 04:27:03 crc kubenswrapper[4839]: I0321 04:27:03.088021 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-server-4sj57" event={"ID":"a178972b-b463-42db-b2c9-dcba9a51c4bc","Type":"ContainerStarted","Data":"133a77624e75b046fd14ceaebf5844fd31bd6a20c4335f0394025db10d528268"} Mar 21 04:27:03 crc kubenswrapper[4839]: I0321 04:27:03.091807 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-bxg8h" event={"ID":"28599c04-0840-41a0-91dd-c0ed5bcf99fd","Type":"ContainerStarted","Data":"5ce59bfc9a8d76f61b475a748505225dea02075e980d4be95f046cacab374fb3"} Mar 21 04:27:03 crc kubenswrapper[4839]: I0321 04:27:03.092154 4839 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-bxg8h" Mar 21 04:27:03 crc kubenswrapper[4839]: I0321 04:27:03.098168 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-fh8k5" event={"ID":"14ed32fc-196a-4d5e-a8f3-d9fc6ec765b1","Type":"ContainerStarted","Data":"10517d841e330452bbfc351ceb4a23f6516213a6e7439f1777eb972e63ebd915"} Mar 21 04:27:03 crc kubenswrapper[4839]: I0321 04:27:03.143794 4839 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-server-4sj57" podStartSLOduration=6.143772369 podStartE2EDuration="6.143772369s" podCreationTimestamp="2026-03-21 04:26:57 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-21 04:27:03.119835484 +0000 UTC m=+227.447622170" watchObservedRunningTime="2026-03-21 04:27:03.143772369 +0000 UTC m=+227.471559045" Mar 21 04:27:03 crc kubenswrapper[4839]: I0321 04:27:03.156778 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-djrdt" event={"ID":"745f7801-7150-4924-b9fb-e8a0aa1e7edb","Type":"ContainerStarted","Data":"ace0517a65be11d7ab5a93fb88093741fe4b523a4638575f37a6b995e5c62697"} Mar 21 04:27:03 crc kubenswrapper[4839]: I0321 04:27:03.156838 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-djrdt" event={"ID":"745f7801-7150-4924-b9fb-e8a0aa1e7edb","Type":"ContainerStarted","Data":"f676c0c1b6ca31caace169e795dc89298d1ab5fa78697ed87adc8f350eda2000"} Mar 21 04:27:03 crc kubenswrapper[4839]: I0321 04:27:03.172187 4839 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-fh8k5" podStartSLOduration=163.172165669 podStartE2EDuration="2m43.172165669s" podCreationTimestamp="2026-03-21 04:24:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-21 04:27:03.168156019 +0000 UTC m=+227.495942695" watchObservedRunningTime="2026-03-21 04:27:03.172165669 +0000 UTC m=+227.499952345" Mar 21 04:27:03 crc kubenswrapper[4839]: I0321 04:27:03.177604 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-ql2ps\" (UID: \"7ef3f28d-e496-434e-a803-3b9a0fa24690\") " pod="openshift-image-registry/image-registry-697d97f7c8-ql2ps" Mar 21 04:27:03 crc kubenswrapper[4839]: E0321 04:27:03.183427 4839 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-21 04:27:03.683409705 +0000 UTC m=+228.011196371 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-ql2ps" (UID: "7ef3f28d-e496-434e-a803-3b9a0fa24690") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 21 04:27:03 crc kubenswrapper[4839]: I0321 04:27:03.200602 4839 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-bxg8h" podStartSLOduration=162.200588478 podStartE2EDuration="2m42.200588478s" podCreationTimestamp="2026-03-21 04:24:21 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-21 04:27:03.199922289 +0000 UTC m=+227.527708965" watchObservedRunningTime="2026-03-21 04:27:03.200588478 +0000 UTC m=+227.528375154" Mar 21 04:27:03 crc kubenswrapper[4839]: I0321 04:27:03.213926 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-777779d784-shqhf" event={"ID":"6dd3a400-6155-44b9-a358-d2cd089db1f6","Type":"ContainerStarted","Data":"17a65b035567eb719a2f53758708b74e68f6b1f5d9b8593826375378484ff4e1"} Mar 21 04:27:03 crc kubenswrapper[4839]: I0321 04:27:03.213975 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-777779d784-shqhf" event={"ID":"6dd3a400-6155-44b9-a358-d2cd089db1f6","Type":"ContainerStarted","Data":"570791c95d0b2b190c1d8da17592465418d253b93a8afbc494ffb34625ed0073"} Mar 21 04:27:03 crc kubenswrapper[4839]: I0321 04:27:03.231025 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-857f4d67dd-g9tz2" event={"ID":"8c8a6e75-7e5f-41c8-8312-b9d274284f35","Type":"ContainerStarted","Data":"cd60520aa8e3d25f4578412e6964fff5da176e03ce4c18684a063c91d4747cba"} Mar 21 04:27:03 crc kubenswrapper[4839]: I0321 04:27:03.249255 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-9c57cc56f-6rrrs" event={"ID":"ad426123-af7f-45c4-8a6b-bca3c83017be","Type":"ContainerStarted","Data":"1a56073ecc981d7ea1066ad2e2a545424cc1438c17c937db0ed6b70dfc89f735"} Mar 21 04:27:03 crc kubenswrapper[4839]: I0321 04:27:03.258080 4839 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-service-ca-operator/service-ca-operator-777779d784-shqhf" podStartSLOduration=162.258062397 podStartE2EDuration="2m42.258062397s" podCreationTimestamp="2026-03-21 04:24:21 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-21 04:27:03.257648805 +0000 UTC m=+227.585435481" watchObservedRunningTime="2026-03-21 04:27:03.258062397 +0000 UTC m=+227.585849073" Mar 21 04:27:03 crc kubenswrapper[4839]: I0321 04:27:03.261296 4839 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-djrdt" podStartSLOduration=163.261280603 podStartE2EDuration="2m43.261280603s" podCreationTimestamp="2026-03-21 04:24:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-21 04:27:03.228190334 +0000 UTC m=+227.555977010" watchObservedRunningTime="2026-03-21 04:27:03.261280603 +0000 UTC m=+227.589067279" Mar 21 04:27:03 crc kubenswrapper[4839]: I0321 04:27:03.275257 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd-operator/etcd-operator-b45778765-2s6j7" event={"ID":"93bc1508-a828-4d23-b078-1d4164d1bc2c","Type":"ContainerStarted","Data":"a2e9dbd6a8e6a22842ccb2c8c268abb030d9130744c130549c78eba74a31e38b"} Mar 21 04:27:03 crc kubenswrapper[4839]: I0321 04:27:03.278383 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 21 04:27:03 crc kubenswrapper[4839]: E0321 04:27:03.278758 4839 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-21 04:27:03.778743605 +0000 UTC m=+228.106530281 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 21 04:27:03 crc kubenswrapper[4839]: I0321 04:27:03.316540 4839 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-etcd-operator/etcd-operator-b45778765-2s6j7" podStartSLOduration=163.316516015 podStartE2EDuration="2m43.316516015s" podCreationTimestamp="2026-03-21 04:24:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-21 04:27:03.315717141 +0000 UTC m=+227.643503827" watchObservedRunningTime="2026-03-21 04:27:03.316516015 +0000 UTC m=+227.644302691" Mar 21 04:27:03 crc kubenswrapper[4839]: I0321 04:27:03.322337 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-vg8dq" event={"ID":"2bceecf8-583d-4e26-9749-f5939280540b","Type":"ContainerStarted","Data":"f454b3f61918548eed258d39d1cc2445884b335cf1cb9cc8888fef8a8986c650"} Mar 21 04:27:03 crc kubenswrapper[4839]: I0321 04:27:03.350016 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-5444994796-w6dzs" event={"ID":"28ce563b-8e5b-4abe-b71b-02c588bff511","Type":"ContainerStarted","Data":"08587c23a111b9d07306bacd6540f579f28544379d121f7c40b0619835c75da7"} Mar 21 04:27:03 crc kubenswrapper[4839]: I0321 04:27:03.378303 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-fhqz6" event={"ID":"3db892a0-fb40-4e0e-93ee-a8f2876ad8be","Type":"ContainerStarted","Data":"8bc2f6e184e6ff56e260fca04b6c5b3b4a0ad543df5a1345d96246b3e1860851"} Mar 21 04:27:03 crc kubenswrapper[4839]: I0321 04:27:03.379966 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-ql2ps\" (UID: \"7ef3f28d-e496-434e-a803-3b9a0fa24690\") " pod="openshift-image-registry/image-registry-697d97f7c8-ql2ps" Mar 21 04:27:03 crc kubenswrapper[4839]: E0321 04:27:03.380536 4839 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-21 04:27:03.880518429 +0000 UTC m=+228.208305185 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-ql2ps" (UID: "7ef3f28d-e496-434e-a803-3b9a0fa24690") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 21 04:27:03 crc kubenswrapper[4839]: I0321 04:27:03.389213 4839 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress/router-default-5444994796-w6dzs" podStartSLOduration=163.389193548 podStartE2EDuration="2m43.389193548s" podCreationTimestamp="2026-03-21 04:24:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-21 04:27:03.38792966 +0000 UTC m=+227.715716336" watchObservedRunningTime="2026-03-21 04:27:03.389193548 +0000 UTC m=+227.716980224" Mar 21 04:27:03 crc kubenswrapper[4839]: I0321 04:27:03.392329 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-whlp9" event={"ID":"40014780-8cb8-47fa-8b2c-c4fb7d04a85c","Type":"ContainerStarted","Data":"620b99a36572573555d9edafa537b030d63de93db8e78e98b8a825c1b5d40fa1"} Mar 21 04:27:03 crc kubenswrapper[4839]: I0321 04:27:03.392470 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-whlp9" event={"ID":"40014780-8cb8-47fa-8b2c-c4fb7d04a85c","Type":"ContainerStarted","Data":"2e8fb3a8c94c2c8e1f3885b168e57485f477a668535e95bf79624d6a8b059e8e"} Mar 21 04:27:03 crc kubenswrapper[4839]: I0321 04:27:03.405695 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-hmqv7" event={"ID":"10dc7791-eebd-49e9-8d9c-63711119e9d7","Type":"ContainerStarted","Data":"20d8ad17fce889c0891718ad487bd407853d271e12febcfcd6d0d77ebb01b23e"} Mar 21 04:27:03 crc kubenswrapper[4839]: I0321 04:27:03.415619 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29567775-lfv48" event={"ID":"0368223e-2e01-4681-a7a6-67b77387f8d8","Type":"ContainerStarted","Data":"c99d11bf14a3d22b9bf9f8b3d4a725f2ad066e9650b4eeed3e95098655e3adb9"} Mar 21 04:27:03 crc kubenswrapper[4839]: I0321 04:27:03.431324 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-c4tgz" event={"ID":"14f49362-2145-40aa-8a7c-e07c70ea910c","Type":"ContainerStarted","Data":"ecf0ef17f4f1af0a2d4aef88de053db8b68324f615935efcde694faed132643f"} Mar 21 04:27:03 crc kubenswrapper[4839]: I0321 04:27:03.450059 4839 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-fhqz6" podStartSLOduration=163.450042258 podStartE2EDuration="2m43.450042258s" podCreationTimestamp="2026-03-21 04:24:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-21 04:27:03.448612985 +0000 UTC m=+227.776399661" watchObservedRunningTime="2026-03-21 04:27:03.450042258 +0000 UTC m=+227.777828934" Mar 21 04:27:03 crc kubenswrapper[4839]: I0321 04:27:03.458886 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-58897d9998-hkg98" event={"ID":"f7156267-6917-4c54-ba75-4a91a0772025","Type":"ContainerStarted","Data":"73f7a1853122a3d755ea4ceef3910803fbdfe0ade43faeec6c9d48fc144f223b"} Mar 21 04:27:03 crc kubenswrapper[4839]: I0321 04:27:03.460006 4839 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console-operator/console-operator-58897d9998-hkg98" Mar 21 04:27:03 crc kubenswrapper[4839]: I0321 04:27:03.468873 4839 patch_prober.go:28] interesting pod/console-operator-58897d9998-hkg98 container/console-operator namespace/openshift-console-operator: Readiness probe status=failure output="Get \"https://10.217.0.18:8443/readyz\": dial tcp 10.217.0.18:8443: connect: connection refused" start-of-body= Mar 21 04:27:03 crc kubenswrapper[4839]: I0321 04:27:03.469197 4839 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console-operator/console-operator-58897d9998-hkg98" podUID="f7156267-6917-4c54-ba75-4a91a0772025" containerName="console-operator" probeResult="failure" output="Get \"https://10.217.0.18:8443/readyz\": dial tcp 10.217.0.18:8443: connect: connection refused" Mar 21 04:27:03 crc kubenswrapper[4839]: I0321 04:27:03.469343 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-xldvn" event={"ID":"a83789bf-1523-4d5e-892d-6597aed01b7d","Type":"ContainerStarted","Data":"30b49456a6f5e32ba761b3f2f18fba90c9260b2efe733bf0886236747cab63b8"} Mar 21 04:27:03 crc kubenswrapper[4839]: I0321 04:27:03.469482 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-xldvn" event={"ID":"a83789bf-1523-4d5e-892d-6597aed01b7d","Type":"ContainerStarted","Data":"438e2b25c5a34df45dc61cbf2b0e52c20429a7a022e8882c559f2d615dc2395e"} Mar 21 04:27:03 crc kubenswrapper[4839]: I0321 04:27:03.470047 4839 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-xldvn" Mar 21 04:27:03 crc kubenswrapper[4839]: I0321 04:27:03.482176 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 21 04:27:03 crc kubenswrapper[4839]: E0321 04:27:03.483252 4839 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-21 04:27:03.98323609 +0000 UTC m=+228.311022766 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 21 04:27:03 crc kubenswrapper[4839]: I0321 04:27:03.486168 4839 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-c4tgz" podStartSLOduration=163.486105936 podStartE2EDuration="2m43.486105936s" podCreationTimestamp="2026-03-21 04:24:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-21 04:27:03.478100587 +0000 UTC m=+227.805887273" watchObservedRunningTime="2026-03-21 04:27:03.486105936 +0000 UTC m=+227.813892612" Mar 21 04:27:03 crc kubenswrapper[4839]: I0321 04:27:03.488645 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-9m6tl" event={"ID":"9fc77b0b-af5d-42f5-8fd1-69ac8d6616d8","Type":"ContainerStarted","Data":"3c79577dadd9928c69db7c80eda570ad44512d4d86f82078636cc99a6209ca04"} Mar 21 04:27:03 crc kubenswrapper[4839]: I0321 04:27:03.495477 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-x2g8w" event={"ID":"34efe2c8-d7a8-47c1-8890-85ebd5ef1eb9","Type":"ContainerStarted","Data":"42e839eac1f8215c88389e8183b602d5bf8635c8709ad35ed5bbab8cd3660612"} Mar 21 04:27:03 crc kubenswrapper[4839]: I0321 04:27:03.495508 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-x2g8w" event={"ID":"34efe2c8-d7a8-47c1-8890-85ebd5ef1eb9","Type":"ContainerStarted","Data":"f687bfd38b7cfa3b591f3fb6b760992a834ed7768fb5ee365a16b165dfb16871"} Mar 21 04:27:03 crc kubenswrapper[4839]: I0321 04:27:03.496639 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-bj929" event={"ID":"ebdfec0a-a8bf-47b0-b51a-75a76d4341f2","Type":"ContainerStarted","Data":"fe1fa26cb747d6735cb9f60b969aa41c7d904163dcab0be59475985097c131d6"} Mar 21 04:27:03 crc kubenswrapper[4839]: I0321 04:27:03.499095 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-4cwc5" event={"ID":"a79fc033-c671-42ff-aa06-78ae64967c92","Type":"ContainerStarted","Data":"d0f9f8438d4a07c654faa0f7a003efabe25d31ca356650fe31683d5c6c32e350"} Mar 21 04:27:03 crc kubenswrapper[4839]: I0321 04:27:03.499485 4839 patch_prober.go:28] interesting pod/downloads-7954f5f757-qp8mz container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.14:8080/\": dial tcp 10.217.0.14:8080: connect: connection refused" start-of-body= Mar 21 04:27:03 crc kubenswrapper[4839]: I0321 04:27:03.499519 4839 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-qp8mz" podUID="4d63cdfd-21e7-4a63-960b-363fb131ac08" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.14:8080/\": dial tcp 10.217.0.14:8080: connect: connection refused" Mar 21 04:27:03 crc kubenswrapper[4839]: I0321 04:27:03.500394 4839 patch_prober.go:28] interesting pod/catalog-operator-68c6474976-xldvn container/catalog-operator namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.38:8443/healthz\": dial tcp 10.217.0.38:8443: connect: connection refused" start-of-body= Mar 21 04:27:03 crc kubenswrapper[4839]: I0321 04:27:03.500439 4839 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-xldvn" podUID="a83789bf-1523-4d5e-892d-6597aed01b7d" containerName="catalog-operator" probeResult="failure" output="Get \"https://10.217.0.38:8443/healthz\": dial tcp 10.217.0.38:8443: connect: connection refused" Mar 21 04:27:03 crc kubenswrapper[4839]: I0321 04:27:03.503953 4839 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/collect-profiles-29567775-lfv48" podStartSLOduration=163.503935029 podStartE2EDuration="2m43.503935029s" podCreationTimestamp="2026-03-21 04:24:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-21 04:27:03.501056383 +0000 UTC m=+227.828843059" watchObservedRunningTime="2026-03-21 04:27:03.503935029 +0000 UTC m=+227.831721705" Mar 21 04:27:03 crc kubenswrapper[4839]: I0321 04:27:03.517135 4839 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-76ctz" Mar 21 04:27:03 crc kubenswrapper[4839]: I0321 04:27:03.522292 4839 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-879f6c89f-45jfn" Mar 21 04:27:03 crc kubenswrapper[4839]: I0321 04:27:03.531504 4839 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-whlp9" podStartSLOduration=162.531487883 podStartE2EDuration="2m42.531487883s" podCreationTimestamp="2026-03-21 04:24:21 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-21 04:27:03.529853774 +0000 UTC m=+227.857640450" watchObservedRunningTime="2026-03-21 04:27:03.531487883 +0000 UTC m=+227.859274559" Mar 21 04:27:03 crc kubenswrapper[4839]: I0321 04:27:03.590956 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-ql2ps\" (UID: \"7ef3f28d-e496-434e-a803-3b9a0fa24690\") " pod="openshift-image-registry/image-registry-697d97f7c8-ql2ps" Mar 21 04:27:03 crc kubenswrapper[4839]: E0321 04:27:03.598994 4839 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-21 04:27:04.098978631 +0000 UTC m=+228.426765307 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-ql2ps" (UID: "7ef3f28d-e496-434e-a803-3b9a0fa24690") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 21 04:27:03 crc kubenswrapper[4839]: I0321 04:27:03.679638 4839 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-xldvn" podStartSLOduration=162.679621673 podStartE2EDuration="2m42.679621673s" podCreationTimestamp="2026-03-21 04:24:21 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-21 04:27:03.59055275 +0000 UTC m=+227.918339436" watchObservedRunningTime="2026-03-21 04:27:03.679621673 +0000 UTC m=+228.007408349" Mar 21 04:27:03 crc kubenswrapper[4839]: I0321 04:27:03.679977 4839 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-4cwc5" podStartSLOduration=163.679970173 podStartE2EDuration="2m43.679970173s" podCreationTimestamp="2026-03-21 04:24:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-21 04:27:03.653002407 +0000 UTC m=+227.980789093" watchObservedRunningTime="2026-03-21 04:27:03.679970173 +0000 UTC m=+228.007756849" Mar 21 04:27:03 crc kubenswrapper[4839]: I0321 04:27:03.692160 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 21 04:27:03 crc kubenswrapper[4839]: E0321 04:27:03.692792 4839 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-21 04:27:04.192759226 +0000 UTC m=+228.520545902 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 21 04:27:03 crc kubenswrapper[4839]: I0321 04:27:03.701083 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-ql2ps\" (UID: \"7ef3f28d-e496-434e-a803-3b9a0fa24690\") " pod="openshift-image-registry/image-registry-697d97f7c8-ql2ps" Mar 21 04:27:03 crc kubenswrapper[4839]: E0321 04:27:03.706624 4839 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-21 04:27:04.20660084 +0000 UTC m=+228.534387516 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-ql2ps" (UID: "7ef3f28d-e496-434e-a803-3b9a0fa24690") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 21 04:27:03 crc kubenswrapper[4839]: I0321 04:27:03.718609 4839 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-ingress/router-default-5444994796-w6dzs" Mar 21 04:27:03 crc kubenswrapper[4839]: I0321 04:27:03.719039 4839 patch_prober.go:28] interesting pod/router-default-5444994796-w6dzs container/router namespace/openshift-ingress: Startup probe status=failure output="Get \"http://localhost:1936/healthz/ready\": dial tcp [::1]:1936: connect: connection refused" start-of-body= Mar 21 04:27:03 crc kubenswrapper[4839]: I0321 04:27:03.723715 4839 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-w6dzs" podUID="28ce563b-8e5b-4abe-b71b-02c588bff511" containerName="router" probeResult="failure" output="Get \"http://localhost:1936/healthz/ready\": dial tcp [::1]:1936: connect: connection refused" Mar 21 04:27:03 crc kubenswrapper[4839]: I0321 04:27:03.762059 4839 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-x2g8w" podStartSLOduration=162.762037957 podStartE2EDuration="2m42.762037957s" podCreationTimestamp="2026-03-21 04:24:21 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-21 04:27:03.760630585 +0000 UTC m=+228.088417261" watchObservedRunningTime="2026-03-21 04:27:03.762037957 +0000 UTC m=+228.089824633" Mar 21 04:27:03 crc kubenswrapper[4839]: I0321 04:27:03.797344 4839 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-authentication/oauth-openshift-558db77b4-zt77f" Mar 21 04:27:03 crc kubenswrapper[4839]: I0321 04:27:03.814404 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 21 04:27:03 crc kubenswrapper[4839]: E0321 04:27:03.814806 4839 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-21 04:27:04.314790845 +0000 UTC m=+228.642577521 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 21 04:27:03 crc kubenswrapper[4839]: I0321 04:27:03.915171 4839 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console-operator/console-operator-58897d9998-hkg98" podStartSLOduration=163.915151756 podStartE2EDuration="2m43.915151756s" podCreationTimestamp="2026-03-21 04:24:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-21 04:27:03.876383287 +0000 UTC m=+228.204169963" watchObservedRunningTime="2026-03-21 04:27:03.915151756 +0000 UTC m=+228.242938432" Mar 21 04:27:03 crc kubenswrapper[4839]: I0321 04:27:03.916000 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-ql2ps\" (UID: \"7ef3f28d-e496-434e-a803-3b9a0fa24690\") " pod="openshift-image-registry/image-registry-697d97f7c8-ql2ps" Mar 21 04:27:03 crc kubenswrapper[4839]: E0321 04:27:03.916392 4839 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-21 04:27:04.416380353 +0000 UTC m=+228.744167029 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-ql2ps" (UID: "7ef3f28d-e496-434e-a803-3b9a0fa24690") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 21 04:27:04 crc kubenswrapper[4839]: I0321 04:27:04.004754 4839 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-f9d7485db-bj929" podStartSLOduration=164.004740035 podStartE2EDuration="2m44.004740035s" podCreationTimestamp="2026-03-21 04:24:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-21 04:27:04.003857678 +0000 UTC m=+228.331644354" watchObservedRunningTime="2026-03-21 04:27:04.004740035 +0000 UTC m=+228.332526711" Mar 21 04:27:04 crc kubenswrapper[4839]: I0321 04:27:04.017561 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 21 04:27:04 crc kubenswrapper[4839]: E0321 04:27:04.017866 4839 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-21 04:27:04.517832916 +0000 UTC m=+228.845619582 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 21 04:27:04 crc kubenswrapper[4839]: I0321 04:27:04.018023 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-ql2ps\" (UID: \"7ef3f28d-e496-434e-a803-3b9a0fa24690\") " pod="openshift-image-registry/image-registry-697d97f7c8-ql2ps" Mar 21 04:27:04 crc kubenswrapper[4839]: E0321 04:27:04.018554 4839 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-21 04:27:04.518545238 +0000 UTC m=+228.846331914 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-ql2ps" (UID: "7ef3f28d-e496-434e-a803-3b9a0fa24690") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 21 04:27:04 crc kubenswrapper[4839]: I0321 04:27:04.118703 4839 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-85pc8" Mar 21 04:27:04 crc kubenswrapper[4839]: I0321 04:27:04.121262 4839 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-85pc8" Mar 21 04:27:04 crc kubenswrapper[4839]: I0321 04:27:04.122134 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 21 04:27:04 crc kubenswrapper[4839]: E0321 04:27:04.122523 4839 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-21 04:27:04.622507816 +0000 UTC m=+228.950294492 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 21 04:27:04 crc kubenswrapper[4839]: I0321 04:27:04.225667 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-ql2ps\" (UID: \"7ef3f28d-e496-434e-a803-3b9a0fa24690\") " pod="openshift-image-registry/image-registry-697d97f7c8-ql2ps" Mar 21 04:27:04 crc kubenswrapper[4839]: E0321 04:27:04.226034 4839 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-21 04:27:04.726021572 +0000 UTC m=+229.053808248 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-ql2ps" (UID: "7ef3f28d-e496-434e-a803-3b9a0fa24690") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 21 04:27:04 crc kubenswrapper[4839]: I0321 04:27:04.327109 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 21 04:27:04 crc kubenswrapper[4839]: E0321 04:27:04.327462 4839 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-21 04:27:04.827447564 +0000 UTC m=+229.155234240 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 21 04:27:04 crc kubenswrapper[4839]: I0321 04:27:04.428856 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-ql2ps\" (UID: \"7ef3f28d-e496-434e-a803-3b9a0fa24690\") " pod="openshift-image-registry/image-registry-697d97f7c8-ql2ps" Mar 21 04:27:04 crc kubenswrapper[4839]: E0321 04:27:04.429240 4839 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-21 04:27:04.929224547 +0000 UTC m=+229.257011223 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-ql2ps" (UID: "7ef3f28d-e496-434e-a803-3b9a0fa24690") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 21 04:27:04 crc kubenswrapper[4839]: I0321 04:27:04.508196 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-c4tgz" event={"ID":"14f49362-2145-40aa-8a7c-e07c70ea910c","Type":"ContainerStarted","Data":"759b32f79065b18d44e62b398dade6c55fefe271033c92b34ee0784f8dfe8cf0"} Mar 21 04:27:04 crc kubenswrapper[4839]: I0321 04:27:04.516145 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-857f4d67dd-g9tz2" event={"ID":"8c8a6e75-7e5f-41c8-8312-b9d274284f35","Type":"ContainerStarted","Data":"192f801676a6fb3ac811f65cb782f56c185d6472eefb8dd7157fa7aed825fd3c"} Mar 21 04:27:04 crc kubenswrapper[4839]: I0321 04:27:04.516190 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-857f4d67dd-g9tz2" event={"ID":"8c8a6e75-7e5f-41c8-8312-b9d274284f35","Type":"ContainerStarted","Data":"c94f6178f711cc6b9a432b444a6693b667b9601efb8005c0fc9c12c421ee6b88"} Mar 21 04:27:04 crc kubenswrapper[4839]: I0321 04:27:04.519528 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-9m6tl" event={"ID":"9fc77b0b-af5d-42f5-8fd1-69ac8d6616d8","Type":"ContainerStarted","Data":"640abcd252c04a2e697fcd252058fb0c5c2913641a12d4b19aa4820b4d21f3e8"} Mar 21 04:27:04 crc kubenswrapper[4839]: I0321 04:27:04.519560 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-9m6tl" event={"ID":"9fc77b0b-af5d-42f5-8fd1-69ac8d6616d8","Type":"ContainerStarted","Data":"3ac49c6150d492e24dca32f31e74f974cfd0ef0dbbca65933463a2a80fe52ff4"} Mar 21 04:27:04 crc kubenswrapper[4839]: I0321 04:27:04.523038 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-hmqv7" event={"ID":"10dc7791-eebd-49e9-8d9c-63711119e9d7","Type":"ContainerStarted","Data":"9088442267d0b0a7c928d54abdc6411a5ee0f9d3fd810ea7906b30fdc2e96a20"} Mar 21 04:27:04 crc kubenswrapper[4839]: I0321 04:27:04.524304 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29567775-lfv48" event={"ID":"0368223e-2e01-4681-a7a6-67b77387f8d8","Type":"ContainerStarted","Data":"8dc51ff3af9bc295da39ecd84349288a171e09e13e9355c5592ecc0b1f1951e7"} Mar 21 04:27:04 crc kubenswrapper[4839]: I0321 04:27:04.525595 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-brnnr" event={"ID":"d1439545-f492-4e4c-858c-ec85c5c2a9d9","Type":"ContainerStarted","Data":"952b273280ea2fa328e0ff549acc309e188c1c302012107463ce46bcfe123548"} Mar 21 04:27:04 crc kubenswrapper[4839]: I0321 04:27:04.530320 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 21 04:27:04 crc kubenswrapper[4839]: E0321 04:27:04.530503 4839 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-21 04:27:05.030487815 +0000 UTC m=+229.358274491 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 21 04:27:04 crc kubenswrapper[4839]: I0321 04:27:04.530617 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-ql2ps\" (UID: \"7ef3f28d-e496-434e-a803-3b9a0fa24690\") " pod="openshift-image-registry/image-registry-697d97f7c8-ql2ps" Mar 21 04:27:04 crc kubenswrapper[4839]: E0321 04:27:04.530947 4839 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-21 04:27:05.030928368 +0000 UTC m=+229.358715044 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-ql2ps" (UID: "7ef3f28d-e496-434e-a803-3b9a0fa24690") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 21 04:27:04 crc kubenswrapper[4839]: I0321 04:27:04.536306 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-5jhkc" event={"ID":"81a76ad1-da33-4b42-9c0a-d0ada077729a","Type":"ContainerStarted","Data":"534753d8eb130a1f76926fb1632d8209e66157f6c2e8528e30034183ce1dd5b6"} Mar 21 04:27:04 crc kubenswrapper[4839]: I0321 04:27:04.536361 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-5jhkc" event={"ID":"81a76ad1-da33-4b42-9c0a-d0ada077729a","Type":"ContainerStarted","Data":"c891a419f7328b292b732c2c2e4597a95d912488259d8d7416f2deacf7fb0e9f"} Mar 21 04:27:04 crc kubenswrapper[4839]: I0321 04:27:04.536583 4839 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-dns/dns-default-5jhkc" Mar 21 04:27:04 crc kubenswrapper[4839]: I0321 04:27:04.541070 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-78xr9" event={"ID":"685c3b51-a70f-484e-b7db-f98383f75003","Type":"ContainerStarted","Data":"fa84c17e80a9eedcf415cda88681e7867da38af3733d30ac7215c31641fe5e5b"} Mar 21 04:27:04 crc kubenswrapper[4839]: I0321 04:27:04.541941 4839 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-78xr9" Mar 21 04:27:04 crc kubenswrapper[4839]: I0321 04:27:04.543555 4839 patch_prober.go:28] interesting pod/olm-operator-6b444d44fb-78xr9 container/olm-operator namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.37:8443/healthz\": dial tcp 10.217.0.37:8443: connect: connection refused" start-of-body= Mar 21 04:27:04 crc kubenswrapper[4839]: I0321 04:27:04.543636 4839 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-78xr9" podUID="685c3b51-a70f-484e-b7db-f98383f75003" containerName="olm-operator" probeResult="failure" output="Get \"https://10.217.0.37:8443/healthz\": dial tcp 10.217.0.37:8443: connect: connection refused" Mar 21 04:27:04 crc kubenswrapper[4839]: I0321 04:27:04.552729 4839 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-admission-controller-857f4d67dd-g9tz2" podStartSLOduration=163.55271316 podStartE2EDuration="2m43.55271316s" podCreationTimestamp="2026-03-21 04:24:21 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-21 04:27:04.549663548 +0000 UTC m=+228.877450224" watchObservedRunningTime="2026-03-21 04:27:04.55271316 +0000 UTC m=+228.880499836" Mar 21 04:27:04 crc kubenswrapper[4839]: I0321 04:27:04.557386 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-lqm8j" event={"ID":"2bc52acb-29f0-4f24-a46a-928a529264dc","Type":"ContainerStarted","Data":"fb7cde5cf5ec6466eb7ac005a25f962746239338ef8ea9b8bad7ad12dd56b03c"} Mar 21 04:27:04 crc kubenswrapper[4839]: I0321 04:27:04.571442 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns-operator/dns-operator-744455d44c-g2rrh" event={"ID":"cae2f42f-b7c7-43c7-b397-a8273ea5844b","Type":"ContainerStarted","Data":"b99776b83e097db45e321be0d5a9b804599d310ae32d00cd000d2780e5aa2659"} Mar 21 04:27:04 crc kubenswrapper[4839]: I0321 04:27:04.574803 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-9c57cc56f-6rrrs" event={"ID":"ad426123-af7f-45c4-8a6b-bca3c83017be","Type":"ContainerStarted","Data":"60df4a3a0ba78465c0ca36153aa7b3cd78179d02a3f567ab66fe42b9f322cd03"} Mar 21 04:27:04 crc kubenswrapper[4839]: I0321 04:27:04.576466 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-hlk25" event={"ID":"a810b51a-5b19-4da9-ad80-05f189d821e4","Type":"ContainerStarted","Data":"d3e33b3ec52738f286e320aed0cd65c111af76df711d2597da407333eb03030b"} Mar 21 04:27:04 crc kubenswrapper[4839]: I0321 04:27:04.578269 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-vg8dq" event={"ID":"2bceecf8-583d-4e26-9749-f5939280540b","Type":"ContainerStarted","Data":"4d3879596d130bc865ecded5b7f57a37807b79fb1f74f6290336cf5643820f26"} Mar 21 04:27:04 crc kubenswrapper[4839]: I0321 04:27:04.579017 4839 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-vg8dq" Mar 21 04:27:04 crc kubenswrapper[4839]: I0321 04:27:04.579969 4839 patch_prober.go:28] interesting pod/packageserver-d55dfcdfc-vg8dq container/packageserver namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.36:5443/healthz\": dial tcp 10.217.0.36:5443: connect: connection refused" start-of-body= Mar 21 04:27:04 crc kubenswrapper[4839]: I0321 04:27:04.580001 4839 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-vg8dq" podUID="2bceecf8-583d-4e26-9749-f5939280540b" containerName="packageserver" probeResult="failure" output="Get \"https://10.217.0.36:5443/healthz\": dial tcp 10.217.0.36:5443: connect: connection refused" Mar 21 04:27:04 crc kubenswrapper[4839]: I0321 04:27:04.589487 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-blcpt" event={"ID":"7f2c6e22-6a88-4c63-9da2-e38b813e0f1c","Type":"ContainerStarted","Data":"6390bd19477356598926f491e42af1c0e0ea94e9190a2096268f962427d67244"} Mar 21 04:27:04 crc kubenswrapper[4839]: I0321 04:27:04.604104 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-gl7rc" event={"ID":"9d291bc8-87c0-4a9e-b269-52a7801f050b","Type":"ContainerStarted","Data":"230f8693273a2a0a0adeb0f9051bca00e6fc233bce7b1294e58215a7b0da83a8"} Mar 21 04:27:04 crc kubenswrapper[4839]: I0321 04:27:04.604148 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-gl7rc" event={"ID":"9d291bc8-87c0-4a9e-b269-52a7801f050b","Type":"ContainerStarted","Data":"c92ecc3ea986b94ba6d739b8e9f5cef071893679a7c30802741401c0db211e81"} Mar 21 04:27:04 crc kubenswrapper[4839]: I0321 04:27:04.614151 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-p4nnp" event={"ID":"b60d6f1b-b109-4fa4-a85d-ebb845b342bd","Type":"ContainerStarted","Data":"fd22ccf9124a36ac68409088a34c8e8b0b1dc177591b8e7ff82797fe49c5947d"} Mar 21 04:27:04 crc kubenswrapper[4839]: I0321 04:27:04.614204 4839 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-config-operator/openshift-config-operator-7777fb866f-p4nnp" Mar 21 04:27:04 crc kubenswrapper[4839]: I0321 04:27:04.620741 4839 patch_prober.go:28] interesting pod/catalog-operator-68c6474976-xldvn container/catalog-operator namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.38:8443/healthz\": dial tcp 10.217.0.38:8443: connect: connection refused" start-of-body= Mar 21 04:27:04 crc kubenswrapper[4839]: I0321 04:27:04.620802 4839 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-xldvn" podUID="a83789bf-1523-4d5e-892d-6597aed01b7d" containerName="catalog-operator" probeResult="failure" output="Get \"https://10.217.0.38:8443/healthz\": dial tcp 10.217.0.38:8443: connect: connection refused" Mar 21 04:27:04 crc kubenswrapper[4839]: I0321 04:27:04.624848 4839 patch_prober.go:28] interesting pod/marketplace-operator-79b997595-8jgh7 container/marketplace-operator namespace/openshift-marketplace: Readiness probe status=failure output="Get \"http://10.217.0.30:8080/healthz\": dial tcp 10.217.0.30:8080: connect: connection refused" start-of-body= Mar 21 04:27:04 crc kubenswrapper[4839]: I0321 04:27:04.624879 4839 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-marketplace/marketplace-operator-79b997595-8jgh7" podUID="6240548e-b827-4fdb-b2be-c7187d6a28e8" containerName="marketplace-operator" probeResult="failure" output="Get \"http://10.217.0.30:8080/healthz\": dial tcp 10.217.0.30:8080: connect: connection refused" Mar 21 04:27:04 crc kubenswrapper[4839]: I0321 04:27:04.625015 4839 patch_prober.go:28] interesting pod/console-operator-58897d9998-hkg98 container/console-operator namespace/openshift-console-operator: Readiness probe status=failure output="Get \"https://10.217.0.18:8443/readyz\": dial tcp 10.217.0.18:8443: connect: connection refused" start-of-body= Mar 21 04:27:04 crc kubenswrapper[4839]: I0321 04:27:04.625077 4839 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console-operator/console-operator-58897d9998-hkg98" podUID="f7156267-6917-4c54-ba75-4a91a0772025" containerName="console-operator" probeResult="failure" output="Get \"https://10.217.0.18:8443/readyz\": dial tcp 10.217.0.18:8443: connect: connection refused" Mar 21 04:27:04 crc kubenswrapper[4839]: I0321 04:27:04.632213 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 21 04:27:04 crc kubenswrapper[4839]: E0321 04:27:04.633631 4839 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-21 04:27:05.133610119 +0000 UTC m=+229.461396795 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 21 04:27:04 crc kubenswrapper[4839]: I0321 04:27:04.708122 4839 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-canary/ingress-canary-brnnr" podStartSLOduration=7.708103046 podStartE2EDuration="7.708103046s" podCreationTimestamp="2026-03-21 04:26:57 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-21 04:27:04.596808918 +0000 UTC m=+228.924595594" watchObservedRunningTime="2026-03-21 04:27:04.708103046 +0000 UTC m=+229.035889722" Mar 21 04:27:04 crc kubenswrapper[4839]: I0321 04:27:04.730399 4839 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/dns-default-5jhkc" podStartSLOduration=7.730372292 podStartE2EDuration="7.730372292s" podCreationTimestamp="2026-03-21 04:26:57 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-21 04:27:04.679267394 +0000 UTC m=+229.007054090" watchObservedRunningTime="2026-03-21 04:27:04.730372292 +0000 UTC m=+229.058158968" Mar 21 04:27:04 crc kubenswrapper[4839]: I0321 04:27:04.737159 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-ql2ps\" (UID: \"7ef3f28d-e496-434e-a803-3b9a0fa24690\") " pod="openshift-image-registry/image-registry-697d97f7c8-ql2ps" Mar 21 04:27:04 crc kubenswrapper[4839]: I0321 04:27:04.740988 4839 patch_prober.go:28] interesting pod/router-default-5444994796-w6dzs container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 21 04:27:04 crc kubenswrapper[4839]: [-]has-synced failed: reason withheld Mar 21 04:27:04 crc kubenswrapper[4839]: [+]process-running ok Mar 21 04:27:04 crc kubenswrapper[4839]: healthz check failed Mar 21 04:27:04 crc kubenswrapper[4839]: I0321 04:27:04.741062 4839 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-w6dzs" podUID="28ce563b-8e5b-4abe-b71b-02c588bff511" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 21 04:27:04 crc kubenswrapper[4839]: E0321 04:27:04.743163 4839 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-21 04:27:05.243144474 +0000 UTC m=+229.570931150 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-ql2ps" (UID: "7ef3f28d-e496-434e-a803-3b9a0fa24690") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 21 04:27:04 crc kubenswrapper[4839]: I0321 04:27:04.748074 4839 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-78xr9" podStartSLOduration=163.748051031 podStartE2EDuration="2m43.748051031s" podCreationTimestamp="2026-03-21 04:24:21 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-21 04:27:04.706518019 +0000 UTC m=+229.034304695" watchObservedRunningTime="2026-03-21 04:27:04.748051031 +0000 UTC m=+229.075837707" Mar 21 04:27:04 crc kubenswrapper[4839]: I0321 04:27:04.797497 4839 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-9m6tl" podStartSLOduration=164.797469079 podStartE2EDuration="2m44.797469079s" podCreationTimestamp="2026-03-21 04:24:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-21 04:27:04.741496065 +0000 UTC m=+229.069282741" watchObservedRunningTime="2026-03-21 04:27:04.797469079 +0000 UTC m=+229.125255755" Mar 21 04:27:04 crc kubenswrapper[4839]: I0321 04:27:04.838952 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 21 04:27:04 crc kubenswrapper[4839]: E0321 04:27:04.839344 4839 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-21 04:27:05.33932825 +0000 UTC m=+229.667114926 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 21 04:27:04 crc kubenswrapper[4839]: I0321 04:27:04.863222 4839 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-apiserver/apiserver-76f77b778f-gl7rc" Mar 21 04:27:04 crc kubenswrapper[4839]: I0321 04:27:04.866370 4839 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-apiserver/apiserver-76f77b778f-gl7rc" Mar 21 04:27:04 crc kubenswrapper[4839]: I0321 04:27:04.876013 4839 patch_prober.go:28] interesting pod/apiserver-76f77b778f-gl7rc container/openshift-apiserver namespace/openshift-apiserver: Startup probe status=failure output="Get \"https://10.217.0.10:8443/livez\": dial tcp 10.217.0.10:8443: connect: connection refused" start-of-body= Mar 21 04:27:04 crc kubenswrapper[4839]: I0321 04:27:04.876080 4839 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-apiserver/apiserver-76f77b778f-gl7rc" podUID="9d291bc8-87c0-4a9e-b269-52a7801f050b" containerName="openshift-apiserver" probeResult="failure" output="Get \"https://10.217.0.10:8443/livez\": dial tcp 10.217.0.10:8443: connect: connection refused" Mar 21 04:27:04 crc kubenswrapper[4839]: I0321 04:27:04.923014 4839 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-config-operator/openshift-config-operator-7777fb866f-p4nnp" podStartSLOduration=164.922999982 podStartE2EDuration="2m44.922999982s" podCreationTimestamp="2026-03-21 04:24:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-21 04:27:04.922197998 +0000 UTC m=+229.249984674" watchObservedRunningTime="2026-03-21 04:27:04.922999982 +0000 UTC m=+229.250786658" Mar 21 04:27:04 crc kubenswrapper[4839]: I0321 04:27:04.925281 4839 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-hmqv7" podStartSLOduration=163.92525486 podStartE2EDuration="2m43.92525486s" podCreationTimestamp="2026-03-21 04:24:21 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-21 04:27:04.86106656 +0000 UTC m=+229.188853236" watchObservedRunningTime="2026-03-21 04:27:04.92525486 +0000 UTC m=+229.253041536" Mar 21 04:27:04 crc kubenswrapper[4839]: I0321 04:27:04.940482 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-ql2ps\" (UID: \"7ef3f28d-e496-434e-a803-3b9a0fa24690\") " pod="openshift-image-registry/image-registry-697d97f7c8-ql2ps" Mar 21 04:27:04 crc kubenswrapper[4839]: E0321 04:27:04.940861 4839 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-21 04:27:05.440848946 +0000 UTC m=+229.768635622 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-ql2ps" (UID: "7ef3f28d-e496-434e-a803-3b9a0fa24690") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 21 04:27:04 crc kubenswrapper[4839]: I0321 04:27:04.995523 4839 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-service-ca/service-ca-9c57cc56f-6rrrs" podStartSLOduration=163.99550188 podStartE2EDuration="2m43.99550188s" podCreationTimestamp="2026-03-21 04:24:21 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-21 04:27:04.995248543 +0000 UTC m=+229.323035229" watchObservedRunningTime="2026-03-21 04:27:04.99550188 +0000 UTC m=+229.323288556" Mar 21 04:27:05 crc kubenswrapper[4839]: I0321 04:27:05.042551 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 21 04:27:05 crc kubenswrapper[4839]: E0321 04:27:05.043013 4839 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-21 04:27:05.542995391 +0000 UTC m=+229.870782067 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 21 04:27:05 crc kubenswrapper[4839]: I0321 04:27:05.101665 4839 patch_prober.go:28] interesting pod/apiserver-7bbb656c7d-85pc8 container/oauth-apiserver namespace/openshift-oauth-apiserver: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[+]ping ok Mar 21 04:27:05 crc kubenswrapper[4839]: [+]log ok Mar 21 04:27:05 crc kubenswrapper[4839]: [+]etcd ok Mar 21 04:27:05 crc kubenswrapper[4839]: [+]poststarthook/start-apiserver-admission-initializer ok Mar 21 04:27:05 crc kubenswrapper[4839]: [-]poststarthook/generic-apiserver-start-informers failed: reason withheld Mar 21 04:27:05 crc kubenswrapper[4839]: [+]poststarthook/max-in-flight-filter ok Mar 21 04:27:05 crc kubenswrapper[4839]: [+]poststarthook/storage-object-count-tracker-hook ok Mar 21 04:27:05 crc kubenswrapper[4839]: [+]poststarthook/openshift.io-StartUserInformer ok Mar 21 04:27:05 crc kubenswrapper[4839]: [+]poststarthook/openshift.io-StartOAuthInformer ok Mar 21 04:27:05 crc kubenswrapper[4839]: [+]poststarthook/openshift.io-StartTokenTimeoutUpdater ok Mar 21 04:27:05 crc kubenswrapper[4839]: livez check failed Mar 21 04:27:05 crc kubenswrapper[4839]: I0321 04:27:05.101722 4839 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-85pc8" podUID="67adff78-dfe5-440a-80b0-fefd703c3aa7" containerName="oauth-apiserver" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 21 04:27:05 crc kubenswrapper[4839]: I0321 04:27:05.144941 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-ql2ps\" (UID: \"7ef3f28d-e496-434e-a803-3b9a0fa24690\") " pod="openshift-image-registry/image-registry-697d97f7c8-ql2ps" Mar 21 04:27:05 crc kubenswrapper[4839]: E0321 04:27:05.145405 4839 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-21 04:27:05.645392252 +0000 UTC m=+229.973178918 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-ql2ps" (UID: "7ef3f28d-e496-434e-a803-3b9a0fa24690") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 21 04:27:05 crc kubenswrapper[4839]: I0321 04:27:05.170948 4839 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-vg8dq" podStartSLOduration=164.170930356 podStartE2EDuration="2m44.170930356s" podCreationTimestamp="2026-03-21 04:24:21 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-21 04:27:05.118587921 +0000 UTC m=+229.446374597" watchObservedRunningTime="2026-03-21 04:27:05.170930356 +0000 UTC m=+229.498717032" Mar 21 04:27:05 crc kubenswrapper[4839]: I0321 04:27:05.245719 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 21 04:27:05 crc kubenswrapper[4839]: E0321 04:27:05.245907 4839 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-21 04:27:05.745881457 +0000 UTC m=+230.073668133 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 21 04:27:05 crc kubenswrapper[4839]: I0321 04:27:05.246149 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-ql2ps\" (UID: \"7ef3f28d-e496-434e-a803-3b9a0fa24690\") " pod="openshift-image-registry/image-registry-697d97f7c8-ql2ps" Mar 21 04:27:05 crc kubenswrapper[4839]: E0321 04:27:05.246474 4839 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-21 04:27:05.746462055 +0000 UTC m=+230.074248731 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-ql2ps" (UID: "7ef3f28d-e496-434e-a803-3b9a0fa24690") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 21 04:27:05 crc kubenswrapper[4839]: I0321 04:27:05.257126 4839 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-lqm8j" podStartSLOduration=164.257105093 podStartE2EDuration="2m44.257105093s" podCreationTimestamp="2026-03-21 04:24:21 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-21 04:27:05.174325298 +0000 UTC m=+229.502111974" watchObservedRunningTime="2026-03-21 04:27:05.257105093 +0000 UTC m=+229.584891769" Mar 21 04:27:05 crc kubenswrapper[4839]: I0321 04:27:05.327143 4839 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-apiserver/apiserver-76f77b778f-gl7rc" podStartSLOduration=165.327124937 podStartE2EDuration="2m45.327124937s" podCreationTimestamp="2026-03-21 04:24:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-21 04:27:05.323391585 +0000 UTC m=+229.651178261" watchObservedRunningTime="2026-03-21 04:27:05.327124937 +0000 UTC m=+229.654911613" Mar 21 04:27:05 crc kubenswrapper[4839]: I0321 04:27:05.328371 4839 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-hlk25" podStartSLOduration=164.328365274 podStartE2EDuration="2m44.328365274s" podCreationTimestamp="2026-03-21 04:24:21 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-21 04:27:05.258507405 +0000 UTC m=+229.586294081" watchObservedRunningTime="2026-03-21 04:27:05.328365274 +0000 UTC m=+229.656151950" Mar 21 04:27:05 crc kubenswrapper[4839]: I0321 04:27:05.347945 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 21 04:27:05 crc kubenswrapper[4839]: E0321 04:27:05.348294 4839 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-21 04:27:05.848277379 +0000 UTC m=+230.176064055 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 21 04:27:05 crc kubenswrapper[4839]: I0321 04:27:05.427052 4839 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns-operator/dns-operator-744455d44c-g2rrh" podStartSLOduration=165.427035844 podStartE2EDuration="2m45.427035844s" podCreationTimestamp="2026-03-21 04:24:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-21 04:27:05.426947302 +0000 UTC m=+229.754733978" watchObservedRunningTime="2026-03-21 04:27:05.427035844 +0000 UTC m=+229.754822520" Mar 21 04:27:05 crc kubenswrapper[4839]: I0321 04:27:05.427118 4839 ???:1] "http: TLS handshake error from 192.168.126.11:53464: no serving certificate available for the kubelet" Mar 21 04:27:05 crc kubenswrapper[4839]: I0321 04:27:05.449549 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-ql2ps\" (UID: \"7ef3f28d-e496-434e-a803-3b9a0fa24690\") " pod="openshift-image-registry/image-registry-697d97f7c8-ql2ps" Mar 21 04:27:05 crc kubenswrapper[4839]: E0321 04:27:05.449887 4839 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-21 04:27:05.949870837 +0000 UTC m=+230.277657503 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-ql2ps" (UID: "7ef3f28d-e496-434e-a803-3b9a0fa24690") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 21 04:27:05 crc kubenswrapper[4839]: I0321 04:27:05.495173 4839 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-blcpt" podStartSLOduration=164.495149211 podStartE2EDuration="2m44.495149211s" podCreationTimestamp="2026-03-21 04:24:21 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-21 04:27:05.493629966 +0000 UTC m=+229.821416642" watchObservedRunningTime="2026-03-21 04:27:05.495149211 +0000 UTC m=+229.822935887" Mar 21 04:27:05 crc kubenswrapper[4839]: I0321 04:27:05.551235 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 21 04:27:05 crc kubenswrapper[4839]: E0321 04:27:05.551977 4839 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-21 04:27:06.05195508 +0000 UTC m=+230.379741756 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 21 04:27:05 crc kubenswrapper[4839]: I0321 04:27:05.564134 4839 ???:1] "http: TLS handshake error from 192.168.126.11:43516: no serving certificate available for the kubelet" Mar 21 04:27:05 crc kubenswrapper[4839]: I0321 04:27:05.627481 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-cstqb" event={"ID":"4fee5524-9cb1-48c7-83b6-10bf3230c783","Type":"ContainerStarted","Data":"b345b9124aa14eedcd6fa38210b1af5914af54e5d8dca691b755df6154040f68"} Mar 21 04:27:05 crc kubenswrapper[4839]: I0321 04:27:05.629705 4839 patch_prober.go:28] interesting pod/olm-operator-6b444d44fb-78xr9 container/olm-operator namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.37:8443/healthz\": dial tcp 10.217.0.37:8443: connect: connection refused" start-of-body= Mar 21 04:27:05 crc kubenswrapper[4839]: I0321 04:27:05.629754 4839 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-78xr9" podUID="685c3b51-a70f-484e-b7db-f98383f75003" containerName="olm-operator" probeResult="failure" output="Get \"https://10.217.0.37:8443/healthz\": dial tcp 10.217.0.37:8443: connect: connection refused" Mar 21 04:27:05 crc kubenswrapper[4839]: I0321 04:27:05.640700 4839 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/marketplace-operator-79b997595-8jgh7" Mar 21 04:27:05 crc kubenswrapper[4839]: I0321 04:27:05.640818 4839 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-xldvn" Mar 21 04:27:05 crc kubenswrapper[4839]: I0321 04:27:05.653804 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-ql2ps\" (UID: \"7ef3f28d-e496-434e-a803-3b9a0fa24690\") " pod="openshift-image-registry/image-registry-697d97f7c8-ql2ps" Mar 21 04:27:05 crc kubenswrapper[4839]: E0321 04:27:05.656706 4839 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-21 04:27:06.156691832 +0000 UTC m=+230.484478588 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-ql2ps" (UID: "7ef3f28d-e496-434e-a803-3b9a0fa24690") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 21 04:27:05 crc kubenswrapper[4839]: I0321 04:27:05.663926 4839 ???:1] "http: TLS handshake error from 192.168.126.11:43532: no serving certificate available for the kubelet" Mar 21 04:27:05 crc kubenswrapper[4839]: I0321 04:27:05.741830 4839 patch_prober.go:28] interesting pod/router-default-5444994796-w6dzs container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 21 04:27:05 crc kubenswrapper[4839]: [-]has-synced failed: reason withheld Mar 21 04:27:05 crc kubenswrapper[4839]: [+]process-running ok Mar 21 04:27:05 crc kubenswrapper[4839]: healthz check failed Mar 21 04:27:05 crc kubenswrapper[4839]: I0321 04:27:05.741898 4839 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-w6dzs" podUID="28ce563b-8e5b-4abe-b71b-02c588bff511" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 21 04:27:05 crc kubenswrapper[4839]: I0321 04:27:05.759941 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 21 04:27:05 crc kubenswrapper[4839]: E0321 04:27:05.760273 4839 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-21 04:27:06.260257529 +0000 UTC m=+230.588044205 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 21 04:27:05 crc kubenswrapper[4839]: I0321 04:27:05.803031 4839 ???:1] "http: TLS handshake error from 192.168.126.11:43534: no serving certificate available for the kubelet" Mar 21 04:27:05 crc kubenswrapper[4839]: I0321 04:27:05.861325 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-ql2ps\" (UID: \"7ef3f28d-e496-434e-a803-3b9a0fa24690\") " pod="openshift-image-registry/image-registry-697d97f7c8-ql2ps" Mar 21 04:27:05 crc kubenswrapper[4839]: E0321 04:27:05.861782 4839 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-21 04:27:06.361765884 +0000 UTC m=+230.689552560 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-ql2ps" (UID: "7ef3f28d-e496-434e-a803-3b9a0fa24690") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 21 04:27:05 crc kubenswrapper[4839]: I0321 04:27:05.905993 4839 ???:1] "http: TLS handshake error from 192.168.126.11:43538: no serving certificate available for the kubelet" Mar 21 04:27:05 crc kubenswrapper[4839]: I0321 04:27:05.962360 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 21 04:27:05 crc kubenswrapper[4839]: E0321 04:27:05.962909 4839 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-21 04:27:06.462889138 +0000 UTC m=+230.790675824 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 21 04:27:05 crc kubenswrapper[4839]: I0321 04:27:05.997867 4839 ???:1] "http: TLS handshake error from 192.168.126.11:43544: no serving certificate available for the kubelet" Mar 21 04:27:06 crc kubenswrapper[4839]: I0321 04:27:06.064197 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-ql2ps\" (UID: \"7ef3f28d-e496-434e-a803-3b9a0fa24690\") " pod="openshift-image-registry/image-registry-697d97f7c8-ql2ps" Mar 21 04:27:06 crc kubenswrapper[4839]: E0321 04:27:06.064480 4839 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-21 04:27:06.564468066 +0000 UTC m=+230.892254742 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-ql2ps" (UID: "7ef3f28d-e496-434e-a803-3b9a0fa24690") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 21 04:27:06 crc kubenswrapper[4839]: I0321 04:27:06.108147 4839 ???:1] "http: TLS handshake error from 192.168.126.11:43554: no serving certificate available for the kubelet" Mar 21 04:27:06 crc kubenswrapper[4839]: I0321 04:27:06.165107 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 21 04:27:06 crc kubenswrapper[4839]: E0321 04:27:06.165297 4839 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-21 04:27:06.6652677 +0000 UTC m=+230.993054376 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 21 04:27:06 crc kubenswrapper[4839]: I0321 04:27:06.165981 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-ql2ps\" (UID: \"7ef3f28d-e496-434e-a803-3b9a0fa24690\") " pod="openshift-image-registry/image-registry-697d97f7c8-ql2ps" Mar 21 04:27:06 crc kubenswrapper[4839]: E0321 04:27:06.166345 4839 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-21 04:27:06.666330172 +0000 UTC m=+230.994116848 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-ql2ps" (UID: "7ef3f28d-e496-434e-a803-3b9a0fa24690") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 21 04:27:06 crc kubenswrapper[4839]: I0321 04:27:06.207314 4839 patch_prober.go:28] interesting pod/openshift-config-operator-7777fb866f-p4nnp container/openshift-config-operator namespace/openshift-config-operator: Liveness probe status=failure output="Get \"https://10.217.0.12:8443/healthz\": dial tcp 10.217.0.12:8443: connect: connection refused" start-of-body= Mar 21 04:27:06 crc kubenswrapper[4839]: I0321 04:27:06.207369 4839 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-config-operator/openshift-config-operator-7777fb866f-p4nnp" podUID="b60d6f1b-b109-4fa4-a85d-ebb845b342bd" containerName="openshift-config-operator" probeResult="failure" output="Get \"https://10.217.0.12:8443/healthz\": dial tcp 10.217.0.12:8443: connect: connection refused" Mar 21 04:27:06 crc kubenswrapper[4839]: I0321 04:27:06.207322 4839 patch_prober.go:28] interesting pod/openshift-config-operator-7777fb866f-p4nnp container/openshift-config-operator namespace/openshift-config-operator: Readiness probe status=failure output="Get \"https://10.217.0.12:8443/healthz\": dial tcp 10.217.0.12:8443: connect: connection refused" start-of-body= Mar 21 04:27:06 crc kubenswrapper[4839]: I0321 04:27:06.207579 4839 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-config-operator/openshift-config-operator-7777fb866f-p4nnp" podUID="b60d6f1b-b109-4fa4-a85d-ebb845b342bd" containerName="openshift-config-operator" probeResult="failure" output="Get \"https://10.217.0.12:8443/healthz\": dial tcp 10.217.0.12:8443: connect: connection refused" Mar 21 04:27:06 crc kubenswrapper[4839]: I0321 04:27:06.251529 4839 ???:1] "http: TLS handshake error from 192.168.126.11:43558: no serving certificate available for the kubelet" Mar 21 04:27:06 crc kubenswrapper[4839]: I0321 04:27:06.267599 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 21 04:27:06 crc kubenswrapper[4839]: E0321 04:27:06.267733 4839 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-21 04:27:06.767714683 +0000 UTC m=+231.095501359 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 21 04:27:06 crc kubenswrapper[4839]: I0321 04:27:06.267781 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-ql2ps\" (UID: \"7ef3f28d-e496-434e-a803-3b9a0fa24690\") " pod="openshift-image-registry/image-registry-697d97f7c8-ql2ps" Mar 21 04:27:06 crc kubenswrapper[4839]: E0321 04:27:06.268055 4839 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-21 04:27:06.768047883 +0000 UTC m=+231.095834559 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-ql2ps" (UID: "7ef3f28d-e496-434e-a803-3b9a0fa24690") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 21 04:27:06 crc kubenswrapper[4839]: I0321 04:27:06.369248 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 21 04:27:06 crc kubenswrapper[4839]: E0321 04:27:06.369444 4839 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-21 04:27:06.869416854 +0000 UTC m=+231.197203530 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 21 04:27:06 crc kubenswrapper[4839]: I0321 04:27:06.369495 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-ql2ps\" (UID: \"7ef3f28d-e496-434e-a803-3b9a0fa24690\") " pod="openshift-image-registry/image-registry-697d97f7c8-ql2ps" Mar 21 04:27:06 crc kubenswrapper[4839]: E0321 04:27:06.369844 4839 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-21 04:27:06.869835227 +0000 UTC m=+231.197621903 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-ql2ps" (UID: "7ef3f28d-e496-434e-a803-3b9a0fa24690") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 21 04:27:06 crc kubenswrapper[4839]: I0321 04:27:06.475556 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 21 04:27:06 crc kubenswrapper[4839]: E0321 04:27:06.475675 4839 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-21 04:27:06.975652611 +0000 UTC m=+231.303439297 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 21 04:27:06 crc kubenswrapper[4839]: I0321 04:27:06.475905 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-ql2ps\" (UID: \"7ef3f28d-e496-434e-a803-3b9a0fa24690\") " pod="openshift-image-registry/image-registry-697d97f7c8-ql2ps" Mar 21 04:27:06 crc kubenswrapper[4839]: E0321 04:27:06.476331 4839 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-21 04:27:06.976320611 +0000 UTC m=+231.304107287 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-ql2ps" (UID: "7ef3f28d-e496-434e-a803-3b9a0fa24690") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 21 04:27:06 crc kubenswrapper[4839]: I0321 04:27:06.577423 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 21 04:27:06 crc kubenswrapper[4839]: E0321 04:27:06.577719 4839 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-21 04:27:07.077702423 +0000 UTC m=+231.405489099 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 21 04:27:06 crc kubenswrapper[4839]: I0321 04:27:06.631177 4839 patch_prober.go:28] interesting pod/packageserver-d55dfcdfc-vg8dq container/packageserver namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.36:5443/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 21 04:27:06 crc kubenswrapper[4839]: I0321 04:27:06.631232 4839 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-vg8dq" podUID="2bceecf8-583d-4e26-9749-f5939280540b" containerName="packageserver" probeResult="failure" output="Get \"https://10.217.0.36:5443/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 21 04:27:06 crc kubenswrapper[4839]: I0321 04:27:06.632609 4839 patch_prober.go:28] interesting pod/console-operator-58897d9998-hkg98 container/console-operator namespace/openshift-console-operator: Readiness probe status=failure output="Get \"https://10.217.0.18:8443/readyz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 21 04:27:06 crc kubenswrapper[4839]: I0321 04:27:06.632646 4839 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console-operator/console-operator-58897d9998-hkg98" podUID="f7156267-6917-4c54-ba75-4a91a0772025" containerName="console-operator" probeResult="failure" output="Get \"https://10.217.0.18:8443/readyz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 21 04:27:06 crc kubenswrapper[4839]: I0321 04:27:06.660006 4839 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-78xr9" Mar 21 04:27:06 crc kubenswrapper[4839]: I0321 04:27:06.680604 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-ql2ps\" (UID: \"7ef3f28d-e496-434e-a803-3b9a0fa24690\") " pod="openshift-image-registry/image-registry-697d97f7c8-ql2ps" Mar 21 04:27:06 crc kubenswrapper[4839]: E0321 04:27:06.683228 4839 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-21 04:27:07.183209848 +0000 UTC m=+231.510996594 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-ql2ps" (UID: "7ef3f28d-e496-434e-a803-3b9a0fa24690") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 21 04:27:06 crc kubenswrapper[4839]: I0321 04:27:06.731793 4839 patch_prober.go:28] interesting pod/router-default-5444994796-w6dzs container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 21 04:27:06 crc kubenswrapper[4839]: [-]has-synced failed: reason withheld Mar 21 04:27:06 crc kubenswrapper[4839]: [+]process-running ok Mar 21 04:27:06 crc kubenswrapper[4839]: healthz check failed Mar 21 04:27:06 crc kubenswrapper[4839]: I0321 04:27:06.731840 4839 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-w6dzs" podUID="28ce563b-8e5b-4abe-b71b-02c588bff511" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 21 04:27:06 crc kubenswrapper[4839]: I0321 04:27:06.782075 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 21 04:27:06 crc kubenswrapper[4839]: E0321 04:27:06.782405 4839 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-21 04:27:07.282389753 +0000 UTC m=+231.610176429 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 21 04:27:06 crc kubenswrapper[4839]: I0321 04:27:06.885302 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-ql2ps\" (UID: \"7ef3f28d-e496-434e-a803-3b9a0fa24690\") " pod="openshift-image-registry/image-registry-697d97f7c8-ql2ps" Mar 21 04:27:06 crc kubenswrapper[4839]: E0321 04:27:06.885659 4839 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-21 04:27:07.385646481 +0000 UTC m=+231.713433157 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-ql2ps" (UID: "7ef3f28d-e496-434e-a803-3b9a0fa24690") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 21 04:27:06 crc kubenswrapper[4839]: I0321 04:27:06.958978 4839 ???:1] "http: TLS handshake error from 192.168.126.11:43566: no serving certificate available for the kubelet" Mar 21 04:27:06 crc kubenswrapper[4839]: I0321 04:27:06.985872 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 21 04:27:06 crc kubenswrapper[4839]: E0321 04:27:06.986390 4839 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-21 04:27:07.486359382 +0000 UTC m=+231.814146068 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 21 04:27:07 crc kubenswrapper[4839]: I0321 04:27:07.088412 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-ql2ps\" (UID: \"7ef3f28d-e496-434e-a803-3b9a0fa24690\") " pod="openshift-image-registry/image-registry-697d97f7c8-ql2ps" Mar 21 04:27:07 crc kubenswrapper[4839]: E0321 04:27:07.088829 4839 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-21 04:27:07.588812766 +0000 UTC m=+231.916599442 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-ql2ps" (UID: "7ef3f28d-e496-434e-a803-3b9a0fa24690") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 21 04:27:07 crc kubenswrapper[4839]: I0321 04:27:07.186505 4839 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-nw7r6"] Mar 21 04:27:07 crc kubenswrapper[4839]: I0321 04:27:07.187678 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-nw7r6" Mar 21 04:27:07 crc kubenswrapper[4839]: I0321 04:27:07.188998 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 21 04:27:07 crc kubenswrapper[4839]: E0321 04:27:07.189306 4839 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-21 04:27:07.689293041 +0000 UTC m=+232.017079717 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 21 04:27:07 crc kubenswrapper[4839]: I0321 04:27:07.192173 4839 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"certified-operators-dockercfg-4rs5g" Mar 21 04:27:07 crc kubenswrapper[4839]: I0321 04:27:07.229347 4839 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-nw7r6"] Mar 21 04:27:07 crc kubenswrapper[4839]: I0321 04:27:07.290275 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/65a571df-f531-458b-9aed-6de99e4607e1-catalog-content\") pod \"certified-operators-nw7r6\" (UID: \"65a571df-f531-458b-9aed-6de99e4607e1\") " pod="openshift-marketplace/certified-operators-nw7r6" Mar 21 04:27:07 crc kubenswrapper[4839]: I0321 04:27:07.290367 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-ql2ps\" (UID: \"7ef3f28d-e496-434e-a803-3b9a0fa24690\") " pod="openshift-image-registry/image-registry-697d97f7c8-ql2ps" Mar 21 04:27:07 crc kubenswrapper[4839]: I0321 04:27:07.290407 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-krww4\" (UniqueName: \"kubernetes.io/projected/65a571df-f531-458b-9aed-6de99e4607e1-kube-api-access-krww4\") pod \"certified-operators-nw7r6\" (UID: \"65a571df-f531-458b-9aed-6de99e4607e1\") " pod="openshift-marketplace/certified-operators-nw7r6" Mar 21 04:27:07 crc kubenswrapper[4839]: I0321 04:27:07.290464 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/65a571df-f531-458b-9aed-6de99e4607e1-utilities\") pod \"certified-operators-nw7r6\" (UID: \"65a571df-f531-458b-9aed-6de99e4607e1\") " pod="openshift-marketplace/certified-operators-nw7r6" Mar 21 04:27:07 crc kubenswrapper[4839]: E0321 04:27:07.290802 4839 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-21 04:27:07.790786785 +0000 UTC m=+232.118573461 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-ql2ps" (UID: "7ef3f28d-e496-434e-a803-3b9a0fa24690") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 21 04:27:07 crc kubenswrapper[4839]: I0321 04:27:07.366400 4839 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-mxrc8"] Mar 21 04:27:07 crc kubenswrapper[4839]: I0321 04:27:07.367946 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-mxrc8" Mar 21 04:27:07 crc kubenswrapper[4839]: I0321 04:27:07.374473 4839 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"community-operators-dockercfg-dmngl" Mar 21 04:27:07 crc kubenswrapper[4839]: I0321 04:27:07.389962 4839 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-mxrc8"] Mar 21 04:27:07 crc kubenswrapper[4839]: I0321 04:27:07.391499 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 21 04:27:07 crc kubenswrapper[4839]: I0321 04:27:07.391862 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/65a571df-f531-458b-9aed-6de99e4607e1-catalog-content\") pod \"certified-operators-nw7r6\" (UID: \"65a571df-f531-458b-9aed-6de99e4607e1\") " pod="openshift-marketplace/certified-operators-nw7r6" Mar 21 04:27:07 crc kubenswrapper[4839]: I0321 04:27:07.391939 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-krww4\" (UniqueName: \"kubernetes.io/projected/65a571df-f531-458b-9aed-6de99e4607e1-kube-api-access-krww4\") pod \"certified-operators-nw7r6\" (UID: \"65a571df-f531-458b-9aed-6de99e4607e1\") " pod="openshift-marketplace/certified-operators-nw7r6" Mar 21 04:27:07 crc kubenswrapper[4839]: I0321 04:27:07.392027 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/65a571df-f531-458b-9aed-6de99e4607e1-utilities\") pod \"certified-operators-nw7r6\" (UID: \"65a571df-f531-458b-9aed-6de99e4607e1\") " pod="openshift-marketplace/certified-operators-nw7r6" Mar 21 04:27:07 crc kubenswrapper[4839]: I0321 04:27:07.392473 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/65a571df-f531-458b-9aed-6de99e4607e1-utilities\") pod \"certified-operators-nw7r6\" (UID: \"65a571df-f531-458b-9aed-6de99e4607e1\") " pod="openshift-marketplace/certified-operators-nw7r6" Mar 21 04:27:07 crc kubenswrapper[4839]: E0321 04:27:07.392558 4839 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-21 04:27:07.892540318 +0000 UTC m=+232.220326994 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 21 04:27:07 crc kubenswrapper[4839]: I0321 04:27:07.392846 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/65a571df-f531-458b-9aed-6de99e4607e1-catalog-content\") pod \"certified-operators-nw7r6\" (UID: \"65a571df-f531-458b-9aed-6de99e4607e1\") " pod="openshift-marketplace/certified-operators-nw7r6" Mar 21 04:27:07 crc kubenswrapper[4839]: I0321 04:27:07.401408 4839 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-45jfn"] Mar 21 04:27:07 crc kubenswrapper[4839]: I0321 04:27:07.401685 4839 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-879f6c89f-45jfn" podUID="3498feaf-72d5-471a-b25e-fb4b68875767" containerName="controller-manager" containerID="cri-o://3be930827d4d25e493d6f8a65c67c37e9fd4d484ec4f005269d96061246be637" gracePeriod=30 Mar 21 04:27:07 crc kubenswrapper[4839]: I0321 04:27:07.435048 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-krww4\" (UniqueName: \"kubernetes.io/projected/65a571df-f531-458b-9aed-6de99e4607e1-kube-api-access-krww4\") pod \"certified-operators-nw7r6\" (UID: \"65a571df-f531-458b-9aed-6de99e4607e1\") " pod="openshift-marketplace/certified-operators-nw7r6" Mar 21 04:27:07 crc kubenswrapper[4839]: I0321 04:27:07.458367 4839 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-76ctz"] Mar 21 04:27:07 crc kubenswrapper[4839]: I0321 04:27:07.458631 4839 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-76ctz" podUID="e81e2384-94b0-4639-bb2d-e4152385c932" containerName="route-controller-manager" containerID="cri-o://e0cf19f30a06660139a2f20b41696b02c7e76a7a7208cb22acb223aa74d717d6" gracePeriod=30 Mar 21 04:27:07 crc kubenswrapper[4839]: I0321 04:27:07.493922 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-ql2ps\" (UID: \"7ef3f28d-e496-434e-a803-3b9a0fa24690\") " pod="openshift-image-registry/image-registry-697d97f7c8-ql2ps" Mar 21 04:27:07 crc kubenswrapper[4839]: E0321 04:27:07.494275 4839 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-21 04:27:07.99426183 +0000 UTC m=+232.322048496 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-ql2ps" (UID: "7ef3f28d-e496-434e-a803-3b9a0fa24690") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 21 04:27:07 crc kubenswrapper[4839]: I0321 04:27:07.494521 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6513c45b-dd98-40b0-b69c-94db4d1c916e-utilities\") pod \"community-operators-mxrc8\" (UID: \"6513c45b-dd98-40b0-b69c-94db4d1c916e\") " pod="openshift-marketplace/community-operators-mxrc8" Mar 21 04:27:07 crc kubenswrapper[4839]: I0321 04:27:07.494552 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nt4t7\" (UniqueName: \"kubernetes.io/projected/6513c45b-dd98-40b0-b69c-94db4d1c916e-kube-api-access-nt4t7\") pod \"community-operators-mxrc8\" (UID: \"6513c45b-dd98-40b0-b69c-94db4d1c916e\") " pod="openshift-marketplace/community-operators-mxrc8" Mar 21 04:27:07 crc kubenswrapper[4839]: I0321 04:27:07.494758 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6513c45b-dd98-40b0-b69c-94db4d1c916e-catalog-content\") pod \"community-operators-mxrc8\" (UID: \"6513c45b-dd98-40b0-b69c-94db4d1c916e\") " pod="openshift-marketplace/community-operators-mxrc8" Mar 21 04:27:07 crc kubenswrapper[4839]: I0321 04:27:07.568402 4839 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-qrqj2"] Mar 21 04:27:07 crc kubenswrapper[4839]: I0321 04:27:07.569427 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-qrqj2" Mar 21 04:27:07 crc kubenswrapper[4839]: I0321 04:27:07.576506 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-nw7r6" Mar 21 04:27:07 crc kubenswrapper[4839]: I0321 04:27:07.591263 4839 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-qrqj2"] Mar 21 04:27:07 crc kubenswrapper[4839]: I0321 04:27:07.595606 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 21 04:27:07 crc kubenswrapper[4839]: I0321 04:27:07.595914 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6513c45b-dd98-40b0-b69c-94db4d1c916e-catalog-content\") pod \"community-operators-mxrc8\" (UID: \"6513c45b-dd98-40b0-b69c-94db4d1c916e\") " pod="openshift-marketplace/community-operators-mxrc8" Mar 21 04:27:07 crc kubenswrapper[4839]: I0321 04:27:07.596032 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6513c45b-dd98-40b0-b69c-94db4d1c916e-utilities\") pod \"community-operators-mxrc8\" (UID: \"6513c45b-dd98-40b0-b69c-94db4d1c916e\") " pod="openshift-marketplace/community-operators-mxrc8" Mar 21 04:27:07 crc kubenswrapper[4839]: I0321 04:27:07.596064 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nt4t7\" (UniqueName: \"kubernetes.io/projected/6513c45b-dd98-40b0-b69c-94db4d1c916e-kube-api-access-nt4t7\") pod \"community-operators-mxrc8\" (UID: \"6513c45b-dd98-40b0-b69c-94db4d1c916e\") " pod="openshift-marketplace/community-operators-mxrc8" Mar 21 04:27:07 crc kubenswrapper[4839]: E0321 04:27:07.596386 4839 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-21 04:27:08.096365203 +0000 UTC m=+232.424151879 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 21 04:27:07 crc kubenswrapper[4839]: I0321 04:27:07.596840 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6513c45b-dd98-40b0-b69c-94db4d1c916e-catalog-content\") pod \"community-operators-mxrc8\" (UID: \"6513c45b-dd98-40b0-b69c-94db4d1c916e\") " pod="openshift-marketplace/community-operators-mxrc8" Mar 21 04:27:07 crc kubenswrapper[4839]: I0321 04:27:07.596872 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6513c45b-dd98-40b0-b69c-94db4d1c916e-utilities\") pod \"community-operators-mxrc8\" (UID: \"6513c45b-dd98-40b0-b69c-94db4d1c916e\") " pod="openshift-marketplace/community-operators-mxrc8" Mar 21 04:27:07 crc kubenswrapper[4839]: I0321 04:27:07.642117 4839 patch_prober.go:28] interesting pod/packageserver-d55dfcdfc-vg8dq container/packageserver namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.36:5443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 21 04:27:07 crc kubenswrapper[4839]: I0321 04:27:07.642172 4839 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-vg8dq" podUID="2bceecf8-583d-4e26-9749-f5939280540b" containerName="packageserver" probeResult="failure" output="Get \"https://10.217.0.36:5443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 21 04:27:07 crc kubenswrapper[4839]: I0321 04:27:07.663286 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nt4t7\" (UniqueName: \"kubernetes.io/projected/6513c45b-dd98-40b0-b69c-94db4d1c916e-kube-api-access-nt4t7\") pod \"community-operators-mxrc8\" (UID: \"6513c45b-dd98-40b0-b69c-94db4d1c916e\") " pod="openshift-marketplace/community-operators-mxrc8" Mar 21 04:27:07 crc kubenswrapper[4839]: I0321 04:27:07.683352 4839 generic.go:334] "Generic (PLEG): container finished" podID="3498feaf-72d5-471a-b25e-fb4b68875767" containerID="3be930827d4d25e493d6f8a65c67c37e9fd4d484ec4f005269d96061246be637" exitCode=0 Mar 21 04:27:07 crc kubenswrapper[4839]: I0321 04:27:07.683451 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-45jfn" event={"ID":"3498feaf-72d5-471a-b25e-fb4b68875767","Type":"ContainerDied","Data":"3be930827d4d25e493d6f8a65c67c37e9fd4d484ec4f005269d96061246be637"} Mar 21 04:27:07 crc kubenswrapper[4839]: I0321 04:27:07.693960 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-mxrc8" Mar 21 04:27:07 crc kubenswrapper[4839]: I0321 04:27:07.697100 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f1ec80e5-557b-4c30-8323-87d6b1447a6d-utilities\") pod \"certified-operators-qrqj2\" (UID: \"f1ec80e5-557b-4c30-8323-87d6b1447a6d\") " pod="openshift-marketplace/certified-operators-qrqj2" Mar 21 04:27:07 crc kubenswrapper[4839]: I0321 04:27:07.697192 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f1ec80e5-557b-4c30-8323-87d6b1447a6d-catalog-content\") pod \"certified-operators-qrqj2\" (UID: \"f1ec80e5-557b-4c30-8323-87d6b1447a6d\") " pod="openshift-marketplace/certified-operators-qrqj2" Mar 21 04:27:07 crc kubenswrapper[4839]: I0321 04:27:07.697224 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-ql2ps\" (UID: \"7ef3f28d-e496-434e-a803-3b9a0fa24690\") " pod="openshift-image-registry/image-registry-697d97f7c8-ql2ps" Mar 21 04:27:07 crc kubenswrapper[4839]: I0321 04:27:07.697318 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z7dr2\" (UniqueName: \"kubernetes.io/projected/f1ec80e5-557b-4c30-8323-87d6b1447a6d-kube-api-access-z7dr2\") pod \"certified-operators-qrqj2\" (UID: \"f1ec80e5-557b-4c30-8323-87d6b1447a6d\") " pod="openshift-marketplace/certified-operators-qrqj2" Mar 21 04:27:07 crc kubenswrapper[4839]: E0321 04:27:07.697733 4839 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-21 04:27:08.197717984 +0000 UTC m=+232.525504660 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-ql2ps" (UID: "7ef3f28d-e496-434e-a803-3b9a0fa24690") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 21 04:27:07 crc kubenswrapper[4839]: I0321 04:27:07.709858 4839 generic.go:334] "Generic (PLEG): container finished" podID="e81e2384-94b0-4639-bb2d-e4152385c932" containerID="e0cf19f30a06660139a2f20b41696b02c7e76a7a7208cb22acb223aa74d717d6" exitCode=0 Mar 21 04:27:07 crc kubenswrapper[4839]: I0321 04:27:07.709941 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-76ctz" event={"ID":"e81e2384-94b0-4639-bb2d-e4152385c932","Type":"ContainerDied","Data":"e0cf19f30a06660139a2f20b41696b02c7e76a7a7208cb22acb223aa74d717d6"} Mar 21 04:27:07 crc kubenswrapper[4839]: I0321 04:27:07.714178 4839 generic.go:334] "Generic (PLEG): container finished" podID="0368223e-2e01-4681-a7a6-67b77387f8d8" containerID="8dc51ff3af9bc295da39ecd84349288a171e09e13e9355c5592ecc0b1f1951e7" exitCode=0 Mar 21 04:27:07 crc kubenswrapper[4839]: I0321 04:27:07.715293 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29567775-lfv48" event={"ID":"0368223e-2e01-4681-a7a6-67b77387f8d8","Type":"ContainerDied","Data":"8dc51ff3af9bc295da39ecd84349288a171e09e13e9355c5592ecc0b1f1951e7"} Mar 21 04:27:07 crc kubenswrapper[4839]: I0321 04:27:07.728829 4839 patch_prober.go:28] interesting pod/router-default-5444994796-w6dzs container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 21 04:27:07 crc kubenswrapper[4839]: [-]has-synced failed: reason withheld Mar 21 04:27:07 crc kubenswrapper[4839]: [+]process-running ok Mar 21 04:27:07 crc kubenswrapper[4839]: healthz check failed Mar 21 04:27:07 crc kubenswrapper[4839]: I0321 04:27:07.728908 4839 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-w6dzs" podUID="28ce563b-8e5b-4abe-b71b-02c588bff511" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 21 04:27:07 crc kubenswrapper[4839]: I0321 04:27:07.772157 4839 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-v4btp"] Mar 21 04:27:07 crc kubenswrapper[4839]: I0321 04:27:07.773377 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-v4btp" Mar 21 04:27:07 crc kubenswrapper[4839]: I0321 04:27:07.789191 4839 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-v4btp"] Mar 21 04:27:07 crc kubenswrapper[4839]: I0321 04:27:07.799388 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 21 04:27:07 crc kubenswrapper[4839]: I0321 04:27:07.799630 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f1ec80e5-557b-4c30-8323-87d6b1447a6d-utilities\") pod \"certified-operators-qrqj2\" (UID: \"f1ec80e5-557b-4c30-8323-87d6b1447a6d\") " pod="openshift-marketplace/certified-operators-qrqj2" Mar 21 04:27:07 crc kubenswrapper[4839]: I0321 04:27:07.799704 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f1ec80e5-557b-4c30-8323-87d6b1447a6d-catalog-content\") pod \"certified-operators-qrqj2\" (UID: \"f1ec80e5-557b-4c30-8323-87d6b1447a6d\") " pod="openshift-marketplace/certified-operators-qrqj2" Mar 21 04:27:07 crc kubenswrapper[4839]: I0321 04:27:07.799748 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-z7dr2\" (UniqueName: \"kubernetes.io/projected/f1ec80e5-557b-4c30-8323-87d6b1447a6d-kube-api-access-z7dr2\") pod \"certified-operators-qrqj2\" (UID: \"f1ec80e5-557b-4c30-8323-87d6b1447a6d\") " pod="openshift-marketplace/certified-operators-qrqj2" Mar 21 04:27:07 crc kubenswrapper[4839]: E0321 04:27:07.800085 4839 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-21 04:27:08.300070454 +0000 UTC m=+232.627857130 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 21 04:27:07 crc kubenswrapper[4839]: I0321 04:27:07.801128 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f1ec80e5-557b-4c30-8323-87d6b1447a6d-utilities\") pod \"certified-operators-qrqj2\" (UID: \"f1ec80e5-557b-4c30-8323-87d6b1447a6d\") " pod="openshift-marketplace/certified-operators-qrqj2" Mar 21 04:27:07 crc kubenswrapper[4839]: I0321 04:27:07.801940 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f1ec80e5-557b-4c30-8323-87d6b1447a6d-catalog-content\") pod \"certified-operators-qrqj2\" (UID: \"f1ec80e5-557b-4c30-8323-87d6b1447a6d\") " pod="openshift-marketplace/certified-operators-qrqj2" Mar 21 04:27:07 crc kubenswrapper[4839]: I0321 04:27:07.826262 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-z7dr2\" (UniqueName: \"kubernetes.io/projected/f1ec80e5-557b-4c30-8323-87d6b1447a6d-kube-api-access-z7dr2\") pod \"certified-operators-qrqj2\" (UID: \"f1ec80e5-557b-4c30-8323-87d6b1447a6d\") " pod="openshift-marketplace/certified-operators-qrqj2" Mar 21 04:27:07 crc kubenswrapper[4839]: I0321 04:27:07.900660 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-ql2ps\" (UID: \"7ef3f28d-e496-434e-a803-3b9a0fa24690\") " pod="openshift-image-registry/image-registry-697d97f7c8-ql2ps" Mar 21 04:27:07 crc kubenswrapper[4839]: I0321 04:27:07.900767 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/dc99f39a-8001-466b-acf1-bd106eb2b81d-utilities\") pod \"community-operators-v4btp\" (UID: \"dc99f39a-8001-466b-acf1-bd106eb2b81d\") " pod="openshift-marketplace/community-operators-v4btp" Mar 21 04:27:07 crc kubenswrapper[4839]: I0321 04:27:07.900828 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w9sw7\" (UniqueName: \"kubernetes.io/projected/dc99f39a-8001-466b-acf1-bd106eb2b81d-kube-api-access-w9sw7\") pod \"community-operators-v4btp\" (UID: \"dc99f39a-8001-466b-acf1-bd106eb2b81d\") " pod="openshift-marketplace/community-operators-v4btp" Mar 21 04:27:07 crc kubenswrapper[4839]: I0321 04:27:07.900897 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/dc99f39a-8001-466b-acf1-bd106eb2b81d-catalog-content\") pod \"community-operators-v4btp\" (UID: \"dc99f39a-8001-466b-acf1-bd106eb2b81d\") " pod="openshift-marketplace/community-operators-v4btp" Mar 21 04:27:07 crc kubenswrapper[4839]: E0321 04:27:07.901041 4839 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-21 04:27:08.401025512 +0000 UTC m=+232.728812188 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-ql2ps" (UID: "7ef3f28d-e496-434e-a803-3b9a0fa24690") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 21 04:27:07 crc kubenswrapper[4839]: I0321 04:27:07.947932 4839 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-45jfn" Mar 21 04:27:07 crc kubenswrapper[4839]: I0321 04:27:07.999888 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-qrqj2" Mar 21 04:27:08 crc kubenswrapper[4839]: I0321 04:27:08.005192 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 21 04:27:08 crc kubenswrapper[4839]: E0321 04:27:08.005311 4839 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-21 04:27:08.50529033 +0000 UTC m=+232.833077006 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 21 04:27:08 crc kubenswrapper[4839]: I0321 04:27:08.005813 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/3498feaf-72d5-471a-b25e-fb4b68875767-client-ca\") pod \"3498feaf-72d5-471a-b25e-fb4b68875767\" (UID: \"3498feaf-72d5-471a-b25e-fb4b68875767\") " Mar 21 04:27:08 crc kubenswrapper[4839]: I0321 04:27:08.005865 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wp2sf\" (UniqueName: \"kubernetes.io/projected/3498feaf-72d5-471a-b25e-fb4b68875767-kube-api-access-wp2sf\") pod \"3498feaf-72d5-471a-b25e-fb4b68875767\" (UID: \"3498feaf-72d5-471a-b25e-fb4b68875767\") " Mar 21 04:27:08 crc kubenswrapper[4839]: I0321 04:27:08.005899 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/3498feaf-72d5-471a-b25e-fb4b68875767-proxy-ca-bundles\") pod \"3498feaf-72d5-471a-b25e-fb4b68875767\" (UID: \"3498feaf-72d5-471a-b25e-fb4b68875767\") " Mar 21 04:27:08 crc kubenswrapper[4839]: I0321 04:27:08.005933 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/3498feaf-72d5-471a-b25e-fb4b68875767-serving-cert\") pod \"3498feaf-72d5-471a-b25e-fb4b68875767\" (UID: \"3498feaf-72d5-471a-b25e-fb4b68875767\") " Mar 21 04:27:08 crc kubenswrapper[4839]: I0321 04:27:08.006025 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3498feaf-72d5-471a-b25e-fb4b68875767-config\") pod \"3498feaf-72d5-471a-b25e-fb4b68875767\" (UID: \"3498feaf-72d5-471a-b25e-fb4b68875767\") " Mar 21 04:27:08 crc kubenswrapper[4839]: I0321 04:27:08.006284 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w9sw7\" (UniqueName: \"kubernetes.io/projected/dc99f39a-8001-466b-acf1-bd106eb2b81d-kube-api-access-w9sw7\") pod \"community-operators-v4btp\" (UID: \"dc99f39a-8001-466b-acf1-bd106eb2b81d\") " pod="openshift-marketplace/community-operators-v4btp" Mar 21 04:27:08 crc kubenswrapper[4839]: I0321 04:27:08.006864 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3498feaf-72d5-471a-b25e-fb4b68875767-client-ca" (OuterVolumeSpecName: "client-ca") pod "3498feaf-72d5-471a-b25e-fb4b68875767" (UID: "3498feaf-72d5-471a-b25e-fb4b68875767"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 21 04:27:08 crc kubenswrapper[4839]: I0321 04:27:08.007510 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3498feaf-72d5-471a-b25e-fb4b68875767-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "3498feaf-72d5-471a-b25e-fb4b68875767" (UID: "3498feaf-72d5-471a-b25e-fb4b68875767"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 21 04:27:08 crc kubenswrapper[4839]: I0321 04:27:08.007556 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3498feaf-72d5-471a-b25e-fb4b68875767-config" (OuterVolumeSpecName: "config") pod "3498feaf-72d5-471a-b25e-fb4b68875767" (UID: "3498feaf-72d5-471a-b25e-fb4b68875767"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 21 04:27:08 crc kubenswrapper[4839]: I0321 04:27:08.007798 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/dc99f39a-8001-466b-acf1-bd106eb2b81d-catalog-content\") pod \"community-operators-v4btp\" (UID: \"dc99f39a-8001-466b-acf1-bd106eb2b81d\") " pod="openshift-marketplace/community-operators-v4btp" Mar 21 04:27:08 crc kubenswrapper[4839]: I0321 04:27:08.008056 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-ql2ps\" (UID: \"7ef3f28d-e496-434e-a803-3b9a0fa24690\") " pod="openshift-image-registry/image-registry-697d97f7c8-ql2ps" Mar 21 04:27:08 crc kubenswrapper[4839]: I0321 04:27:08.008145 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/dc99f39a-8001-466b-acf1-bd106eb2b81d-utilities\") pod \"community-operators-v4btp\" (UID: \"dc99f39a-8001-466b-acf1-bd106eb2b81d\") " pod="openshift-marketplace/community-operators-v4btp" Mar 21 04:27:08 crc kubenswrapper[4839]: I0321 04:27:08.008273 4839 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/3498feaf-72d5-471a-b25e-fb4b68875767-client-ca\") on node \"crc\" DevicePath \"\"" Mar 21 04:27:08 crc kubenswrapper[4839]: I0321 04:27:08.008299 4839 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/3498feaf-72d5-471a-b25e-fb4b68875767-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Mar 21 04:27:08 crc kubenswrapper[4839]: I0321 04:27:08.008315 4839 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3498feaf-72d5-471a-b25e-fb4b68875767-config\") on node \"crc\" DevicePath \"\"" Mar 21 04:27:08 crc kubenswrapper[4839]: I0321 04:27:08.009023 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/dc99f39a-8001-466b-acf1-bd106eb2b81d-utilities\") pod \"community-operators-v4btp\" (UID: \"dc99f39a-8001-466b-acf1-bd106eb2b81d\") " pod="openshift-marketplace/community-operators-v4btp" Mar 21 04:27:08 crc kubenswrapper[4839]: I0321 04:27:08.009961 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/dc99f39a-8001-466b-acf1-bd106eb2b81d-catalog-content\") pod \"community-operators-v4btp\" (UID: \"dc99f39a-8001-466b-acf1-bd106eb2b81d\") " pod="openshift-marketplace/community-operators-v4btp" Mar 21 04:27:08 crc kubenswrapper[4839]: E0321 04:27:08.010309 4839 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-21 04:27:08.510288379 +0000 UTC m=+232.838075205 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-ql2ps" (UID: "7ef3f28d-e496-434e-a803-3b9a0fa24690") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 21 04:27:08 crc kubenswrapper[4839]: I0321 04:27:08.013063 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3498feaf-72d5-471a-b25e-fb4b68875767-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "3498feaf-72d5-471a-b25e-fb4b68875767" (UID: "3498feaf-72d5-471a-b25e-fb4b68875767"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 21 04:27:08 crc kubenswrapper[4839]: I0321 04:27:08.014897 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3498feaf-72d5-471a-b25e-fb4b68875767-kube-api-access-wp2sf" (OuterVolumeSpecName: "kube-api-access-wp2sf") pod "3498feaf-72d5-471a-b25e-fb4b68875767" (UID: "3498feaf-72d5-471a-b25e-fb4b68875767"). InnerVolumeSpecName "kube-api-access-wp2sf". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 21 04:27:08 crc kubenswrapper[4839]: I0321 04:27:08.029353 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w9sw7\" (UniqueName: \"kubernetes.io/projected/dc99f39a-8001-466b-acf1-bd106eb2b81d-kube-api-access-w9sw7\") pod \"community-operators-v4btp\" (UID: \"dc99f39a-8001-466b-acf1-bd106eb2b81d\") " pod="openshift-marketplace/community-operators-v4btp" Mar 21 04:27:08 crc kubenswrapper[4839]: I0321 04:27:08.082451 4839 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-76ctz" Mar 21 04:27:08 crc kubenswrapper[4839]: I0321 04:27:08.115532 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 21 04:27:08 crc kubenswrapper[4839]: E0321 04:27:08.115903 4839 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-21 04:27:08.615865456 +0000 UTC m=+232.943652152 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 21 04:27:08 crc kubenswrapper[4839]: I0321 04:27:08.116093 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-ql2ps\" (UID: \"7ef3f28d-e496-434e-a803-3b9a0fa24690\") " pod="openshift-image-registry/image-registry-697d97f7c8-ql2ps" Mar 21 04:27:08 crc kubenswrapper[4839]: I0321 04:27:08.116277 4839 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wp2sf\" (UniqueName: \"kubernetes.io/projected/3498feaf-72d5-471a-b25e-fb4b68875767-kube-api-access-wp2sf\") on node \"crc\" DevicePath \"\"" Mar 21 04:27:08 crc kubenswrapper[4839]: I0321 04:27:08.116300 4839 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/3498feaf-72d5-471a-b25e-fb4b68875767-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 21 04:27:08 crc kubenswrapper[4839]: E0321 04:27:08.116692 4839 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-21 04:27:08.616680301 +0000 UTC m=+232.944466977 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-ql2ps" (UID: "7ef3f28d-e496-434e-a803-3b9a0fa24690") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 21 04:27:08 crc kubenswrapper[4839]: I0321 04:27:08.124112 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-v4btp" Mar 21 04:27:08 crc kubenswrapper[4839]: I0321 04:27:08.158204 4839 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-mxrc8"] Mar 21 04:27:08 crc kubenswrapper[4839]: I0321 04:27:08.216956 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e81e2384-94b0-4639-bb2d-e4152385c932-serving-cert\") pod \"e81e2384-94b0-4639-bb2d-e4152385c932\" (UID: \"e81e2384-94b0-4639-bb2d-e4152385c932\") " Mar 21 04:27:08 crc kubenswrapper[4839]: I0321 04:27:08.217148 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 21 04:27:08 crc kubenswrapper[4839]: I0321 04:27:08.217180 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e81e2384-94b0-4639-bb2d-e4152385c932-config\") pod \"e81e2384-94b0-4639-bb2d-e4152385c932\" (UID: \"e81e2384-94b0-4639-bb2d-e4152385c932\") " Mar 21 04:27:08 crc kubenswrapper[4839]: I0321 04:27:08.217201 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/e81e2384-94b0-4639-bb2d-e4152385c932-client-ca\") pod \"e81e2384-94b0-4639-bb2d-e4152385c932\" (UID: \"e81e2384-94b0-4639-bb2d-e4152385c932\") " Mar 21 04:27:08 crc kubenswrapper[4839]: I0321 04:27:08.217262 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-64b9q\" (UniqueName: \"kubernetes.io/projected/e81e2384-94b0-4639-bb2d-e4152385c932-kube-api-access-64b9q\") pod \"e81e2384-94b0-4639-bb2d-e4152385c932\" (UID: \"e81e2384-94b0-4639-bb2d-e4152385c932\") " Mar 21 04:27:08 crc kubenswrapper[4839]: I0321 04:27:08.217898 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e81e2384-94b0-4639-bb2d-e4152385c932-config" (OuterVolumeSpecName: "config") pod "e81e2384-94b0-4639-bb2d-e4152385c932" (UID: "e81e2384-94b0-4639-bb2d-e4152385c932"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 21 04:27:08 crc kubenswrapper[4839]: I0321 04:27:08.218395 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e81e2384-94b0-4639-bb2d-e4152385c932-client-ca" (OuterVolumeSpecName: "client-ca") pod "e81e2384-94b0-4639-bb2d-e4152385c932" (UID: "e81e2384-94b0-4639-bb2d-e4152385c932"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 21 04:27:08 crc kubenswrapper[4839]: E0321 04:27:08.218479 4839 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-21 04:27:08.718454324 +0000 UTC m=+233.046241050 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 21 04:27:08 crc kubenswrapper[4839]: I0321 04:27:08.222697 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e81e2384-94b0-4639-bb2d-e4152385c932-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "e81e2384-94b0-4639-bb2d-e4152385c932" (UID: "e81e2384-94b0-4639-bb2d-e4152385c932"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 21 04:27:08 crc kubenswrapper[4839]: I0321 04:27:08.225943 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e81e2384-94b0-4639-bb2d-e4152385c932-kube-api-access-64b9q" (OuterVolumeSpecName: "kube-api-access-64b9q") pod "e81e2384-94b0-4639-bb2d-e4152385c932" (UID: "e81e2384-94b0-4639-bb2d-e4152385c932"). InnerVolumeSpecName "kube-api-access-64b9q". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 21 04:27:08 crc kubenswrapper[4839]: I0321 04:27:08.286421 4839 ???:1] "http: TLS handshake error from 192.168.126.11:43574: no serving certificate available for the kubelet" Mar 21 04:27:08 crc kubenswrapper[4839]: I0321 04:27:08.313213 4839 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-nw7r6"] Mar 21 04:27:08 crc kubenswrapper[4839]: I0321 04:27:08.320547 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-ql2ps\" (UID: \"7ef3f28d-e496-434e-a803-3b9a0fa24690\") " pod="openshift-image-registry/image-registry-697d97f7c8-ql2ps" Mar 21 04:27:08 crc kubenswrapper[4839]: I0321 04:27:08.320771 4839 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e81e2384-94b0-4639-bb2d-e4152385c932-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 21 04:27:08 crc kubenswrapper[4839]: I0321 04:27:08.320795 4839 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e81e2384-94b0-4639-bb2d-e4152385c932-config\") on node \"crc\" DevicePath \"\"" Mar 21 04:27:08 crc kubenswrapper[4839]: I0321 04:27:08.320806 4839 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/e81e2384-94b0-4639-bb2d-e4152385c932-client-ca\") on node \"crc\" DevicePath \"\"" Mar 21 04:27:08 crc kubenswrapper[4839]: I0321 04:27:08.320819 4839 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-64b9q\" (UniqueName: \"kubernetes.io/projected/e81e2384-94b0-4639-bb2d-e4152385c932-kube-api-access-64b9q\") on node \"crc\" DevicePath \"\"" Mar 21 04:27:08 crc kubenswrapper[4839]: E0321 04:27:08.320961 4839 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-21 04:27:08.820945209 +0000 UTC m=+233.148731885 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-ql2ps" (UID: "7ef3f28d-e496-434e-a803-3b9a0fa24690") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 21 04:27:08 crc kubenswrapper[4839]: I0321 04:27:08.328424 4839 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-qrqj2"] Mar 21 04:27:08 crc kubenswrapper[4839]: I0321 04:27:08.422129 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 21 04:27:08 crc kubenswrapper[4839]: E0321 04:27:08.424119 4839 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-21 04:27:08.922508536 +0000 UTC m=+233.250295212 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 21 04:27:08 crc kubenswrapper[4839]: I0321 04:27:08.448064 4839 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-v4btp"] Mar 21 04:27:08 crc kubenswrapper[4839]: W0321 04:27:08.472845 4839 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poddc99f39a_8001_466b_acf1_bd106eb2b81d.slice/crio-a5776e6c987dacb310eadbac22656829eea56efcfa2c6987693a184baa498a40 WatchSource:0}: Error finding container a5776e6c987dacb310eadbac22656829eea56efcfa2c6987693a184baa498a40: Status 404 returned error can't find the container with id a5776e6c987dacb310eadbac22656829eea56efcfa2c6987693a184baa498a40 Mar 21 04:27:08 crc kubenswrapper[4839]: I0321 04:27:08.523644 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-ql2ps\" (UID: \"7ef3f28d-e496-434e-a803-3b9a0fa24690\") " pod="openshift-image-registry/image-registry-697d97f7c8-ql2ps" Mar 21 04:27:08 crc kubenswrapper[4839]: E0321 04:27:08.524080 4839 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-21 04:27:09.024042852 +0000 UTC m=+233.351829528 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-ql2ps" (UID: "7ef3f28d-e496-434e-a803-3b9a0fa24690") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 21 04:27:08 crc kubenswrapper[4839]: I0321 04:27:08.624658 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 21 04:27:08 crc kubenswrapper[4839]: E0321 04:27:08.624789 4839 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-21 04:27:09.124757624 +0000 UTC m=+233.452544300 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 21 04:27:08 crc kubenswrapper[4839]: I0321 04:27:08.625110 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-ql2ps\" (UID: \"7ef3f28d-e496-434e-a803-3b9a0fa24690\") " pod="openshift-image-registry/image-registry-697d97f7c8-ql2ps" Mar 21 04:27:08 crc kubenswrapper[4839]: E0321 04:27:08.625529 4839 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-21 04:27:09.125511646 +0000 UTC m=+233.453298322 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-ql2ps" (UID: "7ef3f28d-e496-434e-a803-3b9a0fa24690") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 21 04:27:08 crc kubenswrapper[4839]: I0321 04:27:08.720743 4839 patch_prober.go:28] interesting pod/router-default-5444994796-w6dzs container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 21 04:27:08 crc kubenswrapper[4839]: [-]has-synced failed: reason withheld Mar 21 04:27:08 crc kubenswrapper[4839]: [+]process-running ok Mar 21 04:27:08 crc kubenswrapper[4839]: healthz check failed Mar 21 04:27:08 crc kubenswrapper[4839]: I0321 04:27:08.720805 4839 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-w6dzs" podUID="28ce563b-8e5b-4abe-b71b-02c588bff511" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 21 04:27:08 crc kubenswrapper[4839]: I0321 04:27:08.726075 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 21 04:27:08 crc kubenswrapper[4839]: E0321 04:27:08.726256 4839 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-21 04:27:09.226234998 +0000 UTC m=+233.554021694 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 21 04:27:08 crc kubenswrapper[4839]: I0321 04:27:08.726375 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-ql2ps\" (UID: \"7ef3f28d-e496-434e-a803-3b9a0fa24690\") " pod="openshift-image-registry/image-registry-697d97f7c8-ql2ps" Mar 21 04:27:08 crc kubenswrapper[4839]: E0321 04:27:08.726720 4839 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-21 04:27:09.226709483 +0000 UTC m=+233.554496159 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-ql2ps" (UID: "7ef3f28d-e496-434e-a803-3b9a0fa24690") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 21 04:27:08 crc kubenswrapper[4839]: I0321 04:27:08.730206 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-45jfn" event={"ID":"3498feaf-72d5-471a-b25e-fb4b68875767","Type":"ContainerDied","Data":"37e1b23e6295895c07381fb6ccbe7c11a0a79e23312f6edb20749b2a0cf5c684"} Mar 21 04:27:08 crc kubenswrapper[4839]: I0321 04:27:08.730252 4839 scope.go:117] "RemoveContainer" containerID="3be930827d4d25e493d6f8a65c67c37e9fd4d484ec4f005269d96061246be637" Mar 21 04:27:08 crc kubenswrapper[4839]: I0321 04:27:08.730428 4839 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-45jfn" Mar 21 04:27:08 crc kubenswrapper[4839]: I0321 04:27:08.738097 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-qrqj2" event={"ID":"f1ec80e5-557b-4c30-8323-87d6b1447a6d","Type":"ContainerStarted","Data":"45c4f382e92761207baa8e2c4160a24c616ae580e9d48096eb767d7eab157d90"} Mar 21 04:27:08 crc kubenswrapper[4839]: I0321 04:27:08.740271 4839 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-76ctz" Mar 21 04:27:08 crc kubenswrapper[4839]: I0321 04:27:08.740861 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-76ctz" event={"ID":"e81e2384-94b0-4639-bb2d-e4152385c932","Type":"ContainerDied","Data":"14f69a0375e6c5d03a334a0b16024be0f89d91b54fd01559c41e7673087b6d53"} Mar 21 04:27:08 crc kubenswrapper[4839]: I0321 04:27:08.743870 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-mxrc8" event={"ID":"6513c45b-dd98-40b0-b69c-94db4d1c916e","Type":"ContainerStarted","Data":"3c535ea31a5aa838095ae16f33b0780c50c3c3698c73e47ae8e3f30c17a3ac39"} Mar 21 04:27:08 crc kubenswrapper[4839]: I0321 04:27:08.751533 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-nw7r6" event={"ID":"65a571df-f531-458b-9aed-6de99e4607e1","Type":"ContainerStarted","Data":"b5f435157e1b2e816a83545c0d59dbf17d3143a0eb363bf4e4b546731c0c8b35"} Mar 21 04:27:08 crc kubenswrapper[4839]: I0321 04:27:08.758852 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-v4btp" event={"ID":"dc99f39a-8001-466b-acf1-bd106eb2b81d","Type":"ContainerStarted","Data":"a5776e6c987dacb310eadbac22656829eea56efcfa2c6987693a184baa498a40"} Mar 21 04:27:08 crc kubenswrapper[4839]: I0321 04:27:08.777633 4839 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-45jfn"] Mar 21 04:27:08 crc kubenswrapper[4839]: I0321 04:27:08.788084 4839 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-45jfn"] Mar 21 04:27:08 crc kubenswrapper[4839]: I0321 04:27:08.810644 4839 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-76ctz"] Mar 21 04:27:08 crc kubenswrapper[4839]: I0321 04:27:08.811220 4839 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-76ctz"] Mar 21 04:27:08 crc kubenswrapper[4839]: I0321 04:27:08.827158 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 21 04:27:08 crc kubenswrapper[4839]: E0321 04:27:08.827357 4839 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-21 04:27:09.327326031 +0000 UTC m=+233.655112707 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 21 04:27:08 crc kubenswrapper[4839]: I0321 04:27:08.827504 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-ql2ps\" (UID: \"7ef3f28d-e496-434e-a803-3b9a0fa24690\") " pod="openshift-image-registry/image-registry-697d97f7c8-ql2ps" Mar 21 04:27:08 crc kubenswrapper[4839]: I0321 04:27:08.827862 4839 scope.go:117] "RemoveContainer" containerID="e0cf19f30a06660139a2f20b41696b02c7e76a7a7208cb22acb223aa74d717d6" Mar 21 04:27:08 crc kubenswrapper[4839]: E0321 04:27:08.828056 4839 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-21 04:27:09.328046483 +0000 UTC m=+233.655833159 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-ql2ps" (UID: "7ef3f28d-e496-434e-a803-3b9a0fa24690") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 21 04:27:08 crc kubenswrapper[4839]: I0321 04:27:08.929033 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 21 04:27:08 crc kubenswrapper[4839]: E0321 04:27:08.929245 4839 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-21 04:27:09.429216108 +0000 UTC m=+233.757002784 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 21 04:27:08 crc kubenswrapper[4839]: I0321 04:27:08.929387 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-ql2ps\" (UID: \"7ef3f28d-e496-434e-a803-3b9a0fa24690\") " pod="openshift-image-registry/image-registry-697d97f7c8-ql2ps" Mar 21 04:27:08 crc kubenswrapper[4839]: E0321 04:27:08.929758 4839 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-21 04:27:09.429750304 +0000 UTC m=+233.757537050 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-ql2ps" (UID: "7ef3f28d-e496-434e-a803-3b9a0fa24690") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 21 04:27:09 crc kubenswrapper[4839]: I0321 04:27:09.030366 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 21 04:27:09 crc kubenswrapper[4839]: E0321 04:27:09.031090 4839 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-21 04:27:09.531071814 +0000 UTC m=+233.858858490 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 21 04:27:09 crc kubenswrapper[4839]: I0321 04:27:09.101389 4839 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29567775-lfv48" Mar 21 04:27:09 crc kubenswrapper[4839]: I0321 04:27:09.127459 4839 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-85pc8" Mar 21 04:27:09 crc kubenswrapper[4839]: I0321 04:27:09.132169 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-ql2ps\" (UID: \"7ef3f28d-e496-434e-a803-3b9a0fa24690\") " pod="openshift-image-registry/image-registry-697d97f7c8-ql2ps" Mar 21 04:27:09 crc kubenswrapper[4839]: E0321 04:27:09.132682 4839 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-21 04:27:09.632667862 +0000 UTC m=+233.960454538 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-ql2ps" (UID: "7ef3f28d-e496-434e-a803-3b9a0fa24690") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 21 04:27:09 crc kubenswrapper[4839]: I0321 04:27:09.133311 4839 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-85pc8" Mar 21 04:27:09 crc kubenswrapper[4839]: I0321 04:27:09.219639 4839 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-config-operator/openshift-config-operator-7777fb866f-p4nnp" Mar 21 04:27:09 crc kubenswrapper[4839]: I0321 04:27:09.232908 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-l9wch\" (UniqueName: \"kubernetes.io/projected/0368223e-2e01-4681-a7a6-67b77387f8d8-kube-api-access-l9wch\") pod \"0368223e-2e01-4681-a7a6-67b77387f8d8\" (UID: \"0368223e-2e01-4681-a7a6-67b77387f8d8\") " Mar 21 04:27:09 crc kubenswrapper[4839]: I0321 04:27:09.232963 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/0368223e-2e01-4681-a7a6-67b77387f8d8-config-volume\") pod \"0368223e-2e01-4681-a7a6-67b77387f8d8\" (UID: \"0368223e-2e01-4681-a7a6-67b77387f8d8\") " Mar 21 04:27:09 crc kubenswrapper[4839]: I0321 04:27:09.233088 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 21 04:27:09 crc kubenswrapper[4839]: I0321 04:27:09.233130 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/0368223e-2e01-4681-a7a6-67b77387f8d8-secret-volume\") pod \"0368223e-2e01-4681-a7a6-67b77387f8d8\" (UID: \"0368223e-2e01-4681-a7a6-67b77387f8d8\") " Mar 21 04:27:09 crc kubenswrapper[4839]: I0321 04:27:09.235628 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0368223e-2e01-4681-a7a6-67b77387f8d8-config-volume" (OuterVolumeSpecName: "config-volume") pod "0368223e-2e01-4681-a7a6-67b77387f8d8" (UID: "0368223e-2e01-4681-a7a6-67b77387f8d8"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 21 04:27:09 crc kubenswrapper[4839]: E0321 04:27:09.235699 4839 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-21 04:27:09.735685532 +0000 UTC m=+234.063472198 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 21 04:27:09 crc kubenswrapper[4839]: I0321 04:27:09.240762 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0368223e-2e01-4681-a7a6-67b77387f8d8-kube-api-access-l9wch" (OuterVolumeSpecName: "kube-api-access-l9wch") pod "0368223e-2e01-4681-a7a6-67b77387f8d8" (UID: "0368223e-2e01-4681-a7a6-67b77387f8d8"). InnerVolumeSpecName "kube-api-access-l9wch". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 21 04:27:09 crc kubenswrapper[4839]: I0321 04:27:09.242223 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0368223e-2e01-4681-a7a6-67b77387f8d8-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "0368223e-2e01-4681-a7a6-67b77387f8d8" (UID: "0368223e-2e01-4681-a7a6-67b77387f8d8"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 21 04:27:09 crc kubenswrapper[4839]: I0321 04:27:09.289488 4839 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/revision-pruner-8-crc"] Mar 21 04:27:09 crc kubenswrapper[4839]: E0321 04:27:09.290002 4839 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3498feaf-72d5-471a-b25e-fb4b68875767" containerName="controller-manager" Mar 21 04:27:09 crc kubenswrapper[4839]: I0321 04:27:09.290022 4839 state_mem.go:107] "Deleted CPUSet assignment" podUID="3498feaf-72d5-471a-b25e-fb4b68875767" containerName="controller-manager" Mar 21 04:27:09 crc kubenswrapper[4839]: E0321 04:27:09.290033 4839 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e81e2384-94b0-4639-bb2d-e4152385c932" containerName="route-controller-manager" Mar 21 04:27:09 crc kubenswrapper[4839]: I0321 04:27:09.290039 4839 state_mem.go:107] "Deleted CPUSet assignment" podUID="e81e2384-94b0-4639-bb2d-e4152385c932" containerName="route-controller-manager" Mar 21 04:27:09 crc kubenswrapper[4839]: E0321 04:27:09.290054 4839 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0368223e-2e01-4681-a7a6-67b77387f8d8" containerName="collect-profiles" Mar 21 04:27:09 crc kubenswrapper[4839]: I0321 04:27:09.290286 4839 state_mem.go:107] "Deleted CPUSet assignment" podUID="0368223e-2e01-4681-a7a6-67b77387f8d8" containerName="collect-profiles" Mar 21 04:27:09 crc kubenswrapper[4839]: I0321 04:27:09.290530 4839 memory_manager.go:354] "RemoveStaleState removing state" podUID="0368223e-2e01-4681-a7a6-67b77387f8d8" containerName="collect-profiles" Mar 21 04:27:09 crc kubenswrapper[4839]: I0321 04:27:09.290546 4839 memory_manager.go:354] "RemoveStaleState removing state" podUID="e81e2384-94b0-4639-bb2d-e4152385c932" containerName="route-controller-manager" Mar 21 04:27:09 crc kubenswrapper[4839]: I0321 04:27:09.290581 4839 memory_manager.go:354] "RemoveStaleState removing state" podUID="3498feaf-72d5-471a-b25e-fb4b68875767" containerName="controller-manager" Mar 21 04:27:09 crc kubenswrapper[4839]: I0321 04:27:09.291181 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Mar 21 04:27:09 crc kubenswrapper[4839]: I0321 04:27:09.296132 4839 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver"/"kube-root-ca.crt" Mar 21 04:27:09 crc kubenswrapper[4839]: I0321 04:27:09.297005 4839 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver"/"installer-sa-dockercfg-5pr6n" Mar 21 04:27:09 crc kubenswrapper[4839]: I0321 04:27:09.299103 4839 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-8-crc"] Mar 21 04:27:09 crc kubenswrapper[4839]: I0321 04:27:09.335472 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-ql2ps\" (UID: \"7ef3f28d-e496-434e-a803-3b9a0fa24690\") " pod="openshift-image-registry/image-registry-697d97f7c8-ql2ps" Mar 21 04:27:09 crc kubenswrapper[4839]: E0321 04:27:09.335804 4839 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-21 04:27:09.835784326 +0000 UTC m=+234.163571002 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-ql2ps" (UID: "7ef3f28d-e496-434e-a803-3b9a0fa24690") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 21 04:27:09 crc kubenswrapper[4839]: I0321 04:27:09.338705 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/e964573d-8ca6-4f88-8754-f34b3aa57504-kubelet-dir\") pod \"revision-pruner-8-crc\" (UID: \"e964573d-8ca6-4f88-8754-f34b3aa57504\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Mar 21 04:27:09 crc kubenswrapper[4839]: I0321 04:27:09.338740 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e964573d-8ca6-4f88-8754-f34b3aa57504-kube-api-access\") pod \"revision-pruner-8-crc\" (UID: \"e964573d-8ca6-4f88-8754-f34b3aa57504\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Mar 21 04:27:09 crc kubenswrapper[4839]: I0321 04:27:09.338932 4839 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-l9wch\" (UniqueName: \"kubernetes.io/projected/0368223e-2e01-4681-a7a6-67b77387f8d8-kube-api-access-l9wch\") on node \"crc\" DevicePath \"\"" Mar 21 04:27:09 crc kubenswrapper[4839]: I0321 04:27:09.338961 4839 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/0368223e-2e01-4681-a7a6-67b77387f8d8-config-volume\") on node \"crc\" DevicePath \"\"" Mar 21 04:27:09 crc kubenswrapper[4839]: I0321 04:27:09.338972 4839 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/0368223e-2e01-4681-a7a6-67b77387f8d8-secret-volume\") on node \"crc\" DevicePath \"\"" Mar 21 04:27:09 crc kubenswrapper[4839]: I0321 04:27:09.362811 4839 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-9qjgq"] Mar 21 04:27:09 crc kubenswrapper[4839]: I0321 04:27:09.363752 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-9qjgq" Mar 21 04:27:09 crc kubenswrapper[4839]: I0321 04:27:09.366749 4839 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-marketplace-dockercfg-x2ctb" Mar 21 04:27:09 crc kubenswrapper[4839]: I0321 04:27:09.372156 4839 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-9qjgq"] Mar 21 04:27:09 crc kubenswrapper[4839]: I0321 04:27:09.440087 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 21 04:27:09 crc kubenswrapper[4839]: E0321 04:27:09.440272 4839 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-21 04:27:09.940248869 +0000 UTC m=+234.268035545 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 21 04:27:09 crc kubenswrapper[4839]: I0321 04:27:09.440331 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0b7a7313-21c4-4909-9ebe-ebe552b29b8c-catalog-content\") pod \"redhat-marketplace-9qjgq\" (UID: \"0b7a7313-21c4-4909-9ebe-ebe552b29b8c\") " pod="openshift-marketplace/redhat-marketplace-9qjgq" Mar 21 04:27:09 crc kubenswrapper[4839]: I0321 04:27:09.440385 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-ql2ps\" (UID: \"7ef3f28d-e496-434e-a803-3b9a0fa24690\") " pod="openshift-image-registry/image-registry-697d97f7c8-ql2ps" Mar 21 04:27:09 crc kubenswrapper[4839]: I0321 04:27:09.440413 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/e964573d-8ca6-4f88-8754-f34b3aa57504-kubelet-dir\") pod \"revision-pruner-8-crc\" (UID: \"e964573d-8ca6-4f88-8754-f34b3aa57504\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Mar 21 04:27:09 crc kubenswrapper[4839]: I0321 04:27:09.440439 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e964573d-8ca6-4f88-8754-f34b3aa57504-kube-api-access\") pod \"revision-pruner-8-crc\" (UID: \"e964573d-8ca6-4f88-8754-f34b3aa57504\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Mar 21 04:27:09 crc kubenswrapper[4839]: I0321 04:27:09.440462 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ncnmc\" (UniqueName: \"kubernetes.io/projected/0b7a7313-21c4-4909-9ebe-ebe552b29b8c-kube-api-access-ncnmc\") pod \"redhat-marketplace-9qjgq\" (UID: \"0b7a7313-21c4-4909-9ebe-ebe552b29b8c\") " pod="openshift-marketplace/redhat-marketplace-9qjgq" Mar 21 04:27:09 crc kubenswrapper[4839]: I0321 04:27:09.440483 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0b7a7313-21c4-4909-9ebe-ebe552b29b8c-utilities\") pod \"redhat-marketplace-9qjgq\" (UID: \"0b7a7313-21c4-4909-9ebe-ebe552b29b8c\") " pod="openshift-marketplace/redhat-marketplace-9qjgq" Mar 21 04:27:09 crc kubenswrapper[4839]: E0321 04:27:09.440815 4839 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-21 04:27:09.940803106 +0000 UTC m=+234.268589782 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-ql2ps" (UID: "7ef3f28d-e496-434e-a803-3b9a0fa24690") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 21 04:27:09 crc kubenswrapper[4839]: I0321 04:27:09.440854 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/e964573d-8ca6-4f88-8754-f34b3aa57504-kubelet-dir\") pod \"revision-pruner-8-crc\" (UID: \"e964573d-8ca6-4f88-8754-f34b3aa57504\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Mar 21 04:27:09 crc kubenswrapper[4839]: I0321 04:27:09.461018 4839 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager/revision-pruner-9-crc"] Mar 21 04:27:09 crc kubenswrapper[4839]: I0321 04:27:09.461656 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Mar 21 04:27:09 crc kubenswrapper[4839]: I0321 04:27:09.464681 4839 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager"/"kube-root-ca.crt" Mar 21 04:27:09 crc kubenswrapper[4839]: I0321 04:27:09.465904 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e964573d-8ca6-4f88-8754-f34b3aa57504-kube-api-access\") pod \"revision-pruner-8-crc\" (UID: \"e964573d-8ca6-4f88-8754-f34b3aa57504\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Mar 21 04:27:09 crc kubenswrapper[4839]: I0321 04:27:09.466162 4839 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager"/"installer-sa-dockercfg-kjl2n" Mar 21 04:27:09 crc kubenswrapper[4839]: I0321 04:27:09.470325 4839 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager/revision-pruner-9-crc"] Mar 21 04:27:09 crc kubenswrapper[4839]: I0321 04:27:09.541983 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 21 04:27:09 crc kubenswrapper[4839]: I0321 04:27:09.542226 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/5b496f66-c844-4eed-b91d-7c0b6b796e5e-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"5b496f66-c844-4eed-b91d-7c0b6b796e5e\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Mar 21 04:27:09 crc kubenswrapper[4839]: I0321 04:27:09.542294 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ncnmc\" (UniqueName: \"kubernetes.io/projected/0b7a7313-21c4-4909-9ebe-ebe552b29b8c-kube-api-access-ncnmc\") pod \"redhat-marketplace-9qjgq\" (UID: \"0b7a7313-21c4-4909-9ebe-ebe552b29b8c\") " pod="openshift-marketplace/redhat-marketplace-9qjgq" Mar 21 04:27:09 crc kubenswrapper[4839]: I0321 04:27:09.542321 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0b7a7313-21c4-4909-9ebe-ebe552b29b8c-utilities\") pod \"redhat-marketplace-9qjgq\" (UID: \"0b7a7313-21c4-4909-9ebe-ebe552b29b8c\") " pod="openshift-marketplace/redhat-marketplace-9qjgq" Mar 21 04:27:09 crc kubenswrapper[4839]: I0321 04:27:09.542357 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/5b496f66-c844-4eed-b91d-7c0b6b796e5e-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"5b496f66-c844-4eed-b91d-7c0b6b796e5e\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Mar 21 04:27:09 crc kubenswrapper[4839]: I0321 04:27:09.542390 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0b7a7313-21c4-4909-9ebe-ebe552b29b8c-catalog-content\") pod \"redhat-marketplace-9qjgq\" (UID: \"0b7a7313-21c4-4909-9ebe-ebe552b29b8c\") " pod="openshift-marketplace/redhat-marketplace-9qjgq" Mar 21 04:27:09 crc kubenswrapper[4839]: E0321 04:27:09.542752 4839 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-21 04:27:10.042724064 +0000 UTC m=+234.370510740 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 21 04:27:09 crc kubenswrapper[4839]: I0321 04:27:09.542859 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0b7a7313-21c4-4909-9ebe-ebe552b29b8c-catalog-content\") pod \"redhat-marketplace-9qjgq\" (UID: \"0b7a7313-21c4-4909-9ebe-ebe552b29b8c\") " pod="openshift-marketplace/redhat-marketplace-9qjgq" Mar 21 04:27:09 crc kubenswrapper[4839]: I0321 04:27:09.543115 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0b7a7313-21c4-4909-9ebe-ebe552b29b8c-utilities\") pod \"redhat-marketplace-9qjgq\" (UID: \"0b7a7313-21c4-4909-9ebe-ebe552b29b8c\") " pod="openshift-marketplace/redhat-marketplace-9qjgq" Mar 21 04:27:09 crc kubenswrapper[4839]: I0321 04:27:09.562357 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ncnmc\" (UniqueName: \"kubernetes.io/projected/0b7a7313-21c4-4909-9ebe-ebe552b29b8c-kube-api-access-ncnmc\") pod \"redhat-marketplace-9qjgq\" (UID: \"0b7a7313-21c4-4909-9ebe-ebe552b29b8c\") " pod="openshift-marketplace/redhat-marketplace-9qjgq" Mar 21 04:27:09 crc kubenswrapper[4839]: I0321 04:27:09.632662 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Mar 21 04:27:09 crc kubenswrapper[4839]: I0321 04:27:09.643887 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-ql2ps\" (UID: \"7ef3f28d-e496-434e-a803-3b9a0fa24690\") " pod="openshift-image-registry/image-registry-697d97f7c8-ql2ps" Mar 21 04:27:09 crc kubenswrapper[4839]: I0321 04:27:09.643932 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/5b496f66-c844-4eed-b91d-7c0b6b796e5e-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"5b496f66-c844-4eed-b91d-7c0b6b796e5e\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Mar 21 04:27:09 crc kubenswrapper[4839]: I0321 04:27:09.644031 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/5b496f66-c844-4eed-b91d-7c0b6b796e5e-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"5b496f66-c844-4eed-b91d-7c0b6b796e5e\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Mar 21 04:27:09 crc kubenswrapper[4839]: I0321 04:27:09.644122 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/5b496f66-c844-4eed-b91d-7c0b6b796e5e-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"5b496f66-c844-4eed-b91d-7c0b6b796e5e\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Mar 21 04:27:09 crc kubenswrapper[4839]: E0321 04:27:09.644347 4839 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-21 04:27:10.144329472 +0000 UTC m=+234.472116148 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-ql2ps" (UID: "7ef3f28d-e496-434e-a803-3b9a0fa24690") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 21 04:27:09 crc kubenswrapper[4839]: I0321 04:27:09.671443 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/5b496f66-c844-4eed-b91d-7c0b6b796e5e-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"5b496f66-c844-4eed-b91d-7c0b6b796e5e\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Mar 21 04:27:09 crc kubenswrapper[4839]: I0321 04:27:09.683172 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-9qjgq" Mar 21 04:27:09 crc kubenswrapper[4839]: I0321 04:27:09.689021 4839 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-5fbc589df6-8mjvg"] Mar 21 04:27:09 crc kubenswrapper[4839]: I0321 04:27:09.689874 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-5fbc589df6-8mjvg" Mar 21 04:27:09 crc kubenswrapper[4839]: I0321 04:27:09.700143 4839 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Mar 21 04:27:09 crc kubenswrapper[4839]: I0321 04:27:09.700321 4839 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Mar 21 04:27:09 crc kubenswrapper[4839]: I0321 04:27:09.700489 4839 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Mar 21 04:27:09 crc kubenswrapper[4839]: I0321 04:27:09.700619 4839 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Mar 21 04:27:09 crc kubenswrapper[4839]: I0321 04:27:09.700733 4839 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Mar 21 04:27:09 crc kubenswrapper[4839]: I0321 04:27:09.700845 4839 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Mar 21 04:27:09 crc kubenswrapper[4839]: I0321 04:27:09.704346 4839 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6bbc66d757-nhjsp"] Mar 21 04:27:09 crc kubenswrapper[4839]: I0321 04:27:09.705321 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6bbc66d757-nhjsp" Mar 21 04:27:09 crc kubenswrapper[4839]: I0321 04:27:09.707553 4839 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-5fbc589df6-8mjvg"] Mar 21 04:27:09 crc kubenswrapper[4839]: I0321 04:27:09.715913 4839 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6bbc66d757-nhjsp"] Mar 21 04:27:09 crc kubenswrapper[4839]: I0321 04:27:09.722713 4839 patch_prober.go:28] interesting pod/router-default-5444994796-w6dzs container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 21 04:27:09 crc kubenswrapper[4839]: [-]has-synced failed: reason withheld Mar 21 04:27:09 crc kubenswrapper[4839]: [+]process-running ok Mar 21 04:27:09 crc kubenswrapper[4839]: healthz check failed Mar 21 04:27:09 crc kubenswrapper[4839]: I0321 04:27:09.722794 4839 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-w6dzs" podUID="28ce563b-8e5b-4abe-b71b-02c588bff511" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 21 04:27:09 crc kubenswrapper[4839]: I0321 04:27:09.731434 4839 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Mar 21 04:27:09 crc kubenswrapper[4839]: I0321 04:27:09.731993 4839 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Mar 21 04:27:09 crc kubenswrapper[4839]: I0321 04:27:09.732126 4839 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Mar 21 04:27:09 crc kubenswrapper[4839]: I0321 04:27:09.732341 4839 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Mar 21 04:27:09 crc kubenswrapper[4839]: I0321 04:27:09.739474 4839 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Mar 21 04:27:09 crc kubenswrapper[4839]: I0321 04:27:09.739855 4839 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Mar 21 04:27:09 crc kubenswrapper[4839]: I0321 04:27:09.742847 4839 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Mar 21 04:27:09 crc kubenswrapper[4839]: I0321 04:27:09.744942 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 21 04:27:09 crc kubenswrapper[4839]: I0321 04:27:09.745328 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/e24bacec-594f-429f-8e02-73abc6c4b092-client-ca\") pod \"controller-manager-5fbc589df6-8mjvg\" (UID: \"e24bacec-594f-429f-8e02-73abc6c4b092\") " pod="openshift-controller-manager/controller-manager-5fbc589df6-8mjvg" Mar 21 04:27:09 crc kubenswrapper[4839]: I0321 04:27:09.745458 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e24bacec-594f-429f-8e02-73abc6c4b092-serving-cert\") pod \"controller-manager-5fbc589df6-8mjvg\" (UID: \"e24bacec-594f-429f-8e02-73abc6c4b092\") " pod="openshift-controller-manager/controller-manager-5fbc589df6-8mjvg" Mar 21 04:27:09 crc kubenswrapper[4839]: E0321 04:27:09.745585 4839 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-21 04:27:10.245532728 +0000 UTC m=+234.573319564 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 21 04:27:09 crc kubenswrapper[4839]: I0321 04:27:09.745656 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s5hv7\" (UniqueName: \"kubernetes.io/projected/e24bacec-594f-429f-8e02-73abc6c4b092-kube-api-access-s5hv7\") pod \"controller-manager-5fbc589df6-8mjvg\" (UID: \"e24bacec-594f-429f-8e02-73abc6c4b092\") " pod="openshift-controller-manager/controller-manager-5fbc589df6-8mjvg" Mar 21 04:27:09 crc kubenswrapper[4839]: I0321 04:27:09.745823 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8805db9c-11be-498e-9f1f-7bc6914dba76-config\") pod \"route-controller-manager-6bbc66d757-nhjsp\" (UID: \"8805db9c-11be-498e-9f1f-7bc6914dba76\") " pod="openshift-route-controller-manager/route-controller-manager-6bbc66d757-nhjsp" Mar 21 04:27:09 crc kubenswrapper[4839]: I0321 04:27:09.745930 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/8805db9c-11be-498e-9f1f-7bc6914dba76-client-ca\") pod \"route-controller-manager-6bbc66d757-nhjsp\" (UID: \"8805db9c-11be-498e-9f1f-7bc6914dba76\") " pod="openshift-route-controller-manager/route-controller-manager-6bbc66d757-nhjsp" Mar 21 04:27:09 crc kubenswrapper[4839]: I0321 04:27:09.746102 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vhfng\" (UniqueName: \"kubernetes.io/projected/8805db9c-11be-498e-9f1f-7bc6914dba76-kube-api-access-vhfng\") pod \"route-controller-manager-6bbc66d757-nhjsp\" (UID: \"8805db9c-11be-498e-9f1f-7bc6914dba76\") " pod="openshift-route-controller-manager/route-controller-manager-6bbc66d757-nhjsp" Mar 21 04:27:09 crc kubenswrapper[4839]: I0321 04:27:09.746131 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e24bacec-594f-429f-8e02-73abc6c4b092-config\") pod \"controller-manager-5fbc589df6-8mjvg\" (UID: \"e24bacec-594f-429f-8e02-73abc6c4b092\") " pod="openshift-controller-manager/controller-manager-5fbc589df6-8mjvg" Mar 21 04:27:09 crc kubenswrapper[4839]: I0321 04:27:09.746203 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/e24bacec-594f-429f-8e02-73abc6c4b092-proxy-ca-bundles\") pod \"controller-manager-5fbc589df6-8mjvg\" (UID: \"e24bacec-594f-429f-8e02-73abc6c4b092\") " pod="openshift-controller-manager/controller-manager-5fbc589df6-8mjvg" Mar 21 04:27:09 crc kubenswrapper[4839]: I0321 04:27:09.746222 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8805db9c-11be-498e-9f1f-7bc6914dba76-serving-cert\") pod \"route-controller-manager-6bbc66d757-nhjsp\" (UID: \"8805db9c-11be-498e-9f1f-7bc6914dba76\") " pod="openshift-route-controller-manager/route-controller-manager-6bbc66d757-nhjsp" Mar 21 04:27:09 crc kubenswrapper[4839]: I0321 04:27:09.746316 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-ql2ps\" (UID: \"7ef3f28d-e496-434e-a803-3b9a0fa24690\") " pod="openshift-image-registry/image-registry-697d97f7c8-ql2ps" Mar 21 04:27:09 crc kubenswrapper[4839]: E0321 04:27:09.746818 4839 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-21 04:27:10.246789676 +0000 UTC m=+234.574576562 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-ql2ps" (UID: "7ef3f28d-e496-434e-a803-3b9a0fa24690") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 21 04:27:09 crc kubenswrapper[4839]: I0321 04:27:09.781982 4839 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-7m29z"] Mar 21 04:27:09 crc kubenswrapper[4839]: I0321 04:27:09.804354 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Mar 21 04:27:09 crc kubenswrapper[4839]: I0321 04:27:09.805280 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-7m29z" Mar 21 04:27:09 crc kubenswrapper[4839]: I0321 04:27:09.805402 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-qrqj2" event={"ID":"f1ec80e5-557b-4c30-8323-87d6b1447a6d","Type":"ContainerDied","Data":"5d071283174381fba88bb02cf3efc908a9879b9241ab8bc6042c28d3491e2250"} Mar 21 04:27:09 crc kubenswrapper[4839]: I0321 04:27:09.805278 4839 generic.go:334] "Generic (PLEG): container finished" podID="f1ec80e5-557b-4c30-8323-87d6b1447a6d" containerID="5d071283174381fba88bb02cf3efc908a9879b9241ab8bc6042c28d3491e2250" exitCode=0 Mar 21 04:27:09 crc kubenswrapper[4839]: I0321 04:27:09.807916 4839 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-7m29z"] Mar 21 04:27:09 crc kubenswrapper[4839]: I0321 04:27:09.814041 4839 generic.go:334] "Generic (PLEG): container finished" podID="6513c45b-dd98-40b0-b69c-94db4d1c916e" containerID="ef36d2755a7903c4bb91306b919a88606f1cdd5839d1cc954ceec3191fcac127" exitCode=0 Mar 21 04:27:09 crc kubenswrapper[4839]: I0321 04:27:09.814349 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-mxrc8" event={"ID":"6513c45b-dd98-40b0-b69c-94db4d1c916e","Type":"ContainerDied","Data":"ef36d2755a7903c4bb91306b919a88606f1cdd5839d1cc954ceec3191fcac127"} Mar 21 04:27:09 crc kubenswrapper[4839]: I0321 04:27:09.817675 4839 generic.go:334] "Generic (PLEG): container finished" podID="65a571df-f531-458b-9aed-6de99e4607e1" containerID="b1d917a6edaeb5ac2761ae508c1b9b4542abdb7660fa75528b425c220749f0c6" exitCode=0 Mar 21 04:27:09 crc kubenswrapper[4839]: I0321 04:27:09.817739 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-nw7r6" event={"ID":"65a571df-f531-458b-9aed-6de99e4607e1","Type":"ContainerDied","Data":"b1d917a6edaeb5ac2761ae508c1b9b4542abdb7660fa75528b425c220749f0c6"} Mar 21 04:27:09 crc kubenswrapper[4839]: I0321 04:27:09.824311 4839 generic.go:334] "Generic (PLEG): container finished" podID="dc99f39a-8001-466b-acf1-bd106eb2b81d" containerID="87e7256c1b35efeb4f01906aa88cf63b70ae781a00455690c43c1caf1c568dc7" exitCode=0 Mar 21 04:27:09 crc kubenswrapper[4839]: I0321 04:27:09.824373 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-v4btp" event={"ID":"dc99f39a-8001-466b-acf1-bd106eb2b81d","Type":"ContainerDied","Data":"87e7256c1b35efeb4f01906aa88cf63b70ae781a00455690c43c1caf1c568dc7"} Mar 21 04:27:09 crc kubenswrapper[4839]: I0321 04:27:09.844684 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29567775-lfv48" event={"ID":"0368223e-2e01-4681-a7a6-67b77387f8d8","Type":"ContainerDied","Data":"c99d11bf14a3d22b9bf9f8b3d4a725f2ad066e9650b4eeed3e95098655e3adb9"} Mar 21 04:27:09 crc kubenswrapper[4839]: I0321 04:27:09.844728 4839 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c99d11bf14a3d22b9bf9f8b3d4a725f2ad066e9650b4eeed3e95098655e3adb9" Mar 21 04:27:09 crc kubenswrapper[4839]: I0321 04:27:09.844800 4839 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29567775-lfv48" Mar 21 04:27:09 crc kubenswrapper[4839]: E0321 04:27:09.848675 4839 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-21 04:27:10.348658172 +0000 UTC m=+234.676444848 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 21 04:27:09 crc kubenswrapper[4839]: I0321 04:27:09.848715 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 21 04:27:09 crc kubenswrapper[4839]: I0321 04:27:09.848898 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c3ae9e7a-784b-4a39-bd4e-10dbff65cd50-catalog-content\") pod \"redhat-marketplace-7m29z\" (UID: \"c3ae9e7a-784b-4a39-bd4e-10dbff65cd50\") " pod="openshift-marketplace/redhat-marketplace-7m29z" Mar 21 04:27:09 crc kubenswrapper[4839]: I0321 04:27:09.849021 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s5hv7\" (UniqueName: \"kubernetes.io/projected/e24bacec-594f-429f-8e02-73abc6c4b092-kube-api-access-s5hv7\") pod \"controller-manager-5fbc589df6-8mjvg\" (UID: \"e24bacec-594f-429f-8e02-73abc6c4b092\") " pod="openshift-controller-manager/controller-manager-5fbc589df6-8mjvg" Mar 21 04:27:09 crc kubenswrapper[4839]: I0321 04:27:09.849701 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-57m7l\" (UniqueName: \"kubernetes.io/projected/c3ae9e7a-784b-4a39-bd4e-10dbff65cd50-kube-api-access-57m7l\") pod \"redhat-marketplace-7m29z\" (UID: \"c3ae9e7a-784b-4a39-bd4e-10dbff65cd50\") " pod="openshift-marketplace/redhat-marketplace-7m29z" Mar 21 04:27:09 crc kubenswrapper[4839]: I0321 04:27:09.850385 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8805db9c-11be-498e-9f1f-7bc6914dba76-config\") pod \"route-controller-manager-6bbc66d757-nhjsp\" (UID: \"8805db9c-11be-498e-9f1f-7bc6914dba76\") " pod="openshift-route-controller-manager/route-controller-manager-6bbc66d757-nhjsp" Mar 21 04:27:09 crc kubenswrapper[4839]: I0321 04:27:09.851779 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8805db9c-11be-498e-9f1f-7bc6914dba76-config\") pod \"route-controller-manager-6bbc66d757-nhjsp\" (UID: \"8805db9c-11be-498e-9f1f-7bc6914dba76\") " pod="openshift-route-controller-manager/route-controller-manager-6bbc66d757-nhjsp" Mar 21 04:27:09 crc kubenswrapper[4839]: I0321 04:27:09.851868 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/8805db9c-11be-498e-9f1f-7bc6914dba76-client-ca\") pod \"route-controller-manager-6bbc66d757-nhjsp\" (UID: \"8805db9c-11be-498e-9f1f-7bc6914dba76\") " pod="openshift-route-controller-manager/route-controller-manager-6bbc66d757-nhjsp" Mar 21 04:27:09 crc kubenswrapper[4839]: I0321 04:27:09.853731 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c3ae9e7a-784b-4a39-bd4e-10dbff65cd50-utilities\") pod \"redhat-marketplace-7m29z\" (UID: \"c3ae9e7a-784b-4a39-bd4e-10dbff65cd50\") " pod="openshift-marketplace/redhat-marketplace-7m29z" Mar 21 04:27:09 crc kubenswrapper[4839]: I0321 04:27:09.853810 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vhfng\" (UniqueName: \"kubernetes.io/projected/8805db9c-11be-498e-9f1f-7bc6914dba76-kube-api-access-vhfng\") pod \"route-controller-manager-6bbc66d757-nhjsp\" (UID: \"8805db9c-11be-498e-9f1f-7bc6914dba76\") " pod="openshift-route-controller-manager/route-controller-manager-6bbc66d757-nhjsp" Mar 21 04:27:09 crc kubenswrapper[4839]: I0321 04:27:09.853838 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e24bacec-594f-429f-8e02-73abc6c4b092-config\") pod \"controller-manager-5fbc589df6-8mjvg\" (UID: \"e24bacec-594f-429f-8e02-73abc6c4b092\") " pod="openshift-controller-manager/controller-manager-5fbc589df6-8mjvg" Mar 21 04:27:09 crc kubenswrapper[4839]: I0321 04:27:09.853894 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/e24bacec-594f-429f-8e02-73abc6c4b092-proxy-ca-bundles\") pod \"controller-manager-5fbc589df6-8mjvg\" (UID: \"e24bacec-594f-429f-8e02-73abc6c4b092\") " pod="openshift-controller-manager/controller-manager-5fbc589df6-8mjvg" Mar 21 04:27:09 crc kubenswrapper[4839]: I0321 04:27:09.853913 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8805db9c-11be-498e-9f1f-7bc6914dba76-serving-cert\") pod \"route-controller-manager-6bbc66d757-nhjsp\" (UID: \"8805db9c-11be-498e-9f1f-7bc6914dba76\") " pod="openshift-route-controller-manager/route-controller-manager-6bbc66d757-nhjsp" Mar 21 04:27:09 crc kubenswrapper[4839]: I0321 04:27:09.853976 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-ql2ps\" (UID: \"7ef3f28d-e496-434e-a803-3b9a0fa24690\") " pod="openshift-image-registry/image-registry-697d97f7c8-ql2ps" Mar 21 04:27:09 crc kubenswrapper[4839]: I0321 04:27:09.854025 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/e24bacec-594f-429f-8e02-73abc6c4b092-client-ca\") pod \"controller-manager-5fbc589df6-8mjvg\" (UID: \"e24bacec-594f-429f-8e02-73abc6c4b092\") " pod="openshift-controller-manager/controller-manager-5fbc589df6-8mjvg" Mar 21 04:27:09 crc kubenswrapper[4839]: I0321 04:27:09.854055 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e24bacec-594f-429f-8e02-73abc6c4b092-serving-cert\") pod \"controller-manager-5fbc589df6-8mjvg\" (UID: \"e24bacec-594f-429f-8e02-73abc6c4b092\") " pod="openshift-controller-manager/controller-manager-5fbc589df6-8mjvg" Mar 21 04:27:09 crc kubenswrapper[4839]: I0321 04:27:09.853558 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/8805db9c-11be-498e-9f1f-7bc6914dba76-client-ca\") pod \"route-controller-manager-6bbc66d757-nhjsp\" (UID: \"8805db9c-11be-498e-9f1f-7bc6914dba76\") " pod="openshift-route-controller-manager/route-controller-manager-6bbc66d757-nhjsp" Mar 21 04:27:09 crc kubenswrapper[4839]: E0321 04:27:09.856896 4839 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-21 04:27:10.356877988 +0000 UTC m=+234.684664664 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-ql2ps" (UID: "7ef3f28d-e496-434e-a803-3b9a0fa24690") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 21 04:27:09 crc kubenswrapper[4839]: I0321 04:27:09.857461 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/e24bacec-594f-429f-8e02-73abc6c4b092-proxy-ca-bundles\") pod \"controller-manager-5fbc589df6-8mjvg\" (UID: \"e24bacec-594f-429f-8e02-73abc6c4b092\") " pod="openshift-controller-manager/controller-manager-5fbc589df6-8mjvg" Mar 21 04:27:09 crc kubenswrapper[4839]: I0321 04:27:09.863173 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8805db9c-11be-498e-9f1f-7bc6914dba76-serving-cert\") pod \"route-controller-manager-6bbc66d757-nhjsp\" (UID: \"8805db9c-11be-498e-9f1f-7bc6914dba76\") " pod="openshift-route-controller-manager/route-controller-manager-6bbc66d757-nhjsp" Mar 21 04:27:09 crc kubenswrapper[4839]: I0321 04:27:09.866038 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-cstqb" event={"ID":"4fee5524-9cb1-48c7-83b6-10bf3230c783","Type":"ContainerStarted","Data":"b49869ad25ce4e8eafa49f497261ed5f8cd6333d47345feb9f2037cf9f52cbee"} Mar 21 04:27:09 crc kubenswrapper[4839]: I0321 04:27:09.866897 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/e24bacec-594f-429f-8e02-73abc6c4b092-client-ca\") pod \"controller-manager-5fbc589df6-8mjvg\" (UID: \"e24bacec-594f-429f-8e02-73abc6c4b092\") " pod="openshift-controller-manager/controller-manager-5fbc589df6-8mjvg" Mar 21 04:27:09 crc kubenswrapper[4839]: I0321 04:27:09.868518 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e24bacec-594f-429f-8e02-73abc6c4b092-config\") pod \"controller-manager-5fbc589df6-8mjvg\" (UID: \"e24bacec-594f-429f-8e02-73abc6c4b092\") " pod="openshift-controller-manager/controller-manager-5fbc589df6-8mjvg" Mar 21 04:27:09 crc kubenswrapper[4839]: I0321 04:27:09.879652 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s5hv7\" (UniqueName: \"kubernetes.io/projected/e24bacec-594f-429f-8e02-73abc6c4b092-kube-api-access-s5hv7\") pod \"controller-manager-5fbc589df6-8mjvg\" (UID: \"e24bacec-594f-429f-8e02-73abc6c4b092\") " pod="openshift-controller-manager/controller-manager-5fbc589df6-8mjvg" Mar 21 04:27:09 crc kubenswrapper[4839]: I0321 04:27:09.880459 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e24bacec-594f-429f-8e02-73abc6c4b092-serving-cert\") pod \"controller-manager-5fbc589df6-8mjvg\" (UID: \"e24bacec-594f-429f-8e02-73abc6c4b092\") " pod="openshift-controller-manager/controller-manager-5fbc589df6-8mjvg" Mar 21 04:27:09 crc kubenswrapper[4839]: I0321 04:27:09.884927 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vhfng\" (UniqueName: \"kubernetes.io/projected/8805db9c-11be-498e-9f1f-7bc6914dba76-kube-api-access-vhfng\") pod \"route-controller-manager-6bbc66d757-nhjsp\" (UID: \"8805db9c-11be-498e-9f1f-7bc6914dba76\") " pod="openshift-route-controller-manager/route-controller-manager-6bbc66d757-nhjsp" Mar 21 04:27:09 crc kubenswrapper[4839]: I0321 04:27:09.889096 4839 patch_prober.go:28] interesting pod/apiserver-76f77b778f-gl7rc container/openshift-apiserver namespace/openshift-apiserver: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[+]ping ok Mar 21 04:27:09 crc kubenswrapper[4839]: [+]log ok Mar 21 04:27:09 crc kubenswrapper[4839]: [+]etcd ok Mar 21 04:27:09 crc kubenswrapper[4839]: [+]poststarthook/start-apiserver-admission-initializer ok Mar 21 04:27:09 crc kubenswrapper[4839]: [+]poststarthook/generic-apiserver-start-informers ok Mar 21 04:27:09 crc kubenswrapper[4839]: [+]poststarthook/max-in-flight-filter ok Mar 21 04:27:09 crc kubenswrapper[4839]: [+]poststarthook/storage-object-count-tracker-hook ok Mar 21 04:27:09 crc kubenswrapper[4839]: [+]poststarthook/image.openshift.io-apiserver-caches ok Mar 21 04:27:09 crc kubenswrapper[4839]: [-]poststarthook/authorization.openshift.io-bootstrapclusterroles failed: reason withheld Mar 21 04:27:09 crc kubenswrapper[4839]: [+]poststarthook/authorization.openshift.io-ensurenodebootstrap-sa ok Mar 21 04:27:09 crc kubenswrapper[4839]: [+]poststarthook/project.openshift.io-projectcache ok Mar 21 04:27:09 crc kubenswrapper[4839]: [+]poststarthook/project.openshift.io-projectauthorizationcache ok Mar 21 04:27:09 crc kubenswrapper[4839]: [+]poststarthook/openshift.io-startinformers ok Mar 21 04:27:09 crc kubenswrapper[4839]: [+]poststarthook/openshift.io-restmapperupdater ok Mar 21 04:27:09 crc kubenswrapper[4839]: [+]poststarthook/quota.openshift.io-clusterquotamapping ok Mar 21 04:27:09 crc kubenswrapper[4839]: livez check failed Mar 21 04:27:09 crc kubenswrapper[4839]: I0321 04:27:09.889175 4839 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-apiserver/apiserver-76f77b778f-gl7rc" podUID="9d291bc8-87c0-4a9e-b269-52a7801f050b" containerName="openshift-apiserver" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 21 04:27:09 crc kubenswrapper[4839]: I0321 04:27:09.932960 4839 patch_prober.go:28] interesting pod/downloads-7954f5f757-qp8mz container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.14:8080/\": dial tcp 10.217.0.14:8080: connect: connection refused" start-of-body= Mar 21 04:27:09 crc kubenswrapper[4839]: I0321 04:27:09.932997 4839 patch_prober.go:28] interesting pod/downloads-7954f5f757-qp8mz container/download-server namespace/openshift-console: Liveness probe status=failure output="Get \"http://10.217.0.14:8080/\": dial tcp 10.217.0.14:8080: connect: connection refused" start-of-body= Mar 21 04:27:09 crc kubenswrapper[4839]: I0321 04:27:09.933025 4839 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-console/downloads-7954f5f757-qp8mz" podUID="4d63cdfd-21e7-4a63-960b-363fb131ac08" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.14:8080/\": dial tcp 10.217.0.14:8080: connect: connection refused" Mar 21 04:27:09 crc kubenswrapper[4839]: I0321 04:27:09.933032 4839 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-qp8mz" podUID="4d63cdfd-21e7-4a63-960b-363fb131ac08" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.14:8080/\": dial tcp 10.217.0.14:8080: connect: connection refused" Mar 21 04:27:09 crc kubenswrapper[4839]: I0321 04:27:09.943304 4839 plugin_watcher.go:194] "Adding socket path or updating timestamp to desired state cache" path="/var/lib/kubelet/plugins_registry/kubevirt.io.hostpath-provisioner-reg.sock" Mar 21 04:27:09 crc kubenswrapper[4839]: I0321 04:27:09.961077 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 21 04:27:09 crc kubenswrapper[4839]: I0321 04:27:09.961286 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c3ae9e7a-784b-4a39-bd4e-10dbff65cd50-catalog-content\") pod \"redhat-marketplace-7m29z\" (UID: \"c3ae9e7a-784b-4a39-bd4e-10dbff65cd50\") " pod="openshift-marketplace/redhat-marketplace-7m29z" Mar 21 04:27:09 crc kubenswrapper[4839]: I0321 04:27:09.961337 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-57m7l\" (UniqueName: \"kubernetes.io/projected/c3ae9e7a-784b-4a39-bd4e-10dbff65cd50-kube-api-access-57m7l\") pod \"redhat-marketplace-7m29z\" (UID: \"c3ae9e7a-784b-4a39-bd4e-10dbff65cd50\") " pod="openshift-marketplace/redhat-marketplace-7m29z" Mar 21 04:27:09 crc kubenswrapper[4839]: I0321 04:27:09.961395 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c3ae9e7a-784b-4a39-bd4e-10dbff65cd50-utilities\") pod \"redhat-marketplace-7m29z\" (UID: \"c3ae9e7a-784b-4a39-bd4e-10dbff65cd50\") " pod="openshift-marketplace/redhat-marketplace-7m29z" Mar 21 04:27:09 crc kubenswrapper[4839]: E0321 04:27:09.961903 4839 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-21 04:27:10.461889888 +0000 UTC m=+234.789676564 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 21 04:27:09 crc kubenswrapper[4839]: I0321 04:27:09.962631 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c3ae9e7a-784b-4a39-bd4e-10dbff65cd50-catalog-content\") pod \"redhat-marketplace-7m29z\" (UID: \"c3ae9e7a-784b-4a39-bd4e-10dbff65cd50\") " pod="openshift-marketplace/redhat-marketplace-7m29z" Mar 21 04:27:09 crc kubenswrapper[4839]: I0321 04:27:09.964067 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c3ae9e7a-784b-4a39-bd4e-10dbff65cd50-utilities\") pod \"redhat-marketplace-7m29z\" (UID: \"c3ae9e7a-784b-4a39-bd4e-10dbff65cd50\") " pod="openshift-marketplace/redhat-marketplace-7m29z" Mar 21 04:27:09 crc kubenswrapper[4839]: I0321 04:27:09.984168 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-57m7l\" (UniqueName: \"kubernetes.io/projected/c3ae9e7a-784b-4a39-bd4e-10dbff65cd50-kube-api-access-57m7l\") pod \"redhat-marketplace-7m29z\" (UID: \"c3ae9e7a-784b-4a39-bd4e-10dbff65cd50\") " pod="openshift-marketplace/redhat-marketplace-7m29z" Mar 21 04:27:10 crc kubenswrapper[4839]: I0321 04:27:10.042066 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-5fbc589df6-8mjvg" Mar 21 04:27:10 crc kubenswrapper[4839]: I0321 04:27:10.055650 4839 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-8-crc"] Mar 21 04:27:10 crc kubenswrapper[4839]: I0321 04:27:10.063346 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-ql2ps\" (UID: \"7ef3f28d-e496-434e-a803-3b9a0fa24690\") " pod="openshift-image-registry/image-registry-697d97f7c8-ql2ps" Mar 21 04:27:10 crc kubenswrapper[4839]: E0321 04:27:10.063891 4839 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-21 04:27:10.563872707 +0000 UTC m=+234.891659383 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-ql2ps" (UID: "7ef3f28d-e496-434e-a803-3b9a0fa24690") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 21 04:27:10 crc kubenswrapper[4839]: I0321 04:27:10.069893 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6bbc66d757-nhjsp" Mar 21 04:27:10 crc kubenswrapper[4839]: W0321 04:27:10.092771 4839 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-pode964573d_8ca6_4f88_8754_f34b3aa57504.slice/crio-a42e32fd6c837587208cca2bfb2a070aca572819f156271c0601f9badb5fbc1d WatchSource:0}: Error finding container a42e32fd6c837587208cca2bfb2a070aca572819f156271c0601f9badb5fbc1d: Status 404 returned error can't find the container with id a42e32fd6c837587208cca2bfb2a070aca572819f156271c0601f9badb5fbc1d Mar 21 04:27:10 crc kubenswrapper[4839]: I0321 04:27:10.127889 4839 reconciler.go:161] "OperationExecutor.RegisterPlugin started" plugin={"SocketPath":"/var/lib/kubelet/plugins_registry/kubevirt.io.hostpath-provisioner-reg.sock","Timestamp":"2026-03-21T04:27:09.943338713Z","Handler":null,"Name":""} Mar 21 04:27:10 crc kubenswrapper[4839]: I0321 04:27:10.129776 4839 csi_plugin.go:100] kubernetes.io/csi: Trying to validate a new CSI Driver with name: kubevirt.io.hostpath-provisioner endpoint: /var/lib/kubelet/plugins/csi-hostpath/csi.sock versions: 1.0.0 Mar 21 04:27:10 crc kubenswrapper[4839]: I0321 04:27:10.129812 4839 csi_plugin.go:113] kubernetes.io/csi: Register new plugin with name: kubevirt.io.hostpath-provisioner at endpoint: /var/lib/kubelet/plugins/csi-hostpath/csi.sock Mar 21 04:27:10 crc kubenswrapper[4839]: I0321 04:27:10.138470 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-7m29z" Mar 21 04:27:10 crc kubenswrapper[4839]: I0321 04:27:10.142253 4839 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-9qjgq"] Mar 21 04:27:10 crc kubenswrapper[4839]: I0321 04:27:10.164403 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 21 04:27:10 crc kubenswrapper[4839]: I0321 04:27:10.168776 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (OuterVolumeSpecName: "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8". PluginName "kubernetes.io/csi", VolumeGidValue "" Mar 21 04:27:10 crc kubenswrapper[4839]: W0321 04:27:10.183367 4839 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0b7a7313_21c4_4909_9ebe_ebe552b29b8c.slice/crio-6a5663fd0eb16a90e793ba0b93994b3affe90036f9e0e38ea8915b0da62b0425 WatchSource:0}: Error finding container 6a5663fd0eb16a90e793ba0b93994b3affe90036f9e0e38ea8915b0da62b0425: Status 404 returned error can't find the container with id 6a5663fd0eb16a90e793ba0b93994b3affe90036f9e0e38ea8915b0da62b0425 Mar 21 04:27:10 crc kubenswrapper[4839]: I0321 04:27:10.220225 4839 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager/revision-pruner-9-crc"] Mar 21 04:27:10 crc kubenswrapper[4839]: I0321 04:27:10.246470 4839 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-f9d7485db-bj929" Mar 21 04:27:10 crc kubenswrapper[4839]: I0321 04:27:10.246551 4839 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/console-f9d7485db-bj929" Mar 21 04:27:10 crc kubenswrapper[4839]: I0321 04:27:10.249174 4839 patch_prober.go:28] interesting pod/console-f9d7485db-bj929 container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.217.0.20:8443/health\": dial tcp 10.217.0.20:8443: connect: connection refused" start-of-body= Mar 21 04:27:10 crc kubenswrapper[4839]: I0321 04:27:10.249231 4839 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-console/console-f9d7485db-bj929" podUID="ebdfec0a-a8bf-47b0-b51a-75a76d4341f2" containerName="console" probeResult="failure" output="Get \"https://10.217.0.20:8443/health\": dial tcp 10.217.0.20:8443: connect: connection refused" Mar 21 04:27:10 crc kubenswrapper[4839]: I0321 04:27:10.270990 4839 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console-operator/console-operator-58897d9998-hkg98" Mar 21 04:27:10 crc kubenswrapper[4839]: I0321 04:27:10.272403 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-ql2ps\" (UID: \"7ef3f28d-e496-434e-a803-3b9a0fa24690\") " pod="openshift-image-registry/image-registry-697d97f7c8-ql2ps" Mar 21 04:27:10 crc kubenswrapper[4839]: I0321 04:27:10.280315 4839 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Mar 21 04:27:10 crc kubenswrapper[4839]: I0321 04:27:10.280359 4839 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-ql2ps\" (UID: \"7ef3f28d-e496-434e-a803-3b9a0fa24690\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/globalmount\"" pod="openshift-image-registry/image-registry-697d97f7c8-ql2ps" Mar 21 04:27:10 crc kubenswrapper[4839]: I0321 04:27:10.363938 4839 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-zgfcm"] Mar 21 04:27:10 crc kubenswrapper[4839]: I0321 04:27:10.365434 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-zgfcm" Mar 21 04:27:10 crc kubenswrapper[4839]: I0321 04:27:10.369886 4839 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-operators-dockercfg-ct8rh" Mar 21 04:27:10 crc kubenswrapper[4839]: I0321 04:27:10.391401 4839 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-zgfcm"] Mar 21 04:27:10 crc kubenswrapper[4839]: I0321 04:27:10.401419 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-ql2ps\" (UID: \"7ef3f28d-e496-434e-a803-3b9a0fa24690\") " pod="openshift-image-registry/image-registry-697d97f7c8-ql2ps" Mar 21 04:27:10 crc kubenswrapper[4839]: I0321 04:27:10.420285 4839 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-5fbc589df6-8mjvg"] Mar 21 04:27:10 crc kubenswrapper[4839]: W0321 04:27:10.435401 4839 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode24bacec_594f_429f_8e02_73abc6c4b092.slice/crio-788d6540d0d6e5edc7b3a0c3a17787a5f1728b835f1af19ef78772b9e9d77f08 WatchSource:0}: Error finding container 788d6540d0d6e5edc7b3a0c3a17787a5f1728b835f1af19ef78772b9e9d77f08: Status 404 returned error can't find the container with id 788d6540d0d6e5edc7b3a0c3a17787a5f1728b835f1af19ef78772b9e9d77f08 Mar 21 04:27:10 crc kubenswrapper[4839]: I0321 04:27:10.475485 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jctj4\" (UniqueName: \"kubernetes.io/projected/5d1d0c02-87bf-4c8b-bc1c-d25007fb3c1c-kube-api-access-jctj4\") pod \"redhat-operators-zgfcm\" (UID: \"5d1d0c02-87bf-4c8b-bc1c-d25007fb3c1c\") " pod="openshift-marketplace/redhat-operators-zgfcm" Mar 21 04:27:10 crc kubenswrapper[4839]: I0321 04:27:10.475550 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5d1d0c02-87bf-4c8b-bc1c-d25007fb3c1c-catalog-content\") pod \"redhat-operators-zgfcm\" (UID: \"5d1d0c02-87bf-4c8b-bc1c-d25007fb3c1c\") " pod="openshift-marketplace/redhat-operators-zgfcm" Mar 21 04:27:10 crc kubenswrapper[4839]: I0321 04:27:10.475610 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5d1d0c02-87bf-4c8b-bc1c-d25007fb3c1c-utilities\") pod \"redhat-operators-zgfcm\" (UID: \"5d1d0c02-87bf-4c8b-bc1c-d25007fb3c1c\") " pod="openshift-marketplace/redhat-operators-zgfcm" Mar 21 04:27:10 crc kubenswrapper[4839]: I0321 04:27:10.477308 4839 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3498feaf-72d5-471a-b25e-fb4b68875767" path="/var/lib/kubelet/pods/3498feaf-72d5-471a-b25e-fb4b68875767/volumes" Mar 21 04:27:10 crc kubenswrapper[4839]: I0321 04:27:10.478273 4839 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8f668bae-612b-4b75-9490-919e737c6a3b" path="/var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes" Mar 21 04:27:10 crc kubenswrapper[4839]: I0321 04:27:10.478823 4839 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e81e2384-94b0-4639-bb2d-e4152385c932" path="/var/lib/kubelet/pods/e81e2384-94b0-4639-bb2d-e4152385c932/volumes" Mar 21 04:27:10 crc kubenswrapper[4839]: I0321 04:27:10.480629 4839 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6bbc66d757-nhjsp"] Mar 21 04:27:10 crc kubenswrapper[4839]: I0321 04:27:10.542500 4839 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-7m29z"] Mar 21 04:27:10 crc kubenswrapper[4839]: I0321 04:27:10.565345 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-ql2ps" Mar 21 04:27:10 crc kubenswrapper[4839]: I0321 04:27:10.584646 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5d1d0c02-87bf-4c8b-bc1c-d25007fb3c1c-catalog-content\") pod \"redhat-operators-zgfcm\" (UID: \"5d1d0c02-87bf-4c8b-bc1c-d25007fb3c1c\") " pod="openshift-marketplace/redhat-operators-zgfcm" Mar 21 04:27:10 crc kubenswrapper[4839]: I0321 04:27:10.585612 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5d1d0c02-87bf-4c8b-bc1c-d25007fb3c1c-utilities\") pod \"redhat-operators-zgfcm\" (UID: \"5d1d0c02-87bf-4c8b-bc1c-d25007fb3c1c\") " pod="openshift-marketplace/redhat-operators-zgfcm" Mar 21 04:27:10 crc kubenswrapper[4839]: I0321 04:27:10.586114 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5d1d0c02-87bf-4c8b-bc1c-d25007fb3c1c-utilities\") pod \"redhat-operators-zgfcm\" (UID: \"5d1d0c02-87bf-4c8b-bc1c-d25007fb3c1c\") " pod="openshift-marketplace/redhat-operators-zgfcm" Mar 21 04:27:10 crc kubenswrapper[4839]: I0321 04:27:10.586166 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jctj4\" (UniqueName: \"kubernetes.io/projected/5d1d0c02-87bf-4c8b-bc1c-d25007fb3c1c-kube-api-access-jctj4\") pod \"redhat-operators-zgfcm\" (UID: \"5d1d0c02-87bf-4c8b-bc1c-d25007fb3c1c\") " pod="openshift-marketplace/redhat-operators-zgfcm" Mar 21 04:27:10 crc kubenswrapper[4839]: I0321 04:27:10.585913 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5d1d0c02-87bf-4c8b-bc1c-d25007fb3c1c-catalog-content\") pod \"redhat-operators-zgfcm\" (UID: \"5d1d0c02-87bf-4c8b-bc1c-d25007fb3c1c\") " pod="openshift-marketplace/redhat-operators-zgfcm" Mar 21 04:27:10 crc kubenswrapper[4839]: I0321 04:27:10.609334 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jctj4\" (UniqueName: \"kubernetes.io/projected/5d1d0c02-87bf-4c8b-bc1c-d25007fb3c1c-kube-api-access-jctj4\") pod \"redhat-operators-zgfcm\" (UID: \"5d1d0c02-87bf-4c8b-bc1c-d25007fb3c1c\") " pod="openshift-marketplace/redhat-operators-zgfcm" Mar 21 04:27:10 crc kubenswrapper[4839]: I0321 04:27:10.717211 4839 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ingress/router-default-5444994796-w6dzs" Mar 21 04:27:10 crc kubenswrapper[4839]: I0321 04:27:10.723346 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-zgfcm" Mar 21 04:27:10 crc kubenswrapper[4839]: I0321 04:27:10.732947 4839 patch_prober.go:28] interesting pod/router-default-5444994796-w6dzs container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 21 04:27:10 crc kubenswrapper[4839]: [-]has-synced failed: reason withheld Mar 21 04:27:10 crc kubenswrapper[4839]: [+]process-running ok Mar 21 04:27:10 crc kubenswrapper[4839]: healthz check failed Mar 21 04:27:10 crc kubenswrapper[4839]: I0321 04:27:10.733030 4839 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-w6dzs" podUID="28ce563b-8e5b-4abe-b71b-02c588bff511" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 21 04:27:10 crc kubenswrapper[4839]: I0321 04:27:10.761796 4839 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-vg8dq" Mar 21 04:27:10 crc kubenswrapper[4839]: I0321 04:27:10.768511 4839 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-cznml"] Mar 21 04:27:10 crc kubenswrapper[4839]: I0321 04:27:10.769892 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-cznml" Mar 21 04:27:10 crc kubenswrapper[4839]: I0321 04:27:10.795380 4839 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-cznml"] Mar 21 04:27:10 crc kubenswrapper[4839]: I0321 04:27:10.890235 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d994r\" (UniqueName: \"kubernetes.io/projected/b144748c-2940-4efe-a486-d2b5c1239b12-kube-api-access-d994r\") pod \"redhat-operators-cznml\" (UID: \"b144748c-2940-4efe-a486-d2b5c1239b12\") " pod="openshift-marketplace/redhat-operators-cznml" Mar 21 04:27:10 crc kubenswrapper[4839]: I0321 04:27:10.890817 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b144748c-2940-4efe-a486-d2b5c1239b12-utilities\") pod \"redhat-operators-cznml\" (UID: \"b144748c-2940-4efe-a486-d2b5c1239b12\") " pod="openshift-marketplace/redhat-operators-cznml" Mar 21 04:27:10 crc kubenswrapper[4839]: I0321 04:27:10.890924 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b144748c-2940-4efe-a486-d2b5c1239b12-catalog-content\") pod \"redhat-operators-cznml\" (UID: \"b144748c-2940-4efe-a486-d2b5c1239b12\") " pod="openshift-marketplace/redhat-operators-cznml" Mar 21 04:27:10 crc kubenswrapper[4839]: I0321 04:27:10.900345 4839 ???:1] "http: TLS handshake error from 192.168.126.11:43584: no serving certificate available for the kubelet" Mar 21 04:27:10 crc kubenswrapper[4839]: I0321 04:27:10.943054 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"5b496f66-c844-4eed-b91d-7c0b6b796e5e","Type":"ContainerStarted","Data":"c6b72bb73f4a17d5dd38c827060714d11fba907dd60f71c72f7d82237314541f"} Mar 21 04:27:10 crc kubenswrapper[4839]: I0321 04:27:10.943136 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"5b496f66-c844-4eed-b91d-7c0b6b796e5e","Type":"ContainerStarted","Data":"273ef0a7f028ed23bc9f1614a57b25fb15734b341325ff5bacc9681a87caec3a"} Mar 21 04:27:10 crc kubenswrapper[4839]: I0321 04:27:10.945586 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6bbc66d757-nhjsp" event={"ID":"8805db9c-11be-498e-9f1f-7bc6914dba76","Type":"ContainerStarted","Data":"5001466b0a821fd088ffe8d3580f139c85ddf7eda1ca152fed0f401c41c672e0"} Mar 21 04:27:10 crc kubenswrapper[4839]: I0321 04:27:10.945607 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6bbc66d757-nhjsp" event={"ID":"8805db9c-11be-498e-9f1f-7bc6914dba76","Type":"ContainerStarted","Data":"6319a8a5c70f22df1c28604d34ec4101c8e3e996be0a10469bb57b8a38840242"} Mar 21 04:27:10 crc kubenswrapper[4839]: I0321 04:27:10.946349 4839 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-6bbc66d757-nhjsp" Mar 21 04:27:10 crc kubenswrapper[4839]: I0321 04:27:10.974249 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-5fbc589df6-8mjvg" event={"ID":"e24bacec-594f-429f-8e02-73abc6c4b092","Type":"ContainerStarted","Data":"155058dcb792bcca927dba12ed5317e9b707d06ff0167036853cffab66697b72"} Mar 21 04:27:10 crc kubenswrapper[4839]: I0321 04:27:10.974316 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-5fbc589df6-8mjvg" event={"ID":"e24bacec-594f-429f-8e02-73abc6c4b092","Type":"ContainerStarted","Data":"788d6540d0d6e5edc7b3a0c3a17787a5f1728b835f1af19ef78772b9e9d77f08"} Mar 21 04:27:10 crc kubenswrapper[4839]: I0321 04:27:10.974948 4839 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-5fbc589df6-8mjvg" Mar 21 04:27:10 crc kubenswrapper[4839]: I0321 04:27:10.976591 4839 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-controller-manager/revision-pruner-9-crc" podStartSLOduration=1.976536899 podStartE2EDuration="1.976536899s" podCreationTimestamp="2026-03-21 04:27:09 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-21 04:27:10.974463257 +0000 UTC m=+235.302249943" watchObservedRunningTime="2026-03-21 04:27:10.976536899 +0000 UTC m=+235.304323575" Mar 21 04:27:10 crc kubenswrapper[4839]: I0321 04:27:10.988414 4839 generic.go:334] "Generic (PLEG): container finished" podID="c3ae9e7a-784b-4a39-bd4e-10dbff65cd50" containerID="7521db13f86a9c3794fc986fd50597367465ad735540781bebd8b03f96db40d5" exitCode=0 Mar 21 04:27:10 crc kubenswrapper[4839]: I0321 04:27:10.988525 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-7m29z" event={"ID":"c3ae9e7a-784b-4a39-bd4e-10dbff65cd50","Type":"ContainerDied","Data":"7521db13f86a9c3794fc986fd50597367465ad735540781bebd8b03f96db40d5"} Mar 21 04:27:10 crc kubenswrapper[4839]: I0321 04:27:10.988562 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-7m29z" event={"ID":"c3ae9e7a-784b-4a39-bd4e-10dbff65cd50","Type":"ContainerStarted","Data":"ffa5ea5aab95eb6762df26e9e80adb2d4051bdda8b4098100f8d0af4408a8c40"} Mar 21 04:27:10 crc kubenswrapper[4839]: I0321 04:27:10.993812 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b144748c-2940-4efe-a486-d2b5c1239b12-catalog-content\") pod \"redhat-operators-cznml\" (UID: \"b144748c-2940-4efe-a486-d2b5c1239b12\") " pod="openshift-marketplace/redhat-operators-cznml" Mar 21 04:27:10 crc kubenswrapper[4839]: I0321 04:27:10.993884 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d994r\" (UniqueName: \"kubernetes.io/projected/b144748c-2940-4efe-a486-d2b5c1239b12-kube-api-access-d994r\") pod \"redhat-operators-cznml\" (UID: \"b144748c-2940-4efe-a486-d2b5c1239b12\") " pod="openshift-marketplace/redhat-operators-cznml" Mar 21 04:27:10 crc kubenswrapper[4839]: I0321 04:27:10.994030 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b144748c-2940-4efe-a486-d2b5c1239b12-utilities\") pod \"redhat-operators-cznml\" (UID: \"b144748c-2940-4efe-a486-d2b5c1239b12\") " pod="openshift-marketplace/redhat-operators-cznml" Mar 21 04:27:10 crc kubenswrapper[4839]: I0321 04:27:10.994840 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b144748c-2940-4efe-a486-d2b5c1239b12-utilities\") pod \"redhat-operators-cznml\" (UID: \"b144748c-2940-4efe-a486-d2b5c1239b12\") " pod="openshift-marketplace/redhat-operators-cznml" Mar 21 04:27:10 crc kubenswrapper[4839]: I0321 04:27:10.995947 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b144748c-2940-4efe-a486-d2b5c1239b12-catalog-content\") pod \"redhat-operators-cznml\" (UID: \"b144748c-2940-4efe-a486-d2b5c1239b12\") " pod="openshift-marketplace/redhat-operators-cznml" Mar 21 04:27:11 crc kubenswrapper[4839]: I0321 04:27:10.999493 4839 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-5fbc589df6-8mjvg" podStartSLOduration=3.999471184 podStartE2EDuration="3.999471184s" podCreationTimestamp="2026-03-21 04:27:07 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-21 04:27:10.993160126 +0000 UTC m=+235.320946802" watchObservedRunningTime="2026-03-21 04:27:10.999471184 +0000 UTC m=+235.327257860" Mar 21 04:27:11 crc kubenswrapper[4839]: I0321 04:27:10.999810 4839 generic.go:334] "Generic (PLEG): container finished" podID="0b7a7313-21c4-4909-9ebe-ebe552b29b8c" containerID="1f40b3f629e9c306e4ab3dffa3f2ab389b605eb7d1dc90763bed0504eb05b231" exitCode=0 Mar 21 04:27:11 crc kubenswrapper[4839]: I0321 04:27:10.999885 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-9qjgq" event={"ID":"0b7a7313-21c4-4909-9ebe-ebe552b29b8c","Type":"ContainerDied","Data":"1f40b3f629e9c306e4ab3dffa3f2ab389b605eb7d1dc90763bed0504eb05b231"} Mar 21 04:27:11 crc kubenswrapper[4839]: I0321 04:27:10.999915 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-9qjgq" event={"ID":"0b7a7313-21c4-4909-9ebe-ebe552b29b8c","Type":"ContainerStarted","Data":"6a5663fd0eb16a90e793ba0b93994b3affe90036f9e0e38ea8915b0da62b0425"} Mar 21 04:27:11 crc kubenswrapper[4839]: I0321 04:27:11.021659 4839 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-6bbc66d757-nhjsp" podStartSLOduration=4.021639187 podStartE2EDuration="4.021639187s" podCreationTimestamp="2026-03-21 04:27:07 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-21 04:27:11.019204665 +0000 UTC m=+235.346991351" watchObservedRunningTime="2026-03-21 04:27:11.021639187 +0000 UTC m=+235.349425863" Mar 21 04:27:11 crc kubenswrapper[4839]: I0321 04:27:11.030028 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d994r\" (UniqueName: \"kubernetes.io/projected/b144748c-2940-4efe-a486-d2b5c1239b12-kube-api-access-d994r\") pod \"redhat-operators-cznml\" (UID: \"b144748c-2940-4efe-a486-d2b5c1239b12\") " pod="openshift-marketplace/redhat-operators-cznml" Mar 21 04:27:11 crc kubenswrapper[4839]: I0321 04:27:11.589381 4839 patch_prober.go:28] interesting pod/controller-manager-5fbc589df6-8mjvg container/controller-manager namespace/openshift-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.52:8443/healthz\": dial tcp 10.217.0.52:8443: connect: connection refused" start-of-body= Mar 21 04:27:11 crc kubenswrapper[4839]: I0321 04:27:11.589442 4839 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-controller-manager/controller-manager-5fbc589df6-8mjvg" podUID="e24bacec-594f-429f-8e02-73abc6c4b092" containerName="controller-manager" probeResult="failure" output="Get \"https://10.217.0.52:8443/healthz\": dial tcp 10.217.0.52:8443: connect: connection refused" Mar 21 04:27:11 crc kubenswrapper[4839]: I0321 04:27:11.597949 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-cznml" Mar 21 04:27:11 crc kubenswrapper[4839]: I0321 04:27:11.612408 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"e964573d-8ca6-4f88-8754-f34b3aa57504","Type":"ContainerStarted","Data":"9354fa601ffe81cec940ff50063028be74bdb3c1b3d994da34672057ed7bc082"} Mar 21 04:27:11 crc kubenswrapper[4839]: I0321 04:27:11.612458 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"e964573d-8ca6-4f88-8754-f34b3aa57504","Type":"ContainerStarted","Data":"a42e32fd6c837587208cca2bfb2a070aca572819f156271c0601f9badb5fbc1d"} Mar 21 04:27:11 crc kubenswrapper[4839]: I0321 04:27:11.655086 4839 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-zgfcm"] Mar 21 04:27:11 crc kubenswrapper[4839]: I0321 04:27:11.656342 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-cstqb" event={"ID":"4fee5524-9cb1-48c7-83b6-10bf3230c783","Type":"ContainerStarted","Data":"42a43bb3aceec4702990fb0046882e1fb903fed10830c4fd8ed164857f556782"} Mar 21 04:27:11 crc kubenswrapper[4839]: I0321 04:27:11.656381 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-cstqb" event={"ID":"4fee5524-9cb1-48c7-83b6-10bf3230c783","Type":"ContainerStarted","Data":"a37cf9596f67207128283a14feff2f3773edd83d2130f0b3b042271179a72d24"} Mar 21 04:27:11 crc kubenswrapper[4839]: I0321 04:27:11.652713 4839 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/revision-pruner-8-crc" podStartSLOduration=2.652687266 podStartE2EDuration="2.652687266s" podCreationTimestamp="2026-03-21 04:27:09 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-21 04:27:11.641023838 +0000 UTC m=+235.968810514" watchObservedRunningTime="2026-03-21 04:27:11.652687266 +0000 UTC m=+235.980473952" Mar 21 04:27:11 crc kubenswrapper[4839]: I0321 04:27:11.690388 4839 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-ql2ps"] Mar 21 04:27:11 crc kubenswrapper[4839]: I0321 04:27:11.695056 4839 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="hostpath-provisioner/csi-hostpathplugin-cstqb" podStartSLOduration=14.695032333 podStartE2EDuration="14.695032333s" podCreationTimestamp="2026-03-21 04:26:57 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-21 04:27:11.691622691 +0000 UTC m=+236.019409367" watchObservedRunningTime="2026-03-21 04:27:11.695032333 +0000 UTC m=+236.022819009" Mar 21 04:27:11 crc kubenswrapper[4839]: W0321 04:27:11.703219 4839 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5d1d0c02_87bf_4c8b_bc1c_d25007fb3c1c.slice/crio-45ca59ee6d68e70db13f642a35e227f7dae46d5a40341a7fcc4d0c33d12ae8bf WatchSource:0}: Error finding container 45ca59ee6d68e70db13f642a35e227f7dae46d5a40341a7fcc4d0c33d12ae8bf: Status 404 returned error can't find the container with id 45ca59ee6d68e70db13f642a35e227f7dae46d5a40341a7fcc4d0c33d12ae8bf Mar 21 04:27:11 crc kubenswrapper[4839]: I0321 04:27:11.720939 4839 patch_prober.go:28] interesting pod/router-default-5444994796-w6dzs container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 21 04:27:11 crc kubenswrapper[4839]: [-]has-synced failed: reason withheld Mar 21 04:27:11 crc kubenswrapper[4839]: [+]process-running ok Mar 21 04:27:11 crc kubenswrapper[4839]: healthz check failed Mar 21 04:27:11 crc kubenswrapper[4839]: I0321 04:27:11.721026 4839 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-w6dzs" podUID="28ce563b-8e5b-4abe-b71b-02c588bff511" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 21 04:27:11 crc kubenswrapper[4839]: I0321 04:27:11.858932 4839 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-6bbc66d757-nhjsp" Mar 21 04:27:12 crc kubenswrapper[4839]: I0321 04:27:12.164358 4839 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-cznml"] Mar 21 04:27:12 crc kubenswrapper[4839]: W0321 04:27:12.167950 4839 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb144748c_2940_4efe_a486_d2b5c1239b12.slice/crio-9d0751bfec85855cd6ce730251b6d95d8b4de8c09e14303b2b6a9c1d9c1fd165 WatchSource:0}: Error finding container 9d0751bfec85855cd6ce730251b6d95d8b4de8c09e14303b2b6a9c1d9c1fd165: Status 404 returned error can't find the container with id 9d0751bfec85855cd6ce730251b6d95d8b4de8c09e14303b2b6a9c1d9c1fd165 Mar 21 04:27:12 crc kubenswrapper[4839]: I0321 04:27:12.669202 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-zgfcm" event={"ID":"5d1d0c02-87bf-4c8b-bc1c-d25007fb3c1c","Type":"ContainerStarted","Data":"976423ef41aaae33850c00fcd3501af4196d38bd3ce2a5845a341191caf443f9"} Mar 21 04:27:12 crc kubenswrapper[4839]: I0321 04:27:12.669258 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-zgfcm" event={"ID":"5d1d0c02-87bf-4c8b-bc1c-d25007fb3c1c","Type":"ContainerStarted","Data":"45ca59ee6d68e70db13f642a35e227f7dae46d5a40341a7fcc4d0c33d12ae8bf"} Mar 21 04:27:12 crc kubenswrapper[4839]: I0321 04:27:12.671210 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-cznml" event={"ID":"b144748c-2940-4efe-a486-d2b5c1239b12","Type":"ContainerStarted","Data":"9d0751bfec85855cd6ce730251b6d95d8b4de8c09e14303b2b6a9c1d9c1fd165"} Mar 21 04:27:12 crc kubenswrapper[4839]: I0321 04:27:12.672971 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-ql2ps" event={"ID":"7ef3f28d-e496-434e-a803-3b9a0fa24690","Type":"ContainerStarted","Data":"db289ed2561962adc1edb7c7cc7d0a2aafe884fed424734dbdd27242d856949f"} Mar 21 04:27:12 crc kubenswrapper[4839]: I0321 04:27:12.675274 4839 generic.go:334] "Generic (PLEG): container finished" podID="e964573d-8ca6-4f88-8754-f34b3aa57504" containerID="9354fa601ffe81cec940ff50063028be74bdb3c1b3d994da34672057ed7bc082" exitCode=0 Mar 21 04:27:12 crc kubenswrapper[4839]: I0321 04:27:12.675427 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"e964573d-8ca6-4f88-8754-f34b3aa57504","Type":"ContainerDied","Data":"9354fa601ffe81cec940ff50063028be74bdb3c1b3d994da34672057ed7bc082"} Mar 21 04:27:12 crc kubenswrapper[4839]: I0321 04:27:12.680851 4839 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-5fbc589df6-8mjvg" Mar 21 04:27:12 crc kubenswrapper[4839]: I0321 04:27:12.720465 4839 patch_prober.go:28] interesting pod/router-default-5444994796-w6dzs container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 21 04:27:12 crc kubenswrapper[4839]: [-]has-synced failed: reason withheld Mar 21 04:27:12 crc kubenswrapper[4839]: [+]process-running ok Mar 21 04:27:12 crc kubenswrapper[4839]: healthz check failed Mar 21 04:27:12 crc kubenswrapper[4839]: I0321 04:27:12.720554 4839 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-w6dzs" podUID="28ce563b-8e5b-4abe-b71b-02c588bff511" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 21 04:27:13 crc kubenswrapper[4839]: I0321 04:27:13.684362 4839 generic.go:334] "Generic (PLEG): container finished" podID="5b496f66-c844-4eed-b91d-7c0b6b796e5e" containerID="c6b72bb73f4a17d5dd38c827060714d11fba907dd60f71c72f7d82237314541f" exitCode=0 Mar 21 04:27:13 crc kubenswrapper[4839]: I0321 04:27:13.684434 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"5b496f66-c844-4eed-b91d-7c0b6b796e5e","Type":"ContainerDied","Data":"c6b72bb73f4a17d5dd38c827060714d11fba907dd60f71c72f7d82237314541f"} Mar 21 04:27:13 crc kubenswrapper[4839]: I0321 04:27:13.687865 4839 generic.go:334] "Generic (PLEG): container finished" podID="5d1d0c02-87bf-4c8b-bc1c-d25007fb3c1c" containerID="976423ef41aaae33850c00fcd3501af4196d38bd3ce2a5845a341191caf443f9" exitCode=0 Mar 21 04:27:13 crc kubenswrapper[4839]: I0321 04:27:13.687950 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-zgfcm" event={"ID":"5d1d0c02-87bf-4c8b-bc1c-d25007fb3c1c","Type":"ContainerDied","Data":"976423ef41aaae33850c00fcd3501af4196d38bd3ce2a5845a341191caf443f9"} Mar 21 04:27:13 crc kubenswrapper[4839]: I0321 04:27:13.689815 4839 generic.go:334] "Generic (PLEG): container finished" podID="b144748c-2940-4efe-a486-d2b5c1239b12" containerID="d34aaed9fe2b229d5a38e871187b560f6a9b3aa6a74029511701966c2b92c3d3" exitCode=0 Mar 21 04:27:13 crc kubenswrapper[4839]: I0321 04:27:13.689873 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-cznml" event={"ID":"b144748c-2940-4efe-a486-d2b5c1239b12","Type":"ContainerDied","Data":"d34aaed9fe2b229d5a38e871187b560f6a9b3aa6a74029511701966c2b92c3d3"} Mar 21 04:27:13 crc kubenswrapper[4839]: I0321 04:27:13.693878 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-ql2ps" event={"ID":"7ef3f28d-e496-434e-a803-3b9a0fa24690","Type":"ContainerStarted","Data":"0b216795d8b50fc395a781f55afc6bd2e9902da0332fa52d6ee539b16a4c0446"} Mar 21 04:27:13 crc kubenswrapper[4839]: I0321 04:27:13.719502 4839 patch_prober.go:28] interesting pod/router-default-5444994796-w6dzs container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 21 04:27:13 crc kubenswrapper[4839]: [-]has-synced failed: reason withheld Mar 21 04:27:13 crc kubenswrapper[4839]: [+]process-running ok Mar 21 04:27:13 crc kubenswrapper[4839]: healthz check failed Mar 21 04:27:13 crc kubenswrapper[4839]: I0321 04:27:13.719585 4839 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-w6dzs" podUID="28ce563b-8e5b-4abe-b71b-02c588bff511" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 21 04:27:13 crc kubenswrapper[4839]: I0321 04:27:13.969354 4839 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Mar 21 04:27:14 crc kubenswrapper[4839]: I0321 04:27:14.000545 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/e964573d-8ca6-4f88-8754-f34b3aa57504-kubelet-dir\") pod \"e964573d-8ca6-4f88-8754-f34b3aa57504\" (UID: \"e964573d-8ca6-4f88-8754-f34b3aa57504\") " Mar 21 04:27:14 crc kubenswrapper[4839]: I0321 04:27:14.000733 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/e964573d-8ca6-4f88-8754-f34b3aa57504-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "e964573d-8ca6-4f88-8754-f34b3aa57504" (UID: "e964573d-8ca6-4f88-8754-f34b3aa57504"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 21 04:27:14 crc kubenswrapper[4839]: I0321 04:27:14.000848 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e964573d-8ca6-4f88-8754-f34b3aa57504-kube-api-access\") pod \"e964573d-8ca6-4f88-8754-f34b3aa57504\" (UID: \"e964573d-8ca6-4f88-8754-f34b3aa57504\") " Mar 21 04:27:14 crc kubenswrapper[4839]: I0321 04:27:14.001103 4839 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/e964573d-8ca6-4f88-8754-f34b3aa57504-kubelet-dir\") on node \"crc\" DevicePath \"\"" Mar 21 04:27:14 crc kubenswrapper[4839]: I0321 04:27:14.012911 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e964573d-8ca6-4f88-8754-f34b3aa57504-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "e964573d-8ca6-4f88-8754-f34b3aa57504" (UID: "e964573d-8ca6-4f88-8754-f34b3aa57504"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 21 04:27:14 crc kubenswrapper[4839]: I0321 04:27:14.103214 4839 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e964573d-8ca6-4f88-8754-f34b3aa57504-kube-api-access\") on node \"crc\" DevicePath \"\"" Mar 21 04:27:14 crc kubenswrapper[4839]: I0321 04:27:14.115416 4839 ???:1] "http: TLS handshake error from 192.168.126.11:43590: no serving certificate available for the kubelet" Mar 21 04:27:14 crc kubenswrapper[4839]: I0321 04:27:14.702106 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"e964573d-8ca6-4f88-8754-f34b3aa57504","Type":"ContainerDied","Data":"a42e32fd6c837587208cca2bfb2a070aca572819f156271c0601f9badb5fbc1d"} Mar 21 04:27:14 crc kubenswrapper[4839]: I0321 04:27:14.702160 4839 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a42e32fd6c837587208cca2bfb2a070aca572819f156271c0601f9badb5fbc1d" Mar 21 04:27:14 crc kubenswrapper[4839]: I0321 04:27:14.702287 4839 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-image-registry/image-registry-697d97f7c8-ql2ps" Mar 21 04:27:14 crc kubenswrapper[4839]: I0321 04:27:14.702121 4839 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Mar 21 04:27:14 crc kubenswrapper[4839]: I0321 04:27:14.727207 4839 patch_prober.go:28] interesting pod/router-default-5444994796-w6dzs container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 21 04:27:14 crc kubenswrapper[4839]: [-]has-synced failed: reason withheld Mar 21 04:27:14 crc kubenswrapper[4839]: [+]process-running ok Mar 21 04:27:14 crc kubenswrapper[4839]: healthz check failed Mar 21 04:27:14 crc kubenswrapper[4839]: I0321 04:27:14.727301 4839 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-w6dzs" podUID="28ce563b-8e5b-4abe-b71b-02c588bff511" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 21 04:27:14 crc kubenswrapper[4839]: I0321 04:27:14.727697 4839 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-697d97f7c8-ql2ps" podStartSLOduration=174.727682947 podStartE2EDuration="2m54.727682947s" podCreationTimestamp="2026-03-21 04:24:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-21 04:27:14.720907735 +0000 UTC m=+239.048694431" watchObservedRunningTime="2026-03-21 04:27:14.727682947 +0000 UTC m=+239.055469623" Mar 21 04:27:14 crc kubenswrapper[4839]: I0321 04:27:14.866019 4839 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-apiserver/apiserver-76f77b778f-gl7rc" Mar 21 04:27:14 crc kubenswrapper[4839]: I0321 04:27:14.870791 4839 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-apiserver/apiserver-76f77b778f-gl7rc" Mar 21 04:27:14 crc kubenswrapper[4839]: I0321 04:27:14.983775 4839 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Mar 21 04:27:15 crc kubenswrapper[4839]: I0321 04:27:15.140438 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/5b496f66-c844-4eed-b91d-7c0b6b796e5e-kube-api-access\") pod \"5b496f66-c844-4eed-b91d-7c0b6b796e5e\" (UID: \"5b496f66-c844-4eed-b91d-7c0b6b796e5e\") " Mar 21 04:27:15 crc kubenswrapper[4839]: I0321 04:27:15.140481 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/5b496f66-c844-4eed-b91d-7c0b6b796e5e-kubelet-dir\") pod \"5b496f66-c844-4eed-b91d-7c0b6b796e5e\" (UID: \"5b496f66-c844-4eed-b91d-7c0b6b796e5e\") " Mar 21 04:27:15 crc kubenswrapper[4839]: I0321 04:27:15.140694 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/5b496f66-c844-4eed-b91d-7c0b6b796e5e-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "5b496f66-c844-4eed-b91d-7c0b6b796e5e" (UID: "5b496f66-c844-4eed-b91d-7c0b6b796e5e"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 21 04:27:15 crc kubenswrapper[4839]: I0321 04:27:15.141149 4839 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/5b496f66-c844-4eed-b91d-7c0b6b796e5e-kubelet-dir\") on node \"crc\" DevicePath \"\"" Mar 21 04:27:15 crc kubenswrapper[4839]: I0321 04:27:15.146450 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5b496f66-c844-4eed-b91d-7c0b6b796e5e-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "5b496f66-c844-4eed-b91d-7c0b6b796e5e" (UID: "5b496f66-c844-4eed-b91d-7c0b6b796e5e"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 21 04:27:15 crc kubenswrapper[4839]: I0321 04:27:15.242687 4839 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/5b496f66-c844-4eed-b91d-7c0b6b796e5e-kube-api-access\") on node \"crc\" DevicePath \"\"" Mar 21 04:27:15 crc kubenswrapper[4839]: I0321 04:27:15.710633 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"5b496f66-c844-4eed-b91d-7c0b6b796e5e","Type":"ContainerDied","Data":"273ef0a7f028ed23bc9f1614a57b25fb15734b341325ff5bacc9681a87caec3a"} Mar 21 04:27:15 crc kubenswrapper[4839]: I0321 04:27:15.710668 4839 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="273ef0a7f028ed23bc9f1614a57b25fb15734b341325ff5bacc9681a87caec3a" Mar 21 04:27:15 crc kubenswrapper[4839]: I0321 04:27:15.710709 4839 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Mar 21 04:27:15 crc kubenswrapper[4839]: I0321 04:27:15.720471 4839 patch_prober.go:28] interesting pod/router-default-5444994796-w6dzs container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 21 04:27:15 crc kubenswrapper[4839]: [-]has-synced failed: reason withheld Mar 21 04:27:15 crc kubenswrapper[4839]: [+]process-running ok Mar 21 04:27:15 crc kubenswrapper[4839]: healthz check failed Mar 21 04:27:15 crc kubenswrapper[4839]: I0321 04:27:15.720554 4839 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-w6dzs" podUID="28ce563b-8e5b-4abe-b71b-02c588bff511" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 21 04:27:15 crc kubenswrapper[4839]: I0321 04:27:15.773794 4839 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-dns/dns-default-5jhkc" Mar 21 04:27:16 crc kubenswrapper[4839]: I0321 04:27:16.132603 4839 ???:1] "http: TLS handshake error from 192.168.126.11:52940: no serving certificate available for the kubelet" Mar 21 04:27:16 crc kubenswrapper[4839]: I0321 04:27:16.723549 4839 patch_prober.go:28] interesting pod/router-default-5444994796-w6dzs container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 21 04:27:16 crc kubenswrapper[4839]: [-]has-synced failed: reason withheld Mar 21 04:27:16 crc kubenswrapper[4839]: [+]process-running ok Mar 21 04:27:16 crc kubenswrapper[4839]: healthz check failed Mar 21 04:27:16 crc kubenswrapper[4839]: I0321 04:27:16.723631 4839 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-w6dzs" podUID="28ce563b-8e5b-4abe-b71b-02c588bff511" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 21 04:27:17 crc kubenswrapper[4839]: I0321 04:27:17.719705 4839 patch_prober.go:28] interesting pod/router-default-5444994796-w6dzs container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 21 04:27:17 crc kubenswrapper[4839]: [-]has-synced failed: reason withheld Mar 21 04:27:17 crc kubenswrapper[4839]: [+]process-running ok Mar 21 04:27:17 crc kubenswrapper[4839]: healthz check failed Mar 21 04:27:17 crc kubenswrapper[4839]: I0321 04:27:17.720308 4839 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-w6dzs" podUID="28ce563b-8e5b-4abe-b71b-02c588bff511" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 21 04:27:18 crc kubenswrapper[4839]: I0321 04:27:18.720169 4839 patch_prober.go:28] interesting pod/router-default-5444994796-w6dzs container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 21 04:27:18 crc kubenswrapper[4839]: [-]has-synced failed: reason withheld Mar 21 04:27:18 crc kubenswrapper[4839]: [+]process-running ok Mar 21 04:27:18 crc kubenswrapper[4839]: healthz check failed Mar 21 04:27:18 crc kubenswrapper[4839]: I0321 04:27:18.720234 4839 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-w6dzs" podUID="28ce563b-8e5b-4abe-b71b-02c588bff511" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 21 04:27:19 crc kubenswrapper[4839]: I0321 04:27:19.722482 4839 patch_prober.go:28] interesting pod/router-default-5444994796-w6dzs container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 21 04:27:19 crc kubenswrapper[4839]: [-]has-synced failed: reason withheld Mar 21 04:27:19 crc kubenswrapper[4839]: [+]process-running ok Mar 21 04:27:19 crc kubenswrapper[4839]: healthz check failed Mar 21 04:27:19 crc kubenswrapper[4839]: I0321 04:27:19.722541 4839 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-w6dzs" podUID="28ce563b-8e5b-4abe-b71b-02c588bff511" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 21 04:27:19 crc kubenswrapper[4839]: I0321 04:27:19.931978 4839 patch_prober.go:28] interesting pod/downloads-7954f5f757-qp8mz container/download-server namespace/openshift-console: Liveness probe status=failure output="Get \"http://10.217.0.14:8080/\": dial tcp 10.217.0.14:8080: connect: connection refused" start-of-body= Mar 21 04:27:19 crc kubenswrapper[4839]: I0321 04:27:19.932045 4839 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-console/downloads-7954f5f757-qp8mz" podUID="4d63cdfd-21e7-4a63-960b-363fb131ac08" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.14:8080/\": dial tcp 10.217.0.14:8080: connect: connection refused" Mar 21 04:27:19 crc kubenswrapper[4839]: I0321 04:27:19.931979 4839 patch_prober.go:28] interesting pod/downloads-7954f5f757-qp8mz container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.14:8080/\": dial tcp 10.217.0.14:8080: connect: connection refused" start-of-body= Mar 21 04:27:19 crc kubenswrapper[4839]: I0321 04:27:19.932213 4839 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-qp8mz" podUID="4d63cdfd-21e7-4a63-960b-363fb131ac08" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.14:8080/\": dial tcp 10.217.0.14:8080: connect: connection refused" Mar 21 04:27:20 crc kubenswrapper[4839]: I0321 04:27:20.247124 4839 patch_prober.go:28] interesting pod/console-f9d7485db-bj929 container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.217.0.20:8443/health\": dial tcp 10.217.0.20:8443: connect: connection refused" start-of-body= Mar 21 04:27:20 crc kubenswrapper[4839]: I0321 04:27:20.247184 4839 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-console/console-f9d7485db-bj929" podUID="ebdfec0a-a8bf-47b0-b51a-75a76d4341f2" containerName="console" probeResult="failure" output="Get \"https://10.217.0.20:8443/health\": dial tcp 10.217.0.20:8443: connect: connection refused" Mar 21 04:27:20 crc kubenswrapper[4839]: I0321 04:27:20.720215 4839 patch_prober.go:28] interesting pod/router-default-5444994796-w6dzs container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 21 04:27:20 crc kubenswrapper[4839]: [-]has-synced failed: reason withheld Mar 21 04:27:20 crc kubenswrapper[4839]: [+]process-running ok Mar 21 04:27:20 crc kubenswrapper[4839]: healthz check failed Mar 21 04:27:20 crc kubenswrapper[4839]: I0321 04:27:20.720283 4839 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-w6dzs" podUID="28ce563b-8e5b-4abe-b71b-02c588bff511" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 21 04:27:21 crc kubenswrapper[4839]: I0321 04:27:21.227996 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/fa13ce27-53f2-4178-8560-251f0bb3f034-metrics-certs\") pod \"network-metrics-daemon-445ww\" (UID: \"fa13ce27-53f2-4178-8560-251f0bb3f034\") " pod="openshift-multus/network-metrics-daemon-445ww" Mar 21 04:27:21 crc kubenswrapper[4839]: I0321 04:27:21.230115 4839 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-secret" Mar 21 04:27:21 crc kubenswrapper[4839]: I0321 04:27:21.243943 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/fa13ce27-53f2-4178-8560-251f0bb3f034-metrics-certs\") pod \"network-metrics-daemon-445ww\" (UID: \"fa13ce27-53f2-4178-8560-251f0bb3f034\") " pod="openshift-multus/network-metrics-daemon-445ww" Mar 21 04:27:21 crc kubenswrapper[4839]: I0321 04:27:21.385325 4839 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-sa-dockercfg-d427c" Mar 21 04:27:21 crc kubenswrapper[4839]: I0321 04:27:21.393256 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-445ww" Mar 21 04:27:21 crc kubenswrapper[4839]: I0321 04:27:21.720340 4839 patch_prober.go:28] interesting pod/router-default-5444994796-w6dzs container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 21 04:27:21 crc kubenswrapper[4839]: [-]has-synced failed: reason withheld Mar 21 04:27:21 crc kubenswrapper[4839]: [+]process-running ok Mar 21 04:27:21 crc kubenswrapper[4839]: healthz check failed Mar 21 04:27:21 crc kubenswrapper[4839]: I0321 04:27:21.720694 4839 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-w6dzs" podUID="28ce563b-8e5b-4abe-b71b-02c588bff511" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 21 04:27:22 crc kubenswrapper[4839]: I0321 04:27:22.720015 4839 patch_prober.go:28] interesting pod/router-default-5444994796-w6dzs container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 21 04:27:22 crc kubenswrapper[4839]: [-]has-synced failed: reason withheld Mar 21 04:27:22 crc kubenswrapper[4839]: [+]process-running ok Mar 21 04:27:22 crc kubenswrapper[4839]: healthz check failed Mar 21 04:27:22 crc kubenswrapper[4839]: I0321 04:27:22.720115 4839 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-w6dzs" podUID="28ce563b-8e5b-4abe-b71b-02c588bff511" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 21 04:27:23 crc kubenswrapper[4839]: I0321 04:27:23.720537 4839 patch_prober.go:28] interesting pod/router-default-5444994796-w6dzs container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 21 04:27:23 crc kubenswrapper[4839]: [-]has-synced failed: reason withheld Mar 21 04:27:23 crc kubenswrapper[4839]: [+]process-running ok Mar 21 04:27:23 crc kubenswrapper[4839]: healthz check failed Mar 21 04:27:23 crc kubenswrapper[4839]: I0321 04:27:23.720789 4839 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-w6dzs" podUID="28ce563b-8e5b-4abe-b71b-02c588bff511" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 21 04:27:24 crc kubenswrapper[4839]: I0321 04:27:24.720810 4839 patch_prober.go:28] interesting pod/router-default-5444994796-w6dzs container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 21 04:27:24 crc kubenswrapper[4839]: [-]has-synced failed: reason withheld Mar 21 04:27:24 crc kubenswrapper[4839]: [+]process-running ok Mar 21 04:27:24 crc kubenswrapper[4839]: healthz check failed Mar 21 04:27:24 crc kubenswrapper[4839]: I0321 04:27:24.721105 4839 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-w6dzs" podUID="28ce563b-8e5b-4abe-b71b-02c588bff511" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 21 04:27:25 crc kubenswrapper[4839]: I0321 04:27:25.720243 4839 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-ingress/router-default-5444994796-w6dzs" Mar 21 04:27:25 crc kubenswrapper[4839]: I0321 04:27:25.722073 4839 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ingress/router-default-5444994796-w6dzs" Mar 21 04:27:26 crc kubenswrapper[4839]: I0321 04:27:26.682034 4839 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-5fbc589df6-8mjvg"] Mar 21 04:27:26 crc kubenswrapper[4839]: I0321 04:27:26.682261 4839 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-5fbc589df6-8mjvg" podUID="e24bacec-594f-429f-8e02-73abc6c4b092" containerName="controller-manager" containerID="cri-o://155058dcb792bcca927dba12ed5317e9b707d06ff0167036853cffab66697b72" gracePeriod=30 Mar 21 04:27:26 crc kubenswrapper[4839]: I0321 04:27:26.709922 4839 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6bbc66d757-nhjsp"] Mar 21 04:27:26 crc kubenswrapper[4839]: I0321 04:27:26.710403 4839 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-6bbc66d757-nhjsp" podUID="8805db9c-11be-498e-9f1f-7bc6914dba76" containerName="route-controller-manager" containerID="cri-o://5001466b0a821fd088ffe8d3580f139c85ddf7eda1ca152fed0f401c41c672e0" gracePeriod=30 Mar 21 04:27:27 crc kubenswrapper[4839]: I0321 04:27:27.847045 4839 generic.go:334] "Generic (PLEG): container finished" podID="e24bacec-594f-429f-8e02-73abc6c4b092" containerID="155058dcb792bcca927dba12ed5317e9b707d06ff0167036853cffab66697b72" exitCode=0 Mar 21 04:27:27 crc kubenswrapper[4839]: I0321 04:27:27.847133 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-5fbc589df6-8mjvg" event={"ID":"e24bacec-594f-429f-8e02-73abc6c4b092","Type":"ContainerDied","Data":"155058dcb792bcca927dba12ed5317e9b707d06ff0167036853cffab66697b72"} Mar 21 04:27:27 crc kubenswrapper[4839]: I0321 04:27:27.849785 4839 generic.go:334] "Generic (PLEG): container finished" podID="8805db9c-11be-498e-9f1f-7bc6914dba76" containerID="5001466b0a821fd088ffe8d3580f139c85ddf7eda1ca152fed0f401c41c672e0" exitCode=0 Mar 21 04:27:27 crc kubenswrapper[4839]: I0321 04:27:27.849810 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6bbc66d757-nhjsp" event={"ID":"8805db9c-11be-498e-9f1f-7bc6914dba76","Type":"ContainerDied","Data":"5001466b0a821fd088ffe8d3580f139c85ddf7eda1ca152fed0f401c41c672e0"} Mar 21 04:27:29 crc kubenswrapper[4839]: I0321 04:27:29.936184 4839 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/downloads-7954f5f757-qp8mz" Mar 21 04:27:30 crc kubenswrapper[4839]: I0321 04:27:30.043799 4839 patch_prober.go:28] interesting pod/controller-manager-5fbc589df6-8mjvg container/controller-manager namespace/openshift-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.52:8443/healthz\": dial tcp 10.217.0.52:8443: connect: connection refused" start-of-body= Mar 21 04:27:30 crc kubenswrapper[4839]: I0321 04:27:30.043863 4839 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-controller-manager/controller-manager-5fbc589df6-8mjvg" podUID="e24bacec-594f-429f-8e02-73abc6c4b092" containerName="controller-manager" probeResult="failure" output="Get \"https://10.217.0.52:8443/healthz\": dial tcp 10.217.0.52:8443: connect: connection refused" Mar 21 04:27:30 crc kubenswrapper[4839]: I0321 04:27:30.073249 4839 patch_prober.go:28] interesting pod/route-controller-manager-6bbc66d757-nhjsp container/route-controller-manager namespace/openshift-route-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.53:8443/healthz\": dial tcp 10.217.0.53:8443: connect: connection refused" start-of-body= Mar 21 04:27:30 crc kubenswrapper[4839]: I0321 04:27:30.073308 4839 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-route-controller-manager/route-controller-manager-6bbc66d757-nhjsp" podUID="8805db9c-11be-498e-9f1f-7bc6914dba76" containerName="route-controller-manager" probeResult="failure" output="Get \"https://10.217.0.53:8443/healthz\": dial tcp 10.217.0.53:8443: connect: connection refused" Mar 21 04:27:30 crc kubenswrapper[4839]: I0321 04:27:30.350446 4839 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-f9d7485db-bj929" Mar 21 04:27:30 crc kubenswrapper[4839]: I0321 04:27:30.354839 4839 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-f9d7485db-bj929" Mar 21 04:27:30 crc kubenswrapper[4839]: I0321 04:27:30.570523 4839 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-697d97f7c8-ql2ps" Mar 21 04:27:30 crc kubenswrapper[4839]: I0321 04:27:30.980723 4839 patch_prober.go:28] interesting pod/machine-config-daemon-jx4q7 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 21 04:27:30 crc kubenswrapper[4839]: I0321 04:27:30.980791 4839 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-jx4q7" podUID="4f92fefb-d5cd-451a-8bbe-31eea55d5bd9" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 21 04:27:36 crc kubenswrapper[4839]: I0321 04:27:36.642173 4839 ???:1] "http: TLS handshake error from 192.168.126.11:39288: no serving certificate available for the kubelet" Mar 21 04:27:39 crc kubenswrapper[4839]: E0321 04:27:39.508101 4839 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/community-operator-index:v4.18" Mar 21 04:27:39 crc kubenswrapper[4839]: E0321 04:27:39.508535 4839 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/community-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-nt4t7,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod community-operators-mxrc8_openshift-marketplace(6513c45b-dd98-40b0-b69c-94db4d1c916e): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Mar 21 04:27:39 crc kubenswrapper[4839]: E0321 04:27:39.509701 4839 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/community-operators-mxrc8" podUID="6513c45b-dd98-40b0-b69c-94db4d1c916e" Mar 21 04:27:40 crc kubenswrapper[4839]: I0321 04:27:40.137449 4839 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-bxg8h" Mar 21 04:27:41 crc kubenswrapper[4839]: I0321 04:27:41.043327 4839 patch_prober.go:28] interesting pod/controller-manager-5fbc589df6-8mjvg container/controller-manager namespace/openshift-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.52:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 21 04:27:41 crc kubenswrapper[4839]: I0321 04:27:41.043415 4839 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-controller-manager/controller-manager-5fbc589df6-8mjvg" podUID="e24bacec-594f-429f-8e02-73abc6c4b092" containerName="controller-manager" probeResult="failure" output="Get \"https://10.217.0.52:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 21 04:27:41 crc kubenswrapper[4839]: I0321 04:27:41.071108 4839 patch_prober.go:28] interesting pod/route-controller-manager-6bbc66d757-nhjsp container/route-controller-manager namespace/openshift-route-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.53:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 21 04:27:41 crc kubenswrapper[4839]: I0321 04:27:41.071224 4839 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-route-controller-manager/route-controller-manager-6bbc66d757-nhjsp" podUID="8805db9c-11be-498e-9f1f-7bc6914dba76" containerName="route-controller-manager" probeResult="failure" output="Get \"https://10.217.0.53:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 21 04:27:42 crc kubenswrapper[4839]: E0321 04:27:42.086558 4839 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"\"" pod="openshift-marketplace/community-operators-mxrc8" podUID="6513c45b-dd98-40b0-b69c-94db4d1c916e" Mar 21 04:27:42 crc kubenswrapper[4839]: I0321 04:27:42.681791 4839 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/revision-pruner-9-crc"] Mar 21 04:27:42 crc kubenswrapper[4839]: E0321 04:27:42.682216 4839 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e964573d-8ca6-4f88-8754-f34b3aa57504" containerName="pruner" Mar 21 04:27:42 crc kubenswrapper[4839]: I0321 04:27:42.682259 4839 state_mem.go:107] "Deleted CPUSet assignment" podUID="e964573d-8ca6-4f88-8754-f34b3aa57504" containerName="pruner" Mar 21 04:27:42 crc kubenswrapper[4839]: E0321 04:27:42.682297 4839 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5b496f66-c844-4eed-b91d-7c0b6b796e5e" containerName="pruner" Mar 21 04:27:42 crc kubenswrapper[4839]: I0321 04:27:42.682314 4839 state_mem.go:107] "Deleted CPUSet assignment" podUID="5b496f66-c844-4eed-b91d-7c0b6b796e5e" containerName="pruner" Mar 21 04:27:42 crc kubenswrapper[4839]: I0321 04:27:42.682665 4839 memory_manager.go:354] "RemoveStaleState removing state" podUID="5b496f66-c844-4eed-b91d-7c0b6b796e5e" containerName="pruner" Mar 21 04:27:42 crc kubenswrapper[4839]: I0321 04:27:42.682714 4839 memory_manager.go:354] "RemoveStaleState removing state" podUID="e964573d-8ca6-4f88-8754-f34b3aa57504" containerName="pruner" Mar 21 04:27:42 crc kubenswrapper[4839]: I0321 04:27:42.683508 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-9-crc" Mar 21 04:27:42 crc kubenswrapper[4839]: I0321 04:27:42.685964 4839 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver"/"installer-sa-dockercfg-5pr6n" Mar 21 04:27:42 crc kubenswrapper[4839]: I0321 04:27:42.686194 4839 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver"/"kube-root-ca.crt" Mar 21 04:27:42 crc kubenswrapper[4839]: I0321 04:27:42.691333 4839 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-9-crc"] Mar 21 04:27:42 crc kubenswrapper[4839]: I0321 04:27:42.866947 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/d62030f0-7d6f-46f7-83a2-a28fafe1a4ef-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"d62030f0-7d6f-46f7-83a2-a28fafe1a4ef\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Mar 21 04:27:42 crc kubenswrapper[4839]: I0321 04:27:42.867005 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/d62030f0-7d6f-46f7-83a2-a28fafe1a4ef-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"d62030f0-7d6f-46f7-83a2-a28fafe1a4ef\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Mar 21 04:27:42 crc kubenswrapper[4839]: I0321 04:27:42.968332 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/d62030f0-7d6f-46f7-83a2-a28fafe1a4ef-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"d62030f0-7d6f-46f7-83a2-a28fafe1a4ef\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Mar 21 04:27:42 crc kubenswrapper[4839]: I0321 04:27:42.968380 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/d62030f0-7d6f-46f7-83a2-a28fafe1a4ef-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"d62030f0-7d6f-46f7-83a2-a28fafe1a4ef\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Mar 21 04:27:42 crc kubenswrapper[4839]: I0321 04:27:42.968479 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/d62030f0-7d6f-46f7-83a2-a28fafe1a4ef-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"d62030f0-7d6f-46f7-83a2-a28fafe1a4ef\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Mar 21 04:27:43 crc kubenswrapper[4839]: I0321 04:27:43.002751 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/d62030f0-7d6f-46f7-83a2-a28fafe1a4ef-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"d62030f0-7d6f-46f7-83a2-a28fafe1a4ef\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Mar 21 04:27:43 crc kubenswrapper[4839]: I0321 04:27:43.015422 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-9-crc" Mar 21 04:27:48 crc kubenswrapper[4839]: I0321 04:27:48.068797 4839 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/installer-9-crc"] Mar 21 04:27:48 crc kubenswrapper[4839]: I0321 04:27:48.071208 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-9-crc" Mar 21 04:27:48 crc kubenswrapper[4839]: I0321 04:27:48.079237 4839 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/installer-9-crc"] Mar 21 04:27:48 crc kubenswrapper[4839]: I0321 04:27:48.131028 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/e3a71e7a-3ead-483f-8de2-9dbf3a336182-kubelet-dir\") pod \"installer-9-crc\" (UID: \"e3a71e7a-3ead-483f-8de2-9dbf3a336182\") " pod="openshift-kube-apiserver/installer-9-crc" Mar 21 04:27:48 crc kubenswrapper[4839]: I0321 04:27:48.131092 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/e3a71e7a-3ead-483f-8de2-9dbf3a336182-var-lock\") pod \"installer-9-crc\" (UID: \"e3a71e7a-3ead-483f-8de2-9dbf3a336182\") " pod="openshift-kube-apiserver/installer-9-crc" Mar 21 04:27:48 crc kubenswrapper[4839]: I0321 04:27:48.131385 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e3a71e7a-3ead-483f-8de2-9dbf3a336182-kube-api-access\") pod \"installer-9-crc\" (UID: \"e3a71e7a-3ead-483f-8de2-9dbf3a336182\") " pod="openshift-kube-apiserver/installer-9-crc" Mar 21 04:27:48 crc kubenswrapper[4839]: I0321 04:27:48.232699 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/e3a71e7a-3ead-483f-8de2-9dbf3a336182-kubelet-dir\") pod \"installer-9-crc\" (UID: \"e3a71e7a-3ead-483f-8de2-9dbf3a336182\") " pod="openshift-kube-apiserver/installer-9-crc" Mar 21 04:27:48 crc kubenswrapper[4839]: I0321 04:27:48.232780 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/e3a71e7a-3ead-483f-8de2-9dbf3a336182-var-lock\") pod \"installer-9-crc\" (UID: \"e3a71e7a-3ead-483f-8de2-9dbf3a336182\") " pod="openshift-kube-apiserver/installer-9-crc" Mar 21 04:27:48 crc kubenswrapper[4839]: I0321 04:27:48.232862 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/e3a71e7a-3ead-483f-8de2-9dbf3a336182-kubelet-dir\") pod \"installer-9-crc\" (UID: \"e3a71e7a-3ead-483f-8de2-9dbf3a336182\") " pod="openshift-kube-apiserver/installer-9-crc" Mar 21 04:27:48 crc kubenswrapper[4839]: I0321 04:27:48.232912 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e3a71e7a-3ead-483f-8de2-9dbf3a336182-kube-api-access\") pod \"installer-9-crc\" (UID: \"e3a71e7a-3ead-483f-8de2-9dbf3a336182\") " pod="openshift-kube-apiserver/installer-9-crc" Mar 21 04:27:48 crc kubenswrapper[4839]: I0321 04:27:48.232951 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/e3a71e7a-3ead-483f-8de2-9dbf3a336182-var-lock\") pod \"installer-9-crc\" (UID: \"e3a71e7a-3ead-483f-8de2-9dbf3a336182\") " pod="openshift-kube-apiserver/installer-9-crc" Mar 21 04:27:48 crc kubenswrapper[4839]: I0321 04:27:48.250130 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e3a71e7a-3ead-483f-8de2-9dbf3a336182-kube-api-access\") pod \"installer-9-crc\" (UID: \"e3a71e7a-3ead-483f-8de2-9dbf3a336182\") " pod="openshift-kube-apiserver/installer-9-crc" Mar 21 04:27:48 crc kubenswrapper[4839]: I0321 04:27:48.399211 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-9-crc" Mar 21 04:27:51 crc kubenswrapper[4839]: I0321 04:27:51.043993 4839 patch_prober.go:28] interesting pod/controller-manager-5fbc589df6-8mjvg container/controller-manager namespace/openshift-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.52:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 21 04:27:51 crc kubenswrapper[4839]: I0321 04:27:51.044379 4839 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-controller-manager/controller-manager-5fbc589df6-8mjvg" podUID="e24bacec-594f-429f-8e02-73abc6c4b092" containerName="controller-manager" probeResult="failure" output="Get \"https://10.217.0.52:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 21 04:27:51 crc kubenswrapper[4839]: I0321 04:27:51.071117 4839 patch_prober.go:28] interesting pod/route-controller-manager-6bbc66d757-nhjsp container/route-controller-manager namespace/openshift-route-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.53:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 21 04:27:51 crc kubenswrapper[4839]: I0321 04:27:51.071188 4839 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-route-controller-manager/route-controller-manager-6bbc66d757-nhjsp" podUID="8805db9c-11be-498e-9f1f-7bc6914dba76" containerName="route-controller-manager" probeResult="failure" output="Get \"https://10.217.0.53:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 21 04:27:51 crc kubenswrapper[4839]: E0321 04:27:51.077900 4839 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/openshift4/ose-cli:latest" Mar 21 04:27:51 crc kubenswrapper[4839]: E0321 04:27:51.078058 4839 kuberuntime_manager.go:1274] "Unhandled Error" err=< Mar 21 04:27:51 crc kubenswrapper[4839]: container &Container{Name:oc,Image:registry.redhat.io/openshift4/ose-cli:latest,Command:[/bin/bash -c oc get csr -o go-template='{{range .items}}{{if not .status}}{{.metadata.name}}{{"\n"}}{{end}}{{end}}' | xargs --no-run-if-empty oc adm certificate approve Mar 21 04:27:51 crc kubenswrapper[4839]: ],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-vjkdf,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:nil,Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod auto-csr-approver-29567786-d8w8k_openshift-infra(609ace61-45d1-44f6-b378-fb97eecf2374): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled Mar 21 04:27:51 crc kubenswrapper[4839]: > logger="UnhandledError" Mar 21 04:27:51 crc kubenswrapper[4839]: E0321 04:27:51.080058 4839 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"oc\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-infra/auto-csr-approver-29567786-d8w8k" podUID="609ace61-45d1-44f6-b378-fb97eecf2374" Mar 21 04:27:51 crc kubenswrapper[4839]: E0321 04:27:51.996142 4839 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"oc\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/openshift4/ose-cli:latest\\\"\"" pod="openshift-infra/auto-csr-approver-29567786-d8w8k" podUID="609ace61-45d1-44f6-b378-fb97eecf2374" Mar 21 04:27:52 crc kubenswrapper[4839]: I0321 04:27:52.737062 4839 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-5fbc589df6-8mjvg" Mar 21 04:27:52 crc kubenswrapper[4839]: I0321 04:27:52.741977 4839 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6bbc66d757-nhjsp" Mar 21 04:27:52 crc kubenswrapper[4839]: I0321 04:27:52.771263 4839 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-64bdf84bc9-fblcp"] Mar 21 04:27:52 crc kubenswrapper[4839]: E0321 04:27:52.771556 4839 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e24bacec-594f-429f-8e02-73abc6c4b092" containerName="controller-manager" Mar 21 04:27:52 crc kubenswrapper[4839]: I0321 04:27:52.771583 4839 state_mem.go:107] "Deleted CPUSet assignment" podUID="e24bacec-594f-429f-8e02-73abc6c4b092" containerName="controller-manager" Mar 21 04:27:52 crc kubenswrapper[4839]: E0321 04:27:52.771593 4839 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8805db9c-11be-498e-9f1f-7bc6914dba76" containerName="route-controller-manager" Mar 21 04:27:52 crc kubenswrapper[4839]: I0321 04:27:52.771599 4839 state_mem.go:107] "Deleted CPUSet assignment" podUID="8805db9c-11be-498e-9f1f-7bc6914dba76" containerName="route-controller-manager" Mar 21 04:27:52 crc kubenswrapper[4839]: I0321 04:27:52.771725 4839 memory_manager.go:354] "RemoveStaleState removing state" podUID="8805db9c-11be-498e-9f1f-7bc6914dba76" containerName="route-controller-manager" Mar 21 04:27:52 crc kubenswrapper[4839]: I0321 04:27:52.771738 4839 memory_manager.go:354] "RemoveStaleState removing state" podUID="e24bacec-594f-429f-8e02-73abc6c4b092" containerName="controller-manager" Mar 21 04:27:52 crc kubenswrapper[4839]: I0321 04:27:52.772182 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-64bdf84bc9-fblcp" Mar 21 04:27:52 crc kubenswrapper[4839]: I0321 04:27:52.775009 4839 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-64bdf84bc9-fblcp"] Mar 21 04:27:52 crc kubenswrapper[4839]: E0321 04:27:52.888029 4839 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/certified-operator-index:v4.18" Mar 21 04:27:52 crc kubenswrapper[4839]: E0321 04:27:52.888238 4839 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/certified-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-z7dr2,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod certified-operators-qrqj2_openshift-marketplace(f1ec80e5-557b-4c30-8323-87d6b1447a6d): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Mar 21 04:27:52 crc kubenswrapper[4839]: E0321 04:27:52.890667 4839 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/certified-operators-qrqj2" podUID="f1ec80e5-557b-4c30-8323-87d6b1447a6d" Mar 21 04:27:52 crc kubenswrapper[4839]: I0321 04:27:52.906087 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vhfng\" (UniqueName: \"kubernetes.io/projected/8805db9c-11be-498e-9f1f-7bc6914dba76-kube-api-access-vhfng\") pod \"8805db9c-11be-498e-9f1f-7bc6914dba76\" (UID: \"8805db9c-11be-498e-9f1f-7bc6914dba76\") " Mar 21 04:27:52 crc kubenswrapper[4839]: I0321 04:27:52.906136 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8805db9c-11be-498e-9f1f-7bc6914dba76-config\") pod \"8805db9c-11be-498e-9f1f-7bc6914dba76\" (UID: \"8805db9c-11be-498e-9f1f-7bc6914dba76\") " Mar 21 04:27:52 crc kubenswrapper[4839]: I0321 04:27:52.906181 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e24bacec-594f-429f-8e02-73abc6c4b092-serving-cert\") pod \"e24bacec-594f-429f-8e02-73abc6c4b092\" (UID: \"e24bacec-594f-429f-8e02-73abc6c4b092\") " Mar 21 04:27:52 crc kubenswrapper[4839]: I0321 04:27:52.906217 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8805db9c-11be-498e-9f1f-7bc6914dba76-serving-cert\") pod \"8805db9c-11be-498e-9f1f-7bc6914dba76\" (UID: \"8805db9c-11be-498e-9f1f-7bc6914dba76\") " Mar 21 04:27:52 crc kubenswrapper[4839]: I0321 04:27:52.906252 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/e24bacec-594f-429f-8e02-73abc6c4b092-client-ca\") pod \"e24bacec-594f-429f-8e02-73abc6c4b092\" (UID: \"e24bacec-594f-429f-8e02-73abc6c4b092\") " Mar 21 04:27:52 crc kubenswrapper[4839]: I0321 04:27:52.906281 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-s5hv7\" (UniqueName: \"kubernetes.io/projected/e24bacec-594f-429f-8e02-73abc6c4b092-kube-api-access-s5hv7\") pod \"e24bacec-594f-429f-8e02-73abc6c4b092\" (UID: \"e24bacec-594f-429f-8e02-73abc6c4b092\") " Mar 21 04:27:52 crc kubenswrapper[4839]: I0321 04:27:52.906315 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/8805db9c-11be-498e-9f1f-7bc6914dba76-client-ca\") pod \"8805db9c-11be-498e-9f1f-7bc6914dba76\" (UID: \"8805db9c-11be-498e-9f1f-7bc6914dba76\") " Mar 21 04:27:52 crc kubenswrapper[4839]: I0321 04:27:52.906357 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e24bacec-594f-429f-8e02-73abc6c4b092-config\") pod \"e24bacec-594f-429f-8e02-73abc6c4b092\" (UID: \"e24bacec-594f-429f-8e02-73abc6c4b092\") " Mar 21 04:27:52 crc kubenswrapper[4839]: I0321 04:27:52.906374 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/e24bacec-594f-429f-8e02-73abc6c4b092-proxy-ca-bundles\") pod \"e24bacec-594f-429f-8e02-73abc6c4b092\" (UID: \"e24bacec-594f-429f-8e02-73abc6c4b092\") " Mar 21 04:27:52 crc kubenswrapper[4839]: I0321 04:27:52.906593 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/602cd797-e549-4e26-a152-a1cb4decf82d-client-ca\") pod \"controller-manager-64bdf84bc9-fblcp\" (UID: \"602cd797-e549-4e26-a152-a1cb4decf82d\") " pod="openshift-controller-manager/controller-manager-64bdf84bc9-fblcp" Mar 21 04:27:52 crc kubenswrapper[4839]: I0321 04:27:52.906639 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/602cd797-e549-4e26-a152-a1cb4decf82d-proxy-ca-bundles\") pod \"controller-manager-64bdf84bc9-fblcp\" (UID: \"602cd797-e549-4e26-a152-a1cb4decf82d\") " pod="openshift-controller-manager/controller-manager-64bdf84bc9-fblcp" Mar 21 04:27:52 crc kubenswrapper[4839]: I0321 04:27:52.906668 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/602cd797-e549-4e26-a152-a1cb4decf82d-config\") pod \"controller-manager-64bdf84bc9-fblcp\" (UID: \"602cd797-e549-4e26-a152-a1cb4decf82d\") " pod="openshift-controller-manager/controller-manager-64bdf84bc9-fblcp" Mar 21 04:27:52 crc kubenswrapper[4839]: I0321 04:27:52.906700 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s54qc\" (UniqueName: \"kubernetes.io/projected/602cd797-e549-4e26-a152-a1cb4decf82d-kube-api-access-s54qc\") pod \"controller-manager-64bdf84bc9-fblcp\" (UID: \"602cd797-e549-4e26-a152-a1cb4decf82d\") " pod="openshift-controller-manager/controller-manager-64bdf84bc9-fblcp" Mar 21 04:27:52 crc kubenswrapper[4839]: I0321 04:27:52.906722 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/602cd797-e549-4e26-a152-a1cb4decf82d-serving-cert\") pod \"controller-manager-64bdf84bc9-fblcp\" (UID: \"602cd797-e549-4e26-a152-a1cb4decf82d\") " pod="openshift-controller-manager/controller-manager-64bdf84bc9-fblcp" Mar 21 04:27:52 crc kubenswrapper[4839]: I0321 04:27:52.907251 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e24bacec-594f-429f-8e02-73abc6c4b092-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "e24bacec-594f-429f-8e02-73abc6c4b092" (UID: "e24bacec-594f-429f-8e02-73abc6c4b092"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 21 04:27:52 crc kubenswrapper[4839]: I0321 04:27:52.907289 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e24bacec-594f-429f-8e02-73abc6c4b092-config" (OuterVolumeSpecName: "config") pod "e24bacec-594f-429f-8e02-73abc6c4b092" (UID: "e24bacec-594f-429f-8e02-73abc6c4b092"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 21 04:27:52 crc kubenswrapper[4839]: I0321 04:27:52.907398 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e24bacec-594f-429f-8e02-73abc6c4b092-client-ca" (OuterVolumeSpecName: "client-ca") pod "e24bacec-594f-429f-8e02-73abc6c4b092" (UID: "e24bacec-594f-429f-8e02-73abc6c4b092"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 21 04:27:52 crc kubenswrapper[4839]: I0321 04:27:52.907598 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8805db9c-11be-498e-9f1f-7bc6914dba76-client-ca" (OuterVolumeSpecName: "client-ca") pod "8805db9c-11be-498e-9f1f-7bc6914dba76" (UID: "8805db9c-11be-498e-9f1f-7bc6914dba76"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 21 04:27:52 crc kubenswrapper[4839]: I0321 04:27:52.908347 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8805db9c-11be-498e-9f1f-7bc6914dba76-config" (OuterVolumeSpecName: "config") pod "8805db9c-11be-498e-9f1f-7bc6914dba76" (UID: "8805db9c-11be-498e-9f1f-7bc6914dba76"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 21 04:27:52 crc kubenswrapper[4839]: I0321 04:27:52.910732 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e24bacec-594f-429f-8e02-73abc6c4b092-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "e24bacec-594f-429f-8e02-73abc6c4b092" (UID: "e24bacec-594f-429f-8e02-73abc6c4b092"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 21 04:27:52 crc kubenswrapper[4839]: I0321 04:27:52.910802 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8805db9c-11be-498e-9f1f-7bc6914dba76-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "8805db9c-11be-498e-9f1f-7bc6914dba76" (UID: "8805db9c-11be-498e-9f1f-7bc6914dba76"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 21 04:27:52 crc kubenswrapper[4839]: I0321 04:27:52.910933 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8805db9c-11be-498e-9f1f-7bc6914dba76-kube-api-access-vhfng" (OuterVolumeSpecName: "kube-api-access-vhfng") pod "8805db9c-11be-498e-9f1f-7bc6914dba76" (UID: "8805db9c-11be-498e-9f1f-7bc6914dba76"). InnerVolumeSpecName "kube-api-access-vhfng". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 21 04:27:52 crc kubenswrapper[4839]: I0321 04:27:52.911882 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e24bacec-594f-429f-8e02-73abc6c4b092-kube-api-access-s5hv7" (OuterVolumeSpecName: "kube-api-access-s5hv7") pod "e24bacec-594f-429f-8e02-73abc6c4b092" (UID: "e24bacec-594f-429f-8e02-73abc6c4b092"). InnerVolumeSpecName "kube-api-access-s5hv7". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 21 04:27:52 crc kubenswrapper[4839]: I0321 04:27:52.999774 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6bbc66d757-nhjsp" event={"ID":"8805db9c-11be-498e-9f1f-7bc6914dba76","Type":"ContainerDied","Data":"6319a8a5c70f22df1c28604d34ec4101c8e3e996be0a10469bb57b8a38840242"} Mar 21 04:27:52 crc kubenswrapper[4839]: I0321 04:27:52.999826 4839 scope.go:117] "RemoveContainer" containerID="5001466b0a821fd088ffe8d3580f139c85ddf7eda1ca152fed0f401c41c672e0" Mar 21 04:27:53 crc kubenswrapper[4839]: I0321 04:27:52.999923 4839 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6bbc66d757-nhjsp" Mar 21 04:27:53 crc kubenswrapper[4839]: I0321 04:27:53.002734 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-5fbc589df6-8mjvg" event={"ID":"e24bacec-594f-429f-8e02-73abc6c4b092","Type":"ContainerDied","Data":"788d6540d0d6e5edc7b3a0c3a17787a5f1728b835f1af19ef78772b9e9d77f08"} Mar 21 04:27:53 crc kubenswrapper[4839]: I0321 04:27:53.003659 4839 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-5fbc589df6-8mjvg" Mar 21 04:27:53 crc kubenswrapper[4839]: I0321 04:27:53.007697 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s54qc\" (UniqueName: \"kubernetes.io/projected/602cd797-e549-4e26-a152-a1cb4decf82d-kube-api-access-s54qc\") pod \"controller-manager-64bdf84bc9-fblcp\" (UID: \"602cd797-e549-4e26-a152-a1cb4decf82d\") " pod="openshift-controller-manager/controller-manager-64bdf84bc9-fblcp" Mar 21 04:27:53 crc kubenswrapper[4839]: I0321 04:27:53.007739 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/602cd797-e549-4e26-a152-a1cb4decf82d-serving-cert\") pod \"controller-manager-64bdf84bc9-fblcp\" (UID: \"602cd797-e549-4e26-a152-a1cb4decf82d\") " pod="openshift-controller-manager/controller-manager-64bdf84bc9-fblcp" Mar 21 04:27:53 crc kubenswrapper[4839]: I0321 04:27:53.007838 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/602cd797-e549-4e26-a152-a1cb4decf82d-client-ca\") pod \"controller-manager-64bdf84bc9-fblcp\" (UID: \"602cd797-e549-4e26-a152-a1cb4decf82d\") " pod="openshift-controller-manager/controller-manager-64bdf84bc9-fblcp" Mar 21 04:27:53 crc kubenswrapper[4839]: I0321 04:27:53.007865 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/602cd797-e549-4e26-a152-a1cb4decf82d-proxy-ca-bundles\") pod \"controller-manager-64bdf84bc9-fblcp\" (UID: \"602cd797-e549-4e26-a152-a1cb4decf82d\") " pod="openshift-controller-manager/controller-manager-64bdf84bc9-fblcp" Mar 21 04:27:53 crc kubenswrapper[4839]: I0321 04:27:53.007890 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/602cd797-e549-4e26-a152-a1cb4decf82d-config\") pod \"controller-manager-64bdf84bc9-fblcp\" (UID: \"602cd797-e549-4e26-a152-a1cb4decf82d\") " pod="openshift-controller-manager/controller-manager-64bdf84bc9-fblcp" Mar 21 04:27:53 crc kubenswrapper[4839]: I0321 04:27:53.007937 4839 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vhfng\" (UniqueName: \"kubernetes.io/projected/8805db9c-11be-498e-9f1f-7bc6914dba76-kube-api-access-vhfng\") on node \"crc\" DevicePath \"\"" Mar 21 04:27:53 crc kubenswrapper[4839]: I0321 04:27:53.007952 4839 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8805db9c-11be-498e-9f1f-7bc6914dba76-config\") on node \"crc\" DevicePath \"\"" Mar 21 04:27:53 crc kubenswrapper[4839]: I0321 04:27:53.007963 4839 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e24bacec-594f-429f-8e02-73abc6c4b092-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 21 04:27:53 crc kubenswrapper[4839]: I0321 04:27:53.007975 4839 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8805db9c-11be-498e-9f1f-7bc6914dba76-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 21 04:27:53 crc kubenswrapper[4839]: I0321 04:27:53.007987 4839 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/e24bacec-594f-429f-8e02-73abc6c4b092-client-ca\") on node \"crc\" DevicePath \"\"" Mar 21 04:27:53 crc kubenswrapper[4839]: I0321 04:27:53.007997 4839 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-s5hv7\" (UniqueName: \"kubernetes.io/projected/e24bacec-594f-429f-8e02-73abc6c4b092-kube-api-access-s5hv7\") on node \"crc\" DevicePath \"\"" Mar 21 04:27:53 crc kubenswrapper[4839]: I0321 04:27:53.008007 4839 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/8805db9c-11be-498e-9f1f-7bc6914dba76-client-ca\") on node \"crc\" DevicePath \"\"" Mar 21 04:27:53 crc kubenswrapper[4839]: I0321 04:27:53.008018 4839 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e24bacec-594f-429f-8e02-73abc6c4b092-config\") on node \"crc\" DevicePath \"\"" Mar 21 04:27:53 crc kubenswrapper[4839]: I0321 04:27:53.008027 4839 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/e24bacec-594f-429f-8e02-73abc6c4b092-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Mar 21 04:27:53 crc kubenswrapper[4839]: I0321 04:27:53.009017 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/602cd797-e549-4e26-a152-a1cb4decf82d-client-ca\") pod \"controller-manager-64bdf84bc9-fblcp\" (UID: \"602cd797-e549-4e26-a152-a1cb4decf82d\") " pod="openshift-controller-manager/controller-manager-64bdf84bc9-fblcp" Mar 21 04:27:53 crc kubenswrapper[4839]: I0321 04:27:53.009367 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/602cd797-e549-4e26-a152-a1cb4decf82d-config\") pod \"controller-manager-64bdf84bc9-fblcp\" (UID: \"602cd797-e549-4e26-a152-a1cb4decf82d\") " pod="openshift-controller-manager/controller-manager-64bdf84bc9-fblcp" Mar 21 04:27:53 crc kubenswrapper[4839]: I0321 04:27:53.009460 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/602cd797-e549-4e26-a152-a1cb4decf82d-proxy-ca-bundles\") pod \"controller-manager-64bdf84bc9-fblcp\" (UID: \"602cd797-e549-4e26-a152-a1cb4decf82d\") " pod="openshift-controller-manager/controller-manager-64bdf84bc9-fblcp" Mar 21 04:27:53 crc kubenswrapper[4839]: I0321 04:27:53.012128 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/602cd797-e549-4e26-a152-a1cb4decf82d-serving-cert\") pod \"controller-manager-64bdf84bc9-fblcp\" (UID: \"602cd797-e549-4e26-a152-a1cb4decf82d\") " pod="openshift-controller-manager/controller-manager-64bdf84bc9-fblcp" Mar 21 04:27:53 crc kubenswrapper[4839]: I0321 04:27:53.027026 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s54qc\" (UniqueName: \"kubernetes.io/projected/602cd797-e549-4e26-a152-a1cb4decf82d-kube-api-access-s54qc\") pod \"controller-manager-64bdf84bc9-fblcp\" (UID: \"602cd797-e549-4e26-a152-a1cb4decf82d\") " pod="openshift-controller-manager/controller-manager-64bdf84bc9-fblcp" Mar 21 04:27:53 crc kubenswrapper[4839]: I0321 04:27:53.050255 4839 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6bbc66d757-nhjsp"] Mar 21 04:27:53 crc kubenswrapper[4839]: I0321 04:27:53.057183 4839 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6bbc66d757-nhjsp"] Mar 21 04:27:53 crc kubenswrapper[4839]: I0321 04:27:53.060707 4839 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-5fbc589df6-8mjvg"] Mar 21 04:27:53 crc kubenswrapper[4839]: I0321 04:27:53.064216 4839 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-5fbc589df6-8mjvg"] Mar 21 04:27:53 crc kubenswrapper[4839]: I0321 04:27:53.094438 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-64bdf84bc9-fblcp" Mar 21 04:27:54 crc kubenswrapper[4839]: I0321 04:27:54.466914 4839 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8805db9c-11be-498e-9f1f-7bc6914dba76" path="/var/lib/kubelet/pods/8805db9c-11be-498e-9f1f-7bc6914dba76/volumes" Mar 21 04:27:54 crc kubenswrapper[4839]: I0321 04:27:54.468168 4839 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e24bacec-594f-429f-8e02-73abc6c4b092" path="/var/lib/kubelet/pods/e24bacec-594f-429f-8e02-73abc6c4b092/volumes" Mar 21 04:27:55 crc kubenswrapper[4839]: E0321 04:27:55.318915 4839 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/community-operator-index:v4.18" Mar 21 04:27:55 crc kubenswrapper[4839]: E0321 04:27:55.319071 4839 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/community-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-w9sw7,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod community-operators-v4btp_openshift-marketplace(dc99f39a-8001-466b-acf1-bd106eb2b81d): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Mar 21 04:27:55 crc kubenswrapper[4839]: E0321 04:27:55.320230 4839 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/community-operators-v4btp" podUID="dc99f39a-8001-466b-acf1-bd106eb2b81d" Mar 21 04:27:55 crc kubenswrapper[4839]: I0321 04:27:55.717755 4839 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-bbccd6fb-qndh7"] Mar 21 04:27:55 crc kubenswrapper[4839]: I0321 04:27:55.720781 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-bbccd6fb-qndh7" Mar 21 04:27:55 crc kubenswrapper[4839]: I0321 04:27:55.722637 4839 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Mar 21 04:27:55 crc kubenswrapper[4839]: I0321 04:27:55.723078 4839 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Mar 21 04:27:55 crc kubenswrapper[4839]: I0321 04:27:55.723132 4839 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Mar 21 04:27:55 crc kubenswrapper[4839]: I0321 04:27:55.723663 4839 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-bbccd6fb-qndh7"] Mar 21 04:27:55 crc kubenswrapper[4839]: I0321 04:27:55.724905 4839 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Mar 21 04:27:55 crc kubenswrapper[4839]: I0321 04:27:55.724914 4839 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Mar 21 04:27:55 crc kubenswrapper[4839]: I0321 04:27:55.725134 4839 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Mar 21 04:27:55 crc kubenswrapper[4839]: I0321 04:27:55.740374 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/c4842010-e137-466d-9596-e65f0cf2f4da-client-ca\") pod \"route-controller-manager-bbccd6fb-qndh7\" (UID: \"c4842010-e137-466d-9596-e65f0cf2f4da\") " pod="openshift-route-controller-manager/route-controller-manager-bbccd6fb-qndh7" Mar 21 04:27:55 crc kubenswrapper[4839]: I0321 04:27:55.740410 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c4842010-e137-466d-9596-e65f0cf2f4da-serving-cert\") pod \"route-controller-manager-bbccd6fb-qndh7\" (UID: \"c4842010-e137-466d-9596-e65f0cf2f4da\") " pod="openshift-route-controller-manager/route-controller-manager-bbccd6fb-qndh7" Mar 21 04:27:55 crc kubenswrapper[4839]: I0321 04:27:55.740448 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c4842010-e137-466d-9596-e65f0cf2f4da-config\") pod \"route-controller-manager-bbccd6fb-qndh7\" (UID: \"c4842010-e137-466d-9596-e65f0cf2f4da\") " pod="openshift-route-controller-manager/route-controller-manager-bbccd6fb-qndh7" Mar 21 04:27:55 crc kubenswrapper[4839]: I0321 04:27:55.740558 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k9x2w\" (UniqueName: \"kubernetes.io/projected/c4842010-e137-466d-9596-e65f0cf2f4da-kube-api-access-k9x2w\") pod \"route-controller-manager-bbccd6fb-qndh7\" (UID: \"c4842010-e137-466d-9596-e65f0cf2f4da\") " pod="openshift-route-controller-manager/route-controller-manager-bbccd6fb-qndh7" Mar 21 04:27:55 crc kubenswrapper[4839]: I0321 04:27:55.841344 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/c4842010-e137-466d-9596-e65f0cf2f4da-client-ca\") pod \"route-controller-manager-bbccd6fb-qndh7\" (UID: \"c4842010-e137-466d-9596-e65f0cf2f4da\") " pod="openshift-route-controller-manager/route-controller-manager-bbccd6fb-qndh7" Mar 21 04:27:55 crc kubenswrapper[4839]: I0321 04:27:55.841400 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c4842010-e137-466d-9596-e65f0cf2f4da-serving-cert\") pod \"route-controller-manager-bbccd6fb-qndh7\" (UID: \"c4842010-e137-466d-9596-e65f0cf2f4da\") " pod="openshift-route-controller-manager/route-controller-manager-bbccd6fb-qndh7" Mar 21 04:27:55 crc kubenswrapper[4839]: I0321 04:27:55.841449 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c4842010-e137-466d-9596-e65f0cf2f4da-config\") pod \"route-controller-manager-bbccd6fb-qndh7\" (UID: \"c4842010-e137-466d-9596-e65f0cf2f4da\") " pod="openshift-route-controller-manager/route-controller-manager-bbccd6fb-qndh7" Mar 21 04:27:55 crc kubenswrapper[4839]: I0321 04:27:55.841552 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-k9x2w\" (UniqueName: \"kubernetes.io/projected/c4842010-e137-466d-9596-e65f0cf2f4da-kube-api-access-k9x2w\") pod \"route-controller-manager-bbccd6fb-qndh7\" (UID: \"c4842010-e137-466d-9596-e65f0cf2f4da\") " pod="openshift-route-controller-manager/route-controller-manager-bbccd6fb-qndh7" Mar 21 04:27:55 crc kubenswrapper[4839]: I0321 04:27:55.842833 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/c4842010-e137-466d-9596-e65f0cf2f4da-client-ca\") pod \"route-controller-manager-bbccd6fb-qndh7\" (UID: \"c4842010-e137-466d-9596-e65f0cf2f4da\") " pod="openshift-route-controller-manager/route-controller-manager-bbccd6fb-qndh7" Mar 21 04:27:55 crc kubenswrapper[4839]: I0321 04:27:55.843635 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c4842010-e137-466d-9596-e65f0cf2f4da-config\") pod \"route-controller-manager-bbccd6fb-qndh7\" (UID: \"c4842010-e137-466d-9596-e65f0cf2f4da\") " pod="openshift-route-controller-manager/route-controller-manager-bbccd6fb-qndh7" Mar 21 04:27:55 crc kubenswrapper[4839]: I0321 04:27:55.855672 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c4842010-e137-466d-9596-e65f0cf2f4da-serving-cert\") pod \"route-controller-manager-bbccd6fb-qndh7\" (UID: \"c4842010-e137-466d-9596-e65f0cf2f4da\") " pod="openshift-route-controller-manager/route-controller-manager-bbccd6fb-qndh7" Mar 21 04:27:55 crc kubenswrapper[4839]: I0321 04:27:55.869248 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-k9x2w\" (UniqueName: \"kubernetes.io/projected/c4842010-e137-466d-9596-e65f0cf2f4da-kube-api-access-k9x2w\") pod \"route-controller-manager-bbccd6fb-qndh7\" (UID: \"c4842010-e137-466d-9596-e65f0cf2f4da\") " pod="openshift-route-controller-manager/route-controller-manager-bbccd6fb-qndh7" Mar 21 04:27:58 crc kubenswrapper[4839]: I0321 04:27:56.036536 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-bbccd6fb-qndh7" Mar 21 04:27:58 crc kubenswrapper[4839]: E0321 04:27:57.831334 4839 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"\"" pod="openshift-marketplace/certified-operators-qrqj2" podUID="f1ec80e5-557b-4c30-8323-87d6b1447a6d" Mar 21 04:27:58 crc kubenswrapper[4839]: E0321 04:27:57.831365 4839 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"\"" pod="openshift-marketplace/community-operators-v4btp" podUID="dc99f39a-8001-466b-acf1-bd106eb2b81d" Mar 21 04:27:59 crc kubenswrapper[4839]: E0321 04:27:59.474453 4839 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/redhat-marketplace-index:v4.18" Mar 21 04:27:59 crc kubenswrapper[4839]: E0321 04:27:59.474647 4839 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/redhat-marketplace-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-ncnmc,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod redhat-marketplace-9qjgq_openshift-marketplace(0b7a7313-21c4-4909-9ebe-ebe552b29b8c): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Mar 21 04:27:59 crc kubenswrapper[4839]: E0321 04:27:59.475898 4839 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/redhat-marketplace-9qjgq" podUID="0b7a7313-21c4-4909-9ebe-ebe552b29b8c" Mar 21 04:27:59 crc kubenswrapper[4839]: E0321 04:27:59.696340 4839 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/redhat-marketplace-index:v4.18" Mar 21 04:27:59 crc kubenswrapper[4839]: E0321 04:27:59.696934 4839 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/redhat-marketplace-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-57m7l,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod redhat-marketplace-7m29z_openshift-marketplace(c3ae9e7a-784b-4a39-bd4e-10dbff65cd50): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Mar 21 04:27:59 crc kubenswrapper[4839]: E0321 04:27:59.698106 4839 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/redhat-marketplace-7m29z" podUID="c3ae9e7a-784b-4a39-bd4e-10dbff65cd50" Mar 21 04:27:59 crc kubenswrapper[4839]: E0321 04:27:59.849943 4839 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/certified-operator-index:v4.18" Mar 21 04:27:59 crc kubenswrapper[4839]: E0321 04:27:59.850134 4839 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/certified-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-krww4,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod certified-operators-nw7r6_openshift-marketplace(65a571df-f531-458b-9aed-6de99e4607e1): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Mar 21 04:27:59 crc kubenswrapper[4839]: E0321 04:27:59.851558 4839 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/certified-operators-nw7r6" podUID="65a571df-f531-458b-9aed-6de99e4607e1" Mar 21 04:28:00 crc kubenswrapper[4839]: I0321 04:28:00.126882 4839 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29567788-9snlp"] Mar 21 04:28:00 crc kubenswrapper[4839]: I0321 04:28:00.128186 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567788-9snlp" Mar 21 04:28:00 crc kubenswrapper[4839]: I0321 04:28:00.130830 4839 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-swld2" Mar 21 04:28:00 crc kubenswrapper[4839]: I0321 04:28:00.134331 4839 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29567788-9snlp"] Mar 21 04:28:00 crc kubenswrapper[4839]: I0321 04:28:00.190008 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f67r4\" (UniqueName: \"kubernetes.io/projected/a45deb0c-4247-4d23-86db-a897c7f7e7f2-kube-api-access-f67r4\") pod \"auto-csr-approver-29567788-9snlp\" (UID: \"a45deb0c-4247-4d23-86db-a897c7f7e7f2\") " pod="openshift-infra/auto-csr-approver-29567788-9snlp" Mar 21 04:28:00 crc kubenswrapper[4839]: I0321 04:28:00.291558 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-f67r4\" (UniqueName: \"kubernetes.io/projected/a45deb0c-4247-4d23-86db-a897c7f7e7f2-kube-api-access-f67r4\") pod \"auto-csr-approver-29567788-9snlp\" (UID: \"a45deb0c-4247-4d23-86db-a897c7f7e7f2\") " pod="openshift-infra/auto-csr-approver-29567788-9snlp" Mar 21 04:28:00 crc kubenswrapper[4839]: I0321 04:28:00.308811 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-f67r4\" (UniqueName: \"kubernetes.io/projected/a45deb0c-4247-4d23-86db-a897c7f7e7f2-kube-api-access-f67r4\") pod \"auto-csr-approver-29567788-9snlp\" (UID: \"a45deb0c-4247-4d23-86db-a897c7f7e7f2\") " pod="openshift-infra/auto-csr-approver-29567788-9snlp" Mar 21 04:28:00 crc kubenswrapper[4839]: I0321 04:28:00.447451 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567788-9snlp" Mar 21 04:28:00 crc kubenswrapper[4839]: I0321 04:28:00.980984 4839 patch_prober.go:28] interesting pod/machine-config-daemon-jx4q7 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 21 04:28:00 crc kubenswrapper[4839]: I0321 04:28:00.981263 4839 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-jx4q7" podUID="4f92fefb-d5cd-451a-8bbe-31eea55d5bd9" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 21 04:28:02 crc kubenswrapper[4839]: E0321 04:28:02.587988 4839 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"\"" pod="openshift-marketplace/redhat-marketplace-7m29z" podUID="c3ae9e7a-784b-4a39-bd4e-10dbff65cd50" Mar 21 04:28:02 crc kubenswrapper[4839]: E0321 04:28:02.588123 4839 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"\"" pod="openshift-marketplace/redhat-marketplace-9qjgq" podUID="0b7a7313-21c4-4909-9ebe-ebe552b29b8c" Mar 21 04:28:02 crc kubenswrapper[4839]: E0321 04:28:02.588280 4839 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"\"" pod="openshift-marketplace/certified-operators-nw7r6" podUID="65a571df-f531-458b-9aed-6de99e4607e1" Mar 21 04:28:02 crc kubenswrapper[4839]: E0321 04:28:02.615343 4839 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/redhat-operator-index:v4.18" Mar 21 04:28:02 crc kubenswrapper[4839]: E0321 04:28:02.615634 4839 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/redhat-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-jctj4,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod redhat-operators-zgfcm_openshift-marketplace(5d1d0c02-87bf-4c8b-bc1c-d25007fb3c1c): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Mar 21 04:28:02 crc kubenswrapper[4839]: E0321 04:28:02.617144 4839 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/redhat-operators-zgfcm" podUID="5d1d0c02-87bf-4c8b-bc1c-d25007fb3c1c" Mar 21 04:28:02 crc kubenswrapper[4839]: I0321 04:28:02.624991 4839 scope.go:117] "RemoveContainer" containerID="155058dcb792bcca927dba12ed5317e9b707d06ff0167036853cffab66697b72" Mar 21 04:28:02 crc kubenswrapper[4839]: E0321 04:28:02.626386 4839 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/redhat-operator-index:v4.18" Mar 21 04:28:02 crc kubenswrapper[4839]: E0321 04:28:02.626546 4839 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/redhat-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-d994r,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod redhat-operators-cznml_openshift-marketplace(b144748c-2940-4efe-a486-d2b5c1239b12): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Mar 21 04:28:02 crc kubenswrapper[4839]: E0321 04:28:02.628316 4839 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/redhat-operators-cznml" podUID="b144748c-2940-4efe-a486-d2b5c1239b12" Mar 21 04:28:03 crc kubenswrapper[4839]: I0321 04:28:03.076015 4839 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-9-crc"] Mar 21 04:28:03 crc kubenswrapper[4839]: I0321 04:28:03.079245 4839 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-64bdf84bc9-fblcp"] Mar 21 04:28:03 crc kubenswrapper[4839]: I0321 04:28:03.094132 4839 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-445ww"] Mar 21 04:28:03 crc kubenswrapper[4839]: I0321 04:28:03.161952 4839 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29567788-9snlp"] Mar 21 04:28:03 crc kubenswrapper[4839]: I0321 04:28:03.174471 4839 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-bbccd6fb-qndh7"] Mar 21 04:28:03 crc kubenswrapper[4839]: I0321 04:28:03.178672 4839 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/installer-9-crc"] Mar 21 04:28:03 crc kubenswrapper[4839]: W0321 04:28:03.562764 4839 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-podd62030f0_7d6f_46f7_83a2_a28fafe1a4ef.slice/crio-068c125d2ab1d568fb725b9241f548239afcbec04a0fa6f6362a280d379a40c3 WatchSource:0}: Error finding container 068c125d2ab1d568fb725b9241f548239afcbec04a0fa6f6362a280d379a40c3: Status 404 returned error can't find the container with id 068c125d2ab1d568fb725b9241f548239afcbec04a0fa6f6362a280d379a40c3 Mar 21 04:28:03 crc kubenswrapper[4839]: W0321 04:28:03.564540 4839 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod602cd797_e549_4e26_a152_a1cb4decf82d.slice/crio-37880a6725bfc4b97b6547a22128a884c967634f4fc94acf8976fa9f68b2ab94 WatchSource:0}: Error finding container 37880a6725bfc4b97b6547a22128a884c967634f4fc94acf8976fa9f68b2ab94: Status 404 returned error can't find the container with id 37880a6725bfc4b97b6547a22128a884c967634f4fc94acf8976fa9f68b2ab94 Mar 21 04:28:03 crc kubenswrapper[4839]: W0321 04:28:03.565275 4839 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podfa13ce27_53f2_4178_8560_251f0bb3f034.slice/crio-4f340f0cf8acc72f2c79da7cce1f15dc42b5eaee90faa48320ff32261b98e874 WatchSource:0}: Error finding container 4f340f0cf8acc72f2c79da7cce1f15dc42b5eaee90faa48320ff32261b98e874: Status 404 returned error can't find the container with id 4f340f0cf8acc72f2c79da7cce1f15dc42b5eaee90faa48320ff32261b98e874 Mar 21 04:28:03 crc kubenswrapper[4839]: W0321 04:28:03.567333 4839 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda45deb0c_4247_4d23_86db_a897c7f7e7f2.slice/crio-7b7b577e4ba61820653dda6ae5bcf120e30feefac0d8c37751d6044474181784 WatchSource:0}: Error finding container 7b7b577e4ba61820653dda6ae5bcf120e30feefac0d8c37751d6044474181784: Status 404 returned error can't find the container with id 7b7b577e4ba61820653dda6ae5bcf120e30feefac0d8c37751d6044474181784 Mar 21 04:28:03 crc kubenswrapper[4839]: W0321 04:28:03.569197 4839 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc4842010_e137_466d_9596_e65f0cf2f4da.slice/crio-8c3849bbe60c3ca3e53c5a7956c8fe11b773acab42338cd4566762648f52546e WatchSource:0}: Error finding container 8c3849bbe60c3ca3e53c5a7956c8fe11b773acab42338cd4566762648f52546e: Status 404 returned error can't find the container with id 8c3849bbe60c3ca3e53c5a7956c8fe11b773acab42338cd4566762648f52546e Mar 21 04:28:03 crc kubenswrapper[4839]: E0321 04:28:03.569529 4839 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"\"" pod="openshift-marketplace/redhat-operators-zgfcm" podUID="5d1d0c02-87bf-4c8b-bc1c-d25007fb3c1c" Mar 21 04:28:03 crc kubenswrapper[4839]: W0321 04:28:03.570519 4839 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-pode3a71e7a_3ead_483f_8de2_9dbf3a336182.slice/crio-c7bf0d3551fb41779d1f34afa1d26b3709109fa26382cddc767b170e4665d502 WatchSource:0}: Error finding container c7bf0d3551fb41779d1f34afa1d26b3709109fa26382cddc767b170e4665d502: Status 404 returned error can't find the container with id c7bf0d3551fb41779d1f34afa1d26b3709109fa26382cddc767b170e4665d502 Mar 21 04:28:04 crc kubenswrapper[4839]: I0321 04:28:04.056302 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-bbccd6fb-qndh7" event={"ID":"c4842010-e137-466d-9596-e65f0cf2f4da","Type":"ContainerStarted","Data":"8c3849bbe60c3ca3e53c5a7956c8fe11b773acab42338cd4566762648f52546e"} Mar 21 04:28:04 crc kubenswrapper[4839]: I0321 04:28:04.058151 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-445ww" event={"ID":"fa13ce27-53f2-4178-8560-251f0bb3f034","Type":"ContainerStarted","Data":"4f340f0cf8acc72f2c79da7cce1f15dc42b5eaee90faa48320ff32261b98e874"} Mar 21 04:28:04 crc kubenswrapper[4839]: I0321 04:28:04.059048 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-9-crc" event={"ID":"e3a71e7a-3ead-483f-8de2-9dbf3a336182","Type":"ContainerStarted","Data":"c7bf0d3551fb41779d1f34afa1d26b3709109fa26382cddc767b170e4665d502"} Mar 21 04:28:04 crc kubenswrapper[4839]: I0321 04:28:04.059980 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567788-9snlp" event={"ID":"a45deb0c-4247-4d23-86db-a897c7f7e7f2","Type":"ContainerStarted","Data":"7b7b577e4ba61820653dda6ae5bcf120e30feefac0d8c37751d6044474181784"} Mar 21 04:28:04 crc kubenswrapper[4839]: I0321 04:28:04.061845 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-64bdf84bc9-fblcp" event={"ID":"602cd797-e549-4e26-a152-a1cb4decf82d","Type":"ContainerStarted","Data":"37880a6725bfc4b97b6547a22128a884c967634f4fc94acf8976fa9f68b2ab94"} Mar 21 04:28:04 crc kubenswrapper[4839]: I0321 04:28:04.063009 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-9-crc" event={"ID":"d62030f0-7d6f-46f7-83a2-a28fafe1a4ef","Type":"ContainerStarted","Data":"068c125d2ab1d568fb725b9241f548239afcbec04a0fa6f6362a280d379a40c3"} Mar 21 04:28:05 crc kubenswrapper[4839]: I0321 04:28:05.071374 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-9-crc" event={"ID":"e3a71e7a-3ead-483f-8de2-9dbf3a336182","Type":"ContainerStarted","Data":"c850247b91b749f4d993a8c6034f93518caa14ae16f4055edfe77ec5dbf0002f"} Mar 21 04:28:05 crc kubenswrapper[4839]: I0321 04:28:05.073331 4839 generic.go:334] "Generic (PLEG): container finished" podID="6513c45b-dd98-40b0-b69c-94db4d1c916e" containerID="8742ce04852792a9b09d21f9082b4080eead694d53c96e4113cd80bb39500730" exitCode=0 Mar 21 04:28:05 crc kubenswrapper[4839]: I0321 04:28:05.073407 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-mxrc8" event={"ID":"6513c45b-dd98-40b0-b69c-94db4d1c916e","Type":"ContainerDied","Data":"8742ce04852792a9b09d21f9082b4080eead694d53c96e4113cd80bb39500730"} Mar 21 04:28:05 crc kubenswrapper[4839]: I0321 04:28:05.075518 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-445ww" event={"ID":"fa13ce27-53f2-4178-8560-251f0bb3f034","Type":"ContainerStarted","Data":"d69c82792ad201cbdc2bc4454792efe9e2d4e64a634f0d00512c8c3edde44b9c"} Mar 21 04:28:05 crc kubenswrapper[4839]: I0321 04:28:05.075546 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-445ww" event={"ID":"fa13ce27-53f2-4178-8560-251f0bb3f034","Type":"ContainerStarted","Data":"ceafc028fef67080a2b5efff83e1dd8de71061972f2082c19d9e75b580a7dcc7"} Mar 21 04:28:05 crc kubenswrapper[4839]: I0321 04:28:05.078818 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567788-9snlp" event={"ID":"a45deb0c-4247-4d23-86db-a897c7f7e7f2","Type":"ContainerStarted","Data":"4d013e774070ce075bd0baa030b45d638ec14fab41990f0c671aa0d311846927"} Mar 21 04:28:05 crc kubenswrapper[4839]: I0321 04:28:05.082481 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-64bdf84bc9-fblcp" event={"ID":"602cd797-e549-4e26-a152-a1cb4decf82d","Type":"ContainerStarted","Data":"daace327abb9a5740d790334ff3eab34c10f48214de659b79cea981c463ae614"} Mar 21 04:28:05 crc kubenswrapper[4839]: I0321 04:28:05.082712 4839 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-64bdf84bc9-fblcp" Mar 21 04:28:05 crc kubenswrapper[4839]: I0321 04:28:05.085982 4839 generic.go:334] "Generic (PLEG): container finished" podID="d62030f0-7d6f-46f7-83a2-a28fafe1a4ef" containerID="e8a9f98548a454114e260e93906bc2ca769f62dc7de5ac4bc5fa88c6f2fff894" exitCode=0 Mar 21 04:28:05 crc kubenswrapper[4839]: I0321 04:28:05.086074 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-9-crc" event={"ID":"d62030f0-7d6f-46f7-83a2-a28fafe1a4ef","Type":"ContainerDied","Data":"e8a9f98548a454114e260e93906bc2ca769f62dc7de5ac4bc5fa88c6f2fff894"} Mar 21 04:28:05 crc kubenswrapper[4839]: I0321 04:28:05.087805 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-bbccd6fb-qndh7" event={"ID":"c4842010-e137-466d-9596-e65f0cf2f4da","Type":"ContainerStarted","Data":"5caf7db86d82198e233c9af5a90d5b97d038e01ec61ad89ee60361b87114a642"} Mar 21 04:28:05 crc kubenswrapper[4839]: I0321 04:28:05.088056 4839 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-bbccd6fb-qndh7" Mar 21 04:28:05 crc kubenswrapper[4839]: I0321 04:28:05.089890 4839 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-64bdf84bc9-fblcp" Mar 21 04:28:05 crc kubenswrapper[4839]: I0321 04:28:05.095467 4839 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/installer-9-crc" podStartSLOduration=17.09543494 podStartE2EDuration="17.09543494s" podCreationTimestamp="2026-03-21 04:27:48 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-21 04:28:05.090351243 +0000 UTC m=+289.418137939" watchObservedRunningTime="2026-03-21 04:28:05.09543494 +0000 UTC m=+289.423221666" Mar 21 04:28:05 crc kubenswrapper[4839]: I0321 04:28:05.097660 4839 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-bbccd6fb-qndh7" Mar 21 04:28:05 crc kubenswrapper[4839]: I0321 04:28:05.114383 4839 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29567788-9snlp" podStartSLOduration=4.108270692 podStartE2EDuration="5.114355877s" podCreationTimestamp="2026-03-21 04:28:00 +0000 UTC" firstStartedPulling="2026-03-21 04:28:03.706165513 +0000 UTC m=+288.033952219" lastFinishedPulling="2026-03-21 04:28:04.712250728 +0000 UTC m=+289.040037404" observedRunningTime="2026-03-21 04:28:05.108232503 +0000 UTC m=+289.436019179" watchObservedRunningTime="2026-03-21 04:28:05.114355877 +0000 UTC m=+289.442142553" Mar 21 04:28:05 crc kubenswrapper[4839]: I0321 04:28:05.144815 4839 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/network-metrics-daemon-445ww" podStartSLOduration=225.144779044 podStartE2EDuration="3m45.144779044s" podCreationTimestamp="2026-03-21 04:24:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-21 04:28:05.142921404 +0000 UTC m=+289.470708090" watchObservedRunningTime="2026-03-21 04:28:05.144779044 +0000 UTC m=+289.472565720" Mar 21 04:28:05 crc kubenswrapper[4839]: I0321 04:28:05.182324 4839 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-64bdf84bc9-fblcp" podStartSLOduration=19.182301269 podStartE2EDuration="19.182301269s" podCreationTimestamp="2026-03-21 04:27:46 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-21 04:28:05.177323776 +0000 UTC m=+289.505110462" watchObservedRunningTime="2026-03-21 04:28:05.182301269 +0000 UTC m=+289.510087945" Mar 21 04:28:05 crc kubenswrapper[4839]: I0321 04:28:05.199336 4839 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-bbccd6fb-qndh7" podStartSLOduration=19.199321506 podStartE2EDuration="19.199321506s" podCreationTimestamp="2026-03-21 04:27:46 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-21 04:28:05.198693139 +0000 UTC m=+289.526479825" watchObservedRunningTime="2026-03-21 04:28:05.199321506 +0000 UTC m=+289.527108182" Mar 21 04:28:05 crc kubenswrapper[4839]: I0321 04:28:05.428436 4839 csr.go:261] certificate signing request csr-wtcz9 is approved, waiting to be issued Mar 21 04:28:05 crc kubenswrapper[4839]: I0321 04:28:05.435475 4839 csr.go:257] certificate signing request csr-wtcz9 is issued Mar 21 04:28:06 crc kubenswrapper[4839]: I0321 04:28:06.101744 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-mxrc8" event={"ID":"6513c45b-dd98-40b0-b69c-94db4d1c916e","Type":"ContainerStarted","Data":"f8a0b041aeabd67569d74e5792234c2d8b9df0f165a69765fe545ae2e64554e4"} Mar 21 04:28:06 crc kubenswrapper[4839]: I0321 04:28:06.104513 4839 generic.go:334] "Generic (PLEG): container finished" podID="a45deb0c-4247-4d23-86db-a897c7f7e7f2" containerID="4d013e774070ce075bd0baa030b45d638ec14fab41990f0c671aa0d311846927" exitCode=0 Mar 21 04:28:06 crc kubenswrapper[4839]: I0321 04:28:06.104776 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567788-9snlp" event={"ID":"a45deb0c-4247-4d23-86db-a897c7f7e7f2","Type":"ContainerDied","Data":"4d013e774070ce075bd0baa030b45d638ec14fab41990f0c671aa0d311846927"} Mar 21 04:28:06 crc kubenswrapper[4839]: I0321 04:28:06.124825 4839 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-mxrc8" podStartSLOduration=3.330857033 podStartE2EDuration="59.124808069s" podCreationTimestamp="2026-03-21 04:27:07 +0000 UTC" firstStartedPulling="2026-03-21 04:27:09.826772367 +0000 UTC m=+234.154559043" lastFinishedPulling="2026-03-21 04:28:05.620723393 +0000 UTC m=+289.948510079" observedRunningTime="2026-03-21 04:28:06.119902947 +0000 UTC m=+290.447689623" watchObservedRunningTime="2026-03-21 04:28:06.124808069 +0000 UTC m=+290.452594745" Mar 21 04:28:06 crc kubenswrapper[4839]: I0321 04:28:06.379683 4839 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-9-crc" Mar 21 04:28:06 crc kubenswrapper[4839]: I0321 04:28:06.437035 4839 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2027-02-24 05:54:36 +0000 UTC, rotation deadline is 2027-01-11 20:36:14.495149546 +0000 UTC Mar 21 04:28:06 crc kubenswrapper[4839]: I0321 04:28:06.437101 4839 certificate_manager.go:356] kubernetes.io/kubelet-serving: Waiting 7120h8m8.058052028s for next certificate rotation Mar 21 04:28:06 crc kubenswrapper[4839]: I0321 04:28:06.482467 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/d62030f0-7d6f-46f7-83a2-a28fafe1a4ef-kubelet-dir\") pod \"d62030f0-7d6f-46f7-83a2-a28fafe1a4ef\" (UID: \"d62030f0-7d6f-46f7-83a2-a28fafe1a4ef\") " Mar 21 04:28:06 crc kubenswrapper[4839]: I0321 04:28:06.482654 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/d62030f0-7d6f-46f7-83a2-a28fafe1a4ef-kube-api-access\") pod \"d62030f0-7d6f-46f7-83a2-a28fafe1a4ef\" (UID: \"d62030f0-7d6f-46f7-83a2-a28fafe1a4ef\") " Mar 21 04:28:06 crc kubenswrapper[4839]: I0321 04:28:06.482665 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/d62030f0-7d6f-46f7-83a2-a28fafe1a4ef-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "d62030f0-7d6f-46f7-83a2-a28fafe1a4ef" (UID: "d62030f0-7d6f-46f7-83a2-a28fafe1a4ef"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 21 04:28:06 crc kubenswrapper[4839]: I0321 04:28:06.483294 4839 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/d62030f0-7d6f-46f7-83a2-a28fafe1a4ef-kubelet-dir\") on node \"crc\" DevicePath \"\"" Mar 21 04:28:06 crc kubenswrapper[4839]: I0321 04:28:06.491032 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d62030f0-7d6f-46f7-83a2-a28fafe1a4ef-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "d62030f0-7d6f-46f7-83a2-a28fafe1a4ef" (UID: "d62030f0-7d6f-46f7-83a2-a28fafe1a4ef"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 21 04:28:06 crc kubenswrapper[4839]: I0321 04:28:06.584588 4839 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/d62030f0-7d6f-46f7-83a2-a28fafe1a4ef-kube-api-access\") on node \"crc\" DevicePath \"\"" Mar 21 04:28:07 crc kubenswrapper[4839]: I0321 04:28:07.112947 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567786-d8w8k" event={"ID":"609ace61-45d1-44f6-b378-fb97eecf2374","Type":"ContainerStarted","Data":"de6f2a80d57a636d18226b6f51d6ae0c6746d29df097ca4fd364524695c212fc"} Mar 21 04:28:07 crc kubenswrapper[4839]: I0321 04:28:07.117835 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-9-crc" event={"ID":"d62030f0-7d6f-46f7-83a2-a28fafe1a4ef","Type":"ContainerDied","Data":"068c125d2ab1d568fb725b9241f548239afcbec04a0fa6f6362a280d379a40c3"} Mar 21 04:28:07 crc kubenswrapper[4839]: I0321 04:28:07.117898 4839 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="068c125d2ab1d568fb725b9241f548239afcbec04a0fa6f6362a280d379a40c3" Mar 21 04:28:07 crc kubenswrapper[4839]: I0321 04:28:07.117948 4839 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-9-crc" Mar 21 04:28:07 crc kubenswrapper[4839]: I0321 04:28:07.421242 4839 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567788-9snlp" Mar 21 04:28:07 crc kubenswrapper[4839]: I0321 04:28:07.437099 4839 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29567786-d8w8k" podStartSLOduration=62.825881119 podStartE2EDuration="2m7.437080329s" podCreationTimestamp="2026-03-21 04:26:00 +0000 UTC" firstStartedPulling="2026-03-21 04:27:02.143648313 +0000 UTC m=+226.471434989" lastFinishedPulling="2026-03-21 04:28:06.754847523 +0000 UTC m=+291.082634199" observedRunningTime="2026-03-21 04:28:07.131075108 +0000 UTC m=+291.458861794" watchObservedRunningTime="2026-03-21 04:28:07.437080329 +0000 UTC m=+291.764867005" Mar 21 04:28:07 crc kubenswrapper[4839]: I0321 04:28:07.437333 4839 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2027-02-24 05:54:36 +0000 UTC, rotation deadline is 2026-12-19 08:14:04.031510178 +0000 UTC Mar 21 04:28:07 crc kubenswrapper[4839]: I0321 04:28:07.437355 4839 certificate_manager.go:356] kubernetes.io/kubelet-serving: Waiting 6555h45m56.594157792s for next certificate rotation Mar 21 04:28:07 crc kubenswrapper[4839]: I0321 04:28:07.601110 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-f67r4\" (UniqueName: \"kubernetes.io/projected/a45deb0c-4247-4d23-86db-a897c7f7e7f2-kube-api-access-f67r4\") pod \"a45deb0c-4247-4d23-86db-a897c7f7e7f2\" (UID: \"a45deb0c-4247-4d23-86db-a897c7f7e7f2\") " Mar 21 04:28:07 crc kubenswrapper[4839]: I0321 04:28:07.610332 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a45deb0c-4247-4d23-86db-a897c7f7e7f2-kube-api-access-f67r4" (OuterVolumeSpecName: "kube-api-access-f67r4") pod "a45deb0c-4247-4d23-86db-a897c7f7e7f2" (UID: "a45deb0c-4247-4d23-86db-a897c7f7e7f2"). InnerVolumeSpecName "kube-api-access-f67r4". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 21 04:28:07 crc kubenswrapper[4839]: I0321 04:28:07.694934 4839 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-mxrc8" Mar 21 04:28:07 crc kubenswrapper[4839]: I0321 04:28:07.696116 4839 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-mxrc8" Mar 21 04:28:07 crc kubenswrapper[4839]: I0321 04:28:07.703495 4839 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-f67r4\" (UniqueName: \"kubernetes.io/projected/a45deb0c-4247-4d23-86db-a897c7f7e7f2-kube-api-access-f67r4\") on node \"crc\" DevicePath \"\"" Mar 21 04:28:08 crc kubenswrapper[4839]: I0321 04:28:08.124878 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567788-9snlp" event={"ID":"a45deb0c-4247-4d23-86db-a897c7f7e7f2","Type":"ContainerDied","Data":"7b7b577e4ba61820653dda6ae5bcf120e30feefac0d8c37751d6044474181784"} Mar 21 04:28:08 crc kubenswrapper[4839]: I0321 04:28:08.125292 4839 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="7b7b577e4ba61820653dda6ae5bcf120e30feefac0d8c37751d6044474181784" Mar 21 04:28:08 crc kubenswrapper[4839]: I0321 04:28:08.124914 4839 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567788-9snlp" Mar 21 04:28:08 crc kubenswrapper[4839]: I0321 04:28:08.127068 4839 generic.go:334] "Generic (PLEG): container finished" podID="609ace61-45d1-44f6-b378-fb97eecf2374" containerID="de6f2a80d57a636d18226b6f51d6ae0c6746d29df097ca4fd364524695c212fc" exitCode=0 Mar 21 04:28:08 crc kubenswrapper[4839]: I0321 04:28:08.127148 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567786-d8w8k" event={"ID":"609ace61-45d1-44f6-b378-fb97eecf2374","Type":"ContainerDied","Data":"de6f2a80d57a636d18226b6f51d6ae0c6746d29df097ca4fd364524695c212fc"} Mar 21 04:28:08 crc kubenswrapper[4839]: I0321 04:28:08.976469 4839 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/community-operators-mxrc8" podUID="6513c45b-dd98-40b0-b69c-94db4d1c916e" containerName="registry-server" probeResult="failure" output=< Mar 21 04:28:08 crc kubenswrapper[4839]: timeout: failed to connect service ":50051" within 1s Mar 21 04:28:08 crc kubenswrapper[4839]: > Mar 21 04:28:09 crc kubenswrapper[4839]: I0321 04:28:09.427795 4839 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567786-d8w8k" Mar 21 04:28:09 crc kubenswrapper[4839]: I0321 04:28:09.535682 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vjkdf\" (UniqueName: \"kubernetes.io/projected/609ace61-45d1-44f6-b378-fb97eecf2374-kube-api-access-vjkdf\") pod \"609ace61-45d1-44f6-b378-fb97eecf2374\" (UID: \"609ace61-45d1-44f6-b378-fb97eecf2374\") " Mar 21 04:28:09 crc kubenswrapper[4839]: I0321 04:28:09.541904 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/609ace61-45d1-44f6-b378-fb97eecf2374-kube-api-access-vjkdf" (OuterVolumeSpecName: "kube-api-access-vjkdf") pod "609ace61-45d1-44f6-b378-fb97eecf2374" (UID: "609ace61-45d1-44f6-b378-fb97eecf2374"). InnerVolumeSpecName "kube-api-access-vjkdf". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 21 04:28:09 crc kubenswrapper[4839]: I0321 04:28:09.637521 4839 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vjkdf\" (UniqueName: \"kubernetes.io/projected/609ace61-45d1-44f6-b378-fb97eecf2374-kube-api-access-vjkdf\") on node \"crc\" DevicePath \"\"" Mar 21 04:28:10 crc kubenswrapper[4839]: I0321 04:28:10.143732 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567786-d8w8k" event={"ID":"609ace61-45d1-44f6-b378-fb97eecf2374","Type":"ContainerDied","Data":"3881096e968c291ccdd0e957e85d1c17697b418b86707f4eba8dd532d8654b50"} Mar 21 04:28:10 crc kubenswrapper[4839]: I0321 04:28:10.144108 4839 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="3881096e968c291ccdd0e957e85d1c17697b418b86707f4eba8dd532d8654b50" Mar 21 04:28:10 crc kubenswrapper[4839]: I0321 04:28:10.144742 4839 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567786-d8w8k" Mar 21 04:28:14 crc kubenswrapper[4839]: I0321 04:28:14.165674 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-v4btp" event={"ID":"dc99f39a-8001-466b-acf1-bd106eb2b81d","Type":"ContainerStarted","Data":"29eeda5c5800bc8b98b9c7f0e11dfdbb6d941849d4928ef702fd78a8e69796aa"} Mar 21 04:28:15 crc kubenswrapper[4839]: I0321 04:28:15.174226 4839 generic.go:334] "Generic (PLEG): container finished" podID="dc99f39a-8001-466b-acf1-bd106eb2b81d" containerID="29eeda5c5800bc8b98b9c7f0e11dfdbb6d941849d4928ef702fd78a8e69796aa" exitCode=0 Mar 21 04:28:15 crc kubenswrapper[4839]: I0321 04:28:15.174271 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-v4btp" event={"ID":"dc99f39a-8001-466b-acf1-bd106eb2b81d","Type":"ContainerDied","Data":"29eeda5c5800bc8b98b9c7f0e11dfdbb6d941849d4928ef702fd78a8e69796aa"} Mar 21 04:28:17 crc kubenswrapper[4839]: I0321 04:28:17.762144 4839 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-mxrc8" Mar 21 04:28:17 crc kubenswrapper[4839]: I0321 04:28:17.803239 4839 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-mxrc8" Mar 21 04:28:19 crc kubenswrapper[4839]: I0321 04:28:19.197546 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-7m29z" event={"ID":"c3ae9e7a-784b-4a39-bd4e-10dbff65cd50","Type":"ContainerStarted","Data":"91243299e8fcd1dc886a5177da277836895624fb7961cb20f7e40640b95d90fa"} Mar 21 04:28:19 crc kubenswrapper[4839]: I0321 04:28:19.199340 4839 generic.go:334] "Generic (PLEG): container finished" podID="f1ec80e5-557b-4c30-8323-87d6b1447a6d" containerID="5b02a43d09df8da1fb3663f2b16933f0e9d27646e4ffb38ae47dd8d3634049a8" exitCode=0 Mar 21 04:28:19 crc kubenswrapper[4839]: I0321 04:28:19.199421 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-qrqj2" event={"ID":"f1ec80e5-557b-4c30-8323-87d6b1447a6d","Type":"ContainerDied","Data":"5b02a43d09df8da1fb3663f2b16933f0e9d27646e4ffb38ae47dd8d3634049a8"} Mar 21 04:28:19 crc kubenswrapper[4839]: I0321 04:28:19.202337 4839 generic.go:334] "Generic (PLEG): container finished" podID="0b7a7313-21c4-4909-9ebe-ebe552b29b8c" containerID="32838aa37e22f65845d98772270e4788d8051a89bcd948b68704534b2f4fcf1e" exitCode=0 Mar 21 04:28:19 crc kubenswrapper[4839]: I0321 04:28:19.202411 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-9qjgq" event={"ID":"0b7a7313-21c4-4909-9ebe-ebe552b29b8c","Type":"ContainerDied","Data":"32838aa37e22f65845d98772270e4788d8051a89bcd948b68704534b2f4fcf1e"} Mar 21 04:28:19 crc kubenswrapper[4839]: I0321 04:28:19.206557 4839 generic.go:334] "Generic (PLEG): container finished" podID="65a571df-f531-458b-9aed-6de99e4607e1" containerID="efc4670847f72c6bb6856a32dc6e4a2a1da439face51ab3c5137f5668c3f5ffb" exitCode=0 Mar 21 04:28:19 crc kubenswrapper[4839]: I0321 04:28:19.206654 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-nw7r6" event={"ID":"65a571df-f531-458b-9aed-6de99e4607e1","Type":"ContainerDied","Data":"efc4670847f72c6bb6856a32dc6e4a2a1da439face51ab3c5137f5668c3f5ffb"} Mar 21 04:28:19 crc kubenswrapper[4839]: I0321 04:28:19.209561 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-cznml" event={"ID":"b144748c-2940-4efe-a486-d2b5c1239b12","Type":"ContainerStarted","Data":"c0f3c8cd4904aa41a1b68ecefd439fa3aaa62843dac43bdac15ac235c5222357"} Mar 21 04:28:19 crc kubenswrapper[4839]: I0321 04:28:19.213734 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-v4btp" event={"ID":"dc99f39a-8001-466b-acf1-bd106eb2b81d","Type":"ContainerStarted","Data":"f0e777e6b17b8feadb828ef554e4eb57eaf696f3c9053a1cb52a0aa9d0e7f691"} Mar 21 04:28:19 crc kubenswrapper[4839]: I0321 04:28:19.306414 4839 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-v4btp" podStartSLOduration=3.818533605 podStartE2EDuration="1m12.30639409s" podCreationTimestamp="2026-03-21 04:27:07 +0000 UTC" firstStartedPulling="2026-03-21 04:27:09.835838258 +0000 UTC m=+234.163624934" lastFinishedPulling="2026-03-21 04:28:18.323698743 +0000 UTC m=+302.651485419" observedRunningTime="2026-03-21 04:28:19.304859969 +0000 UTC m=+303.632646665" watchObservedRunningTime="2026-03-21 04:28:19.30639409 +0000 UTC m=+303.634180766" Mar 21 04:28:20 crc kubenswrapper[4839]: I0321 04:28:20.222990 4839 generic.go:334] "Generic (PLEG): container finished" podID="c3ae9e7a-784b-4a39-bd4e-10dbff65cd50" containerID="91243299e8fcd1dc886a5177da277836895624fb7961cb20f7e40640b95d90fa" exitCode=0 Mar 21 04:28:20 crc kubenswrapper[4839]: I0321 04:28:20.223077 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-7m29z" event={"ID":"c3ae9e7a-784b-4a39-bd4e-10dbff65cd50","Type":"ContainerDied","Data":"91243299e8fcd1dc886a5177da277836895624fb7961cb20f7e40640b95d90fa"} Mar 21 04:28:20 crc kubenswrapper[4839]: I0321 04:28:20.226359 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-zgfcm" event={"ID":"5d1d0c02-87bf-4c8b-bc1c-d25007fb3c1c","Type":"ContainerStarted","Data":"7bce8f9c3683489f31e4a7ec0257da3a6aef6ea02766a2df62cfae9928d74680"} Mar 21 04:28:20 crc kubenswrapper[4839]: I0321 04:28:20.232964 4839 generic.go:334] "Generic (PLEG): container finished" podID="b144748c-2940-4efe-a486-d2b5c1239b12" containerID="c0f3c8cd4904aa41a1b68ecefd439fa3aaa62843dac43bdac15ac235c5222357" exitCode=0 Mar 21 04:28:20 crc kubenswrapper[4839]: I0321 04:28:20.233050 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-cznml" event={"ID":"b144748c-2940-4efe-a486-d2b5c1239b12","Type":"ContainerDied","Data":"c0f3c8cd4904aa41a1b68ecefd439fa3aaa62843dac43bdac15ac235c5222357"} Mar 21 04:28:21 crc kubenswrapper[4839]: I0321 04:28:21.241595 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-9qjgq" event={"ID":"0b7a7313-21c4-4909-9ebe-ebe552b29b8c","Type":"ContainerStarted","Data":"afcbe40293e4b6e4d4a36e26fb65a125e0864be6010a0ab4223765b126370c8a"} Mar 21 04:28:21 crc kubenswrapper[4839]: I0321 04:28:21.243814 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-qrqj2" event={"ID":"f1ec80e5-557b-4c30-8323-87d6b1447a6d","Type":"ContainerStarted","Data":"ff2ff5500daef64d772e498675382b547240590af237209b74ba0f37bf72bdc8"} Mar 21 04:28:21 crc kubenswrapper[4839]: I0321 04:28:21.247638 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-nw7r6" event={"ID":"65a571df-f531-458b-9aed-6de99e4607e1","Type":"ContainerStarted","Data":"3e8fc037dfcad47120a88d3f8d35bf421fba0e269bb8bf6bdf7fceddbf556bd4"} Mar 21 04:28:21 crc kubenswrapper[4839]: I0321 04:28:21.250753 4839 generic.go:334] "Generic (PLEG): container finished" podID="5d1d0c02-87bf-4c8b-bc1c-d25007fb3c1c" containerID="7bce8f9c3683489f31e4a7ec0257da3a6aef6ea02766a2df62cfae9928d74680" exitCode=0 Mar 21 04:28:21 crc kubenswrapper[4839]: I0321 04:28:21.250846 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-zgfcm" event={"ID":"5d1d0c02-87bf-4c8b-bc1c-d25007fb3c1c","Type":"ContainerDied","Data":"7bce8f9c3683489f31e4a7ec0257da3a6aef6ea02766a2df62cfae9928d74680"} Mar 21 04:28:21 crc kubenswrapper[4839]: I0321 04:28:21.253122 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-cznml" event={"ID":"b144748c-2940-4efe-a486-d2b5c1239b12","Type":"ContainerStarted","Data":"5ea3b3f4c3326a4aa81b311c0480c6c4bfb0954f54e7b1a0e142902f9a762cfa"} Mar 21 04:28:21 crc kubenswrapper[4839]: I0321 04:28:21.272022 4839 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-9qjgq" podStartSLOduration=4.087990774 podStartE2EDuration="1m12.27200823s" podCreationTimestamp="2026-03-21 04:27:09 +0000 UTC" firstStartedPulling="2026-03-21 04:27:11.612063602 +0000 UTC m=+235.939850278" lastFinishedPulling="2026-03-21 04:28:19.796081058 +0000 UTC m=+304.123867734" observedRunningTime="2026-03-21 04:28:21.271255459 +0000 UTC m=+305.599042135" watchObservedRunningTime="2026-03-21 04:28:21.27200823 +0000 UTC m=+305.599794906" Mar 21 04:28:21 crc kubenswrapper[4839]: I0321 04:28:21.335348 4839 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-nw7r6" podStartSLOduration=4.326745562 podStartE2EDuration="1m14.335326079s" podCreationTimestamp="2026-03-21 04:27:07 +0000 UTC" firstStartedPulling="2026-03-21 04:27:09.835517029 +0000 UTC m=+234.163303705" lastFinishedPulling="2026-03-21 04:28:19.844097546 +0000 UTC m=+304.171884222" observedRunningTime="2026-03-21 04:28:21.333555901 +0000 UTC m=+305.661342577" watchObservedRunningTime="2026-03-21 04:28:21.335326079 +0000 UTC m=+305.663112755" Mar 21 04:28:21 crc kubenswrapper[4839]: I0321 04:28:21.337304 4839 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-qrqj2" podStartSLOduration=4.406572114 podStartE2EDuration="1m14.337291381s" podCreationTimestamp="2026-03-21 04:27:07 +0000 UTC" firstStartedPulling="2026-03-21 04:27:09.83587402 +0000 UTC m=+234.163660696" lastFinishedPulling="2026-03-21 04:28:19.766593287 +0000 UTC m=+304.094379963" observedRunningTime="2026-03-21 04:28:21.318354013 +0000 UTC m=+305.646140689" watchObservedRunningTime="2026-03-21 04:28:21.337291381 +0000 UTC m=+305.665078057" Mar 21 04:28:21 crc kubenswrapper[4839]: I0321 04:28:21.351417 4839 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-cznml" podStartSLOduration=6.217478847 podStartE2EDuration="1m11.35140031s" podCreationTimestamp="2026-03-21 04:27:10 +0000 UTC" firstStartedPulling="2026-03-21 04:27:15.71332056 +0000 UTC m=+240.041107236" lastFinishedPulling="2026-03-21 04:28:20.847242023 +0000 UTC m=+305.175028699" observedRunningTime="2026-03-21 04:28:21.348764789 +0000 UTC m=+305.676551485" watchObservedRunningTime="2026-03-21 04:28:21.35140031 +0000 UTC m=+305.679186986" Mar 21 04:28:21 crc kubenswrapper[4839]: I0321 04:28:21.599845 4839 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-cznml" Mar 21 04:28:21 crc kubenswrapper[4839]: I0321 04:28:21.599902 4839 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-cznml" Mar 21 04:28:22 crc kubenswrapper[4839]: I0321 04:28:22.643170 4839 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-cznml" podUID="b144748c-2940-4efe-a486-d2b5c1239b12" containerName="registry-server" probeResult="failure" output=< Mar 21 04:28:22 crc kubenswrapper[4839]: timeout: failed to connect service ":50051" within 1s Mar 21 04:28:22 crc kubenswrapper[4839]: > Mar 21 04:28:23 crc kubenswrapper[4839]: I0321 04:28:23.265729 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-7m29z" event={"ID":"c3ae9e7a-784b-4a39-bd4e-10dbff65cd50","Type":"ContainerStarted","Data":"1c80657e72742fec76c998465cd0458ce77bc25333804e766c33538d85b460a4"} Mar 21 04:28:24 crc kubenswrapper[4839]: I0321 04:28:24.291634 4839 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-7m29z" podStartSLOduration=3.759663271 podStartE2EDuration="1m15.29160563s" podCreationTimestamp="2026-03-21 04:27:09 +0000 UTC" firstStartedPulling="2026-03-21 04:27:10.992514516 +0000 UTC m=+235.320301192" lastFinishedPulling="2026-03-21 04:28:22.524456875 +0000 UTC m=+306.852243551" observedRunningTime="2026-03-21 04:28:24.290430708 +0000 UTC m=+308.618217384" watchObservedRunningTime="2026-03-21 04:28:24.29160563 +0000 UTC m=+308.619392306" Mar 21 04:28:25 crc kubenswrapper[4839]: I0321 04:28:25.277134 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-zgfcm" event={"ID":"5d1d0c02-87bf-4c8b-bc1c-d25007fb3c1c","Type":"ContainerStarted","Data":"e513246e76bc96d307d94fb57365aba010eb2b18b93b54a3b78c0eba14929817"} Mar 21 04:28:25 crc kubenswrapper[4839]: I0321 04:28:25.296838 4839 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-zgfcm" podStartSLOduration=4.948884055 podStartE2EDuration="1m15.296822261s" podCreationTimestamp="2026-03-21 04:27:10 +0000 UTC" firstStartedPulling="2026-03-21 04:27:13.68974055 +0000 UTC m=+238.017527236" lastFinishedPulling="2026-03-21 04:28:24.037678766 +0000 UTC m=+308.365465442" observedRunningTime="2026-03-21 04:28:25.296082301 +0000 UTC m=+309.623868977" watchObservedRunningTime="2026-03-21 04:28:25.296822261 +0000 UTC m=+309.624608937" Mar 21 04:28:26 crc kubenswrapper[4839]: I0321 04:28:26.705328 4839 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-64bdf84bc9-fblcp"] Mar 21 04:28:26 crc kubenswrapper[4839]: I0321 04:28:26.705551 4839 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-64bdf84bc9-fblcp" podUID="602cd797-e549-4e26-a152-a1cb4decf82d" containerName="controller-manager" containerID="cri-o://daace327abb9a5740d790334ff3eab34c10f48214de659b79cea981c463ae614" gracePeriod=30 Mar 21 04:28:26 crc kubenswrapper[4839]: I0321 04:28:26.807283 4839 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-bbccd6fb-qndh7"] Mar 21 04:28:26 crc kubenswrapper[4839]: I0321 04:28:26.807886 4839 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-bbccd6fb-qndh7" podUID="c4842010-e137-466d-9596-e65f0cf2f4da" containerName="route-controller-manager" containerID="cri-o://5caf7db86d82198e233c9af5a90d5b97d038e01ec61ad89ee60361b87114a642" gracePeriod=30 Mar 21 04:28:27 crc kubenswrapper[4839]: I0321 04:28:27.577475 4839 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-nw7r6" Mar 21 04:28:27 crc kubenswrapper[4839]: I0321 04:28:27.577547 4839 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-nw7r6" Mar 21 04:28:27 crc kubenswrapper[4839]: I0321 04:28:27.617894 4839 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-nw7r6" Mar 21 04:28:28 crc kubenswrapper[4839]: I0321 04:28:28.000767 4839 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-qrqj2" Mar 21 04:28:28 crc kubenswrapper[4839]: I0321 04:28:28.000931 4839 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-qrqj2" Mar 21 04:28:28 crc kubenswrapper[4839]: I0321 04:28:28.037501 4839 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-qrqj2" Mar 21 04:28:28 crc kubenswrapper[4839]: I0321 04:28:28.124794 4839 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-v4btp" Mar 21 04:28:28 crc kubenswrapper[4839]: I0321 04:28:28.124840 4839 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-v4btp" Mar 21 04:28:28 crc kubenswrapper[4839]: I0321 04:28:28.201434 4839 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-v4btp" Mar 21 04:28:28 crc kubenswrapper[4839]: I0321 04:28:28.306556 4839 generic.go:334] "Generic (PLEG): container finished" podID="c4842010-e137-466d-9596-e65f0cf2f4da" containerID="5caf7db86d82198e233c9af5a90d5b97d038e01ec61ad89ee60361b87114a642" exitCode=0 Mar 21 04:28:28 crc kubenswrapper[4839]: I0321 04:28:28.306770 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-bbccd6fb-qndh7" event={"ID":"c4842010-e137-466d-9596-e65f0cf2f4da","Type":"ContainerDied","Data":"5caf7db86d82198e233c9af5a90d5b97d038e01ec61ad89ee60361b87114a642"} Mar 21 04:28:28 crc kubenswrapper[4839]: I0321 04:28:28.315345 4839 generic.go:334] "Generic (PLEG): container finished" podID="602cd797-e549-4e26-a152-a1cb4decf82d" containerID="daace327abb9a5740d790334ff3eab34c10f48214de659b79cea981c463ae614" exitCode=0 Mar 21 04:28:28 crc kubenswrapper[4839]: I0321 04:28:28.315744 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-64bdf84bc9-fblcp" event={"ID":"602cd797-e549-4e26-a152-a1cb4decf82d","Type":"ContainerDied","Data":"daace327abb9a5740d790334ff3eab34c10f48214de659b79cea981c463ae614"} Mar 21 04:28:28 crc kubenswrapper[4839]: I0321 04:28:28.358807 4839 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-v4btp" Mar 21 04:28:28 crc kubenswrapper[4839]: I0321 04:28:28.364105 4839 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-nw7r6" Mar 21 04:28:28 crc kubenswrapper[4839]: I0321 04:28:28.386464 4839 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-qrqj2" Mar 21 04:28:28 crc kubenswrapper[4839]: I0321 04:28:28.888937 4839 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-v4btp"] Mar 21 04:28:28 crc kubenswrapper[4839]: I0321 04:28:28.983716 4839 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-64bdf84bc9-fblcp" Mar 21 04:28:29 crc kubenswrapper[4839]: I0321 04:28:29.013661 4839 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-558f576774-7vr74"] Mar 21 04:28:29 crc kubenswrapper[4839]: E0321 04:28:29.013910 4839 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a45deb0c-4247-4d23-86db-a897c7f7e7f2" containerName="oc" Mar 21 04:28:29 crc kubenswrapper[4839]: I0321 04:28:29.014204 4839 state_mem.go:107] "Deleted CPUSet assignment" podUID="a45deb0c-4247-4d23-86db-a897c7f7e7f2" containerName="oc" Mar 21 04:28:29 crc kubenswrapper[4839]: E0321 04:28:29.014231 4839 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d62030f0-7d6f-46f7-83a2-a28fafe1a4ef" containerName="pruner" Mar 21 04:28:29 crc kubenswrapper[4839]: I0321 04:28:29.014237 4839 state_mem.go:107] "Deleted CPUSet assignment" podUID="d62030f0-7d6f-46f7-83a2-a28fafe1a4ef" containerName="pruner" Mar 21 04:28:29 crc kubenswrapper[4839]: E0321 04:28:29.014254 4839 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="602cd797-e549-4e26-a152-a1cb4decf82d" containerName="controller-manager" Mar 21 04:28:29 crc kubenswrapper[4839]: I0321 04:28:29.014260 4839 state_mem.go:107] "Deleted CPUSet assignment" podUID="602cd797-e549-4e26-a152-a1cb4decf82d" containerName="controller-manager" Mar 21 04:28:29 crc kubenswrapper[4839]: E0321 04:28:29.014267 4839 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="609ace61-45d1-44f6-b378-fb97eecf2374" containerName="oc" Mar 21 04:28:29 crc kubenswrapper[4839]: I0321 04:28:29.014273 4839 state_mem.go:107] "Deleted CPUSet assignment" podUID="609ace61-45d1-44f6-b378-fb97eecf2374" containerName="oc" Mar 21 04:28:29 crc kubenswrapper[4839]: I0321 04:28:29.014366 4839 memory_manager.go:354] "RemoveStaleState removing state" podUID="d62030f0-7d6f-46f7-83a2-a28fafe1a4ef" containerName="pruner" Mar 21 04:28:29 crc kubenswrapper[4839]: I0321 04:28:29.014376 4839 memory_manager.go:354] "RemoveStaleState removing state" podUID="a45deb0c-4247-4d23-86db-a897c7f7e7f2" containerName="oc" Mar 21 04:28:29 crc kubenswrapper[4839]: I0321 04:28:29.014387 4839 memory_manager.go:354] "RemoveStaleState removing state" podUID="602cd797-e549-4e26-a152-a1cb4decf82d" containerName="controller-manager" Mar 21 04:28:29 crc kubenswrapper[4839]: I0321 04:28:29.014632 4839 memory_manager.go:354] "RemoveStaleState removing state" podUID="609ace61-45d1-44f6-b378-fb97eecf2374" containerName="oc" Mar 21 04:28:29 crc kubenswrapper[4839]: I0321 04:28:29.015163 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-558f576774-7vr74" Mar 21 04:28:29 crc kubenswrapper[4839]: I0321 04:28:29.023237 4839 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-558f576774-7vr74"] Mar 21 04:28:29 crc kubenswrapper[4839]: I0321 04:28:29.076825 4839 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-bbccd6fb-qndh7" Mar 21 04:28:29 crc kubenswrapper[4839]: I0321 04:28:29.119068 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-s54qc\" (UniqueName: \"kubernetes.io/projected/602cd797-e549-4e26-a152-a1cb4decf82d-kube-api-access-s54qc\") pod \"602cd797-e549-4e26-a152-a1cb4decf82d\" (UID: \"602cd797-e549-4e26-a152-a1cb4decf82d\") " Mar 21 04:28:29 crc kubenswrapper[4839]: I0321 04:28:29.119418 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c4842010-e137-466d-9596-e65f0cf2f4da-serving-cert\") pod \"c4842010-e137-466d-9596-e65f0cf2f4da\" (UID: \"c4842010-e137-466d-9596-e65f0cf2f4da\") " Mar 21 04:28:29 crc kubenswrapper[4839]: I0321 04:28:29.119609 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-k9x2w\" (UniqueName: \"kubernetes.io/projected/c4842010-e137-466d-9596-e65f0cf2f4da-kube-api-access-k9x2w\") pod \"c4842010-e137-466d-9596-e65f0cf2f4da\" (UID: \"c4842010-e137-466d-9596-e65f0cf2f4da\") " Mar 21 04:28:29 crc kubenswrapper[4839]: I0321 04:28:29.119794 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/602cd797-e549-4e26-a152-a1cb4decf82d-client-ca\") pod \"602cd797-e549-4e26-a152-a1cb4decf82d\" (UID: \"602cd797-e549-4e26-a152-a1cb4decf82d\") " Mar 21 04:28:29 crc kubenswrapper[4839]: I0321 04:28:29.119911 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c4842010-e137-466d-9596-e65f0cf2f4da-config\") pod \"c4842010-e137-466d-9596-e65f0cf2f4da\" (UID: \"c4842010-e137-466d-9596-e65f0cf2f4da\") " Mar 21 04:28:29 crc kubenswrapper[4839]: I0321 04:28:29.120039 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/602cd797-e549-4e26-a152-a1cb4decf82d-serving-cert\") pod \"602cd797-e549-4e26-a152-a1cb4decf82d\" (UID: \"602cd797-e549-4e26-a152-a1cb4decf82d\") " Mar 21 04:28:29 crc kubenswrapper[4839]: I0321 04:28:29.120175 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/602cd797-e549-4e26-a152-a1cb4decf82d-proxy-ca-bundles\") pod \"602cd797-e549-4e26-a152-a1cb4decf82d\" (UID: \"602cd797-e549-4e26-a152-a1cb4decf82d\") " Mar 21 04:28:29 crc kubenswrapper[4839]: I0321 04:28:29.120278 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/c4842010-e137-466d-9596-e65f0cf2f4da-client-ca\") pod \"c4842010-e137-466d-9596-e65f0cf2f4da\" (UID: \"c4842010-e137-466d-9596-e65f0cf2f4da\") " Mar 21 04:28:29 crc kubenswrapper[4839]: I0321 04:28:29.120415 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/602cd797-e549-4e26-a152-a1cb4decf82d-config\") pod \"602cd797-e549-4e26-a152-a1cb4decf82d\" (UID: \"602cd797-e549-4e26-a152-a1cb4decf82d\") " Mar 21 04:28:29 crc kubenswrapper[4839]: I0321 04:28:29.120634 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c4842010-e137-466d-9596-e65f0cf2f4da-config" (OuterVolumeSpecName: "config") pod "c4842010-e137-466d-9596-e65f0cf2f4da" (UID: "c4842010-e137-466d-9596-e65f0cf2f4da"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 21 04:28:29 crc kubenswrapper[4839]: I0321 04:28:29.120792 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/c56aba9c-2ad5-4635-b6ad-eac6f79054c3-client-ca\") pod \"controller-manager-558f576774-7vr74\" (UID: \"c56aba9c-2ad5-4635-b6ad-eac6f79054c3\") " pod="openshift-controller-manager/controller-manager-558f576774-7vr74" Mar 21 04:28:29 crc kubenswrapper[4839]: I0321 04:28:29.120919 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c56aba9c-2ad5-4635-b6ad-eac6f79054c3-serving-cert\") pod \"controller-manager-558f576774-7vr74\" (UID: \"c56aba9c-2ad5-4635-b6ad-eac6f79054c3\") " pod="openshift-controller-manager/controller-manager-558f576774-7vr74" Mar 21 04:28:29 crc kubenswrapper[4839]: I0321 04:28:29.120788 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c4842010-e137-466d-9596-e65f0cf2f4da-client-ca" (OuterVolumeSpecName: "client-ca") pod "c4842010-e137-466d-9596-e65f0cf2f4da" (UID: "c4842010-e137-466d-9596-e65f0cf2f4da"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 21 04:28:29 crc kubenswrapper[4839]: I0321 04:28:29.120977 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/602cd797-e549-4e26-a152-a1cb4decf82d-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "602cd797-e549-4e26-a152-a1cb4decf82d" (UID: "602cd797-e549-4e26-a152-a1cb4decf82d"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 21 04:28:29 crc kubenswrapper[4839]: I0321 04:28:29.121002 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/602cd797-e549-4e26-a152-a1cb4decf82d-client-ca" (OuterVolumeSpecName: "client-ca") pod "602cd797-e549-4e26-a152-a1cb4decf82d" (UID: "602cd797-e549-4e26-a152-a1cb4decf82d"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 21 04:28:29 crc kubenswrapper[4839]: I0321 04:28:29.121291 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c56aba9c-2ad5-4635-b6ad-eac6f79054c3-config\") pod \"controller-manager-558f576774-7vr74\" (UID: \"c56aba9c-2ad5-4635-b6ad-eac6f79054c3\") " pod="openshift-controller-manager/controller-manager-558f576774-7vr74" Mar 21 04:28:29 crc kubenswrapper[4839]: I0321 04:28:29.121408 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dngsm\" (UniqueName: \"kubernetes.io/projected/c56aba9c-2ad5-4635-b6ad-eac6f79054c3-kube-api-access-dngsm\") pod \"controller-manager-558f576774-7vr74\" (UID: \"c56aba9c-2ad5-4635-b6ad-eac6f79054c3\") " pod="openshift-controller-manager/controller-manager-558f576774-7vr74" Mar 21 04:28:29 crc kubenswrapper[4839]: I0321 04:28:29.121531 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/c56aba9c-2ad5-4635-b6ad-eac6f79054c3-proxy-ca-bundles\") pod \"controller-manager-558f576774-7vr74\" (UID: \"c56aba9c-2ad5-4635-b6ad-eac6f79054c3\") " pod="openshift-controller-manager/controller-manager-558f576774-7vr74" Mar 21 04:28:29 crc kubenswrapper[4839]: I0321 04:28:29.121496 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/602cd797-e549-4e26-a152-a1cb4decf82d-config" (OuterVolumeSpecName: "config") pod "602cd797-e549-4e26-a152-a1cb4decf82d" (UID: "602cd797-e549-4e26-a152-a1cb4decf82d"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 21 04:28:29 crc kubenswrapper[4839]: I0321 04:28:29.121737 4839 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/602cd797-e549-4e26-a152-a1cb4decf82d-client-ca\") on node \"crc\" DevicePath \"\"" Mar 21 04:28:29 crc kubenswrapper[4839]: I0321 04:28:29.121852 4839 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c4842010-e137-466d-9596-e65f0cf2f4da-config\") on node \"crc\" DevicePath \"\"" Mar 21 04:28:29 crc kubenswrapper[4839]: I0321 04:28:29.121883 4839 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/602cd797-e549-4e26-a152-a1cb4decf82d-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Mar 21 04:28:29 crc kubenswrapper[4839]: I0321 04:28:29.121907 4839 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/c4842010-e137-466d-9596-e65f0cf2f4da-client-ca\") on node \"crc\" DevicePath \"\"" Mar 21 04:28:29 crc kubenswrapper[4839]: I0321 04:28:29.126070 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/602cd797-e549-4e26-a152-a1cb4decf82d-kube-api-access-s54qc" (OuterVolumeSpecName: "kube-api-access-s54qc") pod "602cd797-e549-4e26-a152-a1cb4decf82d" (UID: "602cd797-e549-4e26-a152-a1cb4decf82d"). InnerVolumeSpecName "kube-api-access-s54qc". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 21 04:28:29 crc kubenswrapper[4839]: I0321 04:28:29.126169 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c4842010-e137-466d-9596-e65f0cf2f4da-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "c4842010-e137-466d-9596-e65f0cf2f4da" (UID: "c4842010-e137-466d-9596-e65f0cf2f4da"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 21 04:28:29 crc kubenswrapper[4839]: I0321 04:28:29.128456 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c4842010-e137-466d-9596-e65f0cf2f4da-kube-api-access-k9x2w" (OuterVolumeSpecName: "kube-api-access-k9x2w") pod "c4842010-e137-466d-9596-e65f0cf2f4da" (UID: "c4842010-e137-466d-9596-e65f0cf2f4da"). InnerVolumeSpecName "kube-api-access-k9x2w". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 21 04:28:29 crc kubenswrapper[4839]: I0321 04:28:29.130982 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/602cd797-e549-4e26-a152-a1cb4decf82d-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "602cd797-e549-4e26-a152-a1cb4decf82d" (UID: "602cd797-e549-4e26-a152-a1cb4decf82d"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 21 04:28:29 crc kubenswrapper[4839]: I0321 04:28:29.222686 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dngsm\" (UniqueName: \"kubernetes.io/projected/c56aba9c-2ad5-4635-b6ad-eac6f79054c3-kube-api-access-dngsm\") pod \"controller-manager-558f576774-7vr74\" (UID: \"c56aba9c-2ad5-4635-b6ad-eac6f79054c3\") " pod="openshift-controller-manager/controller-manager-558f576774-7vr74" Mar 21 04:28:29 crc kubenswrapper[4839]: I0321 04:28:29.223062 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/c56aba9c-2ad5-4635-b6ad-eac6f79054c3-proxy-ca-bundles\") pod \"controller-manager-558f576774-7vr74\" (UID: \"c56aba9c-2ad5-4635-b6ad-eac6f79054c3\") " pod="openshift-controller-manager/controller-manager-558f576774-7vr74" Mar 21 04:28:29 crc kubenswrapper[4839]: I0321 04:28:29.223208 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/c56aba9c-2ad5-4635-b6ad-eac6f79054c3-client-ca\") pod \"controller-manager-558f576774-7vr74\" (UID: \"c56aba9c-2ad5-4635-b6ad-eac6f79054c3\") " pod="openshift-controller-manager/controller-manager-558f576774-7vr74" Mar 21 04:28:29 crc kubenswrapper[4839]: I0321 04:28:29.223323 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c56aba9c-2ad5-4635-b6ad-eac6f79054c3-serving-cert\") pod \"controller-manager-558f576774-7vr74\" (UID: \"c56aba9c-2ad5-4635-b6ad-eac6f79054c3\") " pod="openshift-controller-manager/controller-manager-558f576774-7vr74" Mar 21 04:28:29 crc kubenswrapper[4839]: I0321 04:28:29.223456 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c56aba9c-2ad5-4635-b6ad-eac6f79054c3-config\") pod \"controller-manager-558f576774-7vr74\" (UID: \"c56aba9c-2ad5-4635-b6ad-eac6f79054c3\") " pod="openshift-controller-manager/controller-manager-558f576774-7vr74" Mar 21 04:28:29 crc kubenswrapper[4839]: I0321 04:28:29.223624 4839 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/602cd797-e549-4e26-a152-a1cb4decf82d-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 21 04:28:29 crc kubenswrapper[4839]: I0321 04:28:29.223726 4839 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/602cd797-e549-4e26-a152-a1cb4decf82d-config\") on node \"crc\" DevicePath \"\"" Mar 21 04:28:29 crc kubenswrapper[4839]: I0321 04:28:29.223839 4839 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-s54qc\" (UniqueName: \"kubernetes.io/projected/602cd797-e549-4e26-a152-a1cb4decf82d-kube-api-access-s54qc\") on node \"crc\" DevicePath \"\"" Mar 21 04:28:29 crc kubenswrapper[4839]: I0321 04:28:29.223920 4839 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-k9x2w\" (UniqueName: \"kubernetes.io/projected/c4842010-e137-466d-9596-e65f0cf2f4da-kube-api-access-k9x2w\") on node \"crc\" DevicePath \"\"" Mar 21 04:28:29 crc kubenswrapper[4839]: I0321 04:28:29.223989 4839 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c4842010-e137-466d-9596-e65f0cf2f4da-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 21 04:28:29 crc kubenswrapper[4839]: I0321 04:28:29.224603 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/c56aba9c-2ad5-4635-b6ad-eac6f79054c3-proxy-ca-bundles\") pod \"controller-manager-558f576774-7vr74\" (UID: \"c56aba9c-2ad5-4635-b6ad-eac6f79054c3\") " pod="openshift-controller-manager/controller-manager-558f576774-7vr74" Mar 21 04:28:29 crc kubenswrapper[4839]: I0321 04:28:29.224844 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c56aba9c-2ad5-4635-b6ad-eac6f79054c3-config\") pod \"controller-manager-558f576774-7vr74\" (UID: \"c56aba9c-2ad5-4635-b6ad-eac6f79054c3\") " pod="openshift-controller-manager/controller-manager-558f576774-7vr74" Mar 21 04:28:29 crc kubenswrapper[4839]: I0321 04:28:29.225404 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/c56aba9c-2ad5-4635-b6ad-eac6f79054c3-client-ca\") pod \"controller-manager-558f576774-7vr74\" (UID: \"c56aba9c-2ad5-4635-b6ad-eac6f79054c3\") " pod="openshift-controller-manager/controller-manager-558f576774-7vr74" Mar 21 04:28:29 crc kubenswrapper[4839]: I0321 04:28:29.229703 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c56aba9c-2ad5-4635-b6ad-eac6f79054c3-serving-cert\") pod \"controller-manager-558f576774-7vr74\" (UID: \"c56aba9c-2ad5-4635-b6ad-eac6f79054c3\") " pod="openshift-controller-manager/controller-manager-558f576774-7vr74" Mar 21 04:28:29 crc kubenswrapper[4839]: I0321 04:28:29.242839 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dngsm\" (UniqueName: \"kubernetes.io/projected/c56aba9c-2ad5-4635-b6ad-eac6f79054c3-kube-api-access-dngsm\") pod \"controller-manager-558f576774-7vr74\" (UID: \"c56aba9c-2ad5-4635-b6ad-eac6f79054c3\") " pod="openshift-controller-manager/controller-manager-558f576774-7vr74" Mar 21 04:28:29 crc kubenswrapper[4839]: I0321 04:28:29.329931 4839 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-bbccd6fb-qndh7" Mar 21 04:28:29 crc kubenswrapper[4839]: I0321 04:28:29.329940 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-bbccd6fb-qndh7" event={"ID":"c4842010-e137-466d-9596-e65f0cf2f4da","Type":"ContainerDied","Data":"8c3849bbe60c3ca3e53c5a7956c8fe11b773acab42338cd4566762648f52546e"} Mar 21 04:28:29 crc kubenswrapper[4839]: I0321 04:28:29.330660 4839 scope.go:117] "RemoveContainer" containerID="5caf7db86d82198e233c9af5a90d5b97d038e01ec61ad89ee60361b87114a642" Mar 21 04:28:29 crc kubenswrapper[4839]: I0321 04:28:29.334075 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-64bdf84bc9-fblcp" event={"ID":"602cd797-e549-4e26-a152-a1cb4decf82d","Type":"ContainerDied","Data":"37880a6725bfc4b97b6547a22128a884c967634f4fc94acf8976fa9f68b2ab94"} Mar 21 04:28:29 crc kubenswrapper[4839]: I0321 04:28:29.334742 4839 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-64bdf84bc9-fblcp" Mar 21 04:28:29 crc kubenswrapper[4839]: I0321 04:28:29.363716 4839 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-bbccd6fb-qndh7"] Mar 21 04:28:29 crc kubenswrapper[4839]: I0321 04:28:29.366542 4839 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-bbccd6fb-qndh7"] Mar 21 04:28:29 crc kubenswrapper[4839]: I0321 04:28:29.374744 4839 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-64bdf84bc9-fblcp"] Mar 21 04:28:29 crc kubenswrapper[4839]: I0321 04:28:29.375113 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-558f576774-7vr74" Mar 21 04:28:29 crc kubenswrapper[4839]: I0321 04:28:29.375182 4839 scope.go:117] "RemoveContainer" containerID="daace327abb9a5740d790334ff3eab34c10f48214de659b79cea981c463ae614" Mar 21 04:28:29 crc kubenswrapper[4839]: I0321 04:28:29.380081 4839 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-64bdf84bc9-fblcp"] Mar 21 04:28:29 crc kubenswrapper[4839]: I0321 04:28:29.684953 4839 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-9qjgq" Mar 21 04:28:29 crc kubenswrapper[4839]: I0321 04:28:29.685292 4839 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-9qjgq" Mar 21 04:28:29 crc kubenswrapper[4839]: I0321 04:28:29.728423 4839 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-9qjgq" Mar 21 04:28:29 crc kubenswrapper[4839]: I0321 04:28:29.828675 4839 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-558f576774-7vr74"] Mar 21 04:28:29 crc kubenswrapper[4839]: W0321 04:28:29.839296 4839 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc56aba9c_2ad5_4635_b6ad_eac6f79054c3.slice/crio-d7388d39aa83eaf8d537d9f043477f7e68a9f0a64d46f47372fa63ac019341e1 WatchSource:0}: Error finding container d7388d39aa83eaf8d537d9f043477f7e68a9f0a64d46f47372fa63ac019341e1: Status 404 returned error can't find the container with id d7388d39aa83eaf8d537d9f043477f7e68a9f0a64d46f47372fa63ac019341e1 Mar 21 04:28:30 crc kubenswrapper[4839]: I0321 04:28:30.139103 4839 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-7m29z" Mar 21 04:28:30 crc kubenswrapper[4839]: I0321 04:28:30.139180 4839 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-7m29z" Mar 21 04:28:30 crc kubenswrapper[4839]: I0321 04:28:30.174829 4839 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-7m29z" Mar 21 04:28:30 crc kubenswrapper[4839]: I0321 04:28:30.343393 4839 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-v4btp" podUID="dc99f39a-8001-466b-acf1-bd106eb2b81d" containerName="registry-server" containerID="cri-o://f0e777e6b17b8feadb828ef554e4eb57eaf696f3c9053a1cb52a0aa9d0e7f691" gracePeriod=2 Mar 21 04:28:30 crc kubenswrapper[4839]: I0321 04:28:30.344038 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-558f576774-7vr74" event={"ID":"c56aba9c-2ad5-4635-b6ad-eac6f79054c3","Type":"ContainerStarted","Data":"d7388d39aa83eaf8d537d9f043477f7e68a9f0a64d46f47372fa63ac019341e1"} Mar 21 04:28:30 crc kubenswrapper[4839]: I0321 04:28:30.393129 4839 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-9qjgq" Mar 21 04:28:30 crc kubenswrapper[4839]: I0321 04:28:30.394185 4839 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-7m29z" Mar 21 04:28:30 crc kubenswrapper[4839]: I0321 04:28:30.460735 4839 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="602cd797-e549-4e26-a152-a1cb4decf82d" path="/var/lib/kubelet/pods/602cd797-e549-4e26-a152-a1cb4decf82d/volumes" Mar 21 04:28:30 crc kubenswrapper[4839]: I0321 04:28:30.461394 4839 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c4842010-e137-466d-9596-e65f0cf2f4da" path="/var/lib/kubelet/pods/c4842010-e137-466d-9596-e65f0cf2f4da/volumes" Mar 21 04:28:30 crc kubenswrapper[4839]: I0321 04:28:30.723930 4839 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-zgfcm" Mar 21 04:28:30 crc kubenswrapper[4839]: I0321 04:28:30.723991 4839 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-zgfcm" Mar 21 04:28:30 crc kubenswrapper[4839]: I0321 04:28:30.979891 4839 patch_prober.go:28] interesting pod/machine-config-daemon-jx4q7 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 21 04:28:30 crc kubenswrapper[4839]: I0321 04:28:30.979973 4839 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-jx4q7" podUID="4f92fefb-d5cd-451a-8bbe-31eea55d5bd9" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 21 04:28:30 crc kubenswrapper[4839]: I0321 04:28:30.980070 4839 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-jx4q7" Mar 21 04:28:30 crc kubenswrapper[4839]: I0321 04:28:30.980829 4839 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"a124ddc90d8b93fe4b6f77b778fbdac127b4896f5f55e6d5d78629cd17584311"} pod="openshift-machine-config-operator/machine-config-daemon-jx4q7" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 21 04:28:30 crc kubenswrapper[4839]: I0321 04:28:30.980927 4839 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-jx4q7" podUID="4f92fefb-d5cd-451a-8bbe-31eea55d5bd9" containerName="machine-config-daemon" containerID="cri-o://a124ddc90d8b93fe4b6f77b778fbdac127b4896f5f55e6d5d78629cd17584311" gracePeriod=600 Mar 21 04:28:31 crc kubenswrapper[4839]: I0321 04:28:31.288293 4839 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-qrqj2"] Mar 21 04:28:31 crc kubenswrapper[4839]: I0321 04:28:31.350192 4839 generic.go:334] "Generic (PLEG): container finished" podID="dc99f39a-8001-466b-acf1-bd106eb2b81d" containerID="f0e777e6b17b8feadb828ef554e4eb57eaf696f3c9053a1cb52a0aa9d0e7f691" exitCode=0 Mar 21 04:28:31 crc kubenswrapper[4839]: I0321 04:28:31.350260 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-v4btp" event={"ID":"dc99f39a-8001-466b-acf1-bd106eb2b81d","Type":"ContainerDied","Data":"f0e777e6b17b8feadb828ef554e4eb57eaf696f3c9053a1cb52a0aa9d0e7f691"} Mar 21 04:28:31 crc kubenswrapper[4839]: I0321 04:28:31.351778 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-558f576774-7vr74" event={"ID":"c56aba9c-2ad5-4635-b6ad-eac6f79054c3","Type":"ContainerStarted","Data":"42a603ec98d84a05b1f20eead303786e9ed6177eea3a7880147775cb99246cdf"} Mar 21 04:28:31 crc kubenswrapper[4839]: I0321 04:28:31.352064 4839 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-qrqj2" podUID="f1ec80e5-557b-4c30-8323-87d6b1447a6d" containerName="registry-server" containerID="cri-o://ff2ff5500daef64d772e498675382b547240590af237209b74ba0f37bf72bdc8" gracePeriod=2 Mar 21 04:28:31 crc kubenswrapper[4839]: I0321 04:28:31.635678 4839 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-cznml" Mar 21 04:28:31 crc kubenswrapper[4839]: I0321 04:28:31.681702 4839 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-cznml" Mar 21 04:28:31 crc kubenswrapper[4839]: I0321 04:28:31.784639 4839 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-zgfcm" podUID="5d1d0c02-87bf-4c8b-bc1c-d25007fb3c1c" containerName="registry-server" probeResult="failure" output=< Mar 21 04:28:31 crc kubenswrapper[4839]: timeout: failed to connect service ":50051" within 1s Mar 21 04:28:31 crc kubenswrapper[4839]: > Mar 21 04:28:31 crc kubenswrapper[4839]: I0321 04:28:31.887751 4839 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-ff7874d5d-7kj25"] Mar 21 04:28:31 crc kubenswrapper[4839]: E0321 04:28:31.888378 4839 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c4842010-e137-466d-9596-e65f0cf2f4da" containerName="route-controller-manager" Mar 21 04:28:31 crc kubenswrapper[4839]: I0321 04:28:31.888402 4839 state_mem.go:107] "Deleted CPUSet assignment" podUID="c4842010-e137-466d-9596-e65f0cf2f4da" containerName="route-controller-manager" Mar 21 04:28:31 crc kubenswrapper[4839]: I0321 04:28:31.888521 4839 memory_manager.go:354] "RemoveStaleState removing state" podUID="c4842010-e137-466d-9596-e65f0cf2f4da" containerName="route-controller-manager" Mar 21 04:28:31 crc kubenswrapper[4839]: I0321 04:28:31.889107 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-ff7874d5d-7kj25" Mar 21 04:28:31 crc kubenswrapper[4839]: I0321 04:28:31.890723 4839 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Mar 21 04:28:31 crc kubenswrapper[4839]: I0321 04:28:31.891006 4839 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Mar 21 04:28:31 crc kubenswrapper[4839]: I0321 04:28:31.891129 4839 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Mar 21 04:28:31 crc kubenswrapper[4839]: I0321 04:28:31.891250 4839 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Mar 21 04:28:31 crc kubenswrapper[4839]: I0321 04:28:31.891345 4839 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Mar 21 04:28:31 crc kubenswrapper[4839]: I0321 04:28:31.891458 4839 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Mar 21 04:28:31 crc kubenswrapper[4839]: I0321 04:28:31.893489 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/2e226a30-c23d-4a45-ab06-4087bf0a38c7-client-ca\") pod \"route-controller-manager-ff7874d5d-7kj25\" (UID: \"2e226a30-c23d-4a45-ab06-4087bf0a38c7\") " pod="openshift-route-controller-manager/route-controller-manager-ff7874d5d-7kj25" Mar 21 04:28:31 crc kubenswrapper[4839]: I0321 04:28:31.893524 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/2e226a30-c23d-4a45-ab06-4087bf0a38c7-serving-cert\") pod \"route-controller-manager-ff7874d5d-7kj25\" (UID: \"2e226a30-c23d-4a45-ab06-4087bf0a38c7\") " pod="openshift-route-controller-manager/route-controller-manager-ff7874d5d-7kj25" Mar 21 04:28:31 crc kubenswrapper[4839]: I0321 04:28:31.893563 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lssj5\" (UniqueName: \"kubernetes.io/projected/2e226a30-c23d-4a45-ab06-4087bf0a38c7-kube-api-access-lssj5\") pod \"route-controller-manager-ff7874d5d-7kj25\" (UID: \"2e226a30-c23d-4a45-ab06-4087bf0a38c7\") " pod="openshift-route-controller-manager/route-controller-manager-ff7874d5d-7kj25" Mar 21 04:28:31 crc kubenswrapper[4839]: I0321 04:28:31.893627 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2e226a30-c23d-4a45-ab06-4087bf0a38c7-config\") pod \"route-controller-manager-ff7874d5d-7kj25\" (UID: \"2e226a30-c23d-4a45-ab06-4087bf0a38c7\") " pod="openshift-route-controller-manager/route-controller-manager-ff7874d5d-7kj25" Mar 21 04:28:31 crc kubenswrapper[4839]: I0321 04:28:31.896981 4839 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-ff7874d5d-7kj25"] Mar 21 04:28:31 crc kubenswrapper[4839]: I0321 04:28:31.974762 4839 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-v4btp" Mar 21 04:28:31 crc kubenswrapper[4839]: I0321 04:28:31.994114 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/dc99f39a-8001-466b-acf1-bd106eb2b81d-catalog-content\") pod \"dc99f39a-8001-466b-acf1-bd106eb2b81d\" (UID: \"dc99f39a-8001-466b-acf1-bd106eb2b81d\") " Mar 21 04:28:31 crc kubenswrapper[4839]: I0321 04:28:31.994193 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/dc99f39a-8001-466b-acf1-bd106eb2b81d-utilities\") pod \"dc99f39a-8001-466b-acf1-bd106eb2b81d\" (UID: \"dc99f39a-8001-466b-acf1-bd106eb2b81d\") " Mar 21 04:28:31 crc kubenswrapper[4839]: I0321 04:28:31.994216 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w9sw7\" (UniqueName: \"kubernetes.io/projected/dc99f39a-8001-466b-acf1-bd106eb2b81d-kube-api-access-w9sw7\") pod \"dc99f39a-8001-466b-acf1-bd106eb2b81d\" (UID: \"dc99f39a-8001-466b-acf1-bd106eb2b81d\") " Mar 21 04:28:31 crc kubenswrapper[4839]: I0321 04:28:31.995157 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2e226a30-c23d-4a45-ab06-4087bf0a38c7-config\") pod \"route-controller-manager-ff7874d5d-7kj25\" (UID: \"2e226a30-c23d-4a45-ab06-4087bf0a38c7\") " pod="openshift-route-controller-manager/route-controller-manager-ff7874d5d-7kj25" Mar 21 04:28:31 crc kubenswrapper[4839]: I0321 04:28:31.995245 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/2e226a30-c23d-4a45-ab06-4087bf0a38c7-client-ca\") pod \"route-controller-manager-ff7874d5d-7kj25\" (UID: \"2e226a30-c23d-4a45-ab06-4087bf0a38c7\") " pod="openshift-route-controller-manager/route-controller-manager-ff7874d5d-7kj25" Mar 21 04:28:31 crc kubenswrapper[4839]: I0321 04:28:31.995270 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/2e226a30-c23d-4a45-ab06-4087bf0a38c7-serving-cert\") pod \"route-controller-manager-ff7874d5d-7kj25\" (UID: \"2e226a30-c23d-4a45-ab06-4087bf0a38c7\") " pod="openshift-route-controller-manager/route-controller-manager-ff7874d5d-7kj25" Mar 21 04:28:31 crc kubenswrapper[4839]: I0321 04:28:31.995306 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lssj5\" (UniqueName: \"kubernetes.io/projected/2e226a30-c23d-4a45-ab06-4087bf0a38c7-kube-api-access-lssj5\") pod \"route-controller-manager-ff7874d5d-7kj25\" (UID: \"2e226a30-c23d-4a45-ab06-4087bf0a38c7\") " pod="openshift-route-controller-manager/route-controller-manager-ff7874d5d-7kj25" Mar 21 04:28:31 crc kubenswrapper[4839]: I0321 04:28:31.995381 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/dc99f39a-8001-466b-acf1-bd106eb2b81d-utilities" (OuterVolumeSpecName: "utilities") pod "dc99f39a-8001-466b-acf1-bd106eb2b81d" (UID: "dc99f39a-8001-466b-acf1-bd106eb2b81d"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 21 04:28:31 crc kubenswrapper[4839]: I0321 04:28:31.996346 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/2e226a30-c23d-4a45-ab06-4087bf0a38c7-client-ca\") pod \"route-controller-manager-ff7874d5d-7kj25\" (UID: \"2e226a30-c23d-4a45-ab06-4087bf0a38c7\") " pod="openshift-route-controller-manager/route-controller-manager-ff7874d5d-7kj25" Mar 21 04:28:31 crc kubenswrapper[4839]: I0321 04:28:31.996734 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2e226a30-c23d-4a45-ab06-4087bf0a38c7-config\") pod \"route-controller-manager-ff7874d5d-7kj25\" (UID: \"2e226a30-c23d-4a45-ab06-4087bf0a38c7\") " pod="openshift-route-controller-manager/route-controller-manager-ff7874d5d-7kj25" Mar 21 04:28:31 crc kubenswrapper[4839]: I0321 04:28:31.999872 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/dc99f39a-8001-466b-acf1-bd106eb2b81d-kube-api-access-w9sw7" (OuterVolumeSpecName: "kube-api-access-w9sw7") pod "dc99f39a-8001-466b-acf1-bd106eb2b81d" (UID: "dc99f39a-8001-466b-acf1-bd106eb2b81d"). InnerVolumeSpecName "kube-api-access-w9sw7". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 21 04:28:32 crc kubenswrapper[4839]: I0321 04:28:32.000412 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/2e226a30-c23d-4a45-ab06-4087bf0a38c7-serving-cert\") pod \"route-controller-manager-ff7874d5d-7kj25\" (UID: \"2e226a30-c23d-4a45-ab06-4087bf0a38c7\") " pod="openshift-route-controller-manager/route-controller-manager-ff7874d5d-7kj25" Mar 21 04:28:32 crc kubenswrapper[4839]: I0321 04:28:32.018842 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lssj5\" (UniqueName: \"kubernetes.io/projected/2e226a30-c23d-4a45-ab06-4087bf0a38c7-kube-api-access-lssj5\") pod \"route-controller-manager-ff7874d5d-7kj25\" (UID: \"2e226a30-c23d-4a45-ab06-4087bf0a38c7\") " pod="openshift-route-controller-manager/route-controller-manager-ff7874d5d-7kj25" Mar 21 04:28:32 crc kubenswrapper[4839]: I0321 04:28:32.063059 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/dc99f39a-8001-466b-acf1-bd106eb2b81d-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "dc99f39a-8001-466b-acf1-bd106eb2b81d" (UID: "dc99f39a-8001-466b-acf1-bd106eb2b81d"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 21 04:28:32 crc kubenswrapper[4839]: I0321 04:28:32.096355 4839 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w9sw7\" (UniqueName: \"kubernetes.io/projected/dc99f39a-8001-466b-acf1-bd106eb2b81d-kube-api-access-w9sw7\") on node \"crc\" DevicePath \"\"" Mar 21 04:28:32 crc kubenswrapper[4839]: I0321 04:28:32.096398 4839 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/dc99f39a-8001-466b-acf1-bd106eb2b81d-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 21 04:28:32 crc kubenswrapper[4839]: I0321 04:28:32.096413 4839 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/dc99f39a-8001-466b-acf1-bd106eb2b81d-utilities\") on node \"crc\" DevicePath \"\"" Mar 21 04:28:32 crc kubenswrapper[4839]: I0321 04:28:32.267118 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-ff7874d5d-7kj25" Mar 21 04:28:32 crc kubenswrapper[4839]: I0321 04:28:32.361158 4839 generic.go:334] "Generic (PLEG): container finished" podID="4f92fefb-d5cd-451a-8bbe-31eea55d5bd9" containerID="a124ddc90d8b93fe4b6f77b778fbdac127b4896f5f55e6d5d78629cd17584311" exitCode=0 Mar 21 04:28:32 crc kubenswrapper[4839]: I0321 04:28:32.361309 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-jx4q7" event={"ID":"4f92fefb-d5cd-451a-8bbe-31eea55d5bd9","Type":"ContainerDied","Data":"a124ddc90d8b93fe4b6f77b778fbdac127b4896f5f55e6d5d78629cd17584311"} Mar 21 04:28:32 crc kubenswrapper[4839]: I0321 04:28:32.364471 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-v4btp" event={"ID":"dc99f39a-8001-466b-acf1-bd106eb2b81d","Type":"ContainerDied","Data":"a5776e6c987dacb310eadbac22656829eea56efcfa2c6987693a184baa498a40"} Mar 21 04:28:32 crc kubenswrapper[4839]: I0321 04:28:32.364511 4839 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-v4btp" Mar 21 04:28:32 crc kubenswrapper[4839]: I0321 04:28:32.364526 4839 scope.go:117] "RemoveContainer" containerID="f0e777e6b17b8feadb828ef554e4eb57eaf696f3c9053a1cb52a0aa9d0e7f691" Mar 21 04:28:32 crc kubenswrapper[4839]: I0321 04:28:32.383279 4839 scope.go:117] "RemoveContainer" containerID="29eeda5c5800bc8b98b9c7f0e11dfdbb6d941849d4928ef702fd78a8e69796aa" Mar 21 04:28:32 crc kubenswrapper[4839]: I0321 04:28:32.388501 4839 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-558f576774-7vr74" podStartSLOduration=6.38848198 podStartE2EDuration="6.38848198s" podCreationTimestamp="2026-03-21 04:28:26 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-21 04:28:32.384798951 +0000 UTC m=+316.712585637" watchObservedRunningTime="2026-03-21 04:28:32.38848198 +0000 UTC m=+316.716268656" Mar 21 04:28:32 crc kubenswrapper[4839]: I0321 04:28:32.415897 4839 scope.go:117] "RemoveContainer" containerID="87e7256c1b35efeb4f01906aa88cf63b70ae781a00455690c43c1caf1c568dc7" Mar 21 04:28:32 crc kubenswrapper[4839]: I0321 04:28:32.419816 4839 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-v4btp"] Mar 21 04:28:32 crc kubenswrapper[4839]: I0321 04:28:32.438853 4839 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-v4btp"] Mar 21 04:28:32 crc kubenswrapper[4839]: I0321 04:28:32.460003 4839 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="dc99f39a-8001-466b-acf1-bd106eb2b81d" path="/var/lib/kubelet/pods/dc99f39a-8001-466b-acf1-bd106eb2b81d/volumes" Mar 21 04:28:32 crc kubenswrapper[4839]: I0321 04:28:32.735475 4839 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-ff7874d5d-7kj25"] Mar 21 04:28:33 crc kubenswrapper[4839]: I0321 04:28:33.274643 4839 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-qrqj2" Mar 21 04:28:33 crc kubenswrapper[4839]: I0321 04:28:33.312956 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f1ec80e5-557b-4c30-8323-87d6b1447a6d-utilities\") pod \"f1ec80e5-557b-4c30-8323-87d6b1447a6d\" (UID: \"f1ec80e5-557b-4c30-8323-87d6b1447a6d\") " Mar 21 04:28:33 crc kubenswrapper[4839]: I0321 04:28:33.313076 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-z7dr2\" (UniqueName: \"kubernetes.io/projected/f1ec80e5-557b-4c30-8323-87d6b1447a6d-kube-api-access-z7dr2\") pod \"f1ec80e5-557b-4c30-8323-87d6b1447a6d\" (UID: \"f1ec80e5-557b-4c30-8323-87d6b1447a6d\") " Mar 21 04:28:33 crc kubenswrapper[4839]: I0321 04:28:33.313134 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f1ec80e5-557b-4c30-8323-87d6b1447a6d-catalog-content\") pod \"f1ec80e5-557b-4c30-8323-87d6b1447a6d\" (UID: \"f1ec80e5-557b-4c30-8323-87d6b1447a6d\") " Mar 21 04:28:33 crc kubenswrapper[4839]: I0321 04:28:33.314004 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f1ec80e5-557b-4c30-8323-87d6b1447a6d-utilities" (OuterVolumeSpecName: "utilities") pod "f1ec80e5-557b-4c30-8323-87d6b1447a6d" (UID: "f1ec80e5-557b-4c30-8323-87d6b1447a6d"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 21 04:28:33 crc kubenswrapper[4839]: I0321 04:28:33.323504 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f1ec80e5-557b-4c30-8323-87d6b1447a6d-kube-api-access-z7dr2" (OuterVolumeSpecName: "kube-api-access-z7dr2") pod "f1ec80e5-557b-4c30-8323-87d6b1447a6d" (UID: "f1ec80e5-557b-4c30-8323-87d6b1447a6d"). InnerVolumeSpecName "kube-api-access-z7dr2". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 21 04:28:33 crc kubenswrapper[4839]: I0321 04:28:33.365798 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f1ec80e5-557b-4c30-8323-87d6b1447a6d-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "f1ec80e5-557b-4c30-8323-87d6b1447a6d" (UID: "f1ec80e5-557b-4c30-8323-87d6b1447a6d"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 21 04:28:33 crc kubenswrapper[4839]: I0321 04:28:33.378709 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-ff7874d5d-7kj25" event={"ID":"2e226a30-c23d-4a45-ab06-4087bf0a38c7","Type":"ContainerStarted","Data":"ebf0cee149fa25552dccfb87a894d627508cabce1c44155d596f5ae2612b7b6c"} Mar 21 04:28:33 crc kubenswrapper[4839]: I0321 04:28:33.378798 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-ff7874d5d-7kj25" event={"ID":"2e226a30-c23d-4a45-ab06-4087bf0a38c7","Type":"ContainerStarted","Data":"13aade652ede3403d6b69f04e499915fd3869f0f1415b459504a8ea3e6cac5bf"} Mar 21 04:28:33 crc kubenswrapper[4839]: I0321 04:28:33.378855 4839 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-ff7874d5d-7kj25" Mar 21 04:28:33 crc kubenswrapper[4839]: I0321 04:28:33.391195 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-jx4q7" event={"ID":"4f92fefb-d5cd-451a-8bbe-31eea55d5bd9","Type":"ContainerStarted","Data":"e09fc13ebec75e4a854ca3cecb49f40ab8a65cb0b655c2368ba9c14be11281c6"} Mar 21 04:28:33 crc kubenswrapper[4839]: I0321 04:28:33.402977 4839 generic.go:334] "Generic (PLEG): container finished" podID="f1ec80e5-557b-4c30-8323-87d6b1447a6d" containerID="ff2ff5500daef64d772e498675382b547240590af237209b74ba0f37bf72bdc8" exitCode=0 Mar 21 04:28:33 crc kubenswrapper[4839]: I0321 04:28:33.403028 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-qrqj2" event={"ID":"f1ec80e5-557b-4c30-8323-87d6b1447a6d","Type":"ContainerDied","Data":"ff2ff5500daef64d772e498675382b547240590af237209b74ba0f37bf72bdc8"} Mar 21 04:28:33 crc kubenswrapper[4839]: I0321 04:28:33.403101 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-qrqj2" event={"ID":"f1ec80e5-557b-4c30-8323-87d6b1447a6d","Type":"ContainerDied","Data":"45c4f382e92761207baa8e2c4160a24c616ae580e9d48096eb767d7eab157d90"} Mar 21 04:28:33 crc kubenswrapper[4839]: I0321 04:28:33.403141 4839 scope.go:117] "RemoveContainer" containerID="ff2ff5500daef64d772e498675382b547240590af237209b74ba0f37bf72bdc8" Mar 21 04:28:33 crc kubenswrapper[4839]: I0321 04:28:33.403047 4839 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-qrqj2" Mar 21 04:28:33 crc kubenswrapper[4839]: I0321 04:28:33.411337 4839 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-ff7874d5d-7kj25" podStartSLOduration=7.411298024 podStartE2EDuration="7.411298024s" podCreationTimestamp="2026-03-21 04:28:26 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-21 04:28:33.399286162 +0000 UTC m=+317.727072838" watchObservedRunningTime="2026-03-21 04:28:33.411298024 +0000 UTC m=+317.739084700" Mar 21 04:28:33 crc kubenswrapper[4839]: I0321 04:28:33.414245 4839 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f1ec80e5-557b-4c30-8323-87d6b1447a6d-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 21 04:28:33 crc kubenswrapper[4839]: I0321 04:28:33.414300 4839 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f1ec80e5-557b-4c30-8323-87d6b1447a6d-utilities\") on node \"crc\" DevicePath \"\"" Mar 21 04:28:33 crc kubenswrapper[4839]: I0321 04:28:33.414314 4839 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-z7dr2\" (UniqueName: \"kubernetes.io/projected/f1ec80e5-557b-4c30-8323-87d6b1447a6d-kube-api-access-z7dr2\") on node \"crc\" DevicePath \"\"" Mar 21 04:28:33 crc kubenswrapper[4839]: I0321 04:28:33.422750 4839 scope.go:117] "RemoveContainer" containerID="5b02a43d09df8da1fb3663f2b16933f0e9d27646e4ffb38ae47dd8d3634049a8" Mar 21 04:28:33 crc kubenswrapper[4839]: I0321 04:28:33.440361 4839 scope.go:117] "RemoveContainer" containerID="5d071283174381fba88bb02cf3efc908a9879b9241ab8bc6042c28d3491e2250" Mar 21 04:28:33 crc kubenswrapper[4839]: I0321 04:28:33.457864 4839 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-qrqj2"] Mar 21 04:28:33 crc kubenswrapper[4839]: I0321 04:28:33.462652 4839 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-qrqj2"] Mar 21 04:28:33 crc kubenswrapper[4839]: I0321 04:28:33.487838 4839 scope.go:117] "RemoveContainer" containerID="ff2ff5500daef64d772e498675382b547240590af237209b74ba0f37bf72bdc8" Mar 21 04:28:33 crc kubenswrapper[4839]: E0321 04:28:33.491769 4839 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ff2ff5500daef64d772e498675382b547240590af237209b74ba0f37bf72bdc8\": container with ID starting with ff2ff5500daef64d772e498675382b547240590af237209b74ba0f37bf72bdc8 not found: ID does not exist" containerID="ff2ff5500daef64d772e498675382b547240590af237209b74ba0f37bf72bdc8" Mar 21 04:28:33 crc kubenswrapper[4839]: I0321 04:28:33.491832 4839 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ff2ff5500daef64d772e498675382b547240590af237209b74ba0f37bf72bdc8"} err="failed to get container status \"ff2ff5500daef64d772e498675382b547240590af237209b74ba0f37bf72bdc8\": rpc error: code = NotFound desc = could not find container \"ff2ff5500daef64d772e498675382b547240590af237209b74ba0f37bf72bdc8\": container with ID starting with ff2ff5500daef64d772e498675382b547240590af237209b74ba0f37bf72bdc8 not found: ID does not exist" Mar 21 04:28:33 crc kubenswrapper[4839]: I0321 04:28:33.491865 4839 scope.go:117] "RemoveContainer" containerID="5b02a43d09df8da1fb3663f2b16933f0e9d27646e4ffb38ae47dd8d3634049a8" Mar 21 04:28:33 crc kubenswrapper[4839]: E0321 04:28:33.495789 4839 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5b02a43d09df8da1fb3663f2b16933f0e9d27646e4ffb38ae47dd8d3634049a8\": container with ID starting with 5b02a43d09df8da1fb3663f2b16933f0e9d27646e4ffb38ae47dd8d3634049a8 not found: ID does not exist" containerID="5b02a43d09df8da1fb3663f2b16933f0e9d27646e4ffb38ae47dd8d3634049a8" Mar 21 04:28:33 crc kubenswrapper[4839]: I0321 04:28:33.495836 4839 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5b02a43d09df8da1fb3663f2b16933f0e9d27646e4ffb38ae47dd8d3634049a8"} err="failed to get container status \"5b02a43d09df8da1fb3663f2b16933f0e9d27646e4ffb38ae47dd8d3634049a8\": rpc error: code = NotFound desc = could not find container \"5b02a43d09df8da1fb3663f2b16933f0e9d27646e4ffb38ae47dd8d3634049a8\": container with ID starting with 5b02a43d09df8da1fb3663f2b16933f0e9d27646e4ffb38ae47dd8d3634049a8 not found: ID does not exist" Mar 21 04:28:33 crc kubenswrapper[4839]: I0321 04:28:33.495863 4839 scope.go:117] "RemoveContainer" containerID="5d071283174381fba88bb02cf3efc908a9879b9241ab8bc6042c28d3491e2250" Mar 21 04:28:33 crc kubenswrapper[4839]: E0321 04:28:33.496370 4839 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5d071283174381fba88bb02cf3efc908a9879b9241ab8bc6042c28d3491e2250\": container with ID starting with 5d071283174381fba88bb02cf3efc908a9879b9241ab8bc6042c28d3491e2250 not found: ID does not exist" containerID="5d071283174381fba88bb02cf3efc908a9879b9241ab8bc6042c28d3491e2250" Mar 21 04:28:33 crc kubenswrapper[4839]: I0321 04:28:33.496396 4839 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5d071283174381fba88bb02cf3efc908a9879b9241ab8bc6042c28d3491e2250"} err="failed to get container status \"5d071283174381fba88bb02cf3efc908a9879b9241ab8bc6042c28d3491e2250\": rpc error: code = NotFound desc = could not find container \"5d071283174381fba88bb02cf3efc908a9879b9241ab8bc6042c28d3491e2250\": container with ID starting with 5d071283174381fba88bb02cf3efc908a9879b9241ab8bc6042c28d3491e2250 not found: ID does not exist" Mar 21 04:28:33 crc kubenswrapper[4839]: I0321 04:28:33.668444 4839 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-ff7874d5d-7kj25" Mar 21 04:28:33 crc kubenswrapper[4839]: I0321 04:28:33.695375 4839 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-7m29z"] Mar 21 04:28:33 crc kubenswrapper[4839]: I0321 04:28:33.695793 4839 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-7m29z" podUID="c3ae9e7a-784b-4a39-bd4e-10dbff65cd50" containerName="registry-server" containerID="cri-o://1c80657e72742fec76c998465cd0458ce77bc25333804e766c33538d85b460a4" gracePeriod=2 Mar 21 04:28:34 crc kubenswrapper[4839]: I0321 04:28:34.029805 4839 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-7m29z" Mar 21 04:28:34 crc kubenswrapper[4839]: I0321 04:28:34.225012 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-57m7l\" (UniqueName: \"kubernetes.io/projected/c3ae9e7a-784b-4a39-bd4e-10dbff65cd50-kube-api-access-57m7l\") pod \"c3ae9e7a-784b-4a39-bd4e-10dbff65cd50\" (UID: \"c3ae9e7a-784b-4a39-bd4e-10dbff65cd50\") " Mar 21 04:28:34 crc kubenswrapper[4839]: I0321 04:28:34.225064 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c3ae9e7a-784b-4a39-bd4e-10dbff65cd50-catalog-content\") pod \"c3ae9e7a-784b-4a39-bd4e-10dbff65cd50\" (UID: \"c3ae9e7a-784b-4a39-bd4e-10dbff65cd50\") " Mar 21 04:28:34 crc kubenswrapper[4839]: I0321 04:28:34.225145 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c3ae9e7a-784b-4a39-bd4e-10dbff65cd50-utilities\") pod \"c3ae9e7a-784b-4a39-bd4e-10dbff65cd50\" (UID: \"c3ae9e7a-784b-4a39-bd4e-10dbff65cd50\") " Mar 21 04:28:34 crc kubenswrapper[4839]: I0321 04:28:34.226139 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c3ae9e7a-784b-4a39-bd4e-10dbff65cd50-utilities" (OuterVolumeSpecName: "utilities") pod "c3ae9e7a-784b-4a39-bd4e-10dbff65cd50" (UID: "c3ae9e7a-784b-4a39-bd4e-10dbff65cd50"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 21 04:28:34 crc kubenswrapper[4839]: I0321 04:28:34.230470 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c3ae9e7a-784b-4a39-bd4e-10dbff65cd50-kube-api-access-57m7l" (OuterVolumeSpecName: "kube-api-access-57m7l") pod "c3ae9e7a-784b-4a39-bd4e-10dbff65cd50" (UID: "c3ae9e7a-784b-4a39-bd4e-10dbff65cd50"). InnerVolumeSpecName "kube-api-access-57m7l". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 21 04:28:34 crc kubenswrapper[4839]: I0321 04:28:34.254763 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c3ae9e7a-784b-4a39-bd4e-10dbff65cd50-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "c3ae9e7a-784b-4a39-bd4e-10dbff65cd50" (UID: "c3ae9e7a-784b-4a39-bd4e-10dbff65cd50"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 21 04:28:34 crc kubenswrapper[4839]: I0321 04:28:34.327192 4839 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c3ae9e7a-784b-4a39-bd4e-10dbff65cd50-utilities\") on node \"crc\" DevicePath \"\"" Mar 21 04:28:34 crc kubenswrapper[4839]: I0321 04:28:34.327226 4839 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c3ae9e7a-784b-4a39-bd4e-10dbff65cd50-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 21 04:28:34 crc kubenswrapper[4839]: I0321 04:28:34.327237 4839 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-57m7l\" (UniqueName: \"kubernetes.io/projected/c3ae9e7a-784b-4a39-bd4e-10dbff65cd50-kube-api-access-57m7l\") on node \"crc\" DevicePath \"\"" Mar 21 04:28:34 crc kubenswrapper[4839]: I0321 04:28:34.412033 4839 generic.go:334] "Generic (PLEG): container finished" podID="c3ae9e7a-784b-4a39-bd4e-10dbff65cd50" containerID="1c80657e72742fec76c998465cd0458ce77bc25333804e766c33538d85b460a4" exitCode=0 Mar 21 04:28:34 crc kubenswrapper[4839]: I0321 04:28:34.412123 4839 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-7m29z" Mar 21 04:28:34 crc kubenswrapper[4839]: I0321 04:28:34.412147 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-7m29z" event={"ID":"c3ae9e7a-784b-4a39-bd4e-10dbff65cd50","Type":"ContainerDied","Data":"1c80657e72742fec76c998465cd0458ce77bc25333804e766c33538d85b460a4"} Mar 21 04:28:34 crc kubenswrapper[4839]: I0321 04:28:34.412202 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-7m29z" event={"ID":"c3ae9e7a-784b-4a39-bd4e-10dbff65cd50","Type":"ContainerDied","Data":"ffa5ea5aab95eb6762df26e9e80adb2d4051bdda8b4098100f8d0af4408a8c40"} Mar 21 04:28:34 crc kubenswrapper[4839]: I0321 04:28:34.412225 4839 scope.go:117] "RemoveContainer" containerID="1c80657e72742fec76c998465cd0458ce77bc25333804e766c33538d85b460a4" Mar 21 04:28:34 crc kubenswrapper[4839]: I0321 04:28:34.428667 4839 scope.go:117] "RemoveContainer" containerID="91243299e8fcd1dc886a5177da277836895624fb7961cb20f7e40640b95d90fa" Mar 21 04:28:34 crc kubenswrapper[4839]: I0321 04:28:34.437253 4839 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-7m29z"] Mar 21 04:28:34 crc kubenswrapper[4839]: I0321 04:28:34.441461 4839 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-7m29z"] Mar 21 04:28:34 crc kubenswrapper[4839]: I0321 04:28:34.445193 4839 scope.go:117] "RemoveContainer" containerID="7521db13f86a9c3794fc986fd50597367465ad735540781bebd8b03f96db40d5" Mar 21 04:28:34 crc kubenswrapper[4839]: I0321 04:28:34.461035 4839 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c3ae9e7a-784b-4a39-bd4e-10dbff65cd50" path="/var/lib/kubelet/pods/c3ae9e7a-784b-4a39-bd4e-10dbff65cd50/volumes" Mar 21 04:28:34 crc kubenswrapper[4839]: I0321 04:28:34.461800 4839 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f1ec80e5-557b-4c30-8323-87d6b1447a6d" path="/var/lib/kubelet/pods/f1ec80e5-557b-4c30-8323-87d6b1447a6d/volumes" Mar 21 04:28:34 crc kubenswrapper[4839]: I0321 04:28:34.461824 4839 scope.go:117] "RemoveContainer" containerID="1c80657e72742fec76c998465cd0458ce77bc25333804e766c33538d85b460a4" Mar 21 04:28:34 crc kubenswrapper[4839]: E0321 04:28:34.462177 4839 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1c80657e72742fec76c998465cd0458ce77bc25333804e766c33538d85b460a4\": container with ID starting with 1c80657e72742fec76c998465cd0458ce77bc25333804e766c33538d85b460a4 not found: ID does not exist" containerID="1c80657e72742fec76c998465cd0458ce77bc25333804e766c33538d85b460a4" Mar 21 04:28:34 crc kubenswrapper[4839]: I0321 04:28:34.462212 4839 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1c80657e72742fec76c998465cd0458ce77bc25333804e766c33538d85b460a4"} err="failed to get container status \"1c80657e72742fec76c998465cd0458ce77bc25333804e766c33538d85b460a4\": rpc error: code = NotFound desc = could not find container \"1c80657e72742fec76c998465cd0458ce77bc25333804e766c33538d85b460a4\": container with ID starting with 1c80657e72742fec76c998465cd0458ce77bc25333804e766c33538d85b460a4 not found: ID does not exist" Mar 21 04:28:34 crc kubenswrapper[4839]: I0321 04:28:34.462238 4839 scope.go:117] "RemoveContainer" containerID="91243299e8fcd1dc886a5177da277836895624fb7961cb20f7e40640b95d90fa" Mar 21 04:28:34 crc kubenswrapper[4839]: E0321 04:28:34.462631 4839 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"91243299e8fcd1dc886a5177da277836895624fb7961cb20f7e40640b95d90fa\": container with ID starting with 91243299e8fcd1dc886a5177da277836895624fb7961cb20f7e40640b95d90fa not found: ID does not exist" containerID="91243299e8fcd1dc886a5177da277836895624fb7961cb20f7e40640b95d90fa" Mar 21 04:28:34 crc kubenswrapper[4839]: I0321 04:28:34.462663 4839 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"91243299e8fcd1dc886a5177da277836895624fb7961cb20f7e40640b95d90fa"} err="failed to get container status \"91243299e8fcd1dc886a5177da277836895624fb7961cb20f7e40640b95d90fa\": rpc error: code = NotFound desc = could not find container \"91243299e8fcd1dc886a5177da277836895624fb7961cb20f7e40640b95d90fa\": container with ID starting with 91243299e8fcd1dc886a5177da277836895624fb7961cb20f7e40640b95d90fa not found: ID does not exist" Mar 21 04:28:34 crc kubenswrapper[4839]: I0321 04:28:34.462682 4839 scope.go:117] "RemoveContainer" containerID="7521db13f86a9c3794fc986fd50597367465ad735540781bebd8b03f96db40d5" Mar 21 04:28:34 crc kubenswrapper[4839]: E0321 04:28:34.462958 4839 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7521db13f86a9c3794fc986fd50597367465ad735540781bebd8b03f96db40d5\": container with ID starting with 7521db13f86a9c3794fc986fd50597367465ad735540781bebd8b03f96db40d5 not found: ID does not exist" containerID="7521db13f86a9c3794fc986fd50597367465ad735540781bebd8b03f96db40d5" Mar 21 04:28:34 crc kubenswrapper[4839]: I0321 04:28:34.462996 4839 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7521db13f86a9c3794fc986fd50597367465ad735540781bebd8b03f96db40d5"} err="failed to get container status \"7521db13f86a9c3794fc986fd50597367465ad735540781bebd8b03f96db40d5\": rpc error: code = NotFound desc = could not find container \"7521db13f86a9c3794fc986fd50597367465ad735540781bebd8b03f96db40d5\": container with ID starting with 7521db13f86a9c3794fc986fd50597367465ad735540781bebd8b03f96db40d5 not found: ID does not exist" Mar 21 04:28:36 crc kubenswrapper[4839]: I0321 04:28:36.090396 4839 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-cznml"] Mar 21 04:28:36 crc kubenswrapper[4839]: I0321 04:28:36.090689 4839 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-cznml" podUID="b144748c-2940-4efe-a486-d2b5c1239b12" containerName="registry-server" containerID="cri-o://5ea3b3f4c3326a4aa81b311c0480c6c4bfb0954f54e7b1a0e142902f9a762cfa" gracePeriod=2 Mar 21 04:28:37 crc kubenswrapper[4839]: I0321 04:28:37.440538 4839 generic.go:334] "Generic (PLEG): container finished" podID="b144748c-2940-4efe-a486-d2b5c1239b12" containerID="5ea3b3f4c3326a4aa81b311c0480c6c4bfb0954f54e7b1a0e142902f9a762cfa" exitCode=0 Mar 21 04:28:37 crc kubenswrapper[4839]: I0321 04:28:37.440617 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-cznml" event={"ID":"b144748c-2940-4efe-a486-d2b5c1239b12","Type":"ContainerDied","Data":"5ea3b3f4c3326a4aa81b311c0480c6c4bfb0954f54e7b1a0e142902f9a762cfa"} Mar 21 04:28:37 crc kubenswrapper[4839]: I0321 04:28:37.672147 4839 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-cznml" Mar 21 04:28:37 crc kubenswrapper[4839]: I0321 04:28:37.821360 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d994r\" (UniqueName: \"kubernetes.io/projected/b144748c-2940-4efe-a486-d2b5c1239b12-kube-api-access-d994r\") pod \"b144748c-2940-4efe-a486-d2b5c1239b12\" (UID: \"b144748c-2940-4efe-a486-d2b5c1239b12\") " Mar 21 04:28:37 crc kubenswrapper[4839]: I0321 04:28:37.821468 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b144748c-2940-4efe-a486-d2b5c1239b12-catalog-content\") pod \"b144748c-2940-4efe-a486-d2b5c1239b12\" (UID: \"b144748c-2940-4efe-a486-d2b5c1239b12\") " Mar 21 04:28:37 crc kubenswrapper[4839]: I0321 04:28:37.821607 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b144748c-2940-4efe-a486-d2b5c1239b12-utilities\") pod \"b144748c-2940-4efe-a486-d2b5c1239b12\" (UID: \"b144748c-2940-4efe-a486-d2b5c1239b12\") " Mar 21 04:28:37 crc kubenswrapper[4839]: I0321 04:28:37.822395 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b144748c-2940-4efe-a486-d2b5c1239b12-utilities" (OuterVolumeSpecName: "utilities") pod "b144748c-2940-4efe-a486-d2b5c1239b12" (UID: "b144748c-2940-4efe-a486-d2b5c1239b12"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 21 04:28:37 crc kubenswrapper[4839]: I0321 04:28:37.827265 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b144748c-2940-4efe-a486-d2b5c1239b12-kube-api-access-d994r" (OuterVolumeSpecName: "kube-api-access-d994r") pod "b144748c-2940-4efe-a486-d2b5c1239b12" (UID: "b144748c-2940-4efe-a486-d2b5c1239b12"). InnerVolumeSpecName "kube-api-access-d994r". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 21 04:28:37 crc kubenswrapper[4839]: I0321 04:28:37.923269 4839 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b144748c-2940-4efe-a486-d2b5c1239b12-utilities\") on node \"crc\" DevicePath \"\"" Mar 21 04:28:37 crc kubenswrapper[4839]: I0321 04:28:37.923314 4839 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d994r\" (UniqueName: \"kubernetes.io/projected/b144748c-2940-4efe-a486-d2b5c1239b12-kube-api-access-d994r\") on node \"crc\" DevicePath \"\"" Mar 21 04:28:37 crc kubenswrapper[4839]: I0321 04:28:37.952747 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b144748c-2940-4efe-a486-d2b5c1239b12-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "b144748c-2940-4efe-a486-d2b5c1239b12" (UID: "b144748c-2940-4efe-a486-d2b5c1239b12"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 21 04:28:38 crc kubenswrapper[4839]: I0321 04:28:38.025070 4839 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b144748c-2940-4efe-a486-d2b5c1239b12-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 21 04:28:38 crc kubenswrapper[4839]: I0321 04:28:38.449600 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-cznml" event={"ID":"b144748c-2940-4efe-a486-d2b5c1239b12","Type":"ContainerDied","Data":"9d0751bfec85855cd6ce730251b6d95d8b4de8c09e14303b2b6a9c1d9c1fd165"} Mar 21 04:28:38 crc kubenswrapper[4839]: I0321 04:28:38.449642 4839 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-cznml" Mar 21 04:28:38 crc kubenswrapper[4839]: I0321 04:28:38.450667 4839 scope.go:117] "RemoveContainer" containerID="5ea3b3f4c3326a4aa81b311c0480c6c4bfb0954f54e7b1a0e142902f9a762cfa" Mar 21 04:28:38 crc kubenswrapper[4839]: I0321 04:28:38.469136 4839 scope.go:117] "RemoveContainer" containerID="c0f3c8cd4904aa41a1b68ecefd439fa3aaa62843dac43bdac15ac235c5222357" Mar 21 04:28:38 crc kubenswrapper[4839]: I0321 04:28:38.500263 4839 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-cznml"] Mar 21 04:28:38 crc kubenswrapper[4839]: I0321 04:28:38.503155 4839 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-cznml"] Mar 21 04:28:38 crc kubenswrapper[4839]: I0321 04:28:38.515651 4839 scope.go:117] "RemoveContainer" containerID="d34aaed9fe2b229d5a38e871187b560f6a9b3aa6a74029511701966c2b92c3d3" Mar 21 04:28:39 crc kubenswrapper[4839]: I0321 04:28:39.375733 4839 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-558f576774-7vr74" Mar 21 04:28:39 crc kubenswrapper[4839]: I0321 04:28:39.381180 4839 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-558f576774-7vr74" Mar 21 04:28:39 crc kubenswrapper[4839]: I0321 04:28:39.818316 4839 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-zt77f"] Mar 21 04:28:40 crc kubenswrapper[4839]: I0321 04:28:40.460234 4839 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b144748c-2940-4efe-a486-d2b5c1239b12" path="/var/lib/kubelet/pods/b144748c-2940-4efe-a486-d2b5c1239b12/volumes" Mar 21 04:28:40 crc kubenswrapper[4839]: I0321 04:28:40.759798 4839 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-zgfcm" Mar 21 04:28:40 crc kubenswrapper[4839]: I0321 04:28:40.793666 4839 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-zgfcm" Mar 21 04:28:41 crc kubenswrapper[4839]: I0321 04:28:41.712615 4839 kubelet.go:2431] "SyncLoop REMOVE" source="file" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Mar 21 04:28:41 crc kubenswrapper[4839]: I0321 04:28:41.713295 4839 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" containerID="cri-o://7299e7f513aff2d1d61b42cef3ed386843478716082a6e38d4d6d61d87f48529" gracePeriod=15 Mar 21 04:28:41 crc kubenswrapper[4839]: I0321 04:28:41.713487 4839 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" containerID="cri-o://412472fce0e71838c2bb83101879b672e777dcd608bf27261a98261150d42e87" gracePeriod=15 Mar 21 04:28:41 crc kubenswrapper[4839]: I0321 04:28:41.713602 4839 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-insecure-readyz" containerID="cri-o://573a241321770c9c676d83d8280f81eba0ad01a3e2f857be89d31316117b8898" gracePeriod=15 Mar 21 04:28:41 crc kubenswrapper[4839]: I0321 04:28:41.713688 4839 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-regeneration-controller" containerID="cri-o://6d9bd6ceaa85cd21af55d1d512ac910eaf9d1bcf5d0f10868cd22ac80b13f66a" gracePeriod=15 Mar 21 04:28:41 crc kubenswrapper[4839]: I0321 04:28:41.713733 4839 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-syncer" containerID="cri-o://e4fe4a6c00181492cd5389e34a9bf18d905f69b2f7467eb75dcee9992eeb0d07" gracePeriod=15 Mar 21 04:28:41 crc kubenswrapper[4839]: I0321 04:28:41.714204 4839 kubelet.go:2421] "SyncLoop ADD" source="file" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Mar 21 04:28:41 crc kubenswrapper[4839]: E0321 04:28:41.714408 4839 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 21 04:28:41 crc kubenswrapper[4839]: I0321 04:28:41.714425 4839 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 21 04:28:41 crc kubenswrapper[4839]: E0321 04:28:41.714434 4839 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f1ec80e5-557b-4c30-8323-87d6b1447a6d" containerName="registry-server" Mar 21 04:28:41 crc kubenswrapper[4839]: I0321 04:28:41.714441 4839 state_mem.go:107] "Deleted CPUSet assignment" podUID="f1ec80e5-557b-4c30-8323-87d6b1447a6d" containerName="registry-server" Mar 21 04:28:41 crc kubenswrapper[4839]: E0321 04:28:41.714450 4839 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 21 04:28:41 crc kubenswrapper[4839]: I0321 04:28:41.714456 4839 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 21 04:28:41 crc kubenswrapper[4839]: E0321 04:28:41.714463 4839 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 21 04:28:41 crc kubenswrapper[4839]: I0321 04:28:41.714468 4839 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 21 04:28:41 crc kubenswrapper[4839]: E0321 04:28:41.714477 4839 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f1ec80e5-557b-4c30-8323-87d6b1447a6d" containerName="extract-content" Mar 21 04:28:41 crc kubenswrapper[4839]: I0321 04:28:41.714483 4839 state_mem.go:107] "Deleted CPUSet assignment" podUID="f1ec80e5-557b-4c30-8323-87d6b1447a6d" containerName="extract-content" Mar 21 04:28:41 crc kubenswrapper[4839]: E0321 04:28:41.714489 4839 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c3ae9e7a-784b-4a39-bd4e-10dbff65cd50" containerName="extract-content" Mar 21 04:28:41 crc kubenswrapper[4839]: I0321 04:28:41.714496 4839 state_mem.go:107] "Deleted CPUSet assignment" podUID="c3ae9e7a-784b-4a39-bd4e-10dbff65cd50" containerName="extract-content" Mar 21 04:28:41 crc kubenswrapper[4839]: E0321 04:28:41.714503 4839 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-regeneration-controller" Mar 21 04:28:41 crc kubenswrapper[4839]: I0321 04:28:41.714509 4839 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-regeneration-controller" Mar 21 04:28:41 crc kubenswrapper[4839]: E0321 04:28:41.714519 4839 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dc99f39a-8001-466b-acf1-bd106eb2b81d" containerName="extract-content" Mar 21 04:28:41 crc kubenswrapper[4839]: I0321 04:28:41.714524 4839 state_mem.go:107] "Deleted CPUSet assignment" podUID="dc99f39a-8001-466b-acf1-bd106eb2b81d" containerName="extract-content" Mar 21 04:28:41 crc kubenswrapper[4839]: E0321 04:28:41.714535 4839 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b144748c-2940-4efe-a486-d2b5c1239b12" containerName="registry-server" Mar 21 04:28:41 crc kubenswrapper[4839]: I0321 04:28:41.714541 4839 state_mem.go:107] "Deleted CPUSet assignment" podUID="b144748c-2940-4efe-a486-d2b5c1239b12" containerName="registry-server" Mar 21 04:28:41 crc kubenswrapper[4839]: E0321 04:28:41.714548 4839 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-syncer" Mar 21 04:28:41 crc kubenswrapper[4839]: I0321 04:28:41.714553 4839 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-syncer" Mar 21 04:28:41 crc kubenswrapper[4839]: E0321 04:28:41.714561 4839 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dc99f39a-8001-466b-acf1-bd106eb2b81d" containerName="registry-server" Mar 21 04:28:41 crc kubenswrapper[4839]: I0321 04:28:41.714586 4839 state_mem.go:107] "Deleted CPUSet assignment" podUID="dc99f39a-8001-466b-acf1-bd106eb2b81d" containerName="registry-server" Mar 21 04:28:41 crc kubenswrapper[4839]: E0321 04:28:41.714593 4839 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b144748c-2940-4efe-a486-d2b5c1239b12" containerName="extract-content" Mar 21 04:28:41 crc kubenswrapper[4839]: I0321 04:28:41.714598 4839 state_mem.go:107] "Deleted CPUSet assignment" podUID="b144748c-2940-4efe-a486-d2b5c1239b12" containerName="extract-content" Mar 21 04:28:41 crc kubenswrapper[4839]: E0321 04:28:41.714608 4839 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f1ec80e5-557b-4c30-8323-87d6b1447a6d" containerName="extract-utilities" Mar 21 04:28:41 crc kubenswrapper[4839]: I0321 04:28:41.714614 4839 state_mem.go:107] "Deleted CPUSet assignment" podUID="f1ec80e5-557b-4c30-8323-87d6b1447a6d" containerName="extract-utilities" Mar 21 04:28:41 crc kubenswrapper[4839]: E0321 04:28:41.714629 4839 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" Mar 21 04:28:41 crc kubenswrapper[4839]: I0321 04:28:41.714635 4839 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" Mar 21 04:28:41 crc kubenswrapper[4839]: E0321 04:28:41.714647 4839 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dc99f39a-8001-466b-acf1-bd106eb2b81d" containerName="extract-utilities" Mar 21 04:28:41 crc kubenswrapper[4839]: I0321 04:28:41.714674 4839 state_mem.go:107] "Deleted CPUSet assignment" podUID="dc99f39a-8001-466b-acf1-bd106eb2b81d" containerName="extract-utilities" Mar 21 04:28:41 crc kubenswrapper[4839]: E0321 04:28:41.714687 4839 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-insecure-readyz" Mar 21 04:28:41 crc kubenswrapper[4839]: I0321 04:28:41.714695 4839 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-insecure-readyz" Mar 21 04:28:41 crc kubenswrapper[4839]: E0321 04:28:41.714704 4839 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c3ae9e7a-784b-4a39-bd4e-10dbff65cd50" containerName="registry-server" Mar 21 04:28:41 crc kubenswrapper[4839]: I0321 04:28:41.714709 4839 state_mem.go:107] "Deleted CPUSet assignment" podUID="c3ae9e7a-784b-4a39-bd4e-10dbff65cd50" containerName="registry-server" Mar 21 04:28:41 crc kubenswrapper[4839]: E0321 04:28:41.714719 4839 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b144748c-2940-4efe-a486-d2b5c1239b12" containerName="extract-utilities" Mar 21 04:28:41 crc kubenswrapper[4839]: I0321 04:28:41.714725 4839 state_mem.go:107] "Deleted CPUSet assignment" podUID="b144748c-2940-4efe-a486-d2b5c1239b12" containerName="extract-utilities" Mar 21 04:28:41 crc kubenswrapper[4839]: E0321 04:28:41.714734 4839 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="setup" Mar 21 04:28:41 crc kubenswrapper[4839]: I0321 04:28:41.714740 4839 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="setup" Mar 21 04:28:41 crc kubenswrapper[4839]: E0321 04:28:41.714750 4839 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c3ae9e7a-784b-4a39-bd4e-10dbff65cd50" containerName="extract-utilities" Mar 21 04:28:41 crc kubenswrapper[4839]: I0321 04:28:41.714756 4839 state_mem.go:107] "Deleted CPUSet assignment" podUID="c3ae9e7a-784b-4a39-bd4e-10dbff65cd50" containerName="extract-utilities" Mar 21 04:28:41 crc kubenswrapper[4839]: I0321 04:28:41.714839 4839 memory_manager.go:354] "RemoveStaleState removing state" podUID="dc99f39a-8001-466b-acf1-bd106eb2b81d" containerName="registry-server" Mar 21 04:28:41 crc kubenswrapper[4839]: I0321 04:28:41.714846 4839 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-syncer" Mar 21 04:28:41 crc kubenswrapper[4839]: I0321 04:28:41.714855 4839 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" Mar 21 04:28:41 crc kubenswrapper[4839]: I0321 04:28:41.714862 4839 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 21 04:28:41 crc kubenswrapper[4839]: I0321 04:28:41.714870 4839 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 21 04:28:41 crc kubenswrapper[4839]: I0321 04:28:41.714878 4839 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 21 04:28:41 crc kubenswrapper[4839]: I0321 04:28:41.714890 4839 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 21 04:28:41 crc kubenswrapper[4839]: I0321 04:28:41.714902 4839 memory_manager.go:354] "RemoveStaleState removing state" podUID="f1ec80e5-557b-4c30-8323-87d6b1447a6d" containerName="registry-server" Mar 21 04:28:41 crc kubenswrapper[4839]: I0321 04:28:41.714912 4839 memory_manager.go:354] "RemoveStaleState removing state" podUID="b144748c-2940-4efe-a486-d2b5c1239b12" containerName="registry-server" Mar 21 04:28:41 crc kubenswrapper[4839]: I0321 04:28:41.714919 4839 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-insecure-readyz" Mar 21 04:28:41 crc kubenswrapper[4839]: I0321 04:28:41.714929 4839 memory_manager.go:354] "RemoveStaleState removing state" podUID="c3ae9e7a-784b-4a39-bd4e-10dbff65cd50" containerName="registry-server" Mar 21 04:28:41 crc kubenswrapper[4839]: I0321 04:28:41.714940 4839 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-regeneration-controller" Mar 21 04:28:41 crc kubenswrapper[4839]: E0321 04:28:41.715049 4839 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 21 04:28:41 crc kubenswrapper[4839]: I0321 04:28:41.715057 4839 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 21 04:28:41 crc kubenswrapper[4839]: E0321 04:28:41.715064 4839 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 21 04:28:41 crc kubenswrapper[4839]: I0321 04:28:41.715070 4839 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 21 04:28:41 crc kubenswrapper[4839]: I0321 04:28:41.715153 4839 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 21 04:28:41 crc kubenswrapper[4839]: I0321 04:28:41.719222 4839 kubelet.go:2421] "SyncLoop ADD" source="file" pods=["openshift-kube-apiserver/kube-apiserver-startup-monitor-crc"] Mar 21 04:28:41 crc kubenswrapper[4839]: I0321 04:28:41.722450 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 21 04:28:41 crc kubenswrapper[4839]: I0321 04:28:41.732132 4839 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openshift-kube-apiserver/kube-apiserver-crc" oldPodUID="f4b27818a5e8e43d0dc095d08835c792" podUID="71bb4a3aecc4ba5b26c4b7318770ce13" Mar 21 04:28:41 crc kubenswrapper[4839]: E0321 04:28:41.760681 4839 kubelet.go:1929] "Failed creating a mirror pod for" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods\": dial tcp 38.102.83.188:6443: connect: connection refused" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 21 04:28:41 crc kubenswrapper[4839]: I0321 04:28:41.877581 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 21 04:28:41 crc kubenswrapper[4839]: I0321 04:28:41.877657 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 21 04:28:41 crc kubenswrapper[4839]: I0321 04:28:41.877678 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 21 04:28:41 crc kubenswrapper[4839]: I0321 04:28:41.877730 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 21 04:28:41 crc kubenswrapper[4839]: I0321 04:28:41.877915 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 21 04:28:41 crc kubenswrapper[4839]: I0321 04:28:41.877967 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 21 04:28:41 crc kubenswrapper[4839]: I0321 04:28:41.878009 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 21 04:28:41 crc kubenswrapper[4839]: I0321 04:28:41.878036 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 21 04:28:41 crc kubenswrapper[4839]: I0321 04:28:41.978665 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 21 04:28:41 crc kubenswrapper[4839]: I0321 04:28:41.978723 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 21 04:28:41 crc kubenswrapper[4839]: I0321 04:28:41.978770 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 21 04:28:41 crc kubenswrapper[4839]: I0321 04:28:41.978795 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 21 04:28:41 crc kubenswrapper[4839]: I0321 04:28:41.978799 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 21 04:28:41 crc kubenswrapper[4839]: I0321 04:28:41.978833 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 21 04:28:41 crc kubenswrapper[4839]: I0321 04:28:41.978813 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 21 04:28:41 crc kubenswrapper[4839]: I0321 04:28:41.978859 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 21 04:28:41 crc kubenswrapper[4839]: I0321 04:28:41.978879 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 21 04:28:41 crc kubenswrapper[4839]: I0321 04:28:41.978925 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 21 04:28:41 crc kubenswrapper[4839]: I0321 04:28:41.979116 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 21 04:28:41 crc kubenswrapper[4839]: I0321 04:28:41.979149 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 21 04:28:41 crc kubenswrapper[4839]: I0321 04:28:41.979190 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 21 04:28:41 crc kubenswrapper[4839]: I0321 04:28:41.979197 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 21 04:28:41 crc kubenswrapper[4839]: I0321 04:28:41.979224 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 21 04:28:41 crc kubenswrapper[4839]: I0321 04:28:41.979429 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 21 04:28:42 crc kubenswrapper[4839]: I0321 04:28:42.062040 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 21 04:28:42 crc kubenswrapper[4839]: W0321 04:28:42.078320 4839 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf85e55b1a89d02b0cb034b1ea31ed45a.slice/crio-94e01fbf6a44a892697bce003df25f4fb15786bcda5230b57805b1b2d3b70080 WatchSource:0}: Error finding container 94e01fbf6a44a892697bce003df25f4fb15786bcda5230b57805b1b2d3b70080: Status 404 returned error can't find the container with id 94e01fbf6a44a892697bce003df25f4fb15786bcda5230b57805b1b2d3b70080 Mar 21 04:28:42 crc kubenswrapper[4839]: E0321 04:28:42.081712 4839 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/events\": dial tcp 38.102.83.188:6443: connect: connection refused" event="&Event{ObjectMeta:{kube-apiserver-startup-monitor-crc.189ec0d47c41cf10 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-startup-monitor-crc,UID:f85e55b1a89d02b0cb034b1ea31ed45a,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{startup-monitor},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-21 04:28:42.080841488 +0000 UTC m=+326.408628164,LastTimestamp:2026-03-21 04:28:42.080841488 +0000 UTC m=+326.408628164,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 21 04:28:42 crc kubenswrapper[4839]: I0321 04:28:42.471430 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" event={"ID":"f85e55b1a89d02b0cb034b1ea31ed45a","Type":"ContainerStarted","Data":"7087a9cc9ea0a71bf83c49dccad5914cb87b12f2c00337676a7c943aa8d8a9b6"} Mar 21 04:28:42 crc kubenswrapper[4839]: I0321 04:28:42.471965 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" event={"ID":"f85e55b1a89d02b0cb034b1ea31ed45a","Type":"ContainerStarted","Data":"94e01fbf6a44a892697bce003df25f4fb15786bcda5230b57805b1b2d3b70080"} Mar 21 04:28:42 crc kubenswrapper[4839]: E0321 04:28:42.473414 4839 kubelet.go:1929] "Failed creating a mirror pod for" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods\": dial tcp 38.102.83.188:6443: connect: connection refused" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 21 04:28:42 crc kubenswrapper[4839]: I0321 04:28:42.474450 4839 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/3.log" Mar 21 04:28:42 crc kubenswrapper[4839]: I0321 04:28:42.476796 4839 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-cert-syncer/0.log" Mar 21 04:28:42 crc kubenswrapper[4839]: I0321 04:28:42.480925 4839 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="412472fce0e71838c2bb83101879b672e777dcd608bf27261a98261150d42e87" exitCode=0 Mar 21 04:28:42 crc kubenswrapper[4839]: I0321 04:28:42.481015 4839 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="573a241321770c9c676d83d8280f81eba0ad01a3e2f857be89d31316117b8898" exitCode=0 Mar 21 04:28:42 crc kubenswrapper[4839]: I0321 04:28:42.481026 4839 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="6d9bd6ceaa85cd21af55d1d512ac910eaf9d1bcf5d0f10868cd22ac80b13f66a" exitCode=0 Mar 21 04:28:42 crc kubenswrapper[4839]: I0321 04:28:42.481036 4839 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="e4fe4a6c00181492cd5389e34a9bf18d905f69b2f7467eb75dcee9992eeb0d07" exitCode=2 Mar 21 04:28:42 crc kubenswrapper[4839]: I0321 04:28:42.481097 4839 scope.go:117] "RemoveContainer" containerID="ecafd1a4266b72f369f4b53d8a423bb875c1bc9719282c517a8e999ad5aecc66" Mar 21 04:28:42 crc kubenswrapper[4839]: I0321 04:28:42.487543 4839 generic.go:334] "Generic (PLEG): container finished" podID="e3a71e7a-3ead-483f-8de2-9dbf3a336182" containerID="c850247b91b749f4d993a8c6034f93518caa14ae16f4055edfe77ec5dbf0002f" exitCode=0 Mar 21 04:28:42 crc kubenswrapper[4839]: I0321 04:28:42.487615 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-9-crc" event={"ID":"e3a71e7a-3ead-483f-8de2-9dbf3a336182","Type":"ContainerDied","Data":"c850247b91b749f4d993a8c6034f93518caa14ae16f4055edfe77ec5dbf0002f"} Mar 21 04:28:42 crc kubenswrapper[4839]: I0321 04:28:42.488453 4839 status_manager.go:851] "Failed to get status for pod" podUID="e3a71e7a-3ead-483f-8de2-9dbf3a336182" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.188:6443: connect: connection refused" Mar 21 04:28:43 crc kubenswrapper[4839]: I0321 04:28:43.502240 4839 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-cert-syncer/0.log" Mar 21 04:28:44 crc kubenswrapper[4839]: I0321 04:28:44.108328 4839 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-9-crc" Mar 21 04:28:44 crc kubenswrapper[4839]: I0321 04:28:44.109332 4839 status_manager.go:851] "Failed to get status for pod" podUID="e3a71e7a-3ead-483f-8de2-9dbf3a336182" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.188:6443: connect: connection refused" Mar 21 04:28:44 crc kubenswrapper[4839]: I0321 04:28:44.113690 4839 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-cert-syncer/0.log" Mar 21 04:28:44 crc kubenswrapper[4839]: I0321 04:28:44.114460 4839 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 21 04:28:44 crc kubenswrapper[4839]: I0321 04:28:44.114927 4839 status_manager.go:851] "Failed to get status for pod" podUID="e3a71e7a-3ead-483f-8de2-9dbf3a336182" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.188:6443: connect: connection refused" Mar 21 04:28:44 crc kubenswrapper[4839]: I0321 04:28:44.115466 4839 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.188:6443: connect: connection refused" Mar 21 04:28:44 crc kubenswrapper[4839]: I0321 04:28:44.207136 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"f4b27818a5e8e43d0dc095d08835c792\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " Mar 21 04:28:44 crc kubenswrapper[4839]: I0321 04:28:44.207224 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/e3a71e7a-3ead-483f-8de2-9dbf3a336182-var-lock\") pod \"e3a71e7a-3ead-483f-8de2-9dbf3a336182\" (UID: \"e3a71e7a-3ead-483f-8de2-9dbf3a336182\") " Mar 21 04:28:44 crc kubenswrapper[4839]: I0321 04:28:44.207268 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e3a71e7a-3ead-483f-8de2-9dbf3a336182-kube-api-access\") pod \"e3a71e7a-3ead-483f-8de2-9dbf3a336182\" (UID: \"e3a71e7a-3ead-483f-8de2-9dbf3a336182\") " Mar 21 04:28:44 crc kubenswrapper[4839]: I0321 04:28:44.207285 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir" (OuterVolumeSpecName: "audit-dir") pod "f4b27818a5e8e43d0dc095d08835c792" (UID: "f4b27818a5e8e43d0dc095d08835c792"). InnerVolumeSpecName "audit-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 21 04:28:44 crc kubenswrapper[4839]: I0321 04:28:44.207309 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"f4b27818a5e8e43d0dc095d08835c792\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " Mar 21 04:28:44 crc kubenswrapper[4839]: I0321 04:28:44.207348 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir" (OuterVolumeSpecName: "resource-dir") pod "f4b27818a5e8e43d0dc095d08835c792" (UID: "f4b27818a5e8e43d0dc095d08835c792"). InnerVolumeSpecName "resource-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 21 04:28:44 crc kubenswrapper[4839]: I0321 04:28:44.207372 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/e3a71e7a-3ead-483f-8de2-9dbf3a336182-var-lock" (OuterVolumeSpecName: "var-lock") pod "e3a71e7a-3ead-483f-8de2-9dbf3a336182" (UID: "e3a71e7a-3ead-483f-8de2-9dbf3a336182"). InnerVolumeSpecName "var-lock". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 21 04:28:44 crc kubenswrapper[4839]: I0321 04:28:44.207399 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/e3a71e7a-3ead-483f-8de2-9dbf3a336182-kubelet-dir\") pod \"e3a71e7a-3ead-483f-8de2-9dbf3a336182\" (UID: \"e3a71e7a-3ead-483f-8de2-9dbf3a336182\") " Mar 21 04:28:44 crc kubenswrapper[4839]: I0321 04:28:44.207476 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"f4b27818a5e8e43d0dc095d08835c792\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " Mar 21 04:28:44 crc kubenswrapper[4839]: I0321 04:28:44.207423 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/e3a71e7a-3ead-483f-8de2-9dbf3a336182-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "e3a71e7a-3ead-483f-8de2-9dbf3a336182" (UID: "e3a71e7a-3ead-483f-8de2-9dbf3a336182"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 21 04:28:44 crc kubenswrapper[4839]: I0321 04:28:44.207593 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir" (OuterVolumeSpecName: "cert-dir") pod "f4b27818a5e8e43d0dc095d08835c792" (UID: "f4b27818a5e8e43d0dc095d08835c792"). InnerVolumeSpecName "cert-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 21 04:28:44 crc kubenswrapper[4839]: I0321 04:28:44.207893 4839 reconciler_common.go:293] "Volume detached for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") on node \"crc\" DevicePath \"\"" Mar 21 04:28:44 crc kubenswrapper[4839]: I0321 04:28:44.207905 4839 reconciler_common.go:293] "Volume detached for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/e3a71e7a-3ead-483f-8de2-9dbf3a336182-var-lock\") on node \"crc\" DevicePath \"\"" Mar 21 04:28:44 crc kubenswrapper[4839]: I0321 04:28:44.207915 4839 reconciler_common.go:293] "Volume detached for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") on node \"crc\" DevicePath \"\"" Mar 21 04:28:44 crc kubenswrapper[4839]: I0321 04:28:44.207924 4839 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/e3a71e7a-3ead-483f-8de2-9dbf3a336182-kubelet-dir\") on node \"crc\" DevicePath \"\"" Mar 21 04:28:44 crc kubenswrapper[4839]: I0321 04:28:44.207931 4839 reconciler_common.go:293] "Volume detached for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") on node \"crc\" DevicePath \"\"" Mar 21 04:28:44 crc kubenswrapper[4839]: I0321 04:28:44.213898 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e3a71e7a-3ead-483f-8de2-9dbf3a336182-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "e3a71e7a-3ead-483f-8de2-9dbf3a336182" (UID: "e3a71e7a-3ead-483f-8de2-9dbf3a336182"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 21 04:28:44 crc kubenswrapper[4839]: I0321 04:28:44.308993 4839 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e3a71e7a-3ead-483f-8de2-9dbf3a336182-kube-api-access\") on node \"crc\" DevicePath \"\"" Mar 21 04:28:44 crc kubenswrapper[4839]: I0321 04:28:44.460856 4839 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f4b27818a5e8e43d0dc095d08835c792" path="/var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/volumes" Mar 21 04:28:44 crc kubenswrapper[4839]: I0321 04:28:44.511454 4839 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-cert-syncer/0.log" Mar 21 04:28:44 crc kubenswrapper[4839]: I0321 04:28:44.512156 4839 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="7299e7f513aff2d1d61b42cef3ed386843478716082a6e38d4d6d61d87f48529" exitCode=0 Mar 21 04:28:44 crc kubenswrapper[4839]: I0321 04:28:44.512219 4839 scope.go:117] "RemoveContainer" containerID="412472fce0e71838c2bb83101879b672e777dcd608bf27261a98261150d42e87" Mar 21 04:28:44 crc kubenswrapper[4839]: I0321 04:28:44.512350 4839 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 21 04:28:44 crc kubenswrapper[4839]: I0321 04:28:44.513649 4839 status_manager.go:851] "Failed to get status for pod" podUID="e3a71e7a-3ead-483f-8de2-9dbf3a336182" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.188:6443: connect: connection refused" Mar 21 04:28:44 crc kubenswrapper[4839]: I0321 04:28:44.514957 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-9-crc" event={"ID":"e3a71e7a-3ead-483f-8de2-9dbf3a336182","Type":"ContainerDied","Data":"c7bf0d3551fb41779d1f34afa1d26b3709109fa26382cddc767b170e4665d502"} Mar 21 04:28:44 crc kubenswrapper[4839]: I0321 04:28:44.514976 4839 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-9-crc" Mar 21 04:28:44 crc kubenswrapper[4839]: I0321 04:28:44.514992 4839 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c7bf0d3551fb41779d1f34afa1d26b3709109fa26382cddc767b170e4665d502" Mar 21 04:28:44 crc kubenswrapper[4839]: I0321 04:28:44.516094 4839 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.188:6443: connect: connection refused" Mar 21 04:28:44 crc kubenswrapper[4839]: I0321 04:28:44.516399 4839 status_manager.go:851] "Failed to get status for pod" podUID="e3a71e7a-3ead-483f-8de2-9dbf3a336182" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.188:6443: connect: connection refused" Mar 21 04:28:44 crc kubenswrapper[4839]: I0321 04:28:44.516638 4839 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.188:6443: connect: connection refused" Mar 21 04:28:44 crc kubenswrapper[4839]: I0321 04:28:44.522808 4839 status_manager.go:851] "Failed to get status for pod" podUID="e3a71e7a-3ead-483f-8de2-9dbf3a336182" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.188:6443: connect: connection refused" Mar 21 04:28:44 crc kubenswrapper[4839]: I0321 04:28:44.523040 4839 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.188:6443: connect: connection refused" Mar 21 04:28:44 crc kubenswrapper[4839]: I0321 04:28:44.529475 4839 scope.go:117] "RemoveContainer" containerID="573a241321770c9c676d83d8280f81eba0ad01a3e2f857be89d31316117b8898" Mar 21 04:28:44 crc kubenswrapper[4839]: I0321 04:28:44.542983 4839 scope.go:117] "RemoveContainer" containerID="6d9bd6ceaa85cd21af55d1d512ac910eaf9d1bcf5d0f10868cd22ac80b13f66a" Mar 21 04:28:44 crc kubenswrapper[4839]: I0321 04:28:44.558292 4839 scope.go:117] "RemoveContainer" containerID="e4fe4a6c00181492cd5389e34a9bf18d905f69b2f7467eb75dcee9992eeb0d07" Mar 21 04:28:44 crc kubenswrapper[4839]: I0321 04:28:44.575701 4839 scope.go:117] "RemoveContainer" containerID="7299e7f513aff2d1d61b42cef3ed386843478716082a6e38d4d6d61d87f48529" Mar 21 04:28:44 crc kubenswrapper[4839]: I0321 04:28:44.593507 4839 scope.go:117] "RemoveContainer" containerID="f70600d5431425836d94025fc8166068d8c9f411d0750be07cd7de44575a9649" Mar 21 04:28:44 crc kubenswrapper[4839]: I0321 04:28:44.613245 4839 scope.go:117] "RemoveContainer" containerID="412472fce0e71838c2bb83101879b672e777dcd608bf27261a98261150d42e87" Mar 21 04:28:44 crc kubenswrapper[4839]: E0321 04:28:44.613891 4839 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"412472fce0e71838c2bb83101879b672e777dcd608bf27261a98261150d42e87\": container with ID starting with 412472fce0e71838c2bb83101879b672e777dcd608bf27261a98261150d42e87 not found: ID does not exist" containerID="412472fce0e71838c2bb83101879b672e777dcd608bf27261a98261150d42e87" Mar 21 04:28:44 crc kubenswrapper[4839]: I0321 04:28:44.613966 4839 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"412472fce0e71838c2bb83101879b672e777dcd608bf27261a98261150d42e87"} err="failed to get container status \"412472fce0e71838c2bb83101879b672e777dcd608bf27261a98261150d42e87\": rpc error: code = NotFound desc = could not find container \"412472fce0e71838c2bb83101879b672e777dcd608bf27261a98261150d42e87\": container with ID starting with 412472fce0e71838c2bb83101879b672e777dcd608bf27261a98261150d42e87 not found: ID does not exist" Mar 21 04:28:44 crc kubenswrapper[4839]: I0321 04:28:44.614006 4839 scope.go:117] "RemoveContainer" containerID="573a241321770c9c676d83d8280f81eba0ad01a3e2f857be89d31316117b8898" Mar 21 04:28:44 crc kubenswrapper[4839]: E0321 04:28:44.616127 4839 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"573a241321770c9c676d83d8280f81eba0ad01a3e2f857be89d31316117b8898\": container with ID starting with 573a241321770c9c676d83d8280f81eba0ad01a3e2f857be89d31316117b8898 not found: ID does not exist" containerID="573a241321770c9c676d83d8280f81eba0ad01a3e2f857be89d31316117b8898" Mar 21 04:28:44 crc kubenswrapper[4839]: I0321 04:28:44.616184 4839 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"573a241321770c9c676d83d8280f81eba0ad01a3e2f857be89d31316117b8898"} err="failed to get container status \"573a241321770c9c676d83d8280f81eba0ad01a3e2f857be89d31316117b8898\": rpc error: code = NotFound desc = could not find container \"573a241321770c9c676d83d8280f81eba0ad01a3e2f857be89d31316117b8898\": container with ID starting with 573a241321770c9c676d83d8280f81eba0ad01a3e2f857be89d31316117b8898 not found: ID does not exist" Mar 21 04:28:44 crc kubenswrapper[4839]: I0321 04:28:44.616215 4839 scope.go:117] "RemoveContainer" containerID="6d9bd6ceaa85cd21af55d1d512ac910eaf9d1bcf5d0f10868cd22ac80b13f66a" Mar 21 04:28:44 crc kubenswrapper[4839]: E0321 04:28:44.616681 4839 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6d9bd6ceaa85cd21af55d1d512ac910eaf9d1bcf5d0f10868cd22ac80b13f66a\": container with ID starting with 6d9bd6ceaa85cd21af55d1d512ac910eaf9d1bcf5d0f10868cd22ac80b13f66a not found: ID does not exist" containerID="6d9bd6ceaa85cd21af55d1d512ac910eaf9d1bcf5d0f10868cd22ac80b13f66a" Mar 21 04:28:44 crc kubenswrapper[4839]: I0321 04:28:44.616730 4839 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6d9bd6ceaa85cd21af55d1d512ac910eaf9d1bcf5d0f10868cd22ac80b13f66a"} err="failed to get container status \"6d9bd6ceaa85cd21af55d1d512ac910eaf9d1bcf5d0f10868cd22ac80b13f66a\": rpc error: code = NotFound desc = could not find container \"6d9bd6ceaa85cd21af55d1d512ac910eaf9d1bcf5d0f10868cd22ac80b13f66a\": container with ID starting with 6d9bd6ceaa85cd21af55d1d512ac910eaf9d1bcf5d0f10868cd22ac80b13f66a not found: ID does not exist" Mar 21 04:28:44 crc kubenswrapper[4839]: I0321 04:28:44.616762 4839 scope.go:117] "RemoveContainer" containerID="e4fe4a6c00181492cd5389e34a9bf18d905f69b2f7467eb75dcee9992eeb0d07" Mar 21 04:28:44 crc kubenswrapper[4839]: E0321 04:28:44.617415 4839 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e4fe4a6c00181492cd5389e34a9bf18d905f69b2f7467eb75dcee9992eeb0d07\": container with ID starting with e4fe4a6c00181492cd5389e34a9bf18d905f69b2f7467eb75dcee9992eeb0d07 not found: ID does not exist" containerID="e4fe4a6c00181492cd5389e34a9bf18d905f69b2f7467eb75dcee9992eeb0d07" Mar 21 04:28:44 crc kubenswrapper[4839]: I0321 04:28:44.617457 4839 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e4fe4a6c00181492cd5389e34a9bf18d905f69b2f7467eb75dcee9992eeb0d07"} err="failed to get container status \"e4fe4a6c00181492cd5389e34a9bf18d905f69b2f7467eb75dcee9992eeb0d07\": rpc error: code = NotFound desc = could not find container \"e4fe4a6c00181492cd5389e34a9bf18d905f69b2f7467eb75dcee9992eeb0d07\": container with ID starting with e4fe4a6c00181492cd5389e34a9bf18d905f69b2f7467eb75dcee9992eeb0d07 not found: ID does not exist" Mar 21 04:28:44 crc kubenswrapper[4839]: I0321 04:28:44.617484 4839 scope.go:117] "RemoveContainer" containerID="7299e7f513aff2d1d61b42cef3ed386843478716082a6e38d4d6d61d87f48529" Mar 21 04:28:44 crc kubenswrapper[4839]: E0321 04:28:44.617836 4839 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7299e7f513aff2d1d61b42cef3ed386843478716082a6e38d4d6d61d87f48529\": container with ID starting with 7299e7f513aff2d1d61b42cef3ed386843478716082a6e38d4d6d61d87f48529 not found: ID does not exist" containerID="7299e7f513aff2d1d61b42cef3ed386843478716082a6e38d4d6d61d87f48529" Mar 21 04:28:44 crc kubenswrapper[4839]: I0321 04:28:44.617868 4839 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7299e7f513aff2d1d61b42cef3ed386843478716082a6e38d4d6d61d87f48529"} err="failed to get container status \"7299e7f513aff2d1d61b42cef3ed386843478716082a6e38d4d6d61d87f48529\": rpc error: code = NotFound desc = could not find container \"7299e7f513aff2d1d61b42cef3ed386843478716082a6e38d4d6d61d87f48529\": container with ID starting with 7299e7f513aff2d1d61b42cef3ed386843478716082a6e38d4d6d61d87f48529 not found: ID does not exist" Mar 21 04:28:44 crc kubenswrapper[4839]: I0321 04:28:44.617890 4839 scope.go:117] "RemoveContainer" containerID="f70600d5431425836d94025fc8166068d8c9f411d0750be07cd7de44575a9649" Mar 21 04:28:44 crc kubenswrapper[4839]: E0321 04:28:44.618201 4839 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f70600d5431425836d94025fc8166068d8c9f411d0750be07cd7de44575a9649\": container with ID starting with f70600d5431425836d94025fc8166068d8c9f411d0750be07cd7de44575a9649 not found: ID does not exist" containerID="f70600d5431425836d94025fc8166068d8c9f411d0750be07cd7de44575a9649" Mar 21 04:28:44 crc kubenswrapper[4839]: I0321 04:28:44.618242 4839 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f70600d5431425836d94025fc8166068d8c9f411d0750be07cd7de44575a9649"} err="failed to get container status \"f70600d5431425836d94025fc8166068d8c9f411d0750be07cd7de44575a9649\": rpc error: code = NotFound desc = could not find container \"f70600d5431425836d94025fc8166068d8c9f411d0750be07cd7de44575a9649\": container with ID starting with f70600d5431425836d94025fc8166068d8c9f411d0750be07cd7de44575a9649 not found: ID does not exist" Mar 21 04:28:46 crc kubenswrapper[4839]: I0321 04:28:46.455277 4839 status_manager.go:851] "Failed to get status for pod" podUID="e3a71e7a-3ead-483f-8de2-9dbf3a336182" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.188:6443: connect: connection refused" Mar 21 04:28:46 crc kubenswrapper[4839]: I0321 04:28:46.456019 4839 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.188:6443: connect: connection refused" Mar 21 04:28:48 crc kubenswrapper[4839]: E0321 04:28:48.475018 4839 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.188:6443: connect: connection refused" Mar 21 04:28:48 crc kubenswrapper[4839]: E0321 04:28:48.475683 4839 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.188:6443: connect: connection refused" Mar 21 04:28:48 crc kubenswrapper[4839]: E0321 04:28:48.476019 4839 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.188:6443: connect: connection refused" Mar 21 04:28:48 crc kubenswrapper[4839]: E0321 04:28:48.476344 4839 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.188:6443: connect: connection refused" Mar 21 04:28:48 crc kubenswrapper[4839]: E0321 04:28:48.476729 4839 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.188:6443: connect: connection refused" Mar 21 04:28:48 crc kubenswrapper[4839]: I0321 04:28:48.476765 4839 controller.go:115] "failed to update lease using latest lease, fallback to ensure lease" err="failed 5 attempts to update lease" Mar 21 04:28:48 crc kubenswrapper[4839]: E0321 04:28:48.477049 4839 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.188:6443: connect: connection refused" interval="200ms" Mar 21 04:28:48 crc kubenswrapper[4839]: E0321 04:28:48.678330 4839 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.188:6443: connect: connection refused" interval="400ms" Mar 21 04:28:49 crc kubenswrapper[4839]: E0321 04:28:49.079881 4839 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.188:6443: connect: connection refused" interval="800ms" Mar 21 04:28:49 crc kubenswrapper[4839]: E0321 04:28:49.881363 4839 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.188:6443: connect: connection refused" interval="1.6s" Mar 21 04:28:51 crc kubenswrapper[4839]: I0321 04:28:51.443644 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 21 04:28:51 crc kubenswrapper[4839]: I0321 04:28:51.444058 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 21 04:28:51 crc kubenswrapper[4839]: W0321 04:28:51.445085 4839 reflector.go:561] object-"openshift-network-console"/"networking-console-plugin-cert": failed to list *v1.Secret: Get "https://api-int.crc.testing:6443/api/v1/namespaces/openshift-network-console/secrets?fieldSelector=metadata.name%3Dnetworking-console-plugin-cert&resourceVersion=27244": dial tcp 38.102.83.188:6443: connect: connection refused Mar 21 04:28:51 crc kubenswrapper[4839]: E0321 04:28:51.445153 4839 reflector.go:158] "Unhandled Error" err="object-\"openshift-network-console\"/\"networking-console-plugin-cert\": Failed to watch *v1.Secret: failed to list *v1.Secret: Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-network-console/secrets?fieldSelector=metadata.name%3Dnetworking-console-plugin-cert&resourceVersion=27244\": dial tcp 38.102.83.188:6443: connect: connection refused" logger="UnhandledError" Mar 21 04:28:51 crc kubenswrapper[4839]: W0321 04:28:51.445085 4839 reflector.go:561] object-"openshift-network-console"/"networking-console-plugin": failed to list *v1.ConfigMap: Get "https://api-int.crc.testing:6443/api/v1/namespaces/openshift-network-console/configmaps?fieldSelector=metadata.name%3Dnetworking-console-plugin&resourceVersion=27242": dial tcp 38.102.83.188:6443: connect: connection refused Mar 21 04:28:51 crc kubenswrapper[4839]: E0321 04:28:51.445183 4839 reflector.go:158] "Unhandled Error" err="object-\"openshift-network-console\"/\"networking-console-plugin\": Failed to watch *v1.ConfigMap: failed to list *v1.ConfigMap: Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-network-console/configmaps?fieldSelector=metadata.name%3Dnetworking-console-plugin&resourceVersion=27242\": dial tcp 38.102.83.188:6443: connect: connection refused" logger="UnhandledError" Mar 21 04:28:51 crc kubenswrapper[4839]: E0321 04:28:51.482793 4839 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.188:6443: connect: connection refused" interval="3.2s" Mar 21 04:28:51 crc kubenswrapper[4839]: I0321 04:28:51.545128 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 21 04:28:51 crc kubenswrapper[4839]: I0321 04:28:51.545199 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 21 04:28:51 crc kubenswrapper[4839]: W0321 04:28:51.545707 4839 reflector.go:561] object-"openshift-network-diagnostics"/"kube-root-ca.crt": failed to list *v1.ConfigMap: Get "https://api-int.crc.testing:6443/api/v1/namespaces/openshift-network-diagnostics/configmaps?fieldSelector=metadata.name%3Dkube-root-ca.crt&resourceVersion=27242": dial tcp 38.102.83.188:6443: connect: connection refused Mar 21 04:28:51 crc kubenswrapper[4839]: E0321 04:28:51.545808 4839 reflector.go:158] "Unhandled Error" err="object-\"openshift-network-diagnostics\"/\"kube-root-ca.crt\": Failed to watch *v1.ConfigMap: failed to list *v1.ConfigMap: Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-network-diagnostics/configmaps?fieldSelector=metadata.name%3Dkube-root-ca.crt&resourceVersion=27242\": dial tcp 38.102.83.188:6443: connect: connection refused" logger="UnhandledError" Mar 21 04:28:51 crc kubenswrapper[4839]: E0321 04:28:51.793279 4839 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/events\": dial tcp 38.102.83.188:6443: connect: connection refused" event="&Event{ObjectMeta:{kube-apiserver-startup-monitor-crc.189ec0d47c41cf10 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-startup-monitor-crc,UID:f85e55b1a89d02b0cb034b1ea31ed45a,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{startup-monitor},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-21 04:28:42.080841488 +0000 UTC m=+326.408628164,LastTimestamp:2026-03-21 04:28:42.080841488 +0000 UTC m=+326.408628164,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 21 04:28:52 crc kubenswrapper[4839]: E0321 04:28:52.444255 4839 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: failed to sync secret cache: timed out waiting for the condition Mar 21 04:28:52 crc kubenswrapper[4839]: E0321 04:28:52.444300 4839 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: failed to sync configmap cache: timed out waiting for the condition Mar 21 04:28:52 crc kubenswrapper[4839]: E0321 04:28:52.444358 4839 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-21 04:30:54.444334464 +0000 UTC m=+458.772121140 (durationBeforeRetry 2m2s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : failed to sync secret cache: timed out waiting for the condition Mar 21 04:28:52 crc kubenswrapper[4839]: E0321 04:28:52.444374 4839 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-21 04:30:54.444368295 +0000 UTC m=+458.772154971 (durationBeforeRetry 2m2s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : failed to sync configmap cache: timed out waiting for the condition Mar 21 04:28:52 crc kubenswrapper[4839]: E0321 04:28:52.546365 4839 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: failed to sync configmap cache: timed out waiting for the condition Mar 21 04:28:52 crc kubenswrapper[4839]: E0321 04:28:52.546374 4839 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: failed to sync configmap cache: timed out waiting for the condition Mar 21 04:28:52 crc kubenswrapper[4839]: W0321 04:28:52.546826 4839 reflector.go:561] object-"openshift-network-diagnostics"/"openshift-service-ca.crt": failed to list *v1.ConfigMap: Get "https://api-int.crc.testing:6443/api/v1/namespaces/openshift-network-diagnostics/configmaps?fieldSelector=metadata.name%3Dopenshift-service-ca.crt&resourceVersion=27242": dial tcp 38.102.83.188:6443: connect: connection refused Mar 21 04:28:52 crc kubenswrapper[4839]: E0321 04:28:52.546881 4839 reflector.go:158] "Unhandled Error" err="object-\"openshift-network-diagnostics\"/\"openshift-service-ca.crt\": Failed to watch *v1.ConfigMap: failed to list *v1.ConfigMap: Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-network-diagnostics/configmaps?fieldSelector=metadata.name%3Dopenshift-service-ca.crt&resourceVersion=27242\": dial tcp 38.102.83.188:6443: connect: connection refused" logger="UnhandledError" Mar 21 04:28:53 crc kubenswrapper[4839]: E0321 04:28:53.546624 4839 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: failed to sync configmap cache: timed out waiting for the condition Mar 21 04:28:53 crc kubenswrapper[4839]: E0321 04:28:53.546656 4839 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: failed to sync configmap cache: timed out waiting for the condition Mar 21 04:28:53 crc kubenswrapper[4839]: E0321 04:28:53.546673 4839 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: failed to sync configmap cache: timed out waiting for the condition Mar 21 04:28:53 crc kubenswrapper[4839]: E0321 04:28:53.546694 4839 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: failed to sync configmap cache: timed out waiting for the condition Mar 21 04:28:53 crc kubenswrapper[4839]: E0321 04:28:53.546750 4839 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-03-21 04:30:55.546731579 +0000 UTC m=+459.874518245 (durationBeforeRetry 2m2s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : failed to sync configmap cache: timed out waiting for the condition Mar 21 04:28:53 crc kubenswrapper[4839]: E0321 04:28:53.546784 4839 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-03-21 04:30:55.54675931 +0000 UTC m=+459.874545986 (durationBeforeRetry 2m2s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : failed to sync configmap cache: timed out waiting for the condition Mar 21 04:28:53 crc kubenswrapper[4839]: W0321 04:28:53.748180 4839 reflector.go:561] object-"openshift-network-diagnostics"/"kube-root-ca.crt": failed to list *v1.ConfigMap: Get "https://api-int.crc.testing:6443/api/v1/namespaces/openshift-network-diagnostics/configmaps?fieldSelector=metadata.name%3Dkube-root-ca.crt&resourceVersion=27242": dial tcp 38.102.83.188:6443: connect: connection refused Mar 21 04:28:53 crc kubenswrapper[4839]: E0321 04:28:53.748257 4839 reflector.go:158] "Unhandled Error" err="object-\"openshift-network-diagnostics\"/\"kube-root-ca.crt\": Failed to watch *v1.ConfigMap: failed to list *v1.ConfigMap: Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-network-diagnostics/configmaps?fieldSelector=metadata.name%3Dkube-root-ca.crt&resourceVersion=27242\": dial tcp 38.102.83.188:6443: connect: connection refused" logger="UnhandledError" Mar 21 04:28:53 crc kubenswrapper[4839]: W0321 04:28:53.757331 4839 reflector.go:561] object-"openshift-network-console"/"networking-console-plugin": failed to list *v1.ConfigMap: Get "https://api-int.crc.testing:6443/api/v1/namespaces/openshift-network-console/configmaps?fieldSelector=metadata.name%3Dnetworking-console-plugin&resourceVersion=27242": dial tcp 38.102.83.188:6443: connect: connection refused Mar 21 04:28:53 crc kubenswrapper[4839]: E0321 04:28:53.757408 4839 reflector.go:158] "Unhandled Error" err="object-\"openshift-network-console\"/\"networking-console-plugin\": Failed to watch *v1.ConfigMap: failed to list *v1.ConfigMap: Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-network-console/configmaps?fieldSelector=metadata.name%3Dnetworking-console-plugin&resourceVersion=27242\": dial tcp 38.102.83.188:6443: connect: connection refused" logger="UnhandledError" Mar 21 04:28:54 crc kubenswrapper[4839]: W0321 04:28:54.260003 4839 reflector.go:561] object-"openshift-network-console"/"networking-console-plugin-cert": failed to list *v1.Secret: Get "https://api-int.crc.testing:6443/api/v1/namespaces/openshift-network-console/secrets?fieldSelector=metadata.name%3Dnetworking-console-plugin-cert&resourceVersion=27244": dial tcp 38.102.83.188:6443: connect: connection refused Mar 21 04:28:54 crc kubenswrapper[4839]: E0321 04:28:54.260125 4839 reflector.go:158] "Unhandled Error" err="object-\"openshift-network-console\"/\"networking-console-plugin-cert\": Failed to watch *v1.Secret: failed to list *v1.Secret: Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-network-console/secrets?fieldSelector=metadata.name%3Dnetworking-console-plugin-cert&resourceVersion=27244\": dial tcp 38.102.83.188:6443: connect: connection refused" logger="UnhandledError" Mar 21 04:28:54 crc kubenswrapper[4839]: I0321 04:28:54.452000 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 21 04:28:54 crc kubenswrapper[4839]: I0321 04:28:54.452984 4839 status_manager.go:851] "Failed to get status for pod" podUID="e3a71e7a-3ead-483f-8de2-9dbf3a336182" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.188:6443: connect: connection refused" Mar 21 04:28:54 crc kubenswrapper[4839]: I0321 04:28:54.470030 4839 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="c5a473f2-0430-4c9b-8ef8-60d457db5188" Mar 21 04:28:54 crc kubenswrapper[4839]: I0321 04:28:54.470405 4839 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="c5a473f2-0430-4c9b-8ef8-60d457db5188" Mar 21 04:28:54 crc kubenswrapper[4839]: E0321 04:28:54.470865 4839 mirror_client.go:138] "Failed deleting a mirror pod" err="Delete \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.188:6443: connect: connection refused" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 21 04:28:54 crc kubenswrapper[4839]: I0321 04:28:54.471367 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 21 04:28:54 crc kubenswrapper[4839]: W0321 04:28:54.493773 4839 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod71bb4a3aecc4ba5b26c4b7318770ce13.slice/crio-d7d8f89ca680e0c0520bbe3f73e12a2ada4499f48b071573715e41bf75652be8 WatchSource:0}: Error finding container d7d8f89ca680e0c0520bbe3f73e12a2ada4499f48b071573715e41bf75652be8: Status 404 returned error can't find the container with id d7d8f89ca680e0c0520bbe3f73e12a2ada4499f48b071573715e41bf75652be8 Mar 21 04:28:54 crc kubenswrapper[4839]: I0321 04:28:54.570099 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"d7d8f89ca680e0c0520bbe3f73e12a2ada4499f48b071573715e41bf75652be8"} Mar 21 04:28:54 crc kubenswrapper[4839]: E0321 04:28:54.684452 4839 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.188:6443: connect: connection refused" interval="6.4s" Mar 21 04:28:55 crc kubenswrapper[4839]: E0321 04:28:55.468352 4839 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[kube-api-access-cqllr], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 21 04:28:55 crc kubenswrapper[4839]: I0321 04:28:55.575918 4839 generic.go:334] "Generic (PLEG): container finished" podID="71bb4a3aecc4ba5b26c4b7318770ce13" containerID="d49c73e9f4625513421a7ac55a4f919881d80ead1c5df858527eec13f0b55cb2" exitCode=0 Mar 21 04:28:55 crc kubenswrapper[4839]: I0321 04:28:55.575973 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerDied","Data":"d49c73e9f4625513421a7ac55a4f919881d80ead1c5df858527eec13f0b55cb2"} Mar 21 04:28:55 crc kubenswrapper[4839]: I0321 04:28:55.576219 4839 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="c5a473f2-0430-4c9b-8ef8-60d457db5188" Mar 21 04:28:55 crc kubenswrapper[4839]: I0321 04:28:55.576606 4839 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="c5a473f2-0430-4c9b-8ef8-60d457db5188" Mar 21 04:28:55 crc kubenswrapper[4839]: I0321 04:28:55.576722 4839 status_manager.go:851] "Failed to get status for pod" podUID="e3a71e7a-3ead-483f-8de2-9dbf3a336182" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.188:6443: connect: connection refused" Mar 21 04:28:55 crc kubenswrapper[4839]: E0321 04:28:55.577013 4839 mirror_client.go:138] "Failed deleting a mirror pod" err="Delete \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.188:6443: connect: connection refused" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 21 04:28:55 crc kubenswrapper[4839]: W0321 04:28:55.680975 4839 reflector.go:561] object-"openshift-network-diagnostics"/"openshift-service-ca.crt": failed to list *v1.ConfigMap: Get "https://api-int.crc.testing:6443/api/v1/namespaces/openshift-network-diagnostics/configmaps?fieldSelector=metadata.name%3Dopenshift-service-ca.crt&resourceVersion=27242": dial tcp 38.102.83.188:6443: connect: connection refused Mar 21 04:28:55 crc kubenswrapper[4839]: E0321 04:28:55.681031 4839 reflector.go:158] "Unhandled Error" err="object-\"openshift-network-diagnostics\"/\"openshift-service-ca.crt\": Failed to watch *v1.ConfigMap: failed to list *v1.ConfigMap: Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-network-diagnostics/configmaps?fieldSelector=metadata.name%3Dopenshift-service-ca.crt&resourceVersion=27242\": dial tcp 38.102.83.188:6443: connect: connection refused" logger="UnhandledError" Mar 21 04:28:56 crc kubenswrapper[4839]: E0321 04:28:56.471403 4839 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[networking-console-plugin-cert nginx-conf], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 21 04:28:56 crc kubenswrapper[4839]: E0321 04:28:56.499423 4839 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[kube-api-access-s2dwl], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 21 04:28:56 crc kubenswrapper[4839]: I0321 04:28:56.587131 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"cd0c768fd3fc7669df9ff99977152e0a55f1eff773769ca371580e0fd59e5587"} Mar 21 04:28:56 crc kubenswrapper[4839]: I0321 04:28:56.587192 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"bde8d34b587c2d23e05a83f6a08f5c7a5abc03ac5af7b64f36e36b748e3d494d"} Mar 21 04:28:56 crc kubenswrapper[4839]: I0321 04:28:56.587207 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"d38cb0ea447e009dbf47aa1847902929beefa1516c957583d83968e314bfe607"} Mar 21 04:28:56 crc kubenswrapper[4839]: I0321 04:28:56.587219 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"1bb6a44266fbf9a1f46069e2559c9f6c5efaa916f94a462779947140b2624181"} Mar 21 04:28:56 crc kubenswrapper[4839]: I0321 04:28:56.591702 4839 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/cluster-policy-controller/1.log" Mar 21 04:28:56 crc kubenswrapper[4839]: I0321 04:28:56.593111 4839 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/kube-controller-manager/0.log" Mar 21 04:28:56 crc kubenswrapper[4839]: I0321 04:28:56.593162 4839 generic.go:334] "Generic (PLEG): container finished" podID="f614b9022728cf315e60c057852e563e" containerID="dd3ce831e80df764c5450f1bf45e92fc061ba1c1efa8ade627b570edcb41567c" exitCode=1 Mar 21 04:28:56 crc kubenswrapper[4839]: I0321 04:28:56.593210 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerDied","Data":"dd3ce831e80df764c5450f1bf45e92fc061ba1c1efa8ade627b570edcb41567c"} Mar 21 04:28:56 crc kubenswrapper[4839]: I0321 04:28:56.593890 4839 scope.go:117] "RemoveContainer" containerID="dd3ce831e80df764c5450f1bf45e92fc061ba1c1efa8ade627b570edcb41567c" Mar 21 04:28:57 crc kubenswrapper[4839]: I0321 04:28:57.003025 4839 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 21 04:28:57 crc kubenswrapper[4839]: I0321 04:28:57.602257 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"b3c0e32beebd917f01eaea8197f62ad832a1a8697b7f81f9f5ddaabf1b01d25f"} Mar 21 04:28:57 crc kubenswrapper[4839]: I0321 04:28:57.602505 4839 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="c5a473f2-0430-4c9b-8ef8-60d457db5188" Mar 21 04:28:57 crc kubenswrapper[4839]: I0321 04:28:57.602519 4839 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="c5a473f2-0430-4c9b-8ef8-60d457db5188" Mar 21 04:28:57 crc kubenswrapper[4839]: I0321 04:28:57.602893 4839 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 21 04:28:57 crc kubenswrapper[4839]: I0321 04:28:57.604949 4839 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/cluster-policy-controller/1.log" Mar 21 04:28:57 crc kubenswrapper[4839]: I0321 04:28:57.605895 4839 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/kube-controller-manager/0.log" Mar 21 04:28:57 crc kubenswrapper[4839]: I0321 04:28:57.605935 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"4ea3e2d51f5d6c22330bdcd78365352b43f67ec780cfe8d52f618f414340d6f6"} Mar 21 04:28:59 crc kubenswrapper[4839]: I0321 04:28:59.069456 4839 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-console"/"networking-console-plugin" Mar 21 04:28:59 crc kubenswrapper[4839]: I0321 04:28:59.472839 4839 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 21 04:28:59 crc kubenswrapper[4839]: I0321 04:28:59.473093 4839 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 21 04:28:59 crc kubenswrapper[4839]: I0321 04:28:59.479855 4839 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 21 04:28:59 crc kubenswrapper[4839]: I0321 04:28:59.821192 4839 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 21 04:28:59 crc kubenswrapper[4839]: I0321 04:28:59.824976 4839 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 21 04:29:00 crc kubenswrapper[4839]: I0321 04:29:00.101642 4839 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"openshift-service-ca.crt" Mar 21 04:29:00 crc kubenswrapper[4839]: I0321 04:29:00.122082 4839 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"kube-root-ca.crt" Mar 21 04:29:00 crc kubenswrapper[4839]: I0321 04:29:00.621978 4839 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 21 04:29:02 crc kubenswrapper[4839]: I0321 04:29:02.764869 4839 kubelet.go:1914] "Deleted mirror pod because it is outdated" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 21 04:29:02 crc kubenswrapper[4839]: I0321 04:29:02.800100 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c5a473f2-0430-4c9b-8ef8-60d457db5188\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:28:55Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:28:55Z\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:28:54Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver kube-apiserver-cert-syncer kube-apiserver-cert-regeneration-controller kube-apiserver-insecure-readyz kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:28:54Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver kube-apiserver-cert-syncer kube-apiserver-cert-regeneration-controller kube-apiserver-insecure-readyz kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}}}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d49c73e9f4625513421a7ac55a4f919881d80ead1c5df858527eec13f0b55cb2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d49c73e9f4625513421a7ac55a4f919881d80ead1c5df858527eec13f0b55cb2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:28:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:28:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Pending\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Pod \"kube-apiserver-crc\" is invalid: metadata.uid: Invalid value: \"c5a473f2-0430-4c9b-8ef8-60d457db5188\": field is immutable" Mar 21 04:29:02 crc kubenswrapper[4839]: I0321 04:29:02.819679 4839 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openshift-kube-apiserver/kube-apiserver-crc" oldPodUID="71bb4a3aecc4ba5b26c4b7318770ce13" podUID="6516779a-c1a6-4f61-8475-35a9bee3aed1" Mar 21 04:29:03 crc kubenswrapper[4839]: I0321 04:29:03.413741 4839 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-console"/"networking-console-plugin-cert" Mar 21 04:29:03 crc kubenswrapper[4839]: I0321 04:29:03.637631 4839 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="c5a473f2-0430-4c9b-8ef8-60d457db5188" Mar 21 04:29:03 crc kubenswrapper[4839]: I0321 04:29:03.637670 4839 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="c5a473f2-0430-4c9b-8ef8-60d457db5188" Mar 21 04:29:03 crc kubenswrapper[4839]: I0321 04:29:03.640871 4839 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openshift-kube-apiserver/kube-apiserver-crc" oldPodUID="71bb4a3aecc4ba5b26c4b7318770ce13" podUID="6516779a-c1a6-4f61-8475-35a9bee3aed1" Mar 21 04:29:03 crc kubenswrapper[4839]: I0321 04:29:03.642238 4839 status_manager.go:308] "Container readiness changed before pod has synced" pod="openshift-kube-apiserver/kube-apiserver-crc" containerID="cri-o://1bb6a44266fbf9a1f46069e2559c9f6c5efaa916f94a462779947140b2624181" Mar 21 04:29:03 crc kubenswrapper[4839]: I0321 04:29:03.642274 4839 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 21 04:29:04 crc kubenswrapper[4839]: I0321 04:29:04.642561 4839 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="c5a473f2-0430-4c9b-8ef8-60d457db5188" Mar 21 04:29:04 crc kubenswrapper[4839]: I0321 04:29:04.642610 4839 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="c5a473f2-0430-4c9b-8ef8-60d457db5188" Mar 21 04:29:04 crc kubenswrapper[4839]: I0321 04:29:04.646027 4839 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openshift-kube-apiserver/kube-apiserver-crc" oldPodUID="71bb4a3aecc4ba5b26c4b7318770ce13" podUID="6516779a-c1a6-4f61-8475-35a9bee3aed1" Mar 21 04:29:04 crc kubenswrapper[4839]: I0321 04:29:04.846251 4839 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-authentication/oauth-openshift-558db77b4-zt77f" podUID="6bfbd19d-a44a-459c-bd6e-150241ce3ebb" containerName="oauth-openshift" containerID="cri-o://5917d0257c4a81565499bf920cd6ba405e8b8d34fcd640889d185cadd9ae650d" gracePeriod=15 Mar 21 04:29:05 crc kubenswrapper[4839]: I0321 04:29:05.649401 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-zt77f" event={"ID":"6bfbd19d-a44a-459c-bd6e-150241ce3ebb","Type":"ContainerDied","Data":"5917d0257c4a81565499bf920cd6ba405e8b8d34fcd640889d185cadd9ae650d"} Mar 21 04:29:05 crc kubenswrapper[4839]: I0321 04:29:05.649530 4839 generic.go:334] "Generic (PLEG): container finished" podID="6bfbd19d-a44a-459c-bd6e-150241ce3ebb" containerID="5917d0257c4a81565499bf920cd6ba405e8b8d34fcd640889d185cadd9ae650d" exitCode=0 Mar 21 04:29:05 crc kubenswrapper[4839]: I0321 04:29:05.850738 4839 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-zt77f" Mar 21 04:29:05 crc kubenswrapper[4839]: I0321 04:29:05.928427 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6bfbd19d-a44a-459c-bd6e-150241ce3ebb-v4-0-config-system-trusted-ca-bundle\") pod \"6bfbd19d-a44a-459c-bd6e-150241ce3ebb\" (UID: \"6bfbd19d-a44a-459c-bd6e-150241ce3ebb\") " Mar 21 04:29:05 crc kubenswrapper[4839]: I0321 04:29:05.928481 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/6bfbd19d-a44a-459c-bd6e-150241ce3ebb-v4-0-config-system-serving-cert\") pod \"6bfbd19d-a44a-459c-bd6e-150241ce3ebb\" (UID: \"6bfbd19d-a44a-459c-bd6e-150241ce3ebb\") " Mar 21 04:29:05 crc kubenswrapper[4839]: I0321 04:29:05.928500 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/6bfbd19d-a44a-459c-bd6e-150241ce3ebb-v4-0-config-user-template-error\") pod \"6bfbd19d-a44a-459c-bd6e-150241ce3ebb\" (UID: \"6bfbd19d-a44a-459c-bd6e-150241ce3ebb\") " Mar 21 04:29:05 crc kubenswrapper[4839]: I0321 04:29:05.928522 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/6bfbd19d-a44a-459c-bd6e-150241ce3ebb-v4-0-config-user-template-login\") pod \"6bfbd19d-a44a-459c-bd6e-150241ce3ebb\" (UID: \"6bfbd19d-a44a-459c-bd6e-150241ce3ebb\") " Mar 21 04:29:05 crc kubenswrapper[4839]: I0321 04:29:05.928546 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/6bfbd19d-a44a-459c-bd6e-150241ce3ebb-v4-0-config-system-session\") pod \"6bfbd19d-a44a-459c-bd6e-150241ce3ebb\" (UID: \"6bfbd19d-a44a-459c-bd6e-150241ce3ebb\") " Mar 21 04:29:05 crc kubenswrapper[4839]: I0321 04:29:05.929561 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6bfbd19d-a44a-459c-bd6e-150241ce3ebb-v4-0-config-system-trusted-ca-bundle" (OuterVolumeSpecName: "v4-0-config-system-trusted-ca-bundle") pod "6bfbd19d-a44a-459c-bd6e-150241ce3ebb" (UID: "6bfbd19d-a44a-459c-bd6e-150241ce3ebb"). InnerVolumeSpecName "v4-0-config-system-trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 21 04:29:05 crc kubenswrapper[4839]: I0321 04:29:05.930012 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/6bfbd19d-a44a-459c-bd6e-150241ce3ebb-v4-0-config-user-template-provider-selection\") pod \"6bfbd19d-a44a-459c-bd6e-150241ce3ebb\" (UID: \"6bfbd19d-a44a-459c-bd6e-150241ce3ebb\") " Mar 21 04:29:05 crc kubenswrapper[4839]: I0321 04:29:05.930080 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/6bfbd19d-a44a-459c-bd6e-150241ce3ebb-v4-0-config-system-cliconfig\") pod \"6bfbd19d-a44a-459c-bd6e-150241ce3ebb\" (UID: \"6bfbd19d-a44a-459c-bd6e-150241ce3ebb\") " Mar 21 04:29:05 crc kubenswrapper[4839]: I0321 04:29:05.930106 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-z5ktq\" (UniqueName: \"kubernetes.io/projected/6bfbd19d-a44a-459c-bd6e-150241ce3ebb-kube-api-access-z5ktq\") pod \"6bfbd19d-a44a-459c-bd6e-150241ce3ebb\" (UID: \"6bfbd19d-a44a-459c-bd6e-150241ce3ebb\") " Mar 21 04:29:05 crc kubenswrapper[4839]: I0321 04:29:05.930136 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/6bfbd19d-a44a-459c-bd6e-150241ce3ebb-audit-policies\") pod \"6bfbd19d-a44a-459c-bd6e-150241ce3ebb\" (UID: \"6bfbd19d-a44a-459c-bd6e-150241ce3ebb\") " Mar 21 04:29:05 crc kubenswrapper[4839]: I0321 04:29:05.930195 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/6bfbd19d-a44a-459c-bd6e-150241ce3ebb-audit-dir\") pod \"6bfbd19d-a44a-459c-bd6e-150241ce3ebb\" (UID: \"6bfbd19d-a44a-459c-bd6e-150241ce3ebb\") " Mar 21 04:29:05 crc kubenswrapper[4839]: I0321 04:29:05.930223 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/6bfbd19d-a44a-459c-bd6e-150241ce3ebb-v4-0-config-system-router-certs\") pod \"6bfbd19d-a44a-459c-bd6e-150241ce3ebb\" (UID: \"6bfbd19d-a44a-459c-bd6e-150241ce3ebb\") " Mar 21 04:29:05 crc kubenswrapper[4839]: I0321 04:29:05.930249 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/6bfbd19d-a44a-459c-bd6e-150241ce3ebb-v4-0-config-system-service-ca\") pod \"6bfbd19d-a44a-459c-bd6e-150241ce3ebb\" (UID: \"6bfbd19d-a44a-459c-bd6e-150241ce3ebb\") " Mar 21 04:29:05 crc kubenswrapper[4839]: I0321 04:29:05.930265 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/6bfbd19d-a44a-459c-bd6e-150241ce3ebb-v4-0-config-user-idp-0-file-data\") pod \"6bfbd19d-a44a-459c-bd6e-150241ce3ebb\" (UID: \"6bfbd19d-a44a-459c-bd6e-150241ce3ebb\") " Mar 21 04:29:05 crc kubenswrapper[4839]: I0321 04:29:05.930360 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/6bfbd19d-a44a-459c-bd6e-150241ce3ebb-v4-0-config-system-ocp-branding-template\") pod \"6bfbd19d-a44a-459c-bd6e-150241ce3ebb\" (UID: \"6bfbd19d-a44a-459c-bd6e-150241ce3ebb\") " Mar 21 04:29:05 crc kubenswrapper[4839]: I0321 04:29:05.930552 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6bfbd19d-a44a-459c-bd6e-150241ce3ebb-v4-0-config-system-cliconfig" (OuterVolumeSpecName: "v4-0-config-system-cliconfig") pod "6bfbd19d-a44a-459c-bd6e-150241ce3ebb" (UID: "6bfbd19d-a44a-459c-bd6e-150241ce3ebb"). InnerVolumeSpecName "v4-0-config-system-cliconfig". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 21 04:29:05 crc kubenswrapper[4839]: I0321 04:29:05.930649 4839 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6bfbd19d-a44a-459c-bd6e-150241ce3ebb-v4-0-config-system-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 21 04:29:05 crc kubenswrapper[4839]: I0321 04:29:05.930669 4839 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/6bfbd19d-a44a-459c-bd6e-150241ce3ebb-v4-0-config-system-cliconfig\") on node \"crc\" DevicePath \"\"" Mar 21 04:29:05 crc kubenswrapper[4839]: I0321 04:29:05.930555 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/6bfbd19d-a44a-459c-bd6e-150241ce3ebb-audit-dir" (OuterVolumeSpecName: "audit-dir") pod "6bfbd19d-a44a-459c-bd6e-150241ce3ebb" (UID: "6bfbd19d-a44a-459c-bd6e-150241ce3ebb"). InnerVolumeSpecName "audit-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 21 04:29:05 crc kubenswrapper[4839]: I0321 04:29:05.930834 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6bfbd19d-a44a-459c-bd6e-150241ce3ebb-v4-0-config-system-service-ca" (OuterVolumeSpecName: "v4-0-config-system-service-ca") pod "6bfbd19d-a44a-459c-bd6e-150241ce3ebb" (UID: "6bfbd19d-a44a-459c-bd6e-150241ce3ebb"). InnerVolumeSpecName "v4-0-config-system-service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 21 04:29:05 crc kubenswrapper[4839]: I0321 04:29:05.931352 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6bfbd19d-a44a-459c-bd6e-150241ce3ebb-audit-policies" (OuterVolumeSpecName: "audit-policies") pod "6bfbd19d-a44a-459c-bd6e-150241ce3ebb" (UID: "6bfbd19d-a44a-459c-bd6e-150241ce3ebb"). InnerVolumeSpecName "audit-policies". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 21 04:29:05 crc kubenswrapper[4839]: I0321 04:29:05.934697 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6bfbd19d-a44a-459c-bd6e-150241ce3ebb-v4-0-config-user-template-provider-selection" (OuterVolumeSpecName: "v4-0-config-user-template-provider-selection") pod "6bfbd19d-a44a-459c-bd6e-150241ce3ebb" (UID: "6bfbd19d-a44a-459c-bd6e-150241ce3ebb"). InnerVolumeSpecName "v4-0-config-user-template-provider-selection". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 21 04:29:05 crc kubenswrapper[4839]: I0321 04:29:05.934948 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6bfbd19d-a44a-459c-bd6e-150241ce3ebb-kube-api-access-z5ktq" (OuterVolumeSpecName: "kube-api-access-z5ktq") pod "6bfbd19d-a44a-459c-bd6e-150241ce3ebb" (UID: "6bfbd19d-a44a-459c-bd6e-150241ce3ebb"). InnerVolumeSpecName "kube-api-access-z5ktq". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 21 04:29:05 crc kubenswrapper[4839]: I0321 04:29:05.934962 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6bfbd19d-a44a-459c-bd6e-150241ce3ebb-v4-0-config-system-session" (OuterVolumeSpecName: "v4-0-config-system-session") pod "6bfbd19d-a44a-459c-bd6e-150241ce3ebb" (UID: "6bfbd19d-a44a-459c-bd6e-150241ce3ebb"). InnerVolumeSpecName "v4-0-config-system-session". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 21 04:29:05 crc kubenswrapper[4839]: I0321 04:29:05.935186 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6bfbd19d-a44a-459c-bd6e-150241ce3ebb-v4-0-config-system-ocp-branding-template" (OuterVolumeSpecName: "v4-0-config-system-ocp-branding-template") pod "6bfbd19d-a44a-459c-bd6e-150241ce3ebb" (UID: "6bfbd19d-a44a-459c-bd6e-150241ce3ebb"). InnerVolumeSpecName "v4-0-config-system-ocp-branding-template". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 21 04:29:05 crc kubenswrapper[4839]: I0321 04:29:05.935423 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6bfbd19d-a44a-459c-bd6e-150241ce3ebb-v4-0-config-system-serving-cert" (OuterVolumeSpecName: "v4-0-config-system-serving-cert") pod "6bfbd19d-a44a-459c-bd6e-150241ce3ebb" (UID: "6bfbd19d-a44a-459c-bd6e-150241ce3ebb"). InnerVolumeSpecName "v4-0-config-system-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 21 04:29:05 crc kubenswrapper[4839]: I0321 04:29:05.935591 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6bfbd19d-a44a-459c-bd6e-150241ce3ebb-v4-0-config-user-template-login" (OuterVolumeSpecName: "v4-0-config-user-template-login") pod "6bfbd19d-a44a-459c-bd6e-150241ce3ebb" (UID: "6bfbd19d-a44a-459c-bd6e-150241ce3ebb"). InnerVolumeSpecName "v4-0-config-user-template-login". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 21 04:29:05 crc kubenswrapper[4839]: I0321 04:29:05.935719 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6bfbd19d-a44a-459c-bd6e-150241ce3ebb-v4-0-config-system-router-certs" (OuterVolumeSpecName: "v4-0-config-system-router-certs") pod "6bfbd19d-a44a-459c-bd6e-150241ce3ebb" (UID: "6bfbd19d-a44a-459c-bd6e-150241ce3ebb"). InnerVolumeSpecName "v4-0-config-system-router-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 21 04:29:05 crc kubenswrapper[4839]: I0321 04:29:05.935881 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6bfbd19d-a44a-459c-bd6e-150241ce3ebb-v4-0-config-user-template-error" (OuterVolumeSpecName: "v4-0-config-user-template-error") pod "6bfbd19d-a44a-459c-bd6e-150241ce3ebb" (UID: "6bfbd19d-a44a-459c-bd6e-150241ce3ebb"). InnerVolumeSpecName "v4-0-config-user-template-error". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 21 04:29:05 crc kubenswrapper[4839]: I0321 04:29:05.936048 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6bfbd19d-a44a-459c-bd6e-150241ce3ebb-v4-0-config-user-idp-0-file-data" (OuterVolumeSpecName: "v4-0-config-user-idp-0-file-data") pod "6bfbd19d-a44a-459c-bd6e-150241ce3ebb" (UID: "6bfbd19d-a44a-459c-bd6e-150241ce3ebb"). InnerVolumeSpecName "v4-0-config-user-idp-0-file-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 21 04:29:06 crc kubenswrapper[4839]: I0321 04:29:06.031189 4839 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/6bfbd19d-a44a-459c-bd6e-150241ce3ebb-v4-0-config-system-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 21 04:29:06 crc kubenswrapper[4839]: I0321 04:29:06.031220 4839 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/6bfbd19d-a44a-459c-bd6e-150241ce3ebb-v4-0-config-user-template-error\") on node \"crc\" DevicePath \"\"" Mar 21 04:29:06 crc kubenswrapper[4839]: I0321 04:29:06.031229 4839 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/6bfbd19d-a44a-459c-bd6e-150241ce3ebb-v4-0-config-user-template-login\") on node \"crc\" DevicePath \"\"" Mar 21 04:29:06 crc kubenswrapper[4839]: I0321 04:29:06.031238 4839 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/6bfbd19d-a44a-459c-bd6e-150241ce3ebb-v4-0-config-system-session\") on node \"crc\" DevicePath \"\"" Mar 21 04:29:06 crc kubenswrapper[4839]: I0321 04:29:06.031248 4839 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/6bfbd19d-a44a-459c-bd6e-150241ce3ebb-v4-0-config-user-template-provider-selection\") on node \"crc\" DevicePath \"\"" Mar 21 04:29:06 crc kubenswrapper[4839]: I0321 04:29:06.031257 4839 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-z5ktq\" (UniqueName: \"kubernetes.io/projected/6bfbd19d-a44a-459c-bd6e-150241ce3ebb-kube-api-access-z5ktq\") on node \"crc\" DevicePath \"\"" Mar 21 04:29:06 crc kubenswrapper[4839]: I0321 04:29:06.031266 4839 reconciler_common.go:293] "Volume detached for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/6bfbd19d-a44a-459c-bd6e-150241ce3ebb-audit-policies\") on node \"crc\" DevicePath \"\"" Mar 21 04:29:06 crc kubenswrapper[4839]: I0321 04:29:06.031273 4839 reconciler_common.go:293] "Volume detached for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/6bfbd19d-a44a-459c-bd6e-150241ce3ebb-audit-dir\") on node \"crc\" DevicePath \"\"" Mar 21 04:29:06 crc kubenswrapper[4839]: I0321 04:29:06.031282 4839 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/6bfbd19d-a44a-459c-bd6e-150241ce3ebb-v4-0-config-system-router-certs\") on node \"crc\" DevicePath \"\"" Mar 21 04:29:06 crc kubenswrapper[4839]: I0321 04:29:06.031290 4839 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/6bfbd19d-a44a-459c-bd6e-150241ce3ebb-v4-0-config-system-service-ca\") on node \"crc\" DevicePath \"\"" Mar 21 04:29:06 crc kubenswrapper[4839]: I0321 04:29:06.031300 4839 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/6bfbd19d-a44a-459c-bd6e-150241ce3ebb-v4-0-config-user-idp-0-file-data\") on node \"crc\" DevicePath \"\"" Mar 21 04:29:06 crc kubenswrapper[4839]: I0321 04:29:06.031308 4839 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/6bfbd19d-a44a-459c-bd6e-150241ce3ebb-v4-0-config-system-ocp-branding-template\") on node \"crc\" DevicePath \"\"" Mar 21 04:29:06 crc kubenswrapper[4839]: I0321 04:29:06.655881 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-zt77f" event={"ID":"6bfbd19d-a44a-459c-bd6e-150241ce3ebb","Type":"ContainerDied","Data":"acab2b98e2e3828439a02d33c7b3fd1855365edb0946861b3e5dc01800f9adfe"} Mar 21 04:29:06 crc kubenswrapper[4839]: I0321 04:29:06.655942 4839 scope.go:117] "RemoveContainer" containerID="5917d0257c4a81565499bf920cd6ba405e8b8d34fcd640889d185cadd9ae650d" Mar 21 04:29:06 crc kubenswrapper[4839]: I0321 04:29:06.655953 4839 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-zt77f" Mar 21 04:29:07 crc kubenswrapper[4839]: I0321 04:29:07.452421 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 21 04:29:07 crc kubenswrapper[4839]: I0321 04:29:07.452805 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 21 04:29:09 crc kubenswrapper[4839]: I0321 04:29:09.452611 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 21 04:29:11 crc kubenswrapper[4839]: I0321 04:29:11.887146 4839 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 21 04:29:13 crc kubenswrapper[4839]: I0321 04:29:13.129610 4839 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"service-ca-dockercfg-pn86c" Mar 21 04:29:13 crc kubenswrapper[4839]: I0321 04:29:13.186744 4839 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"serving-cert" Mar 21 04:29:13 crc kubenswrapper[4839]: I0321 04:29:13.390461 4839 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Mar 21 04:29:13 crc kubenswrapper[4839]: I0321 04:29:13.392392 4839 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"service-ca-operator-dockercfg-rg9jl" Mar 21 04:29:13 crc kubenswrapper[4839]: I0321 04:29:13.590949 4839 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"env-overrides" Mar 21 04:29:13 crc kubenswrapper[4839]: I0321 04:29:13.603481 4839 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-serving-cert" Mar 21 04:29:13 crc kubenswrapper[4839]: I0321 04:29:13.743044 4839 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"serving-cert" Mar 21 04:29:13 crc kubenswrapper[4839]: I0321 04:29:13.828061 4839 reflector.go:368] Caches populated for *v1.CSIDriver from k8s.io/client-go/informers/factory.go:160 Mar 21 04:29:14 crc kubenswrapper[4839]: I0321 04:29:14.097258 4839 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"oauth-apiserver-sa-dockercfg-6r2bq" Mar 21 04:29:14 crc kubenswrapper[4839]: I0321 04:29:14.125237 4839 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"cni-copy-resources" Mar 21 04:29:14 crc kubenswrapper[4839]: I0321 04:29:14.388841 4839 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-control-plane-dockercfg-gs7dd" Mar 21 04:29:15 crc kubenswrapper[4839]: I0321 04:29:15.069079 4839 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-root-ca.crt" Mar 21 04:29:15 crc kubenswrapper[4839]: I0321 04:29:15.147277 4839 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"kube-root-ca.crt" Mar 21 04:29:15 crc kubenswrapper[4839]: I0321 04:29:15.150674 4839 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"kube-root-ca.crt" Mar 21 04:29:15 crc kubenswrapper[4839]: I0321 04:29:15.435583 4839 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"openshift-service-ca.crt" Mar 21 04:29:15 crc kubenswrapper[4839]: I0321 04:29:15.532351 4839 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"kube-root-ca.crt" Mar 21 04:29:15 crc kubenswrapper[4839]: I0321 04:29:15.736307 4839 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-oauth-config" Mar 21 04:29:15 crc kubenswrapper[4839]: I0321 04:29:15.894421 4839 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serviceaccount-dockercfg-rq7zk" Mar 21 04:29:15 crc kubenswrapper[4839]: I0321 04:29:15.976666 4839 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"kube-root-ca.crt" Mar 21 04:29:15 crc kubenswrapper[4839]: I0321 04:29:15.997211 4839 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"env-overrides" Mar 21 04:29:16 crc kubenswrapper[4839]: I0321 04:29:16.077916 4839 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"openshift-apiserver-sa-dockercfg-djjff" Mar 21 04:29:16 crc kubenswrapper[4839]: I0321 04:29:16.182791 4839 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"catalog-operator-serving-cert" Mar 21 04:29:16 crc kubenswrapper[4839]: I0321 04:29:16.182871 4839 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-secret" Mar 21 04:29:16 crc kubenswrapper[4839]: I0321 04:29:16.197353 4839 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"dns-default" Mar 21 04:29:17 crc kubenswrapper[4839]: I0321 04:29:17.441435 4839 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"openshift-service-ca.crt" Mar 21 04:29:17 crc kubenswrapper[4839]: I0321 04:29:17.441645 4839 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"kube-root-ca.crt" Mar 21 04:29:17 crc kubenswrapper[4839]: I0321 04:29:17.441793 4839 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Mar 21 04:29:17 crc kubenswrapper[4839]: I0321 04:29:17.442061 4839 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-tls" Mar 21 04:29:17 crc kubenswrapper[4839]: I0321 04:29:17.442245 4839 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"cluster-image-registry-operator-dockercfg-m4qtx" Mar 21 04:29:17 crc kubenswrapper[4839]: I0321 04:29:17.442492 4839 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-dockercfg-x57mr" Mar 21 04:29:17 crc kubenswrapper[4839]: I0321 04:29:17.442903 4839 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"openshift-service-ca.crt" Mar 21 04:29:17 crc kubenswrapper[4839]: I0321 04:29:17.442967 4839 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"proxy-tls" Mar 21 04:29:17 crc kubenswrapper[4839]: I0321 04:29:17.444250 4839 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"kube-root-ca.crt" Mar 21 04:29:17 crc kubenswrapper[4839]: I0321 04:29:17.450313 4839 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"kube-root-ca.crt" Mar 21 04:29:17 crc kubenswrapper[4839]: I0321 04:29:17.450528 4839 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"kube-root-ca.crt" Mar 21 04:29:17 crc kubenswrapper[4839]: I0321 04:29:17.450714 4839 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"config" Mar 21 04:29:17 crc kubenswrapper[4839]: I0321 04:29:17.453038 4839 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mcc-proxy-tls" Mar 21 04:29:17 crc kubenswrapper[4839]: I0321 04:29:17.456763 4839 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"kube-root-ca.crt" Mar 21 04:29:17 crc kubenswrapper[4839]: I0321 04:29:17.456952 4839 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-serving-cert" Mar 21 04:29:17 crc kubenswrapper[4839]: I0321 04:29:17.460510 4839 reflector.go:368] Caches populated for *v1.Pod from pkg/kubelet/config/apiserver.go:66 Mar 21 04:29:17 crc kubenswrapper[4839]: I0321 04:29:17.466929 4839 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-zt77f","openshift-kube-apiserver/kube-apiserver-crc"] Mar 21 04:29:17 crc kubenswrapper[4839]: I0321 04:29:17.467240 4839 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-authentication/oauth-openshift-cb949b455-jr2d6","openshift-kube-apiserver/kube-apiserver-crc"] Mar 21 04:29:17 crc kubenswrapper[4839]: E0321 04:29:17.467457 4839 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6bfbd19d-a44a-459c-bd6e-150241ce3ebb" containerName="oauth-openshift" Mar 21 04:29:17 crc kubenswrapper[4839]: I0321 04:29:17.467478 4839 state_mem.go:107] "Deleted CPUSet assignment" podUID="6bfbd19d-a44a-459c-bd6e-150241ce3ebb" containerName="oauth-openshift" Mar 21 04:29:17 crc kubenswrapper[4839]: E0321 04:29:17.467502 4839 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e3a71e7a-3ead-483f-8de2-9dbf3a336182" containerName="installer" Mar 21 04:29:17 crc kubenswrapper[4839]: I0321 04:29:17.467510 4839 state_mem.go:107] "Deleted CPUSet assignment" podUID="e3a71e7a-3ead-483f-8de2-9dbf3a336182" containerName="installer" Mar 21 04:29:17 crc kubenswrapper[4839]: I0321 04:29:17.467646 4839 memory_manager.go:354] "RemoveStaleState removing state" podUID="e3a71e7a-3ead-483f-8de2-9dbf3a336182" containerName="installer" Mar 21 04:29:17 crc kubenswrapper[4839]: I0321 04:29:17.467666 4839 memory_manager.go:354] "RemoveStaleState removing state" podUID="6bfbd19d-a44a-459c-bd6e-150241ce3ebb" containerName="oauth-openshift" Mar 21 04:29:17 crc kubenswrapper[4839]: I0321 04:29:17.468169 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-cb949b455-jr2d6" Mar 21 04:29:17 crc kubenswrapper[4839]: I0321 04:29:17.469065 4839 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="c5a473f2-0430-4c9b-8ef8-60d457db5188" Mar 21 04:29:17 crc kubenswrapper[4839]: I0321 04:29:17.469089 4839 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="c5a473f2-0430-4c9b-8ef8-60d457db5188" Mar 21 04:29:17 crc kubenswrapper[4839]: I0321 04:29:17.477131 4839 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"kube-root-ca.crt" Mar 21 04:29:17 crc kubenswrapper[4839]: I0321 04:29:17.477160 4839 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-serving-cert" Mar 21 04:29:17 crc kubenswrapper[4839]: I0321 04:29:17.477374 4839 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-error" Mar 21 04:29:17 crc kubenswrapper[4839]: I0321 04:29:17.477426 4839 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-service-ca" Mar 21 04:29:17 crc kubenswrapper[4839]: I0321 04:29:17.477527 4839 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"audit" Mar 21 04:29:17 crc kubenswrapper[4839]: I0321 04:29:17.477628 4839 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-provider-selection" Mar 21 04:29:17 crc kubenswrapper[4839]: I0321 04:29:17.478047 4839 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"oauth-openshift-dockercfg-znhcc" Mar 21 04:29:17 crc kubenswrapper[4839]: I0321 04:29:17.478195 4839 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-cliconfig" Mar 21 04:29:17 crc kubenswrapper[4839]: I0321 04:29:17.478327 4839 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-router-certs" Mar 21 04:29:17 crc kubenswrapper[4839]: I0321 04:29:17.478666 4839 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"openshift-service-ca.crt" Mar 21 04:29:17 crc kubenswrapper[4839]: I0321 04:29:17.478741 4839 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 21 04:29:17 crc kubenswrapper[4839]: I0321 04:29:17.484980 4839 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-idp-0-file-data" Mar 21 04:29:17 crc kubenswrapper[4839]: I0321 04:29:17.485241 4839 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-session" Mar 21 04:29:17 crc kubenswrapper[4839]: I0321 04:29:17.489465 4839 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-login" Mar 21 04:29:17 crc kubenswrapper[4839]: I0321 04:29:17.495155 4839 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-trusted-ca-bundle" Mar 21 04:29:17 crc kubenswrapper[4839]: I0321 04:29:17.503634 4839 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-ocp-branding-template" Mar 21 04:29:17 crc kubenswrapper[4839]: I0321 04:29:17.526495 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/02d1828c-4b4b-4d6e-994e-d1b383763960-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-cb949b455-jr2d6\" (UID: \"02d1828c-4b4b-4d6e-994e-d1b383763960\") " pod="openshift-authentication/oauth-openshift-cb949b455-jr2d6" Mar 21 04:29:17 crc kubenswrapper[4839]: I0321 04:29:17.526618 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/02d1828c-4b4b-4d6e-994e-d1b383763960-v4-0-config-user-template-login\") pod \"oauth-openshift-cb949b455-jr2d6\" (UID: \"02d1828c-4b4b-4d6e-994e-d1b383763960\") " pod="openshift-authentication/oauth-openshift-cb949b455-jr2d6" Mar 21 04:29:17 crc kubenswrapper[4839]: I0321 04:29:17.526661 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-26phm\" (UniqueName: \"kubernetes.io/projected/02d1828c-4b4b-4d6e-994e-d1b383763960-kube-api-access-26phm\") pod \"oauth-openshift-cb949b455-jr2d6\" (UID: \"02d1828c-4b4b-4d6e-994e-d1b383763960\") " pod="openshift-authentication/oauth-openshift-cb949b455-jr2d6" Mar 21 04:29:17 crc kubenswrapper[4839]: I0321 04:29:17.526821 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/02d1828c-4b4b-4d6e-994e-d1b383763960-v4-0-config-user-template-error\") pod \"oauth-openshift-cb949b455-jr2d6\" (UID: \"02d1828c-4b4b-4d6e-994e-d1b383763960\") " pod="openshift-authentication/oauth-openshift-cb949b455-jr2d6" Mar 21 04:29:17 crc kubenswrapper[4839]: I0321 04:29:17.526858 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/02d1828c-4b4b-4d6e-994e-d1b383763960-v4-0-config-system-router-certs\") pod \"oauth-openshift-cb949b455-jr2d6\" (UID: \"02d1828c-4b4b-4d6e-994e-d1b383763960\") " pod="openshift-authentication/oauth-openshift-cb949b455-jr2d6" Mar 21 04:29:17 crc kubenswrapper[4839]: I0321 04:29:17.526902 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/02d1828c-4b4b-4d6e-994e-d1b383763960-audit-policies\") pod \"oauth-openshift-cb949b455-jr2d6\" (UID: \"02d1828c-4b4b-4d6e-994e-d1b383763960\") " pod="openshift-authentication/oauth-openshift-cb949b455-jr2d6" Mar 21 04:29:17 crc kubenswrapper[4839]: I0321 04:29:17.527032 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/02d1828c-4b4b-4d6e-994e-d1b383763960-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-cb949b455-jr2d6\" (UID: \"02d1828c-4b4b-4d6e-994e-d1b383763960\") " pod="openshift-authentication/oauth-openshift-cb949b455-jr2d6" Mar 21 04:29:17 crc kubenswrapper[4839]: I0321 04:29:17.527124 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/02d1828c-4b4b-4d6e-994e-d1b383763960-v4-0-config-system-service-ca\") pod \"oauth-openshift-cb949b455-jr2d6\" (UID: \"02d1828c-4b4b-4d6e-994e-d1b383763960\") " pod="openshift-authentication/oauth-openshift-cb949b455-jr2d6" Mar 21 04:29:17 crc kubenswrapper[4839]: I0321 04:29:17.527270 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/02d1828c-4b4b-4d6e-994e-d1b383763960-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-cb949b455-jr2d6\" (UID: \"02d1828c-4b4b-4d6e-994e-d1b383763960\") " pod="openshift-authentication/oauth-openshift-cb949b455-jr2d6" Mar 21 04:29:17 crc kubenswrapper[4839]: I0321 04:29:17.527373 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/02d1828c-4b4b-4d6e-994e-d1b383763960-v4-0-config-system-cliconfig\") pod \"oauth-openshift-cb949b455-jr2d6\" (UID: \"02d1828c-4b4b-4d6e-994e-d1b383763960\") " pod="openshift-authentication/oauth-openshift-cb949b455-jr2d6" Mar 21 04:29:17 crc kubenswrapper[4839]: I0321 04:29:17.527455 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/02d1828c-4b4b-4d6e-994e-d1b383763960-v4-0-config-system-serving-cert\") pod \"oauth-openshift-cb949b455-jr2d6\" (UID: \"02d1828c-4b4b-4d6e-994e-d1b383763960\") " pod="openshift-authentication/oauth-openshift-cb949b455-jr2d6" Mar 21 04:29:17 crc kubenswrapper[4839]: I0321 04:29:17.527548 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/02d1828c-4b4b-4d6e-994e-d1b383763960-v4-0-config-system-session\") pod \"oauth-openshift-cb949b455-jr2d6\" (UID: \"02d1828c-4b4b-4d6e-994e-d1b383763960\") " pod="openshift-authentication/oauth-openshift-cb949b455-jr2d6" Mar 21 04:29:17 crc kubenswrapper[4839]: I0321 04:29:17.527737 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/02d1828c-4b4b-4d6e-994e-d1b383763960-audit-dir\") pod \"oauth-openshift-cb949b455-jr2d6\" (UID: \"02d1828c-4b4b-4d6e-994e-d1b383763960\") " pod="openshift-authentication/oauth-openshift-cb949b455-jr2d6" Mar 21 04:29:17 crc kubenswrapper[4839]: I0321 04:29:17.527803 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/02d1828c-4b4b-4d6e-994e-d1b383763960-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-cb949b455-jr2d6\" (UID: \"02d1828c-4b4b-4d6e-994e-d1b383763960\") " pod="openshift-authentication/oauth-openshift-cb949b455-jr2d6" Mar 21 04:29:17 crc kubenswrapper[4839]: I0321 04:29:17.542604 4839 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/kube-apiserver-crc" podStartSLOduration=15.542583949 podStartE2EDuration="15.542583949s" podCreationTimestamp="2026-03-21 04:29:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-21 04:29:17.52339265 +0000 UTC m=+361.851179346" watchObservedRunningTime="2026-03-21 04:29:17.542583949 +0000 UTC m=+361.870370625" Mar 21 04:29:17 crc kubenswrapper[4839]: I0321 04:29:17.573589 4839 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"kube-root-ca.crt" Mar 21 04:29:17 crc kubenswrapper[4839]: I0321 04:29:17.604117 4839 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"service-ca" Mar 21 04:29:17 crc kubenswrapper[4839]: I0321 04:29:17.620996 4839 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"openshift-service-ca.crt" Mar 21 04:29:17 crc kubenswrapper[4839]: I0321 04:29:17.631251 4839 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-service-ca-bundle" Mar 21 04:29:17 crc kubenswrapper[4839]: I0321 04:29:17.631651 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/02d1828c-4b4b-4d6e-994e-d1b383763960-v4-0-config-user-template-login\") pod \"oauth-openshift-cb949b455-jr2d6\" (UID: \"02d1828c-4b4b-4d6e-994e-d1b383763960\") " pod="openshift-authentication/oauth-openshift-cb949b455-jr2d6" Mar 21 04:29:17 crc kubenswrapper[4839]: I0321 04:29:17.631695 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-26phm\" (UniqueName: \"kubernetes.io/projected/02d1828c-4b4b-4d6e-994e-d1b383763960-kube-api-access-26phm\") pod \"oauth-openshift-cb949b455-jr2d6\" (UID: \"02d1828c-4b4b-4d6e-994e-d1b383763960\") " pod="openshift-authentication/oauth-openshift-cb949b455-jr2d6" Mar 21 04:29:17 crc kubenswrapper[4839]: I0321 04:29:17.631732 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/02d1828c-4b4b-4d6e-994e-d1b383763960-v4-0-config-user-template-error\") pod \"oauth-openshift-cb949b455-jr2d6\" (UID: \"02d1828c-4b4b-4d6e-994e-d1b383763960\") " pod="openshift-authentication/oauth-openshift-cb949b455-jr2d6" Mar 21 04:29:17 crc kubenswrapper[4839]: I0321 04:29:17.631758 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/02d1828c-4b4b-4d6e-994e-d1b383763960-v4-0-config-system-router-certs\") pod \"oauth-openshift-cb949b455-jr2d6\" (UID: \"02d1828c-4b4b-4d6e-994e-d1b383763960\") " pod="openshift-authentication/oauth-openshift-cb949b455-jr2d6" Mar 21 04:29:17 crc kubenswrapper[4839]: I0321 04:29:17.631789 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/02d1828c-4b4b-4d6e-994e-d1b383763960-audit-policies\") pod \"oauth-openshift-cb949b455-jr2d6\" (UID: \"02d1828c-4b4b-4d6e-994e-d1b383763960\") " pod="openshift-authentication/oauth-openshift-cb949b455-jr2d6" Mar 21 04:29:17 crc kubenswrapper[4839]: I0321 04:29:17.631809 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/02d1828c-4b4b-4d6e-994e-d1b383763960-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-cb949b455-jr2d6\" (UID: \"02d1828c-4b4b-4d6e-994e-d1b383763960\") " pod="openshift-authentication/oauth-openshift-cb949b455-jr2d6" Mar 21 04:29:17 crc kubenswrapper[4839]: I0321 04:29:17.631831 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/02d1828c-4b4b-4d6e-994e-d1b383763960-v4-0-config-system-service-ca\") pod \"oauth-openshift-cb949b455-jr2d6\" (UID: \"02d1828c-4b4b-4d6e-994e-d1b383763960\") " pod="openshift-authentication/oauth-openshift-cb949b455-jr2d6" Mar 21 04:29:17 crc kubenswrapper[4839]: I0321 04:29:17.631848 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/02d1828c-4b4b-4d6e-994e-d1b383763960-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-cb949b455-jr2d6\" (UID: \"02d1828c-4b4b-4d6e-994e-d1b383763960\") " pod="openshift-authentication/oauth-openshift-cb949b455-jr2d6" Mar 21 04:29:17 crc kubenswrapper[4839]: I0321 04:29:17.631878 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/02d1828c-4b4b-4d6e-994e-d1b383763960-v4-0-config-system-cliconfig\") pod \"oauth-openshift-cb949b455-jr2d6\" (UID: \"02d1828c-4b4b-4d6e-994e-d1b383763960\") " pod="openshift-authentication/oauth-openshift-cb949b455-jr2d6" Mar 21 04:29:17 crc kubenswrapper[4839]: I0321 04:29:17.631896 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/02d1828c-4b4b-4d6e-994e-d1b383763960-v4-0-config-system-serving-cert\") pod \"oauth-openshift-cb949b455-jr2d6\" (UID: \"02d1828c-4b4b-4d6e-994e-d1b383763960\") " pod="openshift-authentication/oauth-openshift-cb949b455-jr2d6" Mar 21 04:29:17 crc kubenswrapper[4839]: I0321 04:29:17.631914 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/02d1828c-4b4b-4d6e-994e-d1b383763960-v4-0-config-system-session\") pod \"oauth-openshift-cb949b455-jr2d6\" (UID: \"02d1828c-4b4b-4d6e-994e-d1b383763960\") " pod="openshift-authentication/oauth-openshift-cb949b455-jr2d6" Mar 21 04:29:17 crc kubenswrapper[4839]: I0321 04:29:17.631944 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/02d1828c-4b4b-4d6e-994e-d1b383763960-audit-dir\") pod \"oauth-openshift-cb949b455-jr2d6\" (UID: \"02d1828c-4b4b-4d6e-994e-d1b383763960\") " pod="openshift-authentication/oauth-openshift-cb949b455-jr2d6" Mar 21 04:29:17 crc kubenswrapper[4839]: I0321 04:29:17.631962 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/02d1828c-4b4b-4d6e-994e-d1b383763960-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-cb949b455-jr2d6\" (UID: \"02d1828c-4b4b-4d6e-994e-d1b383763960\") " pod="openshift-authentication/oauth-openshift-cb949b455-jr2d6" Mar 21 04:29:17 crc kubenswrapper[4839]: I0321 04:29:17.631980 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/02d1828c-4b4b-4d6e-994e-d1b383763960-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-cb949b455-jr2d6\" (UID: \"02d1828c-4b4b-4d6e-994e-d1b383763960\") " pod="openshift-authentication/oauth-openshift-cb949b455-jr2d6" Mar 21 04:29:17 crc kubenswrapper[4839]: I0321 04:29:17.632919 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/02d1828c-4b4b-4d6e-994e-d1b383763960-audit-dir\") pod \"oauth-openshift-cb949b455-jr2d6\" (UID: \"02d1828c-4b4b-4d6e-994e-d1b383763960\") " pod="openshift-authentication/oauth-openshift-cb949b455-jr2d6" Mar 21 04:29:17 crc kubenswrapper[4839]: I0321 04:29:17.632975 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/02d1828c-4b4b-4d6e-994e-d1b383763960-v4-0-config-system-cliconfig\") pod \"oauth-openshift-cb949b455-jr2d6\" (UID: \"02d1828c-4b4b-4d6e-994e-d1b383763960\") " pod="openshift-authentication/oauth-openshift-cb949b455-jr2d6" Mar 21 04:29:17 crc kubenswrapper[4839]: I0321 04:29:17.633170 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/02d1828c-4b4b-4d6e-994e-d1b383763960-v4-0-config-system-service-ca\") pod \"oauth-openshift-cb949b455-jr2d6\" (UID: \"02d1828c-4b4b-4d6e-994e-d1b383763960\") " pod="openshift-authentication/oauth-openshift-cb949b455-jr2d6" Mar 21 04:29:17 crc kubenswrapper[4839]: I0321 04:29:17.634945 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/02d1828c-4b4b-4d6e-994e-d1b383763960-audit-policies\") pod \"oauth-openshift-cb949b455-jr2d6\" (UID: \"02d1828c-4b4b-4d6e-994e-d1b383763960\") " pod="openshift-authentication/oauth-openshift-cb949b455-jr2d6" Mar 21 04:29:17 crc kubenswrapper[4839]: I0321 04:29:17.635413 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/02d1828c-4b4b-4d6e-994e-d1b383763960-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-cb949b455-jr2d6\" (UID: \"02d1828c-4b4b-4d6e-994e-d1b383763960\") " pod="openshift-authentication/oauth-openshift-cb949b455-jr2d6" Mar 21 04:29:17 crc kubenswrapper[4839]: I0321 04:29:17.639664 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/02d1828c-4b4b-4d6e-994e-d1b383763960-v4-0-config-user-template-login\") pod \"oauth-openshift-cb949b455-jr2d6\" (UID: \"02d1828c-4b4b-4d6e-994e-d1b383763960\") " pod="openshift-authentication/oauth-openshift-cb949b455-jr2d6" Mar 21 04:29:17 crc kubenswrapper[4839]: I0321 04:29:17.639921 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/02d1828c-4b4b-4d6e-994e-d1b383763960-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-cb949b455-jr2d6\" (UID: \"02d1828c-4b4b-4d6e-994e-d1b383763960\") " pod="openshift-authentication/oauth-openshift-cb949b455-jr2d6" Mar 21 04:29:17 crc kubenswrapper[4839]: I0321 04:29:17.642722 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/02d1828c-4b4b-4d6e-994e-d1b383763960-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-cb949b455-jr2d6\" (UID: \"02d1828c-4b4b-4d6e-994e-d1b383763960\") " pod="openshift-authentication/oauth-openshift-cb949b455-jr2d6" Mar 21 04:29:17 crc kubenswrapper[4839]: I0321 04:29:17.642894 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/02d1828c-4b4b-4d6e-994e-d1b383763960-v4-0-config-system-router-certs\") pod \"oauth-openshift-cb949b455-jr2d6\" (UID: \"02d1828c-4b4b-4d6e-994e-d1b383763960\") " pod="openshift-authentication/oauth-openshift-cb949b455-jr2d6" Mar 21 04:29:17 crc kubenswrapper[4839]: I0321 04:29:17.645538 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/02d1828c-4b4b-4d6e-994e-d1b383763960-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-cb949b455-jr2d6\" (UID: \"02d1828c-4b4b-4d6e-994e-d1b383763960\") " pod="openshift-authentication/oauth-openshift-cb949b455-jr2d6" Mar 21 04:29:17 crc kubenswrapper[4839]: I0321 04:29:17.649221 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/02d1828c-4b4b-4d6e-994e-d1b383763960-v4-0-config-system-serving-cert\") pod \"oauth-openshift-cb949b455-jr2d6\" (UID: \"02d1828c-4b4b-4d6e-994e-d1b383763960\") " pod="openshift-authentication/oauth-openshift-cb949b455-jr2d6" Mar 21 04:29:17 crc kubenswrapper[4839]: I0321 04:29:17.649772 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/02d1828c-4b4b-4d6e-994e-d1b383763960-v4-0-config-user-template-error\") pod \"oauth-openshift-cb949b455-jr2d6\" (UID: \"02d1828c-4b4b-4d6e-994e-d1b383763960\") " pod="openshift-authentication/oauth-openshift-cb949b455-jr2d6" Mar 21 04:29:17 crc kubenswrapper[4839]: I0321 04:29:17.650143 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/02d1828c-4b4b-4d6e-994e-d1b383763960-v4-0-config-system-session\") pod \"oauth-openshift-cb949b455-jr2d6\" (UID: \"02d1828c-4b4b-4d6e-994e-d1b383763960\") " pod="openshift-authentication/oauth-openshift-cb949b455-jr2d6" Mar 21 04:29:17 crc kubenswrapper[4839]: I0321 04:29:17.654876 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-26phm\" (UniqueName: \"kubernetes.io/projected/02d1828c-4b4b-4d6e-994e-d1b383763960-kube-api-access-26phm\") pod \"oauth-openshift-cb949b455-jr2d6\" (UID: \"02d1828c-4b4b-4d6e-994e-d1b383763960\") " pod="openshift-authentication/oauth-openshift-cb949b455-jr2d6" Mar 21 04:29:17 crc kubenswrapper[4839]: I0321 04:29:17.715389 4839 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"openshift-service-ca.crt" Mar 21 04:29:17 crc kubenswrapper[4839]: I0321 04:29:17.723683 4839 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Mar 21 04:29:17 crc kubenswrapper[4839]: I0321 04:29:17.750580 4839 reflector.go:368] Caches populated for *v1.Node from k8s.io/client-go/informers/factory.go:160 Mar 21 04:29:17 crc kubenswrapper[4839]: I0321 04:29:17.769234 4839 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"authentication-operator-dockercfg-mz9bj" Mar 21 04:29:17 crc kubenswrapper[4839]: I0321 04:29:17.798596 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-cb949b455-jr2d6" Mar 21 04:29:17 crc kubenswrapper[4839]: I0321 04:29:17.839487 4839 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"encryption-config-1" Mar 21 04:29:17 crc kubenswrapper[4839]: I0321 04:29:17.898670 4839 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"audit-1" Mar 21 04:29:17 crc kubenswrapper[4839]: I0321 04:29:17.947386 4839 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"kube-root-ca.crt" Mar 21 04:29:17 crc kubenswrapper[4839]: I0321 04:29:17.953477 4839 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-config" Mar 21 04:29:18 crc kubenswrapper[4839]: I0321 04:29:18.028833 4839 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"serving-cert" Mar 21 04:29:18 crc kubenswrapper[4839]: I0321 04:29:18.030456 4839 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"service-ca-operator-config" Mar 21 04:29:18 crc kubenswrapper[4839]: I0321 04:29:18.045065 4839 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-operators-dockercfg-ct8rh" Mar 21 04:29:18 crc kubenswrapper[4839]: I0321 04:29:18.070099 4839 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-dockercfg-k9rxt" Mar 21 04:29:18 crc kubenswrapper[4839]: I0321 04:29:18.117542 4839 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"openshift-service-ca.crt" Mar 21 04:29:18 crc kubenswrapper[4839]: I0321 04:29:18.241292 4839 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mco-proxy-tls" Mar 21 04:29:18 crc kubenswrapper[4839]: I0321 04:29:18.277214 4839 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-sa-dockercfg-d427c" Mar 21 04:29:18 crc kubenswrapper[4839]: I0321 04:29:18.407092 4839 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"trusted-ca" Mar 21 04:29:18 crc kubenswrapper[4839]: I0321 04:29:18.458169 4839 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6bfbd19d-a44a-459c-bd6e-150241ce3ebb" path="/var/lib/kubelet/pods/6bfbd19d-a44a-459c-bd6e-150241ce3ebb/volumes" Mar 21 04:29:18 crc kubenswrapper[4839]: I0321 04:29:18.551011 4839 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"kube-root-ca.crt" Mar 21 04:29:18 crc kubenswrapper[4839]: I0321 04:29:18.556559 4839 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-daemon-dockercfg-r5tcq" Mar 21 04:29:18 crc kubenswrapper[4839]: I0321 04:29:18.568262 4839 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"machine-api-operator-images" Mar 21 04:29:18 crc kubenswrapper[4839]: I0321 04:29:18.683257 4839 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"cluster-samples-operator-dockercfg-xpp9w" Mar 21 04:29:18 crc kubenswrapper[4839]: I0321 04:29:18.694040 4839 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-rbac-proxy" Mar 21 04:29:18 crc kubenswrapper[4839]: I0321 04:29:18.739914 4839 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"trusted-ca-bundle" Mar 21 04:29:18 crc kubenswrapper[4839]: I0321 04:29:18.745899 4839 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"kube-root-ca.crt" Mar 21 04:29:18 crc kubenswrapper[4839]: I0321 04:29:18.777797 4839 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator"/"kube-storage-version-migrator-sa-dockercfg-5xfcg" Mar 21 04:29:18 crc kubenswrapper[4839]: I0321 04:29:18.818600 4839 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-script-lib" Mar 21 04:29:18 crc kubenswrapper[4839]: I0321 04:29:18.854780 4839 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"kube-scheduler-operator-serving-cert" Mar 21 04:29:18 crc kubenswrapper[4839]: I0321 04:29:18.931638 4839 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Mar 21 04:29:18 crc kubenswrapper[4839]: I0321 04:29:18.968250 4839 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-rbac-proxy" Mar 21 04:29:19 crc kubenswrapper[4839]: I0321 04:29:19.007483 4839 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"etcd-client" Mar 21 04:29:19 crc kubenswrapper[4839]: I0321 04:29:19.207417 4839 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"kube-root-ca.crt" Mar 21 04:29:19 crc kubenswrapper[4839]: I0321 04:29:19.212951 4839 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-node-metrics-cert" Mar 21 04:29:19 crc kubenswrapper[4839]: I0321 04:29:19.257220 4839 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-rbac-proxy" Mar 21 04:29:19 crc kubenswrapper[4839]: I0321 04:29:19.380478 4839 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"openshift-service-ca.crt" Mar 21 04:29:19 crc kubenswrapper[4839]: I0321 04:29:19.404504 4839 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"openshift-service-ca.crt" Mar 21 04:29:19 crc kubenswrapper[4839]: I0321 04:29:19.516341 4839 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"community-operators-dockercfg-dmngl" Mar 21 04:29:19 crc kubenswrapper[4839]: I0321 04:29:19.569083 4839 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-config" Mar 21 04:29:19 crc kubenswrapper[4839]: I0321 04:29:19.578797 4839 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-dockercfg-zdk86" Mar 21 04:29:19 crc kubenswrapper[4839]: I0321 04:29:19.680200 4839 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-service-ca.crt" Mar 21 04:29:19 crc kubenswrapper[4839]: I0321 04:29:19.693279 4839 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-admission-controller-secret" Mar 21 04:29:19 crc kubenswrapper[4839]: I0321 04:29:19.745222 4839 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-root-ca.crt" Mar 21 04:29:19 crc kubenswrapper[4839]: I0321 04:29:19.760209 4839 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-service-ca.crt" Mar 21 04:29:19 crc kubenswrapper[4839]: I0321 04:29:19.766616 4839 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-control-plane-metrics-cert" Mar 21 04:29:19 crc kubenswrapper[4839]: I0321 04:29:19.786877 4839 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-serving-cert" Mar 21 04:29:20 crc kubenswrapper[4839]: I0321 04:29:20.005457 4839 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-node-identity"/"network-node-identity-cert" Mar 21 04:29:20 crc kubenswrapper[4839]: I0321 04:29:20.076030 4839 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-dockercfg-f62pw" Mar 21 04:29:20 crc kubenswrapper[4839]: I0321 04:29:20.109331 4839 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"node-ca-dockercfg-4777p" Mar 21 04:29:20 crc kubenswrapper[4839]: I0321 04:29:20.131337 4839 reflector.go:368] Caches populated for *v1.Service from k8s.io/client-go/informers/factory.go:160 Mar 21 04:29:20 crc kubenswrapper[4839]: I0321 04:29:20.166894 4839 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-config" Mar 21 04:29:20 crc kubenswrapper[4839]: I0321 04:29:20.179697 4839 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"trusted-ca" Mar 21 04:29:20 crc kubenswrapper[4839]: I0321 04:29:20.305408 4839 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"kube-root-ca.crt" Mar 21 04:29:20 crc kubenswrapper[4839]: I0321 04:29:20.333748 4839 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"openshift-service-ca.crt" Mar 21 04:29:20 crc kubenswrapper[4839]: I0321 04:29:20.477229 4839 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"trusted-ca-bundle" Mar 21 04:29:20 crc kubenswrapper[4839]: I0321 04:29:20.500144 4839 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"openshift-service-ca.crt" Mar 21 04:29:20 crc kubenswrapper[4839]: I0321 04:29:20.528653 4839 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"openshift-service-ca.crt" Mar 21 04:29:20 crc kubenswrapper[4839]: I0321 04:29:20.642359 4839 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"kube-root-ca.crt" Mar 21 04:29:20 crc kubenswrapper[4839]: I0321 04:29:20.674358 4839 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"service-ca-bundle" Mar 21 04:29:20 crc kubenswrapper[4839]: I0321 04:29:20.676358 4839 reflector.go:368] Caches populated for *v1.RuntimeClass from k8s.io/client-go/informers/factory.go:160 Mar 21 04:29:20 crc kubenswrapper[4839]: I0321 04:29:20.803746 4839 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-certs-default" Mar 21 04:29:20 crc kubenswrapper[4839]: I0321 04:29:20.830988 4839 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-operator-tls" Mar 21 04:29:20 crc kubenswrapper[4839]: I0321 04:29:20.874540 4839 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"multus-daemon-config" Mar 21 04:29:21 crc kubenswrapper[4839]: I0321 04:29:21.036731 4839 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"openshift-service-ca.crt" Mar 21 04:29:21 crc kubenswrapper[4839]: I0321 04:29:21.074938 4839 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Mar 21 04:29:21 crc kubenswrapper[4839]: I0321 04:29:21.097964 4839 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"default-dockercfg-chnjx" Mar 21 04:29:21 crc kubenswrapper[4839]: I0321 04:29:21.101380 4839 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"trusted-ca-bundle" Mar 21 04:29:21 crc kubenswrapper[4839]: I0321 04:29:21.178983 4839 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"kube-root-ca.crt" Mar 21 04:29:21 crc kubenswrapper[4839]: I0321 04:29:21.186192 4839 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"node-bootstrapper-token" Mar 21 04:29:21 crc kubenswrapper[4839]: I0321 04:29:21.271338 4839 reflector.go:368] Caches populated for *v1.Secret from object-"hostpath-provisioner"/"csi-hostpath-provisioner-sa-dockercfg-qd74k" Mar 21 04:29:21 crc kubenswrapper[4839]: I0321 04:29:21.315598 4839 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-dockercfg-xtcjv" Mar 21 04:29:21 crc kubenswrapper[4839]: I0321 04:29:21.405427 4839 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"audit-1" Mar 21 04:29:21 crc kubenswrapper[4839]: I0321 04:29:21.447955 4839 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"signing-cabundle" Mar 21 04:29:21 crc kubenswrapper[4839]: I0321 04:29:21.492691 4839 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-metrics" Mar 21 04:29:21 crc kubenswrapper[4839]: I0321 04:29:21.508447 4839 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"signing-key" Mar 21 04:29:21 crc kubenswrapper[4839]: I0321 04:29:21.548453 4839 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-dockercfg-5nsgg" Mar 21 04:29:21 crc kubenswrapper[4839]: I0321 04:29:21.615714 4839 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-marketplace-dockercfg-x2ctb" Mar 21 04:29:21 crc kubenswrapper[4839]: I0321 04:29:21.629733 4839 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-tls" Mar 21 04:29:21 crc kubenswrapper[4839]: I0321 04:29:21.723488 4839 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Mar 21 04:29:21 crc kubenswrapper[4839]: I0321 04:29:21.769209 4839 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"kube-root-ca.crt" Mar 21 04:29:21 crc kubenswrapper[4839]: I0321 04:29:21.831962 4839 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"kube-root-ca.crt" Mar 21 04:29:21 crc kubenswrapper[4839]: I0321 04:29:21.893925 4839 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"kube-root-ca.crt" Mar 21 04:29:21 crc kubenswrapper[4839]: I0321 04:29:21.951346 4839 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"etcd-serving-ca" Mar 21 04:29:22 crc kubenswrapper[4839]: I0321 04:29:22.000543 4839 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"package-server-manager-serving-cert" Mar 21 04:29:22 crc kubenswrapper[4839]: I0321 04:29:22.050092 4839 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"marketplace-trusted-ca" Mar 21 04:29:22 crc kubenswrapper[4839]: I0321 04:29:22.051856 4839 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-config" Mar 21 04:29:22 crc kubenswrapper[4839]: I0321 04:29:22.066768 4839 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"image-registry-certificates" Mar 21 04:29:22 crc kubenswrapper[4839]: I0321 04:29:22.081536 4839 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"openshift-service-ca.crt" Mar 21 04:29:22 crc kubenswrapper[4839]: I0321 04:29:22.091143 4839 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-operator"/"metrics-tls" Mar 21 04:29:22 crc kubenswrapper[4839]: I0321 04:29:22.135728 4839 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-cb949b455-jr2d6"] Mar 21 04:29:22 crc kubenswrapper[4839]: I0321 04:29:22.138500 4839 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"metrics-tls" Mar 21 04:29:22 crc kubenswrapper[4839]: I0321 04:29:22.188701 4839 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"console-operator-dockercfg-4xjcr" Mar 21 04:29:22 crc kubenswrapper[4839]: I0321 04:29:22.299153 4839 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"kube-root-ca.crt" Mar 21 04:29:22 crc kubenswrapper[4839]: I0321 04:29:22.313098 4839 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"oauth-serving-cert" Mar 21 04:29:22 crc kubenswrapper[4839]: I0321 04:29:22.331030 4839 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"kube-root-ca.crt" Mar 21 04:29:22 crc kubenswrapper[4839]: I0321 04:29:22.331267 4839 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-dockercfg-vw8fw" Mar 21 04:29:22 crc kubenswrapper[4839]: I0321 04:29:22.366424 4839 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"service-ca-bundle" Mar 21 04:29:22 crc kubenswrapper[4839]: I0321 04:29:22.398087 4839 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"iptables-alerter-script" Mar 21 04:29:22 crc kubenswrapper[4839]: I0321 04:29:22.399219 4839 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"openshift-service-ca.crt" Mar 21 04:29:22 crc kubenswrapper[4839]: I0321 04:29:22.421855 4839 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-ca-bundle" Mar 21 04:29:22 crc kubenswrapper[4839]: I0321 04:29:22.454072 4839 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"openshift-service-ca.crt" Mar 21 04:29:22 crc kubenswrapper[4839]: I0321 04:29:22.553317 4839 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"openshift-service-ca.crt" Mar 21 04:29:22 crc kubenswrapper[4839]: I0321 04:29:22.559102 4839 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-serving-cert" Mar 21 04:29:22 crc kubenswrapper[4839]: I0321 04:29:22.575278 4839 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"openshift-service-ca.crt" Mar 21 04:29:22 crc kubenswrapper[4839]: I0321 04:29:22.621304 4839 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-cb949b455-jr2d6"] Mar 21 04:29:22 crc kubenswrapper[4839]: I0321 04:29:22.656678 4839 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"kube-root-ca.crt" Mar 21 04:29:22 crc kubenswrapper[4839]: I0321 04:29:22.680849 4839 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"openshift-config-operator-dockercfg-7pc5z" Mar 21 04:29:22 crc kubenswrapper[4839]: I0321 04:29:22.811226 4839 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Mar 21 04:29:22 crc kubenswrapper[4839]: I0321 04:29:22.884466 4839 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"default-dockercfg-2q5b6" Mar 21 04:29:22 crc kubenswrapper[4839]: I0321 04:29:22.958264 4839 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"ovnkube-identity-cm" Mar 21 04:29:22 crc kubenswrapper[4839]: I0321 04:29:22.971348 4839 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"openshift-service-ca.crt" Mar 21 04:29:23 crc kubenswrapper[4839]: I0321 04:29:23.012250 4839 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"pprof-cert" Mar 21 04:29:23 crc kubenswrapper[4839]: I0321 04:29:23.214388 4839 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Mar 21 04:29:23 crc kubenswrapper[4839]: I0321 04:29:23.354322 4839 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"metrics-tls" Mar 21 04:29:23 crc kubenswrapper[4839]: I0321 04:29:23.413898 4839 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-node-dockercfg-pwtwl" Mar 21 04:29:23 crc kubenswrapper[4839]: I0321 04:29:23.423990 4839 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-metrics-certs-default" Mar 21 04:29:23 crc kubenswrapper[4839]: I0321 04:29:23.441229 4839 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"openshift-service-ca.crt" Mar 21 04:29:23 crc kubenswrapper[4839]: I0321 04:29:23.495374 4839 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-authentication_oauth-openshift-cb949b455-jr2d6_02d1828c-4b4b-4d6e-994e-d1b383763960/oauth-openshift/0.log" Mar 21 04:29:23 crc kubenswrapper[4839]: I0321 04:29:23.495416 4839 generic.go:334] "Generic (PLEG): container finished" podID="02d1828c-4b4b-4d6e-994e-d1b383763960" containerID="202d6e665354c94940665494d88d864de5f66f7f1da5473dcf378fd1f89c0d9d" exitCode=255 Mar 21 04:29:23 crc kubenswrapper[4839]: I0321 04:29:23.495443 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-cb949b455-jr2d6" event={"ID":"02d1828c-4b4b-4d6e-994e-d1b383763960","Type":"ContainerDied","Data":"202d6e665354c94940665494d88d864de5f66f7f1da5473dcf378fd1f89c0d9d"} Mar 21 04:29:23 crc kubenswrapper[4839]: I0321 04:29:23.495507 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-cb949b455-jr2d6" event={"ID":"02d1828c-4b4b-4d6e-994e-d1b383763960","Type":"ContainerStarted","Data":"ef601578549e3605d9337a01c412bc01a0063ded26fd0d3f4c5256f1ff349d54"} Mar 21 04:29:23 crc kubenswrapper[4839]: I0321 04:29:23.496060 4839 scope.go:117] "RemoveContainer" containerID="202d6e665354c94940665494d88d864de5f66f7f1da5473dcf378fd1f89c0d9d" Mar 21 04:29:23 crc kubenswrapper[4839]: I0321 04:29:23.532235 4839 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ac-dockercfg-9lkdf" Mar 21 04:29:23 crc kubenswrapper[4839]: I0321 04:29:23.545604 4839 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"installation-pull-secrets" Mar 21 04:29:23 crc kubenswrapper[4839]: I0321 04:29:23.607524 4839 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"openshift-service-ca.crt" Mar 21 04:29:23 crc kubenswrapper[4839]: I0321 04:29:23.642142 4839 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-config" Mar 21 04:29:23 crc kubenswrapper[4839]: I0321 04:29:23.643561 4839 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-dockercfg-mfbb7" Mar 21 04:29:23 crc kubenswrapper[4839]: I0321 04:29:23.677355 4839 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-default-metrics-tls" Mar 21 04:29:23 crc kubenswrapper[4839]: I0321 04:29:23.713311 4839 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"console-operator-config" Mar 21 04:29:23 crc kubenswrapper[4839]: I0321 04:29:23.726930 4839 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"openshift-service-ca.crt" Mar 21 04:29:23 crc kubenswrapper[4839]: I0321 04:29:23.772903 4839 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Mar 21 04:29:23 crc kubenswrapper[4839]: I0321 04:29:23.872274 4839 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-serving-cert" Mar 21 04:29:24 crc kubenswrapper[4839]: I0321 04:29:24.002451 4839 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"openshift-service-ca.crt" Mar 21 04:29:24 crc kubenswrapper[4839]: I0321 04:29:24.030522 4839 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"openshift-service-ca.crt" Mar 21 04:29:24 crc kubenswrapper[4839]: I0321 04:29:24.036597 4839 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"openshift-service-ca.crt" Mar 21 04:29:24 crc kubenswrapper[4839]: I0321 04:29:24.056511 4839 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"console-config" Mar 21 04:29:24 crc kubenswrapper[4839]: I0321 04:29:24.173286 4839 kubelet.go:2431] "SyncLoop REMOVE" source="file" pods=["openshift-kube-apiserver/kube-apiserver-startup-monitor-crc"] Mar 21 04:29:24 crc kubenswrapper[4839]: I0321 04:29:24.173921 4839 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" containerName="startup-monitor" containerID="cri-o://7087a9cc9ea0a71bf83c49dccad5914cb87b12f2c00337676a7c943aa8d8a9b6" gracePeriod=5 Mar 21 04:29:24 crc kubenswrapper[4839]: I0321 04:29:24.181795 4839 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"authentication-operator-config" Mar 21 04:29:24 crc kubenswrapper[4839]: I0321 04:29:24.257717 4839 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"kube-root-ca.crt" Mar 21 04:29:24 crc kubenswrapper[4839]: I0321 04:29:24.260988 4839 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"kube-root-ca.crt" Mar 21 04:29:24 crc kubenswrapper[4839]: I0321 04:29:24.340654 4839 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-dockercfg-gkqpw" Mar 21 04:29:24 crc kubenswrapper[4839]: I0321 04:29:24.505555 4839 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-authentication_oauth-openshift-cb949b455-jr2d6_02d1828c-4b4b-4d6e-994e-d1b383763960/oauth-openshift/1.log" Mar 21 04:29:24 crc kubenswrapper[4839]: I0321 04:29:24.506040 4839 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-authentication_oauth-openshift-cb949b455-jr2d6_02d1828c-4b4b-4d6e-994e-d1b383763960/oauth-openshift/0.log" Mar 21 04:29:24 crc kubenswrapper[4839]: I0321 04:29:24.506092 4839 generic.go:334] "Generic (PLEG): container finished" podID="02d1828c-4b4b-4d6e-994e-d1b383763960" containerID="6dd6effaf30cd6d83c10fd64f9da93bee36a19a8a87078b69af4038ae06980b0" exitCode=255 Mar 21 04:29:24 crc kubenswrapper[4839]: I0321 04:29:24.506127 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-cb949b455-jr2d6" event={"ID":"02d1828c-4b4b-4d6e-994e-d1b383763960","Type":"ContainerDied","Data":"6dd6effaf30cd6d83c10fd64f9da93bee36a19a8a87078b69af4038ae06980b0"} Mar 21 04:29:24 crc kubenswrapper[4839]: I0321 04:29:24.506166 4839 scope.go:117] "RemoveContainer" containerID="202d6e665354c94940665494d88d864de5f66f7f1da5473dcf378fd1f89c0d9d" Mar 21 04:29:24 crc kubenswrapper[4839]: I0321 04:29:24.506602 4839 scope.go:117] "RemoveContainer" containerID="6dd6effaf30cd6d83c10fd64f9da93bee36a19a8a87078b69af4038ae06980b0" Mar 21 04:29:24 crc kubenswrapper[4839]: E0321 04:29:24.506890 4839 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"oauth-openshift\" with CrashLoopBackOff: \"back-off 10s restarting failed container=oauth-openshift pod=oauth-openshift-cb949b455-jr2d6_openshift-authentication(02d1828c-4b4b-4d6e-994e-d1b383763960)\"" pod="openshift-authentication/oauth-openshift-cb949b455-jr2d6" podUID="02d1828c-4b4b-4d6e-994e-d1b383763960" Mar 21 04:29:24 crc kubenswrapper[4839]: I0321 04:29:24.636600 4839 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-client" Mar 21 04:29:24 crc kubenswrapper[4839]: I0321 04:29:24.654584 4839 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"openshift-service-ca.crt" Mar 21 04:29:24 crc kubenswrapper[4839]: I0321 04:29:24.917892 4839 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"node-resolver-dockercfg-kz9s7" Mar 21 04:29:24 crc kubenswrapper[4839]: I0321 04:29:24.939692 4839 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"default-dockercfg-2llfx" Mar 21 04:29:24 crc kubenswrapper[4839]: I0321 04:29:24.977737 4839 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"canary-serving-cert" Mar 21 04:29:25 crc kubenswrapper[4839]: I0321 04:29:25.114261 4839 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serving-cert" Mar 21 04:29:25 crc kubenswrapper[4839]: I0321 04:29:25.121268 4839 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"openshift-service-ca.crt" Mar 21 04:29:25 crc kubenswrapper[4839]: I0321 04:29:25.132550 4839 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"kube-storage-version-migrator-operator-dockercfg-2bh8d" Mar 21 04:29:25 crc kubenswrapper[4839]: I0321 04:29:25.142039 4839 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-dockercfg-qx5rd" Mar 21 04:29:25 crc kubenswrapper[4839]: I0321 04:29:25.172174 4839 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"dns-operator-dockercfg-9mqw5" Mar 21 04:29:25 crc kubenswrapper[4839]: I0321 04:29:25.235552 4839 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-config" Mar 21 04:29:25 crc kubenswrapper[4839]: I0321 04:29:25.238879 4839 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Mar 21 04:29:25 crc kubenswrapper[4839]: I0321 04:29:25.259076 4839 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"kube-root-ca.crt" Mar 21 04:29:25 crc kubenswrapper[4839]: I0321 04:29:25.327710 4839 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"machine-approver-config" Mar 21 04:29:25 crc kubenswrapper[4839]: I0321 04:29:25.328896 4839 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"certified-operators-dockercfg-4rs5g" Mar 21 04:29:25 crc kubenswrapper[4839]: I0321 04:29:25.360021 4839 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"packageserver-service-cert" Mar 21 04:29:25 crc kubenswrapper[4839]: I0321 04:29:25.376605 4839 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Mar 21 04:29:25 crc kubenswrapper[4839]: I0321 04:29:25.514016 4839 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-authentication_oauth-openshift-cb949b455-jr2d6_02d1828c-4b4b-4d6e-994e-d1b383763960/oauth-openshift/1.log" Mar 21 04:29:25 crc kubenswrapper[4839]: I0321 04:29:25.514616 4839 scope.go:117] "RemoveContainer" containerID="6dd6effaf30cd6d83c10fd64f9da93bee36a19a8a87078b69af4038ae06980b0" Mar 21 04:29:25 crc kubenswrapper[4839]: E0321 04:29:25.514826 4839 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"oauth-openshift\" with CrashLoopBackOff: \"back-off 10s restarting failed container=oauth-openshift pod=oauth-openshift-cb949b455-jr2d6_openshift-authentication(02d1828c-4b4b-4d6e-994e-d1b383763960)\"" pod="openshift-authentication/oauth-openshift-cb949b455-jr2d6" podUID="02d1828c-4b4b-4d6e-994e-d1b383763960" Mar 21 04:29:25 crc kubenswrapper[4839]: I0321 04:29:25.537303 4839 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-stats-default" Mar 21 04:29:25 crc kubenswrapper[4839]: I0321 04:29:25.574458 4839 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"serving-cert" Mar 21 04:29:25 crc kubenswrapper[4839]: I0321 04:29:25.588898 4839 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"machine-config-operator-images" Mar 21 04:29:25 crc kubenswrapper[4839]: I0321 04:29:25.775744 4839 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ancillary-tools-dockercfg-vnmsz" Mar 21 04:29:25 crc kubenswrapper[4839]: I0321 04:29:25.796199 4839 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"trusted-ca" Mar 21 04:29:25 crc kubenswrapper[4839]: I0321 04:29:25.955156 4839 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"kube-root-ca.crt" Mar 21 04:29:26 crc kubenswrapper[4839]: I0321 04:29:26.065924 4839 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-tls" Mar 21 04:29:26 crc kubenswrapper[4839]: I0321 04:29:26.116674 4839 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"kube-root-ca.crt" Mar 21 04:29:26 crc kubenswrapper[4839]: I0321 04:29:26.174399 4839 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-root-ca.crt" Mar 21 04:29:26 crc kubenswrapper[4839]: I0321 04:29:26.228081 4839 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-operator-config" Mar 21 04:29:26 crc kubenswrapper[4839]: I0321 04:29:26.350121 4839 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"encryption-config-1" Mar 21 04:29:26 crc kubenswrapper[4839]: I0321 04:29:26.389874 4839 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-root-ca.crt" Mar 21 04:29:26 crc kubenswrapper[4839]: I0321 04:29:26.404074 4839 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"serving-cert" Mar 21 04:29:26 crc kubenswrapper[4839]: I0321 04:29:26.553367 4839 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"default-dockercfg-gxtc4" Mar 21 04:29:26 crc kubenswrapper[4839]: I0321 04:29:26.621396 4839 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-dockercfg-qt55r" Mar 21 04:29:26 crc kubenswrapper[4839]: I0321 04:29:26.639389 4839 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"default-cni-sysctl-allowlist" Mar 21 04:29:26 crc kubenswrapper[4839]: I0321 04:29:26.650531 4839 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Mar 21 04:29:26 crc kubenswrapper[4839]: I0321 04:29:26.662610 4839 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"config" Mar 21 04:29:26 crc kubenswrapper[4839]: I0321 04:29:26.701261 4839 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-558f576774-7vr74"] Mar 21 04:29:26 crc kubenswrapper[4839]: I0321 04:29:26.701512 4839 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-558f576774-7vr74" podUID="c56aba9c-2ad5-4635-b6ad-eac6f79054c3" containerName="controller-manager" containerID="cri-o://42a603ec98d84a05b1f20eead303786e9ed6177eea3a7880147775cb99246cdf" gracePeriod=30 Mar 21 04:29:26 crc kubenswrapper[4839]: I0321 04:29:26.705318 4839 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-ff7874d5d-7kj25"] Mar 21 04:29:26 crc kubenswrapper[4839]: I0321 04:29:26.705517 4839 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-ff7874d5d-7kj25" podUID="2e226a30-c23d-4a45-ab06-4087bf0a38c7" containerName="route-controller-manager" containerID="cri-o://ebf0cee149fa25552dccfb87a894d627508cabce1c44155d596f5ae2612b7b6c" gracePeriod=30 Mar 21 04:29:26 crc kubenswrapper[4839]: I0321 04:29:26.735443 4839 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"image-import-ca" Mar 21 04:29:26 crc kubenswrapper[4839]: I0321 04:29:26.754453 4839 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-tls" Mar 21 04:29:26 crc kubenswrapper[4839]: I0321 04:29:26.769407 4839 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-dockercfg-jwfmh" Mar 21 04:29:26 crc kubenswrapper[4839]: I0321 04:29:26.785634 4839 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"serving-cert" Mar 21 04:29:26 crc kubenswrapper[4839]: I0321 04:29:26.925241 4839 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"registry-dockercfg-kzzsd" Mar 21 04:29:27 crc kubenswrapper[4839]: I0321 04:29:27.161342 4839 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-ff7874d5d-7kj25" Mar 21 04:29:27 crc kubenswrapper[4839]: I0321 04:29:27.167075 4839 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-558f576774-7vr74" Mar 21 04:29:27 crc kubenswrapper[4839]: I0321 04:29:27.359380 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c56aba9c-2ad5-4635-b6ad-eac6f79054c3-config\") pod \"c56aba9c-2ad5-4635-b6ad-eac6f79054c3\" (UID: \"c56aba9c-2ad5-4635-b6ad-eac6f79054c3\") " Mar 21 04:29:27 crc kubenswrapper[4839]: I0321 04:29:27.359457 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/c56aba9c-2ad5-4635-b6ad-eac6f79054c3-client-ca\") pod \"c56aba9c-2ad5-4635-b6ad-eac6f79054c3\" (UID: \"c56aba9c-2ad5-4635-b6ad-eac6f79054c3\") " Mar 21 04:29:27 crc kubenswrapper[4839]: I0321 04:29:27.359493 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c56aba9c-2ad5-4635-b6ad-eac6f79054c3-serving-cert\") pod \"c56aba9c-2ad5-4635-b6ad-eac6f79054c3\" (UID: \"c56aba9c-2ad5-4635-b6ad-eac6f79054c3\") " Mar 21 04:29:27 crc kubenswrapper[4839]: I0321 04:29:27.359519 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2e226a30-c23d-4a45-ab06-4087bf0a38c7-config\") pod \"2e226a30-c23d-4a45-ab06-4087bf0a38c7\" (UID: \"2e226a30-c23d-4a45-ab06-4087bf0a38c7\") " Mar 21 04:29:27 crc kubenswrapper[4839]: I0321 04:29:27.359602 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dngsm\" (UniqueName: \"kubernetes.io/projected/c56aba9c-2ad5-4635-b6ad-eac6f79054c3-kube-api-access-dngsm\") pod \"c56aba9c-2ad5-4635-b6ad-eac6f79054c3\" (UID: \"c56aba9c-2ad5-4635-b6ad-eac6f79054c3\") " Mar 21 04:29:27 crc kubenswrapper[4839]: I0321 04:29:27.359679 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/2e226a30-c23d-4a45-ab06-4087bf0a38c7-client-ca\") pod \"2e226a30-c23d-4a45-ab06-4087bf0a38c7\" (UID: \"2e226a30-c23d-4a45-ab06-4087bf0a38c7\") " Mar 21 04:29:27 crc kubenswrapper[4839]: I0321 04:29:27.359704 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lssj5\" (UniqueName: \"kubernetes.io/projected/2e226a30-c23d-4a45-ab06-4087bf0a38c7-kube-api-access-lssj5\") pod \"2e226a30-c23d-4a45-ab06-4087bf0a38c7\" (UID: \"2e226a30-c23d-4a45-ab06-4087bf0a38c7\") " Mar 21 04:29:27 crc kubenswrapper[4839]: I0321 04:29:27.359738 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/2e226a30-c23d-4a45-ab06-4087bf0a38c7-serving-cert\") pod \"2e226a30-c23d-4a45-ab06-4087bf0a38c7\" (UID: \"2e226a30-c23d-4a45-ab06-4087bf0a38c7\") " Mar 21 04:29:27 crc kubenswrapper[4839]: I0321 04:29:27.359764 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/c56aba9c-2ad5-4635-b6ad-eac6f79054c3-proxy-ca-bundles\") pod \"c56aba9c-2ad5-4635-b6ad-eac6f79054c3\" (UID: \"c56aba9c-2ad5-4635-b6ad-eac6f79054c3\") " Mar 21 04:29:27 crc kubenswrapper[4839]: I0321 04:29:27.360126 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c56aba9c-2ad5-4635-b6ad-eac6f79054c3-client-ca" (OuterVolumeSpecName: "client-ca") pod "c56aba9c-2ad5-4635-b6ad-eac6f79054c3" (UID: "c56aba9c-2ad5-4635-b6ad-eac6f79054c3"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 21 04:29:27 crc kubenswrapper[4839]: I0321 04:29:27.360526 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c56aba9c-2ad5-4635-b6ad-eac6f79054c3-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "c56aba9c-2ad5-4635-b6ad-eac6f79054c3" (UID: "c56aba9c-2ad5-4635-b6ad-eac6f79054c3"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 21 04:29:27 crc kubenswrapper[4839]: I0321 04:29:27.360534 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c56aba9c-2ad5-4635-b6ad-eac6f79054c3-config" (OuterVolumeSpecName: "config") pod "c56aba9c-2ad5-4635-b6ad-eac6f79054c3" (UID: "c56aba9c-2ad5-4635-b6ad-eac6f79054c3"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 21 04:29:27 crc kubenswrapper[4839]: I0321 04:29:27.361144 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2e226a30-c23d-4a45-ab06-4087bf0a38c7-client-ca" (OuterVolumeSpecName: "client-ca") pod "2e226a30-c23d-4a45-ab06-4087bf0a38c7" (UID: "2e226a30-c23d-4a45-ab06-4087bf0a38c7"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 21 04:29:27 crc kubenswrapper[4839]: I0321 04:29:27.362053 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2e226a30-c23d-4a45-ab06-4087bf0a38c7-config" (OuterVolumeSpecName: "config") pod "2e226a30-c23d-4a45-ab06-4087bf0a38c7" (UID: "2e226a30-c23d-4a45-ab06-4087bf0a38c7"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 21 04:29:27 crc kubenswrapper[4839]: I0321 04:29:27.365434 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c56aba9c-2ad5-4635-b6ad-eac6f79054c3-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "c56aba9c-2ad5-4635-b6ad-eac6f79054c3" (UID: "c56aba9c-2ad5-4635-b6ad-eac6f79054c3"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 21 04:29:27 crc kubenswrapper[4839]: I0321 04:29:27.365614 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2e226a30-c23d-4a45-ab06-4087bf0a38c7-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "2e226a30-c23d-4a45-ab06-4087bf0a38c7" (UID: "2e226a30-c23d-4a45-ab06-4087bf0a38c7"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 21 04:29:27 crc kubenswrapper[4839]: I0321 04:29:27.365624 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c56aba9c-2ad5-4635-b6ad-eac6f79054c3-kube-api-access-dngsm" (OuterVolumeSpecName: "kube-api-access-dngsm") pod "c56aba9c-2ad5-4635-b6ad-eac6f79054c3" (UID: "c56aba9c-2ad5-4635-b6ad-eac6f79054c3"). InnerVolumeSpecName "kube-api-access-dngsm". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 21 04:29:27 crc kubenswrapper[4839]: I0321 04:29:27.366220 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2e226a30-c23d-4a45-ab06-4087bf0a38c7-kube-api-access-lssj5" (OuterVolumeSpecName: "kube-api-access-lssj5") pod "2e226a30-c23d-4a45-ab06-4087bf0a38c7" (UID: "2e226a30-c23d-4a45-ab06-4087bf0a38c7"). InnerVolumeSpecName "kube-api-access-lssj5". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 21 04:29:27 crc kubenswrapper[4839]: I0321 04:29:27.374767 4839 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"samples-operator-tls" Mar 21 04:29:27 crc kubenswrapper[4839]: I0321 04:29:27.430840 4839 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"openshift-service-ca.crt" Mar 21 04:29:27 crc kubenswrapper[4839]: I0321 04:29:27.453331 4839 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"trusted-ca-bundle" Mar 21 04:29:27 crc kubenswrapper[4839]: I0321 04:29:27.461038 4839 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/c56aba9c-2ad5-4635-b6ad-eac6f79054c3-client-ca\") on node \"crc\" DevicePath \"\"" Mar 21 04:29:27 crc kubenswrapper[4839]: I0321 04:29:27.461076 4839 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c56aba9c-2ad5-4635-b6ad-eac6f79054c3-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 21 04:29:27 crc kubenswrapper[4839]: I0321 04:29:27.461087 4839 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2e226a30-c23d-4a45-ab06-4087bf0a38c7-config\") on node \"crc\" DevicePath \"\"" Mar 21 04:29:27 crc kubenswrapper[4839]: I0321 04:29:27.461096 4839 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dngsm\" (UniqueName: \"kubernetes.io/projected/c56aba9c-2ad5-4635-b6ad-eac6f79054c3-kube-api-access-dngsm\") on node \"crc\" DevicePath \"\"" Mar 21 04:29:27 crc kubenswrapper[4839]: I0321 04:29:27.461109 4839 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/2e226a30-c23d-4a45-ab06-4087bf0a38c7-client-ca\") on node \"crc\" DevicePath \"\"" Mar 21 04:29:27 crc kubenswrapper[4839]: I0321 04:29:27.461118 4839 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lssj5\" (UniqueName: \"kubernetes.io/projected/2e226a30-c23d-4a45-ab06-4087bf0a38c7-kube-api-access-lssj5\") on node \"crc\" DevicePath \"\"" Mar 21 04:29:27 crc kubenswrapper[4839]: I0321 04:29:27.461127 4839 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/2e226a30-c23d-4a45-ab06-4087bf0a38c7-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 21 04:29:27 crc kubenswrapper[4839]: I0321 04:29:27.461137 4839 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/c56aba9c-2ad5-4635-b6ad-eac6f79054c3-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Mar 21 04:29:27 crc kubenswrapper[4839]: I0321 04:29:27.461146 4839 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c56aba9c-2ad5-4635-b6ad-eac6f79054c3-config\") on node \"crc\" DevicePath \"\"" Mar 21 04:29:27 crc kubenswrapper[4839]: I0321 04:29:27.499490 4839 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Mar 21 04:29:27 crc kubenswrapper[4839]: I0321 04:29:27.525852 4839 generic.go:334] "Generic (PLEG): container finished" podID="c56aba9c-2ad5-4635-b6ad-eac6f79054c3" containerID="42a603ec98d84a05b1f20eead303786e9ed6177eea3a7880147775cb99246cdf" exitCode=0 Mar 21 04:29:27 crc kubenswrapper[4839]: I0321 04:29:27.526218 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-558f576774-7vr74" event={"ID":"c56aba9c-2ad5-4635-b6ad-eac6f79054c3","Type":"ContainerDied","Data":"42a603ec98d84a05b1f20eead303786e9ed6177eea3a7880147775cb99246cdf"} Mar 21 04:29:27 crc kubenswrapper[4839]: I0321 04:29:27.526417 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-558f576774-7vr74" event={"ID":"c56aba9c-2ad5-4635-b6ad-eac6f79054c3","Type":"ContainerDied","Data":"d7388d39aa83eaf8d537d9f043477f7e68a9f0a64d46f47372fa63ac019341e1"} Mar 21 04:29:27 crc kubenswrapper[4839]: I0321 04:29:27.526303 4839 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-558f576774-7vr74" Mar 21 04:29:27 crc kubenswrapper[4839]: I0321 04:29:27.526461 4839 scope.go:117] "RemoveContainer" containerID="42a603ec98d84a05b1f20eead303786e9ed6177eea3a7880147775cb99246cdf" Mar 21 04:29:27 crc kubenswrapper[4839]: I0321 04:29:27.528225 4839 generic.go:334] "Generic (PLEG): container finished" podID="2e226a30-c23d-4a45-ab06-4087bf0a38c7" containerID="ebf0cee149fa25552dccfb87a894d627508cabce1c44155d596f5ae2612b7b6c" exitCode=0 Mar 21 04:29:27 crc kubenswrapper[4839]: I0321 04:29:27.528272 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-ff7874d5d-7kj25" event={"ID":"2e226a30-c23d-4a45-ab06-4087bf0a38c7","Type":"ContainerDied","Data":"ebf0cee149fa25552dccfb87a894d627508cabce1c44155d596f5ae2612b7b6c"} Mar 21 04:29:27 crc kubenswrapper[4839]: I0321 04:29:27.528303 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-ff7874d5d-7kj25" event={"ID":"2e226a30-c23d-4a45-ab06-4087bf0a38c7","Type":"ContainerDied","Data":"13aade652ede3403d6b69f04e499915fd3869f0f1415b459504a8ea3e6cac5bf"} Mar 21 04:29:27 crc kubenswrapper[4839]: I0321 04:29:27.528356 4839 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-ff7874d5d-7kj25" Mar 21 04:29:27 crc kubenswrapper[4839]: I0321 04:29:27.552644 4839 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"cluster-version-operator-serving-cert" Mar 21 04:29:27 crc kubenswrapper[4839]: I0321 04:29:27.554454 4839 scope.go:117] "RemoveContainer" containerID="42a603ec98d84a05b1f20eead303786e9ed6177eea3a7880147775cb99246cdf" Mar 21 04:29:27 crc kubenswrapper[4839]: E0321 04:29:27.555393 4839 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"42a603ec98d84a05b1f20eead303786e9ed6177eea3a7880147775cb99246cdf\": container with ID starting with 42a603ec98d84a05b1f20eead303786e9ed6177eea3a7880147775cb99246cdf not found: ID does not exist" containerID="42a603ec98d84a05b1f20eead303786e9ed6177eea3a7880147775cb99246cdf" Mar 21 04:29:27 crc kubenswrapper[4839]: I0321 04:29:27.555439 4839 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"42a603ec98d84a05b1f20eead303786e9ed6177eea3a7880147775cb99246cdf"} err="failed to get container status \"42a603ec98d84a05b1f20eead303786e9ed6177eea3a7880147775cb99246cdf\": rpc error: code = NotFound desc = could not find container \"42a603ec98d84a05b1f20eead303786e9ed6177eea3a7880147775cb99246cdf\": container with ID starting with 42a603ec98d84a05b1f20eead303786e9ed6177eea3a7880147775cb99246cdf not found: ID does not exist" Mar 21 04:29:27 crc kubenswrapper[4839]: I0321 04:29:27.555465 4839 scope.go:117] "RemoveContainer" containerID="ebf0cee149fa25552dccfb87a894d627508cabce1c44155d596f5ae2612b7b6c" Mar 21 04:29:27 crc kubenswrapper[4839]: I0321 04:29:27.560662 4839 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-558f576774-7vr74"] Mar 21 04:29:27 crc kubenswrapper[4839]: I0321 04:29:27.564729 4839 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-558f576774-7vr74"] Mar 21 04:29:27 crc kubenswrapper[4839]: I0321 04:29:27.571062 4839 scope.go:117] "RemoveContainer" containerID="ebf0cee149fa25552dccfb87a894d627508cabce1c44155d596f5ae2612b7b6c" Mar 21 04:29:27 crc kubenswrapper[4839]: E0321 04:29:27.571524 4839 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ebf0cee149fa25552dccfb87a894d627508cabce1c44155d596f5ae2612b7b6c\": container with ID starting with ebf0cee149fa25552dccfb87a894d627508cabce1c44155d596f5ae2612b7b6c not found: ID does not exist" containerID="ebf0cee149fa25552dccfb87a894d627508cabce1c44155d596f5ae2612b7b6c" Mar 21 04:29:27 crc kubenswrapper[4839]: I0321 04:29:27.571658 4839 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ebf0cee149fa25552dccfb87a894d627508cabce1c44155d596f5ae2612b7b6c"} err="failed to get container status \"ebf0cee149fa25552dccfb87a894d627508cabce1c44155d596f5ae2612b7b6c\": rpc error: code = NotFound desc = could not find container \"ebf0cee149fa25552dccfb87a894d627508cabce1c44155d596f5ae2612b7b6c\": container with ID starting with ebf0cee149fa25552dccfb87a894d627508cabce1c44155d596f5ae2612b7b6c not found: ID does not exist" Mar 21 04:29:27 crc kubenswrapper[4839]: I0321 04:29:27.573400 4839 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-ff7874d5d-7kj25"] Mar 21 04:29:27 crc kubenswrapper[4839]: I0321 04:29:27.577092 4839 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-ff7874d5d-7kj25"] Mar 21 04:29:27 crc kubenswrapper[4839]: I0321 04:29:27.694287 4839 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"etcd-serving-ca" Mar 21 04:29:27 crc kubenswrapper[4839]: I0321 04:29:27.696381 4839 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-controller-dockercfg-c2lfx" Mar 21 04:29:27 crc kubenswrapper[4839]: I0321 04:29:27.764752 4839 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"openshift-service-ca.crt" Mar 21 04:29:27 crc kubenswrapper[4839]: I0321 04:29:27.799551 4839 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-authentication/oauth-openshift-cb949b455-jr2d6" Mar 21 04:29:27 crc kubenswrapper[4839]: I0321 04:29:27.799621 4839 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-authentication/oauth-openshift-cb949b455-jr2d6" Mar 21 04:29:27 crc kubenswrapper[4839]: I0321 04:29:27.800132 4839 scope.go:117] "RemoveContainer" containerID="6dd6effaf30cd6d83c10fd64f9da93bee36a19a8a87078b69af4038ae06980b0" Mar 21 04:29:27 crc kubenswrapper[4839]: E0321 04:29:27.800297 4839 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"oauth-openshift\" with CrashLoopBackOff: \"back-off 10s restarting failed container=oauth-openshift pod=oauth-openshift-cb949b455-jr2d6_openshift-authentication(02d1828c-4b4b-4d6e-994e-d1b383763960)\"" pod="openshift-authentication/oauth-openshift-cb949b455-jr2d6" podUID="02d1828c-4b4b-4d6e-994e-d1b383763960" Mar 21 04:29:27 crc kubenswrapper[4839]: I0321 04:29:27.994909 4839 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"etcd-client" Mar 21 04:29:28 crc kubenswrapper[4839]: I0321 04:29:28.007368 4839 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-dockercfg-r9srn" Mar 21 04:29:28 crc kubenswrapper[4839]: I0321 04:29:28.075151 4839 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-serving-cert" Mar 21 04:29:28 crc kubenswrapper[4839]: I0321 04:29:28.282929 4839 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"config-operator-serving-cert" Mar 21 04:29:28 crc kubenswrapper[4839]: I0321 04:29:28.458736 4839 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2e226a30-c23d-4a45-ab06-4087bf0a38c7" path="/var/lib/kubelet/pods/2e226a30-c23d-4a45-ab06-4087bf0a38c7/volumes" Mar 21 04:29:28 crc kubenswrapper[4839]: I0321 04:29:28.459277 4839 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c56aba9c-2ad5-4635-b6ad-eac6f79054c3" path="/var/lib/kubelet/pods/c56aba9c-2ad5-4635-b6ad-eac6f79054c3/volumes" Mar 21 04:29:28 crc kubenswrapper[4839]: I0321 04:29:28.485591 4839 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-75b8f8489-bxmhw"] Mar 21 04:29:28 crc kubenswrapper[4839]: E0321 04:29:28.485792 4839 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" containerName="startup-monitor" Mar 21 04:29:28 crc kubenswrapper[4839]: I0321 04:29:28.485804 4839 state_mem.go:107] "Deleted CPUSet assignment" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" containerName="startup-monitor" Mar 21 04:29:28 crc kubenswrapper[4839]: E0321 04:29:28.485813 4839 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2e226a30-c23d-4a45-ab06-4087bf0a38c7" containerName="route-controller-manager" Mar 21 04:29:28 crc kubenswrapper[4839]: I0321 04:29:28.485818 4839 state_mem.go:107] "Deleted CPUSet assignment" podUID="2e226a30-c23d-4a45-ab06-4087bf0a38c7" containerName="route-controller-manager" Mar 21 04:29:28 crc kubenswrapper[4839]: E0321 04:29:28.485831 4839 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c56aba9c-2ad5-4635-b6ad-eac6f79054c3" containerName="controller-manager" Mar 21 04:29:28 crc kubenswrapper[4839]: I0321 04:29:28.485837 4839 state_mem.go:107] "Deleted CPUSet assignment" podUID="c56aba9c-2ad5-4635-b6ad-eac6f79054c3" containerName="controller-manager" Mar 21 04:29:28 crc kubenswrapper[4839]: I0321 04:29:28.485920 4839 memory_manager.go:354] "RemoveStaleState removing state" podUID="2e226a30-c23d-4a45-ab06-4087bf0a38c7" containerName="route-controller-manager" Mar 21 04:29:28 crc kubenswrapper[4839]: I0321 04:29:28.485930 4839 memory_manager.go:354] "RemoveStaleState removing state" podUID="c56aba9c-2ad5-4635-b6ad-eac6f79054c3" containerName="controller-manager" Mar 21 04:29:28 crc kubenswrapper[4839]: I0321 04:29:28.485940 4839 memory_manager.go:354] "RemoveStaleState removing state" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" containerName="startup-monitor" Mar 21 04:29:28 crc kubenswrapper[4839]: I0321 04:29:28.486232 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-75b8f8489-bxmhw" Mar 21 04:29:28 crc kubenswrapper[4839]: I0321 04:29:28.490948 4839 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Mar 21 04:29:28 crc kubenswrapper[4839]: I0321 04:29:28.491024 4839 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Mar 21 04:29:28 crc kubenswrapper[4839]: I0321 04:29:28.491072 4839 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Mar 21 04:29:28 crc kubenswrapper[4839]: I0321 04:29:28.491083 4839 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Mar 21 04:29:28 crc kubenswrapper[4839]: I0321 04:29:28.491442 4839 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Mar 21 04:29:28 crc kubenswrapper[4839]: I0321 04:29:28.492936 4839 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6698965c79-7hfxr"] Mar 21 04:29:28 crc kubenswrapper[4839]: I0321 04:29:28.499211 4839 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Mar 21 04:29:28 crc kubenswrapper[4839]: I0321 04:29:28.500203 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6698965c79-7hfxr" Mar 21 04:29:28 crc kubenswrapper[4839]: I0321 04:29:28.508983 4839 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Mar 21 04:29:28 crc kubenswrapper[4839]: I0321 04:29:28.509245 4839 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Mar 21 04:29:28 crc kubenswrapper[4839]: I0321 04:29:28.509926 4839 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Mar 21 04:29:28 crc kubenswrapper[4839]: I0321 04:29:28.510961 4839 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Mar 21 04:29:28 crc kubenswrapper[4839]: I0321 04:29:28.511481 4839 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Mar 21 04:29:28 crc kubenswrapper[4839]: I0321 04:29:28.518238 4839 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Mar 21 04:29:28 crc kubenswrapper[4839]: I0321 04:29:28.519896 4839 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Mar 21 04:29:28 crc kubenswrapper[4839]: I0321 04:29:28.520168 4839 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6698965c79-7hfxr"] Mar 21 04:29:28 crc kubenswrapper[4839]: I0321 04:29:28.526546 4839 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-75b8f8489-bxmhw"] Mar 21 04:29:28 crc kubenswrapper[4839]: I0321 04:29:28.628838 4839 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-sa-dockercfg-nl2j4" Mar 21 04:29:28 crc kubenswrapper[4839]: I0321 04:29:28.674754 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/67bd8eb3-11ef-4a6e-a579-e5bddf00634f-config\") pod \"controller-manager-75b8f8489-bxmhw\" (UID: \"67bd8eb3-11ef-4a6e-a579-e5bddf00634f\") " pod="openshift-controller-manager/controller-manager-75b8f8489-bxmhw" Mar 21 04:29:28 crc kubenswrapper[4839]: I0321 04:29:28.674791 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ecb0a5f1-f808-400a-a4c1-205733971f86-serving-cert\") pod \"route-controller-manager-6698965c79-7hfxr\" (UID: \"ecb0a5f1-f808-400a-a4c1-205733971f86\") " pod="openshift-route-controller-manager/route-controller-manager-6698965c79-7hfxr" Mar 21 04:29:28 crc kubenswrapper[4839]: I0321 04:29:28.674815 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k2h9k\" (UniqueName: \"kubernetes.io/projected/67bd8eb3-11ef-4a6e-a579-e5bddf00634f-kube-api-access-k2h9k\") pod \"controller-manager-75b8f8489-bxmhw\" (UID: \"67bd8eb3-11ef-4a6e-a579-e5bddf00634f\") " pod="openshift-controller-manager/controller-manager-75b8f8489-bxmhw" Mar 21 04:29:28 crc kubenswrapper[4839]: I0321 04:29:28.674844 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/67bd8eb3-11ef-4a6e-a579-e5bddf00634f-client-ca\") pod \"controller-manager-75b8f8489-bxmhw\" (UID: \"67bd8eb3-11ef-4a6e-a579-e5bddf00634f\") " pod="openshift-controller-manager/controller-manager-75b8f8489-bxmhw" Mar 21 04:29:28 crc kubenswrapper[4839]: I0321 04:29:28.674877 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/67bd8eb3-11ef-4a6e-a579-e5bddf00634f-serving-cert\") pod \"controller-manager-75b8f8489-bxmhw\" (UID: \"67bd8eb3-11ef-4a6e-a579-e5bddf00634f\") " pod="openshift-controller-manager/controller-manager-75b8f8489-bxmhw" Mar 21 04:29:28 crc kubenswrapper[4839]: I0321 04:29:28.674895 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ecb0a5f1-f808-400a-a4c1-205733971f86-config\") pod \"route-controller-manager-6698965c79-7hfxr\" (UID: \"ecb0a5f1-f808-400a-a4c1-205733971f86\") " pod="openshift-route-controller-manager/route-controller-manager-6698965c79-7hfxr" Mar 21 04:29:28 crc kubenswrapper[4839]: I0321 04:29:28.674992 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k2kdt\" (UniqueName: \"kubernetes.io/projected/ecb0a5f1-f808-400a-a4c1-205733971f86-kube-api-access-k2kdt\") pod \"route-controller-manager-6698965c79-7hfxr\" (UID: \"ecb0a5f1-f808-400a-a4c1-205733971f86\") " pod="openshift-route-controller-manager/route-controller-manager-6698965c79-7hfxr" Mar 21 04:29:28 crc kubenswrapper[4839]: I0321 04:29:28.675117 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/67bd8eb3-11ef-4a6e-a579-e5bddf00634f-proxy-ca-bundles\") pod \"controller-manager-75b8f8489-bxmhw\" (UID: \"67bd8eb3-11ef-4a6e-a579-e5bddf00634f\") " pod="openshift-controller-manager/controller-manager-75b8f8489-bxmhw" Mar 21 04:29:28 crc kubenswrapper[4839]: I0321 04:29:28.675187 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/ecb0a5f1-f808-400a-a4c1-205733971f86-client-ca\") pod \"route-controller-manager-6698965c79-7hfxr\" (UID: \"ecb0a5f1-f808-400a-a4c1-205733971f86\") " pod="openshift-route-controller-manager/route-controller-manager-6698965c79-7hfxr" Mar 21 04:29:28 crc kubenswrapper[4839]: I0321 04:29:28.775677 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/ecb0a5f1-f808-400a-a4c1-205733971f86-client-ca\") pod \"route-controller-manager-6698965c79-7hfxr\" (UID: \"ecb0a5f1-f808-400a-a4c1-205733971f86\") " pod="openshift-route-controller-manager/route-controller-manager-6698965c79-7hfxr" Mar 21 04:29:28 crc kubenswrapper[4839]: I0321 04:29:28.775785 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/67bd8eb3-11ef-4a6e-a579-e5bddf00634f-config\") pod \"controller-manager-75b8f8489-bxmhw\" (UID: \"67bd8eb3-11ef-4a6e-a579-e5bddf00634f\") " pod="openshift-controller-manager/controller-manager-75b8f8489-bxmhw" Mar 21 04:29:28 crc kubenswrapper[4839]: I0321 04:29:28.775804 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ecb0a5f1-f808-400a-a4c1-205733971f86-serving-cert\") pod \"route-controller-manager-6698965c79-7hfxr\" (UID: \"ecb0a5f1-f808-400a-a4c1-205733971f86\") " pod="openshift-route-controller-manager/route-controller-manager-6698965c79-7hfxr" Mar 21 04:29:28 crc kubenswrapper[4839]: I0321 04:29:28.775839 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-k2h9k\" (UniqueName: \"kubernetes.io/projected/67bd8eb3-11ef-4a6e-a579-e5bddf00634f-kube-api-access-k2h9k\") pod \"controller-manager-75b8f8489-bxmhw\" (UID: \"67bd8eb3-11ef-4a6e-a579-e5bddf00634f\") " pod="openshift-controller-manager/controller-manager-75b8f8489-bxmhw" Mar 21 04:29:28 crc kubenswrapper[4839]: I0321 04:29:28.775866 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/67bd8eb3-11ef-4a6e-a579-e5bddf00634f-client-ca\") pod \"controller-manager-75b8f8489-bxmhw\" (UID: \"67bd8eb3-11ef-4a6e-a579-e5bddf00634f\") " pod="openshift-controller-manager/controller-manager-75b8f8489-bxmhw" Mar 21 04:29:28 crc kubenswrapper[4839]: I0321 04:29:28.775924 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/67bd8eb3-11ef-4a6e-a579-e5bddf00634f-serving-cert\") pod \"controller-manager-75b8f8489-bxmhw\" (UID: \"67bd8eb3-11ef-4a6e-a579-e5bddf00634f\") " pod="openshift-controller-manager/controller-manager-75b8f8489-bxmhw" Mar 21 04:29:28 crc kubenswrapper[4839]: I0321 04:29:28.775942 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ecb0a5f1-f808-400a-a4c1-205733971f86-config\") pod \"route-controller-manager-6698965c79-7hfxr\" (UID: \"ecb0a5f1-f808-400a-a4c1-205733971f86\") " pod="openshift-route-controller-manager/route-controller-manager-6698965c79-7hfxr" Mar 21 04:29:28 crc kubenswrapper[4839]: I0321 04:29:28.775959 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-k2kdt\" (UniqueName: \"kubernetes.io/projected/ecb0a5f1-f808-400a-a4c1-205733971f86-kube-api-access-k2kdt\") pod \"route-controller-manager-6698965c79-7hfxr\" (UID: \"ecb0a5f1-f808-400a-a4c1-205733971f86\") " pod="openshift-route-controller-manager/route-controller-manager-6698965c79-7hfxr" Mar 21 04:29:28 crc kubenswrapper[4839]: I0321 04:29:28.776165 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/67bd8eb3-11ef-4a6e-a579-e5bddf00634f-proxy-ca-bundles\") pod \"controller-manager-75b8f8489-bxmhw\" (UID: \"67bd8eb3-11ef-4a6e-a579-e5bddf00634f\") " pod="openshift-controller-manager/controller-manager-75b8f8489-bxmhw" Mar 21 04:29:28 crc kubenswrapper[4839]: I0321 04:29:28.777459 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/ecb0a5f1-f808-400a-a4c1-205733971f86-client-ca\") pod \"route-controller-manager-6698965c79-7hfxr\" (UID: \"ecb0a5f1-f808-400a-a4c1-205733971f86\") " pod="openshift-route-controller-manager/route-controller-manager-6698965c79-7hfxr" Mar 21 04:29:28 crc kubenswrapper[4839]: I0321 04:29:28.777809 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ecb0a5f1-f808-400a-a4c1-205733971f86-config\") pod \"route-controller-manager-6698965c79-7hfxr\" (UID: \"ecb0a5f1-f808-400a-a4c1-205733971f86\") " pod="openshift-route-controller-manager/route-controller-manager-6698965c79-7hfxr" Mar 21 04:29:28 crc kubenswrapper[4839]: I0321 04:29:28.777954 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/67bd8eb3-11ef-4a6e-a579-e5bddf00634f-config\") pod \"controller-manager-75b8f8489-bxmhw\" (UID: \"67bd8eb3-11ef-4a6e-a579-e5bddf00634f\") " pod="openshift-controller-manager/controller-manager-75b8f8489-bxmhw" Mar 21 04:29:28 crc kubenswrapper[4839]: I0321 04:29:28.779814 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/67bd8eb3-11ef-4a6e-a579-e5bddf00634f-client-ca\") pod \"controller-manager-75b8f8489-bxmhw\" (UID: \"67bd8eb3-11ef-4a6e-a579-e5bddf00634f\") " pod="openshift-controller-manager/controller-manager-75b8f8489-bxmhw" Mar 21 04:29:28 crc kubenswrapper[4839]: I0321 04:29:28.787912 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ecb0a5f1-f808-400a-a4c1-205733971f86-serving-cert\") pod \"route-controller-manager-6698965c79-7hfxr\" (UID: \"ecb0a5f1-f808-400a-a4c1-205733971f86\") " pod="openshift-route-controller-manager/route-controller-manager-6698965c79-7hfxr" Mar 21 04:29:28 crc kubenswrapper[4839]: I0321 04:29:28.790079 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/67bd8eb3-11ef-4a6e-a579-e5bddf00634f-proxy-ca-bundles\") pod \"controller-manager-75b8f8489-bxmhw\" (UID: \"67bd8eb3-11ef-4a6e-a579-e5bddf00634f\") " pod="openshift-controller-manager/controller-manager-75b8f8489-bxmhw" Mar 21 04:29:28 crc kubenswrapper[4839]: I0321 04:29:28.792386 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-k2h9k\" (UniqueName: \"kubernetes.io/projected/67bd8eb3-11ef-4a6e-a579-e5bddf00634f-kube-api-access-k2h9k\") pod \"controller-manager-75b8f8489-bxmhw\" (UID: \"67bd8eb3-11ef-4a6e-a579-e5bddf00634f\") " pod="openshift-controller-manager/controller-manager-75b8f8489-bxmhw" Mar 21 04:29:28 crc kubenswrapper[4839]: I0321 04:29:28.793388 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-k2kdt\" (UniqueName: \"kubernetes.io/projected/ecb0a5f1-f808-400a-a4c1-205733971f86-kube-api-access-k2kdt\") pod \"route-controller-manager-6698965c79-7hfxr\" (UID: \"ecb0a5f1-f808-400a-a4c1-205733971f86\") " pod="openshift-route-controller-manager/route-controller-manager-6698965c79-7hfxr" Mar 21 04:29:28 crc kubenswrapper[4839]: I0321 04:29:28.798334 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/67bd8eb3-11ef-4a6e-a579-e5bddf00634f-serving-cert\") pod \"controller-manager-75b8f8489-bxmhw\" (UID: \"67bd8eb3-11ef-4a6e-a579-e5bddf00634f\") " pod="openshift-controller-manager/controller-manager-75b8f8489-bxmhw" Mar 21 04:29:28 crc kubenswrapper[4839]: I0321 04:29:28.820417 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-75b8f8489-bxmhw" Mar 21 04:29:28 crc kubenswrapper[4839]: I0321 04:29:28.827467 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6698965c79-7hfxr" Mar 21 04:29:29 crc kubenswrapper[4839]: I0321 04:29:29.019741 4839 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6698965c79-7hfxr"] Mar 21 04:29:29 crc kubenswrapper[4839]: W0321 04:29:29.061541 4839 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod67bd8eb3_11ef_4a6e_a579_e5bddf00634f.slice/crio-0ca0196447a9f98003d964a1b6ae20c72e661ec09f53f87cf2c8a36e650639a9 WatchSource:0}: Error finding container 0ca0196447a9f98003d964a1b6ae20c72e661ec09f53f87cf2c8a36e650639a9: Status 404 returned error can't find the container with id 0ca0196447a9f98003d964a1b6ae20c72e661ec09f53f87cf2c8a36e650639a9 Mar 21 04:29:29 crc kubenswrapper[4839]: I0321 04:29:29.064517 4839 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-75b8f8489-bxmhw"] Mar 21 04:29:29 crc kubenswrapper[4839]: I0321 04:29:29.108680 4839 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"kube-root-ca.crt" Mar 21 04:29:29 crc kubenswrapper[4839]: I0321 04:29:29.240695 4839 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-root-ca.crt" Mar 21 04:29:29 crc kubenswrapper[4839]: I0321 04:29:29.349560 4839 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"openshift-service-ca.crt" Mar 21 04:29:29 crc kubenswrapper[4839]: I0321 04:29:29.546582 4839 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-startup-monitor-crc_f85e55b1a89d02b0cb034b1ea31ed45a/startup-monitor/0.log" Mar 21 04:29:29 crc kubenswrapper[4839]: I0321 04:29:29.546642 4839 generic.go:334] "Generic (PLEG): container finished" podID="f85e55b1a89d02b0cb034b1ea31ed45a" containerID="7087a9cc9ea0a71bf83c49dccad5914cb87b12f2c00337676a7c943aa8d8a9b6" exitCode=137 Mar 21 04:29:29 crc kubenswrapper[4839]: I0321 04:29:29.555536 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-75b8f8489-bxmhw" event={"ID":"67bd8eb3-11ef-4a6e-a579-e5bddf00634f","Type":"ContainerStarted","Data":"5489888f82116fb94075e04cc085c30514d0b66b268ad7293d4d847b8ea20bd7"} Mar 21 04:29:29 crc kubenswrapper[4839]: I0321 04:29:29.555619 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-75b8f8489-bxmhw" event={"ID":"67bd8eb3-11ef-4a6e-a579-e5bddf00634f","Type":"ContainerStarted","Data":"0ca0196447a9f98003d964a1b6ae20c72e661ec09f53f87cf2c8a36e650639a9"} Mar 21 04:29:29 crc kubenswrapper[4839]: I0321 04:29:29.555772 4839 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-75b8f8489-bxmhw" Mar 21 04:29:29 crc kubenswrapper[4839]: I0321 04:29:29.557791 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6698965c79-7hfxr" event={"ID":"ecb0a5f1-f808-400a-a4c1-205733971f86","Type":"ContainerStarted","Data":"a56ed0e167d915e0d9ff351ce8c17a19ff7a39b536f07fee566f40ea9ea7e9f4"} Mar 21 04:29:29 crc kubenswrapper[4839]: I0321 04:29:29.557839 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6698965c79-7hfxr" event={"ID":"ecb0a5f1-f808-400a-a4c1-205733971f86","Type":"ContainerStarted","Data":"8d1f0128d103bab12910baaf83b8d5ee7c491413cb0a80553f2697a11324deab"} Mar 21 04:29:29 crc kubenswrapper[4839]: I0321 04:29:29.558210 4839 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-6698965c79-7hfxr" Mar 21 04:29:29 crc kubenswrapper[4839]: I0321 04:29:29.561498 4839 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-75b8f8489-bxmhw" Mar 21 04:29:29 crc kubenswrapper[4839]: I0321 04:29:29.562971 4839 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-6698965c79-7hfxr" Mar 21 04:29:29 crc kubenswrapper[4839]: I0321 04:29:29.574717 4839 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-75b8f8489-bxmhw" podStartSLOduration=3.574699172 podStartE2EDuration="3.574699172s" podCreationTimestamp="2026-03-21 04:29:26 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-21 04:29:29.572993383 +0000 UTC m=+373.900780069" watchObservedRunningTime="2026-03-21 04:29:29.574699172 +0000 UTC m=+373.902485848" Mar 21 04:29:29 crc kubenswrapper[4839]: I0321 04:29:29.724388 4839 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-startup-monitor-crc_f85e55b1a89d02b0cb034b1ea31ed45a/startup-monitor/0.log" Mar 21 04:29:29 crc kubenswrapper[4839]: I0321 04:29:29.724464 4839 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 21 04:29:29 crc kubenswrapper[4839]: I0321 04:29:29.791692 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Mar 21 04:29:29 crc kubenswrapper[4839]: I0321 04:29:29.791749 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Mar 21 04:29:29 crc kubenswrapper[4839]: I0321 04:29:29.791814 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Mar 21 04:29:29 crc kubenswrapper[4839]: I0321 04:29:29.791811 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log" (OuterVolumeSpecName: "var-log") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "var-log". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 21 04:29:29 crc kubenswrapper[4839]: I0321 04:29:29.791857 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Mar 21 04:29:29 crc kubenswrapper[4839]: I0321 04:29:29.791890 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Mar 21 04:29:29 crc kubenswrapper[4839]: I0321 04:29:29.791903 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir" (OuterVolumeSpecName: "resource-dir") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "resource-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 21 04:29:29 crc kubenswrapper[4839]: I0321 04:29:29.791920 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock" (OuterVolumeSpecName: "var-lock") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "var-lock". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 21 04:29:29 crc kubenswrapper[4839]: I0321 04:29:29.792012 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests" (OuterVolumeSpecName: "manifests") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "manifests". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 21 04:29:29 crc kubenswrapper[4839]: I0321 04:29:29.792177 4839 reconciler_common.go:293] "Volume detached for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") on node \"crc\" DevicePath \"\"" Mar 21 04:29:29 crc kubenswrapper[4839]: I0321 04:29:29.792189 4839 reconciler_common.go:293] "Volume detached for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") on node \"crc\" DevicePath \"\"" Mar 21 04:29:29 crc kubenswrapper[4839]: I0321 04:29:29.792201 4839 reconciler_common.go:293] "Volume detached for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") on node \"crc\" DevicePath \"\"" Mar 21 04:29:29 crc kubenswrapper[4839]: I0321 04:29:29.792210 4839 reconciler_common.go:293] "Volume detached for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") on node \"crc\" DevicePath \"\"" Mar 21 04:29:29 crc kubenswrapper[4839]: I0321 04:29:29.799863 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir" (OuterVolumeSpecName: "pod-resource-dir") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "pod-resource-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 21 04:29:29 crc kubenswrapper[4839]: I0321 04:29:29.820533 4839 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-operator-dockercfg-98p87" Mar 21 04:29:29 crc kubenswrapper[4839]: I0321 04:29:29.892598 4839 reconciler_common.go:293] "Volume detached for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") on node \"crc\" DevicePath \"\"" Mar 21 04:29:29 crc kubenswrapper[4839]: I0321 04:29:29.945141 4839 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"ingress-operator-dockercfg-7lnqk" Mar 21 04:29:30 crc kubenswrapper[4839]: I0321 04:29:30.127233 4839 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-tls" Mar 21 04:29:30 crc kubenswrapper[4839]: I0321 04:29:30.468209 4839 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" path="/var/lib/kubelet/pods/f85e55b1a89d02b0cb034b1ea31ed45a/volumes" Mar 21 04:29:30 crc kubenswrapper[4839]: I0321 04:29:30.566210 4839 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-startup-monitor-crc_f85e55b1a89d02b0cb034b1ea31ed45a/startup-monitor/0.log" Mar 21 04:29:30 crc kubenswrapper[4839]: I0321 04:29:30.566354 4839 scope.go:117] "RemoveContainer" containerID="7087a9cc9ea0a71bf83c49dccad5914cb87b12f2c00337676a7c943aa8d8a9b6" Mar 21 04:29:30 crc kubenswrapper[4839]: I0321 04:29:30.566498 4839 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 21 04:29:41 crc kubenswrapper[4839]: I0321 04:29:41.452558 4839 scope.go:117] "RemoveContainer" containerID="6dd6effaf30cd6d83c10fd64f9da93bee36a19a8a87078b69af4038ae06980b0" Mar 21 04:29:42 crc kubenswrapper[4839]: I0321 04:29:42.626705 4839 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-authentication_oauth-openshift-cb949b455-jr2d6_02d1828c-4b4b-4d6e-994e-d1b383763960/oauth-openshift/1.log" Mar 21 04:29:42 crc kubenswrapper[4839]: I0321 04:29:42.627027 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-cb949b455-jr2d6" event={"ID":"02d1828c-4b4b-4d6e-994e-d1b383763960","Type":"ContainerStarted","Data":"128cf1880264029b1df19f2d836e53b76d4f39d0c9758f4478220d81c2192166"} Mar 21 04:29:42 crc kubenswrapper[4839]: I0321 04:29:42.627351 4839 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-authentication/oauth-openshift-cb949b455-jr2d6" Mar 21 04:29:42 crc kubenswrapper[4839]: I0321 04:29:42.632378 4839 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-authentication/oauth-openshift-cb949b455-jr2d6" Mar 21 04:29:42 crc kubenswrapper[4839]: I0321 04:29:42.646662 4839 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-authentication/oauth-openshift-cb949b455-jr2d6" podStartSLOduration=63.646643679 podStartE2EDuration="1m3.646643679s" podCreationTimestamp="2026-03-21 04:28:39 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-21 04:29:42.644737025 +0000 UTC m=+386.972523711" watchObservedRunningTime="2026-03-21 04:29:42.646643679 +0000 UTC m=+386.974430355" Mar 21 04:29:42 crc kubenswrapper[4839]: I0321 04:29:42.650149 4839 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-6698965c79-7hfxr" podStartSLOduration=16.650139199 podStartE2EDuration="16.650139199s" podCreationTimestamp="2026-03-21 04:29:26 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-21 04:29:29.616086615 +0000 UTC m=+373.943873301" watchObservedRunningTime="2026-03-21 04:29:42.650139199 +0000 UTC m=+386.977925875" Mar 21 04:29:46 crc kubenswrapper[4839]: I0321 04:29:46.696943 4839 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-75b8f8489-bxmhw"] Mar 21 04:29:46 crc kubenswrapper[4839]: I0321 04:29:46.697457 4839 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-75b8f8489-bxmhw" podUID="67bd8eb3-11ef-4a6e-a579-e5bddf00634f" containerName="controller-manager" containerID="cri-o://5489888f82116fb94075e04cc085c30514d0b66b268ad7293d4d847b8ea20bd7" gracePeriod=30 Mar 21 04:29:46 crc kubenswrapper[4839]: I0321 04:29:46.711189 4839 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6698965c79-7hfxr"] Mar 21 04:29:46 crc kubenswrapper[4839]: I0321 04:29:46.711835 4839 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-6698965c79-7hfxr" podUID="ecb0a5f1-f808-400a-a4c1-205733971f86" containerName="route-controller-manager" containerID="cri-o://a56ed0e167d915e0d9ff351ce8c17a19ff7a39b536f07fee566f40ea9ea7e9f4" gracePeriod=30 Mar 21 04:29:47 crc kubenswrapper[4839]: I0321 04:29:47.224505 4839 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6698965c79-7hfxr" Mar 21 04:29:47 crc kubenswrapper[4839]: I0321 04:29:47.259345 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ecb0a5f1-f808-400a-a4c1-205733971f86-serving-cert\") pod \"ecb0a5f1-f808-400a-a4c1-205733971f86\" (UID: \"ecb0a5f1-f808-400a-a4c1-205733971f86\") " Mar 21 04:29:47 crc kubenswrapper[4839]: I0321 04:29:47.259470 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ecb0a5f1-f808-400a-a4c1-205733971f86-config\") pod \"ecb0a5f1-f808-400a-a4c1-205733971f86\" (UID: \"ecb0a5f1-f808-400a-a4c1-205733971f86\") " Mar 21 04:29:47 crc kubenswrapper[4839]: I0321 04:29:47.259520 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-k2kdt\" (UniqueName: \"kubernetes.io/projected/ecb0a5f1-f808-400a-a4c1-205733971f86-kube-api-access-k2kdt\") pod \"ecb0a5f1-f808-400a-a4c1-205733971f86\" (UID: \"ecb0a5f1-f808-400a-a4c1-205733971f86\") " Mar 21 04:29:47 crc kubenswrapper[4839]: I0321 04:29:47.259554 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/ecb0a5f1-f808-400a-a4c1-205733971f86-client-ca\") pod \"ecb0a5f1-f808-400a-a4c1-205733971f86\" (UID: \"ecb0a5f1-f808-400a-a4c1-205733971f86\") " Mar 21 04:29:47 crc kubenswrapper[4839]: I0321 04:29:47.260275 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ecb0a5f1-f808-400a-a4c1-205733971f86-client-ca" (OuterVolumeSpecName: "client-ca") pod "ecb0a5f1-f808-400a-a4c1-205733971f86" (UID: "ecb0a5f1-f808-400a-a4c1-205733971f86"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 21 04:29:47 crc kubenswrapper[4839]: I0321 04:29:47.260329 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ecb0a5f1-f808-400a-a4c1-205733971f86-config" (OuterVolumeSpecName: "config") pod "ecb0a5f1-f808-400a-a4c1-205733971f86" (UID: "ecb0a5f1-f808-400a-a4c1-205733971f86"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 21 04:29:47 crc kubenswrapper[4839]: I0321 04:29:47.264806 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ecb0a5f1-f808-400a-a4c1-205733971f86-kube-api-access-k2kdt" (OuterVolumeSpecName: "kube-api-access-k2kdt") pod "ecb0a5f1-f808-400a-a4c1-205733971f86" (UID: "ecb0a5f1-f808-400a-a4c1-205733971f86"). InnerVolumeSpecName "kube-api-access-k2kdt". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 21 04:29:47 crc kubenswrapper[4839]: I0321 04:29:47.266733 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ecb0a5f1-f808-400a-a4c1-205733971f86-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "ecb0a5f1-f808-400a-a4c1-205733971f86" (UID: "ecb0a5f1-f808-400a-a4c1-205733971f86"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 21 04:29:47 crc kubenswrapper[4839]: I0321 04:29:47.340850 4839 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-75b8f8489-bxmhw" Mar 21 04:29:47 crc kubenswrapper[4839]: I0321 04:29:47.360699 4839 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ecb0a5f1-f808-400a-a4c1-205733971f86-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 21 04:29:47 crc kubenswrapper[4839]: I0321 04:29:47.360738 4839 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ecb0a5f1-f808-400a-a4c1-205733971f86-config\") on node \"crc\" DevicePath \"\"" Mar 21 04:29:47 crc kubenswrapper[4839]: I0321 04:29:47.360750 4839 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-k2kdt\" (UniqueName: \"kubernetes.io/projected/ecb0a5f1-f808-400a-a4c1-205733971f86-kube-api-access-k2kdt\") on node \"crc\" DevicePath \"\"" Mar 21 04:29:47 crc kubenswrapper[4839]: I0321 04:29:47.360762 4839 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/ecb0a5f1-f808-400a-a4c1-205733971f86-client-ca\") on node \"crc\" DevicePath \"\"" Mar 21 04:29:47 crc kubenswrapper[4839]: I0321 04:29:47.461098 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/67bd8eb3-11ef-4a6e-a579-e5bddf00634f-client-ca\") pod \"67bd8eb3-11ef-4a6e-a579-e5bddf00634f\" (UID: \"67bd8eb3-11ef-4a6e-a579-e5bddf00634f\") " Mar 21 04:29:47 crc kubenswrapper[4839]: I0321 04:29:47.461145 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/67bd8eb3-11ef-4a6e-a579-e5bddf00634f-proxy-ca-bundles\") pod \"67bd8eb3-11ef-4a6e-a579-e5bddf00634f\" (UID: \"67bd8eb3-11ef-4a6e-a579-e5bddf00634f\") " Mar 21 04:29:47 crc kubenswrapper[4839]: I0321 04:29:47.461189 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-k2h9k\" (UniqueName: \"kubernetes.io/projected/67bd8eb3-11ef-4a6e-a579-e5bddf00634f-kube-api-access-k2h9k\") pod \"67bd8eb3-11ef-4a6e-a579-e5bddf00634f\" (UID: \"67bd8eb3-11ef-4a6e-a579-e5bddf00634f\") " Mar 21 04:29:47 crc kubenswrapper[4839]: I0321 04:29:47.461277 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/67bd8eb3-11ef-4a6e-a579-e5bddf00634f-config\") pod \"67bd8eb3-11ef-4a6e-a579-e5bddf00634f\" (UID: \"67bd8eb3-11ef-4a6e-a579-e5bddf00634f\") " Mar 21 04:29:47 crc kubenswrapper[4839]: I0321 04:29:47.461326 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/67bd8eb3-11ef-4a6e-a579-e5bddf00634f-serving-cert\") pod \"67bd8eb3-11ef-4a6e-a579-e5bddf00634f\" (UID: \"67bd8eb3-11ef-4a6e-a579-e5bddf00634f\") " Mar 21 04:29:47 crc kubenswrapper[4839]: I0321 04:29:47.461956 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/67bd8eb3-11ef-4a6e-a579-e5bddf00634f-client-ca" (OuterVolumeSpecName: "client-ca") pod "67bd8eb3-11ef-4a6e-a579-e5bddf00634f" (UID: "67bd8eb3-11ef-4a6e-a579-e5bddf00634f"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 21 04:29:47 crc kubenswrapper[4839]: I0321 04:29:47.462023 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/67bd8eb3-11ef-4a6e-a579-e5bddf00634f-config" (OuterVolumeSpecName: "config") pod "67bd8eb3-11ef-4a6e-a579-e5bddf00634f" (UID: "67bd8eb3-11ef-4a6e-a579-e5bddf00634f"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 21 04:29:47 crc kubenswrapper[4839]: I0321 04:29:47.462037 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/67bd8eb3-11ef-4a6e-a579-e5bddf00634f-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "67bd8eb3-11ef-4a6e-a579-e5bddf00634f" (UID: "67bd8eb3-11ef-4a6e-a579-e5bddf00634f"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 21 04:29:47 crc kubenswrapper[4839]: I0321 04:29:47.464161 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/67bd8eb3-11ef-4a6e-a579-e5bddf00634f-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "67bd8eb3-11ef-4a6e-a579-e5bddf00634f" (UID: "67bd8eb3-11ef-4a6e-a579-e5bddf00634f"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 21 04:29:47 crc kubenswrapper[4839]: I0321 04:29:47.464679 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/67bd8eb3-11ef-4a6e-a579-e5bddf00634f-kube-api-access-k2h9k" (OuterVolumeSpecName: "kube-api-access-k2h9k") pod "67bd8eb3-11ef-4a6e-a579-e5bddf00634f" (UID: "67bd8eb3-11ef-4a6e-a579-e5bddf00634f"). InnerVolumeSpecName "kube-api-access-k2h9k". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 21 04:29:47 crc kubenswrapper[4839]: I0321 04:29:47.562407 4839 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/67bd8eb3-11ef-4a6e-a579-e5bddf00634f-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 21 04:29:47 crc kubenswrapper[4839]: I0321 04:29:47.562441 4839 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/67bd8eb3-11ef-4a6e-a579-e5bddf00634f-client-ca\") on node \"crc\" DevicePath \"\"" Mar 21 04:29:47 crc kubenswrapper[4839]: I0321 04:29:47.562449 4839 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/67bd8eb3-11ef-4a6e-a579-e5bddf00634f-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Mar 21 04:29:47 crc kubenswrapper[4839]: I0321 04:29:47.562460 4839 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-k2h9k\" (UniqueName: \"kubernetes.io/projected/67bd8eb3-11ef-4a6e-a579-e5bddf00634f-kube-api-access-k2h9k\") on node \"crc\" DevicePath \"\"" Mar 21 04:29:47 crc kubenswrapper[4839]: I0321 04:29:47.562468 4839 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/67bd8eb3-11ef-4a6e-a579-e5bddf00634f-config\") on node \"crc\" DevicePath \"\"" Mar 21 04:29:47 crc kubenswrapper[4839]: I0321 04:29:47.652971 4839 generic.go:334] "Generic (PLEG): container finished" podID="67bd8eb3-11ef-4a6e-a579-e5bddf00634f" containerID="5489888f82116fb94075e04cc085c30514d0b66b268ad7293d4d847b8ea20bd7" exitCode=0 Mar 21 04:29:47 crc kubenswrapper[4839]: I0321 04:29:47.653039 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-75b8f8489-bxmhw" event={"ID":"67bd8eb3-11ef-4a6e-a579-e5bddf00634f","Type":"ContainerDied","Data":"5489888f82116fb94075e04cc085c30514d0b66b268ad7293d4d847b8ea20bd7"} Mar 21 04:29:47 crc kubenswrapper[4839]: I0321 04:29:47.653066 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-75b8f8489-bxmhw" event={"ID":"67bd8eb3-11ef-4a6e-a579-e5bddf00634f","Type":"ContainerDied","Data":"0ca0196447a9f98003d964a1b6ae20c72e661ec09f53f87cf2c8a36e650639a9"} Mar 21 04:29:47 crc kubenswrapper[4839]: I0321 04:29:47.653084 4839 scope.go:117] "RemoveContainer" containerID="5489888f82116fb94075e04cc085c30514d0b66b268ad7293d4d847b8ea20bd7" Mar 21 04:29:47 crc kubenswrapper[4839]: I0321 04:29:47.653173 4839 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-75b8f8489-bxmhw" Mar 21 04:29:47 crc kubenswrapper[4839]: I0321 04:29:47.655378 4839 generic.go:334] "Generic (PLEG): container finished" podID="ecb0a5f1-f808-400a-a4c1-205733971f86" containerID="a56ed0e167d915e0d9ff351ce8c17a19ff7a39b536f07fee566f40ea9ea7e9f4" exitCode=0 Mar 21 04:29:47 crc kubenswrapper[4839]: I0321 04:29:47.655412 4839 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6698965c79-7hfxr" Mar 21 04:29:47 crc kubenswrapper[4839]: I0321 04:29:47.655408 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6698965c79-7hfxr" event={"ID":"ecb0a5f1-f808-400a-a4c1-205733971f86","Type":"ContainerDied","Data":"a56ed0e167d915e0d9ff351ce8c17a19ff7a39b536f07fee566f40ea9ea7e9f4"} Mar 21 04:29:47 crc kubenswrapper[4839]: I0321 04:29:47.655459 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6698965c79-7hfxr" event={"ID":"ecb0a5f1-f808-400a-a4c1-205733971f86","Type":"ContainerDied","Data":"8d1f0128d103bab12910baaf83b8d5ee7c491413cb0a80553f2697a11324deab"} Mar 21 04:29:47 crc kubenswrapper[4839]: I0321 04:29:47.666927 4839 scope.go:117] "RemoveContainer" containerID="5489888f82116fb94075e04cc085c30514d0b66b268ad7293d4d847b8ea20bd7" Mar 21 04:29:47 crc kubenswrapper[4839]: E0321 04:29:47.667265 4839 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5489888f82116fb94075e04cc085c30514d0b66b268ad7293d4d847b8ea20bd7\": container with ID starting with 5489888f82116fb94075e04cc085c30514d0b66b268ad7293d4d847b8ea20bd7 not found: ID does not exist" containerID="5489888f82116fb94075e04cc085c30514d0b66b268ad7293d4d847b8ea20bd7" Mar 21 04:29:47 crc kubenswrapper[4839]: I0321 04:29:47.667298 4839 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5489888f82116fb94075e04cc085c30514d0b66b268ad7293d4d847b8ea20bd7"} err="failed to get container status \"5489888f82116fb94075e04cc085c30514d0b66b268ad7293d4d847b8ea20bd7\": rpc error: code = NotFound desc = could not find container \"5489888f82116fb94075e04cc085c30514d0b66b268ad7293d4d847b8ea20bd7\": container with ID starting with 5489888f82116fb94075e04cc085c30514d0b66b268ad7293d4d847b8ea20bd7 not found: ID does not exist" Mar 21 04:29:47 crc kubenswrapper[4839]: I0321 04:29:47.667321 4839 scope.go:117] "RemoveContainer" containerID="a56ed0e167d915e0d9ff351ce8c17a19ff7a39b536f07fee566f40ea9ea7e9f4" Mar 21 04:29:47 crc kubenswrapper[4839]: I0321 04:29:47.680485 4839 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6698965c79-7hfxr"] Mar 21 04:29:47 crc kubenswrapper[4839]: I0321 04:29:47.681652 4839 scope.go:117] "RemoveContainer" containerID="a56ed0e167d915e0d9ff351ce8c17a19ff7a39b536f07fee566f40ea9ea7e9f4" Mar 21 04:29:47 crc kubenswrapper[4839]: E0321 04:29:47.681982 4839 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a56ed0e167d915e0d9ff351ce8c17a19ff7a39b536f07fee566f40ea9ea7e9f4\": container with ID starting with a56ed0e167d915e0d9ff351ce8c17a19ff7a39b536f07fee566f40ea9ea7e9f4 not found: ID does not exist" containerID="a56ed0e167d915e0d9ff351ce8c17a19ff7a39b536f07fee566f40ea9ea7e9f4" Mar 21 04:29:47 crc kubenswrapper[4839]: I0321 04:29:47.682008 4839 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a56ed0e167d915e0d9ff351ce8c17a19ff7a39b536f07fee566f40ea9ea7e9f4"} err="failed to get container status \"a56ed0e167d915e0d9ff351ce8c17a19ff7a39b536f07fee566f40ea9ea7e9f4\": rpc error: code = NotFound desc = could not find container \"a56ed0e167d915e0d9ff351ce8c17a19ff7a39b536f07fee566f40ea9ea7e9f4\": container with ID starting with a56ed0e167d915e0d9ff351ce8c17a19ff7a39b536f07fee566f40ea9ea7e9f4 not found: ID does not exist" Mar 21 04:29:47 crc kubenswrapper[4839]: I0321 04:29:47.683710 4839 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6698965c79-7hfxr"] Mar 21 04:29:47 crc kubenswrapper[4839]: I0321 04:29:47.690447 4839 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-75b8f8489-bxmhw"] Mar 21 04:29:47 crc kubenswrapper[4839]: I0321 04:29:47.693291 4839 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-75b8f8489-bxmhw"] Mar 21 04:29:48 crc kubenswrapper[4839]: I0321 04:29:48.459206 4839 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="67bd8eb3-11ef-4a6e-a579-e5bddf00634f" path="/var/lib/kubelet/pods/67bd8eb3-11ef-4a6e-a579-e5bddf00634f/volumes" Mar 21 04:29:48 crc kubenswrapper[4839]: I0321 04:29:48.459907 4839 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ecb0a5f1-f808-400a-a4c1-205733971f86" path="/var/lib/kubelet/pods/ecb0a5f1-f808-400a-a4c1-205733971f86/volumes" Mar 21 04:29:48 crc kubenswrapper[4839]: I0321 04:29:48.503827 4839 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-5db558bd57-lhzm8"] Mar 21 04:29:48 crc kubenswrapper[4839]: E0321 04:29:48.504159 4839 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ecb0a5f1-f808-400a-a4c1-205733971f86" containerName="route-controller-manager" Mar 21 04:29:48 crc kubenswrapper[4839]: I0321 04:29:48.504178 4839 state_mem.go:107] "Deleted CPUSet assignment" podUID="ecb0a5f1-f808-400a-a4c1-205733971f86" containerName="route-controller-manager" Mar 21 04:29:48 crc kubenswrapper[4839]: E0321 04:29:48.504209 4839 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="67bd8eb3-11ef-4a6e-a579-e5bddf00634f" containerName="controller-manager" Mar 21 04:29:48 crc kubenswrapper[4839]: I0321 04:29:48.504216 4839 state_mem.go:107] "Deleted CPUSet assignment" podUID="67bd8eb3-11ef-4a6e-a579-e5bddf00634f" containerName="controller-manager" Mar 21 04:29:48 crc kubenswrapper[4839]: I0321 04:29:48.504368 4839 memory_manager.go:354] "RemoveStaleState removing state" podUID="ecb0a5f1-f808-400a-a4c1-205733971f86" containerName="route-controller-manager" Mar 21 04:29:48 crc kubenswrapper[4839]: I0321 04:29:48.504389 4839 memory_manager.go:354] "RemoveStaleState removing state" podUID="67bd8eb3-11ef-4a6e-a579-e5bddf00634f" containerName="controller-manager" Mar 21 04:29:48 crc kubenswrapper[4839]: I0321 04:29:48.505022 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-5db558bd57-lhzm8" Mar 21 04:29:48 crc kubenswrapper[4839]: I0321 04:29:48.507189 4839 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-69c79dd4cc-nhr5p"] Mar 21 04:29:48 crc kubenswrapper[4839]: I0321 04:29:48.507591 4839 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Mar 21 04:29:48 crc kubenswrapper[4839]: I0321 04:29:48.507644 4839 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Mar 21 04:29:48 crc kubenswrapper[4839]: I0321 04:29:48.507776 4839 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Mar 21 04:29:48 crc kubenswrapper[4839]: I0321 04:29:48.507820 4839 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Mar 21 04:29:48 crc kubenswrapper[4839]: I0321 04:29:48.507925 4839 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Mar 21 04:29:48 crc kubenswrapper[4839]: I0321 04:29:48.507977 4839 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Mar 21 04:29:48 crc kubenswrapper[4839]: I0321 04:29:48.508049 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-69c79dd4cc-nhr5p" Mar 21 04:29:48 crc kubenswrapper[4839]: I0321 04:29:48.513932 4839 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Mar 21 04:29:48 crc kubenswrapper[4839]: I0321 04:29:48.514113 4839 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Mar 21 04:29:48 crc kubenswrapper[4839]: I0321 04:29:48.514234 4839 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Mar 21 04:29:48 crc kubenswrapper[4839]: I0321 04:29:48.514318 4839 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Mar 21 04:29:48 crc kubenswrapper[4839]: I0321 04:29:48.514327 4839 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Mar 21 04:29:48 crc kubenswrapper[4839]: I0321 04:29:48.515219 4839 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-5db558bd57-lhzm8"] Mar 21 04:29:48 crc kubenswrapper[4839]: I0321 04:29:48.517012 4839 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Mar 21 04:29:48 crc kubenswrapper[4839]: I0321 04:29:48.525305 4839 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Mar 21 04:29:48 crc kubenswrapper[4839]: I0321 04:29:48.525517 4839 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-69c79dd4cc-nhr5p"] Mar 21 04:29:48 crc kubenswrapper[4839]: I0321 04:29:48.572972 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3ec6a4a0-0142-4e96-8bb1-4bd5592708fb-config\") pod \"route-controller-manager-69c79dd4cc-nhr5p\" (UID: \"3ec6a4a0-0142-4e96-8bb1-4bd5592708fb\") " pod="openshift-route-controller-manager/route-controller-manager-69c79dd4cc-nhr5p" Mar 21 04:29:48 crc kubenswrapper[4839]: I0321 04:29:48.573030 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bz559\" (UniqueName: \"kubernetes.io/projected/690da0e8-bbb6-43cd-a875-01057cb5c75c-kube-api-access-bz559\") pod \"controller-manager-5db558bd57-lhzm8\" (UID: \"690da0e8-bbb6-43cd-a875-01057cb5c75c\") " pod="openshift-controller-manager/controller-manager-5db558bd57-lhzm8" Mar 21 04:29:48 crc kubenswrapper[4839]: I0321 04:29:48.573057 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/3ec6a4a0-0142-4e96-8bb1-4bd5592708fb-client-ca\") pod \"route-controller-manager-69c79dd4cc-nhr5p\" (UID: \"3ec6a4a0-0142-4e96-8bb1-4bd5592708fb\") " pod="openshift-route-controller-manager/route-controller-manager-69c79dd4cc-nhr5p" Mar 21 04:29:48 crc kubenswrapper[4839]: I0321 04:29:48.573102 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hxlzk\" (UniqueName: \"kubernetes.io/projected/3ec6a4a0-0142-4e96-8bb1-4bd5592708fb-kube-api-access-hxlzk\") pod \"route-controller-manager-69c79dd4cc-nhr5p\" (UID: \"3ec6a4a0-0142-4e96-8bb1-4bd5592708fb\") " pod="openshift-route-controller-manager/route-controller-manager-69c79dd4cc-nhr5p" Mar 21 04:29:48 crc kubenswrapper[4839]: I0321 04:29:48.573132 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/690da0e8-bbb6-43cd-a875-01057cb5c75c-proxy-ca-bundles\") pod \"controller-manager-5db558bd57-lhzm8\" (UID: \"690da0e8-bbb6-43cd-a875-01057cb5c75c\") " pod="openshift-controller-manager/controller-manager-5db558bd57-lhzm8" Mar 21 04:29:48 crc kubenswrapper[4839]: I0321 04:29:48.573348 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/690da0e8-bbb6-43cd-a875-01057cb5c75c-client-ca\") pod \"controller-manager-5db558bd57-lhzm8\" (UID: \"690da0e8-bbb6-43cd-a875-01057cb5c75c\") " pod="openshift-controller-manager/controller-manager-5db558bd57-lhzm8" Mar 21 04:29:48 crc kubenswrapper[4839]: I0321 04:29:48.573514 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/3ec6a4a0-0142-4e96-8bb1-4bd5592708fb-serving-cert\") pod \"route-controller-manager-69c79dd4cc-nhr5p\" (UID: \"3ec6a4a0-0142-4e96-8bb1-4bd5592708fb\") " pod="openshift-route-controller-manager/route-controller-manager-69c79dd4cc-nhr5p" Mar 21 04:29:48 crc kubenswrapper[4839]: I0321 04:29:48.573644 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/690da0e8-bbb6-43cd-a875-01057cb5c75c-config\") pod \"controller-manager-5db558bd57-lhzm8\" (UID: \"690da0e8-bbb6-43cd-a875-01057cb5c75c\") " pod="openshift-controller-manager/controller-manager-5db558bd57-lhzm8" Mar 21 04:29:48 crc kubenswrapper[4839]: I0321 04:29:48.573712 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/690da0e8-bbb6-43cd-a875-01057cb5c75c-serving-cert\") pod \"controller-manager-5db558bd57-lhzm8\" (UID: \"690da0e8-bbb6-43cd-a875-01057cb5c75c\") " pod="openshift-controller-manager/controller-manager-5db558bd57-lhzm8" Mar 21 04:29:48 crc kubenswrapper[4839]: I0321 04:29:48.675251 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/690da0e8-bbb6-43cd-a875-01057cb5c75c-client-ca\") pod \"controller-manager-5db558bd57-lhzm8\" (UID: \"690da0e8-bbb6-43cd-a875-01057cb5c75c\") " pod="openshift-controller-manager/controller-manager-5db558bd57-lhzm8" Mar 21 04:29:48 crc kubenswrapper[4839]: I0321 04:29:48.675364 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/3ec6a4a0-0142-4e96-8bb1-4bd5592708fb-serving-cert\") pod \"route-controller-manager-69c79dd4cc-nhr5p\" (UID: \"3ec6a4a0-0142-4e96-8bb1-4bd5592708fb\") " pod="openshift-route-controller-manager/route-controller-manager-69c79dd4cc-nhr5p" Mar 21 04:29:48 crc kubenswrapper[4839]: I0321 04:29:48.675426 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/690da0e8-bbb6-43cd-a875-01057cb5c75c-config\") pod \"controller-manager-5db558bd57-lhzm8\" (UID: \"690da0e8-bbb6-43cd-a875-01057cb5c75c\") " pod="openshift-controller-manager/controller-manager-5db558bd57-lhzm8" Mar 21 04:29:48 crc kubenswrapper[4839]: I0321 04:29:48.675459 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/690da0e8-bbb6-43cd-a875-01057cb5c75c-serving-cert\") pod \"controller-manager-5db558bd57-lhzm8\" (UID: \"690da0e8-bbb6-43cd-a875-01057cb5c75c\") " pod="openshift-controller-manager/controller-manager-5db558bd57-lhzm8" Mar 21 04:29:48 crc kubenswrapper[4839]: I0321 04:29:48.675509 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3ec6a4a0-0142-4e96-8bb1-4bd5592708fb-config\") pod \"route-controller-manager-69c79dd4cc-nhr5p\" (UID: \"3ec6a4a0-0142-4e96-8bb1-4bd5592708fb\") " pod="openshift-route-controller-manager/route-controller-manager-69c79dd4cc-nhr5p" Mar 21 04:29:48 crc kubenswrapper[4839]: I0321 04:29:48.675546 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bz559\" (UniqueName: \"kubernetes.io/projected/690da0e8-bbb6-43cd-a875-01057cb5c75c-kube-api-access-bz559\") pod \"controller-manager-5db558bd57-lhzm8\" (UID: \"690da0e8-bbb6-43cd-a875-01057cb5c75c\") " pod="openshift-controller-manager/controller-manager-5db558bd57-lhzm8" Mar 21 04:29:48 crc kubenswrapper[4839]: I0321 04:29:48.675619 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/3ec6a4a0-0142-4e96-8bb1-4bd5592708fb-client-ca\") pod \"route-controller-manager-69c79dd4cc-nhr5p\" (UID: \"3ec6a4a0-0142-4e96-8bb1-4bd5592708fb\") " pod="openshift-route-controller-manager/route-controller-manager-69c79dd4cc-nhr5p" Mar 21 04:29:48 crc kubenswrapper[4839]: I0321 04:29:48.675725 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hxlzk\" (UniqueName: \"kubernetes.io/projected/3ec6a4a0-0142-4e96-8bb1-4bd5592708fb-kube-api-access-hxlzk\") pod \"route-controller-manager-69c79dd4cc-nhr5p\" (UID: \"3ec6a4a0-0142-4e96-8bb1-4bd5592708fb\") " pod="openshift-route-controller-manager/route-controller-manager-69c79dd4cc-nhr5p" Mar 21 04:29:48 crc kubenswrapper[4839]: I0321 04:29:48.675766 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/690da0e8-bbb6-43cd-a875-01057cb5c75c-proxy-ca-bundles\") pod \"controller-manager-5db558bd57-lhzm8\" (UID: \"690da0e8-bbb6-43cd-a875-01057cb5c75c\") " pod="openshift-controller-manager/controller-manager-5db558bd57-lhzm8" Mar 21 04:29:48 crc kubenswrapper[4839]: I0321 04:29:48.676828 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/690da0e8-bbb6-43cd-a875-01057cb5c75c-client-ca\") pod \"controller-manager-5db558bd57-lhzm8\" (UID: \"690da0e8-bbb6-43cd-a875-01057cb5c75c\") " pod="openshift-controller-manager/controller-manager-5db558bd57-lhzm8" Mar 21 04:29:48 crc kubenswrapper[4839]: I0321 04:29:48.676881 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/3ec6a4a0-0142-4e96-8bb1-4bd5592708fb-client-ca\") pod \"route-controller-manager-69c79dd4cc-nhr5p\" (UID: \"3ec6a4a0-0142-4e96-8bb1-4bd5592708fb\") " pod="openshift-route-controller-manager/route-controller-manager-69c79dd4cc-nhr5p" Mar 21 04:29:48 crc kubenswrapper[4839]: I0321 04:29:48.677483 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3ec6a4a0-0142-4e96-8bb1-4bd5592708fb-config\") pod \"route-controller-manager-69c79dd4cc-nhr5p\" (UID: \"3ec6a4a0-0142-4e96-8bb1-4bd5592708fb\") " pod="openshift-route-controller-manager/route-controller-manager-69c79dd4cc-nhr5p" Mar 21 04:29:48 crc kubenswrapper[4839]: I0321 04:29:48.677526 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/690da0e8-bbb6-43cd-a875-01057cb5c75c-config\") pod \"controller-manager-5db558bd57-lhzm8\" (UID: \"690da0e8-bbb6-43cd-a875-01057cb5c75c\") " pod="openshift-controller-manager/controller-manager-5db558bd57-lhzm8" Mar 21 04:29:48 crc kubenswrapper[4839]: I0321 04:29:48.678515 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/690da0e8-bbb6-43cd-a875-01057cb5c75c-proxy-ca-bundles\") pod \"controller-manager-5db558bd57-lhzm8\" (UID: \"690da0e8-bbb6-43cd-a875-01057cb5c75c\") " pod="openshift-controller-manager/controller-manager-5db558bd57-lhzm8" Mar 21 04:29:48 crc kubenswrapper[4839]: I0321 04:29:48.680478 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/3ec6a4a0-0142-4e96-8bb1-4bd5592708fb-serving-cert\") pod \"route-controller-manager-69c79dd4cc-nhr5p\" (UID: \"3ec6a4a0-0142-4e96-8bb1-4bd5592708fb\") " pod="openshift-route-controller-manager/route-controller-manager-69c79dd4cc-nhr5p" Mar 21 04:29:48 crc kubenswrapper[4839]: I0321 04:29:48.684998 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/690da0e8-bbb6-43cd-a875-01057cb5c75c-serving-cert\") pod \"controller-manager-5db558bd57-lhzm8\" (UID: \"690da0e8-bbb6-43cd-a875-01057cb5c75c\") " pod="openshift-controller-manager/controller-manager-5db558bd57-lhzm8" Mar 21 04:29:48 crc kubenswrapper[4839]: I0321 04:29:48.694170 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hxlzk\" (UniqueName: \"kubernetes.io/projected/3ec6a4a0-0142-4e96-8bb1-4bd5592708fb-kube-api-access-hxlzk\") pod \"route-controller-manager-69c79dd4cc-nhr5p\" (UID: \"3ec6a4a0-0142-4e96-8bb1-4bd5592708fb\") " pod="openshift-route-controller-manager/route-controller-manager-69c79dd4cc-nhr5p" Mar 21 04:29:48 crc kubenswrapper[4839]: I0321 04:29:48.696463 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bz559\" (UniqueName: \"kubernetes.io/projected/690da0e8-bbb6-43cd-a875-01057cb5c75c-kube-api-access-bz559\") pod \"controller-manager-5db558bd57-lhzm8\" (UID: \"690da0e8-bbb6-43cd-a875-01057cb5c75c\") " pod="openshift-controller-manager/controller-manager-5db558bd57-lhzm8" Mar 21 04:29:48 crc kubenswrapper[4839]: I0321 04:29:48.827037 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-5db558bd57-lhzm8" Mar 21 04:29:48 crc kubenswrapper[4839]: I0321 04:29:48.838223 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-69c79dd4cc-nhr5p" Mar 21 04:29:49 crc kubenswrapper[4839]: I0321 04:29:49.068232 4839 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-69c79dd4cc-nhr5p"] Mar 21 04:29:49 crc kubenswrapper[4839]: W0321 04:29:49.073226 4839 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3ec6a4a0_0142_4e96_8bb1_4bd5592708fb.slice/crio-6801db09bbefa1710476d24d618a489257ebc6c5c6d0190d1048c0d66c2a111e WatchSource:0}: Error finding container 6801db09bbefa1710476d24d618a489257ebc6c5c6d0190d1048c0d66c2a111e: Status 404 returned error can't find the container with id 6801db09bbefa1710476d24d618a489257ebc6c5c6d0190d1048c0d66c2a111e Mar 21 04:29:49 crc kubenswrapper[4839]: I0321 04:29:49.253691 4839 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-5db558bd57-lhzm8"] Mar 21 04:29:49 crc kubenswrapper[4839]: I0321 04:29:49.669930 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-5db558bd57-lhzm8" event={"ID":"690da0e8-bbb6-43cd-a875-01057cb5c75c","Type":"ContainerStarted","Data":"30c0a19b4b15271d935501a59bd059db3ee741da807e38461cc3b414e2fd9707"} Mar 21 04:29:49 crc kubenswrapper[4839]: I0321 04:29:49.669980 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-5db558bd57-lhzm8" event={"ID":"690da0e8-bbb6-43cd-a875-01057cb5c75c","Type":"ContainerStarted","Data":"d7e1d1749d2f8c80b6cb4501bf2388e3ae20d7bb2af4e3b44e88a047b92a941b"} Mar 21 04:29:49 crc kubenswrapper[4839]: I0321 04:29:49.671175 4839 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-5db558bd57-lhzm8" Mar 21 04:29:49 crc kubenswrapper[4839]: I0321 04:29:49.673477 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-69c79dd4cc-nhr5p" event={"ID":"3ec6a4a0-0142-4e96-8bb1-4bd5592708fb","Type":"ContainerStarted","Data":"cb1e5111f57ecec9d54deaa24ac7b3e3895324ed67a482c44d209c0b8560c6bc"} Mar 21 04:29:49 crc kubenswrapper[4839]: I0321 04:29:49.673509 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-69c79dd4cc-nhr5p" event={"ID":"3ec6a4a0-0142-4e96-8bb1-4bd5592708fb","Type":"ContainerStarted","Data":"6801db09bbefa1710476d24d618a489257ebc6c5c6d0190d1048c0d66c2a111e"} Mar 21 04:29:49 crc kubenswrapper[4839]: I0321 04:29:49.673721 4839 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-69c79dd4cc-nhr5p" Mar 21 04:29:49 crc kubenswrapper[4839]: I0321 04:29:49.677770 4839 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-69c79dd4cc-nhr5p" Mar 21 04:29:49 crc kubenswrapper[4839]: I0321 04:29:49.679214 4839 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-5db558bd57-lhzm8" Mar 21 04:29:49 crc kubenswrapper[4839]: I0321 04:29:49.692325 4839 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-5db558bd57-lhzm8" podStartSLOduration=3.692307288 podStartE2EDuration="3.692307288s" podCreationTimestamp="2026-03-21 04:29:46 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-21 04:29:49.690530169 +0000 UTC m=+394.018316855" watchObservedRunningTime="2026-03-21 04:29:49.692307288 +0000 UTC m=+394.020093964" Mar 21 04:29:49 crc kubenswrapper[4839]: I0321 04:29:49.727800 4839 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-69c79dd4cc-nhr5p" podStartSLOduration=3.7277851699999998 podStartE2EDuration="3.72778517s" podCreationTimestamp="2026-03-21 04:29:46 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-21 04:29:49.725167448 +0000 UTC m=+394.052954124" watchObservedRunningTime="2026-03-21 04:29:49.72778517 +0000 UTC m=+394.055571846" Mar 21 04:30:00 crc kubenswrapper[4839]: I0321 04:30:00.157785 4839 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29567790-knjwr"] Mar 21 04:30:00 crc kubenswrapper[4839]: I0321 04:30:00.158956 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29567790-knjwr" Mar 21 04:30:00 crc kubenswrapper[4839]: I0321 04:30:00.160227 4839 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29567790-h7nhz"] Mar 21 04:30:00 crc kubenswrapper[4839]: I0321 04:30:00.160903 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567790-h7nhz" Mar 21 04:30:00 crc kubenswrapper[4839]: I0321 04:30:00.161116 4839 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Mar 21 04:30:00 crc kubenswrapper[4839]: I0321 04:30:00.161289 4839 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Mar 21 04:30:00 crc kubenswrapper[4839]: I0321 04:30:00.162355 4839 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 21 04:30:00 crc kubenswrapper[4839]: I0321 04:30:00.163275 4839 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-swld2" Mar 21 04:30:00 crc kubenswrapper[4839]: I0321 04:30:00.163406 4839 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 21 04:30:00 crc kubenswrapper[4839]: I0321 04:30:00.165080 4839 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29567790-knjwr"] Mar 21 04:30:00 crc kubenswrapper[4839]: I0321 04:30:00.183548 4839 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29567790-h7nhz"] Mar 21 04:30:00 crc kubenswrapper[4839]: I0321 04:30:00.294395 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vk4tn\" (UniqueName: \"kubernetes.io/projected/64e6efc9-03ce-4af4-bcc2-bc64ceebc652-kube-api-access-vk4tn\") pod \"auto-csr-approver-29567790-h7nhz\" (UID: \"64e6efc9-03ce-4af4-bcc2-bc64ceebc652\") " pod="openshift-infra/auto-csr-approver-29567790-h7nhz" Mar 21 04:30:00 crc kubenswrapper[4839]: I0321 04:30:00.294503 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/0fd65835-5c51-49a6-8e2f-9ac9569c2c64-config-volume\") pod \"collect-profiles-29567790-knjwr\" (UID: \"0fd65835-5c51-49a6-8e2f-9ac9569c2c64\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29567790-knjwr" Mar 21 04:30:00 crc kubenswrapper[4839]: I0321 04:30:00.294546 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/0fd65835-5c51-49a6-8e2f-9ac9569c2c64-secret-volume\") pod \"collect-profiles-29567790-knjwr\" (UID: \"0fd65835-5c51-49a6-8e2f-9ac9569c2c64\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29567790-knjwr" Mar 21 04:30:00 crc kubenswrapper[4839]: I0321 04:30:00.294602 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8792q\" (UniqueName: \"kubernetes.io/projected/0fd65835-5c51-49a6-8e2f-9ac9569c2c64-kube-api-access-8792q\") pod \"collect-profiles-29567790-knjwr\" (UID: \"0fd65835-5c51-49a6-8e2f-9ac9569c2c64\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29567790-knjwr" Mar 21 04:30:00 crc kubenswrapper[4839]: I0321 04:30:00.395633 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vk4tn\" (UniqueName: \"kubernetes.io/projected/64e6efc9-03ce-4af4-bcc2-bc64ceebc652-kube-api-access-vk4tn\") pod \"auto-csr-approver-29567790-h7nhz\" (UID: \"64e6efc9-03ce-4af4-bcc2-bc64ceebc652\") " pod="openshift-infra/auto-csr-approver-29567790-h7nhz" Mar 21 04:30:00 crc kubenswrapper[4839]: I0321 04:30:00.395752 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/0fd65835-5c51-49a6-8e2f-9ac9569c2c64-config-volume\") pod \"collect-profiles-29567790-knjwr\" (UID: \"0fd65835-5c51-49a6-8e2f-9ac9569c2c64\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29567790-knjwr" Mar 21 04:30:00 crc kubenswrapper[4839]: I0321 04:30:00.395796 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/0fd65835-5c51-49a6-8e2f-9ac9569c2c64-secret-volume\") pod \"collect-profiles-29567790-knjwr\" (UID: \"0fd65835-5c51-49a6-8e2f-9ac9569c2c64\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29567790-knjwr" Mar 21 04:30:00 crc kubenswrapper[4839]: I0321 04:30:00.395830 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8792q\" (UniqueName: \"kubernetes.io/projected/0fd65835-5c51-49a6-8e2f-9ac9569c2c64-kube-api-access-8792q\") pod \"collect-profiles-29567790-knjwr\" (UID: \"0fd65835-5c51-49a6-8e2f-9ac9569c2c64\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29567790-knjwr" Mar 21 04:30:00 crc kubenswrapper[4839]: I0321 04:30:00.398231 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/0fd65835-5c51-49a6-8e2f-9ac9569c2c64-config-volume\") pod \"collect-profiles-29567790-knjwr\" (UID: \"0fd65835-5c51-49a6-8e2f-9ac9569c2c64\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29567790-knjwr" Mar 21 04:30:00 crc kubenswrapper[4839]: I0321 04:30:00.407358 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/0fd65835-5c51-49a6-8e2f-9ac9569c2c64-secret-volume\") pod \"collect-profiles-29567790-knjwr\" (UID: \"0fd65835-5c51-49a6-8e2f-9ac9569c2c64\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29567790-knjwr" Mar 21 04:30:00 crc kubenswrapper[4839]: I0321 04:30:00.411835 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8792q\" (UniqueName: \"kubernetes.io/projected/0fd65835-5c51-49a6-8e2f-9ac9569c2c64-kube-api-access-8792q\") pod \"collect-profiles-29567790-knjwr\" (UID: \"0fd65835-5c51-49a6-8e2f-9ac9569c2c64\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29567790-knjwr" Mar 21 04:30:00 crc kubenswrapper[4839]: I0321 04:30:00.412261 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vk4tn\" (UniqueName: \"kubernetes.io/projected/64e6efc9-03ce-4af4-bcc2-bc64ceebc652-kube-api-access-vk4tn\") pod \"auto-csr-approver-29567790-h7nhz\" (UID: \"64e6efc9-03ce-4af4-bcc2-bc64ceebc652\") " pod="openshift-infra/auto-csr-approver-29567790-h7nhz" Mar 21 04:30:00 crc kubenswrapper[4839]: I0321 04:30:00.480819 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29567790-knjwr" Mar 21 04:30:00 crc kubenswrapper[4839]: I0321 04:30:00.488903 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567790-h7nhz" Mar 21 04:30:00 crc kubenswrapper[4839]: I0321 04:30:00.921654 4839 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29567790-knjwr"] Mar 21 04:30:00 crc kubenswrapper[4839]: I0321 04:30:00.930315 4839 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29567790-h7nhz"] Mar 21 04:30:00 crc kubenswrapper[4839]: W0321 04:30:00.931157 4839 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0fd65835_5c51_49a6_8e2f_9ac9569c2c64.slice/crio-72b46ac923acc04590d465b45e540fcefcfcdaf3bfd4b6ec30526ce0c31dc4d0 WatchSource:0}: Error finding container 72b46ac923acc04590d465b45e540fcefcfcdaf3bfd4b6ec30526ce0c31dc4d0: Status 404 returned error can't find the container with id 72b46ac923acc04590d465b45e540fcefcfcdaf3bfd4b6ec30526ce0c31dc4d0 Mar 21 04:30:00 crc kubenswrapper[4839]: W0321 04:30:00.935584 4839 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod64e6efc9_03ce_4af4_bcc2_bc64ceebc652.slice/crio-5b0eb2e2244b4707e2942990ef25d2d51082fbaa4744b1e9d21dec471bf648a0 WatchSource:0}: Error finding container 5b0eb2e2244b4707e2942990ef25d2d51082fbaa4744b1e9d21dec471bf648a0: Status 404 returned error can't find the container with id 5b0eb2e2244b4707e2942990ef25d2d51082fbaa4744b1e9d21dec471bf648a0 Mar 21 04:30:01 crc kubenswrapper[4839]: I0321 04:30:01.766711 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567790-h7nhz" event={"ID":"64e6efc9-03ce-4af4-bcc2-bc64ceebc652","Type":"ContainerStarted","Data":"5b0eb2e2244b4707e2942990ef25d2d51082fbaa4744b1e9d21dec471bf648a0"} Mar 21 04:30:01 crc kubenswrapper[4839]: I0321 04:30:01.769637 4839 generic.go:334] "Generic (PLEG): container finished" podID="0fd65835-5c51-49a6-8e2f-9ac9569c2c64" containerID="d1742f96e69ee0f8c2f73ccffb16323bc9bae63d20c55bd829c98946a612539f" exitCode=0 Mar 21 04:30:01 crc kubenswrapper[4839]: I0321 04:30:01.769684 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29567790-knjwr" event={"ID":"0fd65835-5c51-49a6-8e2f-9ac9569c2c64","Type":"ContainerDied","Data":"d1742f96e69ee0f8c2f73ccffb16323bc9bae63d20c55bd829c98946a612539f"} Mar 21 04:30:01 crc kubenswrapper[4839]: I0321 04:30:01.769710 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29567790-knjwr" event={"ID":"0fd65835-5c51-49a6-8e2f-9ac9569c2c64","Type":"ContainerStarted","Data":"72b46ac923acc04590d465b45e540fcefcfcdaf3bfd4b6ec30526ce0c31dc4d0"} Mar 21 04:30:03 crc kubenswrapper[4839]: I0321 04:30:03.204164 4839 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29567790-knjwr" Mar 21 04:30:03 crc kubenswrapper[4839]: I0321 04:30:03.231219 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8792q\" (UniqueName: \"kubernetes.io/projected/0fd65835-5c51-49a6-8e2f-9ac9569c2c64-kube-api-access-8792q\") pod \"0fd65835-5c51-49a6-8e2f-9ac9569c2c64\" (UID: \"0fd65835-5c51-49a6-8e2f-9ac9569c2c64\") " Mar 21 04:30:03 crc kubenswrapper[4839]: I0321 04:30:03.231877 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/0fd65835-5c51-49a6-8e2f-9ac9569c2c64-secret-volume\") pod \"0fd65835-5c51-49a6-8e2f-9ac9569c2c64\" (UID: \"0fd65835-5c51-49a6-8e2f-9ac9569c2c64\") " Mar 21 04:30:03 crc kubenswrapper[4839]: I0321 04:30:03.231916 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/0fd65835-5c51-49a6-8e2f-9ac9569c2c64-config-volume\") pod \"0fd65835-5c51-49a6-8e2f-9ac9569c2c64\" (UID: \"0fd65835-5c51-49a6-8e2f-9ac9569c2c64\") " Mar 21 04:30:03 crc kubenswrapper[4839]: I0321 04:30:03.233109 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0fd65835-5c51-49a6-8e2f-9ac9569c2c64-config-volume" (OuterVolumeSpecName: "config-volume") pod "0fd65835-5c51-49a6-8e2f-9ac9569c2c64" (UID: "0fd65835-5c51-49a6-8e2f-9ac9569c2c64"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 21 04:30:03 crc kubenswrapper[4839]: I0321 04:30:03.237553 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0fd65835-5c51-49a6-8e2f-9ac9569c2c64-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "0fd65835-5c51-49a6-8e2f-9ac9569c2c64" (UID: "0fd65835-5c51-49a6-8e2f-9ac9569c2c64"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 21 04:30:03 crc kubenswrapper[4839]: I0321 04:30:03.238703 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0fd65835-5c51-49a6-8e2f-9ac9569c2c64-kube-api-access-8792q" (OuterVolumeSpecName: "kube-api-access-8792q") pod "0fd65835-5c51-49a6-8e2f-9ac9569c2c64" (UID: "0fd65835-5c51-49a6-8e2f-9ac9569c2c64"). InnerVolumeSpecName "kube-api-access-8792q". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 21 04:30:03 crc kubenswrapper[4839]: I0321 04:30:03.333046 4839 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/0fd65835-5c51-49a6-8e2f-9ac9569c2c64-secret-volume\") on node \"crc\" DevicePath \"\"" Mar 21 04:30:03 crc kubenswrapper[4839]: I0321 04:30:03.333078 4839 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/0fd65835-5c51-49a6-8e2f-9ac9569c2c64-config-volume\") on node \"crc\" DevicePath \"\"" Mar 21 04:30:03 crc kubenswrapper[4839]: I0321 04:30:03.333088 4839 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8792q\" (UniqueName: \"kubernetes.io/projected/0fd65835-5c51-49a6-8e2f-9ac9569c2c64-kube-api-access-8792q\") on node \"crc\" DevicePath \"\"" Mar 21 04:30:03 crc kubenswrapper[4839]: I0321 04:30:03.781653 4839 generic.go:334] "Generic (PLEG): container finished" podID="64e6efc9-03ce-4af4-bcc2-bc64ceebc652" containerID="7a160fd6d3c601d634e7f0ddbce27e4379f3d0fc66482e35f835bfe3e44b6c2b" exitCode=0 Mar 21 04:30:03 crc kubenswrapper[4839]: I0321 04:30:03.781711 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567790-h7nhz" event={"ID":"64e6efc9-03ce-4af4-bcc2-bc64ceebc652","Type":"ContainerDied","Data":"7a160fd6d3c601d634e7f0ddbce27e4379f3d0fc66482e35f835bfe3e44b6c2b"} Mar 21 04:30:03 crc kubenswrapper[4839]: I0321 04:30:03.783505 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29567790-knjwr" event={"ID":"0fd65835-5c51-49a6-8e2f-9ac9569c2c64","Type":"ContainerDied","Data":"72b46ac923acc04590d465b45e540fcefcfcdaf3bfd4b6ec30526ce0c31dc4d0"} Mar 21 04:30:03 crc kubenswrapper[4839]: I0321 04:30:03.783528 4839 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="72b46ac923acc04590d465b45e540fcefcfcdaf3bfd4b6ec30526ce0c31dc4d0" Mar 21 04:30:03 crc kubenswrapper[4839]: I0321 04:30:03.783586 4839 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29567790-knjwr" Mar 21 04:30:05 crc kubenswrapper[4839]: I0321 04:30:05.153433 4839 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567790-h7nhz" Mar 21 04:30:05 crc kubenswrapper[4839]: I0321 04:30:05.154902 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vk4tn\" (UniqueName: \"kubernetes.io/projected/64e6efc9-03ce-4af4-bcc2-bc64ceebc652-kube-api-access-vk4tn\") pod \"64e6efc9-03ce-4af4-bcc2-bc64ceebc652\" (UID: \"64e6efc9-03ce-4af4-bcc2-bc64ceebc652\") " Mar 21 04:30:05 crc kubenswrapper[4839]: I0321 04:30:05.159819 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/64e6efc9-03ce-4af4-bcc2-bc64ceebc652-kube-api-access-vk4tn" (OuterVolumeSpecName: "kube-api-access-vk4tn") pod "64e6efc9-03ce-4af4-bcc2-bc64ceebc652" (UID: "64e6efc9-03ce-4af4-bcc2-bc64ceebc652"). InnerVolumeSpecName "kube-api-access-vk4tn". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 21 04:30:05 crc kubenswrapper[4839]: I0321 04:30:05.255691 4839 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vk4tn\" (UniqueName: \"kubernetes.io/projected/64e6efc9-03ce-4af4-bcc2-bc64ceebc652-kube-api-access-vk4tn\") on node \"crc\" DevicePath \"\"" Mar 21 04:30:05 crc kubenswrapper[4839]: I0321 04:30:05.794625 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567790-h7nhz" event={"ID":"64e6efc9-03ce-4af4-bcc2-bc64ceebc652","Type":"ContainerDied","Data":"5b0eb2e2244b4707e2942990ef25d2d51082fbaa4744b1e9d21dec471bf648a0"} Mar 21 04:30:05 crc kubenswrapper[4839]: I0321 04:30:05.794656 4839 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567790-h7nhz" Mar 21 04:30:05 crc kubenswrapper[4839]: I0321 04:30:05.794662 4839 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="5b0eb2e2244b4707e2942990ef25d2d51082fbaa4744b1e9d21dec471bf648a0" Mar 21 04:30:26 crc kubenswrapper[4839]: I0321 04:30:26.678826 4839 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-5db558bd57-lhzm8"] Mar 21 04:30:26 crc kubenswrapper[4839]: I0321 04:30:26.679630 4839 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-5db558bd57-lhzm8" podUID="690da0e8-bbb6-43cd-a875-01057cb5c75c" containerName="controller-manager" containerID="cri-o://30c0a19b4b15271d935501a59bd059db3ee741da807e38461cc3b414e2fd9707" gracePeriod=30 Mar 21 04:30:26 crc kubenswrapper[4839]: I0321 04:30:26.903060 4839 generic.go:334] "Generic (PLEG): container finished" podID="690da0e8-bbb6-43cd-a875-01057cb5c75c" containerID="30c0a19b4b15271d935501a59bd059db3ee741da807e38461cc3b414e2fd9707" exitCode=0 Mar 21 04:30:26 crc kubenswrapper[4839]: I0321 04:30:26.903099 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-5db558bd57-lhzm8" event={"ID":"690da0e8-bbb6-43cd-a875-01057cb5c75c","Type":"ContainerDied","Data":"30c0a19b4b15271d935501a59bd059db3ee741da807e38461cc3b414e2fd9707"} Mar 21 04:30:27 crc kubenswrapper[4839]: I0321 04:30:27.160930 4839 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-5db558bd57-lhzm8" Mar 21 04:30:27 crc kubenswrapper[4839]: I0321 04:30:27.187930 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/690da0e8-bbb6-43cd-a875-01057cb5c75c-config\") pod \"690da0e8-bbb6-43cd-a875-01057cb5c75c\" (UID: \"690da0e8-bbb6-43cd-a875-01057cb5c75c\") " Mar 21 04:30:27 crc kubenswrapper[4839]: I0321 04:30:27.188000 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/690da0e8-bbb6-43cd-a875-01057cb5c75c-proxy-ca-bundles\") pod \"690da0e8-bbb6-43cd-a875-01057cb5c75c\" (UID: \"690da0e8-bbb6-43cd-a875-01057cb5c75c\") " Mar 21 04:30:27 crc kubenswrapper[4839]: I0321 04:30:27.188049 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bz559\" (UniqueName: \"kubernetes.io/projected/690da0e8-bbb6-43cd-a875-01057cb5c75c-kube-api-access-bz559\") pod \"690da0e8-bbb6-43cd-a875-01057cb5c75c\" (UID: \"690da0e8-bbb6-43cd-a875-01057cb5c75c\") " Mar 21 04:30:27 crc kubenswrapper[4839]: I0321 04:30:27.188087 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/690da0e8-bbb6-43cd-a875-01057cb5c75c-serving-cert\") pod \"690da0e8-bbb6-43cd-a875-01057cb5c75c\" (UID: \"690da0e8-bbb6-43cd-a875-01057cb5c75c\") " Mar 21 04:30:27 crc kubenswrapper[4839]: I0321 04:30:27.188161 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/690da0e8-bbb6-43cd-a875-01057cb5c75c-client-ca\") pod \"690da0e8-bbb6-43cd-a875-01057cb5c75c\" (UID: \"690da0e8-bbb6-43cd-a875-01057cb5c75c\") " Mar 21 04:30:27 crc kubenswrapper[4839]: I0321 04:30:27.189303 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/690da0e8-bbb6-43cd-a875-01057cb5c75c-client-ca" (OuterVolumeSpecName: "client-ca") pod "690da0e8-bbb6-43cd-a875-01057cb5c75c" (UID: "690da0e8-bbb6-43cd-a875-01057cb5c75c"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 21 04:30:27 crc kubenswrapper[4839]: I0321 04:30:27.189320 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/690da0e8-bbb6-43cd-a875-01057cb5c75c-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "690da0e8-bbb6-43cd-a875-01057cb5c75c" (UID: "690da0e8-bbb6-43cd-a875-01057cb5c75c"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 21 04:30:27 crc kubenswrapper[4839]: I0321 04:30:27.189965 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/690da0e8-bbb6-43cd-a875-01057cb5c75c-config" (OuterVolumeSpecName: "config") pod "690da0e8-bbb6-43cd-a875-01057cb5c75c" (UID: "690da0e8-bbb6-43cd-a875-01057cb5c75c"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 21 04:30:27 crc kubenswrapper[4839]: I0321 04:30:27.194065 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/690da0e8-bbb6-43cd-a875-01057cb5c75c-kube-api-access-bz559" (OuterVolumeSpecName: "kube-api-access-bz559") pod "690da0e8-bbb6-43cd-a875-01057cb5c75c" (UID: "690da0e8-bbb6-43cd-a875-01057cb5c75c"). InnerVolumeSpecName "kube-api-access-bz559". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 21 04:30:27 crc kubenswrapper[4839]: I0321 04:30:27.194202 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/690da0e8-bbb6-43cd-a875-01057cb5c75c-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "690da0e8-bbb6-43cd-a875-01057cb5c75c" (UID: "690da0e8-bbb6-43cd-a875-01057cb5c75c"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 21 04:30:27 crc kubenswrapper[4839]: I0321 04:30:27.290157 4839 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/690da0e8-bbb6-43cd-a875-01057cb5c75c-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Mar 21 04:30:27 crc kubenswrapper[4839]: I0321 04:30:27.290211 4839 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bz559\" (UniqueName: \"kubernetes.io/projected/690da0e8-bbb6-43cd-a875-01057cb5c75c-kube-api-access-bz559\") on node \"crc\" DevicePath \"\"" Mar 21 04:30:27 crc kubenswrapper[4839]: I0321 04:30:27.290239 4839 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/690da0e8-bbb6-43cd-a875-01057cb5c75c-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 21 04:30:27 crc kubenswrapper[4839]: I0321 04:30:27.290250 4839 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/690da0e8-bbb6-43cd-a875-01057cb5c75c-client-ca\") on node \"crc\" DevicePath \"\"" Mar 21 04:30:27 crc kubenswrapper[4839]: I0321 04:30:27.290260 4839 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/690da0e8-bbb6-43cd-a875-01057cb5c75c-config\") on node \"crc\" DevicePath \"\"" Mar 21 04:30:27 crc kubenswrapper[4839]: I0321 04:30:27.760133 4839 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-75b8f8489-pl6wn"] Mar 21 04:30:27 crc kubenswrapper[4839]: E0321 04:30:27.760714 4839 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0fd65835-5c51-49a6-8e2f-9ac9569c2c64" containerName="collect-profiles" Mar 21 04:30:27 crc kubenswrapper[4839]: I0321 04:30:27.760730 4839 state_mem.go:107] "Deleted CPUSet assignment" podUID="0fd65835-5c51-49a6-8e2f-9ac9569c2c64" containerName="collect-profiles" Mar 21 04:30:27 crc kubenswrapper[4839]: E0321 04:30:27.760740 4839 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="64e6efc9-03ce-4af4-bcc2-bc64ceebc652" containerName="oc" Mar 21 04:30:27 crc kubenswrapper[4839]: I0321 04:30:27.760748 4839 state_mem.go:107] "Deleted CPUSet assignment" podUID="64e6efc9-03ce-4af4-bcc2-bc64ceebc652" containerName="oc" Mar 21 04:30:27 crc kubenswrapper[4839]: E0321 04:30:27.760906 4839 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="690da0e8-bbb6-43cd-a875-01057cb5c75c" containerName="controller-manager" Mar 21 04:30:27 crc kubenswrapper[4839]: I0321 04:30:27.760916 4839 state_mem.go:107] "Deleted CPUSet assignment" podUID="690da0e8-bbb6-43cd-a875-01057cb5c75c" containerName="controller-manager" Mar 21 04:30:27 crc kubenswrapper[4839]: I0321 04:30:27.761053 4839 memory_manager.go:354] "RemoveStaleState removing state" podUID="690da0e8-bbb6-43cd-a875-01057cb5c75c" containerName="controller-manager" Mar 21 04:30:27 crc kubenswrapper[4839]: I0321 04:30:27.761065 4839 memory_manager.go:354] "RemoveStaleState removing state" podUID="0fd65835-5c51-49a6-8e2f-9ac9569c2c64" containerName="collect-profiles" Mar 21 04:30:27 crc kubenswrapper[4839]: I0321 04:30:27.761083 4839 memory_manager.go:354] "RemoveStaleState removing state" podUID="64e6efc9-03ce-4af4-bcc2-bc64ceebc652" containerName="oc" Mar 21 04:30:27 crc kubenswrapper[4839]: I0321 04:30:27.761499 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-75b8f8489-pl6wn" Mar 21 04:30:27 crc kubenswrapper[4839]: I0321 04:30:27.772424 4839 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-75b8f8489-pl6wn"] Mar 21 04:30:27 crc kubenswrapper[4839]: I0321 04:30:27.795130 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7b710421-2c24-4791-a561-846b4830b732-serving-cert\") pod \"controller-manager-75b8f8489-pl6wn\" (UID: \"7b710421-2c24-4791-a561-846b4830b732\") " pod="openshift-controller-manager/controller-manager-75b8f8489-pl6wn" Mar 21 04:30:27 crc kubenswrapper[4839]: I0321 04:30:27.795177 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7b710421-2c24-4791-a561-846b4830b732-config\") pod \"controller-manager-75b8f8489-pl6wn\" (UID: \"7b710421-2c24-4791-a561-846b4830b732\") " pod="openshift-controller-manager/controller-manager-75b8f8489-pl6wn" Mar 21 04:30:27 crc kubenswrapper[4839]: I0321 04:30:27.795197 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/7b710421-2c24-4791-a561-846b4830b732-proxy-ca-bundles\") pod \"controller-manager-75b8f8489-pl6wn\" (UID: \"7b710421-2c24-4791-a561-846b4830b732\") " pod="openshift-controller-manager/controller-manager-75b8f8489-pl6wn" Mar 21 04:30:27 crc kubenswrapper[4839]: I0321 04:30:27.795407 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/7b710421-2c24-4791-a561-846b4830b732-client-ca\") pod \"controller-manager-75b8f8489-pl6wn\" (UID: \"7b710421-2c24-4791-a561-846b4830b732\") " pod="openshift-controller-manager/controller-manager-75b8f8489-pl6wn" Mar 21 04:30:27 crc kubenswrapper[4839]: I0321 04:30:27.795449 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7z8cc\" (UniqueName: \"kubernetes.io/projected/7b710421-2c24-4791-a561-846b4830b732-kube-api-access-7z8cc\") pod \"controller-manager-75b8f8489-pl6wn\" (UID: \"7b710421-2c24-4791-a561-846b4830b732\") " pod="openshift-controller-manager/controller-manager-75b8f8489-pl6wn" Mar 21 04:30:27 crc kubenswrapper[4839]: I0321 04:30:27.896849 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/7b710421-2c24-4791-a561-846b4830b732-client-ca\") pod \"controller-manager-75b8f8489-pl6wn\" (UID: \"7b710421-2c24-4791-a561-846b4830b732\") " pod="openshift-controller-manager/controller-manager-75b8f8489-pl6wn" Mar 21 04:30:27 crc kubenswrapper[4839]: I0321 04:30:27.896893 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7z8cc\" (UniqueName: \"kubernetes.io/projected/7b710421-2c24-4791-a561-846b4830b732-kube-api-access-7z8cc\") pod \"controller-manager-75b8f8489-pl6wn\" (UID: \"7b710421-2c24-4791-a561-846b4830b732\") " pod="openshift-controller-manager/controller-manager-75b8f8489-pl6wn" Mar 21 04:30:27 crc kubenswrapper[4839]: I0321 04:30:27.896933 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7b710421-2c24-4791-a561-846b4830b732-serving-cert\") pod \"controller-manager-75b8f8489-pl6wn\" (UID: \"7b710421-2c24-4791-a561-846b4830b732\") " pod="openshift-controller-manager/controller-manager-75b8f8489-pl6wn" Mar 21 04:30:27 crc kubenswrapper[4839]: I0321 04:30:27.896955 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7b710421-2c24-4791-a561-846b4830b732-config\") pod \"controller-manager-75b8f8489-pl6wn\" (UID: \"7b710421-2c24-4791-a561-846b4830b732\") " pod="openshift-controller-manager/controller-manager-75b8f8489-pl6wn" Mar 21 04:30:27 crc kubenswrapper[4839]: I0321 04:30:27.896970 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/7b710421-2c24-4791-a561-846b4830b732-proxy-ca-bundles\") pod \"controller-manager-75b8f8489-pl6wn\" (UID: \"7b710421-2c24-4791-a561-846b4830b732\") " pod="openshift-controller-manager/controller-manager-75b8f8489-pl6wn" Mar 21 04:30:27 crc kubenswrapper[4839]: I0321 04:30:27.898030 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/7b710421-2c24-4791-a561-846b4830b732-client-ca\") pod \"controller-manager-75b8f8489-pl6wn\" (UID: \"7b710421-2c24-4791-a561-846b4830b732\") " pod="openshift-controller-manager/controller-manager-75b8f8489-pl6wn" Mar 21 04:30:27 crc kubenswrapper[4839]: I0321 04:30:27.898101 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/7b710421-2c24-4791-a561-846b4830b732-proxy-ca-bundles\") pod \"controller-manager-75b8f8489-pl6wn\" (UID: \"7b710421-2c24-4791-a561-846b4830b732\") " pod="openshift-controller-manager/controller-manager-75b8f8489-pl6wn" Mar 21 04:30:27 crc kubenswrapper[4839]: I0321 04:30:27.900130 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7b710421-2c24-4791-a561-846b4830b732-config\") pod \"controller-manager-75b8f8489-pl6wn\" (UID: \"7b710421-2c24-4791-a561-846b4830b732\") " pod="openshift-controller-manager/controller-manager-75b8f8489-pl6wn" Mar 21 04:30:27 crc kubenswrapper[4839]: I0321 04:30:27.906233 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7b710421-2c24-4791-a561-846b4830b732-serving-cert\") pod \"controller-manager-75b8f8489-pl6wn\" (UID: \"7b710421-2c24-4791-a561-846b4830b732\") " pod="openshift-controller-manager/controller-manager-75b8f8489-pl6wn" Mar 21 04:30:27 crc kubenswrapper[4839]: I0321 04:30:27.911948 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-5db558bd57-lhzm8" event={"ID":"690da0e8-bbb6-43cd-a875-01057cb5c75c","Type":"ContainerDied","Data":"d7e1d1749d2f8c80b6cb4501bf2388e3ae20d7bb2af4e3b44e88a047b92a941b"} Mar 21 04:30:27 crc kubenswrapper[4839]: I0321 04:30:27.912254 4839 scope.go:117] "RemoveContainer" containerID="30c0a19b4b15271d935501a59bd059db3ee741da807e38461cc3b414e2fd9707" Mar 21 04:30:27 crc kubenswrapper[4839]: I0321 04:30:27.912657 4839 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-5db558bd57-lhzm8" Mar 21 04:30:27 crc kubenswrapper[4839]: I0321 04:30:27.919299 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7z8cc\" (UniqueName: \"kubernetes.io/projected/7b710421-2c24-4791-a561-846b4830b732-kube-api-access-7z8cc\") pod \"controller-manager-75b8f8489-pl6wn\" (UID: \"7b710421-2c24-4791-a561-846b4830b732\") " pod="openshift-controller-manager/controller-manager-75b8f8489-pl6wn" Mar 21 04:30:27 crc kubenswrapper[4839]: I0321 04:30:27.977148 4839 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-5db558bd57-lhzm8"] Mar 21 04:30:27 crc kubenswrapper[4839]: I0321 04:30:27.980403 4839 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-5db558bd57-lhzm8"] Mar 21 04:30:28 crc kubenswrapper[4839]: I0321 04:30:28.091790 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-75b8f8489-pl6wn" Mar 21 04:30:28 crc kubenswrapper[4839]: I0321 04:30:28.459065 4839 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="690da0e8-bbb6-43cd-a875-01057cb5c75c" path="/var/lib/kubelet/pods/690da0e8-bbb6-43cd-a875-01057cb5c75c/volumes" Mar 21 04:30:28 crc kubenswrapper[4839]: I0321 04:30:28.487940 4839 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-75b8f8489-pl6wn"] Mar 21 04:30:28 crc kubenswrapper[4839]: I0321 04:30:28.920303 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-75b8f8489-pl6wn" event={"ID":"7b710421-2c24-4791-a561-846b4830b732","Type":"ContainerStarted","Data":"7a77379278e6f9bb3b097e82d6b7770ec9bdb02d0153f713b8f21179945aa0a3"} Mar 21 04:30:28 crc kubenswrapper[4839]: I0321 04:30:28.920342 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-75b8f8489-pl6wn" event={"ID":"7b710421-2c24-4791-a561-846b4830b732","Type":"ContainerStarted","Data":"1892b5ea14009b912efbf9b734e6dcf7bd4394bc469fa05c6907c7a1e8c51927"} Mar 21 04:30:28 crc kubenswrapper[4839]: I0321 04:30:28.920797 4839 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-75b8f8489-pl6wn" Mar 21 04:30:28 crc kubenswrapper[4839]: I0321 04:30:28.925145 4839 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-75b8f8489-pl6wn" Mar 21 04:30:28 crc kubenswrapper[4839]: I0321 04:30:28.935784 4839 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-75b8f8489-pl6wn" podStartSLOduration=2.935765214 podStartE2EDuration="2.935765214s" podCreationTimestamp="2026-03-21 04:30:26 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-21 04:30:28.934247052 +0000 UTC m=+433.262033768" watchObservedRunningTime="2026-03-21 04:30:28.935765214 +0000 UTC m=+433.263551890" Mar 21 04:30:46 crc kubenswrapper[4839]: I0321 04:30:46.677173 4839 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-69c79dd4cc-nhr5p"] Mar 21 04:30:46 crc kubenswrapper[4839]: I0321 04:30:46.679410 4839 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-69c79dd4cc-nhr5p" podUID="3ec6a4a0-0142-4e96-8bb1-4bd5592708fb" containerName="route-controller-manager" containerID="cri-o://cb1e5111f57ecec9d54deaa24ac7b3e3895324ed67a482c44d209c0b8560c6bc" gracePeriod=30 Mar 21 04:30:47 crc kubenswrapper[4839]: I0321 04:30:47.015424 4839 generic.go:334] "Generic (PLEG): container finished" podID="3ec6a4a0-0142-4e96-8bb1-4bd5592708fb" containerID="cb1e5111f57ecec9d54deaa24ac7b3e3895324ed67a482c44d209c0b8560c6bc" exitCode=0 Mar 21 04:30:47 crc kubenswrapper[4839]: I0321 04:30:47.015666 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-69c79dd4cc-nhr5p" event={"ID":"3ec6a4a0-0142-4e96-8bb1-4bd5592708fb","Type":"ContainerDied","Data":"cb1e5111f57ecec9d54deaa24ac7b3e3895324ed67a482c44d209c0b8560c6bc"} Mar 21 04:30:47 crc kubenswrapper[4839]: I0321 04:30:47.135025 4839 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-69c79dd4cc-nhr5p" Mar 21 04:30:47 crc kubenswrapper[4839]: I0321 04:30:47.155163 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hxlzk\" (UniqueName: \"kubernetes.io/projected/3ec6a4a0-0142-4e96-8bb1-4bd5592708fb-kube-api-access-hxlzk\") pod \"3ec6a4a0-0142-4e96-8bb1-4bd5592708fb\" (UID: \"3ec6a4a0-0142-4e96-8bb1-4bd5592708fb\") " Mar 21 04:30:47 crc kubenswrapper[4839]: I0321 04:30:47.155207 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/3ec6a4a0-0142-4e96-8bb1-4bd5592708fb-serving-cert\") pod \"3ec6a4a0-0142-4e96-8bb1-4bd5592708fb\" (UID: \"3ec6a4a0-0142-4e96-8bb1-4bd5592708fb\") " Mar 21 04:30:47 crc kubenswrapper[4839]: I0321 04:30:47.155261 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3ec6a4a0-0142-4e96-8bb1-4bd5592708fb-config\") pod \"3ec6a4a0-0142-4e96-8bb1-4bd5592708fb\" (UID: \"3ec6a4a0-0142-4e96-8bb1-4bd5592708fb\") " Mar 21 04:30:47 crc kubenswrapper[4839]: I0321 04:30:47.155295 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/3ec6a4a0-0142-4e96-8bb1-4bd5592708fb-client-ca\") pod \"3ec6a4a0-0142-4e96-8bb1-4bd5592708fb\" (UID: \"3ec6a4a0-0142-4e96-8bb1-4bd5592708fb\") " Mar 21 04:30:47 crc kubenswrapper[4839]: I0321 04:30:47.156133 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3ec6a4a0-0142-4e96-8bb1-4bd5592708fb-client-ca" (OuterVolumeSpecName: "client-ca") pod "3ec6a4a0-0142-4e96-8bb1-4bd5592708fb" (UID: "3ec6a4a0-0142-4e96-8bb1-4bd5592708fb"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 21 04:30:47 crc kubenswrapper[4839]: I0321 04:30:47.156436 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3ec6a4a0-0142-4e96-8bb1-4bd5592708fb-config" (OuterVolumeSpecName: "config") pod "3ec6a4a0-0142-4e96-8bb1-4bd5592708fb" (UID: "3ec6a4a0-0142-4e96-8bb1-4bd5592708fb"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 21 04:30:47 crc kubenswrapper[4839]: I0321 04:30:47.161352 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3ec6a4a0-0142-4e96-8bb1-4bd5592708fb-kube-api-access-hxlzk" (OuterVolumeSpecName: "kube-api-access-hxlzk") pod "3ec6a4a0-0142-4e96-8bb1-4bd5592708fb" (UID: "3ec6a4a0-0142-4e96-8bb1-4bd5592708fb"). InnerVolumeSpecName "kube-api-access-hxlzk". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 21 04:30:47 crc kubenswrapper[4839]: I0321 04:30:47.161698 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3ec6a4a0-0142-4e96-8bb1-4bd5592708fb-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "3ec6a4a0-0142-4e96-8bb1-4bd5592708fb" (UID: "3ec6a4a0-0142-4e96-8bb1-4bd5592708fb"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 21 04:30:47 crc kubenswrapper[4839]: I0321 04:30:47.256463 4839 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/3ec6a4a0-0142-4e96-8bb1-4bd5592708fb-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 21 04:30:47 crc kubenswrapper[4839]: I0321 04:30:47.256497 4839 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3ec6a4a0-0142-4e96-8bb1-4bd5592708fb-config\") on node \"crc\" DevicePath \"\"" Mar 21 04:30:47 crc kubenswrapper[4839]: I0321 04:30:47.256506 4839 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/3ec6a4a0-0142-4e96-8bb1-4bd5592708fb-client-ca\") on node \"crc\" DevicePath \"\"" Mar 21 04:30:47 crc kubenswrapper[4839]: I0321 04:30:47.256514 4839 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hxlzk\" (UniqueName: \"kubernetes.io/projected/3ec6a4a0-0142-4e96-8bb1-4bd5592708fb-kube-api-access-hxlzk\") on node \"crc\" DevicePath \"\"" Mar 21 04:30:47 crc kubenswrapper[4839]: I0321 04:30:47.769926 4839 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6698965c79-mwntb"] Mar 21 04:30:47 crc kubenswrapper[4839]: E0321 04:30:47.770158 4839 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3ec6a4a0-0142-4e96-8bb1-4bd5592708fb" containerName="route-controller-manager" Mar 21 04:30:47 crc kubenswrapper[4839]: I0321 04:30:47.770173 4839 state_mem.go:107] "Deleted CPUSet assignment" podUID="3ec6a4a0-0142-4e96-8bb1-4bd5592708fb" containerName="route-controller-manager" Mar 21 04:30:47 crc kubenswrapper[4839]: I0321 04:30:47.770294 4839 memory_manager.go:354] "RemoveStaleState removing state" podUID="3ec6a4a0-0142-4e96-8bb1-4bd5592708fb" containerName="route-controller-manager" Mar 21 04:30:47 crc kubenswrapper[4839]: I0321 04:30:47.770707 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6698965c79-mwntb" Mar 21 04:30:47 crc kubenswrapper[4839]: I0321 04:30:47.781839 4839 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6698965c79-mwntb"] Mar 21 04:30:47 crc kubenswrapper[4839]: I0321 04:30:47.881488 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e01f7098-c6cd-4537-b145-c7090c45f92c-config\") pod \"route-controller-manager-6698965c79-mwntb\" (UID: \"e01f7098-c6cd-4537-b145-c7090c45f92c\") " pod="openshift-route-controller-manager/route-controller-manager-6698965c79-mwntb" Mar 21 04:30:47 crc kubenswrapper[4839]: I0321 04:30:47.881592 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jwxrk\" (UniqueName: \"kubernetes.io/projected/e01f7098-c6cd-4537-b145-c7090c45f92c-kube-api-access-jwxrk\") pod \"route-controller-manager-6698965c79-mwntb\" (UID: \"e01f7098-c6cd-4537-b145-c7090c45f92c\") " pod="openshift-route-controller-manager/route-controller-manager-6698965c79-mwntb" Mar 21 04:30:47 crc kubenswrapper[4839]: I0321 04:30:47.881662 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/e01f7098-c6cd-4537-b145-c7090c45f92c-client-ca\") pod \"route-controller-manager-6698965c79-mwntb\" (UID: \"e01f7098-c6cd-4537-b145-c7090c45f92c\") " pod="openshift-route-controller-manager/route-controller-manager-6698965c79-mwntb" Mar 21 04:30:47 crc kubenswrapper[4839]: I0321 04:30:47.881696 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e01f7098-c6cd-4537-b145-c7090c45f92c-serving-cert\") pod \"route-controller-manager-6698965c79-mwntb\" (UID: \"e01f7098-c6cd-4537-b145-c7090c45f92c\") " pod="openshift-route-controller-manager/route-controller-manager-6698965c79-mwntb" Mar 21 04:30:47 crc kubenswrapper[4839]: I0321 04:30:47.982444 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/e01f7098-c6cd-4537-b145-c7090c45f92c-client-ca\") pod \"route-controller-manager-6698965c79-mwntb\" (UID: \"e01f7098-c6cd-4537-b145-c7090c45f92c\") " pod="openshift-route-controller-manager/route-controller-manager-6698965c79-mwntb" Mar 21 04:30:47 crc kubenswrapper[4839]: I0321 04:30:47.982529 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e01f7098-c6cd-4537-b145-c7090c45f92c-serving-cert\") pod \"route-controller-manager-6698965c79-mwntb\" (UID: \"e01f7098-c6cd-4537-b145-c7090c45f92c\") " pod="openshift-route-controller-manager/route-controller-manager-6698965c79-mwntb" Mar 21 04:30:47 crc kubenswrapper[4839]: I0321 04:30:47.982627 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e01f7098-c6cd-4537-b145-c7090c45f92c-config\") pod \"route-controller-manager-6698965c79-mwntb\" (UID: \"e01f7098-c6cd-4537-b145-c7090c45f92c\") " pod="openshift-route-controller-manager/route-controller-manager-6698965c79-mwntb" Mar 21 04:30:47 crc kubenswrapper[4839]: I0321 04:30:47.982670 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jwxrk\" (UniqueName: \"kubernetes.io/projected/e01f7098-c6cd-4537-b145-c7090c45f92c-kube-api-access-jwxrk\") pod \"route-controller-manager-6698965c79-mwntb\" (UID: \"e01f7098-c6cd-4537-b145-c7090c45f92c\") " pod="openshift-route-controller-manager/route-controller-manager-6698965c79-mwntb" Mar 21 04:30:47 crc kubenswrapper[4839]: I0321 04:30:47.983541 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/e01f7098-c6cd-4537-b145-c7090c45f92c-client-ca\") pod \"route-controller-manager-6698965c79-mwntb\" (UID: \"e01f7098-c6cd-4537-b145-c7090c45f92c\") " pod="openshift-route-controller-manager/route-controller-manager-6698965c79-mwntb" Mar 21 04:30:47 crc kubenswrapper[4839]: I0321 04:30:47.984860 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e01f7098-c6cd-4537-b145-c7090c45f92c-config\") pod \"route-controller-manager-6698965c79-mwntb\" (UID: \"e01f7098-c6cd-4537-b145-c7090c45f92c\") " pod="openshift-route-controller-manager/route-controller-manager-6698965c79-mwntb" Mar 21 04:30:47 crc kubenswrapper[4839]: I0321 04:30:47.989695 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e01f7098-c6cd-4537-b145-c7090c45f92c-serving-cert\") pod \"route-controller-manager-6698965c79-mwntb\" (UID: \"e01f7098-c6cd-4537-b145-c7090c45f92c\") " pod="openshift-route-controller-manager/route-controller-manager-6698965c79-mwntb" Mar 21 04:30:48 crc kubenswrapper[4839]: I0321 04:30:48.004218 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jwxrk\" (UniqueName: \"kubernetes.io/projected/e01f7098-c6cd-4537-b145-c7090c45f92c-kube-api-access-jwxrk\") pod \"route-controller-manager-6698965c79-mwntb\" (UID: \"e01f7098-c6cd-4537-b145-c7090c45f92c\") " pod="openshift-route-controller-manager/route-controller-manager-6698965c79-mwntb" Mar 21 04:30:48 crc kubenswrapper[4839]: I0321 04:30:48.025398 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-69c79dd4cc-nhr5p" event={"ID":"3ec6a4a0-0142-4e96-8bb1-4bd5592708fb","Type":"ContainerDied","Data":"6801db09bbefa1710476d24d618a489257ebc6c5c6d0190d1048c0d66c2a111e"} Mar 21 04:30:48 crc kubenswrapper[4839]: I0321 04:30:48.025847 4839 scope.go:117] "RemoveContainer" containerID="cb1e5111f57ecec9d54deaa24ac7b3e3895324ed67a482c44d209c0b8560c6bc" Mar 21 04:30:48 crc kubenswrapper[4839]: I0321 04:30:48.028762 4839 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-69c79dd4cc-nhr5p" Mar 21 04:30:48 crc kubenswrapper[4839]: I0321 04:30:48.066888 4839 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-69c79dd4cc-nhr5p"] Mar 21 04:30:48 crc kubenswrapper[4839]: I0321 04:30:48.069820 4839 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-69c79dd4cc-nhr5p"] Mar 21 04:30:48 crc kubenswrapper[4839]: I0321 04:30:48.090009 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6698965c79-mwntb" Mar 21 04:30:48 crc kubenswrapper[4839]: I0321 04:30:48.460165 4839 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3ec6a4a0-0142-4e96-8bb1-4bd5592708fb" path="/var/lib/kubelet/pods/3ec6a4a0-0142-4e96-8bb1-4bd5592708fb/volumes" Mar 21 04:30:48 crc kubenswrapper[4839]: I0321 04:30:48.545440 4839 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6698965c79-mwntb"] Mar 21 04:30:48 crc kubenswrapper[4839]: W0321 04:30:48.551385 4839 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode01f7098_c6cd_4537_b145_c7090c45f92c.slice/crio-4b96e27d02dd51feef2f2f1520f0ec60d86838746f0af8378dae40943e81b2d0 WatchSource:0}: Error finding container 4b96e27d02dd51feef2f2f1520f0ec60d86838746f0af8378dae40943e81b2d0: Status 404 returned error can't find the container with id 4b96e27d02dd51feef2f2f1520f0ec60d86838746f0af8378dae40943e81b2d0 Mar 21 04:30:49 crc kubenswrapper[4839]: I0321 04:30:49.033247 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6698965c79-mwntb" event={"ID":"e01f7098-c6cd-4537-b145-c7090c45f92c","Type":"ContainerStarted","Data":"e84ec08bf925ff9dc30f0b6708dd111cd670a713a1eab3bcc8ca98041173de0c"} Mar 21 04:30:49 crc kubenswrapper[4839]: I0321 04:30:49.033323 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6698965c79-mwntb" event={"ID":"e01f7098-c6cd-4537-b145-c7090c45f92c","Type":"ContainerStarted","Data":"4b96e27d02dd51feef2f2f1520f0ec60d86838746f0af8378dae40943e81b2d0"} Mar 21 04:30:49 crc kubenswrapper[4839]: I0321 04:30:49.033506 4839 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-6698965c79-mwntb" Mar 21 04:30:49 crc kubenswrapper[4839]: I0321 04:30:49.039853 4839 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-6698965c79-mwntb" Mar 21 04:30:49 crc kubenswrapper[4839]: I0321 04:30:49.057808 4839 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-6698965c79-mwntb" podStartSLOduration=3.05778756 podStartE2EDuration="3.05778756s" podCreationTimestamp="2026-03-21 04:30:46 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-21 04:30:49.053493093 +0000 UTC m=+453.381279769" watchObservedRunningTime="2026-03-21 04:30:49.05778756 +0000 UTC m=+453.385574236" Mar 21 04:30:50 crc kubenswrapper[4839]: I0321 04:30:50.678754 4839 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-66df7c8f76-dq7r2"] Mar 21 04:30:50 crc kubenswrapper[4839]: I0321 04:30:50.679710 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-66df7c8f76-dq7r2" Mar 21 04:30:50 crc kubenswrapper[4839]: I0321 04:30:50.691097 4839 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-66df7c8f76-dq7r2"] Mar 21 04:30:50 crc kubenswrapper[4839]: I0321 04:30:50.848349 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/f38bcb76-a7ac-4c9c-8113-82113a818347-trusted-ca\") pod \"image-registry-66df7c8f76-dq7r2\" (UID: \"f38bcb76-a7ac-4c9c-8113-82113a818347\") " pod="openshift-image-registry/image-registry-66df7c8f76-dq7r2" Mar 21 04:30:50 crc kubenswrapper[4839]: I0321 04:30:50.848836 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-66df7c8f76-dq7r2\" (UID: \"f38bcb76-a7ac-4c9c-8113-82113a818347\") " pod="openshift-image-registry/image-registry-66df7c8f76-dq7r2" Mar 21 04:30:50 crc kubenswrapper[4839]: I0321 04:30:50.849004 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/f38bcb76-a7ac-4c9c-8113-82113a818347-registry-tls\") pod \"image-registry-66df7c8f76-dq7r2\" (UID: \"f38bcb76-a7ac-4c9c-8113-82113a818347\") " pod="openshift-image-registry/image-registry-66df7c8f76-dq7r2" Mar 21 04:30:50 crc kubenswrapper[4839]: I0321 04:30:50.849137 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/f38bcb76-a7ac-4c9c-8113-82113a818347-ca-trust-extracted\") pod \"image-registry-66df7c8f76-dq7r2\" (UID: \"f38bcb76-a7ac-4c9c-8113-82113a818347\") " pod="openshift-image-registry/image-registry-66df7c8f76-dq7r2" Mar 21 04:30:50 crc kubenswrapper[4839]: I0321 04:30:50.849264 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mdcr5\" (UniqueName: \"kubernetes.io/projected/f38bcb76-a7ac-4c9c-8113-82113a818347-kube-api-access-mdcr5\") pod \"image-registry-66df7c8f76-dq7r2\" (UID: \"f38bcb76-a7ac-4c9c-8113-82113a818347\") " pod="openshift-image-registry/image-registry-66df7c8f76-dq7r2" Mar 21 04:30:50 crc kubenswrapper[4839]: I0321 04:30:50.849390 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/f38bcb76-a7ac-4c9c-8113-82113a818347-bound-sa-token\") pod \"image-registry-66df7c8f76-dq7r2\" (UID: \"f38bcb76-a7ac-4c9c-8113-82113a818347\") " pod="openshift-image-registry/image-registry-66df7c8f76-dq7r2" Mar 21 04:30:50 crc kubenswrapper[4839]: I0321 04:30:50.849480 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/f38bcb76-a7ac-4c9c-8113-82113a818347-installation-pull-secrets\") pod \"image-registry-66df7c8f76-dq7r2\" (UID: \"f38bcb76-a7ac-4c9c-8113-82113a818347\") " pod="openshift-image-registry/image-registry-66df7c8f76-dq7r2" Mar 21 04:30:50 crc kubenswrapper[4839]: I0321 04:30:50.849594 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/f38bcb76-a7ac-4c9c-8113-82113a818347-registry-certificates\") pod \"image-registry-66df7c8f76-dq7r2\" (UID: \"f38bcb76-a7ac-4c9c-8113-82113a818347\") " pod="openshift-image-registry/image-registry-66df7c8f76-dq7r2" Mar 21 04:30:50 crc kubenswrapper[4839]: I0321 04:30:50.884648 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-66df7c8f76-dq7r2\" (UID: \"f38bcb76-a7ac-4c9c-8113-82113a818347\") " pod="openshift-image-registry/image-registry-66df7c8f76-dq7r2" Mar 21 04:30:50 crc kubenswrapper[4839]: I0321 04:30:50.952519 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/f38bcb76-a7ac-4c9c-8113-82113a818347-registry-tls\") pod \"image-registry-66df7c8f76-dq7r2\" (UID: \"f38bcb76-a7ac-4c9c-8113-82113a818347\") " pod="openshift-image-registry/image-registry-66df7c8f76-dq7r2" Mar 21 04:30:50 crc kubenswrapper[4839]: I0321 04:30:50.952641 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/f38bcb76-a7ac-4c9c-8113-82113a818347-ca-trust-extracted\") pod \"image-registry-66df7c8f76-dq7r2\" (UID: \"f38bcb76-a7ac-4c9c-8113-82113a818347\") " pod="openshift-image-registry/image-registry-66df7c8f76-dq7r2" Mar 21 04:30:50 crc kubenswrapper[4839]: I0321 04:30:50.952707 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mdcr5\" (UniqueName: \"kubernetes.io/projected/f38bcb76-a7ac-4c9c-8113-82113a818347-kube-api-access-mdcr5\") pod \"image-registry-66df7c8f76-dq7r2\" (UID: \"f38bcb76-a7ac-4c9c-8113-82113a818347\") " pod="openshift-image-registry/image-registry-66df7c8f76-dq7r2" Mar 21 04:30:50 crc kubenswrapper[4839]: I0321 04:30:50.952762 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/f38bcb76-a7ac-4c9c-8113-82113a818347-bound-sa-token\") pod \"image-registry-66df7c8f76-dq7r2\" (UID: \"f38bcb76-a7ac-4c9c-8113-82113a818347\") " pod="openshift-image-registry/image-registry-66df7c8f76-dq7r2" Mar 21 04:30:50 crc kubenswrapper[4839]: I0321 04:30:50.952780 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/f38bcb76-a7ac-4c9c-8113-82113a818347-installation-pull-secrets\") pod \"image-registry-66df7c8f76-dq7r2\" (UID: \"f38bcb76-a7ac-4c9c-8113-82113a818347\") " pod="openshift-image-registry/image-registry-66df7c8f76-dq7r2" Mar 21 04:30:50 crc kubenswrapper[4839]: I0321 04:30:50.952803 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/f38bcb76-a7ac-4c9c-8113-82113a818347-registry-certificates\") pod \"image-registry-66df7c8f76-dq7r2\" (UID: \"f38bcb76-a7ac-4c9c-8113-82113a818347\") " pod="openshift-image-registry/image-registry-66df7c8f76-dq7r2" Mar 21 04:30:50 crc kubenswrapper[4839]: I0321 04:30:50.952841 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/f38bcb76-a7ac-4c9c-8113-82113a818347-trusted-ca\") pod \"image-registry-66df7c8f76-dq7r2\" (UID: \"f38bcb76-a7ac-4c9c-8113-82113a818347\") " pod="openshift-image-registry/image-registry-66df7c8f76-dq7r2" Mar 21 04:30:50 crc kubenswrapper[4839]: I0321 04:30:50.953772 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/f38bcb76-a7ac-4c9c-8113-82113a818347-ca-trust-extracted\") pod \"image-registry-66df7c8f76-dq7r2\" (UID: \"f38bcb76-a7ac-4c9c-8113-82113a818347\") " pod="openshift-image-registry/image-registry-66df7c8f76-dq7r2" Mar 21 04:30:50 crc kubenswrapper[4839]: I0321 04:30:50.955527 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/f38bcb76-a7ac-4c9c-8113-82113a818347-registry-certificates\") pod \"image-registry-66df7c8f76-dq7r2\" (UID: \"f38bcb76-a7ac-4c9c-8113-82113a818347\") " pod="openshift-image-registry/image-registry-66df7c8f76-dq7r2" Mar 21 04:30:50 crc kubenswrapper[4839]: I0321 04:30:50.955864 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/f38bcb76-a7ac-4c9c-8113-82113a818347-trusted-ca\") pod \"image-registry-66df7c8f76-dq7r2\" (UID: \"f38bcb76-a7ac-4c9c-8113-82113a818347\") " pod="openshift-image-registry/image-registry-66df7c8f76-dq7r2" Mar 21 04:30:50 crc kubenswrapper[4839]: I0321 04:30:50.960319 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/f38bcb76-a7ac-4c9c-8113-82113a818347-installation-pull-secrets\") pod \"image-registry-66df7c8f76-dq7r2\" (UID: \"f38bcb76-a7ac-4c9c-8113-82113a818347\") " pod="openshift-image-registry/image-registry-66df7c8f76-dq7r2" Mar 21 04:30:50 crc kubenswrapper[4839]: I0321 04:30:50.960362 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/f38bcb76-a7ac-4c9c-8113-82113a818347-registry-tls\") pod \"image-registry-66df7c8f76-dq7r2\" (UID: \"f38bcb76-a7ac-4c9c-8113-82113a818347\") " pod="openshift-image-registry/image-registry-66df7c8f76-dq7r2" Mar 21 04:30:50 crc kubenswrapper[4839]: I0321 04:30:50.971945 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/f38bcb76-a7ac-4c9c-8113-82113a818347-bound-sa-token\") pod \"image-registry-66df7c8f76-dq7r2\" (UID: \"f38bcb76-a7ac-4c9c-8113-82113a818347\") " pod="openshift-image-registry/image-registry-66df7c8f76-dq7r2" Mar 21 04:30:50 crc kubenswrapper[4839]: I0321 04:30:50.978215 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mdcr5\" (UniqueName: \"kubernetes.io/projected/f38bcb76-a7ac-4c9c-8113-82113a818347-kube-api-access-mdcr5\") pod \"image-registry-66df7c8f76-dq7r2\" (UID: \"f38bcb76-a7ac-4c9c-8113-82113a818347\") " pod="openshift-image-registry/image-registry-66df7c8f76-dq7r2" Mar 21 04:30:51 crc kubenswrapper[4839]: I0321 04:30:51.007232 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-66df7c8f76-dq7r2" Mar 21 04:30:51 crc kubenswrapper[4839]: I0321 04:30:51.467637 4839 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-66df7c8f76-dq7r2"] Mar 21 04:30:51 crc kubenswrapper[4839]: W0321 04:30:51.478745 4839 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf38bcb76_a7ac_4c9c_8113_82113a818347.slice/crio-dff358d0ff2e550e651ae5623bc54671cd1bf5226486d1ce469fbae73b838e60 WatchSource:0}: Error finding container dff358d0ff2e550e651ae5623bc54671cd1bf5226486d1ce469fbae73b838e60: Status 404 returned error can't find the container with id dff358d0ff2e550e651ae5623bc54671cd1bf5226486d1ce469fbae73b838e60 Mar 21 04:30:52 crc kubenswrapper[4839]: I0321 04:30:52.053148 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-66df7c8f76-dq7r2" event={"ID":"f38bcb76-a7ac-4c9c-8113-82113a818347","Type":"ContainerStarted","Data":"85f0b2b796ead4669125a67d51a2d115844b73c64d19e425a155a86e90f916b2"} Mar 21 04:30:52 crc kubenswrapper[4839]: I0321 04:30:52.053203 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-66df7c8f76-dq7r2" event={"ID":"f38bcb76-a7ac-4c9c-8113-82113a818347","Type":"ContainerStarted","Data":"dff358d0ff2e550e651ae5623bc54671cd1bf5226486d1ce469fbae73b838e60"} Mar 21 04:30:52 crc kubenswrapper[4839]: I0321 04:30:52.053388 4839 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-image-registry/image-registry-66df7c8f76-dq7r2" Mar 21 04:30:52 crc kubenswrapper[4839]: I0321 04:30:52.081949 4839 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-66df7c8f76-dq7r2" podStartSLOduration=2.081919896 podStartE2EDuration="2.081919896s" podCreationTimestamp="2026-03-21 04:30:50 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-21 04:30:52.07584699 +0000 UTC m=+456.403633696" watchObservedRunningTime="2026-03-21 04:30:52.081919896 +0000 UTC m=+456.409706592" Mar 21 04:30:54 crc kubenswrapper[4839]: I0321 04:30:54.502077 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 21 04:30:54 crc kubenswrapper[4839]: I0321 04:30:54.502552 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 21 04:30:54 crc kubenswrapper[4839]: I0321 04:30:54.503818 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 21 04:30:54 crc kubenswrapper[4839]: I0321 04:30:54.509386 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 21 04:30:54 crc kubenswrapper[4839]: I0321 04:30:54.554364 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 21 04:30:55 crc kubenswrapper[4839]: I0321 04:30:55.071318 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" event={"ID":"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8","Type":"ContainerStarted","Data":"745b558b217e5153570af89d9c999c0979ecc2ce0441d5449c2dd805d7fc01ec"} Mar 21 04:30:55 crc kubenswrapper[4839]: I0321 04:30:55.619934 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 21 04:30:55 crc kubenswrapper[4839]: I0321 04:30:55.620412 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 21 04:30:55 crc kubenswrapper[4839]: I0321 04:30:55.625928 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 21 04:30:55 crc kubenswrapper[4839]: I0321 04:30:55.627684 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 21 04:30:55 crc kubenswrapper[4839]: I0321 04:30:55.654036 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 21 04:30:55 crc kubenswrapper[4839]: I0321 04:30:55.755857 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 21 04:30:56 crc kubenswrapper[4839]: I0321 04:30:56.112207 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" event={"ID":"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8","Type":"ContainerStarted","Data":"1f70fef8b391bd5f5c87fd8bd890dbf0a6c418cf851fa6fc6c48976086e26181"} Mar 21 04:30:56 crc kubenswrapper[4839]: W0321 04:30:56.171846 4839 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3b6479f0_333b_4a96_9adf_2099afdc2447.slice/crio-4dbf18e5e98e81f4493f6c492ee383c648403a00f68be58482999bfc91dbb223 WatchSource:0}: Error finding container 4dbf18e5e98e81f4493f6c492ee383c648403a00f68be58482999bfc91dbb223: Status 404 returned error can't find the container with id 4dbf18e5e98e81f4493f6c492ee383c648403a00f68be58482999bfc91dbb223 Mar 21 04:30:56 crc kubenswrapper[4839]: W0321 04:30:56.235627 4839 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9d751cbb_f2e2_430d_9754_c882a5e924a5.slice/crio-29f8c2e191cc9bfaab8c57de58aba11a74b39306c987e0e9d89e5b71f8ddd8aa WatchSource:0}: Error finding container 29f8c2e191cc9bfaab8c57de58aba11a74b39306c987e0e9d89e5b71f8ddd8aa: Status 404 returned error can't find the container with id 29f8c2e191cc9bfaab8c57de58aba11a74b39306c987e0e9d89e5b71f8ddd8aa Mar 21 04:30:57 crc kubenswrapper[4839]: I0321 04:30:57.119037 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" event={"ID":"9d751cbb-f2e2-430d-9754-c882a5e924a5","Type":"ContainerStarted","Data":"02c21ed828f04bc82be2a8edc51d67c38b0d2bd2d9096daec5d2fd9d06d41f37"} Mar 21 04:30:57 crc kubenswrapper[4839]: I0321 04:30:57.119411 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" event={"ID":"9d751cbb-f2e2-430d-9754-c882a5e924a5","Type":"ContainerStarted","Data":"29f8c2e191cc9bfaab8c57de58aba11a74b39306c987e0e9d89e5b71f8ddd8aa"} Mar 21 04:30:57 crc kubenswrapper[4839]: I0321 04:30:57.123798 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" event={"ID":"3b6479f0-333b-4a96-9adf-2099afdc2447","Type":"ContainerStarted","Data":"02a2a03d40793c60881217f7e9dae950ae3b7179c07747924d67f4ba362f51ce"} Mar 21 04:30:57 crc kubenswrapper[4839]: I0321 04:30:57.124146 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" event={"ID":"3b6479f0-333b-4a96-9adf-2099afdc2447","Type":"ContainerStarted","Data":"4dbf18e5e98e81f4493f6c492ee383c648403a00f68be58482999bfc91dbb223"} Mar 21 04:30:57 crc kubenswrapper[4839]: I0321 04:30:57.124305 4839 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 21 04:31:00 crc kubenswrapper[4839]: I0321 04:31:00.980270 4839 patch_prober.go:28] interesting pod/machine-config-daemon-jx4q7 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 21 04:31:00 crc kubenswrapper[4839]: I0321 04:31:00.980527 4839 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-jx4q7" podUID="4f92fefb-d5cd-451a-8bbe-31eea55d5bd9" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 21 04:31:01 crc kubenswrapper[4839]: I0321 04:31:01.258491 4839 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-nw7r6"] Mar 21 04:31:01 crc kubenswrapper[4839]: I0321 04:31:01.258778 4839 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-nw7r6" podUID="65a571df-f531-458b-9aed-6de99e4607e1" containerName="registry-server" containerID="cri-o://3e8fc037dfcad47120a88d3f8d35bf421fba0e269bb8bf6bdf7fceddbf556bd4" gracePeriod=30 Mar 21 04:31:01 crc kubenswrapper[4839]: I0321 04:31:01.269856 4839 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-mxrc8"] Mar 21 04:31:01 crc kubenswrapper[4839]: I0321 04:31:01.270117 4839 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-mxrc8" podUID="6513c45b-dd98-40b0-b69c-94db4d1c916e" containerName="registry-server" containerID="cri-o://f8a0b041aeabd67569d74e5792234c2d8b9df0f165a69765fe545ae2e64554e4" gracePeriod=30 Mar 21 04:31:01 crc kubenswrapper[4839]: I0321 04:31:01.276767 4839 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-8jgh7"] Mar 21 04:31:01 crc kubenswrapper[4839]: I0321 04:31:01.276965 4839 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/marketplace-operator-79b997595-8jgh7" podUID="6240548e-b827-4fdb-b2be-c7187d6a28e8" containerName="marketplace-operator" containerID="cri-o://10d054bdba792021ed12279ec4d8798c4bb962afe280d6356a1f51072ca850c9" gracePeriod=30 Mar 21 04:31:01 crc kubenswrapper[4839]: I0321 04:31:01.288216 4839 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-9qjgq"] Mar 21 04:31:01 crc kubenswrapper[4839]: I0321 04:31:01.288443 4839 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-9qjgq" podUID="0b7a7313-21c4-4909-9ebe-ebe552b29b8c" containerName="registry-server" containerID="cri-o://afcbe40293e4b6e4d4a36e26fb65a125e0864be6010a0ab4223765b126370c8a" gracePeriod=30 Mar 21 04:31:01 crc kubenswrapper[4839]: I0321 04:31:01.296250 4839 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-qb9bp"] Mar 21 04:31:01 crc kubenswrapper[4839]: I0321 04:31:01.297034 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-qb9bp" Mar 21 04:31:01 crc kubenswrapper[4839]: I0321 04:31:01.309300 4839 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-zgfcm"] Mar 21 04:31:01 crc kubenswrapper[4839]: I0321 04:31:01.309613 4839 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-zgfcm" podUID="5d1d0c02-87bf-4c8b-bc1c-d25007fb3c1c" containerName="registry-server" containerID="cri-o://e513246e76bc96d307d94fb57365aba010eb2b18b93b54a3b78c0eba14929817" gracePeriod=30 Mar 21 04:31:01 crc kubenswrapper[4839]: I0321 04:31:01.325736 4839 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-qb9bp"] Mar 21 04:31:01 crc kubenswrapper[4839]: I0321 04:31:01.490518 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/df9bf95b-dc8f-4104-9c6c-873159393850-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-qb9bp\" (UID: \"df9bf95b-dc8f-4104-9c6c-873159393850\") " pod="openshift-marketplace/marketplace-operator-79b997595-qb9bp" Mar 21 04:31:01 crc kubenswrapper[4839]: I0321 04:31:01.490930 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/df9bf95b-dc8f-4104-9c6c-873159393850-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-qb9bp\" (UID: \"df9bf95b-dc8f-4104-9c6c-873159393850\") " pod="openshift-marketplace/marketplace-operator-79b997595-qb9bp" Mar 21 04:31:01 crc kubenswrapper[4839]: I0321 04:31:01.490979 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-628d5\" (UniqueName: \"kubernetes.io/projected/df9bf95b-dc8f-4104-9c6c-873159393850-kube-api-access-628d5\") pod \"marketplace-operator-79b997595-qb9bp\" (UID: \"df9bf95b-dc8f-4104-9c6c-873159393850\") " pod="openshift-marketplace/marketplace-operator-79b997595-qb9bp" Mar 21 04:31:01 crc kubenswrapper[4839]: I0321 04:31:01.591868 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-628d5\" (UniqueName: \"kubernetes.io/projected/df9bf95b-dc8f-4104-9c6c-873159393850-kube-api-access-628d5\") pod \"marketplace-operator-79b997595-qb9bp\" (UID: \"df9bf95b-dc8f-4104-9c6c-873159393850\") " pod="openshift-marketplace/marketplace-operator-79b997595-qb9bp" Mar 21 04:31:01 crc kubenswrapper[4839]: I0321 04:31:01.591942 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/df9bf95b-dc8f-4104-9c6c-873159393850-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-qb9bp\" (UID: \"df9bf95b-dc8f-4104-9c6c-873159393850\") " pod="openshift-marketplace/marketplace-operator-79b997595-qb9bp" Mar 21 04:31:01 crc kubenswrapper[4839]: I0321 04:31:01.591995 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/df9bf95b-dc8f-4104-9c6c-873159393850-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-qb9bp\" (UID: \"df9bf95b-dc8f-4104-9c6c-873159393850\") " pod="openshift-marketplace/marketplace-operator-79b997595-qb9bp" Mar 21 04:31:01 crc kubenswrapper[4839]: I0321 04:31:01.593496 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/df9bf95b-dc8f-4104-9c6c-873159393850-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-qb9bp\" (UID: \"df9bf95b-dc8f-4104-9c6c-873159393850\") " pod="openshift-marketplace/marketplace-operator-79b997595-qb9bp" Mar 21 04:31:01 crc kubenswrapper[4839]: I0321 04:31:01.597895 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/df9bf95b-dc8f-4104-9c6c-873159393850-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-qb9bp\" (UID: \"df9bf95b-dc8f-4104-9c6c-873159393850\") " pod="openshift-marketplace/marketplace-operator-79b997595-qb9bp" Mar 21 04:31:01 crc kubenswrapper[4839]: I0321 04:31:01.611117 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-628d5\" (UniqueName: \"kubernetes.io/projected/df9bf95b-dc8f-4104-9c6c-873159393850-kube-api-access-628d5\") pod \"marketplace-operator-79b997595-qb9bp\" (UID: \"df9bf95b-dc8f-4104-9c6c-873159393850\") " pod="openshift-marketplace/marketplace-operator-79b997595-qb9bp" Mar 21 04:31:01 crc kubenswrapper[4839]: I0321 04:31:01.617613 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-qb9bp" Mar 21 04:31:01 crc kubenswrapper[4839]: I0321 04:31:01.801353 4839 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-nw7r6" Mar 21 04:31:01 crc kubenswrapper[4839]: I0321 04:31:01.954612 4839 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-zgfcm" Mar 21 04:31:01 crc kubenswrapper[4839]: I0321 04:31:01.959142 4839 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-8jgh7" Mar 21 04:31:01 crc kubenswrapper[4839]: I0321 04:31:01.965508 4839 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-9qjgq" Mar 21 04:31:01 crc kubenswrapper[4839]: I0321 04:31:01.970909 4839 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-mxrc8" Mar 21 04:31:02 crc kubenswrapper[4839]: I0321 04:31:01.998393 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-krww4\" (UniqueName: \"kubernetes.io/projected/65a571df-f531-458b-9aed-6de99e4607e1-kube-api-access-krww4\") pod \"65a571df-f531-458b-9aed-6de99e4607e1\" (UID: \"65a571df-f531-458b-9aed-6de99e4607e1\") " Mar 21 04:31:02 crc kubenswrapper[4839]: I0321 04:31:01.998478 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/65a571df-f531-458b-9aed-6de99e4607e1-catalog-content\") pod \"65a571df-f531-458b-9aed-6de99e4607e1\" (UID: \"65a571df-f531-458b-9aed-6de99e4607e1\") " Mar 21 04:31:02 crc kubenswrapper[4839]: I0321 04:31:01.998583 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/65a571df-f531-458b-9aed-6de99e4607e1-utilities\") pod \"65a571df-f531-458b-9aed-6de99e4607e1\" (UID: \"65a571df-f531-458b-9aed-6de99e4607e1\") " Mar 21 04:31:02 crc kubenswrapper[4839]: I0321 04:31:02.002630 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/65a571df-f531-458b-9aed-6de99e4607e1-utilities" (OuterVolumeSpecName: "utilities") pod "65a571df-f531-458b-9aed-6de99e4607e1" (UID: "65a571df-f531-458b-9aed-6de99e4607e1"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 21 04:31:02 crc kubenswrapper[4839]: I0321 04:31:02.003734 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/65a571df-f531-458b-9aed-6de99e4607e1-kube-api-access-krww4" (OuterVolumeSpecName: "kube-api-access-krww4") pod "65a571df-f531-458b-9aed-6de99e4607e1" (UID: "65a571df-f531-458b-9aed-6de99e4607e1"). InnerVolumeSpecName "kube-api-access-krww4". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 21 04:31:02 crc kubenswrapper[4839]: I0321 04:31:02.088753 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/65a571df-f531-458b-9aed-6de99e4607e1-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "65a571df-f531-458b-9aed-6de99e4607e1" (UID: "65a571df-f531-458b-9aed-6de99e4607e1"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 21 04:31:02 crc kubenswrapper[4839]: I0321 04:31:02.099366 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jctj4\" (UniqueName: \"kubernetes.io/projected/5d1d0c02-87bf-4c8b-bc1c-d25007fb3c1c-kube-api-access-jctj4\") pod \"5d1d0c02-87bf-4c8b-bc1c-d25007fb3c1c\" (UID: \"5d1d0c02-87bf-4c8b-bc1c-d25007fb3c1c\") " Mar 21 04:31:02 crc kubenswrapper[4839]: I0321 04:31:02.099430 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5d1d0c02-87bf-4c8b-bc1c-d25007fb3c1c-catalog-content\") pod \"5d1d0c02-87bf-4c8b-bc1c-d25007fb3c1c\" (UID: \"5d1d0c02-87bf-4c8b-bc1c-d25007fb3c1c\") " Mar 21 04:31:02 crc kubenswrapper[4839]: I0321 04:31:02.099461 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0b7a7313-21c4-4909-9ebe-ebe552b29b8c-catalog-content\") pod \"0b7a7313-21c4-4909-9ebe-ebe552b29b8c\" (UID: \"0b7a7313-21c4-4909-9ebe-ebe552b29b8c\") " Mar 21 04:31:02 crc kubenswrapper[4839]: I0321 04:31:02.099484 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5d1d0c02-87bf-4c8b-bc1c-d25007fb3c1c-utilities\") pod \"5d1d0c02-87bf-4c8b-bc1c-d25007fb3c1c\" (UID: \"5d1d0c02-87bf-4c8b-bc1c-d25007fb3c1c\") " Mar 21 04:31:02 crc kubenswrapper[4839]: I0321 04:31:02.099515 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-v69rn\" (UniqueName: \"kubernetes.io/projected/6240548e-b827-4fdb-b2be-c7187d6a28e8-kube-api-access-v69rn\") pod \"6240548e-b827-4fdb-b2be-c7187d6a28e8\" (UID: \"6240548e-b827-4fdb-b2be-c7187d6a28e8\") " Mar 21 04:31:02 crc kubenswrapper[4839]: I0321 04:31:02.099597 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/6240548e-b827-4fdb-b2be-c7187d6a28e8-marketplace-operator-metrics\") pod \"6240548e-b827-4fdb-b2be-c7187d6a28e8\" (UID: \"6240548e-b827-4fdb-b2be-c7187d6a28e8\") " Mar 21 04:31:02 crc kubenswrapper[4839]: I0321 04:31:02.099643 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6513c45b-dd98-40b0-b69c-94db4d1c916e-catalog-content\") pod \"6513c45b-dd98-40b0-b69c-94db4d1c916e\" (UID: \"6513c45b-dd98-40b0-b69c-94db4d1c916e\") " Mar 21 04:31:02 crc kubenswrapper[4839]: I0321 04:31:02.099668 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nt4t7\" (UniqueName: \"kubernetes.io/projected/6513c45b-dd98-40b0-b69c-94db4d1c916e-kube-api-access-nt4t7\") pod \"6513c45b-dd98-40b0-b69c-94db4d1c916e\" (UID: \"6513c45b-dd98-40b0-b69c-94db4d1c916e\") " Mar 21 04:31:02 crc kubenswrapper[4839]: I0321 04:31:02.099696 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/6240548e-b827-4fdb-b2be-c7187d6a28e8-marketplace-trusted-ca\") pod \"6240548e-b827-4fdb-b2be-c7187d6a28e8\" (UID: \"6240548e-b827-4fdb-b2be-c7187d6a28e8\") " Mar 21 04:31:02 crc kubenswrapper[4839]: I0321 04:31:02.099717 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6513c45b-dd98-40b0-b69c-94db4d1c916e-utilities\") pod \"6513c45b-dd98-40b0-b69c-94db4d1c916e\" (UID: \"6513c45b-dd98-40b0-b69c-94db4d1c916e\") " Mar 21 04:31:02 crc kubenswrapper[4839]: I0321 04:31:02.099754 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ncnmc\" (UniqueName: \"kubernetes.io/projected/0b7a7313-21c4-4909-9ebe-ebe552b29b8c-kube-api-access-ncnmc\") pod \"0b7a7313-21c4-4909-9ebe-ebe552b29b8c\" (UID: \"0b7a7313-21c4-4909-9ebe-ebe552b29b8c\") " Mar 21 04:31:02 crc kubenswrapper[4839]: I0321 04:31:02.099783 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0b7a7313-21c4-4909-9ebe-ebe552b29b8c-utilities\") pod \"0b7a7313-21c4-4909-9ebe-ebe552b29b8c\" (UID: \"0b7a7313-21c4-4909-9ebe-ebe552b29b8c\") " Mar 21 04:31:02 crc kubenswrapper[4839]: I0321 04:31:02.100040 4839 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/65a571df-f531-458b-9aed-6de99e4607e1-utilities\") on node \"crc\" DevicePath \"\"" Mar 21 04:31:02 crc kubenswrapper[4839]: I0321 04:31:02.100067 4839 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-krww4\" (UniqueName: \"kubernetes.io/projected/65a571df-f531-458b-9aed-6de99e4607e1-kube-api-access-krww4\") on node \"crc\" DevicePath \"\"" Mar 21 04:31:02 crc kubenswrapper[4839]: I0321 04:31:02.100081 4839 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/65a571df-f531-458b-9aed-6de99e4607e1-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 21 04:31:02 crc kubenswrapper[4839]: I0321 04:31:02.100761 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0b7a7313-21c4-4909-9ebe-ebe552b29b8c-utilities" (OuterVolumeSpecName: "utilities") pod "0b7a7313-21c4-4909-9ebe-ebe552b29b8c" (UID: "0b7a7313-21c4-4909-9ebe-ebe552b29b8c"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 21 04:31:02 crc kubenswrapper[4839]: I0321 04:31:02.102623 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5d1d0c02-87bf-4c8b-bc1c-d25007fb3c1c-utilities" (OuterVolumeSpecName: "utilities") pod "5d1d0c02-87bf-4c8b-bc1c-d25007fb3c1c" (UID: "5d1d0c02-87bf-4c8b-bc1c-d25007fb3c1c"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 21 04:31:02 crc kubenswrapper[4839]: I0321 04:31:02.103295 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6240548e-b827-4fdb-b2be-c7187d6a28e8-marketplace-trusted-ca" (OuterVolumeSpecName: "marketplace-trusted-ca") pod "6240548e-b827-4fdb-b2be-c7187d6a28e8" (UID: "6240548e-b827-4fdb-b2be-c7187d6a28e8"). InnerVolumeSpecName "marketplace-trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 21 04:31:02 crc kubenswrapper[4839]: I0321 04:31:02.103821 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6513c45b-dd98-40b0-b69c-94db4d1c916e-utilities" (OuterVolumeSpecName: "utilities") pod "6513c45b-dd98-40b0-b69c-94db4d1c916e" (UID: "6513c45b-dd98-40b0-b69c-94db4d1c916e"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 21 04:31:02 crc kubenswrapper[4839]: I0321 04:31:02.105225 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6513c45b-dd98-40b0-b69c-94db4d1c916e-kube-api-access-nt4t7" (OuterVolumeSpecName: "kube-api-access-nt4t7") pod "6513c45b-dd98-40b0-b69c-94db4d1c916e" (UID: "6513c45b-dd98-40b0-b69c-94db4d1c916e"). InnerVolumeSpecName "kube-api-access-nt4t7". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 21 04:31:02 crc kubenswrapper[4839]: I0321 04:31:02.105507 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0b7a7313-21c4-4909-9ebe-ebe552b29b8c-kube-api-access-ncnmc" (OuterVolumeSpecName: "kube-api-access-ncnmc") pod "0b7a7313-21c4-4909-9ebe-ebe552b29b8c" (UID: "0b7a7313-21c4-4909-9ebe-ebe552b29b8c"). InnerVolumeSpecName "kube-api-access-ncnmc". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 21 04:31:02 crc kubenswrapper[4839]: I0321 04:31:02.105878 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5d1d0c02-87bf-4c8b-bc1c-d25007fb3c1c-kube-api-access-jctj4" (OuterVolumeSpecName: "kube-api-access-jctj4") pod "5d1d0c02-87bf-4c8b-bc1c-d25007fb3c1c" (UID: "5d1d0c02-87bf-4c8b-bc1c-d25007fb3c1c"). InnerVolumeSpecName "kube-api-access-jctj4". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 21 04:31:02 crc kubenswrapper[4839]: I0321 04:31:02.106137 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6240548e-b827-4fdb-b2be-c7187d6a28e8-marketplace-operator-metrics" (OuterVolumeSpecName: "marketplace-operator-metrics") pod "6240548e-b827-4fdb-b2be-c7187d6a28e8" (UID: "6240548e-b827-4fdb-b2be-c7187d6a28e8"). InnerVolumeSpecName "marketplace-operator-metrics". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 21 04:31:02 crc kubenswrapper[4839]: I0321 04:31:02.106904 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6240548e-b827-4fdb-b2be-c7187d6a28e8-kube-api-access-v69rn" (OuterVolumeSpecName: "kube-api-access-v69rn") pod "6240548e-b827-4fdb-b2be-c7187d6a28e8" (UID: "6240548e-b827-4fdb-b2be-c7187d6a28e8"). InnerVolumeSpecName "kube-api-access-v69rn". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 21 04:31:02 crc kubenswrapper[4839]: I0321 04:31:02.130552 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0b7a7313-21c4-4909-9ebe-ebe552b29b8c-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "0b7a7313-21c4-4909-9ebe-ebe552b29b8c" (UID: "0b7a7313-21c4-4909-9ebe-ebe552b29b8c"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 21 04:31:02 crc kubenswrapper[4839]: I0321 04:31:02.154970 4839 generic.go:334] "Generic (PLEG): container finished" podID="6513c45b-dd98-40b0-b69c-94db4d1c916e" containerID="f8a0b041aeabd67569d74e5792234c2d8b9df0f165a69765fe545ae2e64554e4" exitCode=0 Mar 21 04:31:02 crc kubenswrapper[4839]: I0321 04:31:02.155041 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-mxrc8" event={"ID":"6513c45b-dd98-40b0-b69c-94db4d1c916e","Type":"ContainerDied","Data":"f8a0b041aeabd67569d74e5792234c2d8b9df0f165a69765fe545ae2e64554e4"} Mar 21 04:31:02 crc kubenswrapper[4839]: I0321 04:31:02.155053 4839 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-mxrc8" Mar 21 04:31:02 crc kubenswrapper[4839]: I0321 04:31:02.155067 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-mxrc8" event={"ID":"6513c45b-dd98-40b0-b69c-94db4d1c916e","Type":"ContainerDied","Data":"3c535ea31a5aa838095ae16f33b0780c50c3c3698c73e47ae8e3f30c17a3ac39"} Mar 21 04:31:02 crc kubenswrapper[4839]: I0321 04:31:02.155085 4839 scope.go:117] "RemoveContainer" containerID="f8a0b041aeabd67569d74e5792234c2d8b9df0f165a69765fe545ae2e64554e4" Mar 21 04:31:02 crc kubenswrapper[4839]: I0321 04:31:02.157646 4839 generic.go:334] "Generic (PLEG): container finished" podID="65a571df-f531-458b-9aed-6de99e4607e1" containerID="3e8fc037dfcad47120a88d3f8d35bf421fba0e269bb8bf6bdf7fceddbf556bd4" exitCode=0 Mar 21 04:31:02 crc kubenswrapper[4839]: I0321 04:31:02.157699 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-nw7r6" event={"ID":"65a571df-f531-458b-9aed-6de99e4607e1","Type":"ContainerDied","Data":"3e8fc037dfcad47120a88d3f8d35bf421fba0e269bb8bf6bdf7fceddbf556bd4"} Mar 21 04:31:02 crc kubenswrapper[4839]: I0321 04:31:02.157722 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-nw7r6" event={"ID":"65a571df-f531-458b-9aed-6de99e4607e1","Type":"ContainerDied","Data":"b5f435157e1b2e816a83545c0d59dbf17d3143a0eb363bf4e4b546731c0c8b35"} Mar 21 04:31:02 crc kubenswrapper[4839]: I0321 04:31:02.157773 4839 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-nw7r6" Mar 21 04:31:02 crc kubenswrapper[4839]: I0321 04:31:02.170362 4839 generic.go:334] "Generic (PLEG): container finished" podID="5d1d0c02-87bf-4c8b-bc1c-d25007fb3c1c" containerID="e513246e76bc96d307d94fb57365aba010eb2b18b93b54a3b78c0eba14929817" exitCode=0 Mar 21 04:31:02 crc kubenswrapper[4839]: I0321 04:31:02.170427 4839 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-zgfcm" Mar 21 04:31:02 crc kubenswrapper[4839]: I0321 04:31:02.170434 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-zgfcm" event={"ID":"5d1d0c02-87bf-4c8b-bc1c-d25007fb3c1c","Type":"ContainerDied","Data":"e513246e76bc96d307d94fb57365aba010eb2b18b93b54a3b78c0eba14929817"} Mar 21 04:31:02 crc kubenswrapper[4839]: I0321 04:31:02.170469 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-zgfcm" event={"ID":"5d1d0c02-87bf-4c8b-bc1c-d25007fb3c1c","Type":"ContainerDied","Data":"45ca59ee6d68e70db13f642a35e227f7dae46d5a40341a7fcc4d0c33d12ae8bf"} Mar 21 04:31:02 crc kubenswrapper[4839]: I0321 04:31:02.171457 4839 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-qb9bp"] Mar 21 04:31:02 crc kubenswrapper[4839]: I0321 04:31:02.173659 4839 generic.go:334] "Generic (PLEG): container finished" podID="6240548e-b827-4fdb-b2be-c7187d6a28e8" containerID="10d054bdba792021ed12279ec4d8798c4bb962afe280d6356a1f51072ca850c9" exitCode=0 Mar 21 04:31:02 crc kubenswrapper[4839]: I0321 04:31:02.173702 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-8jgh7" event={"ID":"6240548e-b827-4fdb-b2be-c7187d6a28e8","Type":"ContainerDied","Data":"10d054bdba792021ed12279ec4d8798c4bb962afe280d6356a1f51072ca850c9"} Mar 21 04:31:02 crc kubenswrapper[4839]: I0321 04:31:02.173720 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-8jgh7" event={"ID":"6240548e-b827-4fdb-b2be-c7187d6a28e8","Type":"ContainerDied","Data":"dec41352b22dc4b1f265aecf13bbf9f995403b64a2bfc4f44c88616523722931"} Mar 21 04:31:02 crc kubenswrapper[4839]: I0321 04:31:02.173773 4839 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-8jgh7" Mar 21 04:31:02 crc kubenswrapper[4839]: I0321 04:31:02.180143 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6513c45b-dd98-40b0-b69c-94db4d1c916e-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "6513c45b-dd98-40b0-b69c-94db4d1c916e" (UID: "6513c45b-dd98-40b0-b69c-94db4d1c916e"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 21 04:31:02 crc kubenswrapper[4839]: I0321 04:31:02.183027 4839 scope.go:117] "RemoveContainer" containerID="8742ce04852792a9b09d21f9082b4080eead694d53c96e4113cd80bb39500730" Mar 21 04:31:02 crc kubenswrapper[4839]: I0321 04:31:02.205766 4839 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-v69rn\" (UniqueName: \"kubernetes.io/projected/6240548e-b827-4fdb-b2be-c7187d6a28e8-kube-api-access-v69rn\") on node \"crc\" DevicePath \"\"" Mar 21 04:31:02 crc kubenswrapper[4839]: I0321 04:31:02.205838 4839 reconciler_common.go:293] "Volume detached for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/6240548e-b827-4fdb-b2be-c7187d6a28e8-marketplace-operator-metrics\") on node \"crc\" DevicePath \"\"" Mar 21 04:31:02 crc kubenswrapper[4839]: I0321 04:31:02.205857 4839 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6513c45b-dd98-40b0-b69c-94db4d1c916e-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 21 04:31:02 crc kubenswrapper[4839]: I0321 04:31:02.205871 4839 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nt4t7\" (UniqueName: \"kubernetes.io/projected/6513c45b-dd98-40b0-b69c-94db4d1c916e-kube-api-access-nt4t7\") on node \"crc\" DevicePath \"\"" Mar 21 04:31:02 crc kubenswrapper[4839]: I0321 04:31:02.205890 4839 reconciler_common.go:293] "Volume detached for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/6240548e-b827-4fdb-b2be-c7187d6a28e8-marketplace-trusted-ca\") on node \"crc\" DevicePath \"\"" Mar 21 04:31:02 crc kubenswrapper[4839]: I0321 04:31:02.205929 4839 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6513c45b-dd98-40b0-b69c-94db4d1c916e-utilities\") on node \"crc\" DevicePath \"\"" Mar 21 04:31:02 crc kubenswrapper[4839]: I0321 04:31:02.205942 4839 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ncnmc\" (UniqueName: \"kubernetes.io/projected/0b7a7313-21c4-4909-9ebe-ebe552b29b8c-kube-api-access-ncnmc\") on node \"crc\" DevicePath \"\"" Mar 21 04:31:02 crc kubenswrapper[4839]: I0321 04:31:02.205960 4839 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0b7a7313-21c4-4909-9ebe-ebe552b29b8c-utilities\") on node \"crc\" DevicePath \"\"" Mar 21 04:31:02 crc kubenswrapper[4839]: I0321 04:31:02.205972 4839 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jctj4\" (UniqueName: \"kubernetes.io/projected/5d1d0c02-87bf-4c8b-bc1c-d25007fb3c1c-kube-api-access-jctj4\") on node \"crc\" DevicePath \"\"" Mar 21 04:31:02 crc kubenswrapper[4839]: I0321 04:31:02.206017 4839 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0b7a7313-21c4-4909-9ebe-ebe552b29b8c-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 21 04:31:02 crc kubenswrapper[4839]: I0321 04:31:02.206035 4839 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5d1d0c02-87bf-4c8b-bc1c-d25007fb3c1c-utilities\") on node \"crc\" DevicePath \"\"" Mar 21 04:31:02 crc kubenswrapper[4839]: I0321 04:31:02.211114 4839 generic.go:334] "Generic (PLEG): container finished" podID="0b7a7313-21c4-4909-9ebe-ebe552b29b8c" containerID="afcbe40293e4b6e4d4a36e26fb65a125e0864be6010a0ab4223765b126370c8a" exitCode=0 Mar 21 04:31:02 crc kubenswrapper[4839]: I0321 04:31:02.211187 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-9qjgq" event={"ID":"0b7a7313-21c4-4909-9ebe-ebe552b29b8c","Type":"ContainerDied","Data":"afcbe40293e4b6e4d4a36e26fb65a125e0864be6010a0ab4223765b126370c8a"} Mar 21 04:31:02 crc kubenswrapper[4839]: I0321 04:31:02.211220 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-9qjgq" event={"ID":"0b7a7313-21c4-4909-9ebe-ebe552b29b8c","Type":"ContainerDied","Data":"6a5663fd0eb16a90e793ba0b93994b3affe90036f9e0e38ea8915b0da62b0425"} Mar 21 04:31:02 crc kubenswrapper[4839]: I0321 04:31:02.211528 4839 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-9qjgq" Mar 21 04:31:02 crc kubenswrapper[4839]: I0321 04:31:02.221188 4839 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-nw7r6"] Mar 21 04:31:02 crc kubenswrapper[4839]: I0321 04:31:02.225411 4839 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-nw7r6"] Mar 21 04:31:02 crc kubenswrapper[4839]: I0321 04:31:02.229398 4839 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-8jgh7"] Mar 21 04:31:02 crc kubenswrapper[4839]: I0321 04:31:02.233277 4839 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-8jgh7"] Mar 21 04:31:02 crc kubenswrapper[4839]: I0321 04:31:02.239024 4839 scope.go:117] "RemoveContainer" containerID="ef36d2755a7903c4bb91306b919a88606f1cdd5839d1cc954ceec3191fcac127" Mar 21 04:31:02 crc kubenswrapper[4839]: I0321 04:31:02.245414 4839 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-9qjgq"] Mar 21 04:31:02 crc kubenswrapper[4839]: I0321 04:31:02.251709 4839 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-9qjgq"] Mar 21 04:31:02 crc kubenswrapper[4839]: I0321 04:31:02.259113 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5d1d0c02-87bf-4c8b-bc1c-d25007fb3c1c-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "5d1d0c02-87bf-4c8b-bc1c-d25007fb3c1c" (UID: "5d1d0c02-87bf-4c8b-bc1c-d25007fb3c1c"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 21 04:31:02 crc kubenswrapper[4839]: I0321 04:31:02.276294 4839 scope.go:117] "RemoveContainer" containerID="f8a0b041aeabd67569d74e5792234c2d8b9df0f165a69765fe545ae2e64554e4" Mar 21 04:31:02 crc kubenswrapper[4839]: E0321 04:31:02.276971 4839 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f8a0b041aeabd67569d74e5792234c2d8b9df0f165a69765fe545ae2e64554e4\": container with ID starting with f8a0b041aeabd67569d74e5792234c2d8b9df0f165a69765fe545ae2e64554e4 not found: ID does not exist" containerID="f8a0b041aeabd67569d74e5792234c2d8b9df0f165a69765fe545ae2e64554e4" Mar 21 04:31:02 crc kubenswrapper[4839]: I0321 04:31:02.277008 4839 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f8a0b041aeabd67569d74e5792234c2d8b9df0f165a69765fe545ae2e64554e4"} err="failed to get container status \"f8a0b041aeabd67569d74e5792234c2d8b9df0f165a69765fe545ae2e64554e4\": rpc error: code = NotFound desc = could not find container \"f8a0b041aeabd67569d74e5792234c2d8b9df0f165a69765fe545ae2e64554e4\": container with ID starting with f8a0b041aeabd67569d74e5792234c2d8b9df0f165a69765fe545ae2e64554e4 not found: ID does not exist" Mar 21 04:31:02 crc kubenswrapper[4839]: I0321 04:31:02.277048 4839 scope.go:117] "RemoveContainer" containerID="8742ce04852792a9b09d21f9082b4080eead694d53c96e4113cd80bb39500730" Mar 21 04:31:02 crc kubenswrapper[4839]: E0321 04:31:02.277270 4839 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8742ce04852792a9b09d21f9082b4080eead694d53c96e4113cd80bb39500730\": container with ID starting with 8742ce04852792a9b09d21f9082b4080eead694d53c96e4113cd80bb39500730 not found: ID does not exist" containerID="8742ce04852792a9b09d21f9082b4080eead694d53c96e4113cd80bb39500730" Mar 21 04:31:02 crc kubenswrapper[4839]: I0321 04:31:02.277312 4839 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8742ce04852792a9b09d21f9082b4080eead694d53c96e4113cd80bb39500730"} err="failed to get container status \"8742ce04852792a9b09d21f9082b4080eead694d53c96e4113cd80bb39500730\": rpc error: code = NotFound desc = could not find container \"8742ce04852792a9b09d21f9082b4080eead694d53c96e4113cd80bb39500730\": container with ID starting with 8742ce04852792a9b09d21f9082b4080eead694d53c96e4113cd80bb39500730 not found: ID does not exist" Mar 21 04:31:02 crc kubenswrapper[4839]: I0321 04:31:02.277327 4839 scope.go:117] "RemoveContainer" containerID="ef36d2755a7903c4bb91306b919a88606f1cdd5839d1cc954ceec3191fcac127" Mar 21 04:31:02 crc kubenswrapper[4839]: E0321 04:31:02.277536 4839 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ef36d2755a7903c4bb91306b919a88606f1cdd5839d1cc954ceec3191fcac127\": container with ID starting with ef36d2755a7903c4bb91306b919a88606f1cdd5839d1cc954ceec3191fcac127 not found: ID does not exist" containerID="ef36d2755a7903c4bb91306b919a88606f1cdd5839d1cc954ceec3191fcac127" Mar 21 04:31:02 crc kubenswrapper[4839]: I0321 04:31:02.277557 4839 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ef36d2755a7903c4bb91306b919a88606f1cdd5839d1cc954ceec3191fcac127"} err="failed to get container status \"ef36d2755a7903c4bb91306b919a88606f1cdd5839d1cc954ceec3191fcac127\": rpc error: code = NotFound desc = could not find container \"ef36d2755a7903c4bb91306b919a88606f1cdd5839d1cc954ceec3191fcac127\": container with ID starting with ef36d2755a7903c4bb91306b919a88606f1cdd5839d1cc954ceec3191fcac127 not found: ID does not exist" Mar 21 04:31:02 crc kubenswrapper[4839]: I0321 04:31:02.277584 4839 scope.go:117] "RemoveContainer" containerID="3e8fc037dfcad47120a88d3f8d35bf421fba0e269bb8bf6bdf7fceddbf556bd4" Mar 21 04:31:02 crc kubenswrapper[4839]: I0321 04:31:02.295981 4839 scope.go:117] "RemoveContainer" containerID="efc4670847f72c6bb6856a32dc6e4a2a1da439face51ab3c5137f5668c3f5ffb" Mar 21 04:31:02 crc kubenswrapper[4839]: I0321 04:31:02.307968 4839 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5d1d0c02-87bf-4c8b-bc1c-d25007fb3c1c-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 21 04:31:02 crc kubenswrapper[4839]: I0321 04:31:02.320844 4839 scope.go:117] "RemoveContainer" containerID="b1d917a6edaeb5ac2761ae508c1b9b4542abdb7660fa75528b425c220749f0c6" Mar 21 04:31:02 crc kubenswrapper[4839]: I0321 04:31:02.338186 4839 scope.go:117] "RemoveContainer" containerID="3e8fc037dfcad47120a88d3f8d35bf421fba0e269bb8bf6bdf7fceddbf556bd4" Mar 21 04:31:02 crc kubenswrapper[4839]: E0321 04:31:02.338574 4839 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3e8fc037dfcad47120a88d3f8d35bf421fba0e269bb8bf6bdf7fceddbf556bd4\": container with ID starting with 3e8fc037dfcad47120a88d3f8d35bf421fba0e269bb8bf6bdf7fceddbf556bd4 not found: ID does not exist" containerID="3e8fc037dfcad47120a88d3f8d35bf421fba0e269bb8bf6bdf7fceddbf556bd4" Mar 21 04:31:02 crc kubenswrapper[4839]: I0321 04:31:02.338614 4839 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3e8fc037dfcad47120a88d3f8d35bf421fba0e269bb8bf6bdf7fceddbf556bd4"} err="failed to get container status \"3e8fc037dfcad47120a88d3f8d35bf421fba0e269bb8bf6bdf7fceddbf556bd4\": rpc error: code = NotFound desc = could not find container \"3e8fc037dfcad47120a88d3f8d35bf421fba0e269bb8bf6bdf7fceddbf556bd4\": container with ID starting with 3e8fc037dfcad47120a88d3f8d35bf421fba0e269bb8bf6bdf7fceddbf556bd4 not found: ID does not exist" Mar 21 04:31:02 crc kubenswrapper[4839]: I0321 04:31:02.338640 4839 scope.go:117] "RemoveContainer" containerID="efc4670847f72c6bb6856a32dc6e4a2a1da439face51ab3c5137f5668c3f5ffb" Mar 21 04:31:02 crc kubenswrapper[4839]: E0321 04:31:02.339010 4839 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"efc4670847f72c6bb6856a32dc6e4a2a1da439face51ab3c5137f5668c3f5ffb\": container with ID starting with efc4670847f72c6bb6856a32dc6e4a2a1da439face51ab3c5137f5668c3f5ffb not found: ID does not exist" containerID="efc4670847f72c6bb6856a32dc6e4a2a1da439face51ab3c5137f5668c3f5ffb" Mar 21 04:31:02 crc kubenswrapper[4839]: I0321 04:31:02.339046 4839 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"efc4670847f72c6bb6856a32dc6e4a2a1da439face51ab3c5137f5668c3f5ffb"} err="failed to get container status \"efc4670847f72c6bb6856a32dc6e4a2a1da439face51ab3c5137f5668c3f5ffb\": rpc error: code = NotFound desc = could not find container \"efc4670847f72c6bb6856a32dc6e4a2a1da439face51ab3c5137f5668c3f5ffb\": container with ID starting with efc4670847f72c6bb6856a32dc6e4a2a1da439face51ab3c5137f5668c3f5ffb not found: ID does not exist" Mar 21 04:31:02 crc kubenswrapper[4839]: I0321 04:31:02.339067 4839 scope.go:117] "RemoveContainer" containerID="b1d917a6edaeb5ac2761ae508c1b9b4542abdb7660fa75528b425c220749f0c6" Mar 21 04:31:02 crc kubenswrapper[4839]: E0321 04:31:02.339290 4839 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b1d917a6edaeb5ac2761ae508c1b9b4542abdb7660fa75528b425c220749f0c6\": container with ID starting with b1d917a6edaeb5ac2761ae508c1b9b4542abdb7660fa75528b425c220749f0c6 not found: ID does not exist" containerID="b1d917a6edaeb5ac2761ae508c1b9b4542abdb7660fa75528b425c220749f0c6" Mar 21 04:31:02 crc kubenswrapper[4839]: I0321 04:31:02.339304 4839 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b1d917a6edaeb5ac2761ae508c1b9b4542abdb7660fa75528b425c220749f0c6"} err="failed to get container status \"b1d917a6edaeb5ac2761ae508c1b9b4542abdb7660fa75528b425c220749f0c6\": rpc error: code = NotFound desc = could not find container \"b1d917a6edaeb5ac2761ae508c1b9b4542abdb7660fa75528b425c220749f0c6\": container with ID starting with b1d917a6edaeb5ac2761ae508c1b9b4542abdb7660fa75528b425c220749f0c6 not found: ID does not exist" Mar 21 04:31:02 crc kubenswrapper[4839]: I0321 04:31:02.339315 4839 scope.go:117] "RemoveContainer" containerID="e513246e76bc96d307d94fb57365aba010eb2b18b93b54a3b78c0eba14929817" Mar 21 04:31:02 crc kubenswrapper[4839]: I0321 04:31:02.349923 4839 scope.go:117] "RemoveContainer" containerID="7bce8f9c3683489f31e4a7ec0257da3a6aef6ea02766a2df62cfae9928d74680" Mar 21 04:31:02 crc kubenswrapper[4839]: I0321 04:31:02.362778 4839 scope.go:117] "RemoveContainer" containerID="976423ef41aaae33850c00fcd3501af4196d38bd3ce2a5845a341191caf443f9" Mar 21 04:31:02 crc kubenswrapper[4839]: I0321 04:31:02.376331 4839 scope.go:117] "RemoveContainer" containerID="e513246e76bc96d307d94fb57365aba010eb2b18b93b54a3b78c0eba14929817" Mar 21 04:31:02 crc kubenswrapper[4839]: E0321 04:31:02.381745 4839 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e513246e76bc96d307d94fb57365aba010eb2b18b93b54a3b78c0eba14929817\": container with ID starting with e513246e76bc96d307d94fb57365aba010eb2b18b93b54a3b78c0eba14929817 not found: ID does not exist" containerID="e513246e76bc96d307d94fb57365aba010eb2b18b93b54a3b78c0eba14929817" Mar 21 04:31:02 crc kubenswrapper[4839]: I0321 04:31:02.381793 4839 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e513246e76bc96d307d94fb57365aba010eb2b18b93b54a3b78c0eba14929817"} err="failed to get container status \"e513246e76bc96d307d94fb57365aba010eb2b18b93b54a3b78c0eba14929817\": rpc error: code = NotFound desc = could not find container \"e513246e76bc96d307d94fb57365aba010eb2b18b93b54a3b78c0eba14929817\": container with ID starting with e513246e76bc96d307d94fb57365aba010eb2b18b93b54a3b78c0eba14929817 not found: ID does not exist" Mar 21 04:31:02 crc kubenswrapper[4839]: I0321 04:31:02.381821 4839 scope.go:117] "RemoveContainer" containerID="7bce8f9c3683489f31e4a7ec0257da3a6aef6ea02766a2df62cfae9928d74680" Mar 21 04:31:02 crc kubenswrapper[4839]: E0321 04:31:02.382883 4839 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7bce8f9c3683489f31e4a7ec0257da3a6aef6ea02766a2df62cfae9928d74680\": container with ID starting with 7bce8f9c3683489f31e4a7ec0257da3a6aef6ea02766a2df62cfae9928d74680 not found: ID does not exist" containerID="7bce8f9c3683489f31e4a7ec0257da3a6aef6ea02766a2df62cfae9928d74680" Mar 21 04:31:02 crc kubenswrapper[4839]: I0321 04:31:02.382934 4839 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7bce8f9c3683489f31e4a7ec0257da3a6aef6ea02766a2df62cfae9928d74680"} err="failed to get container status \"7bce8f9c3683489f31e4a7ec0257da3a6aef6ea02766a2df62cfae9928d74680\": rpc error: code = NotFound desc = could not find container \"7bce8f9c3683489f31e4a7ec0257da3a6aef6ea02766a2df62cfae9928d74680\": container with ID starting with 7bce8f9c3683489f31e4a7ec0257da3a6aef6ea02766a2df62cfae9928d74680 not found: ID does not exist" Mar 21 04:31:02 crc kubenswrapper[4839]: I0321 04:31:02.382968 4839 scope.go:117] "RemoveContainer" containerID="976423ef41aaae33850c00fcd3501af4196d38bd3ce2a5845a341191caf443f9" Mar 21 04:31:02 crc kubenswrapper[4839]: E0321 04:31:02.383414 4839 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"976423ef41aaae33850c00fcd3501af4196d38bd3ce2a5845a341191caf443f9\": container with ID starting with 976423ef41aaae33850c00fcd3501af4196d38bd3ce2a5845a341191caf443f9 not found: ID does not exist" containerID="976423ef41aaae33850c00fcd3501af4196d38bd3ce2a5845a341191caf443f9" Mar 21 04:31:02 crc kubenswrapper[4839]: I0321 04:31:02.383433 4839 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"976423ef41aaae33850c00fcd3501af4196d38bd3ce2a5845a341191caf443f9"} err="failed to get container status \"976423ef41aaae33850c00fcd3501af4196d38bd3ce2a5845a341191caf443f9\": rpc error: code = NotFound desc = could not find container \"976423ef41aaae33850c00fcd3501af4196d38bd3ce2a5845a341191caf443f9\": container with ID starting with 976423ef41aaae33850c00fcd3501af4196d38bd3ce2a5845a341191caf443f9 not found: ID does not exist" Mar 21 04:31:02 crc kubenswrapper[4839]: I0321 04:31:02.383447 4839 scope.go:117] "RemoveContainer" containerID="10d054bdba792021ed12279ec4d8798c4bb962afe280d6356a1f51072ca850c9" Mar 21 04:31:02 crc kubenswrapper[4839]: I0321 04:31:02.395559 4839 scope.go:117] "RemoveContainer" containerID="10d054bdba792021ed12279ec4d8798c4bb962afe280d6356a1f51072ca850c9" Mar 21 04:31:02 crc kubenswrapper[4839]: E0321 04:31:02.395971 4839 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"10d054bdba792021ed12279ec4d8798c4bb962afe280d6356a1f51072ca850c9\": container with ID starting with 10d054bdba792021ed12279ec4d8798c4bb962afe280d6356a1f51072ca850c9 not found: ID does not exist" containerID="10d054bdba792021ed12279ec4d8798c4bb962afe280d6356a1f51072ca850c9" Mar 21 04:31:02 crc kubenswrapper[4839]: I0321 04:31:02.396006 4839 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"10d054bdba792021ed12279ec4d8798c4bb962afe280d6356a1f51072ca850c9"} err="failed to get container status \"10d054bdba792021ed12279ec4d8798c4bb962afe280d6356a1f51072ca850c9\": rpc error: code = NotFound desc = could not find container \"10d054bdba792021ed12279ec4d8798c4bb962afe280d6356a1f51072ca850c9\": container with ID starting with 10d054bdba792021ed12279ec4d8798c4bb962afe280d6356a1f51072ca850c9 not found: ID does not exist" Mar 21 04:31:02 crc kubenswrapper[4839]: I0321 04:31:02.396063 4839 scope.go:117] "RemoveContainer" containerID="afcbe40293e4b6e4d4a36e26fb65a125e0864be6010a0ab4223765b126370c8a" Mar 21 04:31:02 crc kubenswrapper[4839]: I0321 04:31:02.407864 4839 scope.go:117] "RemoveContainer" containerID="32838aa37e22f65845d98772270e4788d8051a89bcd948b68704534b2f4fcf1e" Mar 21 04:31:02 crc kubenswrapper[4839]: I0321 04:31:02.455809 4839 scope.go:117] "RemoveContainer" containerID="1f40b3f629e9c306e4ab3dffa3f2ab389b605eb7d1dc90763bed0504eb05b231" Mar 21 04:31:02 crc kubenswrapper[4839]: I0321 04:31:02.460508 4839 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0b7a7313-21c4-4909-9ebe-ebe552b29b8c" path="/var/lib/kubelet/pods/0b7a7313-21c4-4909-9ebe-ebe552b29b8c/volumes" Mar 21 04:31:02 crc kubenswrapper[4839]: I0321 04:31:02.461311 4839 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6240548e-b827-4fdb-b2be-c7187d6a28e8" path="/var/lib/kubelet/pods/6240548e-b827-4fdb-b2be-c7187d6a28e8/volumes" Mar 21 04:31:02 crc kubenswrapper[4839]: I0321 04:31:02.461875 4839 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="65a571df-f531-458b-9aed-6de99e4607e1" path="/var/lib/kubelet/pods/65a571df-f531-458b-9aed-6de99e4607e1/volumes" Mar 21 04:31:02 crc kubenswrapper[4839]: I0321 04:31:02.477736 4839 scope.go:117] "RemoveContainer" containerID="afcbe40293e4b6e4d4a36e26fb65a125e0864be6010a0ab4223765b126370c8a" Mar 21 04:31:02 crc kubenswrapper[4839]: E0321 04:31:02.478168 4839 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"afcbe40293e4b6e4d4a36e26fb65a125e0864be6010a0ab4223765b126370c8a\": container with ID starting with afcbe40293e4b6e4d4a36e26fb65a125e0864be6010a0ab4223765b126370c8a not found: ID does not exist" containerID="afcbe40293e4b6e4d4a36e26fb65a125e0864be6010a0ab4223765b126370c8a" Mar 21 04:31:02 crc kubenswrapper[4839]: I0321 04:31:02.478202 4839 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"afcbe40293e4b6e4d4a36e26fb65a125e0864be6010a0ab4223765b126370c8a"} err="failed to get container status \"afcbe40293e4b6e4d4a36e26fb65a125e0864be6010a0ab4223765b126370c8a\": rpc error: code = NotFound desc = could not find container \"afcbe40293e4b6e4d4a36e26fb65a125e0864be6010a0ab4223765b126370c8a\": container with ID starting with afcbe40293e4b6e4d4a36e26fb65a125e0864be6010a0ab4223765b126370c8a not found: ID does not exist" Mar 21 04:31:02 crc kubenswrapper[4839]: I0321 04:31:02.478228 4839 scope.go:117] "RemoveContainer" containerID="32838aa37e22f65845d98772270e4788d8051a89bcd948b68704534b2f4fcf1e" Mar 21 04:31:02 crc kubenswrapper[4839]: E0321 04:31:02.478550 4839 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"32838aa37e22f65845d98772270e4788d8051a89bcd948b68704534b2f4fcf1e\": container with ID starting with 32838aa37e22f65845d98772270e4788d8051a89bcd948b68704534b2f4fcf1e not found: ID does not exist" containerID="32838aa37e22f65845d98772270e4788d8051a89bcd948b68704534b2f4fcf1e" Mar 21 04:31:02 crc kubenswrapper[4839]: I0321 04:31:02.478585 4839 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"32838aa37e22f65845d98772270e4788d8051a89bcd948b68704534b2f4fcf1e"} err="failed to get container status \"32838aa37e22f65845d98772270e4788d8051a89bcd948b68704534b2f4fcf1e\": rpc error: code = NotFound desc = could not find container \"32838aa37e22f65845d98772270e4788d8051a89bcd948b68704534b2f4fcf1e\": container with ID starting with 32838aa37e22f65845d98772270e4788d8051a89bcd948b68704534b2f4fcf1e not found: ID does not exist" Mar 21 04:31:02 crc kubenswrapper[4839]: I0321 04:31:02.478599 4839 scope.go:117] "RemoveContainer" containerID="1f40b3f629e9c306e4ab3dffa3f2ab389b605eb7d1dc90763bed0504eb05b231" Mar 21 04:31:02 crc kubenswrapper[4839]: E0321 04:31:02.478797 4839 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1f40b3f629e9c306e4ab3dffa3f2ab389b605eb7d1dc90763bed0504eb05b231\": container with ID starting with 1f40b3f629e9c306e4ab3dffa3f2ab389b605eb7d1dc90763bed0504eb05b231 not found: ID does not exist" containerID="1f40b3f629e9c306e4ab3dffa3f2ab389b605eb7d1dc90763bed0504eb05b231" Mar 21 04:31:02 crc kubenswrapper[4839]: I0321 04:31:02.478817 4839 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1f40b3f629e9c306e4ab3dffa3f2ab389b605eb7d1dc90763bed0504eb05b231"} err="failed to get container status \"1f40b3f629e9c306e4ab3dffa3f2ab389b605eb7d1dc90763bed0504eb05b231\": rpc error: code = NotFound desc = could not find container \"1f40b3f629e9c306e4ab3dffa3f2ab389b605eb7d1dc90763bed0504eb05b231\": container with ID starting with 1f40b3f629e9c306e4ab3dffa3f2ab389b605eb7d1dc90763bed0504eb05b231 not found: ID does not exist" Mar 21 04:31:02 crc kubenswrapper[4839]: I0321 04:31:02.500401 4839 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-mxrc8"] Mar 21 04:31:02 crc kubenswrapper[4839]: I0321 04:31:02.505581 4839 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-mxrc8"] Mar 21 04:31:02 crc kubenswrapper[4839]: I0321 04:31:02.516755 4839 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-zgfcm"] Mar 21 04:31:02 crc kubenswrapper[4839]: I0321 04:31:02.520954 4839 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-zgfcm"] Mar 21 04:31:03 crc kubenswrapper[4839]: I0321 04:31:03.220086 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-qb9bp" event={"ID":"df9bf95b-dc8f-4104-9c6c-873159393850","Type":"ContainerStarted","Data":"a61fd4afaf9ad306000624583773ad7dfede05b26f16296be0baf8474b7fb7ab"} Mar 21 04:31:03 crc kubenswrapper[4839]: I0321 04:31:03.220143 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-qb9bp" event={"ID":"df9bf95b-dc8f-4104-9c6c-873159393850","Type":"ContainerStarted","Data":"58549bd273c0de0af6511133c6dd48f3904f237e11edf620a85c316cab17625e"} Mar 21 04:31:03 crc kubenswrapper[4839]: I0321 04:31:03.220322 4839 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/marketplace-operator-79b997595-qb9bp" Mar 21 04:31:03 crc kubenswrapper[4839]: I0321 04:31:03.225758 4839 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/marketplace-operator-79b997595-qb9bp" Mar 21 04:31:03 crc kubenswrapper[4839]: I0321 04:31:03.245393 4839 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/marketplace-operator-79b997595-qb9bp" podStartSLOduration=2.245375078 podStartE2EDuration="2.245375078s" podCreationTimestamp="2026-03-21 04:31:01 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-21 04:31:03.23853028 +0000 UTC m=+467.566316986" watchObservedRunningTime="2026-03-21 04:31:03.245375078 +0000 UTC m=+467.573161764" Mar 21 04:31:04 crc kubenswrapper[4839]: I0321 04:31:04.307673 4839 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-7sxqv"] Mar 21 04:31:04 crc kubenswrapper[4839]: E0321 04:31:04.308222 4839 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5d1d0c02-87bf-4c8b-bc1c-d25007fb3c1c" containerName="extract-content" Mar 21 04:31:04 crc kubenswrapper[4839]: I0321 04:31:04.308235 4839 state_mem.go:107] "Deleted CPUSet assignment" podUID="5d1d0c02-87bf-4c8b-bc1c-d25007fb3c1c" containerName="extract-content" Mar 21 04:31:04 crc kubenswrapper[4839]: E0321 04:31:04.308256 4839 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6513c45b-dd98-40b0-b69c-94db4d1c916e" containerName="registry-server" Mar 21 04:31:04 crc kubenswrapper[4839]: I0321 04:31:04.308263 4839 state_mem.go:107] "Deleted CPUSet assignment" podUID="6513c45b-dd98-40b0-b69c-94db4d1c916e" containerName="registry-server" Mar 21 04:31:04 crc kubenswrapper[4839]: E0321 04:31:04.308274 4839 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6240548e-b827-4fdb-b2be-c7187d6a28e8" containerName="marketplace-operator" Mar 21 04:31:04 crc kubenswrapper[4839]: I0321 04:31:04.308281 4839 state_mem.go:107] "Deleted CPUSet assignment" podUID="6240548e-b827-4fdb-b2be-c7187d6a28e8" containerName="marketplace-operator" Mar 21 04:31:04 crc kubenswrapper[4839]: E0321 04:31:04.308290 4839 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0b7a7313-21c4-4909-9ebe-ebe552b29b8c" containerName="registry-server" Mar 21 04:31:04 crc kubenswrapper[4839]: I0321 04:31:04.308296 4839 state_mem.go:107] "Deleted CPUSet assignment" podUID="0b7a7313-21c4-4909-9ebe-ebe552b29b8c" containerName="registry-server" Mar 21 04:31:04 crc kubenswrapper[4839]: E0321 04:31:04.308308 4839 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6513c45b-dd98-40b0-b69c-94db4d1c916e" containerName="extract-utilities" Mar 21 04:31:04 crc kubenswrapper[4839]: I0321 04:31:04.308316 4839 state_mem.go:107] "Deleted CPUSet assignment" podUID="6513c45b-dd98-40b0-b69c-94db4d1c916e" containerName="extract-utilities" Mar 21 04:31:04 crc kubenswrapper[4839]: E0321 04:31:04.308329 4839 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="65a571df-f531-458b-9aed-6de99e4607e1" containerName="extract-utilities" Mar 21 04:31:04 crc kubenswrapper[4839]: I0321 04:31:04.308336 4839 state_mem.go:107] "Deleted CPUSet assignment" podUID="65a571df-f531-458b-9aed-6de99e4607e1" containerName="extract-utilities" Mar 21 04:31:04 crc kubenswrapper[4839]: E0321 04:31:04.308347 4839 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6513c45b-dd98-40b0-b69c-94db4d1c916e" containerName="extract-content" Mar 21 04:31:04 crc kubenswrapper[4839]: I0321 04:31:04.308355 4839 state_mem.go:107] "Deleted CPUSet assignment" podUID="6513c45b-dd98-40b0-b69c-94db4d1c916e" containerName="extract-content" Mar 21 04:31:04 crc kubenswrapper[4839]: E0321 04:31:04.308365 4839 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5d1d0c02-87bf-4c8b-bc1c-d25007fb3c1c" containerName="registry-server" Mar 21 04:31:04 crc kubenswrapper[4839]: I0321 04:31:04.308370 4839 state_mem.go:107] "Deleted CPUSet assignment" podUID="5d1d0c02-87bf-4c8b-bc1c-d25007fb3c1c" containerName="registry-server" Mar 21 04:31:04 crc kubenswrapper[4839]: E0321 04:31:04.308378 4839 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="65a571df-f531-458b-9aed-6de99e4607e1" containerName="registry-server" Mar 21 04:31:04 crc kubenswrapper[4839]: I0321 04:31:04.308384 4839 state_mem.go:107] "Deleted CPUSet assignment" podUID="65a571df-f531-458b-9aed-6de99e4607e1" containerName="registry-server" Mar 21 04:31:04 crc kubenswrapper[4839]: E0321 04:31:04.308392 4839 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0b7a7313-21c4-4909-9ebe-ebe552b29b8c" containerName="extract-utilities" Mar 21 04:31:04 crc kubenswrapper[4839]: I0321 04:31:04.308398 4839 state_mem.go:107] "Deleted CPUSet assignment" podUID="0b7a7313-21c4-4909-9ebe-ebe552b29b8c" containerName="extract-utilities" Mar 21 04:31:04 crc kubenswrapper[4839]: E0321 04:31:04.308406 4839 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0b7a7313-21c4-4909-9ebe-ebe552b29b8c" containerName="extract-content" Mar 21 04:31:04 crc kubenswrapper[4839]: I0321 04:31:04.308411 4839 state_mem.go:107] "Deleted CPUSet assignment" podUID="0b7a7313-21c4-4909-9ebe-ebe552b29b8c" containerName="extract-content" Mar 21 04:31:04 crc kubenswrapper[4839]: E0321 04:31:04.308420 4839 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="65a571df-f531-458b-9aed-6de99e4607e1" containerName="extract-content" Mar 21 04:31:04 crc kubenswrapper[4839]: I0321 04:31:04.308425 4839 state_mem.go:107] "Deleted CPUSet assignment" podUID="65a571df-f531-458b-9aed-6de99e4607e1" containerName="extract-content" Mar 21 04:31:04 crc kubenswrapper[4839]: E0321 04:31:04.308433 4839 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5d1d0c02-87bf-4c8b-bc1c-d25007fb3c1c" containerName="extract-utilities" Mar 21 04:31:04 crc kubenswrapper[4839]: I0321 04:31:04.308447 4839 state_mem.go:107] "Deleted CPUSet assignment" podUID="5d1d0c02-87bf-4c8b-bc1c-d25007fb3c1c" containerName="extract-utilities" Mar 21 04:31:04 crc kubenswrapper[4839]: I0321 04:31:04.308529 4839 memory_manager.go:354] "RemoveStaleState removing state" podUID="65a571df-f531-458b-9aed-6de99e4607e1" containerName="registry-server" Mar 21 04:31:04 crc kubenswrapper[4839]: I0321 04:31:04.308542 4839 memory_manager.go:354] "RemoveStaleState removing state" podUID="6513c45b-dd98-40b0-b69c-94db4d1c916e" containerName="registry-server" Mar 21 04:31:04 crc kubenswrapper[4839]: I0321 04:31:04.308550 4839 memory_manager.go:354] "RemoveStaleState removing state" podUID="6240548e-b827-4fdb-b2be-c7187d6a28e8" containerName="marketplace-operator" Mar 21 04:31:04 crc kubenswrapper[4839]: I0321 04:31:04.308557 4839 memory_manager.go:354] "RemoveStaleState removing state" podUID="5d1d0c02-87bf-4c8b-bc1c-d25007fb3c1c" containerName="registry-server" Mar 21 04:31:04 crc kubenswrapper[4839]: I0321 04:31:04.308590 4839 memory_manager.go:354] "RemoveStaleState removing state" podUID="0b7a7313-21c4-4909-9ebe-ebe552b29b8c" containerName="registry-server" Mar 21 04:31:04 crc kubenswrapper[4839]: I0321 04:31:04.310204 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-7sxqv" Mar 21 04:31:04 crc kubenswrapper[4839]: I0321 04:31:04.316002 4839 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-7sxqv"] Mar 21 04:31:04 crc kubenswrapper[4839]: I0321 04:31:04.318260 4839 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-marketplace-dockercfg-x2ctb" Mar 21 04:31:04 crc kubenswrapper[4839]: I0321 04:31:04.330746 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vltzp\" (UniqueName: \"kubernetes.io/projected/e28c0850-90f8-445b-be34-13ab0d940eb4-kube-api-access-vltzp\") pod \"redhat-marketplace-7sxqv\" (UID: \"e28c0850-90f8-445b-be34-13ab0d940eb4\") " pod="openshift-marketplace/redhat-marketplace-7sxqv" Mar 21 04:31:04 crc kubenswrapper[4839]: I0321 04:31:04.330807 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e28c0850-90f8-445b-be34-13ab0d940eb4-catalog-content\") pod \"redhat-marketplace-7sxqv\" (UID: \"e28c0850-90f8-445b-be34-13ab0d940eb4\") " pod="openshift-marketplace/redhat-marketplace-7sxqv" Mar 21 04:31:04 crc kubenswrapper[4839]: I0321 04:31:04.330844 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e28c0850-90f8-445b-be34-13ab0d940eb4-utilities\") pod \"redhat-marketplace-7sxqv\" (UID: \"e28c0850-90f8-445b-be34-13ab0d940eb4\") " pod="openshift-marketplace/redhat-marketplace-7sxqv" Mar 21 04:31:04 crc kubenswrapper[4839]: I0321 04:31:04.432525 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vltzp\" (UniqueName: \"kubernetes.io/projected/e28c0850-90f8-445b-be34-13ab0d940eb4-kube-api-access-vltzp\") pod \"redhat-marketplace-7sxqv\" (UID: \"e28c0850-90f8-445b-be34-13ab0d940eb4\") " pod="openshift-marketplace/redhat-marketplace-7sxqv" Mar 21 04:31:04 crc kubenswrapper[4839]: I0321 04:31:04.432606 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e28c0850-90f8-445b-be34-13ab0d940eb4-catalog-content\") pod \"redhat-marketplace-7sxqv\" (UID: \"e28c0850-90f8-445b-be34-13ab0d940eb4\") " pod="openshift-marketplace/redhat-marketplace-7sxqv" Mar 21 04:31:04 crc kubenswrapper[4839]: I0321 04:31:04.432653 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e28c0850-90f8-445b-be34-13ab0d940eb4-utilities\") pod \"redhat-marketplace-7sxqv\" (UID: \"e28c0850-90f8-445b-be34-13ab0d940eb4\") " pod="openshift-marketplace/redhat-marketplace-7sxqv" Mar 21 04:31:04 crc kubenswrapper[4839]: I0321 04:31:04.433129 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e28c0850-90f8-445b-be34-13ab0d940eb4-utilities\") pod \"redhat-marketplace-7sxqv\" (UID: \"e28c0850-90f8-445b-be34-13ab0d940eb4\") " pod="openshift-marketplace/redhat-marketplace-7sxqv" Mar 21 04:31:04 crc kubenswrapper[4839]: I0321 04:31:04.433526 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e28c0850-90f8-445b-be34-13ab0d940eb4-catalog-content\") pod \"redhat-marketplace-7sxqv\" (UID: \"e28c0850-90f8-445b-be34-13ab0d940eb4\") " pod="openshift-marketplace/redhat-marketplace-7sxqv" Mar 21 04:31:04 crc kubenswrapper[4839]: I0321 04:31:04.459061 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vltzp\" (UniqueName: \"kubernetes.io/projected/e28c0850-90f8-445b-be34-13ab0d940eb4-kube-api-access-vltzp\") pod \"redhat-marketplace-7sxqv\" (UID: \"e28c0850-90f8-445b-be34-13ab0d940eb4\") " pod="openshift-marketplace/redhat-marketplace-7sxqv" Mar 21 04:31:04 crc kubenswrapper[4839]: I0321 04:31:04.464713 4839 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5d1d0c02-87bf-4c8b-bc1c-d25007fb3c1c" path="/var/lib/kubelet/pods/5d1d0c02-87bf-4c8b-bc1c-d25007fb3c1c/volumes" Mar 21 04:31:04 crc kubenswrapper[4839]: I0321 04:31:04.466438 4839 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6513c45b-dd98-40b0-b69c-94db4d1c916e" path="/var/lib/kubelet/pods/6513c45b-dd98-40b0-b69c-94db4d1c916e/volumes" Mar 21 04:31:04 crc kubenswrapper[4839]: I0321 04:31:04.504046 4839 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-8p22k"] Mar 21 04:31:04 crc kubenswrapper[4839]: I0321 04:31:04.505036 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-8p22k" Mar 21 04:31:04 crc kubenswrapper[4839]: I0321 04:31:04.510044 4839 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-operators-dockercfg-ct8rh" Mar 21 04:31:04 crc kubenswrapper[4839]: I0321 04:31:04.514959 4839 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-8p22k"] Mar 21 04:31:04 crc kubenswrapper[4839]: I0321 04:31:04.634343 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-7sxqv" Mar 21 04:31:04 crc kubenswrapper[4839]: I0321 04:31:04.635439 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4chfq\" (UniqueName: \"kubernetes.io/projected/d2de7c7a-fc46-44bc-9fad-d346e82f8ebc-kube-api-access-4chfq\") pod \"redhat-operators-8p22k\" (UID: \"d2de7c7a-fc46-44bc-9fad-d346e82f8ebc\") " pod="openshift-marketplace/redhat-operators-8p22k" Mar 21 04:31:04 crc kubenswrapper[4839]: I0321 04:31:04.635543 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d2de7c7a-fc46-44bc-9fad-d346e82f8ebc-utilities\") pod \"redhat-operators-8p22k\" (UID: \"d2de7c7a-fc46-44bc-9fad-d346e82f8ebc\") " pod="openshift-marketplace/redhat-operators-8p22k" Mar 21 04:31:04 crc kubenswrapper[4839]: I0321 04:31:04.635594 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d2de7c7a-fc46-44bc-9fad-d346e82f8ebc-catalog-content\") pod \"redhat-operators-8p22k\" (UID: \"d2de7c7a-fc46-44bc-9fad-d346e82f8ebc\") " pod="openshift-marketplace/redhat-operators-8p22k" Mar 21 04:31:04 crc kubenswrapper[4839]: I0321 04:31:04.738668 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d2de7c7a-fc46-44bc-9fad-d346e82f8ebc-utilities\") pod \"redhat-operators-8p22k\" (UID: \"d2de7c7a-fc46-44bc-9fad-d346e82f8ebc\") " pod="openshift-marketplace/redhat-operators-8p22k" Mar 21 04:31:04 crc kubenswrapper[4839]: I0321 04:31:04.739106 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d2de7c7a-fc46-44bc-9fad-d346e82f8ebc-utilities\") pod \"redhat-operators-8p22k\" (UID: \"d2de7c7a-fc46-44bc-9fad-d346e82f8ebc\") " pod="openshift-marketplace/redhat-operators-8p22k" Mar 21 04:31:04 crc kubenswrapper[4839]: I0321 04:31:04.739109 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d2de7c7a-fc46-44bc-9fad-d346e82f8ebc-catalog-content\") pod \"redhat-operators-8p22k\" (UID: \"d2de7c7a-fc46-44bc-9fad-d346e82f8ebc\") " pod="openshift-marketplace/redhat-operators-8p22k" Mar 21 04:31:04 crc kubenswrapper[4839]: I0321 04:31:04.739183 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4chfq\" (UniqueName: \"kubernetes.io/projected/d2de7c7a-fc46-44bc-9fad-d346e82f8ebc-kube-api-access-4chfq\") pod \"redhat-operators-8p22k\" (UID: \"d2de7c7a-fc46-44bc-9fad-d346e82f8ebc\") " pod="openshift-marketplace/redhat-operators-8p22k" Mar 21 04:31:04 crc kubenswrapper[4839]: I0321 04:31:04.739748 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d2de7c7a-fc46-44bc-9fad-d346e82f8ebc-catalog-content\") pod \"redhat-operators-8p22k\" (UID: \"d2de7c7a-fc46-44bc-9fad-d346e82f8ebc\") " pod="openshift-marketplace/redhat-operators-8p22k" Mar 21 04:31:04 crc kubenswrapper[4839]: I0321 04:31:04.758394 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4chfq\" (UniqueName: \"kubernetes.io/projected/d2de7c7a-fc46-44bc-9fad-d346e82f8ebc-kube-api-access-4chfq\") pod \"redhat-operators-8p22k\" (UID: \"d2de7c7a-fc46-44bc-9fad-d346e82f8ebc\") " pod="openshift-marketplace/redhat-operators-8p22k" Mar 21 04:31:04 crc kubenswrapper[4839]: I0321 04:31:04.846879 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-8p22k" Mar 21 04:31:05 crc kubenswrapper[4839]: I0321 04:31:05.043096 4839 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-7sxqv"] Mar 21 04:31:05 crc kubenswrapper[4839]: W0321 04:31:05.050986 4839 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode28c0850_90f8_445b_be34_13ab0d940eb4.slice/crio-7df77261ae0a5009d090b007920101a824ad5b5e9d1bf8636cce9c00538f7cd9 WatchSource:0}: Error finding container 7df77261ae0a5009d090b007920101a824ad5b5e9d1bf8636cce9c00538f7cd9: Status 404 returned error can't find the container with id 7df77261ae0a5009d090b007920101a824ad5b5e9d1bf8636cce9c00538f7cd9 Mar 21 04:31:05 crc kubenswrapper[4839]: I0321 04:31:05.249721 4839 generic.go:334] "Generic (PLEG): container finished" podID="e28c0850-90f8-445b-be34-13ab0d940eb4" containerID="671e02cd5c07dca793f66a536c30face9575efef661dd6ea8f93ced61743edb3" exitCode=0 Mar 21 04:31:05 crc kubenswrapper[4839]: I0321 04:31:05.249942 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-7sxqv" event={"ID":"e28c0850-90f8-445b-be34-13ab0d940eb4","Type":"ContainerDied","Data":"671e02cd5c07dca793f66a536c30face9575efef661dd6ea8f93ced61743edb3"} Mar 21 04:31:05 crc kubenswrapper[4839]: I0321 04:31:05.250091 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-7sxqv" event={"ID":"e28c0850-90f8-445b-be34-13ab0d940eb4","Type":"ContainerStarted","Data":"7df77261ae0a5009d090b007920101a824ad5b5e9d1bf8636cce9c00538f7cd9"} Mar 21 04:31:05 crc kubenswrapper[4839]: I0321 04:31:05.875863 4839 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-8p22k"] Mar 21 04:31:05 crc kubenswrapper[4839]: W0321 04:31:05.881413 4839 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd2de7c7a_fc46_44bc_9fad_d346e82f8ebc.slice/crio-c79733c0a493e6a45d45cf07c4d3921f3a9afa3de87779dbb72ce95b7d3d98ae WatchSource:0}: Error finding container c79733c0a493e6a45d45cf07c4d3921f3a9afa3de87779dbb72ce95b7d3d98ae: Status 404 returned error can't find the container with id c79733c0a493e6a45d45cf07c4d3921f3a9afa3de87779dbb72ce95b7d3d98ae Mar 21 04:31:06 crc kubenswrapper[4839]: I0321 04:31:06.255243 4839 generic.go:334] "Generic (PLEG): container finished" podID="d2de7c7a-fc46-44bc-9fad-d346e82f8ebc" containerID="7ac3f706c4d746984f3de670c6eb113b6380888615edcc9dab4dfbd09d139f6d" exitCode=0 Mar 21 04:31:06 crc kubenswrapper[4839]: I0321 04:31:06.255350 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-8p22k" event={"ID":"d2de7c7a-fc46-44bc-9fad-d346e82f8ebc","Type":"ContainerDied","Data":"7ac3f706c4d746984f3de670c6eb113b6380888615edcc9dab4dfbd09d139f6d"} Mar 21 04:31:06 crc kubenswrapper[4839]: I0321 04:31:06.255657 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-8p22k" event={"ID":"d2de7c7a-fc46-44bc-9fad-d346e82f8ebc","Type":"ContainerStarted","Data":"c79733c0a493e6a45d45cf07c4d3921f3a9afa3de87779dbb72ce95b7d3d98ae"} Mar 21 04:31:06 crc kubenswrapper[4839]: I0321 04:31:06.257436 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-7sxqv" event={"ID":"e28c0850-90f8-445b-be34-13ab0d940eb4","Type":"ContainerStarted","Data":"c4696895008d77bf85dc49e81c28be91989a3ba193851694f6ef9a99a11c30d0"} Mar 21 04:31:06 crc kubenswrapper[4839]: I0321 04:31:06.704411 4839 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-99hx2"] Mar 21 04:31:06 crc kubenswrapper[4839]: I0321 04:31:06.712698 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-99hx2" Mar 21 04:31:06 crc kubenswrapper[4839]: I0321 04:31:06.718266 4839 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"community-operators-dockercfg-dmngl" Mar 21 04:31:06 crc kubenswrapper[4839]: I0321 04:31:06.724668 4839 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-99hx2"] Mar 21 04:31:06 crc kubenswrapper[4839]: I0321 04:31:06.766364 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lpg48\" (UniqueName: \"kubernetes.io/projected/51f96bb3-505b-4c7b-bc6d-b0a465c7daae-kube-api-access-lpg48\") pod \"community-operators-99hx2\" (UID: \"51f96bb3-505b-4c7b-bc6d-b0a465c7daae\") " pod="openshift-marketplace/community-operators-99hx2" Mar 21 04:31:06 crc kubenswrapper[4839]: I0321 04:31:06.766448 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/51f96bb3-505b-4c7b-bc6d-b0a465c7daae-utilities\") pod \"community-operators-99hx2\" (UID: \"51f96bb3-505b-4c7b-bc6d-b0a465c7daae\") " pod="openshift-marketplace/community-operators-99hx2" Mar 21 04:31:06 crc kubenswrapper[4839]: I0321 04:31:06.766500 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/51f96bb3-505b-4c7b-bc6d-b0a465c7daae-catalog-content\") pod \"community-operators-99hx2\" (UID: \"51f96bb3-505b-4c7b-bc6d-b0a465c7daae\") " pod="openshift-marketplace/community-operators-99hx2" Mar 21 04:31:06 crc kubenswrapper[4839]: I0321 04:31:06.867266 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lpg48\" (UniqueName: \"kubernetes.io/projected/51f96bb3-505b-4c7b-bc6d-b0a465c7daae-kube-api-access-lpg48\") pod \"community-operators-99hx2\" (UID: \"51f96bb3-505b-4c7b-bc6d-b0a465c7daae\") " pod="openshift-marketplace/community-operators-99hx2" Mar 21 04:31:06 crc kubenswrapper[4839]: I0321 04:31:06.867327 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/51f96bb3-505b-4c7b-bc6d-b0a465c7daae-utilities\") pod \"community-operators-99hx2\" (UID: \"51f96bb3-505b-4c7b-bc6d-b0a465c7daae\") " pod="openshift-marketplace/community-operators-99hx2" Mar 21 04:31:06 crc kubenswrapper[4839]: I0321 04:31:06.867366 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/51f96bb3-505b-4c7b-bc6d-b0a465c7daae-catalog-content\") pod \"community-operators-99hx2\" (UID: \"51f96bb3-505b-4c7b-bc6d-b0a465c7daae\") " pod="openshift-marketplace/community-operators-99hx2" Mar 21 04:31:06 crc kubenswrapper[4839]: I0321 04:31:06.867788 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/51f96bb3-505b-4c7b-bc6d-b0a465c7daae-utilities\") pod \"community-operators-99hx2\" (UID: \"51f96bb3-505b-4c7b-bc6d-b0a465c7daae\") " pod="openshift-marketplace/community-operators-99hx2" Mar 21 04:31:06 crc kubenswrapper[4839]: I0321 04:31:06.867818 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/51f96bb3-505b-4c7b-bc6d-b0a465c7daae-catalog-content\") pod \"community-operators-99hx2\" (UID: \"51f96bb3-505b-4c7b-bc6d-b0a465c7daae\") " pod="openshift-marketplace/community-operators-99hx2" Mar 21 04:31:06 crc kubenswrapper[4839]: I0321 04:31:06.902624 4839 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-xg8xw"] Mar 21 04:31:06 crc kubenswrapper[4839]: I0321 04:31:06.903547 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-xg8xw" Mar 21 04:31:06 crc kubenswrapper[4839]: I0321 04:31:06.911181 4839 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"certified-operators-dockercfg-4rs5g" Mar 21 04:31:06 crc kubenswrapper[4839]: I0321 04:31:06.915656 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lpg48\" (UniqueName: \"kubernetes.io/projected/51f96bb3-505b-4c7b-bc6d-b0a465c7daae-kube-api-access-lpg48\") pod \"community-operators-99hx2\" (UID: \"51f96bb3-505b-4c7b-bc6d-b0a465c7daae\") " pod="openshift-marketplace/community-operators-99hx2" Mar 21 04:31:06 crc kubenswrapper[4839]: I0321 04:31:06.915765 4839 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-xg8xw"] Mar 21 04:31:07 crc kubenswrapper[4839]: I0321 04:31:07.028967 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-99hx2" Mar 21 04:31:07 crc kubenswrapper[4839]: I0321 04:31:07.069498 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1d4943ad-c109-47a0-bcc8-4eb1a89836ca-catalog-content\") pod \"certified-operators-xg8xw\" (UID: \"1d4943ad-c109-47a0-bcc8-4eb1a89836ca\") " pod="openshift-marketplace/certified-operators-xg8xw" Mar 21 04:31:07 crc kubenswrapper[4839]: I0321 04:31:07.069876 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4gc48\" (UniqueName: \"kubernetes.io/projected/1d4943ad-c109-47a0-bcc8-4eb1a89836ca-kube-api-access-4gc48\") pod \"certified-operators-xg8xw\" (UID: \"1d4943ad-c109-47a0-bcc8-4eb1a89836ca\") " pod="openshift-marketplace/certified-operators-xg8xw" Mar 21 04:31:07 crc kubenswrapper[4839]: I0321 04:31:07.069921 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1d4943ad-c109-47a0-bcc8-4eb1a89836ca-utilities\") pod \"certified-operators-xg8xw\" (UID: \"1d4943ad-c109-47a0-bcc8-4eb1a89836ca\") " pod="openshift-marketplace/certified-operators-xg8xw" Mar 21 04:31:07 crc kubenswrapper[4839]: I0321 04:31:07.170642 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1d4943ad-c109-47a0-bcc8-4eb1a89836ca-catalog-content\") pod \"certified-operators-xg8xw\" (UID: \"1d4943ad-c109-47a0-bcc8-4eb1a89836ca\") " pod="openshift-marketplace/certified-operators-xg8xw" Mar 21 04:31:07 crc kubenswrapper[4839]: I0321 04:31:07.170694 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4gc48\" (UniqueName: \"kubernetes.io/projected/1d4943ad-c109-47a0-bcc8-4eb1a89836ca-kube-api-access-4gc48\") pod \"certified-operators-xg8xw\" (UID: \"1d4943ad-c109-47a0-bcc8-4eb1a89836ca\") " pod="openshift-marketplace/certified-operators-xg8xw" Mar 21 04:31:07 crc kubenswrapper[4839]: I0321 04:31:07.170733 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1d4943ad-c109-47a0-bcc8-4eb1a89836ca-utilities\") pod \"certified-operators-xg8xw\" (UID: \"1d4943ad-c109-47a0-bcc8-4eb1a89836ca\") " pod="openshift-marketplace/certified-operators-xg8xw" Mar 21 04:31:07 crc kubenswrapper[4839]: I0321 04:31:07.171173 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1d4943ad-c109-47a0-bcc8-4eb1a89836ca-utilities\") pod \"certified-operators-xg8xw\" (UID: \"1d4943ad-c109-47a0-bcc8-4eb1a89836ca\") " pod="openshift-marketplace/certified-operators-xg8xw" Mar 21 04:31:07 crc kubenswrapper[4839]: I0321 04:31:07.171279 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1d4943ad-c109-47a0-bcc8-4eb1a89836ca-catalog-content\") pod \"certified-operators-xg8xw\" (UID: \"1d4943ad-c109-47a0-bcc8-4eb1a89836ca\") " pod="openshift-marketplace/certified-operators-xg8xw" Mar 21 04:31:07 crc kubenswrapper[4839]: I0321 04:31:07.188274 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4gc48\" (UniqueName: \"kubernetes.io/projected/1d4943ad-c109-47a0-bcc8-4eb1a89836ca-kube-api-access-4gc48\") pod \"certified-operators-xg8xw\" (UID: \"1d4943ad-c109-47a0-bcc8-4eb1a89836ca\") " pod="openshift-marketplace/certified-operators-xg8xw" Mar 21 04:31:07 crc kubenswrapper[4839]: I0321 04:31:07.225881 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-xg8xw" Mar 21 04:31:07 crc kubenswrapper[4839]: I0321 04:31:07.276697 4839 generic.go:334] "Generic (PLEG): container finished" podID="e28c0850-90f8-445b-be34-13ab0d940eb4" containerID="c4696895008d77bf85dc49e81c28be91989a3ba193851694f6ef9a99a11c30d0" exitCode=0 Mar 21 04:31:07 crc kubenswrapper[4839]: I0321 04:31:07.276917 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-7sxqv" event={"ID":"e28c0850-90f8-445b-be34-13ab0d940eb4","Type":"ContainerDied","Data":"c4696895008d77bf85dc49e81c28be91989a3ba193851694f6ef9a99a11c30d0"} Mar 21 04:31:07 crc kubenswrapper[4839]: I0321 04:31:07.478738 4839 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-99hx2"] Mar 21 04:31:07 crc kubenswrapper[4839]: W0321 04:31:07.489036 4839 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod51f96bb3_505b_4c7b_bc6d_b0a465c7daae.slice/crio-5dfbf9a67af37390466b9ccf173193b1784ffd5fc8fd6f8770a7a80bfe124608 WatchSource:0}: Error finding container 5dfbf9a67af37390466b9ccf173193b1784ffd5fc8fd6f8770a7a80bfe124608: Status 404 returned error can't find the container with id 5dfbf9a67af37390466b9ccf173193b1784ffd5fc8fd6f8770a7a80bfe124608 Mar 21 04:31:07 crc kubenswrapper[4839]: I0321 04:31:07.623923 4839 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-xg8xw"] Mar 21 04:31:08 crc kubenswrapper[4839]: I0321 04:31:08.284663 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-7sxqv" event={"ID":"e28c0850-90f8-445b-be34-13ab0d940eb4","Type":"ContainerStarted","Data":"ad183f3dc059b05c56ba2d142f216f479d5cd17174ed23542746efc702343255"} Mar 21 04:31:08 crc kubenswrapper[4839]: I0321 04:31:08.288422 4839 generic.go:334] "Generic (PLEG): container finished" podID="d2de7c7a-fc46-44bc-9fad-d346e82f8ebc" containerID="78d7e4f0f24f3717e0672a2931c7a29a770677751b0f588204cc8056a3eacae1" exitCode=0 Mar 21 04:31:08 crc kubenswrapper[4839]: I0321 04:31:08.288471 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-8p22k" event={"ID":"d2de7c7a-fc46-44bc-9fad-d346e82f8ebc","Type":"ContainerDied","Data":"78d7e4f0f24f3717e0672a2931c7a29a770677751b0f588204cc8056a3eacae1"} Mar 21 04:31:08 crc kubenswrapper[4839]: I0321 04:31:08.291594 4839 generic.go:334] "Generic (PLEG): container finished" podID="1d4943ad-c109-47a0-bcc8-4eb1a89836ca" containerID="8dc14d18d8cf81dc96e16afef9fbbd62f7059647583f5f33c915a3c943cd863d" exitCode=0 Mar 21 04:31:08 crc kubenswrapper[4839]: I0321 04:31:08.292134 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-xg8xw" event={"ID":"1d4943ad-c109-47a0-bcc8-4eb1a89836ca","Type":"ContainerDied","Data":"8dc14d18d8cf81dc96e16afef9fbbd62f7059647583f5f33c915a3c943cd863d"} Mar 21 04:31:08 crc kubenswrapper[4839]: I0321 04:31:08.292186 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-xg8xw" event={"ID":"1d4943ad-c109-47a0-bcc8-4eb1a89836ca","Type":"ContainerStarted","Data":"8425816cb044e883f5be89447f594c78ef1d0a1b7dc84bbfc79a41aee4e2ccff"} Mar 21 04:31:08 crc kubenswrapper[4839]: I0321 04:31:08.296454 4839 generic.go:334] "Generic (PLEG): container finished" podID="51f96bb3-505b-4c7b-bc6d-b0a465c7daae" containerID="cbac869c90a20289dbbcaf58c9bf7ffae0fa45f3c2c8a2348fee1c393d4640c5" exitCode=0 Mar 21 04:31:08 crc kubenswrapper[4839]: I0321 04:31:08.296498 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-99hx2" event={"ID":"51f96bb3-505b-4c7b-bc6d-b0a465c7daae","Type":"ContainerDied","Data":"cbac869c90a20289dbbcaf58c9bf7ffae0fa45f3c2c8a2348fee1c393d4640c5"} Mar 21 04:31:08 crc kubenswrapper[4839]: I0321 04:31:08.296529 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-99hx2" event={"ID":"51f96bb3-505b-4c7b-bc6d-b0a465c7daae","Type":"ContainerStarted","Data":"5dfbf9a67af37390466b9ccf173193b1784ffd5fc8fd6f8770a7a80bfe124608"} Mar 21 04:31:08 crc kubenswrapper[4839]: I0321 04:31:08.312413 4839 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-7sxqv" podStartSLOduration=1.802340448 podStartE2EDuration="4.312395824s" podCreationTimestamp="2026-03-21 04:31:04 +0000 UTC" firstStartedPulling="2026-03-21 04:31:05.255042666 +0000 UTC m=+469.582829342" lastFinishedPulling="2026-03-21 04:31:07.765098042 +0000 UTC m=+472.092884718" observedRunningTime="2026-03-21 04:31:08.308792695 +0000 UTC m=+472.636579381" watchObservedRunningTime="2026-03-21 04:31:08.312395824 +0000 UTC m=+472.640182500" Mar 21 04:31:09 crc kubenswrapper[4839]: I0321 04:31:09.306123 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-8p22k" event={"ID":"d2de7c7a-fc46-44bc-9fad-d346e82f8ebc","Type":"ContainerStarted","Data":"eaf8ffe3b2691785f746aa6a66a7e03fa72af7beca0e7bf2a2ca95ff339769f2"} Mar 21 04:31:09 crc kubenswrapper[4839]: I0321 04:31:09.307930 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-xg8xw" event={"ID":"1d4943ad-c109-47a0-bcc8-4eb1a89836ca","Type":"ContainerStarted","Data":"ca17604b0621caac86216daea7679a5a305dc92873a2d65515d33fbecb8395bf"} Mar 21 04:31:09 crc kubenswrapper[4839]: I0321 04:31:09.329257 4839 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-8p22k" podStartSLOduration=2.9310292860000002 podStartE2EDuration="5.329241078s" podCreationTimestamp="2026-03-21 04:31:04 +0000 UTC" firstStartedPulling="2026-03-21 04:31:06.257961519 +0000 UTC m=+470.585748195" lastFinishedPulling="2026-03-21 04:31:08.656173301 +0000 UTC m=+472.983959987" observedRunningTime="2026-03-21 04:31:09.323968883 +0000 UTC m=+473.651755559" watchObservedRunningTime="2026-03-21 04:31:09.329241078 +0000 UTC m=+473.657027754" Mar 21 04:31:10 crc kubenswrapper[4839]: I0321 04:31:10.314747 4839 generic.go:334] "Generic (PLEG): container finished" podID="1d4943ad-c109-47a0-bcc8-4eb1a89836ca" containerID="ca17604b0621caac86216daea7679a5a305dc92873a2d65515d33fbecb8395bf" exitCode=0 Mar 21 04:31:10 crc kubenswrapper[4839]: I0321 04:31:10.314848 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-xg8xw" event={"ID":"1d4943ad-c109-47a0-bcc8-4eb1a89836ca","Type":"ContainerDied","Data":"ca17604b0621caac86216daea7679a5a305dc92873a2d65515d33fbecb8395bf"} Mar 21 04:31:10 crc kubenswrapper[4839]: I0321 04:31:10.318812 4839 generic.go:334] "Generic (PLEG): container finished" podID="51f96bb3-505b-4c7b-bc6d-b0a465c7daae" containerID="aa598d041afcc73aec2b39e045139334e7de1f9ed9f722e0af6d0c3eaf74de2b" exitCode=0 Mar 21 04:31:10 crc kubenswrapper[4839]: I0321 04:31:10.319340 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-99hx2" event={"ID":"51f96bb3-505b-4c7b-bc6d-b0a465c7daae","Type":"ContainerDied","Data":"aa598d041afcc73aec2b39e045139334e7de1f9ed9f722e0af6d0c3eaf74de2b"} Mar 21 04:31:11 crc kubenswrapper[4839]: I0321 04:31:11.022456 4839 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-66df7c8f76-dq7r2" Mar 21 04:31:11 crc kubenswrapper[4839]: I0321 04:31:11.079698 4839 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-ql2ps"] Mar 21 04:31:11 crc kubenswrapper[4839]: I0321 04:31:11.325333 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-99hx2" event={"ID":"51f96bb3-505b-4c7b-bc6d-b0a465c7daae","Type":"ContainerStarted","Data":"54638761080df82c2fd1dbca89c408f24ffd27149bbba2519c39a4fa9f226ac7"} Mar 21 04:31:11 crc kubenswrapper[4839]: I0321 04:31:11.327392 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-xg8xw" event={"ID":"1d4943ad-c109-47a0-bcc8-4eb1a89836ca","Type":"ContainerStarted","Data":"ceb673ff766e58c024b0d0da7fd37d67489e5e02914e27fa16da101592e95b29"} Mar 21 04:31:11 crc kubenswrapper[4839]: I0321 04:31:11.342960 4839 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-99hx2" podStartSLOduration=2.7120386 podStartE2EDuration="5.342941647s" podCreationTimestamp="2026-03-21 04:31:06 +0000 UTC" firstStartedPulling="2026-03-21 04:31:08.298044651 +0000 UTC m=+472.625831327" lastFinishedPulling="2026-03-21 04:31:10.928947678 +0000 UTC m=+475.256734374" observedRunningTime="2026-03-21 04:31:11.34268919 +0000 UTC m=+475.670475866" watchObservedRunningTime="2026-03-21 04:31:11.342941647 +0000 UTC m=+475.670728323" Mar 21 04:31:11 crc kubenswrapper[4839]: I0321 04:31:11.362416 4839 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-xg8xw" podStartSLOduration=2.767659224 podStartE2EDuration="5.36239948s" podCreationTimestamp="2026-03-21 04:31:06 +0000 UTC" firstStartedPulling="2026-03-21 04:31:08.294103983 +0000 UTC m=+472.621890659" lastFinishedPulling="2026-03-21 04:31:10.888844239 +0000 UTC m=+475.216630915" observedRunningTime="2026-03-21 04:31:11.361335271 +0000 UTC m=+475.689121947" watchObservedRunningTime="2026-03-21 04:31:11.36239948 +0000 UTC m=+475.690186156" Mar 21 04:31:14 crc kubenswrapper[4839]: I0321 04:31:14.635445 4839 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-7sxqv" Mar 21 04:31:14 crc kubenswrapper[4839]: I0321 04:31:14.635837 4839 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-7sxqv" Mar 21 04:31:14 crc kubenswrapper[4839]: I0321 04:31:14.674692 4839 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-7sxqv" Mar 21 04:31:14 crc kubenswrapper[4839]: I0321 04:31:14.847533 4839 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-8p22k" Mar 21 04:31:14 crc kubenswrapper[4839]: I0321 04:31:14.847779 4839 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-8p22k" Mar 21 04:31:15 crc kubenswrapper[4839]: I0321 04:31:15.412839 4839 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-7sxqv" Mar 21 04:31:15 crc kubenswrapper[4839]: I0321 04:31:15.895922 4839 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-8p22k" podUID="d2de7c7a-fc46-44bc-9fad-d346e82f8ebc" containerName="registry-server" probeResult="failure" output=< Mar 21 04:31:15 crc kubenswrapper[4839]: timeout: failed to connect service ":50051" within 1s Mar 21 04:31:15 crc kubenswrapper[4839]: > Mar 21 04:31:17 crc kubenswrapper[4839]: I0321 04:31:17.029648 4839 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-99hx2" Mar 21 04:31:17 crc kubenswrapper[4839]: I0321 04:31:17.030625 4839 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-99hx2" Mar 21 04:31:17 crc kubenswrapper[4839]: I0321 04:31:17.065031 4839 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-99hx2" Mar 21 04:31:17 crc kubenswrapper[4839]: I0321 04:31:17.226620 4839 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-xg8xw" Mar 21 04:31:17 crc kubenswrapper[4839]: I0321 04:31:17.226685 4839 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-xg8xw" Mar 21 04:31:17 crc kubenswrapper[4839]: I0321 04:31:17.262641 4839 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-xg8xw" Mar 21 04:31:17 crc kubenswrapper[4839]: I0321 04:31:17.393638 4839 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-xg8xw" Mar 21 04:31:17 crc kubenswrapper[4839]: I0321 04:31:17.394533 4839 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-99hx2" Mar 21 04:31:24 crc kubenswrapper[4839]: I0321 04:31:24.895852 4839 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-8p22k" Mar 21 04:31:24 crc kubenswrapper[4839]: I0321 04:31:24.941086 4839 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-8p22k" Mar 21 04:31:30 crc kubenswrapper[4839]: I0321 04:31:30.980486 4839 patch_prober.go:28] interesting pod/machine-config-daemon-jx4q7 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 21 04:31:30 crc kubenswrapper[4839]: I0321 04:31:30.980933 4839 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-jx4q7" podUID="4f92fefb-d5cd-451a-8bbe-31eea55d5bd9" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 21 04:31:35 crc kubenswrapper[4839]: I0321 04:31:35.659350 4839 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 21 04:31:36 crc kubenswrapper[4839]: I0321 04:31:36.122348 4839 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-image-registry/image-registry-697d97f7c8-ql2ps" podUID="7ef3f28d-e496-434e-a803-3b9a0fa24690" containerName="registry" containerID="cri-o://0b216795d8b50fc395a781f55afc6bd2e9902da0332fa52d6ee539b16a4c0446" gracePeriod=30 Mar 21 04:31:36 crc kubenswrapper[4839]: I0321 04:31:36.485409 4839 generic.go:334] "Generic (PLEG): container finished" podID="7ef3f28d-e496-434e-a803-3b9a0fa24690" containerID="0b216795d8b50fc395a781f55afc6bd2e9902da0332fa52d6ee539b16a4c0446" exitCode=0 Mar 21 04:31:36 crc kubenswrapper[4839]: I0321 04:31:36.485508 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-ql2ps" event={"ID":"7ef3f28d-e496-434e-a803-3b9a0fa24690","Type":"ContainerDied","Data":"0b216795d8b50fc395a781f55afc6bd2e9902da0332fa52d6ee539b16a4c0446"} Mar 21 04:31:36 crc kubenswrapper[4839]: I0321 04:31:36.534757 4839 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-ql2ps" Mar 21 04:31:36 crc kubenswrapper[4839]: I0321 04:31:36.660867 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/7ef3f28d-e496-434e-a803-3b9a0fa24690-registry-certificates\") pod \"7ef3f28d-e496-434e-a803-3b9a0fa24690\" (UID: \"7ef3f28d-e496-434e-a803-3b9a0fa24690\") " Mar 21 04:31:36 crc kubenswrapper[4839]: I0321 04:31:36.661145 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-storage\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"7ef3f28d-e496-434e-a803-3b9a0fa24690\" (UID: \"7ef3f28d-e496-434e-a803-3b9a0fa24690\") " Mar 21 04:31:36 crc kubenswrapper[4839]: I0321 04:31:36.661175 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pfdvw\" (UniqueName: \"kubernetes.io/projected/7ef3f28d-e496-434e-a803-3b9a0fa24690-kube-api-access-pfdvw\") pod \"7ef3f28d-e496-434e-a803-3b9a0fa24690\" (UID: \"7ef3f28d-e496-434e-a803-3b9a0fa24690\") " Mar 21 04:31:36 crc kubenswrapper[4839]: I0321 04:31:36.661207 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/7ef3f28d-e496-434e-a803-3b9a0fa24690-installation-pull-secrets\") pod \"7ef3f28d-e496-434e-a803-3b9a0fa24690\" (UID: \"7ef3f28d-e496-434e-a803-3b9a0fa24690\") " Mar 21 04:31:36 crc kubenswrapper[4839]: I0321 04:31:36.661229 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/7ef3f28d-e496-434e-a803-3b9a0fa24690-ca-trust-extracted\") pod \"7ef3f28d-e496-434e-a803-3b9a0fa24690\" (UID: \"7ef3f28d-e496-434e-a803-3b9a0fa24690\") " Mar 21 04:31:36 crc kubenswrapper[4839]: I0321 04:31:36.661248 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/7ef3f28d-e496-434e-a803-3b9a0fa24690-registry-tls\") pod \"7ef3f28d-e496-434e-a803-3b9a0fa24690\" (UID: \"7ef3f28d-e496-434e-a803-3b9a0fa24690\") " Mar 21 04:31:36 crc kubenswrapper[4839]: I0321 04:31:36.662046 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7ef3f28d-e496-434e-a803-3b9a0fa24690-registry-certificates" (OuterVolumeSpecName: "registry-certificates") pod "7ef3f28d-e496-434e-a803-3b9a0fa24690" (UID: "7ef3f28d-e496-434e-a803-3b9a0fa24690"). InnerVolumeSpecName "registry-certificates". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 21 04:31:36 crc kubenswrapper[4839]: I0321 04:31:36.666698 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/7ef3f28d-e496-434e-a803-3b9a0fa24690-bound-sa-token\") pod \"7ef3f28d-e496-434e-a803-3b9a0fa24690\" (UID: \"7ef3f28d-e496-434e-a803-3b9a0fa24690\") " Mar 21 04:31:36 crc kubenswrapper[4839]: I0321 04:31:36.666761 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/7ef3f28d-e496-434e-a803-3b9a0fa24690-trusted-ca\") pod \"7ef3f28d-e496-434e-a803-3b9a0fa24690\" (UID: \"7ef3f28d-e496-434e-a803-3b9a0fa24690\") " Mar 21 04:31:36 crc kubenswrapper[4839]: I0321 04:31:36.667199 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7ef3f28d-e496-434e-a803-3b9a0fa24690-kube-api-access-pfdvw" (OuterVolumeSpecName: "kube-api-access-pfdvw") pod "7ef3f28d-e496-434e-a803-3b9a0fa24690" (UID: "7ef3f28d-e496-434e-a803-3b9a0fa24690"). InnerVolumeSpecName "kube-api-access-pfdvw". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 21 04:31:36 crc kubenswrapper[4839]: I0321 04:31:36.667241 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7ef3f28d-e496-434e-a803-3b9a0fa24690-registry-tls" (OuterVolumeSpecName: "registry-tls") pod "7ef3f28d-e496-434e-a803-3b9a0fa24690" (UID: "7ef3f28d-e496-434e-a803-3b9a0fa24690"). InnerVolumeSpecName "registry-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 21 04:31:36 crc kubenswrapper[4839]: I0321 04:31:36.667536 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7ef3f28d-e496-434e-a803-3b9a0fa24690-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "7ef3f28d-e496-434e-a803-3b9a0fa24690" (UID: "7ef3f28d-e496-434e-a803-3b9a0fa24690"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 21 04:31:36 crc kubenswrapper[4839]: I0321 04:31:36.667863 4839 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/7ef3f28d-e496-434e-a803-3b9a0fa24690-trusted-ca\") on node \"crc\" DevicePath \"\"" Mar 21 04:31:36 crc kubenswrapper[4839]: I0321 04:31:36.667880 4839 reconciler_common.go:293] "Volume detached for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/7ef3f28d-e496-434e-a803-3b9a0fa24690-registry-certificates\") on node \"crc\" DevicePath \"\"" Mar 21 04:31:36 crc kubenswrapper[4839]: I0321 04:31:36.667894 4839 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pfdvw\" (UniqueName: \"kubernetes.io/projected/7ef3f28d-e496-434e-a803-3b9a0fa24690-kube-api-access-pfdvw\") on node \"crc\" DevicePath \"\"" Mar 21 04:31:36 crc kubenswrapper[4839]: I0321 04:31:36.667904 4839 reconciler_common.go:293] "Volume detached for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/7ef3f28d-e496-434e-a803-3b9a0fa24690-registry-tls\") on node \"crc\" DevicePath \"\"" Mar 21 04:31:36 crc kubenswrapper[4839]: I0321 04:31:36.669293 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7ef3f28d-e496-434e-a803-3b9a0fa24690-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "7ef3f28d-e496-434e-a803-3b9a0fa24690" (UID: "7ef3f28d-e496-434e-a803-3b9a0fa24690"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 21 04:31:36 crc kubenswrapper[4839]: I0321 04:31:36.674293 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (OuterVolumeSpecName: "registry-storage") pod "7ef3f28d-e496-434e-a803-3b9a0fa24690" (UID: "7ef3f28d-e496-434e-a803-3b9a0fa24690"). InnerVolumeSpecName "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8". PluginName "kubernetes.io/csi", VolumeGidValue "" Mar 21 04:31:36 crc kubenswrapper[4839]: I0321 04:31:36.678040 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7ef3f28d-e496-434e-a803-3b9a0fa24690-installation-pull-secrets" (OuterVolumeSpecName: "installation-pull-secrets") pod "7ef3f28d-e496-434e-a803-3b9a0fa24690" (UID: "7ef3f28d-e496-434e-a803-3b9a0fa24690"). InnerVolumeSpecName "installation-pull-secrets". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 21 04:31:36 crc kubenswrapper[4839]: I0321 04:31:36.678725 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7ef3f28d-e496-434e-a803-3b9a0fa24690-ca-trust-extracted" (OuterVolumeSpecName: "ca-trust-extracted") pod "7ef3f28d-e496-434e-a803-3b9a0fa24690" (UID: "7ef3f28d-e496-434e-a803-3b9a0fa24690"). InnerVolumeSpecName "ca-trust-extracted". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 21 04:31:36 crc kubenswrapper[4839]: I0321 04:31:36.769862 4839 reconciler_common.go:293] "Volume detached for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/7ef3f28d-e496-434e-a803-3b9a0fa24690-installation-pull-secrets\") on node \"crc\" DevicePath \"\"" Mar 21 04:31:36 crc kubenswrapper[4839]: I0321 04:31:36.769901 4839 reconciler_common.go:293] "Volume detached for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/7ef3f28d-e496-434e-a803-3b9a0fa24690-ca-trust-extracted\") on node \"crc\" DevicePath \"\"" Mar 21 04:31:36 crc kubenswrapper[4839]: I0321 04:31:36.769943 4839 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/7ef3f28d-e496-434e-a803-3b9a0fa24690-bound-sa-token\") on node \"crc\" DevicePath \"\"" Mar 21 04:31:37 crc kubenswrapper[4839]: I0321 04:31:37.491876 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-ql2ps" event={"ID":"7ef3f28d-e496-434e-a803-3b9a0fa24690","Type":"ContainerDied","Data":"db289ed2561962adc1edb7c7cc7d0a2aafe884fed424734dbdd27242d856949f"} Mar 21 04:31:37 crc kubenswrapper[4839]: I0321 04:31:37.491942 4839 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-ql2ps" Mar 21 04:31:37 crc kubenswrapper[4839]: I0321 04:31:37.492192 4839 scope.go:117] "RemoveContainer" containerID="0b216795d8b50fc395a781f55afc6bd2e9902da0332fa52d6ee539b16a4c0446" Mar 21 04:31:37 crc kubenswrapper[4839]: I0321 04:31:37.526761 4839 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-ql2ps"] Mar 21 04:31:37 crc kubenswrapper[4839]: I0321 04:31:37.535262 4839 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-ql2ps"] Mar 21 04:31:38 crc kubenswrapper[4839]: I0321 04:31:38.459638 4839 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7ef3f28d-e496-434e-a803-3b9a0fa24690" path="/var/lib/kubelet/pods/7ef3f28d-e496-434e-a803-3b9a0fa24690/volumes" Mar 21 04:32:00 crc kubenswrapper[4839]: I0321 04:32:00.141386 4839 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29567792-rhj6k"] Mar 21 04:32:00 crc kubenswrapper[4839]: E0321 04:32:00.142288 4839 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7ef3f28d-e496-434e-a803-3b9a0fa24690" containerName="registry" Mar 21 04:32:00 crc kubenswrapper[4839]: I0321 04:32:00.142301 4839 state_mem.go:107] "Deleted CPUSet assignment" podUID="7ef3f28d-e496-434e-a803-3b9a0fa24690" containerName="registry" Mar 21 04:32:00 crc kubenswrapper[4839]: I0321 04:32:00.142424 4839 memory_manager.go:354] "RemoveStaleState removing state" podUID="7ef3f28d-e496-434e-a803-3b9a0fa24690" containerName="registry" Mar 21 04:32:00 crc kubenswrapper[4839]: I0321 04:32:00.142941 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567792-rhj6k" Mar 21 04:32:00 crc kubenswrapper[4839]: I0321 04:32:00.145935 4839 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 21 04:32:00 crc kubenswrapper[4839]: I0321 04:32:00.146119 4839 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 21 04:32:00 crc kubenswrapper[4839]: I0321 04:32:00.146279 4839 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-swld2" Mar 21 04:32:00 crc kubenswrapper[4839]: I0321 04:32:00.149032 4839 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29567792-rhj6k"] Mar 21 04:32:00 crc kubenswrapper[4839]: I0321 04:32:00.176444 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lxgj4\" (UniqueName: \"kubernetes.io/projected/fbe4754f-40a1-43e0-827f-557507a5e7d1-kube-api-access-lxgj4\") pod \"auto-csr-approver-29567792-rhj6k\" (UID: \"fbe4754f-40a1-43e0-827f-557507a5e7d1\") " pod="openshift-infra/auto-csr-approver-29567792-rhj6k" Mar 21 04:32:00 crc kubenswrapper[4839]: I0321 04:32:00.278144 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lxgj4\" (UniqueName: \"kubernetes.io/projected/fbe4754f-40a1-43e0-827f-557507a5e7d1-kube-api-access-lxgj4\") pod \"auto-csr-approver-29567792-rhj6k\" (UID: \"fbe4754f-40a1-43e0-827f-557507a5e7d1\") " pod="openshift-infra/auto-csr-approver-29567792-rhj6k" Mar 21 04:32:00 crc kubenswrapper[4839]: I0321 04:32:00.298675 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lxgj4\" (UniqueName: \"kubernetes.io/projected/fbe4754f-40a1-43e0-827f-557507a5e7d1-kube-api-access-lxgj4\") pod \"auto-csr-approver-29567792-rhj6k\" (UID: \"fbe4754f-40a1-43e0-827f-557507a5e7d1\") " pod="openshift-infra/auto-csr-approver-29567792-rhj6k" Mar 21 04:32:00 crc kubenswrapper[4839]: I0321 04:32:00.496218 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567792-rhj6k" Mar 21 04:32:00 crc kubenswrapper[4839]: I0321 04:32:00.890529 4839 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29567792-rhj6k"] Mar 21 04:32:00 crc kubenswrapper[4839]: I0321 04:32:00.980635 4839 patch_prober.go:28] interesting pod/machine-config-daemon-jx4q7 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 21 04:32:00 crc kubenswrapper[4839]: I0321 04:32:00.980897 4839 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-jx4q7" podUID="4f92fefb-d5cd-451a-8bbe-31eea55d5bd9" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 21 04:32:00 crc kubenswrapper[4839]: I0321 04:32:00.980945 4839 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-jx4q7" Mar 21 04:32:00 crc kubenswrapper[4839]: I0321 04:32:00.981477 4839 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"e09fc13ebec75e4a854ca3cecb49f40ab8a65cb0b655c2368ba9c14be11281c6"} pod="openshift-machine-config-operator/machine-config-daemon-jx4q7" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 21 04:32:00 crc kubenswrapper[4839]: I0321 04:32:00.981534 4839 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-jx4q7" podUID="4f92fefb-d5cd-451a-8bbe-31eea55d5bd9" containerName="machine-config-daemon" containerID="cri-o://e09fc13ebec75e4a854ca3cecb49f40ab8a65cb0b655c2368ba9c14be11281c6" gracePeriod=600 Mar 21 04:32:01 crc kubenswrapper[4839]: I0321 04:32:01.635715 4839 generic.go:334] "Generic (PLEG): container finished" podID="4f92fefb-d5cd-451a-8bbe-31eea55d5bd9" containerID="e09fc13ebec75e4a854ca3cecb49f40ab8a65cb0b655c2368ba9c14be11281c6" exitCode=0 Mar 21 04:32:01 crc kubenswrapper[4839]: I0321 04:32:01.635837 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-jx4q7" event={"ID":"4f92fefb-d5cd-451a-8bbe-31eea55d5bd9","Type":"ContainerDied","Data":"e09fc13ebec75e4a854ca3cecb49f40ab8a65cb0b655c2368ba9c14be11281c6"} Mar 21 04:32:01 crc kubenswrapper[4839]: I0321 04:32:01.636126 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-jx4q7" event={"ID":"4f92fefb-d5cd-451a-8bbe-31eea55d5bd9","Type":"ContainerStarted","Data":"da4c2d3dcbc2429432cc1a9b7a706caf5c1cde12d0441535caf710ea73866018"} Mar 21 04:32:01 crc kubenswrapper[4839]: I0321 04:32:01.636156 4839 scope.go:117] "RemoveContainer" containerID="a124ddc90d8b93fe4b6f77b778fbdac127b4896f5f55e6d5d78629cd17584311" Mar 21 04:32:01 crc kubenswrapper[4839]: I0321 04:32:01.637887 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567792-rhj6k" event={"ID":"fbe4754f-40a1-43e0-827f-557507a5e7d1","Type":"ContainerStarted","Data":"df484df4adc4b9da9eb920fd0aca59833fdf5b6b0a24afdf6a74a12ba2c545d1"} Mar 21 04:32:02 crc kubenswrapper[4839]: I0321 04:32:02.649971 4839 generic.go:334] "Generic (PLEG): container finished" podID="fbe4754f-40a1-43e0-827f-557507a5e7d1" containerID="c106b5183e83a440589571433cb66f6749e926bbac60bb184fac0a05ac6cf93b" exitCode=0 Mar 21 04:32:02 crc kubenswrapper[4839]: I0321 04:32:02.650503 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567792-rhj6k" event={"ID":"fbe4754f-40a1-43e0-827f-557507a5e7d1","Type":"ContainerDied","Data":"c106b5183e83a440589571433cb66f6749e926bbac60bb184fac0a05ac6cf93b"} Mar 21 04:32:03 crc kubenswrapper[4839]: I0321 04:32:03.920505 4839 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567792-rhj6k" Mar 21 04:32:03 crc kubenswrapper[4839]: I0321 04:32:03.939547 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lxgj4\" (UniqueName: \"kubernetes.io/projected/fbe4754f-40a1-43e0-827f-557507a5e7d1-kube-api-access-lxgj4\") pod \"fbe4754f-40a1-43e0-827f-557507a5e7d1\" (UID: \"fbe4754f-40a1-43e0-827f-557507a5e7d1\") " Mar 21 04:32:03 crc kubenswrapper[4839]: I0321 04:32:03.945994 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fbe4754f-40a1-43e0-827f-557507a5e7d1-kube-api-access-lxgj4" (OuterVolumeSpecName: "kube-api-access-lxgj4") pod "fbe4754f-40a1-43e0-827f-557507a5e7d1" (UID: "fbe4754f-40a1-43e0-827f-557507a5e7d1"). InnerVolumeSpecName "kube-api-access-lxgj4". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 21 04:32:04 crc kubenswrapper[4839]: I0321 04:32:04.040783 4839 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lxgj4\" (UniqueName: \"kubernetes.io/projected/fbe4754f-40a1-43e0-827f-557507a5e7d1-kube-api-access-lxgj4\") on node \"crc\" DevicePath \"\"" Mar 21 04:32:04 crc kubenswrapper[4839]: I0321 04:32:04.665845 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567792-rhj6k" event={"ID":"fbe4754f-40a1-43e0-827f-557507a5e7d1","Type":"ContainerDied","Data":"df484df4adc4b9da9eb920fd0aca59833fdf5b6b0a24afdf6a74a12ba2c545d1"} Mar 21 04:32:04 crc kubenswrapper[4839]: I0321 04:32:04.665895 4839 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="df484df4adc4b9da9eb920fd0aca59833fdf5b6b0a24afdf6a74a12ba2c545d1" Mar 21 04:32:04 crc kubenswrapper[4839]: I0321 04:32:04.665953 4839 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567792-rhj6k" Mar 21 04:32:04 crc kubenswrapper[4839]: I0321 04:32:04.976764 4839 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29567786-d8w8k"] Mar 21 04:32:04 crc kubenswrapper[4839]: I0321 04:32:04.979680 4839 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29567786-d8w8k"] Mar 21 04:32:06 crc kubenswrapper[4839]: I0321 04:32:06.461733 4839 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="609ace61-45d1-44f6-b378-fb97eecf2374" path="/var/lib/kubelet/pods/609ace61-45d1-44f6-b378-fb97eecf2374/volumes" Mar 21 04:34:00 crc kubenswrapper[4839]: I0321 04:34:00.149381 4839 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29567794-rclnt"] Mar 21 04:34:00 crc kubenswrapper[4839]: E0321 04:34:00.150440 4839 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fbe4754f-40a1-43e0-827f-557507a5e7d1" containerName="oc" Mar 21 04:34:00 crc kubenswrapper[4839]: I0321 04:34:00.150468 4839 state_mem.go:107] "Deleted CPUSet assignment" podUID="fbe4754f-40a1-43e0-827f-557507a5e7d1" containerName="oc" Mar 21 04:34:00 crc kubenswrapper[4839]: I0321 04:34:00.150760 4839 memory_manager.go:354] "RemoveStaleState removing state" podUID="fbe4754f-40a1-43e0-827f-557507a5e7d1" containerName="oc" Mar 21 04:34:00 crc kubenswrapper[4839]: I0321 04:34:00.151466 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567794-rclnt" Mar 21 04:34:00 crc kubenswrapper[4839]: I0321 04:34:00.155772 4839 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-swld2" Mar 21 04:34:00 crc kubenswrapper[4839]: I0321 04:34:00.156224 4839 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 21 04:34:00 crc kubenswrapper[4839]: I0321 04:34:00.156878 4839 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29567794-rclnt"] Mar 21 04:34:00 crc kubenswrapper[4839]: I0321 04:34:00.158648 4839 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 21 04:34:00 crc kubenswrapper[4839]: I0321 04:34:00.162965 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zcjv9\" (UniqueName: \"kubernetes.io/projected/2dfa2356-3aca-4ed1-bfce-93cc8857825d-kube-api-access-zcjv9\") pod \"auto-csr-approver-29567794-rclnt\" (UID: \"2dfa2356-3aca-4ed1-bfce-93cc8857825d\") " pod="openshift-infra/auto-csr-approver-29567794-rclnt" Mar 21 04:34:00 crc kubenswrapper[4839]: I0321 04:34:00.264695 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zcjv9\" (UniqueName: \"kubernetes.io/projected/2dfa2356-3aca-4ed1-bfce-93cc8857825d-kube-api-access-zcjv9\") pod \"auto-csr-approver-29567794-rclnt\" (UID: \"2dfa2356-3aca-4ed1-bfce-93cc8857825d\") " pod="openshift-infra/auto-csr-approver-29567794-rclnt" Mar 21 04:34:00 crc kubenswrapper[4839]: I0321 04:34:00.283753 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zcjv9\" (UniqueName: \"kubernetes.io/projected/2dfa2356-3aca-4ed1-bfce-93cc8857825d-kube-api-access-zcjv9\") pod \"auto-csr-approver-29567794-rclnt\" (UID: \"2dfa2356-3aca-4ed1-bfce-93cc8857825d\") " pod="openshift-infra/auto-csr-approver-29567794-rclnt" Mar 21 04:34:00 crc kubenswrapper[4839]: I0321 04:34:00.483189 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567794-rclnt" Mar 21 04:34:00 crc kubenswrapper[4839]: I0321 04:34:00.904615 4839 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29567794-rclnt"] Mar 21 04:34:00 crc kubenswrapper[4839]: I0321 04:34:00.918805 4839 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 21 04:34:01 crc kubenswrapper[4839]: I0321 04:34:01.370691 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567794-rclnt" event={"ID":"2dfa2356-3aca-4ed1-bfce-93cc8857825d","Type":"ContainerStarted","Data":"adf709a089479709a5e89dbbee5413cabcfdbc11eac14be8289d76ab28d549e1"} Mar 21 04:34:02 crc kubenswrapper[4839]: I0321 04:34:02.378305 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567794-rclnt" event={"ID":"2dfa2356-3aca-4ed1-bfce-93cc8857825d","Type":"ContainerStarted","Data":"28332a1cde28bd0485ad3577e9e484f03df8597359b74118b7173ff71df9e89d"} Mar 21 04:34:02 crc kubenswrapper[4839]: I0321 04:34:02.390241 4839 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29567794-rclnt" podStartSLOduration=1.4019920510000001 podStartE2EDuration="2.390223858s" podCreationTimestamp="2026-03-21 04:34:00 +0000 UTC" firstStartedPulling="2026-03-21 04:34:00.917240398 +0000 UTC m=+645.245027104" lastFinishedPulling="2026-03-21 04:34:01.905472205 +0000 UTC m=+646.233258911" observedRunningTime="2026-03-21 04:34:02.389614701 +0000 UTC m=+646.717401377" watchObservedRunningTime="2026-03-21 04:34:02.390223858 +0000 UTC m=+646.718010534" Mar 21 04:34:03 crc kubenswrapper[4839]: I0321 04:34:03.386453 4839 generic.go:334] "Generic (PLEG): container finished" podID="2dfa2356-3aca-4ed1-bfce-93cc8857825d" containerID="28332a1cde28bd0485ad3577e9e484f03df8597359b74118b7173ff71df9e89d" exitCode=0 Mar 21 04:34:03 crc kubenswrapper[4839]: I0321 04:34:03.386502 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567794-rclnt" event={"ID":"2dfa2356-3aca-4ed1-bfce-93cc8857825d","Type":"ContainerDied","Data":"28332a1cde28bd0485ad3577e9e484f03df8597359b74118b7173ff71df9e89d"} Mar 21 04:34:04 crc kubenswrapper[4839]: I0321 04:34:04.622781 4839 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567794-rclnt" Mar 21 04:34:04 crc kubenswrapper[4839]: I0321 04:34:04.726082 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zcjv9\" (UniqueName: \"kubernetes.io/projected/2dfa2356-3aca-4ed1-bfce-93cc8857825d-kube-api-access-zcjv9\") pod \"2dfa2356-3aca-4ed1-bfce-93cc8857825d\" (UID: \"2dfa2356-3aca-4ed1-bfce-93cc8857825d\") " Mar 21 04:34:04 crc kubenswrapper[4839]: I0321 04:34:04.732559 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2dfa2356-3aca-4ed1-bfce-93cc8857825d-kube-api-access-zcjv9" (OuterVolumeSpecName: "kube-api-access-zcjv9") pod "2dfa2356-3aca-4ed1-bfce-93cc8857825d" (UID: "2dfa2356-3aca-4ed1-bfce-93cc8857825d"). InnerVolumeSpecName "kube-api-access-zcjv9". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 21 04:34:04 crc kubenswrapper[4839]: I0321 04:34:04.827311 4839 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zcjv9\" (UniqueName: \"kubernetes.io/projected/2dfa2356-3aca-4ed1-bfce-93cc8857825d-kube-api-access-zcjv9\") on node \"crc\" DevicePath \"\"" Mar 21 04:34:05 crc kubenswrapper[4839]: I0321 04:34:05.399564 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567794-rclnt" event={"ID":"2dfa2356-3aca-4ed1-bfce-93cc8857825d","Type":"ContainerDied","Data":"adf709a089479709a5e89dbbee5413cabcfdbc11eac14be8289d76ab28d549e1"} Mar 21 04:34:05 crc kubenswrapper[4839]: I0321 04:34:05.399948 4839 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="adf709a089479709a5e89dbbee5413cabcfdbc11eac14be8289d76ab28d549e1" Mar 21 04:34:05 crc kubenswrapper[4839]: I0321 04:34:05.399786 4839 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567794-rclnt" Mar 21 04:34:05 crc kubenswrapper[4839]: I0321 04:34:05.445724 4839 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29567788-9snlp"] Mar 21 04:34:05 crc kubenswrapper[4839]: I0321 04:34:05.448700 4839 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29567788-9snlp"] Mar 21 04:34:06 crc kubenswrapper[4839]: I0321 04:34:06.467514 4839 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a45deb0c-4247-4d23-86db-a897c7f7e7f2" path="/var/lib/kubelet/pods/a45deb0c-4247-4d23-86db-a897c7f7e7f2/volumes" Mar 21 04:34:29 crc kubenswrapper[4839]: I0321 04:34:29.860091 4839 scope.go:117] "RemoveContainer" containerID="4d013e774070ce075bd0baa030b45d638ec14fab41990f0c671aa0d311846927" Mar 21 04:34:29 crc kubenswrapper[4839]: I0321 04:34:29.901849 4839 scope.go:117] "RemoveContainer" containerID="de6f2a80d57a636d18226b6f51d6ae0c6746d29df097ca4fd364524695c212fc" Mar 21 04:34:30 crc kubenswrapper[4839]: I0321 04:34:30.981216 4839 patch_prober.go:28] interesting pod/machine-config-daemon-jx4q7 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 21 04:34:30 crc kubenswrapper[4839]: I0321 04:34:30.981280 4839 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-jx4q7" podUID="4f92fefb-d5cd-451a-8bbe-31eea55d5bd9" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 21 04:35:00 crc kubenswrapper[4839]: I0321 04:35:00.980355 4839 patch_prober.go:28] interesting pod/machine-config-daemon-jx4q7 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 21 04:35:00 crc kubenswrapper[4839]: I0321 04:35:00.981225 4839 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-jx4q7" podUID="4f92fefb-d5cd-451a-8bbe-31eea55d5bd9" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 21 04:35:30 crc kubenswrapper[4839]: I0321 04:35:30.980014 4839 patch_prober.go:28] interesting pod/machine-config-daemon-jx4q7 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 21 04:35:30 crc kubenswrapper[4839]: I0321 04:35:30.980598 4839 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-jx4q7" podUID="4f92fefb-d5cd-451a-8bbe-31eea55d5bd9" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 21 04:35:30 crc kubenswrapper[4839]: I0321 04:35:30.980660 4839 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-jx4q7" Mar 21 04:35:30 crc kubenswrapper[4839]: I0321 04:35:30.981451 4839 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"da4c2d3dcbc2429432cc1a9b7a706caf5c1cde12d0441535caf710ea73866018"} pod="openshift-machine-config-operator/machine-config-daemon-jx4q7" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 21 04:35:30 crc kubenswrapper[4839]: I0321 04:35:30.981530 4839 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-jx4q7" podUID="4f92fefb-d5cd-451a-8bbe-31eea55d5bd9" containerName="machine-config-daemon" containerID="cri-o://da4c2d3dcbc2429432cc1a9b7a706caf5c1cde12d0441535caf710ea73866018" gracePeriod=600 Mar 21 04:35:31 crc kubenswrapper[4839]: I0321 04:35:31.884782 4839 generic.go:334] "Generic (PLEG): container finished" podID="4f92fefb-d5cd-451a-8bbe-31eea55d5bd9" containerID="da4c2d3dcbc2429432cc1a9b7a706caf5c1cde12d0441535caf710ea73866018" exitCode=0 Mar 21 04:35:31 crc kubenswrapper[4839]: I0321 04:35:31.884871 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-jx4q7" event={"ID":"4f92fefb-d5cd-451a-8bbe-31eea55d5bd9","Type":"ContainerDied","Data":"da4c2d3dcbc2429432cc1a9b7a706caf5c1cde12d0441535caf710ea73866018"} Mar 21 04:35:31 crc kubenswrapper[4839]: I0321 04:35:31.885369 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-jx4q7" event={"ID":"4f92fefb-d5cd-451a-8bbe-31eea55d5bd9","Type":"ContainerStarted","Data":"d9f640234dbdc5d617b2a0974e24e968076d94c55d65466d46d7d064392afc02"} Mar 21 04:35:31 crc kubenswrapper[4839]: I0321 04:35:31.885393 4839 scope.go:117] "RemoveContainer" containerID="e09fc13ebec75e4a854ca3cecb49f40ab8a65cb0b655c2368ba9c14be11281c6" Mar 21 04:36:00 crc kubenswrapper[4839]: I0321 04:36:00.131794 4839 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29567796-c5w5j"] Mar 21 04:36:00 crc kubenswrapper[4839]: E0321 04:36:00.132593 4839 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2dfa2356-3aca-4ed1-bfce-93cc8857825d" containerName="oc" Mar 21 04:36:00 crc kubenswrapper[4839]: I0321 04:36:00.132608 4839 state_mem.go:107] "Deleted CPUSet assignment" podUID="2dfa2356-3aca-4ed1-bfce-93cc8857825d" containerName="oc" Mar 21 04:36:00 crc kubenswrapper[4839]: I0321 04:36:00.132734 4839 memory_manager.go:354] "RemoveStaleState removing state" podUID="2dfa2356-3aca-4ed1-bfce-93cc8857825d" containerName="oc" Mar 21 04:36:00 crc kubenswrapper[4839]: I0321 04:36:00.133151 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567796-c5w5j" Mar 21 04:36:00 crc kubenswrapper[4839]: I0321 04:36:00.136388 4839 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-swld2" Mar 21 04:36:00 crc kubenswrapper[4839]: I0321 04:36:00.136704 4839 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 21 04:36:00 crc kubenswrapper[4839]: I0321 04:36:00.136954 4839 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 21 04:36:00 crc kubenswrapper[4839]: I0321 04:36:00.151135 4839 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29567796-c5w5j"] Mar 21 04:36:00 crc kubenswrapper[4839]: I0321 04:36:00.287547 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8tcnw\" (UniqueName: \"kubernetes.io/projected/c0a0d1c7-7c95-46aa-81de-f2a3d91bea6b-kube-api-access-8tcnw\") pod \"auto-csr-approver-29567796-c5w5j\" (UID: \"c0a0d1c7-7c95-46aa-81de-f2a3d91bea6b\") " pod="openshift-infra/auto-csr-approver-29567796-c5w5j" Mar 21 04:36:00 crc kubenswrapper[4839]: I0321 04:36:00.388977 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8tcnw\" (UniqueName: \"kubernetes.io/projected/c0a0d1c7-7c95-46aa-81de-f2a3d91bea6b-kube-api-access-8tcnw\") pod \"auto-csr-approver-29567796-c5w5j\" (UID: \"c0a0d1c7-7c95-46aa-81de-f2a3d91bea6b\") " pod="openshift-infra/auto-csr-approver-29567796-c5w5j" Mar 21 04:36:00 crc kubenswrapper[4839]: I0321 04:36:00.412765 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8tcnw\" (UniqueName: \"kubernetes.io/projected/c0a0d1c7-7c95-46aa-81de-f2a3d91bea6b-kube-api-access-8tcnw\") pod \"auto-csr-approver-29567796-c5w5j\" (UID: \"c0a0d1c7-7c95-46aa-81de-f2a3d91bea6b\") " pod="openshift-infra/auto-csr-approver-29567796-c5w5j" Mar 21 04:36:00 crc kubenswrapper[4839]: I0321 04:36:00.458038 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567796-c5w5j" Mar 21 04:36:00 crc kubenswrapper[4839]: I0321 04:36:00.835546 4839 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29567796-c5w5j"] Mar 21 04:36:01 crc kubenswrapper[4839]: I0321 04:36:01.049823 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567796-c5w5j" event={"ID":"c0a0d1c7-7c95-46aa-81de-f2a3d91bea6b","Type":"ContainerStarted","Data":"81b430d783f9d19437394ba6f18815c98c399b02dbf007aedb78110cae8ef789"} Mar 21 04:36:03 crc kubenswrapper[4839]: I0321 04:36:03.061538 4839 generic.go:334] "Generic (PLEG): container finished" podID="c0a0d1c7-7c95-46aa-81de-f2a3d91bea6b" containerID="edf0b9b310ad11f4cb21b959eb633d808203a45ec2b8463a2fe875186e107484" exitCode=0 Mar 21 04:36:03 crc kubenswrapper[4839]: I0321 04:36:03.061642 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567796-c5w5j" event={"ID":"c0a0d1c7-7c95-46aa-81de-f2a3d91bea6b","Type":"ContainerDied","Data":"edf0b9b310ad11f4cb21b959eb633d808203a45ec2b8463a2fe875186e107484"} Mar 21 04:36:04 crc kubenswrapper[4839]: I0321 04:36:04.324456 4839 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567796-c5w5j" Mar 21 04:36:04 crc kubenswrapper[4839]: I0321 04:36:04.443294 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8tcnw\" (UniqueName: \"kubernetes.io/projected/c0a0d1c7-7c95-46aa-81de-f2a3d91bea6b-kube-api-access-8tcnw\") pod \"c0a0d1c7-7c95-46aa-81de-f2a3d91bea6b\" (UID: \"c0a0d1c7-7c95-46aa-81de-f2a3d91bea6b\") " Mar 21 04:36:04 crc kubenswrapper[4839]: I0321 04:36:04.449076 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c0a0d1c7-7c95-46aa-81de-f2a3d91bea6b-kube-api-access-8tcnw" (OuterVolumeSpecName: "kube-api-access-8tcnw") pod "c0a0d1c7-7c95-46aa-81de-f2a3d91bea6b" (UID: "c0a0d1c7-7c95-46aa-81de-f2a3d91bea6b"). InnerVolumeSpecName "kube-api-access-8tcnw". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 21 04:36:04 crc kubenswrapper[4839]: I0321 04:36:04.545172 4839 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8tcnw\" (UniqueName: \"kubernetes.io/projected/c0a0d1c7-7c95-46aa-81de-f2a3d91bea6b-kube-api-access-8tcnw\") on node \"crc\" DevicePath \"\"" Mar 21 04:36:05 crc kubenswrapper[4839]: I0321 04:36:05.075915 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567796-c5w5j" event={"ID":"c0a0d1c7-7c95-46aa-81de-f2a3d91bea6b","Type":"ContainerDied","Data":"81b430d783f9d19437394ba6f18815c98c399b02dbf007aedb78110cae8ef789"} Mar 21 04:36:05 crc kubenswrapper[4839]: I0321 04:36:05.075972 4839 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="81b430d783f9d19437394ba6f18815c98c399b02dbf007aedb78110cae8ef789" Mar 21 04:36:05 crc kubenswrapper[4839]: I0321 04:36:05.076019 4839 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567796-c5w5j" Mar 21 04:36:05 crc kubenswrapper[4839]: I0321 04:36:05.386013 4839 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29567790-h7nhz"] Mar 21 04:36:05 crc kubenswrapper[4839]: I0321 04:36:05.390431 4839 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29567790-h7nhz"] Mar 21 04:36:06 crc kubenswrapper[4839]: I0321 04:36:06.458289 4839 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="64e6efc9-03ce-4af4-bcc2-bc64ceebc652" path="/var/lib/kubelet/pods/64e6efc9-03ce-4af4-bcc2-bc64ceebc652/volumes" Mar 21 04:36:29 crc kubenswrapper[4839]: I0321 04:36:29.971226 4839 scope.go:117] "RemoveContainer" containerID="7a160fd6d3c601d634e7f0ddbce27e4379f3d0fc66482e35f835bfe3e44b6c2b" Mar 21 04:36:49 crc kubenswrapper[4839]: I0321 04:36:49.911422 4839 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-cainjector-cf98fcc89-v297k"] Mar 21 04:36:49 crc kubenswrapper[4839]: E0321 04:36:49.913398 4839 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c0a0d1c7-7c95-46aa-81de-f2a3d91bea6b" containerName="oc" Mar 21 04:36:49 crc kubenswrapper[4839]: I0321 04:36:49.913414 4839 state_mem.go:107] "Deleted CPUSet assignment" podUID="c0a0d1c7-7c95-46aa-81de-f2a3d91bea6b" containerName="oc" Mar 21 04:36:49 crc kubenswrapper[4839]: I0321 04:36:49.913532 4839 memory_manager.go:354] "RemoveStaleState removing state" podUID="c0a0d1c7-7c95-46aa-81de-f2a3d91bea6b" containerName="oc" Mar 21 04:36:49 crc kubenswrapper[4839]: I0321 04:36:49.914716 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-cainjector-cf98fcc89-v297k" Mar 21 04:36:49 crc kubenswrapper[4839]: I0321 04:36:49.918616 4839 reflector.go:368] Caches populated for *v1.ConfigMap from object-"cert-manager"/"openshift-service-ca.crt" Mar 21 04:36:49 crc kubenswrapper[4839]: I0321 04:36:49.918994 4839 reflector.go:368] Caches populated for *v1.Secret from object-"cert-manager"/"cert-manager-cainjector-dockercfg-59dq7" Mar 21 04:36:49 crc kubenswrapper[4839]: I0321 04:36:49.924335 4839 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-858654f9db-x2cpt"] Mar 21 04:36:49 crc kubenswrapper[4839]: I0321 04:36:49.925123 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-858654f9db-x2cpt" Mar 21 04:36:49 crc kubenswrapper[4839]: I0321 04:36:49.925553 4839 reflector.go:368] Caches populated for *v1.ConfigMap from object-"cert-manager"/"kube-root-ca.crt" Mar 21 04:36:49 crc kubenswrapper[4839]: I0321 04:36:49.929485 4839 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-cainjector-cf98fcc89-v297k"] Mar 21 04:36:49 crc kubenswrapper[4839]: I0321 04:36:49.939398 4839 reflector.go:368] Caches populated for *v1.Secret from object-"cert-manager"/"cert-manager-dockercfg-8lbcs" Mar 21 04:36:49 crc kubenswrapper[4839]: I0321 04:36:49.947839 4839 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-858654f9db-x2cpt"] Mar 21 04:36:49 crc kubenswrapper[4839]: I0321 04:36:49.953616 4839 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-webhook-687f57d79b-s9zj6"] Mar 21 04:36:49 crc kubenswrapper[4839]: I0321 04:36:49.954470 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-webhook-687f57d79b-s9zj6" Mar 21 04:36:49 crc kubenswrapper[4839]: I0321 04:36:49.956165 4839 reflector.go:368] Caches populated for *v1.Secret from object-"cert-manager"/"cert-manager-webhook-dockercfg-zw6m8" Mar 21 04:36:49 crc kubenswrapper[4839]: I0321 04:36:49.968829 4839 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-webhook-687f57d79b-s9zj6"] Mar 21 04:36:50 crc kubenswrapper[4839]: I0321 04:36:50.102727 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qrtvn\" (UniqueName: \"kubernetes.io/projected/daed7a16-7023-463e-9d60-3f56f091f73e-kube-api-access-qrtvn\") pod \"cert-manager-858654f9db-x2cpt\" (UID: \"daed7a16-7023-463e-9d60-3f56f091f73e\") " pod="cert-manager/cert-manager-858654f9db-x2cpt" Mar 21 04:36:50 crc kubenswrapper[4839]: I0321 04:36:50.102809 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tfkbz\" (UniqueName: \"kubernetes.io/projected/d70f5b8f-f5a8-4829-b4e1-7a7a12dddd1f-kube-api-access-tfkbz\") pod \"cert-manager-webhook-687f57d79b-s9zj6\" (UID: \"d70f5b8f-f5a8-4829-b4e1-7a7a12dddd1f\") " pod="cert-manager/cert-manager-webhook-687f57d79b-s9zj6" Mar 21 04:36:50 crc kubenswrapper[4839]: I0321 04:36:50.102922 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sdlqn\" (UniqueName: \"kubernetes.io/projected/814a91ac-5e2f-4479-88a3-254e4216e50c-kube-api-access-sdlqn\") pod \"cert-manager-cainjector-cf98fcc89-v297k\" (UID: \"814a91ac-5e2f-4479-88a3-254e4216e50c\") " pod="cert-manager/cert-manager-cainjector-cf98fcc89-v297k" Mar 21 04:36:50 crc kubenswrapper[4839]: I0321 04:36:50.204520 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qrtvn\" (UniqueName: \"kubernetes.io/projected/daed7a16-7023-463e-9d60-3f56f091f73e-kube-api-access-qrtvn\") pod \"cert-manager-858654f9db-x2cpt\" (UID: \"daed7a16-7023-463e-9d60-3f56f091f73e\") " pod="cert-manager/cert-manager-858654f9db-x2cpt" Mar 21 04:36:50 crc kubenswrapper[4839]: I0321 04:36:50.204603 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tfkbz\" (UniqueName: \"kubernetes.io/projected/d70f5b8f-f5a8-4829-b4e1-7a7a12dddd1f-kube-api-access-tfkbz\") pod \"cert-manager-webhook-687f57d79b-s9zj6\" (UID: \"d70f5b8f-f5a8-4829-b4e1-7a7a12dddd1f\") " pod="cert-manager/cert-manager-webhook-687f57d79b-s9zj6" Mar 21 04:36:50 crc kubenswrapper[4839]: I0321 04:36:50.204673 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sdlqn\" (UniqueName: \"kubernetes.io/projected/814a91ac-5e2f-4479-88a3-254e4216e50c-kube-api-access-sdlqn\") pod \"cert-manager-cainjector-cf98fcc89-v297k\" (UID: \"814a91ac-5e2f-4479-88a3-254e4216e50c\") " pod="cert-manager/cert-manager-cainjector-cf98fcc89-v297k" Mar 21 04:36:50 crc kubenswrapper[4839]: I0321 04:36:50.231455 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qrtvn\" (UniqueName: \"kubernetes.io/projected/daed7a16-7023-463e-9d60-3f56f091f73e-kube-api-access-qrtvn\") pod \"cert-manager-858654f9db-x2cpt\" (UID: \"daed7a16-7023-463e-9d60-3f56f091f73e\") " pod="cert-manager/cert-manager-858654f9db-x2cpt" Mar 21 04:36:50 crc kubenswrapper[4839]: I0321 04:36:50.232592 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sdlqn\" (UniqueName: \"kubernetes.io/projected/814a91ac-5e2f-4479-88a3-254e4216e50c-kube-api-access-sdlqn\") pod \"cert-manager-cainjector-cf98fcc89-v297k\" (UID: \"814a91ac-5e2f-4479-88a3-254e4216e50c\") " pod="cert-manager/cert-manager-cainjector-cf98fcc89-v297k" Mar 21 04:36:50 crc kubenswrapper[4839]: I0321 04:36:50.233399 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tfkbz\" (UniqueName: \"kubernetes.io/projected/d70f5b8f-f5a8-4829-b4e1-7a7a12dddd1f-kube-api-access-tfkbz\") pod \"cert-manager-webhook-687f57d79b-s9zj6\" (UID: \"d70f5b8f-f5a8-4829-b4e1-7a7a12dddd1f\") " pod="cert-manager/cert-manager-webhook-687f57d79b-s9zj6" Mar 21 04:36:50 crc kubenswrapper[4839]: I0321 04:36:50.250478 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-cainjector-cf98fcc89-v297k" Mar 21 04:36:50 crc kubenswrapper[4839]: I0321 04:36:50.263209 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-858654f9db-x2cpt" Mar 21 04:36:50 crc kubenswrapper[4839]: I0321 04:36:50.272917 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-webhook-687f57d79b-s9zj6" Mar 21 04:36:50 crc kubenswrapper[4839]: I0321 04:36:50.676876 4839 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-858654f9db-x2cpt"] Mar 21 04:36:50 crc kubenswrapper[4839]: I0321 04:36:50.683049 4839 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-webhook-687f57d79b-s9zj6"] Mar 21 04:36:50 crc kubenswrapper[4839]: W0321 04:36:50.690593 4839 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poddaed7a16_7023_463e_9d60_3f56f091f73e.slice/crio-71bfec46c62749530b458ffa7a5e2d59a383e1598aef62ecfd57fd55bc7fb110 WatchSource:0}: Error finding container 71bfec46c62749530b458ffa7a5e2d59a383e1598aef62ecfd57fd55bc7fb110: Status 404 returned error can't find the container with id 71bfec46c62749530b458ffa7a5e2d59a383e1598aef62ecfd57fd55bc7fb110 Mar 21 04:36:50 crc kubenswrapper[4839]: I0321 04:36:50.696252 4839 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-cainjector-cf98fcc89-v297k"] Mar 21 04:36:50 crc kubenswrapper[4839]: W0321 04:36:50.703418 4839 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd70f5b8f_f5a8_4829_b4e1_7a7a12dddd1f.slice/crio-9c2e27f3df9f328e368b776dfb590ef3e58fdb0406efbeba9904be4426d36b5a WatchSource:0}: Error finding container 9c2e27f3df9f328e368b776dfb590ef3e58fdb0406efbeba9904be4426d36b5a: Status 404 returned error can't find the container with id 9c2e27f3df9f328e368b776dfb590ef3e58fdb0406efbeba9904be4426d36b5a Mar 21 04:36:51 crc kubenswrapper[4839]: I0321 04:36:51.355778 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-webhook-687f57d79b-s9zj6" event={"ID":"d70f5b8f-f5a8-4829-b4e1-7a7a12dddd1f","Type":"ContainerStarted","Data":"9c2e27f3df9f328e368b776dfb590ef3e58fdb0406efbeba9904be4426d36b5a"} Mar 21 04:36:51 crc kubenswrapper[4839]: I0321 04:36:51.357152 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-858654f9db-x2cpt" event={"ID":"daed7a16-7023-463e-9d60-3f56f091f73e","Type":"ContainerStarted","Data":"71bfec46c62749530b458ffa7a5e2d59a383e1598aef62ecfd57fd55bc7fb110"} Mar 21 04:36:51 crc kubenswrapper[4839]: I0321 04:36:51.358111 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-cainjector-cf98fcc89-v297k" event={"ID":"814a91ac-5e2f-4479-88a3-254e4216e50c","Type":"ContainerStarted","Data":"816fae4ac38734a4473d5aa061bfa6f3a4f63f96f190b173b9adbcb2aaffd2a2"} Mar 21 04:36:55 crc kubenswrapper[4839]: I0321 04:36:55.386191 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-cainjector-cf98fcc89-v297k" event={"ID":"814a91ac-5e2f-4479-88a3-254e4216e50c","Type":"ContainerStarted","Data":"58209ebe8c80a719c78fd621af2586a50b4ce6e9a54e4085cfd695e0fb52d176"} Mar 21 04:36:55 crc kubenswrapper[4839]: I0321 04:36:55.406013 4839 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-cainjector-cf98fcc89-v297k" podStartSLOduration=2.527279573 podStartE2EDuration="6.405997753s" podCreationTimestamp="2026-03-21 04:36:49 +0000 UTC" firstStartedPulling="2026-03-21 04:36:50.702671083 +0000 UTC m=+815.030457809" lastFinishedPulling="2026-03-21 04:36:54.581389303 +0000 UTC m=+818.909175989" observedRunningTime="2026-03-21 04:36:55.403860556 +0000 UTC m=+819.731647272" watchObservedRunningTime="2026-03-21 04:36:55.405997753 +0000 UTC m=+819.733784429" Mar 21 04:36:57 crc kubenswrapper[4839]: I0321 04:36:57.401447 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-webhook-687f57d79b-s9zj6" event={"ID":"d70f5b8f-f5a8-4829-b4e1-7a7a12dddd1f","Type":"ContainerStarted","Data":"5e8efcf0912016e52ad90c06eb91b7c511365ffc59b290ed94cf862a643a434c"} Mar 21 04:36:57 crc kubenswrapper[4839]: I0321 04:36:57.402022 4839 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="cert-manager/cert-manager-webhook-687f57d79b-s9zj6" Mar 21 04:36:57 crc kubenswrapper[4839]: I0321 04:36:57.405378 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-858654f9db-x2cpt" event={"ID":"daed7a16-7023-463e-9d60-3f56f091f73e","Type":"ContainerStarted","Data":"361b894f501f07723005e385884369bf997646eee850b84b3d1ce3020fefe03c"} Mar 21 04:36:57 crc kubenswrapper[4839]: I0321 04:36:57.426711 4839 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-webhook-687f57d79b-s9zj6" podStartSLOduration=2.762408116 podStartE2EDuration="8.426683404s" podCreationTimestamp="2026-03-21 04:36:49 +0000 UTC" firstStartedPulling="2026-03-21 04:36:50.705631463 +0000 UTC m=+815.033418159" lastFinishedPulling="2026-03-21 04:36:56.369906771 +0000 UTC m=+820.697693447" observedRunningTime="2026-03-21 04:36:57.422784479 +0000 UTC m=+821.750571235" watchObservedRunningTime="2026-03-21 04:36:57.426683404 +0000 UTC m=+821.754470160" Mar 21 04:36:57 crc kubenswrapper[4839]: I0321 04:36:57.446303 4839 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-858654f9db-x2cpt" podStartSLOduration=2.651085407 podStartE2EDuration="8.446276908s" podCreationTimestamp="2026-03-21 04:36:49 +0000 UTC" firstStartedPulling="2026-03-21 04:36:50.693910409 +0000 UTC m=+815.021697095" lastFinishedPulling="2026-03-21 04:36:56.48910192 +0000 UTC m=+820.816888596" observedRunningTime="2026-03-21 04:36:57.44447656 +0000 UTC m=+821.772263276" watchObservedRunningTime="2026-03-21 04:36:57.446276908 +0000 UTC m=+821.774063614" Mar 21 04:36:59 crc kubenswrapper[4839]: I0321 04:36:59.951645 4839 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-spl4b"] Mar 21 04:36:59 crc kubenswrapper[4839]: I0321 04:36:59.952851 4839 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-spl4b" podUID="d634043b-c9ec-4469-b267-26053b1f02f9" containerName="ovn-controller" containerID="cri-o://0555faac11cd1da1d5573074ee5d8704c1c4a7d8559996229bf242d9ed7290f4" gracePeriod=30 Mar 21 04:36:59 crc kubenswrapper[4839]: I0321 04:36:59.952870 4839 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-spl4b" podUID="d634043b-c9ec-4469-b267-26053b1f02f9" containerName="nbdb" containerID="cri-o://340ce870dbfbd3b102f8f1c5e6705da80e4a23db035175d7746ffe6ad63310f0" gracePeriod=30 Mar 21 04:36:59 crc kubenswrapper[4839]: I0321 04:36:59.952952 4839 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-spl4b" podUID="d634043b-c9ec-4469-b267-26053b1f02f9" containerName="kube-rbac-proxy-node" containerID="cri-o://5854b81091eb3920f499e2fb8bdb5e51afa5cf975397ad7d789603150fdafe92" gracePeriod=30 Mar 21 04:36:59 crc kubenswrapper[4839]: I0321 04:36:59.953009 4839 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-spl4b" podUID="d634043b-c9ec-4469-b267-26053b1f02f9" containerName="northd" containerID="cri-o://04c0470b8aa154ad901d5416631136ad2480da45bff39836dda6837a3e4b980e" gracePeriod=30 Mar 21 04:36:59 crc kubenswrapper[4839]: I0321 04:36:59.953070 4839 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-spl4b" podUID="d634043b-c9ec-4469-b267-26053b1f02f9" containerName="ovn-acl-logging" containerID="cri-o://821316a28b33f897df4ea4ae553ed0cbdd066ff220cc310af604bbe062e4b00e" gracePeriod=30 Mar 21 04:36:59 crc kubenswrapper[4839]: I0321 04:36:59.953088 4839 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-spl4b" podUID="d634043b-c9ec-4469-b267-26053b1f02f9" containerName="sbdb" containerID="cri-o://ee9c34719e2681860a7065870d4f0fd1d5aae10da5f2cc780e83f8020211e536" gracePeriod=30 Mar 21 04:36:59 crc kubenswrapper[4839]: I0321 04:36:59.953121 4839 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-spl4b" podUID="d634043b-c9ec-4469-b267-26053b1f02f9" containerName="kube-rbac-proxy-ovn-metrics" containerID="cri-o://c532a8668c47301ab93dbb1ea21be6ed687a83a6c9008b887e5023a46f27d172" gracePeriod=30 Mar 21 04:36:59 crc kubenswrapper[4839]: I0321 04:36:59.988849 4839 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-spl4b" podUID="d634043b-c9ec-4469-b267-26053b1f02f9" containerName="ovnkube-controller" containerID="cri-o://06ba1fe337b0b90d457f17a01536cb29904e3afa201457970ad79c949040543e" gracePeriod=30 Mar 21 04:37:00 crc kubenswrapper[4839]: I0321 04:37:00.246689 4839 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-spl4b_d634043b-c9ec-4469-b267-26053b1f02f9/ovnkube-controller/3.log" Mar 21 04:37:00 crc kubenswrapper[4839]: I0321 04:37:00.249634 4839 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-spl4b_d634043b-c9ec-4469-b267-26053b1f02f9/ovn-acl-logging/0.log" Mar 21 04:37:00 crc kubenswrapper[4839]: I0321 04:37:00.250038 4839 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-spl4b_d634043b-c9ec-4469-b267-26053b1f02f9/ovn-controller/0.log" Mar 21 04:37:00 crc kubenswrapper[4839]: I0321 04:37:00.250399 4839 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-spl4b" Mar 21 04:37:00 crc kubenswrapper[4839]: I0321 04:37:00.309655 4839 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-vdfz5"] Mar 21 04:37:00 crc kubenswrapper[4839]: E0321 04:37:00.309854 4839 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d634043b-c9ec-4469-b267-26053b1f02f9" containerName="ovnkube-controller" Mar 21 04:37:00 crc kubenswrapper[4839]: I0321 04:37:00.309866 4839 state_mem.go:107] "Deleted CPUSet assignment" podUID="d634043b-c9ec-4469-b267-26053b1f02f9" containerName="ovnkube-controller" Mar 21 04:37:00 crc kubenswrapper[4839]: E0321 04:37:00.309874 4839 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d634043b-c9ec-4469-b267-26053b1f02f9" containerName="ovnkube-controller" Mar 21 04:37:00 crc kubenswrapper[4839]: I0321 04:37:00.309880 4839 state_mem.go:107] "Deleted CPUSet assignment" podUID="d634043b-c9ec-4469-b267-26053b1f02f9" containerName="ovnkube-controller" Mar 21 04:37:00 crc kubenswrapper[4839]: E0321 04:37:00.309889 4839 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d634043b-c9ec-4469-b267-26053b1f02f9" containerName="sbdb" Mar 21 04:37:00 crc kubenswrapper[4839]: I0321 04:37:00.309894 4839 state_mem.go:107] "Deleted CPUSet assignment" podUID="d634043b-c9ec-4469-b267-26053b1f02f9" containerName="sbdb" Mar 21 04:37:00 crc kubenswrapper[4839]: E0321 04:37:00.309904 4839 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d634043b-c9ec-4469-b267-26053b1f02f9" containerName="ovnkube-controller" Mar 21 04:37:00 crc kubenswrapper[4839]: I0321 04:37:00.309909 4839 state_mem.go:107] "Deleted CPUSet assignment" podUID="d634043b-c9ec-4469-b267-26053b1f02f9" containerName="ovnkube-controller" Mar 21 04:37:00 crc kubenswrapper[4839]: E0321 04:37:00.309916 4839 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d634043b-c9ec-4469-b267-26053b1f02f9" containerName="ovn-controller" Mar 21 04:37:00 crc kubenswrapper[4839]: I0321 04:37:00.309921 4839 state_mem.go:107] "Deleted CPUSet assignment" podUID="d634043b-c9ec-4469-b267-26053b1f02f9" containerName="ovn-controller" Mar 21 04:37:00 crc kubenswrapper[4839]: E0321 04:37:00.309932 4839 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d634043b-c9ec-4469-b267-26053b1f02f9" containerName="ovn-acl-logging" Mar 21 04:37:00 crc kubenswrapper[4839]: I0321 04:37:00.309937 4839 state_mem.go:107] "Deleted CPUSet assignment" podUID="d634043b-c9ec-4469-b267-26053b1f02f9" containerName="ovn-acl-logging" Mar 21 04:37:00 crc kubenswrapper[4839]: E0321 04:37:00.309944 4839 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d634043b-c9ec-4469-b267-26053b1f02f9" containerName="kube-rbac-proxy-ovn-metrics" Mar 21 04:37:00 crc kubenswrapper[4839]: I0321 04:37:00.309949 4839 state_mem.go:107] "Deleted CPUSet assignment" podUID="d634043b-c9ec-4469-b267-26053b1f02f9" containerName="kube-rbac-proxy-ovn-metrics" Mar 21 04:37:00 crc kubenswrapper[4839]: E0321 04:37:00.309959 4839 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d634043b-c9ec-4469-b267-26053b1f02f9" containerName="kubecfg-setup" Mar 21 04:37:00 crc kubenswrapper[4839]: I0321 04:37:00.309965 4839 state_mem.go:107] "Deleted CPUSet assignment" podUID="d634043b-c9ec-4469-b267-26053b1f02f9" containerName="kubecfg-setup" Mar 21 04:37:00 crc kubenswrapper[4839]: E0321 04:37:00.309975 4839 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d634043b-c9ec-4469-b267-26053b1f02f9" containerName="nbdb" Mar 21 04:37:00 crc kubenswrapper[4839]: I0321 04:37:00.309981 4839 state_mem.go:107] "Deleted CPUSet assignment" podUID="d634043b-c9ec-4469-b267-26053b1f02f9" containerName="nbdb" Mar 21 04:37:00 crc kubenswrapper[4839]: E0321 04:37:00.309991 4839 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d634043b-c9ec-4469-b267-26053b1f02f9" containerName="kube-rbac-proxy-node" Mar 21 04:37:00 crc kubenswrapper[4839]: I0321 04:37:00.309998 4839 state_mem.go:107] "Deleted CPUSet assignment" podUID="d634043b-c9ec-4469-b267-26053b1f02f9" containerName="kube-rbac-proxy-node" Mar 21 04:37:00 crc kubenswrapper[4839]: E0321 04:37:00.310004 4839 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d634043b-c9ec-4469-b267-26053b1f02f9" containerName="northd" Mar 21 04:37:00 crc kubenswrapper[4839]: I0321 04:37:00.310010 4839 state_mem.go:107] "Deleted CPUSet assignment" podUID="d634043b-c9ec-4469-b267-26053b1f02f9" containerName="northd" Mar 21 04:37:00 crc kubenswrapper[4839]: I0321 04:37:00.310105 4839 memory_manager.go:354] "RemoveStaleState removing state" podUID="d634043b-c9ec-4469-b267-26053b1f02f9" containerName="northd" Mar 21 04:37:00 crc kubenswrapper[4839]: I0321 04:37:00.310119 4839 memory_manager.go:354] "RemoveStaleState removing state" podUID="d634043b-c9ec-4469-b267-26053b1f02f9" containerName="kube-rbac-proxy-ovn-metrics" Mar 21 04:37:00 crc kubenswrapper[4839]: I0321 04:37:00.310126 4839 memory_manager.go:354] "RemoveStaleState removing state" podUID="d634043b-c9ec-4469-b267-26053b1f02f9" containerName="sbdb" Mar 21 04:37:00 crc kubenswrapper[4839]: I0321 04:37:00.310133 4839 memory_manager.go:354] "RemoveStaleState removing state" podUID="d634043b-c9ec-4469-b267-26053b1f02f9" containerName="ovnkube-controller" Mar 21 04:37:00 crc kubenswrapper[4839]: I0321 04:37:00.310140 4839 memory_manager.go:354] "RemoveStaleState removing state" podUID="d634043b-c9ec-4469-b267-26053b1f02f9" containerName="nbdb" Mar 21 04:37:00 crc kubenswrapper[4839]: I0321 04:37:00.310146 4839 memory_manager.go:354] "RemoveStaleState removing state" podUID="d634043b-c9ec-4469-b267-26053b1f02f9" containerName="ovnkube-controller" Mar 21 04:37:00 crc kubenswrapper[4839]: I0321 04:37:00.310153 4839 memory_manager.go:354] "RemoveStaleState removing state" podUID="d634043b-c9ec-4469-b267-26053b1f02f9" containerName="ovnkube-controller" Mar 21 04:37:00 crc kubenswrapper[4839]: I0321 04:37:00.310161 4839 memory_manager.go:354] "RemoveStaleState removing state" podUID="d634043b-c9ec-4469-b267-26053b1f02f9" containerName="ovnkube-controller" Mar 21 04:37:00 crc kubenswrapper[4839]: I0321 04:37:00.310170 4839 memory_manager.go:354] "RemoveStaleState removing state" podUID="d634043b-c9ec-4469-b267-26053b1f02f9" containerName="ovn-acl-logging" Mar 21 04:37:00 crc kubenswrapper[4839]: I0321 04:37:00.310177 4839 memory_manager.go:354] "RemoveStaleState removing state" podUID="d634043b-c9ec-4469-b267-26053b1f02f9" containerName="kube-rbac-proxy-node" Mar 21 04:37:00 crc kubenswrapper[4839]: I0321 04:37:00.310184 4839 memory_manager.go:354] "RemoveStaleState removing state" podUID="d634043b-c9ec-4469-b267-26053b1f02f9" containerName="ovn-controller" Mar 21 04:37:00 crc kubenswrapper[4839]: E0321 04:37:00.310284 4839 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d634043b-c9ec-4469-b267-26053b1f02f9" containerName="ovnkube-controller" Mar 21 04:37:00 crc kubenswrapper[4839]: I0321 04:37:00.310291 4839 state_mem.go:107] "Deleted CPUSet assignment" podUID="d634043b-c9ec-4469-b267-26053b1f02f9" containerName="ovnkube-controller" Mar 21 04:37:00 crc kubenswrapper[4839]: E0321 04:37:00.310298 4839 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d634043b-c9ec-4469-b267-26053b1f02f9" containerName="ovnkube-controller" Mar 21 04:37:00 crc kubenswrapper[4839]: I0321 04:37:00.310303 4839 state_mem.go:107] "Deleted CPUSet assignment" podUID="d634043b-c9ec-4469-b267-26053b1f02f9" containerName="ovnkube-controller" Mar 21 04:37:00 crc kubenswrapper[4839]: I0321 04:37:00.310388 4839 memory_manager.go:354] "RemoveStaleState removing state" podUID="d634043b-c9ec-4469-b267-26053b1f02f9" containerName="ovnkube-controller" Mar 21 04:37:00 crc kubenswrapper[4839]: I0321 04:37:00.311871 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-vdfz5" Mar 21 04:37:00 crc kubenswrapper[4839]: I0321 04:37:00.354041 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/d634043b-c9ec-4469-b267-26053b1f02f9-host-var-lib-cni-networks-ovn-kubernetes\") pod \"d634043b-c9ec-4469-b267-26053b1f02f9\" (UID: \"d634043b-c9ec-4469-b267-26053b1f02f9\") " Mar 21 04:37:00 crc kubenswrapper[4839]: I0321 04:37:00.354108 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/d634043b-c9ec-4469-b267-26053b1f02f9-ovnkube-config\") pod \"d634043b-c9ec-4469-b267-26053b1f02f9\" (UID: \"d634043b-c9ec-4469-b267-26053b1f02f9\") " Mar 21 04:37:00 crc kubenswrapper[4839]: I0321 04:37:00.354160 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/d634043b-c9ec-4469-b267-26053b1f02f9-host-var-lib-cni-networks-ovn-kubernetes" (OuterVolumeSpecName: "host-var-lib-cni-networks-ovn-kubernetes") pod "d634043b-c9ec-4469-b267-26053b1f02f9" (UID: "d634043b-c9ec-4469-b267-26053b1f02f9"). InnerVolumeSpecName "host-var-lib-cni-networks-ovn-kubernetes". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 21 04:37:00 crc kubenswrapper[4839]: I0321 04:37:00.354204 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/d634043b-c9ec-4469-b267-26053b1f02f9-run-ovn\") pod \"d634043b-c9ec-4469-b267-26053b1f02f9\" (UID: \"d634043b-c9ec-4469-b267-26053b1f02f9\") " Mar 21 04:37:00 crc kubenswrapper[4839]: I0321 04:37:00.354272 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/d634043b-c9ec-4469-b267-26053b1f02f9-run-ovn" (OuterVolumeSpecName: "run-ovn") pod "d634043b-c9ec-4469-b267-26053b1f02f9" (UID: "d634043b-c9ec-4469-b267-26053b1f02f9"). InnerVolumeSpecName "run-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 21 04:37:00 crc kubenswrapper[4839]: I0321 04:37:00.354274 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/d634043b-c9ec-4469-b267-26053b1f02f9-host-kubelet\") pod \"d634043b-c9ec-4469-b267-26053b1f02f9\" (UID: \"d634043b-c9ec-4469-b267-26053b1f02f9\") " Mar 21 04:37:00 crc kubenswrapper[4839]: I0321 04:37:00.354297 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/d634043b-c9ec-4469-b267-26053b1f02f9-host-kubelet" (OuterVolumeSpecName: "host-kubelet") pod "d634043b-c9ec-4469-b267-26053b1f02f9" (UID: "d634043b-c9ec-4469-b267-26053b1f02f9"). InnerVolumeSpecName "host-kubelet". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 21 04:37:00 crc kubenswrapper[4839]: I0321 04:37:00.354514 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/d634043b-c9ec-4469-b267-26053b1f02f9-ovnkube-script-lib\") pod \"d634043b-c9ec-4469-b267-26053b1f02f9\" (UID: \"d634043b-c9ec-4469-b267-26053b1f02f9\") " Mar 21 04:37:00 crc kubenswrapper[4839]: I0321 04:37:00.354543 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/d634043b-c9ec-4469-b267-26053b1f02f9-run-openvswitch\") pod \"d634043b-c9ec-4469-b267-26053b1f02f9\" (UID: \"d634043b-c9ec-4469-b267-26053b1f02f9\") " Mar 21 04:37:00 crc kubenswrapper[4839]: I0321 04:37:00.354560 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d634043b-c9ec-4469-b267-26053b1f02f9-ovnkube-config" (OuterVolumeSpecName: "ovnkube-config") pod "d634043b-c9ec-4469-b267-26053b1f02f9" (UID: "d634043b-c9ec-4469-b267-26053b1f02f9"). InnerVolumeSpecName "ovnkube-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 21 04:37:00 crc kubenswrapper[4839]: I0321 04:37:00.354604 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/d634043b-c9ec-4469-b267-26053b1f02f9-ovn-node-metrics-cert\") pod \"d634043b-c9ec-4469-b267-26053b1f02f9\" (UID: \"d634043b-c9ec-4469-b267-26053b1f02f9\") " Mar 21 04:37:00 crc kubenswrapper[4839]: I0321 04:37:00.354619 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/d634043b-c9ec-4469-b267-26053b1f02f9-run-openvswitch" (OuterVolumeSpecName: "run-openvswitch") pod "d634043b-c9ec-4469-b267-26053b1f02f9" (UID: "d634043b-c9ec-4469-b267-26053b1f02f9"). InnerVolumeSpecName "run-openvswitch". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 21 04:37:00 crc kubenswrapper[4839]: I0321 04:37:00.354628 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/d634043b-c9ec-4469-b267-26053b1f02f9-run-systemd\") pod \"d634043b-c9ec-4469-b267-26053b1f02f9\" (UID: \"d634043b-c9ec-4469-b267-26053b1f02f9\") " Mar 21 04:37:00 crc kubenswrapper[4839]: I0321 04:37:00.354658 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/d634043b-c9ec-4469-b267-26053b1f02f9-host-cni-bin\") pod \"d634043b-c9ec-4469-b267-26053b1f02f9\" (UID: \"d634043b-c9ec-4469-b267-26053b1f02f9\") " Mar 21 04:37:00 crc kubenswrapper[4839]: I0321 04:37:00.354680 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d634043b-c9ec-4469-b267-26053b1f02f9-host-slash\") pod \"d634043b-c9ec-4469-b267-26053b1f02f9\" (UID: \"d634043b-c9ec-4469-b267-26053b1f02f9\") " Mar 21 04:37:00 crc kubenswrapper[4839]: I0321 04:37:00.354712 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/d634043b-c9ec-4469-b267-26053b1f02f9-host-run-netns\") pod \"d634043b-c9ec-4469-b267-26053b1f02f9\" (UID: \"d634043b-c9ec-4469-b267-26053b1f02f9\") " Mar 21 04:37:00 crc kubenswrapper[4839]: I0321 04:37:00.354736 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/d634043b-c9ec-4469-b267-26053b1f02f9-node-log\") pod \"d634043b-c9ec-4469-b267-26053b1f02f9\" (UID: \"d634043b-c9ec-4469-b267-26053b1f02f9\") " Mar 21 04:37:00 crc kubenswrapper[4839]: I0321 04:37:00.354748 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/d634043b-c9ec-4469-b267-26053b1f02f9-host-cni-bin" (OuterVolumeSpecName: "host-cni-bin") pod "d634043b-c9ec-4469-b267-26053b1f02f9" (UID: "d634043b-c9ec-4469-b267-26053b1f02f9"). InnerVolumeSpecName "host-cni-bin". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 21 04:37:00 crc kubenswrapper[4839]: I0321 04:37:00.354763 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/d634043b-c9ec-4469-b267-26053b1f02f9-host-slash" (OuterVolumeSpecName: "host-slash") pod "d634043b-c9ec-4469-b267-26053b1f02f9" (UID: "d634043b-c9ec-4469-b267-26053b1f02f9"). InnerVolumeSpecName "host-slash". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 21 04:37:00 crc kubenswrapper[4839]: I0321 04:37:00.354769 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/d634043b-c9ec-4469-b267-26053b1f02f9-log-socket\") pod \"d634043b-c9ec-4469-b267-26053b1f02f9\" (UID: \"d634043b-c9ec-4469-b267-26053b1f02f9\") " Mar 21 04:37:00 crc kubenswrapper[4839]: I0321 04:37:00.354787 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/d634043b-c9ec-4469-b267-26053b1f02f9-host-run-netns" (OuterVolumeSpecName: "host-run-netns") pod "d634043b-c9ec-4469-b267-26053b1f02f9" (UID: "d634043b-c9ec-4469-b267-26053b1f02f9"). InnerVolumeSpecName "host-run-netns". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 21 04:37:00 crc kubenswrapper[4839]: I0321 04:37:00.354794 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/d634043b-c9ec-4469-b267-26053b1f02f9-log-socket" (OuterVolumeSpecName: "log-socket") pod "d634043b-c9ec-4469-b267-26053b1f02f9" (UID: "d634043b-c9ec-4469-b267-26053b1f02f9"). InnerVolumeSpecName "log-socket". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 21 04:37:00 crc kubenswrapper[4839]: I0321 04:37:00.354807 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/d634043b-c9ec-4469-b267-26053b1f02f9-node-log" (OuterVolumeSpecName: "node-log") pod "d634043b-c9ec-4469-b267-26053b1f02f9" (UID: "d634043b-c9ec-4469-b267-26053b1f02f9"). InnerVolumeSpecName "node-log". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 21 04:37:00 crc kubenswrapper[4839]: I0321 04:37:00.354809 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/d634043b-c9ec-4469-b267-26053b1f02f9-systemd-units\") pod \"d634043b-c9ec-4469-b267-26053b1f02f9\" (UID: \"d634043b-c9ec-4469-b267-26053b1f02f9\") " Mar 21 04:37:00 crc kubenswrapper[4839]: I0321 04:37:00.354827 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/d634043b-c9ec-4469-b267-26053b1f02f9-systemd-units" (OuterVolumeSpecName: "systemd-units") pod "d634043b-c9ec-4469-b267-26053b1f02f9" (UID: "d634043b-c9ec-4469-b267-26053b1f02f9"). InnerVolumeSpecName "systemd-units". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 21 04:37:00 crc kubenswrapper[4839]: I0321 04:37:00.354833 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/d634043b-c9ec-4469-b267-26053b1f02f9-host-cni-netd\") pod \"d634043b-c9ec-4469-b267-26053b1f02f9\" (UID: \"d634043b-c9ec-4469-b267-26053b1f02f9\") " Mar 21 04:37:00 crc kubenswrapper[4839]: I0321 04:37:00.354856 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/d634043b-c9ec-4469-b267-26053b1f02f9-host-run-ovn-kubernetes\") pod \"d634043b-c9ec-4469-b267-26053b1f02f9\" (UID: \"d634043b-c9ec-4469-b267-26053b1f02f9\") " Mar 21 04:37:00 crc kubenswrapper[4839]: I0321 04:37:00.354882 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cdph2\" (UniqueName: \"kubernetes.io/projected/d634043b-c9ec-4469-b267-26053b1f02f9-kube-api-access-cdph2\") pod \"d634043b-c9ec-4469-b267-26053b1f02f9\" (UID: \"d634043b-c9ec-4469-b267-26053b1f02f9\") " Mar 21 04:37:00 crc kubenswrapper[4839]: I0321 04:37:00.354907 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/d634043b-c9ec-4469-b267-26053b1f02f9-etc-openvswitch\") pod \"d634043b-c9ec-4469-b267-26053b1f02f9\" (UID: \"d634043b-c9ec-4469-b267-26053b1f02f9\") " Mar 21 04:37:00 crc kubenswrapper[4839]: I0321 04:37:00.354929 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/d634043b-c9ec-4469-b267-26053b1f02f9-var-lib-openvswitch\") pod \"d634043b-c9ec-4469-b267-26053b1f02f9\" (UID: \"d634043b-c9ec-4469-b267-26053b1f02f9\") " Mar 21 04:37:00 crc kubenswrapper[4839]: I0321 04:37:00.354906 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/d634043b-c9ec-4469-b267-26053b1f02f9-host-cni-netd" (OuterVolumeSpecName: "host-cni-netd") pod "d634043b-c9ec-4469-b267-26053b1f02f9" (UID: "d634043b-c9ec-4469-b267-26053b1f02f9"). InnerVolumeSpecName "host-cni-netd". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 21 04:37:00 crc kubenswrapper[4839]: I0321 04:37:00.354913 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/d634043b-c9ec-4469-b267-26053b1f02f9-host-run-ovn-kubernetes" (OuterVolumeSpecName: "host-run-ovn-kubernetes") pod "d634043b-c9ec-4469-b267-26053b1f02f9" (UID: "d634043b-c9ec-4469-b267-26053b1f02f9"). InnerVolumeSpecName "host-run-ovn-kubernetes". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 21 04:37:00 crc kubenswrapper[4839]: I0321 04:37:00.354933 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/d634043b-c9ec-4469-b267-26053b1f02f9-etc-openvswitch" (OuterVolumeSpecName: "etc-openvswitch") pod "d634043b-c9ec-4469-b267-26053b1f02f9" (UID: "d634043b-c9ec-4469-b267-26053b1f02f9"). InnerVolumeSpecName "etc-openvswitch". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 21 04:37:00 crc kubenswrapper[4839]: I0321 04:37:00.354953 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/d634043b-c9ec-4469-b267-26053b1f02f9-env-overrides\") pod \"d634043b-c9ec-4469-b267-26053b1f02f9\" (UID: \"d634043b-c9ec-4469-b267-26053b1f02f9\") " Mar 21 04:37:00 crc kubenswrapper[4839]: I0321 04:37:00.354966 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/d634043b-c9ec-4469-b267-26053b1f02f9-var-lib-openvswitch" (OuterVolumeSpecName: "var-lib-openvswitch") pod "d634043b-c9ec-4469-b267-26053b1f02f9" (UID: "d634043b-c9ec-4469-b267-26053b1f02f9"). InnerVolumeSpecName "var-lib-openvswitch". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 21 04:37:00 crc kubenswrapper[4839]: I0321 04:37:00.355049 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/cb64f802-d294-4fd5-a691-da3096ee0978-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-vdfz5\" (UID: \"cb64f802-d294-4fd5-a691-da3096ee0978\") " pod="openshift-ovn-kubernetes/ovnkube-node-vdfz5" Mar 21 04:37:00 crc kubenswrapper[4839]: I0321 04:37:00.355082 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/cb64f802-d294-4fd5-a691-da3096ee0978-node-log\") pod \"ovnkube-node-vdfz5\" (UID: \"cb64f802-d294-4fd5-a691-da3096ee0978\") " pod="openshift-ovn-kubernetes/ovnkube-node-vdfz5" Mar 21 04:37:00 crc kubenswrapper[4839]: I0321 04:37:00.355102 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d876p\" (UniqueName: \"kubernetes.io/projected/cb64f802-d294-4fd5-a691-da3096ee0978-kube-api-access-d876p\") pod \"ovnkube-node-vdfz5\" (UID: \"cb64f802-d294-4fd5-a691-da3096ee0978\") " pod="openshift-ovn-kubernetes/ovnkube-node-vdfz5" Mar 21 04:37:00 crc kubenswrapper[4839]: I0321 04:37:00.355121 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/cb64f802-d294-4fd5-a691-da3096ee0978-run-openvswitch\") pod \"ovnkube-node-vdfz5\" (UID: \"cb64f802-d294-4fd5-a691-da3096ee0978\") " pod="openshift-ovn-kubernetes/ovnkube-node-vdfz5" Mar 21 04:37:00 crc kubenswrapper[4839]: I0321 04:37:00.355144 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/cb64f802-d294-4fd5-a691-da3096ee0978-var-lib-openvswitch\") pod \"ovnkube-node-vdfz5\" (UID: \"cb64f802-d294-4fd5-a691-da3096ee0978\") " pod="openshift-ovn-kubernetes/ovnkube-node-vdfz5" Mar 21 04:37:00 crc kubenswrapper[4839]: I0321 04:37:00.355167 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/cb64f802-d294-4fd5-a691-da3096ee0978-run-ovn\") pod \"ovnkube-node-vdfz5\" (UID: \"cb64f802-d294-4fd5-a691-da3096ee0978\") " pod="openshift-ovn-kubernetes/ovnkube-node-vdfz5" Mar 21 04:37:00 crc kubenswrapper[4839]: I0321 04:37:00.355190 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/cb64f802-d294-4fd5-a691-da3096ee0978-host-kubelet\") pod \"ovnkube-node-vdfz5\" (UID: \"cb64f802-d294-4fd5-a691-da3096ee0978\") " pod="openshift-ovn-kubernetes/ovnkube-node-vdfz5" Mar 21 04:37:00 crc kubenswrapper[4839]: I0321 04:37:00.355220 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/cb64f802-d294-4fd5-a691-da3096ee0978-host-run-ovn-kubernetes\") pod \"ovnkube-node-vdfz5\" (UID: \"cb64f802-d294-4fd5-a691-da3096ee0978\") " pod="openshift-ovn-kubernetes/ovnkube-node-vdfz5" Mar 21 04:37:00 crc kubenswrapper[4839]: I0321 04:37:00.355251 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d634043b-c9ec-4469-b267-26053b1f02f9-env-overrides" (OuterVolumeSpecName: "env-overrides") pod "d634043b-c9ec-4469-b267-26053b1f02f9" (UID: "d634043b-c9ec-4469-b267-26053b1f02f9"). InnerVolumeSpecName "env-overrides". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 21 04:37:00 crc kubenswrapper[4839]: I0321 04:37:00.355358 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/cb64f802-d294-4fd5-a691-da3096ee0978-run-systemd\") pod \"ovnkube-node-vdfz5\" (UID: \"cb64f802-d294-4fd5-a691-da3096ee0978\") " pod="openshift-ovn-kubernetes/ovnkube-node-vdfz5" Mar 21 04:37:00 crc kubenswrapper[4839]: I0321 04:37:00.355396 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/cb64f802-d294-4fd5-a691-da3096ee0978-host-slash\") pod \"ovnkube-node-vdfz5\" (UID: \"cb64f802-d294-4fd5-a691-da3096ee0978\") " pod="openshift-ovn-kubernetes/ovnkube-node-vdfz5" Mar 21 04:37:00 crc kubenswrapper[4839]: I0321 04:37:00.355433 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/cb64f802-d294-4fd5-a691-da3096ee0978-etc-openvswitch\") pod \"ovnkube-node-vdfz5\" (UID: \"cb64f802-d294-4fd5-a691-da3096ee0978\") " pod="openshift-ovn-kubernetes/ovnkube-node-vdfz5" Mar 21 04:37:00 crc kubenswrapper[4839]: I0321 04:37:00.355451 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/cb64f802-d294-4fd5-a691-da3096ee0978-log-socket\") pod \"ovnkube-node-vdfz5\" (UID: \"cb64f802-d294-4fd5-a691-da3096ee0978\") " pod="openshift-ovn-kubernetes/ovnkube-node-vdfz5" Mar 21 04:37:00 crc kubenswrapper[4839]: I0321 04:37:00.355471 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/cb64f802-d294-4fd5-a691-da3096ee0978-env-overrides\") pod \"ovnkube-node-vdfz5\" (UID: \"cb64f802-d294-4fd5-a691-da3096ee0978\") " pod="openshift-ovn-kubernetes/ovnkube-node-vdfz5" Mar 21 04:37:00 crc kubenswrapper[4839]: I0321 04:37:00.355509 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d634043b-c9ec-4469-b267-26053b1f02f9-ovnkube-script-lib" (OuterVolumeSpecName: "ovnkube-script-lib") pod "d634043b-c9ec-4469-b267-26053b1f02f9" (UID: "d634043b-c9ec-4469-b267-26053b1f02f9"). InnerVolumeSpecName "ovnkube-script-lib". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 21 04:37:00 crc kubenswrapper[4839]: I0321 04:37:00.355545 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/cb64f802-d294-4fd5-a691-da3096ee0978-ovnkube-script-lib\") pod \"ovnkube-node-vdfz5\" (UID: \"cb64f802-d294-4fd5-a691-da3096ee0978\") " pod="openshift-ovn-kubernetes/ovnkube-node-vdfz5" Mar 21 04:37:00 crc kubenswrapper[4839]: I0321 04:37:00.355584 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/cb64f802-d294-4fd5-a691-da3096ee0978-host-cni-bin\") pod \"ovnkube-node-vdfz5\" (UID: \"cb64f802-d294-4fd5-a691-da3096ee0978\") " pod="openshift-ovn-kubernetes/ovnkube-node-vdfz5" Mar 21 04:37:00 crc kubenswrapper[4839]: I0321 04:37:00.355604 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/cb64f802-d294-4fd5-a691-da3096ee0978-ovnkube-config\") pod \"ovnkube-node-vdfz5\" (UID: \"cb64f802-d294-4fd5-a691-da3096ee0978\") " pod="openshift-ovn-kubernetes/ovnkube-node-vdfz5" Mar 21 04:37:00 crc kubenswrapper[4839]: I0321 04:37:00.355639 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/cb64f802-d294-4fd5-a691-da3096ee0978-host-run-netns\") pod \"ovnkube-node-vdfz5\" (UID: \"cb64f802-d294-4fd5-a691-da3096ee0978\") " pod="openshift-ovn-kubernetes/ovnkube-node-vdfz5" Mar 21 04:37:00 crc kubenswrapper[4839]: I0321 04:37:00.355691 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/cb64f802-d294-4fd5-a691-da3096ee0978-ovn-node-metrics-cert\") pod \"ovnkube-node-vdfz5\" (UID: \"cb64f802-d294-4fd5-a691-da3096ee0978\") " pod="openshift-ovn-kubernetes/ovnkube-node-vdfz5" Mar 21 04:37:00 crc kubenswrapper[4839]: I0321 04:37:00.355712 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/cb64f802-d294-4fd5-a691-da3096ee0978-host-cni-netd\") pod \"ovnkube-node-vdfz5\" (UID: \"cb64f802-d294-4fd5-a691-da3096ee0978\") " pod="openshift-ovn-kubernetes/ovnkube-node-vdfz5" Mar 21 04:37:00 crc kubenswrapper[4839]: I0321 04:37:00.355741 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/cb64f802-d294-4fd5-a691-da3096ee0978-systemd-units\") pod \"ovnkube-node-vdfz5\" (UID: \"cb64f802-d294-4fd5-a691-da3096ee0978\") " pod="openshift-ovn-kubernetes/ovnkube-node-vdfz5" Mar 21 04:37:00 crc kubenswrapper[4839]: I0321 04:37:00.355789 4839 reconciler_common.go:293] "Volume detached for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/d634043b-c9ec-4469-b267-26053b1f02f9-systemd-units\") on node \"crc\" DevicePath \"\"" Mar 21 04:37:00 crc kubenswrapper[4839]: I0321 04:37:00.355800 4839 reconciler_common.go:293] "Volume detached for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/d634043b-c9ec-4469-b267-26053b1f02f9-host-cni-netd\") on node \"crc\" DevicePath \"\"" Mar 21 04:37:00 crc kubenswrapper[4839]: I0321 04:37:00.355809 4839 reconciler_common.go:293] "Volume detached for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/d634043b-c9ec-4469-b267-26053b1f02f9-host-run-ovn-kubernetes\") on node \"crc\" DevicePath \"\"" Mar 21 04:37:00 crc kubenswrapper[4839]: I0321 04:37:00.355818 4839 reconciler_common.go:293] "Volume detached for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/d634043b-c9ec-4469-b267-26053b1f02f9-etc-openvswitch\") on node \"crc\" DevicePath \"\"" Mar 21 04:37:00 crc kubenswrapper[4839]: I0321 04:37:00.355826 4839 reconciler_common.go:293] "Volume detached for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/d634043b-c9ec-4469-b267-26053b1f02f9-var-lib-openvswitch\") on node \"crc\" DevicePath \"\"" Mar 21 04:37:00 crc kubenswrapper[4839]: I0321 04:37:00.355834 4839 reconciler_common.go:293] "Volume detached for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/d634043b-c9ec-4469-b267-26053b1f02f9-env-overrides\") on node \"crc\" DevicePath \"\"" Mar 21 04:37:00 crc kubenswrapper[4839]: I0321 04:37:00.355843 4839 reconciler_common.go:293] "Volume detached for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/d634043b-c9ec-4469-b267-26053b1f02f9-host-var-lib-cni-networks-ovn-kubernetes\") on node \"crc\" DevicePath \"\"" Mar 21 04:37:00 crc kubenswrapper[4839]: I0321 04:37:00.355851 4839 reconciler_common.go:293] "Volume detached for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/d634043b-c9ec-4469-b267-26053b1f02f9-ovnkube-config\") on node \"crc\" DevicePath \"\"" Mar 21 04:37:00 crc kubenswrapper[4839]: I0321 04:37:00.355860 4839 reconciler_common.go:293] "Volume detached for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/d634043b-c9ec-4469-b267-26053b1f02f9-run-ovn\") on node \"crc\" DevicePath \"\"" Mar 21 04:37:00 crc kubenswrapper[4839]: I0321 04:37:00.355877 4839 reconciler_common.go:293] "Volume detached for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/d634043b-c9ec-4469-b267-26053b1f02f9-host-kubelet\") on node \"crc\" DevicePath \"\"" Mar 21 04:37:00 crc kubenswrapper[4839]: I0321 04:37:00.355885 4839 reconciler_common.go:293] "Volume detached for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/d634043b-c9ec-4469-b267-26053b1f02f9-ovnkube-script-lib\") on node \"crc\" DevicePath \"\"" Mar 21 04:37:00 crc kubenswrapper[4839]: I0321 04:37:00.355893 4839 reconciler_common.go:293] "Volume detached for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/d634043b-c9ec-4469-b267-26053b1f02f9-run-openvswitch\") on node \"crc\" DevicePath \"\"" Mar 21 04:37:00 crc kubenswrapper[4839]: I0321 04:37:00.355902 4839 reconciler_common.go:293] "Volume detached for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/d634043b-c9ec-4469-b267-26053b1f02f9-host-cni-bin\") on node \"crc\" DevicePath \"\"" Mar 21 04:37:00 crc kubenswrapper[4839]: I0321 04:37:00.355909 4839 reconciler_common.go:293] "Volume detached for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d634043b-c9ec-4469-b267-26053b1f02f9-host-slash\") on node \"crc\" DevicePath \"\"" Mar 21 04:37:00 crc kubenswrapper[4839]: I0321 04:37:00.355917 4839 reconciler_common.go:293] "Volume detached for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/d634043b-c9ec-4469-b267-26053b1f02f9-host-run-netns\") on node \"crc\" DevicePath \"\"" Mar 21 04:37:00 crc kubenswrapper[4839]: I0321 04:37:00.355926 4839 reconciler_common.go:293] "Volume detached for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/d634043b-c9ec-4469-b267-26053b1f02f9-node-log\") on node \"crc\" DevicePath \"\"" Mar 21 04:37:00 crc kubenswrapper[4839]: I0321 04:37:00.355933 4839 reconciler_common.go:293] "Volume detached for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/d634043b-c9ec-4469-b267-26053b1f02f9-log-socket\") on node \"crc\" DevicePath \"\"" Mar 21 04:37:00 crc kubenswrapper[4839]: I0321 04:37:00.359640 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d634043b-c9ec-4469-b267-26053b1f02f9-kube-api-access-cdph2" (OuterVolumeSpecName: "kube-api-access-cdph2") pod "d634043b-c9ec-4469-b267-26053b1f02f9" (UID: "d634043b-c9ec-4469-b267-26053b1f02f9"). InnerVolumeSpecName "kube-api-access-cdph2". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 21 04:37:00 crc kubenswrapper[4839]: I0321 04:37:00.359902 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d634043b-c9ec-4469-b267-26053b1f02f9-ovn-node-metrics-cert" (OuterVolumeSpecName: "ovn-node-metrics-cert") pod "d634043b-c9ec-4469-b267-26053b1f02f9" (UID: "d634043b-c9ec-4469-b267-26053b1f02f9"). InnerVolumeSpecName "ovn-node-metrics-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 21 04:37:00 crc kubenswrapper[4839]: I0321 04:37:00.370412 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/d634043b-c9ec-4469-b267-26053b1f02f9-run-systemd" (OuterVolumeSpecName: "run-systemd") pod "d634043b-c9ec-4469-b267-26053b1f02f9" (UID: "d634043b-c9ec-4469-b267-26053b1f02f9"). InnerVolumeSpecName "run-systemd". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 21 04:37:00 crc kubenswrapper[4839]: I0321 04:37:00.421954 4839 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-spl4b_d634043b-c9ec-4469-b267-26053b1f02f9/ovnkube-controller/3.log" Mar 21 04:37:00 crc kubenswrapper[4839]: I0321 04:37:00.424769 4839 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-spl4b_d634043b-c9ec-4469-b267-26053b1f02f9/ovn-acl-logging/0.log" Mar 21 04:37:00 crc kubenswrapper[4839]: I0321 04:37:00.425453 4839 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-spl4b_d634043b-c9ec-4469-b267-26053b1f02f9/ovn-controller/0.log" Mar 21 04:37:00 crc kubenswrapper[4839]: I0321 04:37:00.426045 4839 generic.go:334] "Generic (PLEG): container finished" podID="d634043b-c9ec-4469-b267-26053b1f02f9" containerID="06ba1fe337b0b90d457f17a01536cb29904e3afa201457970ad79c949040543e" exitCode=0 Mar 21 04:37:00 crc kubenswrapper[4839]: I0321 04:37:00.426076 4839 generic.go:334] "Generic (PLEG): container finished" podID="d634043b-c9ec-4469-b267-26053b1f02f9" containerID="ee9c34719e2681860a7065870d4f0fd1d5aae10da5f2cc780e83f8020211e536" exitCode=0 Mar 21 04:37:00 crc kubenswrapper[4839]: I0321 04:37:00.426087 4839 generic.go:334] "Generic (PLEG): container finished" podID="d634043b-c9ec-4469-b267-26053b1f02f9" containerID="340ce870dbfbd3b102f8f1c5e6705da80e4a23db035175d7746ffe6ad63310f0" exitCode=0 Mar 21 04:37:00 crc kubenswrapper[4839]: I0321 04:37:00.426097 4839 generic.go:334] "Generic (PLEG): container finished" podID="d634043b-c9ec-4469-b267-26053b1f02f9" containerID="04c0470b8aa154ad901d5416631136ad2480da45bff39836dda6837a3e4b980e" exitCode=0 Mar 21 04:37:00 crc kubenswrapper[4839]: I0321 04:37:00.426107 4839 generic.go:334] "Generic (PLEG): container finished" podID="d634043b-c9ec-4469-b267-26053b1f02f9" containerID="c532a8668c47301ab93dbb1ea21be6ed687a83a6c9008b887e5023a46f27d172" exitCode=0 Mar 21 04:37:00 crc kubenswrapper[4839]: I0321 04:37:00.426117 4839 generic.go:334] "Generic (PLEG): container finished" podID="d634043b-c9ec-4469-b267-26053b1f02f9" containerID="5854b81091eb3920f499e2fb8bdb5e51afa5cf975397ad7d789603150fdafe92" exitCode=0 Mar 21 04:37:00 crc kubenswrapper[4839]: I0321 04:37:00.426125 4839 generic.go:334] "Generic (PLEG): container finished" podID="d634043b-c9ec-4469-b267-26053b1f02f9" containerID="821316a28b33f897df4ea4ae553ed0cbdd066ff220cc310af604bbe062e4b00e" exitCode=143 Mar 21 04:37:00 crc kubenswrapper[4839]: I0321 04:37:00.426134 4839 generic.go:334] "Generic (PLEG): container finished" podID="d634043b-c9ec-4469-b267-26053b1f02f9" containerID="0555faac11cd1da1d5573074ee5d8704c1c4a7d8559996229bf242d9ed7290f4" exitCode=143 Mar 21 04:37:00 crc kubenswrapper[4839]: I0321 04:37:00.426155 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-spl4b" event={"ID":"d634043b-c9ec-4469-b267-26053b1f02f9","Type":"ContainerDied","Data":"06ba1fe337b0b90d457f17a01536cb29904e3afa201457970ad79c949040543e"} Mar 21 04:37:00 crc kubenswrapper[4839]: I0321 04:37:00.426188 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-spl4b" event={"ID":"d634043b-c9ec-4469-b267-26053b1f02f9","Type":"ContainerDied","Data":"ee9c34719e2681860a7065870d4f0fd1d5aae10da5f2cc780e83f8020211e536"} Mar 21 04:37:00 crc kubenswrapper[4839]: I0321 04:37:00.426200 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-spl4b" event={"ID":"d634043b-c9ec-4469-b267-26053b1f02f9","Type":"ContainerDied","Data":"340ce870dbfbd3b102f8f1c5e6705da80e4a23db035175d7746ffe6ad63310f0"} Mar 21 04:37:00 crc kubenswrapper[4839]: I0321 04:37:00.426205 4839 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-spl4b" Mar 21 04:37:00 crc kubenswrapper[4839]: I0321 04:37:00.426211 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-spl4b" event={"ID":"d634043b-c9ec-4469-b267-26053b1f02f9","Type":"ContainerDied","Data":"04c0470b8aa154ad901d5416631136ad2480da45bff39836dda6837a3e4b980e"} Mar 21 04:37:00 crc kubenswrapper[4839]: I0321 04:37:00.426344 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-spl4b" event={"ID":"d634043b-c9ec-4469-b267-26053b1f02f9","Type":"ContainerDied","Data":"c532a8668c47301ab93dbb1ea21be6ed687a83a6c9008b887e5023a46f27d172"} Mar 21 04:37:00 crc kubenswrapper[4839]: I0321 04:37:00.426370 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-spl4b" event={"ID":"d634043b-c9ec-4469-b267-26053b1f02f9","Type":"ContainerDied","Data":"5854b81091eb3920f499e2fb8bdb5e51afa5cf975397ad7d789603150fdafe92"} Mar 21 04:37:00 crc kubenswrapper[4839]: I0321 04:37:00.426388 4839 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"616e01984d5c14dee369d9f7e59f96c9dfaccfd431fc19972454ffbbb0b5489f"} Mar 21 04:37:00 crc kubenswrapper[4839]: I0321 04:37:00.426223 4839 scope.go:117] "RemoveContainer" containerID="06ba1fe337b0b90d457f17a01536cb29904e3afa201457970ad79c949040543e" Mar 21 04:37:00 crc kubenswrapper[4839]: I0321 04:37:00.426400 4839 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"ee9c34719e2681860a7065870d4f0fd1d5aae10da5f2cc780e83f8020211e536"} Mar 21 04:37:00 crc kubenswrapper[4839]: I0321 04:37:00.426486 4839 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"340ce870dbfbd3b102f8f1c5e6705da80e4a23db035175d7746ffe6ad63310f0"} Mar 21 04:37:00 crc kubenswrapper[4839]: I0321 04:37:00.426498 4839 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"04c0470b8aa154ad901d5416631136ad2480da45bff39836dda6837a3e4b980e"} Mar 21 04:37:00 crc kubenswrapper[4839]: I0321 04:37:00.426507 4839 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"c532a8668c47301ab93dbb1ea21be6ed687a83a6c9008b887e5023a46f27d172"} Mar 21 04:37:00 crc kubenswrapper[4839]: I0321 04:37:00.426514 4839 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"5854b81091eb3920f499e2fb8bdb5e51afa5cf975397ad7d789603150fdafe92"} Mar 21 04:37:00 crc kubenswrapper[4839]: I0321 04:37:00.426521 4839 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"821316a28b33f897df4ea4ae553ed0cbdd066ff220cc310af604bbe062e4b00e"} Mar 21 04:37:00 crc kubenswrapper[4839]: I0321 04:37:00.426528 4839 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"0555faac11cd1da1d5573074ee5d8704c1c4a7d8559996229bf242d9ed7290f4"} Mar 21 04:37:00 crc kubenswrapper[4839]: I0321 04:37:00.426534 4839 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"466ed122245008a86b4471250670e086e61c981de3a7a026461c7c2238c87f3c"} Mar 21 04:37:00 crc kubenswrapper[4839]: I0321 04:37:00.426548 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-spl4b" event={"ID":"d634043b-c9ec-4469-b267-26053b1f02f9","Type":"ContainerDied","Data":"821316a28b33f897df4ea4ae553ed0cbdd066ff220cc310af604bbe062e4b00e"} Mar 21 04:37:00 crc kubenswrapper[4839]: I0321 04:37:00.426579 4839 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"06ba1fe337b0b90d457f17a01536cb29904e3afa201457970ad79c949040543e"} Mar 21 04:37:00 crc kubenswrapper[4839]: I0321 04:37:00.426587 4839 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"616e01984d5c14dee369d9f7e59f96c9dfaccfd431fc19972454ffbbb0b5489f"} Mar 21 04:37:00 crc kubenswrapper[4839]: I0321 04:37:00.426595 4839 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"ee9c34719e2681860a7065870d4f0fd1d5aae10da5f2cc780e83f8020211e536"} Mar 21 04:37:00 crc kubenswrapper[4839]: I0321 04:37:00.426603 4839 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"340ce870dbfbd3b102f8f1c5e6705da80e4a23db035175d7746ffe6ad63310f0"} Mar 21 04:37:00 crc kubenswrapper[4839]: I0321 04:37:00.426610 4839 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"04c0470b8aa154ad901d5416631136ad2480da45bff39836dda6837a3e4b980e"} Mar 21 04:37:00 crc kubenswrapper[4839]: I0321 04:37:00.426617 4839 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"c532a8668c47301ab93dbb1ea21be6ed687a83a6c9008b887e5023a46f27d172"} Mar 21 04:37:00 crc kubenswrapper[4839]: I0321 04:37:00.426625 4839 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"5854b81091eb3920f499e2fb8bdb5e51afa5cf975397ad7d789603150fdafe92"} Mar 21 04:37:00 crc kubenswrapper[4839]: I0321 04:37:00.426632 4839 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"821316a28b33f897df4ea4ae553ed0cbdd066ff220cc310af604bbe062e4b00e"} Mar 21 04:37:00 crc kubenswrapper[4839]: I0321 04:37:00.426639 4839 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"0555faac11cd1da1d5573074ee5d8704c1c4a7d8559996229bf242d9ed7290f4"} Mar 21 04:37:00 crc kubenswrapper[4839]: I0321 04:37:00.426647 4839 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"466ed122245008a86b4471250670e086e61c981de3a7a026461c7c2238c87f3c"} Mar 21 04:37:00 crc kubenswrapper[4839]: I0321 04:37:00.426657 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-spl4b" event={"ID":"d634043b-c9ec-4469-b267-26053b1f02f9","Type":"ContainerDied","Data":"0555faac11cd1da1d5573074ee5d8704c1c4a7d8559996229bf242d9ed7290f4"} Mar 21 04:37:00 crc kubenswrapper[4839]: I0321 04:37:00.426668 4839 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"06ba1fe337b0b90d457f17a01536cb29904e3afa201457970ad79c949040543e"} Mar 21 04:37:00 crc kubenswrapper[4839]: I0321 04:37:00.426676 4839 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"616e01984d5c14dee369d9f7e59f96c9dfaccfd431fc19972454ffbbb0b5489f"} Mar 21 04:37:00 crc kubenswrapper[4839]: I0321 04:37:00.426684 4839 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"ee9c34719e2681860a7065870d4f0fd1d5aae10da5f2cc780e83f8020211e536"} Mar 21 04:37:00 crc kubenswrapper[4839]: I0321 04:37:00.426692 4839 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"340ce870dbfbd3b102f8f1c5e6705da80e4a23db035175d7746ffe6ad63310f0"} Mar 21 04:37:00 crc kubenswrapper[4839]: I0321 04:37:00.426699 4839 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"04c0470b8aa154ad901d5416631136ad2480da45bff39836dda6837a3e4b980e"} Mar 21 04:37:00 crc kubenswrapper[4839]: I0321 04:37:00.426710 4839 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"c532a8668c47301ab93dbb1ea21be6ed687a83a6c9008b887e5023a46f27d172"} Mar 21 04:37:00 crc kubenswrapper[4839]: I0321 04:37:00.426720 4839 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"5854b81091eb3920f499e2fb8bdb5e51afa5cf975397ad7d789603150fdafe92"} Mar 21 04:37:00 crc kubenswrapper[4839]: I0321 04:37:00.426729 4839 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"821316a28b33f897df4ea4ae553ed0cbdd066ff220cc310af604bbe062e4b00e"} Mar 21 04:37:00 crc kubenswrapper[4839]: I0321 04:37:00.426740 4839 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"0555faac11cd1da1d5573074ee5d8704c1c4a7d8559996229bf242d9ed7290f4"} Mar 21 04:37:00 crc kubenswrapper[4839]: I0321 04:37:00.426750 4839 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"466ed122245008a86b4471250670e086e61c981de3a7a026461c7c2238c87f3c"} Mar 21 04:37:00 crc kubenswrapper[4839]: I0321 04:37:00.426762 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-spl4b" event={"ID":"d634043b-c9ec-4469-b267-26053b1f02f9","Type":"ContainerDied","Data":"f75e324ef6ce35e2f9d2ecb83aad79d37f2471563f3c265cdfc0e67df74a76f1"} Mar 21 04:37:00 crc kubenswrapper[4839]: I0321 04:37:00.426776 4839 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"06ba1fe337b0b90d457f17a01536cb29904e3afa201457970ad79c949040543e"} Mar 21 04:37:00 crc kubenswrapper[4839]: I0321 04:37:00.426788 4839 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"616e01984d5c14dee369d9f7e59f96c9dfaccfd431fc19972454ffbbb0b5489f"} Mar 21 04:37:00 crc kubenswrapper[4839]: I0321 04:37:00.426798 4839 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"ee9c34719e2681860a7065870d4f0fd1d5aae10da5f2cc780e83f8020211e536"} Mar 21 04:37:00 crc kubenswrapper[4839]: I0321 04:37:00.426807 4839 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"340ce870dbfbd3b102f8f1c5e6705da80e4a23db035175d7746ffe6ad63310f0"} Mar 21 04:37:00 crc kubenswrapper[4839]: I0321 04:37:00.426815 4839 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"04c0470b8aa154ad901d5416631136ad2480da45bff39836dda6837a3e4b980e"} Mar 21 04:37:00 crc kubenswrapper[4839]: I0321 04:37:00.426822 4839 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"c532a8668c47301ab93dbb1ea21be6ed687a83a6c9008b887e5023a46f27d172"} Mar 21 04:37:00 crc kubenswrapper[4839]: I0321 04:37:00.426829 4839 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"5854b81091eb3920f499e2fb8bdb5e51afa5cf975397ad7d789603150fdafe92"} Mar 21 04:37:00 crc kubenswrapper[4839]: I0321 04:37:00.426836 4839 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"821316a28b33f897df4ea4ae553ed0cbdd066ff220cc310af604bbe062e4b00e"} Mar 21 04:37:00 crc kubenswrapper[4839]: I0321 04:37:00.426843 4839 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"0555faac11cd1da1d5573074ee5d8704c1c4a7d8559996229bf242d9ed7290f4"} Mar 21 04:37:00 crc kubenswrapper[4839]: I0321 04:37:00.426850 4839 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"466ed122245008a86b4471250670e086e61c981de3a7a026461c7c2238c87f3c"} Mar 21 04:37:00 crc kubenswrapper[4839]: I0321 04:37:00.427657 4839 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-zqcw4_1602189b-f4f3-40ee-ba63-c695c11069d0/kube-multus/2.log" Mar 21 04:37:00 crc kubenswrapper[4839]: I0321 04:37:00.428108 4839 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-zqcw4_1602189b-f4f3-40ee-ba63-c695c11069d0/kube-multus/1.log" Mar 21 04:37:00 crc kubenswrapper[4839]: I0321 04:37:00.428138 4839 generic.go:334] "Generic (PLEG): container finished" podID="1602189b-f4f3-40ee-ba63-c695c11069d0" containerID="44c7b00e724e15bccb8ef54953306d49bc029cd21069ea40d7f724706be68de4" exitCode=2 Mar 21 04:37:00 crc kubenswrapper[4839]: I0321 04:37:00.428164 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-zqcw4" event={"ID":"1602189b-f4f3-40ee-ba63-c695c11069d0","Type":"ContainerDied","Data":"44c7b00e724e15bccb8ef54953306d49bc029cd21069ea40d7f724706be68de4"} Mar 21 04:37:00 crc kubenswrapper[4839]: I0321 04:37:00.428179 4839 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"bc026a6498e53a6dc434660326fcbdb2323e59c0c7d6f9e5d768c3a347b614d1"} Mar 21 04:37:00 crc kubenswrapper[4839]: I0321 04:37:00.428621 4839 scope.go:117] "RemoveContainer" containerID="44c7b00e724e15bccb8ef54953306d49bc029cd21069ea40d7f724706be68de4" Mar 21 04:37:00 crc kubenswrapper[4839]: I0321 04:37:00.458301 4839 scope.go:117] "RemoveContainer" containerID="616e01984d5c14dee369d9f7e59f96c9dfaccfd431fc19972454ffbbb0b5489f" Mar 21 04:37:00 crc kubenswrapper[4839]: I0321 04:37:00.461195 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/cb64f802-d294-4fd5-a691-da3096ee0978-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-vdfz5\" (UID: \"cb64f802-d294-4fd5-a691-da3096ee0978\") " pod="openshift-ovn-kubernetes/ovnkube-node-vdfz5" Mar 21 04:37:00 crc kubenswrapper[4839]: I0321 04:37:00.461282 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/cb64f802-d294-4fd5-a691-da3096ee0978-node-log\") pod \"ovnkube-node-vdfz5\" (UID: \"cb64f802-d294-4fd5-a691-da3096ee0978\") " pod="openshift-ovn-kubernetes/ovnkube-node-vdfz5" Mar 21 04:37:00 crc kubenswrapper[4839]: I0321 04:37:00.461504 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d876p\" (UniqueName: \"kubernetes.io/projected/cb64f802-d294-4fd5-a691-da3096ee0978-kube-api-access-d876p\") pod \"ovnkube-node-vdfz5\" (UID: \"cb64f802-d294-4fd5-a691-da3096ee0978\") " pod="openshift-ovn-kubernetes/ovnkube-node-vdfz5" Mar 21 04:37:00 crc kubenswrapper[4839]: I0321 04:37:00.461508 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/cb64f802-d294-4fd5-a691-da3096ee0978-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-vdfz5\" (UID: \"cb64f802-d294-4fd5-a691-da3096ee0978\") " pod="openshift-ovn-kubernetes/ovnkube-node-vdfz5" Mar 21 04:37:00 crc kubenswrapper[4839]: I0321 04:37:00.461727 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/cb64f802-d294-4fd5-a691-da3096ee0978-node-log\") pod \"ovnkube-node-vdfz5\" (UID: \"cb64f802-d294-4fd5-a691-da3096ee0978\") " pod="openshift-ovn-kubernetes/ovnkube-node-vdfz5" Mar 21 04:37:00 crc kubenswrapper[4839]: I0321 04:37:00.461749 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/cb64f802-d294-4fd5-a691-da3096ee0978-run-openvswitch\") pod \"ovnkube-node-vdfz5\" (UID: \"cb64f802-d294-4fd5-a691-da3096ee0978\") " pod="openshift-ovn-kubernetes/ovnkube-node-vdfz5" Mar 21 04:37:00 crc kubenswrapper[4839]: I0321 04:37:00.461817 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/cb64f802-d294-4fd5-a691-da3096ee0978-var-lib-openvswitch\") pod \"ovnkube-node-vdfz5\" (UID: \"cb64f802-d294-4fd5-a691-da3096ee0978\") " pod="openshift-ovn-kubernetes/ovnkube-node-vdfz5" Mar 21 04:37:00 crc kubenswrapper[4839]: I0321 04:37:00.461851 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/cb64f802-d294-4fd5-a691-da3096ee0978-run-openvswitch\") pod \"ovnkube-node-vdfz5\" (UID: \"cb64f802-d294-4fd5-a691-da3096ee0978\") " pod="openshift-ovn-kubernetes/ovnkube-node-vdfz5" Mar 21 04:37:00 crc kubenswrapper[4839]: I0321 04:37:00.461953 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/cb64f802-d294-4fd5-a691-da3096ee0978-var-lib-openvswitch\") pod \"ovnkube-node-vdfz5\" (UID: \"cb64f802-d294-4fd5-a691-da3096ee0978\") " pod="openshift-ovn-kubernetes/ovnkube-node-vdfz5" Mar 21 04:37:00 crc kubenswrapper[4839]: I0321 04:37:00.461870 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/cb64f802-d294-4fd5-a691-da3096ee0978-run-ovn\") pod \"ovnkube-node-vdfz5\" (UID: \"cb64f802-d294-4fd5-a691-da3096ee0978\") " pod="openshift-ovn-kubernetes/ovnkube-node-vdfz5" Mar 21 04:37:00 crc kubenswrapper[4839]: I0321 04:37:00.462063 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/cb64f802-d294-4fd5-a691-da3096ee0978-host-kubelet\") pod \"ovnkube-node-vdfz5\" (UID: \"cb64f802-d294-4fd5-a691-da3096ee0978\") " pod="openshift-ovn-kubernetes/ovnkube-node-vdfz5" Mar 21 04:37:00 crc kubenswrapper[4839]: I0321 04:37:00.462127 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/cb64f802-d294-4fd5-a691-da3096ee0978-run-ovn\") pod \"ovnkube-node-vdfz5\" (UID: \"cb64f802-d294-4fd5-a691-da3096ee0978\") " pod="openshift-ovn-kubernetes/ovnkube-node-vdfz5" Mar 21 04:37:00 crc kubenswrapper[4839]: I0321 04:37:00.462247 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/cb64f802-d294-4fd5-a691-da3096ee0978-host-run-ovn-kubernetes\") pod \"ovnkube-node-vdfz5\" (UID: \"cb64f802-d294-4fd5-a691-da3096ee0978\") " pod="openshift-ovn-kubernetes/ovnkube-node-vdfz5" Mar 21 04:37:00 crc kubenswrapper[4839]: I0321 04:37:00.462305 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/cb64f802-d294-4fd5-a691-da3096ee0978-run-systemd\") pod \"ovnkube-node-vdfz5\" (UID: \"cb64f802-d294-4fd5-a691-da3096ee0978\") " pod="openshift-ovn-kubernetes/ovnkube-node-vdfz5" Mar 21 04:37:00 crc kubenswrapper[4839]: I0321 04:37:00.462504 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/cb64f802-d294-4fd5-a691-da3096ee0978-host-kubelet\") pod \"ovnkube-node-vdfz5\" (UID: \"cb64f802-d294-4fd5-a691-da3096ee0978\") " pod="openshift-ovn-kubernetes/ovnkube-node-vdfz5" Mar 21 04:37:00 crc kubenswrapper[4839]: I0321 04:37:00.462523 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/cb64f802-d294-4fd5-a691-da3096ee0978-host-slash\") pod \"ovnkube-node-vdfz5\" (UID: \"cb64f802-d294-4fd5-a691-da3096ee0978\") " pod="openshift-ovn-kubernetes/ovnkube-node-vdfz5" Mar 21 04:37:00 crc kubenswrapper[4839]: I0321 04:37:00.462667 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/cb64f802-d294-4fd5-a691-da3096ee0978-etc-openvswitch\") pod \"ovnkube-node-vdfz5\" (UID: \"cb64f802-d294-4fd5-a691-da3096ee0978\") " pod="openshift-ovn-kubernetes/ovnkube-node-vdfz5" Mar 21 04:37:00 crc kubenswrapper[4839]: I0321 04:37:00.462695 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/cb64f802-d294-4fd5-a691-da3096ee0978-host-slash\") pod \"ovnkube-node-vdfz5\" (UID: \"cb64f802-d294-4fd5-a691-da3096ee0978\") " pod="openshift-ovn-kubernetes/ovnkube-node-vdfz5" Mar 21 04:37:00 crc kubenswrapper[4839]: I0321 04:37:00.462721 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/cb64f802-d294-4fd5-a691-da3096ee0978-log-socket\") pod \"ovnkube-node-vdfz5\" (UID: \"cb64f802-d294-4fd5-a691-da3096ee0978\") " pod="openshift-ovn-kubernetes/ovnkube-node-vdfz5" Mar 21 04:37:00 crc kubenswrapper[4839]: I0321 04:37:00.462763 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/cb64f802-d294-4fd5-a691-da3096ee0978-run-systemd\") pod \"ovnkube-node-vdfz5\" (UID: \"cb64f802-d294-4fd5-a691-da3096ee0978\") " pod="openshift-ovn-kubernetes/ovnkube-node-vdfz5" Mar 21 04:37:00 crc kubenswrapper[4839]: I0321 04:37:00.462523 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/cb64f802-d294-4fd5-a691-da3096ee0978-host-run-ovn-kubernetes\") pod \"ovnkube-node-vdfz5\" (UID: \"cb64f802-d294-4fd5-a691-da3096ee0978\") " pod="openshift-ovn-kubernetes/ovnkube-node-vdfz5" Mar 21 04:37:00 crc kubenswrapper[4839]: I0321 04:37:00.462828 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/cb64f802-d294-4fd5-a691-da3096ee0978-etc-openvswitch\") pod \"ovnkube-node-vdfz5\" (UID: \"cb64f802-d294-4fd5-a691-da3096ee0978\") " pod="openshift-ovn-kubernetes/ovnkube-node-vdfz5" Mar 21 04:37:00 crc kubenswrapper[4839]: I0321 04:37:00.462928 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/cb64f802-d294-4fd5-a691-da3096ee0978-env-overrides\") pod \"ovnkube-node-vdfz5\" (UID: \"cb64f802-d294-4fd5-a691-da3096ee0978\") " pod="openshift-ovn-kubernetes/ovnkube-node-vdfz5" Mar 21 04:37:00 crc kubenswrapper[4839]: I0321 04:37:00.462977 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/cb64f802-d294-4fd5-a691-da3096ee0978-ovnkube-script-lib\") pod \"ovnkube-node-vdfz5\" (UID: \"cb64f802-d294-4fd5-a691-da3096ee0978\") " pod="openshift-ovn-kubernetes/ovnkube-node-vdfz5" Mar 21 04:37:00 crc kubenswrapper[4839]: I0321 04:37:00.463068 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/cb64f802-d294-4fd5-a691-da3096ee0978-host-cni-bin\") pod \"ovnkube-node-vdfz5\" (UID: \"cb64f802-d294-4fd5-a691-da3096ee0978\") " pod="openshift-ovn-kubernetes/ovnkube-node-vdfz5" Mar 21 04:37:00 crc kubenswrapper[4839]: I0321 04:37:00.463121 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/cb64f802-d294-4fd5-a691-da3096ee0978-ovnkube-config\") pod \"ovnkube-node-vdfz5\" (UID: \"cb64f802-d294-4fd5-a691-da3096ee0978\") " pod="openshift-ovn-kubernetes/ovnkube-node-vdfz5" Mar 21 04:37:00 crc kubenswrapper[4839]: I0321 04:37:00.463247 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/cb64f802-d294-4fd5-a691-da3096ee0978-host-run-netns\") pod \"ovnkube-node-vdfz5\" (UID: \"cb64f802-d294-4fd5-a691-da3096ee0978\") " pod="openshift-ovn-kubernetes/ovnkube-node-vdfz5" Mar 21 04:37:00 crc kubenswrapper[4839]: I0321 04:37:00.463068 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/cb64f802-d294-4fd5-a691-da3096ee0978-log-socket\") pod \"ovnkube-node-vdfz5\" (UID: \"cb64f802-d294-4fd5-a691-da3096ee0978\") " pod="openshift-ovn-kubernetes/ovnkube-node-vdfz5" Mar 21 04:37:00 crc kubenswrapper[4839]: I0321 04:37:00.463301 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/cb64f802-d294-4fd5-a691-da3096ee0978-ovn-node-metrics-cert\") pod \"ovnkube-node-vdfz5\" (UID: \"cb64f802-d294-4fd5-a691-da3096ee0978\") " pod="openshift-ovn-kubernetes/ovnkube-node-vdfz5" Mar 21 04:37:00 crc kubenswrapper[4839]: I0321 04:37:00.463407 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/cb64f802-d294-4fd5-a691-da3096ee0978-host-cni-netd\") pod \"ovnkube-node-vdfz5\" (UID: \"cb64f802-d294-4fd5-a691-da3096ee0978\") " pod="openshift-ovn-kubernetes/ovnkube-node-vdfz5" Mar 21 04:37:00 crc kubenswrapper[4839]: I0321 04:37:00.463461 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/cb64f802-d294-4fd5-a691-da3096ee0978-systemd-units\") pod \"ovnkube-node-vdfz5\" (UID: \"cb64f802-d294-4fd5-a691-da3096ee0978\") " pod="openshift-ovn-kubernetes/ovnkube-node-vdfz5" Mar 21 04:37:00 crc kubenswrapper[4839]: I0321 04:37:00.464282 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/cb64f802-d294-4fd5-a691-da3096ee0978-env-overrides\") pod \"ovnkube-node-vdfz5\" (UID: \"cb64f802-d294-4fd5-a691-da3096ee0978\") " pod="openshift-ovn-kubernetes/ovnkube-node-vdfz5" Mar 21 04:37:00 crc kubenswrapper[4839]: I0321 04:37:00.464310 4839 reconciler_common.go:293] "Volume detached for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/d634043b-c9ec-4469-b267-26053b1f02f9-ovn-node-metrics-cert\") on node \"crc\" DevicePath \"\"" Mar 21 04:37:00 crc kubenswrapper[4839]: I0321 04:37:00.464341 4839 reconciler_common.go:293] "Volume detached for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/d634043b-c9ec-4469-b267-26053b1f02f9-run-systemd\") on node \"crc\" DevicePath \"\"" Mar 21 04:37:00 crc kubenswrapper[4839]: I0321 04:37:00.464356 4839 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cdph2\" (UniqueName: \"kubernetes.io/projected/d634043b-c9ec-4469-b267-26053b1f02f9-kube-api-access-cdph2\") on node \"crc\" DevicePath \"\"" Mar 21 04:37:00 crc kubenswrapper[4839]: I0321 04:37:00.464515 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/cb64f802-d294-4fd5-a691-da3096ee0978-systemd-units\") pod \"ovnkube-node-vdfz5\" (UID: \"cb64f802-d294-4fd5-a691-da3096ee0978\") " pod="openshift-ovn-kubernetes/ovnkube-node-vdfz5" Mar 21 04:37:00 crc kubenswrapper[4839]: I0321 04:37:00.464604 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/cb64f802-d294-4fd5-a691-da3096ee0978-host-cni-netd\") pod \"ovnkube-node-vdfz5\" (UID: \"cb64f802-d294-4fd5-a691-da3096ee0978\") " pod="openshift-ovn-kubernetes/ovnkube-node-vdfz5" Mar 21 04:37:00 crc kubenswrapper[4839]: I0321 04:37:00.464678 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/cb64f802-d294-4fd5-a691-da3096ee0978-host-cni-bin\") pod \"ovnkube-node-vdfz5\" (UID: \"cb64f802-d294-4fd5-a691-da3096ee0978\") " pod="openshift-ovn-kubernetes/ovnkube-node-vdfz5" Mar 21 04:37:00 crc kubenswrapper[4839]: I0321 04:37:00.464755 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/cb64f802-d294-4fd5-a691-da3096ee0978-host-run-netns\") pod \"ovnkube-node-vdfz5\" (UID: \"cb64f802-d294-4fd5-a691-da3096ee0978\") " pod="openshift-ovn-kubernetes/ovnkube-node-vdfz5" Mar 21 04:37:00 crc kubenswrapper[4839]: I0321 04:37:00.465095 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/cb64f802-d294-4fd5-a691-da3096ee0978-ovnkube-config\") pod \"ovnkube-node-vdfz5\" (UID: \"cb64f802-d294-4fd5-a691-da3096ee0978\") " pod="openshift-ovn-kubernetes/ovnkube-node-vdfz5" Mar 21 04:37:00 crc kubenswrapper[4839]: I0321 04:37:00.465927 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/cb64f802-d294-4fd5-a691-da3096ee0978-ovnkube-script-lib\") pod \"ovnkube-node-vdfz5\" (UID: \"cb64f802-d294-4fd5-a691-da3096ee0978\") " pod="openshift-ovn-kubernetes/ovnkube-node-vdfz5" Mar 21 04:37:00 crc kubenswrapper[4839]: I0321 04:37:00.471746 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/cb64f802-d294-4fd5-a691-da3096ee0978-ovn-node-metrics-cert\") pod \"ovnkube-node-vdfz5\" (UID: \"cb64f802-d294-4fd5-a691-da3096ee0978\") " pod="openshift-ovn-kubernetes/ovnkube-node-vdfz5" Mar 21 04:37:00 crc kubenswrapper[4839]: I0321 04:37:00.487261 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d876p\" (UniqueName: \"kubernetes.io/projected/cb64f802-d294-4fd5-a691-da3096ee0978-kube-api-access-d876p\") pod \"ovnkube-node-vdfz5\" (UID: \"cb64f802-d294-4fd5-a691-da3096ee0978\") " pod="openshift-ovn-kubernetes/ovnkube-node-vdfz5" Mar 21 04:37:00 crc kubenswrapper[4839]: I0321 04:37:00.494673 4839 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-spl4b"] Mar 21 04:37:00 crc kubenswrapper[4839]: I0321 04:37:00.497141 4839 scope.go:117] "RemoveContainer" containerID="ee9c34719e2681860a7065870d4f0fd1d5aae10da5f2cc780e83f8020211e536" Mar 21 04:37:00 crc kubenswrapper[4839]: I0321 04:37:00.499377 4839 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-spl4b"] Mar 21 04:37:00 crc kubenswrapper[4839]: I0321 04:37:00.513662 4839 scope.go:117] "RemoveContainer" containerID="340ce870dbfbd3b102f8f1c5e6705da80e4a23db035175d7746ffe6ad63310f0" Mar 21 04:37:00 crc kubenswrapper[4839]: I0321 04:37:00.543895 4839 scope.go:117] "RemoveContainer" containerID="04c0470b8aa154ad901d5416631136ad2480da45bff39836dda6837a3e4b980e" Mar 21 04:37:00 crc kubenswrapper[4839]: I0321 04:37:00.556269 4839 scope.go:117] "RemoveContainer" containerID="c532a8668c47301ab93dbb1ea21be6ed687a83a6c9008b887e5023a46f27d172" Mar 21 04:37:00 crc kubenswrapper[4839]: I0321 04:37:00.567823 4839 scope.go:117] "RemoveContainer" containerID="5854b81091eb3920f499e2fb8bdb5e51afa5cf975397ad7d789603150fdafe92" Mar 21 04:37:00 crc kubenswrapper[4839]: I0321 04:37:00.581652 4839 scope.go:117] "RemoveContainer" containerID="821316a28b33f897df4ea4ae553ed0cbdd066ff220cc310af604bbe062e4b00e" Mar 21 04:37:00 crc kubenswrapper[4839]: I0321 04:37:00.594647 4839 scope.go:117] "RemoveContainer" containerID="0555faac11cd1da1d5573074ee5d8704c1c4a7d8559996229bf242d9ed7290f4" Mar 21 04:37:00 crc kubenswrapper[4839]: I0321 04:37:00.606434 4839 scope.go:117] "RemoveContainer" containerID="466ed122245008a86b4471250670e086e61c981de3a7a026461c7c2238c87f3c" Mar 21 04:37:00 crc kubenswrapper[4839]: I0321 04:37:00.619090 4839 scope.go:117] "RemoveContainer" containerID="06ba1fe337b0b90d457f17a01536cb29904e3afa201457970ad79c949040543e" Mar 21 04:37:00 crc kubenswrapper[4839]: E0321 04:37:00.619413 4839 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"06ba1fe337b0b90d457f17a01536cb29904e3afa201457970ad79c949040543e\": container with ID starting with 06ba1fe337b0b90d457f17a01536cb29904e3afa201457970ad79c949040543e not found: ID does not exist" containerID="06ba1fe337b0b90d457f17a01536cb29904e3afa201457970ad79c949040543e" Mar 21 04:37:00 crc kubenswrapper[4839]: I0321 04:37:00.619445 4839 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"06ba1fe337b0b90d457f17a01536cb29904e3afa201457970ad79c949040543e"} err="failed to get container status \"06ba1fe337b0b90d457f17a01536cb29904e3afa201457970ad79c949040543e\": rpc error: code = NotFound desc = could not find container \"06ba1fe337b0b90d457f17a01536cb29904e3afa201457970ad79c949040543e\": container with ID starting with 06ba1fe337b0b90d457f17a01536cb29904e3afa201457970ad79c949040543e not found: ID does not exist" Mar 21 04:37:00 crc kubenswrapper[4839]: I0321 04:37:00.619466 4839 scope.go:117] "RemoveContainer" containerID="616e01984d5c14dee369d9f7e59f96c9dfaccfd431fc19972454ffbbb0b5489f" Mar 21 04:37:00 crc kubenswrapper[4839]: E0321 04:37:00.619728 4839 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"616e01984d5c14dee369d9f7e59f96c9dfaccfd431fc19972454ffbbb0b5489f\": container with ID starting with 616e01984d5c14dee369d9f7e59f96c9dfaccfd431fc19972454ffbbb0b5489f not found: ID does not exist" containerID="616e01984d5c14dee369d9f7e59f96c9dfaccfd431fc19972454ffbbb0b5489f" Mar 21 04:37:00 crc kubenswrapper[4839]: I0321 04:37:00.619749 4839 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"616e01984d5c14dee369d9f7e59f96c9dfaccfd431fc19972454ffbbb0b5489f"} err="failed to get container status \"616e01984d5c14dee369d9f7e59f96c9dfaccfd431fc19972454ffbbb0b5489f\": rpc error: code = NotFound desc = could not find container \"616e01984d5c14dee369d9f7e59f96c9dfaccfd431fc19972454ffbbb0b5489f\": container with ID starting with 616e01984d5c14dee369d9f7e59f96c9dfaccfd431fc19972454ffbbb0b5489f not found: ID does not exist" Mar 21 04:37:00 crc kubenswrapper[4839]: I0321 04:37:00.619763 4839 scope.go:117] "RemoveContainer" containerID="ee9c34719e2681860a7065870d4f0fd1d5aae10da5f2cc780e83f8020211e536" Mar 21 04:37:00 crc kubenswrapper[4839]: E0321 04:37:00.620114 4839 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ee9c34719e2681860a7065870d4f0fd1d5aae10da5f2cc780e83f8020211e536\": container with ID starting with ee9c34719e2681860a7065870d4f0fd1d5aae10da5f2cc780e83f8020211e536 not found: ID does not exist" containerID="ee9c34719e2681860a7065870d4f0fd1d5aae10da5f2cc780e83f8020211e536" Mar 21 04:37:00 crc kubenswrapper[4839]: I0321 04:37:00.620137 4839 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ee9c34719e2681860a7065870d4f0fd1d5aae10da5f2cc780e83f8020211e536"} err="failed to get container status \"ee9c34719e2681860a7065870d4f0fd1d5aae10da5f2cc780e83f8020211e536\": rpc error: code = NotFound desc = could not find container \"ee9c34719e2681860a7065870d4f0fd1d5aae10da5f2cc780e83f8020211e536\": container with ID starting with ee9c34719e2681860a7065870d4f0fd1d5aae10da5f2cc780e83f8020211e536 not found: ID does not exist" Mar 21 04:37:00 crc kubenswrapper[4839]: I0321 04:37:00.620150 4839 scope.go:117] "RemoveContainer" containerID="340ce870dbfbd3b102f8f1c5e6705da80e4a23db035175d7746ffe6ad63310f0" Mar 21 04:37:00 crc kubenswrapper[4839]: E0321 04:37:00.620471 4839 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"340ce870dbfbd3b102f8f1c5e6705da80e4a23db035175d7746ffe6ad63310f0\": container with ID starting with 340ce870dbfbd3b102f8f1c5e6705da80e4a23db035175d7746ffe6ad63310f0 not found: ID does not exist" containerID="340ce870dbfbd3b102f8f1c5e6705da80e4a23db035175d7746ffe6ad63310f0" Mar 21 04:37:00 crc kubenswrapper[4839]: I0321 04:37:00.620489 4839 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"340ce870dbfbd3b102f8f1c5e6705da80e4a23db035175d7746ffe6ad63310f0"} err="failed to get container status \"340ce870dbfbd3b102f8f1c5e6705da80e4a23db035175d7746ffe6ad63310f0\": rpc error: code = NotFound desc = could not find container \"340ce870dbfbd3b102f8f1c5e6705da80e4a23db035175d7746ffe6ad63310f0\": container with ID starting with 340ce870dbfbd3b102f8f1c5e6705da80e4a23db035175d7746ffe6ad63310f0 not found: ID does not exist" Mar 21 04:37:00 crc kubenswrapper[4839]: I0321 04:37:00.620501 4839 scope.go:117] "RemoveContainer" containerID="04c0470b8aa154ad901d5416631136ad2480da45bff39836dda6837a3e4b980e" Mar 21 04:37:00 crc kubenswrapper[4839]: E0321 04:37:00.620726 4839 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"04c0470b8aa154ad901d5416631136ad2480da45bff39836dda6837a3e4b980e\": container with ID starting with 04c0470b8aa154ad901d5416631136ad2480da45bff39836dda6837a3e4b980e not found: ID does not exist" containerID="04c0470b8aa154ad901d5416631136ad2480da45bff39836dda6837a3e4b980e" Mar 21 04:37:00 crc kubenswrapper[4839]: I0321 04:37:00.620765 4839 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"04c0470b8aa154ad901d5416631136ad2480da45bff39836dda6837a3e4b980e"} err="failed to get container status \"04c0470b8aa154ad901d5416631136ad2480da45bff39836dda6837a3e4b980e\": rpc error: code = NotFound desc = could not find container \"04c0470b8aa154ad901d5416631136ad2480da45bff39836dda6837a3e4b980e\": container with ID starting with 04c0470b8aa154ad901d5416631136ad2480da45bff39836dda6837a3e4b980e not found: ID does not exist" Mar 21 04:37:00 crc kubenswrapper[4839]: I0321 04:37:00.620780 4839 scope.go:117] "RemoveContainer" containerID="c532a8668c47301ab93dbb1ea21be6ed687a83a6c9008b887e5023a46f27d172" Mar 21 04:37:00 crc kubenswrapper[4839]: E0321 04:37:00.620947 4839 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c532a8668c47301ab93dbb1ea21be6ed687a83a6c9008b887e5023a46f27d172\": container with ID starting with c532a8668c47301ab93dbb1ea21be6ed687a83a6c9008b887e5023a46f27d172 not found: ID does not exist" containerID="c532a8668c47301ab93dbb1ea21be6ed687a83a6c9008b887e5023a46f27d172" Mar 21 04:37:00 crc kubenswrapper[4839]: I0321 04:37:00.620967 4839 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c532a8668c47301ab93dbb1ea21be6ed687a83a6c9008b887e5023a46f27d172"} err="failed to get container status \"c532a8668c47301ab93dbb1ea21be6ed687a83a6c9008b887e5023a46f27d172\": rpc error: code = NotFound desc = could not find container \"c532a8668c47301ab93dbb1ea21be6ed687a83a6c9008b887e5023a46f27d172\": container with ID starting with c532a8668c47301ab93dbb1ea21be6ed687a83a6c9008b887e5023a46f27d172 not found: ID does not exist" Mar 21 04:37:00 crc kubenswrapper[4839]: I0321 04:37:00.620980 4839 scope.go:117] "RemoveContainer" containerID="5854b81091eb3920f499e2fb8bdb5e51afa5cf975397ad7d789603150fdafe92" Mar 21 04:37:00 crc kubenswrapper[4839]: E0321 04:37:00.621236 4839 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5854b81091eb3920f499e2fb8bdb5e51afa5cf975397ad7d789603150fdafe92\": container with ID starting with 5854b81091eb3920f499e2fb8bdb5e51afa5cf975397ad7d789603150fdafe92 not found: ID does not exist" containerID="5854b81091eb3920f499e2fb8bdb5e51afa5cf975397ad7d789603150fdafe92" Mar 21 04:37:00 crc kubenswrapper[4839]: I0321 04:37:00.621298 4839 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5854b81091eb3920f499e2fb8bdb5e51afa5cf975397ad7d789603150fdafe92"} err="failed to get container status \"5854b81091eb3920f499e2fb8bdb5e51afa5cf975397ad7d789603150fdafe92\": rpc error: code = NotFound desc = could not find container \"5854b81091eb3920f499e2fb8bdb5e51afa5cf975397ad7d789603150fdafe92\": container with ID starting with 5854b81091eb3920f499e2fb8bdb5e51afa5cf975397ad7d789603150fdafe92 not found: ID does not exist" Mar 21 04:37:00 crc kubenswrapper[4839]: I0321 04:37:00.621338 4839 scope.go:117] "RemoveContainer" containerID="821316a28b33f897df4ea4ae553ed0cbdd066ff220cc310af604bbe062e4b00e" Mar 21 04:37:00 crc kubenswrapper[4839]: E0321 04:37:00.621699 4839 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"821316a28b33f897df4ea4ae553ed0cbdd066ff220cc310af604bbe062e4b00e\": container with ID starting with 821316a28b33f897df4ea4ae553ed0cbdd066ff220cc310af604bbe062e4b00e not found: ID does not exist" containerID="821316a28b33f897df4ea4ae553ed0cbdd066ff220cc310af604bbe062e4b00e" Mar 21 04:37:00 crc kubenswrapper[4839]: I0321 04:37:00.621719 4839 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"821316a28b33f897df4ea4ae553ed0cbdd066ff220cc310af604bbe062e4b00e"} err="failed to get container status \"821316a28b33f897df4ea4ae553ed0cbdd066ff220cc310af604bbe062e4b00e\": rpc error: code = NotFound desc = could not find container \"821316a28b33f897df4ea4ae553ed0cbdd066ff220cc310af604bbe062e4b00e\": container with ID starting with 821316a28b33f897df4ea4ae553ed0cbdd066ff220cc310af604bbe062e4b00e not found: ID does not exist" Mar 21 04:37:00 crc kubenswrapper[4839]: I0321 04:37:00.621732 4839 scope.go:117] "RemoveContainer" containerID="0555faac11cd1da1d5573074ee5d8704c1c4a7d8559996229bf242d9ed7290f4" Mar 21 04:37:00 crc kubenswrapper[4839]: E0321 04:37:00.621964 4839 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0555faac11cd1da1d5573074ee5d8704c1c4a7d8559996229bf242d9ed7290f4\": container with ID starting with 0555faac11cd1da1d5573074ee5d8704c1c4a7d8559996229bf242d9ed7290f4 not found: ID does not exist" containerID="0555faac11cd1da1d5573074ee5d8704c1c4a7d8559996229bf242d9ed7290f4" Mar 21 04:37:00 crc kubenswrapper[4839]: I0321 04:37:00.621983 4839 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0555faac11cd1da1d5573074ee5d8704c1c4a7d8559996229bf242d9ed7290f4"} err="failed to get container status \"0555faac11cd1da1d5573074ee5d8704c1c4a7d8559996229bf242d9ed7290f4\": rpc error: code = NotFound desc = could not find container \"0555faac11cd1da1d5573074ee5d8704c1c4a7d8559996229bf242d9ed7290f4\": container with ID starting with 0555faac11cd1da1d5573074ee5d8704c1c4a7d8559996229bf242d9ed7290f4 not found: ID does not exist" Mar 21 04:37:00 crc kubenswrapper[4839]: I0321 04:37:00.621996 4839 scope.go:117] "RemoveContainer" containerID="466ed122245008a86b4471250670e086e61c981de3a7a026461c7c2238c87f3c" Mar 21 04:37:00 crc kubenswrapper[4839]: E0321 04:37:00.622282 4839 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"466ed122245008a86b4471250670e086e61c981de3a7a026461c7c2238c87f3c\": container with ID starting with 466ed122245008a86b4471250670e086e61c981de3a7a026461c7c2238c87f3c not found: ID does not exist" containerID="466ed122245008a86b4471250670e086e61c981de3a7a026461c7c2238c87f3c" Mar 21 04:37:00 crc kubenswrapper[4839]: I0321 04:37:00.622321 4839 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"466ed122245008a86b4471250670e086e61c981de3a7a026461c7c2238c87f3c"} err="failed to get container status \"466ed122245008a86b4471250670e086e61c981de3a7a026461c7c2238c87f3c\": rpc error: code = NotFound desc = could not find container \"466ed122245008a86b4471250670e086e61c981de3a7a026461c7c2238c87f3c\": container with ID starting with 466ed122245008a86b4471250670e086e61c981de3a7a026461c7c2238c87f3c not found: ID does not exist" Mar 21 04:37:00 crc kubenswrapper[4839]: I0321 04:37:00.622346 4839 scope.go:117] "RemoveContainer" containerID="06ba1fe337b0b90d457f17a01536cb29904e3afa201457970ad79c949040543e" Mar 21 04:37:00 crc kubenswrapper[4839]: I0321 04:37:00.622649 4839 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"06ba1fe337b0b90d457f17a01536cb29904e3afa201457970ad79c949040543e"} err="failed to get container status \"06ba1fe337b0b90d457f17a01536cb29904e3afa201457970ad79c949040543e\": rpc error: code = NotFound desc = could not find container \"06ba1fe337b0b90d457f17a01536cb29904e3afa201457970ad79c949040543e\": container with ID starting with 06ba1fe337b0b90d457f17a01536cb29904e3afa201457970ad79c949040543e not found: ID does not exist" Mar 21 04:37:00 crc kubenswrapper[4839]: I0321 04:37:00.622668 4839 scope.go:117] "RemoveContainer" containerID="616e01984d5c14dee369d9f7e59f96c9dfaccfd431fc19972454ffbbb0b5489f" Mar 21 04:37:00 crc kubenswrapper[4839]: I0321 04:37:00.622918 4839 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"616e01984d5c14dee369d9f7e59f96c9dfaccfd431fc19972454ffbbb0b5489f"} err="failed to get container status \"616e01984d5c14dee369d9f7e59f96c9dfaccfd431fc19972454ffbbb0b5489f\": rpc error: code = NotFound desc = could not find container \"616e01984d5c14dee369d9f7e59f96c9dfaccfd431fc19972454ffbbb0b5489f\": container with ID starting with 616e01984d5c14dee369d9f7e59f96c9dfaccfd431fc19972454ffbbb0b5489f not found: ID does not exist" Mar 21 04:37:00 crc kubenswrapper[4839]: I0321 04:37:00.622936 4839 scope.go:117] "RemoveContainer" containerID="ee9c34719e2681860a7065870d4f0fd1d5aae10da5f2cc780e83f8020211e536" Mar 21 04:37:00 crc kubenswrapper[4839]: I0321 04:37:00.623170 4839 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ee9c34719e2681860a7065870d4f0fd1d5aae10da5f2cc780e83f8020211e536"} err="failed to get container status \"ee9c34719e2681860a7065870d4f0fd1d5aae10da5f2cc780e83f8020211e536\": rpc error: code = NotFound desc = could not find container \"ee9c34719e2681860a7065870d4f0fd1d5aae10da5f2cc780e83f8020211e536\": container with ID starting with ee9c34719e2681860a7065870d4f0fd1d5aae10da5f2cc780e83f8020211e536 not found: ID does not exist" Mar 21 04:37:00 crc kubenswrapper[4839]: I0321 04:37:00.623193 4839 scope.go:117] "RemoveContainer" containerID="340ce870dbfbd3b102f8f1c5e6705da80e4a23db035175d7746ffe6ad63310f0" Mar 21 04:37:00 crc kubenswrapper[4839]: I0321 04:37:00.623457 4839 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"340ce870dbfbd3b102f8f1c5e6705da80e4a23db035175d7746ffe6ad63310f0"} err="failed to get container status \"340ce870dbfbd3b102f8f1c5e6705da80e4a23db035175d7746ffe6ad63310f0\": rpc error: code = NotFound desc = could not find container \"340ce870dbfbd3b102f8f1c5e6705da80e4a23db035175d7746ffe6ad63310f0\": container with ID starting with 340ce870dbfbd3b102f8f1c5e6705da80e4a23db035175d7746ffe6ad63310f0 not found: ID does not exist" Mar 21 04:37:00 crc kubenswrapper[4839]: I0321 04:37:00.623475 4839 scope.go:117] "RemoveContainer" containerID="04c0470b8aa154ad901d5416631136ad2480da45bff39836dda6837a3e4b980e" Mar 21 04:37:00 crc kubenswrapper[4839]: I0321 04:37:00.623546 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-vdfz5" Mar 21 04:37:00 crc kubenswrapper[4839]: I0321 04:37:00.623788 4839 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"04c0470b8aa154ad901d5416631136ad2480da45bff39836dda6837a3e4b980e"} err="failed to get container status \"04c0470b8aa154ad901d5416631136ad2480da45bff39836dda6837a3e4b980e\": rpc error: code = NotFound desc = could not find container \"04c0470b8aa154ad901d5416631136ad2480da45bff39836dda6837a3e4b980e\": container with ID starting with 04c0470b8aa154ad901d5416631136ad2480da45bff39836dda6837a3e4b980e not found: ID does not exist" Mar 21 04:37:00 crc kubenswrapper[4839]: I0321 04:37:00.623809 4839 scope.go:117] "RemoveContainer" containerID="c532a8668c47301ab93dbb1ea21be6ed687a83a6c9008b887e5023a46f27d172" Mar 21 04:37:00 crc kubenswrapper[4839]: I0321 04:37:00.624119 4839 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c532a8668c47301ab93dbb1ea21be6ed687a83a6c9008b887e5023a46f27d172"} err="failed to get container status \"c532a8668c47301ab93dbb1ea21be6ed687a83a6c9008b887e5023a46f27d172\": rpc error: code = NotFound desc = could not find container \"c532a8668c47301ab93dbb1ea21be6ed687a83a6c9008b887e5023a46f27d172\": container with ID starting with c532a8668c47301ab93dbb1ea21be6ed687a83a6c9008b887e5023a46f27d172 not found: ID does not exist" Mar 21 04:37:00 crc kubenswrapper[4839]: I0321 04:37:00.624162 4839 scope.go:117] "RemoveContainer" containerID="5854b81091eb3920f499e2fb8bdb5e51afa5cf975397ad7d789603150fdafe92" Mar 21 04:37:00 crc kubenswrapper[4839]: I0321 04:37:00.624530 4839 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5854b81091eb3920f499e2fb8bdb5e51afa5cf975397ad7d789603150fdafe92"} err="failed to get container status \"5854b81091eb3920f499e2fb8bdb5e51afa5cf975397ad7d789603150fdafe92\": rpc error: code = NotFound desc = could not find container \"5854b81091eb3920f499e2fb8bdb5e51afa5cf975397ad7d789603150fdafe92\": container with ID starting with 5854b81091eb3920f499e2fb8bdb5e51afa5cf975397ad7d789603150fdafe92 not found: ID does not exist" Mar 21 04:37:00 crc kubenswrapper[4839]: I0321 04:37:00.624548 4839 scope.go:117] "RemoveContainer" containerID="821316a28b33f897df4ea4ae553ed0cbdd066ff220cc310af604bbe062e4b00e" Mar 21 04:37:00 crc kubenswrapper[4839]: I0321 04:37:00.624847 4839 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"821316a28b33f897df4ea4ae553ed0cbdd066ff220cc310af604bbe062e4b00e"} err="failed to get container status \"821316a28b33f897df4ea4ae553ed0cbdd066ff220cc310af604bbe062e4b00e\": rpc error: code = NotFound desc = could not find container \"821316a28b33f897df4ea4ae553ed0cbdd066ff220cc310af604bbe062e4b00e\": container with ID starting with 821316a28b33f897df4ea4ae553ed0cbdd066ff220cc310af604bbe062e4b00e not found: ID does not exist" Mar 21 04:37:00 crc kubenswrapper[4839]: I0321 04:37:00.624918 4839 scope.go:117] "RemoveContainer" containerID="0555faac11cd1da1d5573074ee5d8704c1c4a7d8559996229bf242d9ed7290f4" Mar 21 04:37:00 crc kubenswrapper[4839]: I0321 04:37:00.625175 4839 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0555faac11cd1da1d5573074ee5d8704c1c4a7d8559996229bf242d9ed7290f4"} err="failed to get container status \"0555faac11cd1da1d5573074ee5d8704c1c4a7d8559996229bf242d9ed7290f4\": rpc error: code = NotFound desc = could not find container \"0555faac11cd1da1d5573074ee5d8704c1c4a7d8559996229bf242d9ed7290f4\": container with ID starting with 0555faac11cd1da1d5573074ee5d8704c1c4a7d8559996229bf242d9ed7290f4 not found: ID does not exist" Mar 21 04:37:00 crc kubenswrapper[4839]: I0321 04:37:00.625196 4839 scope.go:117] "RemoveContainer" containerID="466ed122245008a86b4471250670e086e61c981de3a7a026461c7c2238c87f3c" Mar 21 04:37:00 crc kubenswrapper[4839]: I0321 04:37:00.625504 4839 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"466ed122245008a86b4471250670e086e61c981de3a7a026461c7c2238c87f3c"} err="failed to get container status \"466ed122245008a86b4471250670e086e61c981de3a7a026461c7c2238c87f3c\": rpc error: code = NotFound desc = could not find container \"466ed122245008a86b4471250670e086e61c981de3a7a026461c7c2238c87f3c\": container with ID starting with 466ed122245008a86b4471250670e086e61c981de3a7a026461c7c2238c87f3c not found: ID does not exist" Mar 21 04:37:00 crc kubenswrapper[4839]: I0321 04:37:00.625531 4839 scope.go:117] "RemoveContainer" containerID="06ba1fe337b0b90d457f17a01536cb29904e3afa201457970ad79c949040543e" Mar 21 04:37:00 crc kubenswrapper[4839]: I0321 04:37:00.625857 4839 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"06ba1fe337b0b90d457f17a01536cb29904e3afa201457970ad79c949040543e"} err="failed to get container status \"06ba1fe337b0b90d457f17a01536cb29904e3afa201457970ad79c949040543e\": rpc error: code = NotFound desc = could not find container \"06ba1fe337b0b90d457f17a01536cb29904e3afa201457970ad79c949040543e\": container with ID starting with 06ba1fe337b0b90d457f17a01536cb29904e3afa201457970ad79c949040543e not found: ID does not exist" Mar 21 04:37:00 crc kubenswrapper[4839]: I0321 04:37:00.625878 4839 scope.go:117] "RemoveContainer" containerID="616e01984d5c14dee369d9f7e59f96c9dfaccfd431fc19972454ffbbb0b5489f" Mar 21 04:37:00 crc kubenswrapper[4839]: I0321 04:37:00.626092 4839 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"616e01984d5c14dee369d9f7e59f96c9dfaccfd431fc19972454ffbbb0b5489f"} err="failed to get container status \"616e01984d5c14dee369d9f7e59f96c9dfaccfd431fc19972454ffbbb0b5489f\": rpc error: code = NotFound desc = could not find container \"616e01984d5c14dee369d9f7e59f96c9dfaccfd431fc19972454ffbbb0b5489f\": container with ID starting with 616e01984d5c14dee369d9f7e59f96c9dfaccfd431fc19972454ffbbb0b5489f not found: ID does not exist" Mar 21 04:37:00 crc kubenswrapper[4839]: I0321 04:37:00.626128 4839 scope.go:117] "RemoveContainer" containerID="ee9c34719e2681860a7065870d4f0fd1d5aae10da5f2cc780e83f8020211e536" Mar 21 04:37:00 crc kubenswrapper[4839]: I0321 04:37:00.626395 4839 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ee9c34719e2681860a7065870d4f0fd1d5aae10da5f2cc780e83f8020211e536"} err="failed to get container status \"ee9c34719e2681860a7065870d4f0fd1d5aae10da5f2cc780e83f8020211e536\": rpc error: code = NotFound desc = could not find container \"ee9c34719e2681860a7065870d4f0fd1d5aae10da5f2cc780e83f8020211e536\": container with ID starting with ee9c34719e2681860a7065870d4f0fd1d5aae10da5f2cc780e83f8020211e536 not found: ID does not exist" Mar 21 04:37:00 crc kubenswrapper[4839]: I0321 04:37:00.626439 4839 scope.go:117] "RemoveContainer" containerID="340ce870dbfbd3b102f8f1c5e6705da80e4a23db035175d7746ffe6ad63310f0" Mar 21 04:37:00 crc kubenswrapper[4839]: I0321 04:37:00.626760 4839 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"340ce870dbfbd3b102f8f1c5e6705da80e4a23db035175d7746ffe6ad63310f0"} err="failed to get container status \"340ce870dbfbd3b102f8f1c5e6705da80e4a23db035175d7746ffe6ad63310f0\": rpc error: code = NotFound desc = could not find container \"340ce870dbfbd3b102f8f1c5e6705da80e4a23db035175d7746ffe6ad63310f0\": container with ID starting with 340ce870dbfbd3b102f8f1c5e6705da80e4a23db035175d7746ffe6ad63310f0 not found: ID does not exist" Mar 21 04:37:00 crc kubenswrapper[4839]: I0321 04:37:00.626789 4839 scope.go:117] "RemoveContainer" containerID="04c0470b8aa154ad901d5416631136ad2480da45bff39836dda6837a3e4b980e" Mar 21 04:37:00 crc kubenswrapper[4839]: I0321 04:37:00.627119 4839 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"04c0470b8aa154ad901d5416631136ad2480da45bff39836dda6837a3e4b980e"} err="failed to get container status \"04c0470b8aa154ad901d5416631136ad2480da45bff39836dda6837a3e4b980e\": rpc error: code = NotFound desc = could not find container \"04c0470b8aa154ad901d5416631136ad2480da45bff39836dda6837a3e4b980e\": container with ID starting with 04c0470b8aa154ad901d5416631136ad2480da45bff39836dda6837a3e4b980e not found: ID does not exist" Mar 21 04:37:00 crc kubenswrapper[4839]: I0321 04:37:00.627147 4839 scope.go:117] "RemoveContainer" containerID="c532a8668c47301ab93dbb1ea21be6ed687a83a6c9008b887e5023a46f27d172" Mar 21 04:37:00 crc kubenswrapper[4839]: I0321 04:37:00.627422 4839 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c532a8668c47301ab93dbb1ea21be6ed687a83a6c9008b887e5023a46f27d172"} err="failed to get container status \"c532a8668c47301ab93dbb1ea21be6ed687a83a6c9008b887e5023a46f27d172\": rpc error: code = NotFound desc = could not find container \"c532a8668c47301ab93dbb1ea21be6ed687a83a6c9008b887e5023a46f27d172\": container with ID starting with c532a8668c47301ab93dbb1ea21be6ed687a83a6c9008b887e5023a46f27d172 not found: ID does not exist" Mar 21 04:37:00 crc kubenswrapper[4839]: I0321 04:37:00.627443 4839 scope.go:117] "RemoveContainer" containerID="5854b81091eb3920f499e2fb8bdb5e51afa5cf975397ad7d789603150fdafe92" Mar 21 04:37:00 crc kubenswrapper[4839]: I0321 04:37:00.627793 4839 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5854b81091eb3920f499e2fb8bdb5e51afa5cf975397ad7d789603150fdafe92"} err="failed to get container status \"5854b81091eb3920f499e2fb8bdb5e51afa5cf975397ad7d789603150fdafe92\": rpc error: code = NotFound desc = could not find container \"5854b81091eb3920f499e2fb8bdb5e51afa5cf975397ad7d789603150fdafe92\": container with ID starting with 5854b81091eb3920f499e2fb8bdb5e51afa5cf975397ad7d789603150fdafe92 not found: ID does not exist" Mar 21 04:37:00 crc kubenswrapper[4839]: I0321 04:37:00.627815 4839 scope.go:117] "RemoveContainer" containerID="821316a28b33f897df4ea4ae553ed0cbdd066ff220cc310af604bbe062e4b00e" Mar 21 04:37:00 crc kubenswrapper[4839]: I0321 04:37:00.628590 4839 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"821316a28b33f897df4ea4ae553ed0cbdd066ff220cc310af604bbe062e4b00e"} err="failed to get container status \"821316a28b33f897df4ea4ae553ed0cbdd066ff220cc310af604bbe062e4b00e\": rpc error: code = NotFound desc = could not find container \"821316a28b33f897df4ea4ae553ed0cbdd066ff220cc310af604bbe062e4b00e\": container with ID starting with 821316a28b33f897df4ea4ae553ed0cbdd066ff220cc310af604bbe062e4b00e not found: ID does not exist" Mar 21 04:37:00 crc kubenswrapper[4839]: I0321 04:37:00.628617 4839 scope.go:117] "RemoveContainer" containerID="0555faac11cd1da1d5573074ee5d8704c1c4a7d8559996229bf242d9ed7290f4" Mar 21 04:37:00 crc kubenswrapper[4839]: I0321 04:37:00.628897 4839 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0555faac11cd1da1d5573074ee5d8704c1c4a7d8559996229bf242d9ed7290f4"} err="failed to get container status \"0555faac11cd1da1d5573074ee5d8704c1c4a7d8559996229bf242d9ed7290f4\": rpc error: code = NotFound desc = could not find container \"0555faac11cd1da1d5573074ee5d8704c1c4a7d8559996229bf242d9ed7290f4\": container with ID starting with 0555faac11cd1da1d5573074ee5d8704c1c4a7d8559996229bf242d9ed7290f4 not found: ID does not exist" Mar 21 04:37:00 crc kubenswrapper[4839]: I0321 04:37:00.628917 4839 scope.go:117] "RemoveContainer" containerID="466ed122245008a86b4471250670e086e61c981de3a7a026461c7c2238c87f3c" Mar 21 04:37:00 crc kubenswrapper[4839]: I0321 04:37:00.629157 4839 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"466ed122245008a86b4471250670e086e61c981de3a7a026461c7c2238c87f3c"} err="failed to get container status \"466ed122245008a86b4471250670e086e61c981de3a7a026461c7c2238c87f3c\": rpc error: code = NotFound desc = could not find container \"466ed122245008a86b4471250670e086e61c981de3a7a026461c7c2238c87f3c\": container with ID starting with 466ed122245008a86b4471250670e086e61c981de3a7a026461c7c2238c87f3c not found: ID does not exist" Mar 21 04:37:00 crc kubenswrapper[4839]: I0321 04:37:00.629176 4839 scope.go:117] "RemoveContainer" containerID="06ba1fe337b0b90d457f17a01536cb29904e3afa201457970ad79c949040543e" Mar 21 04:37:00 crc kubenswrapper[4839]: I0321 04:37:00.629446 4839 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"06ba1fe337b0b90d457f17a01536cb29904e3afa201457970ad79c949040543e"} err="failed to get container status \"06ba1fe337b0b90d457f17a01536cb29904e3afa201457970ad79c949040543e\": rpc error: code = NotFound desc = could not find container \"06ba1fe337b0b90d457f17a01536cb29904e3afa201457970ad79c949040543e\": container with ID starting with 06ba1fe337b0b90d457f17a01536cb29904e3afa201457970ad79c949040543e not found: ID does not exist" Mar 21 04:37:00 crc kubenswrapper[4839]: I0321 04:37:00.629465 4839 scope.go:117] "RemoveContainer" containerID="616e01984d5c14dee369d9f7e59f96c9dfaccfd431fc19972454ffbbb0b5489f" Mar 21 04:37:00 crc kubenswrapper[4839]: I0321 04:37:00.629739 4839 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"616e01984d5c14dee369d9f7e59f96c9dfaccfd431fc19972454ffbbb0b5489f"} err="failed to get container status \"616e01984d5c14dee369d9f7e59f96c9dfaccfd431fc19972454ffbbb0b5489f\": rpc error: code = NotFound desc = could not find container \"616e01984d5c14dee369d9f7e59f96c9dfaccfd431fc19972454ffbbb0b5489f\": container with ID starting with 616e01984d5c14dee369d9f7e59f96c9dfaccfd431fc19972454ffbbb0b5489f not found: ID does not exist" Mar 21 04:37:00 crc kubenswrapper[4839]: I0321 04:37:00.629757 4839 scope.go:117] "RemoveContainer" containerID="ee9c34719e2681860a7065870d4f0fd1d5aae10da5f2cc780e83f8020211e536" Mar 21 04:37:00 crc kubenswrapper[4839]: I0321 04:37:00.630018 4839 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ee9c34719e2681860a7065870d4f0fd1d5aae10da5f2cc780e83f8020211e536"} err="failed to get container status \"ee9c34719e2681860a7065870d4f0fd1d5aae10da5f2cc780e83f8020211e536\": rpc error: code = NotFound desc = could not find container \"ee9c34719e2681860a7065870d4f0fd1d5aae10da5f2cc780e83f8020211e536\": container with ID starting with ee9c34719e2681860a7065870d4f0fd1d5aae10da5f2cc780e83f8020211e536 not found: ID does not exist" Mar 21 04:37:00 crc kubenswrapper[4839]: I0321 04:37:00.630038 4839 scope.go:117] "RemoveContainer" containerID="340ce870dbfbd3b102f8f1c5e6705da80e4a23db035175d7746ffe6ad63310f0" Mar 21 04:37:00 crc kubenswrapper[4839]: I0321 04:37:00.630342 4839 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"340ce870dbfbd3b102f8f1c5e6705da80e4a23db035175d7746ffe6ad63310f0"} err="failed to get container status \"340ce870dbfbd3b102f8f1c5e6705da80e4a23db035175d7746ffe6ad63310f0\": rpc error: code = NotFound desc = could not find container \"340ce870dbfbd3b102f8f1c5e6705da80e4a23db035175d7746ffe6ad63310f0\": container with ID starting with 340ce870dbfbd3b102f8f1c5e6705da80e4a23db035175d7746ffe6ad63310f0 not found: ID does not exist" Mar 21 04:37:00 crc kubenswrapper[4839]: I0321 04:37:00.630361 4839 scope.go:117] "RemoveContainer" containerID="04c0470b8aa154ad901d5416631136ad2480da45bff39836dda6837a3e4b980e" Mar 21 04:37:00 crc kubenswrapper[4839]: I0321 04:37:00.630589 4839 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"04c0470b8aa154ad901d5416631136ad2480da45bff39836dda6837a3e4b980e"} err="failed to get container status \"04c0470b8aa154ad901d5416631136ad2480da45bff39836dda6837a3e4b980e\": rpc error: code = NotFound desc = could not find container \"04c0470b8aa154ad901d5416631136ad2480da45bff39836dda6837a3e4b980e\": container with ID starting with 04c0470b8aa154ad901d5416631136ad2480da45bff39836dda6837a3e4b980e not found: ID does not exist" Mar 21 04:37:00 crc kubenswrapper[4839]: I0321 04:37:00.630608 4839 scope.go:117] "RemoveContainer" containerID="c532a8668c47301ab93dbb1ea21be6ed687a83a6c9008b887e5023a46f27d172" Mar 21 04:37:00 crc kubenswrapper[4839]: I0321 04:37:00.630995 4839 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c532a8668c47301ab93dbb1ea21be6ed687a83a6c9008b887e5023a46f27d172"} err="failed to get container status \"c532a8668c47301ab93dbb1ea21be6ed687a83a6c9008b887e5023a46f27d172\": rpc error: code = NotFound desc = could not find container \"c532a8668c47301ab93dbb1ea21be6ed687a83a6c9008b887e5023a46f27d172\": container with ID starting with c532a8668c47301ab93dbb1ea21be6ed687a83a6c9008b887e5023a46f27d172 not found: ID does not exist" Mar 21 04:37:00 crc kubenswrapper[4839]: I0321 04:37:00.631011 4839 scope.go:117] "RemoveContainer" containerID="5854b81091eb3920f499e2fb8bdb5e51afa5cf975397ad7d789603150fdafe92" Mar 21 04:37:00 crc kubenswrapper[4839]: I0321 04:37:00.631275 4839 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5854b81091eb3920f499e2fb8bdb5e51afa5cf975397ad7d789603150fdafe92"} err="failed to get container status \"5854b81091eb3920f499e2fb8bdb5e51afa5cf975397ad7d789603150fdafe92\": rpc error: code = NotFound desc = could not find container \"5854b81091eb3920f499e2fb8bdb5e51afa5cf975397ad7d789603150fdafe92\": container with ID starting with 5854b81091eb3920f499e2fb8bdb5e51afa5cf975397ad7d789603150fdafe92 not found: ID does not exist" Mar 21 04:37:00 crc kubenswrapper[4839]: I0321 04:37:00.631294 4839 scope.go:117] "RemoveContainer" containerID="821316a28b33f897df4ea4ae553ed0cbdd066ff220cc310af604bbe062e4b00e" Mar 21 04:37:00 crc kubenswrapper[4839]: I0321 04:37:00.631603 4839 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"821316a28b33f897df4ea4ae553ed0cbdd066ff220cc310af604bbe062e4b00e"} err="failed to get container status \"821316a28b33f897df4ea4ae553ed0cbdd066ff220cc310af604bbe062e4b00e\": rpc error: code = NotFound desc = could not find container \"821316a28b33f897df4ea4ae553ed0cbdd066ff220cc310af604bbe062e4b00e\": container with ID starting with 821316a28b33f897df4ea4ae553ed0cbdd066ff220cc310af604bbe062e4b00e not found: ID does not exist" Mar 21 04:37:00 crc kubenswrapper[4839]: I0321 04:37:00.631622 4839 scope.go:117] "RemoveContainer" containerID="0555faac11cd1da1d5573074ee5d8704c1c4a7d8559996229bf242d9ed7290f4" Mar 21 04:37:00 crc kubenswrapper[4839]: I0321 04:37:00.631974 4839 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0555faac11cd1da1d5573074ee5d8704c1c4a7d8559996229bf242d9ed7290f4"} err="failed to get container status \"0555faac11cd1da1d5573074ee5d8704c1c4a7d8559996229bf242d9ed7290f4\": rpc error: code = NotFound desc = could not find container \"0555faac11cd1da1d5573074ee5d8704c1c4a7d8559996229bf242d9ed7290f4\": container with ID starting with 0555faac11cd1da1d5573074ee5d8704c1c4a7d8559996229bf242d9ed7290f4 not found: ID does not exist" Mar 21 04:37:00 crc kubenswrapper[4839]: I0321 04:37:00.632002 4839 scope.go:117] "RemoveContainer" containerID="466ed122245008a86b4471250670e086e61c981de3a7a026461c7c2238c87f3c" Mar 21 04:37:00 crc kubenswrapper[4839]: I0321 04:37:00.632301 4839 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"466ed122245008a86b4471250670e086e61c981de3a7a026461c7c2238c87f3c"} err="failed to get container status \"466ed122245008a86b4471250670e086e61c981de3a7a026461c7c2238c87f3c\": rpc error: code = NotFound desc = could not find container \"466ed122245008a86b4471250670e086e61c981de3a7a026461c7c2238c87f3c\": container with ID starting with 466ed122245008a86b4471250670e086e61c981de3a7a026461c7c2238c87f3c not found: ID does not exist" Mar 21 04:37:00 crc kubenswrapper[4839]: I0321 04:37:00.632341 4839 scope.go:117] "RemoveContainer" containerID="06ba1fe337b0b90d457f17a01536cb29904e3afa201457970ad79c949040543e" Mar 21 04:37:00 crc kubenswrapper[4839]: I0321 04:37:00.632759 4839 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"06ba1fe337b0b90d457f17a01536cb29904e3afa201457970ad79c949040543e"} err="failed to get container status \"06ba1fe337b0b90d457f17a01536cb29904e3afa201457970ad79c949040543e\": rpc error: code = NotFound desc = could not find container \"06ba1fe337b0b90d457f17a01536cb29904e3afa201457970ad79c949040543e\": container with ID starting with 06ba1fe337b0b90d457f17a01536cb29904e3afa201457970ad79c949040543e not found: ID does not exist" Mar 21 04:37:00 crc kubenswrapper[4839]: W0321 04:37:00.641536 4839 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podcb64f802_d294_4fd5_a691_da3096ee0978.slice/crio-a477abd48e8a183005c5fb2c96cbc3b3d90374b5e29c8c80ec6eb64b98380b2c WatchSource:0}: Error finding container a477abd48e8a183005c5fb2c96cbc3b3d90374b5e29c8c80ec6eb64b98380b2c: Status 404 returned error can't find the container with id a477abd48e8a183005c5fb2c96cbc3b3d90374b5e29c8c80ec6eb64b98380b2c Mar 21 04:37:01 crc kubenswrapper[4839]: I0321 04:37:01.438824 4839 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-zqcw4_1602189b-f4f3-40ee-ba63-c695c11069d0/kube-multus/2.log" Mar 21 04:37:01 crc kubenswrapper[4839]: I0321 04:37:01.440285 4839 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-zqcw4_1602189b-f4f3-40ee-ba63-c695c11069d0/kube-multus/1.log" Mar 21 04:37:01 crc kubenswrapper[4839]: I0321 04:37:01.440489 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-zqcw4" event={"ID":"1602189b-f4f3-40ee-ba63-c695c11069d0","Type":"ContainerStarted","Data":"71df07ffacae9482203a8cabda642d7215f74bd8f6c84a260cf8febd9e078ba4"} Mar 21 04:37:01 crc kubenswrapper[4839]: I0321 04:37:01.443240 4839 generic.go:334] "Generic (PLEG): container finished" podID="cb64f802-d294-4fd5-a691-da3096ee0978" containerID="6b659d570cf01643372fe7dfe0d11fb064f2d21ec2dfdb605b3b0751a44a44db" exitCode=0 Mar 21 04:37:01 crc kubenswrapper[4839]: I0321 04:37:01.443299 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-vdfz5" event={"ID":"cb64f802-d294-4fd5-a691-da3096ee0978","Type":"ContainerDied","Data":"6b659d570cf01643372fe7dfe0d11fb064f2d21ec2dfdb605b3b0751a44a44db"} Mar 21 04:37:01 crc kubenswrapper[4839]: I0321 04:37:01.443344 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-vdfz5" event={"ID":"cb64f802-d294-4fd5-a691-da3096ee0978","Type":"ContainerStarted","Data":"a477abd48e8a183005c5fb2c96cbc3b3d90374b5e29c8c80ec6eb64b98380b2c"} Mar 21 04:37:02 crc kubenswrapper[4839]: I0321 04:37:02.451563 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-vdfz5" event={"ID":"cb64f802-d294-4fd5-a691-da3096ee0978","Type":"ContainerStarted","Data":"f879662c8aebd4b6919b577691c554116168a132db8b6705855aa831492be275"} Mar 21 04:37:02 crc kubenswrapper[4839]: I0321 04:37:02.460277 4839 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d634043b-c9ec-4469-b267-26053b1f02f9" path="/var/lib/kubelet/pods/d634043b-c9ec-4469-b267-26053b1f02f9/volumes" Mar 21 04:37:02 crc kubenswrapper[4839]: I0321 04:37:02.461271 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-vdfz5" event={"ID":"cb64f802-d294-4fd5-a691-da3096ee0978","Type":"ContainerStarted","Data":"fed131455fa0580c3e4f7918efeb38a28bb7ec55c2dddce52627784de7a65f59"} Mar 21 04:37:02 crc kubenswrapper[4839]: I0321 04:37:02.461299 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-vdfz5" event={"ID":"cb64f802-d294-4fd5-a691-da3096ee0978","Type":"ContainerStarted","Data":"a6483341a255f761d85ad7aeb013462317460d06af975951231edc45f9e20aa1"} Mar 21 04:37:02 crc kubenswrapper[4839]: I0321 04:37:02.461323 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-vdfz5" event={"ID":"cb64f802-d294-4fd5-a691-da3096ee0978","Type":"ContainerStarted","Data":"ceb79d4636a8f9f85d22b1b5027dc86cd38b45a3cf957b7e8581665713d35d02"} Mar 21 04:37:02 crc kubenswrapper[4839]: I0321 04:37:02.461333 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-vdfz5" event={"ID":"cb64f802-d294-4fd5-a691-da3096ee0978","Type":"ContainerStarted","Data":"2ea140b265ee9f3cf33bac33aa856c4694af4dafd7b61dc901561c695b98234f"} Mar 21 04:37:02 crc kubenswrapper[4839]: I0321 04:37:02.461341 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-vdfz5" event={"ID":"cb64f802-d294-4fd5-a691-da3096ee0978","Type":"ContainerStarted","Data":"2f70a4e247f7d8c7918a238fe573958c42eada15b75225f677902d3d12197324"} Mar 21 04:37:05 crc kubenswrapper[4839]: I0321 04:37:05.276690 4839 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="cert-manager/cert-manager-webhook-687f57d79b-s9zj6" Mar 21 04:37:05 crc kubenswrapper[4839]: I0321 04:37:05.484033 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-vdfz5" event={"ID":"cb64f802-d294-4fd5-a691-da3096ee0978","Type":"ContainerStarted","Data":"1521c8728685fe015ce907663e6c1745e77d64e79ad395f301a027b893061cdb"} Mar 21 04:37:07 crc kubenswrapper[4839]: I0321 04:37:07.504173 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-vdfz5" event={"ID":"cb64f802-d294-4fd5-a691-da3096ee0978","Type":"ContainerStarted","Data":"e81ce74fdc8cf188c028ae5fd9384ef1daaec91296ec73394ea137308db19fe8"} Mar 21 04:37:07 crc kubenswrapper[4839]: I0321 04:37:07.504700 4839 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-vdfz5" Mar 21 04:37:07 crc kubenswrapper[4839]: I0321 04:37:07.541612 4839 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-vdfz5" Mar 21 04:37:07 crc kubenswrapper[4839]: I0321 04:37:07.555078 4839 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-vdfz5" podStartSLOduration=7.555064002 podStartE2EDuration="7.555064002s" podCreationTimestamp="2026-03-21 04:37:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-21 04:37:07.550785307 +0000 UTC m=+831.878572003" watchObservedRunningTime="2026-03-21 04:37:07.555064002 +0000 UTC m=+831.882850668" Mar 21 04:37:08 crc kubenswrapper[4839]: I0321 04:37:08.508835 4839 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-vdfz5" Mar 21 04:37:08 crc kubenswrapper[4839]: I0321 04:37:08.509091 4839 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-vdfz5" Mar 21 04:37:08 crc kubenswrapper[4839]: I0321 04:37:08.536099 4839 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-vdfz5" Mar 21 04:37:14 crc kubenswrapper[4839]: I0321 04:37:14.302880 4839 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Mar 21 04:37:30 crc kubenswrapper[4839]: I0321 04:37:30.030783 4839 scope.go:117] "RemoveContainer" containerID="bc026a6498e53a6dc434660326fcbdb2323e59c0c7d6f9e5d768c3a347b614d1" Mar 21 04:37:30 crc kubenswrapper[4839]: I0321 04:37:30.634958 4839 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-zqcw4_1602189b-f4f3-40ee-ba63-c695c11069d0/kube-multus/2.log" Mar 21 04:37:30 crc kubenswrapper[4839]: I0321 04:37:30.645101 4839 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-vdfz5" Mar 21 04:37:46 crc kubenswrapper[4839]: I0321 04:37:46.353807 4839 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874bb5km"] Mar 21 04:37:46 crc kubenswrapper[4839]: I0321 04:37:46.355471 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874bb5km" Mar 21 04:37:46 crc kubenswrapper[4839]: I0321 04:37:46.359916 4839 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"default-dockercfg-vmwhc" Mar 21 04:37:46 crc kubenswrapper[4839]: I0321 04:37:46.365131 4839 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874bb5km"] Mar 21 04:37:46 crc kubenswrapper[4839]: I0321 04:37:46.485397 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/e0eea72c-ae42-4ea4-a067-6ff3e853c081-bundle\") pod \"1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874bb5km\" (UID: \"e0eea72c-ae42-4ea4-a067-6ff3e853c081\") " pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874bb5km" Mar 21 04:37:46 crc kubenswrapper[4839]: I0321 04:37:46.485449 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ffk54\" (UniqueName: \"kubernetes.io/projected/e0eea72c-ae42-4ea4-a067-6ff3e853c081-kube-api-access-ffk54\") pod \"1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874bb5km\" (UID: \"e0eea72c-ae42-4ea4-a067-6ff3e853c081\") " pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874bb5km" Mar 21 04:37:46 crc kubenswrapper[4839]: I0321 04:37:46.485551 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/e0eea72c-ae42-4ea4-a067-6ff3e853c081-util\") pod \"1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874bb5km\" (UID: \"e0eea72c-ae42-4ea4-a067-6ff3e853c081\") " pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874bb5km" Mar 21 04:37:46 crc kubenswrapper[4839]: I0321 04:37:46.587246 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/e0eea72c-ae42-4ea4-a067-6ff3e853c081-bundle\") pod \"1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874bb5km\" (UID: \"e0eea72c-ae42-4ea4-a067-6ff3e853c081\") " pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874bb5km" Mar 21 04:37:46 crc kubenswrapper[4839]: I0321 04:37:46.587321 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ffk54\" (UniqueName: \"kubernetes.io/projected/e0eea72c-ae42-4ea4-a067-6ff3e853c081-kube-api-access-ffk54\") pod \"1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874bb5km\" (UID: \"e0eea72c-ae42-4ea4-a067-6ff3e853c081\") " pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874bb5km" Mar 21 04:37:46 crc kubenswrapper[4839]: I0321 04:37:46.587437 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/e0eea72c-ae42-4ea4-a067-6ff3e853c081-util\") pod \"1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874bb5km\" (UID: \"e0eea72c-ae42-4ea4-a067-6ff3e853c081\") " pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874bb5km" Mar 21 04:37:46 crc kubenswrapper[4839]: I0321 04:37:46.587876 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/e0eea72c-ae42-4ea4-a067-6ff3e853c081-util\") pod \"1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874bb5km\" (UID: \"e0eea72c-ae42-4ea4-a067-6ff3e853c081\") " pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874bb5km" Mar 21 04:37:46 crc kubenswrapper[4839]: I0321 04:37:46.588845 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/e0eea72c-ae42-4ea4-a067-6ff3e853c081-bundle\") pod \"1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874bb5km\" (UID: \"e0eea72c-ae42-4ea4-a067-6ff3e853c081\") " pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874bb5km" Mar 21 04:37:46 crc kubenswrapper[4839]: I0321 04:37:46.605771 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ffk54\" (UniqueName: \"kubernetes.io/projected/e0eea72c-ae42-4ea4-a067-6ff3e853c081-kube-api-access-ffk54\") pod \"1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874bb5km\" (UID: \"e0eea72c-ae42-4ea4-a067-6ff3e853c081\") " pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874bb5km" Mar 21 04:37:46 crc kubenswrapper[4839]: I0321 04:37:46.672702 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874bb5km" Mar 21 04:37:46 crc kubenswrapper[4839]: I0321 04:37:46.847758 4839 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874bb5km"] Mar 21 04:37:47 crc kubenswrapper[4839]: I0321 04:37:47.727365 4839 generic.go:334] "Generic (PLEG): container finished" podID="e0eea72c-ae42-4ea4-a067-6ff3e853c081" containerID="97f916ce8a710bdca34d1c2baa215bdf07d025722c2d6356867fad6af911dbeb" exitCode=0 Mar 21 04:37:47 crc kubenswrapper[4839]: I0321 04:37:47.727416 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874bb5km" event={"ID":"e0eea72c-ae42-4ea4-a067-6ff3e853c081","Type":"ContainerDied","Data":"97f916ce8a710bdca34d1c2baa215bdf07d025722c2d6356867fad6af911dbeb"} Mar 21 04:37:47 crc kubenswrapper[4839]: I0321 04:37:47.727446 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874bb5km" event={"ID":"e0eea72c-ae42-4ea4-a067-6ff3e853c081","Type":"ContainerStarted","Data":"a2995f0b972f1412efb332edf63f598b757d416c916a33749410e3cf43dc6a7a"} Mar 21 04:37:48 crc kubenswrapper[4839]: I0321 04:37:48.639798 4839 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-2z7jr"] Mar 21 04:37:48 crc kubenswrapper[4839]: I0321 04:37:48.641725 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-2z7jr" Mar 21 04:37:48 crc kubenswrapper[4839]: I0321 04:37:48.648757 4839 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-2z7jr"] Mar 21 04:37:48 crc kubenswrapper[4839]: I0321 04:37:48.713984 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/36c4ce7f-79eb-4f18-8573-f6900d7812fe-catalog-content\") pod \"redhat-operators-2z7jr\" (UID: \"36c4ce7f-79eb-4f18-8573-f6900d7812fe\") " pod="openshift-marketplace/redhat-operators-2z7jr" Mar 21 04:37:48 crc kubenswrapper[4839]: I0321 04:37:48.714094 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/36c4ce7f-79eb-4f18-8573-f6900d7812fe-utilities\") pod \"redhat-operators-2z7jr\" (UID: \"36c4ce7f-79eb-4f18-8573-f6900d7812fe\") " pod="openshift-marketplace/redhat-operators-2z7jr" Mar 21 04:37:48 crc kubenswrapper[4839]: I0321 04:37:48.714168 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gqkfx\" (UniqueName: \"kubernetes.io/projected/36c4ce7f-79eb-4f18-8573-f6900d7812fe-kube-api-access-gqkfx\") pod \"redhat-operators-2z7jr\" (UID: \"36c4ce7f-79eb-4f18-8573-f6900d7812fe\") " pod="openshift-marketplace/redhat-operators-2z7jr" Mar 21 04:37:48 crc kubenswrapper[4839]: I0321 04:37:48.815181 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/36c4ce7f-79eb-4f18-8573-f6900d7812fe-utilities\") pod \"redhat-operators-2z7jr\" (UID: \"36c4ce7f-79eb-4f18-8573-f6900d7812fe\") " pod="openshift-marketplace/redhat-operators-2z7jr" Mar 21 04:37:48 crc kubenswrapper[4839]: I0321 04:37:48.815268 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gqkfx\" (UniqueName: \"kubernetes.io/projected/36c4ce7f-79eb-4f18-8573-f6900d7812fe-kube-api-access-gqkfx\") pod \"redhat-operators-2z7jr\" (UID: \"36c4ce7f-79eb-4f18-8573-f6900d7812fe\") " pod="openshift-marketplace/redhat-operators-2z7jr" Mar 21 04:37:48 crc kubenswrapper[4839]: I0321 04:37:48.815320 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/36c4ce7f-79eb-4f18-8573-f6900d7812fe-catalog-content\") pod \"redhat-operators-2z7jr\" (UID: \"36c4ce7f-79eb-4f18-8573-f6900d7812fe\") " pod="openshift-marketplace/redhat-operators-2z7jr" Mar 21 04:37:48 crc kubenswrapper[4839]: I0321 04:37:48.815598 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/36c4ce7f-79eb-4f18-8573-f6900d7812fe-utilities\") pod \"redhat-operators-2z7jr\" (UID: \"36c4ce7f-79eb-4f18-8573-f6900d7812fe\") " pod="openshift-marketplace/redhat-operators-2z7jr" Mar 21 04:37:48 crc kubenswrapper[4839]: I0321 04:37:48.815820 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/36c4ce7f-79eb-4f18-8573-f6900d7812fe-catalog-content\") pod \"redhat-operators-2z7jr\" (UID: \"36c4ce7f-79eb-4f18-8573-f6900d7812fe\") " pod="openshift-marketplace/redhat-operators-2z7jr" Mar 21 04:37:48 crc kubenswrapper[4839]: I0321 04:37:48.837702 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gqkfx\" (UniqueName: \"kubernetes.io/projected/36c4ce7f-79eb-4f18-8573-f6900d7812fe-kube-api-access-gqkfx\") pod \"redhat-operators-2z7jr\" (UID: \"36c4ce7f-79eb-4f18-8573-f6900d7812fe\") " pod="openshift-marketplace/redhat-operators-2z7jr" Mar 21 04:37:48 crc kubenswrapper[4839]: I0321 04:37:48.963044 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-2z7jr" Mar 21 04:37:49 crc kubenswrapper[4839]: I0321 04:37:49.189422 4839 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-2z7jr"] Mar 21 04:37:49 crc kubenswrapper[4839]: I0321 04:37:49.739624 4839 generic.go:334] "Generic (PLEG): container finished" podID="36c4ce7f-79eb-4f18-8573-f6900d7812fe" containerID="47364d415da3cf65d8eae0846016fd96e5b43e3a480f9d051b30b253d8dce58d" exitCode=0 Mar 21 04:37:49 crc kubenswrapper[4839]: I0321 04:37:49.739706 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-2z7jr" event={"ID":"36c4ce7f-79eb-4f18-8573-f6900d7812fe","Type":"ContainerDied","Data":"47364d415da3cf65d8eae0846016fd96e5b43e3a480f9d051b30b253d8dce58d"} Mar 21 04:37:49 crc kubenswrapper[4839]: I0321 04:37:49.739736 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-2z7jr" event={"ID":"36c4ce7f-79eb-4f18-8573-f6900d7812fe","Type":"ContainerStarted","Data":"6d41a78d0e49e188c99866f252414054ca009b7f7c05cfeefe8bbaeba6908685"} Mar 21 04:37:49 crc kubenswrapper[4839]: I0321 04:37:49.741977 4839 generic.go:334] "Generic (PLEG): container finished" podID="e0eea72c-ae42-4ea4-a067-6ff3e853c081" containerID="7f2f1955493b275d48d8bf9215382a18b7c71e8975e69015be47c6034da4c8e7" exitCode=0 Mar 21 04:37:49 crc kubenswrapper[4839]: I0321 04:37:49.742034 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874bb5km" event={"ID":"e0eea72c-ae42-4ea4-a067-6ff3e853c081","Type":"ContainerDied","Data":"7f2f1955493b275d48d8bf9215382a18b7c71e8975e69015be47c6034da4c8e7"} Mar 21 04:37:50 crc kubenswrapper[4839]: I0321 04:37:50.751173 4839 generic.go:334] "Generic (PLEG): container finished" podID="e0eea72c-ae42-4ea4-a067-6ff3e853c081" containerID="a71f8e9a655fe731cfee84e65c0d6b9444ddc523531becc69fa5e438aece2bc6" exitCode=0 Mar 21 04:37:50 crc kubenswrapper[4839]: I0321 04:37:50.751241 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874bb5km" event={"ID":"e0eea72c-ae42-4ea4-a067-6ff3e853c081","Type":"ContainerDied","Data":"a71f8e9a655fe731cfee84e65c0d6b9444ddc523531becc69fa5e438aece2bc6"} Mar 21 04:37:50 crc kubenswrapper[4839]: I0321 04:37:50.754321 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-2z7jr" event={"ID":"36c4ce7f-79eb-4f18-8573-f6900d7812fe","Type":"ContainerStarted","Data":"e9f5e0197428b66b353464b7a4b43db5842405a6daab284a7a273837bce258b9"} Mar 21 04:37:51 crc kubenswrapper[4839]: I0321 04:37:51.762321 4839 generic.go:334] "Generic (PLEG): container finished" podID="36c4ce7f-79eb-4f18-8573-f6900d7812fe" containerID="e9f5e0197428b66b353464b7a4b43db5842405a6daab284a7a273837bce258b9" exitCode=0 Mar 21 04:37:51 crc kubenswrapper[4839]: I0321 04:37:51.763771 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-2z7jr" event={"ID":"36c4ce7f-79eb-4f18-8573-f6900d7812fe","Type":"ContainerDied","Data":"e9f5e0197428b66b353464b7a4b43db5842405a6daab284a7a273837bce258b9"} Mar 21 04:37:51 crc kubenswrapper[4839]: I0321 04:37:51.976018 4839 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874bb5km" Mar 21 04:37:52 crc kubenswrapper[4839]: I0321 04:37:52.059068 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/e0eea72c-ae42-4ea4-a067-6ff3e853c081-bundle\") pod \"e0eea72c-ae42-4ea4-a067-6ff3e853c081\" (UID: \"e0eea72c-ae42-4ea4-a067-6ff3e853c081\") " Mar 21 04:37:52 crc kubenswrapper[4839]: I0321 04:37:52.059148 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ffk54\" (UniqueName: \"kubernetes.io/projected/e0eea72c-ae42-4ea4-a067-6ff3e853c081-kube-api-access-ffk54\") pod \"e0eea72c-ae42-4ea4-a067-6ff3e853c081\" (UID: \"e0eea72c-ae42-4ea4-a067-6ff3e853c081\") " Mar 21 04:37:52 crc kubenswrapper[4839]: I0321 04:37:52.059217 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/e0eea72c-ae42-4ea4-a067-6ff3e853c081-util\") pod \"e0eea72c-ae42-4ea4-a067-6ff3e853c081\" (UID: \"e0eea72c-ae42-4ea4-a067-6ff3e853c081\") " Mar 21 04:37:52 crc kubenswrapper[4839]: I0321 04:37:52.059777 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e0eea72c-ae42-4ea4-a067-6ff3e853c081-bundle" (OuterVolumeSpecName: "bundle") pod "e0eea72c-ae42-4ea4-a067-6ff3e853c081" (UID: "e0eea72c-ae42-4ea4-a067-6ff3e853c081"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 21 04:37:52 crc kubenswrapper[4839]: I0321 04:37:52.066828 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e0eea72c-ae42-4ea4-a067-6ff3e853c081-kube-api-access-ffk54" (OuterVolumeSpecName: "kube-api-access-ffk54") pod "e0eea72c-ae42-4ea4-a067-6ff3e853c081" (UID: "e0eea72c-ae42-4ea4-a067-6ff3e853c081"). InnerVolumeSpecName "kube-api-access-ffk54". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 21 04:37:52 crc kubenswrapper[4839]: I0321 04:37:52.073091 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e0eea72c-ae42-4ea4-a067-6ff3e853c081-util" (OuterVolumeSpecName: "util") pod "e0eea72c-ae42-4ea4-a067-6ff3e853c081" (UID: "e0eea72c-ae42-4ea4-a067-6ff3e853c081"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 21 04:37:52 crc kubenswrapper[4839]: I0321 04:37:52.160957 4839 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/e0eea72c-ae42-4ea4-a067-6ff3e853c081-bundle\") on node \"crc\" DevicePath \"\"" Mar 21 04:37:52 crc kubenswrapper[4839]: I0321 04:37:52.161003 4839 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ffk54\" (UniqueName: \"kubernetes.io/projected/e0eea72c-ae42-4ea4-a067-6ff3e853c081-kube-api-access-ffk54\") on node \"crc\" DevicePath \"\"" Mar 21 04:37:52 crc kubenswrapper[4839]: I0321 04:37:52.161013 4839 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/e0eea72c-ae42-4ea4-a067-6ff3e853c081-util\") on node \"crc\" DevicePath \"\"" Mar 21 04:37:52 crc kubenswrapper[4839]: I0321 04:37:52.768791 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-2z7jr" event={"ID":"36c4ce7f-79eb-4f18-8573-f6900d7812fe","Type":"ContainerStarted","Data":"ebb35bbf64054558101b0109e0037494ebb7b1b197dba55792f4b36d612fa907"} Mar 21 04:37:52 crc kubenswrapper[4839]: I0321 04:37:52.771147 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874bb5km" event={"ID":"e0eea72c-ae42-4ea4-a067-6ff3e853c081","Type":"ContainerDied","Data":"a2995f0b972f1412efb332edf63f598b757d416c916a33749410e3cf43dc6a7a"} Mar 21 04:37:52 crc kubenswrapper[4839]: I0321 04:37:52.771189 4839 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874bb5km" Mar 21 04:37:52 crc kubenswrapper[4839]: I0321 04:37:52.771297 4839 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a2995f0b972f1412efb332edf63f598b757d416c916a33749410e3cf43dc6a7a" Mar 21 04:37:52 crc kubenswrapper[4839]: I0321 04:37:52.790054 4839 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-2z7jr" podStartSLOduration=2.273817125 podStartE2EDuration="4.790032723s" podCreationTimestamp="2026-03-21 04:37:48 +0000 UTC" firstStartedPulling="2026-03-21 04:37:49.741305348 +0000 UTC m=+874.069092024" lastFinishedPulling="2026-03-21 04:37:52.257520946 +0000 UTC m=+876.585307622" observedRunningTime="2026-03-21 04:37:52.787005079 +0000 UTC m=+877.114791765" watchObservedRunningTime="2026-03-21 04:37:52.790032723 +0000 UTC m=+877.117819399" Mar 21 04:37:54 crc kubenswrapper[4839]: I0321 04:37:54.322768 4839 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-operator-796d4cfff4-vrlf4"] Mar 21 04:37:54 crc kubenswrapper[4839]: E0321 04:37:54.322983 4839 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e0eea72c-ae42-4ea4-a067-6ff3e853c081" containerName="util" Mar 21 04:37:54 crc kubenswrapper[4839]: I0321 04:37:54.322994 4839 state_mem.go:107] "Deleted CPUSet assignment" podUID="e0eea72c-ae42-4ea4-a067-6ff3e853c081" containerName="util" Mar 21 04:37:54 crc kubenswrapper[4839]: E0321 04:37:54.323007 4839 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e0eea72c-ae42-4ea4-a067-6ff3e853c081" containerName="extract" Mar 21 04:37:54 crc kubenswrapper[4839]: I0321 04:37:54.323013 4839 state_mem.go:107] "Deleted CPUSet assignment" podUID="e0eea72c-ae42-4ea4-a067-6ff3e853c081" containerName="extract" Mar 21 04:37:54 crc kubenswrapper[4839]: E0321 04:37:54.323021 4839 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e0eea72c-ae42-4ea4-a067-6ff3e853c081" containerName="pull" Mar 21 04:37:54 crc kubenswrapper[4839]: I0321 04:37:54.323027 4839 state_mem.go:107] "Deleted CPUSet assignment" podUID="e0eea72c-ae42-4ea4-a067-6ff3e853c081" containerName="pull" Mar 21 04:37:54 crc kubenswrapper[4839]: I0321 04:37:54.323117 4839 memory_manager.go:354] "RemoveStaleState removing state" podUID="e0eea72c-ae42-4ea4-a067-6ff3e853c081" containerName="extract" Mar 21 04:37:54 crc kubenswrapper[4839]: I0321 04:37:54.323521 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-operator-796d4cfff4-vrlf4" Mar 21 04:37:54 crc kubenswrapper[4839]: I0321 04:37:54.327194 4839 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"nmstate-operator-dockercfg-vb5wp" Mar 21 04:37:54 crc kubenswrapper[4839]: I0321 04:37:54.327274 4839 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-nmstate"/"openshift-service-ca.crt" Mar 21 04:37:54 crc kubenswrapper[4839]: I0321 04:37:54.327336 4839 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-nmstate"/"kube-root-ca.crt" Mar 21 04:37:54 crc kubenswrapper[4839]: I0321 04:37:54.333615 4839 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-operator-796d4cfff4-vrlf4"] Mar 21 04:37:54 crc kubenswrapper[4839]: I0321 04:37:54.386009 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tds6h\" (UniqueName: \"kubernetes.io/projected/fbd83ba5-ac43-45f6-8a15-78ba82a246f7-kube-api-access-tds6h\") pod \"nmstate-operator-796d4cfff4-vrlf4\" (UID: \"fbd83ba5-ac43-45f6-8a15-78ba82a246f7\") " pod="openshift-nmstate/nmstate-operator-796d4cfff4-vrlf4" Mar 21 04:37:54 crc kubenswrapper[4839]: I0321 04:37:54.486999 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tds6h\" (UniqueName: \"kubernetes.io/projected/fbd83ba5-ac43-45f6-8a15-78ba82a246f7-kube-api-access-tds6h\") pod \"nmstate-operator-796d4cfff4-vrlf4\" (UID: \"fbd83ba5-ac43-45f6-8a15-78ba82a246f7\") " pod="openshift-nmstate/nmstate-operator-796d4cfff4-vrlf4" Mar 21 04:37:54 crc kubenswrapper[4839]: I0321 04:37:54.505228 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tds6h\" (UniqueName: \"kubernetes.io/projected/fbd83ba5-ac43-45f6-8a15-78ba82a246f7-kube-api-access-tds6h\") pod \"nmstate-operator-796d4cfff4-vrlf4\" (UID: \"fbd83ba5-ac43-45f6-8a15-78ba82a246f7\") " pod="openshift-nmstate/nmstate-operator-796d4cfff4-vrlf4" Mar 21 04:37:54 crc kubenswrapper[4839]: I0321 04:37:54.637223 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-operator-796d4cfff4-vrlf4" Mar 21 04:37:54 crc kubenswrapper[4839]: I0321 04:37:54.871812 4839 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-operator-796d4cfff4-vrlf4"] Mar 21 04:37:55 crc kubenswrapper[4839]: I0321 04:37:55.795828 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-operator-796d4cfff4-vrlf4" event={"ID":"fbd83ba5-ac43-45f6-8a15-78ba82a246f7","Type":"ContainerStarted","Data":"8770be52a19063d93da5e410bb53a0f306abb53f735dd1eafb8d2751664e9ddd"} Mar 21 04:37:57 crc kubenswrapper[4839]: I0321 04:37:57.809274 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-operator-796d4cfff4-vrlf4" event={"ID":"fbd83ba5-ac43-45f6-8a15-78ba82a246f7","Type":"ContainerStarted","Data":"44e281b5b66f07c1c9612fc027c822b02e4291277aac26388ba7d2a8dad97c05"} Mar 21 04:37:57 crc kubenswrapper[4839]: I0321 04:37:57.828525 4839 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-operator-796d4cfff4-vrlf4" podStartSLOduration=1.8959327639999999 podStartE2EDuration="3.828494178s" podCreationTimestamp="2026-03-21 04:37:54 +0000 UTC" firstStartedPulling="2026-03-21 04:37:54.889854292 +0000 UTC m=+879.217640968" lastFinishedPulling="2026-03-21 04:37:56.822415706 +0000 UTC m=+881.150202382" observedRunningTime="2026-03-21 04:37:57.825537645 +0000 UTC m=+882.153324361" watchObservedRunningTime="2026-03-21 04:37:57.828494178 +0000 UTC m=+882.156280864" Mar 21 04:37:58 crc kubenswrapper[4839]: I0321 04:37:58.954292 4839 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-metrics-9b8c8685d-z5wkc"] Mar 21 04:37:58 crc kubenswrapper[4839]: I0321 04:37:58.955100 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-metrics-9b8c8685d-z5wkc" Mar 21 04:37:58 crc kubenswrapper[4839]: I0321 04:37:58.957172 4839 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"nmstate-handler-dockercfg-xt4s6" Mar 21 04:37:58 crc kubenswrapper[4839]: I0321 04:37:58.963105 4839 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-2z7jr" Mar 21 04:37:58 crc kubenswrapper[4839]: I0321 04:37:58.963310 4839 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-2z7jr" Mar 21 04:37:58 crc kubenswrapper[4839]: I0321 04:37:58.966282 4839 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-metrics-9b8c8685d-z5wkc"] Mar 21 04:37:58 crc kubenswrapper[4839]: I0321 04:37:58.991651 4839 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-webhook-5f558f5558-7ghd4"] Mar 21 04:37:58 crc kubenswrapper[4839]: I0321 04:37:58.992292 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-webhook-5f558f5558-7ghd4" Mar 21 04:37:58 crc kubenswrapper[4839]: I0321 04:37:58.994025 4839 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"openshift-nmstate-webhook" Mar 21 04:37:59 crc kubenswrapper[4839]: I0321 04:37:59.013517 4839 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-webhook-5f558f5558-7ghd4"] Mar 21 04:37:59 crc kubenswrapper[4839]: I0321 04:37:59.017625 4839 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-handler-k57vv"] Mar 21 04:37:59 crc kubenswrapper[4839]: I0321 04:37:59.018448 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-handler-k57vv" Mar 21 04:37:59 crc kubenswrapper[4839]: I0321 04:37:59.031329 4839 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-2z7jr" Mar 21 04:37:59 crc kubenswrapper[4839]: I0321 04:37:59.043585 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-key-pair\" (UniqueName: \"kubernetes.io/secret/5a2485ca-cb21-4edf-b074-f7ac255f45f8-tls-key-pair\") pod \"nmstate-webhook-5f558f5558-7ghd4\" (UID: \"5a2485ca-cb21-4edf-b074-f7ac255f45f8\") " pod="openshift-nmstate/nmstate-webhook-5f558f5558-7ghd4" Mar 21 04:37:59 crc kubenswrapper[4839]: I0321 04:37:59.043631 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f9sv7\" (UniqueName: \"kubernetes.io/projected/fdc1639d-742f-41a6-8cb7-318997a4a8b1-kube-api-access-f9sv7\") pod \"nmstate-metrics-9b8c8685d-z5wkc\" (UID: \"fdc1639d-742f-41a6-8cb7-318997a4a8b1\") " pod="openshift-nmstate/nmstate-metrics-9b8c8685d-z5wkc" Mar 21 04:37:59 crc kubenswrapper[4839]: I0321 04:37:59.043763 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lz8w6\" (UniqueName: \"kubernetes.io/projected/5a2485ca-cb21-4edf-b074-f7ac255f45f8-kube-api-access-lz8w6\") pod \"nmstate-webhook-5f558f5558-7ghd4\" (UID: \"5a2485ca-cb21-4edf-b074-f7ac255f45f8\") " pod="openshift-nmstate/nmstate-webhook-5f558f5558-7ghd4" Mar 21 04:37:59 crc kubenswrapper[4839]: I0321 04:37:59.098368 4839 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-console-plugin-86f58fcf4-j5z4g"] Mar 21 04:37:59 crc kubenswrapper[4839]: I0321 04:37:59.099059 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-console-plugin-86f58fcf4-j5z4g" Mar 21 04:37:59 crc kubenswrapper[4839]: I0321 04:37:59.104518 4839 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"plugin-serving-cert" Mar 21 04:37:59 crc kubenswrapper[4839]: I0321 04:37:59.104551 4839 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"default-dockercfg-nr6km" Mar 21 04:37:59 crc kubenswrapper[4839]: I0321 04:37:59.104721 4839 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-nmstate"/"nginx-conf" Mar 21 04:37:59 crc kubenswrapper[4839]: I0321 04:37:59.112359 4839 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-console-plugin-86f58fcf4-j5z4g"] Mar 21 04:37:59 crc kubenswrapper[4839]: I0321 04:37:59.145183 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cw2g5\" (UniqueName: \"kubernetes.io/projected/42329e42-8b9b-45ed-ab04-bf12468d8859-kube-api-access-cw2g5\") pod \"nmstate-handler-k57vv\" (UID: \"42329e42-8b9b-45ed-ab04-bf12468d8859\") " pod="openshift-nmstate/nmstate-handler-k57vv" Mar 21 04:37:59 crc kubenswrapper[4839]: I0321 04:37:59.145224 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovs-socket\" (UniqueName: \"kubernetes.io/host-path/42329e42-8b9b-45ed-ab04-bf12468d8859-ovs-socket\") pod \"nmstate-handler-k57vv\" (UID: \"42329e42-8b9b-45ed-ab04-bf12468d8859\") " pod="openshift-nmstate/nmstate-handler-k57vv" Mar 21 04:37:59 crc kubenswrapper[4839]: I0321 04:37:59.145246 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/8e7a66bb-3731-4f75-9a7f-5b9d07a36b39-nginx-conf\") pod \"nmstate-console-plugin-86f58fcf4-j5z4g\" (UID: \"8e7a66bb-3731-4f75-9a7f-5b9d07a36b39\") " pod="openshift-nmstate/nmstate-console-plugin-86f58fcf4-j5z4g" Mar 21 04:37:59 crc kubenswrapper[4839]: I0321 04:37:59.145260 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nmstate-lock\" (UniqueName: \"kubernetes.io/host-path/42329e42-8b9b-45ed-ab04-bf12468d8859-nmstate-lock\") pod \"nmstate-handler-k57vv\" (UID: \"42329e42-8b9b-45ed-ab04-bf12468d8859\") " pod="openshift-nmstate/nmstate-handler-k57vv" Mar 21 04:37:59 crc kubenswrapper[4839]: I0321 04:37:59.145283 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dbus-socket\" (UniqueName: \"kubernetes.io/host-path/42329e42-8b9b-45ed-ab04-bf12468d8859-dbus-socket\") pod \"nmstate-handler-k57vv\" (UID: \"42329e42-8b9b-45ed-ab04-bf12468d8859\") " pod="openshift-nmstate/nmstate-handler-k57vv" Mar 21 04:37:59 crc kubenswrapper[4839]: I0321 04:37:59.145366 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-key-pair\" (UniqueName: \"kubernetes.io/secret/5a2485ca-cb21-4edf-b074-f7ac255f45f8-tls-key-pair\") pod \"nmstate-webhook-5f558f5558-7ghd4\" (UID: \"5a2485ca-cb21-4edf-b074-f7ac255f45f8\") " pod="openshift-nmstate/nmstate-webhook-5f558f5558-7ghd4" Mar 21 04:37:59 crc kubenswrapper[4839]: I0321 04:37:59.145391 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-f9sv7\" (UniqueName: \"kubernetes.io/projected/fdc1639d-742f-41a6-8cb7-318997a4a8b1-kube-api-access-f9sv7\") pod \"nmstate-metrics-9b8c8685d-z5wkc\" (UID: \"fdc1639d-742f-41a6-8cb7-318997a4a8b1\") " pod="openshift-nmstate/nmstate-metrics-9b8c8685d-z5wkc" Mar 21 04:37:59 crc kubenswrapper[4839]: I0321 04:37:59.145422 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gkdsg\" (UniqueName: \"kubernetes.io/projected/8e7a66bb-3731-4f75-9a7f-5b9d07a36b39-kube-api-access-gkdsg\") pod \"nmstate-console-plugin-86f58fcf4-j5z4g\" (UID: \"8e7a66bb-3731-4f75-9a7f-5b9d07a36b39\") " pod="openshift-nmstate/nmstate-console-plugin-86f58fcf4-j5z4g" Mar 21 04:37:59 crc kubenswrapper[4839]: I0321 04:37:59.145448 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/8e7a66bb-3731-4f75-9a7f-5b9d07a36b39-plugin-serving-cert\") pod \"nmstate-console-plugin-86f58fcf4-j5z4g\" (UID: \"8e7a66bb-3731-4f75-9a7f-5b9d07a36b39\") " pod="openshift-nmstate/nmstate-console-plugin-86f58fcf4-j5z4g" Mar 21 04:37:59 crc kubenswrapper[4839]: I0321 04:37:59.145474 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lz8w6\" (UniqueName: \"kubernetes.io/projected/5a2485ca-cb21-4edf-b074-f7ac255f45f8-kube-api-access-lz8w6\") pod \"nmstate-webhook-5f558f5558-7ghd4\" (UID: \"5a2485ca-cb21-4edf-b074-f7ac255f45f8\") " pod="openshift-nmstate/nmstate-webhook-5f558f5558-7ghd4" Mar 21 04:37:59 crc kubenswrapper[4839]: E0321 04:37:59.145787 4839 secret.go:188] Couldn't get secret openshift-nmstate/openshift-nmstate-webhook: secret "openshift-nmstate-webhook" not found Mar 21 04:37:59 crc kubenswrapper[4839]: E0321 04:37:59.145827 4839 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5a2485ca-cb21-4edf-b074-f7ac255f45f8-tls-key-pair podName:5a2485ca-cb21-4edf-b074-f7ac255f45f8 nodeName:}" failed. No retries permitted until 2026-03-21 04:37:59.645813338 +0000 UTC m=+883.973600014 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "tls-key-pair" (UniqueName: "kubernetes.io/secret/5a2485ca-cb21-4edf-b074-f7ac255f45f8-tls-key-pair") pod "nmstate-webhook-5f558f5558-7ghd4" (UID: "5a2485ca-cb21-4edf-b074-f7ac255f45f8") : secret "openshift-nmstate-webhook" not found Mar 21 04:37:59 crc kubenswrapper[4839]: I0321 04:37:59.165500 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lz8w6\" (UniqueName: \"kubernetes.io/projected/5a2485ca-cb21-4edf-b074-f7ac255f45f8-kube-api-access-lz8w6\") pod \"nmstate-webhook-5f558f5558-7ghd4\" (UID: \"5a2485ca-cb21-4edf-b074-f7ac255f45f8\") " pod="openshift-nmstate/nmstate-webhook-5f558f5558-7ghd4" Mar 21 04:37:59 crc kubenswrapper[4839]: I0321 04:37:59.165507 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-f9sv7\" (UniqueName: \"kubernetes.io/projected/fdc1639d-742f-41a6-8cb7-318997a4a8b1-kube-api-access-f9sv7\") pod \"nmstate-metrics-9b8c8685d-z5wkc\" (UID: \"fdc1639d-742f-41a6-8cb7-318997a4a8b1\") " pod="openshift-nmstate/nmstate-metrics-9b8c8685d-z5wkc" Mar 21 04:37:59 crc kubenswrapper[4839]: I0321 04:37:59.246492 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nmstate-lock\" (UniqueName: \"kubernetes.io/host-path/42329e42-8b9b-45ed-ab04-bf12468d8859-nmstate-lock\") pod \"nmstate-handler-k57vv\" (UID: \"42329e42-8b9b-45ed-ab04-bf12468d8859\") " pod="openshift-nmstate/nmstate-handler-k57vv" Mar 21 04:37:59 crc kubenswrapper[4839]: I0321 04:37:59.246558 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dbus-socket\" (UniqueName: \"kubernetes.io/host-path/42329e42-8b9b-45ed-ab04-bf12468d8859-dbus-socket\") pod \"nmstate-handler-k57vv\" (UID: \"42329e42-8b9b-45ed-ab04-bf12468d8859\") " pod="openshift-nmstate/nmstate-handler-k57vv" Mar 21 04:37:59 crc kubenswrapper[4839]: I0321 04:37:59.246654 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gkdsg\" (UniqueName: \"kubernetes.io/projected/8e7a66bb-3731-4f75-9a7f-5b9d07a36b39-kube-api-access-gkdsg\") pod \"nmstate-console-plugin-86f58fcf4-j5z4g\" (UID: \"8e7a66bb-3731-4f75-9a7f-5b9d07a36b39\") " pod="openshift-nmstate/nmstate-console-plugin-86f58fcf4-j5z4g" Mar 21 04:37:59 crc kubenswrapper[4839]: I0321 04:37:59.246664 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nmstate-lock\" (UniqueName: \"kubernetes.io/host-path/42329e42-8b9b-45ed-ab04-bf12468d8859-nmstate-lock\") pod \"nmstate-handler-k57vv\" (UID: \"42329e42-8b9b-45ed-ab04-bf12468d8859\") " pod="openshift-nmstate/nmstate-handler-k57vv" Mar 21 04:37:59 crc kubenswrapper[4839]: I0321 04:37:59.246760 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/8e7a66bb-3731-4f75-9a7f-5b9d07a36b39-plugin-serving-cert\") pod \"nmstate-console-plugin-86f58fcf4-j5z4g\" (UID: \"8e7a66bb-3731-4f75-9a7f-5b9d07a36b39\") " pod="openshift-nmstate/nmstate-console-plugin-86f58fcf4-j5z4g" Mar 21 04:37:59 crc kubenswrapper[4839]: I0321 04:37:59.246800 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cw2g5\" (UniqueName: \"kubernetes.io/projected/42329e42-8b9b-45ed-ab04-bf12468d8859-kube-api-access-cw2g5\") pod \"nmstate-handler-k57vv\" (UID: \"42329e42-8b9b-45ed-ab04-bf12468d8859\") " pod="openshift-nmstate/nmstate-handler-k57vv" Mar 21 04:37:59 crc kubenswrapper[4839]: I0321 04:37:59.246823 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovs-socket\" (UniqueName: \"kubernetes.io/host-path/42329e42-8b9b-45ed-ab04-bf12468d8859-ovs-socket\") pod \"nmstate-handler-k57vv\" (UID: \"42329e42-8b9b-45ed-ab04-bf12468d8859\") " pod="openshift-nmstate/nmstate-handler-k57vv" Mar 21 04:37:59 crc kubenswrapper[4839]: I0321 04:37:59.246843 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/8e7a66bb-3731-4f75-9a7f-5b9d07a36b39-nginx-conf\") pod \"nmstate-console-plugin-86f58fcf4-j5z4g\" (UID: \"8e7a66bb-3731-4f75-9a7f-5b9d07a36b39\") " pod="openshift-nmstate/nmstate-console-plugin-86f58fcf4-j5z4g" Mar 21 04:37:59 crc kubenswrapper[4839]: I0321 04:37:59.247092 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovs-socket\" (UniqueName: \"kubernetes.io/host-path/42329e42-8b9b-45ed-ab04-bf12468d8859-ovs-socket\") pod \"nmstate-handler-k57vv\" (UID: \"42329e42-8b9b-45ed-ab04-bf12468d8859\") " pod="openshift-nmstate/nmstate-handler-k57vv" Mar 21 04:37:59 crc kubenswrapper[4839]: I0321 04:37:59.247382 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dbus-socket\" (UniqueName: \"kubernetes.io/host-path/42329e42-8b9b-45ed-ab04-bf12468d8859-dbus-socket\") pod \"nmstate-handler-k57vv\" (UID: \"42329e42-8b9b-45ed-ab04-bf12468d8859\") " pod="openshift-nmstate/nmstate-handler-k57vv" Mar 21 04:37:59 crc kubenswrapper[4839]: I0321 04:37:59.248044 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/8e7a66bb-3731-4f75-9a7f-5b9d07a36b39-nginx-conf\") pod \"nmstate-console-plugin-86f58fcf4-j5z4g\" (UID: \"8e7a66bb-3731-4f75-9a7f-5b9d07a36b39\") " pod="openshift-nmstate/nmstate-console-plugin-86f58fcf4-j5z4g" Mar 21 04:37:59 crc kubenswrapper[4839]: I0321 04:37:59.255290 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/8e7a66bb-3731-4f75-9a7f-5b9d07a36b39-plugin-serving-cert\") pod \"nmstate-console-plugin-86f58fcf4-j5z4g\" (UID: \"8e7a66bb-3731-4f75-9a7f-5b9d07a36b39\") " pod="openshift-nmstate/nmstate-console-plugin-86f58fcf4-j5z4g" Mar 21 04:37:59 crc kubenswrapper[4839]: I0321 04:37:59.269157 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cw2g5\" (UniqueName: \"kubernetes.io/projected/42329e42-8b9b-45ed-ab04-bf12468d8859-kube-api-access-cw2g5\") pod \"nmstate-handler-k57vv\" (UID: \"42329e42-8b9b-45ed-ab04-bf12468d8859\") " pod="openshift-nmstate/nmstate-handler-k57vv" Mar 21 04:37:59 crc kubenswrapper[4839]: I0321 04:37:59.276676 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gkdsg\" (UniqueName: \"kubernetes.io/projected/8e7a66bb-3731-4f75-9a7f-5b9d07a36b39-kube-api-access-gkdsg\") pod \"nmstate-console-plugin-86f58fcf4-j5z4g\" (UID: \"8e7a66bb-3731-4f75-9a7f-5b9d07a36b39\") " pod="openshift-nmstate/nmstate-console-plugin-86f58fcf4-j5z4g" Mar 21 04:37:59 crc kubenswrapper[4839]: I0321 04:37:59.279221 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-metrics-9b8c8685d-z5wkc" Mar 21 04:37:59 crc kubenswrapper[4839]: I0321 04:37:59.343366 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-handler-k57vv" Mar 21 04:37:59 crc kubenswrapper[4839]: I0321 04:37:59.363784 4839 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/console-95579fd9f-99csd"] Mar 21 04:37:59 crc kubenswrapper[4839]: I0321 04:37:59.364476 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-95579fd9f-99csd" Mar 21 04:37:59 crc kubenswrapper[4839]: I0321 04:37:59.387270 4839 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-95579fd9f-99csd"] Mar 21 04:37:59 crc kubenswrapper[4839]: I0321 04:37:59.420066 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-console-plugin-86f58fcf4-j5z4g" Mar 21 04:37:59 crc kubenswrapper[4839]: I0321 04:37:59.448361 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/e0ed5e0f-3202-4b63-acb9-e689b9d1b1e4-oauth-serving-cert\") pod \"console-95579fd9f-99csd\" (UID: \"e0ed5e0f-3202-4b63-acb9-e689b9d1b1e4\") " pod="openshift-console/console-95579fd9f-99csd" Mar 21 04:37:59 crc kubenswrapper[4839]: I0321 04:37:59.448688 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/e0ed5e0f-3202-4b63-acb9-e689b9d1b1e4-console-config\") pod \"console-95579fd9f-99csd\" (UID: \"e0ed5e0f-3202-4b63-acb9-e689b9d1b1e4\") " pod="openshift-console/console-95579fd9f-99csd" Mar 21 04:37:59 crc kubenswrapper[4839]: I0321 04:37:59.448743 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/e0ed5e0f-3202-4b63-acb9-e689b9d1b1e4-console-serving-cert\") pod \"console-95579fd9f-99csd\" (UID: \"e0ed5e0f-3202-4b63-acb9-e689b9d1b1e4\") " pod="openshift-console/console-95579fd9f-99csd" Mar 21 04:37:59 crc kubenswrapper[4839]: I0321 04:37:59.448782 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/e0ed5e0f-3202-4b63-acb9-e689b9d1b1e4-console-oauth-config\") pod \"console-95579fd9f-99csd\" (UID: \"e0ed5e0f-3202-4b63-acb9-e689b9d1b1e4\") " pod="openshift-console/console-95579fd9f-99csd" Mar 21 04:37:59 crc kubenswrapper[4839]: I0321 04:37:59.448800 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/e0ed5e0f-3202-4b63-acb9-e689b9d1b1e4-service-ca\") pod \"console-95579fd9f-99csd\" (UID: \"e0ed5e0f-3202-4b63-acb9-e689b9d1b1e4\") " pod="openshift-console/console-95579fd9f-99csd" Mar 21 04:37:59 crc kubenswrapper[4839]: I0321 04:37:59.448819 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vzkqq\" (UniqueName: \"kubernetes.io/projected/e0ed5e0f-3202-4b63-acb9-e689b9d1b1e4-kube-api-access-vzkqq\") pod \"console-95579fd9f-99csd\" (UID: \"e0ed5e0f-3202-4b63-acb9-e689b9d1b1e4\") " pod="openshift-console/console-95579fd9f-99csd" Mar 21 04:37:59 crc kubenswrapper[4839]: I0321 04:37:59.448985 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e0ed5e0f-3202-4b63-acb9-e689b9d1b1e4-trusted-ca-bundle\") pod \"console-95579fd9f-99csd\" (UID: \"e0ed5e0f-3202-4b63-acb9-e689b9d1b1e4\") " pod="openshift-console/console-95579fd9f-99csd" Mar 21 04:37:59 crc kubenswrapper[4839]: I0321 04:37:59.528988 4839 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-metrics-9b8c8685d-z5wkc"] Mar 21 04:37:59 crc kubenswrapper[4839]: W0321 04:37:59.547979 4839 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podfdc1639d_742f_41a6_8cb7_318997a4a8b1.slice/crio-22bb1435120da004a6680d61d3cda7b5011e0803c4c5a4d0f3a624c0c4568a5a WatchSource:0}: Error finding container 22bb1435120da004a6680d61d3cda7b5011e0803c4c5a4d0f3a624c0c4568a5a: Status 404 returned error can't find the container with id 22bb1435120da004a6680d61d3cda7b5011e0803c4c5a4d0f3a624c0c4568a5a Mar 21 04:37:59 crc kubenswrapper[4839]: I0321 04:37:59.549822 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/e0ed5e0f-3202-4b63-acb9-e689b9d1b1e4-console-serving-cert\") pod \"console-95579fd9f-99csd\" (UID: \"e0ed5e0f-3202-4b63-acb9-e689b9d1b1e4\") " pod="openshift-console/console-95579fd9f-99csd" Mar 21 04:37:59 crc kubenswrapper[4839]: I0321 04:37:59.549872 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/e0ed5e0f-3202-4b63-acb9-e689b9d1b1e4-console-oauth-config\") pod \"console-95579fd9f-99csd\" (UID: \"e0ed5e0f-3202-4b63-acb9-e689b9d1b1e4\") " pod="openshift-console/console-95579fd9f-99csd" Mar 21 04:37:59 crc kubenswrapper[4839]: I0321 04:37:59.549891 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/e0ed5e0f-3202-4b63-acb9-e689b9d1b1e4-service-ca\") pod \"console-95579fd9f-99csd\" (UID: \"e0ed5e0f-3202-4b63-acb9-e689b9d1b1e4\") " pod="openshift-console/console-95579fd9f-99csd" Mar 21 04:37:59 crc kubenswrapper[4839]: I0321 04:37:59.549907 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vzkqq\" (UniqueName: \"kubernetes.io/projected/e0ed5e0f-3202-4b63-acb9-e689b9d1b1e4-kube-api-access-vzkqq\") pod \"console-95579fd9f-99csd\" (UID: \"e0ed5e0f-3202-4b63-acb9-e689b9d1b1e4\") " pod="openshift-console/console-95579fd9f-99csd" Mar 21 04:37:59 crc kubenswrapper[4839]: I0321 04:37:59.549950 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e0ed5e0f-3202-4b63-acb9-e689b9d1b1e4-trusted-ca-bundle\") pod \"console-95579fd9f-99csd\" (UID: \"e0ed5e0f-3202-4b63-acb9-e689b9d1b1e4\") " pod="openshift-console/console-95579fd9f-99csd" Mar 21 04:37:59 crc kubenswrapper[4839]: I0321 04:37:59.550000 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/e0ed5e0f-3202-4b63-acb9-e689b9d1b1e4-oauth-serving-cert\") pod \"console-95579fd9f-99csd\" (UID: \"e0ed5e0f-3202-4b63-acb9-e689b9d1b1e4\") " pod="openshift-console/console-95579fd9f-99csd" Mar 21 04:37:59 crc kubenswrapper[4839]: I0321 04:37:59.550026 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/e0ed5e0f-3202-4b63-acb9-e689b9d1b1e4-console-config\") pod \"console-95579fd9f-99csd\" (UID: \"e0ed5e0f-3202-4b63-acb9-e689b9d1b1e4\") " pod="openshift-console/console-95579fd9f-99csd" Mar 21 04:37:59 crc kubenswrapper[4839]: I0321 04:37:59.551987 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/e0ed5e0f-3202-4b63-acb9-e689b9d1b1e4-service-ca\") pod \"console-95579fd9f-99csd\" (UID: \"e0ed5e0f-3202-4b63-acb9-e689b9d1b1e4\") " pod="openshift-console/console-95579fd9f-99csd" Mar 21 04:37:59 crc kubenswrapper[4839]: I0321 04:37:59.552589 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e0ed5e0f-3202-4b63-acb9-e689b9d1b1e4-trusted-ca-bundle\") pod \"console-95579fd9f-99csd\" (UID: \"e0ed5e0f-3202-4b63-acb9-e689b9d1b1e4\") " pod="openshift-console/console-95579fd9f-99csd" Mar 21 04:37:59 crc kubenswrapper[4839]: I0321 04:37:59.552693 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/e0ed5e0f-3202-4b63-acb9-e689b9d1b1e4-console-config\") pod \"console-95579fd9f-99csd\" (UID: \"e0ed5e0f-3202-4b63-acb9-e689b9d1b1e4\") " pod="openshift-console/console-95579fd9f-99csd" Mar 21 04:37:59 crc kubenswrapper[4839]: I0321 04:37:59.553456 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/e0ed5e0f-3202-4b63-acb9-e689b9d1b1e4-oauth-serving-cert\") pod \"console-95579fd9f-99csd\" (UID: \"e0ed5e0f-3202-4b63-acb9-e689b9d1b1e4\") " pod="openshift-console/console-95579fd9f-99csd" Mar 21 04:37:59 crc kubenswrapper[4839]: I0321 04:37:59.555753 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/e0ed5e0f-3202-4b63-acb9-e689b9d1b1e4-console-serving-cert\") pod \"console-95579fd9f-99csd\" (UID: \"e0ed5e0f-3202-4b63-acb9-e689b9d1b1e4\") " pod="openshift-console/console-95579fd9f-99csd" Mar 21 04:37:59 crc kubenswrapper[4839]: I0321 04:37:59.556261 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/e0ed5e0f-3202-4b63-acb9-e689b9d1b1e4-console-oauth-config\") pod \"console-95579fd9f-99csd\" (UID: \"e0ed5e0f-3202-4b63-acb9-e689b9d1b1e4\") " pod="openshift-console/console-95579fd9f-99csd" Mar 21 04:37:59 crc kubenswrapper[4839]: I0321 04:37:59.570262 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vzkqq\" (UniqueName: \"kubernetes.io/projected/e0ed5e0f-3202-4b63-acb9-e689b9d1b1e4-kube-api-access-vzkqq\") pod \"console-95579fd9f-99csd\" (UID: \"e0ed5e0f-3202-4b63-acb9-e689b9d1b1e4\") " pod="openshift-console/console-95579fd9f-99csd" Mar 21 04:37:59 crc kubenswrapper[4839]: I0321 04:37:59.625387 4839 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-console-plugin-86f58fcf4-j5z4g"] Mar 21 04:37:59 crc kubenswrapper[4839]: W0321 04:37:59.633149 4839 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8e7a66bb_3731_4f75_9a7f_5b9d07a36b39.slice/crio-3b3202817c160023a78760462711be586f38e2b1a4df731d3d58a53d7e9207e0 WatchSource:0}: Error finding container 3b3202817c160023a78760462711be586f38e2b1a4df731d3d58a53d7e9207e0: Status 404 returned error can't find the container with id 3b3202817c160023a78760462711be586f38e2b1a4df731d3d58a53d7e9207e0 Mar 21 04:37:59 crc kubenswrapper[4839]: I0321 04:37:59.651093 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-key-pair\" (UniqueName: \"kubernetes.io/secret/5a2485ca-cb21-4edf-b074-f7ac255f45f8-tls-key-pair\") pod \"nmstate-webhook-5f558f5558-7ghd4\" (UID: \"5a2485ca-cb21-4edf-b074-f7ac255f45f8\") " pod="openshift-nmstate/nmstate-webhook-5f558f5558-7ghd4" Mar 21 04:37:59 crc kubenswrapper[4839]: E0321 04:37:59.651354 4839 secret.go:188] Couldn't get secret openshift-nmstate/openshift-nmstate-webhook: secret "openshift-nmstate-webhook" not found Mar 21 04:37:59 crc kubenswrapper[4839]: E0321 04:37:59.651443 4839 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5a2485ca-cb21-4edf-b074-f7ac255f45f8-tls-key-pair podName:5a2485ca-cb21-4edf-b074-f7ac255f45f8 nodeName:}" failed. No retries permitted until 2026-03-21 04:38:00.651422965 +0000 UTC m=+884.979209641 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "tls-key-pair" (UniqueName: "kubernetes.io/secret/5a2485ca-cb21-4edf-b074-f7ac255f45f8-tls-key-pair") pod "nmstate-webhook-5f558f5558-7ghd4" (UID: "5a2485ca-cb21-4edf-b074-f7ac255f45f8") : secret "openshift-nmstate-webhook" not found Mar 21 04:37:59 crc kubenswrapper[4839]: I0321 04:37:59.699803 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-95579fd9f-99csd" Mar 21 04:37:59 crc kubenswrapper[4839]: I0321 04:37:59.819608 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-handler-k57vv" event={"ID":"42329e42-8b9b-45ed-ab04-bf12468d8859","Type":"ContainerStarted","Data":"b544eb484d97313f14bd105e4cc46e22488a4e8e42e07ee31d8399d900ba3c0c"} Mar 21 04:37:59 crc kubenswrapper[4839]: I0321 04:37:59.821164 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-metrics-9b8c8685d-z5wkc" event={"ID":"fdc1639d-742f-41a6-8cb7-318997a4a8b1","Type":"ContainerStarted","Data":"22bb1435120da004a6680d61d3cda7b5011e0803c4c5a4d0f3a624c0c4568a5a"} Mar 21 04:37:59 crc kubenswrapper[4839]: I0321 04:37:59.828964 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-console-plugin-86f58fcf4-j5z4g" event={"ID":"8e7a66bb-3731-4f75-9a7f-5b9d07a36b39","Type":"ContainerStarted","Data":"3b3202817c160023a78760462711be586f38e2b1a4df731d3d58a53d7e9207e0"} Mar 21 04:37:59 crc kubenswrapper[4839]: I0321 04:37:59.869646 4839 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-2z7jr" Mar 21 04:37:59 crc kubenswrapper[4839]: I0321 04:37:59.897119 4839 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-95579fd9f-99csd"] Mar 21 04:37:59 crc kubenswrapper[4839]: W0321 04:37:59.898760 4839 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode0ed5e0f_3202_4b63_acb9_e689b9d1b1e4.slice/crio-32df6dfc7fea9f1b2d6f4f1d406e9bb59a75e860d7ca7361f6c842948e9b6125 WatchSource:0}: Error finding container 32df6dfc7fea9f1b2d6f4f1d406e9bb59a75e860d7ca7361f6c842948e9b6125: Status 404 returned error can't find the container with id 32df6dfc7fea9f1b2d6f4f1d406e9bb59a75e860d7ca7361f6c842948e9b6125 Mar 21 04:38:00 crc kubenswrapper[4839]: I0321 04:38:00.138809 4839 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29567798-k5zv2"] Mar 21 04:38:00 crc kubenswrapper[4839]: I0321 04:38:00.140746 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567798-k5zv2" Mar 21 04:38:00 crc kubenswrapper[4839]: I0321 04:38:00.142605 4839 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 21 04:38:00 crc kubenswrapper[4839]: I0321 04:38:00.142696 4839 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-swld2" Mar 21 04:38:00 crc kubenswrapper[4839]: I0321 04:38:00.142847 4839 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 21 04:38:00 crc kubenswrapper[4839]: I0321 04:38:00.150963 4839 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29567798-k5zv2"] Mar 21 04:38:00 crc kubenswrapper[4839]: I0321 04:38:00.258889 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pwkcd\" (UniqueName: \"kubernetes.io/projected/ad32cfd7-7b60-4c76-8df2-eb2e65b102c3-kube-api-access-pwkcd\") pod \"auto-csr-approver-29567798-k5zv2\" (UID: \"ad32cfd7-7b60-4c76-8df2-eb2e65b102c3\") " pod="openshift-infra/auto-csr-approver-29567798-k5zv2" Mar 21 04:38:00 crc kubenswrapper[4839]: I0321 04:38:00.360385 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pwkcd\" (UniqueName: \"kubernetes.io/projected/ad32cfd7-7b60-4c76-8df2-eb2e65b102c3-kube-api-access-pwkcd\") pod \"auto-csr-approver-29567798-k5zv2\" (UID: \"ad32cfd7-7b60-4c76-8df2-eb2e65b102c3\") " pod="openshift-infra/auto-csr-approver-29567798-k5zv2" Mar 21 04:38:00 crc kubenswrapper[4839]: I0321 04:38:00.384179 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pwkcd\" (UniqueName: \"kubernetes.io/projected/ad32cfd7-7b60-4c76-8df2-eb2e65b102c3-kube-api-access-pwkcd\") pod \"auto-csr-approver-29567798-k5zv2\" (UID: \"ad32cfd7-7b60-4c76-8df2-eb2e65b102c3\") " pod="openshift-infra/auto-csr-approver-29567798-k5zv2" Mar 21 04:38:00 crc kubenswrapper[4839]: I0321 04:38:00.457996 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567798-k5zv2" Mar 21 04:38:00 crc kubenswrapper[4839]: I0321 04:38:00.634052 4839 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-2z7jr"] Mar 21 04:38:00 crc kubenswrapper[4839]: I0321 04:38:00.656544 4839 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29567798-k5zv2"] Mar 21 04:38:00 crc kubenswrapper[4839]: W0321 04:38:00.660898 4839 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podad32cfd7_7b60_4c76_8df2_eb2e65b102c3.slice/crio-bd03cbce2553e12ebca4f813fc3769db9f3f1290d71820b0b85085cc50cc8961 WatchSource:0}: Error finding container bd03cbce2553e12ebca4f813fc3769db9f3f1290d71820b0b85085cc50cc8961: Status 404 returned error can't find the container with id bd03cbce2553e12ebca4f813fc3769db9f3f1290d71820b0b85085cc50cc8961 Mar 21 04:38:00 crc kubenswrapper[4839]: I0321 04:38:00.663739 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-key-pair\" (UniqueName: \"kubernetes.io/secret/5a2485ca-cb21-4edf-b074-f7ac255f45f8-tls-key-pair\") pod \"nmstate-webhook-5f558f5558-7ghd4\" (UID: \"5a2485ca-cb21-4edf-b074-f7ac255f45f8\") " pod="openshift-nmstate/nmstate-webhook-5f558f5558-7ghd4" Mar 21 04:38:00 crc kubenswrapper[4839]: I0321 04:38:00.673888 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tls-key-pair\" (UniqueName: \"kubernetes.io/secret/5a2485ca-cb21-4edf-b074-f7ac255f45f8-tls-key-pair\") pod \"nmstate-webhook-5f558f5558-7ghd4\" (UID: \"5a2485ca-cb21-4edf-b074-f7ac255f45f8\") " pod="openshift-nmstate/nmstate-webhook-5f558f5558-7ghd4" Mar 21 04:38:00 crc kubenswrapper[4839]: I0321 04:38:00.814366 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-webhook-5f558f5558-7ghd4" Mar 21 04:38:00 crc kubenswrapper[4839]: I0321 04:38:00.834338 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567798-k5zv2" event={"ID":"ad32cfd7-7b60-4c76-8df2-eb2e65b102c3","Type":"ContainerStarted","Data":"bd03cbce2553e12ebca4f813fc3769db9f3f1290d71820b0b85085cc50cc8961"} Mar 21 04:38:00 crc kubenswrapper[4839]: I0321 04:38:00.835982 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-95579fd9f-99csd" event={"ID":"e0ed5e0f-3202-4b63-acb9-e689b9d1b1e4","Type":"ContainerStarted","Data":"2bf04a9dca89aceb104eeeb4ef15898b4b2e73051968502edf52e79ebfde3a8b"} Mar 21 04:38:00 crc kubenswrapper[4839]: I0321 04:38:00.836040 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-95579fd9f-99csd" event={"ID":"e0ed5e0f-3202-4b63-acb9-e689b9d1b1e4","Type":"ContainerStarted","Data":"32df6dfc7fea9f1b2d6f4f1d406e9bb59a75e860d7ca7361f6c842948e9b6125"} Mar 21 04:38:00 crc kubenswrapper[4839]: I0321 04:38:00.853157 4839 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-95579fd9f-99csd" podStartSLOduration=1.853142362 podStartE2EDuration="1.853142362s" podCreationTimestamp="2026-03-21 04:37:59 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-21 04:38:00.850280282 +0000 UTC m=+885.178066958" watchObservedRunningTime="2026-03-21 04:38:00.853142362 +0000 UTC m=+885.180929038" Mar 21 04:38:00 crc kubenswrapper[4839]: I0321 04:38:00.980557 4839 patch_prober.go:28] interesting pod/machine-config-daemon-jx4q7 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 21 04:38:00 crc kubenswrapper[4839]: I0321 04:38:00.980642 4839 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-jx4q7" podUID="4f92fefb-d5cd-451a-8bbe-31eea55d5bd9" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 21 04:38:01 crc kubenswrapper[4839]: I0321 04:38:00.999965 4839 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-webhook-5f558f5558-7ghd4"] Mar 21 04:38:01 crc kubenswrapper[4839]: W0321 04:38:01.007433 4839 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5a2485ca_cb21_4edf_b074_f7ac255f45f8.slice/crio-769a7aa3ee42ef068519b51e71f3992a30992d2666f7ea25c621e539bb327f94 WatchSource:0}: Error finding container 769a7aa3ee42ef068519b51e71f3992a30992d2666f7ea25c621e539bb327f94: Status 404 returned error can't find the container with id 769a7aa3ee42ef068519b51e71f3992a30992d2666f7ea25c621e539bb327f94 Mar 21 04:38:01 crc kubenswrapper[4839]: I0321 04:38:01.841466 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-webhook-5f558f5558-7ghd4" event={"ID":"5a2485ca-cb21-4edf-b074-f7ac255f45f8","Type":"ContainerStarted","Data":"769a7aa3ee42ef068519b51e71f3992a30992d2666f7ea25c621e539bb327f94"} Mar 21 04:38:01 crc kubenswrapper[4839]: I0321 04:38:01.841881 4839 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-2z7jr" podUID="36c4ce7f-79eb-4f18-8573-f6900d7812fe" containerName="registry-server" containerID="cri-o://ebb35bbf64054558101b0109e0037494ebb7b1b197dba55792f4b36d612fa907" gracePeriod=2 Mar 21 04:38:03 crc kubenswrapper[4839]: I0321 04:38:03.855923 4839 generic.go:334] "Generic (PLEG): container finished" podID="36c4ce7f-79eb-4f18-8573-f6900d7812fe" containerID="ebb35bbf64054558101b0109e0037494ebb7b1b197dba55792f4b36d612fa907" exitCode=0 Mar 21 04:38:03 crc kubenswrapper[4839]: I0321 04:38:03.856014 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-2z7jr" event={"ID":"36c4ce7f-79eb-4f18-8573-f6900d7812fe","Type":"ContainerDied","Data":"ebb35bbf64054558101b0109e0037494ebb7b1b197dba55792f4b36d612fa907"} Mar 21 04:38:05 crc kubenswrapper[4839]: I0321 04:38:05.241323 4839 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-2z7jr" Mar 21 04:38:05 crc kubenswrapper[4839]: I0321 04:38:05.355180 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/36c4ce7f-79eb-4f18-8573-f6900d7812fe-catalog-content\") pod \"36c4ce7f-79eb-4f18-8573-f6900d7812fe\" (UID: \"36c4ce7f-79eb-4f18-8573-f6900d7812fe\") " Mar 21 04:38:05 crc kubenswrapper[4839]: I0321 04:38:05.355259 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gqkfx\" (UniqueName: \"kubernetes.io/projected/36c4ce7f-79eb-4f18-8573-f6900d7812fe-kube-api-access-gqkfx\") pod \"36c4ce7f-79eb-4f18-8573-f6900d7812fe\" (UID: \"36c4ce7f-79eb-4f18-8573-f6900d7812fe\") " Mar 21 04:38:05 crc kubenswrapper[4839]: I0321 04:38:05.355330 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/36c4ce7f-79eb-4f18-8573-f6900d7812fe-utilities\") pod \"36c4ce7f-79eb-4f18-8573-f6900d7812fe\" (UID: \"36c4ce7f-79eb-4f18-8573-f6900d7812fe\") " Mar 21 04:38:05 crc kubenswrapper[4839]: I0321 04:38:05.356322 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/36c4ce7f-79eb-4f18-8573-f6900d7812fe-utilities" (OuterVolumeSpecName: "utilities") pod "36c4ce7f-79eb-4f18-8573-f6900d7812fe" (UID: "36c4ce7f-79eb-4f18-8573-f6900d7812fe"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 21 04:38:05 crc kubenswrapper[4839]: I0321 04:38:05.362046 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/36c4ce7f-79eb-4f18-8573-f6900d7812fe-kube-api-access-gqkfx" (OuterVolumeSpecName: "kube-api-access-gqkfx") pod "36c4ce7f-79eb-4f18-8573-f6900d7812fe" (UID: "36c4ce7f-79eb-4f18-8573-f6900d7812fe"). InnerVolumeSpecName "kube-api-access-gqkfx". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 21 04:38:05 crc kubenswrapper[4839]: I0321 04:38:05.456636 4839 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gqkfx\" (UniqueName: \"kubernetes.io/projected/36c4ce7f-79eb-4f18-8573-f6900d7812fe-kube-api-access-gqkfx\") on node \"crc\" DevicePath \"\"" Mar 21 04:38:05 crc kubenswrapper[4839]: I0321 04:38:05.457014 4839 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/36c4ce7f-79eb-4f18-8573-f6900d7812fe-utilities\") on node \"crc\" DevicePath \"\"" Mar 21 04:38:05 crc kubenswrapper[4839]: I0321 04:38:05.491197 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/36c4ce7f-79eb-4f18-8573-f6900d7812fe-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "36c4ce7f-79eb-4f18-8573-f6900d7812fe" (UID: "36c4ce7f-79eb-4f18-8573-f6900d7812fe"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 21 04:38:05 crc kubenswrapper[4839]: I0321 04:38:05.558410 4839 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/36c4ce7f-79eb-4f18-8573-f6900d7812fe-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 21 04:38:05 crc kubenswrapper[4839]: I0321 04:38:05.871251 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-handler-k57vv" event={"ID":"42329e42-8b9b-45ed-ab04-bf12468d8859","Type":"ContainerStarted","Data":"e899cf87c8d4d05afd09205bcf13b68549c60d7d20b9278cdf01d93eac37f787"} Mar 21 04:38:05 crc kubenswrapper[4839]: I0321 04:38:05.871373 4839 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-nmstate/nmstate-handler-k57vv" Mar 21 04:38:05 crc kubenswrapper[4839]: I0321 04:38:05.873553 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-webhook-5f558f5558-7ghd4" event={"ID":"5a2485ca-cb21-4edf-b074-f7ac255f45f8","Type":"ContainerStarted","Data":"7a7308c909cd646a0935b0a3183be77fe7128ae384bb364352bd99ec09d0e108"} Mar 21 04:38:05 crc kubenswrapper[4839]: I0321 04:38:05.873641 4839 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-nmstate/nmstate-webhook-5f558f5558-7ghd4" Mar 21 04:38:05 crc kubenswrapper[4839]: I0321 04:38:05.875277 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-metrics-9b8c8685d-z5wkc" event={"ID":"fdc1639d-742f-41a6-8cb7-318997a4a8b1","Type":"ContainerStarted","Data":"ecd307d3a0d3c7e149e08fadad6255d3a00e77345c12a089ff24cda561605b7f"} Mar 21 04:38:05 crc kubenswrapper[4839]: I0321 04:38:05.876834 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-console-plugin-86f58fcf4-j5z4g" event={"ID":"8e7a66bb-3731-4f75-9a7f-5b9d07a36b39","Type":"ContainerStarted","Data":"df5cb5f435c7feedb45febdef13192c2506368609a50cd1db9d0cee2c50a3c52"} Mar 21 04:38:05 crc kubenswrapper[4839]: I0321 04:38:05.879029 4839 generic.go:334] "Generic (PLEG): container finished" podID="ad32cfd7-7b60-4c76-8df2-eb2e65b102c3" containerID="3564e41aa34a1722e5c61a5b47bf82e1bb5bc4612fbb2dc888f7e8b1d996cdd6" exitCode=0 Mar 21 04:38:05 crc kubenswrapper[4839]: I0321 04:38:05.879074 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567798-k5zv2" event={"ID":"ad32cfd7-7b60-4c76-8df2-eb2e65b102c3","Type":"ContainerDied","Data":"3564e41aa34a1722e5c61a5b47bf82e1bb5bc4612fbb2dc888f7e8b1d996cdd6"} Mar 21 04:38:05 crc kubenswrapper[4839]: I0321 04:38:05.881629 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-2z7jr" event={"ID":"36c4ce7f-79eb-4f18-8573-f6900d7812fe","Type":"ContainerDied","Data":"6d41a78d0e49e188c99866f252414054ca009b7f7c05cfeefe8bbaeba6908685"} Mar 21 04:38:05 crc kubenswrapper[4839]: I0321 04:38:05.881666 4839 scope.go:117] "RemoveContainer" containerID="ebb35bbf64054558101b0109e0037494ebb7b1b197dba55792f4b36d612fa907" Mar 21 04:38:05 crc kubenswrapper[4839]: I0321 04:38:05.881666 4839 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-2z7jr" Mar 21 04:38:05 crc kubenswrapper[4839]: I0321 04:38:05.885029 4839 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-handler-k57vv" podStartSLOduration=2.191746971 podStartE2EDuration="7.885011652s" podCreationTimestamp="2026-03-21 04:37:58 +0000 UTC" firstStartedPulling="2026-03-21 04:37:59.39767136 +0000 UTC m=+883.725458036" lastFinishedPulling="2026-03-21 04:38:05.090936041 +0000 UTC m=+889.418722717" observedRunningTime="2026-03-21 04:38:05.884858967 +0000 UTC m=+890.212645673" watchObservedRunningTime="2026-03-21 04:38:05.885011652 +0000 UTC m=+890.212798348" Mar 21 04:38:05 crc kubenswrapper[4839]: I0321 04:38:05.903760 4839 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-console-plugin-86f58fcf4-j5z4g" podStartSLOduration=1.435147887 podStartE2EDuration="6.903741484s" podCreationTimestamp="2026-03-21 04:37:59 +0000 UTC" firstStartedPulling="2026-03-21 04:37:59.634904704 +0000 UTC m=+883.962691380" lastFinishedPulling="2026-03-21 04:38:05.103498301 +0000 UTC m=+889.431284977" observedRunningTime="2026-03-21 04:38:05.899354912 +0000 UTC m=+890.227141628" watchObservedRunningTime="2026-03-21 04:38:05.903741484 +0000 UTC m=+890.231528170" Mar 21 04:38:05 crc kubenswrapper[4839]: I0321 04:38:05.935884 4839 scope.go:117] "RemoveContainer" containerID="e9f5e0197428b66b353464b7a4b43db5842405a6daab284a7a273837bce258b9" Mar 21 04:38:05 crc kubenswrapper[4839]: I0321 04:38:05.949693 4839 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-webhook-5f558f5558-7ghd4" podStartSLOduration=3.840377468 podStartE2EDuration="7.949668474s" podCreationTimestamp="2026-03-21 04:37:58 +0000 UTC" firstStartedPulling="2026-03-21 04:38:01.009670116 +0000 UTC m=+885.337456792" lastFinishedPulling="2026-03-21 04:38:05.118961122 +0000 UTC m=+889.446747798" observedRunningTime="2026-03-21 04:38:05.936200609 +0000 UTC m=+890.263987295" watchObservedRunningTime="2026-03-21 04:38:05.949668474 +0000 UTC m=+890.277455150" Mar 21 04:38:05 crc kubenswrapper[4839]: I0321 04:38:05.956752 4839 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-2z7jr"] Mar 21 04:38:05 crc kubenswrapper[4839]: I0321 04:38:05.961573 4839 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-2z7jr"] Mar 21 04:38:05 crc kubenswrapper[4839]: I0321 04:38:05.977676 4839 scope.go:117] "RemoveContainer" containerID="47364d415da3cf65d8eae0846016fd96e5b43e3a480f9d051b30b253d8dce58d" Mar 21 04:38:06 crc kubenswrapper[4839]: I0321 04:38:06.469089 4839 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="36c4ce7f-79eb-4f18-8573-f6900d7812fe" path="/var/lib/kubelet/pods/36c4ce7f-79eb-4f18-8573-f6900d7812fe/volumes" Mar 21 04:38:07 crc kubenswrapper[4839]: I0321 04:38:07.124890 4839 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567798-k5zv2" Mar 21 04:38:07 crc kubenswrapper[4839]: I0321 04:38:07.189321 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pwkcd\" (UniqueName: \"kubernetes.io/projected/ad32cfd7-7b60-4c76-8df2-eb2e65b102c3-kube-api-access-pwkcd\") pod \"ad32cfd7-7b60-4c76-8df2-eb2e65b102c3\" (UID: \"ad32cfd7-7b60-4c76-8df2-eb2e65b102c3\") " Mar 21 04:38:07 crc kubenswrapper[4839]: I0321 04:38:07.203064 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ad32cfd7-7b60-4c76-8df2-eb2e65b102c3-kube-api-access-pwkcd" (OuterVolumeSpecName: "kube-api-access-pwkcd") pod "ad32cfd7-7b60-4c76-8df2-eb2e65b102c3" (UID: "ad32cfd7-7b60-4c76-8df2-eb2e65b102c3"). InnerVolumeSpecName "kube-api-access-pwkcd". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 21 04:38:07 crc kubenswrapper[4839]: I0321 04:38:07.291201 4839 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pwkcd\" (UniqueName: \"kubernetes.io/projected/ad32cfd7-7b60-4c76-8df2-eb2e65b102c3-kube-api-access-pwkcd\") on node \"crc\" DevicePath \"\"" Mar 21 04:38:07 crc kubenswrapper[4839]: I0321 04:38:07.895302 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567798-k5zv2" event={"ID":"ad32cfd7-7b60-4c76-8df2-eb2e65b102c3","Type":"ContainerDied","Data":"bd03cbce2553e12ebca4f813fc3769db9f3f1290d71820b0b85085cc50cc8961"} Mar 21 04:38:07 crc kubenswrapper[4839]: I0321 04:38:07.895340 4839 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="bd03cbce2553e12ebca4f813fc3769db9f3f1290d71820b0b85085cc50cc8961" Mar 21 04:38:07 crc kubenswrapper[4839]: I0321 04:38:07.895341 4839 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567798-k5zv2" Mar 21 04:38:08 crc kubenswrapper[4839]: I0321 04:38:08.172854 4839 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29567792-rhj6k"] Mar 21 04:38:08 crc kubenswrapper[4839]: I0321 04:38:08.176527 4839 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29567792-rhj6k"] Mar 21 04:38:08 crc kubenswrapper[4839]: I0321 04:38:08.466836 4839 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fbe4754f-40a1-43e0-827f-557507a5e7d1" path="/var/lib/kubelet/pods/fbe4754f-40a1-43e0-827f-557507a5e7d1/volumes" Mar 21 04:38:08 crc kubenswrapper[4839]: I0321 04:38:08.902880 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-metrics-9b8c8685d-z5wkc" event={"ID":"fdc1639d-742f-41a6-8cb7-318997a4a8b1","Type":"ContainerStarted","Data":"3766a4a598c503e95585b55577b46395d383991dadab9eee78dc2c3a32dc87a5"} Mar 21 04:38:08 crc kubenswrapper[4839]: I0321 04:38:08.931922 4839 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-metrics-9b8c8685d-z5wkc" podStartSLOduration=2.596305621 podStartE2EDuration="10.931896116s" podCreationTimestamp="2026-03-21 04:37:58 +0000 UTC" firstStartedPulling="2026-03-21 04:37:59.553629688 +0000 UTC m=+883.881416364" lastFinishedPulling="2026-03-21 04:38:07.889220183 +0000 UTC m=+892.217006859" observedRunningTime="2026-03-21 04:38:08.927130213 +0000 UTC m=+893.254916939" watchObservedRunningTime="2026-03-21 04:38:08.931896116 +0000 UTC m=+893.259682822" Mar 21 04:38:09 crc kubenswrapper[4839]: I0321 04:38:09.699887 4839 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-95579fd9f-99csd" Mar 21 04:38:09 crc kubenswrapper[4839]: I0321 04:38:09.699967 4839 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/console-95579fd9f-99csd" Mar 21 04:38:09 crc kubenswrapper[4839]: I0321 04:38:09.704308 4839 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-95579fd9f-99csd" Mar 21 04:38:09 crc kubenswrapper[4839]: I0321 04:38:09.910678 4839 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-95579fd9f-99csd" Mar 21 04:38:09 crc kubenswrapper[4839]: I0321 04:38:09.959399 4839 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-console/console-f9d7485db-bj929"] Mar 21 04:38:14 crc kubenswrapper[4839]: I0321 04:38:14.412763 4839 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-nmstate/nmstate-handler-k57vv" Mar 21 04:38:20 crc kubenswrapper[4839]: I0321 04:38:20.824675 4839 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-nmstate/nmstate-webhook-5f558f5558-7ghd4" Mar 21 04:38:30 crc kubenswrapper[4839]: I0321 04:38:30.115221 4839 scope.go:117] "RemoveContainer" containerID="c106b5183e83a440589571433cb66f6749e926bbac60bb184fac0a05ac6cf93b" Mar 21 04:38:30 crc kubenswrapper[4839]: I0321 04:38:30.980625 4839 patch_prober.go:28] interesting pod/machine-config-daemon-jx4q7 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 21 04:38:30 crc kubenswrapper[4839]: I0321 04:38:30.981295 4839 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-jx4q7" podUID="4f92fefb-d5cd-451a-8bbe-31eea55d5bd9" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 21 04:38:32 crc kubenswrapper[4839]: I0321 04:38:32.746221 4839 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c18p48w"] Mar 21 04:38:32 crc kubenswrapper[4839]: E0321 04:38:32.746791 4839 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="36c4ce7f-79eb-4f18-8573-f6900d7812fe" containerName="extract-content" Mar 21 04:38:32 crc kubenswrapper[4839]: I0321 04:38:32.746805 4839 state_mem.go:107] "Deleted CPUSet assignment" podUID="36c4ce7f-79eb-4f18-8573-f6900d7812fe" containerName="extract-content" Mar 21 04:38:32 crc kubenswrapper[4839]: E0321 04:38:32.746825 4839 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="36c4ce7f-79eb-4f18-8573-f6900d7812fe" containerName="extract-utilities" Mar 21 04:38:32 crc kubenswrapper[4839]: I0321 04:38:32.746833 4839 state_mem.go:107] "Deleted CPUSet assignment" podUID="36c4ce7f-79eb-4f18-8573-f6900d7812fe" containerName="extract-utilities" Mar 21 04:38:32 crc kubenswrapper[4839]: E0321 04:38:32.746842 4839 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="36c4ce7f-79eb-4f18-8573-f6900d7812fe" containerName="registry-server" Mar 21 04:38:32 crc kubenswrapper[4839]: I0321 04:38:32.746849 4839 state_mem.go:107] "Deleted CPUSet assignment" podUID="36c4ce7f-79eb-4f18-8573-f6900d7812fe" containerName="registry-server" Mar 21 04:38:32 crc kubenswrapper[4839]: E0321 04:38:32.746862 4839 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ad32cfd7-7b60-4c76-8df2-eb2e65b102c3" containerName="oc" Mar 21 04:38:32 crc kubenswrapper[4839]: I0321 04:38:32.746869 4839 state_mem.go:107] "Deleted CPUSet assignment" podUID="ad32cfd7-7b60-4c76-8df2-eb2e65b102c3" containerName="oc" Mar 21 04:38:32 crc kubenswrapper[4839]: I0321 04:38:32.746988 4839 memory_manager.go:354] "RemoveStaleState removing state" podUID="36c4ce7f-79eb-4f18-8573-f6900d7812fe" containerName="registry-server" Mar 21 04:38:32 crc kubenswrapper[4839]: I0321 04:38:32.746998 4839 memory_manager.go:354] "RemoveStaleState removing state" podUID="ad32cfd7-7b60-4c76-8df2-eb2e65b102c3" containerName="oc" Mar 21 04:38:32 crc kubenswrapper[4839]: I0321 04:38:32.748029 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c18p48w" Mar 21 04:38:32 crc kubenswrapper[4839]: I0321 04:38:32.752221 4839 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"default-dockercfg-vmwhc" Mar 21 04:38:32 crc kubenswrapper[4839]: I0321 04:38:32.762227 4839 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c18p48w"] Mar 21 04:38:32 crc kubenswrapper[4839]: I0321 04:38:32.790221 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/cb3471d2-6268-4816-bc09-31044e9989e7-bundle\") pod \"2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c18p48w\" (UID: \"cb3471d2-6268-4816-bc09-31044e9989e7\") " pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c18p48w" Mar 21 04:38:32 crc kubenswrapper[4839]: I0321 04:38:32.790284 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/cb3471d2-6268-4816-bc09-31044e9989e7-util\") pod \"2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c18p48w\" (UID: \"cb3471d2-6268-4816-bc09-31044e9989e7\") " pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c18p48w" Mar 21 04:38:32 crc kubenswrapper[4839]: I0321 04:38:32.790335 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z7fgd\" (UniqueName: \"kubernetes.io/projected/cb3471d2-6268-4816-bc09-31044e9989e7-kube-api-access-z7fgd\") pod \"2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c18p48w\" (UID: \"cb3471d2-6268-4816-bc09-31044e9989e7\") " pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c18p48w" Mar 21 04:38:32 crc kubenswrapper[4839]: I0321 04:38:32.891477 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/cb3471d2-6268-4816-bc09-31044e9989e7-bundle\") pod \"2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c18p48w\" (UID: \"cb3471d2-6268-4816-bc09-31044e9989e7\") " pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c18p48w" Mar 21 04:38:32 crc kubenswrapper[4839]: I0321 04:38:32.891591 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/cb3471d2-6268-4816-bc09-31044e9989e7-util\") pod \"2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c18p48w\" (UID: \"cb3471d2-6268-4816-bc09-31044e9989e7\") " pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c18p48w" Mar 21 04:38:32 crc kubenswrapper[4839]: I0321 04:38:32.891630 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-z7fgd\" (UniqueName: \"kubernetes.io/projected/cb3471d2-6268-4816-bc09-31044e9989e7-kube-api-access-z7fgd\") pod \"2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c18p48w\" (UID: \"cb3471d2-6268-4816-bc09-31044e9989e7\") " pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c18p48w" Mar 21 04:38:32 crc kubenswrapper[4839]: I0321 04:38:32.892027 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/cb3471d2-6268-4816-bc09-31044e9989e7-bundle\") pod \"2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c18p48w\" (UID: \"cb3471d2-6268-4816-bc09-31044e9989e7\") " pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c18p48w" Mar 21 04:38:32 crc kubenswrapper[4839]: I0321 04:38:32.892048 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/cb3471d2-6268-4816-bc09-31044e9989e7-util\") pod \"2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c18p48w\" (UID: \"cb3471d2-6268-4816-bc09-31044e9989e7\") " pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c18p48w" Mar 21 04:38:32 crc kubenswrapper[4839]: I0321 04:38:32.912339 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-z7fgd\" (UniqueName: \"kubernetes.io/projected/cb3471d2-6268-4816-bc09-31044e9989e7-kube-api-access-z7fgd\") pod \"2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c18p48w\" (UID: \"cb3471d2-6268-4816-bc09-31044e9989e7\") " pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c18p48w" Mar 21 04:38:33 crc kubenswrapper[4839]: I0321 04:38:33.064245 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c18p48w" Mar 21 04:38:33 crc kubenswrapper[4839]: I0321 04:38:33.284366 4839 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c18p48w"] Mar 21 04:38:33 crc kubenswrapper[4839]: W0321 04:38:33.295716 4839 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podcb3471d2_6268_4816_bc09_31044e9989e7.slice/crio-855d03996d5a5ae5267b018c13b87d9e204fe590b85da00fec74d630249a21e0 WatchSource:0}: Error finding container 855d03996d5a5ae5267b018c13b87d9e204fe590b85da00fec74d630249a21e0: Status 404 returned error can't find the container with id 855d03996d5a5ae5267b018c13b87d9e204fe590b85da00fec74d630249a21e0 Mar 21 04:38:33 crc kubenswrapper[4839]: I0321 04:38:33.548046 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c18p48w" event={"ID":"cb3471d2-6268-4816-bc09-31044e9989e7","Type":"ContainerStarted","Data":"161517e37efc056c02af6449afb48e53bbb04f2c08be3c2f7ef4e330e9b394a9"} Mar 21 04:38:33 crc kubenswrapper[4839]: I0321 04:38:33.548363 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c18p48w" event={"ID":"cb3471d2-6268-4816-bc09-31044e9989e7","Type":"ContainerStarted","Data":"855d03996d5a5ae5267b018c13b87d9e204fe590b85da00fec74d630249a21e0"} Mar 21 04:38:34 crc kubenswrapper[4839]: I0321 04:38:34.556745 4839 generic.go:334] "Generic (PLEG): container finished" podID="cb3471d2-6268-4816-bc09-31044e9989e7" containerID="161517e37efc056c02af6449afb48e53bbb04f2c08be3c2f7ef4e330e9b394a9" exitCode=0 Mar 21 04:38:34 crc kubenswrapper[4839]: I0321 04:38:34.556796 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c18p48w" event={"ID":"cb3471d2-6268-4816-bc09-31044e9989e7","Type":"ContainerDied","Data":"161517e37efc056c02af6449afb48e53bbb04f2c08be3c2f7ef4e330e9b394a9"} Mar 21 04:38:34 crc kubenswrapper[4839]: I0321 04:38:34.998649 4839 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-console/console-f9d7485db-bj929" podUID="ebdfec0a-a8bf-47b0-b51a-75a76d4341f2" containerName="console" containerID="cri-o://fe1fa26cb747d6735cb9f60b969aa41c7d904163dcab0be59475985097c131d6" gracePeriod=15 Mar 21 04:38:35 crc kubenswrapper[4839]: I0321 04:38:35.366527 4839 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-f9d7485db-bj929_ebdfec0a-a8bf-47b0-b51a-75a76d4341f2/console/0.log" Mar 21 04:38:35 crc kubenswrapper[4839]: I0321 04:38:35.366840 4839 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-bj929" Mar 21 04:38:35 crc kubenswrapper[4839]: I0321 04:38:35.531849 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/ebdfec0a-a8bf-47b0-b51a-75a76d4341f2-oauth-serving-cert\") pod \"ebdfec0a-a8bf-47b0-b51a-75a76d4341f2\" (UID: \"ebdfec0a-a8bf-47b0-b51a-75a76d4341f2\") " Mar 21 04:38:35 crc kubenswrapper[4839]: I0321 04:38:35.531963 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/ebdfec0a-a8bf-47b0-b51a-75a76d4341f2-console-oauth-config\") pod \"ebdfec0a-a8bf-47b0-b51a-75a76d4341f2\" (UID: \"ebdfec0a-a8bf-47b0-b51a-75a76d4341f2\") " Mar 21 04:38:35 crc kubenswrapper[4839]: I0321 04:38:35.533040 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/ebdfec0a-a8bf-47b0-b51a-75a76d4341f2-console-config\") pod \"ebdfec0a-a8bf-47b0-b51a-75a76d4341f2\" (UID: \"ebdfec0a-a8bf-47b0-b51a-75a76d4341f2\") " Mar 21 04:38:35 crc kubenswrapper[4839]: I0321 04:38:35.533106 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ebdfec0a-a8bf-47b0-b51a-75a76d4341f2-trusted-ca-bundle\") pod \"ebdfec0a-a8bf-47b0-b51a-75a76d4341f2\" (UID: \"ebdfec0a-a8bf-47b0-b51a-75a76d4341f2\") " Mar 21 04:38:35 crc kubenswrapper[4839]: I0321 04:38:35.533233 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/ebdfec0a-a8bf-47b0-b51a-75a76d4341f2-service-ca\") pod \"ebdfec0a-a8bf-47b0-b51a-75a76d4341f2\" (UID: \"ebdfec0a-a8bf-47b0-b51a-75a76d4341f2\") " Mar 21 04:38:35 crc kubenswrapper[4839]: I0321 04:38:35.533313 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/ebdfec0a-a8bf-47b0-b51a-75a76d4341f2-console-serving-cert\") pod \"ebdfec0a-a8bf-47b0-b51a-75a76d4341f2\" (UID: \"ebdfec0a-a8bf-47b0-b51a-75a76d4341f2\") " Mar 21 04:38:35 crc kubenswrapper[4839]: I0321 04:38:35.533349 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-q47zt\" (UniqueName: \"kubernetes.io/projected/ebdfec0a-a8bf-47b0-b51a-75a76d4341f2-kube-api-access-q47zt\") pod \"ebdfec0a-a8bf-47b0-b51a-75a76d4341f2\" (UID: \"ebdfec0a-a8bf-47b0-b51a-75a76d4341f2\") " Mar 21 04:38:35 crc kubenswrapper[4839]: I0321 04:38:35.533934 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ebdfec0a-a8bf-47b0-b51a-75a76d4341f2-service-ca" (OuterVolumeSpecName: "service-ca") pod "ebdfec0a-a8bf-47b0-b51a-75a76d4341f2" (UID: "ebdfec0a-a8bf-47b0-b51a-75a76d4341f2"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 21 04:38:35 crc kubenswrapper[4839]: I0321 04:38:35.534051 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ebdfec0a-a8bf-47b0-b51a-75a76d4341f2-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "ebdfec0a-a8bf-47b0-b51a-75a76d4341f2" (UID: "ebdfec0a-a8bf-47b0-b51a-75a76d4341f2"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 21 04:38:35 crc kubenswrapper[4839]: I0321 04:38:35.533928 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ebdfec0a-a8bf-47b0-b51a-75a76d4341f2-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "ebdfec0a-a8bf-47b0-b51a-75a76d4341f2" (UID: "ebdfec0a-a8bf-47b0-b51a-75a76d4341f2"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 21 04:38:35 crc kubenswrapper[4839]: I0321 04:38:35.534730 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ebdfec0a-a8bf-47b0-b51a-75a76d4341f2-console-config" (OuterVolumeSpecName: "console-config") pod "ebdfec0a-a8bf-47b0-b51a-75a76d4341f2" (UID: "ebdfec0a-a8bf-47b0-b51a-75a76d4341f2"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 21 04:38:35 crc kubenswrapper[4839]: I0321 04:38:35.538796 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ebdfec0a-a8bf-47b0-b51a-75a76d4341f2-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "ebdfec0a-a8bf-47b0-b51a-75a76d4341f2" (UID: "ebdfec0a-a8bf-47b0-b51a-75a76d4341f2"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 21 04:38:35 crc kubenswrapper[4839]: I0321 04:38:35.539631 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ebdfec0a-a8bf-47b0-b51a-75a76d4341f2-kube-api-access-q47zt" (OuterVolumeSpecName: "kube-api-access-q47zt") pod "ebdfec0a-a8bf-47b0-b51a-75a76d4341f2" (UID: "ebdfec0a-a8bf-47b0-b51a-75a76d4341f2"). InnerVolumeSpecName "kube-api-access-q47zt". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 21 04:38:35 crc kubenswrapper[4839]: I0321 04:38:35.539963 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ebdfec0a-a8bf-47b0-b51a-75a76d4341f2-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "ebdfec0a-a8bf-47b0-b51a-75a76d4341f2" (UID: "ebdfec0a-a8bf-47b0-b51a-75a76d4341f2"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 21 04:38:35 crc kubenswrapper[4839]: I0321 04:38:35.565083 4839 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-f9d7485db-bj929_ebdfec0a-a8bf-47b0-b51a-75a76d4341f2/console/0.log" Mar 21 04:38:35 crc kubenswrapper[4839]: I0321 04:38:35.565141 4839 generic.go:334] "Generic (PLEG): container finished" podID="ebdfec0a-a8bf-47b0-b51a-75a76d4341f2" containerID="fe1fa26cb747d6735cb9f60b969aa41c7d904163dcab0be59475985097c131d6" exitCode=2 Mar 21 04:38:35 crc kubenswrapper[4839]: I0321 04:38:35.565173 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-bj929" event={"ID":"ebdfec0a-a8bf-47b0-b51a-75a76d4341f2","Type":"ContainerDied","Data":"fe1fa26cb747d6735cb9f60b969aa41c7d904163dcab0be59475985097c131d6"} Mar 21 04:38:35 crc kubenswrapper[4839]: I0321 04:38:35.565208 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-bj929" event={"ID":"ebdfec0a-a8bf-47b0-b51a-75a76d4341f2","Type":"ContainerDied","Data":"774dc5e188fdd2949c4be19591127c4007be44d80647d0129310982be9176b4a"} Mar 21 04:38:35 crc kubenswrapper[4839]: I0321 04:38:35.565251 4839 scope.go:117] "RemoveContainer" containerID="fe1fa26cb747d6735cb9f60b969aa41c7d904163dcab0be59475985097c131d6" Mar 21 04:38:35 crc kubenswrapper[4839]: I0321 04:38:35.565369 4839 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-bj929" Mar 21 04:38:35 crc kubenswrapper[4839]: I0321 04:38:35.583932 4839 scope.go:117] "RemoveContainer" containerID="fe1fa26cb747d6735cb9f60b969aa41c7d904163dcab0be59475985097c131d6" Mar 21 04:38:35 crc kubenswrapper[4839]: E0321 04:38:35.584405 4839 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fe1fa26cb747d6735cb9f60b969aa41c7d904163dcab0be59475985097c131d6\": container with ID starting with fe1fa26cb747d6735cb9f60b969aa41c7d904163dcab0be59475985097c131d6 not found: ID does not exist" containerID="fe1fa26cb747d6735cb9f60b969aa41c7d904163dcab0be59475985097c131d6" Mar 21 04:38:35 crc kubenswrapper[4839]: I0321 04:38:35.584452 4839 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fe1fa26cb747d6735cb9f60b969aa41c7d904163dcab0be59475985097c131d6"} err="failed to get container status \"fe1fa26cb747d6735cb9f60b969aa41c7d904163dcab0be59475985097c131d6\": rpc error: code = NotFound desc = could not find container \"fe1fa26cb747d6735cb9f60b969aa41c7d904163dcab0be59475985097c131d6\": container with ID starting with fe1fa26cb747d6735cb9f60b969aa41c7d904163dcab0be59475985097c131d6 not found: ID does not exist" Mar 21 04:38:35 crc kubenswrapper[4839]: I0321 04:38:35.617279 4839 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-console/console-f9d7485db-bj929"] Mar 21 04:38:35 crc kubenswrapper[4839]: I0321 04:38:35.624387 4839 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-f9d7485db-bj929"] Mar 21 04:38:35 crc kubenswrapper[4839]: I0321 04:38:35.635015 4839 reconciler_common.go:293] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/ebdfec0a-a8bf-47b0-b51a-75a76d4341f2-console-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 21 04:38:35 crc kubenswrapper[4839]: I0321 04:38:35.635059 4839 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-q47zt\" (UniqueName: \"kubernetes.io/projected/ebdfec0a-a8bf-47b0-b51a-75a76d4341f2-kube-api-access-q47zt\") on node \"crc\" DevicePath \"\"" Mar 21 04:38:35 crc kubenswrapper[4839]: I0321 04:38:35.635069 4839 reconciler_common.go:293] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/ebdfec0a-a8bf-47b0-b51a-75a76d4341f2-oauth-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 21 04:38:35 crc kubenswrapper[4839]: I0321 04:38:35.635080 4839 reconciler_common.go:293] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/ebdfec0a-a8bf-47b0-b51a-75a76d4341f2-console-oauth-config\") on node \"crc\" DevicePath \"\"" Mar 21 04:38:35 crc kubenswrapper[4839]: I0321 04:38:35.635094 4839 reconciler_common.go:293] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/ebdfec0a-a8bf-47b0-b51a-75a76d4341f2-console-config\") on node \"crc\" DevicePath \"\"" Mar 21 04:38:35 crc kubenswrapper[4839]: I0321 04:38:35.635103 4839 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ebdfec0a-a8bf-47b0-b51a-75a76d4341f2-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 21 04:38:35 crc kubenswrapper[4839]: I0321 04:38:35.635112 4839 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/ebdfec0a-a8bf-47b0-b51a-75a76d4341f2-service-ca\") on node \"crc\" DevicePath \"\"" Mar 21 04:38:35 crc kubenswrapper[4839]: E0321 04:38:35.797014 4839 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podcb3471d2_6268_4816_bc09_31044e9989e7.slice/crio-conmon-31a28139a6805f3245c66606cfc1ecc7d0c642e4e9170453147f7a66cff1f1a3.scope\": RecentStats: unable to find data in memory cache]" Mar 21 04:38:36 crc kubenswrapper[4839]: I0321 04:38:36.466401 4839 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ebdfec0a-a8bf-47b0-b51a-75a76d4341f2" path="/var/lib/kubelet/pods/ebdfec0a-a8bf-47b0-b51a-75a76d4341f2/volumes" Mar 21 04:38:36 crc kubenswrapper[4839]: I0321 04:38:36.574836 4839 generic.go:334] "Generic (PLEG): container finished" podID="cb3471d2-6268-4816-bc09-31044e9989e7" containerID="31a28139a6805f3245c66606cfc1ecc7d0c642e4e9170453147f7a66cff1f1a3" exitCode=0 Mar 21 04:38:36 crc kubenswrapper[4839]: I0321 04:38:36.574929 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c18p48w" event={"ID":"cb3471d2-6268-4816-bc09-31044e9989e7","Type":"ContainerDied","Data":"31a28139a6805f3245c66606cfc1ecc7d0c642e4e9170453147f7a66cff1f1a3"} Mar 21 04:38:37 crc kubenswrapper[4839]: I0321 04:38:37.584459 4839 generic.go:334] "Generic (PLEG): container finished" podID="cb3471d2-6268-4816-bc09-31044e9989e7" containerID="5b7eb1cd60982ec264def859b7887dc44e416d3dd03acaa89f0c34a2119dcf7f" exitCode=0 Mar 21 04:38:37 crc kubenswrapper[4839]: I0321 04:38:37.584533 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c18p48w" event={"ID":"cb3471d2-6268-4816-bc09-31044e9989e7","Type":"ContainerDied","Data":"5b7eb1cd60982ec264def859b7887dc44e416d3dd03acaa89f0c34a2119dcf7f"} Mar 21 04:38:38 crc kubenswrapper[4839]: I0321 04:38:38.799907 4839 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c18p48w" Mar 21 04:38:38 crc kubenswrapper[4839]: I0321 04:38:38.883623 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-z7fgd\" (UniqueName: \"kubernetes.io/projected/cb3471d2-6268-4816-bc09-31044e9989e7-kube-api-access-z7fgd\") pod \"cb3471d2-6268-4816-bc09-31044e9989e7\" (UID: \"cb3471d2-6268-4816-bc09-31044e9989e7\") " Mar 21 04:38:38 crc kubenswrapper[4839]: I0321 04:38:38.883699 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/cb3471d2-6268-4816-bc09-31044e9989e7-bundle\") pod \"cb3471d2-6268-4816-bc09-31044e9989e7\" (UID: \"cb3471d2-6268-4816-bc09-31044e9989e7\") " Mar 21 04:38:38 crc kubenswrapper[4839]: I0321 04:38:38.885598 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/cb3471d2-6268-4816-bc09-31044e9989e7-bundle" (OuterVolumeSpecName: "bundle") pod "cb3471d2-6268-4816-bc09-31044e9989e7" (UID: "cb3471d2-6268-4816-bc09-31044e9989e7"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 21 04:38:38 crc kubenswrapper[4839]: I0321 04:38:38.885654 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/cb3471d2-6268-4816-bc09-31044e9989e7-util\") pod \"cb3471d2-6268-4816-bc09-31044e9989e7\" (UID: \"cb3471d2-6268-4816-bc09-31044e9989e7\") " Mar 21 04:38:38 crc kubenswrapper[4839]: I0321 04:38:38.885896 4839 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/cb3471d2-6268-4816-bc09-31044e9989e7-bundle\") on node \"crc\" DevicePath \"\"" Mar 21 04:38:38 crc kubenswrapper[4839]: I0321 04:38:38.889679 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cb3471d2-6268-4816-bc09-31044e9989e7-kube-api-access-z7fgd" (OuterVolumeSpecName: "kube-api-access-z7fgd") pod "cb3471d2-6268-4816-bc09-31044e9989e7" (UID: "cb3471d2-6268-4816-bc09-31044e9989e7"). InnerVolumeSpecName "kube-api-access-z7fgd". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 21 04:38:38 crc kubenswrapper[4839]: I0321 04:38:38.902767 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/cb3471d2-6268-4816-bc09-31044e9989e7-util" (OuterVolumeSpecName: "util") pod "cb3471d2-6268-4816-bc09-31044e9989e7" (UID: "cb3471d2-6268-4816-bc09-31044e9989e7"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 21 04:38:38 crc kubenswrapper[4839]: I0321 04:38:38.986528 4839 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/cb3471d2-6268-4816-bc09-31044e9989e7-util\") on node \"crc\" DevicePath \"\"" Mar 21 04:38:38 crc kubenswrapper[4839]: I0321 04:38:38.986616 4839 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-z7fgd\" (UniqueName: \"kubernetes.io/projected/cb3471d2-6268-4816-bc09-31044e9989e7-kube-api-access-z7fgd\") on node \"crc\" DevicePath \"\"" Mar 21 04:38:39 crc kubenswrapper[4839]: I0321 04:38:39.600129 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c18p48w" event={"ID":"cb3471d2-6268-4816-bc09-31044e9989e7","Type":"ContainerDied","Data":"855d03996d5a5ae5267b018c13b87d9e204fe590b85da00fec74d630249a21e0"} Mar 21 04:38:39 crc kubenswrapper[4839]: I0321 04:38:39.600180 4839 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c18p48w" Mar 21 04:38:39 crc kubenswrapper[4839]: I0321 04:38:39.600182 4839 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="855d03996d5a5ae5267b018c13b87d9e204fe590b85da00fec74d630249a21e0" Mar 21 04:38:47 crc kubenswrapper[4839]: I0321 04:38:47.731768 4839 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/metallb-operator-controller-manager-7b8d865685-2pk4g"] Mar 21 04:38:47 crc kubenswrapper[4839]: E0321 04:38:47.732522 4839 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cb3471d2-6268-4816-bc09-31044e9989e7" containerName="pull" Mar 21 04:38:47 crc kubenswrapper[4839]: I0321 04:38:47.732539 4839 state_mem.go:107] "Deleted CPUSet assignment" podUID="cb3471d2-6268-4816-bc09-31044e9989e7" containerName="pull" Mar 21 04:38:47 crc kubenswrapper[4839]: E0321 04:38:47.732550 4839 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cb3471d2-6268-4816-bc09-31044e9989e7" containerName="util" Mar 21 04:38:47 crc kubenswrapper[4839]: I0321 04:38:47.732557 4839 state_mem.go:107] "Deleted CPUSet assignment" podUID="cb3471d2-6268-4816-bc09-31044e9989e7" containerName="util" Mar 21 04:38:47 crc kubenswrapper[4839]: E0321 04:38:47.732616 4839 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cb3471d2-6268-4816-bc09-31044e9989e7" containerName="extract" Mar 21 04:38:47 crc kubenswrapper[4839]: I0321 04:38:47.732627 4839 state_mem.go:107] "Deleted CPUSet assignment" podUID="cb3471d2-6268-4816-bc09-31044e9989e7" containerName="extract" Mar 21 04:38:47 crc kubenswrapper[4839]: E0321 04:38:47.732642 4839 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ebdfec0a-a8bf-47b0-b51a-75a76d4341f2" containerName="console" Mar 21 04:38:47 crc kubenswrapper[4839]: I0321 04:38:47.732648 4839 state_mem.go:107] "Deleted CPUSet assignment" podUID="ebdfec0a-a8bf-47b0-b51a-75a76d4341f2" containerName="console" Mar 21 04:38:47 crc kubenswrapper[4839]: I0321 04:38:47.732769 4839 memory_manager.go:354] "RemoveStaleState removing state" podUID="cb3471d2-6268-4816-bc09-31044e9989e7" containerName="extract" Mar 21 04:38:47 crc kubenswrapper[4839]: I0321 04:38:47.732778 4839 memory_manager.go:354] "RemoveStaleState removing state" podUID="ebdfec0a-a8bf-47b0-b51a-75a76d4341f2" containerName="console" Mar 21 04:38:47 crc kubenswrapper[4839]: I0321 04:38:47.733178 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-controller-manager-7b8d865685-2pk4g" Mar 21 04:38:47 crc kubenswrapper[4839]: I0321 04:38:47.736402 4839 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-operator-webhook-server-cert" Mar 21 04:38:47 crc kubenswrapper[4839]: I0321 04:38:47.736416 4839 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-operator-controller-manager-service-cert" Mar 21 04:38:47 crc kubenswrapper[4839]: I0321 04:38:47.736504 4839 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"manager-account-dockercfg-tclds" Mar 21 04:38:47 crc kubenswrapper[4839]: I0321 04:38:47.736613 4839 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"openshift-service-ca.crt" Mar 21 04:38:47 crc kubenswrapper[4839]: I0321 04:38:47.736695 4839 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"kube-root-ca.crt" Mar 21 04:38:47 crc kubenswrapper[4839]: I0321 04:38:47.757018 4839 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-controller-manager-7b8d865685-2pk4g"] Mar 21 04:38:47 crc kubenswrapper[4839]: I0321 04:38:47.898177 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cdpfs\" (UniqueName: \"kubernetes.io/projected/888cdc0b-241d-456a-9a9f-3ed253b3dbf3-kube-api-access-cdpfs\") pod \"metallb-operator-controller-manager-7b8d865685-2pk4g\" (UID: \"888cdc0b-241d-456a-9a9f-3ed253b3dbf3\") " pod="metallb-system/metallb-operator-controller-manager-7b8d865685-2pk4g" Mar 21 04:38:47 crc kubenswrapper[4839]: I0321 04:38:47.898249 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/888cdc0b-241d-456a-9a9f-3ed253b3dbf3-webhook-cert\") pod \"metallb-operator-controller-manager-7b8d865685-2pk4g\" (UID: \"888cdc0b-241d-456a-9a9f-3ed253b3dbf3\") " pod="metallb-system/metallb-operator-controller-manager-7b8d865685-2pk4g" Mar 21 04:38:47 crc kubenswrapper[4839]: I0321 04:38:47.898326 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/888cdc0b-241d-456a-9a9f-3ed253b3dbf3-apiservice-cert\") pod \"metallb-operator-controller-manager-7b8d865685-2pk4g\" (UID: \"888cdc0b-241d-456a-9a9f-3ed253b3dbf3\") " pod="metallb-system/metallb-operator-controller-manager-7b8d865685-2pk4g" Mar 21 04:38:47 crc kubenswrapper[4839]: I0321 04:38:47.966203 4839 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/metallb-operator-webhook-server-7df97b96d6-7wvzr"] Mar 21 04:38:47 crc kubenswrapper[4839]: I0321 04:38:47.966860 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-webhook-server-7df97b96d6-7wvzr" Mar 21 04:38:47 crc kubenswrapper[4839]: I0321 04:38:47.969329 4839 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"controller-dockercfg-j8csk" Mar 21 04:38:47 crc kubenswrapper[4839]: I0321 04:38:47.969498 4839 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-webhook-cert" Mar 21 04:38:47 crc kubenswrapper[4839]: I0321 04:38:47.969931 4839 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-operator-webhook-server-service-cert" Mar 21 04:38:47 crc kubenswrapper[4839]: I0321 04:38:47.988869 4839 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-webhook-server-7df97b96d6-7wvzr"] Mar 21 04:38:47 crc kubenswrapper[4839]: I0321 04:38:47.999262 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/ca0627e2-8115-4514-ba93-47e00a823a31-webhook-cert\") pod \"metallb-operator-webhook-server-7df97b96d6-7wvzr\" (UID: \"ca0627e2-8115-4514-ba93-47e00a823a31\") " pod="metallb-system/metallb-operator-webhook-server-7df97b96d6-7wvzr" Mar 21 04:38:47 crc kubenswrapper[4839]: I0321 04:38:47.999314 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/888cdc0b-241d-456a-9a9f-3ed253b3dbf3-apiservice-cert\") pod \"metallb-operator-controller-manager-7b8d865685-2pk4g\" (UID: \"888cdc0b-241d-456a-9a9f-3ed253b3dbf3\") " pod="metallb-system/metallb-operator-controller-manager-7b8d865685-2pk4g" Mar 21 04:38:47 crc kubenswrapper[4839]: I0321 04:38:47.999342 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l29ln\" (UniqueName: \"kubernetes.io/projected/ca0627e2-8115-4514-ba93-47e00a823a31-kube-api-access-l29ln\") pod \"metallb-operator-webhook-server-7df97b96d6-7wvzr\" (UID: \"ca0627e2-8115-4514-ba93-47e00a823a31\") " pod="metallb-system/metallb-operator-webhook-server-7df97b96d6-7wvzr" Mar 21 04:38:47 crc kubenswrapper[4839]: I0321 04:38:47.999392 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/ca0627e2-8115-4514-ba93-47e00a823a31-apiservice-cert\") pod \"metallb-operator-webhook-server-7df97b96d6-7wvzr\" (UID: \"ca0627e2-8115-4514-ba93-47e00a823a31\") " pod="metallb-system/metallb-operator-webhook-server-7df97b96d6-7wvzr" Mar 21 04:38:47 crc kubenswrapper[4839]: I0321 04:38:47.999540 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cdpfs\" (UniqueName: \"kubernetes.io/projected/888cdc0b-241d-456a-9a9f-3ed253b3dbf3-kube-api-access-cdpfs\") pod \"metallb-operator-controller-manager-7b8d865685-2pk4g\" (UID: \"888cdc0b-241d-456a-9a9f-3ed253b3dbf3\") " pod="metallb-system/metallb-operator-controller-manager-7b8d865685-2pk4g" Mar 21 04:38:47 crc kubenswrapper[4839]: I0321 04:38:47.999667 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/888cdc0b-241d-456a-9a9f-3ed253b3dbf3-webhook-cert\") pod \"metallb-operator-controller-manager-7b8d865685-2pk4g\" (UID: \"888cdc0b-241d-456a-9a9f-3ed253b3dbf3\") " pod="metallb-system/metallb-operator-controller-manager-7b8d865685-2pk4g" Mar 21 04:38:48 crc kubenswrapper[4839]: I0321 04:38:48.012936 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/888cdc0b-241d-456a-9a9f-3ed253b3dbf3-webhook-cert\") pod \"metallb-operator-controller-manager-7b8d865685-2pk4g\" (UID: \"888cdc0b-241d-456a-9a9f-3ed253b3dbf3\") " pod="metallb-system/metallb-operator-controller-manager-7b8d865685-2pk4g" Mar 21 04:38:48 crc kubenswrapper[4839]: I0321 04:38:48.025253 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/888cdc0b-241d-456a-9a9f-3ed253b3dbf3-apiservice-cert\") pod \"metallb-operator-controller-manager-7b8d865685-2pk4g\" (UID: \"888cdc0b-241d-456a-9a9f-3ed253b3dbf3\") " pod="metallb-system/metallb-operator-controller-manager-7b8d865685-2pk4g" Mar 21 04:38:48 crc kubenswrapper[4839]: I0321 04:38:48.025675 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cdpfs\" (UniqueName: \"kubernetes.io/projected/888cdc0b-241d-456a-9a9f-3ed253b3dbf3-kube-api-access-cdpfs\") pod \"metallb-operator-controller-manager-7b8d865685-2pk4g\" (UID: \"888cdc0b-241d-456a-9a9f-3ed253b3dbf3\") " pod="metallb-system/metallb-operator-controller-manager-7b8d865685-2pk4g" Mar 21 04:38:48 crc kubenswrapper[4839]: I0321 04:38:48.050189 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-controller-manager-7b8d865685-2pk4g" Mar 21 04:38:48 crc kubenswrapper[4839]: I0321 04:38:48.100069 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/ca0627e2-8115-4514-ba93-47e00a823a31-webhook-cert\") pod \"metallb-operator-webhook-server-7df97b96d6-7wvzr\" (UID: \"ca0627e2-8115-4514-ba93-47e00a823a31\") " pod="metallb-system/metallb-operator-webhook-server-7df97b96d6-7wvzr" Mar 21 04:38:48 crc kubenswrapper[4839]: I0321 04:38:48.100128 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l29ln\" (UniqueName: \"kubernetes.io/projected/ca0627e2-8115-4514-ba93-47e00a823a31-kube-api-access-l29ln\") pod \"metallb-operator-webhook-server-7df97b96d6-7wvzr\" (UID: \"ca0627e2-8115-4514-ba93-47e00a823a31\") " pod="metallb-system/metallb-operator-webhook-server-7df97b96d6-7wvzr" Mar 21 04:38:48 crc kubenswrapper[4839]: I0321 04:38:48.100148 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/ca0627e2-8115-4514-ba93-47e00a823a31-apiservice-cert\") pod \"metallb-operator-webhook-server-7df97b96d6-7wvzr\" (UID: \"ca0627e2-8115-4514-ba93-47e00a823a31\") " pod="metallb-system/metallb-operator-webhook-server-7df97b96d6-7wvzr" Mar 21 04:38:48 crc kubenswrapper[4839]: I0321 04:38:48.104450 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/ca0627e2-8115-4514-ba93-47e00a823a31-apiservice-cert\") pod \"metallb-operator-webhook-server-7df97b96d6-7wvzr\" (UID: \"ca0627e2-8115-4514-ba93-47e00a823a31\") " pod="metallb-system/metallb-operator-webhook-server-7df97b96d6-7wvzr" Mar 21 04:38:48 crc kubenswrapper[4839]: I0321 04:38:48.104616 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/ca0627e2-8115-4514-ba93-47e00a823a31-webhook-cert\") pod \"metallb-operator-webhook-server-7df97b96d6-7wvzr\" (UID: \"ca0627e2-8115-4514-ba93-47e00a823a31\") " pod="metallb-system/metallb-operator-webhook-server-7df97b96d6-7wvzr" Mar 21 04:38:48 crc kubenswrapper[4839]: I0321 04:38:48.119356 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l29ln\" (UniqueName: \"kubernetes.io/projected/ca0627e2-8115-4514-ba93-47e00a823a31-kube-api-access-l29ln\") pod \"metallb-operator-webhook-server-7df97b96d6-7wvzr\" (UID: \"ca0627e2-8115-4514-ba93-47e00a823a31\") " pod="metallb-system/metallb-operator-webhook-server-7df97b96d6-7wvzr" Mar 21 04:38:48 crc kubenswrapper[4839]: I0321 04:38:48.280310 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-webhook-server-7df97b96d6-7wvzr" Mar 21 04:38:48 crc kubenswrapper[4839]: I0321 04:38:48.542943 4839 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-controller-manager-7b8d865685-2pk4g"] Mar 21 04:38:48 crc kubenswrapper[4839]: I0321 04:38:48.575284 4839 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-webhook-server-7df97b96d6-7wvzr"] Mar 21 04:38:48 crc kubenswrapper[4839]: W0321 04:38:48.585943 4839 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podca0627e2_8115_4514_ba93_47e00a823a31.slice/crio-869a06b6537f812a44258d6110cf1801eb8bea39d1ca3f2634baf3b5adcbb78f WatchSource:0}: Error finding container 869a06b6537f812a44258d6110cf1801eb8bea39d1ca3f2634baf3b5adcbb78f: Status 404 returned error can't find the container with id 869a06b6537f812a44258d6110cf1801eb8bea39d1ca3f2634baf3b5adcbb78f Mar 21 04:38:48 crc kubenswrapper[4839]: I0321 04:38:48.646927 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-controller-manager-7b8d865685-2pk4g" event={"ID":"888cdc0b-241d-456a-9a9f-3ed253b3dbf3","Type":"ContainerStarted","Data":"57ebe20bbae5523fece00f859a64c821627f6a21eb01df8a452ebd5098d55e30"} Mar 21 04:38:48 crc kubenswrapper[4839]: I0321 04:38:48.647789 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-webhook-server-7df97b96d6-7wvzr" event={"ID":"ca0627e2-8115-4514-ba93-47e00a823a31","Type":"ContainerStarted","Data":"869a06b6537f812a44258d6110cf1801eb8bea39d1ca3f2634baf3b5adcbb78f"} Mar 21 04:38:53 crc kubenswrapper[4839]: I0321 04:38:53.675286 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-webhook-server-7df97b96d6-7wvzr" event={"ID":"ca0627e2-8115-4514-ba93-47e00a823a31","Type":"ContainerStarted","Data":"1b0e0cb542507ea6f4d57710659512a3587810a5cb00d131aab412e1c70391f0"} Mar 21 04:38:53 crc kubenswrapper[4839]: I0321 04:38:53.676041 4839 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/metallb-operator-webhook-server-7df97b96d6-7wvzr" Mar 21 04:38:53 crc kubenswrapper[4839]: I0321 04:38:53.677648 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-controller-manager-7b8d865685-2pk4g" event={"ID":"888cdc0b-241d-456a-9a9f-3ed253b3dbf3","Type":"ContainerStarted","Data":"c692d3dfffe307c44a10c0a8af0140f6d4ef1e60ed1617a28e0bb2b9c00fb89e"} Mar 21 04:38:53 crc kubenswrapper[4839]: I0321 04:38:53.677809 4839 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/metallb-operator-controller-manager-7b8d865685-2pk4g" Mar 21 04:38:53 crc kubenswrapper[4839]: I0321 04:38:53.697480 4839 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/metallb-operator-webhook-server-7df97b96d6-7wvzr" podStartSLOduration=2.441074143 podStartE2EDuration="6.697455862s" podCreationTimestamp="2026-03-21 04:38:47 +0000 UTC" firstStartedPulling="2026-03-21 04:38:48.588963885 +0000 UTC m=+932.916750561" lastFinishedPulling="2026-03-21 04:38:52.845345604 +0000 UTC m=+937.173132280" observedRunningTime="2026-03-21 04:38:53.692120713 +0000 UTC m=+938.019907429" watchObservedRunningTime="2026-03-21 04:38:53.697455862 +0000 UTC m=+938.025242558" Mar 21 04:38:53 crc kubenswrapper[4839]: I0321 04:38:53.718262 4839 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/metallb-operator-controller-manager-7b8d865685-2pk4g" podStartSLOduration=2.432048581 podStartE2EDuration="6.718241841s" podCreationTimestamp="2026-03-21 04:38:47 +0000 UTC" firstStartedPulling="2026-03-21 04:38:48.544555977 +0000 UTC m=+932.872342653" lastFinishedPulling="2026-03-21 04:38:52.830749237 +0000 UTC m=+937.158535913" observedRunningTime="2026-03-21 04:38:53.715375841 +0000 UTC m=+938.043162517" watchObservedRunningTime="2026-03-21 04:38:53.718241841 +0000 UTC m=+938.046028517" Mar 21 04:39:00 crc kubenswrapper[4839]: I0321 04:39:00.980171 4839 patch_prober.go:28] interesting pod/machine-config-daemon-jx4q7 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 21 04:39:00 crc kubenswrapper[4839]: I0321 04:39:00.980514 4839 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-jx4q7" podUID="4f92fefb-d5cd-451a-8bbe-31eea55d5bd9" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 21 04:39:00 crc kubenswrapper[4839]: I0321 04:39:00.980560 4839 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-jx4q7" Mar 21 04:39:00 crc kubenswrapper[4839]: I0321 04:39:00.981147 4839 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"d9f640234dbdc5d617b2a0974e24e968076d94c55d65466d46d7d064392afc02"} pod="openshift-machine-config-operator/machine-config-daemon-jx4q7" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 21 04:39:00 crc kubenswrapper[4839]: I0321 04:39:00.981198 4839 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-jx4q7" podUID="4f92fefb-d5cd-451a-8bbe-31eea55d5bd9" containerName="machine-config-daemon" containerID="cri-o://d9f640234dbdc5d617b2a0974e24e968076d94c55d65466d46d7d064392afc02" gracePeriod=600 Mar 21 04:39:01 crc kubenswrapper[4839]: I0321 04:39:01.735800 4839 generic.go:334] "Generic (PLEG): container finished" podID="4f92fefb-d5cd-451a-8bbe-31eea55d5bd9" containerID="d9f640234dbdc5d617b2a0974e24e968076d94c55d65466d46d7d064392afc02" exitCode=0 Mar 21 04:39:01 crc kubenswrapper[4839]: I0321 04:39:01.736107 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-jx4q7" event={"ID":"4f92fefb-d5cd-451a-8bbe-31eea55d5bd9","Type":"ContainerDied","Data":"d9f640234dbdc5d617b2a0974e24e968076d94c55d65466d46d7d064392afc02"} Mar 21 04:39:01 crc kubenswrapper[4839]: I0321 04:39:01.736131 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-jx4q7" event={"ID":"4f92fefb-d5cd-451a-8bbe-31eea55d5bd9","Type":"ContainerStarted","Data":"3ca17db50991abbb7e584e1a028ac5195afd6abd747f7e5e9969a64ed39bcf6c"} Mar 21 04:39:01 crc kubenswrapper[4839]: I0321 04:39:01.736146 4839 scope.go:117] "RemoveContainer" containerID="da4c2d3dcbc2429432cc1a9b7a706caf5c1cde12d0441535caf710ea73866018" Mar 21 04:39:08 crc kubenswrapper[4839]: I0321 04:39:08.286096 4839 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/metallb-operator-webhook-server-7df97b96d6-7wvzr" Mar 21 04:39:28 crc kubenswrapper[4839]: I0321 04:39:28.053995 4839 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/metallb-operator-controller-manager-7b8d865685-2pk4g" Mar 21 04:39:28 crc kubenswrapper[4839]: I0321 04:39:28.766691 4839 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/frr-k8s-lzrf7"] Mar 21 04:39:28 crc kubenswrapper[4839]: I0321 04:39:28.770001 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-lzrf7" Mar 21 04:39:28 crc kubenswrapper[4839]: I0321 04:39:28.771632 4839 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/frr-k8s-webhook-server-bcc4b6f68-qm7jb"] Mar 21 04:39:28 crc kubenswrapper[4839]: I0321 04:39:28.772275 4839 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"frr-k8s-daemon-dockercfg-qtgp5" Mar 21 04:39:28 crc kubenswrapper[4839]: I0321 04:39:28.772426 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-webhook-server-bcc4b6f68-qm7jb" Mar 21 04:39:28 crc kubenswrapper[4839]: I0321 04:39:28.772431 4839 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"frr-startup" Mar 21 04:39:28 crc kubenswrapper[4839]: I0321 04:39:28.773273 4839 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"frr-k8s-certs-secret" Mar 21 04:39:28 crc kubenswrapper[4839]: I0321 04:39:28.774645 4839 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"frr-k8s-webhook-server-cert" Mar 21 04:39:28 crc kubenswrapper[4839]: I0321 04:39:28.778364 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/06b3d06a-d515-469a-9a88-77b3f1e6c6f0-cert\") pod \"frr-k8s-webhook-server-bcc4b6f68-qm7jb\" (UID: \"06b3d06a-d515-469a-9a88-77b3f1e6c6f0\") " pod="metallb-system/frr-k8s-webhook-server-bcc4b6f68-qm7jb" Mar 21 04:39:28 crc kubenswrapper[4839]: I0321 04:39:28.778390 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qk6bf\" (UniqueName: \"kubernetes.io/projected/822ff984-89c3-48d0-b420-4ecf223f8176-kube-api-access-qk6bf\") pod \"frr-k8s-lzrf7\" (UID: \"822ff984-89c3-48d0-b420-4ecf223f8176\") " pod="metallb-system/frr-k8s-lzrf7" Mar 21 04:39:28 crc kubenswrapper[4839]: I0321 04:39:28.778420 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"reloader\" (UniqueName: \"kubernetes.io/empty-dir/822ff984-89c3-48d0-b420-4ecf223f8176-reloader\") pod \"frr-k8s-lzrf7\" (UID: \"822ff984-89c3-48d0-b420-4ecf223f8176\") " pod="metallb-system/frr-k8s-lzrf7" Mar 21 04:39:28 crc kubenswrapper[4839]: I0321 04:39:28.778447 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"frr-startup\" (UniqueName: \"kubernetes.io/configmap/822ff984-89c3-48d0-b420-4ecf223f8176-frr-startup\") pod \"frr-k8s-lzrf7\" (UID: \"822ff984-89c3-48d0-b420-4ecf223f8176\") " pod="metallb-system/frr-k8s-lzrf7" Mar 21 04:39:28 crc kubenswrapper[4839]: I0321 04:39:28.778475 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"frr-sockets\" (UniqueName: \"kubernetes.io/empty-dir/822ff984-89c3-48d0-b420-4ecf223f8176-frr-sockets\") pod \"frr-k8s-lzrf7\" (UID: \"822ff984-89c3-48d0-b420-4ecf223f8176\") " pod="metallb-system/frr-k8s-lzrf7" Mar 21 04:39:28 crc kubenswrapper[4839]: I0321 04:39:28.778503 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"frr-conf\" (UniqueName: \"kubernetes.io/empty-dir/822ff984-89c3-48d0-b420-4ecf223f8176-frr-conf\") pod \"frr-k8s-lzrf7\" (UID: \"822ff984-89c3-48d0-b420-4ecf223f8176\") " pod="metallb-system/frr-k8s-lzrf7" Mar 21 04:39:28 crc kubenswrapper[4839]: I0321 04:39:28.778517 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics\" (UniqueName: \"kubernetes.io/empty-dir/822ff984-89c3-48d0-b420-4ecf223f8176-metrics\") pod \"frr-k8s-lzrf7\" (UID: \"822ff984-89c3-48d0-b420-4ecf223f8176\") " pod="metallb-system/frr-k8s-lzrf7" Mar 21 04:39:28 crc kubenswrapper[4839]: I0321 04:39:28.778531 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/822ff984-89c3-48d0-b420-4ecf223f8176-metrics-certs\") pod \"frr-k8s-lzrf7\" (UID: \"822ff984-89c3-48d0-b420-4ecf223f8176\") " pod="metallb-system/frr-k8s-lzrf7" Mar 21 04:39:28 crc kubenswrapper[4839]: I0321 04:39:28.778552 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jmkxb\" (UniqueName: \"kubernetes.io/projected/06b3d06a-d515-469a-9a88-77b3f1e6c6f0-kube-api-access-jmkxb\") pod \"frr-k8s-webhook-server-bcc4b6f68-qm7jb\" (UID: \"06b3d06a-d515-469a-9a88-77b3f1e6c6f0\") " pod="metallb-system/frr-k8s-webhook-server-bcc4b6f68-qm7jb" Mar 21 04:39:28 crc kubenswrapper[4839]: I0321 04:39:28.782012 4839 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/frr-k8s-webhook-server-bcc4b6f68-qm7jb"] Mar 21 04:39:28 crc kubenswrapper[4839]: I0321 04:39:28.843987 4839 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/speaker-b2wb4"] Mar 21 04:39:28 crc kubenswrapper[4839]: I0321 04:39:28.845074 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/speaker-b2wb4" Mar 21 04:39:28 crc kubenswrapper[4839]: I0321 04:39:28.846860 4839 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-memberlist" Mar 21 04:39:28 crc kubenswrapper[4839]: I0321 04:39:28.847134 4839 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"speaker-certs-secret" Mar 21 04:39:28 crc kubenswrapper[4839]: I0321 04:39:28.852895 4839 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"speaker-dockercfg-d2tbg" Mar 21 04:39:28 crc kubenswrapper[4839]: I0321 04:39:28.853120 4839 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"metallb-excludel2" Mar 21 04:39:28 crc kubenswrapper[4839]: I0321 04:39:28.854247 4839 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/controller-7bb4cc7c98-q9zb9"] Mar 21 04:39:28 crc kubenswrapper[4839]: I0321 04:39:28.866499 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/controller-7bb4cc7c98-q9zb9" Mar 21 04:39:28 crc kubenswrapper[4839]: I0321 04:39:28.879483 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/f0373e22-a3f9-48c6-abd6-fc8147ea49e6-metrics-certs\") pod \"controller-7bb4cc7c98-q9zb9\" (UID: \"f0373e22-a3f9-48c6-abd6-fc8147ea49e6\") " pod="metallb-system/controller-7bb4cc7c98-q9zb9" Mar 21 04:39:28 crc kubenswrapper[4839]: I0321 04:39:28.879529 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/6b330e86-2ac2-4bee-8a6e-364cb2f093d7-memberlist\") pod \"speaker-b2wb4\" (UID: \"6b330e86-2ac2-4bee-8a6e-364cb2f093d7\") " pod="metallb-system/speaker-b2wb4" Mar 21 04:39:28 crc kubenswrapper[4839]: I0321 04:39:28.879586 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"frr-startup\" (UniqueName: \"kubernetes.io/configmap/822ff984-89c3-48d0-b420-4ecf223f8176-frr-startup\") pod \"frr-k8s-lzrf7\" (UID: \"822ff984-89c3-48d0-b420-4ecf223f8176\") " pod="metallb-system/frr-k8s-lzrf7" Mar 21 04:39:28 crc kubenswrapper[4839]: I0321 04:39:28.879611 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rs2jp\" (UniqueName: \"kubernetes.io/projected/6b330e86-2ac2-4bee-8a6e-364cb2f093d7-kube-api-access-rs2jp\") pod \"speaker-b2wb4\" (UID: \"6b330e86-2ac2-4bee-8a6e-364cb2f093d7\") " pod="metallb-system/speaker-b2wb4" Mar 21 04:39:28 crc kubenswrapper[4839]: I0321 04:39:28.879643 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"frr-sockets\" (UniqueName: \"kubernetes.io/empty-dir/822ff984-89c3-48d0-b420-4ecf223f8176-frr-sockets\") pod \"frr-k8s-lzrf7\" (UID: \"822ff984-89c3-48d0-b420-4ecf223f8176\") " pod="metallb-system/frr-k8s-lzrf7" Mar 21 04:39:28 crc kubenswrapper[4839]: I0321 04:39:28.879669 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/f0373e22-a3f9-48c6-abd6-fc8147ea49e6-cert\") pod \"controller-7bb4cc7c98-q9zb9\" (UID: \"f0373e22-a3f9-48c6-abd6-fc8147ea49e6\") " pod="metallb-system/controller-7bb4cc7c98-q9zb9" Mar 21 04:39:28 crc kubenswrapper[4839]: I0321 04:39:28.879692 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metallb-excludel2\" (UniqueName: \"kubernetes.io/configmap/6b330e86-2ac2-4bee-8a6e-364cb2f093d7-metallb-excludel2\") pod \"speaker-b2wb4\" (UID: \"6b330e86-2ac2-4bee-8a6e-364cb2f093d7\") " pod="metallb-system/speaker-b2wb4" Mar 21 04:39:28 crc kubenswrapper[4839]: I0321 04:39:28.879718 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"frr-conf\" (UniqueName: \"kubernetes.io/empty-dir/822ff984-89c3-48d0-b420-4ecf223f8176-frr-conf\") pod \"frr-k8s-lzrf7\" (UID: \"822ff984-89c3-48d0-b420-4ecf223f8176\") " pod="metallb-system/frr-k8s-lzrf7" Mar 21 04:39:28 crc kubenswrapper[4839]: I0321 04:39:28.879734 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/6b330e86-2ac2-4bee-8a6e-364cb2f093d7-metrics-certs\") pod \"speaker-b2wb4\" (UID: \"6b330e86-2ac2-4bee-8a6e-364cb2f093d7\") " pod="metallb-system/speaker-b2wb4" Mar 21 04:39:28 crc kubenswrapper[4839]: I0321 04:39:28.879759 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics\" (UniqueName: \"kubernetes.io/empty-dir/822ff984-89c3-48d0-b420-4ecf223f8176-metrics\") pod \"frr-k8s-lzrf7\" (UID: \"822ff984-89c3-48d0-b420-4ecf223f8176\") " pod="metallb-system/frr-k8s-lzrf7" Mar 21 04:39:28 crc kubenswrapper[4839]: I0321 04:39:28.879779 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/822ff984-89c3-48d0-b420-4ecf223f8176-metrics-certs\") pod \"frr-k8s-lzrf7\" (UID: \"822ff984-89c3-48d0-b420-4ecf223f8176\") " pod="metallb-system/frr-k8s-lzrf7" Mar 21 04:39:28 crc kubenswrapper[4839]: I0321 04:39:28.879806 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jmkxb\" (UniqueName: \"kubernetes.io/projected/06b3d06a-d515-469a-9a88-77b3f1e6c6f0-kube-api-access-jmkxb\") pod \"frr-k8s-webhook-server-bcc4b6f68-qm7jb\" (UID: \"06b3d06a-d515-469a-9a88-77b3f1e6c6f0\") " pod="metallb-system/frr-k8s-webhook-server-bcc4b6f68-qm7jb" Mar 21 04:39:28 crc kubenswrapper[4839]: I0321 04:39:28.879824 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/06b3d06a-d515-469a-9a88-77b3f1e6c6f0-cert\") pod \"frr-k8s-webhook-server-bcc4b6f68-qm7jb\" (UID: \"06b3d06a-d515-469a-9a88-77b3f1e6c6f0\") " pod="metallb-system/frr-k8s-webhook-server-bcc4b6f68-qm7jb" Mar 21 04:39:28 crc kubenswrapper[4839]: I0321 04:39:28.879846 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s4btv\" (UniqueName: \"kubernetes.io/projected/f0373e22-a3f9-48c6-abd6-fc8147ea49e6-kube-api-access-s4btv\") pod \"controller-7bb4cc7c98-q9zb9\" (UID: \"f0373e22-a3f9-48c6-abd6-fc8147ea49e6\") " pod="metallb-system/controller-7bb4cc7c98-q9zb9" Mar 21 04:39:28 crc kubenswrapper[4839]: I0321 04:39:28.879866 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qk6bf\" (UniqueName: \"kubernetes.io/projected/822ff984-89c3-48d0-b420-4ecf223f8176-kube-api-access-qk6bf\") pod \"frr-k8s-lzrf7\" (UID: \"822ff984-89c3-48d0-b420-4ecf223f8176\") " pod="metallb-system/frr-k8s-lzrf7" Mar 21 04:39:28 crc kubenswrapper[4839]: I0321 04:39:28.879894 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"reloader\" (UniqueName: \"kubernetes.io/empty-dir/822ff984-89c3-48d0-b420-4ecf223f8176-reloader\") pod \"frr-k8s-lzrf7\" (UID: \"822ff984-89c3-48d0-b420-4ecf223f8176\") " pod="metallb-system/frr-k8s-lzrf7" Mar 21 04:39:28 crc kubenswrapper[4839]: I0321 04:39:28.880281 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"reloader\" (UniqueName: \"kubernetes.io/empty-dir/822ff984-89c3-48d0-b420-4ecf223f8176-reloader\") pod \"frr-k8s-lzrf7\" (UID: \"822ff984-89c3-48d0-b420-4ecf223f8176\") " pod="metallb-system/frr-k8s-lzrf7" Mar 21 04:39:28 crc kubenswrapper[4839]: I0321 04:39:28.881099 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"frr-startup\" (UniqueName: \"kubernetes.io/configmap/822ff984-89c3-48d0-b420-4ecf223f8176-frr-startup\") pod \"frr-k8s-lzrf7\" (UID: \"822ff984-89c3-48d0-b420-4ecf223f8176\") " pod="metallb-system/frr-k8s-lzrf7" Mar 21 04:39:28 crc kubenswrapper[4839]: I0321 04:39:28.881309 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"frr-sockets\" (UniqueName: \"kubernetes.io/empty-dir/822ff984-89c3-48d0-b420-4ecf223f8176-frr-sockets\") pod \"frr-k8s-lzrf7\" (UID: \"822ff984-89c3-48d0-b420-4ecf223f8176\") " pod="metallb-system/frr-k8s-lzrf7" Mar 21 04:39:28 crc kubenswrapper[4839]: I0321 04:39:28.881500 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"frr-conf\" (UniqueName: \"kubernetes.io/empty-dir/822ff984-89c3-48d0-b420-4ecf223f8176-frr-conf\") pod \"frr-k8s-lzrf7\" (UID: \"822ff984-89c3-48d0-b420-4ecf223f8176\") " pod="metallb-system/frr-k8s-lzrf7" Mar 21 04:39:28 crc kubenswrapper[4839]: I0321 04:39:28.881704 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics\" (UniqueName: \"kubernetes.io/empty-dir/822ff984-89c3-48d0-b420-4ecf223f8176-metrics\") pod \"frr-k8s-lzrf7\" (UID: \"822ff984-89c3-48d0-b420-4ecf223f8176\") " pod="metallb-system/frr-k8s-lzrf7" Mar 21 04:39:28 crc kubenswrapper[4839]: E0321 04:39:28.881788 4839 secret.go:188] Couldn't get secret metallb-system/frr-k8s-certs-secret: secret "frr-k8s-certs-secret" not found Mar 21 04:39:28 crc kubenswrapper[4839]: E0321 04:39:28.881835 4839 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/822ff984-89c3-48d0-b420-4ecf223f8176-metrics-certs podName:822ff984-89c3-48d0-b420-4ecf223f8176 nodeName:}" failed. No retries permitted until 2026-03-21 04:39:29.381820166 +0000 UTC m=+973.709606842 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/822ff984-89c3-48d0-b420-4ecf223f8176-metrics-certs") pod "frr-k8s-lzrf7" (UID: "822ff984-89c3-48d0-b420-4ecf223f8176") : secret "frr-k8s-certs-secret" not found Mar 21 04:39:28 crc kubenswrapper[4839]: E0321 04:39:28.881915 4839 secret.go:188] Couldn't get secret metallb-system/frr-k8s-webhook-server-cert: secret "frr-k8s-webhook-server-cert" not found Mar 21 04:39:28 crc kubenswrapper[4839]: E0321 04:39:28.881994 4839 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/06b3d06a-d515-469a-9a88-77b3f1e6c6f0-cert podName:06b3d06a-d515-469a-9a88-77b3f1e6c6f0 nodeName:}" failed. No retries permitted until 2026-03-21 04:39:29.38197035 +0000 UTC m=+973.709757026 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/06b3d06a-d515-469a-9a88-77b3f1e6c6f0-cert") pod "frr-k8s-webhook-server-bcc4b6f68-qm7jb" (UID: "06b3d06a-d515-469a-9a88-77b3f1e6c6f0") : secret "frr-k8s-webhook-server-cert" not found Mar 21 04:39:28 crc kubenswrapper[4839]: I0321 04:39:28.889826 4839 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"controller-certs-secret" Mar 21 04:39:28 crc kubenswrapper[4839]: I0321 04:39:28.930993 4839 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/controller-7bb4cc7c98-q9zb9"] Mar 21 04:39:28 crc kubenswrapper[4839]: I0321 04:39:28.978369 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qk6bf\" (UniqueName: \"kubernetes.io/projected/822ff984-89c3-48d0-b420-4ecf223f8176-kube-api-access-qk6bf\") pod \"frr-k8s-lzrf7\" (UID: \"822ff984-89c3-48d0-b420-4ecf223f8176\") " pod="metallb-system/frr-k8s-lzrf7" Mar 21 04:39:28 crc kubenswrapper[4839]: I0321 04:39:28.983295 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/f0373e22-a3f9-48c6-abd6-fc8147ea49e6-cert\") pod \"controller-7bb4cc7c98-q9zb9\" (UID: \"f0373e22-a3f9-48c6-abd6-fc8147ea49e6\") " pod="metallb-system/controller-7bb4cc7c98-q9zb9" Mar 21 04:39:28 crc kubenswrapper[4839]: I0321 04:39:28.983361 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metallb-excludel2\" (UniqueName: \"kubernetes.io/configmap/6b330e86-2ac2-4bee-8a6e-364cb2f093d7-metallb-excludel2\") pod \"speaker-b2wb4\" (UID: \"6b330e86-2ac2-4bee-8a6e-364cb2f093d7\") " pod="metallb-system/speaker-b2wb4" Mar 21 04:39:28 crc kubenswrapper[4839]: I0321 04:39:28.983395 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/6b330e86-2ac2-4bee-8a6e-364cb2f093d7-metrics-certs\") pod \"speaker-b2wb4\" (UID: \"6b330e86-2ac2-4bee-8a6e-364cb2f093d7\") " pod="metallb-system/speaker-b2wb4" Mar 21 04:39:28 crc kubenswrapper[4839]: I0321 04:39:28.983474 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s4btv\" (UniqueName: \"kubernetes.io/projected/f0373e22-a3f9-48c6-abd6-fc8147ea49e6-kube-api-access-s4btv\") pod \"controller-7bb4cc7c98-q9zb9\" (UID: \"f0373e22-a3f9-48c6-abd6-fc8147ea49e6\") " pod="metallb-system/controller-7bb4cc7c98-q9zb9" Mar 21 04:39:28 crc kubenswrapper[4839]: I0321 04:39:28.983523 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/f0373e22-a3f9-48c6-abd6-fc8147ea49e6-metrics-certs\") pod \"controller-7bb4cc7c98-q9zb9\" (UID: \"f0373e22-a3f9-48c6-abd6-fc8147ea49e6\") " pod="metallb-system/controller-7bb4cc7c98-q9zb9" Mar 21 04:39:28 crc kubenswrapper[4839]: I0321 04:39:28.983545 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/6b330e86-2ac2-4bee-8a6e-364cb2f093d7-memberlist\") pod \"speaker-b2wb4\" (UID: \"6b330e86-2ac2-4bee-8a6e-364cb2f093d7\") " pod="metallb-system/speaker-b2wb4" Mar 21 04:39:28 crc kubenswrapper[4839]: I0321 04:39:28.983593 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rs2jp\" (UniqueName: \"kubernetes.io/projected/6b330e86-2ac2-4bee-8a6e-364cb2f093d7-kube-api-access-rs2jp\") pod \"speaker-b2wb4\" (UID: \"6b330e86-2ac2-4bee-8a6e-364cb2f093d7\") " pod="metallb-system/speaker-b2wb4" Mar 21 04:39:28 crc kubenswrapper[4839]: I0321 04:39:28.985049 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metallb-excludel2\" (UniqueName: \"kubernetes.io/configmap/6b330e86-2ac2-4bee-8a6e-364cb2f093d7-metallb-excludel2\") pod \"speaker-b2wb4\" (UID: \"6b330e86-2ac2-4bee-8a6e-364cb2f093d7\") " pod="metallb-system/speaker-b2wb4" Mar 21 04:39:28 crc kubenswrapper[4839]: E0321 04:39:28.985131 4839 secret.go:188] Couldn't get secret metallb-system/speaker-certs-secret: secret "speaker-certs-secret" not found Mar 21 04:39:28 crc kubenswrapper[4839]: E0321 04:39:28.985183 4839 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/6b330e86-2ac2-4bee-8a6e-364cb2f093d7-metrics-certs podName:6b330e86-2ac2-4bee-8a6e-364cb2f093d7 nodeName:}" failed. No retries permitted until 2026-03-21 04:39:29.485168077 +0000 UTC m=+973.812954753 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/6b330e86-2ac2-4bee-8a6e-364cb2f093d7-metrics-certs") pod "speaker-b2wb4" (UID: "6b330e86-2ac2-4bee-8a6e-364cb2f093d7") : secret "speaker-certs-secret" not found Mar 21 04:39:28 crc kubenswrapper[4839]: E0321 04:39:28.985507 4839 secret.go:188] Couldn't get secret metallb-system/controller-certs-secret: secret "controller-certs-secret" not found Mar 21 04:39:28 crc kubenswrapper[4839]: E0321 04:39:28.985558 4839 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f0373e22-a3f9-48c6-abd6-fc8147ea49e6-metrics-certs podName:f0373e22-a3f9-48c6-abd6-fc8147ea49e6 nodeName:}" failed. No retries permitted until 2026-03-21 04:39:29.485536938 +0000 UTC m=+973.813323614 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/f0373e22-a3f9-48c6-abd6-fc8147ea49e6-metrics-certs") pod "controller-7bb4cc7c98-q9zb9" (UID: "f0373e22-a3f9-48c6-abd6-fc8147ea49e6") : secret "controller-certs-secret" not found Mar 21 04:39:28 crc kubenswrapper[4839]: E0321 04:39:28.985621 4839 secret.go:188] Couldn't get secret metallb-system/metallb-memberlist: secret "metallb-memberlist" not found Mar 21 04:39:28 crc kubenswrapper[4839]: E0321 04:39:28.985648 4839 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/6b330e86-2ac2-4bee-8a6e-364cb2f093d7-memberlist podName:6b330e86-2ac2-4bee-8a6e-364cb2f093d7 nodeName:}" failed. No retries permitted until 2026-03-21 04:39:29.485640781 +0000 UTC m=+973.813427457 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "memberlist" (UniqueName: "kubernetes.io/secret/6b330e86-2ac2-4bee-8a6e-364cb2f093d7-memberlist") pod "speaker-b2wb4" (UID: "6b330e86-2ac2-4bee-8a6e-364cb2f093d7") : secret "metallb-memberlist" not found Mar 21 04:39:28 crc kubenswrapper[4839]: I0321 04:39:28.990902 4839 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-webhook-cert" Mar 21 04:39:28 crc kubenswrapper[4839]: I0321 04:39:28.993294 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jmkxb\" (UniqueName: \"kubernetes.io/projected/06b3d06a-d515-469a-9a88-77b3f1e6c6f0-kube-api-access-jmkxb\") pod \"frr-k8s-webhook-server-bcc4b6f68-qm7jb\" (UID: \"06b3d06a-d515-469a-9a88-77b3f1e6c6f0\") " pod="metallb-system/frr-k8s-webhook-server-bcc4b6f68-qm7jb" Mar 21 04:39:29 crc kubenswrapper[4839]: I0321 04:39:29.008213 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/f0373e22-a3f9-48c6-abd6-fc8147ea49e6-cert\") pod \"controller-7bb4cc7c98-q9zb9\" (UID: \"f0373e22-a3f9-48c6-abd6-fc8147ea49e6\") " pod="metallb-system/controller-7bb4cc7c98-q9zb9" Mar 21 04:39:29 crc kubenswrapper[4839]: I0321 04:39:29.011476 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rs2jp\" (UniqueName: \"kubernetes.io/projected/6b330e86-2ac2-4bee-8a6e-364cb2f093d7-kube-api-access-rs2jp\") pod \"speaker-b2wb4\" (UID: \"6b330e86-2ac2-4bee-8a6e-364cb2f093d7\") " pod="metallb-system/speaker-b2wb4" Mar 21 04:39:29 crc kubenswrapper[4839]: I0321 04:39:29.018756 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s4btv\" (UniqueName: \"kubernetes.io/projected/f0373e22-a3f9-48c6-abd6-fc8147ea49e6-kube-api-access-s4btv\") pod \"controller-7bb4cc7c98-q9zb9\" (UID: \"f0373e22-a3f9-48c6-abd6-fc8147ea49e6\") " pod="metallb-system/controller-7bb4cc7c98-q9zb9" Mar 21 04:39:29 crc kubenswrapper[4839]: I0321 04:39:29.388543 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/06b3d06a-d515-469a-9a88-77b3f1e6c6f0-cert\") pod \"frr-k8s-webhook-server-bcc4b6f68-qm7jb\" (UID: \"06b3d06a-d515-469a-9a88-77b3f1e6c6f0\") " pod="metallb-system/frr-k8s-webhook-server-bcc4b6f68-qm7jb" Mar 21 04:39:29 crc kubenswrapper[4839]: I0321 04:39:29.388706 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/822ff984-89c3-48d0-b420-4ecf223f8176-metrics-certs\") pod \"frr-k8s-lzrf7\" (UID: \"822ff984-89c3-48d0-b420-4ecf223f8176\") " pod="metallb-system/frr-k8s-lzrf7" Mar 21 04:39:29 crc kubenswrapper[4839]: I0321 04:39:29.391744 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/822ff984-89c3-48d0-b420-4ecf223f8176-metrics-certs\") pod \"frr-k8s-lzrf7\" (UID: \"822ff984-89c3-48d0-b420-4ecf223f8176\") " pod="metallb-system/frr-k8s-lzrf7" Mar 21 04:39:29 crc kubenswrapper[4839]: I0321 04:39:29.392189 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/06b3d06a-d515-469a-9a88-77b3f1e6c6f0-cert\") pod \"frr-k8s-webhook-server-bcc4b6f68-qm7jb\" (UID: \"06b3d06a-d515-469a-9a88-77b3f1e6c6f0\") " pod="metallb-system/frr-k8s-webhook-server-bcc4b6f68-qm7jb" Mar 21 04:39:29 crc kubenswrapper[4839]: I0321 04:39:29.402364 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-webhook-server-bcc4b6f68-qm7jb" Mar 21 04:39:29 crc kubenswrapper[4839]: I0321 04:39:29.490393 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/6b330e86-2ac2-4bee-8a6e-364cb2f093d7-metrics-certs\") pod \"speaker-b2wb4\" (UID: \"6b330e86-2ac2-4bee-8a6e-364cb2f093d7\") " pod="metallb-system/speaker-b2wb4" Mar 21 04:39:29 crc kubenswrapper[4839]: I0321 04:39:29.490708 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/f0373e22-a3f9-48c6-abd6-fc8147ea49e6-metrics-certs\") pod \"controller-7bb4cc7c98-q9zb9\" (UID: \"f0373e22-a3f9-48c6-abd6-fc8147ea49e6\") " pod="metallb-system/controller-7bb4cc7c98-q9zb9" Mar 21 04:39:29 crc kubenswrapper[4839]: I0321 04:39:29.490771 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/6b330e86-2ac2-4bee-8a6e-364cb2f093d7-memberlist\") pod \"speaker-b2wb4\" (UID: \"6b330e86-2ac2-4bee-8a6e-364cb2f093d7\") " pod="metallb-system/speaker-b2wb4" Mar 21 04:39:29 crc kubenswrapper[4839]: E0321 04:39:29.492775 4839 secret.go:188] Couldn't get secret metallb-system/metallb-memberlist: secret "metallb-memberlist" not found Mar 21 04:39:29 crc kubenswrapper[4839]: E0321 04:39:29.492842 4839 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/6b330e86-2ac2-4bee-8a6e-364cb2f093d7-memberlist podName:6b330e86-2ac2-4bee-8a6e-364cb2f093d7 nodeName:}" failed. No retries permitted until 2026-03-21 04:39:30.492822469 +0000 UTC m=+974.820609145 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "memberlist" (UniqueName: "kubernetes.io/secret/6b330e86-2ac2-4bee-8a6e-364cb2f093d7-memberlist") pod "speaker-b2wb4" (UID: "6b330e86-2ac2-4bee-8a6e-364cb2f093d7") : secret "metallb-memberlist" not found Mar 21 04:39:29 crc kubenswrapper[4839]: I0321 04:39:29.496286 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/f0373e22-a3f9-48c6-abd6-fc8147ea49e6-metrics-certs\") pod \"controller-7bb4cc7c98-q9zb9\" (UID: \"f0373e22-a3f9-48c6-abd6-fc8147ea49e6\") " pod="metallb-system/controller-7bb4cc7c98-q9zb9" Mar 21 04:39:29 crc kubenswrapper[4839]: I0321 04:39:29.496548 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/6b330e86-2ac2-4bee-8a6e-364cb2f093d7-metrics-certs\") pod \"speaker-b2wb4\" (UID: \"6b330e86-2ac2-4bee-8a6e-364cb2f093d7\") " pod="metallb-system/speaker-b2wb4" Mar 21 04:39:29 crc kubenswrapper[4839]: I0321 04:39:29.546398 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/controller-7bb4cc7c98-q9zb9" Mar 21 04:39:29 crc kubenswrapper[4839]: I0321 04:39:29.691370 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-lzrf7" Mar 21 04:39:29 crc kubenswrapper[4839]: I0321 04:39:29.826044 4839 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/frr-k8s-webhook-server-bcc4b6f68-qm7jb"] Mar 21 04:39:29 crc kubenswrapper[4839]: W0321 04:39:29.832124 4839 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod06b3d06a_d515_469a_9a88_77b3f1e6c6f0.slice/crio-f8dd9234facab32c48e559e6f4cb604cf7d06eb6bc747d7815442d5d620ef3dc WatchSource:0}: Error finding container f8dd9234facab32c48e559e6f4cb604cf7d06eb6bc747d7815442d5d620ef3dc: Status 404 returned error can't find the container with id f8dd9234facab32c48e559e6f4cb604cf7d06eb6bc747d7815442d5d620ef3dc Mar 21 04:39:29 crc kubenswrapper[4839]: I0321 04:39:29.834277 4839 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 21 04:39:29 crc kubenswrapper[4839]: I0321 04:39:29.955307 4839 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/controller-7bb4cc7c98-q9zb9"] Mar 21 04:39:29 crc kubenswrapper[4839]: W0321 04:39:29.960052 4839 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf0373e22_a3f9_48c6_abd6_fc8147ea49e6.slice/crio-924d7bcbde774bd43b73cc101eb7776cf9c377e8b5d1ee41f29bdb41a709d632 WatchSource:0}: Error finding container 924d7bcbde774bd43b73cc101eb7776cf9c377e8b5d1ee41f29bdb41a709d632: Status 404 returned error can't find the container with id 924d7bcbde774bd43b73cc101eb7776cf9c377e8b5d1ee41f29bdb41a709d632 Mar 21 04:39:30 crc kubenswrapper[4839]: I0321 04:39:30.506018 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/6b330e86-2ac2-4bee-8a6e-364cb2f093d7-memberlist\") pod \"speaker-b2wb4\" (UID: \"6b330e86-2ac2-4bee-8a6e-364cb2f093d7\") " pod="metallb-system/speaker-b2wb4" Mar 21 04:39:30 crc kubenswrapper[4839]: I0321 04:39:30.519108 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/6b330e86-2ac2-4bee-8a6e-364cb2f093d7-memberlist\") pod \"speaker-b2wb4\" (UID: \"6b330e86-2ac2-4bee-8a6e-364cb2f093d7\") " pod="metallb-system/speaker-b2wb4" Mar 21 04:39:30 crc kubenswrapper[4839]: I0321 04:39:30.619718 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/controller-7bb4cc7c98-q9zb9" event={"ID":"f0373e22-a3f9-48c6-abd6-fc8147ea49e6","Type":"ContainerStarted","Data":"b68f5220ef1f6b732d5018555e549c5121a31dbec8e183b5866f019908cd4660"} Mar 21 04:39:30 crc kubenswrapper[4839]: I0321 04:39:30.619791 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/controller-7bb4cc7c98-q9zb9" event={"ID":"f0373e22-a3f9-48c6-abd6-fc8147ea49e6","Type":"ContainerStarted","Data":"61ec918888cba3cb29538cac06ead0d68a6acbc5f28a437c83970641a050afe1"} Mar 21 04:39:30 crc kubenswrapper[4839]: I0321 04:39:30.619807 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/controller-7bb4cc7c98-q9zb9" event={"ID":"f0373e22-a3f9-48c6-abd6-fc8147ea49e6","Type":"ContainerStarted","Data":"924d7bcbde774bd43b73cc101eb7776cf9c377e8b5d1ee41f29bdb41a709d632"} Mar 21 04:39:30 crc kubenswrapper[4839]: I0321 04:39:30.619928 4839 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/controller-7bb4cc7c98-q9zb9" Mar 21 04:39:30 crc kubenswrapper[4839]: I0321 04:39:30.621124 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-lzrf7" event={"ID":"822ff984-89c3-48d0-b420-4ecf223f8176","Type":"ContainerStarted","Data":"ba42772066ec959ddafaf8cd84741af0ac4d463f4e44cf3893305b9d3ba41a03"} Mar 21 04:39:30 crc kubenswrapper[4839]: I0321 04:39:30.622845 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-webhook-server-bcc4b6f68-qm7jb" event={"ID":"06b3d06a-d515-469a-9a88-77b3f1e6c6f0","Type":"ContainerStarted","Data":"f8dd9234facab32c48e559e6f4cb604cf7d06eb6bc747d7815442d5d620ef3dc"} Mar 21 04:39:30 crc kubenswrapper[4839]: I0321 04:39:30.651969 4839 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/controller-7bb4cc7c98-q9zb9" podStartSLOduration=2.651943286 podStartE2EDuration="2.651943286s" podCreationTimestamp="2026-03-21 04:39:28 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-21 04:39:30.647232235 +0000 UTC m=+974.975018931" watchObservedRunningTime="2026-03-21 04:39:30.651943286 +0000 UTC m=+974.979730002" Mar 21 04:39:30 crc kubenswrapper[4839]: I0321 04:39:30.709697 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/speaker-b2wb4" Mar 21 04:39:31 crc kubenswrapper[4839]: I0321 04:39:31.629847 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/speaker-b2wb4" event={"ID":"6b330e86-2ac2-4bee-8a6e-364cb2f093d7","Type":"ContainerStarted","Data":"5f20e86978e4b21f36c5428c36388b126f2db7acec1b5806b78d5311837d5785"} Mar 21 04:39:31 crc kubenswrapper[4839]: I0321 04:39:31.630215 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/speaker-b2wb4" event={"ID":"6b330e86-2ac2-4bee-8a6e-364cb2f093d7","Type":"ContainerStarted","Data":"68418a673127c1e259429faa893b533c6e96a97fb321206a185c6baa1e3e125f"} Mar 21 04:39:31 crc kubenswrapper[4839]: I0321 04:39:31.630230 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/speaker-b2wb4" event={"ID":"6b330e86-2ac2-4bee-8a6e-364cb2f093d7","Type":"ContainerStarted","Data":"54f9dccf15e053f6cd6469db3f8d9afa547b5dae24f884f45d7072078aeb511a"} Mar 21 04:39:31 crc kubenswrapper[4839]: I0321 04:39:31.630381 4839 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/speaker-b2wb4" Mar 21 04:39:31 crc kubenswrapper[4839]: I0321 04:39:31.646843 4839 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/speaker-b2wb4" podStartSLOduration=3.646819308 podStartE2EDuration="3.646819308s" podCreationTimestamp="2026-03-21 04:39:28 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-21 04:39:31.644847283 +0000 UTC m=+975.972633959" watchObservedRunningTime="2026-03-21 04:39:31.646819308 +0000 UTC m=+975.974605984" Mar 21 04:39:37 crc kubenswrapper[4839]: I0321 04:39:37.675074 4839 generic.go:334] "Generic (PLEG): container finished" podID="822ff984-89c3-48d0-b420-4ecf223f8176" containerID="565bac187c28c85d8dfcd7bd88242ad8104b6bd3ee7a4f401ac27d881d077f3a" exitCode=0 Mar 21 04:39:37 crc kubenswrapper[4839]: I0321 04:39:37.675406 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-lzrf7" event={"ID":"822ff984-89c3-48d0-b420-4ecf223f8176","Type":"ContainerDied","Data":"565bac187c28c85d8dfcd7bd88242ad8104b6bd3ee7a4f401ac27d881d077f3a"} Mar 21 04:39:37 crc kubenswrapper[4839]: I0321 04:39:37.677653 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-webhook-server-bcc4b6f68-qm7jb" event={"ID":"06b3d06a-d515-469a-9a88-77b3f1e6c6f0","Type":"ContainerStarted","Data":"56dc8d83024726e36a1f005b734afb7ee64b3dc6b8573f3f079ae6b51be5a03c"} Mar 21 04:39:37 crc kubenswrapper[4839]: I0321 04:39:37.678348 4839 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/frr-k8s-webhook-server-bcc4b6f68-qm7jb" Mar 21 04:39:37 crc kubenswrapper[4839]: I0321 04:39:37.736577 4839 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/frr-k8s-webhook-server-bcc4b6f68-qm7jb" podStartSLOduration=2.747086549 podStartE2EDuration="9.736495689s" podCreationTimestamp="2026-03-21 04:39:28 +0000 UTC" firstStartedPulling="2026-03-21 04:39:29.834093077 +0000 UTC m=+974.161879753" lastFinishedPulling="2026-03-21 04:39:36.823502197 +0000 UTC m=+981.151288893" observedRunningTime="2026-03-21 04:39:37.694088912 +0000 UTC m=+982.021891929" watchObservedRunningTime="2026-03-21 04:39:37.736495689 +0000 UTC m=+982.064282365" Mar 21 04:39:38 crc kubenswrapper[4839]: I0321 04:39:38.689677 4839 generic.go:334] "Generic (PLEG): container finished" podID="822ff984-89c3-48d0-b420-4ecf223f8176" containerID="e2c95d7b3b135c19b82da28111b6b85c7851811d4dc2ed3bbc6166eb62c44fc2" exitCode=0 Mar 21 04:39:38 crc kubenswrapper[4839]: I0321 04:39:38.689767 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-lzrf7" event={"ID":"822ff984-89c3-48d0-b420-4ecf223f8176","Type":"ContainerDied","Data":"e2c95d7b3b135c19b82da28111b6b85c7851811d4dc2ed3bbc6166eb62c44fc2"} Mar 21 04:39:39 crc kubenswrapper[4839]: I0321 04:39:39.705644 4839 generic.go:334] "Generic (PLEG): container finished" podID="822ff984-89c3-48d0-b420-4ecf223f8176" containerID="fee246ca150bd459f2420666e9da817cd62f654c9584340fc5779d4c1a29128a" exitCode=0 Mar 21 04:39:39 crc kubenswrapper[4839]: I0321 04:39:39.706250 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-lzrf7" event={"ID":"822ff984-89c3-48d0-b420-4ecf223f8176","Type":"ContainerDied","Data":"fee246ca150bd459f2420666e9da817cd62f654c9584340fc5779d4c1a29128a"} Mar 21 04:39:40 crc kubenswrapper[4839]: I0321 04:39:40.717232 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-lzrf7" event={"ID":"822ff984-89c3-48d0-b420-4ecf223f8176","Type":"ContainerStarted","Data":"6cbc1c4b8cb154df282fd3655e9a70ad7fadc36b8d3c20ce4d9ea29329d1a2a6"} Mar 21 04:39:40 crc kubenswrapper[4839]: I0321 04:39:40.717664 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-lzrf7" event={"ID":"822ff984-89c3-48d0-b420-4ecf223f8176","Type":"ContainerStarted","Data":"d6afc1444d4a08a428f6f1ad5a0020b67e53b28d3807691f1253dce24bad5553"} Mar 21 04:39:40 crc kubenswrapper[4839]: I0321 04:39:40.717687 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-lzrf7" event={"ID":"822ff984-89c3-48d0-b420-4ecf223f8176","Type":"ContainerStarted","Data":"09583428a431967fdeee2fdd5a64a97d93655d0b8a6f4620d364eb2bb9293a9a"} Mar 21 04:39:40 crc kubenswrapper[4839]: I0321 04:39:40.717703 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-lzrf7" event={"ID":"822ff984-89c3-48d0-b420-4ecf223f8176","Type":"ContainerStarted","Data":"4340d096c43af3372f6634220f77336f714d48d74e9bbaf7919145c20384c29c"} Mar 21 04:39:40 crc kubenswrapper[4839]: I0321 04:39:40.717719 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-lzrf7" event={"ID":"822ff984-89c3-48d0-b420-4ecf223f8176","Type":"ContainerStarted","Data":"4461e8f0f78d715175ffaf25277a13356d4cd70a7e5240aed9ea68302cc779e0"} Mar 21 04:39:41 crc kubenswrapper[4839]: I0321 04:39:41.728397 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-lzrf7" event={"ID":"822ff984-89c3-48d0-b420-4ecf223f8176","Type":"ContainerStarted","Data":"ab37b8eb272fbf8a026d266846e789be95e7c4c569d14d26432c35b673ce942d"} Mar 21 04:39:41 crc kubenswrapper[4839]: I0321 04:39:41.728590 4839 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/frr-k8s-lzrf7" Mar 21 04:39:41 crc kubenswrapper[4839]: I0321 04:39:41.764980 4839 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/frr-k8s-lzrf7" podStartSLOduration=6.847850868 podStartE2EDuration="13.764963916s" podCreationTimestamp="2026-03-21 04:39:28 +0000 UTC" firstStartedPulling="2026-03-21 04:39:29.891282226 +0000 UTC m=+974.219068902" lastFinishedPulling="2026-03-21 04:39:36.808395264 +0000 UTC m=+981.136181950" observedRunningTime="2026-03-21 04:39:41.760328967 +0000 UTC m=+986.088115733" watchObservedRunningTime="2026-03-21 04:39:41.764963916 +0000 UTC m=+986.092750592" Mar 21 04:39:44 crc kubenswrapper[4839]: I0321 04:39:44.691770 4839 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="metallb-system/frr-k8s-lzrf7" Mar 21 04:39:44 crc kubenswrapper[4839]: I0321 04:39:44.732007 4839 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="metallb-system/frr-k8s-lzrf7" Mar 21 04:39:49 crc kubenswrapper[4839]: I0321 04:39:49.407340 4839 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/frr-k8s-webhook-server-bcc4b6f68-qm7jb" Mar 21 04:39:49 crc kubenswrapper[4839]: I0321 04:39:49.550727 4839 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/controller-7bb4cc7c98-q9zb9" Mar 21 04:39:49 crc kubenswrapper[4839]: I0321 04:39:49.694735 4839 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/frr-k8s-lzrf7" Mar 21 04:39:50 crc kubenswrapper[4839]: I0321 04:39:50.713510 4839 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/speaker-b2wb4" Mar 21 04:39:53 crc kubenswrapper[4839]: I0321 04:39:53.493520 4839 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-index-7pfkn"] Mar 21 04:39:53 crc kubenswrapper[4839]: I0321 04:39:53.494834 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-7pfkn" Mar 21 04:39:53 crc kubenswrapper[4839]: I0321 04:39:53.496856 4839 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-operator-index-dockercfg-5zf2m" Mar 21 04:39:53 crc kubenswrapper[4839]: I0321 04:39:53.496970 4839 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-operators"/"kube-root-ca.crt" Mar 21 04:39:53 crc kubenswrapper[4839]: I0321 04:39:53.497413 4839 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-operators"/"openshift-service-ca.crt" Mar 21 04:39:53 crc kubenswrapper[4839]: I0321 04:39:53.537313 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hwwmm\" (UniqueName: \"kubernetes.io/projected/e7f6aac9-7315-491e-b5b1-e0a5e43c1387-kube-api-access-hwwmm\") pod \"openstack-operator-index-7pfkn\" (UID: \"e7f6aac9-7315-491e-b5b1-e0a5e43c1387\") " pod="openstack-operators/openstack-operator-index-7pfkn" Mar 21 04:39:53 crc kubenswrapper[4839]: I0321 04:39:53.552065 4839 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-7pfkn"] Mar 21 04:39:53 crc kubenswrapper[4839]: I0321 04:39:53.638744 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hwwmm\" (UniqueName: \"kubernetes.io/projected/e7f6aac9-7315-491e-b5b1-e0a5e43c1387-kube-api-access-hwwmm\") pod \"openstack-operator-index-7pfkn\" (UID: \"e7f6aac9-7315-491e-b5b1-e0a5e43c1387\") " pod="openstack-operators/openstack-operator-index-7pfkn" Mar 21 04:39:53 crc kubenswrapper[4839]: I0321 04:39:53.660814 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hwwmm\" (UniqueName: \"kubernetes.io/projected/e7f6aac9-7315-491e-b5b1-e0a5e43c1387-kube-api-access-hwwmm\") pod \"openstack-operator-index-7pfkn\" (UID: \"e7f6aac9-7315-491e-b5b1-e0a5e43c1387\") " pod="openstack-operators/openstack-operator-index-7pfkn" Mar 21 04:39:53 crc kubenswrapper[4839]: I0321 04:39:53.824282 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-7pfkn" Mar 21 04:39:54 crc kubenswrapper[4839]: I0321 04:39:54.290086 4839 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-7pfkn"] Mar 21 04:39:54 crc kubenswrapper[4839]: I0321 04:39:54.820371 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-7pfkn" event={"ID":"e7f6aac9-7315-491e-b5b1-e0a5e43c1387","Type":"ContainerStarted","Data":"f9459f4578385c2c260c563d32018cecfef8f292c4211136802ae9d2d156071f"} Mar 21 04:39:55 crc kubenswrapper[4839]: I0321 04:39:55.672561 4839 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/openstack-operator-index-7pfkn"] Mar 21 04:39:56 crc kubenswrapper[4839]: I0321 04:39:56.286564 4839 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-index-lj8h4"] Mar 21 04:39:56 crc kubenswrapper[4839]: I0321 04:39:56.287281 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-lj8h4" Mar 21 04:39:56 crc kubenswrapper[4839]: I0321 04:39:56.295872 4839 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-lj8h4"] Mar 21 04:39:56 crc kubenswrapper[4839]: I0321 04:39:56.474529 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-95j7t\" (UniqueName: \"kubernetes.io/projected/6ff65f56-ff89-43c6-b087-6d3c3b72d2ef-kube-api-access-95j7t\") pod \"openstack-operator-index-lj8h4\" (UID: \"6ff65f56-ff89-43c6-b087-6d3c3b72d2ef\") " pod="openstack-operators/openstack-operator-index-lj8h4" Mar 21 04:39:56 crc kubenswrapper[4839]: I0321 04:39:56.575705 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-95j7t\" (UniqueName: \"kubernetes.io/projected/6ff65f56-ff89-43c6-b087-6d3c3b72d2ef-kube-api-access-95j7t\") pod \"openstack-operator-index-lj8h4\" (UID: \"6ff65f56-ff89-43c6-b087-6d3c3b72d2ef\") " pod="openstack-operators/openstack-operator-index-lj8h4" Mar 21 04:39:56 crc kubenswrapper[4839]: I0321 04:39:56.597870 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-95j7t\" (UniqueName: \"kubernetes.io/projected/6ff65f56-ff89-43c6-b087-6d3c3b72d2ef-kube-api-access-95j7t\") pod \"openstack-operator-index-lj8h4\" (UID: \"6ff65f56-ff89-43c6-b087-6d3c3b72d2ef\") " pod="openstack-operators/openstack-operator-index-lj8h4" Mar 21 04:39:56 crc kubenswrapper[4839]: I0321 04:39:56.612633 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-lj8h4" Mar 21 04:39:57 crc kubenswrapper[4839]: I0321 04:39:57.887122 4839 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-lj8h4"] Mar 21 04:39:58 crc kubenswrapper[4839]: I0321 04:39:58.850860 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-7pfkn" event={"ID":"e7f6aac9-7315-491e-b5b1-e0a5e43c1387","Type":"ContainerStarted","Data":"4b98942741c949f340c09c4da5ff5a74fbdefa3105379c38b5bbee4104f0a057"} Mar 21 04:39:58 crc kubenswrapper[4839]: I0321 04:39:58.851024 4839 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-operators/openstack-operator-index-7pfkn" podUID="e7f6aac9-7315-491e-b5b1-e0a5e43c1387" containerName="registry-server" containerID="cri-o://4b98942741c949f340c09c4da5ff5a74fbdefa3105379c38b5bbee4104f0a057" gracePeriod=2 Mar 21 04:39:58 crc kubenswrapper[4839]: I0321 04:39:58.852990 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-lj8h4" event={"ID":"6ff65f56-ff89-43c6-b087-6d3c3b72d2ef","Type":"ContainerStarted","Data":"d8d5f4623ef46362fa062476beb5cd44fe699aee49dc6ca663cf74cf54f14f4b"} Mar 21 04:39:58 crc kubenswrapper[4839]: I0321 04:39:58.853045 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-lj8h4" event={"ID":"6ff65f56-ff89-43c6-b087-6d3c3b72d2ef","Type":"ContainerStarted","Data":"9bf5dc4a328f57563a89f14d7276635e4dd98c6451e9b20265d01aba9066d661"} Mar 21 04:39:58 crc kubenswrapper[4839]: I0321 04:39:58.880081 4839 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-index-7pfkn" podStartSLOduration=1.985557412 podStartE2EDuration="5.880056452s" podCreationTimestamp="2026-03-21 04:39:53 +0000 UTC" firstStartedPulling="2026-03-21 04:39:54.303742438 +0000 UTC m=+998.631529154" lastFinishedPulling="2026-03-21 04:39:58.198241508 +0000 UTC m=+1002.526028194" observedRunningTime="2026-03-21 04:39:58.874759834 +0000 UTC m=+1003.202546530" watchObservedRunningTime="2026-03-21 04:39:58.880056452 +0000 UTC m=+1003.207843158" Mar 21 04:39:58 crc kubenswrapper[4839]: I0321 04:39:58.901129 4839 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-index-lj8h4" podStartSLOduration=2.849441026 podStartE2EDuration="2.901096951s" podCreationTimestamp="2026-03-21 04:39:56 +0000 UTC" firstStartedPulling="2026-03-21 04:39:58.147314243 +0000 UTC m=+1002.475100919" lastFinishedPulling="2026-03-21 04:39:58.198970168 +0000 UTC m=+1002.526756844" observedRunningTime="2026-03-21 04:39:58.892021587 +0000 UTC m=+1003.219808273" watchObservedRunningTime="2026-03-21 04:39:58.901096951 +0000 UTC m=+1003.228883667" Mar 21 04:39:59 crc kubenswrapper[4839]: I0321 04:39:59.345414 4839 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-7pfkn" Mar 21 04:39:59 crc kubenswrapper[4839]: I0321 04:39:59.516265 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hwwmm\" (UniqueName: \"kubernetes.io/projected/e7f6aac9-7315-491e-b5b1-e0a5e43c1387-kube-api-access-hwwmm\") pod \"e7f6aac9-7315-491e-b5b1-e0a5e43c1387\" (UID: \"e7f6aac9-7315-491e-b5b1-e0a5e43c1387\") " Mar 21 04:39:59 crc kubenswrapper[4839]: I0321 04:39:59.523040 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e7f6aac9-7315-491e-b5b1-e0a5e43c1387-kube-api-access-hwwmm" (OuterVolumeSpecName: "kube-api-access-hwwmm") pod "e7f6aac9-7315-491e-b5b1-e0a5e43c1387" (UID: "e7f6aac9-7315-491e-b5b1-e0a5e43c1387"). InnerVolumeSpecName "kube-api-access-hwwmm". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 21 04:39:59 crc kubenswrapper[4839]: I0321 04:39:59.618152 4839 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hwwmm\" (UniqueName: \"kubernetes.io/projected/e7f6aac9-7315-491e-b5b1-e0a5e43c1387-kube-api-access-hwwmm\") on node \"crc\" DevicePath \"\"" Mar 21 04:39:59 crc kubenswrapper[4839]: I0321 04:39:59.859500 4839 generic.go:334] "Generic (PLEG): container finished" podID="e7f6aac9-7315-491e-b5b1-e0a5e43c1387" containerID="4b98942741c949f340c09c4da5ff5a74fbdefa3105379c38b5bbee4104f0a057" exitCode=0 Mar 21 04:39:59 crc kubenswrapper[4839]: I0321 04:39:59.859585 4839 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-7pfkn" Mar 21 04:39:59 crc kubenswrapper[4839]: I0321 04:39:59.859596 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-7pfkn" event={"ID":"e7f6aac9-7315-491e-b5b1-e0a5e43c1387","Type":"ContainerDied","Data":"4b98942741c949f340c09c4da5ff5a74fbdefa3105379c38b5bbee4104f0a057"} Mar 21 04:39:59 crc kubenswrapper[4839]: I0321 04:39:59.859633 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-7pfkn" event={"ID":"e7f6aac9-7315-491e-b5b1-e0a5e43c1387","Type":"ContainerDied","Data":"f9459f4578385c2c260c563d32018cecfef8f292c4211136802ae9d2d156071f"} Mar 21 04:39:59 crc kubenswrapper[4839]: I0321 04:39:59.859650 4839 scope.go:117] "RemoveContainer" containerID="4b98942741c949f340c09c4da5ff5a74fbdefa3105379c38b5bbee4104f0a057" Mar 21 04:39:59 crc kubenswrapper[4839]: I0321 04:39:59.874506 4839 scope.go:117] "RemoveContainer" containerID="4b98942741c949f340c09c4da5ff5a74fbdefa3105379c38b5bbee4104f0a057" Mar 21 04:39:59 crc kubenswrapper[4839]: E0321 04:39:59.874893 4839 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4b98942741c949f340c09c4da5ff5a74fbdefa3105379c38b5bbee4104f0a057\": container with ID starting with 4b98942741c949f340c09c4da5ff5a74fbdefa3105379c38b5bbee4104f0a057 not found: ID does not exist" containerID="4b98942741c949f340c09c4da5ff5a74fbdefa3105379c38b5bbee4104f0a057" Mar 21 04:39:59 crc kubenswrapper[4839]: I0321 04:39:59.874932 4839 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4b98942741c949f340c09c4da5ff5a74fbdefa3105379c38b5bbee4104f0a057"} err="failed to get container status \"4b98942741c949f340c09c4da5ff5a74fbdefa3105379c38b5bbee4104f0a057\": rpc error: code = NotFound desc = could not find container \"4b98942741c949f340c09c4da5ff5a74fbdefa3105379c38b5bbee4104f0a057\": container with ID starting with 4b98942741c949f340c09c4da5ff5a74fbdefa3105379c38b5bbee4104f0a057 not found: ID does not exist" Mar 21 04:39:59 crc kubenswrapper[4839]: I0321 04:39:59.887047 4839 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/openstack-operator-index-7pfkn"] Mar 21 04:39:59 crc kubenswrapper[4839]: I0321 04:39:59.890773 4839 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-operators/openstack-operator-index-7pfkn"] Mar 21 04:40:00 crc kubenswrapper[4839]: I0321 04:40:00.145511 4839 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29567800-hzcbk"] Mar 21 04:40:00 crc kubenswrapper[4839]: E0321 04:40:00.145936 4839 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e7f6aac9-7315-491e-b5b1-e0a5e43c1387" containerName="registry-server" Mar 21 04:40:00 crc kubenswrapper[4839]: I0321 04:40:00.145960 4839 state_mem.go:107] "Deleted CPUSet assignment" podUID="e7f6aac9-7315-491e-b5b1-e0a5e43c1387" containerName="registry-server" Mar 21 04:40:00 crc kubenswrapper[4839]: I0321 04:40:00.146178 4839 memory_manager.go:354] "RemoveStaleState removing state" podUID="e7f6aac9-7315-491e-b5b1-e0a5e43c1387" containerName="registry-server" Mar 21 04:40:00 crc kubenswrapper[4839]: I0321 04:40:00.146798 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567800-hzcbk" Mar 21 04:40:00 crc kubenswrapper[4839]: I0321 04:40:00.150489 4839 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29567800-hzcbk"] Mar 21 04:40:00 crc kubenswrapper[4839]: I0321 04:40:00.150617 4839 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 21 04:40:00 crc kubenswrapper[4839]: I0321 04:40:00.150617 4839 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-swld2" Mar 21 04:40:00 crc kubenswrapper[4839]: I0321 04:40:00.150631 4839 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 21 04:40:00 crc kubenswrapper[4839]: I0321 04:40:00.326616 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t9nrd\" (UniqueName: \"kubernetes.io/projected/4a2cd29b-967b-4cf6-9902-6f30ad049cb1-kube-api-access-t9nrd\") pod \"auto-csr-approver-29567800-hzcbk\" (UID: \"4a2cd29b-967b-4cf6-9902-6f30ad049cb1\") " pod="openshift-infra/auto-csr-approver-29567800-hzcbk" Mar 21 04:40:00 crc kubenswrapper[4839]: I0321 04:40:00.428663 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-t9nrd\" (UniqueName: \"kubernetes.io/projected/4a2cd29b-967b-4cf6-9902-6f30ad049cb1-kube-api-access-t9nrd\") pod \"auto-csr-approver-29567800-hzcbk\" (UID: \"4a2cd29b-967b-4cf6-9902-6f30ad049cb1\") " pod="openshift-infra/auto-csr-approver-29567800-hzcbk" Mar 21 04:40:00 crc kubenswrapper[4839]: I0321 04:40:00.454934 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-t9nrd\" (UniqueName: \"kubernetes.io/projected/4a2cd29b-967b-4cf6-9902-6f30ad049cb1-kube-api-access-t9nrd\") pod \"auto-csr-approver-29567800-hzcbk\" (UID: \"4a2cd29b-967b-4cf6-9902-6f30ad049cb1\") " pod="openshift-infra/auto-csr-approver-29567800-hzcbk" Mar 21 04:40:00 crc kubenswrapper[4839]: I0321 04:40:00.465055 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567800-hzcbk" Mar 21 04:40:00 crc kubenswrapper[4839]: I0321 04:40:00.468850 4839 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e7f6aac9-7315-491e-b5b1-e0a5e43c1387" path="/var/lib/kubelet/pods/e7f6aac9-7315-491e-b5b1-e0a5e43c1387/volumes" Mar 21 04:40:00 crc kubenswrapper[4839]: I0321 04:40:00.856167 4839 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29567800-hzcbk"] Mar 21 04:40:00 crc kubenswrapper[4839]: I0321 04:40:00.871909 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567800-hzcbk" event={"ID":"4a2cd29b-967b-4cf6-9902-6f30ad049cb1","Type":"ContainerStarted","Data":"5678ff93115193b208da67b4ac1cc4702c554276d60ee0d4de4653dda74e182d"} Mar 21 04:40:02 crc kubenswrapper[4839]: I0321 04:40:02.886869 4839 generic.go:334] "Generic (PLEG): container finished" podID="4a2cd29b-967b-4cf6-9902-6f30ad049cb1" containerID="54072f0390a561fb948d238ef6ee4fb04223cd43a9ba8e8eef297b621c8367df" exitCode=0 Mar 21 04:40:02 crc kubenswrapper[4839]: I0321 04:40:02.886922 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567800-hzcbk" event={"ID":"4a2cd29b-967b-4cf6-9902-6f30ad049cb1","Type":"ContainerDied","Data":"54072f0390a561fb948d238ef6ee4fb04223cd43a9ba8e8eef297b621c8367df"} Mar 21 04:40:04 crc kubenswrapper[4839]: I0321 04:40:04.230839 4839 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567800-hzcbk" Mar 21 04:40:04 crc kubenswrapper[4839]: I0321 04:40:04.415050 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-t9nrd\" (UniqueName: \"kubernetes.io/projected/4a2cd29b-967b-4cf6-9902-6f30ad049cb1-kube-api-access-t9nrd\") pod \"4a2cd29b-967b-4cf6-9902-6f30ad049cb1\" (UID: \"4a2cd29b-967b-4cf6-9902-6f30ad049cb1\") " Mar 21 04:40:04 crc kubenswrapper[4839]: I0321 04:40:04.424376 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4a2cd29b-967b-4cf6-9902-6f30ad049cb1-kube-api-access-t9nrd" (OuterVolumeSpecName: "kube-api-access-t9nrd") pod "4a2cd29b-967b-4cf6-9902-6f30ad049cb1" (UID: "4a2cd29b-967b-4cf6-9902-6f30ad049cb1"). InnerVolumeSpecName "kube-api-access-t9nrd". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 21 04:40:04 crc kubenswrapper[4839]: I0321 04:40:04.516861 4839 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-t9nrd\" (UniqueName: \"kubernetes.io/projected/4a2cd29b-967b-4cf6-9902-6f30ad049cb1-kube-api-access-t9nrd\") on node \"crc\" DevicePath \"\"" Mar 21 04:40:04 crc kubenswrapper[4839]: I0321 04:40:04.908059 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567800-hzcbk" event={"ID":"4a2cd29b-967b-4cf6-9902-6f30ad049cb1","Type":"ContainerDied","Data":"5678ff93115193b208da67b4ac1cc4702c554276d60ee0d4de4653dda74e182d"} Mar 21 04:40:04 crc kubenswrapper[4839]: I0321 04:40:04.908382 4839 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="5678ff93115193b208da67b4ac1cc4702c554276d60ee0d4de4653dda74e182d" Mar 21 04:40:04 crc kubenswrapper[4839]: I0321 04:40:04.908131 4839 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567800-hzcbk" Mar 21 04:40:05 crc kubenswrapper[4839]: I0321 04:40:05.297616 4839 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29567794-rclnt"] Mar 21 04:40:05 crc kubenswrapper[4839]: I0321 04:40:05.302936 4839 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29567794-rclnt"] Mar 21 04:40:06 crc kubenswrapper[4839]: I0321 04:40:06.466783 4839 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2dfa2356-3aca-4ed1-bfce-93cc8857825d" path="/var/lib/kubelet/pods/2dfa2356-3aca-4ed1-bfce-93cc8857825d/volumes" Mar 21 04:40:06 crc kubenswrapper[4839]: I0321 04:40:06.613117 4839 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-operator-index-lj8h4" Mar 21 04:40:06 crc kubenswrapper[4839]: I0321 04:40:06.613278 4839 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-operators/openstack-operator-index-lj8h4" Mar 21 04:40:06 crc kubenswrapper[4839]: I0321 04:40:06.661758 4839 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack-operators/openstack-operator-index-lj8h4" Mar 21 04:40:06 crc kubenswrapper[4839]: I0321 04:40:06.951451 4839 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-operator-index-lj8h4" Mar 21 04:40:10 crc kubenswrapper[4839]: I0321 04:40:10.087344 4839 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-6q2g2"] Mar 21 04:40:10 crc kubenswrapper[4839]: E0321 04:40:10.089232 4839 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4a2cd29b-967b-4cf6-9902-6f30ad049cb1" containerName="oc" Mar 21 04:40:10 crc kubenswrapper[4839]: I0321 04:40:10.089278 4839 state_mem.go:107] "Deleted CPUSet assignment" podUID="4a2cd29b-967b-4cf6-9902-6f30ad049cb1" containerName="oc" Mar 21 04:40:10 crc kubenswrapper[4839]: I0321 04:40:10.089633 4839 memory_manager.go:354] "RemoveStaleState removing state" podUID="4a2cd29b-967b-4cf6-9902-6f30ad049cb1" containerName="oc" Mar 21 04:40:10 crc kubenswrapper[4839]: I0321 04:40:10.091563 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-6q2g2" Mar 21 04:40:10 crc kubenswrapper[4839]: I0321 04:40:10.108666 4839 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-6q2g2"] Mar 21 04:40:10 crc kubenswrapper[4839]: I0321 04:40:10.194178 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/befc88a7-caca-450d-b23e-c4382b36217e-utilities\") pod \"community-operators-6q2g2\" (UID: \"befc88a7-caca-450d-b23e-c4382b36217e\") " pod="openshift-marketplace/community-operators-6q2g2" Mar 21 04:40:10 crc kubenswrapper[4839]: I0321 04:40:10.194707 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/befc88a7-caca-450d-b23e-c4382b36217e-catalog-content\") pod \"community-operators-6q2g2\" (UID: \"befc88a7-caca-450d-b23e-c4382b36217e\") " pod="openshift-marketplace/community-operators-6q2g2" Mar 21 04:40:10 crc kubenswrapper[4839]: I0321 04:40:10.194768 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fz8c9\" (UniqueName: \"kubernetes.io/projected/befc88a7-caca-450d-b23e-c4382b36217e-kube-api-access-fz8c9\") pod \"community-operators-6q2g2\" (UID: \"befc88a7-caca-450d-b23e-c4382b36217e\") " pod="openshift-marketplace/community-operators-6q2g2" Mar 21 04:40:10 crc kubenswrapper[4839]: I0321 04:40:10.295667 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/befc88a7-caca-450d-b23e-c4382b36217e-utilities\") pod \"community-operators-6q2g2\" (UID: \"befc88a7-caca-450d-b23e-c4382b36217e\") " pod="openshift-marketplace/community-operators-6q2g2" Mar 21 04:40:10 crc kubenswrapper[4839]: I0321 04:40:10.295721 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/befc88a7-caca-450d-b23e-c4382b36217e-catalog-content\") pod \"community-operators-6q2g2\" (UID: \"befc88a7-caca-450d-b23e-c4382b36217e\") " pod="openshift-marketplace/community-operators-6q2g2" Mar 21 04:40:10 crc kubenswrapper[4839]: I0321 04:40:10.295775 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fz8c9\" (UniqueName: \"kubernetes.io/projected/befc88a7-caca-450d-b23e-c4382b36217e-kube-api-access-fz8c9\") pod \"community-operators-6q2g2\" (UID: \"befc88a7-caca-450d-b23e-c4382b36217e\") " pod="openshift-marketplace/community-operators-6q2g2" Mar 21 04:40:10 crc kubenswrapper[4839]: I0321 04:40:10.296197 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/befc88a7-caca-450d-b23e-c4382b36217e-catalog-content\") pod \"community-operators-6q2g2\" (UID: \"befc88a7-caca-450d-b23e-c4382b36217e\") " pod="openshift-marketplace/community-operators-6q2g2" Mar 21 04:40:10 crc kubenswrapper[4839]: I0321 04:40:10.296255 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/befc88a7-caca-450d-b23e-c4382b36217e-utilities\") pod \"community-operators-6q2g2\" (UID: \"befc88a7-caca-450d-b23e-c4382b36217e\") " pod="openshift-marketplace/community-operators-6q2g2" Mar 21 04:40:10 crc kubenswrapper[4839]: I0321 04:40:10.321614 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fz8c9\" (UniqueName: \"kubernetes.io/projected/befc88a7-caca-450d-b23e-c4382b36217e-kube-api-access-fz8c9\") pod \"community-operators-6q2g2\" (UID: \"befc88a7-caca-450d-b23e-c4382b36217e\") " pod="openshift-marketplace/community-operators-6q2g2" Mar 21 04:40:10 crc kubenswrapper[4839]: I0321 04:40:10.422447 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-6q2g2" Mar 21 04:40:10 crc kubenswrapper[4839]: I0321 04:40:10.900476 4839 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-6q2g2"] Mar 21 04:40:10 crc kubenswrapper[4839]: I0321 04:40:10.962321 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-6q2g2" event={"ID":"befc88a7-caca-450d-b23e-c4382b36217e","Type":"ContainerStarted","Data":"ce9d7fdc03ee552772de04055fb99bd937e17bd28a54562444504527ae42320e"} Mar 21 04:40:11 crc kubenswrapper[4839]: I0321 04:40:11.985688 4839 generic.go:334] "Generic (PLEG): container finished" podID="befc88a7-caca-450d-b23e-c4382b36217e" containerID="4f8c28d0fe8376d6fb6dbfcfb14d27aa066b107c300eb4d596d4ace562332e7b" exitCode=0 Mar 21 04:40:11 crc kubenswrapper[4839]: I0321 04:40:11.985990 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-6q2g2" event={"ID":"befc88a7-caca-450d-b23e-c4382b36217e","Type":"ContainerDied","Data":"4f8c28d0fe8376d6fb6dbfcfb14d27aa066b107c300eb4d596d4ace562332e7b"} Mar 21 04:40:13 crc kubenswrapper[4839]: I0321 04:40:13.520176 4839 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/e32239b1d0c83face19bfb7ecc802c1e0782962973f0d7ba9f259e86a2x5d6m"] Mar 21 04:40:13 crc kubenswrapper[4839]: I0321 04:40:13.521656 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/e32239b1d0c83face19bfb7ecc802c1e0782962973f0d7ba9f259e86a2x5d6m" Mar 21 04:40:13 crc kubenswrapper[4839]: I0321 04:40:13.524779 4839 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"default-dockercfg-tzl2s" Mar 21 04:40:13 crc kubenswrapper[4839]: I0321 04:40:13.532855 4839 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/e32239b1d0c83face19bfb7ecc802c1e0782962973f0d7ba9f259e86a2x5d6m"] Mar 21 04:40:13 crc kubenswrapper[4839]: I0321 04:40:13.637474 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/f63f3493-d532-4d99-94c0-ab8648252dab-bundle\") pod \"e32239b1d0c83face19bfb7ecc802c1e0782962973f0d7ba9f259e86a2x5d6m\" (UID: \"f63f3493-d532-4d99-94c0-ab8648252dab\") " pod="openstack-operators/e32239b1d0c83face19bfb7ecc802c1e0782962973f0d7ba9f259e86a2x5d6m" Mar 21 04:40:13 crc kubenswrapper[4839]: I0321 04:40:13.637553 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-94nth\" (UniqueName: \"kubernetes.io/projected/f63f3493-d532-4d99-94c0-ab8648252dab-kube-api-access-94nth\") pod \"e32239b1d0c83face19bfb7ecc802c1e0782962973f0d7ba9f259e86a2x5d6m\" (UID: \"f63f3493-d532-4d99-94c0-ab8648252dab\") " pod="openstack-operators/e32239b1d0c83face19bfb7ecc802c1e0782962973f0d7ba9f259e86a2x5d6m" Mar 21 04:40:13 crc kubenswrapper[4839]: I0321 04:40:13.637609 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/f63f3493-d532-4d99-94c0-ab8648252dab-util\") pod \"e32239b1d0c83face19bfb7ecc802c1e0782962973f0d7ba9f259e86a2x5d6m\" (UID: \"f63f3493-d532-4d99-94c0-ab8648252dab\") " pod="openstack-operators/e32239b1d0c83face19bfb7ecc802c1e0782962973f0d7ba9f259e86a2x5d6m" Mar 21 04:40:13 crc kubenswrapper[4839]: I0321 04:40:13.739614 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/f63f3493-d532-4d99-94c0-ab8648252dab-bundle\") pod \"e32239b1d0c83face19bfb7ecc802c1e0782962973f0d7ba9f259e86a2x5d6m\" (UID: \"f63f3493-d532-4d99-94c0-ab8648252dab\") " pod="openstack-operators/e32239b1d0c83face19bfb7ecc802c1e0782962973f0d7ba9f259e86a2x5d6m" Mar 21 04:40:13 crc kubenswrapper[4839]: I0321 04:40:13.740137 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/f63f3493-d532-4d99-94c0-ab8648252dab-bundle\") pod \"e32239b1d0c83face19bfb7ecc802c1e0782962973f0d7ba9f259e86a2x5d6m\" (UID: \"f63f3493-d532-4d99-94c0-ab8648252dab\") " pod="openstack-operators/e32239b1d0c83face19bfb7ecc802c1e0782962973f0d7ba9f259e86a2x5d6m" Mar 21 04:40:13 crc kubenswrapper[4839]: I0321 04:40:13.740220 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-94nth\" (UniqueName: \"kubernetes.io/projected/f63f3493-d532-4d99-94c0-ab8648252dab-kube-api-access-94nth\") pod \"e32239b1d0c83face19bfb7ecc802c1e0782962973f0d7ba9f259e86a2x5d6m\" (UID: \"f63f3493-d532-4d99-94c0-ab8648252dab\") " pod="openstack-operators/e32239b1d0c83face19bfb7ecc802c1e0782962973f0d7ba9f259e86a2x5d6m" Mar 21 04:40:13 crc kubenswrapper[4839]: I0321 04:40:13.740303 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/f63f3493-d532-4d99-94c0-ab8648252dab-util\") pod \"e32239b1d0c83face19bfb7ecc802c1e0782962973f0d7ba9f259e86a2x5d6m\" (UID: \"f63f3493-d532-4d99-94c0-ab8648252dab\") " pod="openstack-operators/e32239b1d0c83face19bfb7ecc802c1e0782962973f0d7ba9f259e86a2x5d6m" Mar 21 04:40:13 crc kubenswrapper[4839]: I0321 04:40:13.741063 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/f63f3493-d532-4d99-94c0-ab8648252dab-util\") pod \"e32239b1d0c83face19bfb7ecc802c1e0782962973f0d7ba9f259e86a2x5d6m\" (UID: \"f63f3493-d532-4d99-94c0-ab8648252dab\") " pod="openstack-operators/e32239b1d0c83face19bfb7ecc802c1e0782962973f0d7ba9f259e86a2x5d6m" Mar 21 04:40:13 crc kubenswrapper[4839]: I0321 04:40:13.763055 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-94nth\" (UniqueName: \"kubernetes.io/projected/f63f3493-d532-4d99-94c0-ab8648252dab-kube-api-access-94nth\") pod \"e32239b1d0c83face19bfb7ecc802c1e0782962973f0d7ba9f259e86a2x5d6m\" (UID: \"f63f3493-d532-4d99-94c0-ab8648252dab\") " pod="openstack-operators/e32239b1d0c83face19bfb7ecc802c1e0782962973f0d7ba9f259e86a2x5d6m" Mar 21 04:40:13 crc kubenswrapper[4839]: I0321 04:40:13.839819 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/e32239b1d0c83face19bfb7ecc802c1e0782962973f0d7ba9f259e86a2x5d6m" Mar 21 04:40:14 crc kubenswrapper[4839]: I0321 04:40:14.000122 4839 generic.go:334] "Generic (PLEG): container finished" podID="befc88a7-caca-450d-b23e-c4382b36217e" containerID="7dbabf80cccb9957a73998b24ba4a430e810a5d74896e5943d1676694432404b" exitCode=0 Mar 21 04:40:14 crc kubenswrapper[4839]: I0321 04:40:14.000174 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-6q2g2" event={"ID":"befc88a7-caca-450d-b23e-c4382b36217e","Type":"ContainerDied","Data":"7dbabf80cccb9957a73998b24ba4a430e810a5d74896e5943d1676694432404b"} Mar 21 04:40:14 crc kubenswrapper[4839]: I0321 04:40:14.071360 4839 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/e32239b1d0c83face19bfb7ecc802c1e0782962973f0d7ba9f259e86a2x5d6m"] Mar 21 04:40:15 crc kubenswrapper[4839]: I0321 04:40:15.009232 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-6q2g2" event={"ID":"befc88a7-caca-450d-b23e-c4382b36217e","Type":"ContainerStarted","Data":"00e0222729d7c8ca5cf26a69c3049542634e97119934b9dc854c6eadb23842db"} Mar 21 04:40:15 crc kubenswrapper[4839]: I0321 04:40:15.010838 4839 generic.go:334] "Generic (PLEG): container finished" podID="f63f3493-d532-4d99-94c0-ab8648252dab" containerID="57be45a01d1b8d4cd697b274c6ed49d7b6536d5cdf8d1a3add452abc67e651b6" exitCode=0 Mar 21 04:40:15 crc kubenswrapper[4839]: I0321 04:40:15.010863 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/e32239b1d0c83face19bfb7ecc802c1e0782962973f0d7ba9f259e86a2x5d6m" event={"ID":"f63f3493-d532-4d99-94c0-ab8648252dab","Type":"ContainerDied","Data":"57be45a01d1b8d4cd697b274c6ed49d7b6536d5cdf8d1a3add452abc67e651b6"} Mar 21 04:40:15 crc kubenswrapper[4839]: I0321 04:40:15.010892 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/e32239b1d0c83face19bfb7ecc802c1e0782962973f0d7ba9f259e86a2x5d6m" event={"ID":"f63f3493-d532-4d99-94c0-ab8648252dab","Type":"ContainerStarted","Data":"38fdf9bf3572e061d6030fa507bfc73e742e3df2a2a60745bef1f77d03acf33c"} Mar 21 04:40:15 crc kubenswrapper[4839]: I0321 04:40:15.038154 4839 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-6q2g2" podStartSLOduration=2.489975513 podStartE2EDuration="5.038138039s" podCreationTimestamp="2026-03-21 04:40:10 +0000 UTC" firstStartedPulling="2026-03-21 04:40:11.987785444 +0000 UTC m=+1016.315572120" lastFinishedPulling="2026-03-21 04:40:14.53594796 +0000 UTC m=+1018.863734646" observedRunningTime="2026-03-21 04:40:15.031865764 +0000 UTC m=+1019.359652440" watchObservedRunningTime="2026-03-21 04:40:15.038138039 +0000 UTC m=+1019.365924705" Mar 21 04:40:16 crc kubenswrapper[4839]: I0321 04:40:16.018826 4839 generic.go:334] "Generic (PLEG): container finished" podID="f63f3493-d532-4d99-94c0-ab8648252dab" containerID="a5341302af03e8f55e21dd4989a0b3c126f401ec6e24ea0e494248009cb0d09c" exitCode=0 Mar 21 04:40:16 crc kubenswrapper[4839]: I0321 04:40:16.018922 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/e32239b1d0c83face19bfb7ecc802c1e0782962973f0d7ba9f259e86a2x5d6m" event={"ID":"f63f3493-d532-4d99-94c0-ab8648252dab","Type":"ContainerDied","Data":"a5341302af03e8f55e21dd4989a0b3c126f401ec6e24ea0e494248009cb0d09c"} Mar 21 04:40:17 crc kubenswrapper[4839]: I0321 04:40:17.027813 4839 generic.go:334] "Generic (PLEG): container finished" podID="f63f3493-d532-4d99-94c0-ab8648252dab" containerID="334a72c7ab7081387e62857481e9ea50d715e161c32f5884fbc232169d834d0a" exitCode=0 Mar 21 04:40:17 crc kubenswrapper[4839]: I0321 04:40:17.027850 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/e32239b1d0c83face19bfb7ecc802c1e0782962973f0d7ba9f259e86a2x5d6m" event={"ID":"f63f3493-d532-4d99-94c0-ab8648252dab","Type":"ContainerDied","Data":"334a72c7ab7081387e62857481e9ea50d715e161c32f5884fbc232169d834d0a"} Mar 21 04:40:18 crc kubenswrapper[4839]: I0321 04:40:18.405755 4839 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/e32239b1d0c83face19bfb7ecc802c1e0782962973f0d7ba9f259e86a2x5d6m" Mar 21 04:40:18 crc kubenswrapper[4839]: I0321 04:40:18.502986 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-94nth\" (UniqueName: \"kubernetes.io/projected/f63f3493-d532-4d99-94c0-ab8648252dab-kube-api-access-94nth\") pod \"f63f3493-d532-4d99-94c0-ab8648252dab\" (UID: \"f63f3493-d532-4d99-94c0-ab8648252dab\") " Mar 21 04:40:18 crc kubenswrapper[4839]: I0321 04:40:18.503552 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/f63f3493-d532-4d99-94c0-ab8648252dab-bundle\") pod \"f63f3493-d532-4d99-94c0-ab8648252dab\" (UID: \"f63f3493-d532-4d99-94c0-ab8648252dab\") " Mar 21 04:40:18 crc kubenswrapper[4839]: I0321 04:40:18.503627 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/f63f3493-d532-4d99-94c0-ab8648252dab-util\") pod \"f63f3493-d532-4d99-94c0-ab8648252dab\" (UID: \"f63f3493-d532-4d99-94c0-ab8648252dab\") " Mar 21 04:40:18 crc kubenswrapper[4839]: I0321 04:40:18.505796 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f63f3493-d532-4d99-94c0-ab8648252dab-bundle" (OuterVolumeSpecName: "bundle") pod "f63f3493-d532-4d99-94c0-ab8648252dab" (UID: "f63f3493-d532-4d99-94c0-ab8648252dab"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 21 04:40:18 crc kubenswrapper[4839]: I0321 04:40:18.512508 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f63f3493-d532-4d99-94c0-ab8648252dab-kube-api-access-94nth" (OuterVolumeSpecName: "kube-api-access-94nth") pod "f63f3493-d532-4d99-94c0-ab8648252dab" (UID: "f63f3493-d532-4d99-94c0-ab8648252dab"). InnerVolumeSpecName "kube-api-access-94nth". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 21 04:40:18 crc kubenswrapper[4839]: I0321 04:40:18.545204 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f63f3493-d532-4d99-94c0-ab8648252dab-util" (OuterVolumeSpecName: "util") pod "f63f3493-d532-4d99-94c0-ab8648252dab" (UID: "f63f3493-d532-4d99-94c0-ab8648252dab"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 21 04:40:18 crc kubenswrapper[4839]: I0321 04:40:18.605231 4839 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-94nth\" (UniqueName: \"kubernetes.io/projected/f63f3493-d532-4d99-94c0-ab8648252dab-kube-api-access-94nth\") on node \"crc\" DevicePath \"\"" Mar 21 04:40:18 crc kubenswrapper[4839]: I0321 04:40:18.605265 4839 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/f63f3493-d532-4d99-94c0-ab8648252dab-bundle\") on node \"crc\" DevicePath \"\"" Mar 21 04:40:18 crc kubenswrapper[4839]: I0321 04:40:18.605275 4839 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/f63f3493-d532-4d99-94c0-ab8648252dab-util\") on node \"crc\" DevicePath \"\"" Mar 21 04:40:19 crc kubenswrapper[4839]: I0321 04:40:19.066344 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/e32239b1d0c83face19bfb7ecc802c1e0782962973f0d7ba9f259e86a2x5d6m" event={"ID":"f63f3493-d532-4d99-94c0-ab8648252dab","Type":"ContainerDied","Data":"38fdf9bf3572e061d6030fa507bfc73e742e3df2a2a60745bef1f77d03acf33c"} Mar 21 04:40:19 crc kubenswrapper[4839]: I0321 04:40:19.066422 4839 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="38fdf9bf3572e061d6030fa507bfc73e742e3df2a2a60745bef1f77d03acf33c" Mar 21 04:40:19 crc kubenswrapper[4839]: I0321 04:40:19.066441 4839 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/e32239b1d0c83face19bfb7ecc802c1e0782962973f0d7ba9f259e86a2x5d6m" Mar 21 04:40:20 crc kubenswrapper[4839]: I0321 04:40:20.423352 4839 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-6q2g2" Mar 21 04:40:20 crc kubenswrapper[4839]: I0321 04:40:20.423678 4839 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-6q2g2" Mar 21 04:40:20 crc kubenswrapper[4839]: I0321 04:40:20.462366 4839 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-6q2g2" Mar 21 04:40:21 crc kubenswrapper[4839]: I0321 04:40:21.025780 4839 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-controller-init-948579bb7-j6fx6"] Mar 21 04:40:21 crc kubenswrapper[4839]: E0321 04:40:21.025995 4839 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f63f3493-d532-4d99-94c0-ab8648252dab" containerName="extract" Mar 21 04:40:21 crc kubenswrapper[4839]: I0321 04:40:21.026005 4839 state_mem.go:107] "Deleted CPUSet assignment" podUID="f63f3493-d532-4d99-94c0-ab8648252dab" containerName="extract" Mar 21 04:40:21 crc kubenswrapper[4839]: E0321 04:40:21.026015 4839 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f63f3493-d532-4d99-94c0-ab8648252dab" containerName="pull" Mar 21 04:40:21 crc kubenswrapper[4839]: I0321 04:40:21.026021 4839 state_mem.go:107] "Deleted CPUSet assignment" podUID="f63f3493-d532-4d99-94c0-ab8648252dab" containerName="pull" Mar 21 04:40:21 crc kubenswrapper[4839]: E0321 04:40:21.026039 4839 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f63f3493-d532-4d99-94c0-ab8648252dab" containerName="util" Mar 21 04:40:21 crc kubenswrapper[4839]: I0321 04:40:21.026046 4839 state_mem.go:107] "Deleted CPUSet assignment" podUID="f63f3493-d532-4d99-94c0-ab8648252dab" containerName="util" Mar 21 04:40:21 crc kubenswrapper[4839]: I0321 04:40:21.026145 4839 memory_manager.go:354] "RemoveStaleState removing state" podUID="f63f3493-d532-4d99-94c0-ab8648252dab" containerName="extract" Mar 21 04:40:21 crc kubenswrapper[4839]: I0321 04:40:21.026526 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-init-948579bb7-j6fx6" Mar 21 04:40:21 crc kubenswrapper[4839]: I0321 04:40:21.029374 4839 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-operator-controller-init-dockercfg-gwtj7" Mar 21 04:40:21 crc kubenswrapper[4839]: I0321 04:40:21.058530 4839 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-init-948579bb7-j6fx6"] Mar 21 04:40:21 crc kubenswrapper[4839]: I0321 04:40:21.117667 4839 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-6q2g2" Mar 21 04:40:21 crc kubenswrapper[4839]: I0321 04:40:21.138281 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lxn94\" (UniqueName: \"kubernetes.io/projected/b27308cc-b2b7-4bf0-a3ca-55ccdfa47f59-kube-api-access-lxn94\") pod \"openstack-operator-controller-init-948579bb7-j6fx6\" (UID: \"b27308cc-b2b7-4bf0-a3ca-55ccdfa47f59\") " pod="openstack-operators/openstack-operator-controller-init-948579bb7-j6fx6" Mar 21 04:40:21 crc kubenswrapper[4839]: I0321 04:40:21.239791 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lxn94\" (UniqueName: \"kubernetes.io/projected/b27308cc-b2b7-4bf0-a3ca-55ccdfa47f59-kube-api-access-lxn94\") pod \"openstack-operator-controller-init-948579bb7-j6fx6\" (UID: \"b27308cc-b2b7-4bf0-a3ca-55ccdfa47f59\") " pod="openstack-operators/openstack-operator-controller-init-948579bb7-j6fx6" Mar 21 04:40:21 crc kubenswrapper[4839]: I0321 04:40:21.264981 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lxn94\" (UniqueName: \"kubernetes.io/projected/b27308cc-b2b7-4bf0-a3ca-55ccdfa47f59-kube-api-access-lxn94\") pod \"openstack-operator-controller-init-948579bb7-j6fx6\" (UID: \"b27308cc-b2b7-4bf0-a3ca-55ccdfa47f59\") " pod="openstack-operators/openstack-operator-controller-init-948579bb7-j6fx6" Mar 21 04:40:21 crc kubenswrapper[4839]: I0321 04:40:21.345124 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-init-948579bb7-j6fx6" Mar 21 04:40:21 crc kubenswrapper[4839]: I0321 04:40:21.646062 4839 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-init-948579bb7-j6fx6"] Mar 21 04:40:22 crc kubenswrapper[4839]: I0321 04:40:22.088373 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-init-948579bb7-j6fx6" event={"ID":"b27308cc-b2b7-4bf0-a3ca-55ccdfa47f59","Type":"ContainerStarted","Data":"7989561f519e101d54d9d516a04aec0cbc57fa8e0c963b23185fdd9a51dbf92b"} Mar 21 04:40:22 crc kubenswrapper[4839]: I0321 04:40:22.278445 4839 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-6q2g2"] Mar 21 04:40:24 crc kubenswrapper[4839]: I0321 04:40:24.114681 4839 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-6q2g2" podUID="befc88a7-caca-450d-b23e-c4382b36217e" containerName="registry-server" containerID="cri-o://00e0222729d7c8ca5cf26a69c3049542634e97119934b9dc854c6eadb23842db" gracePeriod=2 Mar 21 04:40:25 crc kubenswrapper[4839]: I0321 04:40:25.122168 4839 generic.go:334] "Generic (PLEG): container finished" podID="befc88a7-caca-450d-b23e-c4382b36217e" containerID="00e0222729d7c8ca5cf26a69c3049542634e97119934b9dc854c6eadb23842db" exitCode=0 Mar 21 04:40:25 crc kubenswrapper[4839]: I0321 04:40:25.122221 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-6q2g2" event={"ID":"befc88a7-caca-450d-b23e-c4382b36217e","Type":"ContainerDied","Data":"00e0222729d7c8ca5cf26a69c3049542634e97119934b9dc854c6eadb23842db"} Mar 21 04:40:27 crc kubenswrapper[4839]: I0321 04:40:27.219550 4839 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-6q2g2" Mar 21 04:40:27 crc kubenswrapper[4839]: I0321 04:40:27.380548 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/befc88a7-caca-450d-b23e-c4382b36217e-utilities\") pod \"befc88a7-caca-450d-b23e-c4382b36217e\" (UID: \"befc88a7-caca-450d-b23e-c4382b36217e\") " Mar 21 04:40:27 crc kubenswrapper[4839]: I0321 04:40:27.380888 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fz8c9\" (UniqueName: \"kubernetes.io/projected/befc88a7-caca-450d-b23e-c4382b36217e-kube-api-access-fz8c9\") pod \"befc88a7-caca-450d-b23e-c4382b36217e\" (UID: \"befc88a7-caca-450d-b23e-c4382b36217e\") " Mar 21 04:40:27 crc kubenswrapper[4839]: I0321 04:40:27.381036 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/befc88a7-caca-450d-b23e-c4382b36217e-catalog-content\") pod \"befc88a7-caca-450d-b23e-c4382b36217e\" (UID: \"befc88a7-caca-450d-b23e-c4382b36217e\") " Mar 21 04:40:27 crc kubenswrapper[4839]: I0321 04:40:27.381459 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/befc88a7-caca-450d-b23e-c4382b36217e-utilities" (OuterVolumeSpecName: "utilities") pod "befc88a7-caca-450d-b23e-c4382b36217e" (UID: "befc88a7-caca-450d-b23e-c4382b36217e"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 21 04:40:27 crc kubenswrapper[4839]: I0321 04:40:27.381848 4839 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/befc88a7-caca-450d-b23e-c4382b36217e-utilities\") on node \"crc\" DevicePath \"\"" Mar 21 04:40:27 crc kubenswrapper[4839]: I0321 04:40:27.386594 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/befc88a7-caca-450d-b23e-c4382b36217e-kube-api-access-fz8c9" (OuterVolumeSpecName: "kube-api-access-fz8c9") pod "befc88a7-caca-450d-b23e-c4382b36217e" (UID: "befc88a7-caca-450d-b23e-c4382b36217e"). InnerVolumeSpecName "kube-api-access-fz8c9". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 21 04:40:27 crc kubenswrapper[4839]: I0321 04:40:27.429375 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/befc88a7-caca-450d-b23e-c4382b36217e-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "befc88a7-caca-450d-b23e-c4382b36217e" (UID: "befc88a7-caca-450d-b23e-c4382b36217e"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 21 04:40:27 crc kubenswrapper[4839]: I0321 04:40:27.483330 4839 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fz8c9\" (UniqueName: \"kubernetes.io/projected/befc88a7-caca-450d-b23e-c4382b36217e-kube-api-access-fz8c9\") on node \"crc\" DevicePath \"\"" Mar 21 04:40:27 crc kubenswrapper[4839]: I0321 04:40:27.483364 4839 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/befc88a7-caca-450d-b23e-c4382b36217e-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 21 04:40:28 crc kubenswrapper[4839]: I0321 04:40:28.148873 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-init-948579bb7-j6fx6" event={"ID":"b27308cc-b2b7-4bf0-a3ca-55ccdfa47f59","Type":"ContainerStarted","Data":"9f158af74a72fb5ea00fcf97b3330f16b9c394fcf9fc9baa4365c40374df5d7d"} Mar 21 04:40:28 crc kubenswrapper[4839]: I0321 04:40:28.149256 4839 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-operator-controller-init-948579bb7-j6fx6" Mar 21 04:40:28 crc kubenswrapper[4839]: I0321 04:40:28.152519 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-6q2g2" event={"ID":"befc88a7-caca-450d-b23e-c4382b36217e","Type":"ContainerDied","Data":"ce9d7fdc03ee552772de04055fb99bd937e17bd28a54562444504527ae42320e"} Mar 21 04:40:28 crc kubenswrapper[4839]: I0321 04:40:28.152604 4839 scope.go:117] "RemoveContainer" containerID="00e0222729d7c8ca5cf26a69c3049542634e97119934b9dc854c6eadb23842db" Mar 21 04:40:28 crc kubenswrapper[4839]: I0321 04:40:28.152711 4839 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-6q2g2" Mar 21 04:40:28 crc kubenswrapper[4839]: I0321 04:40:28.175441 4839 scope.go:117] "RemoveContainer" containerID="7dbabf80cccb9957a73998b24ba4a430e810a5d74896e5943d1676694432404b" Mar 21 04:40:28 crc kubenswrapper[4839]: I0321 04:40:28.200127 4839 scope.go:117] "RemoveContainer" containerID="4f8c28d0fe8376d6fb6dbfcfb14d27aa066b107c300eb4d596d4ace562332e7b" Mar 21 04:40:28 crc kubenswrapper[4839]: I0321 04:40:28.203500 4839 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-controller-init-948579bb7-j6fx6" podStartSLOduration=1.549891813 podStartE2EDuration="7.203479361s" podCreationTimestamp="2026-03-21 04:40:21 +0000 UTC" firstStartedPulling="2026-03-21 04:40:21.659428482 +0000 UTC m=+1025.987215168" lastFinishedPulling="2026-03-21 04:40:27.31301604 +0000 UTC m=+1031.640802716" observedRunningTime="2026-03-21 04:40:28.200612581 +0000 UTC m=+1032.528399257" watchObservedRunningTime="2026-03-21 04:40:28.203479361 +0000 UTC m=+1032.531266037" Mar 21 04:40:28 crc kubenswrapper[4839]: I0321 04:40:28.216872 4839 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-6q2g2"] Mar 21 04:40:28 crc kubenswrapper[4839]: I0321 04:40:28.226748 4839 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-6q2g2"] Mar 21 04:40:28 crc kubenswrapper[4839]: I0321 04:40:28.461471 4839 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="befc88a7-caca-450d-b23e-c4382b36217e" path="/var/lib/kubelet/pods/befc88a7-caca-450d-b23e-c4382b36217e/volumes" Mar 21 04:40:30 crc kubenswrapper[4839]: I0321 04:40:30.196427 4839 scope.go:117] "RemoveContainer" containerID="28332a1cde28bd0485ad3577e9e484f03df8597359b74118b7173ff71df9e89d" Mar 21 04:40:30 crc kubenswrapper[4839]: I0321 04:40:30.479015 4839 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-5dcmf"] Mar 21 04:40:30 crc kubenswrapper[4839]: E0321 04:40:30.479611 4839 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="befc88a7-caca-450d-b23e-c4382b36217e" containerName="registry-server" Mar 21 04:40:30 crc kubenswrapper[4839]: I0321 04:40:30.479627 4839 state_mem.go:107] "Deleted CPUSet assignment" podUID="befc88a7-caca-450d-b23e-c4382b36217e" containerName="registry-server" Mar 21 04:40:30 crc kubenswrapper[4839]: E0321 04:40:30.479640 4839 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="befc88a7-caca-450d-b23e-c4382b36217e" containerName="extract-content" Mar 21 04:40:30 crc kubenswrapper[4839]: I0321 04:40:30.479646 4839 state_mem.go:107] "Deleted CPUSet assignment" podUID="befc88a7-caca-450d-b23e-c4382b36217e" containerName="extract-content" Mar 21 04:40:30 crc kubenswrapper[4839]: E0321 04:40:30.479658 4839 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="befc88a7-caca-450d-b23e-c4382b36217e" containerName="extract-utilities" Mar 21 04:40:30 crc kubenswrapper[4839]: I0321 04:40:30.479664 4839 state_mem.go:107] "Deleted CPUSet assignment" podUID="befc88a7-caca-450d-b23e-c4382b36217e" containerName="extract-utilities" Mar 21 04:40:30 crc kubenswrapper[4839]: I0321 04:40:30.479768 4839 memory_manager.go:354] "RemoveStaleState removing state" podUID="befc88a7-caca-450d-b23e-c4382b36217e" containerName="registry-server" Mar 21 04:40:30 crc kubenswrapper[4839]: I0321 04:40:30.480614 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-5dcmf" Mar 21 04:40:30 crc kubenswrapper[4839]: I0321 04:40:30.489355 4839 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-5dcmf"] Mar 21 04:40:30 crc kubenswrapper[4839]: I0321 04:40:30.623603 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pblj8\" (UniqueName: \"kubernetes.io/projected/f4edab0e-9c96-42e0-a1e4-20a69c5493d9-kube-api-access-pblj8\") pod \"certified-operators-5dcmf\" (UID: \"f4edab0e-9c96-42e0-a1e4-20a69c5493d9\") " pod="openshift-marketplace/certified-operators-5dcmf" Mar 21 04:40:30 crc kubenswrapper[4839]: I0321 04:40:30.623650 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f4edab0e-9c96-42e0-a1e4-20a69c5493d9-utilities\") pod \"certified-operators-5dcmf\" (UID: \"f4edab0e-9c96-42e0-a1e4-20a69c5493d9\") " pod="openshift-marketplace/certified-operators-5dcmf" Mar 21 04:40:30 crc kubenswrapper[4839]: I0321 04:40:30.623733 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f4edab0e-9c96-42e0-a1e4-20a69c5493d9-catalog-content\") pod \"certified-operators-5dcmf\" (UID: \"f4edab0e-9c96-42e0-a1e4-20a69c5493d9\") " pod="openshift-marketplace/certified-operators-5dcmf" Mar 21 04:40:30 crc kubenswrapper[4839]: I0321 04:40:30.725160 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f4edab0e-9c96-42e0-a1e4-20a69c5493d9-catalog-content\") pod \"certified-operators-5dcmf\" (UID: \"f4edab0e-9c96-42e0-a1e4-20a69c5493d9\") " pod="openshift-marketplace/certified-operators-5dcmf" Mar 21 04:40:30 crc kubenswrapper[4839]: I0321 04:40:30.725233 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pblj8\" (UniqueName: \"kubernetes.io/projected/f4edab0e-9c96-42e0-a1e4-20a69c5493d9-kube-api-access-pblj8\") pod \"certified-operators-5dcmf\" (UID: \"f4edab0e-9c96-42e0-a1e4-20a69c5493d9\") " pod="openshift-marketplace/certified-operators-5dcmf" Mar 21 04:40:30 crc kubenswrapper[4839]: I0321 04:40:30.725259 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f4edab0e-9c96-42e0-a1e4-20a69c5493d9-utilities\") pod \"certified-operators-5dcmf\" (UID: \"f4edab0e-9c96-42e0-a1e4-20a69c5493d9\") " pod="openshift-marketplace/certified-operators-5dcmf" Mar 21 04:40:30 crc kubenswrapper[4839]: I0321 04:40:30.725705 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f4edab0e-9c96-42e0-a1e4-20a69c5493d9-catalog-content\") pod \"certified-operators-5dcmf\" (UID: \"f4edab0e-9c96-42e0-a1e4-20a69c5493d9\") " pod="openshift-marketplace/certified-operators-5dcmf" Mar 21 04:40:30 crc kubenswrapper[4839]: I0321 04:40:30.725724 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f4edab0e-9c96-42e0-a1e4-20a69c5493d9-utilities\") pod \"certified-operators-5dcmf\" (UID: \"f4edab0e-9c96-42e0-a1e4-20a69c5493d9\") " pod="openshift-marketplace/certified-operators-5dcmf" Mar 21 04:40:30 crc kubenswrapper[4839]: I0321 04:40:30.746501 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pblj8\" (UniqueName: \"kubernetes.io/projected/f4edab0e-9c96-42e0-a1e4-20a69c5493d9-kube-api-access-pblj8\") pod \"certified-operators-5dcmf\" (UID: \"f4edab0e-9c96-42e0-a1e4-20a69c5493d9\") " pod="openshift-marketplace/certified-operators-5dcmf" Mar 21 04:40:30 crc kubenswrapper[4839]: I0321 04:40:30.798270 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-5dcmf" Mar 21 04:40:31 crc kubenswrapper[4839]: I0321 04:40:31.040187 4839 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-5dcmf"] Mar 21 04:40:31 crc kubenswrapper[4839]: I0321 04:40:31.172466 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-5dcmf" event={"ID":"f4edab0e-9c96-42e0-a1e4-20a69c5493d9","Type":"ContainerStarted","Data":"c5f60eb50affb07ac40233821bff40c2440a94b2dbdf1b2ea286ab2eed44dd44"} Mar 21 04:40:32 crc kubenswrapper[4839]: I0321 04:40:32.182359 4839 generic.go:334] "Generic (PLEG): container finished" podID="f4edab0e-9c96-42e0-a1e4-20a69c5493d9" containerID="76cb461c5bc389f3fddf6611c6e82c1153d3ae4660df3c024829cb1932b42d3e" exitCode=0 Mar 21 04:40:32 crc kubenswrapper[4839]: I0321 04:40:32.182726 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-5dcmf" event={"ID":"f4edab0e-9c96-42e0-a1e4-20a69c5493d9","Type":"ContainerDied","Data":"76cb461c5bc389f3fddf6611c6e82c1153d3ae4660df3c024829cb1932b42d3e"} Mar 21 04:40:33 crc kubenswrapper[4839]: I0321 04:40:33.190861 4839 generic.go:334] "Generic (PLEG): container finished" podID="f4edab0e-9c96-42e0-a1e4-20a69c5493d9" containerID="09ab1301f9e2ee1ffc883bb7c61bb7c49bf4d4738a6c88515d79f190ab72642e" exitCode=0 Mar 21 04:40:33 crc kubenswrapper[4839]: I0321 04:40:33.190902 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-5dcmf" event={"ID":"f4edab0e-9c96-42e0-a1e4-20a69c5493d9","Type":"ContainerDied","Data":"09ab1301f9e2ee1ffc883bb7c61bb7c49bf4d4738a6c88515d79f190ab72642e"} Mar 21 04:40:34 crc kubenswrapper[4839]: I0321 04:40:34.198061 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-5dcmf" event={"ID":"f4edab0e-9c96-42e0-a1e4-20a69c5493d9","Type":"ContainerStarted","Data":"2d39b08aec369487f04720788b081c6a69ec46132e51baebda55da37620565dc"} Mar 21 04:40:34 crc kubenswrapper[4839]: I0321 04:40:34.217254 4839 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-5dcmf" podStartSLOduration=2.800735849 podStartE2EDuration="4.217237946s" podCreationTimestamp="2026-03-21 04:40:30 +0000 UTC" firstStartedPulling="2026-03-21 04:40:32.184358236 +0000 UTC m=+1036.512144932" lastFinishedPulling="2026-03-21 04:40:33.600860353 +0000 UTC m=+1037.928647029" observedRunningTime="2026-03-21 04:40:34.21237563 +0000 UTC m=+1038.540162306" watchObservedRunningTime="2026-03-21 04:40:34.217237946 +0000 UTC m=+1038.545024622" Mar 21 04:40:36 crc kubenswrapper[4839]: I0321 04:40:36.676437 4839 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-qjmfj"] Mar 21 04:40:36 crc kubenswrapper[4839]: I0321 04:40:36.678185 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-qjmfj" Mar 21 04:40:36 crc kubenswrapper[4839]: I0321 04:40:36.686919 4839 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-qjmfj"] Mar 21 04:40:36 crc kubenswrapper[4839]: I0321 04:40:36.806132 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-98rlh\" (UniqueName: \"kubernetes.io/projected/edae614d-050e-4b5b-afed-f694797a2d8b-kube-api-access-98rlh\") pod \"redhat-marketplace-qjmfj\" (UID: \"edae614d-050e-4b5b-afed-f694797a2d8b\") " pod="openshift-marketplace/redhat-marketplace-qjmfj" Mar 21 04:40:36 crc kubenswrapper[4839]: I0321 04:40:36.806186 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/edae614d-050e-4b5b-afed-f694797a2d8b-utilities\") pod \"redhat-marketplace-qjmfj\" (UID: \"edae614d-050e-4b5b-afed-f694797a2d8b\") " pod="openshift-marketplace/redhat-marketplace-qjmfj" Mar 21 04:40:36 crc kubenswrapper[4839]: I0321 04:40:36.806235 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/edae614d-050e-4b5b-afed-f694797a2d8b-catalog-content\") pod \"redhat-marketplace-qjmfj\" (UID: \"edae614d-050e-4b5b-afed-f694797a2d8b\") " pod="openshift-marketplace/redhat-marketplace-qjmfj" Mar 21 04:40:36 crc kubenswrapper[4839]: I0321 04:40:36.907634 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-98rlh\" (UniqueName: \"kubernetes.io/projected/edae614d-050e-4b5b-afed-f694797a2d8b-kube-api-access-98rlh\") pod \"redhat-marketplace-qjmfj\" (UID: \"edae614d-050e-4b5b-afed-f694797a2d8b\") " pod="openshift-marketplace/redhat-marketplace-qjmfj" Mar 21 04:40:36 crc kubenswrapper[4839]: I0321 04:40:36.907797 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/edae614d-050e-4b5b-afed-f694797a2d8b-utilities\") pod \"redhat-marketplace-qjmfj\" (UID: \"edae614d-050e-4b5b-afed-f694797a2d8b\") " pod="openshift-marketplace/redhat-marketplace-qjmfj" Mar 21 04:40:36 crc kubenswrapper[4839]: I0321 04:40:36.907872 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/edae614d-050e-4b5b-afed-f694797a2d8b-catalog-content\") pod \"redhat-marketplace-qjmfj\" (UID: \"edae614d-050e-4b5b-afed-f694797a2d8b\") " pod="openshift-marketplace/redhat-marketplace-qjmfj" Mar 21 04:40:36 crc kubenswrapper[4839]: I0321 04:40:36.908333 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/edae614d-050e-4b5b-afed-f694797a2d8b-utilities\") pod \"redhat-marketplace-qjmfj\" (UID: \"edae614d-050e-4b5b-afed-f694797a2d8b\") " pod="openshift-marketplace/redhat-marketplace-qjmfj" Mar 21 04:40:36 crc kubenswrapper[4839]: I0321 04:40:36.908393 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/edae614d-050e-4b5b-afed-f694797a2d8b-catalog-content\") pod \"redhat-marketplace-qjmfj\" (UID: \"edae614d-050e-4b5b-afed-f694797a2d8b\") " pod="openshift-marketplace/redhat-marketplace-qjmfj" Mar 21 04:40:36 crc kubenswrapper[4839]: I0321 04:40:36.926300 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-98rlh\" (UniqueName: \"kubernetes.io/projected/edae614d-050e-4b5b-afed-f694797a2d8b-kube-api-access-98rlh\") pod \"redhat-marketplace-qjmfj\" (UID: \"edae614d-050e-4b5b-afed-f694797a2d8b\") " pod="openshift-marketplace/redhat-marketplace-qjmfj" Mar 21 04:40:36 crc kubenswrapper[4839]: I0321 04:40:36.996201 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-qjmfj" Mar 21 04:40:37 crc kubenswrapper[4839]: I0321 04:40:37.207500 4839 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-qjmfj"] Mar 21 04:40:37 crc kubenswrapper[4839]: W0321 04:40:37.213809 4839 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podedae614d_050e_4b5b_afed_f694797a2d8b.slice/crio-a16a7b7bb667ce36d9a1deb487ba3a222cb842d5c93e4157daa4a4cc14dd3b84 WatchSource:0}: Error finding container a16a7b7bb667ce36d9a1deb487ba3a222cb842d5c93e4157daa4a4cc14dd3b84: Status 404 returned error can't find the container with id a16a7b7bb667ce36d9a1deb487ba3a222cb842d5c93e4157daa4a4cc14dd3b84 Mar 21 04:40:38 crc kubenswrapper[4839]: I0321 04:40:38.223588 4839 generic.go:334] "Generic (PLEG): container finished" podID="edae614d-050e-4b5b-afed-f694797a2d8b" containerID="22bdc0fcd51be93f0ee040bb89784038d3a1acbeb9f3d78dd39bc6607be12b7c" exitCode=0 Mar 21 04:40:38 crc kubenswrapper[4839]: I0321 04:40:38.223652 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-qjmfj" event={"ID":"edae614d-050e-4b5b-afed-f694797a2d8b","Type":"ContainerDied","Data":"22bdc0fcd51be93f0ee040bb89784038d3a1acbeb9f3d78dd39bc6607be12b7c"} Mar 21 04:40:38 crc kubenswrapper[4839]: I0321 04:40:38.223910 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-qjmfj" event={"ID":"edae614d-050e-4b5b-afed-f694797a2d8b","Type":"ContainerStarted","Data":"a16a7b7bb667ce36d9a1deb487ba3a222cb842d5c93e4157daa4a4cc14dd3b84"} Mar 21 04:40:40 crc kubenswrapper[4839]: I0321 04:40:40.240037 4839 generic.go:334] "Generic (PLEG): container finished" podID="edae614d-050e-4b5b-afed-f694797a2d8b" containerID="63f0a061f3e36b2f811467c7f994b71affd66ca79c0317c1ddad231a315960b6" exitCode=0 Mar 21 04:40:40 crc kubenswrapper[4839]: I0321 04:40:40.240192 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-qjmfj" event={"ID":"edae614d-050e-4b5b-afed-f694797a2d8b","Type":"ContainerDied","Data":"63f0a061f3e36b2f811467c7f994b71affd66ca79c0317c1ddad231a315960b6"} Mar 21 04:40:40 crc kubenswrapper[4839]: I0321 04:40:40.799008 4839 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-5dcmf" Mar 21 04:40:40 crc kubenswrapper[4839]: I0321 04:40:40.799366 4839 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-5dcmf" Mar 21 04:40:40 crc kubenswrapper[4839]: I0321 04:40:40.839265 4839 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-5dcmf" Mar 21 04:40:41 crc kubenswrapper[4839]: I0321 04:40:41.248946 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-qjmfj" event={"ID":"edae614d-050e-4b5b-afed-f694797a2d8b","Type":"ContainerStarted","Data":"8d96a607137088fcd99a7eb42a522d1c84d223d8932ab44eb6e0d43d47637463"} Mar 21 04:40:41 crc kubenswrapper[4839]: I0321 04:40:41.275232 4839 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-qjmfj" podStartSLOduration=2.574380659 podStartE2EDuration="5.275201535s" podCreationTimestamp="2026-03-21 04:40:36 +0000 UTC" firstStartedPulling="2026-03-21 04:40:38.225073817 +0000 UTC m=+1042.552860493" lastFinishedPulling="2026-03-21 04:40:40.925894693 +0000 UTC m=+1045.253681369" observedRunningTime="2026-03-21 04:40:41.267675845 +0000 UTC m=+1045.595462521" watchObservedRunningTime="2026-03-21 04:40:41.275201535 +0000 UTC m=+1045.602988251" Mar 21 04:40:41 crc kubenswrapper[4839]: I0321 04:40:41.289820 4839 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-5dcmf" Mar 21 04:40:41 crc kubenswrapper[4839]: I0321 04:40:41.347964 4839 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-operator-controller-init-948579bb7-j6fx6" Mar 21 04:40:42 crc kubenswrapper[4839]: I0321 04:40:42.671888 4839 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-5dcmf"] Mar 21 04:40:43 crc kubenswrapper[4839]: I0321 04:40:43.266002 4839 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-5dcmf" podUID="f4edab0e-9c96-42e0-a1e4-20a69c5493d9" containerName="registry-server" containerID="cri-o://2d39b08aec369487f04720788b081c6a69ec46132e51baebda55da37620565dc" gracePeriod=2 Mar 21 04:40:43 crc kubenswrapper[4839]: I0321 04:40:43.624418 4839 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-5dcmf" Mar 21 04:40:43 crc kubenswrapper[4839]: I0321 04:40:43.807493 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pblj8\" (UniqueName: \"kubernetes.io/projected/f4edab0e-9c96-42e0-a1e4-20a69c5493d9-kube-api-access-pblj8\") pod \"f4edab0e-9c96-42e0-a1e4-20a69c5493d9\" (UID: \"f4edab0e-9c96-42e0-a1e4-20a69c5493d9\") " Mar 21 04:40:43 crc kubenswrapper[4839]: I0321 04:40:43.807674 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f4edab0e-9c96-42e0-a1e4-20a69c5493d9-utilities\") pod \"f4edab0e-9c96-42e0-a1e4-20a69c5493d9\" (UID: \"f4edab0e-9c96-42e0-a1e4-20a69c5493d9\") " Mar 21 04:40:43 crc kubenswrapper[4839]: I0321 04:40:43.807729 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f4edab0e-9c96-42e0-a1e4-20a69c5493d9-catalog-content\") pod \"f4edab0e-9c96-42e0-a1e4-20a69c5493d9\" (UID: \"f4edab0e-9c96-42e0-a1e4-20a69c5493d9\") " Mar 21 04:40:43 crc kubenswrapper[4839]: I0321 04:40:43.808500 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f4edab0e-9c96-42e0-a1e4-20a69c5493d9-utilities" (OuterVolumeSpecName: "utilities") pod "f4edab0e-9c96-42e0-a1e4-20a69c5493d9" (UID: "f4edab0e-9c96-42e0-a1e4-20a69c5493d9"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 21 04:40:43 crc kubenswrapper[4839]: I0321 04:40:43.820904 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f4edab0e-9c96-42e0-a1e4-20a69c5493d9-kube-api-access-pblj8" (OuterVolumeSpecName: "kube-api-access-pblj8") pod "f4edab0e-9c96-42e0-a1e4-20a69c5493d9" (UID: "f4edab0e-9c96-42e0-a1e4-20a69c5493d9"). InnerVolumeSpecName "kube-api-access-pblj8". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 21 04:40:43 crc kubenswrapper[4839]: I0321 04:40:43.861257 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f4edab0e-9c96-42e0-a1e4-20a69c5493d9-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "f4edab0e-9c96-42e0-a1e4-20a69c5493d9" (UID: "f4edab0e-9c96-42e0-a1e4-20a69c5493d9"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 21 04:40:43 crc kubenswrapper[4839]: I0321 04:40:43.909660 4839 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f4edab0e-9c96-42e0-a1e4-20a69c5493d9-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 21 04:40:43 crc kubenswrapper[4839]: I0321 04:40:43.909689 4839 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pblj8\" (UniqueName: \"kubernetes.io/projected/f4edab0e-9c96-42e0-a1e4-20a69c5493d9-kube-api-access-pblj8\") on node \"crc\" DevicePath \"\"" Mar 21 04:40:43 crc kubenswrapper[4839]: I0321 04:40:43.909700 4839 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f4edab0e-9c96-42e0-a1e4-20a69c5493d9-utilities\") on node \"crc\" DevicePath \"\"" Mar 21 04:40:44 crc kubenswrapper[4839]: I0321 04:40:44.274490 4839 generic.go:334] "Generic (PLEG): container finished" podID="f4edab0e-9c96-42e0-a1e4-20a69c5493d9" containerID="2d39b08aec369487f04720788b081c6a69ec46132e51baebda55da37620565dc" exitCode=0 Mar 21 04:40:44 crc kubenswrapper[4839]: I0321 04:40:44.274611 4839 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-5dcmf" Mar 21 04:40:44 crc kubenswrapper[4839]: I0321 04:40:44.274606 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-5dcmf" event={"ID":"f4edab0e-9c96-42e0-a1e4-20a69c5493d9","Type":"ContainerDied","Data":"2d39b08aec369487f04720788b081c6a69ec46132e51baebda55da37620565dc"} Mar 21 04:40:44 crc kubenswrapper[4839]: I0321 04:40:44.274762 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-5dcmf" event={"ID":"f4edab0e-9c96-42e0-a1e4-20a69c5493d9","Type":"ContainerDied","Data":"c5f60eb50affb07ac40233821bff40c2440a94b2dbdf1b2ea286ab2eed44dd44"} Mar 21 04:40:44 crc kubenswrapper[4839]: I0321 04:40:44.274787 4839 scope.go:117] "RemoveContainer" containerID="2d39b08aec369487f04720788b081c6a69ec46132e51baebda55da37620565dc" Mar 21 04:40:44 crc kubenswrapper[4839]: I0321 04:40:44.304472 4839 scope.go:117] "RemoveContainer" containerID="09ab1301f9e2ee1ffc883bb7c61bb7c49bf4d4738a6c88515d79f190ab72642e" Mar 21 04:40:44 crc kubenswrapper[4839]: I0321 04:40:44.314150 4839 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-5dcmf"] Mar 21 04:40:44 crc kubenswrapper[4839]: I0321 04:40:44.324991 4839 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-5dcmf"] Mar 21 04:40:44 crc kubenswrapper[4839]: I0321 04:40:44.326902 4839 scope.go:117] "RemoveContainer" containerID="76cb461c5bc389f3fddf6611c6e82c1153d3ae4660df3c024829cb1932b42d3e" Mar 21 04:40:44 crc kubenswrapper[4839]: I0321 04:40:44.358867 4839 scope.go:117] "RemoveContainer" containerID="2d39b08aec369487f04720788b081c6a69ec46132e51baebda55da37620565dc" Mar 21 04:40:44 crc kubenswrapper[4839]: E0321 04:40:44.362005 4839 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2d39b08aec369487f04720788b081c6a69ec46132e51baebda55da37620565dc\": container with ID starting with 2d39b08aec369487f04720788b081c6a69ec46132e51baebda55da37620565dc not found: ID does not exist" containerID="2d39b08aec369487f04720788b081c6a69ec46132e51baebda55da37620565dc" Mar 21 04:40:44 crc kubenswrapper[4839]: I0321 04:40:44.362043 4839 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2d39b08aec369487f04720788b081c6a69ec46132e51baebda55da37620565dc"} err="failed to get container status \"2d39b08aec369487f04720788b081c6a69ec46132e51baebda55da37620565dc\": rpc error: code = NotFound desc = could not find container \"2d39b08aec369487f04720788b081c6a69ec46132e51baebda55da37620565dc\": container with ID starting with 2d39b08aec369487f04720788b081c6a69ec46132e51baebda55da37620565dc not found: ID does not exist" Mar 21 04:40:44 crc kubenswrapper[4839]: I0321 04:40:44.362069 4839 scope.go:117] "RemoveContainer" containerID="09ab1301f9e2ee1ffc883bb7c61bb7c49bf4d4738a6c88515d79f190ab72642e" Mar 21 04:40:44 crc kubenswrapper[4839]: E0321 04:40:44.362486 4839 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"09ab1301f9e2ee1ffc883bb7c61bb7c49bf4d4738a6c88515d79f190ab72642e\": container with ID starting with 09ab1301f9e2ee1ffc883bb7c61bb7c49bf4d4738a6c88515d79f190ab72642e not found: ID does not exist" containerID="09ab1301f9e2ee1ffc883bb7c61bb7c49bf4d4738a6c88515d79f190ab72642e" Mar 21 04:40:44 crc kubenswrapper[4839]: I0321 04:40:44.362515 4839 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"09ab1301f9e2ee1ffc883bb7c61bb7c49bf4d4738a6c88515d79f190ab72642e"} err="failed to get container status \"09ab1301f9e2ee1ffc883bb7c61bb7c49bf4d4738a6c88515d79f190ab72642e\": rpc error: code = NotFound desc = could not find container \"09ab1301f9e2ee1ffc883bb7c61bb7c49bf4d4738a6c88515d79f190ab72642e\": container with ID starting with 09ab1301f9e2ee1ffc883bb7c61bb7c49bf4d4738a6c88515d79f190ab72642e not found: ID does not exist" Mar 21 04:40:44 crc kubenswrapper[4839]: I0321 04:40:44.362531 4839 scope.go:117] "RemoveContainer" containerID="76cb461c5bc389f3fddf6611c6e82c1153d3ae4660df3c024829cb1932b42d3e" Mar 21 04:40:44 crc kubenswrapper[4839]: E0321 04:40:44.362863 4839 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"76cb461c5bc389f3fddf6611c6e82c1153d3ae4660df3c024829cb1932b42d3e\": container with ID starting with 76cb461c5bc389f3fddf6611c6e82c1153d3ae4660df3c024829cb1932b42d3e not found: ID does not exist" containerID="76cb461c5bc389f3fddf6611c6e82c1153d3ae4660df3c024829cb1932b42d3e" Mar 21 04:40:44 crc kubenswrapper[4839]: I0321 04:40:44.362884 4839 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"76cb461c5bc389f3fddf6611c6e82c1153d3ae4660df3c024829cb1932b42d3e"} err="failed to get container status \"76cb461c5bc389f3fddf6611c6e82c1153d3ae4660df3c024829cb1932b42d3e\": rpc error: code = NotFound desc = could not find container \"76cb461c5bc389f3fddf6611c6e82c1153d3ae4660df3c024829cb1932b42d3e\": container with ID starting with 76cb461c5bc389f3fddf6611c6e82c1153d3ae4660df3c024829cb1932b42d3e not found: ID does not exist" Mar 21 04:40:44 crc kubenswrapper[4839]: I0321 04:40:44.464389 4839 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f4edab0e-9c96-42e0-a1e4-20a69c5493d9" path="/var/lib/kubelet/pods/f4edab0e-9c96-42e0-a1e4-20a69c5493d9/volumes" Mar 21 04:40:46 crc kubenswrapper[4839]: I0321 04:40:46.997117 4839 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-qjmfj" Mar 21 04:40:46 crc kubenswrapper[4839]: I0321 04:40:46.998144 4839 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-qjmfj" Mar 21 04:40:47 crc kubenswrapper[4839]: I0321 04:40:47.087802 4839 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-qjmfj" Mar 21 04:40:47 crc kubenswrapper[4839]: I0321 04:40:47.331935 4839 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-qjmfj" Mar 21 04:40:48 crc kubenswrapper[4839]: I0321 04:40:48.272310 4839 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-qjmfj"] Mar 21 04:40:49 crc kubenswrapper[4839]: I0321 04:40:49.300789 4839 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-qjmfj" podUID="edae614d-050e-4b5b-afed-f694797a2d8b" containerName="registry-server" containerID="cri-o://8d96a607137088fcd99a7eb42a522d1c84d223d8932ab44eb6e0d43d47637463" gracePeriod=2 Mar 21 04:40:49 crc kubenswrapper[4839]: I0321 04:40:49.651269 4839 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-qjmfj" Mar 21 04:40:49 crc kubenswrapper[4839]: I0321 04:40:49.777404 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/edae614d-050e-4b5b-afed-f694797a2d8b-catalog-content\") pod \"edae614d-050e-4b5b-afed-f694797a2d8b\" (UID: \"edae614d-050e-4b5b-afed-f694797a2d8b\") " Mar 21 04:40:49 crc kubenswrapper[4839]: I0321 04:40:49.777502 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/edae614d-050e-4b5b-afed-f694797a2d8b-utilities\") pod \"edae614d-050e-4b5b-afed-f694797a2d8b\" (UID: \"edae614d-050e-4b5b-afed-f694797a2d8b\") " Mar 21 04:40:49 crc kubenswrapper[4839]: I0321 04:40:49.777551 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-98rlh\" (UniqueName: \"kubernetes.io/projected/edae614d-050e-4b5b-afed-f694797a2d8b-kube-api-access-98rlh\") pod \"edae614d-050e-4b5b-afed-f694797a2d8b\" (UID: \"edae614d-050e-4b5b-afed-f694797a2d8b\") " Mar 21 04:40:49 crc kubenswrapper[4839]: I0321 04:40:49.778315 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/edae614d-050e-4b5b-afed-f694797a2d8b-utilities" (OuterVolumeSpecName: "utilities") pod "edae614d-050e-4b5b-afed-f694797a2d8b" (UID: "edae614d-050e-4b5b-afed-f694797a2d8b"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 21 04:40:49 crc kubenswrapper[4839]: I0321 04:40:49.790404 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/edae614d-050e-4b5b-afed-f694797a2d8b-kube-api-access-98rlh" (OuterVolumeSpecName: "kube-api-access-98rlh") pod "edae614d-050e-4b5b-afed-f694797a2d8b" (UID: "edae614d-050e-4b5b-afed-f694797a2d8b"). InnerVolumeSpecName "kube-api-access-98rlh". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 21 04:40:49 crc kubenswrapper[4839]: I0321 04:40:49.809988 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/edae614d-050e-4b5b-afed-f694797a2d8b-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "edae614d-050e-4b5b-afed-f694797a2d8b" (UID: "edae614d-050e-4b5b-afed-f694797a2d8b"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 21 04:40:49 crc kubenswrapper[4839]: I0321 04:40:49.879370 4839 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/edae614d-050e-4b5b-afed-f694797a2d8b-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 21 04:40:49 crc kubenswrapper[4839]: I0321 04:40:49.879399 4839 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/edae614d-050e-4b5b-afed-f694797a2d8b-utilities\") on node \"crc\" DevicePath \"\"" Mar 21 04:40:49 crc kubenswrapper[4839]: I0321 04:40:49.879408 4839 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-98rlh\" (UniqueName: \"kubernetes.io/projected/edae614d-050e-4b5b-afed-f694797a2d8b-kube-api-access-98rlh\") on node \"crc\" DevicePath \"\"" Mar 21 04:40:50 crc kubenswrapper[4839]: I0321 04:40:50.307965 4839 generic.go:334] "Generic (PLEG): container finished" podID="edae614d-050e-4b5b-afed-f694797a2d8b" containerID="8d96a607137088fcd99a7eb42a522d1c84d223d8932ab44eb6e0d43d47637463" exitCode=0 Mar 21 04:40:50 crc kubenswrapper[4839]: I0321 04:40:50.308020 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-qjmfj" event={"ID":"edae614d-050e-4b5b-afed-f694797a2d8b","Type":"ContainerDied","Data":"8d96a607137088fcd99a7eb42a522d1c84d223d8932ab44eb6e0d43d47637463"} Mar 21 04:40:50 crc kubenswrapper[4839]: I0321 04:40:50.308053 4839 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-qjmfj" Mar 21 04:40:50 crc kubenswrapper[4839]: I0321 04:40:50.308083 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-qjmfj" event={"ID":"edae614d-050e-4b5b-afed-f694797a2d8b","Type":"ContainerDied","Data":"a16a7b7bb667ce36d9a1deb487ba3a222cb842d5c93e4157daa4a4cc14dd3b84"} Mar 21 04:40:50 crc kubenswrapper[4839]: I0321 04:40:50.308111 4839 scope.go:117] "RemoveContainer" containerID="8d96a607137088fcd99a7eb42a522d1c84d223d8932ab44eb6e0d43d47637463" Mar 21 04:40:50 crc kubenswrapper[4839]: I0321 04:40:50.330027 4839 scope.go:117] "RemoveContainer" containerID="63f0a061f3e36b2f811467c7f994b71affd66ca79c0317c1ddad231a315960b6" Mar 21 04:40:50 crc kubenswrapper[4839]: I0321 04:40:50.339021 4839 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-qjmfj"] Mar 21 04:40:50 crc kubenswrapper[4839]: I0321 04:40:50.343392 4839 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-qjmfj"] Mar 21 04:40:50 crc kubenswrapper[4839]: I0321 04:40:50.349628 4839 scope.go:117] "RemoveContainer" containerID="22bdc0fcd51be93f0ee040bb89784038d3a1acbeb9f3d78dd39bc6607be12b7c" Mar 21 04:40:50 crc kubenswrapper[4839]: I0321 04:40:50.368304 4839 scope.go:117] "RemoveContainer" containerID="8d96a607137088fcd99a7eb42a522d1c84d223d8932ab44eb6e0d43d47637463" Mar 21 04:40:50 crc kubenswrapper[4839]: E0321 04:40:50.369225 4839 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8d96a607137088fcd99a7eb42a522d1c84d223d8932ab44eb6e0d43d47637463\": container with ID starting with 8d96a607137088fcd99a7eb42a522d1c84d223d8932ab44eb6e0d43d47637463 not found: ID does not exist" containerID="8d96a607137088fcd99a7eb42a522d1c84d223d8932ab44eb6e0d43d47637463" Mar 21 04:40:50 crc kubenswrapper[4839]: I0321 04:40:50.369301 4839 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8d96a607137088fcd99a7eb42a522d1c84d223d8932ab44eb6e0d43d47637463"} err="failed to get container status \"8d96a607137088fcd99a7eb42a522d1c84d223d8932ab44eb6e0d43d47637463\": rpc error: code = NotFound desc = could not find container \"8d96a607137088fcd99a7eb42a522d1c84d223d8932ab44eb6e0d43d47637463\": container with ID starting with 8d96a607137088fcd99a7eb42a522d1c84d223d8932ab44eb6e0d43d47637463 not found: ID does not exist" Mar 21 04:40:50 crc kubenswrapper[4839]: I0321 04:40:50.369347 4839 scope.go:117] "RemoveContainer" containerID="63f0a061f3e36b2f811467c7f994b71affd66ca79c0317c1ddad231a315960b6" Mar 21 04:40:50 crc kubenswrapper[4839]: E0321 04:40:50.369892 4839 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"63f0a061f3e36b2f811467c7f994b71affd66ca79c0317c1ddad231a315960b6\": container with ID starting with 63f0a061f3e36b2f811467c7f994b71affd66ca79c0317c1ddad231a315960b6 not found: ID does not exist" containerID="63f0a061f3e36b2f811467c7f994b71affd66ca79c0317c1ddad231a315960b6" Mar 21 04:40:50 crc kubenswrapper[4839]: I0321 04:40:50.369945 4839 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"63f0a061f3e36b2f811467c7f994b71affd66ca79c0317c1ddad231a315960b6"} err="failed to get container status \"63f0a061f3e36b2f811467c7f994b71affd66ca79c0317c1ddad231a315960b6\": rpc error: code = NotFound desc = could not find container \"63f0a061f3e36b2f811467c7f994b71affd66ca79c0317c1ddad231a315960b6\": container with ID starting with 63f0a061f3e36b2f811467c7f994b71affd66ca79c0317c1ddad231a315960b6 not found: ID does not exist" Mar 21 04:40:50 crc kubenswrapper[4839]: I0321 04:40:50.369963 4839 scope.go:117] "RemoveContainer" containerID="22bdc0fcd51be93f0ee040bb89784038d3a1acbeb9f3d78dd39bc6607be12b7c" Mar 21 04:40:50 crc kubenswrapper[4839]: E0321 04:40:50.370326 4839 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"22bdc0fcd51be93f0ee040bb89784038d3a1acbeb9f3d78dd39bc6607be12b7c\": container with ID starting with 22bdc0fcd51be93f0ee040bb89784038d3a1acbeb9f3d78dd39bc6607be12b7c not found: ID does not exist" containerID="22bdc0fcd51be93f0ee040bb89784038d3a1acbeb9f3d78dd39bc6607be12b7c" Mar 21 04:40:50 crc kubenswrapper[4839]: I0321 04:40:50.370358 4839 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"22bdc0fcd51be93f0ee040bb89784038d3a1acbeb9f3d78dd39bc6607be12b7c"} err="failed to get container status \"22bdc0fcd51be93f0ee040bb89784038d3a1acbeb9f3d78dd39bc6607be12b7c\": rpc error: code = NotFound desc = could not find container \"22bdc0fcd51be93f0ee040bb89784038d3a1acbeb9f3d78dd39bc6607be12b7c\": container with ID starting with 22bdc0fcd51be93f0ee040bb89784038d3a1acbeb9f3d78dd39bc6607be12b7c not found: ID does not exist" Mar 21 04:40:50 crc kubenswrapper[4839]: I0321 04:40:50.460983 4839 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="edae614d-050e-4b5b-afed-f694797a2d8b" path="/var/lib/kubelet/pods/edae614d-050e-4b5b-afed-f694797a2d8b/volumes" Mar 21 04:40:59 crc kubenswrapper[4839]: I0321 04:40:59.454557 4839 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/barbican-operator-controller-manager-59bc569d95-2mkmz"] Mar 21 04:40:59 crc kubenswrapper[4839]: E0321 04:40:59.455221 4839 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4edab0e-9c96-42e0-a1e4-20a69c5493d9" containerName="extract-content" Mar 21 04:40:59 crc kubenswrapper[4839]: I0321 04:40:59.455234 4839 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4edab0e-9c96-42e0-a1e4-20a69c5493d9" containerName="extract-content" Mar 21 04:40:59 crc kubenswrapper[4839]: E0321 04:40:59.455246 4839 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4edab0e-9c96-42e0-a1e4-20a69c5493d9" containerName="extract-utilities" Mar 21 04:40:59 crc kubenswrapper[4839]: I0321 04:40:59.455252 4839 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4edab0e-9c96-42e0-a1e4-20a69c5493d9" containerName="extract-utilities" Mar 21 04:40:59 crc kubenswrapper[4839]: E0321 04:40:59.455258 4839 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4edab0e-9c96-42e0-a1e4-20a69c5493d9" containerName="registry-server" Mar 21 04:40:59 crc kubenswrapper[4839]: I0321 04:40:59.455265 4839 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4edab0e-9c96-42e0-a1e4-20a69c5493d9" containerName="registry-server" Mar 21 04:40:59 crc kubenswrapper[4839]: E0321 04:40:59.455276 4839 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="edae614d-050e-4b5b-afed-f694797a2d8b" containerName="registry-server" Mar 21 04:40:59 crc kubenswrapper[4839]: I0321 04:40:59.455282 4839 state_mem.go:107] "Deleted CPUSet assignment" podUID="edae614d-050e-4b5b-afed-f694797a2d8b" containerName="registry-server" Mar 21 04:40:59 crc kubenswrapper[4839]: E0321 04:40:59.455293 4839 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="edae614d-050e-4b5b-afed-f694797a2d8b" containerName="extract-utilities" Mar 21 04:40:59 crc kubenswrapper[4839]: I0321 04:40:59.455299 4839 state_mem.go:107] "Deleted CPUSet assignment" podUID="edae614d-050e-4b5b-afed-f694797a2d8b" containerName="extract-utilities" Mar 21 04:40:59 crc kubenswrapper[4839]: E0321 04:40:59.455311 4839 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="edae614d-050e-4b5b-afed-f694797a2d8b" containerName="extract-content" Mar 21 04:40:59 crc kubenswrapper[4839]: I0321 04:40:59.455317 4839 state_mem.go:107] "Deleted CPUSet assignment" podUID="edae614d-050e-4b5b-afed-f694797a2d8b" containerName="extract-content" Mar 21 04:40:59 crc kubenswrapper[4839]: I0321 04:40:59.455425 4839 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4edab0e-9c96-42e0-a1e4-20a69c5493d9" containerName="registry-server" Mar 21 04:40:59 crc kubenswrapper[4839]: I0321 04:40:59.455443 4839 memory_manager.go:354] "RemoveStaleState removing state" podUID="edae614d-050e-4b5b-afed-f694797a2d8b" containerName="registry-server" Mar 21 04:40:59 crc kubenswrapper[4839]: I0321 04:40:59.455844 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/barbican-operator-controller-manager-59bc569d95-2mkmz" Mar 21 04:40:59 crc kubenswrapper[4839]: I0321 04:40:59.458050 4839 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"barbican-operator-controller-manager-dockercfg-ww6sd" Mar 21 04:40:59 crc kubenswrapper[4839]: I0321 04:40:59.459072 4839 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/cinder-operator-controller-manager-8d58dc466-dncxc"] Mar 21 04:40:59 crc kubenswrapper[4839]: I0321 04:40:59.459973 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/cinder-operator-controller-manager-8d58dc466-dncxc" Mar 21 04:40:59 crc kubenswrapper[4839]: I0321 04:40:59.461494 4839 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"cinder-operator-controller-manager-dockercfg-dw2zz" Mar 21 04:40:59 crc kubenswrapper[4839]: I0321 04:40:59.472346 4839 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/barbican-operator-controller-manager-59bc569d95-2mkmz"] Mar 21 04:40:59 crc kubenswrapper[4839]: I0321 04:40:59.480209 4839 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/designate-operator-controller-manager-588d4d986b-9s4vt"] Mar 21 04:40:59 crc kubenswrapper[4839]: I0321 04:40:59.481165 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/designate-operator-controller-manager-588d4d986b-9s4vt" Mar 21 04:40:59 crc kubenswrapper[4839]: I0321 04:40:59.482581 4839 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"designate-operator-controller-manager-dockercfg-k62gw" Mar 21 04:40:59 crc kubenswrapper[4839]: I0321 04:40:59.487689 4839 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/cinder-operator-controller-manager-8d58dc466-dncxc"] Mar 21 04:40:59 crc kubenswrapper[4839]: I0321 04:40:59.508720 4839 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/designate-operator-controller-manager-588d4d986b-9s4vt"] Mar 21 04:40:59 crc kubenswrapper[4839]: I0321 04:40:59.523227 4839 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/glance-operator-controller-manager-79df6bcc97-6s6q7"] Mar 21 04:40:59 crc kubenswrapper[4839]: I0321 04:40:59.524355 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/glance-operator-controller-manager-79df6bcc97-6s6q7" Mar 21 04:40:59 crc kubenswrapper[4839]: I0321 04:40:59.524741 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5fztr\" (UniqueName: \"kubernetes.io/projected/05f30a88-e899-4727-9440-981d010a1342-kube-api-access-5fztr\") pod \"cinder-operator-controller-manager-8d58dc466-dncxc\" (UID: \"05f30a88-e899-4727-9440-981d010a1342\") " pod="openstack-operators/cinder-operator-controller-manager-8d58dc466-dncxc" Mar 21 04:40:59 crc kubenswrapper[4839]: I0321 04:40:59.524891 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-njspv\" (UniqueName: \"kubernetes.io/projected/0c51ffa0-2285-4f7e-af09-0cafba139934-kube-api-access-njspv\") pod \"barbican-operator-controller-manager-59bc569d95-2mkmz\" (UID: \"0c51ffa0-2285-4f7e-af09-0cafba139934\") " pod="openstack-operators/barbican-operator-controller-manager-59bc569d95-2mkmz" Mar 21 04:40:59 crc kubenswrapper[4839]: I0321 04:40:59.525100 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lvkpt\" (UniqueName: \"kubernetes.io/projected/ee9d64a7-0d03-4cb0-a266-47b26f9957b5-kube-api-access-lvkpt\") pod \"designate-operator-controller-manager-588d4d986b-9s4vt\" (UID: \"ee9d64a7-0d03-4cb0-a266-47b26f9957b5\") " pod="openstack-operators/designate-operator-controller-manager-588d4d986b-9s4vt" Mar 21 04:40:59 crc kubenswrapper[4839]: I0321 04:40:59.529475 4839 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/glance-operator-controller-manager-79df6bcc97-6s6q7"] Mar 21 04:40:59 crc kubenswrapper[4839]: I0321 04:40:59.533478 4839 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"glance-operator-controller-manager-dockercfg-mtcqf" Mar 21 04:40:59 crc kubenswrapper[4839]: I0321 04:40:59.535762 4839 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/heat-operator-controller-manager-67dd5f86f5-2n27d"] Mar 21 04:40:59 crc kubenswrapper[4839]: I0321 04:40:59.536919 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/heat-operator-controller-manager-67dd5f86f5-2n27d" Mar 21 04:40:59 crc kubenswrapper[4839]: I0321 04:40:59.543939 4839 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"heat-operator-controller-manager-dockercfg-r6l7p" Mar 21 04:40:59 crc kubenswrapper[4839]: I0321 04:40:59.559639 4839 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/heat-operator-controller-manager-67dd5f86f5-2n27d"] Mar 21 04:40:59 crc kubenswrapper[4839]: I0321 04:40:59.581682 4839 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/horizon-operator-controller-manager-8464cc45fb-d7h7r"] Mar 21 04:40:59 crc kubenswrapper[4839]: I0321 04:40:59.582534 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/horizon-operator-controller-manager-8464cc45fb-d7h7r" Mar 21 04:40:59 crc kubenswrapper[4839]: I0321 04:40:59.598889 4839 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"horizon-operator-controller-manager-dockercfg-t2d4l" Mar 21 04:40:59 crc kubenswrapper[4839]: I0321 04:40:59.628861 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wfn9l\" (UniqueName: \"kubernetes.io/projected/fd731e7e-440b-4e77-a778-08a4a62e0c9f-kube-api-access-wfn9l\") pod \"heat-operator-controller-manager-67dd5f86f5-2n27d\" (UID: \"fd731e7e-440b-4e77-a778-08a4a62e0c9f\") " pod="openstack-operators/heat-operator-controller-manager-67dd5f86f5-2n27d" Mar 21 04:40:59 crc kubenswrapper[4839]: I0321 04:40:59.628925 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kh8j8\" (UniqueName: \"kubernetes.io/projected/acb1d7ac-b3f9-4564-8346-344ffb5c3964-kube-api-access-kh8j8\") pod \"horizon-operator-controller-manager-8464cc45fb-d7h7r\" (UID: \"acb1d7ac-b3f9-4564-8346-344ffb5c3964\") " pod="openstack-operators/horizon-operator-controller-manager-8464cc45fb-d7h7r" Mar 21 04:40:59 crc kubenswrapper[4839]: I0321 04:40:59.628954 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-njspv\" (UniqueName: \"kubernetes.io/projected/0c51ffa0-2285-4f7e-af09-0cafba139934-kube-api-access-njspv\") pod \"barbican-operator-controller-manager-59bc569d95-2mkmz\" (UID: \"0c51ffa0-2285-4f7e-af09-0cafba139934\") " pod="openstack-operators/barbican-operator-controller-manager-59bc569d95-2mkmz" Mar 21 04:40:59 crc kubenswrapper[4839]: I0321 04:40:59.629015 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lvkpt\" (UniqueName: \"kubernetes.io/projected/ee9d64a7-0d03-4cb0-a266-47b26f9957b5-kube-api-access-lvkpt\") pod \"designate-operator-controller-manager-588d4d986b-9s4vt\" (UID: \"ee9d64a7-0d03-4cb0-a266-47b26f9957b5\") " pod="openstack-operators/designate-operator-controller-manager-588d4d986b-9s4vt" Mar 21 04:40:59 crc kubenswrapper[4839]: I0321 04:40:59.629047 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-blk4c\" (UniqueName: \"kubernetes.io/projected/d3dc722f-f66c-46a0-9b1a-ae1b9c4de060-kube-api-access-blk4c\") pod \"glance-operator-controller-manager-79df6bcc97-6s6q7\" (UID: \"d3dc722f-f66c-46a0-9b1a-ae1b9c4de060\") " pod="openstack-operators/glance-operator-controller-manager-79df6bcc97-6s6q7" Mar 21 04:40:59 crc kubenswrapper[4839]: I0321 04:40:59.629082 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5fztr\" (UniqueName: \"kubernetes.io/projected/05f30a88-e899-4727-9440-981d010a1342-kube-api-access-5fztr\") pod \"cinder-operator-controller-manager-8d58dc466-dncxc\" (UID: \"05f30a88-e899-4727-9440-981d010a1342\") " pod="openstack-operators/cinder-operator-controller-manager-8d58dc466-dncxc" Mar 21 04:40:59 crc kubenswrapper[4839]: I0321 04:40:59.668882 4839 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/ironic-operator-controller-manager-6f787dddc9-8sg4d"] Mar 21 04:40:59 crc kubenswrapper[4839]: I0321 04:40:59.669041 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lvkpt\" (UniqueName: \"kubernetes.io/projected/ee9d64a7-0d03-4cb0-a266-47b26f9957b5-kube-api-access-lvkpt\") pod \"designate-operator-controller-manager-588d4d986b-9s4vt\" (UID: \"ee9d64a7-0d03-4cb0-a266-47b26f9957b5\") " pod="openstack-operators/designate-operator-controller-manager-588d4d986b-9s4vt" Mar 21 04:40:59 crc kubenswrapper[4839]: I0321 04:40:59.670432 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ironic-operator-controller-manager-6f787dddc9-8sg4d" Mar 21 04:40:59 crc kubenswrapper[4839]: I0321 04:40:59.674164 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5fztr\" (UniqueName: \"kubernetes.io/projected/05f30a88-e899-4727-9440-981d010a1342-kube-api-access-5fztr\") pod \"cinder-operator-controller-manager-8d58dc466-dncxc\" (UID: \"05f30a88-e899-4727-9440-981d010a1342\") " pod="openstack-operators/cinder-operator-controller-manager-8d58dc466-dncxc" Mar 21 04:40:59 crc kubenswrapper[4839]: I0321 04:40:59.674729 4839 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"ironic-operator-controller-manager-dockercfg-6p58z" Mar 21 04:40:59 crc kubenswrapper[4839]: I0321 04:40:59.683738 4839 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/infra-operator-controller-manager-7b9c774f96-bsdjs"] Mar 21 04:40:59 crc kubenswrapper[4839]: I0321 04:40:59.684964 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/infra-operator-controller-manager-7b9c774f96-bsdjs" Mar 21 04:40:59 crc kubenswrapper[4839]: I0321 04:40:59.693748 4839 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"infra-operator-controller-manager-dockercfg-m29j8" Mar 21 04:40:59 crc kubenswrapper[4839]: I0321 04:40:59.693933 4839 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"infra-operator-webhook-server-cert" Mar 21 04:40:59 crc kubenswrapper[4839]: I0321 04:40:59.694538 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-njspv\" (UniqueName: \"kubernetes.io/projected/0c51ffa0-2285-4f7e-af09-0cafba139934-kube-api-access-njspv\") pod \"barbican-operator-controller-manager-59bc569d95-2mkmz\" (UID: \"0c51ffa0-2285-4f7e-af09-0cafba139934\") " pod="openstack-operators/barbican-operator-controller-manager-59bc569d95-2mkmz" Mar 21 04:40:59 crc kubenswrapper[4839]: I0321 04:40:59.704554 4839 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/horizon-operator-controller-manager-8464cc45fb-d7h7r"] Mar 21 04:40:59 crc kubenswrapper[4839]: I0321 04:40:59.707589 4839 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ironic-operator-controller-manager-6f787dddc9-8sg4d"] Mar 21 04:40:59 crc kubenswrapper[4839]: I0321 04:40:59.712830 4839 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/keystone-operator-controller-manager-768b96df4c-k4lg5"] Mar 21 04:40:59 crc kubenswrapper[4839]: I0321 04:40:59.713879 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/keystone-operator-controller-manager-768b96df4c-k4lg5" Mar 21 04:40:59 crc kubenswrapper[4839]: I0321 04:40:59.718920 4839 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/infra-operator-controller-manager-7b9c774f96-bsdjs"] Mar 21 04:40:59 crc kubenswrapper[4839]: I0321 04:40:59.725979 4839 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"keystone-operator-controller-manager-dockercfg-m5jtl" Mar 21 04:40:59 crc kubenswrapper[4839]: I0321 04:40:59.726695 4839 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/manila-operator-controller-manager-55f864c847-gzh8j"] Mar 21 04:40:59 crc kubenswrapper[4839]: I0321 04:40:59.727588 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/manila-operator-controller-manager-55f864c847-gzh8j" Mar 21 04:40:59 crc kubenswrapper[4839]: I0321 04:40:59.732372 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-blk4c\" (UniqueName: \"kubernetes.io/projected/d3dc722f-f66c-46a0-9b1a-ae1b9c4de060-kube-api-access-blk4c\") pod \"glance-operator-controller-manager-79df6bcc97-6s6q7\" (UID: \"d3dc722f-f66c-46a0-9b1a-ae1b9c4de060\") " pod="openstack-operators/glance-operator-controller-manager-79df6bcc97-6s6q7" Mar 21 04:40:59 crc kubenswrapper[4839]: I0321 04:40:59.732433 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/ca2a8cd0-1c71-45bb-b4fc-4c7f82515b3b-cert\") pod \"infra-operator-controller-manager-7b9c774f96-bsdjs\" (UID: \"ca2a8cd0-1c71-45bb-b4fc-4c7f82515b3b\") " pod="openstack-operators/infra-operator-controller-manager-7b9c774f96-bsdjs" Mar 21 04:40:59 crc kubenswrapper[4839]: I0321 04:40:59.732481 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sdzgg\" (UniqueName: \"kubernetes.io/projected/7a7bf7a3-acea-4059-8a89-db576f3588d1-kube-api-access-sdzgg\") pod \"keystone-operator-controller-manager-768b96df4c-k4lg5\" (UID: \"7a7bf7a3-acea-4059-8a89-db576f3588d1\") " pod="openstack-operators/keystone-operator-controller-manager-768b96df4c-k4lg5" Mar 21 04:40:59 crc kubenswrapper[4839]: I0321 04:40:59.732512 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lvp89\" (UniqueName: \"kubernetes.io/projected/ca2a8cd0-1c71-45bb-b4fc-4c7f82515b3b-kube-api-access-lvp89\") pod \"infra-operator-controller-manager-7b9c774f96-bsdjs\" (UID: \"ca2a8cd0-1c71-45bb-b4fc-4c7f82515b3b\") " pod="openstack-operators/infra-operator-controller-manager-7b9c774f96-bsdjs" Mar 21 04:40:59 crc kubenswrapper[4839]: I0321 04:40:59.732548 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dch45\" (UniqueName: \"kubernetes.io/projected/ccec0d11-294b-43a2-be2e-fcef8a6818c6-kube-api-access-dch45\") pod \"ironic-operator-controller-manager-6f787dddc9-8sg4d\" (UID: \"ccec0d11-294b-43a2-be2e-fcef8a6818c6\") " pod="openstack-operators/ironic-operator-controller-manager-6f787dddc9-8sg4d" Mar 21 04:40:59 crc kubenswrapper[4839]: I0321 04:40:59.732620 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wfn9l\" (UniqueName: \"kubernetes.io/projected/fd731e7e-440b-4e77-a778-08a4a62e0c9f-kube-api-access-wfn9l\") pod \"heat-operator-controller-manager-67dd5f86f5-2n27d\" (UID: \"fd731e7e-440b-4e77-a778-08a4a62e0c9f\") " pod="openstack-operators/heat-operator-controller-manager-67dd5f86f5-2n27d" Mar 21 04:40:59 crc kubenswrapper[4839]: I0321 04:40:59.732655 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kh8j8\" (UniqueName: \"kubernetes.io/projected/acb1d7ac-b3f9-4564-8346-344ffb5c3964-kube-api-access-kh8j8\") pod \"horizon-operator-controller-manager-8464cc45fb-d7h7r\" (UID: \"acb1d7ac-b3f9-4564-8346-344ffb5c3964\") " pod="openstack-operators/horizon-operator-controller-manager-8464cc45fb-d7h7r" Mar 21 04:40:59 crc kubenswrapper[4839]: I0321 04:40:59.733527 4839 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"manila-operator-controller-manager-dockercfg-kw7js" Mar 21 04:40:59 crc kubenswrapper[4839]: I0321 04:40:59.746638 4839 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/keystone-operator-controller-manager-768b96df4c-k4lg5"] Mar 21 04:40:59 crc kubenswrapper[4839]: I0321 04:40:59.773728 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/barbican-operator-controller-manager-59bc569d95-2mkmz" Mar 21 04:40:59 crc kubenswrapper[4839]: I0321 04:40:59.774621 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-blk4c\" (UniqueName: \"kubernetes.io/projected/d3dc722f-f66c-46a0-9b1a-ae1b9c4de060-kube-api-access-blk4c\") pod \"glance-operator-controller-manager-79df6bcc97-6s6q7\" (UID: \"d3dc722f-f66c-46a0-9b1a-ae1b9c4de060\") " pod="openstack-operators/glance-operator-controller-manager-79df6bcc97-6s6q7" Mar 21 04:40:59 crc kubenswrapper[4839]: I0321 04:40:59.774933 4839 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/manila-operator-controller-manager-55f864c847-gzh8j"] Mar 21 04:40:59 crc kubenswrapper[4839]: I0321 04:40:59.780226 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wfn9l\" (UniqueName: \"kubernetes.io/projected/fd731e7e-440b-4e77-a778-08a4a62e0c9f-kube-api-access-wfn9l\") pod \"heat-operator-controller-manager-67dd5f86f5-2n27d\" (UID: \"fd731e7e-440b-4e77-a778-08a4a62e0c9f\") " pod="openstack-operators/heat-operator-controller-manager-67dd5f86f5-2n27d" Mar 21 04:40:59 crc kubenswrapper[4839]: I0321 04:40:59.784929 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/cinder-operator-controller-manager-8d58dc466-dncxc" Mar 21 04:40:59 crc kubenswrapper[4839]: I0321 04:40:59.792787 4839 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/mariadb-operator-controller-manager-67ccfc9778-sp4j4"] Mar 21 04:40:59 crc kubenswrapper[4839]: I0321 04:40:59.793703 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kh8j8\" (UniqueName: \"kubernetes.io/projected/acb1d7ac-b3f9-4564-8346-344ffb5c3964-kube-api-access-kh8j8\") pod \"horizon-operator-controller-manager-8464cc45fb-d7h7r\" (UID: \"acb1d7ac-b3f9-4564-8346-344ffb5c3964\") " pod="openstack-operators/horizon-operator-controller-manager-8464cc45fb-d7h7r" Mar 21 04:40:59 crc kubenswrapper[4839]: I0321 04:40:59.794193 4839 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/neutron-operator-controller-manager-767865f676-94vpf"] Mar 21 04:40:59 crc kubenswrapper[4839]: I0321 04:40:59.794911 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/neutron-operator-controller-manager-767865f676-94vpf" Mar 21 04:40:59 crc kubenswrapper[4839]: I0321 04:40:59.795552 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/mariadb-operator-controller-manager-67ccfc9778-sp4j4" Mar 21 04:40:59 crc kubenswrapper[4839]: I0321 04:40:59.800021 4839 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/mariadb-operator-controller-manager-67ccfc9778-sp4j4"] Mar 21 04:40:59 crc kubenswrapper[4839]: I0321 04:40:59.810137 4839 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"mariadb-operator-controller-manager-dockercfg-w7mps" Mar 21 04:40:59 crc kubenswrapper[4839]: I0321 04:40:59.810543 4839 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"neutron-operator-controller-manager-dockercfg-65hn8" Mar 21 04:40:59 crc kubenswrapper[4839]: I0321 04:40:59.815876 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/designate-operator-controller-manager-588d4d986b-9s4vt" Mar 21 04:40:59 crc kubenswrapper[4839]: I0321 04:40:59.820268 4839 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/nova-operator-controller-manager-5d488d59fb-wjw9j"] Mar 21 04:40:59 crc kubenswrapper[4839]: I0321 04:40:59.821488 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/nova-operator-controller-manager-5d488d59fb-wjw9j" Mar 21 04:40:59 crc kubenswrapper[4839]: I0321 04:40:59.826656 4839 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"nova-operator-controller-manager-dockercfg-c7dml" Mar 21 04:40:59 crc kubenswrapper[4839]: I0321 04:40:59.838971 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lvp89\" (UniqueName: \"kubernetes.io/projected/ca2a8cd0-1c71-45bb-b4fc-4c7f82515b3b-kube-api-access-lvp89\") pod \"infra-operator-controller-manager-7b9c774f96-bsdjs\" (UID: \"ca2a8cd0-1c71-45bb-b4fc-4c7f82515b3b\") " pod="openstack-operators/infra-operator-controller-manager-7b9c774f96-bsdjs" Mar 21 04:40:59 crc kubenswrapper[4839]: I0321 04:40:59.839041 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6rjxn\" (UniqueName: \"kubernetes.io/projected/6074766c-0ecd-4051-a676-dcc21b24184f-kube-api-access-6rjxn\") pod \"manila-operator-controller-manager-55f864c847-gzh8j\" (UID: \"6074766c-0ecd-4051-a676-dcc21b24184f\") " pod="openstack-operators/manila-operator-controller-manager-55f864c847-gzh8j" Mar 21 04:40:59 crc kubenswrapper[4839]: I0321 04:40:59.839084 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dch45\" (UniqueName: \"kubernetes.io/projected/ccec0d11-294b-43a2-be2e-fcef8a6818c6-kube-api-access-dch45\") pod \"ironic-operator-controller-manager-6f787dddc9-8sg4d\" (UID: \"ccec0d11-294b-43a2-be2e-fcef8a6818c6\") " pod="openstack-operators/ironic-operator-controller-manager-6f787dddc9-8sg4d" Mar 21 04:40:59 crc kubenswrapper[4839]: I0321 04:40:59.839119 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tcpnh\" (UniqueName: \"kubernetes.io/projected/70702cd5-6815-4a01-98a4-2f4dfaeef839-kube-api-access-tcpnh\") pod \"neutron-operator-controller-manager-767865f676-94vpf\" (UID: \"70702cd5-6815-4a01-98a4-2f4dfaeef839\") " pod="openstack-operators/neutron-operator-controller-manager-767865f676-94vpf" Mar 21 04:40:59 crc kubenswrapper[4839]: I0321 04:40:59.839164 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xfbjv\" (UniqueName: \"kubernetes.io/projected/2162bafb-7e49-435c-9591-d8b725f10336-kube-api-access-xfbjv\") pod \"mariadb-operator-controller-manager-67ccfc9778-sp4j4\" (UID: \"2162bafb-7e49-435c-9591-d8b725f10336\") " pod="openstack-operators/mariadb-operator-controller-manager-67ccfc9778-sp4j4" Mar 21 04:40:59 crc kubenswrapper[4839]: I0321 04:40:59.839217 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6648r\" (UniqueName: \"kubernetes.io/projected/6914418f-3639-4ebc-a58d-d8b478cbf6b4-kube-api-access-6648r\") pod \"nova-operator-controller-manager-5d488d59fb-wjw9j\" (UID: \"6914418f-3639-4ebc-a58d-d8b478cbf6b4\") " pod="openstack-operators/nova-operator-controller-manager-5d488d59fb-wjw9j" Mar 21 04:40:59 crc kubenswrapper[4839]: I0321 04:40:59.839242 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/ca2a8cd0-1c71-45bb-b4fc-4c7f82515b3b-cert\") pod \"infra-operator-controller-manager-7b9c774f96-bsdjs\" (UID: \"ca2a8cd0-1c71-45bb-b4fc-4c7f82515b3b\") " pod="openstack-operators/infra-operator-controller-manager-7b9c774f96-bsdjs" Mar 21 04:40:59 crc kubenswrapper[4839]: I0321 04:40:59.839279 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sdzgg\" (UniqueName: \"kubernetes.io/projected/7a7bf7a3-acea-4059-8a89-db576f3588d1-kube-api-access-sdzgg\") pod \"keystone-operator-controller-manager-768b96df4c-k4lg5\" (UID: \"7a7bf7a3-acea-4059-8a89-db576f3588d1\") " pod="openstack-operators/keystone-operator-controller-manager-768b96df4c-k4lg5" Mar 21 04:40:59 crc kubenswrapper[4839]: E0321 04:40:59.840153 4839 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Mar 21 04:40:59 crc kubenswrapper[4839]: E0321 04:40:59.840218 4839 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ca2a8cd0-1c71-45bb-b4fc-4c7f82515b3b-cert podName:ca2a8cd0-1c71-45bb-b4fc-4c7f82515b3b nodeName:}" failed. No retries permitted until 2026-03-21 04:41:00.340195655 +0000 UTC m=+1064.667982331 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/ca2a8cd0-1c71-45bb-b4fc-4c7f82515b3b-cert") pod "infra-operator-controller-manager-7b9c774f96-bsdjs" (UID: "ca2a8cd0-1c71-45bb-b4fc-4c7f82515b3b") : secret "infra-operator-webhook-server-cert" not found Mar 21 04:40:59 crc kubenswrapper[4839]: I0321 04:40:59.847477 4839 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/nova-operator-controller-manager-5d488d59fb-wjw9j"] Mar 21 04:40:59 crc kubenswrapper[4839]: I0321 04:40:59.860654 4839 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/octavia-operator-controller-manager-5b9f45d989-6p4mn"] Mar 21 04:40:59 crc kubenswrapper[4839]: I0321 04:40:59.862524 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/glance-operator-controller-manager-79df6bcc97-6s6q7" Mar 21 04:40:59 crc kubenswrapper[4839]: I0321 04:40:59.863266 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/octavia-operator-controller-manager-5b9f45d989-6p4mn" Mar 21 04:40:59 crc kubenswrapper[4839]: I0321 04:40:59.870159 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/heat-operator-controller-manager-67dd5f86f5-2n27d" Mar 21 04:40:59 crc kubenswrapper[4839]: I0321 04:40:59.884352 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sdzgg\" (UniqueName: \"kubernetes.io/projected/7a7bf7a3-acea-4059-8a89-db576f3588d1-kube-api-access-sdzgg\") pod \"keystone-operator-controller-manager-768b96df4c-k4lg5\" (UID: \"7a7bf7a3-acea-4059-8a89-db576f3588d1\") " pod="openstack-operators/keystone-operator-controller-manager-768b96df4c-k4lg5" Mar 21 04:40:59 crc kubenswrapper[4839]: I0321 04:40:59.888509 4839 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"octavia-operator-controller-manager-dockercfg-w2xjn" Mar 21 04:40:59 crc kubenswrapper[4839]: I0321 04:40:59.902454 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dch45\" (UniqueName: \"kubernetes.io/projected/ccec0d11-294b-43a2-be2e-fcef8a6818c6-kube-api-access-dch45\") pod \"ironic-operator-controller-manager-6f787dddc9-8sg4d\" (UID: \"ccec0d11-294b-43a2-be2e-fcef8a6818c6\") " pod="openstack-operators/ironic-operator-controller-manager-6f787dddc9-8sg4d" Mar 21 04:40:59 crc kubenswrapper[4839]: I0321 04:40:59.915764 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lvp89\" (UniqueName: \"kubernetes.io/projected/ca2a8cd0-1c71-45bb-b4fc-4c7f82515b3b-kube-api-access-lvp89\") pod \"infra-operator-controller-manager-7b9c774f96-bsdjs\" (UID: \"ca2a8cd0-1c71-45bb-b4fc-4c7f82515b3b\") " pod="openstack-operators/infra-operator-controller-manager-7b9c774f96-bsdjs" Mar 21 04:40:59 crc kubenswrapper[4839]: I0321 04:40:59.938323 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/horizon-operator-controller-manager-8464cc45fb-d7h7r" Mar 21 04:40:59 crc kubenswrapper[4839]: I0321 04:40:59.941424 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6648r\" (UniqueName: \"kubernetes.io/projected/6914418f-3639-4ebc-a58d-d8b478cbf6b4-kube-api-access-6648r\") pod \"nova-operator-controller-manager-5d488d59fb-wjw9j\" (UID: \"6914418f-3639-4ebc-a58d-d8b478cbf6b4\") " pod="openstack-operators/nova-operator-controller-manager-5d488d59fb-wjw9j" Mar 21 04:40:59 crc kubenswrapper[4839]: I0321 04:40:59.941528 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6rjxn\" (UniqueName: \"kubernetes.io/projected/6074766c-0ecd-4051-a676-dcc21b24184f-kube-api-access-6rjxn\") pod \"manila-operator-controller-manager-55f864c847-gzh8j\" (UID: \"6074766c-0ecd-4051-a676-dcc21b24184f\") " pod="openstack-operators/manila-operator-controller-manager-55f864c847-gzh8j" Mar 21 04:40:59 crc kubenswrapper[4839]: I0321 04:40:59.957728 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tcpnh\" (UniqueName: \"kubernetes.io/projected/70702cd5-6815-4a01-98a4-2f4dfaeef839-kube-api-access-tcpnh\") pod \"neutron-operator-controller-manager-767865f676-94vpf\" (UID: \"70702cd5-6815-4a01-98a4-2f4dfaeef839\") " pod="openstack-operators/neutron-operator-controller-manager-767865f676-94vpf" Mar 21 04:40:59 crc kubenswrapper[4839]: I0321 04:40:59.957888 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xfbjv\" (UniqueName: \"kubernetes.io/projected/2162bafb-7e49-435c-9591-d8b725f10336-kube-api-access-xfbjv\") pod \"mariadb-operator-controller-manager-67ccfc9778-sp4j4\" (UID: \"2162bafb-7e49-435c-9591-d8b725f10336\") " pod="openstack-operators/mariadb-operator-controller-manager-67ccfc9778-sp4j4" Mar 21 04:40:59 crc kubenswrapper[4839]: I0321 04:40:59.957929 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kcn8p\" (UniqueName: \"kubernetes.io/projected/faac458b-73d9-4fb8-9f1c-50f7521088b0-kube-api-access-kcn8p\") pod \"octavia-operator-controller-manager-5b9f45d989-6p4mn\" (UID: \"faac458b-73d9-4fb8-9f1c-50f7521088b0\") " pod="openstack-operators/octavia-operator-controller-manager-5b9f45d989-6p4mn" Mar 21 04:40:59 crc kubenswrapper[4839]: I0321 04:40:59.967627 4839 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/neutron-operator-controller-manager-767865f676-94vpf"] Mar 21 04:40:59 crc kubenswrapper[4839]: I0321 04:40:59.985185 4839 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/octavia-operator-controller-manager-5b9f45d989-6p4mn"] Mar 21 04:40:59 crc kubenswrapper[4839]: I0321 04:40:59.999552 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xfbjv\" (UniqueName: \"kubernetes.io/projected/2162bafb-7e49-435c-9591-d8b725f10336-kube-api-access-xfbjv\") pod \"mariadb-operator-controller-manager-67ccfc9778-sp4j4\" (UID: \"2162bafb-7e49-435c-9591-d8b725f10336\") " pod="openstack-operators/mariadb-operator-controller-manager-67ccfc9778-sp4j4" Mar 21 04:41:00 crc kubenswrapper[4839]: I0321 04:41:00.001596 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tcpnh\" (UniqueName: \"kubernetes.io/projected/70702cd5-6815-4a01-98a4-2f4dfaeef839-kube-api-access-tcpnh\") pod \"neutron-operator-controller-manager-767865f676-94vpf\" (UID: \"70702cd5-6815-4a01-98a4-2f4dfaeef839\") " pod="openstack-operators/neutron-operator-controller-manager-767865f676-94vpf" Mar 21 04:41:00 crc kubenswrapper[4839]: I0321 04:41:00.002653 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6648r\" (UniqueName: \"kubernetes.io/projected/6914418f-3639-4ebc-a58d-d8b478cbf6b4-kube-api-access-6648r\") pod \"nova-operator-controller-manager-5d488d59fb-wjw9j\" (UID: \"6914418f-3639-4ebc-a58d-d8b478cbf6b4\") " pod="openstack-operators/nova-operator-controller-manager-5d488d59fb-wjw9j" Mar 21 04:41:00 crc kubenswrapper[4839]: I0321 04:41:00.003864 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/nova-operator-controller-manager-5d488d59fb-wjw9j" Mar 21 04:41:00 crc kubenswrapper[4839]: I0321 04:41:00.012246 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6rjxn\" (UniqueName: \"kubernetes.io/projected/6074766c-0ecd-4051-a676-dcc21b24184f-kube-api-access-6rjxn\") pod \"manila-operator-controller-manager-55f864c847-gzh8j\" (UID: \"6074766c-0ecd-4051-a676-dcc21b24184f\") " pod="openstack-operators/manila-operator-controller-manager-55f864c847-gzh8j" Mar 21 04:41:00 crc kubenswrapper[4839]: I0321 04:41:00.024603 4839 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-baremetal-operator-controller-manager-89d64c458-8gc22"] Mar 21 04:41:00 crc kubenswrapper[4839]: I0321 04:41:00.026408 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-baremetal-operator-controller-manager-89d64c458-8gc22" Mar 21 04:41:00 crc kubenswrapper[4839]: I0321 04:41:00.031634 4839 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-baremetal-operator-webhook-server-cert" Mar 21 04:41:00 crc kubenswrapper[4839]: I0321 04:41:00.031677 4839 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-baremetal-operator-controller-manager-dockercfg-m6g24" Mar 21 04:41:00 crc kubenswrapper[4839]: I0321 04:41:00.041672 4839 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/ovn-operator-controller-manager-884679f54-qt58c"] Mar 21 04:41:00 crc kubenswrapper[4839]: I0321 04:41:00.042768 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ovn-operator-controller-manager-884679f54-qt58c" Mar 21 04:41:00 crc kubenswrapper[4839]: I0321 04:41:00.044754 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ironic-operator-controller-manager-6f787dddc9-8sg4d" Mar 21 04:41:00 crc kubenswrapper[4839]: I0321 04:41:00.058098 4839 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"ovn-operator-controller-manager-dockercfg-js6r2" Mar 21 04:41:00 crc kubenswrapper[4839]: I0321 04:41:00.066489 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rtzcv\" (UniqueName: \"kubernetes.io/projected/859b11bc-e9fb-40a2-a053-66a07337965c-kube-api-access-rtzcv\") pod \"openstack-baremetal-operator-controller-manager-89d64c458-8gc22\" (UID: \"859b11bc-e9fb-40a2-a053-66a07337965c\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-89d64c458-8gc22" Mar 21 04:41:00 crc kubenswrapper[4839]: I0321 04:41:00.066589 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kcn8p\" (UniqueName: \"kubernetes.io/projected/faac458b-73d9-4fb8-9f1c-50f7521088b0-kube-api-access-kcn8p\") pod \"octavia-operator-controller-manager-5b9f45d989-6p4mn\" (UID: \"faac458b-73d9-4fb8-9f1c-50f7521088b0\") " pod="openstack-operators/octavia-operator-controller-manager-5b9f45d989-6p4mn" Mar 21 04:41:00 crc kubenswrapper[4839]: I0321 04:41:00.066618 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/859b11bc-e9fb-40a2-a053-66a07337965c-cert\") pod \"openstack-baremetal-operator-controller-manager-89d64c458-8gc22\" (UID: \"859b11bc-e9fb-40a2-a053-66a07337965c\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-89d64c458-8gc22" Mar 21 04:41:00 crc kubenswrapper[4839]: I0321 04:41:00.066641 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bdqxk\" (UniqueName: \"kubernetes.io/projected/379b40a1-e3f5-448b-b668-0f168457e5d0-kube-api-access-bdqxk\") pod \"ovn-operator-controller-manager-884679f54-qt58c\" (UID: \"379b40a1-e3f5-448b-b668-0f168457e5d0\") " pod="openstack-operators/ovn-operator-controller-manager-884679f54-qt58c" Mar 21 04:41:00 crc kubenswrapper[4839]: I0321 04:41:00.113155 4839 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ovn-operator-controller-manager-884679f54-qt58c"] Mar 21 04:41:00 crc kubenswrapper[4839]: I0321 04:41:00.122291 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kcn8p\" (UniqueName: \"kubernetes.io/projected/faac458b-73d9-4fb8-9f1c-50f7521088b0-kube-api-access-kcn8p\") pod \"octavia-operator-controller-manager-5b9f45d989-6p4mn\" (UID: \"faac458b-73d9-4fb8-9f1c-50f7521088b0\") " pod="openstack-operators/octavia-operator-controller-manager-5b9f45d989-6p4mn" Mar 21 04:41:00 crc kubenswrapper[4839]: I0321 04:41:00.150348 4839 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-baremetal-operator-controller-manager-89d64c458-8gc22"] Mar 21 04:41:00 crc kubenswrapper[4839]: I0321 04:41:00.165840 4839 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/placement-operator-controller-manager-5784578c99-x75fd"] Mar 21 04:41:00 crc kubenswrapper[4839]: I0321 04:41:00.166689 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/placement-operator-controller-manager-5784578c99-x75fd" Mar 21 04:41:00 crc kubenswrapper[4839]: I0321 04:41:00.168170 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/keystone-operator-controller-manager-768b96df4c-k4lg5" Mar 21 04:41:00 crc kubenswrapper[4839]: I0321 04:41:00.169474 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rtzcv\" (UniqueName: \"kubernetes.io/projected/859b11bc-e9fb-40a2-a053-66a07337965c-kube-api-access-rtzcv\") pod \"openstack-baremetal-operator-controller-manager-89d64c458-8gc22\" (UID: \"859b11bc-e9fb-40a2-a053-66a07337965c\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-89d64c458-8gc22" Mar 21 04:41:00 crc kubenswrapper[4839]: I0321 04:41:00.169592 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/859b11bc-e9fb-40a2-a053-66a07337965c-cert\") pod \"openstack-baremetal-operator-controller-manager-89d64c458-8gc22\" (UID: \"859b11bc-e9fb-40a2-a053-66a07337965c\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-89d64c458-8gc22" Mar 21 04:41:00 crc kubenswrapper[4839]: I0321 04:41:00.169625 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bdqxk\" (UniqueName: \"kubernetes.io/projected/379b40a1-e3f5-448b-b668-0f168457e5d0-kube-api-access-bdqxk\") pod \"ovn-operator-controller-manager-884679f54-qt58c\" (UID: \"379b40a1-e3f5-448b-b668-0f168457e5d0\") " pod="openstack-operators/ovn-operator-controller-manager-884679f54-qt58c" Mar 21 04:41:00 crc kubenswrapper[4839]: I0321 04:41:00.176032 4839 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"placement-operator-controller-manager-dockercfg-n9vx9" Mar 21 04:41:00 crc kubenswrapper[4839]: E0321 04:41:00.176655 4839 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Mar 21 04:41:00 crc kubenswrapper[4839]: E0321 04:41:00.176712 4839 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/859b11bc-e9fb-40a2-a053-66a07337965c-cert podName:859b11bc-e9fb-40a2-a053-66a07337965c nodeName:}" failed. No retries permitted until 2026-03-21 04:41:00.676695439 +0000 UTC m=+1065.004482115 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/859b11bc-e9fb-40a2-a053-66a07337965c-cert") pod "openstack-baremetal-operator-controller-manager-89d64c458-8gc22" (UID: "859b11bc-e9fb-40a2-a053-66a07337965c") : secret "openstack-baremetal-operator-webhook-server-cert" not found Mar 21 04:41:00 crc kubenswrapper[4839]: I0321 04:41:00.188495 4839 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/swift-operator-controller-manager-c674c5965-xt7xt"] Mar 21 04:41:00 crc kubenswrapper[4839]: I0321 04:41:00.189475 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/swift-operator-controller-manager-c674c5965-xt7xt" Mar 21 04:41:00 crc kubenswrapper[4839]: I0321 04:41:00.197689 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/manila-operator-controller-manager-55f864c847-gzh8j" Mar 21 04:41:00 crc kubenswrapper[4839]: I0321 04:41:00.203866 4839 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"swift-operator-controller-manager-dockercfg-l5ks9" Mar 21 04:41:00 crc kubenswrapper[4839]: I0321 04:41:00.223745 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bdqxk\" (UniqueName: \"kubernetes.io/projected/379b40a1-e3f5-448b-b668-0f168457e5d0-kube-api-access-bdqxk\") pod \"ovn-operator-controller-manager-884679f54-qt58c\" (UID: \"379b40a1-e3f5-448b-b668-0f168457e5d0\") " pod="openstack-operators/ovn-operator-controller-manager-884679f54-qt58c" Mar 21 04:41:00 crc kubenswrapper[4839]: I0321 04:41:00.224349 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/neutron-operator-controller-manager-767865f676-94vpf" Mar 21 04:41:00 crc kubenswrapper[4839]: I0321 04:41:00.240775 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rtzcv\" (UniqueName: \"kubernetes.io/projected/859b11bc-e9fb-40a2-a053-66a07337965c-kube-api-access-rtzcv\") pod \"openstack-baremetal-operator-controller-manager-89d64c458-8gc22\" (UID: \"859b11bc-e9fb-40a2-a053-66a07337965c\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-89d64c458-8gc22" Mar 21 04:41:00 crc kubenswrapper[4839]: I0321 04:41:00.267835 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/mariadb-operator-controller-manager-67ccfc9778-sp4j4" Mar 21 04:41:00 crc kubenswrapper[4839]: I0321 04:41:00.296067 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rfvwt\" (UniqueName: \"kubernetes.io/projected/361c2d7b-9a75-41fd-953d-4b1bd64ca6df-kube-api-access-rfvwt\") pod \"placement-operator-controller-manager-5784578c99-x75fd\" (UID: \"361c2d7b-9a75-41fd-953d-4b1bd64ca6df\") " pod="openstack-operators/placement-operator-controller-manager-5784578c99-x75fd" Mar 21 04:41:00 crc kubenswrapper[4839]: I0321 04:41:00.296104 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4shpf\" (UniqueName: \"kubernetes.io/projected/2045f5d2-c67e-47cd-b16d-3c69d449f099-kube-api-access-4shpf\") pod \"swift-operator-controller-manager-c674c5965-xt7xt\" (UID: \"2045f5d2-c67e-47cd-b16d-3c69d449f099\") " pod="openstack-operators/swift-operator-controller-manager-c674c5965-xt7xt" Mar 21 04:41:00 crc kubenswrapper[4839]: I0321 04:41:00.325621 4839 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/placement-operator-controller-manager-5784578c99-x75fd"] Mar 21 04:41:00 crc kubenswrapper[4839]: I0321 04:41:00.346032 4839 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/swift-operator-controller-manager-c674c5965-xt7xt"] Mar 21 04:41:00 crc kubenswrapper[4839]: I0321 04:41:00.360079 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/octavia-operator-controller-manager-5b9f45d989-6p4mn" Mar 21 04:41:00 crc kubenswrapper[4839]: I0321 04:41:00.367161 4839 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/telemetry-operator-controller-manager-d6b694c5-btkvt"] Mar 21 04:41:00 crc kubenswrapper[4839]: I0321 04:41:00.370434 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/telemetry-operator-controller-manager-d6b694c5-btkvt" Mar 21 04:41:00 crc kubenswrapper[4839]: I0321 04:41:00.372714 4839 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"telemetry-operator-controller-manager-dockercfg-59xc6" Mar 21 04:41:00 crc kubenswrapper[4839]: I0321 04:41:00.397064 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rfvwt\" (UniqueName: \"kubernetes.io/projected/361c2d7b-9a75-41fd-953d-4b1bd64ca6df-kube-api-access-rfvwt\") pod \"placement-operator-controller-manager-5784578c99-x75fd\" (UID: \"361c2d7b-9a75-41fd-953d-4b1bd64ca6df\") " pod="openstack-operators/placement-operator-controller-manager-5784578c99-x75fd" Mar 21 04:41:00 crc kubenswrapper[4839]: I0321 04:41:00.397106 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4shpf\" (UniqueName: \"kubernetes.io/projected/2045f5d2-c67e-47cd-b16d-3c69d449f099-kube-api-access-4shpf\") pod \"swift-operator-controller-manager-c674c5965-xt7xt\" (UID: \"2045f5d2-c67e-47cd-b16d-3c69d449f099\") " pod="openstack-operators/swift-operator-controller-manager-c674c5965-xt7xt" Mar 21 04:41:00 crc kubenswrapper[4839]: I0321 04:41:00.397127 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/ca2a8cd0-1c71-45bb-b4fc-4c7f82515b3b-cert\") pod \"infra-operator-controller-manager-7b9c774f96-bsdjs\" (UID: \"ca2a8cd0-1c71-45bb-b4fc-4c7f82515b3b\") " pod="openstack-operators/infra-operator-controller-manager-7b9c774f96-bsdjs" Mar 21 04:41:00 crc kubenswrapper[4839]: E0321 04:41:00.397294 4839 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Mar 21 04:41:00 crc kubenswrapper[4839]: E0321 04:41:00.397342 4839 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ca2a8cd0-1c71-45bb-b4fc-4c7f82515b3b-cert podName:ca2a8cd0-1c71-45bb-b4fc-4c7f82515b3b nodeName:}" failed. No retries permitted until 2026-03-21 04:41:01.397327631 +0000 UTC m=+1065.725114307 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/ca2a8cd0-1c71-45bb-b4fc-4c7f82515b3b-cert") pod "infra-operator-controller-manager-7b9c774f96-bsdjs" (UID: "ca2a8cd0-1c71-45bb-b4fc-4c7f82515b3b") : secret "infra-operator-webhook-server-cert" not found Mar 21 04:41:00 crc kubenswrapper[4839]: I0321 04:41:00.405559 4839 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/telemetry-operator-controller-manager-d6b694c5-btkvt"] Mar 21 04:41:00 crc kubenswrapper[4839]: I0321 04:41:00.416447 4839 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/test-operator-controller-manager-5c5cb9c4d7-7f4qh"] Mar 21 04:41:00 crc kubenswrapper[4839]: I0321 04:41:00.416920 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4shpf\" (UniqueName: \"kubernetes.io/projected/2045f5d2-c67e-47cd-b16d-3c69d449f099-kube-api-access-4shpf\") pod \"swift-operator-controller-manager-c674c5965-xt7xt\" (UID: \"2045f5d2-c67e-47cd-b16d-3c69d449f099\") " pod="openstack-operators/swift-operator-controller-manager-c674c5965-xt7xt" Mar 21 04:41:00 crc kubenswrapper[4839]: I0321 04:41:00.417796 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/test-operator-controller-manager-5c5cb9c4d7-7f4qh" Mar 21 04:41:00 crc kubenswrapper[4839]: I0321 04:41:00.423879 4839 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"test-operator-controller-manager-dockercfg-cgxcz" Mar 21 04:41:00 crc kubenswrapper[4839]: I0321 04:41:00.425319 4839 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/test-operator-controller-manager-5c5cb9c4d7-7f4qh"] Mar 21 04:41:00 crc kubenswrapper[4839]: I0321 04:41:00.431430 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rfvwt\" (UniqueName: \"kubernetes.io/projected/361c2d7b-9a75-41fd-953d-4b1bd64ca6df-kube-api-access-rfvwt\") pod \"placement-operator-controller-manager-5784578c99-x75fd\" (UID: \"361c2d7b-9a75-41fd-953d-4b1bd64ca6df\") " pod="openstack-operators/placement-operator-controller-manager-5784578c99-x75fd" Mar 21 04:41:00 crc kubenswrapper[4839]: I0321 04:41:00.434924 4839 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/watcher-operator-controller-manager-6c4d75f7f9-hh27s"] Mar 21 04:41:00 crc kubenswrapper[4839]: I0321 04:41:00.437502 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/watcher-operator-controller-manager-6c4d75f7f9-hh27s" Mar 21 04:41:00 crc kubenswrapper[4839]: I0321 04:41:00.439654 4839 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"watcher-operator-controller-manager-dockercfg-v8qj9" Mar 21 04:41:00 crc kubenswrapper[4839]: I0321 04:41:00.440936 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ovn-operator-controller-manager-884679f54-qt58c" Mar 21 04:41:00 crc kubenswrapper[4839]: I0321 04:41:00.443758 4839 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/watcher-operator-controller-manager-6c4d75f7f9-hh27s"] Mar 21 04:41:00 crc kubenswrapper[4839]: I0321 04:41:00.499335 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bzgnq\" (UniqueName: \"kubernetes.io/projected/1d32b541-7b80-492b-adac-e51d5090b668-kube-api-access-bzgnq\") pod \"watcher-operator-controller-manager-6c4d75f7f9-hh27s\" (UID: \"1d32b541-7b80-492b-adac-e51d5090b668\") " pod="openstack-operators/watcher-operator-controller-manager-6c4d75f7f9-hh27s" Mar 21 04:41:00 crc kubenswrapper[4839]: I0321 04:41:00.499412 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bx8b8\" (UniqueName: \"kubernetes.io/projected/d3ea9c2e-11a4-492e-9e84-8294e81ce775-kube-api-access-bx8b8\") pod \"telemetry-operator-controller-manager-d6b694c5-btkvt\" (UID: \"d3ea9c2e-11a4-492e-9e84-8294e81ce775\") " pod="openstack-operators/telemetry-operator-controller-manager-d6b694c5-btkvt" Mar 21 04:41:00 crc kubenswrapper[4839]: I0321 04:41:00.499442 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2hwvx\" (UniqueName: \"kubernetes.io/projected/5eeb53bd-3988-458f-baa5-d265e0178aea-kube-api-access-2hwvx\") pod \"test-operator-controller-manager-5c5cb9c4d7-7f4qh\" (UID: \"5eeb53bd-3988-458f-baa5-d265e0178aea\") " pod="openstack-operators/test-operator-controller-manager-5c5cb9c4d7-7f4qh" Mar 21 04:41:00 crc kubenswrapper[4839]: I0321 04:41:00.518130 4839 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-controller-manager-5ccd4855ff-jx6pn"] Mar 21 04:41:00 crc kubenswrapper[4839]: I0321 04:41:00.519106 4839 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-manager-5ccd4855ff-jx6pn"] Mar 21 04:41:00 crc kubenswrapper[4839]: I0321 04:41:00.519130 4839 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-lzbtt"] Mar 21 04:41:00 crc kubenswrapper[4839]: I0321 04:41:00.519859 4839 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-lzbtt"] Mar 21 04:41:00 crc kubenswrapper[4839]: I0321 04:41:00.519949 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-lzbtt" Mar 21 04:41:00 crc kubenswrapper[4839]: I0321 04:41:00.520318 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-manager-5ccd4855ff-jx6pn" Mar 21 04:41:00 crc kubenswrapper[4839]: I0321 04:41:00.523400 4839 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-operator-controller-manager-dockercfg-vcnt8" Mar 21 04:41:00 crc kubenswrapper[4839]: I0321 04:41:00.523530 4839 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"rabbitmq-cluster-operator-controller-manager-dockercfg-wrpz4" Mar 21 04:41:00 crc kubenswrapper[4839]: I0321 04:41:00.523669 4839 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"webhook-server-cert" Mar 21 04:41:00 crc kubenswrapper[4839]: I0321 04:41:00.523865 4839 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"metrics-server-cert" Mar 21 04:41:00 crc kubenswrapper[4839]: I0321 04:41:00.524517 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/placement-operator-controller-manager-5784578c99-x75fd" Mar 21 04:41:00 crc kubenswrapper[4839]: I0321 04:41:00.545682 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/swift-operator-controller-manager-c674c5965-xt7xt" Mar 21 04:41:00 crc kubenswrapper[4839]: I0321 04:41:00.601294 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2hwvx\" (UniqueName: \"kubernetes.io/projected/5eeb53bd-3988-458f-baa5-d265e0178aea-kube-api-access-2hwvx\") pod \"test-operator-controller-manager-5c5cb9c4d7-7f4qh\" (UID: \"5eeb53bd-3988-458f-baa5-d265e0178aea\") " pod="openstack-operators/test-operator-controller-manager-5c5cb9c4d7-7f4qh" Mar 21 04:41:00 crc kubenswrapper[4839]: I0321 04:41:00.601922 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/06f9e67e-8978-46a1-9dc8-c511197241e2-metrics-certs\") pod \"openstack-operator-controller-manager-5ccd4855ff-jx6pn\" (UID: \"06f9e67e-8978-46a1-9dc8-c511197241e2\") " pod="openstack-operators/openstack-operator-controller-manager-5ccd4855ff-jx6pn" Mar 21 04:41:00 crc kubenswrapper[4839]: I0321 04:41:00.601964 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/06f9e67e-8978-46a1-9dc8-c511197241e2-webhook-certs\") pod \"openstack-operator-controller-manager-5ccd4855ff-jx6pn\" (UID: \"06f9e67e-8978-46a1-9dc8-c511197241e2\") " pod="openstack-operators/openstack-operator-controller-manager-5ccd4855ff-jx6pn" Mar 21 04:41:00 crc kubenswrapper[4839]: I0321 04:41:00.602002 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mgnfs\" (UniqueName: \"kubernetes.io/projected/c8584ecb-dc92-4cec-9178-3017f09095da-kube-api-access-mgnfs\") pod \"rabbitmq-cluster-operator-manager-668c99d594-lzbtt\" (UID: \"c8584ecb-dc92-4cec-9178-3017f09095da\") " pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-lzbtt" Mar 21 04:41:00 crc kubenswrapper[4839]: I0321 04:41:00.602026 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2qfsq\" (UniqueName: \"kubernetes.io/projected/06f9e67e-8978-46a1-9dc8-c511197241e2-kube-api-access-2qfsq\") pod \"openstack-operator-controller-manager-5ccd4855ff-jx6pn\" (UID: \"06f9e67e-8978-46a1-9dc8-c511197241e2\") " pod="openstack-operators/openstack-operator-controller-manager-5ccd4855ff-jx6pn" Mar 21 04:41:00 crc kubenswrapper[4839]: I0321 04:41:00.602050 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bzgnq\" (UniqueName: \"kubernetes.io/projected/1d32b541-7b80-492b-adac-e51d5090b668-kube-api-access-bzgnq\") pod \"watcher-operator-controller-manager-6c4d75f7f9-hh27s\" (UID: \"1d32b541-7b80-492b-adac-e51d5090b668\") " pod="openstack-operators/watcher-operator-controller-manager-6c4d75f7f9-hh27s" Mar 21 04:41:00 crc kubenswrapper[4839]: I0321 04:41:00.602095 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bx8b8\" (UniqueName: \"kubernetes.io/projected/d3ea9c2e-11a4-492e-9e84-8294e81ce775-kube-api-access-bx8b8\") pod \"telemetry-operator-controller-manager-d6b694c5-btkvt\" (UID: \"d3ea9c2e-11a4-492e-9e84-8294e81ce775\") " pod="openstack-operators/telemetry-operator-controller-manager-d6b694c5-btkvt" Mar 21 04:41:00 crc kubenswrapper[4839]: I0321 04:41:00.622304 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2hwvx\" (UniqueName: \"kubernetes.io/projected/5eeb53bd-3988-458f-baa5-d265e0178aea-kube-api-access-2hwvx\") pod \"test-operator-controller-manager-5c5cb9c4d7-7f4qh\" (UID: \"5eeb53bd-3988-458f-baa5-d265e0178aea\") " pod="openstack-operators/test-operator-controller-manager-5c5cb9c4d7-7f4qh" Mar 21 04:41:00 crc kubenswrapper[4839]: I0321 04:41:00.623722 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bzgnq\" (UniqueName: \"kubernetes.io/projected/1d32b541-7b80-492b-adac-e51d5090b668-kube-api-access-bzgnq\") pod \"watcher-operator-controller-manager-6c4d75f7f9-hh27s\" (UID: \"1d32b541-7b80-492b-adac-e51d5090b668\") " pod="openstack-operators/watcher-operator-controller-manager-6c4d75f7f9-hh27s" Mar 21 04:41:00 crc kubenswrapper[4839]: I0321 04:41:00.627650 4839 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/cinder-operator-controller-manager-8d58dc466-dncxc"] Mar 21 04:41:00 crc kubenswrapper[4839]: I0321 04:41:00.634871 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bx8b8\" (UniqueName: \"kubernetes.io/projected/d3ea9c2e-11a4-492e-9e84-8294e81ce775-kube-api-access-bx8b8\") pod \"telemetry-operator-controller-manager-d6b694c5-btkvt\" (UID: \"d3ea9c2e-11a4-492e-9e84-8294e81ce775\") " pod="openstack-operators/telemetry-operator-controller-manager-d6b694c5-btkvt" Mar 21 04:41:00 crc kubenswrapper[4839]: I0321 04:41:00.704523 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/859b11bc-e9fb-40a2-a053-66a07337965c-cert\") pod \"openstack-baremetal-operator-controller-manager-89d64c458-8gc22\" (UID: \"859b11bc-e9fb-40a2-a053-66a07337965c\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-89d64c458-8gc22" Mar 21 04:41:00 crc kubenswrapper[4839]: I0321 04:41:00.704619 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/06f9e67e-8978-46a1-9dc8-c511197241e2-metrics-certs\") pod \"openstack-operator-controller-manager-5ccd4855ff-jx6pn\" (UID: \"06f9e67e-8978-46a1-9dc8-c511197241e2\") " pod="openstack-operators/openstack-operator-controller-manager-5ccd4855ff-jx6pn" Mar 21 04:41:00 crc kubenswrapper[4839]: I0321 04:41:00.704672 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/06f9e67e-8978-46a1-9dc8-c511197241e2-webhook-certs\") pod \"openstack-operator-controller-manager-5ccd4855ff-jx6pn\" (UID: \"06f9e67e-8978-46a1-9dc8-c511197241e2\") " pod="openstack-operators/openstack-operator-controller-manager-5ccd4855ff-jx6pn" Mar 21 04:41:00 crc kubenswrapper[4839]: I0321 04:41:00.704712 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mgnfs\" (UniqueName: \"kubernetes.io/projected/c8584ecb-dc92-4cec-9178-3017f09095da-kube-api-access-mgnfs\") pod \"rabbitmq-cluster-operator-manager-668c99d594-lzbtt\" (UID: \"c8584ecb-dc92-4cec-9178-3017f09095da\") " pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-lzbtt" Mar 21 04:41:00 crc kubenswrapper[4839]: I0321 04:41:00.704735 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2qfsq\" (UniqueName: \"kubernetes.io/projected/06f9e67e-8978-46a1-9dc8-c511197241e2-kube-api-access-2qfsq\") pod \"openstack-operator-controller-manager-5ccd4855ff-jx6pn\" (UID: \"06f9e67e-8978-46a1-9dc8-c511197241e2\") " pod="openstack-operators/openstack-operator-controller-manager-5ccd4855ff-jx6pn" Mar 21 04:41:00 crc kubenswrapper[4839]: E0321 04:41:00.705618 4839 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Mar 21 04:41:00 crc kubenswrapper[4839]: E0321 04:41:00.705685 4839 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/06f9e67e-8978-46a1-9dc8-c511197241e2-metrics-certs podName:06f9e67e-8978-46a1-9dc8-c511197241e2 nodeName:}" failed. No retries permitted until 2026-03-21 04:41:01.205669377 +0000 UTC m=+1065.533456053 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/06f9e67e-8978-46a1-9dc8-c511197241e2-metrics-certs") pod "openstack-operator-controller-manager-5ccd4855ff-jx6pn" (UID: "06f9e67e-8978-46a1-9dc8-c511197241e2") : secret "metrics-server-cert" not found Mar 21 04:41:00 crc kubenswrapper[4839]: E0321 04:41:00.705890 4839 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Mar 21 04:41:00 crc kubenswrapper[4839]: E0321 04:41:00.705912 4839 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Mar 21 04:41:00 crc kubenswrapper[4839]: E0321 04:41:00.705920 4839 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/06f9e67e-8978-46a1-9dc8-c511197241e2-webhook-certs podName:06f9e67e-8978-46a1-9dc8-c511197241e2 nodeName:}" failed. No retries permitted until 2026-03-21 04:41:01.205912324 +0000 UTC m=+1065.533698990 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/06f9e67e-8978-46a1-9dc8-c511197241e2-webhook-certs") pod "openstack-operator-controller-manager-5ccd4855ff-jx6pn" (UID: "06f9e67e-8978-46a1-9dc8-c511197241e2") : secret "webhook-server-cert" not found Mar 21 04:41:00 crc kubenswrapper[4839]: E0321 04:41:00.705939 4839 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/859b11bc-e9fb-40a2-a053-66a07337965c-cert podName:859b11bc-e9fb-40a2-a053-66a07337965c nodeName:}" failed. No retries permitted until 2026-03-21 04:41:01.705931835 +0000 UTC m=+1066.033718511 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/859b11bc-e9fb-40a2-a053-66a07337965c-cert") pod "openstack-baremetal-operator-controller-manager-89d64c458-8gc22" (UID: "859b11bc-e9fb-40a2-a053-66a07337965c") : secret "openstack-baremetal-operator-webhook-server-cert" not found Mar 21 04:41:00 crc kubenswrapper[4839]: I0321 04:41:00.738381 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2qfsq\" (UniqueName: \"kubernetes.io/projected/06f9e67e-8978-46a1-9dc8-c511197241e2-kube-api-access-2qfsq\") pod \"openstack-operator-controller-manager-5ccd4855ff-jx6pn\" (UID: \"06f9e67e-8978-46a1-9dc8-c511197241e2\") " pod="openstack-operators/openstack-operator-controller-manager-5ccd4855ff-jx6pn" Mar 21 04:41:00 crc kubenswrapper[4839]: I0321 04:41:00.746042 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mgnfs\" (UniqueName: \"kubernetes.io/projected/c8584ecb-dc92-4cec-9178-3017f09095da-kube-api-access-mgnfs\") pod \"rabbitmq-cluster-operator-manager-668c99d594-lzbtt\" (UID: \"c8584ecb-dc92-4cec-9178-3017f09095da\") " pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-lzbtt" Mar 21 04:41:00 crc kubenswrapper[4839]: I0321 04:41:00.791979 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/telemetry-operator-controller-manager-d6b694c5-btkvt" Mar 21 04:41:00 crc kubenswrapper[4839]: I0321 04:41:00.803523 4839 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/barbican-operator-controller-manager-59bc569d95-2mkmz"] Mar 21 04:41:00 crc kubenswrapper[4839]: I0321 04:41:00.808155 4839 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/designate-operator-controller-manager-588d4d986b-9s4vt"] Mar 21 04:41:00 crc kubenswrapper[4839]: I0321 04:41:00.820120 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/test-operator-controller-manager-5c5cb9c4d7-7f4qh" Mar 21 04:41:00 crc kubenswrapper[4839]: I0321 04:41:00.824700 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/watcher-operator-controller-manager-6c4d75f7f9-hh27s" Mar 21 04:41:00 crc kubenswrapper[4839]: I0321 04:41:00.869765 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-lzbtt" Mar 21 04:41:00 crc kubenswrapper[4839]: I0321 04:41:00.970186 4839 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/horizon-operator-controller-manager-8464cc45fb-d7h7r"] Mar 21 04:41:01 crc kubenswrapper[4839]: I0321 04:41:01.002756 4839 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/glance-operator-controller-manager-79df6bcc97-6s6q7"] Mar 21 04:41:01 crc kubenswrapper[4839]: I0321 04:41:01.011744 4839 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/nova-operator-controller-manager-5d488d59fb-wjw9j"] Mar 21 04:41:01 crc kubenswrapper[4839]: W0321 04:41:01.039551 4839 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd3dc722f_f66c_46a0_9b1a_ae1b9c4de060.slice/crio-25b27b660f9d026738b89fe32f29d9d05c7c096d139f4969eccedc4a0ef1cdfc WatchSource:0}: Error finding container 25b27b660f9d026738b89fe32f29d9d05c7c096d139f4969eccedc4a0ef1cdfc: Status 404 returned error can't find the container with id 25b27b660f9d026738b89fe32f29d9d05c7c096d139f4969eccedc4a0ef1cdfc Mar 21 04:41:01 crc kubenswrapper[4839]: I0321 04:41:01.049039 4839 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/heat-operator-controller-manager-67dd5f86f5-2n27d"] Mar 21 04:41:01 crc kubenswrapper[4839]: W0321 04:41:01.174821 4839 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7a7bf7a3_acea_4059_8a89_db576f3588d1.slice/crio-95aaef1836ce9a9bef4c03957baa597a4869021c94cede1031514f565dbdf645 WatchSource:0}: Error finding container 95aaef1836ce9a9bef4c03957baa597a4869021c94cede1031514f565dbdf645: Status 404 returned error can't find the container with id 95aaef1836ce9a9bef4c03957baa597a4869021c94cede1031514f565dbdf645 Mar 21 04:41:01 crc kubenswrapper[4839]: I0321 04:41:01.177502 4839 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/keystone-operator-controller-manager-768b96df4c-k4lg5"] Mar 21 04:41:01 crc kubenswrapper[4839]: I0321 04:41:01.190242 4839 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ironic-operator-controller-manager-6f787dddc9-8sg4d"] Mar 21 04:41:01 crc kubenswrapper[4839]: I0321 04:41:01.225376 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/06f9e67e-8978-46a1-9dc8-c511197241e2-metrics-certs\") pod \"openstack-operator-controller-manager-5ccd4855ff-jx6pn\" (UID: \"06f9e67e-8978-46a1-9dc8-c511197241e2\") " pod="openstack-operators/openstack-operator-controller-manager-5ccd4855ff-jx6pn" Mar 21 04:41:01 crc kubenswrapper[4839]: I0321 04:41:01.226012 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/06f9e67e-8978-46a1-9dc8-c511197241e2-webhook-certs\") pod \"openstack-operator-controller-manager-5ccd4855ff-jx6pn\" (UID: \"06f9e67e-8978-46a1-9dc8-c511197241e2\") " pod="openstack-operators/openstack-operator-controller-manager-5ccd4855ff-jx6pn" Mar 21 04:41:01 crc kubenswrapper[4839]: E0321 04:41:01.225916 4839 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Mar 21 04:41:01 crc kubenswrapper[4839]: E0321 04:41:01.226225 4839 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/06f9e67e-8978-46a1-9dc8-c511197241e2-metrics-certs podName:06f9e67e-8978-46a1-9dc8-c511197241e2 nodeName:}" failed. No retries permitted until 2026-03-21 04:41:02.22621022 +0000 UTC m=+1066.553996886 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/06f9e67e-8978-46a1-9dc8-c511197241e2-metrics-certs") pod "openstack-operator-controller-manager-5ccd4855ff-jx6pn" (UID: "06f9e67e-8978-46a1-9dc8-c511197241e2") : secret "metrics-server-cert" not found Mar 21 04:41:01 crc kubenswrapper[4839]: E0321 04:41:01.226178 4839 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Mar 21 04:41:01 crc kubenswrapper[4839]: E0321 04:41:01.226462 4839 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/06f9e67e-8978-46a1-9dc8-c511197241e2-webhook-certs podName:06f9e67e-8978-46a1-9dc8-c511197241e2 nodeName:}" failed. No retries permitted until 2026-03-21 04:41:02.226431696 +0000 UTC m=+1066.554218372 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/06f9e67e-8978-46a1-9dc8-c511197241e2-webhook-certs") pod "openstack-operator-controller-manager-5ccd4855ff-jx6pn" (UID: "06f9e67e-8978-46a1-9dc8-c511197241e2") : secret "webhook-server-cert" not found Mar 21 04:41:01 crc kubenswrapper[4839]: I0321 04:41:01.317603 4839 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/neutron-operator-controller-manager-767865f676-94vpf"] Mar 21 04:41:01 crc kubenswrapper[4839]: I0321 04:41:01.381072 4839 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/placement-operator-controller-manager-5784578c99-x75fd"] Mar 21 04:41:01 crc kubenswrapper[4839]: W0321 04:41:01.381196 4839 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod361c2d7b_9a75_41fd_953d_4b1bd64ca6df.slice/crio-484978de3436726b3f864212d1e8ec9df3ea28eebf62b790ce717e22ea6d0465 WatchSource:0}: Error finding container 484978de3436726b3f864212d1e8ec9df3ea28eebf62b790ce717e22ea6d0465: Status 404 returned error can't find the container with id 484978de3436726b3f864212d1e8ec9df3ea28eebf62b790ce717e22ea6d0465 Mar 21 04:41:01 crc kubenswrapper[4839]: I0321 04:41:01.401584 4839 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/manila-operator-controller-manager-55f864c847-gzh8j"] Mar 21 04:41:01 crc kubenswrapper[4839]: I0321 04:41:01.425470 4839 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/swift-operator-controller-manager-c674c5965-xt7xt"] Mar 21 04:41:01 crc kubenswrapper[4839]: I0321 04:41:01.430554 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/ca2a8cd0-1c71-45bb-b4fc-4c7f82515b3b-cert\") pod \"infra-operator-controller-manager-7b9c774f96-bsdjs\" (UID: \"ca2a8cd0-1c71-45bb-b4fc-4c7f82515b3b\") " pod="openstack-operators/infra-operator-controller-manager-7b9c774f96-bsdjs" Mar 21 04:41:01 crc kubenswrapper[4839]: E0321 04:41:01.430874 4839 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Mar 21 04:41:01 crc kubenswrapper[4839]: E0321 04:41:01.430931 4839 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ca2a8cd0-1c71-45bb-b4fc-4c7f82515b3b-cert podName:ca2a8cd0-1c71-45bb-b4fc-4c7f82515b3b nodeName:}" failed. No retries permitted until 2026-03-21 04:41:03.430913906 +0000 UTC m=+1067.758700582 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/ca2a8cd0-1c71-45bb-b4fc-4c7f82515b3b-cert") pod "infra-operator-controller-manager-7b9c774f96-bsdjs" (UID: "ca2a8cd0-1c71-45bb-b4fc-4c7f82515b3b") : secret "infra-operator-webhook-server-cert" not found Mar 21 04:41:01 crc kubenswrapper[4839]: E0321 04:41:01.466813 4839 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/mariadb-operator@sha256:6e7552996253fc66667eaa3eb0e11b4e97145efa2ae577155ceabf8e9913ddc1,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-xfbjv,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod mariadb-operator-controller-manager-67ccfc9778-sp4j4_openstack-operators(2162bafb-7e49-435c-9591-d8b725f10336): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Mar 21 04:41:01 crc kubenswrapper[4839]: E0321 04:41:01.468416 4839 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/mariadb-operator-controller-manager-67ccfc9778-sp4j4" podUID="2162bafb-7e49-435c-9591-d8b725f10336" Mar 21 04:41:01 crc kubenswrapper[4839]: I0321 04:41:01.474867 4839 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/octavia-operator-controller-manager-5b9f45d989-6p4mn"] Mar 21 04:41:01 crc kubenswrapper[4839]: I0321 04:41:01.481443 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/barbican-operator-controller-manager-59bc569d95-2mkmz" event={"ID":"0c51ffa0-2285-4f7e-af09-0cafba139934","Type":"ContainerStarted","Data":"f178a46d5304a2344a0cf7e88fe08339ae279ffac5a78564c6d46cc77ef8989c"} Mar 21 04:41:01 crc kubenswrapper[4839]: I0321 04:41:01.483144 4839 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/mariadb-operator-controller-manager-67ccfc9778-sp4j4"] Mar 21 04:41:01 crc kubenswrapper[4839]: I0321 04:41:01.549866 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/heat-operator-controller-manager-67dd5f86f5-2n27d" event={"ID":"fd731e7e-440b-4e77-a778-08a4a62e0c9f","Type":"ContainerStarted","Data":"187fc773e622271ea9dd4774a4a961f54233a950e74d2da6ba9f5a9b080c16eb"} Mar 21 04:41:01 crc kubenswrapper[4839]: I0321 04:41:01.580870 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/placement-operator-controller-manager-5784578c99-x75fd" event={"ID":"361c2d7b-9a75-41fd-953d-4b1bd64ca6df","Type":"ContainerStarted","Data":"484978de3436726b3f864212d1e8ec9df3ea28eebf62b790ce717e22ea6d0465"} Mar 21 04:41:01 crc kubenswrapper[4839]: I0321 04:41:01.595029 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/designate-operator-controller-manager-588d4d986b-9s4vt" event={"ID":"ee9d64a7-0d03-4cb0-a266-47b26f9957b5","Type":"ContainerStarted","Data":"b57c63613e5e99cde07bf4587e39393a9651f73f2d70628187580237ce767ec1"} Mar 21 04:41:01 crc kubenswrapper[4839]: I0321 04:41:01.613275 4839 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/watcher-operator-controller-manager-6c4d75f7f9-hh27s"] Mar 21 04:41:01 crc kubenswrapper[4839]: I0321 04:41:01.615741 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/keystone-operator-controller-manager-768b96df4c-k4lg5" event={"ID":"7a7bf7a3-acea-4059-8a89-db576f3588d1","Type":"ContainerStarted","Data":"95aaef1836ce9a9bef4c03957baa597a4869021c94cede1031514f565dbdf645"} Mar 21 04:41:01 crc kubenswrapper[4839]: I0321 04:41:01.623262 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/horizon-operator-controller-manager-8464cc45fb-d7h7r" event={"ID":"acb1d7ac-b3f9-4564-8346-344ffb5c3964","Type":"ContainerStarted","Data":"d11080250b691496b64dace2092d9ed99da1740c09542dd8a197140613a4549b"} Mar 21 04:41:01 crc kubenswrapper[4839]: I0321 04:41:01.648662 4839 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-lzbtt"] Mar 21 04:41:01 crc kubenswrapper[4839]: W0321 04:41:01.653667 4839 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1d32b541_7b80_492b_adac_e51d5090b668.slice/crio-5f3ab0be0781d2c50b7806355240791ed2d46f4185d1588c0117d51abce0987d WatchSource:0}: Error finding container 5f3ab0be0781d2c50b7806355240791ed2d46f4185d1588c0117d51abce0987d: Status 404 returned error can't find the container with id 5f3ab0be0781d2c50b7806355240791ed2d46f4185d1588c0117d51abce0987d Mar 21 04:41:01 crc kubenswrapper[4839]: I0321 04:41:01.663627 4839 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ovn-operator-controller-manager-884679f54-qt58c"] Mar 21 04:41:01 crc kubenswrapper[4839]: I0321 04:41:01.666096 4839 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/test-operator-controller-manager-5c5cb9c4d7-7f4qh"] Mar 21 04:41:01 crc kubenswrapper[4839]: I0321 04:41:01.681725 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/manila-operator-controller-manager-55f864c847-gzh8j" event={"ID":"6074766c-0ecd-4051-a676-dcc21b24184f","Type":"ContainerStarted","Data":"8a0e49009225321e3e5163ca7f552fd97c178ddb4ba50dc2360e71b295cda662"} Mar 21 04:41:01 crc kubenswrapper[4839]: E0321 04:41:01.681881 4839 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:operator,Image:quay.io/openstack-k8s-operators/rabbitmq-cluster-operator@sha256:893e66303c1b0bc1d00a299a3f0380bad55c8dc813c8a1c6a4aab379f5aa12a2,Command:[/manager],Args:[],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:metrics,HostPort:0,ContainerPort:9782,Protocol:TCP,HostIP:,},},Env:[]EnvVar{EnvVar{Name:OPERATOR_NAMESPACE,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:metadata.namespace,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{200 -3} {} 200m DecimalSI},memory: {{524288000 0} {} 500Mi BinarySI},},Requests:ResourceList{cpu: {{5 -3} {} 5m DecimalSI},memory: {{67108864 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-mgnfs,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000660000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod rabbitmq-cluster-operator-manager-668c99d594-lzbtt_openstack-operators(c8584ecb-dc92-4cec-9178-3017f09095da): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Mar 21 04:41:01 crc kubenswrapper[4839]: E0321 04:41:01.683137 4839 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"operator\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-lzbtt" podUID="c8584ecb-dc92-4cec-9178-3017f09095da" Mar 21 04:41:01 crc kubenswrapper[4839]: I0321 04:41:01.684103 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/nova-operator-controller-manager-5d488d59fb-wjw9j" event={"ID":"6914418f-3639-4ebc-a58d-d8b478cbf6b4","Type":"ContainerStarted","Data":"84d5fe1899628facaa4eb033948b29c1621a64d3687adbb83a4011a9d37b203e"} Mar 21 04:41:01 crc kubenswrapper[4839]: I0321 04:41:01.689481 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/cinder-operator-controller-manager-8d58dc466-dncxc" event={"ID":"05f30a88-e899-4727-9440-981d010a1342","Type":"ContainerStarted","Data":"0ab49986c77037fd9f16877166b7ad114b8319308b9c26305ab3284dd8b48804"} Mar 21 04:41:01 crc kubenswrapper[4839]: I0321 04:41:01.691511 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ironic-operator-controller-manager-6f787dddc9-8sg4d" event={"ID":"ccec0d11-294b-43a2-be2e-fcef8a6818c6","Type":"ContainerStarted","Data":"09b1189a2dfcb9fb6e01a5648e76850fe32bb5469efc192ae2a1427725dda062"} Mar 21 04:41:01 crc kubenswrapper[4839]: I0321 04:41:01.693313 4839 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/telemetry-operator-controller-manager-d6b694c5-btkvt"] Mar 21 04:41:01 crc kubenswrapper[4839]: I0321 04:41:01.694181 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/glance-operator-controller-manager-79df6bcc97-6s6q7" event={"ID":"d3dc722f-f66c-46a0-9b1a-ae1b9c4de060","Type":"ContainerStarted","Data":"25b27b660f9d026738b89fe32f29d9d05c7c096d139f4969eccedc4a0ef1cdfc"} Mar 21 04:41:01 crc kubenswrapper[4839]: I0321 04:41:01.695538 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/neutron-operator-controller-manager-767865f676-94vpf" event={"ID":"70702cd5-6815-4a01-98a4-2f4dfaeef839","Type":"ContainerStarted","Data":"476911e8d76781fef68c6933ac928afce635c2f8a44b15d059a55811fc657a08"} Mar 21 04:41:01 crc kubenswrapper[4839]: W0321 04:41:01.699076 4839 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5eeb53bd_3988_458f_baa5_d265e0178aea.slice/crio-6228dd5a7c9da87a461d25a02176c15280a17edf7ea5ded5dcdb2960fa9d19b4 WatchSource:0}: Error finding container 6228dd5a7c9da87a461d25a02176c15280a17edf7ea5ded5dcdb2960fa9d19b4: Status 404 returned error can't find the container with id 6228dd5a7c9da87a461d25a02176c15280a17edf7ea5ded5dcdb2960fa9d19b4 Mar 21 04:41:01 crc kubenswrapper[4839]: W0321 04:41:01.702501 4839 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod379b40a1_e3f5_448b_b668_0f168457e5d0.slice/crio-720a95457a517f9f5d285d605be2ad29fffdce302247f1ef12e85bcc110782da WatchSource:0}: Error finding container 720a95457a517f9f5d285d605be2ad29fffdce302247f1ef12e85bcc110782da: Status 404 returned error can't find the container with id 720a95457a517f9f5d285d605be2ad29fffdce302247f1ef12e85bcc110782da Mar 21 04:41:01 crc kubenswrapper[4839]: E0321 04:41:01.704803 4839 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/ovn-operator@sha256:bef93f71d3b42a72d8b96c69bdb4db4b8bd797c5093a0a719443d7a5c9aaab55,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-bdqxk,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod ovn-operator-controller-manager-884679f54-qt58c_openstack-operators(379b40a1-e3f5-448b-b668-0f168457e5d0): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Mar 21 04:41:01 crc kubenswrapper[4839]: E0321 04:41:01.704797 4839 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/test-operator@sha256:43bd420bc05b4789243740bc75f61e10c7aac7883fc2f82b2d4d50085bc96c42,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-2hwvx,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod test-operator-controller-manager-5c5cb9c4d7-7f4qh_openstack-operators(5eeb53bd-3988-458f-baa5-d265e0178aea): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Mar 21 04:41:01 crc kubenswrapper[4839]: E0321 04:41:01.706500 4839 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/test-operator-controller-manager-5c5cb9c4d7-7f4qh" podUID="5eeb53bd-3988-458f-baa5-d265e0178aea" Mar 21 04:41:01 crc kubenswrapper[4839]: E0321 04:41:01.706526 4839 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/ovn-operator-controller-manager-884679f54-qt58c" podUID="379b40a1-e3f5-448b-b668-0f168457e5d0" Mar 21 04:41:01 crc kubenswrapper[4839]: W0321 04:41:01.709891 4839 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd3ea9c2e_11a4_492e_9e84_8294e81ce775.slice/crio-762ee9fc75ec394ff05b9aa66977c31cc1739cac21e982ba4b0a840666eae7f8 WatchSource:0}: Error finding container 762ee9fc75ec394ff05b9aa66977c31cc1739cac21e982ba4b0a840666eae7f8: Status 404 returned error can't find the container with id 762ee9fc75ec394ff05b9aa66977c31cc1739cac21e982ba4b0a840666eae7f8 Mar 21 04:41:01 crc kubenswrapper[4839]: E0321 04:41:01.716414 4839 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/telemetry-operator@sha256:c500fa7080b94105e85eeced772d8872e4168904e74ba02116e15ab66f522444,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-bx8b8,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod telemetry-operator-controller-manager-d6b694c5-btkvt_openstack-operators(d3ea9c2e-11a4-492e-9e84-8294e81ce775): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Mar 21 04:41:01 crc kubenswrapper[4839]: E0321 04:41:01.717622 4839 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/telemetry-operator-controller-manager-d6b694c5-btkvt" podUID="d3ea9c2e-11a4-492e-9e84-8294e81ce775" Mar 21 04:41:01 crc kubenswrapper[4839]: I0321 04:41:01.751078 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/859b11bc-e9fb-40a2-a053-66a07337965c-cert\") pod \"openstack-baremetal-operator-controller-manager-89d64c458-8gc22\" (UID: \"859b11bc-e9fb-40a2-a053-66a07337965c\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-89d64c458-8gc22" Mar 21 04:41:01 crc kubenswrapper[4839]: E0321 04:41:01.751258 4839 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Mar 21 04:41:01 crc kubenswrapper[4839]: E0321 04:41:01.751332 4839 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/859b11bc-e9fb-40a2-a053-66a07337965c-cert podName:859b11bc-e9fb-40a2-a053-66a07337965c nodeName:}" failed. No retries permitted until 2026-03-21 04:41:03.75131457 +0000 UTC m=+1068.079101246 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/859b11bc-e9fb-40a2-a053-66a07337965c-cert") pod "openstack-baremetal-operator-controller-manager-89d64c458-8gc22" (UID: "859b11bc-e9fb-40a2-a053-66a07337965c") : secret "openstack-baremetal-operator-webhook-server-cert" not found Mar 21 04:41:02 crc kubenswrapper[4839]: I0321 04:41:02.256657 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/06f9e67e-8978-46a1-9dc8-c511197241e2-webhook-certs\") pod \"openstack-operator-controller-manager-5ccd4855ff-jx6pn\" (UID: \"06f9e67e-8978-46a1-9dc8-c511197241e2\") " pod="openstack-operators/openstack-operator-controller-manager-5ccd4855ff-jx6pn" Mar 21 04:41:02 crc kubenswrapper[4839]: I0321 04:41:02.257076 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/06f9e67e-8978-46a1-9dc8-c511197241e2-metrics-certs\") pod \"openstack-operator-controller-manager-5ccd4855ff-jx6pn\" (UID: \"06f9e67e-8978-46a1-9dc8-c511197241e2\") " pod="openstack-operators/openstack-operator-controller-manager-5ccd4855ff-jx6pn" Mar 21 04:41:02 crc kubenswrapper[4839]: E0321 04:41:02.257541 4839 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Mar 21 04:41:02 crc kubenswrapper[4839]: E0321 04:41:02.257603 4839 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/06f9e67e-8978-46a1-9dc8-c511197241e2-metrics-certs podName:06f9e67e-8978-46a1-9dc8-c511197241e2 nodeName:}" failed. No retries permitted until 2026-03-21 04:41:04.257590172 +0000 UTC m=+1068.585376848 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/06f9e67e-8978-46a1-9dc8-c511197241e2-metrics-certs") pod "openstack-operator-controller-manager-5ccd4855ff-jx6pn" (UID: "06f9e67e-8978-46a1-9dc8-c511197241e2") : secret "metrics-server-cert" not found Mar 21 04:41:02 crc kubenswrapper[4839]: E0321 04:41:02.258277 4839 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Mar 21 04:41:02 crc kubenswrapper[4839]: E0321 04:41:02.259299 4839 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/06f9e67e-8978-46a1-9dc8-c511197241e2-webhook-certs podName:06f9e67e-8978-46a1-9dc8-c511197241e2 nodeName:}" failed. No retries permitted until 2026-03-21 04:41:04.258356753 +0000 UTC m=+1068.586143429 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/06f9e67e-8978-46a1-9dc8-c511197241e2-webhook-certs") pod "openstack-operator-controller-manager-5ccd4855ff-jx6pn" (UID: "06f9e67e-8978-46a1-9dc8-c511197241e2") : secret "webhook-server-cert" not found Mar 21 04:41:02 crc kubenswrapper[4839]: I0321 04:41:02.716373 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/octavia-operator-controller-manager-5b9f45d989-6p4mn" event={"ID":"faac458b-73d9-4fb8-9f1c-50f7521088b0","Type":"ContainerStarted","Data":"1e35a256903f495719370ffe2429bd5fe5d7996f75d2d2e1db93599fc471e751"} Mar 21 04:41:02 crc kubenswrapper[4839]: I0321 04:41:02.720389 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ovn-operator-controller-manager-884679f54-qt58c" event={"ID":"379b40a1-e3f5-448b-b668-0f168457e5d0","Type":"ContainerStarted","Data":"720a95457a517f9f5d285d605be2ad29fffdce302247f1ef12e85bcc110782da"} Mar 21 04:41:02 crc kubenswrapper[4839]: E0321 04:41:02.722922 4839 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/ovn-operator@sha256:bef93f71d3b42a72d8b96c69bdb4db4b8bd797c5093a0a719443d7a5c9aaab55\\\"\"" pod="openstack-operators/ovn-operator-controller-manager-884679f54-qt58c" podUID="379b40a1-e3f5-448b-b668-0f168457e5d0" Mar 21 04:41:02 crc kubenswrapper[4839]: I0321 04:41:02.728221 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-lzbtt" event={"ID":"c8584ecb-dc92-4cec-9178-3017f09095da","Type":"ContainerStarted","Data":"e8410073c966fb866a357a85931b73911a01e1b42c609718749511a611851fdd"} Mar 21 04:41:02 crc kubenswrapper[4839]: E0321 04:41:02.732876 4839 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"operator\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/rabbitmq-cluster-operator@sha256:893e66303c1b0bc1d00a299a3f0380bad55c8dc813c8a1c6a4aab379f5aa12a2\\\"\"" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-lzbtt" podUID="c8584ecb-dc92-4cec-9178-3017f09095da" Mar 21 04:41:02 crc kubenswrapper[4839]: I0321 04:41:02.741464 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/swift-operator-controller-manager-c674c5965-xt7xt" event={"ID":"2045f5d2-c67e-47cd-b16d-3c69d449f099","Type":"ContainerStarted","Data":"ebc9a84925147293fca3aa5eef619abe213ed86caccf4379f911aad3558c281a"} Mar 21 04:41:02 crc kubenswrapper[4839]: I0321 04:41:02.743652 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/test-operator-controller-manager-5c5cb9c4d7-7f4qh" event={"ID":"5eeb53bd-3988-458f-baa5-d265e0178aea","Type":"ContainerStarted","Data":"6228dd5a7c9da87a461d25a02176c15280a17edf7ea5ded5dcdb2960fa9d19b4"} Mar 21 04:41:02 crc kubenswrapper[4839]: E0321 04:41:02.746579 4839 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/test-operator@sha256:43bd420bc05b4789243740bc75f61e10c7aac7883fc2f82b2d4d50085bc96c42\\\"\"" pod="openstack-operators/test-operator-controller-manager-5c5cb9c4d7-7f4qh" podUID="5eeb53bd-3988-458f-baa5-d265e0178aea" Mar 21 04:41:02 crc kubenswrapper[4839]: I0321 04:41:02.763368 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/telemetry-operator-controller-manager-d6b694c5-btkvt" event={"ID":"d3ea9c2e-11a4-492e-9e84-8294e81ce775","Type":"ContainerStarted","Data":"762ee9fc75ec394ff05b9aa66977c31cc1739cac21e982ba4b0a840666eae7f8"} Mar 21 04:41:02 crc kubenswrapper[4839]: I0321 04:41:02.767907 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/mariadb-operator-controller-manager-67ccfc9778-sp4j4" event={"ID":"2162bafb-7e49-435c-9591-d8b725f10336","Type":"ContainerStarted","Data":"71e02a0a009b78d60b7181c44029ff9136a64c0fca88cf247d4175cc39a0e6c3"} Mar 21 04:41:02 crc kubenswrapper[4839]: E0321 04:41:02.773124 4839 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/mariadb-operator@sha256:6e7552996253fc66667eaa3eb0e11b4e97145efa2ae577155ceabf8e9913ddc1\\\"\"" pod="openstack-operators/mariadb-operator-controller-manager-67ccfc9778-sp4j4" podUID="2162bafb-7e49-435c-9591-d8b725f10336" Mar 21 04:41:02 crc kubenswrapper[4839]: E0321 04:41:02.773193 4839 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/telemetry-operator@sha256:c500fa7080b94105e85eeced772d8872e4168904e74ba02116e15ab66f522444\\\"\"" pod="openstack-operators/telemetry-operator-controller-manager-d6b694c5-btkvt" podUID="d3ea9c2e-11a4-492e-9e84-8294e81ce775" Mar 21 04:41:02 crc kubenswrapper[4839]: I0321 04:41:02.773493 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/watcher-operator-controller-manager-6c4d75f7f9-hh27s" event={"ID":"1d32b541-7b80-492b-adac-e51d5090b668","Type":"ContainerStarted","Data":"5f3ab0be0781d2c50b7806355240791ed2d46f4185d1588c0117d51abce0987d"} Mar 21 04:41:03 crc kubenswrapper[4839]: I0321 04:41:03.474241 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/ca2a8cd0-1c71-45bb-b4fc-4c7f82515b3b-cert\") pod \"infra-operator-controller-manager-7b9c774f96-bsdjs\" (UID: \"ca2a8cd0-1c71-45bb-b4fc-4c7f82515b3b\") " pod="openstack-operators/infra-operator-controller-manager-7b9c774f96-bsdjs" Mar 21 04:41:03 crc kubenswrapper[4839]: E0321 04:41:03.474617 4839 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Mar 21 04:41:03 crc kubenswrapper[4839]: E0321 04:41:03.474675 4839 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ca2a8cd0-1c71-45bb-b4fc-4c7f82515b3b-cert podName:ca2a8cd0-1c71-45bb-b4fc-4c7f82515b3b nodeName:}" failed. No retries permitted until 2026-03-21 04:41:07.47466086 +0000 UTC m=+1071.802447536 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/ca2a8cd0-1c71-45bb-b4fc-4c7f82515b3b-cert") pod "infra-operator-controller-manager-7b9c774f96-bsdjs" (UID: "ca2a8cd0-1c71-45bb-b4fc-4c7f82515b3b") : secret "infra-operator-webhook-server-cert" not found Mar 21 04:41:03 crc kubenswrapper[4839]: I0321 04:41:03.779602 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/859b11bc-e9fb-40a2-a053-66a07337965c-cert\") pod \"openstack-baremetal-operator-controller-manager-89d64c458-8gc22\" (UID: \"859b11bc-e9fb-40a2-a053-66a07337965c\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-89d64c458-8gc22" Mar 21 04:41:03 crc kubenswrapper[4839]: E0321 04:41:03.779880 4839 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Mar 21 04:41:03 crc kubenswrapper[4839]: E0321 04:41:03.779925 4839 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/859b11bc-e9fb-40a2-a053-66a07337965c-cert podName:859b11bc-e9fb-40a2-a053-66a07337965c nodeName:}" failed. No retries permitted until 2026-03-21 04:41:07.77991076 +0000 UTC m=+1072.107697436 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/859b11bc-e9fb-40a2-a053-66a07337965c-cert") pod "openstack-baremetal-operator-controller-manager-89d64c458-8gc22" (UID: "859b11bc-e9fb-40a2-a053-66a07337965c") : secret "openstack-baremetal-operator-webhook-server-cert" not found Mar 21 04:41:03 crc kubenswrapper[4839]: E0321 04:41:03.779915 4839 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/mariadb-operator@sha256:6e7552996253fc66667eaa3eb0e11b4e97145efa2ae577155ceabf8e9913ddc1\\\"\"" pod="openstack-operators/mariadb-operator-controller-manager-67ccfc9778-sp4j4" podUID="2162bafb-7e49-435c-9591-d8b725f10336" Mar 21 04:41:03 crc kubenswrapper[4839]: E0321 04:41:03.780246 4839 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"operator\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/rabbitmq-cluster-operator@sha256:893e66303c1b0bc1d00a299a3f0380bad55c8dc813c8a1c6a4aab379f5aa12a2\\\"\"" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-lzbtt" podUID="c8584ecb-dc92-4cec-9178-3017f09095da" Mar 21 04:41:03 crc kubenswrapper[4839]: E0321 04:41:03.780705 4839 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/test-operator@sha256:43bd420bc05b4789243740bc75f61e10c7aac7883fc2f82b2d4d50085bc96c42\\\"\"" pod="openstack-operators/test-operator-controller-manager-5c5cb9c4d7-7f4qh" podUID="5eeb53bd-3988-458f-baa5-d265e0178aea" Mar 21 04:41:03 crc kubenswrapper[4839]: E0321 04:41:03.780797 4839 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/telemetry-operator@sha256:c500fa7080b94105e85eeced772d8872e4168904e74ba02116e15ab66f522444\\\"\"" pod="openstack-operators/telemetry-operator-controller-manager-d6b694c5-btkvt" podUID="d3ea9c2e-11a4-492e-9e84-8294e81ce775" Mar 21 04:41:03 crc kubenswrapper[4839]: E0321 04:41:03.781546 4839 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/ovn-operator@sha256:bef93f71d3b42a72d8b96c69bdb4db4b8bd797c5093a0a719443d7a5c9aaab55\\\"\"" pod="openstack-operators/ovn-operator-controller-manager-884679f54-qt58c" podUID="379b40a1-e3f5-448b-b668-0f168457e5d0" Mar 21 04:41:04 crc kubenswrapper[4839]: I0321 04:41:04.288478 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/06f9e67e-8978-46a1-9dc8-c511197241e2-webhook-certs\") pod \"openstack-operator-controller-manager-5ccd4855ff-jx6pn\" (UID: \"06f9e67e-8978-46a1-9dc8-c511197241e2\") " pod="openstack-operators/openstack-operator-controller-manager-5ccd4855ff-jx6pn" Mar 21 04:41:04 crc kubenswrapper[4839]: I0321 04:41:04.288681 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/06f9e67e-8978-46a1-9dc8-c511197241e2-metrics-certs\") pod \"openstack-operator-controller-manager-5ccd4855ff-jx6pn\" (UID: \"06f9e67e-8978-46a1-9dc8-c511197241e2\") " pod="openstack-operators/openstack-operator-controller-manager-5ccd4855ff-jx6pn" Mar 21 04:41:04 crc kubenswrapper[4839]: E0321 04:41:04.288853 4839 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Mar 21 04:41:04 crc kubenswrapper[4839]: E0321 04:41:04.288900 4839 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Mar 21 04:41:04 crc kubenswrapper[4839]: E0321 04:41:04.288933 4839 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/06f9e67e-8978-46a1-9dc8-c511197241e2-metrics-certs podName:06f9e67e-8978-46a1-9dc8-c511197241e2 nodeName:}" failed. No retries permitted until 2026-03-21 04:41:08.288913359 +0000 UTC m=+1072.616700035 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/06f9e67e-8978-46a1-9dc8-c511197241e2-metrics-certs") pod "openstack-operator-controller-manager-5ccd4855ff-jx6pn" (UID: "06f9e67e-8978-46a1-9dc8-c511197241e2") : secret "metrics-server-cert" not found Mar 21 04:41:04 crc kubenswrapper[4839]: E0321 04:41:04.288972 4839 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/06f9e67e-8978-46a1-9dc8-c511197241e2-webhook-certs podName:06f9e67e-8978-46a1-9dc8-c511197241e2 nodeName:}" failed. No retries permitted until 2026-03-21 04:41:08.28895013 +0000 UTC m=+1072.616736806 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/06f9e67e-8978-46a1-9dc8-c511197241e2-webhook-certs") pod "openstack-operator-controller-manager-5ccd4855ff-jx6pn" (UID: "06f9e67e-8978-46a1-9dc8-c511197241e2") : secret "webhook-server-cert" not found Mar 21 04:41:07 crc kubenswrapper[4839]: I0321 04:41:07.538381 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/ca2a8cd0-1c71-45bb-b4fc-4c7f82515b3b-cert\") pod \"infra-operator-controller-manager-7b9c774f96-bsdjs\" (UID: \"ca2a8cd0-1c71-45bb-b4fc-4c7f82515b3b\") " pod="openstack-operators/infra-operator-controller-manager-7b9c774f96-bsdjs" Mar 21 04:41:07 crc kubenswrapper[4839]: E0321 04:41:07.538611 4839 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Mar 21 04:41:07 crc kubenswrapper[4839]: E0321 04:41:07.539152 4839 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ca2a8cd0-1c71-45bb-b4fc-4c7f82515b3b-cert podName:ca2a8cd0-1c71-45bb-b4fc-4c7f82515b3b nodeName:}" failed. No retries permitted until 2026-03-21 04:41:15.539121365 +0000 UTC m=+1079.866908041 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/ca2a8cd0-1c71-45bb-b4fc-4c7f82515b3b-cert") pod "infra-operator-controller-manager-7b9c774f96-bsdjs" (UID: "ca2a8cd0-1c71-45bb-b4fc-4c7f82515b3b") : secret "infra-operator-webhook-server-cert" not found Mar 21 04:41:07 crc kubenswrapper[4839]: I0321 04:41:07.842667 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/859b11bc-e9fb-40a2-a053-66a07337965c-cert\") pod \"openstack-baremetal-operator-controller-manager-89d64c458-8gc22\" (UID: \"859b11bc-e9fb-40a2-a053-66a07337965c\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-89d64c458-8gc22" Mar 21 04:41:07 crc kubenswrapper[4839]: E0321 04:41:07.842844 4839 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Mar 21 04:41:07 crc kubenswrapper[4839]: E0321 04:41:07.842889 4839 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/859b11bc-e9fb-40a2-a053-66a07337965c-cert podName:859b11bc-e9fb-40a2-a053-66a07337965c nodeName:}" failed. No retries permitted until 2026-03-21 04:41:15.842876053 +0000 UTC m=+1080.170662729 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/859b11bc-e9fb-40a2-a053-66a07337965c-cert") pod "openstack-baremetal-operator-controller-manager-89d64c458-8gc22" (UID: "859b11bc-e9fb-40a2-a053-66a07337965c") : secret "openstack-baremetal-operator-webhook-server-cert" not found Mar 21 04:41:08 crc kubenswrapper[4839]: I0321 04:41:08.349897 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/06f9e67e-8978-46a1-9dc8-c511197241e2-metrics-certs\") pod \"openstack-operator-controller-manager-5ccd4855ff-jx6pn\" (UID: \"06f9e67e-8978-46a1-9dc8-c511197241e2\") " pod="openstack-operators/openstack-operator-controller-manager-5ccd4855ff-jx6pn" Mar 21 04:41:08 crc kubenswrapper[4839]: I0321 04:41:08.349973 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/06f9e67e-8978-46a1-9dc8-c511197241e2-webhook-certs\") pod \"openstack-operator-controller-manager-5ccd4855ff-jx6pn\" (UID: \"06f9e67e-8978-46a1-9dc8-c511197241e2\") " pod="openstack-operators/openstack-operator-controller-manager-5ccd4855ff-jx6pn" Mar 21 04:41:08 crc kubenswrapper[4839]: E0321 04:41:08.350100 4839 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Mar 21 04:41:08 crc kubenswrapper[4839]: E0321 04:41:08.350177 4839 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/06f9e67e-8978-46a1-9dc8-c511197241e2-metrics-certs podName:06f9e67e-8978-46a1-9dc8-c511197241e2 nodeName:}" failed. No retries permitted until 2026-03-21 04:41:16.350157684 +0000 UTC m=+1080.677944440 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/06f9e67e-8978-46a1-9dc8-c511197241e2-metrics-certs") pod "openstack-operator-controller-manager-5ccd4855ff-jx6pn" (UID: "06f9e67e-8978-46a1-9dc8-c511197241e2") : secret "metrics-server-cert" not found Mar 21 04:41:08 crc kubenswrapper[4839]: E0321 04:41:08.350225 4839 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Mar 21 04:41:08 crc kubenswrapper[4839]: E0321 04:41:08.350362 4839 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/06f9e67e-8978-46a1-9dc8-c511197241e2-webhook-certs podName:06f9e67e-8978-46a1-9dc8-c511197241e2 nodeName:}" failed. No retries permitted until 2026-03-21 04:41:16.350309688 +0000 UTC m=+1080.678096454 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/06f9e67e-8978-46a1-9dc8-c511197241e2-webhook-certs") pod "openstack-operator-controller-manager-5ccd4855ff-jx6pn" (UID: "06f9e67e-8978-46a1-9dc8-c511197241e2") : secret "webhook-server-cert" not found Mar 21 04:41:15 crc kubenswrapper[4839]: I0321 04:41:15.612665 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/ca2a8cd0-1c71-45bb-b4fc-4c7f82515b3b-cert\") pod \"infra-operator-controller-manager-7b9c774f96-bsdjs\" (UID: \"ca2a8cd0-1c71-45bb-b4fc-4c7f82515b3b\") " pod="openstack-operators/infra-operator-controller-manager-7b9c774f96-bsdjs" Mar 21 04:41:15 crc kubenswrapper[4839]: I0321 04:41:15.653147 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/ca2a8cd0-1c71-45bb-b4fc-4c7f82515b3b-cert\") pod \"infra-operator-controller-manager-7b9c774f96-bsdjs\" (UID: \"ca2a8cd0-1c71-45bb-b4fc-4c7f82515b3b\") " pod="openstack-operators/infra-operator-controller-manager-7b9c774f96-bsdjs" Mar 21 04:41:15 crc kubenswrapper[4839]: I0321 04:41:15.662915 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/infra-operator-controller-manager-7b9c774f96-bsdjs" Mar 21 04:41:15 crc kubenswrapper[4839]: I0321 04:41:15.919541 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/859b11bc-e9fb-40a2-a053-66a07337965c-cert\") pod \"openstack-baremetal-operator-controller-manager-89d64c458-8gc22\" (UID: \"859b11bc-e9fb-40a2-a053-66a07337965c\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-89d64c458-8gc22" Mar 21 04:41:15 crc kubenswrapper[4839]: I0321 04:41:15.924242 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/859b11bc-e9fb-40a2-a053-66a07337965c-cert\") pod \"openstack-baremetal-operator-controller-manager-89d64c458-8gc22\" (UID: \"859b11bc-e9fb-40a2-a053-66a07337965c\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-89d64c458-8gc22" Mar 21 04:41:16 crc kubenswrapper[4839]: I0321 04:41:16.015925 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-baremetal-operator-controller-manager-89d64c458-8gc22" Mar 21 04:41:16 crc kubenswrapper[4839]: I0321 04:41:16.426531 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/06f9e67e-8978-46a1-9dc8-c511197241e2-metrics-certs\") pod \"openstack-operator-controller-manager-5ccd4855ff-jx6pn\" (UID: \"06f9e67e-8978-46a1-9dc8-c511197241e2\") " pod="openstack-operators/openstack-operator-controller-manager-5ccd4855ff-jx6pn" Mar 21 04:41:16 crc kubenswrapper[4839]: I0321 04:41:16.427396 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/06f9e67e-8978-46a1-9dc8-c511197241e2-webhook-certs\") pod \"openstack-operator-controller-manager-5ccd4855ff-jx6pn\" (UID: \"06f9e67e-8978-46a1-9dc8-c511197241e2\") " pod="openstack-operators/openstack-operator-controller-manager-5ccd4855ff-jx6pn" Mar 21 04:41:16 crc kubenswrapper[4839]: I0321 04:41:16.430725 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/06f9e67e-8978-46a1-9dc8-c511197241e2-metrics-certs\") pod \"openstack-operator-controller-manager-5ccd4855ff-jx6pn\" (UID: \"06f9e67e-8978-46a1-9dc8-c511197241e2\") " pod="openstack-operators/openstack-operator-controller-manager-5ccd4855ff-jx6pn" Mar 21 04:41:16 crc kubenswrapper[4839]: I0321 04:41:16.431526 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/06f9e67e-8978-46a1-9dc8-c511197241e2-webhook-certs\") pod \"openstack-operator-controller-manager-5ccd4855ff-jx6pn\" (UID: \"06f9e67e-8978-46a1-9dc8-c511197241e2\") " pod="openstack-operators/openstack-operator-controller-manager-5ccd4855ff-jx6pn" Mar 21 04:41:16 crc kubenswrapper[4839]: I0321 04:41:16.498111 4839 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-operator-controller-manager-dockercfg-vcnt8" Mar 21 04:41:16 crc kubenswrapper[4839]: I0321 04:41:16.509803 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-manager-5ccd4855ff-jx6pn" Mar 21 04:41:18 crc kubenswrapper[4839]: E0321 04:41:18.925597 4839 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/watcher-operator@sha256:d9c55e8c6304a0e32289b5e8c69a87ea59b9968918a5c85b7c384633df82c807" Mar 21 04:41:18 crc kubenswrapper[4839]: E0321 04:41:18.926277 4839 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/watcher-operator@sha256:d9c55e8c6304a0e32289b5e8c69a87ea59b9968918a5c85b7c384633df82c807,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-bzgnq,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod watcher-operator-controller-manager-6c4d75f7f9-hh27s_openstack-operators(1d32b541-7b80-492b-adac-e51d5090b668): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 21 04:41:18 crc kubenswrapper[4839]: E0321 04:41:18.927491 4839 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/watcher-operator-controller-manager-6c4d75f7f9-hh27s" podUID="1d32b541-7b80-492b-adac-e51d5090b668" Mar 21 04:41:19 crc kubenswrapper[4839]: E0321 04:41:19.509038 4839 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/watcher-operator@sha256:d9c55e8c6304a0e32289b5e8c69a87ea59b9968918a5c85b7c384633df82c807\\\"\"" pod="openstack-operators/watcher-operator-controller-manager-6c4d75f7f9-hh27s" podUID="1d32b541-7b80-492b-adac-e51d5090b668" Mar 21 04:41:20 crc kubenswrapper[4839]: E0321 04:41:20.121523 4839 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/neutron-operator@sha256:526f9d4965431e1a5e4f8c3224bcee3f636a3108a5e0767296a994c2a517404a" Mar 21 04:41:20 crc kubenswrapper[4839]: E0321 04:41:20.121715 4839 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/neutron-operator@sha256:526f9d4965431e1a5e4f8c3224bcee3f636a3108a5e0767296a994c2a517404a,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-tcpnh,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod neutron-operator-controller-manager-767865f676-94vpf_openstack-operators(70702cd5-6815-4a01-98a4-2f4dfaeef839): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 21 04:41:20 crc kubenswrapper[4839]: E0321 04:41:20.123432 4839 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/neutron-operator-controller-manager-767865f676-94vpf" podUID="70702cd5-6815-4a01-98a4-2f4dfaeef839" Mar 21 04:41:21 crc kubenswrapper[4839]: E0321 04:41:21.114161 4839 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/neutron-operator@sha256:526f9d4965431e1a5e4f8c3224bcee3f636a3108a5e0767296a994c2a517404a\\\"\"" pod="openstack-operators/neutron-operator-controller-manager-767865f676-94vpf" podUID="70702cd5-6815-4a01-98a4-2f4dfaeef839" Mar 21 04:41:22 crc kubenswrapper[4839]: E0321 04:41:22.289819 4839 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/nova-operator@sha256:7398eb8fa5a4844d3326a5dff759d17199870c389b3ce3011a038b27bf95512a" Mar 21 04:41:22 crc kubenswrapper[4839]: E0321 04:41:22.290017 4839 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/nova-operator@sha256:7398eb8fa5a4844d3326a5dff759d17199870c389b3ce3011a038b27bf95512a,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-6648r,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod nova-operator-controller-manager-5d488d59fb-wjw9j_openstack-operators(6914418f-3639-4ebc-a58d-d8b478cbf6b4): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 21 04:41:22 crc kubenswrapper[4839]: E0321 04:41:22.291184 4839 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/nova-operator-controller-manager-5d488d59fb-wjw9j" podUID="6914418f-3639-4ebc-a58d-d8b478cbf6b4" Mar 21 04:41:22 crc kubenswrapper[4839]: E0321 04:41:22.804039 4839 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/cinder-operator@sha256:d8210bb21d4d298271a7b43f92fe58789393546e616aaaec1ce71bb2a754e777" Mar 21 04:41:22 crc kubenswrapper[4839]: E0321 04:41:22.804239 4839 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/cinder-operator@sha256:d8210bb21d4d298271a7b43f92fe58789393546e616aaaec1ce71bb2a754e777,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-5fztr,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod cinder-operator-controller-manager-8d58dc466-dncxc_openstack-operators(05f30a88-e899-4727-9440-981d010a1342): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 21 04:41:22 crc kubenswrapper[4839]: E0321 04:41:22.805464 4839 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/cinder-operator-controller-manager-8d58dc466-dncxc" podUID="05f30a88-e899-4727-9440-981d010a1342" Mar 21 04:41:23 crc kubenswrapper[4839]: E0321 04:41:23.128343 4839 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/cinder-operator@sha256:d8210bb21d4d298271a7b43f92fe58789393546e616aaaec1ce71bb2a754e777\\\"\"" pod="openstack-operators/cinder-operator-controller-manager-8d58dc466-dncxc" podUID="05f30a88-e899-4727-9440-981d010a1342" Mar 21 04:41:23 crc kubenswrapper[4839]: E0321 04:41:23.131513 4839 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/nova-operator@sha256:7398eb8fa5a4844d3326a5dff759d17199870c389b3ce3011a038b27bf95512a\\\"\"" pod="openstack-operators/nova-operator-controller-manager-5d488d59fb-wjw9j" podUID="6914418f-3639-4ebc-a58d-d8b478cbf6b4" Mar 21 04:41:23 crc kubenswrapper[4839]: E0321 04:41:23.415887 4839 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/keystone-operator@sha256:ec36a9083657587022f8471c9d5a71b87a7895398496e7fc546c73aa1eae4b56" Mar 21 04:41:23 crc kubenswrapper[4839]: E0321 04:41:23.416086 4839 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/keystone-operator@sha256:ec36a9083657587022f8471c9d5a71b87a7895398496e7fc546c73aa1eae4b56,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-sdzgg,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod keystone-operator-controller-manager-768b96df4c-k4lg5_openstack-operators(7a7bf7a3-acea-4059-8a89-db576f3588d1): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 21 04:41:23 crc kubenswrapper[4839]: E0321 04:41:23.417263 4839 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/keystone-operator-controller-manager-768b96df4c-k4lg5" podUID="7a7bf7a3-acea-4059-8a89-db576f3588d1" Mar 21 04:41:24 crc kubenswrapper[4839]: E0321 04:41:24.136637 4839 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/keystone-operator@sha256:ec36a9083657587022f8471c9d5a71b87a7895398496e7fc546c73aa1eae4b56\\\"\"" pod="openstack-operators/keystone-operator-controller-manager-768b96df4c-k4lg5" podUID="7a7bf7a3-acea-4059-8a89-db576f3588d1" Mar 21 04:41:27 crc kubenswrapper[4839]: I0321 04:41:27.547638 4839 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-baremetal-operator-controller-manager-89d64c458-8gc22"] Mar 21 04:41:27 crc kubenswrapper[4839]: I0321 04:41:27.626505 4839 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/infra-operator-controller-manager-7b9c774f96-bsdjs"] Mar 21 04:41:27 crc kubenswrapper[4839]: W0321 04:41:27.652474 4839 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podca2a8cd0_1c71_45bb_b4fc_4c7f82515b3b.slice/crio-b63fc0bf4c85f10ef31dbddf4ba6c783b58142632c6db0f8d741f682b33d05a5 WatchSource:0}: Error finding container b63fc0bf4c85f10ef31dbddf4ba6c783b58142632c6db0f8d741f682b33d05a5: Status 404 returned error can't find the container with id b63fc0bf4c85f10ef31dbddf4ba6c783b58142632c6db0f8d741f682b33d05a5 Mar 21 04:41:27 crc kubenswrapper[4839]: I0321 04:41:27.781549 4839 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-manager-5ccd4855ff-jx6pn"] Mar 21 04:41:28 crc kubenswrapper[4839]: I0321 04:41:28.188815 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/manila-operator-controller-manager-55f864c847-gzh8j" event={"ID":"6074766c-0ecd-4051-a676-dcc21b24184f","Type":"ContainerStarted","Data":"9a9f78cf0a14d5de0729d09dbdb5cf3e9768422484d405dc6ebae8447edfda36"} Mar 21 04:41:28 crc kubenswrapper[4839]: I0321 04:41:28.189805 4839 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/manila-operator-controller-manager-55f864c847-gzh8j" Mar 21 04:41:28 crc kubenswrapper[4839]: I0321 04:41:28.212888 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/mariadb-operator-controller-manager-67ccfc9778-sp4j4" event={"ID":"2162bafb-7e49-435c-9591-d8b725f10336","Type":"ContainerStarted","Data":"a103cedf709e0f1efd5371c0501a46a1069b5e2c9d2f96700d6110af18b60471"} Mar 21 04:41:28 crc kubenswrapper[4839]: I0321 04:41:28.213730 4839 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/mariadb-operator-controller-manager-67ccfc9778-sp4j4" Mar 21 04:41:28 crc kubenswrapper[4839]: I0321 04:41:28.220041 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/infra-operator-controller-manager-7b9c774f96-bsdjs" event={"ID":"ca2a8cd0-1c71-45bb-b4fc-4c7f82515b3b","Type":"ContainerStarted","Data":"b63fc0bf4c85f10ef31dbddf4ba6c783b58142632c6db0f8d741f682b33d05a5"} Mar 21 04:41:28 crc kubenswrapper[4839]: I0321 04:41:28.237817 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ironic-operator-controller-manager-6f787dddc9-8sg4d" event={"ID":"ccec0d11-294b-43a2-be2e-fcef8a6818c6","Type":"ContainerStarted","Data":"41473f2c5d8c86b32cdc3c2a6d5e0e216ac9b1861edefe0e092ba5e6634ccefa"} Mar 21 04:41:28 crc kubenswrapper[4839]: I0321 04:41:28.238602 4839 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/ironic-operator-controller-manager-6f787dddc9-8sg4d" Mar 21 04:41:28 crc kubenswrapper[4839]: I0321 04:41:28.261941 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/swift-operator-controller-manager-c674c5965-xt7xt" event={"ID":"2045f5d2-c67e-47cd-b16d-3c69d449f099","Type":"ContainerStarted","Data":"67acf10ed6bac7057aee76136d969a692d945cbe0d46b587a7c4f9f547fe11f6"} Mar 21 04:41:28 crc kubenswrapper[4839]: I0321 04:41:28.262772 4839 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/swift-operator-controller-manager-c674c5965-xt7xt" Mar 21 04:41:28 crc kubenswrapper[4839]: I0321 04:41:28.277798 4839 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/manila-operator-controller-manager-55f864c847-gzh8j" podStartSLOduration=7.300646789 podStartE2EDuration="29.277778984s" podCreationTimestamp="2026-03-21 04:40:59 +0000 UTC" firstStartedPulling="2026-03-21 04:41:01.418690834 +0000 UTC m=+1065.746477500" lastFinishedPulling="2026-03-21 04:41:23.395823019 +0000 UTC m=+1087.723609695" observedRunningTime="2026-03-21 04:41:28.227745454 +0000 UTC m=+1092.555532130" watchObservedRunningTime="2026-03-21 04:41:28.277778984 +0000 UTC m=+1092.605565660" Mar 21 04:41:28 crc kubenswrapper[4839]: I0321 04:41:28.290693 4839 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/mariadb-operator-controller-manager-67ccfc9778-sp4j4" podStartSLOduration=3.606030701 podStartE2EDuration="29.290670495s" podCreationTimestamp="2026-03-21 04:40:59 +0000 UTC" firstStartedPulling="2026-03-21 04:41:01.466641756 +0000 UTC m=+1065.794428432" lastFinishedPulling="2026-03-21 04:41:27.15128156 +0000 UTC m=+1091.479068226" observedRunningTime="2026-03-21 04:41:28.278822723 +0000 UTC m=+1092.606609399" watchObservedRunningTime="2026-03-21 04:41:28.290670495 +0000 UTC m=+1092.618457171" Mar 21 04:41:28 crc kubenswrapper[4839]: I0321 04:41:28.292875 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/glance-operator-controller-manager-79df6bcc97-6s6q7" event={"ID":"d3dc722f-f66c-46a0-9b1a-ae1b9c4de060","Type":"ContainerStarted","Data":"634de7da255cd804a2e18816b291b64eb7d000a599f377038f3ddc0140788ad6"} Mar 21 04:41:28 crc kubenswrapper[4839]: I0321 04:41:28.293666 4839 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/glance-operator-controller-manager-79df6bcc97-6s6q7" Mar 21 04:41:28 crc kubenswrapper[4839]: I0321 04:41:28.305621 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-manager-5ccd4855ff-jx6pn" event={"ID":"06f9e67e-8978-46a1-9dc8-c511197241e2","Type":"ContainerStarted","Data":"55bdf6e62bbc1a37ec4b18a5aba81bffda0650b9566b70d88a865acf43d8d066"} Mar 21 04:41:28 crc kubenswrapper[4839]: I0321 04:41:28.305676 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-manager-5ccd4855ff-jx6pn" event={"ID":"06f9e67e-8978-46a1-9dc8-c511197241e2","Type":"ContainerStarted","Data":"4a7a84c40d7546d7e282c94ae650b82aa400a266a6a30a4f3c9a726d568986c2"} Mar 21 04:41:28 crc kubenswrapper[4839]: I0321 04:41:28.306397 4839 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-operator-controller-manager-5ccd4855ff-jx6pn" Mar 21 04:41:28 crc kubenswrapper[4839]: I0321 04:41:28.332847 4839 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/ironic-operator-controller-manager-6f787dddc9-8sg4d" podStartSLOduration=6.041056262 podStartE2EDuration="29.332827954s" podCreationTimestamp="2026-03-21 04:40:59 +0000 UTC" firstStartedPulling="2026-03-21 04:41:01.19903764 +0000 UTC m=+1065.526824316" lastFinishedPulling="2026-03-21 04:41:24.490809312 +0000 UTC m=+1088.818596008" observedRunningTime="2026-03-21 04:41:28.329949914 +0000 UTC m=+1092.657736590" watchObservedRunningTime="2026-03-21 04:41:28.332827954 +0000 UTC m=+1092.660614630" Mar 21 04:41:28 crc kubenswrapper[4839]: I0321 04:41:28.344897 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/telemetry-operator-controller-manager-d6b694c5-btkvt" event={"ID":"d3ea9c2e-11a4-492e-9e84-8294e81ce775","Type":"ContainerStarted","Data":"c35cc8d062eac13eebcfe68346660c95e96b656d2e63c934431da28d13763fc9"} Mar 21 04:41:28 crc kubenswrapper[4839]: I0321 04:41:28.345621 4839 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/telemetry-operator-controller-manager-d6b694c5-btkvt" Mar 21 04:41:28 crc kubenswrapper[4839]: I0321 04:41:28.368730 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/octavia-operator-controller-manager-5b9f45d989-6p4mn" event={"ID":"faac458b-73d9-4fb8-9f1c-50f7521088b0","Type":"ContainerStarted","Data":"711fedfa565003f3e8676b238fb5732b71d1179ad5a4f3ce139b01d353c78c5b"} Mar 21 04:41:28 crc kubenswrapper[4839]: I0321 04:41:28.369373 4839 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/octavia-operator-controller-manager-5b9f45d989-6p4mn" Mar 21 04:41:28 crc kubenswrapper[4839]: I0321 04:41:28.377149 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-lzbtt" event={"ID":"c8584ecb-dc92-4cec-9178-3017f09095da","Type":"ContainerStarted","Data":"5225ba85403837e007c27f0908ad15aaa3a398465f58d263aa47458eb177b30a"} Mar 21 04:41:28 crc kubenswrapper[4839]: I0321 04:41:28.382461 4839 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/swift-operator-controller-manager-c674c5965-xt7xt" podStartSLOduration=7.45296122 podStartE2EDuration="29.382438982s" podCreationTimestamp="2026-03-21 04:40:59 +0000 UTC" firstStartedPulling="2026-03-21 04:41:01.466233574 +0000 UTC m=+1065.794020250" lastFinishedPulling="2026-03-21 04:41:23.395711336 +0000 UTC m=+1087.723498012" observedRunningTime="2026-03-21 04:41:28.381905527 +0000 UTC m=+1092.709692193" watchObservedRunningTime="2026-03-21 04:41:28.382438982 +0000 UTC m=+1092.710225668" Mar 21 04:41:28 crc kubenswrapper[4839]: I0321 04:41:28.396787 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/test-operator-controller-manager-5c5cb9c4d7-7f4qh" event={"ID":"5eeb53bd-3988-458f-baa5-d265e0178aea","Type":"ContainerStarted","Data":"3243ecf3495ef3b03890baf0dd8f20305c3138a5be4bd45c487e00a1ea0e92ef"} Mar 21 04:41:28 crc kubenswrapper[4839]: I0321 04:41:28.397448 4839 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/test-operator-controller-manager-5c5cb9c4d7-7f4qh" Mar 21 04:41:28 crc kubenswrapper[4839]: I0321 04:41:28.410631 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/barbican-operator-controller-manager-59bc569d95-2mkmz" event={"ID":"0c51ffa0-2285-4f7e-af09-0cafba139934","Type":"ContainerStarted","Data":"2c88204fa488213d4a01f36f6ead51e69c30b08d8733182a70d861da1c63aa67"} Mar 21 04:41:28 crc kubenswrapper[4839]: I0321 04:41:28.411392 4839 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/barbican-operator-controller-manager-59bc569d95-2mkmz" Mar 21 04:41:28 crc kubenswrapper[4839]: I0321 04:41:28.418897 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/heat-operator-controller-manager-67dd5f86f5-2n27d" event={"ID":"fd731e7e-440b-4e77-a778-08a4a62e0c9f","Type":"ContainerStarted","Data":"f6d7298b89265ea0b0e32733b5171ef0302b5814bb0a0b1807e3695a32147290"} Mar 21 04:41:28 crc kubenswrapper[4839]: I0321 04:41:28.419690 4839 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/heat-operator-controller-manager-67dd5f86f5-2n27d" Mar 21 04:41:28 crc kubenswrapper[4839]: I0321 04:41:28.434467 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/placement-operator-controller-manager-5784578c99-x75fd" event={"ID":"361c2d7b-9a75-41fd-953d-4b1bd64ca6df","Type":"ContainerStarted","Data":"399588d5878e8f33a9fe43bf1f68cd960748e5bc08fc984a42824e5d44050afe"} Mar 21 04:41:28 crc kubenswrapper[4839]: I0321 04:41:28.435126 4839 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/placement-operator-controller-manager-5784578c99-x75fd" Mar 21 04:41:28 crc kubenswrapper[4839]: I0321 04:41:28.475065 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/designate-operator-controller-manager-588d4d986b-9s4vt" event={"ID":"ee9d64a7-0d03-4cb0-a266-47b26f9957b5","Type":"ContainerStarted","Data":"020e99262e5422112c1e7f4291e1e1bb460871d7f1abd0108bc960753b39b0bb"} Mar 21 04:41:28 crc kubenswrapper[4839]: I0321 04:41:28.475104 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-baremetal-operator-controller-manager-89d64c458-8gc22" event={"ID":"859b11bc-e9fb-40a2-a053-66a07337965c","Type":"ContainerStarted","Data":"e0600411406dd7042d97e44d13c8fe878d34e3cf93a42f82842f96fe3401748c"} Mar 21 04:41:28 crc kubenswrapper[4839]: I0321 04:41:28.475122 4839 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/designate-operator-controller-manager-588d4d986b-9s4vt" Mar 21 04:41:28 crc kubenswrapper[4839]: I0321 04:41:28.479864 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/horizon-operator-controller-manager-8464cc45fb-d7h7r" event={"ID":"acb1d7ac-b3f9-4564-8346-344ffb5c3964","Type":"ContainerStarted","Data":"fa4799519581ae9bbfb1067af15231c49e45fbcc764ac42ff1560afc251a03c0"} Mar 21 04:41:28 crc kubenswrapper[4839]: I0321 04:41:28.480264 4839 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/horizon-operator-controller-manager-8464cc45fb-d7h7r" Mar 21 04:41:28 crc kubenswrapper[4839]: I0321 04:41:28.488890 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ovn-operator-controller-manager-884679f54-qt58c" event={"ID":"379b40a1-e3f5-448b-b668-0f168457e5d0","Type":"ContainerStarted","Data":"0efd5f828c3c703c7d4cfecb83afa6f34b7fb100ae7c48ed515b9e78bae52115"} Mar 21 04:41:28 crc kubenswrapper[4839]: I0321 04:41:28.489706 4839 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/ovn-operator-controller-manager-884679f54-qt58c" Mar 21 04:41:28 crc kubenswrapper[4839]: I0321 04:41:28.495916 4839 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/octavia-operator-controller-manager-5b9f45d989-6p4mn" podStartSLOduration=7.556168027 podStartE2EDuration="29.495894196s" podCreationTimestamp="2026-03-21 04:40:59 +0000 UTC" firstStartedPulling="2026-03-21 04:41:01.463191089 +0000 UTC m=+1065.790977765" lastFinishedPulling="2026-03-21 04:41:23.402917258 +0000 UTC m=+1087.730703934" observedRunningTime="2026-03-21 04:41:28.495753552 +0000 UTC m=+1092.823540228" watchObservedRunningTime="2026-03-21 04:41:28.495894196 +0000 UTC m=+1092.823680892" Mar 21 04:41:28 crc kubenswrapper[4839]: I0321 04:41:28.496998 4839 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/glance-operator-controller-manager-79df6bcc97-6s6q7" podStartSLOduration=6.048641534 podStartE2EDuration="29.496987807s" podCreationTimestamp="2026-03-21 04:40:59 +0000 UTC" firstStartedPulling="2026-03-21 04:41:01.042700696 +0000 UTC m=+1065.370487372" lastFinishedPulling="2026-03-21 04:41:24.491046949 +0000 UTC m=+1088.818833645" observedRunningTime="2026-03-21 04:41:28.448805659 +0000 UTC m=+1092.776592335" watchObservedRunningTime="2026-03-21 04:41:28.496987807 +0000 UTC m=+1092.824774493" Mar 21 04:41:28 crc kubenswrapper[4839]: I0321 04:41:28.528057 4839 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/telemetry-operator-controller-manager-d6b694c5-btkvt" podStartSLOduration=3.9756173390000002 podStartE2EDuration="29.528038025s" podCreationTimestamp="2026-03-21 04:40:59 +0000 UTC" firstStartedPulling="2026-03-21 04:41:01.716281049 +0000 UTC m=+1066.044067735" lastFinishedPulling="2026-03-21 04:41:27.268701745 +0000 UTC m=+1091.596488421" observedRunningTime="2026-03-21 04:41:28.526117962 +0000 UTC m=+1092.853904638" watchObservedRunningTime="2026-03-21 04:41:28.528038025 +0000 UTC m=+1092.855824701" Mar 21 04:41:28 crc kubenswrapper[4839]: I0321 04:41:28.614304 4839 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-controller-manager-5ccd4855ff-jx6pn" podStartSLOduration=28.614285558 podStartE2EDuration="28.614285558s" podCreationTimestamp="2026-03-21 04:41:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-21 04:41:28.61292741 +0000 UTC m=+1092.940714096" watchObservedRunningTime="2026-03-21 04:41:28.614285558 +0000 UTC m=+1092.942072234" Mar 21 04:41:28 crc kubenswrapper[4839]: I0321 04:41:28.617077 4839 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-lzbtt" podStartSLOduration=2.9663166199999997 podStartE2EDuration="28.617059586s" podCreationTimestamp="2026-03-21 04:41:00 +0000 UTC" firstStartedPulling="2026-03-21 04:41:01.681736903 +0000 UTC m=+1066.009523579" lastFinishedPulling="2026-03-21 04:41:27.332479869 +0000 UTC m=+1091.660266545" observedRunningTime="2026-03-21 04:41:28.557905901 +0000 UTC m=+1092.885692577" watchObservedRunningTime="2026-03-21 04:41:28.617059586 +0000 UTC m=+1092.944846262" Mar 21 04:41:28 crc kubenswrapper[4839]: I0321 04:41:28.655720 4839 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/designate-operator-controller-manager-588d4d986b-9s4vt" podStartSLOduration=6.001018561 podStartE2EDuration="29.655697417s" podCreationTimestamp="2026-03-21 04:40:59 +0000 UTC" firstStartedPulling="2026-03-21 04:41:00.835996883 +0000 UTC m=+1065.163783559" lastFinishedPulling="2026-03-21 04:41:24.490675739 +0000 UTC m=+1088.818462415" observedRunningTime="2026-03-21 04:41:28.648405673 +0000 UTC m=+1092.976192349" watchObservedRunningTime="2026-03-21 04:41:28.655697417 +0000 UTC m=+1092.983484093" Mar 21 04:41:28 crc kubenswrapper[4839]: I0321 04:41:28.684133 4839 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/ovn-operator-controller-manager-884679f54-qt58c" podStartSLOduration=4.121099691 podStartE2EDuration="29.684115112s" podCreationTimestamp="2026-03-21 04:40:59 +0000 UTC" firstStartedPulling="2026-03-21 04:41:01.704634974 +0000 UTC m=+1066.032421640" lastFinishedPulling="2026-03-21 04:41:27.267650385 +0000 UTC m=+1091.595437061" observedRunningTime="2026-03-21 04:41:28.680227673 +0000 UTC m=+1093.008014359" watchObservedRunningTime="2026-03-21 04:41:28.684115112 +0000 UTC m=+1093.011901788" Mar 21 04:41:28 crc kubenswrapper[4839]: I0321 04:41:28.722024 4839 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/test-operator-controller-manager-5c5cb9c4d7-7f4qh" podStartSLOduration=4.264109832 podStartE2EDuration="29.722002732s" podCreationTimestamp="2026-03-21 04:40:59 +0000 UTC" firstStartedPulling="2026-03-21 04:41:01.704640824 +0000 UTC m=+1066.032427500" lastFinishedPulling="2026-03-21 04:41:27.162533724 +0000 UTC m=+1091.490320400" observedRunningTime="2026-03-21 04:41:28.713917295 +0000 UTC m=+1093.041703981" watchObservedRunningTime="2026-03-21 04:41:28.722002732 +0000 UTC m=+1093.049789408" Mar 21 04:41:28 crc kubenswrapper[4839]: I0321 04:41:28.749700 4839 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/placement-operator-controller-manager-5784578c99-x75fd" podStartSLOduration=6.6804999 podStartE2EDuration="29.749675886s" podCreationTimestamp="2026-03-21 04:40:59 +0000 UTC" firstStartedPulling="2026-03-21 04:41:01.422078609 +0000 UTC m=+1065.749865285" lastFinishedPulling="2026-03-21 04:41:24.491254585 +0000 UTC m=+1088.819041271" observedRunningTime="2026-03-21 04:41:28.748819262 +0000 UTC m=+1093.076605938" watchObservedRunningTime="2026-03-21 04:41:28.749675886 +0000 UTC m=+1093.077462572" Mar 21 04:41:28 crc kubenswrapper[4839]: I0321 04:41:28.776106 4839 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/heat-operator-controller-manager-67dd5f86f5-2n27d" podStartSLOduration=6.373442051 podStartE2EDuration="29.776090725s" podCreationTimestamp="2026-03-21 04:40:59 +0000 UTC" firstStartedPulling="2026-03-21 04:41:01.088335993 +0000 UTC m=+1065.416122669" lastFinishedPulling="2026-03-21 04:41:24.490984647 +0000 UTC m=+1088.818771343" observedRunningTime="2026-03-21 04:41:28.774461509 +0000 UTC m=+1093.102248205" watchObservedRunningTime="2026-03-21 04:41:28.776090725 +0000 UTC m=+1093.103877401" Mar 21 04:41:28 crc kubenswrapper[4839]: I0321 04:41:28.808654 4839 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/horizon-operator-controller-manager-8464cc45fb-d7h7r" podStartSLOduration=7.38825609 podStartE2EDuration="29.808636635s" podCreationTimestamp="2026-03-21 04:40:59 +0000 UTC" firstStartedPulling="2026-03-21 04:41:00.982540823 +0000 UTC m=+1065.310327499" lastFinishedPulling="2026-03-21 04:41:23.402921368 +0000 UTC m=+1087.730708044" observedRunningTime="2026-03-21 04:41:28.80344031 +0000 UTC m=+1093.131226986" watchObservedRunningTime="2026-03-21 04:41:28.808636635 +0000 UTC m=+1093.136423311" Mar 21 04:41:28 crc kubenswrapper[4839]: I0321 04:41:28.832843 4839 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/barbican-operator-controller-manager-59bc569d95-2mkmz" podStartSLOduration=7.272967195 podStartE2EDuration="29.832824932s" podCreationTimestamp="2026-03-21 04:40:59 +0000 UTC" firstStartedPulling="2026-03-21 04:41:00.835998223 +0000 UTC m=+1065.163784899" lastFinishedPulling="2026-03-21 04:41:23.39585596 +0000 UTC m=+1087.723642636" observedRunningTime="2026-03-21 04:41:28.8309806 +0000 UTC m=+1093.158767276" watchObservedRunningTime="2026-03-21 04:41:28.832824932 +0000 UTC m=+1093.160611608" Mar 21 04:41:30 crc kubenswrapper[4839]: I0321 04:41:30.979874 4839 patch_prober.go:28] interesting pod/machine-config-daemon-jx4q7 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 21 04:41:30 crc kubenswrapper[4839]: I0321 04:41:30.980258 4839 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-jx4q7" podUID="4f92fefb-d5cd-451a-8bbe-31eea55d5bd9" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 21 04:41:32 crc kubenswrapper[4839]: I0321 04:41:32.527069 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/watcher-operator-controller-manager-6c4d75f7f9-hh27s" event={"ID":"1d32b541-7b80-492b-adac-e51d5090b668","Type":"ContainerStarted","Data":"d6f1629e94084e4cd740063bed823aafe4e138a7b52af9e308c678da34c08dab"} Mar 21 04:41:32 crc kubenswrapper[4839]: I0321 04:41:32.527642 4839 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/watcher-operator-controller-manager-6c4d75f7f9-hh27s" Mar 21 04:41:32 crc kubenswrapper[4839]: I0321 04:41:32.528990 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/infra-operator-controller-manager-7b9c774f96-bsdjs" event={"ID":"ca2a8cd0-1c71-45bb-b4fc-4c7f82515b3b","Type":"ContainerStarted","Data":"f8d051f6645b73a056ba3e02d18242fa3c2872a559ea6446dedc430614cafbd7"} Mar 21 04:41:32 crc kubenswrapper[4839]: I0321 04:41:32.529118 4839 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/infra-operator-controller-manager-7b9c774f96-bsdjs" Mar 21 04:41:32 crc kubenswrapper[4839]: I0321 04:41:32.530877 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-baremetal-operator-controller-manager-89d64c458-8gc22" event={"ID":"859b11bc-e9fb-40a2-a053-66a07337965c","Type":"ContainerStarted","Data":"c8487d32c5de78d29c1475c53de91c166771fbb61a68ec96854bfb0de661a1b1"} Mar 21 04:41:32 crc kubenswrapper[4839]: I0321 04:41:32.531050 4839 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-baremetal-operator-controller-manager-89d64c458-8gc22" Mar 21 04:41:32 crc kubenswrapper[4839]: I0321 04:41:32.549859 4839 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/watcher-operator-controller-manager-6c4d75f7f9-hh27s" podStartSLOduration=2.554344764 podStartE2EDuration="32.549839766s" podCreationTimestamp="2026-03-21 04:41:00 +0000 UTC" firstStartedPulling="2026-03-21 04:41:01.68163365 +0000 UTC m=+1066.009420326" lastFinishedPulling="2026-03-21 04:41:31.677128632 +0000 UTC m=+1096.004915328" observedRunningTime="2026-03-21 04:41:32.542535072 +0000 UTC m=+1096.870321768" watchObservedRunningTime="2026-03-21 04:41:32.549839766 +0000 UTC m=+1096.877626442" Mar 21 04:41:32 crc kubenswrapper[4839]: I0321 04:41:32.555513 4839 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/infra-operator-controller-manager-7b9c774f96-bsdjs" podStartSLOduration=29.534562028 podStartE2EDuration="33.555488875s" podCreationTimestamp="2026-03-21 04:40:59 +0000 UTC" firstStartedPulling="2026-03-21 04:41:27.65496668 +0000 UTC m=+1091.982753356" lastFinishedPulling="2026-03-21 04:41:31.675893527 +0000 UTC m=+1096.003680203" observedRunningTime="2026-03-21 04:41:32.555463034 +0000 UTC m=+1096.883249730" watchObservedRunningTime="2026-03-21 04:41:32.555488875 +0000 UTC m=+1096.883275551" Mar 21 04:41:32 crc kubenswrapper[4839]: I0321 04:41:32.588445 4839 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-baremetal-operator-controller-manager-89d64c458-8gc22" podStartSLOduration=29.476132373 podStartE2EDuration="33.588428546s" podCreationTimestamp="2026-03-21 04:40:59 +0000 UTC" firstStartedPulling="2026-03-21 04:41:27.564829259 +0000 UTC m=+1091.892615935" lastFinishedPulling="2026-03-21 04:41:31.677125422 +0000 UTC m=+1096.004912108" observedRunningTime="2026-03-21 04:41:32.58785887 +0000 UTC m=+1096.915645556" watchObservedRunningTime="2026-03-21 04:41:32.588428546 +0000 UTC m=+1096.916215222" Mar 21 04:41:34 crc kubenswrapper[4839]: I0321 04:41:34.546553 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/neutron-operator-controller-manager-767865f676-94vpf" event={"ID":"70702cd5-6815-4a01-98a4-2f4dfaeef839","Type":"ContainerStarted","Data":"b58db27d2d95e030329747eea38da9c271f47fb2d475f66104b3ccb3429a2c3c"} Mar 21 04:41:34 crc kubenswrapper[4839]: I0321 04:41:34.547350 4839 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/neutron-operator-controller-manager-767865f676-94vpf" Mar 21 04:41:34 crc kubenswrapper[4839]: I0321 04:41:34.562533 4839 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/neutron-operator-controller-manager-767865f676-94vpf" podStartSLOduration=3.050101138 podStartE2EDuration="35.562517421s" podCreationTimestamp="2026-03-21 04:40:59 +0000 UTC" firstStartedPulling="2026-03-21 04:41:01.345460486 +0000 UTC m=+1065.673247162" lastFinishedPulling="2026-03-21 04:41:33.857876739 +0000 UTC m=+1098.185663445" observedRunningTime="2026-03-21 04:41:34.559422134 +0000 UTC m=+1098.887208810" watchObservedRunningTime="2026-03-21 04:41:34.562517421 +0000 UTC m=+1098.890304097" Mar 21 04:41:35 crc kubenswrapper[4839]: I0321 04:41:35.556074 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/cinder-operator-controller-manager-8d58dc466-dncxc" event={"ID":"05f30a88-e899-4727-9440-981d010a1342","Type":"ContainerStarted","Data":"a1f568e57e5085c011ddf0acbcf6148bbd55263b2b00d976af4bff35fd113311"} Mar 21 04:41:35 crc kubenswrapper[4839]: I0321 04:41:35.556578 4839 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/cinder-operator-controller-manager-8d58dc466-dncxc" Mar 21 04:41:35 crc kubenswrapper[4839]: I0321 04:41:35.577319 4839 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/cinder-operator-controller-manager-8d58dc466-dncxc" podStartSLOduration=2.340336302 podStartE2EDuration="36.577296579s" podCreationTimestamp="2026-03-21 04:40:59 +0000 UTC" firstStartedPulling="2026-03-21 04:41:00.644686391 +0000 UTC m=+1064.972473067" lastFinishedPulling="2026-03-21 04:41:34.881646668 +0000 UTC m=+1099.209433344" observedRunningTime="2026-03-21 04:41:35.572903616 +0000 UTC m=+1099.900690312" watchObservedRunningTime="2026-03-21 04:41:35.577296579 +0000 UTC m=+1099.905083255" Mar 21 04:41:36 crc kubenswrapper[4839]: I0321 04:41:36.026076 4839 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-baremetal-operator-controller-manager-89d64c458-8gc22" Mar 21 04:41:36 crc kubenswrapper[4839]: I0321 04:41:36.515126 4839 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-operator-controller-manager-5ccd4855ff-jx6pn" Mar 21 04:41:36 crc kubenswrapper[4839]: I0321 04:41:36.575317 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/keystone-operator-controller-manager-768b96df4c-k4lg5" event={"ID":"7a7bf7a3-acea-4059-8a89-db576f3588d1","Type":"ContainerStarted","Data":"f0607583941d87b541d87529b02738cfb39b0b3cd9d1173a5fa1a97050d8e31d"} Mar 21 04:41:36 crc kubenswrapper[4839]: I0321 04:41:36.576020 4839 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/keystone-operator-controller-manager-768b96df4c-k4lg5" Mar 21 04:41:39 crc kubenswrapper[4839]: I0321 04:41:39.593174 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/nova-operator-controller-manager-5d488d59fb-wjw9j" event={"ID":"6914418f-3639-4ebc-a58d-d8b478cbf6b4","Type":"ContainerStarted","Data":"a0218d33682d53e141602fa8319c3ab211da923160ce7bbabde66b108e30e250"} Mar 21 04:41:39 crc kubenswrapper[4839]: I0321 04:41:39.594083 4839 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/nova-operator-controller-manager-5d488d59fb-wjw9j" Mar 21 04:41:39 crc kubenswrapper[4839]: I0321 04:41:39.607045 4839 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/keystone-operator-controller-manager-768b96df4c-k4lg5" podStartSLOduration=5.795175313 podStartE2EDuration="40.607027332s" podCreationTimestamp="2026-03-21 04:40:59 +0000 UTC" firstStartedPulling="2026-03-21 04:41:01.177558569 +0000 UTC m=+1065.505345245" lastFinishedPulling="2026-03-21 04:41:35.989410578 +0000 UTC m=+1100.317197264" observedRunningTime="2026-03-21 04:41:36.603710124 +0000 UTC m=+1100.931496800" watchObservedRunningTime="2026-03-21 04:41:39.607027332 +0000 UTC m=+1103.934814008" Mar 21 04:41:39 crc kubenswrapper[4839]: I0321 04:41:39.607426 4839 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/nova-operator-controller-manager-5d488d59fb-wjw9j" podStartSLOduration=3.3295372260000002 podStartE2EDuration="40.607420873s" podCreationTimestamp="2026-03-21 04:40:59 +0000 UTC" firstStartedPulling="2026-03-21 04:41:01.024783645 +0000 UTC m=+1065.352570321" lastFinishedPulling="2026-03-21 04:41:38.302667282 +0000 UTC m=+1102.630453968" observedRunningTime="2026-03-21 04:41:39.606676622 +0000 UTC m=+1103.934463298" watchObservedRunningTime="2026-03-21 04:41:39.607420873 +0000 UTC m=+1103.935207549" Mar 21 04:41:39 crc kubenswrapper[4839]: I0321 04:41:39.777475 4839 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/barbican-operator-controller-manager-59bc569d95-2mkmz" Mar 21 04:41:39 crc kubenswrapper[4839]: I0321 04:41:39.824756 4839 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/designate-operator-controller-manager-588d4d986b-9s4vt" Mar 21 04:41:39 crc kubenswrapper[4839]: I0321 04:41:39.866124 4839 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/glance-operator-controller-manager-79df6bcc97-6s6q7" Mar 21 04:41:39 crc kubenswrapper[4839]: I0321 04:41:39.872115 4839 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/heat-operator-controller-manager-67dd5f86f5-2n27d" Mar 21 04:41:39 crc kubenswrapper[4839]: I0321 04:41:39.940098 4839 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/horizon-operator-controller-manager-8464cc45fb-d7h7r" Mar 21 04:41:40 crc kubenswrapper[4839]: I0321 04:41:40.048100 4839 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/ironic-operator-controller-manager-6f787dddc9-8sg4d" Mar 21 04:41:40 crc kubenswrapper[4839]: I0321 04:41:40.201366 4839 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/manila-operator-controller-manager-55f864c847-gzh8j" Mar 21 04:41:40 crc kubenswrapper[4839]: I0321 04:41:40.228130 4839 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/neutron-operator-controller-manager-767865f676-94vpf" Mar 21 04:41:40 crc kubenswrapper[4839]: I0321 04:41:40.270520 4839 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/mariadb-operator-controller-manager-67ccfc9778-sp4j4" Mar 21 04:41:40 crc kubenswrapper[4839]: I0321 04:41:40.368727 4839 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/octavia-operator-controller-manager-5b9f45d989-6p4mn" Mar 21 04:41:40 crc kubenswrapper[4839]: I0321 04:41:40.444406 4839 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/ovn-operator-controller-manager-884679f54-qt58c" Mar 21 04:41:40 crc kubenswrapper[4839]: I0321 04:41:40.528957 4839 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/placement-operator-controller-manager-5784578c99-x75fd" Mar 21 04:41:40 crc kubenswrapper[4839]: I0321 04:41:40.549102 4839 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/swift-operator-controller-manager-c674c5965-xt7xt" Mar 21 04:41:40 crc kubenswrapper[4839]: I0321 04:41:40.795042 4839 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/telemetry-operator-controller-manager-d6b694c5-btkvt" Mar 21 04:41:40 crc kubenswrapper[4839]: I0321 04:41:40.824147 4839 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/test-operator-controller-manager-5c5cb9c4d7-7f4qh" Mar 21 04:41:40 crc kubenswrapper[4839]: I0321 04:41:40.827810 4839 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/watcher-operator-controller-manager-6c4d75f7f9-hh27s" Mar 21 04:41:45 crc kubenswrapper[4839]: I0321 04:41:45.668628 4839 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/infra-operator-controller-manager-7b9c774f96-bsdjs" Mar 21 04:41:49 crc kubenswrapper[4839]: I0321 04:41:49.787010 4839 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/cinder-operator-controller-manager-8d58dc466-dncxc" Mar 21 04:41:50 crc kubenswrapper[4839]: I0321 04:41:50.007276 4839 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/nova-operator-controller-manager-5d488d59fb-wjw9j" Mar 21 04:41:50 crc kubenswrapper[4839]: I0321 04:41:50.178478 4839 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/keystone-operator-controller-manager-768b96df4c-k4lg5" Mar 21 04:42:00 crc kubenswrapper[4839]: I0321 04:42:00.161612 4839 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29567802-zsmks"] Mar 21 04:42:00 crc kubenswrapper[4839]: I0321 04:42:00.163034 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567802-zsmks" Mar 21 04:42:00 crc kubenswrapper[4839]: I0321 04:42:00.166809 4839 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-swld2" Mar 21 04:42:00 crc kubenswrapper[4839]: I0321 04:42:00.167184 4839 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 21 04:42:00 crc kubenswrapper[4839]: I0321 04:42:00.167742 4839 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 21 04:42:00 crc kubenswrapper[4839]: I0321 04:42:00.171122 4839 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29567802-zsmks"] Mar 21 04:42:00 crc kubenswrapper[4839]: I0321 04:42:00.224696 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ktvvb\" (UniqueName: \"kubernetes.io/projected/ab3902e0-a483-447f-b86c-4fe8e8983152-kube-api-access-ktvvb\") pod \"auto-csr-approver-29567802-zsmks\" (UID: \"ab3902e0-a483-447f-b86c-4fe8e8983152\") " pod="openshift-infra/auto-csr-approver-29567802-zsmks" Mar 21 04:42:00 crc kubenswrapper[4839]: I0321 04:42:00.326080 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ktvvb\" (UniqueName: \"kubernetes.io/projected/ab3902e0-a483-447f-b86c-4fe8e8983152-kube-api-access-ktvvb\") pod \"auto-csr-approver-29567802-zsmks\" (UID: \"ab3902e0-a483-447f-b86c-4fe8e8983152\") " pod="openshift-infra/auto-csr-approver-29567802-zsmks" Mar 21 04:42:00 crc kubenswrapper[4839]: I0321 04:42:00.343825 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ktvvb\" (UniqueName: \"kubernetes.io/projected/ab3902e0-a483-447f-b86c-4fe8e8983152-kube-api-access-ktvvb\") pod \"auto-csr-approver-29567802-zsmks\" (UID: \"ab3902e0-a483-447f-b86c-4fe8e8983152\") " pod="openshift-infra/auto-csr-approver-29567802-zsmks" Mar 21 04:42:00 crc kubenswrapper[4839]: I0321 04:42:00.482445 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567802-zsmks" Mar 21 04:42:00 crc kubenswrapper[4839]: I0321 04:42:00.879780 4839 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29567802-zsmks"] Mar 21 04:42:00 crc kubenswrapper[4839]: I0321 04:42:00.980373 4839 patch_prober.go:28] interesting pod/machine-config-daemon-jx4q7 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 21 04:42:00 crc kubenswrapper[4839]: I0321 04:42:00.980438 4839 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-jx4q7" podUID="4f92fefb-d5cd-451a-8bbe-31eea55d5bd9" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 21 04:42:01 crc kubenswrapper[4839]: I0321 04:42:01.744917 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567802-zsmks" event={"ID":"ab3902e0-a483-447f-b86c-4fe8e8983152","Type":"ContainerStarted","Data":"cf16797bf9f83a86eb511379fcad611d80959bf312445466bab1c78ca2ba1616"} Mar 21 04:42:04 crc kubenswrapper[4839]: I0321 04:42:04.769842 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567802-zsmks" event={"ID":"ab3902e0-a483-447f-b86c-4fe8e8983152","Type":"ContainerStarted","Data":"13a62d6a43116fe61b0ca05db07b93400dd1b1d3d2760f545556037c6e4992fd"} Mar 21 04:42:04 crc kubenswrapper[4839]: I0321 04:42:04.782888 4839 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29567802-zsmks" podStartSLOduration=1.362470622 podStartE2EDuration="4.782871449s" podCreationTimestamp="2026-03-21 04:42:00 +0000 UTC" firstStartedPulling="2026-03-21 04:42:00.893745789 +0000 UTC m=+1125.221532465" lastFinishedPulling="2026-03-21 04:42:04.314146616 +0000 UTC m=+1128.641933292" observedRunningTime="2026-03-21 04:42:04.78074747 +0000 UTC m=+1129.108534156" watchObservedRunningTime="2026-03-21 04:42:04.782871449 +0000 UTC m=+1129.110658125" Mar 21 04:42:05 crc kubenswrapper[4839]: I0321 04:42:05.777980 4839 generic.go:334] "Generic (PLEG): container finished" podID="ab3902e0-a483-447f-b86c-4fe8e8983152" containerID="13a62d6a43116fe61b0ca05db07b93400dd1b1d3d2760f545556037c6e4992fd" exitCode=0 Mar 21 04:42:05 crc kubenswrapper[4839]: I0321 04:42:05.778030 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567802-zsmks" event={"ID":"ab3902e0-a483-447f-b86c-4fe8e8983152","Type":"ContainerDied","Data":"13a62d6a43116fe61b0ca05db07b93400dd1b1d3d2760f545556037c6e4992fd"} Mar 21 04:42:07 crc kubenswrapper[4839]: I0321 04:42:07.057322 4839 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567802-zsmks" Mar 21 04:42:07 crc kubenswrapper[4839]: I0321 04:42:07.126745 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ktvvb\" (UniqueName: \"kubernetes.io/projected/ab3902e0-a483-447f-b86c-4fe8e8983152-kube-api-access-ktvvb\") pod \"ab3902e0-a483-447f-b86c-4fe8e8983152\" (UID: \"ab3902e0-a483-447f-b86c-4fe8e8983152\") " Mar 21 04:42:07 crc kubenswrapper[4839]: I0321 04:42:07.134854 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ab3902e0-a483-447f-b86c-4fe8e8983152-kube-api-access-ktvvb" (OuterVolumeSpecName: "kube-api-access-ktvvb") pod "ab3902e0-a483-447f-b86c-4fe8e8983152" (UID: "ab3902e0-a483-447f-b86c-4fe8e8983152"). InnerVolumeSpecName "kube-api-access-ktvvb". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 21 04:42:07 crc kubenswrapper[4839]: I0321 04:42:07.227961 4839 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ktvvb\" (UniqueName: \"kubernetes.io/projected/ab3902e0-a483-447f-b86c-4fe8e8983152-kube-api-access-ktvvb\") on node \"crc\" DevicePath \"\"" Mar 21 04:42:07 crc kubenswrapper[4839]: I0321 04:42:07.413743 4839 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-675f4bcbfc-zn278"] Mar 21 04:42:07 crc kubenswrapper[4839]: E0321 04:42:07.414427 4839 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ab3902e0-a483-447f-b86c-4fe8e8983152" containerName="oc" Mar 21 04:42:07 crc kubenswrapper[4839]: I0321 04:42:07.414449 4839 state_mem.go:107] "Deleted CPUSet assignment" podUID="ab3902e0-a483-447f-b86c-4fe8e8983152" containerName="oc" Mar 21 04:42:07 crc kubenswrapper[4839]: I0321 04:42:07.414688 4839 memory_manager.go:354] "RemoveStaleState removing state" podUID="ab3902e0-a483-447f-b86c-4fe8e8983152" containerName="oc" Mar 21 04:42:07 crc kubenswrapper[4839]: I0321 04:42:07.415558 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-675f4bcbfc-zn278" Mar 21 04:42:07 crc kubenswrapper[4839]: I0321 04:42:07.417257 4839 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dnsmasq-dns-dockercfg-8nc52" Mar 21 04:42:07 crc kubenswrapper[4839]: I0321 04:42:07.417435 4839 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"kube-root-ca.crt" Mar 21 04:42:07 crc kubenswrapper[4839]: I0321 04:42:07.417546 4839 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openshift-service-ca.crt" Mar 21 04:42:07 crc kubenswrapper[4839]: I0321 04:42:07.419240 4839 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"dns" Mar 21 04:42:07 crc kubenswrapper[4839]: I0321 04:42:07.434858 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ef534e12-a75b-4e9b-b2a3-4046bece5903-config\") pod \"dnsmasq-dns-675f4bcbfc-zn278\" (UID: \"ef534e12-a75b-4e9b-b2a3-4046bece5903\") " pod="openstack/dnsmasq-dns-675f4bcbfc-zn278" Mar 21 04:42:07 crc kubenswrapper[4839]: I0321 04:42:07.435196 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vkmxw\" (UniqueName: \"kubernetes.io/projected/ef534e12-a75b-4e9b-b2a3-4046bece5903-kube-api-access-vkmxw\") pod \"dnsmasq-dns-675f4bcbfc-zn278\" (UID: \"ef534e12-a75b-4e9b-b2a3-4046bece5903\") " pod="openstack/dnsmasq-dns-675f4bcbfc-zn278" Mar 21 04:42:07 crc kubenswrapper[4839]: I0321 04:42:07.446224 4839 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-675f4bcbfc-zn278"] Mar 21 04:42:07 crc kubenswrapper[4839]: I0321 04:42:07.487703 4839 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-78dd6ddcc-7tj8n"] Mar 21 04:42:07 crc kubenswrapper[4839]: I0321 04:42:07.489142 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-78dd6ddcc-7tj8n" Mar 21 04:42:07 crc kubenswrapper[4839]: I0321 04:42:07.492041 4839 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"dns-svc" Mar 21 04:42:07 crc kubenswrapper[4839]: I0321 04:42:07.501802 4839 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-78dd6ddcc-7tj8n"] Mar 21 04:42:07 crc kubenswrapper[4839]: I0321 04:42:07.538321 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ef534e12-a75b-4e9b-b2a3-4046bece5903-config\") pod \"dnsmasq-dns-675f4bcbfc-zn278\" (UID: \"ef534e12-a75b-4e9b-b2a3-4046bece5903\") " pod="openstack/dnsmasq-dns-675f4bcbfc-zn278" Mar 21 04:42:07 crc kubenswrapper[4839]: I0321 04:42:07.538422 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vkmxw\" (UniqueName: \"kubernetes.io/projected/ef534e12-a75b-4e9b-b2a3-4046bece5903-kube-api-access-vkmxw\") pod \"dnsmasq-dns-675f4bcbfc-zn278\" (UID: \"ef534e12-a75b-4e9b-b2a3-4046bece5903\") " pod="openstack/dnsmasq-dns-675f4bcbfc-zn278" Mar 21 04:42:07 crc kubenswrapper[4839]: I0321 04:42:07.539656 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ef534e12-a75b-4e9b-b2a3-4046bece5903-config\") pod \"dnsmasq-dns-675f4bcbfc-zn278\" (UID: \"ef534e12-a75b-4e9b-b2a3-4046bece5903\") " pod="openstack/dnsmasq-dns-675f4bcbfc-zn278" Mar 21 04:42:07 crc kubenswrapper[4839]: I0321 04:42:07.578623 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vkmxw\" (UniqueName: \"kubernetes.io/projected/ef534e12-a75b-4e9b-b2a3-4046bece5903-kube-api-access-vkmxw\") pod \"dnsmasq-dns-675f4bcbfc-zn278\" (UID: \"ef534e12-a75b-4e9b-b2a3-4046bece5903\") " pod="openstack/dnsmasq-dns-675f4bcbfc-zn278" Mar 21 04:42:07 crc kubenswrapper[4839]: I0321 04:42:07.639420 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/2f135620-f512-4bfe-9875-4d4e07c6a0f5-dns-svc\") pod \"dnsmasq-dns-78dd6ddcc-7tj8n\" (UID: \"2f135620-f512-4bfe-9875-4d4e07c6a0f5\") " pod="openstack/dnsmasq-dns-78dd6ddcc-7tj8n" Mar 21 04:42:07 crc kubenswrapper[4839]: I0321 04:42:07.639765 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2f135620-f512-4bfe-9875-4d4e07c6a0f5-config\") pod \"dnsmasq-dns-78dd6ddcc-7tj8n\" (UID: \"2f135620-f512-4bfe-9875-4d4e07c6a0f5\") " pod="openstack/dnsmasq-dns-78dd6ddcc-7tj8n" Mar 21 04:42:07 crc kubenswrapper[4839]: I0321 04:42:07.639932 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-grztj\" (UniqueName: \"kubernetes.io/projected/2f135620-f512-4bfe-9875-4d4e07c6a0f5-kube-api-access-grztj\") pod \"dnsmasq-dns-78dd6ddcc-7tj8n\" (UID: \"2f135620-f512-4bfe-9875-4d4e07c6a0f5\") " pod="openstack/dnsmasq-dns-78dd6ddcc-7tj8n" Mar 21 04:42:07 crc kubenswrapper[4839]: I0321 04:42:07.740987 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2f135620-f512-4bfe-9875-4d4e07c6a0f5-config\") pod \"dnsmasq-dns-78dd6ddcc-7tj8n\" (UID: \"2f135620-f512-4bfe-9875-4d4e07c6a0f5\") " pod="openstack/dnsmasq-dns-78dd6ddcc-7tj8n" Mar 21 04:42:07 crc kubenswrapper[4839]: I0321 04:42:07.741038 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-grztj\" (UniqueName: \"kubernetes.io/projected/2f135620-f512-4bfe-9875-4d4e07c6a0f5-kube-api-access-grztj\") pod \"dnsmasq-dns-78dd6ddcc-7tj8n\" (UID: \"2f135620-f512-4bfe-9875-4d4e07c6a0f5\") " pod="openstack/dnsmasq-dns-78dd6ddcc-7tj8n" Mar 21 04:42:07 crc kubenswrapper[4839]: I0321 04:42:07.741098 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/2f135620-f512-4bfe-9875-4d4e07c6a0f5-dns-svc\") pod \"dnsmasq-dns-78dd6ddcc-7tj8n\" (UID: \"2f135620-f512-4bfe-9875-4d4e07c6a0f5\") " pod="openstack/dnsmasq-dns-78dd6ddcc-7tj8n" Mar 21 04:42:07 crc kubenswrapper[4839]: I0321 04:42:07.741823 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/2f135620-f512-4bfe-9875-4d4e07c6a0f5-dns-svc\") pod \"dnsmasq-dns-78dd6ddcc-7tj8n\" (UID: \"2f135620-f512-4bfe-9875-4d4e07c6a0f5\") " pod="openstack/dnsmasq-dns-78dd6ddcc-7tj8n" Mar 21 04:42:07 crc kubenswrapper[4839]: I0321 04:42:07.742323 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2f135620-f512-4bfe-9875-4d4e07c6a0f5-config\") pod \"dnsmasq-dns-78dd6ddcc-7tj8n\" (UID: \"2f135620-f512-4bfe-9875-4d4e07c6a0f5\") " pod="openstack/dnsmasq-dns-78dd6ddcc-7tj8n" Mar 21 04:42:07 crc kubenswrapper[4839]: I0321 04:42:07.752996 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-675f4bcbfc-zn278" Mar 21 04:42:07 crc kubenswrapper[4839]: I0321 04:42:07.758360 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-grztj\" (UniqueName: \"kubernetes.io/projected/2f135620-f512-4bfe-9875-4d4e07c6a0f5-kube-api-access-grztj\") pod \"dnsmasq-dns-78dd6ddcc-7tj8n\" (UID: \"2f135620-f512-4bfe-9875-4d4e07c6a0f5\") " pod="openstack/dnsmasq-dns-78dd6ddcc-7tj8n" Mar 21 04:42:07 crc kubenswrapper[4839]: I0321 04:42:07.791366 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567802-zsmks" event={"ID":"ab3902e0-a483-447f-b86c-4fe8e8983152","Type":"ContainerDied","Data":"cf16797bf9f83a86eb511379fcad611d80959bf312445466bab1c78ca2ba1616"} Mar 21 04:42:07 crc kubenswrapper[4839]: I0321 04:42:07.791632 4839 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="cf16797bf9f83a86eb511379fcad611d80959bf312445466bab1c78ca2ba1616" Mar 21 04:42:07 crc kubenswrapper[4839]: I0321 04:42:07.791755 4839 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567802-zsmks" Mar 21 04:42:07 crc kubenswrapper[4839]: I0321 04:42:07.813990 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-78dd6ddcc-7tj8n" Mar 21 04:42:07 crc kubenswrapper[4839]: I0321 04:42:07.855763 4839 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29567796-c5w5j"] Mar 21 04:42:07 crc kubenswrapper[4839]: I0321 04:42:07.868438 4839 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29567796-c5w5j"] Mar 21 04:42:07 crc kubenswrapper[4839]: I0321 04:42:07.986928 4839 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-675f4bcbfc-zn278"] Mar 21 04:42:07 crc kubenswrapper[4839]: W0321 04:42:07.990761 4839 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podef534e12_a75b_4e9b_b2a3_4046bece5903.slice/crio-8d9c14a410c4252b3bc652d39bc6ba19cb56d790ccb813a314300ee40264a16c WatchSource:0}: Error finding container 8d9c14a410c4252b3bc652d39bc6ba19cb56d790ccb813a314300ee40264a16c: Status 404 returned error can't find the container with id 8d9c14a410c4252b3bc652d39bc6ba19cb56d790ccb813a314300ee40264a16c Mar 21 04:42:08 crc kubenswrapper[4839]: I0321 04:42:08.256986 4839 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-78dd6ddcc-7tj8n"] Mar 21 04:42:08 crc kubenswrapper[4839]: I0321 04:42:08.462536 4839 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c0a0d1c7-7c95-46aa-81de-f2a3d91bea6b" path="/var/lib/kubelet/pods/c0a0d1c7-7c95-46aa-81de-f2a3d91bea6b/volumes" Mar 21 04:42:08 crc kubenswrapper[4839]: I0321 04:42:08.800551 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-78dd6ddcc-7tj8n" event={"ID":"2f135620-f512-4bfe-9875-4d4e07c6a0f5","Type":"ContainerStarted","Data":"2be9891bd895defb2ca0d28836ebf421c8763119934d2eefdf49a40b80778096"} Mar 21 04:42:08 crc kubenswrapper[4839]: I0321 04:42:08.802459 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-675f4bcbfc-zn278" event={"ID":"ef534e12-a75b-4e9b-b2a3-4046bece5903","Type":"ContainerStarted","Data":"8d9c14a410c4252b3bc652d39bc6ba19cb56d790ccb813a314300ee40264a16c"} Mar 21 04:42:10 crc kubenswrapper[4839]: I0321 04:42:10.319885 4839 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-675f4bcbfc-zn278"] Mar 21 04:42:10 crc kubenswrapper[4839]: I0321 04:42:10.348979 4839 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-5ccc8479f9-d99qf"] Mar 21 04:42:10 crc kubenswrapper[4839]: I0321 04:42:10.354806 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5ccc8479f9-d99qf" Mar 21 04:42:10 crc kubenswrapper[4839]: I0321 04:42:10.373063 4839 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5ccc8479f9-d99qf"] Mar 21 04:42:10 crc kubenswrapper[4839]: I0321 04:42:10.484951 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/45f7903c-4081-446b-94d9-ad979332590b-dns-svc\") pod \"dnsmasq-dns-5ccc8479f9-d99qf\" (UID: \"45f7903c-4081-446b-94d9-ad979332590b\") " pod="openstack/dnsmasq-dns-5ccc8479f9-d99qf" Mar 21 04:42:10 crc kubenswrapper[4839]: I0321 04:42:10.485224 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/45f7903c-4081-446b-94d9-ad979332590b-config\") pod \"dnsmasq-dns-5ccc8479f9-d99qf\" (UID: \"45f7903c-4081-446b-94d9-ad979332590b\") " pod="openstack/dnsmasq-dns-5ccc8479f9-d99qf" Mar 21 04:42:10 crc kubenswrapper[4839]: I0321 04:42:10.485277 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-djkvq\" (UniqueName: \"kubernetes.io/projected/45f7903c-4081-446b-94d9-ad979332590b-kube-api-access-djkvq\") pod \"dnsmasq-dns-5ccc8479f9-d99qf\" (UID: \"45f7903c-4081-446b-94d9-ad979332590b\") " pod="openstack/dnsmasq-dns-5ccc8479f9-d99qf" Mar 21 04:42:10 crc kubenswrapper[4839]: I0321 04:42:10.586290 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/45f7903c-4081-446b-94d9-ad979332590b-dns-svc\") pod \"dnsmasq-dns-5ccc8479f9-d99qf\" (UID: \"45f7903c-4081-446b-94d9-ad979332590b\") " pod="openstack/dnsmasq-dns-5ccc8479f9-d99qf" Mar 21 04:42:10 crc kubenswrapper[4839]: I0321 04:42:10.586349 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/45f7903c-4081-446b-94d9-ad979332590b-config\") pod \"dnsmasq-dns-5ccc8479f9-d99qf\" (UID: \"45f7903c-4081-446b-94d9-ad979332590b\") " pod="openstack/dnsmasq-dns-5ccc8479f9-d99qf" Mar 21 04:42:10 crc kubenswrapper[4839]: I0321 04:42:10.586393 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-djkvq\" (UniqueName: \"kubernetes.io/projected/45f7903c-4081-446b-94d9-ad979332590b-kube-api-access-djkvq\") pod \"dnsmasq-dns-5ccc8479f9-d99qf\" (UID: \"45f7903c-4081-446b-94d9-ad979332590b\") " pod="openstack/dnsmasq-dns-5ccc8479f9-d99qf" Mar 21 04:42:10 crc kubenswrapper[4839]: I0321 04:42:10.589070 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/45f7903c-4081-446b-94d9-ad979332590b-config\") pod \"dnsmasq-dns-5ccc8479f9-d99qf\" (UID: \"45f7903c-4081-446b-94d9-ad979332590b\") " pod="openstack/dnsmasq-dns-5ccc8479f9-d99qf" Mar 21 04:42:10 crc kubenswrapper[4839]: I0321 04:42:10.589620 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/45f7903c-4081-446b-94d9-ad979332590b-dns-svc\") pod \"dnsmasq-dns-5ccc8479f9-d99qf\" (UID: \"45f7903c-4081-446b-94d9-ad979332590b\") " pod="openstack/dnsmasq-dns-5ccc8479f9-d99qf" Mar 21 04:42:10 crc kubenswrapper[4839]: I0321 04:42:10.611014 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-djkvq\" (UniqueName: \"kubernetes.io/projected/45f7903c-4081-446b-94d9-ad979332590b-kube-api-access-djkvq\") pod \"dnsmasq-dns-5ccc8479f9-d99qf\" (UID: \"45f7903c-4081-446b-94d9-ad979332590b\") " pod="openstack/dnsmasq-dns-5ccc8479f9-d99qf" Mar 21 04:42:10 crc kubenswrapper[4839]: I0321 04:42:10.623959 4839 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-78dd6ddcc-7tj8n"] Mar 21 04:42:10 crc kubenswrapper[4839]: I0321 04:42:10.667457 4839 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-57d769cc4f-hpzj8"] Mar 21 04:42:10 crc kubenswrapper[4839]: I0321 04:42:10.668672 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-57d769cc4f-hpzj8" Mar 21 04:42:10 crc kubenswrapper[4839]: I0321 04:42:10.673508 4839 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-57d769cc4f-hpzj8"] Mar 21 04:42:10 crc kubenswrapper[4839]: I0321 04:42:10.684913 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5ccc8479f9-d99qf" Mar 21 04:42:10 crc kubenswrapper[4839]: I0321 04:42:10.791763 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6cxnx\" (UniqueName: \"kubernetes.io/projected/5f0d86dd-f2c4-415e-b4dc-0bfeb0b1dbb5-kube-api-access-6cxnx\") pod \"dnsmasq-dns-57d769cc4f-hpzj8\" (UID: \"5f0d86dd-f2c4-415e-b4dc-0bfeb0b1dbb5\") " pod="openstack/dnsmasq-dns-57d769cc4f-hpzj8" Mar 21 04:42:10 crc kubenswrapper[4839]: I0321 04:42:10.791873 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5f0d86dd-f2c4-415e-b4dc-0bfeb0b1dbb5-config\") pod \"dnsmasq-dns-57d769cc4f-hpzj8\" (UID: \"5f0d86dd-f2c4-415e-b4dc-0bfeb0b1dbb5\") " pod="openstack/dnsmasq-dns-57d769cc4f-hpzj8" Mar 21 04:42:10 crc kubenswrapper[4839]: I0321 04:42:10.791946 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/5f0d86dd-f2c4-415e-b4dc-0bfeb0b1dbb5-dns-svc\") pod \"dnsmasq-dns-57d769cc4f-hpzj8\" (UID: \"5f0d86dd-f2c4-415e-b4dc-0bfeb0b1dbb5\") " pod="openstack/dnsmasq-dns-57d769cc4f-hpzj8" Mar 21 04:42:10 crc kubenswrapper[4839]: I0321 04:42:10.894814 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5f0d86dd-f2c4-415e-b4dc-0bfeb0b1dbb5-config\") pod \"dnsmasq-dns-57d769cc4f-hpzj8\" (UID: \"5f0d86dd-f2c4-415e-b4dc-0bfeb0b1dbb5\") " pod="openstack/dnsmasq-dns-57d769cc4f-hpzj8" Mar 21 04:42:10 crc kubenswrapper[4839]: I0321 04:42:10.894918 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/5f0d86dd-f2c4-415e-b4dc-0bfeb0b1dbb5-dns-svc\") pod \"dnsmasq-dns-57d769cc4f-hpzj8\" (UID: \"5f0d86dd-f2c4-415e-b4dc-0bfeb0b1dbb5\") " pod="openstack/dnsmasq-dns-57d769cc4f-hpzj8" Mar 21 04:42:10 crc kubenswrapper[4839]: I0321 04:42:10.894976 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6cxnx\" (UniqueName: \"kubernetes.io/projected/5f0d86dd-f2c4-415e-b4dc-0bfeb0b1dbb5-kube-api-access-6cxnx\") pod \"dnsmasq-dns-57d769cc4f-hpzj8\" (UID: \"5f0d86dd-f2c4-415e-b4dc-0bfeb0b1dbb5\") " pod="openstack/dnsmasq-dns-57d769cc4f-hpzj8" Mar 21 04:42:10 crc kubenswrapper[4839]: I0321 04:42:10.895981 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5f0d86dd-f2c4-415e-b4dc-0bfeb0b1dbb5-config\") pod \"dnsmasq-dns-57d769cc4f-hpzj8\" (UID: \"5f0d86dd-f2c4-415e-b4dc-0bfeb0b1dbb5\") " pod="openstack/dnsmasq-dns-57d769cc4f-hpzj8" Mar 21 04:42:10 crc kubenswrapper[4839]: I0321 04:42:10.896029 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/5f0d86dd-f2c4-415e-b4dc-0bfeb0b1dbb5-dns-svc\") pod \"dnsmasq-dns-57d769cc4f-hpzj8\" (UID: \"5f0d86dd-f2c4-415e-b4dc-0bfeb0b1dbb5\") " pod="openstack/dnsmasq-dns-57d769cc4f-hpzj8" Mar 21 04:42:10 crc kubenswrapper[4839]: I0321 04:42:10.914473 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6cxnx\" (UniqueName: \"kubernetes.io/projected/5f0d86dd-f2c4-415e-b4dc-0bfeb0b1dbb5-kube-api-access-6cxnx\") pod \"dnsmasq-dns-57d769cc4f-hpzj8\" (UID: \"5f0d86dd-f2c4-415e-b4dc-0bfeb0b1dbb5\") " pod="openstack/dnsmasq-dns-57d769cc4f-hpzj8" Mar 21 04:42:10 crc kubenswrapper[4839]: I0321 04:42:10.985995 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-57d769cc4f-hpzj8" Mar 21 04:42:11 crc kubenswrapper[4839]: I0321 04:42:11.521128 4839 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Mar 21 04:42:11 crc kubenswrapper[4839]: I0321 04:42:11.523104 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Mar 21 04:42:11 crc kubenswrapper[4839]: I0321 04:42:11.525115 4839 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Mar 21 04:42:11 crc kubenswrapper[4839]: I0321 04:42:11.525972 4839 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-erlang-cookie" Mar 21 04:42:11 crc kubenswrapper[4839]: I0321 04:42:11.526247 4839 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-default-user" Mar 21 04:42:11 crc kubenswrapper[4839]: I0321 04:42:11.526534 4839 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-plugins-conf" Mar 21 04:42:11 crc kubenswrapper[4839]: I0321 04:42:11.527106 4839 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-server-conf" Mar 21 04:42:11 crc kubenswrapper[4839]: I0321 04:42:11.527259 4839 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-config-data" Mar 21 04:42:11 crc kubenswrapper[4839]: I0321 04:42:11.527477 4839 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-server-dockercfg-wq8rw" Mar 21 04:42:11 crc kubenswrapper[4839]: I0321 04:42:11.527658 4839 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-rabbitmq-cell1-svc" Mar 21 04:42:11 crc kubenswrapper[4839]: I0321 04:42:11.709491 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/6e1d0e8c-00aa-4770-9e58-b8f706d80a35-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"6e1d0e8c-00aa-4770-9e58-b8f706d80a35\") " pod="openstack/rabbitmq-cell1-server-0" Mar 21 04:42:11 crc kubenswrapper[4839]: I0321 04:42:11.709534 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/6e1d0e8c-00aa-4770-9e58-b8f706d80a35-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"6e1d0e8c-00aa-4770-9e58-b8f706d80a35\") " pod="openstack/rabbitmq-cell1-server-0" Mar 21 04:42:11 crc kubenswrapper[4839]: I0321 04:42:11.709590 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/6e1d0e8c-00aa-4770-9e58-b8f706d80a35-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"6e1d0e8c-00aa-4770-9e58-b8f706d80a35\") " pod="openstack/rabbitmq-cell1-server-0" Mar 21 04:42:11 crc kubenswrapper[4839]: I0321 04:42:11.709719 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/6e1d0e8c-00aa-4770-9e58-b8f706d80a35-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"6e1d0e8c-00aa-4770-9e58-b8f706d80a35\") " pod="openstack/rabbitmq-cell1-server-0" Mar 21 04:42:11 crc kubenswrapper[4839]: I0321 04:42:11.709755 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rb4vz\" (UniqueName: \"kubernetes.io/projected/6e1d0e8c-00aa-4770-9e58-b8f706d80a35-kube-api-access-rb4vz\") pod \"rabbitmq-cell1-server-0\" (UID: \"6e1d0e8c-00aa-4770-9e58-b8f706d80a35\") " pod="openstack/rabbitmq-cell1-server-0" Mar 21 04:42:11 crc kubenswrapper[4839]: I0321 04:42:11.709791 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/6e1d0e8c-00aa-4770-9e58-b8f706d80a35-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"6e1d0e8c-00aa-4770-9e58-b8f706d80a35\") " pod="openstack/rabbitmq-cell1-server-0" Mar 21 04:42:11 crc kubenswrapper[4839]: I0321 04:42:11.709848 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/6e1d0e8c-00aa-4770-9e58-b8f706d80a35-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"6e1d0e8c-00aa-4770-9e58-b8f706d80a35\") " pod="openstack/rabbitmq-cell1-server-0" Mar 21 04:42:11 crc kubenswrapper[4839]: I0321 04:42:11.709885 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/6e1d0e8c-00aa-4770-9e58-b8f706d80a35-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"6e1d0e8c-00aa-4770-9e58-b8f706d80a35\") " pod="openstack/rabbitmq-cell1-server-0" Mar 21 04:42:11 crc kubenswrapper[4839]: I0321 04:42:11.709940 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"6e1d0e8c-00aa-4770-9e58-b8f706d80a35\") " pod="openstack/rabbitmq-cell1-server-0" Mar 21 04:42:11 crc kubenswrapper[4839]: I0321 04:42:11.709968 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/6e1d0e8c-00aa-4770-9e58-b8f706d80a35-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"6e1d0e8c-00aa-4770-9e58-b8f706d80a35\") " pod="openstack/rabbitmq-cell1-server-0" Mar 21 04:42:11 crc kubenswrapper[4839]: I0321 04:42:11.709995 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/6e1d0e8c-00aa-4770-9e58-b8f706d80a35-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"6e1d0e8c-00aa-4770-9e58-b8f706d80a35\") " pod="openstack/rabbitmq-cell1-server-0" Mar 21 04:42:11 crc kubenswrapper[4839]: I0321 04:42:11.799938 4839 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-server-0"] Mar 21 04:42:11 crc kubenswrapper[4839]: I0321 04:42:11.801304 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Mar 21 04:42:11 crc kubenswrapper[4839]: I0321 04:42:11.806367 4839 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-rabbitmq-svc" Mar 21 04:42:11 crc kubenswrapper[4839]: I0321 04:42:11.806545 4839 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-default-user" Mar 21 04:42:11 crc kubenswrapper[4839]: I0321 04:42:11.807544 4839 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-server-dockercfg-nxhtb" Mar 21 04:42:11 crc kubenswrapper[4839]: I0321 04:42:11.807726 4839 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-server-conf" Mar 21 04:42:11 crc kubenswrapper[4839]: I0321 04:42:11.807966 4839 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-erlang-cookie" Mar 21 04:42:11 crc kubenswrapper[4839]: I0321 04:42:11.808247 4839 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-plugins-conf" Mar 21 04:42:11 crc kubenswrapper[4839]: I0321 04:42:11.808498 4839 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-config-data" Mar 21 04:42:11 crc kubenswrapper[4839]: I0321 04:42:11.810781 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/6e1d0e8c-00aa-4770-9e58-b8f706d80a35-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"6e1d0e8c-00aa-4770-9e58-b8f706d80a35\") " pod="openstack/rabbitmq-cell1-server-0" Mar 21 04:42:11 crc kubenswrapper[4839]: I0321 04:42:11.810837 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/6e1d0e8c-00aa-4770-9e58-b8f706d80a35-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"6e1d0e8c-00aa-4770-9e58-b8f706d80a35\") " pod="openstack/rabbitmq-cell1-server-0" Mar 21 04:42:11 crc kubenswrapper[4839]: I0321 04:42:11.810873 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"6e1d0e8c-00aa-4770-9e58-b8f706d80a35\") " pod="openstack/rabbitmq-cell1-server-0" Mar 21 04:42:11 crc kubenswrapper[4839]: I0321 04:42:11.810903 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/6e1d0e8c-00aa-4770-9e58-b8f706d80a35-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"6e1d0e8c-00aa-4770-9e58-b8f706d80a35\") " pod="openstack/rabbitmq-cell1-server-0" Mar 21 04:42:11 crc kubenswrapper[4839]: I0321 04:42:11.810932 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/6e1d0e8c-00aa-4770-9e58-b8f706d80a35-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"6e1d0e8c-00aa-4770-9e58-b8f706d80a35\") " pod="openstack/rabbitmq-cell1-server-0" Mar 21 04:42:11 crc kubenswrapper[4839]: I0321 04:42:11.810984 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/6e1d0e8c-00aa-4770-9e58-b8f706d80a35-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"6e1d0e8c-00aa-4770-9e58-b8f706d80a35\") " pod="openstack/rabbitmq-cell1-server-0" Mar 21 04:42:11 crc kubenswrapper[4839]: I0321 04:42:11.811005 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/6e1d0e8c-00aa-4770-9e58-b8f706d80a35-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"6e1d0e8c-00aa-4770-9e58-b8f706d80a35\") " pod="openstack/rabbitmq-cell1-server-0" Mar 21 04:42:11 crc kubenswrapper[4839]: I0321 04:42:11.811034 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/6e1d0e8c-00aa-4770-9e58-b8f706d80a35-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"6e1d0e8c-00aa-4770-9e58-b8f706d80a35\") " pod="openstack/rabbitmq-cell1-server-0" Mar 21 04:42:11 crc kubenswrapper[4839]: I0321 04:42:11.811072 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/6e1d0e8c-00aa-4770-9e58-b8f706d80a35-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"6e1d0e8c-00aa-4770-9e58-b8f706d80a35\") " pod="openstack/rabbitmq-cell1-server-0" Mar 21 04:42:11 crc kubenswrapper[4839]: I0321 04:42:11.811095 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rb4vz\" (UniqueName: \"kubernetes.io/projected/6e1d0e8c-00aa-4770-9e58-b8f706d80a35-kube-api-access-rb4vz\") pod \"rabbitmq-cell1-server-0\" (UID: \"6e1d0e8c-00aa-4770-9e58-b8f706d80a35\") " pod="openstack/rabbitmq-cell1-server-0" Mar 21 04:42:11 crc kubenswrapper[4839]: I0321 04:42:11.811132 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/6e1d0e8c-00aa-4770-9e58-b8f706d80a35-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"6e1d0e8c-00aa-4770-9e58-b8f706d80a35\") " pod="openstack/rabbitmq-cell1-server-0" Mar 21 04:42:11 crc kubenswrapper[4839]: I0321 04:42:11.812150 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/6e1d0e8c-00aa-4770-9e58-b8f706d80a35-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"6e1d0e8c-00aa-4770-9e58-b8f706d80a35\") " pod="openstack/rabbitmq-cell1-server-0" Mar 21 04:42:11 crc kubenswrapper[4839]: I0321 04:42:11.812405 4839 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"6e1d0e8c-00aa-4770-9e58-b8f706d80a35\") device mount path \"/mnt/openstack/pv04\"" pod="openstack/rabbitmq-cell1-server-0" Mar 21 04:42:11 crc kubenswrapper[4839]: I0321 04:42:11.812468 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/6e1d0e8c-00aa-4770-9e58-b8f706d80a35-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"6e1d0e8c-00aa-4770-9e58-b8f706d80a35\") " pod="openstack/rabbitmq-cell1-server-0" Mar 21 04:42:11 crc kubenswrapper[4839]: I0321 04:42:11.812821 4839 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Mar 21 04:42:11 crc kubenswrapper[4839]: I0321 04:42:11.816018 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/6e1d0e8c-00aa-4770-9e58-b8f706d80a35-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"6e1d0e8c-00aa-4770-9e58-b8f706d80a35\") " pod="openstack/rabbitmq-cell1-server-0" Mar 21 04:42:11 crc kubenswrapper[4839]: I0321 04:42:11.816192 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/6e1d0e8c-00aa-4770-9e58-b8f706d80a35-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"6e1d0e8c-00aa-4770-9e58-b8f706d80a35\") " pod="openstack/rabbitmq-cell1-server-0" Mar 21 04:42:11 crc kubenswrapper[4839]: I0321 04:42:11.816363 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/6e1d0e8c-00aa-4770-9e58-b8f706d80a35-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"6e1d0e8c-00aa-4770-9e58-b8f706d80a35\") " pod="openstack/rabbitmq-cell1-server-0" Mar 21 04:42:11 crc kubenswrapper[4839]: I0321 04:42:11.818195 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/6e1d0e8c-00aa-4770-9e58-b8f706d80a35-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"6e1d0e8c-00aa-4770-9e58-b8f706d80a35\") " pod="openstack/rabbitmq-cell1-server-0" Mar 21 04:42:11 crc kubenswrapper[4839]: I0321 04:42:11.827633 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/6e1d0e8c-00aa-4770-9e58-b8f706d80a35-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"6e1d0e8c-00aa-4770-9e58-b8f706d80a35\") " pod="openstack/rabbitmq-cell1-server-0" Mar 21 04:42:11 crc kubenswrapper[4839]: I0321 04:42:11.835276 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rb4vz\" (UniqueName: \"kubernetes.io/projected/6e1d0e8c-00aa-4770-9e58-b8f706d80a35-kube-api-access-rb4vz\") pod \"rabbitmq-cell1-server-0\" (UID: \"6e1d0e8c-00aa-4770-9e58-b8f706d80a35\") " pod="openstack/rabbitmq-cell1-server-0" Mar 21 04:42:11 crc kubenswrapper[4839]: I0321 04:42:11.836425 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/6e1d0e8c-00aa-4770-9e58-b8f706d80a35-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"6e1d0e8c-00aa-4770-9e58-b8f706d80a35\") " pod="openstack/rabbitmq-cell1-server-0" Mar 21 04:42:11 crc kubenswrapper[4839]: I0321 04:42:11.843497 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"6e1d0e8c-00aa-4770-9e58-b8f706d80a35\") " pod="openstack/rabbitmq-cell1-server-0" Mar 21 04:42:11 crc kubenswrapper[4839]: I0321 04:42:11.844026 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/6e1d0e8c-00aa-4770-9e58-b8f706d80a35-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"6e1d0e8c-00aa-4770-9e58-b8f706d80a35\") " pod="openstack/rabbitmq-cell1-server-0" Mar 21 04:42:11 crc kubenswrapper[4839]: I0321 04:42:11.847900 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Mar 21 04:42:11 crc kubenswrapper[4839]: I0321 04:42:11.912986 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/8028561c-b039-4400-a065-b5efee753b5f-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"8028561c-b039-4400-a065-b5efee753b5f\") " pod="openstack/rabbitmq-server-0" Mar 21 04:42:11 crc kubenswrapper[4839]: I0321 04:42:11.913029 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vh2cd\" (UniqueName: \"kubernetes.io/projected/8028561c-b039-4400-a065-b5efee753b5f-kube-api-access-vh2cd\") pod \"rabbitmq-server-0\" (UID: \"8028561c-b039-4400-a065-b5efee753b5f\") " pod="openstack/rabbitmq-server-0" Mar 21 04:42:11 crc kubenswrapper[4839]: I0321 04:42:11.913073 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/8028561c-b039-4400-a065-b5efee753b5f-config-data\") pod \"rabbitmq-server-0\" (UID: \"8028561c-b039-4400-a065-b5efee753b5f\") " pod="openstack/rabbitmq-server-0" Mar 21 04:42:11 crc kubenswrapper[4839]: I0321 04:42:11.913091 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/8028561c-b039-4400-a065-b5efee753b5f-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"8028561c-b039-4400-a065-b5efee753b5f\") " pod="openstack/rabbitmq-server-0" Mar 21 04:42:11 crc kubenswrapper[4839]: I0321 04:42:11.913153 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/8028561c-b039-4400-a065-b5efee753b5f-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"8028561c-b039-4400-a065-b5efee753b5f\") " pod="openstack/rabbitmq-server-0" Mar 21 04:42:11 crc kubenswrapper[4839]: I0321 04:42:11.913220 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/8028561c-b039-4400-a065-b5efee753b5f-pod-info\") pod \"rabbitmq-server-0\" (UID: \"8028561c-b039-4400-a065-b5efee753b5f\") " pod="openstack/rabbitmq-server-0" Mar 21 04:42:11 crc kubenswrapper[4839]: I0321 04:42:11.913256 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/8028561c-b039-4400-a065-b5efee753b5f-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"8028561c-b039-4400-a065-b5efee753b5f\") " pod="openstack/rabbitmq-server-0" Mar 21 04:42:11 crc kubenswrapper[4839]: I0321 04:42:11.913332 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"rabbitmq-server-0\" (UID: \"8028561c-b039-4400-a065-b5efee753b5f\") " pod="openstack/rabbitmq-server-0" Mar 21 04:42:11 crc kubenswrapper[4839]: I0321 04:42:11.913438 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/8028561c-b039-4400-a065-b5efee753b5f-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"8028561c-b039-4400-a065-b5efee753b5f\") " pod="openstack/rabbitmq-server-0" Mar 21 04:42:11 crc kubenswrapper[4839]: I0321 04:42:11.913481 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/8028561c-b039-4400-a065-b5efee753b5f-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"8028561c-b039-4400-a065-b5efee753b5f\") " pod="openstack/rabbitmq-server-0" Mar 21 04:42:11 crc kubenswrapper[4839]: I0321 04:42:11.913507 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/8028561c-b039-4400-a065-b5efee753b5f-server-conf\") pod \"rabbitmq-server-0\" (UID: \"8028561c-b039-4400-a065-b5efee753b5f\") " pod="openstack/rabbitmq-server-0" Mar 21 04:42:12 crc kubenswrapper[4839]: I0321 04:42:12.015016 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/8028561c-b039-4400-a065-b5efee753b5f-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"8028561c-b039-4400-a065-b5efee753b5f\") " pod="openstack/rabbitmq-server-0" Mar 21 04:42:12 crc kubenswrapper[4839]: I0321 04:42:12.015071 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/8028561c-b039-4400-a065-b5efee753b5f-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"8028561c-b039-4400-a065-b5efee753b5f\") " pod="openstack/rabbitmq-server-0" Mar 21 04:42:12 crc kubenswrapper[4839]: I0321 04:42:12.015097 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/8028561c-b039-4400-a065-b5efee753b5f-server-conf\") pod \"rabbitmq-server-0\" (UID: \"8028561c-b039-4400-a065-b5efee753b5f\") " pod="openstack/rabbitmq-server-0" Mar 21 04:42:12 crc kubenswrapper[4839]: I0321 04:42:12.015180 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/8028561c-b039-4400-a065-b5efee753b5f-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"8028561c-b039-4400-a065-b5efee753b5f\") " pod="openstack/rabbitmq-server-0" Mar 21 04:42:12 crc kubenswrapper[4839]: I0321 04:42:12.015208 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vh2cd\" (UniqueName: \"kubernetes.io/projected/8028561c-b039-4400-a065-b5efee753b5f-kube-api-access-vh2cd\") pod \"rabbitmq-server-0\" (UID: \"8028561c-b039-4400-a065-b5efee753b5f\") " pod="openstack/rabbitmq-server-0" Mar 21 04:42:12 crc kubenswrapper[4839]: I0321 04:42:12.015238 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/8028561c-b039-4400-a065-b5efee753b5f-config-data\") pod \"rabbitmq-server-0\" (UID: \"8028561c-b039-4400-a065-b5efee753b5f\") " pod="openstack/rabbitmq-server-0" Mar 21 04:42:12 crc kubenswrapper[4839]: I0321 04:42:12.015261 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/8028561c-b039-4400-a065-b5efee753b5f-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"8028561c-b039-4400-a065-b5efee753b5f\") " pod="openstack/rabbitmq-server-0" Mar 21 04:42:12 crc kubenswrapper[4839]: I0321 04:42:12.015284 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/8028561c-b039-4400-a065-b5efee753b5f-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"8028561c-b039-4400-a065-b5efee753b5f\") " pod="openstack/rabbitmq-server-0" Mar 21 04:42:12 crc kubenswrapper[4839]: I0321 04:42:12.015308 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/8028561c-b039-4400-a065-b5efee753b5f-pod-info\") pod \"rabbitmq-server-0\" (UID: \"8028561c-b039-4400-a065-b5efee753b5f\") " pod="openstack/rabbitmq-server-0" Mar 21 04:42:12 crc kubenswrapper[4839]: I0321 04:42:12.015332 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/8028561c-b039-4400-a065-b5efee753b5f-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"8028561c-b039-4400-a065-b5efee753b5f\") " pod="openstack/rabbitmq-server-0" Mar 21 04:42:12 crc kubenswrapper[4839]: I0321 04:42:12.015368 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"rabbitmq-server-0\" (UID: \"8028561c-b039-4400-a065-b5efee753b5f\") " pod="openstack/rabbitmq-server-0" Mar 21 04:42:12 crc kubenswrapper[4839]: I0321 04:42:12.015738 4839 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"rabbitmq-server-0\" (UID: \"8028561c-b039-4400-a065-b5efee753b5f\") device mount path \"/mnt/openstack/pv03\"" pod="openstack/rabbitmq-server-0" Mar 21 04:42:12 crc kubenswrapper[4839]: I0321 04:42:12.019388 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/8028561c-b039-4400-a065-b5efee753b5f-config-data\") pod \"rabbitmq-server-0\" (UID: \"8028561c-b039-4400-a065-b5efee753b5f\") " pod="openstack/rabbitmq-server-0" Mar 21 04:42:12 crc kubenswrapper[4839]: I0321 04:42:12.019754 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/8028561c-b039-4400-a065-b5efee753b5f-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"8028561c-b039-4400-a065-b5efee753b5f\") " pod="openstack/rabbitmq-server-0" Mar 21 04:42:12 crc kubenswrapper[4839]: I0321 04:42:12.019905 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/8028561c-b039-4400-a065-b5efee753b5f-server-conf\") pod \"rabbitmq-server-0\" (UID: \"8028561c-b039-4400-a065-b5efee753b5f\") " pod="openstack/rabbitmq-server-0" Mar 21 04:42:12 crc kubenswrapper[4839]: I0321 04:42:12.020865 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/8028561c-b039-4400-a065-b5efee753b5f-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"8028561c-b039-4400-a065-b5efee753b5f\") " pod="openstack/rabbitmq-server-0" Mar 21 04:42:12 crc kubenswrapper[4839]: I0321 04:42:12.023937 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/8028561c-b039-4400-a065-b5efee753b5f-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"8028561c-b039-4400-a065-b5efee753b5f\") " pod="openstack/rabbitmq-server-0" Mar 21 04:42:12 crc kubenswrapper[4839]: I0321 04:42:12.024261 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/8028561c-b039-4400-a065-b5efee753b5f-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"8028561c-b039-4400-a065-b5efee753b5f\") " pod="openstack/rabbitmq-server-0" Mar 21 04:42:12 crc kubenswrapper[4839]: I0321 04:42:12.031300 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/8028561c-b039-4400-a065-b5efee753b5f-pod-info\") pod \"rabbitmq-server-0\" (UID: \"8028561c-b039-4400-a065-b5efee753b5f\") " pod="openstack/rabbitmq-server-0" Mar 21 04:42:12 crc kubenswrapper[4839]: I0321 04:42:12.032272 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/8028561c-b039-4400-a065-b5efee753b5f-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"8028561c-b039-4400-a065-b5efee753b5f\") " pod="openstack/rabbitmq-server-0" Mar 21 04:42:12 crc kubenswrapper[4839]: I0321 04:42:12.037791 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/8028561c-b039-4400-a065-b5efee753b5f-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"8028561c-b039-4400-a065-b5efee753b5f\") " pod="openstack/rabbitmq-server-0" Mar 21 04:42:12 crc kubenswrapper[4839]: I0321 04:42:12.063279 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"rabbitmq-server-0\" (UID: \"8028561c-b039-4400-a065-b5efee753b5f\") " pod="openstack/rabbitmq-server-0" Mar 21 04:42:12 crc kubenswrapper[4839]: I0321 04:42:12.073650 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vh2cd\" (UniqueName: \"kubernetes.io/projected/8028561c-b039-4400-a065-b5efee753b5f-kube-api-access-vh2cd\") pod \"rabbitmq-server-0\" (UID: \"8028561c-b039-4400-a065-b5efee753b5f\") " pod="openstack/rabbitmq-server-0" Mar 21 04:42:12 crc kubenswrapper[4839]: I0321 04:42:12.197671 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Mar 21 04:42:13 crc kubenswrapper[4839]: I0321 04:42:13.143858 4839 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstack-galera-0"] Mar 21 04:42:13 crc kubenswrapper[4839]: I0321 04:42:13.146482 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-galera-0" Mar 21 04:42:13 crc kubenswrapper[4839]: I0321 04:42:13.152706 4839 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-galera-openstack-svc" Mar 21 04:42:13 crc kubenswrapper[4839]: I0321 04:42:13.152783 4839 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"galera-openstack-dockercfg-p2lmh" Mar 21 04:42:13 crc kubenswrapper[4839]: I0321 04:42:13.152890 4839 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-config-data" Mar 21 04:42:13 crc kubenswrapper[4839]: I0321 04:42:13.152970 4839 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-scripts" Mar 21 04:42:13 crc kubenswrapper[4839]: I0321 04:42:13.153010 4839 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-galera-0"] Mar 21 04:42:13 crc kubenswrapper[4839]: I0321 04:42:13.157683 4839 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"combined-ca-bundle" Mar 21 04:42:13 crc kubenswrapper[4839]: I0321 04:42:13.236273 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/4f1edf0d-f220-4815-aeb6-e4507576247a-config-data-default\") pod \"openstack-galera-0\" (UID: \"4f1edf0d-f220-4815-aeb6-e4507576247a\") " pod="openstack/openstack-galera-0" Mar 21 04:42:13 crc kubenswrapper[4839]: I0321 04:42:13.236467 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/4f1edf0d-f220-4815-aeb6-e4507576247a-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"4f1edf0d-f220-4815-aeb6-e4507576247a\") " pod="openstack/openstack-galera-0" Mar 21 04:42:13 crc kubenswrapper[4839]: I0321 04:42:13.236500 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qv6bv\" (UniqueName: \"kubernetes.io/projected/4f1edf0d-f220-4815-aeb6-e4507576247a-kube-api-access-qv6bv\") pod \"openstack-galera-0\" (UID: \"4f1edf0d-f220-4815-aeb6-e4507576247a\") " pod="openstack/openstack-galera-0" Mar 21 04:42:13 crc kubenswrapper[4839]: I0321 04:42:13.236539 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/4f1edf0d-f220-4815-aeb6-e4507576247a-kolla-config\") pod \"openstack-galera-0\" (UID: \"4f1edf0d-f220-4815-aeb6-e4507576247a\") " pod="openstack/openstack-galera-0" Mar 21 04:42:13 crc kubenswrapper[4839]: I0321 04:42:13.236682 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"openstack-galera-0\" (UID: \"4f1edf0d-f220-4815-aeb6-e4507576247a\") " pod="openstack/openstack-galera-0" Mar 21 04:42:13 crc kubenswrapper[4839]: I0321 04:42:13.236708 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/4f1edf0d-f220-4815-aeb6-e4507576247a-operator-scripts\") pod \"openstack-galera-0\" (UID: \"4f1edf0d-f220-4815-aeb6-e4507576247a\") " pod="openstack/openstack-galera-0" Mar 21 04:42:13 crc kubenswrapper[4839]: I0321 04:42:13.236771 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/4f1edf0d-f220-4815-aeb6-e4507576247a-config-data-generated\") pod \"openstack-galera-0\" (UID: \"4f1edf0d-f220-4815-aeb6-e4507576247a\") " pod="openstack/openstack-galera-0" Mar 21 04:42:13 crc kubenswrapper[4839]: I0321 04:42:13.236880 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4f1edf0d-f220-4815-aeb6-e4507576247a-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"4f1edf0d-f220-4815-aeb6-e4507576247a\") " pod="openstack/openstack-galera-0" Mar 21 04:42:13 crc kubenswrapper[4839]: I0321 04:42:13.338041 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4f1edf0d-f220-4815-aeb6-e4507576247a-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"4f1edf0d-f220-4815-aeb6-e4507576247a\") " pod="openstack/openstack-galera-0" Mar 21 04:42:13 crc kubenswrapper[4839]: I0321 04:42:13.338096 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/4f1edf0d-f220-4815-aeb6-e4507576247a-config-data-default\") pod \"openstack-galera-0\" (UID: \"4f1edf0d-f220-4815-aeb6-e4507576247a\") " pod="openstack/openstack-galera-0" Mar 21 04:42:13 crc kubenswrapper[4839]: I0321 04:42:13.338112 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/4f1edf0d-f220-4815-aeb6-e4507576247a-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"4f1edf0d-f220-4815-aeb6-e4507576247a\") " pod="openstack/openstack-galera-0" Mar 21 04:42:13 crc kubenswrapper[4839]: I0321 04:42:13.338130 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qv6bv\" (UniqueName: \"kubernetes.io/projected/4f1edf0d-f220-4815-aeb6-e4507576247a-kube-api-access-qv6bv\") pod \"openstack-galera-0\" (UID: \"4f1edf0d-f220-4815-aeb6-e4507576247a\") " pod="openstack/openstack-galera-0" Mar 21 04:42:13 crc kubenswrapper[4839]: I0321 04:42:13.338156 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/4f1edf0d-f220-4815-aeb6-e4507576247a-kolla-config\") pod \"openstack-galera-0\" (UID: \"4f1edf0d-f220-4815-aeb6-e4507576247a\") " pod="openstack/openstack-galera-0" Mar 21 04:42:13 crc kubenswrapper[4839]: I0321 04:42:13.338198 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"openstack-galera-0\" (UID: \"4f1edf0d-f220-4815-aeb6-e4507576247a\") " pod="openstack/openstack-galera-0" Mar 21 04:42:13 crc kubenswrapper[4839]: I0321 04:42:13.338215 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/4f1edf0d-f220-4815-aeb6-e4507576247a-operator-scripts\") pod \"openstack-galera-0\" (UID: \"4f1edf0d-f220-4815-aeb6-e4507576247a\") " pod="openstack/openstack-galera-0" Mar 21 04:42:13 crc kubenswrapper[4839]: I0321 04:42:13.338236 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/4f1edf0d-f220-4815-aeb6-e4507576247a-config-data-generated\") pod \"openstack-galera-0\" (UID: \"4f1edf0d-f220-4815-aeb6-e4507576247a\") " pod="openstack/openstack-galera-0" Mar 21 04:42:13 crc kubenswrapper[4839]: I0321 04:42:13.338588 4839 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"openstack-galera-0\" (UID: \"4f1edf0d-f220-4815-aeb6-e4507576247a\") device mount path \"/mnt/openstack/pv08\"" pod="openstack/openstack-galera-0" Mar 21 04:42:13 crc kubenswrapper[4839]: I0321 04:42:13.338718 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/4f1edf0d-f220-4815-aeb6-e4507576247a-config-data-generated\") pod \"openstack-galera-0\" (UID: \"4f1edf0d-f220-4815-aeb6-e4507576247a\") " pod="openstack/openstack-galera-0" Mar 21 04:42:13 crc kubenswrapper[4839]: I0321 04:42:13.339448 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/4f1edf0d-f220-4815-aeb6-e4507576247a-kolla-config\") pod \"openstack-galera-0\" (UID: \"4f1edf0d-f220-4815-aeb6-e4507576247a\") " pod="openstack/openstack-galera-0" Mar 21 04:42:13 crc kubenswrapper[4839]: I0321 04:42:13.339591 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/4f1edf0d-f220-4815-aeb6-e4507576247a-config-data-default\") pod \"openstack-galera-0\" (UID: \"4f1edf0d-f220-4815-aeb6-e4507576247a\") " pod="openstack/openstack-galera-0" Mar 21 04:42:13 crc kubenswrapper[4839]: I0321 04:42:13.340061 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/4f1edf0d-f220-4815-aeb6-e4507576247a-operator-scripts\") pod \"openstack-galera-0\" (UID: \"4f1edf0d-f220-4815-aeb6-e4507576247a\") " pod="openstack/openstack-galera-0" Mar 21 04:42:13 crc kubenswrapper[4839]: I0321 04:42:13.346212 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4f1edf0d-f220-4815-aeb6-e4507576247a-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"4f1edf0d-f220-4815-aeb6-e4507576247a\") " pod="openstack/openstack-galera-0" Mar 21 04:42:13 crc kubenswrapper[4839]: I0321 04:42:13.347665 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/4f1edf0d-f220-4815-aeb6-e4507576247a-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"4f1edf0d-f220-4815-aeb6-e4507576247a\") " pod="openstack/openstack-galera-0" Mar 21 04:42:13 crc kubenswrapper[4839]: I0321 04:42:13.366556 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qv6bv\" (UniqueName: \"kubernetes.io/projected/4f1edf0d-f220-4815-aeb6-e4507576247a-kube-api-access-qv6bv\") pod \"openstack-galera-0\" (UID: \"4f1edf0d-f220-4815-aeb6-e4507576247a\") " pod="openstack/openstack-galera-0" Mar 21 04:42:13 crc kubenswrapper[4839]: I0321 04:42:13.368875 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"openstack-galera-0\" (UID: \"4f1edf0d-f220-4815-aeb6-e4507576247a\") " pod="openstack/openstack-galera-0" Mar 21 04:42:13 crc kubenswrapper[4839]: I0321 04:42:13.466367 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-galera-0" Mar 21 04:42:14 crc kubenswrapper[4839]: I0321 04:42:14.418676 4839 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstack-cell1-galera-0"] Mar 21 04:42:14 crc kubenswrapper[4839]: I0321 04:42:14.419777 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-cell1-galera-0" Mar 21 04:42:14 crc kubenswrapper[4839]: I0321 04:42:14.424888 4839 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"galera-openstack-cell1-dockercfg-shglw" Mar 21 04:42:14 crc kubenswrapper[4839]: I0321 04:42:14.430148 4839 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-galera-openstack-cell1-svc" Mar 21 04:42:14 crc kubenswrapper[4839]: I0321 04:42:14.431991 4839 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-cell1-config-data" Mar 21 04:42:14 crc kubenswrapper[4839]: I0321 04:42:14.432275 4839 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-cell1-scripts" Mar 21 04:42:14 crc kubenswrapper[4839]: I0321 04:42:14.449832 4839 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-cell1-galera-0"] Mar 21 04:42:14 crc kubenswrapper[4839]: I0321 04:42:14.558525 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"openstack-cell1-galera-0\" (UID: \"30d22e92-45bd-4d1e-954e-3ade801245d4\") " pod="openstack/openstack-cell1-galera-0" Mar 21 04:42:14 crc kubenswrapper[4839]: I0321 04:42:14.559813 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/30d22e92-45bd-4d1e-954e-3ade801245d4-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"30d22e92-45bd-4d1e-954e-3ade801245d4\") " pod="openstack/openstack-cell1-galera-0" Mar 21 04:42:14 crc kubenswrapper[4839]: I0321 04:42:14.559877 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/30d22e92-45bd-4d1e-954e-3ade801245d4-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"30d22e92-45bd-4d1e-954e-3ade801245d4\") " pod="openstack/openstack-cell1-galera-0" Mar 21 04:42:14 crc kubenswrapper[4839]: I0321 04:42:14.560078 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/30d22e92-45bd-4d1e-954e-3ade801245d4-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"30d22e92-45bd-4d1e-954e-3ade801245d4\") " pod="openstack/openstack-cell1-galera-0" Mar 21 04:42:14 crc kubenswrapper[4839]: I0321 04:42:14.560142 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vw629\" (UniqueName: \"kubernetes.io/projected/30d22e92-45bd-4d1e-954e-3ade801245d4-kube-api-access-vw629\") pod \"openstack-cell1-galera-0\" (UID: \"30d22e92-45bd-4d1e-954e-3ade801245d4\") " pod="openstack/openstack-cell1-galera-0" Mar 21 04:42:14 crc kubenswrapper[4839]: I0321 04:42:14.560177 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/30d22e92-45bd-4d1e-954e-3ade801245d4-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"30d22e92-45bd-4d1e-954e-3ade801245d4\") " pod="openstack/openstack-cell1-galera-0" Mar 21 04:42:14 crc kubenswrapper[4839]: I0321 04:42:14.560213 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/30d22e92-45bd-4d1e-954e-3ade801245d4-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"30d22e92-45bd-4d1e-954e-3ade801245d4\") " pod="openstack/openstack-cell1-galera-0" Mar 21 04:42:14 crc kubenswrapper[4839]: I0321 04:42:14.562962 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/30d22e92-45bd-4d1e-954e-3ade801245d4-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"30d22e92-45bd-4d1e-954e-3ade801245d4\") " pod="openstack/openstack-cell1-galera-0" Mar 21 04:42:14 crc kubenswrapper[4839]: I0321 04:42:14.664894 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/30d22e92-45bd-4d1e-954e-3ade801245d4-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"30d22e92-45bd-4d1e-954e-3ade801245d4\") " pod="openstack/openstack-cell1-galera-0" Mar 21 04:42:14 crc kubenswrapper[4839]: I0321 04:42:14.664972 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vw629\" (UniqueName: \"kubernetes.io/projected/30d22e92-45bd-4d1e-954e-3ade801245d4-kube-api-access-vw629\") pod \"openstack-cell1-galera-0\" (UID: \"30d22e92-45bd-4d1e-954e-3ade801245d4\") " pod="openstack/openstack-cell1-galera-0" Mar 21 04:42:14 crc kubenswrapper[4839]: I0321 04:42:14.664998 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/30d22e92-45bd-4d1e-954e-3ade801245d4-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"30d22e92-45bd-4d1e-954e-3ade801245d4\") " pod="openstack/openstack-cell1-galera-0" Mar 21 04:42:14 crc kubenswrapper[4839]: I0321 04:42:14.665022 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/30d22e92-45bd-4d1e-954e-3ade801245d4-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"30d22e92-45bd-4d1e-954e-3ade801245d4\") " pod="openstack/openstack-cell1-galera-0" Mar 21 04:42:14 crc kubenswrapper[4839]: I0321 04:42:14.665046 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/30d22e92-45bd-4d1e-954e-3ade801245d4-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"30d22e92-45bd-4d1e-954e-3ade801245d4\") " pod="openstack/openstack-cell1-galera-0" Mar 21 04:42:14 crc kubenswrapper[4839]: I0321 04:42:14.665099 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"openstack-cell1-galera-0\" (UID: \"30d22e92-45bd-4d1e-954e-3ade801245d4\") " pod="openstack/openstack-cell1-galera-0" Mar 21 04:42:14 crc kubenswrapper[4839]: I0321 04:42:14.665147 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/30d22e92-45bd-4d1e-954e-3ade801245d4-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"30d22e92-45bd-4d1e-954e-3ade801245d4\") " pod="openstack/openstack-cell1-galera-0" Mar 21 04:42:14 crc kubenswrapper[4839]: I0321 04:42:14.665171 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/30d22e92-45bd-4d1e-954e-3ade801245d4-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"30d22e92-45bd-4d1e-954e-3ade801245d4\") " pod="openstack/openstack-cell1-galera-0" Mar 21 04:42:14 crc kubenswrapper[4839]: I0321 04:42:14.665539 4839 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"openstack-cell1-galera-0\" (UID: \"30d22e92-45bd-4d1e-954e-3ade801245d4\") device mount path \"/mnt/openstack/pv06\"" pod="openstack/openstack-cell1-galera-0" Mar 21 04:42:14 crc kubenswrapper[4839]: I0321 04:42:14.665775 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/30d22e92-45bd-4d1e-954e-3ade801245d4-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"30d22e92-45bd-4d1e-954e-3ade801245d4\") " pod="openstack/openstack-cell1-galera-0" Mar 21 04:42:14 crc kubenswrapper[4839]: I0321 04:42:14.666616 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/30d22e92-45bd-4d1e-954e-3ade801245d4-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"30d22e92-45bd-4d1e-954e-3ade801245d4\") " pod="openstack/openstack-cell1-galera-0" Mar 21 04:42:14 crc kubenswrapper[4839]: I0321 04:42:14.666646 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/30d22e92-45bd-4d1e-954e-3ade801245d4-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"30d22e92-45bd-4d1e-954e-3ade801245d4\") " pod="openstack/openstack-cell1-galera-0" Mar 21 04:42:14 crc kubenswrapper[4839]: I0321 04:42:14.667037 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/30d22e92-45bd-4d1e-954e-3ade801245d4-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"30d22e92-45bd-4d1e-954e-3ade801245d4\") " pod="openstack/openstack-cell1-galera-0" Mar 21 04:42:14 crc kubenswrapper[4839]: I0321 04:42:14.670015 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/30d22e92-45bd-4d1e-954e-3ade801245d4-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"30d22e92-45bd-4d1e-954e-3ade801245d4\") " pod="openstack/openstack-cell1-galera-0" Mar 21 04:42:14 crc kubenswrapper[4839]: I0321 04:42:14.683755 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/30d22e92-45bd-4d1e-954e-3ade801245d4-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"30d22e92-45bd-4d1e-954e-3ade801245d4\") " pod="openstack/openstack-cell1-galera-0" Mar 21 04:42:14 crc kubenswrapper[4839]: I0321 04:42:14.687315 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vw629\" (UniqueName: \"kubernetes.io/projected/30d22e92-45bd-4d1e-954e-3ade801245d4-kube-api-access-vw629\") pod \"openstack-cell1-galera-0\" (UID: \"30d22e92-45bd-4d1e-954e-3ade801245d4\") " pod="openstack/openstack-cell1-galera-0" Mar 21 04:42:14 crc kubenswrapper[4839]: I0321 04:42:14.688620 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"openstack-cell1-galera-0\" (UID: \"30d22e92-45bd-4d1e-954e-3ade801245d4\") " pod="openstack/openstack-cell1-galera-0" Mar 21 04:42:14 crc kubenswrapper[4839]: I0321 04:42:14.740463 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-cell1-galera-0" Mar 21 04:42:14 crc kubenswrapper[4839]: I0321 04:42:14.757893 4839 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/memcached-0"] Mar 21 04:42:14 crc kubenswrapper[4839]: I0321 04:42:14.758949 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/memcached-0" Mar 21 04:42:14 crc kubenswrapper[4839]: I0321 04:42:14.763363 4839 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"memcached-config-data" Mar 21 04:42:14 crc kubenswrapper[4839]: I0321 04:42:14.763392 4839 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"memcached-memcached-dockercfg-4xnqn" Mar 21 04:42:14 crc kubenswrapper[4839]: I0321 04:42:14.767430 4839 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-memcached-svc" Mar 21 04:42:14 crc kubenswrapper[4839]: I0321 04:42:14.776867 4839 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/memcached-0"] Mar 21 04:42:14 crc kubenswrapper[4839]: I0321 04:42:14.870446 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9dsjx\" (UniqueName: \"kubernetes.io/projected/3c49bdbb-0c05-4dea-8de8-61ca09b7e84c-kube-api-access-9dsjx\") pod \"memcached-0\" (UID: \"3c49bdbb-0c05-4dea-8de8-61ca09b7e84c\") " pod="openstack/memcached-0" Mar 21 04:42:14 crc kubenswrapper[4839]: I0321 04:42:14.870514 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/3c49bdbb-0c05-4dea-8de8-61ca09b7e84c-config-data\") pod \"memcached-0\" (UID: \"3c49bdbb-0c05-4dea-8de8-61ca09b7e84c\") " pod="openstack/memcached-0" Mar 21 04:42:14 crc kubenswrapper[4839]: I0321 04:42:14.870561 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3c49bdbb-0c05-4dea-8de8-61ca09b7e84c-combined-ca-bundle\") pod \"memcached-0\" (UID: \"3c49bdbb-0c05-4dea-8de8-61ca09b7e84c\") " pod="openstack/memcached-0" Mar 21 04:42:14 crc kubenswrapper[4839]: I0321 04:42:14.870629 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/3c49bdbb-0c05-4dea-8de8-61ca09b7e84c-memcached-tls-certs\") pod \"memcached-0\" (UID: \"3c49bdbb-0c05-4dea-8de8-61ca09b7e84c\") " pod="openstack/memcached-0" Mar 21 04:42:14 crc kubenswrapper[4839]: I0321 04:42:14.870725 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/3c49bdbb-0c05-4dea-8de8-61ca09b7e84c-kolla-config\") pod \"memcached-0\" (UID: \"3c49bdbb-0c05-4dea-8de8-61ca09b7e84c\") " pod="openstack/memcached-0" Mar 21 04:42:14 crc kubenswrapper[4839]: I0321 04:42:14.972224 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/3c49bdbb-0c05-4dea-8de8-61ca09b7e84c-kolla-config\") pod \"memcached-0\" (UID: \"3c49bdbb-0c05-4dea-8de8-61ca09b7e84c\") " pod="openstack/memcached-0" Mar 21 04:42:14 crc kubenswrapper[4839]: I0321 04:42:14.972309 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9dsjx\" (UniqueName: \"kubernetes.io/projected/3c49bdbb-0c05-4dea-8de8-61ca09b7e84c-kube-api-access-9dsjx\") pod \"memcached-0\" (UID: \"3c49bdbb-0c05-4dea-8de8-61ca09b7e84c\") " pod="openstack/memcached-0" Mar 21 04:42:14 crc kubenswrapper[4839]: I0321 04:42:14.972352 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/3c49bdbb-0c05-4dea-8de8-61ca09b7e84c-config-data\") pod \"memcached-0\" (UID: \"3c49bdbb-0c05-4dea-8de8-61ca09b7e84c\") " pod="openstack/memcached-0" Mar 21 04:42:14 crc kubenswrapper[4839]: I0321 04:42:14.972383 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3c49bdbb-0c05-4dea-8de8-61ca09b7e84c-combined-ca-bundle\") pod \"memcached-0\" (UID: \"3c49bdbb-0c05-4dea-8de8-61ca09b7e84c\") " pod="openstack/memcached-0" Mar 21 04:42:14 crc kubenswrapper[4839]: I0321 04:42:14.972411 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/3c49bdbb-0c05-4dea-8de8-61ca09b7e84c-memcached-tls-certs\") pod \"memcached-0\" (UID: \"3c49bdbb-0c05-4dea-8de8-61ca09b7e84c\") " pod="openstack/memcached-0" Mar 21 04:42:14 crc kubenswrapper[4839]: I0321 04:42:14.973307 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/3c49bdbb-0c05-4dea-8de8-61ca09b7e84c-kolla-config\") pod \"memcached-0\" (UID: \"3c49bdbb-0c05-4dea-8de8-61ca09b7e84c\") " pod="openstack/memcached-0" Mar 21 04:42:14 crc kubenswrapper[4839]: I0321 04:42:14.973341 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/3c49bdbb-0c05-4dea-8de8-61ca09b7e84c-config-data\") pod \"memcached-0\" (UID: \"3c49bdbb-0c05-4dea-8de8-61ca09b7e84c\") " pod="openstack/memcached-0" Mar 21 04:42:14 crc kubenswrapper[4839]: I0321 04:42:14.975916 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/3c49bdbb-0c05-4dea-8de8-61ca09b7e84c-memcached-tls-certs\") pod \"memcached-0\" (UID: \"3c49bdbb-0c05-4dea-8de8-61ca09b7e84c\") " pod="openstack/memcached-0" Mar 21 04:42:14 crc kubenswrapper[4839]: I0321 04:42:14.987826 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3c49bdbb-0c05-4dea-8de8-61ca09b7e84c-combined-ca-bundle\") pod \"memcached-0\" (UID: \"3c49bdbb-0c05-4dea-8de8-61ca09b7e84c\") " pod="openstack/memcached-0" Mar 21 04:42:14 crc kubenswrapper[4839]: I0321 04:42:14.998859 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9dsjx\" (UniqueName: \"kubernetes.io/projected/3c49bdbb-0c05-4dea-8de8-61ca09b7e84c-kube-api-access-9dsjx\") pod \"memcached-0\" (UID: \"3c49bdbb-0c05-4dea-8de8-61ca09b7e84c\") " pod="openstack/memcached-0" Mar 21 04:42:15 crc kubenswrapper[4839]: I0321 04:42:15.074314 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/memcached-0" Mar 21 04:42:17 crc kubenswrapper[4839]: I0321 04:42:17.152323 4839 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/kube-state-metrics-0"] Mar 21 04:42:17 crc kubenswrapper[4839]: I0321 04:42:17.153724 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Mar 21 04:42:17 crc kubenswrapper[4839]: I0321 04:42:17.155824 4839 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"telemetry-ceilometer-dockercfg-cck8h" Mar 21 04:42:17 crc kubenswrapper[4839]: I0321 04:42:17.175950 4839 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Mar 21 04:42:17 crc kubenswrapper[4839]: I0321 04:42:17.219090 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fr9w9\" (UniqueName: \"kubernetes.io/projected/76b8f1b8-aa66-4f5e-937a-f837a2da28f1-kube-api-access-fr9w9\") pod \"kube-state-metrics-0\" (UID: \"76b8f1b8-aa66-4f5e-937a-f837a2da28f1\") " pod="openstack/kube-state-metrics-0" Mar 21 04:42:17 crc kubenswrapper[4839]: I0321 04:42:17.325649 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fr9w9\" (UniqueName: \"kubernetes.io/projected/76b8f1b8-aa66-4f5e-937a-f837a2da28f1-kube-api-access-fr9w9\") pod \"kube-state-metrics-0\" (UID: \"76b8f1b8-aa66-4f5e-937a-f837a2da28f1\") " pod="openstack/kube-state-metrics-0" Mar 21 04:42:17 crc kubenswrapper[4839]: I0321 04:42:17.348816 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fr9w9\" (UniqueName: \"kubernetes.io/projected/76b8f1b8-aa66-4f5e-937a-f837a2da28f1-kube-api-access-fr9w9\") pod \"kube-state-metrics-0\" (UID: \"76b8f1b8-aa66-4f5e-937a-f837a2da28f1\") " pod="openstack/kube-state-metrics-0" Mar 21 04:42:17 crc kubenswrapper[4839]: I0321 04:42:17.473084 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Mar 21 04:42:20 crc kubenswrapper[4839]: I0321 04:42:20.425252 4839 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-qt5s4"] Mar 21 04:42:20 crc kubenswrapper[4839]: I0321 04:42:20.426499 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-qt5s4" Mar 21 04:42:20 crc kubenswrapper[4839]: I0321 04:42:20.429029 4839 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovncontroller-ovndbs" Mar 21 04:42:20 crc kubenswrapper[4839]: I0321 04:42:20.430439 4839 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovncontroller-ovncontroller-dockercfg-62scq" Mar 21 04:42:20 crc kubenswrapper[4839]: I0321 04:42:20.431537 4839 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-scripts" Mar 21 04:42:20 crc kubenswrapper[4839]: I0321 04:42:20.437420 4839 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-qt5s4"] Mar 21 04:42:20 crc kubenswrapper[4839]: I0321 04:42:20.478740 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/b31b64cb-0266-4b8a-9fcb-ae5e36c8309a-var-run\") pod \"ovn-controller-qt5s4\" (UID: \"b31b64cb-0266-4b8a-9fcb-ae5e36c8309a\") " pod="openstack/ovn-controller-qt5s4" Mar 21 04:42:20 crc kubenswrapper[4839]: I0321 04:42:20.478814 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b31b64cb-0266-4b8a-9fcb-ae5e36c8309a-combined-ca-bundle\") pod \"ovn-controller-qt5s4\" (UID: \"b31b64cb-0266-4b8a-9fcb-ae5e36c8309a\") " pod="openstack/ovn-controller-qt5s4" Mar 21 04:42:20 crc kubenswrapper[4839]: I0321 04:42:20.478880 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/b31b64cb-0266-4b8a-9fcb-ae5e36c8309a-var-log-ovn\") pod \"ovn-controller-qt5s4\" (UID: \"b31b64cb-0266-4b8a-9fcb-ae5e36c8309a\") " pod="openstack/ovn-controller-qt5s4" Mar 21 04:42:20 crc kubenswrapper[4839]: I0321 04:42:20.478907 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/b31b64cb-0266-4b8a-9fcb-ae5e36c8309a-var-run-ovn\") pod \"ovn-controller-qt5s4\" (UID: \"b31b64cb-0266-4b8a-9fcb-ae5e36c8309a\") " pod="openstack/ovn-controller-qt5s4" Mar 21 04:42:20 crc kubenswrapper[4839]: I0321 04:42:20.478928 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/b31b64cb-0266-4b8a-9fcb-ae5e36c8309a-scripts\") pod \"ovn-controller-qt5s4\" (UID: \"b31b64cb-0266-4b8a-9fcb-ae5e36c8309a\") " pod="openstack/ovn-controller-qt5s4" Mar 21 04:42:20 crc kubenswrapper[4839]: I0321 04:42:20.478969 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-controller-tls-certs\" (UniqueName: \"kubernetes.io/secret/b31b64cb-0266-4b8a-9fcb-ae5e36c8309a-ovn-controller-tls-certs\") pod \"ovn-controller-qt5s4\" (UID: \"b31b64cb-0266-4b8a-9fcb-ae5e36c8309a\") " pod="openstack/ovn-controller-qt5s4" Mar 21 04:42:20 crc kubenswrapper[4839]: I0321 04:42:20.479030 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kvg5s\" (UniqueName: \"kubernetes.io/projected/b31b64cb-0266-4b8a-9fcb-ae5e36c8309a-kube-api-access-kvg5s\") pod \"ovn-controller-qt5s4\" (UID: \"b31b64cb-0266-4b8a-9fcb-ae5e36c8309a\") " pod="openstack/ovn-controller-qt5s4" Mar 21 04:42:20 crc kubenswrapper[4839]: I0321 04:42:20.485233 4839 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-ovs-hrww8"] Mar 21 04:42:20 crc kubenswrapper[4839]: I0321 04:42:20.486753 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-ovs-hrww8" Mar 21 04:42:20 crc kubenswrapper[4839]: I0321 04:42:20.513115 4839 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-ovs-hrww8"] Mar 21 04:42:20 crc kubenswrapper[4839]: I0321 04:42:20.580007 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kvg5s\" (UniqueName: \"kubernetes.io/projected/b31b64cb-0266-4b8a-9fcb-ae5e36c8309a-kube-api-access-kvg5s\") pod \"ovn-controller-qt5s4\" (UID: \"b31b64cb-0266-4b8a-9fcb-ae5e36c8309a\") " pod="openstack/ovn-controller-qt5s4" Mar 21 04:42:20 crc kubenswrapper[4839]: I0321 04:42:20.580058 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/3d74e911-e100-4e79-89be-202e06bb4d30-var-log\") pod \"ovn-controller-ovs-hrww8\" (UID: \"3d74e911-e100-4e79-89be-202e06bb4d30\") " pod="openstack/ovn-controller-ovs-hrww8" Mar 21 04:42:20 crc kubenswrapper[4839]: I0321 04:42:20.580087 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/b31b64cb-0266-4b8a-9fcb-ae5e36c8309a-var-run\") pod \"ovn-controller-qt5s4\" (UID: \"b31b64cb-0266-4b8a-9fcb-ae5e36c8309a\") " pod="openstack/ovn-controller-qt5s4" Mar 21 04:42:20 crc kubenswrapper[4839]: I0321 04:42:20.580110 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/3d74e911-e100-4e79-89be-202e06bb4d30-var-run\") pod \"ovn-controller-ovs-hrww8\" (UID: \"3d74e911-e100-4e79-89be-202e06bb4d30\") " pod="openstack/ovn-controller-ovs-hrww8" Mar 21 04:42:20 crc kubenswrapper[4839]: I0321 04:42:20.580140 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b31b64cb-0266-4b8a-9fcb-ae5e36c8309a-combined-ca-bundle\") pod \"ovn-controller-qt5s4\" (UID: \"b31b64cb-0266-4b8a-9fcb-ae5e36c8309a\") " pod="openstack/ovn-controller-qt5s4" Mar 21 04:42:20 crc kubenswrapper[4839]: I0321 04:42:20.580163 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-ovs\" (UniqueName: \"kubernetes.io/host-path/3d74e911-e100-4e79-89be-202e06bb4d30-etc-ovs\") pod \"ovn-controller-ovs-hrww8\" (UID: \"3d74e911-e100-4e79-89be-202e06bb4d30\") " pod="openstack/ovn-controller-ovs-hrww8" Mar 21 04:42:20 crc kubenswrapper[4839]: I0321 04:42:20.580196 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9ll56\" (UniqueName: \"kubernetes.io/projected/3d74e911-e100-4e79-89be-202e06bb4d30-kube-api-access-9ll56\") pod \"ovn-controller-ovs-hrww8\" (UID: \"3d74e911-e100-4e79-89be-202e06bb4d30\") " pod="openstack/ovn-controller-ovs-hrww8" Mar 21 04:42:20 crc kubenswrapper[4839]: I0321 04:42:20.580215 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/b31b64cb-0266-4b8a-9fcb-ae5e36c8309a-var-log-ovn\") pod \"ovn-controller-qt5s4\" (UID: \"b31b64cb-0266-4b8a-9fcb-ae5e36c8309a\") " pod="openstack/ovn-controller-qt5s4" Mar 21 04:42:20 crc kubenswrapper[4839]: I0321 04:42:20.580233 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/b31b64cb-0266-4b8a-9fcb-ae5e36c8309a-var-run-ovn\") pod \"ovn-controller-qt5s4\" (UID: \"b31b64cb-0266-4b8a-9fcb-ae5e36c8309a\") " pod="openstack/ovn-controller-qt5s4" Mar 21 04:42:20 crc kubenswrapper[4839]: I0321 04:42:20.580248 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/b31b64cb-0266-4b8a-9fcb-ae5e36c8309a-scripts\") pod \"ovn-controller-qt5s4\" (UID: \"b31b64cb-0266-4b8a-9fcb-ae5e36c8309a\") " pod="openstack/ovn-controller-qt5s4" Mar 21 04:42:20 crc kubenswrapper[4839]: I0321 04:42:20.580277 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-controller-tls-certs\" (UniqueName: \"kubernetes.io/secret/b31b64cb-0266-4b8a-9fcb-ae5e36c8309a-ovn-controller-tls-certs\") pod \"ovn-controller-qt5s4\" (UID: \"b31b64cb-0266-4b8a-9fcb-ae5e36c8309a\") " pod="openstack/ovn-controller-qt5s4" Mar 21 04:42:20 crc kubenswrapper[4839]: I0321 04:42:20.580294 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib\" (UniqueName: \"kubernetes.io/host-path/3d74e911-e100-4e79-89be-202e06bb4d30-var-lib\") pod \"ovn-controller-ovs-hrww8\" (UID: \"3d74e911-e100-4e79-89be-202e06bb4d30\") " pod="openstack/ovn-controller-ovs-hrww8" Mar 21 04:42:20 crc kubenswrapper[4839]: I0321 04:42:20.580311 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/3d74e911-e100-4e79-89be-202e06bb4d30-scripts\") pod \"ovn-controller-ovs-hrww8\" (UID: \"3d74e911-e100-4e79-89be-202e06bb4d30\") " pod="openstack/ovn-controller-ovs-hrww8" Mar 21 04:42:20 crc kubenswrapper[4839]: I0321 04:42:20.581264 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/b31b64cb-0266-4b8a-9fcb-ae5e36c8309a-var-log-ovn\") pod \"ovn-controller-qt5s4\" (UID: \"b31b64cb-0266-4b8a-9fcb-ae5e36c8309a\") " pod="openstack/ovn-controller-qt5s4" Mar 21 04:42:20 crc kubenswrapper[4839]: I0321 04:42:20.581345 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/b31b64cb-0266-4b8a-9fcb-ae5e36c8309a-var-run\") pod \"ovn-controller-qt5s4\" (UID: \"b31b64cb-0266-4b8a-9fcb-ae5e36c8309a\") " pod="openstack/ovn-controller-qt5s4" Mar 21 04:42:20 crc kubenswrapper[4839]: I0321 04:42:20.581453 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/b31b64cb-0266-4b8a-9fcb-ae5e36c8309a-var-run-ovn\") pod \"ovn-controller-qt5s4\" (UID: \"b31b64cb-0266-4b8a-9fcb-ae5e36c8309a\") " pod="openstack/ovn-controller-qt5s4" Mar 21 04:42:20 crc kubenswrapper[4839]: I0321 04:42:20.582685 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/b31b64cb-0266-4b8a-9fcb-ae5e36c8309a-scripts\") pod \"ovn-controller-qt5s4\" (UID: \"b31b64cb-0266-4b8a-9fcb-ae5e36c8309a\") " pod="openstack/ovn-controller-qt5s4" Mar 21 04:42:20 crc kubenswrapper[4839]: I0321 04:42:20.593692 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-controller-tls-certs\" (UniqueName: \"kubernetes.io/secret/b31b64cb-0266-4b8a-9fcb-ae5e36c8309a-ovn-controller-tls-certs\") pod \"ovn-controller-qt5s4\" (UID: \"b31b64cb-0266-4b8a-9fcb-ae5e36c8309a\") " pod="openstack/ovn-controller-qt5s4" Mar 21 04:42:20 crc kubenswrapper[4839]: I0321 04:42:20.594538 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b31b64cb-0266-4b8a-9fcb-ae5e36c8309a-combined-ca-bundle\") pod \"ovn-controller-qt5s4\" (UID: \"b31b64cb-0266-4b8a-9fcb-ae5e36c8309a\") " pod="openstack/ovn-controller-qt5s4" Mar 21 04:42:20 crc kubenswrapper[4839]: I0321 04:42:20.601711 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kvg5s\" (UniqueName: \"kubernetes.io/projected/b31b64cb-0266-4b8a-9fcb-ae5e36c8309a-kube-api-access-kvg5s\") pod \"ovn-controller-qt5s4\" (UID: \"b31b64cb-0266-4b8a-9fcb-ae5e36c8309a\") " pod="openstack/ovn-controller-qt5s4" Mar 21 04:42:20 crc kubenswrapper[4839]: I0321 04:42:20.682193 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib\" (UniqueName: \"kubernetes.io/host-path/3d74e911-e100-4e79-89be-202e06bb4d30-var-lib\") pod \"ovn-controller-ovs-hrww8\" (UID: \"3d74e911-e100-4e79-89be-202e06bb4d30\") " pod="openstack/ovn-controller-ovs-hrww8" Mar 21 04:42:20 crc kubenswrapper[4839]: I0321 04:42:20.682253 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/3d74e911-e100-4e79-89be-202e06bb4d30-scripts\") pod \"ovn-controller-ovs-hrww8\" (UID: \"3d74e911-e100-4e79-89be-202e06bb4d30\") " pod="openstack/ovn-controller-ovs-hrww8" Mar 21 04:42:20 crc kubenswrapper[4839]: I0321 04:42:20.682330 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/3d74e911-e100-4e79-89be-202e06bb4d30-var-log\") pod \"ovn-controller-ovs-hrww8\" (UID: \"3d74e911-e100-4e79-89be-202e06bb4d30\") " pod="openstack/ovn-controller-ovs-hrww8" Mar 21 04:42:20 crc kubenswrapper[4839]: I0321 04:42:20.682370 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/3d74e911-e100-4e79-89be-202e06bb4d30-var-run\") pod \"ovn-controller-ovs-hrww8\" (UID: \"3d74e911-e100-4e79-89be-202e06bb4d30\") " pod="openstack/ovn-controller-ovs-hrww8" Mar 21 04:42:20 crc kubenswrapper[4839]: I0321 04:42:20.682414 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-ovs\" (UniqueName: \"kubernetes.io/host-path/3d74e911-e100-4e79-89be-202e06bb4d30-etc-ovs\") pod \"ovn-controller-ovs-hrww8\" (UID: \"3d74e911-e100-4e79-89be-202e06bb4d30\") " pod="openstack/ovn-controller-ovs-hrww8" Mar 21 04:42:20 crc kubenswrapper[4839]: I0321 04:42:20.682493 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9ll56\" (UniqueName: \"kubernetes.io/projected/3d74e911-e100-4e79-89be-202e06bb4d30-kube-api-access-9ll56\") pod \"ovn-controller-ovs-hrww8\" (UID: \"3d74e911-e100-4e79-89be-202e06bb4d30\") " pod="openstack/ovn-controller-ovs-hrww8" Mar 21 04:42:20 crc kubenswrapper[4839]: I0321 04:42:20.683653 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib\" (UniqueName: \"kubernetes.io/host-path/3d74e911-e100-4e79-89be-202e06bb4d30-var-lib\") pod \"ovn-controller-ovs-hrww8\" (UID: \"3d74e911-e100-4e79-89be-202e06bb4d30\") " pod="openstack/ovn-controller-ovs-hrww8" Mar 21 04:42:20 crc kubenswrapper[4839]: I0321 04:42:20.683733 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/3d74e911-e100-4e79-89be-202e06bb4d30-var-run\") pod \"ovn-controller-ovs-hrww8\" (UID: \"3d74e911-e100-4e79-89be-202e06bb4d30\") " pod="openstack/ovn-controller-ovs-hrww8" Mar 21 04:42:20 crc kubenswrapper[4839]: I0321 04:42:20.683815 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/3d74e911-e100-4e79-89be-202e06bb4d30-var-log\") pod \"ovn-controller-ovs-hrww8\" (UID: \"3d74e911-e100-4e79-89be-202e06bb4d30\") " pod="openstack/ovn-controller-ovs-hrww8" Mar 21 04:42:20 crc kubenswrapper[4839]: I0321 04:42:20.683923 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-ovs\" (UniqueName: \"kubernetes.io/host-path/3d74e911-e100-4e79-89be-202e06bb4d30-etc-ovs\") pod \"ovn-controller-ovs-hrww8\" (UID: \"3d74e911-e100-4e79-89be-202e06bb4d30\") " pod="openstack/ovn-controller-ovs-hrww8" Mar 21 04:42:20 crc kubenswrapper[4839]: I0321 04:42:20.685472 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/3d74e911-e100-4e79-89be-202e06bb4d30-scripts\") pod \"ovn-controller-ovs-hrww8\" (UID: \"3d74e911-e100-4e79-89be-202e06bb4d30\") " pod="openstack/ovn-controller-ovs-hrww8" Mar 21 04:42:20 crc kubenswrapper[4839]: I0321 04:42:20.701006 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9ll56\" (UniqueName: \"kubernetes.io/projected/3d74e911-e100-4e79-89be-202e06bb4d30-kube-api-access-9ll56\") pod \"ovn-controller-ovs-hrww8\" (UID: \"3d74e911-e100-4e79-89be-202e06bb4d30\") " pod="openstack/ovn-controller-ovs-hrww8" Mar 21 04:42:20 crc kubenswrapper[4839]: I0321 04:42:20.758048 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-qt5s4" Mar 21 04:42:20 crc kubenswrapper[4839]: I0321 04:42:20.805808 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-ovs-hrww8" Mar 21 04:42:21 crc kubenswrapper[4839]: E0321 04:42:21.930113 4839 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified" Mar 21 04:42:21 crc kubenswrapper[4839]: E0321 04:42:21.930584 4839 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:init,Image:quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified,Command:[/bin/bash],Args:[-c dnsmasq --interface=* --conf-dir=/etc/dnsmasq.d --hostsdir=/etc/dnsmasq.d/hosts --keep-in-foreground --log-debug --bind-interfaces --listen-address=$(POD_IP) --port 5353 --log-facility=- --no-hosts --domain-needed --no-resolv --bogus-priv --log-queries --test],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:nffh5bdhf4h5f8h79h55h77h58fh56dh7bh6fh578hbch55dh68h56bhd9h65dh57ch658hc9h566h666h688h58h65dh684h5d7h6ch575h5d6h88q,ValueFrom:nil,},EnvVar{Name:POD_IP,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:status.podIP,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/etc/dnsmasq.d/config.cfg,SubPath:dns,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-vkmxw,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod dnsmasq-dns-675f4bcbfc-zn278_openstack(ef534e12-a75b-4e9b-b2a3-4046bece5903): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 21 04:42:21 crc kubenswrapper[4839]: E0321 04:42:21.935630 4839 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/dnsmasq-dns-675f4bcbfc-zn278" podUID="ef534e12-a75b-4e9b-b2a3-4046bece5903" Mar 21 04:42:21 crc kubenswrapper[4839]: E0321 04:42:21.953694 4839 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified" Mar 21 04:42:21 crc kubenswrapper[4839]: E0321 04:42:21.954006 4839 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:init,Image:quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified,Command:[/bin/bash],Args:[-c dnsmasq --interface=* --conf-dir=/etc/dnsmasq.d --hostsdir=/etc/dnsmasq.d/hosts --keep-in-foreground --log-debug --bind-interfaces --listen-address=$(POD_IP) --port 5353 --log-facility=- --no-hosts --domain-needed --no-resolv --bogus-priv --log-queries --test],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:ndfhb5h667h568h584h5f9h58dh565h664h587h597h577h64bh5c4h66fh647hbdh68ch5c5h68dh686h5f7h64hd7hc6h55fh57bh98h57fh87h5fh57fq,ValueFrom:nil,},EnvVar{Name:POD_IP,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:status.podIP,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/etc/dnsmasq.d/config.cfg,SubPath:dns,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:dns-svc,ReadOnly:true,MountPath:/etc/dnsmasq.d/hosts/dns-svc,SubPath:dns-svc,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-grztj,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod dnsmasq-dns-78dd6ddcc-7tj8n_openstack(2f135620-f512-4bfe-9875-4d4e07c6a0f5): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 21 04:42:21 crc kubenswrapper[4839]: E0321 04:42:21.955858 4839 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/dnsmasq-dns-78dd6ddcc-7tj8n" podUID="2f135620-f512-4bfe-9875-4d4e07c6a0f5" Mar 21 04:42:22 crc kubenswrapper[4839]: I0321 04:42:22.363519 4839 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-675f4bcbfc-zn278" Mar 21 04:42:22 crc kubenswrapper[4839]: I0321 04:42:22.524083 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vkmxw\" (UniqueName: \"kubernetes.io/projected/ef534e12-a75b-4e9b-b2a3-4046bece5903-kube-api-access-vkmxw\") pod \"ef534e12-a75b-4e9b-b2a3-4046bece5903\" (UID: \"ef534e12-a75b-4e9b-b2a3-4046bece5903\") " Mar 21 04:42:22 crc kubenswrapper[4839]: I0321 04:42:22.524330 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ef534e12-a75b-4e9b-b2a3-4046bece5903-config\") pod \"ef534e12-a75b-4e9b-b2a3-4046bece5903\" (UID: \"ef534e12-a75b-4e9b-b2a3-4046bece5903\") " Mar 21 04:42:22 crc kubenswrapper[4839]: I0321 04:42:22.524806 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ef534e12-a75b-4e9b-b2a3-4046bece5903-config" (OuterVolumeSpecName: "config") pod "ef534e12-a75b-4e9b-b2a3-4046bece5903" (UID: "ef534e12-a75b-4e9b-b2a3-4046bece5903"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 21 04:42:22 crc kubenswrapper[4839]: I0321 04:42:22.546088 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ef534e12-a75b-4e9b-b2a3-4046bece5903-kube-api-access-vkmxw" (OuterVolumeSpecName: "kube-api-access-vkmxw") pod "ef534e12-a75b-4e9b-b2a3-4046bece5903" (UID: "ef534e12-a75b-4e9b-b2a3-4046bece5903"). InnerVolumeSpecName "kube-api-access-vkmxw". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 21 04:42:22 crc kubenswrapper[4839]: I0321 04:42:22.626673 4839 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vkmxw\" (UniqueName: \"kubernetes.io/projected/ef534e12-a75b-4e9b-b2a3-4046bece5903-kube-api-access-vkmxw\") on node \"crc\" DevicePath \"\"" Mar 21 04:42:22 crc kubenswrapper[4839]: I0321 04:42:22.626708 4839 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ef534e12-a75b-4e9b-b2a3-4046bece5903-config\") on node \"crc\" DevicePath \"\"" Mar 21 04:42:22 crc kubenswrapper[4839]: I0321 04:42:22.649868 4839 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-57d769cc4f-hpzj8"] Mar 21 04:42:22 crc kubenswrapper[4839]: I0321 04:42:22.658479 4839 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Mar 21 04:42:22 crc kubenswrapper[4839]: I0321 04:42:22.676376 4839 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/memcached-0"] Mar 21 04:42:22 crc kubenswrapper[4839]: I0321 04:42:22.688748 4839 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-galera-0"] Mar 21 04:42:22 crc kubenswrapper[4839]: I0321 04:42:22.698177 4839 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Mar 21 04:42:22 crc kubenswrapper[4839]: I0321 04:42:22.704442 4839 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Mar 21 04:42:22 crc kubenswrapper[4839]: W0321 04:42:22.720446 4839 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6e1d0e8c_00aa_4770_9e58_b8f706d80a35.slice/crio-13cf1811708e735c8587e5f387524078eddb6176802aa11ecbd1435c38ed0541 WatchSource:0}: Error finding container 13cf1811708e735c8587e5f387524078eddb6176802aa11ecbd1435c38ed0541: Status 404 returned error can't find the container with id 13cf1811708e735c8587e5f387524078eddb6176802aa11ecbd1435c38ed0541 Mar 21 04:42:22 crc kubenswrapper[4839]: W0321 04:42:22.730852 4839 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8028561c_b039_4400_a065_b5efee753b5f.slice/crio-5eaf787d4b2014f872ad6aefa43fc8d3d3baab1a1f0af69a0017de992e3a8b54 WatchSource:0}: Error finding container 5eaf787d4b2014f872ad6aefa43fc8d3d3baab1a1f0af69a0017de992e3a8b54: Status 404 returned error can't find the container with id 5eaf787d4b2014f872ad6aefa43fc8d3d3baab1a1f0af69a0017de992e3a8b54 Mar 21 04:42:22 crc kubenswrapper[4839]: I0321 04:42:22.865720 4839 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-cell1-galera-0"] Mar 21 04:42:22 crc kubenswrapper[4839]: I0321 04:42:22.964600 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"30d22e92-45bd-4d1e-954e-3ade801245d4","Type":"ContainerStarted","Data":"9b8ab073370c7e9b4ef0d33f05bad1959de23b13a08aa77b184763533f460bc8"} Mar 21 04:42:22 crc kubenswrapper[4839]: I0321 04:42:22.966321 4839 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-675f4bcbfc-zn278" Mar 21 04:42:22 crc kubenswrapper[4839]: I0321 04:42:22.966363 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-675f4bcbfc-zn278" event={"ID":"ef534e12-a75b-4e9b-b2a3-4046bece5903","Type":"ContainerDied","Data":"8d9c14a410c4252b3bc652d39bc6ba19cb56d790ccb813a314300ee40264a16c"} Mar 21 04:42:22 crc kubenswrapper[4839]: I0321 04:42:22.977799 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-57d769cc4f-hpzj8" event={"ID":"5f0d86dd-f2c4-415e-b4dc-0bfeb0b1dbb5","Type":"ContainerStarted","Data":"0ce222ce6f31634231e7d8b6628241b56c0a1c3b1c7afeb55d6d347fe6bd4c49"} Mar 21 04:42:22 crc kubenswrapper[4839]: I0321 04:42:22.979295 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/memcached-0" event={"ID":"3c49bdbb-0c05-4dea-8de8-61ca09b7e84c","Type":"ContainerStarted","Data":"73cd9e67027e3c3a25c526ec7264951bd83d32e50e75829a33891f7f554d78d2"} Mar 21 04:42:22 crc kubenswrapper[4839]: I0321 04:42:22.987116 4839 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-qt5s4"] Mar 21 04:42:22 crc kubenswrapper[4839]: I0321 04:42:22.988328 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"4f1edf0d-f220-4815-aeb6-e4507576247a","Type":"ContainerStarted","Data":"92d8cbd04f02626e8199703ada8650eafd375a40b634369189fe2eae5e79b310"} Mar 21 04:42:22 crc kubenswrapper[4839]: I0321 04:42:22.991243 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"6e1d0e8c-00aa-4770-9e58-b8f706d80a35","Type":"ContainerStarted","Data":"13cf1811708e735c8587e5f387524078eddb6176802aa11ecbd1435c38ed0541"} Mar 21 04:42:22 crc kubenswrapper[4839]: I0321 04:42:22.993432 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"8028561c-b039-4400-a065-b5efee753b5f","Type":"ContainerStarted","Data":"5eaf787d4b2014f872ad6aefa43fc8d3d3baab1a1f0af69a0017de992e3a8b54"} Mar 21 04:42:22 crc kubenswrapper[4839]: I0321 04:42:22.996323 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"76b8f1b8-aa66-4f5e-937a-f837a2da28f1","Type":"ContainerStarted","Data":"c3e02332eed0f6ac50479a637c2f9551186161a99dab978e61007f6da0cf9aba"} Mar 21 04:42:23 crc kubenswrapper[4839]: W0321 04:42:23.025325 4839 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod45f7903c_4081_446b_94d9_ad979332590b.slice/crio-104ea53b53097b885fd70bb32ecb2748f3dd610f76b7554afac7980d69d50aa2 WatchSource:0}: Error finding container 104ea53b53097b885fd70bb32ecb2748f3dd610f76b7554afac7980d69d50aa2: Status 404 returned error can't find the container with id 104ea53b53097b885fd70bb32ecb2748f3dd610f76b7554afac7980d69d50aa2 Mar 21 04:42:23 crc kubenswrapper[4839]: W0321 04:42:23.027475 4839 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod3d74e911_e100_4e79_89be_202e06bb4d30.slice/crio-7717e327dcaa2d021e550fe3f1735a11ebeed797ea33dd4e818cb7806408989f WatchSource:0}: Error finding container 7717e327dcaa2d021e550fe3f1735a11ebeed797ea33dd4e818cb7806408989f: Status 404 returned error can't find the container with id 7717e327dcaa2d021e550fe3f1735a11ebeed797ea33dd4e818cb7806408989f Mar 21 04:42:23 crc kubenswrapper[4839]: I0321 04:42:23.030621 4839 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-ovs-hrww8"] Mar 21 04:42:23 crc kubenswrapper[4839]: I0321 04:42:23.042791 4839 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5ccc8479f9-d99qf"] Mar 21 04:42:23 crc kubenswrapper[4839]: I0321 04:42:23.117615 4839 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-675f4bcbfc-zn278"] Mar 21 04:42:23 crc kubenswrapper[4839]: I0321 04:42:23.130017 4839 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-675f4bcbfc-zn278"] Mar 21 04:42:23 crc kubenswrapper[4839]: I0321 04:42:23.279676 4839 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-metrics-mx5tf"] Mar 21 04:42:23 crc kubenswrapper[4839]: I0321 04:42:23.283491 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-metrics-mx5tf" Mar 21 04:42:23 crc kubenswrapper[4839]: I0321 04:42:23.289160 4839 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovn-metrics" Mar 21 04:42:23 crc kubenswrapper[4839]: I0321 04:42:23.289445 4839 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-metrics-config" Mar 21 04:42:23 crc kubenswrapper[4839]: I0321 04:42:23.289704 4839 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-metrics-mx5tf"] Mar 21 04:42:23 crc kubenswrapper[4839]: I0321 04:42:23.328426 4839 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-78dd6ddcc-7tj8n" Mar 21 04:42:23 crc kubenswrapper[4839]: I0321 04:42:23.450288 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2f135620-f512-4bfe-9875-4d4e07c6a0f5-config\") pod \"2f135620-f512-4bfe-9875-4d4e07c6a0f5\" (UID: \"2f135620-f512-4bfe-9875-4d4e07c6a0f5\") " Mar 21 04:42:23 crc kubenswrapper[4839]: I0321 04:42:23.450366 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-grztj\" (UniqueName: \"kubernetes.io/projected/2f135620-f512-4bfe-9875-4d4e07c6a0f5-kube-api-access-grztj\") pod \"2f135620-f512-4bfe-9875-4d4e07c6a0f5\" (UID: \"2f135620-f512-4bfe-9875-4d4e07c6a0f5\") " Mar 21 04:42:23 crc kubenswrapper[4839]: I0321 04:42:23.450524 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/2f135620-f512-4bfe-9875-4d4e07c6a0f5-dns-svc\") pod \"2f135620-f512-4bfe-9875-4d4e07c6a0f5\" (UID: \"2f135620-f512-4bfe-9875-4d4e07c6a0f5\") " Mar 21 04:42:23 crc kubenswrapper[4839]: I0321 04:42:23.450812 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/host-path/64d13111-845e-4c61-a4ce-483ddfb799b7-ovn-rundir\") pod \"ovn-controller-metrics-mx5tf\" (UID: \"64d13111-845e-4c61-a4ce-483ddfb799b7\") " pod="openstack/ovn-controller-metrics-mx5tf" Mar 21 04:42:23 crc kubenswrapper[4839]: I0321 04:42:23.450856 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mrpd6\" (UniqueName: \"kubernetes.io/projected/64d13111-845e-4c61-a4ce-483ddfb799b7-kube-api-access-mrpd6\") pod \"ovn-controller-metrics-mx5tf\" (UID: \"64d13111-845e-4c61-a4ce-483ddfb799b7\") " pod="openstack/ovn-controller-metrics-mx5tf" Mar 21 04:42:23 crc kubenswrapper[4839]: I0321 04:42:23.450886 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovs-rundir\" (UniqueName: \"kubernetes.io/host-path/64d13111-845e-4c61-a4ce-483ddfb799b7-ovs-rundir\") pod \"ovn-controller-metrics-mx5tf\" (UID: \"64d13111-845e-4c61-a4ce-483ddfb799b7\") " pod="openstack/ovn-controller-metrics-mx5tf" Mar 21 04:42:23 crc kubenswrapper[4839]: I0321 04:42:23.451004 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/64d13111-845e-4c61-a4ce-483ddfb799b7-metrics-certs-tls-certs\") pod \"ovn-controller-metrics-mx5tf\" (UID: \"64d13111-845e-4c61-a4ce-483ddfb799b7\") " pod="openstack/ovn-controller-metrics-mx5tf" Mar 21 04:42:23 crc kubenswrapper[4839]: I0321 04:42:23.451046 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/64d13111-845e-4c61-a4ce-483ddfb799b7-combined-ca-bundle\") pod \"ovn-controller-metrics-mx5tf\" (UID: \"64d13111-845e-4c61-a4ce-483ddfb799b7\") " pod="openstack/ovn-controller-metrics-mx5tf" Mar 21 04:42:23 crc kubenswrapper[4839]: I0321 04:42:23.451066 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2f135620-f512-4bfe-9875-4d4e07c6a0f5-config" (OuterVolumeSpecName: "config") pod "2f135620-f512-4bfe-9875-4d4e07c6a0f5" (UID: "2f135620-f512-4bfe-9875-4d4e07c6a0f5"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 21 04:42:23 crc kubenswrapper[4839]: I0321 04:42:23.451110 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/64d13111-845e-4c61-a4ce-483ddfb799b7-config\") pod \"ovn-controller-metrics-mx5tf\" (UID: \"64d13111-845e-4c61-a4ce-483ddfb799b7\") " pod="openstack/ovn-controller-metrics-mx5tf" Mar 21 04:42:23 crc kubenswrapper[4839]: I0321 04:42:23.451334 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2f135620-f512-4bfe-9875-4d4e07c6a0f5-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "2f135620-f512-4bfe-9875-4d4e07c6a0f5" (UID: "2f135620-f512-4bfe-9875-4d4e07c6a0f5"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 21 04:42:23 crc kubenswrapper[4839]: I0321 04:42:23.455347 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2f135620-f512-4bfe-9875-4d4e07c6a0f5-kube-api-access-grztj" (OuterVolumeSpecName: "kube-api-access-grztj") pod "2f135620-f512-4bfe-9875-4d4e07c6a0f5" (UID: "2f135620-f512-4bfe-9875-4d4e07c6a0f5"). InnerVolumeSpecName "kube-api-access-grztj". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 21 04:42:23 crc kubenswrapper[4839]: I0321 04:42:23.952076 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/host-path/64d13111-845e-4c61-a4ce-483ddfb799b7-ovn-rundir\") pod \"ovn-controller-metrics-mx5tf\" (UID: \"64d13111-845e-4c61-a4ce-483ddfb799b7\") " pod="openstack/ovn-controller-metrics-mx5tf" Mar 21 04:42:23 crc kubenswrapper[4839]: I0321 04:42:23.952171 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mrpd6\" (UniqueName: \"kubernetes.io/projected/64d13111-845e-4c61-a4ce-483ddfb799b7-kube-api-access-mrpd6\") pod \"ovn-controller-metrics-mx5tf\" (UID: \"64d13111-845e-4c61-a4ce-483ddfb799b7\") " pod="openstack/ovn-controller-metrics-mx5tf" Mar 21 04:42:23 crc kubenswrapper[4839]: I0321 04:42:23.952234 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovs-rundir\" (UniqueName: \"kubernetes.io/host-path/64d13111-845e-4c61-a4ce-483ddfb799b7-ovs-rundir\") pod \"ovn-controller-metrics-mx5tf\" (UID: \"64d13111-845e-4c61-a4ce-483ddfb799b7\") " pod="openstack/ovn-controller-metrics-mx5tf" Mar 21 04:42:23 crc kubenswrapper[4839]: I0321 04:42:23.952330 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/64d13111-845e-4c61-a4ce-483ddfb799b7-metrics-certs-tls-certs\") pod \"ovn-controller-metrics-mx5tf\" (UID: \"64d13111-845e-4c61-a4ce-483ddfb799b7\") " pod="openstack/ovn-controller-metrics-mx5tf" Mar 21 04:42:23 crc kubenswrapper[4839]: I0321 04:42:23.952407 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/64d13111-845e-4c61-a4ce-483ddfb799b7-combined-ca-bundle\") pod \"ovn-controller-metrics-mx5tf\" (UID: \"64d13111-845e-4c61-a4ce-483ddfb799b7\") " pod="openstack/ovn-controller-metrics-mx5tf" Mar 21 04:42:23 crc kubenswrapper[4839]: I0321 04:42:23.952480 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/64d13111-845e-4c61-a4ce-483ddfb799b7-config\") pod \"ovn-controller-metrics-mx5tf\" (UID: \"64d13111-845e-4c61-a4ce-483ddfb799b7\") " pod="openstack/ovn-controller-metrics-mx5tf" Mar 21 04:42:23 crc kubenswrapper[4839]: I0321 04:42:23.952598 4839 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2f135620-f512-4bfe-9875-4d4e07c6a0f5-config\") on node \"crc\" DevicePath \"\"" Mar 21 04:42:23 crc kubenswrapper[4839]: I0321 04:42:23.952615 4839 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-grztj\" (UniqueName: \"kubernetes.io/projected/2f135620-f512-4bfe-9875-4d4e07c6a0f5-kube-api-access-grztj\") on node \"crc\" DevicePath \"\"" Mar 21 04:42:23 crc kubenswrapper[4839]: I0321 04:42:23.952631 4839 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/2f135620-f512-4bfe-9875-4d4e07c6a0f5-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 21 04:42:23 crc kubenswrapper[4839]: I0321 04:42:23.955665 4839 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovsdbserver-sb-0"] Mar 21 04:42:23 crc kubenswrapper[4839]: I0321 04:42:23.956788 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-sb-0" Mar 21 04:42:23 crc kubenswrapper[4839]: I0321 04:42:23.958385 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovs-rundir\" (UniqueName: \"kubernetes.io/host-path/64d13111-845e-4c61-a4ce-483ddfb799b7-ovs-rundir\") pod \"ovn-controller-metrics-mx5tf\" (UID: \"64d13111-845e-4c61-a4ce-483ddfb799b7\") " pod="openstack/ovn-controller-metrics-mx5tf" Mar 21 04:42:23 crc kubenswrapper[4839]: I0321 04:42:23.963101 4839 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-sb-scripts" Mar 21 04:42:23 crc kubenswrapper[4839]: I0321 04:42:23.963942 4839 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovncluster-ovndbcluster-sb-dockercfg-f7k5z" Mar 21 04:42:23 crc kubenswrapper[4839]: I0321 04:42:23.964583 4839 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-sb-config" Mar 21 04:42:23 crc kubenswrapper[4839]: I0321 04:42:23.965230 4839 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovndbcluster-sb-ovndbs" Mar 21 04:42:23 crc kubenswrapper[4839]: I0321 04:42:23.965718 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/64d13111-845e-4c61-a4ce-483ddfb799b7-config\") pod \"ovn-controller-metrics-mx5tf\" (UID: \"64d13111-845e-4c61-a4ce-483ddfb799b7\") " pod="openstack/ovn-controller-metrics-mx5tf" Mar 21 04:42:23 crc kubenswrapper[4839]: I0321 04:42:23.965901 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/host-path/64d13111-845e-4c61-a4ce-483ddfb799b7-ovn-rundir\") pod \"ovn-controller-metrics-mx5tf\" (UID: \"64d13111-845e-4c61-a4ce-483ddfb799b7\") " pod="openstack/ovn-controller-metrics-mx5tf" Mar 21 04:42:23 crc kubenswrapper[4839]: I0321 04:42:23.967687 4839 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-sb-0"] Mar 21 04:42:23 crc kubenswrapper[4839]: I0321 04:42:23.968644 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/64d13111-845e-4c61-a4ce-483ddfb799b7-combined-ca-bundle\") pod \"ovn-controller-metrics-mx5tf\" (UID: \"64d13111-845e-4c61-a4ce-483ddfb799b7\") " pod="openstack/ovn-controller-metrics-mx5tf" Mar 21 04:42:23 crc kubenswrapper[4839]: I0321 04:42:23.969147 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/64d13111-845e-4c61-a4ce-483ddfb799b7-metrics-certs-tls-certs\") pod \"ovn-controller-metrics-mx5tf\" (UID: \"64d13111-845e-4c61-a4ce-483ddfb799b7\") " pod="openstack/ovn-controller-metrics-mx5tf" Mar 21 04:42:23 crc kubenswrapper[4839]: I0321 04:42:23.981720 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mrpd6\" (UniqueName: \"kubernetes.io/projected/64d13111-845e-4c61-a4ce-483ddfb799b7-kube-api-access-mrpd6\") pod \"ovn-controller-metrics-mx5tf\" (UID: \"64d13111-845e-4c61-a4ce-483ddfb799b7\") " pod="openstack/ovn-controller-metrics-mx5tf" Mar 21 04:42:24 crc kubenswrapper[4839]: I0321 04:42:24.004507 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5ccc8479f9-d99qf" event={"ID":"45f7903c-4081-446b-94d9-ad979332590b","Type":"ContainerStarted","Data":"104ea53b53097b885fd70bb32ecb2748f3dd610f76b7554afac7980d69d50aa2"} Mar 21 04:42:24 crc kubenswrapper[4839]: I0321 04:42:24.008634 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-qt5s4" event={"ID":"b31b64cb-0266-4b8a-9fcb-ae5e36c8309a","Type":"ContainerStarted","Data":"3f05c669d23667bcf282f85eaf3b128b498ba04ca003dea9bc418e7987ec280b"} Mar 21 04:42:24 crc kubenswrapper[4839]: I0321 04:42:24.009970 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-78dd6ddcc-7tj8n" event={"ID":"2f135620-f512-4bfe-9875-4d4e07c6a0f5","Type":"ContainerDied","Data":"2be9891bd895defb2ca0d28836ebf421c8763119934d2eefdf49a40b80778096"} Mar 21 04:42:24 crc kubenswrapper[4839]: I0321 04:42:24.010058 4839 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-78dd6ddcc-7tj8n" Mar 21 04:42:24 crc kubenswrapper[4839]: I0321 04:42:24.012250 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-hrww8" event={"ID":"3d74e911-e100-4e79-89be-202e06bb4d30","Type":"ContainerStarted","Data":"7717e327dcaa2d021e550fe3f1735a11ebeed797ea33dd4e818cb7806408989f"} Mar 21 04:42:24 crc kubenswrapper[4839]: I0321 04:42:24.109809 4839 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovsdbserver-nb-0"] Mar 21 04:42:24 crc kubenswrapper[4839]: I0321 04:42:24.111026 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-nb-0" Mar 21 04:42:24 crc kubenswrapper[4839]: I0321 04:42:24.118205 4839 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovncluster-ovndbcluster-nb-dockercfg-65bwr" Mar 21 04:42:24 crc kubenswrapper[4839]: I0321 04:42:24.118486 4839 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-nb-scripts" Mar 21 04:42:24 crc kubenswrapper[4839]: I0321 04:42:24.118551 4839 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-nb-config" Mar 21 04:42:24 crc kubenswrapper[4839]: I0321 04:42:24.118775 4839 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovndbcluster-nb-ovndbs" Mar 21 04:42:24 crc kubenswrapper[4839]: I0321 04:42:24.122792 4839 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-nb-0"] Mar 21 04:42:24 crc kubenswrapper[4839]: I0321 04:42:24.157910 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8c2e5ef4-e4c0-4278-897e-ce5d00b4079d-config\") pod \"ovsdbserver-sb-0\" (UID: \"8c2e5ef4-e4c0-4278-897e-ce5d00b4079d\") " pod="openstack/ovsdbserver-sb-0" Mar 21 04:42:24 crc kubenswrapper[4839]: I0321 04:42:24.158836 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/8c2e5ef4-e4c0-4278-897e-ce5d00b4079d-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"8c2e5ef4-e4c0-4278-897e-ce5d00b4079d\") " pod="openstack/ovsdbserver-sb-0" Mar 21 04:42:24 crc kubenswrapper[4839]: I0321 04:42:24.158891 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/8c2e5ef4-e4c0-4278-897e-ce5d00b4079d-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"8c2e5ef4-e4c0-4278-897e-ce5d00b4079d\") " pod="openstack/ovsdbserver-sb-0" Mar 21 04:42:24 crc kubenswrapper[4839]: I0321 04:42:24.158947 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/8c2e5ef4-e4c0-4278-897e-ce5d00b4079d-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"8c2e5ef4-e4c0-4278-897e-ce5d00b4079d\") " pod="openstack/ovsdbserver-sb-0" Mar 21 04:42:24 crc kubenswrapper[4839]: I0321 04:42:24.158975 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/8c2e5ef4-e4c0-4278-897e-ce5d00b4079d-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"8c2e5ef4-e4c0-4278-897e-ce5d00b4079d\") " pod="openstack/ovsdbserver-sb-0" Mar 21 04:42:24 crc kubenswrapper[4839]: I0321 04:42:24.159004 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8c2e5ef4-e4c0-4278-897e-ce5d00b4079d-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"8c2e5ef4-e4c0-4278-897e-ce5d00b4079d\") " pod="openstack/ovsdbserver-sb-0" Mar 21 04:42:24 crc kubenswrapper[4839]: I0321 04:42:24.159036 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"ovsdbserver-sb-0\" (UID: \"8c2e5ef4-e4c0-4278-897e-ce5d00b4079d\") " pod="openstack/ovsdbserver-sb-0" Mar 21 04:42:24 crc kubenswrapper[4839]: I0321 04:42:24.159104 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5xwmw\" (UniqueName: \"kubernetes.io/projected/8c2e5ef4-e4c0-4278-897e-ce5d00b4079d-kube-api-access-5xwmw\") pod \"ovsdbserver-sb-0\" (UID: \"8c2e5ef4-e4c0-4278-897e-ce5d00b4079d\") " pod="openstack/ovsdbserver-sb-0" Mar 21 04:42:24 crc kubenswrapper[4839]: I0321 04:42:24.212684 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-metrics-mx5tf" Mar 21 04:42:24 crc kubenswrapper[4839]: I0321 04:42:24.265692 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4a7a1028-3deb-4033-890c-db0861c6a9a2-config\") pod \"ovsdbserver-nb-0\" (UID: \"4a7a1028-3deb-4033-890c-db0861c6a9a2\") " pod="openstack/ovsdbserver-nb-0" Mar 21 04:42:24 crc kubenswrapper[4839]: I0321 04:42:24.265756 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/4a7a1028-3deb-4033-890c-db0861c6a9a2-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"4a7a1028-3deb-4033-890c-db0861c6a9a2\") " pod="openstack/ovsdbserver-nb-0" Mar 21 04:42:24 crc kubenswrapper[4839]: I0321 04:42:24.270829 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/8c2e5ef4-e4c0-4278-897e-ce5d00b4079d-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"8c2e5ef4-e4c0-4278-897e-ce5d00b4079d\") " pod="openstack/ovsdbserver-sb-0" Mar 21 04:42:24 crc kubenswrapper[4839]: I0321 04:42:24.270860 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/4a7a1028-3deb-4033-890c-db0861c6a9a2-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"4a7a1028-3deb-4033-890c-db0861c6a9a2\") " pod="openstack/ovsdbserver-nb-0" Mar 21 04:42:24 crc kubenswrapper[4839]: I0321 04:42:24.270885 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8c2e5ef4-e4c0-4278-897e-ce5d00b4079d-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"8c2e5ef4-e4c0-4278-897e-ce5d00b4079d\") " pod="openstack/ovsdbserver-sb-0" Mar 21 04:42:24 crc kubenswrapper[4839]: I0321 04:42:24.270916 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"ovsdbserver-sb-0\" (UID: \"8c2e5ef4-e4c0-4278-897e-ce5d00b4079d\") " pod="openstack/ovsdbserver-sb-0" Mar 21 04:42:24 crc kubenswrapper[4839]: I0321 04:42:24.270938 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/4a7a1028-3deb-4033-890c-db0861c6a9a2-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"4a7a1028-3deb-4033-890c-db0861c6a9a2\") " pod="openstack/ovsdbserver-nb-0" Mar 21 04:42:24 crc kubenswrapper[4839]: I0321 04:42:24.270991 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5xwmw\" (UniqueName: \"kubernetes.io/projected/8c2e5ef4-e4c0-4278-897e-ce5d00b4079d-kube-api-access-5xwmw\") pod \"ovsdbserver-sb-0\" (UID: \"8c2e5ef4-e4c0-4278-897e-ce5d00b4079d\") " pod="openstack/ovsdbserver-sb-0" Mar 21 04:42:24 crc kubenswrapper[4839]: I0321 04:42:24.271011 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4a7a1028-3deb-4033-890c-db0861c6a9a2-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"4a7a1028-3deb-4033-890c-db0861c6a9a2\") " pod="openstack/ovsdbserver-nb-0" Mar 21 04:42:24 crc kubenswrapper[4839]: I0321 04:42:24.271040 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8c2e5ef4-e4c0-4278-897e-ce5d00b4079d-config\") pod \"ovsdbserver-sb-0\" (UID: \"8c2e5ef4-e4c0-4278-897e-ce5d00b4079d\") " pod="openstack/ovsdbserver-sb-0" Mar 21 04:42:24 crc kubenswrapper[4839]: I0321 04:42:24.271064 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"ovsdbserver-nb-0\" (UID: \"4a7a1028-3deb-4033-890c-db0861c6a9a2\") " pod="openstack/ovsdbserver-nb-0" Mar 21 04:42:24 crc kubenswrapper[4839]: I0321 04:42:24.271106 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/8c2e5ef4-e4c0-4278-897e-ce5d00b4079d-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"8c2e5ef4-e4c0-4278-897e-ce5d00b4079d\") " pod="openstack/ovsdbserver-sb-0" Mar 21 04:42:24 crc kubenswrapper[4839]: I0321 04:42:24.271131 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/8c2e5ef4-e4c0-4278-897e-ce5d00b4079d-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"8c2e5ef4-e4c0-4278-897e-ce5d00b4079d\") " pod="openstack/ovsdbserver-sb-0" Mar 21 04:42:24 crc kubenswrapper[4839]: I0321 04:42:24.271169 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/4a7a1028-3deb-4033-890c-db0861c6a9a2-ovsdb-rundir\") pod \"ovsdbserver-nb-0\" (UID: \"4a7a1028-3deb-4033-890c-db0861c6a9a2\") " pod="openstack/ovsdbserver-nb-0" Mar 21 04:42:24 crc kubenswrapper[4839]: I0321 04:42:24.271191 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s9ccz\" (UniqueName: \"kubernetes.io/projected/4a7a1028-3deb-4033-890c-db0861c6a9a2-kube-api-access-s9ccz\") pod \"ovsdbserver-nb-0\" (UID: \"4a7a1028-3deb-4033-890c-db0861c6a9a2\") " pod="openstack/ovsdbserver-nb-0" Mar 21 04:42:24 crc kubenswrapper[4839]: I0321 04:42:24.271211 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/8c2e5ef4-e4c0-4278-897e-ce5d00b4079d-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"8c2e5ef4-e4c0-4278-897e-ce5d00b4079d\") " pod="openstack/ovsdbserver-sb-0" Mar 21 04:42:24 crc kubenswrapper[4839]: I0321 04:42:24.272033 4839 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"ovsdbserver-sb-0\" (UID: \"8c2e5ef4-e4c0-4278-897e-ce5d00b4079d\") device mount path \"/mnt/openstack/pv12\"" pod="openstack/ovsdbserver-sb-0" Mar 21 04:42:24 crc kubenswrapper[4839]: I0321 04:42:24.274351 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8c2e5ef4-e4c0-4278-897e-ce5d00b4079d-config\") pod \"ovsdbserver-sb-0\" (UID: \"8c2e5ef4-e4c0-4278-897e-ce5d00b4079d\") " pod="openstack/ovsdbserver-sb-0" Mar 21 04:42:24 crc kubenswrapper[4839]: I0321 04:42:24.274355 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/8c2e5ef4-e4c0-4278-897e-ce5d00b4079d-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"8c2e5ef4-e4c0-4278-897e-ce5d00b4079d\") " pod="openstack/ovsdbserver-sb-0" Mar 21 04:42:24 crc kubenswrapper[4839]: I0321 04:42:24.275847 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/8c2e5ef4-e4c0-4278-897e-ce5d00b4079d-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"8c2e5ef4-e4c0-4278-897e-ce5d00b4079d\") " pod="openstack/ovsdbserver-sb-0" Mar 21 04:42:24 crc kubenswrapper[4839]: I0321 04:42:24.276865 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/8c2e5ef4-e4c0-4278-897e-ce5d00b4079d-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"8c2e5ef4-e4c0-4278-897e-ce5d00b4079d\") " pod="openstack/ovsdbserver-sb-0" Mar 21 04:42:24 crc kubenswrapper[4839]: I0321 04:42:24.285456 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/8c2e5ef4-e4c0-4278-897e-ce5d00b4079d-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"8c2e5ef4-e4c0-4278-897e-ce5d00b4079d\") " pod="openstack/ovsdbserver-sb-0" Mar 21 04:42:24 crc kubenswrapper[4839]: I0321 04:42:24.287693 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5xwmw\" (UniqueName: \"kubernetes.io/projected/8c2e5ef4-e4c0-4278-897e-ce5d00b4079d-kube-api-access-5xwmw\") pod \"ovsdbserver-sb-0\" (UID: \"8c2e5ef4-e4c0-4278-897e-ce5d00b4079d\") " pod="openstack/ovsdbserver-sb-0" Mar 21 04:42:24 crc kubenswrapper[4839]: I0321 04:42:24.294069 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8c2e5ef4-e4c0-4278-897e-ce5d00b4079d-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"8c2e5ef4-e4c0-4278-897e-ce5d00b4079d\") " pod="openstack/ovsdbserver-sb-0" Mar 21 04:42:24 crc kubenswrapper[4839]: I0321 04:42:24.294341 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"ovsdbserver-sb-0\" (UID: \"8c2e5ef4-e4c0-4278-897e-ce5d00b4079d\") " pod="openstack/ovsdbserver-sb-0" Mar 21 04:42:24 crc kubenswrapper[4839]: I0321 04:42:24.425047 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4a7a1028-3deb-4033-890c-db0861c6a9a2-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"4a7a1028-3deb-4033-890c-db0861c6a9a2\") " pod="openstack/ovsdbserver-nb-0" Mar 21 04:42:24 crc kubenswrapper[4839]: I0321 04:42:24.425836 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"ovsdbserver-nb-0\" (UID: \"4a7a1028-3deb-4033-890c-db0861c6a9a2\") " pod="openstack/ovsdbserver-nb-0" Mar 21 04:42:24 crc kubenswrapper[4839]: I0321 04:42:24.425978 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/4a7a1028-3deb-4033-890c-db0861c6a9a2-ovsdb-rundir\") pod \"ovsdbserver-nb-0\" (UID: \"4a7a1028-3deb-4033-890c-db0861c6a9a2\") " pod="openstack/ovsdbserver-nb-0" Mar 21 04:42:24 crc kubenswrapper[4839]: I0321 04:42:24.426086 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s9ccz\" (UniqueName: \"kubernetes.io/projected/4a7a1028-3deb-4033-890c-db0861c6a9a2-kube-api-access-s9ccz\") pod \"ovsdbserver-nb-0\" (UID: \"4a7a1028-3deb-4033-890c-db0861c6a9a2\") " pod="openstack/ovsdbserver-nb-0" Mar 21 04:42:24 crc kubenswrapper[4839]: I0321 04:42:24.426204 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4a7a1028-3deb-4033-890c-db0861c6a9a2-config\") pod \"ovsdbserver-nb-0\" (UID: \"4a7a1028-3deb-4033-890c-db0861c6a9a2\") " pod="openstack/ovsdbserver-nb-0" Mar 21 04:42:24 crc kubenswrapper[4839]: I0321 04:42:24.426361 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/4a7a1028-3deb-4033-890c-db0861c6a9a2-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"4a7a1028-3deb-4033-890c-db0861c6a9a2\") " pod="openstack/ovsdbserver-nb-0" Mar 21 04:42:24 crc kubenswrapper[4839]: I0321 04:42:24.426511 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/4a7a1028-3deb-4033-890c-db0861c6a9a2-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"4a7a1028-3deb-4033-890c-db0861c6a9a2\") " pod="openstack/ovsdbserver-nb-0" Mar 21 04:42:24 crc kubenswrapper[4839]: I0321 04:42:24.426634 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/4a7a1028-3deb-4033-890c-db0861c6a9a2-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"4a7a1028-3deb-4033-890c-db0861c6a9a2\") " pod="openstack/ovsdbserver-nb-0" Mar 21 04:42:24 crc kubenswrapper[4839]: I0321 04:42:24.426645 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/4a7a1028-3deb-4033-890c-db0861c6a9a2-ovsdb-rundir\") pod \"ovsdbserver-nb-0\" (UID: \"4a7a1028-3deb-4033-890c-db0861c6a9a2\") " pod="openstack/ovsdbserver-nb-0" Mar 21 04:42:24 crc kubenswrapper[4839]: I0321 04:42:24.426287 4839 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"ovsdbserver-nb-0\" (UID: \"4a7a1028-3deb-4033-890c-db0861c6a9a2\") device mount path \"/mnt/openstack/pv11\"" pod="openstack/ovsdbserver-nb-0" Mar 21 04:42:24 crc kubenswrapper[4839]: I0321 04:42:24.429887 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4a7a1028-3deb-4033-890c-db0861c6a9a2-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"4a7a1028-3deb-4033-890c-db0861c6a9a2\") " pod="openstack/ovsdbserver-nb-0" Mar 21 04:42:24 crc kubenswrapper[4839]: I0321 04:42:24.430096 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4a7a1028-3deb-4033-890c-db0861c6a9a2-config\") pod \"ovsdbserver-nb-0\" (UID: \"4a7a1028-3deb-4033-890c-db0861c6a9a2\") " pod="openstack/ovsdbserver-nb-0" Mar 21 04:42:24 crc kubenswrapper[4839]: I0321 04:42:24.430533 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/4a7a1028-3deb-4033-890c-db0861c6a9a2-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"4a7a1028-3deb-4033-890c-db0861c6a9a2\") " pod="openstack/ovsdbserver-nb-0" Mar 21 04:42:24 crc kubenswrapper[4839]: I0321 04:42:24.431762 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/4a7a1028-3deb-4033-890c-db0861c6a9a2-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"4a7a1028-3deb-4033-890c-db0861c6a9a2\") " pod="openstack/ovsdbserver-nb-0" Mar 21 04:42:24 crc kubenswrapper[4839]: I0321 04:42:24.433585 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/4a7a1028-3deb-4033-890c-db0861c6a9a2-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"4a7a1028-3deb-4033-890c-db0861c6a9a2\") " pod="openstack/ovsdbserver-nb-0" Mar 21 04:42:24 crc kubenswrapper[4839]: I0321 04:42:24.445603 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s9ccz\" (UniqueName: \"kubernetes.io/projected/4a7a1028-3deb-4033-890c-db0861c6a9a2-kube-api-access-s9ccz\") pod \"ovsdbserver-nb-0\" (UID: \"4a7a1028-3deb-4033-890c-db0861c6a9a2\") " pod="openstack/ovsdbserver-nb-0" Mar 21 04:42:24 crc kubenswrapper[4839]: I0321 04:42:24.452661 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"ovsdbserver-nb-0\" (UID: \"4a7a1028-3deb-4033-890c-db0861c6a9a2\") " pod="openstack/ovsdbserver-nb-0" Mar 21 04:42:24 crc kubenswrapper[4839]: I0321 04:42:24.462456 4839 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ef534e12-a75b-4e9b-b2a3-4046bece5903" path="/var/lib/kubelet/pods/ef534e12-a75b-4e9b-b2a3-4046bece5903/volumes" Mar 21 04:42:24 crc kubenswrapper[4839]: I0321 04:42:24.524501 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-sb-0" Mar 21 04:42:24 crc kubenswrapper[4839]: I0321 04:42:24.532865 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-nb-0" Mar 21 04:42:26 crc kubenswrapper[4839]: I0321 04:42:26.728322 4839 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-sb-0"] Mar 21 04:42:26 crc kubenswrapper[4839]: I0321 04:42:26.763210 4839 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-metrics-mx5tf"] Mar 21 04:42:27 crc kubenswrapper[4839]: I0321 04:42:27.637415 4839 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-nb-0"] Mar 21 04:42:29 crc kubenswrapper[4839]: W0321 04:42:29.105210 4839 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod8c2e5ef4_e4c0_4278_897e_ce5d00b4079d.slice/crio-d9a27b85c9cd913ab75f08dfedcf1fddb423f9ed676b8a4adcd4d45cc577f713 WatchSource:0}: Error finding container d9a27b85c9cd913ab75f08dfedcf1fddb423f9ed676b8a4adcd4d45cc577f713: Status 404 returned error can't find the container with id d9a27b85c9cd913ab75f08dfedcf1fddb423f9ed676b8a4adcd4d45cc577f713 Mar 21 04:42:29 crc kubenswrapper[4839]: W0321 04:42:29.106433 4839 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod4a7a1028_3deb_4033_890c_db0861c6a9a2.slice/crio-a8bf4c720e0e6b3f3fd84af1321aca246839afe8f37873fe14892ee36dcca921 WatchSource:0}: Error finding container a8bf4c720e0e6b3f3fd84af1321aca246839afe8f37873fe14892ee36dcca921: Status 404 returned error can't find the container with id a8bf4c720e0e6b3f3fd84af1321aca246839afe8f37873fe14892ee36dcca921 Mar 21 04:42:30 crc kubenswrapper[4839]: I0321 04:42:30.057178 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"8c2e5ef4-e4c0-4278-897e-ce5d00b4079d","Type":"ContainerStarted","Data":"d9a27b85c9cd913ab75f08dfedcf1fddb423f9ed676b8a4adcd4d45cc577f713"} Mar 21 04:42:30 crc kubenswrapper[4839]: I0321 04:42:30.058961 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"4a7a1028-3deb-4033-890c-db0861c6a9a2","Type":"ContainerStarted","Data":"a8bf4c720e0e6b3f3fd84af1321aca246839afe8f37873fe14892ee36dcca921"} Mar 21 04:42:30 crc kubenswrapper[4839]: I0321 04:42:30.060287 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-metrics-mx5tf" event={"ID":"64d13111-845e-4c61-a4ce-483ddfb799b7","Type":"ContainerStarted","Data":"af359b866288950544e18ecda8a727bc0e66ac01a39c017d435cd3186e9fc456"} Mar 21 04:42:30 crc kubenswrapper[4839]: I0321 04:42:30.980616 4839 patch_prober.go:28] interesting pod/machine-config-daemon-jx4q7 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 21 04:42:30 crc kubenswrapper[4839]: I0321 04:42:30.980673 4839 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-jx4q7" podUID="4f92fefb-d5cd-451a-8bbe-31eea55d5bd9" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 21 04:42:30 crc kubenswrapper[4839]: I0321 04:42:30.980719 4839 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-jx4q7" Mar 21 04:42:30 crc kubenswrapper[4839]: I0321 04:42:30.981387 4839 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"3ca17db50991abbb7e584e1a028ac5195afd6abd747f7e5e9969a64ed39bcf6c"} pod="openshift-machine-config-operator/machine-config-daemon-jx4q7" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 21 04:42:30 crc kubenswrapper[4839]: I0321 04:42:30.981439 4839 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-jx4q7" podUID="4f92fefb-d5cd-451a-8bbe-31eea55d5bd9" containerName="machine-config-daemon" containerID="cri-o://3ca17db50991abbb7e584e1a028ac5195afd6abd747f7e5e9969a64ed39bcf6c" gracePeriod=600 Mar 21 04:42:31 crc kubenswrapper[4839]: I0321 04:42:31.675885 4839 scope.go:117] "RemoveContainer" containerID="edf0b9b310ad11f4cb21b959eb633d808203a45ec2b8463a2fe875186e107484" Mar 21 04:42:32 crc kubenswrapper[4839]: I0321 04:42:32.081815 4839 generic.go:334] "Generic (PLEG): container finished" podID="4f92fefb-d5cd-451a-8bbe-31eea55d5bd9" containerID="3ca17db50991abbb7e584e1a028ac5195afd6abd747f7e5e9969a64ed39bcf6c" exitCode=0 Mar 21 04:42:32 crc kubenswrapper[4839]: I0321 04:42:32.081854 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-jx4q7" event={"ID":"4f92fefb-d5cd-451a-8bbe-31eea55d5bd9","Type":"ContainerDied","Data":"3ca17db50991abbb7e584e1a028ac5195afd6abd747f7e5e9969a64ed39bcf6c"} Mar 21 04:42:32 crc kubenswrapper[4839]: I0321 04:42:32.081921 4839 scope.go:117] "RemoveContainer" containerID="d9f640234dbdc5d617b2a0974e24e968076d94c55d65466d46d7d064392afc02" Mar 21 04:42:42 crc kubenswrapper[4839]: I0321 04:42:42.158974 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-57d769cc4f-hpzj8" event={"ID":"5f0d86dd-f2c4-415e-b4dc-0bfeb0b1dbb5","Type":"ContainerStarted","Data":"f34f4acac62bc779d36b74541949c853bc07653f2853d959f243208237be8360"} Mar 21 04:42:42 crc kubenswrapper[4839]: I0321 04:42:42.162348 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-jx4q7" event={"ID":"4f92fefb-d5cd-451a-8bbe-31eea55d5bd9","Type":"ContainerStarted","Data":"48bb6d2443587cf3023178aa72ea424c113f55b1e7600821dbf21c214de8e70f"} Mar 21 04:42:43 crc kubenswrapper[4839]: I0321 04:42:43.172553 4839 generic.go:334] "Generic (PLEG): container finished" podID="5f0d86dd-f2c4-415e-b4dc-0bfeb0b1dbb5" containerID="f34f4acac62bc779d36b74541949c853bc07653f2853d959f243208237be8360" exitCode=0 Mar 21 04:42:43 crc kubenswrapper[4839]: I0321 04:42:43.172652 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-57d769cc4f-hpzj8" event={"ID":"5f0d86dd-f2c4-415e-b4dc-0bfeb0b1dbb5","Type":"ContainerDied","Data":"f34f4acac62bc779d36b74541949c853bc07653f2853d959f243208237be8360"} Mar 21 04:42:43 crc kubenswrapper[4839]: E0321 04:42:43.704107 4839 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.k8s.io/kube-state-metrics/kube-state-metrics:v2.15.0" Mar 21 04:42:43 crc kubenswrapper[4839]: E0321 04:42:43.704503 4839 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.k8s.io/kube-state-metrics/kube-state-metrics:v2.15.0" Mar 21 04:42:43 crc kubenswrapper[4839]: E0321 04:42:43.704671 4839 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:kube-state-metrics,Image:registry.k8s.io/kube-state-metrics/kube-state-metrics:v2.15.0,Command:[],Args:[--resources=pods --namespaces=openstack],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:http-metrics,HostPort:0,ContainerPort:8080,Protocol:TCP,HostIP:,},ContainerPort{Name:telemetry,HostPort:0,ContainerPort:8081,Protocol:TCP,HostIP:,},},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-fr9w9,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/livez,Port:{0 8080 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:5,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:5,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:*true,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod kube-state-metrics-0_openstack(76b8f1b8-aa66-4f5e-937a-f837a2da28f1): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Mar 21 04:42:43 crc kubenswrapper[4839]: E0321 04:42:43.705868 4839 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-state-metrics\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openstack/kube-state-metrics-0" podUID="76b8f1b8-aa66-4f5e-937a-f837a2da28f1" Mar 21 04:42:44 crc kubenswrapper[4839]: E0321 04:42:44.184941 4839 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-state-metrics\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.k8s.io/kube-state-metrics/kube-state-metrics:v2.15.0\\\"\"" pod="openstack/kube-state-metrics-0" podUID="76b8f1b8-aa66-4f5e-937a-f837a2da28f1" Mar 21 04:42:45 crc kubenswrapper[4839]: I0321 04:42:45.195628 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-57d769cc4f-hpzj8" event={"ID":"5f0d86dd-f2c4-415e-b4dc-0bfeb0b1dbb5","Type":"ContainerStarted","Data":"3bcf6c4dca6994d55b402560e153aeec77a770a87db6c2a3981fd71f1c0e7782"} Mar 21 04:42:45 crc kubenswrapper[4839]: I0321 04:42:45.196235 4839 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-57d769cc4f-hpzj8" Mar 21 04:42:45 crc kubenswrapper[4839]: I0321 04:42:45.197786 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-qt5s4" event={"ID":"b31b64cb-0266-4b8a-9fcb-ae5e36c8309a","Type":"ContainerStarted","Data":"03174b199080cd580b420f87aeff54e80e4f62879ff8261601e9b256242195a2"} Mar 21 04:42:45 crc kubenswrapper[4839]: I0321 04:42:45.197926 4839 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-controller-qt5s4" Mar 21 04:42:45 crc kubenswrapper[4839]: I0321 04:42:45.199807 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/memcached-0" event={"ID":"3c49bdbb-0c05-4dea-8de8-61ca09b7e84c","Type":"ContainerStarted","Data":"0b8b7584ab62d33c68dbb641dd3ecf4023cf10a177e28f93543a064496569031"} Mar 21 04:42:45 crc kubenswrapper[4839]: I0321 04:42:45.199922 4839 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/memcached-0" Mar 21 04:42:45 crc kubenswrapper[4839]: I0321 04:42:45.201859 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"8c2e5ef4-e4c0-4278-897e-ce5d00b4079d","Type":"ContainerStarted","Data":"f47aaf50b193096eb277420228011f3f9ef749e81e13ef2453a74549a72a6b4a"} Mar 21 04:42:45 crc kubenswrapper[4839]: I0321 04:42:45.203592 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"4f1edf0d-f220-4815-aeb6-e4507576247a","Type":"ContainerStarted","Data":"eb929a7642759119016130bc2484941db41a93bf8dd7c08b25451605be57abdb"} Mar 21 04:42:45 crc kubenswrapper[4839]: I0321 04:42:45.207151 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"30d22e92-45bd-4d1e-954e-3ade801245d4","Type":"ContainerStarted","Data":"03af0871636dda05a3e9679ab089635f4fe936a9737c3d6ca310aaab50787692"} Mar 21 04:42:45 crc kubenswrapper[4839]: I0321 04:42:45.209174 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"4a7a1028-3deb-4033-890c-db0861c6a9a2","Type":"ContainerStarted","Data":"435e1ac136be0d7f0e75d3d6e51aaa96829af337f68d82ee84841b76ec92423a"} Mar 21 04:42:45 crc kubenswrapper[4839]: I0321 04:42:45.210820 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"6e1d0e8c-00aa-4770-9e58-b8f706d80a35","Type":"ContainerStarted","Data":"e90ba1902491587c449fe1da895e413653e91fc3602d79a9260ecadc688aa182"} Mar 21 04:42:45 crc kubenswrapper[4839]: I0321 04:42:45.212620 4839 generic.go:334] "Generic (PLEG): container finished" podID="3d74e911-e100-4e79-89be-202e06bb4d30" containerID="dbd877256a7c5b5bd86e79c64adcea4e8fc78bd1bc828444b55643dd18222c51" exitCode=0 Mar 21 04:42:45 crc kubenswrapper[4839]: I0321 04:42:45.212701 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-hrww8" event={"ID":"3d74e911-e100-4e79-89be-202e06bb4d30","Type":"ContainerDied","Data":"dbd877256a7c5b5bd86e79c64adcea4e8fc78bd1bc828444b55643dd18222c51"} Mar 21 04:42:45 crc kubenswrapper[4839]: I0321 04:42:45.217760 4839 generic.go:334] "Generic (PLEG): container finished" podID="45f7903c-4081-446b-94d9-ad979332590b" containerID="ca652ab32b31e13e16160470923dab804165a2c796e14ceaeace470aa8e62f9b" exitCode=0 Mar 21 04:42:45 crc kubenswrapper[4839]: I0321 04:42:45.217823 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5ccc8479f9-d99qf" event={"ID":"45f7903c-4081-446b-94d9-ad979332590b","Type":"ContainerDied","Data":"ca652ab32b31e13e16160470923dab804165a2c796e14ceaeace470aa8e62f9b"} Mar 21 04:42:45 crc kubenswrapper[4839]: I0321 04:42:45.238734 4839 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-57d769cc4f-hpzj8" podStartSLOduration=31.900222346 podStartE2EDuration="35.238713681s" podCreationTimestamp="2026-03-21 04:42:10 +0000 UTC" firstStartedPulling="2026-03-21 04:42:22.678487073 +0000 UTC m=+1147.006273749" lastFinishedPulling="2026-03-21 04:42:26.016978408 +0000 UTC m=+1150.344765084" observedRunningTime="2026-03-21 04:42:45.216897041 +0000 UTC m=+1169.544683727" watchObservedRunningTime="2026-03-21 04:42:45.238713681 +0000 UTC m=+1169.566500357" Mar 21 04:42:45 crc kubenswrapper[4839]: I0321 04:42:45.242150 4839 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/memcached-0" podStartSLOduration=18.515870427 podStartE2EDuration="31.242133567s" podCreationTimestamp="2026-03-21 04:42:14 +0000 UTC" firstStartedPulling="2026-03-21 04:42:22.700201561 +0000 UTC m=+1147.027988237" lastFinishedPulling="2026-03-21 04:42:35.426464691 +0000 UTC m=+1159.754251377" observedRunningTime="2026-03-21 04:42:45.232901299 +0000 UTC m=+1169.560687995" watchObservedRunningTime="2026-03-21 04:42:45.242133567 +0000 UTC m=+1169.569920243" Mar 21 04:42:45 crc kubenswrapper[4839]: I0321 04:42:45.282446 4839 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-qt5s4" podStartSLOduration=9.766836292 podStartE2EDuration="25.282426494s" podCreationTimestamp="2026-03-21 04:42:20 +0000 UTC" firstStartedPulling="2026-03-21 04:42:22.987765975 +0000 UTC m=+1147.315552651" lastFinishedPulling="2026-03-21 04:42:38.503356177 +0000 UTC m=+1162.831142853" observedRunningTime="2026-03-21 04:42:45.280216182 +0000 UTC m=+1169.608002878" watchObservedRunningTime="2026-03-21 04:42:45.282426494 +0000 UTC m=+1169.610213180" Mar 21 04:42:46 crc kubenswrapper[4839]: I0321 04:42:46.258662 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"8028561c-b039-4400-a065-b5efee753b5f","Type":"ContainerStarted","Data":"fcd7e300ab111a88b888a2fc68f007c49d0404de0648aa1177c5d04bb341e74c"} Mar 21 04:42:47 crc kubenswrapper[4839]: I0321 04:42:47.284725 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-metrics-mx5tf" event={"ID":"64d13111-845e-4c61-a4ce-483ddfb799b7","Type":"ContainerStarted","Data":"175bbeed186af8e0c1144e458ee1210b3823c52893d44d05376e0576ec7042d3"} Mar 21 04:42:47 crc kubenswrapper[4839]: I0321 04:42:47.288938 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-hrww8" event={"ID":"3d74e911-e100-4e79-89be-202e06bb4d30","Type":"ContainerStarted","Data":"dc90889004e71da508eff82ba8b3e962b8dde614cbf9d560b36062af63991346"} Mar 21 04:42:47 crc kubenswrapper[4839]: I0321 04:42:47.292026 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5ccc8479f9-d99qf" event={"ID":"45f7903c-4081-446b-94d9-ad979332590b","Type":"ContainerStarted","Data":"ceb5288368bd1daf0f9565c9514c1f663aa6c3ba57ee1a29a443b28ffe10ccc5"} Mar 21 04:42:47 crc kubenswrapper[4839]: I0321 04:42:47.292241 4839 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-5ccc8479f9-d99qf" Mar 21 04:42:47 crc kubenswrapper[4839]: I0321 04:42:47.314414 4839 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-5ccc8479f9-d99qf" podStartSLOduration=24.808359938 podStartE2EDuration="37.314393327s" podCreationTimestamp="2026-03-21 04:42:10 +0000 UTC" firstStartedPulling="2026-03-21 04:42:23.029159383 +0000 UTC m=+1147.356946059" lastFinishedPulling="2026-03-21 04:42:35.535192772 +0000 UTC m=+1159.862979448" observedRunningTime="2026-03-21 04:42:47.311040424 +0000 UTC m=+1171.638827130" watchObservedRunningTime="2026-03-21 04:42:47.314393327 +0000 UTC m=+1171.642180013" Mar 21 04:42:48 crc kubenswrapper[4839]: I0321 04:42:48.305560 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-hrww8" event={"ID":"3d74e911-e100-4e79-89be-202e06bb4d30","Type":"ContainerStarted","Data":"c545a523b095359d2d8c61304f37592d2cebcf5b76db7972a0012ae265bb8e7e"} Mar 21 04:42:48 crc kubenswrapper[4839]: I0321 04:42:48.306312 4839 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-controller-ovs-hrww8" Mar 21 04:42:48 crc kubenswrapper[4839]: I0321 04:42:48.308060 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"4a7a1028-3deb-4033-890c-db0861c6a9a2","Type":"ContainerStarted","Data":"8cef22d68ec78af1eb35427e9159a065e50b724bcca1f49f2f330f9757887e18"} Mar 21 04:42:48 crc kubenswrapper[4839]: I0321 04:42:48.345212 4839 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-ovs-hrww8" podStartSLOduration=15.280689572 podStartE2EDuration="28.345186044s" podCreationTimestamp="2026-03-21 04:42:20 +0000 UTC" firstStartedPulling="2026-03-21 04:42:23.029895014 +0000 UTC m=+1147.357681690" lastFinishedPulling="2026-03-21 04:42:36.094391486 +0000 UTC m=+1160.422178162" observedRunningTime="2026-03-21 04:42:48.328032744 +0000 UTC m=+1172.655819420" watchObservedRunningTime="2026-03-21 04:42:48.345186044 +0000 UTC m=+1172.672972730" Mar 21 04:42:48 crc kubenswrapper[4839]: I0321 04:42:48.354600 4839 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovsdbserver-nb-0" podStartSLOduration=7.285219283 podStartE2EDuration="25.354548216s" podCreationTimestamp="2026-03-21 04:42:23 +0000 UTC" firstStartedPulling="2026-03-21 04:42:29.630032054 +0000 UTC m=+1153.957818770" lastFinishedPulling="2026-03-21 04:42:47.699361017 +0000 UTC m=+1172.027147703" observedRunningTime="2026-03-21 04:42:48.350829852 +0000 UTC m=+1172.678616538" watchObservedRunningTime="2026-03-21 04:42:48.354548216 +0000 UTC m=+1172.682334912" Mar 21 04:42:48 crc kubenswrapper[4839]: I0321 04:42:48.533367 4839 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/ovsdbserver-nb-0" Mar 21 04:42:48 crc kubenswrapper[4839]: I0321 04:42:48.582268 4839 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/ovsdbserver-nb-0" Mar 21 04:42:48 crc kubenswrapper[4839]: I0321 04:42:48.604675 4839 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-metrics-mx5tf" podStartSLOduration=9.028040949 podStartE2EDuration="25.604642972s" podCreationTimestamp="2026-03-21 04:42:23 +0000 UTC" firstStartedPulling="2026-03-21 04:42:29.629741626 +0000 UTC m=+1153.957528312" lastFinishedPulling="2026-03-21 04:42:46.206343659 +0000 UTC m=+1170.534130335" observedRunningTime="2026-03-21 04:42:48.37866829 +0000 UTC m=+1172.706454956" watchObservedRunningTime="2026-03-21 04:42:48.604642972 +0000 UTC m=+1172.932429648" Mar 21 04:42:48 crc kubenswrapper[4839]: I0321 04:42:48.785868 4839 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-57d769cc4f-hpzj8"] Mar 21 04:42:48 crc kubenswrapper[4839]: I0321 04:42:48.786553 4839 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-57d769cc4f-hpzj8" podUID="5f0d86dd-f2c4-415e-b4dc-0bfeb0b1dbb5" containerName="dnsmasq-dns" containerID="cri-o://3bcf6c4dca6994d55b402560e153aeec77a770a87db6c2a3981fd71f1c0e7782" gracePeriod=10 Mar 21 04:42:48 crc kubenswrapper[4839]: I0321 04:42:48.826170 4839 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-6bc7876d45-vxcbj"] Mar 21 04:42:48 crc kubenswrapper[4839]: I0321 04:42:48.827426 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6bc7876d45-vxcbj" Mar 21 04:42:48 crc kubenswrapper[4839]: I0321 04:42:48.830241 4839 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovsdbserver-sb" Mar 21 04:42:48 crc kubenswrapper[4839]: I0321 04:42:48.845701 4839 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6bc7876d45-vxcbj"] Mar 21 04:42:48 crc kubenswrapper[4839]: I0321 04:42:48.944709 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9dqqx\" (UniqueName: \"kubernetes.io/projected/40fc2c3c-2c23-497a-89d6-906ba78506c2-kube-api-access-9dqqx\") pod \"dnsmasq-dns-6bc7876d45-vxcbj\" (UID: \"40fc2c3c-2c23-497a-89d6-906ba78506c2\") " pod="openstack/dnsmasq-dns-6bc7876d45-vxcbj" Mar 21 04:42:48 crc kubenswrapper[4839]: I0321 04:42:48.944833 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/40fc2c3c-2c23-497a-89d6-906ba78506c2-dns-svc\") pod \"dnsmasq-dns-6bc7876d45-vxcbj\" (UID: \"40fc2c3c-2c23-497a-89d6-906ba78506c2\") " pod="openstack/dnsmasq-dns-6bc7876d45-vxcbj" Mar 21 04:42:48 crc kubenswrapper[4839]: I0321 04:42:48.944891 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/40fc2c3c-2c23-497a-89d6-906ba78506c2-config\") pod \"dnsmasq-dns-6bc7876d45-vxcbj\" (UID: \"40fc2c3c-2c23-497a-89d6-906ba78506c2\") " pod="openstack/dnsmasq-dns-6bc7876d45-vxcbj" Mar 21 04:42:48 crc kubenswrapper[4839]: I0321 04:42:48.944991 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/40fc2c3c-2c23-497a-89d6-906ba78506c2-ovsdbserver-sb\") pod \"dnsmasq-dns-6bc7876d45-vxcbj\" (UID: \"40fc2c3c-2c23-497a-89d6-906ba78506c2\") " pod="openstack/dnsmasq-dns-6bc7876d45-vxcbj" Mar 21 04:42:49 crc kubenswrapper[4839]: I0321 04:42:49.000911 4839 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5ccc8479f9-d99qf"] Mar 21 04:42:49 crc kubenswrapper[4839]: I0321 04:42:49.047199 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9dqqx\" (UniqueName: \"kubernetes.io/projected/40fc2c3c-2c23-497a-89d6-906ba78506c2-kube-api-access-9dqqx\") pod \"dnsmasq-dns-6bc7876d45-vxcbj\" (UID: \"40fc2c3c-2c23-497a-89d6-906ba78506c2\") " pod="openstack/dnsmasq-dns-6bc7876d45-vxcbj" Mar 21 04:42:49 crc kubenswrapper[4839]: I0321 04:42:49.047256 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/40fc2c3c-2c23-497a-89d6-906ba78506c2-dns-svc\") pod \"dnsmasq-dns-6bc7876d45-vxcbj\" (UID: \"40fc2c3c-2c23-497a-89d6-906ba78506c2\") " pod="openstack/dnsmasq-dns-6bc7876d45-vxcbj" Mar 21 04:42:49 crc kubenswrapper[4839]: I0321 04:42:49.047319 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/40fc2c3c-2c23-497a-89d6-906ba78506c2-config\") pod \"dnsmasq-dns-6bc7876d45-vxcbj\" (UID: \"40fc2c3c-2c23-497a-89d6-906ba78506c2\") " pod="openstack/dnsmasq-dns-6bc7876d45-vxcbj" Mar 21 04:42:49 crc kubenswrapper[4839]: I0321 04:42:49.047387 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/40fc2c3c-2c23-497a-89d6-906ba78506c2-ovsdbserver-sb\") pod \"dnsmasq-dns-6bc7876d45-vxcbj\" (UID: \"40fc2c3c-2c23-497a-89d6-906ba78506c2\") " pod="openstack/dnsmasq-dns-6bc7876d45-vxcbj" Mar 21 04:42:49 crc kubenswrapper[4839]: I0321 04:42:49.048344 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/40fc2c3c-2c23-497a-89d6-906ba78506c2-ovsdbserver-sb\") pod \"dnsmasq-dns-6bc7876d45-vxcbj\" (UID: \"40fc2c3c-2c23-497a-89d6-906ba78506c2\") " pod="openstack/dnsmasq-dns-6bc7876d45-vxcbj" Mar 21 04:42:49 crc kubenswrapper[4839]: I0321 04:42:49.048500 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/40fc2c3c-2c23-497a-89d6-906ba78506c2-dns-svc\") pod \"dnsmasq-dns-6bc7876d45-vxcbj\" (UID: \"40fc2c3c-2c23-497a-89d6-906ba78506c2\") " pod="openstack/dnsmasq-dns-6bc7876d45-vxcbj" Mar 21 04:42:49 crc kubenswrapper[4839]: I0321 04:42:49.048636 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/40fc2c3c-2c23-497a-89d6-906ba78506c2-config\") pod \"dnsmasq-dns-6bc7876d45-vxcbj\" (UID: \"40fc2c3c-2c23-497a-89d6-906ba78506c2\") " pod="openstack/dnsmasq-dns-6bc7876d45-vxcbj" Mar 21 04:42:49 crc kubenswrapper[4839]: I0321 04:42:49.082711 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9dqqx\" (UniqueName: \"kubernetes.io/projected/40fc2c3c-2c23-497a-89d6-906ba78506c2-kube-api-access-9dqqx\") pod \"dnsmasq-dns-6bc7876d45-vxcbj\" (UID: \"40fc2c3c-2c23-497a-89d6-906ba78506c2\") " pod="openstack/dnsmasq-dns-6bc7876d45-vxcbj" Mar 21 04:42:49 crc kubenswrapper[4839]: I0321 04:42:49.084834 4839 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-8554648995-4z7nl"] Mar 21 04:42:49 crc kubenswrapper[4839]: I0321 04:42:49.086278 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-8554648995-4z7nl" Mar 21 04:42:49 crc kubenswrapper[4839]: I0321 04:42:49.093056 4839 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovsdbserver-nb" Mar 21 04:42:49 crc kubenswrapper[4839]: I0321 04:42:49.094917 4839 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-8554648995-4z7nl"] Mar 21 04:42:49 crc kubenswrapper[4839]: I0321 04:42:49.148446 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d92c95bc-4fde-4d24-ad6e-d4583ec19b3a-config\") pod \"dnsmasq-dns-8554648995-4z7nl\" (UID: \"d92c95bc-4fde-4d24-ad6e-d4583ec19b3a\") " pod="openstack/dnsmasq-dns-8554648995-4z7nl" Mar 21 04:42:49 crc kubenswrapper[4839]: I0321 04:42:49.148512 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/d92c95bc-4fde-4d24-ad6e-d4583ec19b3a-dns-svc\") pod \"dnsmasq-dns-8554648995-4z7nl\" (UID: \"d92c95bc-4fde-4d24-ad6e-d4583ec19b3a\") " pod="openstack/dnsmasq-dns-8554648995-4z7nl" Mar 21 04:42:49 crc kubenswrapper[4839]: I0321 04:42:49.148540 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j56r2\" (UniqueName: \"kubernetes.io/projected/d92c95bc-4fde-4d24-ad6e-d4583ec19b3a-kube-api-access-j56r2\") pod \"dnsmasq-dns-8554648995-4z7nl\" (UID: \"d92c95bc-4fde-4d24-ad6e-d4583ec19b3a\") " pod="openstack/dnsmasq-dns-8554648995-4z7nl" Mar 21 04:42:49 crc kubenswrapper[4839]: I0321 04:42:49.148597 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/d92c95bc-4fde-4d24-ad6e-d4583ec19b3a-ovsdbserver-sb\") pod \"dnsmasq-dns-8554648995-4z7nl\" (UID: \"d92c95bc-4fde-4d24-ad6e-d4583ec19b3a\") " pod="openstack/dnsmasq-dns-8554648995-4z7nl" Mar 21 04:42:49 crc kubenswrapper[4839]: I0321 04:42:49.148638 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/d92c95bc-4fde-4d24-ad6e-d4583ec19b3a-ovsdbserver-nb\") pod \"dnsmasq-dns-8554648995-4z7nl\" (UID: \"d92c95bc-4fde-4d24-ad6e-d4583ec19b3a\") " pod="openstack/dnsmasq-dns-8554648995-4z7nl" Mar 21 04:42:49 crc kubenswrapper[4839]: I0321 04:42:49.168881 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6bc7876d45-vxcbj" Mar 21 04:42:49 crc kubenswrapper[4839]: I0321 04:42:49.258170 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d92c95bc-4fde-4d24-ad6e-d4583ec19b3a-config\") pod \"dnsmasq-dns-8554648995-4z7nl\" (UID: \"d92c95bc-4fde-4d24-ad6e-d4583ec19b3a\") " pod="openstack/dnsmasq-dns-8554648995-4z7nl" Mar 21 04:42:49 crc kubenswrapper[4839]: I0321 04:42:49.258248 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/d92c95bc-4fde-4d24-ad6e-d4583ec19b3a-dns-svc\") pod \"dnsmasq-dns-8554648995-4z7nl\" (UID: \"d92c95bc-4fde-4d24-ad6e-d4583ec19b3a\") " pod="openstack/dnsmasq-dns-8554648995-4z7nl" Mar 21 04:42:49 crc kubenswrapper[4839]: I0321 04:42:49.258273 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-j56r2\" (UniqueName: \"kubernetes.io/projected/d92c95bc-4fde-4d24-ad6e-d4583ec19b3a-kube-api-access-j56r2\") pod \"dnsmasq-dns-8554648995-4z7nl\" (UID: \"d92c95bc-4fde-4d24-ad6e-d4583ec19b3a\") " pod="openstack/dnsmasq-dns-8554648995-4z7nl" Mar 21 04:42:49 crc kubenswrapper[4839]: I0321 04:42:49.258330 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/d92c95bc-4fde-4d24-ad6e-d4583ec19b3a-ovsdbserver-sb\") pod \"dnsmasq-dns-8554648995-4z7nl\" (UID: \"d92c95bc-4fde-4d24-ad6e-d4583ec19b3a\") " pod="openstack/dnsmasq-dns-8554648995-4z7nl" Mar 21 04:42:49 crc kubenswrapper[4839]: I0321 04:42:49.258383 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/d92c95bc-4fde-4d24-ad6e-d4583ec19b3a-ovsdbserver-nb\") pod \"dnsmasq-dns-8554648995-4z7nl\" (UID: \"d92c95bc-4fde-4d24-ad6e-d4583ec19b3a\") " pod="openstack/dnsmasq-dns-8554648995-4z7nl" Mar 21 04:42:49 crc kubenswrapper[4839]: I0321 04:42:49.276394 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/d92c95bc-4fde-4d24-ad6e-d4583ec19b3a-ovsdbserver-nb\") pod \"dnsmasq-dns-8554648995-4z7nl\" (UID: \"d92c95bc-4fde-4d24-ad6e-d4583ec19b3a\") " pod="openstack/dnsmasq-dns-8554648995-4z7nl" Mar 21 04:42:49 crc kubenswrapper[4839]: I0321 04:42:49.284488 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d92c95bc-4fde-4d24-ad6e-d4583ec19b3a-config\") pod \"dnsmasq-dns-8554648995-4z7nl\" (UID: \"d92c95bc-4fde-4d24-ad6e-d4583ec19b3a\") " pod="openstack/dnsmasq-dns-8554648995-4z7nl" Mar 21 04:42:49 crc kubenswrapper[4839]: I0321 04:42:49.290527 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/d92c95bc-4fde-4d24-ad6e-d4583ec19b3a-dns-svc\") pod \"dnsmasq-dns-8554648995-4z7nl\" (UID: \"d92c95bc-4fde-4d24-ad6e-d4583ec19b3a\") " pod="openstack/dnsmasq-dns-8554648995-4z7nl" Mar 21 04:42:49 crc kubenswrapper[4839]: I0321 04:42:49.292111 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-j56r2\" (UniqueName: \"kubernetes.io/projected/d92c95bc-4fde-4d24-ad6e-d4583ec19b3a-kube-api-access-j56r2\") pod \"dnsmasq-dns-8554648995-4z7nl\" (UID: \"d92c95bc-4fde-4d24-ad6e-d4583ec19b3a\") " pod="openstack/dnsmasq-dns-8554648995-4z7nl" Mar 21 04:42:49 crc kubenswrapper[4839]: I0321 04:42:49.295714 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/d92c95bc-4fde-4d24-ad6e-d4583ec19b3a-ovsdbserver-sb\") pod \"dnsmasq-dns-8554648995-4z7nl\" (UID: \"d92c95bc-4fde-4d24-ad6e-d4583ec19b3a\") " pod="openstack/dnsmasq-dns-8554648995-4z7nl" Mar 21 04:42:49 crc kubenswrapper[4839]: I0321 04:42:49.323092 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"8c2e5ef4-e4c0-4278-897e-ce5d00b4079d","Type":"ContainerStarted","Data":"dfd3e2ddc3b5f48fb0549e59e6def9e56a5ded8e1205e64ee9554fc4e94a80b9"} Mar 21 04:42:49 crc kubenswrapper[4839]: I0321 04:42:49.326205 4839 generic.go:334] "Generic (PLEG): container finished" podID="5f0d86dd-f2c4-415e-b4dc-0bfeb0b1dbb5" containerID="3bcf6c4dca6994d55b402560e153aeec77a770a87db6c2a3981fd71f1c0e7782" exitCode=0 Mar 21 04:42:49 crc kubenswrapper[4839]: I0321 04:42:49.326293 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-57d769cc4f-hpzj8" event={"ID":"5f0d86dd-f2c4-415e-b4dc-0bfeb0b1dbb5","Type":"ContainerDied","Data":"3bcf6c4dca6994d55b402560e153aeec77a770a87db6c2a3981fd71f1c0e7782"} Mar 21 04:42:49 crc kubenswrapper[4839]: I0321 04:42:49.326942 4839 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovsdbserver-nb-0" Mar 21 04:42:49 crc kubenswrapper[4839]: I0321 04:42:49.326973 4839 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-controller-ovs-hrww8" Mar 21 04:42:49 crc kubenswrapper[4839]: I0321 04:42:49.327076 4839 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-5ccc8479f9-d99qf" podUID="45f7903c-4081-446b-94d9-ad979332590b" containerName="dnsmasq-dns" containerID="cri-o://ceb5288368bd1daf0f9565c9514c1f663aa6c3ba57ee1a29a443b28ffe10ccc5" gracePeriod=10 Mar 21 04:42:49 crc kubenswrapper[4839]: I0321 04:42:49.395058 4839 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovsdbserver-nb-0" Mar 21 04:42:49 crc kubenswrapper[4839]: I0321 04:42:49.421753 4839 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovsdbserver-sb-0" podStartSLOduration=8.726753314 podStartE2EDuration="27.421730099s" podCreationTimestamp="2026-03-21 04:42:22 +0000 UTC" firstStartedPulling="2026-03-21 04:42:29.629630573 +0000 UTC m=+1153.957417249" lastFinishedPulling="2026-03-21 04:42:48.324607338 +0000 UTC m=+1172.652394034" observedRunningTime="2026-03-21 04:42:49.358300105 +0000 UTC m=+1173.686086801" watchObservedRunningTime="2026-03-21 04:42:49.421730099 +0000 UTC m=+1173.749516775" Mar 21 04:42:49 crc kubenswrapper[4839]: I0321 04:42:49.482740 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-8554648995-4z7nl" Mar 21 04:42:49 crc kubenswrapper[4839]: I0321 04:42:49.516731 4839 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-57d769cc4f-hpzj8" Mar 21 04:42:49 crc kubenswrapper[4839]: I0321 04:42:49.525147 4839 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovsdbserver-sb-0" Mar 21 04:42:49 crc kubenswrapper[4839]: I0321 04:42:49.667272 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5f0d86dd-f2c4-415e-b4dc-0bfeb0b1dbb5-config\") pod \"5f0d86dd-f2c4-415e-b4dc-0bfeb0b1dbb5\" (UID: \"5f0d86dd-f2c4-415e-b4dc-0bfeb0b1dbb5\") " Mar 21 04:42:49 crc kubenswrapper[4839]: I0321 04:42:49.667704 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6cxnx\" (UniqueName: \"kubernetes.io/projected/5f0d86dd-f2c4-415e-b4dc-0bfeb0b1dbb5-kube-api-access-6cxnx\") pod \"5f0d86dd-f2c4-415e-b4dc-0bfeb0b1dbb5\" (UID: \"5f0d86dd-f2c4-415e-b4dc-0bfeb0b1dbb5\") " Mar 21 04:42:49 crc kubenswrapper[4839]: I0321 04:42:49.667771 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/5f0d86dd-f2c4-415e-b4dc-0bfeb0b1dbb5-dns-svc\") pod \"5f0d86dd-f2c4-415e-b4dc-0bfeb0b1dbb5\" (UID: \"5f0d86dd-f2c4-415e-b4dc-0bfeb0b1dbb5\") " Mar 21 04:42:49 crc kubenswrapper[4839]: I0321 04:42:49.673488 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5f0d86dd-f2c4-415e-b4dc-0bfeb0b1dbb5-kube-api-access-6cxnx" (OuterVolumeSpecName: "kube-api-access-6cxnx") pod "5f0d86dd-f2c4-415e-b4dc-0bfeb0b1dbb5" (UID: "5f0d86dd-f2c4-415e-b4dc-0bfeb0b1dbb5"). InnerVolumeSpecName "kube-api-access-6cxnx". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 21 04:42:49 crc kubenswrapper[4839]: I0321 04:42:49.712645 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5f0d86dd-f2c4-415e-b4dc-0bfeb0b1dbb5-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "5f0d86dd-f2c4-415e-b4dc-0bfeb0b1dbb5" (UID: "5f0d86dd-f2c4-415e-b4dc-0bfeb0b1dbb5"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 21 04:42:49 crc kubenswrapper[4839]: I0321 04:42:49.738084 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5f0d86dd-f2c4-415e-b4dc-0bfeb0b1dbb5-config" (OuterVolumeSpecName: "config") pod "5f0d86dd-f2c4-415e-b4dc-0bfeb0b1dbb5" (UID: "5f0d86dd-f2c4-415e-b4dc-0bfeb0b1dbb5"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 21 04:42:49 crc kubenswrapper[4839]: I0321 04:42:49.752864 4839 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6bc7876d45-vxcbj"] Mar 21 04:42:49 crc kubenswrapper[4839]: W0321 04:42:49.756410 4839 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod40fc2c3c_2c23_497a_89d6_906ba78506c2.slice/crio-19aecc8d605d42905ae54d5640d84373129c9b80e1392d6792f10bef52bd1252 WatchSource:0}: Error finding container 19aecc8d605d42905ae54d5640d84373129c9b80e1392d6792f10bef52bd1252: Status 404 returned error can't find the container with id 19aecc8d605d42905ae54d5640d84373129c9b80e1392d6792f10bef52bd1252 Mar 21 04:42:49 crc kubenswrapper[4839]: I0321 04:42:49.767915 4839 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5ccc8479f9-d99qf" Mar 21 04:42:49 crc kubenswrapper[4839]: I0321 04:42:49.769955 4839 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5f0d86dd-f2c4-415e-b4dc-0bfeb0b1dbb5-config\") on node \"crc\" DevicePath \"\"" Mar 21 04:42:49 crc kubenswrapper[4839]: I0321 04:42:49.769989 4839 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6cxnx\" (UniqueName: \"kubernetes.io/projected/5f0d86dd-f2c4-415e-b4dc-0bfeb0b1dbb5-kube-api-access-6cxnx\") on node \"crc\" DevicePath \"\"" Mar 21 04:42:49 crc kubenswrapper[4839]: I0321 04:42:49.770007 4839 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/5f0d86dd-f2c4-415e-b4dc-0bfeb0b1dbb5-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 21 04:42:49 crc kubenswrapper[4839]: I0321 04:42:49.871941 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/45f7903c-4081-446b-94d9-ad979332590b-dns-svc\") pod \"45f7903c-4081-446b-94d9-ad979332590b\" (UID: \"45f7903c-4081-446b-94d9-ad979332590b\") " Mar 21 04:42:49 crc kubenswrapper[4839]: I0321 04:42:49.872442 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/45f7903c-4081-446b-94d9-ad979332590b-config\") pod \"45f7903c-4081-446b-94d9-ad979332590b\" (UID: \"45f7903c-4081-446b-94d9-ad979332590b\") " Mar 21 04:42:49 crc kubenswrapper[4839]: I0321 04:42:49.872528 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-djkvq\" (UniqueName: \"kubernetes.io/projected/45f7903c-4081-446b-94d9-ad979332590b-kube-api-access-djkvq\") pod \"45f7903c-4081-446b-94d9-ad979332590b\" (UID: \"45f7903c-4081-446b-94d9-ad979332590b\") " Mar 21 04:42:49 crc kubenswrapper[4839]: I0321 04:42:49.876699 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/45f7903c-4081-446b-94d9-ad979332590b-kube-api-access-djkvq" (OuterVolumeSpecName: "kube-api-access-djkvq") pod "45f7903c-4081-446b-94d9-ad979332590b" (UID: "45f7903c-4081-446b-94d9-ad979332590b"). InnerVolumeSpecName "kube-api-access-djkvq". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 21 04:42:49 crc kubenswrapper[4839]: I0321 04:42:49.911479 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/45f7903c-4081-446b-94d9-ad979332590b-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "45f7903c-4081-446b-94d9-ad979332590b" (UID: "45f7903c-4081-446b-94d9-ad979332590b"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 21 04:42:49 crc kubenswrapper[4839]: I0321 04:42:49.919679 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/45f7903c-4081-446b-94d9-ad979332590b-config" (OuterVolumeSpecName: "config") pod "45f7903c-4081-446b-94d9-ad979332590b" (UID: "45f7903c-4081-446b-94d9-ad979332590b"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 21 04:42:49 crc kubenswrapper[4839]: I0321 04:42:49.974657 4839 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-djkvq\" (UniqueName: \"kubernetes.io/projected/45f7903c-4081-446b-94d9-ad979332590b-kube-api-access-djkvq\") on node \"crc\" DevicePath \"\"" Mar 21 04:42:49 crc kubenswrapper[4839]: I0321 04:42:49.974711 4839 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/45f7903c-4081-446b-94d9-ad979332590b-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 21 04:42:49 crc kubenswrapper[4839]: I0321 04:42:49.974723 4839 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/45f7903c-4081-446b-94d9-ad979332590b-config\") on node \"crc\" DevicePath \"\"" Mar 21 04:42:50 crc kubenswrapper[4839]: I0321 04:42:50.012740 4839 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-8554648995-4z7nl"] Mar 21 04:42:50 crc kubenswrapper[4839]: W0321 04:42:50.015905 4839 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd92c95bc_4fde_4d24_ad6e_d4583ec19b3a.slice/crio-8e39c8706b3815a73340c1ae1ac875a1799d9491908ef6b336359be187e60c9d WatchSource:0}: Error finding container 8e39c8706b3815a73340c1ae1ac875a1799d9491908ef6b336359be187e60c9d: Status 404 returned error can't find the container with id 8e39c8706b3815a73340c1ae1ac875a1799d9491908ef6b336359be187e60c9d Mar 21 04:42:50 crc kubenswrapper[4839]: I0321 04:42:50.075316 4839 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/memcached-0" Mar 21 04:42:50 crc kubenswrapper[4839]: I0321 04:42:50.339211 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-8554648995-4z7nl" event={"ID":"d92c95bc-4fde-4d24-ad6e-d4583ec19b3a","Type":"ContainerStarted","Data":"8e39c8706b3815a73340c1ae1ac875a1799d9491908ef6b336359be187e60c9d"} Mar 21 04:42:50 crc kubenswrapper[4839]: I0321 04:42:50.341039 4839 generic.go:334] "Generic (PLEG): container finished" podID="40fc2c3c-2c23-497a-89d6-906ba78506c2" containerID="8a45ce0a9f6faa9e05d38583bc564141b514dd0db35ef0ce33729091517ea300" exitCode=0 Mar 21 04:42:50 crc kubenswrapper[4839]: I0321 04:42:50.341104 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6bc7876d45-vxcbj" event={"ID":"40fc2c3c-2c23-497a-89d6-906ba78506c2","Type":"ContainerDied","Data":"8a45ce0a9f6faa9e05d38583bc564141b514dd0db35ef0ce33729091517ea300"} Mar 21 04:42:50 crc kubenswrapper[4839]: I0321 04:42:50.341125 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6bc7876d45-vxcbj" event={"ID":"40fc2c3c-2c23-497a-89d6-906ba78506c2","Type":"ContainerStarted","Data":"19aecc8d605d42905ae54d5640d84373129c9b80e1392d6792f10bef52bd1252"} Mar 21 04:42:50 crc kubenswrapper[4839]: I0321 04:42:50.344667 4839 generic.go:334] "Generic (PLEG): container finished" podID="45f7903c-4081-446b-94d9-ad979332590b" containerID="ceb5288368bd1daf0f9565c9514c1f663aa6c3ba57ee1a29a443b28ffe10ccc5" exitCode=0 Mar 21 04:42:50 crc kubenswrapper[4839]: I0321 04:42:50.344734 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5ccc8479f9-d99qf" event={"ID":"45f7903c-4081-446b-94d9-ad979332590b","Type":"ContainerDied","Data":"ceb5288368bd1daf0f9565c9514c1f663aa6c3ba57ee1a29a443b28ffe10ccc5"} Mar 21 04:42:50 crc kubenswrapper[4839]: I0321 04:42:50.344740 4839 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5ccc8479f9-d99qf" Mar 21 04:42:50 crc kubenswrapper[4839]: I0321 04:42:50.344756 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5ccc8479f9-d99qf" event={"ID":"45f7903c-4081-446b-94d9-ad979332590b","Type":"ContainerDied","Data":"104ea53b53097b885fd70bb32ecb2748f3dd610f76b7554afac7980d69d50aa2"} Mar 21 04:42:50 crc kubenswrapper[4839]: I0321 04:42:50.344778 4839 scope.go:117] "RemoveContainer" containerID="ceb5288368bd1daf0f9565c9514c1f663aa6c3ba57ee1a29a443b28ffe10ccc5" Mar 21 04:42:50 crc kubenswrapper[4839]: I0321 04:42:50.348554 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-57d769cc4f-hpzj8" event={"ID":"5f0d86dd-f2c4-415e-b4dc-0bfeb0b1dbb5","Type":"ContainerDied","Data":"0ce222ce6f31634231e7d8b6628241b56c0a1c3b1c7afeb55d6d347fe6bd4c49"} Mar 21 04:42:50 crc kubenswrapper[4839]: I0321 04:42:50.349237 4839 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-57d769cc4f-hpzj8" Mar 21 04:42:50 crc kubenswrapper[4839]: I0321 04:42:50.451961 4839 scope.go:117] "RemoveContainer" containerID="ca652ab32b31e13e16160470923dab804165a2c796e14ceaeace470aa8e62f9b" Mar 21 04:42:50 crc kubenswrapper[4839]: I0321 04:42:50.493528 4839 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-57d769cc4f-hpzj8"] Mar 21 04:42:50 crc kubenswrapper[4839]: I0321 04:42:50.500226 4839 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-57d769cc4f-hpzj8"] Mar 21 04:42:50 crc kubenswrapper[4839]: I0321 04:42:50.505619 4839 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5ccc8479f9-d99qf"] Mar 21 04:42:50 crc kubenswrapper[4839]: I0321 04:42:50.509690 4839 scope.go:117] "RemoveContainer" containerID="ceb5288368bd1daf0f9565c9514c1f663aa6c3ba57ee1a29a443b28ffe10ccc5" Mar 21 04:42:50 crc kubenswrapper[4839]: E0321 04:42:50.510168 4839 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ceb5288368bd1daf0f9565c9514c1f663aa6c3ba57ee1a29a443b28ffe10ccc5\": container with ID starting with ceb5288368bd1daf0f9565c9514c1f663aa6c3ba57ee1a29a443b28ffe10ccc5 not found: ID does not exist" containerID="ceb5288368bd1daf0f9565c9514c1f663aa6c3ba57ee1a29a443b28ffe10ccc5" Mar 21 04:42:50 crc kubenswrapper[4839]: I0321 04:42:50.510204 4839 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ceb5288368bd1daf0f9565c9514c1f663aa6c3ba57ee1a29a443b28ffe10ccc5"} err="failed to get container status \"ceb5288368bd1daf0f9565c9514c1f663aa6c3ba57ee1a29a443b28ffe10ccc5\": rpc error: code = NotFound desc = could not find container \"ceb5288368bd1daf0f9565c9514c1f663aa6c3ba57ee1a29a443b28ffe10ccc5\": container with ID starting with ceb5288368bd1daf0f9565c9514c1f663aa6c3ba57ee1a29a443b28ffe10ccc5 not found: ID does not exist" Mar 21 04:42:50 crc kubenswrapper[4839]: I0321 04:42:50.510227 4839 scope.go:117] "RemoveContainer" containerID="ca652ab32b31e13e16160470923dab804165a2c796e14ceaeace470aa8e62f9b" Mar 21 04:42:50 crc kubenswrapper[4839]: E0321 04:42:50.510531 4839 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ca652ab32b31e13e16160470923dab804165a2c796e14ceaeace470aa8e62f9b\": container with ID starting with ca652ab32b31e13e16160470923dab804165a2c796e14ceaeace470aa8e62f9b not found: ID does not exist" containerID="ca652ab32b31e13e16160470923dab804165a2c796e14ceaeace470aa8e62f9b" Mar 21 04:42:50 crc kubenswrapper[4839]: I0321 04:42:50.510587 4839 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ca652ab32b31e13e16160470923dab804165a2c796e14ceaeace470aa8e62f9b"} err="failed to get container status \"ca652ab32b31e13e16160470923dab804165a2c796e14ceaeace470aa8e62f9b\": rpc error: code = NotFound desc = could not find container \"ca652ab32b31e13e16160470923dab804165a2c796e14ceaeace470aa8e62f9b\": container with ID starting with ca652ab32b31e13e16160470923dab804165a2c796e14ceaeace470aa8e62f9b not found: ID does not exist" Mar 21 04:42:50 crc kubenswrapper[4839]: I0321 04:42:50.510614 4839 scope.go:117] "RemoveContainer" containerID="3bcf6c4dca6994d55b402560e153aeec77a770a87db6c2a3981fd71f1c0e7782" Mar 21 04:42:50 crc kubenswrapper[4839]: I0321 04:42:50.513002 4839 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-5ccc8479f9-d99qf"] Mar 21 04:42:50 crc kubenswrapper[4839]: I0321 04:42:50.540789 4839 scope.go:117] "RemoveContainer" containerID="f34f4acac62bc779d36b74541949c853bc07653f2853d959f243208237be8360" Mar 21 04:42:51 crc kubenswrapper[4839]: I0321 04:42:51.359300 4839 generic.go:334] "Generic (PLEG): container finished" podID="d92c95bc-4fde-4d24-ad6e-d4583ec19b3a" containerID="fdec34f8addba6741ac12f007463e3379c5cafeea2de83548fa2bc44a873ebd5" exitCode=0 Mar 21 04:42:51 crc kubenswrapper[4839]: I0321 04:42:51.359397 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-8554648995-4z7nl" event={"ID":"d92c95bc-4fde-4d24-ad6e-d4583ec19b3a","Type":"ContainerDied","Data":"fdec34f8addba6741ac12f007463e3379c5cafeea2de83548fa2bc44a873ebd5"} Mar 21 04:42:51 crc kubenswrapper[4839]: I0321 04:42:51.362434 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6bc7876d45-vxcbj" event={"ID":"40fc2c3c-2c23-497a-89d6-906ba78506c2","Type":"ContainerStarted","Data":"5f6671d09fc3d7cf05752820e4458b198d84c62a10462305fe2e9de3a8094910"} Mar 21 04:42:51 crc kubenswrapper[4839]: I0321 04:42:51.362901 4839 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-6bc7876d45-vxcbj" Mar 21 04:42:51 crc kubenswrapper[4839]: I0321 04:42:51.409522 4839 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-6bc7876d45-vxcbj" podStartSLOduration=3.409503788 podStartE2EDuration="3.409503788s" podCreationTimestamp="2026-03-21 04:42:48 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-21 04:42:51.401994478 +0000 UTC m=+1175.729781194" watchObservedRunningTime="2026-03-21 04:42:51.409503788 +0000 UTC m=+1175.737290474" Mar 21 04:42:51 crc kubenswrapper[4839]: I0321 04:42:51.525668 4839 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/ovsdbserver-sb-0" Mar 21 04:42:51 crc kubenswrapper[4839]: I0321 04:42:51.592737 4839 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/ovsdbserver-sb-0" Mar 21 04:42:52 crc kubenswrapper[4839]: I0321 04:42:52.371429 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-8554648995-4z7nl" event={"ID":"d92c95bc-4fde-4d24-ad6e-d4583ec19b3a","Type":"ContainerStarted","Data":"97084a1051c26c4cfcbee9a2a345f9fe3d46b532fdf62d0b9b04772413ec0e3b"} Mar 21 04:42:52 crc kubenswrapper[4839]: I0321 04:42:52.372016 4839 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-8554648995-4z7nl" Mar 21 04:42:52 crc kubenswrapper[4839]: I0321 04:42:52.393558 4839 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-8554648995-4z7nl" podStartSLOduration=3.393537077 podStartE2EDuration="3.393537077s" podCreationTimestamp="2026-03-21 04:42:49 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-21 04:42:52.387823697 +0000 UTC m=+1176.715610393" watchObservedRunningTime="2026-03-21 04:42:52.393537077 +0000 UTC m=+1176.721323753" Mar 21 04:42:52 crc kubenswrapper[4839]: I0321 04:42:52.412583 4839 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovsdbserver-sb-0" Mar 21 04:42:52 crc kubenswrapper[4839]: I0321 04:42:52.469279 4839 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="45f7903c-4081-446b-94d9-ad979332590b" path="/var/lib/kubelet/pods/45f7903c-4081-446b-94d9-ad979332590b/volumes" Mar 21 04:42:52 crc kubenswrapper[4839]: I0321 04:42:52.469882 4839 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5f0d86dd-f2c4-415e-b4dc-0bfeb0b1dbb5" path="/var/lib/kubelet/pods/5f0d86dd-f2c4-415e-b4dc-0bfeb0b1dbb5/volumes" Mar 21 04:42:52 crc kubenswrapper[4839]: I0321 04:42:52.550274 4839 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-northd-0"] Mar 21 04:42:52 crc kubenswrapper[4839]: E0321 04:42:52.550649 4839 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5f0d86dd-f2c4-415e-b4dc-0bfeb0b1dbb5" containerName="dnsmasq-dns" Mar 21 04:42:52 crc kubenswrapper[4839]: I0321 04:42:52.550671 4839 state_mem.go:107] "Deleted CPUSet assignment" podUID="5f0d86dd-f2c4-415e-b4dc-0bfeb0b1dbb5" containerName="dnsmasq-dns" Mar 21 04:42:52 crc kubenswrapper[4839]: E0321 04:42:52.550713 4839 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5f0d86dd-f2c4-415e-b4dc-0bfeb0b1dbb5" containerName="init" Mar 21 04:42:52 crc kubenswrapper[4839]: I0321 04:42:52.550720 4839 state_mem.go:107] "Deleted CPUSet assignment" podUID="5f0d86dd-f2c4-415e-b4dc-0bfeb0b1dbb5" containerName="init" Mar 21 04:42:52 crc kubenswrapper[4839]: E0321 04:42:52.550733 4839 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="45f7903c-4081-446b-94d9-ad979332590b" containerName="dnsmasq-dns" Mar 21 04:42:52 crc kubenswrapper[4839]: I0321 04:42:52.550740 4839 state_mem.go:107] "Deleted CPUSet assignment" podUID="45f7903c-4081-446b-94d9-ad979332590b" containerName="dnsmasq-dns" Mar 21 04:42:52 crc kubenswrapper[4839]: E0321 04:42:52.550757 4839 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="45f7903c-4081-446b-94d9-ad979332590b" containerName="init" Mar 21 04:42:52 crc kubenswrapper[4839]: I0321 04:42:52.550763 4839 state_mem.go:107] "Deleted CPUSet assignment" podUID="45f7903c-4081-446b-94d9-ad979332590b" containerName="init" Mar 21 04:42:52 crc kubenswrapper[4839]: I0321 04:42:52.550918 4839 memory_manager.go:354] "RemoveStaleState removing state" podUID="5f0d86dd-f2c4-415e-b4dc-0bfeb0b1dbb5" containerName="dnsmasq-dns" Mar 21 04:42:52 crc kubenswrapper[4839]: I0321 04:42:52.550934 4839 memory_manager.go:354] "RemoveStaleState removing state" podUID="45f7903c-4081-446b-94d9-ad979332590b" containerName="dnsmasq-dns" Mar 21 04:42:52 crc kubenswrapper[4839]: I0321 04:42:52.551745 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-northd-0" Mar 21 04:42:52 crc kubenswrapper[4839]: I0321 04:42:52.553929 4839 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovnnorthd-config" Mar 21 04:42:52 crc kubenswrapper[4839]: I0321 04:42:52.554406 4839 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovnnorthd-ovnnorthd-dockercfg-c8b6d" Mar 21 04:42:52 crc kubenswrapper[4839]: I0321 04:42:52.557234 4839 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovnnorthd-ovndbs" Mar 21 04:42:52 crc kubenswrapper[4839]: I0321 04:42:52.562370 4839 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovnnorthd-scripts" Mar 21 04:42:52 crc kubenswrapper[4839]: I0321 04:42:52.582407 4839 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-northd-0"] Mar 21 04:42:52 crc kubenswrapper[4839]: I0321 04:42:52.623454 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/dbcaa531-3e09-48c7-8535-76f3e1f5c303-config\") pod \"ovn-northd-0\" (UID: \"dbcaa531-3e09-48c7-8535-76f3e1f5c303\") " pod="openstack/ovn-northd-0" Mar 21 04:42:52 crc kubenswrapper[4839]: I0321 04:42:52.623510 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/dbcaa531-3e09-48c7-8535-76f3e1f5c303-ovn-northd-tls-certs\") pod \"ovn-northd-0\" (UID: \"dbcaa531-3e09-48c7-8535-76f3e1f5c303\") " pod="openstack/ovn-northd-0" Mar 21 04:42:52 crc kubenswrapper[4839]: I0321 04:42:52.623605 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/dbcaa531-3e09-48c7-8535-76f3e1f5c303-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"dbcaa531-3e09-48c7-8535-76f3e1f5c303\") " pod="openstack/ovn-northd-0" Mar 21 04:42:52 crc kubenswrapper[4839]: I0321 04:42:52.623673 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-snwvj\" (UniqueName: \"kubernetes.io/projected/dbcaa531-3e09-48c7-8535-76f3e1f5c303-kube-api-access-snwvj\") pod \"ovn-northd-0\" (UID: \"dbcaa531-3e09-48c7-8535-76f3e1f5c303\") " pod="openstack/ovn-northd-0" Mar 21 04:42:52 crc kubenswrapper[4839]: I0321 04:42:52.623712 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/dbcaa531-3e09-48c7-8535-76f3e1f5c303-metrics-certs-tls-certs\") pod \"ovn-northd-0\" (UID: \"dbcaa531-3e09-48c7-8535-76f3e1f5c303\") " pod="openstack/ovn-northd-0" Mar 21 04:42:52 crc kubenswrapper[4839]: I0321 04:42:52.623737 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dbcaa531-3e09-48c7-8535-76f3e1f5c303-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"dbcaa531-3e09-48c7-8535-76f3e1f5c303\") " pod="openstack/ovn-northd-0" Mar 21 04:42:52 crc kubenswrapper[4839]: I0321 04:42:52.623754 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/dbcaa531-3e09-48c7-8535-76f3e1f5c303-scripts\") pod \"ovn-northd-0\" (UID: \"dbcaa531-3e09-48c7-8535-76f3e1f5c303\") " pod="openstack/ovn-northd-0" Mar 21 04:42:52 crc kubenswrapper[4839]: I0321 04:42:52.725101 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/dbcaa531-3e09-48c7-8535-76f3e1f5c303-ovn-northd-tls-certs\") pod \"ovn-northd-0\" (UID: \"dbcaa531-3e09-48c7-8535-76f3e1f5c303\") " pod="openstack/ovn-northd-0" Mar 21 04:42:52 crc kubenswrapper[4839]: I0321 04:42:52.725396 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/dbcaa531-3e09-48c7-8535-76f3e1f5c303-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"dbcaa531-3e09-48c7-8535-76f3e1f5c303\") " pod="openstack/ovn-northd-0" Mar 21 04:42:52 crc kubenswrapper[4839]: I0321 04:42:52.725500 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-snwvj\" (UniqueName: \"kubernetes.io/projected/dbcaa531-3e09-48c7-8535-76f3e1f5c303-kube-api-access-snwvj\") pod \"ovn-northd-0\" (UID: \"dbcaa531-3e09-48c7-8535-76f3e1f5c303\") " pod="openstack/ovn-northd-0" Mar 21 04:42:52 crc kubenswrapper[4839]: I0321 04:42:52.725620 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/dbcaa531-3e09-48c7-8535-76f3e1f5c303-metrics-certs-tls-certs\") pod \"ovn-northd-0\" (UID: \"dbcaa531-3e09-48c7-8535-76f3e1f5c303\") " pod="openstack/ovn-northd-0" Mar 21 04:42:52 crc kubenswrapper[4839]: I0321 04:42:52.725711 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dbcaa531-3e09-48c7-8535-76f3e1f5c303-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"dbcaa531-3e09-48c7-8535-76f3e1f5c303\") " pod="openstack/ovn-northd-0" Mar 21 04:42:52 crc kubenswrapper[4839]: I0321 04:42:52.725781 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/dbcaa531-3e09-48c7-8535-76f3e1f5c303-scripts\") pod \"ovn-northd-0\" (UID: \"dbcaa531-3e09-48c7-8535-76f3e1f5c303\") " pod="openstack/ovn-northd-0" Mar 21 04:42:52 crc kubenswrapper[4839]: I0321 04:42:52.725873 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/dbcaa531-3e09-48c7-8535-76f3e1f5c303-config\") pod \"ovn-northd-0\" (UID: \"dbcaa531-3e09-48c7-8535-76f3e1f5c303\") " pod="openstack/ovn-northd-0" Mar 21 04:42:52 crc kubenswrapper[4839]: I0321 04:42:52.726018 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/dbcaa531-3e09-48c7-8535-76f3e1f5c303-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"dbcaa531-3e09-48c7-8535-76f3e1f5c303\") " pod="openstack/ovn-northd-0" Mar 21 04:42:52 crc kubenswrapper[4839]: I0321 04:42:52.726729 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/dbcaa531-3e09-48c7-8535-76f3e1f5c303-config\") pod \"ovn-northd-0\" (UID: \"dbcaa531-3e09-48c7-8535-76f3e1f5c303\") " pod="openstack/ovn-northd-0" Mar 21 04:42:52 crc kubenswrapper[4839]: I0321 04:42:52.726779 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/dbcaa531-3e09-48c7-8535-76f3e1f5c303-scripts\") pod \"ovn-northd-0\" (UID: \"dbcaa531-3e09-48c7-8535-76f3e1f5c303\") " pod="openstack/ovn-northd-0" Mar 21 04:42:52 crc kubenswrapper[4839]: I0321 04:42:52.730895 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/dbcaa531-3e09-48c7-8535-76f3e1f5c303-metrics-certs-tls-certs\") pod \"ovn-northd-0\" (UID: \"dbcaa531-3e09-48c7-8535-76f3e1f5c303\") " pod="openstack/ovn-northd-0" Mar 21 04:42:52 crc kubenswrapper[4839]: I0321 04:42:52.734082 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dbcaa531-3e09-48c7-8535-76f3e1f5c303-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"dbcaa531-3e09-48c7-8535-76f3e1f5c303\") " pod="openstack/ovn-northd-0" Mar 21 04:42:52 crc kubenswrapper[4839]: I0321 04:42:52.737535 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/dbcaa531-3e09-48c7-8535-76f3e1f5c303-ovn-northd-tls-certs\") pod \"ovn-northd-0\" (UID: \"dbcaa531-3e09-48c7-8535-76f3e1f5c303\") " pod="openstack/ovn-northd-0" Mar 21 04:42:52 crc kubenswrapper[4839]: I0321 04:42:52.751537 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-snwvj\" (UniqueName: \"kubernetes.io/projected/dbcaa531-3e09-48c7-8535-76f3e1f5c303-kube-api-access-snwvj\") pod \"ovn-northd-0\" (UID: \"dbcaa531-3e09-48c7-8535-76f3e1f5c303\") " pod="openstack/ovn-northd-0" Mar 21 04:42:52 crc kubenswrapper[4839]: I0321 04:42:52.869625 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-northd-0" Mar 21 04:42:53 crc kubenswrapper[4839]: I0321 04:42:53.124641 4839 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-northd-0"] Mar 21 04:42:53 crc kubenswrapper[4839]: I0321 04:42:53.379716 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"dbcaa531-3e09-48c7-8535-76f3e1f5c303","Type":"ContainerStarted","Data":"cb26c02e4b67b2ee72c3a057d0d13e316aa7baf49fd4350ce7ca2df46971a652"} Mar 21 04:42:54 crc kubenswrapper[4839]: I0321 04:42:54.510718 4839 pod_container_manager_linux.go:210] "Failed to delete cgroup paths" cgroupName=["kubepods","besteffort","pod2f135620-f512-4bfe-9875-4d4e07c6a0f5"] err="unable to destroy cgroup paths for cgroup [kubepods besteffort pod2f135620-f512-4bfe-9875-4d4e07c6a0f5] : Timed out while waiting for systemd to remove kubepods-besteffort-pod2f135620_f512_4bfe_9875_4d4e07c6a0f5.slice" Mar 21 04:42:54 crc kubenswrapper[4839]: E0321 04:42:54.510805 4839 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to delete cgroup paths for [kubepods besteffort pod2f135620-f512-4bfe-9875-4d4e07c6a0f5] : unable to destroy cgroup paths for cgroup [kubepods besteffort pod2f135620-f512-4bfe-9875-4d4e07c6a0f5] : Timed out while waiting for systemd to remove kubepods-besteffort-pod2f135620_f512_4bfe_9875_4d4e07c6a0f5.slice" pod="openstack/dnsmasq-dns-78dd6ddcc-7tj8n" podUID="2f135620-f512-4bfe-9875-4d4e07c6a0f5" Mar 21 04:42:55 crc kubenswrapper[4839]: I0321 04:42:55.397184 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-78dd6ddcc-7tj8n" Mar 21 04:42:55 crc kubenswrapper[4839]: I0321 04:42:55.397213 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"dbcaa531-3e09-48c7-8535-76f3e1f5c303","Type":"ContainerStarted","Data":"c22c6e1a7be56bf8725b63f090a6cdc87ec4c4d7a1e8cf8cc014466210946c61"} Mar 21 04:42:55 crc kubenswrapper[4839]: I0321 04:42:55.397678 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"dbcaa531-3e09-48c7-8535-76f3e1f5c303","Type":"ContainerStarted","Data":"3b217217438dd5c4ebb0a93f1eac576c17e0299263c356ee52a23ab7cb389bc3"} Mar 21 04:42:55 crc kubenswrapper[4839]: I0321 04:42:55.398008 4839 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-northd-0" Mar 21 04:42:55 crc kubenswrapper[4839]: I0321 04:42:55.425531 4839 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-northd-0" podStartSLOduration=1.8447265449999999 podStartE2EDuration="3.425512838s" podCreationTimestamp="2026-03-21 04:42:52 +0000 UTC" firstStartedPulling="2026-03-21 04:42:53.133021994 +0000 UTC m=+1177.460808670" lastFinishedPulling="2026-03-21 04:42:54.713808287 +0000 UTC m=+1179.041594963" observedRunningTime="2026-03-21 04:42:55.423867732 +0000 UTC m=+1179.751654408" watchObservedRunningTime="2026-03-21 04:42:55.425512838 +0000 UTC m=+1179.753299524" Mar 21 04:42:55 crc kubenswrapper[4839]: I0321 04:42:55.485978 4839 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-78dd6ddcc-7tj8n"] Mar 21 04:42:55 crc kubenswrapper[4839]: I0321 04:42:55.494169 4839 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-78dd6ddcc-7tj8n"] Mar 21 04:42:56 crc kubenswrapper[4839]: I0321 04:42:56.460949 4839 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2f135620-f512-4bfe-9875-4d4e07c6a0f5" path="/var/lib/kubelet/pods/2f135620-f512-4bfe-9875-4d4e07c6a0f5/volumes" Mar 21 04:42:58 crc kubenswrapper[4839]: I0321 04:42:58.287856 4839 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6bc7876d45-vxcbj"] Mar 21 04:42:58 crc kubenswrapper[4839]: I0321 04:42:58.289303 4839 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-6bc7876d45-vxcbj" podUID="40fc2c3c-2c23-497a-89d6-906ba78506c2" containerName="dnsmasq-dns" containerID="cri-o://5f6671d09fc3d7cf05752820e4458b198d84c62a10462305fe2e9de3a8094910" gracePeriod=10 Mar 21 04:42:58 crc kubenswrapper[4839]: I0321 04:42:58.294634 4839 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-6bc7876d45-vxcbj" Mar 21 04:42:58 crc kubenswrapper[4839]: I0321 04:42:58.319025 4839 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-b8fbc5445-zkqb7"] Mar 21 04:42:58 crc kubenswrapper[4839]: I0321 04:42:58.320279 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-b8fbc5445-zkqb7" Mar 21 04:42:58 crc kubenswrapper[4839]: I0321 04:42:58.335555 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/67dd1633-1450-4153-b0af-b6887f61944c-ovsdbserver-sb\") pod \"dnsmasq-dns-b8fbc5445-zkqb7\" (UID: \"67dd1633-1450-4153-b0af-b6887f61944c\") " pod="openstack/dnsmasq-dns-b8fbc5445-zkqb7" Mar 21 04:42:58 crc kubenswrapper[4839]: I0321 04:42:58.335666 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/67dd1633-1450-4153-b0af-b6887f61944c-config\") pod \"dnsmasq-dns-b8fbc5445-zkqb7\" (UID: \"67dd1633-1450-4153-b0af-b6887f61944c\") " pod="openstack/dnsmasq-dns-b8fbc5445-zkqb7" Mar 21 04:42:58 crc kubenswrapper[4839]: I0321 04:42:58.335704 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ksgp4\" (UniqueName: \"kubernetes.io/projected/67dd1633-1450-4153-b0af-b6887f61944c-kube-api-access-ksgp4\") pod \"dnsmasq-dns-b8fbc5445-zkqb7\" (UID: \"67dd1633-1450-4153-b0af-b6887f61944c\") " pod="openstack/dnsmasq-dns-b8fbc5445-zkqb7" Mar 21 04:42:58 crc kubenswrapper[4839]: I0321 04:42:58.335742 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/67dd1633-1450-4153-b0af-b6887f61944c-ovsdbserver-nb\") pod \"dnsmasq-dns-b8fbc5445-zkqb7\" (UID: \"67dd1633-1450-4153-b0af-b6887f61944c\") " pod="openstack/dnsmasq-dns-b8fbc5445-zkqb7" Mar 21 04:42:58 crc kubenswrapper[4839]: I0321 04:42:58.335879 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/67dd1633-1450-4153-b0af-b6887f61944c-dns-svc\") pod \"dnsmasq-dns-b8fbc5445-zkqb7\" (UID: \"67dd1633-1450-4153-b0af-b6887f61944c\") " pod="openstack/dnsmasq-dns-b8fbc5445-zkqb7" Mar 21 04:42:58 crc kubenswrapper[4839]: I0321 04:42:58.357758 4839 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-b8fbc5445-zkqb7"] Mar 21 04:42:58 crc kubenswrapper[4839]: I0321 04:42:58.437050 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/67dd1633-1450-4153-b0af-b6887f61944c-dns-svc\") pod \"dnsmasq-dns-b8fbc5445-zkqb7\" (UID: \"67dd1633-1450-4153-b0af-b6887f61944c\") " pod="openstack/dnsmasq-dns-b8fbc5445-zkqb7" Mar 21 04:42:58 crc kubenswrapper[4839]: I0321 04:42:58.437123 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/67dd1633-1450-4153-b0af-b6887f61944c-ovsdbserver-sb\") pod \"dnsmasq-dns-b8fbc5445-zkqb7\" (UID: \"67dd1633-1450-4153-b0af-b6887f61944c\") " pod="openstack/dnsmasq-dns-b8fbc5445-zkqb7" Mar 21 04:42:58 crc kubenswrapper[4839]: I0321 04:42:58.437187 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/67dd1633-1450-4153-b0af-b6887f61944c-config\") pod \"dnsmasq-dns-b8fbc5445-zkqb7\" (UID: \"67dd1633-1450-4153-b0af-b6887f61944c\") " pod="openstack/dnsmasq-dns-b8fbc5445-zkqb7" Mar 21 04:42:58 crc kubenswrapper[4839]: I0321 04:42:58.437220 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ksgp4\" (UniqueName: \"kubernetes.io/projected/67dd1633-1450-4153-b0af-b6887f61944c-kube-api-access-ksgp4\") pod \"dnsmasq-dns-b8fbc5445-zkqb7\" (UID: \"67dd1633-1450-4153-b0af-b6887f61944c\") " pod="openstack/dnsmasq-dns-b8fbc5445-zkqb7" Mar 21 04:42:58 crc kubenswrapper[4839]: I0321 04:42:58.437260 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/67dd1633-1450-4153-b0af-b6887f61944c-ovsdbserver-nb\") pod \"dnsmasq-dns-b8fbc5445-zkqb7\" (UID: \"67dd1633-1450-4153-b0af-b6887f61944c\") " pod="openstack/dnsmasq-dns-b8fbc5445-zkqb7" Mar 21 04:42:58 crc kubenswrapper[4839]: I0321 04:42:58.438287 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/67dd1633-1450-4153-b0af-b6887f61944c-ovsdbserver-nb\") pod \"dnsmasq-dns-b8fbc5445-zkqb7\" (UID: \"67dd1633-1450-4153-b0af-b6887f61944c\") " pod="openstack/dnsmasq-dns-b8fbc5445-zkqb7" Mar 21 04:42:58 crc kubenswrapper[4839]: I0321 04:42:58.438852 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/67dd1633-1450-4153-b0af-b6887f61944c-dns-svc\") pod \"dnsmasq-dns-b8fbc5445-zkqb7\" (UID: \"67dd1633-1450-4153-b0af-b6887f61944c\") " pod="openstack/dnsmasq-dns-b8fbc5445-zkqb7" Mar 21 04:42:58 crc kubenswrapper[4839]: I0321 04:42:58.439373 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/67dd1633-1450-4153-b0af-b6887f61944c-ovsdbserver-sb\") pod \"dnsmasq-dns-b8fbc5445-zkqb7\" (UID: \"67dd1633-1450-4153-b0af-b6887f61944c\") " pod="openstack/dnsmasq-dns-b8fbc5445-zkqb7" Mar 21 04:42:58 crc kubenswrapper[4839]: I0321 04:42:58.439887 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/67dd1633-1450-4153-b0af-b6887f61944c-config\") pod \"dnsmasq-dns-b8fbc5445-zkqb7\" (UID: \"67dd1633-1450-4153-b0af-b6887f61944c\") " pod="openstack/dnsmasq-dns-b8fbc5445-zkqb7" Mar 21 04:42:58 crc kubenswrapper[4839]: I0321 04:42:58.458837 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ksgp4\" (UniqueName: \"kubernetes.io/projected/67dd1633-1450-4153-b0af-b6887f61944c-kube-api-access-ksgp4\") pod \"dnsmasq-dns-b8fbc5445-zkqb7\" (UID: \"67dd1633-1450-4153-b0af-b6887f61944c\") " pod="openstack/dnsmasq-dns-b8fbc5445-zkqb7" Mar 21 04:42:58 crc kubenswrapper[4839]: I0321 04:42:58.638991 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-b8fbc5445-zkqb7" Mar 21 04:42:59 crc kubenswrapper[4839]: I0321 04:42:59.106107 4839 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-b8fbc5445-zkqb7"] Mar 21 04:42:59 crc kubenswrapper[4839]: I0321 04:42:59.125527 4839 generic.go:334] "Generic (PLEG): container finished" podID="4f1edf0d-f220-4815-aeb6-e4507576247a" containerID="eb929a7642759119016130bc2484941db41a93bf8dd7c08b25451605be57abdb" exitCode=0 Mar 21 04:42:59 crc kubenswrapper[4839]: I0321 04:42:59.125937 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"4f1edf0d-f220-4815-aeb6-e4507576247a","Type":"ContainerDied","Data":"eb929a7642759119016130bc2484941db41a93bf8dd7c08b25451605be57abdb"} Mar 21 04:42:59 crc kubenswrapper[4839]: I0321 04:42:59.133438 4839 generic.go:334] "Generic (PLEG): container finished" podID="30d22e92-45bd-4d1e-954e-3ade801245d4" containerID="03af0871636dda05a3e9679ab089635f4fe936a9737c3d6ca310aaab50787692" exitCode=0 Mar 21 04:42:59 crc kubenswrapper[4839]: I0321 04:42:59.133536 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"30d22e92-45bd-4d1e-954e-3ade801245d4","Type":"ContainerDied","Data":"03af0871636dda05a3e9679ab089635f4fe936a9737c3d6ca310aaab50787692"} Mar 21 04:42:59 crc kubenswrapper[4839]: I0321 04:42:59.138495 4839 generic.go:334] "Generic (PLEG): container finished" podID="40fc2c3c-2c23-497a-89d6-906ba78506c2" containerID="5f6671d09fc3d7cf05752820e4458b198d84c62a10462305fe2e9de3a8094910" exitCode=0 Mar 21 04:42:59 crc kubenswrapper[4839]: I0321 04:42:59.138539 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6bc7876d45-vxcbj" event={"ID":"40fc2c3c-2c23-497a-89d6-906ba78506c2","Type":"ContainerDied","Data":"5f6671d09fc3d7cf05752820e4458b198d84c62a10462305fe2e9de3a8094910"} Mar 21 04:42:59 crc kubenswrapper[4839]: I0321 04:42:59.176532 4839 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-6bc7876d45-vxcbj" podUID="40fc2c3c-2c23-497a-89d6-906ba78506c2" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.112:5353: connect: connection refused" Mar 21 04:42:59 crc kubenswrapper[4839]: I0321 04:42:59.387549 4839 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/swift-storage-0"] Mar 21 04:42:59 crc kubenswrapper[4839]: I0321 04:42:59.392707 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-storage-0" Mar 21 04:42:59 crc kubenswrapper[4839]: I0321 04:42:59.395812 4839 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"swift-swift-dockercfg-jcqm2" Mar 21 04:42:59 crc kubenswrapper[4839]: I0321 04:42:59.396053 4839 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-ring-files" Mar 21 04:42:59 crc kubenswrapper[4839]: I0321 04:42:59.407853 4839 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-storage-config-data" Mar 21 04:42:59 crc kubenswrapper[4839]: I0321 04:42:59.407870 4839 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"swift-conf" Mar 21 04:42:59 crc kubenswrapper[4839]: I0321 04:42:59.413610 4839 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-storage-0"] Mar 21 04:42:59 crc kubenswrapper[4839]: I0321 04:42:59.483757 4839 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-8554648995-4z7nl" Mar 21 04:42:59 crc kubenswrapper[4839]: I0321 04:42:59.554897 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/9848d2f0-c562-4b2a-bd1c-cd91c6754079-cache\") pod \"swift-storage-0\" (UID: \"9848d2f0-c562-4b2a-bd1c-cd91c6754079\") " pod="openstack/swift-storage-0" Mar 21 04:42:59 crc kubenswrapper[4839]: I0321 04:42:59.555212 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-drfd2\" (UniqueName: \"kubernetes.io/projected/9848d2f0-c562-4b2a-bd1c-cd91c6754079-kube-api-access-drfd2\") pod \"swift-storage-0\" (UID: \"9848d2f0-c562-4b2a-bd1c-cd91c6754079\") " pod="openstack/swift-storage-0" Mar 21 04:42:59 crc kubenswrapper[4839]: I0321 04:42:59.555259 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/9848d2f0-c562-4b2a-bd1c-cd91c6754079-etc-swift\") pod \"swift-storage-0\" (UID: \"9848d2f0-c562-4b2a-bd1c-cd91c6754079\") " pod="openstack/swift-storage-0" Mar 21 04:42:59 crc kubenswrapper[4839]: I0321 04:42:59.555309 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"swift-storage-0\" (UID: \"9848d2f0-c562-4b2a-bd1c-cd91c6754079\") " pod="openstack/swift-storage-0" Mar 21 04:42:59 crc kubenswrapper[4839]: I0321 04:42:59.555349 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/9848d2f0-c562-4b2a-bd1c-cd91c6754079-lock\") pod \"swift-storage-0\" (UID: \"9848d2f0-c562-4b2a-bd1c-cd91c6754079\") " pod="openstack/swift-storage-0" Mar 21 04:42:59 crc kubenswrapper[4839]: I0321 04:42:59.555365 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9848d2f0-c562-4b2a-bd1c-cd91c6754079-combined-ca-bundle\") pod \"swift-storage-0\" (UID: \"9848d2f0-c562-4b2a-bd1c-cd91c6754079\") " pod="openstack/swift-storage-0" Mar 21 04:42:59 crc kubenswrapper[4839]: I0321 04:42:59.657466 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/9848d2f0-c562-4b2a-bd1c-cd91c6754079-cache\") pod \"swift-storage-0\" (UID: \"9848d2f0-c562-4b2a-bd1c-cd91c6754079\") " pod="openstack/swift-storage-0" Mar 21 04:42:59 crc kubenswrapper[4839]: I0321 04:42:59.657527 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-drfd2\" (UniqueName: \"kubernetes.io/projected/9848d2f0-c562-4b2a-bd1c-cd91c6754079-kube-api-access-drfd2\") pod \"swift-storage-0\" (UID: \"9848d2f0-c562-4b2a-bd1c-cd91c6754079\") " pod="openstack/swift-storage-0" Mar 21 04:42:59 crc kubenswrapper[4839]: I0321 04:42:59.657555 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/9848d2f0-c562-4b2a-bd1c-cd91c6754079-etc-swift\") pod \"swift-storage-0\" (UID: \"9848d2f0-c562-4b2a-bd1c-cd91c6754079\") " pod="openstack/swift-storage-0" Mar 21 04:42:59 crc kubenswrapper[4839]: I0321 04:42:59.657623 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"swift-storage-0\" (UID: \"9848d2f0-c562-4b2a-bd1c-cd91c6754079\") " pod="openstack/swift-storage-0" Mar 21 04:42:59 crc kubenswrapper[4839]: I0321 04:42:59.657653 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/9848d2f0-c562-4b2a-bd1c-cd91c6754079-lock\") pod \"swift-storage-0\" (UID: \"9848d2f0-c562-4b2a-bd1c-cd91c6754079\") " pod="openstack/swift-storage-0" Mar 21 04:42:59 crc kubenswrapper[4839]: I0321 04:42:59.657667 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9848d2f0-c562-4b2a-bd1c-cd91c6754079-combined-ca-bundle\") pod \"swift-storage-0\" (UID: \"9848d2f0-c562-4b2a-bd1c-cd91c6754079\") " pod="openstack/swift-storage-0" Mar 21 04:42:59 crc kubenswrapper[4839]: E0321 04:42:59.657886 4839 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Mar 21 04:42:59 crc kubenswrapper[4839]: E0321 04:42:59.657920 4839 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Mar 21 04:42:59 crc kubenswrapper[4839]: E0321 04:42:59.657982 4839 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9848d2f0-c562-4b2a-bd1c-cd91c6754079-etc-swift podName:9848d2f0-c562-4b2a-bd1c-cd91c6754079 nodeName:}" failed. No retries permitted until 2026-03-21 04:43:00.157962832 +0000 UTC m=+1184.485749508 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/9848d2f0-c562-4b2a-bd1c-cd91c6754079-etc-swift") pod "swift-storage-0" (UID: "9848d2f0-c562-4b2a-bd1c-cd91c6754079") : configmap "swift-ring-files" not found Mar 21 04:42:59 crc kubenswrapper[4839]: I0321 04:42:59.658101 4839 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"swift-storage-0\" (UID: \"9848d2f0-c562-4b2a-bd1c-cd91c6754079\") device mount path \"/mnt/openstack/pv01\"" pod="openstack/swift-storage-0" Mar 21 04:42:59 crc kubenswrapper[4839]: I0321 04:42:59.658471 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/9848d2f0-c562-4b2a-bd1c-cd91c6754079-cache\") pod \"swift-storage-0\" (UID: \"9848d2f0-c562-4b2a-bd1c-cd91c6754079\") " pod="openstack/swift-storage-0" Mar 21 04:42:59 crc kubenswrapper[4839]: I0321 04:42:59.659587 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/9848d2f0-c562-4b2a-bd1c-cd91c6754079-lock\") pod \"swift-storage-0\" (UID: \"9848d2f0-c562-4b2a-bd1c-cd91c6754079\") " pod="openstack/swift-storage-0" Mar 21 04:42:59 crc kubenswrapper[4839]: I0321 04:42:59.664053 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9848d2f0-c562-4b2a-bd1c-cd91c6754079-combined-ca-bundle\") pod \"swift-storage-0\" (UID: \"9848d2f0-c562-4b2a-bd1c-cd91c6754079\") " pod="openstack/swift-storage-0" Mar 21 04:42:59 crc kubenswrapper[4839]: I0321 04:42:59.674436 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-drfd2\" (UniqueName: \"kubernetes.io/projected/9848d2f0-c562-4b2a-bd1c-cd91c6754079-kube-api-access-drfd2\") pod \"swift-storage-0\" (UID: \"9848d2f0-c562-4b2a-bd1c-cd91c6754079\") " pod="openstack/swift-storage-0" Mar 21 04:42:59 crc kubenswrapper[4839]: I0321 04:42:59.683385 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"swift-storage-0\" (UID: \"9848d2f0-c562-4b2a-bd1c-cd91c6754079\") " pod="openstack/swift-storage-0" Mar 21 04:43:00 crc kubenswrapper[4839]: I0321 04:43:00.149790 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-b8fbc5445-zkqb7" event={"ID":"67dd1633-1450-4153-b0af-b6887f61944c","Type":"ContainerStarted","Data":"57de16c4224a656e8f3fcae76650a94702fb081fd5f9e8c3856fcde976889201"} Mar 21 04:43:00 crc kubenswrapper[4839]: I0321 04:43:00.165743 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/9848d2f0-c562-4b2a-bd1c-cd91c6754079-etc-swift\") pod \"swift-storage-0\" (UID: \"9848d2f0-c562-4b2a-bd1c-cd91c6754079\") " pod="openstack/swift-storage-0" Mar 21 04:43:00 crc kubenswrapper[4839]: E0321 04:43:00.166049 4839 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Mar 21 04:43:00 crc kubenswrapper[4839]: E0321 04:43:00.166092 4839 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Mar 21 04:43:00 crc kubenswrapper[4839]: E0321 04:43:00.166302 4839 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9848d2f0-c562-4b2a-bd1c-cd91c6754079-etc-swift podName:9848d2f0-c562-4b2a-bd1c-cd91c6754079 nodeName:}" failed. No retries permitted until 2026-03-21 04:43:01.166279461 +0000 UTC m=+1185.494066137 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/9848d2f0-c562-4b2a-bd1c-cd91c6754079-etc-swift") pod "swift-storage-0" (UID: "9848d2f0-c562-4b2a-bd1c-cd91c6754079") : configmap "swift-ring-files" not found Mar 21 04:43:01 crc kubenswrapper[4839]: I0321 04:43:01.200098 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/9848d2f0-c562-4b2a-bd1c-cd91c6754079-etc-swift\") pod \"swift-storage-0\" (UID: \"9848d2f0-c562-4b2a-bd1c-cd91c6754079\") " pod="openstack/swift-storage-0" Mar 21 04:43:01 crc kubenswrapper[4839]: E0321 04:43:01.200315 4839 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Mar 21 04:43:01 crc kubenswrapper[4839]: E0321 04:43:01.200396 4839 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Mar 21 04:43:01 crc kubenswrapper[4839]: E0321 04:43:01.200449 4839 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9848d2f0-c562-4b2a-bd1c-cd91c6754079-etc-swift podName:9848d2f0-c562-4b2a-bd1c-cd91c6754079 nodeName:}" failed. No retries permitted until 2026-03-21 04:43:03.200434762 +0000 UTC m=+1187.528221438 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/9848d2f0-c562-4b2a-bd1c-cd91c6754079-etc-swift") pod "swift-storage-0" (UID: "9848d2f0-c562-4b2a-bd1c-cd91c6754079") : configmap "swift-ring-files" not found Mar 21 04:43:02 crc kubenswrapper[4839]: I0321 04:43:02.166194 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"30d22e92-45bd-4d1e-954e-3ade801245d4","Type":"ContainerStarted","Data":"d0d0cc3b9992c79ba662d880c6be2ed4e505bdd6e757aa9952909ab5d979d31f"} Mar 21 04:43:02 crc kubenswrapper[4839]: I0321 04:43:02.168346 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-b8fbc5445-zkqb7" event={"ID":"67dd1633-1450-4153-b0af-b6887f61944c","Type":"ContainerStarted","Data":"285d767665dbf1b22bee7f8005f18b61072968dd727d608ba30f4f564d8882bb"} Mar 21 04:43:02 crc kubenswrapper[4839]: I0321 04:43:02.170512 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"4f1edf0d-f220-4815-aeb6-e4507576247a","Type":"ContainerStarted","Data":"067f116e13aa69b91b4d2ca31991a45e297831881328d0b0917f54a9ef074313"} Mar 21 04:43:02 crc kubenswrapper[4839]: I0321 04:43:02.401388 4839 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6bc7876d45-vxcbj" Mar 21 04:43:02 crc kubenswrapper[4839]: I0321 04:43:02.523994 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9dqqx\" (UniqueName: \"kubernetes.io/projected/40fc2c3c-2c23-497a-89d6-906ba78506c2-kube-api-access-9dqqx\") pod \"40fc2c3c-2c23-497a-89d6-906ba78506c2\" (UID: \"40fc2c3c-2c23-497a-89d6-906ba78506c2\") " Mar 21 04:43:02 crc kubenswrapper[4839]: I0321 04:43:02.524588 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/40fc2c3c-2c23-497a-89d6-906ba78506c2-dns-svc\") pod \"40fc2c3c-2c23-497a-89d6-906ba78506c2\" (UID: \"40fc2c3c-2c23-497a-89d6-906ba78506c2\") " Mar 21 04:43:02 crc kubenswrapper[4839]: I0321 04:43:02.524651 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/40fc2c3c-2c23-497a-89d6-906ba78506c2-config\") pod \"40fc2c3c-2c23-497a-89d6-906ba78506c2\" (UID: \"40fc2c3c-2c23-497a-89d6-906ba78506c2\") " Mar 21 04:43:02 crc kubenswrapper[4839]: I0321 04:43:02.524683 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/40fc2c3c-2c23-497a-89d6-906ba78506c2-ovsdbserver-sb\") pod \"40fc2c3c-2c23-497a-89d6-906ba78506c2\" (UID: \"40fc2c3c-2c23-497a-89d6-906ba78506c2\") " Mar 21 04:43:02 crc kubenswrapper[4839]: I0321 04:43:02.530025 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/40fc2c3c-2c23-497a-89d6-906ba78506c2-kube-api-access-9dqqx" (OuterVolumeSpecName: "kube-api-access-9dqqx") pod "40fc2c3c-2c23-497a-89d6-906ba78506c2" (UID: "40fc2c3c-2c23-497a-89d6-906ba78506c2"). InnerVolumeSpecName "kube-api-access-9dqqx". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 21 04:43:02 crc kubenswrapper[4839]: I0321 04:43:02.565250 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/40fc2c3c-2c23-497a-89d6-906ba78506c2-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "40fc2c3c-2c23-497a-89d6-906ba78506c2" (UID: "40fc2c3c-2c23-497a-89d6-906ba78506c2"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 21 04:43:02 crc kubenswrapper[4839]: I0321 04:43:02.569513 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/40fc2c3c-2c23-497a-89d6-906ba78506c2-config" (OuterVolumeSpecName: "config") pod "40fc2c3c-2c23-497a-89d6-906ba78506c2" (UID: "40fc2c3c-2c23-497a-89d6-906ba78506c2"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 21 04:43:02 crc kubenswrapper[4839]: I0321 04:43:02.574376 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/40fc2c3c-2c23-497a-89d6-906ba78506c2-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "40fc2c3c-2c23-497a-89d6-906ba78506c2" (UID: "40fc2c3c-2c23-497a-89d6-906ba78506c2"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 21 04:43:02 crc kubenswrapper[4839]: I0321 04:43:02.627998 4839 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/40fc2c3c-2c23-497a-89d6-906ba78506c2-config\") on node \"crc\" DevicePath \"\"" Mar 21 04:43:02 crc kubenswrapper[4839]: I0321 04:43:02.628073 4839 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/40fc2c3c-2c23-497a-89d6-906ba78506c2-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Mar 21 04:43:02 crc kubenswrapper[4839]: I0321 04:43:02.628094 4839 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9dqqx\" (UniqueName: \"kubernetes.io/projected/40fc2c3c-2c23-497a-89d6-906ba78506c2-kube-api-access-9dqqx\") on node \"crc\" DevicePath \"\"" Mar 21 04:43:02 crc kubenswrapper[4839]: I0321 04:43:02.628108 4839 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/40fc2c3c-2c23-497a-89d6-906ba78506c2-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 21 04:43:03 crc kubenswrapper[4839]: I0321 04:43:03.178205 4839 generic.go:334] "Generic (PLEG): container finished" podID="67dd1633-1450-4153-b0af-b6887f61944c" containerID="285d767665dbf1b22bee7f8005f18b61072968dd727d608ba30f4f564d8882bb" exitCode=0 Mar 21 04:43:03 crc kubenswrapper[4839]: I0321 04:43:03.178268 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-b8fbc5445-zkqb7" event={"ID":"67dd1633-1450-4153-b0af-b6887f61944c","Type":"ContainerDied","Data":"285d767665dbf1b22bee7f8005f18b61072968dd727d608ba30f4f564d8882bb"} Mar 21 04:43:03 crc kubenswrapper[4839]: I0321 04:43:03.180437 4839 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6bc7876d45-vxcbj" Mar 21 04:43:03 crc kubenswrapper[4839]: I0321 04:43:03.180469 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6bc7876d45-vxcbj" event={"ID":"40fc2c3c-2c23-497a-89d6-906ba78506c2","Type":"ContainerDied","Data":"19aecc8d605d42905ae54d5640d84373129c9b80e1392d6792f10bef52bd1252"} Mar 21 04:43:03 crc kubenswrapper[4839]: I0321 04:43:03.180505 4839 scope.go:117] "RemoveContainer" containerID="5f6671d09fc3d7cf05752820e4458b198d84c62a10462305fe2e9de3a8094910" Mar 21 04:43:03 crc kubenswrapper[4839]: I0321 04:43:03.183121 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"76b8f1b8-aa66-4f5e-937a-f837a2da28f1","Type":"ContainerStarted","Data":"ad857e3bd5d30310b9baab625e84f950c629d3216d303b1cad7b0178fb62aa6b"} Mar 21 04:43:03 crc kubenswrapper[4839]: I0321 04:43:03.183638 4839 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/kube-state-metrics-0" Mar 21 04:43:03 crc kubenswrapper[4839]: I0321 04:43:03.207717 4839 scope.go:117] "RemoveContainer" containerID="8a45ce0a9f6faa9e05d38583bc564141b514dd0db35ef0ce33729091517ea300" Mar 21 04:43:03 crc kubenswrapper[4839]: I0321 04:43:03.238312 4839 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/openstack-galera-0" podStartSLOduration=38.045040249 podStartE2EDuration="51.238292313s" podCreationTimestamp="2026-03-21 04:42:12 +0000 UTC" firstStartedPulling="2026-03-21 04:42:22.702505945 +0000 UTC m=+1147.030292621" lastFinishedPulling="2026-03-21 04:42:35.895758009 +0000 UTC m=+1160.223544685" observedRunningTime="2026-03-21 04:43:03.223979852 +0000 UTC m=+1187.551766538" watchObservedRunningTime="2026-03-21 04:43:03.238292313 +0000 UTC m=+1187.566078989" Mar 21 04:43:03 crc kubenswrapper[4839]: I0321 04:43:03.241538 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/9848d2f0-c562-4b2a-bd1c-cd91c6754079-etc-swift\") pod \"swift-storage-0\" (UID: \"9848d2f0-c562-4b2a-bd1c-cd91c6754079\") " pod="openstack/swift-storage-0" Mar 21 04:43:03 crc kubenswrapper[4839]: E0321 04:43:03.241868 4839 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Mar 21 04:43:03 crc kubenswrapper[4839]: E0321 04:43:03.241888 4839 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Mar 21 04:43:03 crc kubenswrapper[4839]: E0321 04:43:03.241932 4839 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9848d2f0-c562-4b2a-bd1c-cd91c6754079-etc-swift podName:9848d2f0-c562-4b2a-bd1c-cd91c6754079 nodeName:}" failed. No retries permitted until 2026-03-21 04:43:07.241916224 +0000 UTC m=+1191.569702900 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/9848d2f0-c562-4b2a-bd1c-cd91c6754079-etc-swift") pod "swift-storage-0" (UID: "9848d2f0-c562-4b2a-bd1c-cd91c6754079") : configmap "swift-ring-files" not found Mar 21 04:43:03 crc kubenswrapper[4839]: I0321 04:43:03.252626 4839 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/kube-state-metrics-0" podStartSLOduration=6.398790814 podStartE2EDuration="46.252605513s" podCreationTimestamp="2026-03-21 04:42:17 +0000 UTC" firstStartedPulling="2026-03-21 04:42:22.678151984 +0000 UTC m=+1147.005938660" lastFinishedPulling="2026-03-21 04:43:02.531966683 +0000 UTC m=+1186.859753359" observedRunningTime="2026-03-21 04:43:03.240946577 +0000 UTC m=+1187.568733253" watchObservedRunningTime="2026-03-21 04:43:03.252605513 +0000 UTC m=+1187.580392189" Mar 21 04:43:03 crc kubenswrapper[4839]: I0321 04:43:03.274014 4839 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/openstack-cell1-galera-0" podStartSLOduration=35.118178033 podStartE2EDuration="50.273998121s" podCreationTimestamp="2026-03-21 04:42:13 +0000 UTC" firstStartedPulling="2026-03-21 04:42:22.899368502 +0000 UTC m=+1147.227155178" lastFinishedPulling="2026-03-21 04:42:38.05518859 +0000 UTC m=+1162.382975266" observedRunningTime="2026-03-21 04:43:03.271705927 +0000 UTC m=+1187.599492603" watchObservedRunningTime="2026-03-21 04:43:03.273998121 +0000 UTC m=+1187.601784797" Mar 21 04:43:03 crc kubenswrapper[4839]: I0321 04:43:03.292908 4839 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6bc7876d45-vxcbj"] Mar 21 04:43:03 crc kubenswrapper[4839]: I0321 04:43:03.298865 4839 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-6bc7876d45-vxcbj"] Mar 21 04:43:03 crc kubenswrapper[4839]: I0321 04:43:03.375612 4839 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/swift-ring-rebalance-grqqv"] Mar 21 04:43:03 crc kubenswrapper[4839]: E0321 04:43:03.376147 4839 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="40fc2c3c-2c23-497a-89d6-906ba78506c2" containerName="dnsmasq-dns" Mar 21 04:43:03 crc kubenswrapper[4839]: I0321 04:43:03.376181 4839 state_mem.go:107] "Deleted CPUSet assignment" podUID="40fc2c3c-2c23-497a-89d6-906ba78506c2" containerName="dnsmasq-dns" Mar 21 04:43:03 crc kubenswrapper[4839]: E0321 04:43:03.376220 4839 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="40fc2c3c-2c23-497a-89d6-906ba78506c2" containerName="init" Mar 21 04:43:03 crc kubenswrapper[4839]: I0321 04:43:03.376227 4839 state_mem.go:107] "Deleted CPUSet assignment" podUID="40fc2c3c-2c23-497a-89d6-906ba78506c2" containerName="init" Mar 21 04:43:03 crc kubenswrapper[4839]: I0321 04:43:03.376411 4839 memory_manager.go:354] "RemoveStaleState removing state" podUID="40fc2c3c-2c23-497a-89d6-906ba78506c2" containerName="dnsmasq-dns" Mar 21 04:43:03 crc kubenswrapper[4839]: I0321 04:43:03.379931 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-grqqv" Mar 21 04:43:03 crc kubenswrapper[4839]: I0321 04:43:03.385198 4839 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"swift-proxy-config-data" Mar 21 04:43:03 crc kubenswrapper[4839]: I0321 04:43:03.385242 4839 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-ring-scripts" Mar 21 04:43:03 crc kubenswrapper[4839]: I0321 04:43:03.385786 4839 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-ring-config-data" Mar 21 04:43:03 crc kubenswrapper[4839]: I0321 04:43:03.411415 4839 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/swift-ring-rebalance-grqqv"] Mar 21 04:43:03 crc kubenswrapper[4839]: E0321 04:43:03.412354 4839 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[combined-ca-bundle dispersionconf etc-swift kube-api-access-698g5 ring-data-devices scripts swiftconf], unattached volumes=[], failed to process volumes=[combined-ca-bundle dispersionconf etc-swift kube-api-access-698g5 ring-data-devices scripts swiftconf]: context canceled" pod="openstack/swift-ring-rebalance-grqqv" podUID="40c8620e-6bc9-444b-8ab8-9428633490f3" Mar 21 04:43:03 crc kubenswrapper[4839]: I0321 04:43:03.426061 4839 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/swift-ring-rebalance-kkvzq"] Mar 21 04:43:03 crc kubenswrapper[4839]: I0321 04:43:03.427326 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-kkvzq" Mar 21 04:43:03 crc kubenswrapper[4839]: I0321 04:43:03.446500 4839 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-ring-rebalance-kkvzq"] Mar 21 04:43:03 crc kubenswrapper[4839]: I0321 04:43:03.466792 4839 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/openstack-galera-0" Mar 21 04:43:03 crc kubenswrapper[4839]: I0321 04:43:03.467120 4839 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/openstack-galera-0" Mar 21 04:43:03 crc kubenswrapper[4839]: I0321 04:43:03.469408 4839 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/swift-ring-rebalance-grqqv"] Mar 21 04:43:03 crc kubenswrapper[4839]: I0321 04:43:03.547416 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/5484abbf-53f2-445a-b6fe-0996eba95345-dispersionconf\") pod \"swift-ring-rebalance-kkvzq\" (UID: \"5484abbf-53f2-445a-b6fe-0996eba95345\") " pod="openstack/swift-ring-rebalance-kkvzq" Mar 21 04:43:03 crc kubenswrapper[4839]: I0321 04:43:03.547520 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/5484abbf-53f2-445a-b6fe-0996eba95345-scripts\") pod \"swift-ring-rebalance-kkvzq\" (UID: \"5484abbf-53f2-445a-b6fe-0996eba95345\") " pod="openstack/swift-ring-rebalance-kkvzq" Mar 21 04:43:03 crc kubenswrapper[4839]: I0321 04:43:03.547544 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/40c8620e-6bc9-444b-8ab8-9428633490f3-dispersionconf\") pod \"swift-ring-rebalance-grqqv\" (UID: \"40c8620e-6bc9-444b-8ab8-9428633490f3\") " pod="openstack/swift-ring-rebalance-grqqv" Mar 21 04:43:03 crc kubenswrapper[4839]: I0321 04:43:03.547577 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/40c8620e-6bc9-444b-8ab8-9428633490f3-swiftconf\") pod \"swift-ring-rebalance-grqqv\" (UID: \"40c8620e-6bc9-444b-8ab8-9428633490f3\") " pod="openstack/swift-ring-rebalance-grqqv" Mar 21 04:43:03 crc kubenswrapper[4839]: I0321 04:43:03.547601 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/40c8620e-6bc9-444b-8ab8-9428633490f3-combined-ca-bundle\") pod \"swift-ring-rebalance-grqqv\" (UID: \"40c8620e-6bc9-444b-8ab8-9428633490f3\") " pod="openstack/swift-ring-rebalance-grqqv" Mar 21 04:43:03 crc kubenswrapper[4839]: I0321 04:43:03.547623 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/40c8620e-6bc9-444b-8ab8-9428633490f3-etc-swift\") pod \"swift-ring-rebalance-grqqv\" (UID: \"40c8620e-6bc9-444b-8ab8-9428633490f3\") " pod="openstack/swift-ring-rebalance-grqqv" Mar 21 04:43:03 crc kubenswrapper[4839]: I0321 04:43:03.547673 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-698g5\" (UniqueName: \"kubernetes.io/projected/40c8620e-6bc9-444b-8ab8-9428633490f3-kube-api-access-698g5\") pod \"swift-ring-rebalance-grqqv\" (UID: \"40c8620e-6bc9-444b-8ab8-9428633490f3\") " pod="openstack/swift-ring-rebalance-grqqv" Mar 21 04:43:03 crc kubenswrapper[4839]: I0321 04:43:03.547711 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/40c8620e-6bc9-444b-8ab8-9428633490f3-scripts\") pod \"swift-ring-rebalance-grqqv\" (UID: \"40c8620e-6bc9-444b-8ab8-9428633490f3\") " pod="openstack/swift-ring-rebalance-grqqv" Mar 21 04:43:03 crc kubenswrapper[4839]: I0321 04:43:03.547743 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5484abbf-53f2-445a-b6fe-0996eba95345-combined-ca-bundle\") pod \"swift-ring-rebalance-kkvzq\" (UID: \"5484abbf-53f2-445a-b6fe-0996eba95345\") " pod="openstack/swift-ring-rebalance-kkvzq" Mar 21 04:43:03 crc kubenswrapper[4839]: I0321 04:43:03.547781 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/5484abbf-53f2-445a-b6fe-0996eba95345-etc-swift\") pod \"swift-ring-rebalance-kkvzq\" (UID: \"5484abbf-53f2-445a-b6fe-0996eba95345\") " pod="openstack/swift-ring-rebalance-kkvzq" Mar 21 04:43:03 crc kubenswrapper[4839]: I0321 04:43:03.547802 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/5484abbf-53f2-445a-b6fe-0996eba95345-swiftconf\") pod \"swift-ring-rebalance-kkvzq\" (UID: \"5484abbf-53f2-445a-b6fe-0996eba95345\") " pod="openstack/swift-ring-rebalance-kkvzq" Mar 21 04:43:03 crc kubenswrapper[4839]: I0321 04:43:03.547846 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/40c8620e-6bc9-444b-8ab8-9428633490f3-ring-data-devices\") pod \"swift-ring-rebalance-grqqv\" (UID: \"40c8620e-6bc9-444b-8ab8-9428633490f3\") " pod="openstack/swift-ring-rebalance-grqqv" Mar 21 04:43:03 crc kubenswrapper[4839]: I0321 04:43:03.547872 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/5484abbf-53f2-445a-b6fe-0996eba95345-ring-data-devices\") pod \"swift-ring-rebalance-kkvzq\" (UID: \"5484abbf-53f2-445a-b6fe-0996eba95345\") " pod="openstack/swift-ring-rebalance-kkvzq" Mar 21 04:43:03 crc kubenswrapper[4839]: I0321 04:43:03.547911 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4552p\" (UniqueName: \"kubernetes.io/projected/5484abbf-53f2-445a-b6fe-0996eba95345-kube-api-access-4552p\") pod \"swift-ring-rebalance-kkvzq\" (UID: \"5484abbf-53f2-445a-b6fe-0996eba95345\") " pod="openstack/swift-ring-rebalance-kkvzq" Mar 21 04:43:03 crc kubenswrapper[4839]: I0321 04:43:03.649182 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/40c8620e-6bc9-444b-8ab8-9428633490f3-ring-data-devices\") pod \"swift-ring-rebalance-grqqv\" (UID: \"40c8620e-6bc9-444b-8ab8-9428633490f3\") " pod="openstack/swift-ring-rebalance-grqqv" Mar 21 04:43:03 crc kubenswrapper[4839]: I0321 04:43:03.649548 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/5484abbf-53f2-445a-b6fe-0996eba95345-ring-data-devices\") pod \"swift-ring-rebalance-kkvzq\" (UID: \"5484abbf-53f2-445a-b6fe-0996eba95345\") " pod="openstack/swift-ring-rebalance-kkvzq" Mar 21 04:43:03 crc kubenswrapper[4839]: I0321 04:43:03.649679 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4552p\" (UniqueName: \"kubernetes.io/projected/5484abbf-53f2-445a-b6fe-0996eba95345-kube-api-access-4552p\") pod \"swift-ring-rebalance-kkvzq\" (UID: \"5484abbf-53f2-445a-b6fe-0996eba95345\") " pod="openstack/swift-ring-rebalance-kkvzq" Mar 21 04:43:03 crc kubenswrapper[4839]: I0321 04:43:03.649780 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/5484abbf-53f2-445a-b6fe-0996eba95345-dispersionconf\") pod \"swift-ring-rebalance-kkvzq\" (UID: \"5484abbf-53f2-445a-b6fe-0996eba95345\") " pod="openstack/swift-ring-rebalance-kkvzq" Mar 21 04:43:03 crc kubenswrapper[4839]: I0321 04:43:03.649918 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/5484abbf-53f2-445a-b6fe-0996eba95345-scripts\") pod \"swift-ring-rebalance-kkvzq\" (UID: \"5484abbf-53f2-445a-b6fe-0996eba95345\") " pod="openstack/swift-ring-rebalance-kkvzq" Mar 21 04:43:03 crc kubenswrapper[4839]: I0321 04:43:03.650024 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/40c8620e-6bc9-444b-8ab8-9428633490f3-dispersionconf\") pod \"swift-ring-rebalance-grqqv\" (UID: \"40c8620e-6bc9-444b-8ab8-9428633490f3\") " pod="openstack/swift-ring-rebalance-grqqv" Mar 21 04:43:03 crc kubenswrapper[4839]: I0321 04:43:03.650099 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/40c8620e-6bc9-444b-8ab8-9428633490f3-swiftconf\") pod \"swift-ring-rebalance-grqqv\" (UID: \"40c8620e-6bc9-444b-8ab8-9428633490f3\") " pod="openstack/swift-ring-rebalance-grqqv" Mar 21 04:43:03 crc kubenswrapper[4839]: I0321 04:43:03.650174 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/40c8620e-6bc9-444b-8ab8-9428633490f3-combined-ca-bundle\") pod \"swift-ring-rebalance-grqqv\" (UID: \"40c8620e-6bc9-444b-8ab8-9428633490f3\") " pod="openstack/swift-ring-rebalance-grqqv" Mar 21 04:43:03 crc kubenswrapper[4839]: I0321 04:43:03.650247 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/40c8620e-6bc9-444b-8ab8-9428633490f3-etc-swift\") pod \"swift-ring-rebalance-grqqv\" (UID: \"40c8620e-6bc9-444b-8ab8-9428633490f3\") " pod="openstack/swift-ring-rebalance-grqqv" Mar 21 04:43:03 crc kubenswrapper[4839]: I0321 04:43:03.650323 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-698g5\" (UniqueName: \"kubernetes.io/projected/40c8620e-6bc9-444b-8ab8-9428633490f3-kube-api-access-698g5\") pod \"swift-ring-rebalance-grqqv\" (UID: \"40c8620e-6bc9-444b-8ab8-9428633490f3\") " pod="openstack/swift-ring-rebalance-grqqv" Mar 21 04:43:03 crc kubenswrapper[4839]: I0321 04:43:03.650403 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/40c8620e-6bc9-444b-8ab8-9428633490f3-scripts\") pod \"swift-ring-rebalance-grqqv\" (UID: \"40c8620e-6bc9-444b-8ab8-9428633490f3\") " pod="openstack/swift-ring-rebalance-grqqv" Mar 21 04:43:03 crc kubenswrapper[4839]: I0321 04:43:03.650484 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5484abbf-53f2-445a-b6fe-0996eba95345-combined-ca-bundle\") pod \"swift-ring-rebalance-kkvzq\" (UID: \"5484abbf-53f2-445a-b6fe-0996eba95345\") " pod="openstack/swift-ring-rebalance-kkvzq" Mar 21 04:43:03 crc kubenswrapper[4839]: I0321 04:43:03.650581 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/5484abbf-53f2-445a-b6fe-0996eba95345-ring-data-devices\") pod \"swift-ring-rebalance-kkvzq\" (UID: \"5484abbf-53f2-445a-b6fe-0996eba95345\") " pod="openstack/swift-ring-rebalance-kkvzq" Mar 21 04:43:03 crc kubenswrapper[4839]: I0321 04:43:03.650589 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/5484abbf-53f2-445a-b6fe-0996eba95345-etc-swift\") pod \"swift-ring-rebalance-kkvzq\" (UID: \"5484abbf-53f2-445a-b6fe-0996eba95345\") " pod="openstack/swift-ring-rebalance-kkvzq" Mar 21 04:43:03 crc kubenswrapper[4839]: I0321 04:43:03.650691 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/5484abbf-53f2-445a-b6fe-0996eba95345-swiftconf\") pod \"swift-ring-rebalance-kkvzq\" (UID: \"5484abbf-53f2-445a-b6fe-0996eba95345\") " pod="openstack/swift-ring-rebalance-kkvzq" Mar 21 04:43:03 crc kubenswrapper[4839]: I0321 04:43:03.651119 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/5484abbf-53f2-445a-b6fe-0996eba95345-etc-swift\") pod \"swift-ring-rebalance-kkvzq\" (UID: \"5484abbf-53f2-445a-b6fe-0996eba95345\") " pod="openstack/swift-ring-rebalance-kkvzq" Mar 21 04:43:03 crc kubenswrapper[4839]: I0321 04:43:03.651241 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/40c8620e-6bc9-444b-8ab8-9428633490f3-ring-data-devices\") pod \"swift-ring-rebalance-grqqv\" (UID: \"40c8620e-6bc9-444b-8ab8-9428633490f3\") " pod="openstack/swift-ring-rebalance-grqqv" Mar 21 04:43:03 crc kubenswrapper[4839]: I0321 04:43:03.651548 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/40c8620e-6bc9-444b-8ab8-9428633490f3-etc-swift\") pod \"swift-ring-rebalance-grqqv\" (UID: \"40c8620e-6bc9-444b-8ab8-9428633490f3\") " pod="openstack/swift-ring-rebalance-grqqv" Mar 21 04:43:03 crc kubenswrapper[4839]: I0321 04:43:03.651848 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/40c8620e-6bc9-444b-8ab8-9428633490f3-scripts\") pod \"swift-ring-rebalance-grqqv\" (UID: \"40c8620e-6bc9-444b-8ab8-9428633490f3\") " pod="openstack/swift-ring-rebalance-grqqv" Mar 21 04:43:03 crc kubenswrapper[4839]: I0321 04:43:03.652080 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/5484abbf-53f2-445a-b6fe-0996eba95345-scripts\") pod \"swift-ring-rebalance-kkvzq\" (UID: \"5484abbf-53f2-445a-b6fe-0996eba95345\") " pod="openstack/swift-ring-rebalance-kkvzq" Mar 21 04:43:03 crc kubenswrapper[4839]: I0321 04:43:03.654972 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/40c8620e-6bc9-444b-8ab8-9428633490f3-dispersionconf\") pod \"swift-ring-rebalance-grqqv\" (UID: \"40c8620e-6bc9-444b-8ab8-9428633490f3\") " pod="openstack/swift-ring-rebalance-grqqv" Mar 21 04:43:03 crc kubenswrapper[4839]: I0321 04:43:03.655713 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/5484abbf-53f2-445a-b6fe-0996eba95345-swiftconf\") pod \"swift-ring-rebalance-kkvzq\" (UID: \"5484abbf-53f2-445a-b6fe-0996eba95345\") " pod="openstack/swift-ring-rebalance-kkvzq" Mar 21 04:43:03 crc kubenswrapper[4839]: I0321 04:43:03.656329 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/40c8620e-6bc9-444b-8ab8-9428633490f3-combined-ca-bundle\") pod \"swift-ring-rebalance-grqqv\" (UID: \"40c8620e-6bc9-444b-8ab8-9428633490f3\") " pod="openstack/swift-ring-rebalance-grqqv" Mar 21 04:43:03 crc kubenswrapper[4839]: I0321 04:43:03.658004 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5484abbf-53f2-445a-b6fe-0996eba95345-combined-ca-bundle\") pod \"swift-ring-rebalance-kkvzq\" (UID: \"5484abbf-53f2-445a-b6fe-0996eba95345\") " pod="openstack/swift-ring-rebalance-kkvzq" Mar 21 04:43:03 crc kubenswrapper[4839]: I0321 04:43:03.659073 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/40c8620e-6bc9-444b-8ab8-9428633490f3-swiftconf\") pod \"swift-ring-rebalance-grqqv\" (UID: \"40c8620e-6bc9-444b-8ab8-9428633490f3\") " pod="openstack/swift-ring-rebalance-grqqv" Mar 21 04:43:03 crc kubenswrapper[4839]: I0321 04:43:03.663146 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/5484abbf-53f2-445a-b6fe-0996eba95345-dispersionconf\") pod \"swift-ring-rebalance-kkvzq\" (UID: \"5484abbf-53f2-445a-b6fe-0996eba95345\") " pod="openstack/swift-ring-rebalance-kkvzq" Mar 21 04:43:03 crc kubenswrapper[4839]: I0321 04:43:03.669278 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4552p\" (UniqueName: \"kubernetes.io/projected/5484abbf-53f2-445a-b6fe-0996eba95345-kube-api-access-4552p\") pod \"swift-ring-rebalance-kkvzq\" (UID: \"5484abbf-53f2-445a-b6fe-0996eba95345\") " pod="openstack/swift-ring-rebalance-kkvzq" Mar 21 04:43:03 crc kubenswrapper[4839]: I0321 04:43:03.671604 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-698g5\" (UniqueName: \"kubernetes.io/projected/40c8620e-6bc9-444b-8ab8-9428633490f3-kube-api-access-698g5\") pod \"swift-ring-rebalance-grqqv\" (UID: \"40c8620e-6bc9-444b-8ab8-9428633490f3\") " pod="openstack/swift-ring-rebalance-grqqv" Mar 21 04:43:03 crc kubenswrapper[4839]: I0321 04:43:03.760121 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-kkvzq" Mar 21 04:43:04 crc kubenswrapper[4839]: I0321 04:43:04.200003 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-b8fbc5445-zkqb7" event={"ID":"67dd1633-1450-4153-b0af-b6887f61944c","Type":"ContainerStarted","Data":"66a460b182805c08827a7b4f6980d98fea84c8290c7b4fe1cb071b3630a6c029"} Mar 21 04:43:04 crc kubenswrapper[4839]: I0321 04:43:04.200066 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-grqqv" Mar 21 04:43:04 crc kubenswrapper[4839]: I0321 04:43:04.226547 4839 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-b8fbc5445-zkqb7" podStartSLOduration=6.226529698 podStartE2EDuration="6.226529698s" podCreationTimestamp="2026-03-21 04:42:58 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-21 04:43:04.224171272 +0000 UTC m=+1188.551957968" watchObservedRunningTime="2026-03-21 04:43:04.226529698 +0000 UTC m=+1188.554316374" Mar 21 04:43:04 crc kubenswrapper[4839]: I0321 04:43:04.229589 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-grqqv" Mar 21 04:43:04 crc kubenswrapper[4839]: W0321 04:43:04.264125 4839 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod5484abbf_53f2_445a_b6fe_0996eba95345.slice/crio-609cfec14cc280f7aa99193851e53d3aac7200e8836be9f73b80cc6653f3fdc5 WatchSource:0}: Error finding container 609cfec14cc280f7aa99193851e53d3aac7200e8836be9f73b80cc6653f3fdc5: Status 404 returned error can't find the container with id 609cfec14cc280f7aa99193851e53d3aac7200e8836be9f73b80cc6653f3fdc5 Mar 21 04:43:04 crc kubenswrapper[4839]: I0321 04:43:04.268927 4839 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-ring-rebalance-kkvzq"] Mar 21 04:43:04 crc kubenswrapper[4839]: I0321 04:43:04.363738 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/40c8620e-6bc9-444b-8ab8-9428633490f3-dispersionconf\") pod \"40c8620e-6bc9-444b-8ab8-9428633490f3\" (UID: \"40c8620e-6bc9-444b-8ab8-9428633490f3\") " Mar 21 04:43:04 crc kubenswrapper[4839]: I0321 04:43:04.364014 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-698g5\" (UniqueName: \"kubernetes.io/projected/40c8620e-6bc9-444b-8ab8-9428633490f3-kube-api-access-698g5\") pod \"40c8620e-6bc9-444b-8ab8-9428633490f3\" (UID: \"40c8620e-6bc9-444b-8ab8-9428633490f3\") " Mar 21 04:43:04 crc kubenswrapper[4839]: I0321 04:43:04.364043 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/40c8620e-6bc9-444b-8ab8-9428633490f3-etc-swift\") pod \"40c8620e-6bc9-444b-8ab8-9428633490f3\" (UID: \"40c8620e-6bc9-444b-8ab8-9428633490f3\") " Mar 21 04:43:04 crc kubenswrapper[4839]: I0321 04:43:04.364177 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/40c8620e-6bc9-444b-8ab8-9428633490f3-swiftconf\") pod \"40c8620e-6bc9-444b-8ab8-9428633490f3\" (UID: \"40c8620e-6bc9-444b-8ab8-9428633490f3\") " Mar 21 04:43:04 crc kubenswrapper[4839]: I0321 04:43:04.364230 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/40c8620e-6bc9-444b-8ab8-9428633490f3-scripts\") pod \"40c8620e-6bc9-444b-8ab8-9428633490f3\" (UID: \"40c8620e-6bc9-444b-8ab8-9428633490f3\") " Mar 21 04:43:04 crc kubenswrapper[4839]: I0321 04:43:04.364260 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/40c8620e-6bc9-444b-8ab8-9428633490f3-combined-ca-bundle\") pod \"40c8620e-6bc9-444b-8ab8-9428633490f3\" (UID: \"40c8620e-6bc9-444b-8ab8-9428633490f3\") " Mar 21 04:43:04 crc kubenswrapper[4839]: I0321 04:43:04.364307 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/40c8620e-6bc9-444b-8ab8-9428633490f3-ring-data-devices\") pod \"40c8620e-6bc9-444b-8ab8-9428633490f3\" (UID: \"40c8620e-6bc9-444b-8ab8-9428633490f3\") " Mar 21 04:43:04 crc kubenswrapper[4839]: I0321 04:43:04.365678 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/40c8620e-6bc9-444b-8ab8-9428633490f3-etc-swift" (OuterVolumeSpecName: "etc-swift") pod "40c8620e-6bc9-444b-8ab8-9428633490f3" (UID: "40c8620e-6bc9-444b-8ab8-9428633490f3"). InnerVolumeSpecName "etc-swift". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 21 04:43:04 crc kubenswrapper[4839]: I0321 04:43:04.367128 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/40c8620e-6bc9-444b-8ab8-9428633490f3-scripts" (OuterVolumeSpecName: "scripts") pod "40c8620e-6bc9-444b-8ab8-9428633490f3" (UID: "40c8620e-6bc9-444b-8ab8-9428633490f3"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 21 04:43:04 crc kubenswrapper[4839]: I0321 04:43:04.367524 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/40c8620e-6bc9-444b-8ab8-9428633490f3-ring-data-devices" (OuterVolumeSpecName: "ring-data-devices") pod "40c8620e-6bc9-444b-8ab8-9428633490f3" (UID: "40c8620e-6bc9-444b-8ab8-9428633490f3"). InnerVolumeSpecName "ring-data-devices". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 21 04:43:04 crc kubenswrapper[4839]: I0321 04:43:04.371660 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/40c8620e-6bc9-444b-8ab8-9428633490f3-dispersionconf" (OuterVolumeSpecName: "dispersionconf") pod "40c8620e-6bc9-444b-8ab8-9428633490f3" (UID: "40c8620e-6bc9-444b-8ab8-9428633490f3"). InnerVolumeSpecName "dispersionconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 21 04:43:04 crc kubenswrapper[4839]: I0321 04:43:04.371716 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/40c8620e-6bc9-444b-8ab8-9428633490f3-kube-api-access-698g5" (OuterVolumeSpecName: "kube-api-access-698g5") pod "40c8620e-6bc9-444b-8ab8-9428633490f3" (UID: "40c8620e-6bc9-444b-8ab8-9428633490f3"). InnerVolumeSpecName "kube-api-access-698g5". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 21 04:43:04 crc kubenswrapper[4839]: I0321 04:43:04.375853 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/40c8620e-6bc9-444b-8ab8-9428633490f3-swiftconf" (OuterVolumeSpecName: "swiftconf") pod "40c8620e-6bc9-444b-8ab8-9428633490f3" (UID: "40c8620e-6bc9-444b-8ab8-9428633490f3"). InnerVolumeSpecName "swiftconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 21 04:43:04 crc kubenswrapper[4839]: I0321 04:43:04.376005 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/40c8620e-6bc9-444b-8ab8-9428633490f3-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "40c8620e-6bc9-444b-8ab8-9428633490f3" (UID: "40c8620e-6bc9-444b-8ab8-9428633490f3"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 21 04:43:04 crc kubenswrapper[4839]: I0321 04:43:04.462065 4839 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="40fc2c3c-2c23-497a-89d6-906ba78506c2" path="/var/lib/kubelet/pods/40fc2c3c-2c23-497a-89d6-906ba78506c2/volumes" Mar 21 04:43:04 crc kubenswrapper[4839]: I0321 04:43:04.466896 4839 reconciler_common.go:293] "Volume detached for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/40c8620e-6bc9-444b-8ab8-9428633490f3-swiftconf\") on node \"crc\" DevicePath \"\"" Mar 21 04:43:04 crc kubenswrapper[4839]: I0321 04:43:04.466936 4839 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/40c8620e-6bc9-444b-8ab8-9428633490f3-scripts\") on node \"crc\" DevicePath \"\"" Mar 21 04:43:04 crc kubenswrapper[4839]: I0321 04:43:04.466948 4839 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/40c8620e-6bc9-444b-8ab8-9428633490f3-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 21 04:43:04 crc kubenswrapper[4839]: I0321 04:43:04.466964 4839 reconciler_common.go:293] "Volume detached for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/40c8620e-6bc9-444b-8ab8-9428633490f3-ring-data-devices\") on node \"crc\" DevicePath \"\"" Mar 21 04:43:04 crc kubenswrapper[4839]: I0321 04:43:04.466976 4839 reconciler_common.go:293] "Volume detached for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/40c8620e-6bc9-444b-8ab8-9428633490f3-dispersionconf\") on node \"crc\" DevicePath \"\"" Mar 21 04:43:04 crc kubenswrapper[4839]: I0321 04:43:04.466989 4839 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-698g5\" (UniqueName: \"kubernetes.io/projected/40c8620e-6bc9-444b-8ab8-9428633490f3-kube-api-access-698g5\") on node \"crc\" DevicePath \"\"" Mar 21 04:43:04 crc kubenswrapper[4839]: I0321 04:43:04.467000 4839 reconciler_common.go:293] "Volume detached for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/40c8620e-6bc9-444b-8ab8-9428633490f3-etc-swift\") on node \"crc\" DevicePath \"\"" Mar 21 04:43:04 crc kubenswrapper[4839]: I0321 04:43:04.740930 4839 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/openstack-cell1-galera-0" Mar 21 04:43:04 crc kubenswrapper[4839]: I0321 04:43:04.740984 4839 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/openstack-cell1-galera-0" Mar 21 04:43:05 crc kubenswrapper[4839]: I0321 04:43:05.215029 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-kkvzq" event={"ID":"5484abbf-53f2-445a-b6fe-0996eba95345","Type":"ContainerStarted","Data":"609cfec14cc280f7aa99193851e53d3aac7200e8836be9f73b80cc6653f3fdc5"} Mar 21 04:43:05 crc kubenswrapper[4839]: I0321 04:43:05.215059 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-grqqv" Mar 21 04:43:05 crc kubenswrapper[4839]: I0321 04:43:05.215428 4839 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-b8fbc5445-zkqb7" Mar 21 04:43:05 crc kubenswrapper[4839]: I0321 04:43:05.251811 4839 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/swift-ring-rebalance-grqqv"] Mar 21 04:43:05 crc kubenswrapper[4839]: I0321 04:43:05.260623 4839 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/swift-ring-rebalance-grqqv"] Mar 21 04:43:06 crc kubenswrapper[4839]: I0321 04:43:06.464557 4839 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="40c8620e-6bc9-444b-8ab8-9428633490f3" path="/var/lib/kubelet/pods/40c8620e-6bc9-444b-8ab8-9428633490f3/volumes" Mar 21 04:43:07 crc kubenswrapper[4839]: I0321 04:43:07.312010 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/9848d2f0-c562-4b2a-bd1c-cd91c6754079-etc-swift\") pod \"swift-storage-0\" (UID: \"9848d2f0-c562-4b2a-bd1c-cd91c6754079\") " pod="openstack/swift-storage-0" Mar 21 04:43:07 crc kubenswrapper[4839]: E0321 04:43:07.312266 4839 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Mar 21 04:43:07 crc kubenswrapper[4839]: E0321 04:43:07.312310 4839 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Mar 21 04:43:07 crc kubenswrapper[4839]: E0321 04:43:07.312377 4839 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9848d2f0-c562-4b2a-bd1c-cd91c6754079-etc-swift podName:9848d2f0-c562-4b2a-bd1c-cd91c6754079 nodeName:}" failed. No retries permitted until 2026-03-21 04:43:15.312355975 +0000 UTC m=+1199.640142651 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/9848d2f0-c562-4b2a-bd1c-cd91c6754079-etc-swift") pod "swift-storage-0" (UID: "9848d2f0-c562-4b2a-bd1c-cd91c6754079") : configmap "swift-ring-files" not found Mar 21 04:43:07 crc kubenswrapper[4839]: I0321 04:43:07.478939 4839 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/kube-state-metrics-0" Mar 21 04:43:08 crc kubenswrapper[4839]: I0321 04:43:08.640828 4839 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-b8fbc5445-zkqb7" Mar 21 04:43:08 crc kubenswrapper[4839]: I0321 04:43:08.696820 4839 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-8554648995-4z7nl"] Mar 21 04:43:08 crc kubenswrapper[4839]: I0321 04:43:08.697071 4839 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-8554648995-4z7nl" podUID="d92c95bc-4fde-4d24-ad6e-d4583ec19b3a" containerName="dnsmasq-dns" containerID="cri-o://97084a1051c26c4cfcbee9a2a345f9fe3d46b532fdf62d0b9b04772413ec0e3b" gracePeriod=10 Mar 21 04:43:09 crc kubenswrapper[4839]: I0321 04:43:09.483498 4839 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-8554648995-4z7nl" podUID="d92c95bc-4fde-4d24-ad6e-d4583ec19b3a" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.113:5353: connect: connection refused" Mar 21 04:43:12 crc kubenswrapper[4839]: I0321 04:43:12.264483 4839 generic.go:334] "Generic (PLEG): container finished" podID="d92c95bc-4fde-4d24-ad6e-d4583ec19b3a" containerID="97084a1051c26c4cfcbee9a2a345f9fe3d46b532fdf62d0b9b04772413ec0e3b" exitCode=0 Mar 21 04:43:12 crc kubenswrapper[4839]: I0321 04:43:12.264558 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-8554648995-4z7nl" event={"ID":"d92c95bc-4fde-4d24-ad6e-d4583ec19b3a","Type":"ContainerDied","Data":"97084a1051c26c4cfcbee9a2a345f9fe3d46b532fdf62d0b9b04772413ec0e3b"} Mar 21 04:43:12 crc kubenswrapper[4839]: I0321 04:43:12.932489 4839 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-northd-0" Mar 21 04:43:13 crc kubenswrapper[4839]: I0321 04:43:13.579386 4839 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/openstack-galera-0" Mar 21 04:43:13 crc kubenswrapper[4839]: I0321 04:43:13.657196 4839 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/openstack-galera-0" Mar 21 04:43:14 crc kubenswrapper[4839]: I0321 04:43:14.217902 4839 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-8554648995-4z7nl" Mar 21 04:43:14 crc kubenswrapper[4839]: I0321 04:43:14.283965 4839 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-8554648995-4z7nl" Mar 21 04:43:14 crc kubenswrapper[4839]: I0321 04:43:14.284383 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-8554648995-4z7nl" event={"ID":"d92c95bc-4fde-4d24-ad6e-d4583ec19b3a","Type":"ContainerDied","Data":"8e39c8706b3815a73340c1ae1ac875a1799d9491908ef6b336359be187e60c9d"} Mar 21 04:43:14 crc kubenswrapper[4839]: I0321 04:43:14.285218 4839 scope.go:117] "RemoveContainer" containerID="97084a1051c26c4cfcbee9a2a345f9fe3d46b532fdf62d0b9b04772413ec0e3b" Mar 21 04:43:14 crc kubenswrapper[4839]: I0321 04:43:14.331721 4839 scope.go:117] "RemoveContainer" containerID="fdec34f8addba6741ac12f007463e3379c5cafeea2de83548fa2bc44a873ebd5" Mar 21 04:43:14 crc kubenswrapper[4839]: I0321 04:43:14.343463 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/d92c95bc-4fde-4d24-ad6e-d4583ec19b3a-ovsdbserver-nb\") pod \"d92c95bc-4fde-4d24-ad6e-d4583ec19b3a\" (UID: \"d92c95bc-4fde-4d24-ad6e-d4583ec19b3a\") " Mar 21 04:43:14 crc kubenswrapper[4839]: I0321 04:43:14.343531 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-j56r2\" (UniqueName: \"kubernetes.io/projected/d92c95bc-4fde-4d24-ad6e-d4583ec19b3a-kube-api-access-j56r2\") pod \"d92c95bc-4fde-4d24-ad6e-d4583ec19b3a\" (UID: \"d92c95bc-4fde-4d24-ad6e-d4583ec19b3a\") " Mar 21 04:43:14 crc kubenswrapper[4839]: I0321 04:43:14.343560 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/d92c95bc-4fde-4d24-ad6e-d4583ec19b3a-ovsdbserver-sb\") pod \"d92c95bc-4fde-4d24-ad6e-d4583ec19b3a\" (UID: \"d92c95bc-4fde-4d24-ad6e-d4583ec19b3a\") " Mar 21 04:43:14 crc kubenswrapper[4839]: I0321 04:43:14.343774 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d92c95bc-4fde-4d24-ad6e-d4583ec19b3a-config\") pod \"d92c95bc-4fde-4d24-ad6e-d4583ec19b3a\" (UID: \"d92c95bc-4fde-4d24-ad6e-d4583ec19b3a\") " Mar 21 04:43:14 crc kubenswrapper[4839]: I0321 04:43:14.343821 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/d92c95bc-4fde-4d24-ad6e-d4583ec19b3a-dns-svc\") pod \"d92c95bc-4fde-4d24-ad6e-d4583ec19b3a\" (UID: \"d92c95bc-4fde-4d24-ad6e-d4583ec19b3a\") " Mar 21 04:43:14 crc kubenswrapper[4839]: I0321 04:43:14.347486 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d92c95bc-4fde-4d24-ad6e-d4583ec19b3a-kube-api-access-j56r2" (OuterVolumeSpecName: "kube-api-access-j56r2") pod "d92c95bc-4fde-4d24-ad6e-d4583ec19b3a" (UID: "d92c95bc-4fde-4d24-ad6e-d4583ec19b3a"). InnerVolumeSpecName "kube-api-access-j56r2". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 21 04:43:14 crc kubenswrapper[4839]: I0321 04:43:14.387049 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d92c95bc-4fde-4d24-ad6e-d4583ec19b3a-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "d92c95bc-4fde-4d24-ad6e-d4583ec19b3a" (UID: "d92c95bc-4fde-4d24-ad6e-d4583ec19b3a"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 21 04:43:14 crc kubenswrapper[4839]: I0321 04:43:14.387784 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d92c95bc-4fde-4d24-ad6e-d4583ec19b3a-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "d92c95bc-4fde-4d24-ad6e-d4583ec19b3a" (UID: "d92c95bc-4fde-4d24-ad6e-d4583ec19b3a"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 21 04:43:14 crc kubenswrapper[4839]: I0321 04:43:14.393741 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d92c95bc-4fde-4d24-ad6e-d4583ec19b3a-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "d92c95bc-4fde-4d24-ad6e-d4583ec19b3a" (UID: "d92c95bc-4fde-4d24-ad6e-d4583ec19b3a"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 21 04:43:14 crc kubenswrapper[4839]: I0321 04:43:14.394824 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d92c95bc-4fde-4d24-ad6e-d4583ec19b3a-config" (OuterVolumeSpecName: "config") pod "d92c95bc-4fde-4d24-ad6e-d4583ec19b3a" (UID: "d92c95bc-4fde-4d24-ad6e-d4583ec19b3a"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 21 04:43:14 crc kubenswrapper[4839]: I0321 04:43:14.445864 4839 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d92c95bc-4fde-4d24-ad6e-d4583ec19b3a-config\") on node \"crc\" DevicePath \"\"" Mar 21 04:43:14 crc kubenswrapper[4839]: I0321 04:43:14.445908 4839 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/d92c95bc-4fde-4d24-ad6e-d4583ec19b3a-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 21 04:43:14 crc kubenswrapper[4839]: I0321 04:43:14.445917 4839 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/d92c95bc-4fde-4d24-ad6e-d4583ec19b3a-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Mar 21 04:43:14 crc kubenswrapper[4839]: I0321 04:43:14.445929 4839 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-j56r2\" (UniqueName: \"kubernetes.io/projected/d92c95bc-4fde-4d24-ad6e-d4583ec19b3a-kube-api-access-j56r2\") on node \"crc\" DevicePath \"\"" Mar 21 04:43:14 crc kubenswrapper[4839]: I0321 04:43:14.445937 4839 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/d92c95bc-4fde-4d24-ad6e-d4583ec19b3a-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Mar 21 04:43:14 crc kubenswrapper[4839]: I0321 04:43:14.605695 4839 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-8554648995-4z7nl"] Mar 21 04:43:14 crc kubenswrapper[4839]: I0321 04:43:14.631907 4839 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-8554648995-4z7nl"] Mar 21 04:43:14 crc kubenswrapper[4839]: I0321 04:43:14.878066 4839 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/openstack-cell1-galera-0" Mar 21 04:43:14 crc kubenswrapper[4839]: I0321 04:43:14.959003 4839 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/openstack-cell1-galera-0" Mar 21 04:43:15 crc kubenswrapper[4839]: I0321 04:43:15.290305 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-kkvzq" event={"ID":"5484abbf-53f2-445a-b6fe-0996eba95345","Type":"ContainerStarted","Data":"93b552b909df831d40a3b2b56c1f6ba5babeea45e19a365649677dff6a5a3b56"} Mar 21 04:43:15 crc kubenswrapper[4839]: I0321 04:43:15.309911 4839 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/swift-ring-rebalance-kkvzq" podStartSLOduration=2.354275297 podStartE2EDuration="12.309885448s" podCreationTimestamp="2026-03-21 04:43:03 +0000 UTC" firstStartedPulling="2026-03-21 04:43:04.267727821 +0000 UTC m=+1188.595514497" lastFinishedPulling="2026-03-21 04:43:14.223337972 +0000 UTC m=+1198.551124648" observedRunningTime="2026-03-21 04:43:15.307552892 +0000 UTC m=+1199.635339578" watchObservedRunningTime="2026-03-21 04:43:15.309885448 +0000 UTC m=+1199.637672124" Mar 21 04:43:15 crc kubenswrapper[4839]: I0321 04:43:15.359600 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/9848d2f0-c562-4b2a-bd1c-cd91c6754079-etc-swift\") pod \"swift-storage-0\" (UID: \"9848d2f0-c562-4b2a-bd1c-cd91c6754079\") " pod="openstack/swift-storage-0" Mar 21 04:43:15 crc kubenswrapper[4839]: E0321 04:43:15.359800 4839 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Mar 21 04:43:15 crc kubenswrapper[4839]: E0321 04:43:15.359816 4839 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Mar 21 04:43:15 crc kubenswrapper[4839]: E0321 04:43:15.360269 4839 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9848d2f0-c562-4b2a-bd1c-cd91c6754079-etc-swift podName:9848d2f0-c562-4b2a-bd1c-cd91c6754079 nodeName:}" failed. No retries permitted until 2026-03-21 04:43:31.360214026 +0000 UTC m=+1215.688000702 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/9848d2f0-c562-4b2a-bd1c-cd91c6754079-etc-swift") pod "swift-storage-0" (UID: "9848d2f0-c562-4b2a-bd1c-cd91c6754079") : configmap "swift-ring-files" not found Mar 21 04:43:15 crc kubenswrapper[4839]: I0321 04:43:15.791982 4839 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ovn-controller-qt5s4" podUID="b31b64cb-0266-4b8a-9fcb-ae5e36c8309a" containerName="ovn-controller" probeResult="failure" output=< Mar 21 04:43:15 crc kubenswrapper[4839]: ERROR - ovn-controller connection status is 'not connected', expecting 'connected' status Mar 21 04:43:15 crc kubenswrapper[4839]: > Mar 21 04:43:16 crc kubenswrapper[4839]: I0321 04:43:16.110311 4839 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-852d-account-create-update-nv5n7"] Mar 21 04:43:16 crc kubenswrapper[4839]: E0321 04:43:16.110646 4839 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d92c95bc-4fde-4d24-ad6e-d4583ec19b3a" containerName="dnsmasq-dns" Mar 21 04:43:16 crc kubenswrapper[4839]: I0321 04:43:16.110679 4839 state_mem.go:107] "Deleted CPUSet assignment" podUID="d92c95bc-4fde-4d24-ad6e-d4583ec19b3a" containerName="dnsmasq-dns" Mar 21 04:43:16 crc kubenswrapper[4839]: E0321 04:43:16.110698 4839 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d92c95bc-4fde-4d24-ad6e-d4583ec19b3a" containerName="init" Mar 21 04:43:16 crc kubenswrapper[4839]: I0321 04:43:16.110705 4839 state_mem.go:107] "Deleted CPUSet assignment" podUID="d92c95bc-4fde-4d24-ad6e-d4583ec19b3a" containerName="init" Mar 21 04:43:16 crc kubenswrapper[4839]: I0321 04:43:16.110870 4839 memory_manager.go:354] "RemoveStaleState removing state" podUID="d92c95bc-4fde-4d24-ad6e-d4583ec19b3a" containerName="dnsmasq-dns" Mar 21 04:43:16 crc kubenswrapper[4839]: I0321 04:43:16.111347 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-852d-account-create-update-nv5n7" Mar 21 04:43:16 crc kubenswrapper[4839]: I0321 04:43:16.113635 4839 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-db-secret" Mar 21 04:43:16 crc kubenswrapper[4839]: I0321 04:43:16.119996 4839 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-852d-account-create-update-nv5n7"] Mar 21 04:43:16 crc kubenswrapper[4839]: I0321 04:43:16.162914 4839 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-db-create-v4k9c"] Mar 21 04:43:16 crc kubenswrapper[4839]: I0321 04:43:16.163859 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-v4k9c" Mar 21 04:43:16 crc kubenswrapper[4839]: I0321 04:43:16.173609 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wnt44\" (UniqueName: \"kubernetes.io/projected/8ad3cc08-174a-4164-aa38-3d7f6fbed0c0-kube-api-access-wnt44\") pod \"keystone-852d-account-create-update-nv5n7\" (UID: \"8ad3cc08-174a-4164-aa38-3d7f6fbed0c0\") " pod="openstack/keystone-852d-account-create-update-nv5n7" Mar 21 04:43:16 crc kubenswrapper[4839]: I0321 04:43:16.173717 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8ad3cc08-174a-4164-aa38-3d7f6fbed0c0-operator-scripts\") pod \"keystone-852d-account-create-update-nv5n7\" (UID: \"8ad3cc08-174a-4164-aa38-3d7f6fbed0c0\") " pod="openstack/keystone-852d-account-create-update-nv5n7" Mar 21 04:43:16 crc kubenswrapper[4839]: I0321 04:43:16.176547 4839 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-create-v4k9c"] Mar 21 04:43:16 crc kubenswrapper[4839]: I0321 04:43:16.274788 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-49pjl\" (UniqueName: \"kubernetes.io/projected/e779c2ff-ee70-4779-b3fc-3b3bf87aff47-kube-api-access-49pjl\") pod \"keystone-db-create-v4k9c\" (UID: \"e779c2ff-ee70-4779-b3fc-3b3bf87aff47\") " pod="openstack/keystone-db-create-v4k9c" Mar 21 04:43:16 crc kubenswrapper[4839]: I0321 04:43:16.275009 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wnt44\" (UniqueName: \"kubernetes.io/projected/8ad3cc08-174a-4164-aa38-3d7f6fbed0c0-kube-api-access-wnt44\") pod \"keystone-852d-account-create-update-nv5n7\" (UID: \"8ad3cc08-174a-4164-aa38-3d7f6fbed0c0\") " pod="openstack/keystone-852d-account-create-update-nv5n7" Mar 21 04:43:16 crc kubenswrapper[4839]: I0321 04:43:16.275056 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e779c2ff-ee70-4779-b3fc-3b3bf87aff47-operator-scripts\") pod \"keystone-db-create-v4k9c\" (UID: \"e779c2ff-ee70-4779-b3fc-3b3bf87aff47\") " pod="openstack/keystone-db-create-v4k9c" Mar 21 04:43:16 crc kubenswrapper[4839]: I0321 04:43:16.275141 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8ad3cc08-174a-4164-aa38-3d7f6fbed0c0-operator-scripts\") pod \"keystone-852d-account-create-update-nv5n7\" (UID: \"8ad3cc08-174a-4164-aa38-3d7f6fbed0c0\") " pod="openstack/keystone-852d-account-create-update-nv5n7" Mar 21 04:43:16 crc kubenswrapper[4839]: I0321 04:43:16.275919 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8ad3cc08-174a-4164-aa38-3d7f6fbed0c0-operator-scripts\") pod \"keystone-852d-account-create-update-nv5n7\" (UID: \"8ad3cc08-174a-4164-aa38-3d7f6fbed0c0\") " pod="openstack/keystone-852d-account-create-update-nv5n7" Mar 21 04:43:16 crc kubenswrapper[4839]: I0321 04:43:16.293548 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wnt44\" (UniqueName: \"kubernetes.io/projected/8ad3cc08-174a-4164-aa38-3d7f6fbed0c0-kube-api-access-wnt44\") pod \"keystone-852d-account-create-update-nv5n7\" (UID: \"8ad3cc08-174a-4164-aa38-3d7f6fbed0c0\") " pod="openstack/keystone-852d-account-create-update-nv5n7" Mar 21 04:43:16 crc kubenswrapper[4839]: I0321 04:43:16.322480 4839 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-db-create-tnx95"] Mar 21 04:43:16 crc kubenswrapper[4839]: I0321 04:43:16.324499 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-tnx95" Mar 21 04:43:16 crc kubenswrapper[4839]: I0321 04:43:16.333092 4839 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-create-tnx95"] Mar 21 04:43:16 crc kubenswrapper[4839]: I0321 04:43:16.385975 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e779c2ff-ee70-4779-b3fc-3b3bf87aff47-operator-scripts\") pod \"keystone-db-create-v4k9c\" (UID: \"e779c2ff-ee70-4779-b3fc-3b3bf87aff47\") " pod="openstack/keystone-db-create-v4k9c" Mar 21 04:43:16 crc kubenswrapper[4839]: I0321 04:43:16.386049 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c9a6840a-2ece-4b8d-be60-caa89912db9f-operator-scripts\") pod \"placement-db-create-tnx95\" (UID: \"c9a6840a-2ece-4b8d-be60-caa89912db9f\") " pod="openstack/placement-db-create-tnx95" Mar 21 04:43:16 crc kubenswrapper[4839]: I0321 04:43:16.386166 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-49pjl\" (UniqueName: \"kubernetes.io/projected/e779c2ff-ee70-4779-b3fc-3b3bf87aff47-kube-api-access-49pjl\") pod \"keystone-db-create-v4k9c\" (UID: \"e779c2ff-ee70-4779-b3fc-3b3bf87aff47\") " pod="openstack/keystone-db-create-v4k9c" Mar 21 04:43:16 crc kubenswrapper[4839]: I0321 04:43:16.386199 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fm69x\" (UniqueName: \"kubernetes.io/projected/c9a6840a-2ece-4b8d-be60-caa89912db9f-kube-api-access-fm69x\") pod \"placement-db-create-tnx95\" (UID: \"c9a6840a-2ece-4b8d-be60-caa89912db9f\") " pod="openstack/placement-db-create-tnx95" Mar 21 04:43:16 crc kubenswrapper[4839]: I0321 04:43:16.386928 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e779c2ff-ee70-4779-b3fc-3b3bf87aff47-operator-scripts\") pod \"keystone-db-create-v4k9c\" (UID: \"e779c2ff-ee70-4779-b3fc-3b3bf87aff47\") " pod="openstack/keystone-db-create-v4k9c" Mar 21 04:43:16 crc kubenswrapper[4839]: I0321 04:43:16.412674 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-49pjl\" (UniqueName: \"kubernetes.io/projected/e779c2ff-ee70-4779-b3fc-3b3bf87aff47-kube-api-access-49pjl\") pod \"keystone-db-create-v4k9c\" (UID: \"e779c2ff-ee70-4779-b3fc-3b3bf87aff47\") " pod="openstack/keystone-db-create-v4k9c" Mar 21 04:43:16 crc kubenswrapper[4839]: I0321 04:43:16.430911 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-852d-account-create-update-nv5n7" Mar 21 04:43:16 crc kubenswrapper[4839]: I0321 04:43:16.437524 4839 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-1eec-account-create-update-h7hp7"] Mar 21 04:43:16 crc kubenswrapper[4839]: I0321 04:43:16.438667 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-1eec-account-create-update-h7hp7" Mar 21 04:43:16 crc kubenswrapper[4839]: I0321 04:43:16.441512 4839 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-db-secret" Mar 21 04:43:16 crc kubenswrapper[4839]: I0321 04:43:16.472983 4839 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d92c95bc-4fde-4d24-ad6e-d4583ec19b3a" path="/var/lib/kubelet/pods/d92c95bc-4fde-4d24-ad6e-d4583ec19b3a/volumes" Mar 21 04:43:16 crc kubenswrapper[4839]: I0321 04:43:16.474893 4839 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-1eec-account-create-update-h7hp7"] Mar 21 04:43:16 crc kubenswrapper[4839]: I0321 04:43:16.478375 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-v4k9c" Mar 21 04:43:16 crc kubenswrapper[4839]: I0321 04:43:16.487965 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-frcpm\" (UniqueName: \"kubernetes.io/projected/59ce13a7-d2a6-4c54-908d-39d1511da50b-kube-api-access-frcpm\") pod \"placement-1eec-account-create-update-h7hp7\" (UID: \"59ce13a7-d2a6-4c54-908d-39d1511da50b\") " pod="openstack/placement-1eec-account-create-update-h7hp7" Mar 21 04:43:16 crc kubenswrapper[4839]: I0321 04:43:16.488061 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c9a6840a-2ece-4b8d-be60-caa89912db9f-operator-scripts\") pod \"placement-db-create-tnx95\" (UID: \"c9a6840a-2ece-4b8d-be60-caa89912db9f\") " pod="openstack/placement-db-create-tnx95" Mar 21 04:43:16 crc kubenswrapper[4839]: I0321 04:43:16.488127 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/59ce13a7-d2a6-4c54-908d-39d1511da50b-operator-scripts\") pod \"placement-1eec-account-create-update-h7hp7\" (UID: \"59ce13a7-d2a6-4c54-908d-39d1511da50b\") " pod="openstack/placement-1eec-account-create-update-h7hp7" Mar 21 04:43:16 crc kubenswrapper[4839]: I0321 04:43:16.488159 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fm69x\" (UniqueName: \"kubernetes.io/projected/c9a6840a-2ece-4b8d-be60-caa89912db9f-kube-api-access-fm69x\") pod \"placement-db-create-tnx95\" (UID: \"c9a6840a-2ece-4b8d-be60-caa89912db9f\") " pod="openstack/placement-db-create-tnx95" Mar 21 04:43:16 crc kubenswrapper[4839]: I0321 04:43:16.489075 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c9a6840a-2ece-4b8d-be60-caa89912db9f-operator-scripts\") pod \"placement-db-create-tnx95\" (UID: \"c9a6840a-2ece-4b8d-be60-caa89912db9f\") " pod="openstack/placement-db-create-tnx95" Mar 21 04:43:16 crc kubenswrapper[4839]: I0321 04:43:16.513404 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fm69x\" (UniqueName: \"kubernetes.io/projected/c9a6840a-2ece-4b8d-be60-caa89912db9f-kube-api-access-fm69x\") pod \"placement-db-create-tnx95\" (UID: \"c9a6840a-2ece-4b8d-be60-caa89912db9f\") " pod="openstack/placement-db-create-tnx95" Mar 21 04:43:16 crc kubenswrapper[4839]: I0321 04:43:16.591656 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/59ce13a7-d2a6-4c54-908d-39d1511da50b-operator-scripts\") pod \"placement-1eec-account-create-update-h7hp7\" (UID: \"59ce13a7-d2a6-4c54-908d-39d1511da50b\") " pod="openstack/placement-1eec-account-create-update-h7hp7" Mar 21 04:43:16 crc kubenswrapper[4839]: I0321 04:43:16.592022 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-frcpm\" (UniqueName: \"kubernetes.io/projected/59ce13a7-d2a6-4c54-908d-39d1511da50b-kube-api-access-frcpm\") pod \"placement-1eec-account-create-update-h7hp7\" (UID: \"59ce13a7-d2a6-4c54-908d-39d1511da50b\") " pod="openstack/placement-1eec-account-create-update-h7hp7" Mar 21 04:43:16 crc kubenswrapper[4839]: I0321 04:43:16.593391 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/59ce13a7-d2a6-4c54-908d-39d1511da50b-operator-scripts\") pod \"placement-1eec-account-create-update-h7hp7\" (UID: \"59ce13a7-d2a6-4c54-908d-39d1511da50b\") " pod="openstack/placement-1eec-account-create-update-h7hp7" Mar 21 04:43:16 crc kubenswrapper[4839]: I0321 04:43:16.619431 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-frcpm\" (UniqueName: \"kubernetes.io/projected/59ce13a7-d2a6-4c54-908d-39d1511da50b-kube-api-access-frcpm\") pod \"placement-1eec-account-create-update-h7hp7\" (UID: \"59ce13a7-d2a6-4c54-908d-39d1511da50b\") " pod="openstack/placement-1eec-account-create-update-h7hp7" Mar 21 04:43:16 crc kubenswrapper[4839]: I0321 04:43:16.690950 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-tnx95" Mar 21 04:43:16 crc kubenswrapper[4839]: I0321 04:43:16.872376 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-1eec-account-create-update-h7hp7" Mar 21 04:43:16 crc kubenswrapper[4839]: I0321 04:43:16.950708 4839 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-852d-account-create-update-nv5n7"] Mar 21 04:43:16 crc kubenswrapper[4839]: W0321 04:43:16.957785 4839 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod8ad3cc08_174a_4164_aa38_3d7f6fbed0c0.slice/crio-fd06d64cd3e2f8af0c1e033552a48d6556973db424a6dde6732cb4af920ae9b3 WatchSource:0}: Error finding container fd06d64cd3e2f8af0c1e033552a48d6556973db424a6dde6732cb4af920ae9b3: Status 404 returned error can't find the container with id fd06d64cd3e2f8af0c1e033552a48d6556973db424a6dde6732cb4af920ae9b3 Mar 21 04:43:16 crc kubenswrapper[4839]: I0321 04:43:16.965712 4839 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-db-secret" Mar 21 04:43:17 crc kubenswrapper[4839]: I0321 04:43:17.028487 4839 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-create-v4k9c"] Mar 21 04:43:17 crc kubenswrapper[4839]: I0321 04:43:17.134597 4839 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-create-tnx95"] Mar 21 04:43:17 crc kubenswrapper[4839]: W0321 04:43:17.157210 4839 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc9a6840a_2ece_4b8d_be60_caa89912db9f.slice/crio-978b69facff26547540b897eeb4f977dfaff5f241499331330d6b7477a899fcb WatchSource:0}: Error finding container 978b69facff26547540b897eeb4f977dfaff5f241499331330d6b7477a899fcb: Status 404 returned error can't find the container with id 978b69facff26547540b897eeb4f977dfaff5f241499331330d6b7477a899fcb Mar 21 04:43:17 crc kubenswrapper[4839]: I0321 04:43:17.308995 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-v4k9c" event={"ID":"e779c2ff-ee70-4779-b3fc-3b3bf87aff47","Type":"ContainerStarted","Data":"048e9e28ac07a1e9124d69a89e17059f1d443023c6faf3348223cf9a7387e352"} Mar 21 04:43:17 crc kubenswrapper[4839]: I0321 04:43:17.309348 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-v4k9c" event={"ID":"e779c2ff-ee70-4779-b3fc-3b3bf87aff47","Type":"ContainerStarted","Data":"97bb4acf95a9a42df523280643f99bc7040e0d83179344bcc8f457ae232c2302"} Mar 21 04:43:17 crc kubenswrapper[4839]: I0321 04:43:17.319857 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-create-tnx95" event={"ID":"c9a6840a-2ece-4b8d-be60-caa89912db9f","Type":"ContainerStarted","Data":"978b69facff26547540b897eeb4f977dfaff5f241499331330d6b7477a899fcb"} Mar 21 04:43:17 crc kubenswrapper[4839]: I0321 04:43:17.321580 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-852d-account-create-update-nv5n7" event={"ID":"8ad3cc08-174a-4164-aa38-3d7f6fbed0c0","Type":"ContainerStarted","Data":"435dd7b699a596fb94e68ae9d7689a76011012e1ee2be4e567ac5a478d536eb6"} Mar 21 04:43:17 crc kubenswrapper[4839]: I0321 04:43:17.321612 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-852d-account-create-update-nv5n7" event={"ID":"8ad3cc08-174a-4164-aa38-3d7f6fbed0c0","Type":"ContainerStarted","Data":"fd06d64cd3e2f8af0c1e033552a48d6556973db424a6dde6732cb4af920ae9b3"} Mar 21 04:43:17 crc kubenswrapper[4839]: I0321 04:43:17.327223 4839 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-db-create-v4k9c" podStartSLOduration=1.327207793 podStartE2EDuration="1.327207793s" podCreationTimestamp="2026-03-21 04:43:16 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-21 04:43:17.325885256 +0000 UTC m=+1201.653671932" watchObservedRunningTime="2026-03-21 04:43:17.327207793 +0000 UTC m=+1201.654994469" Mar 21 04:43:17 crc kubenswrapper[4839]: I0321 04:43:17.345226 4839 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-1eec-account-create-update-h7hp7"] Mar 21 04:43:17 crc kubenswrapper[4839]: I0321 04:43:17.346246 4839 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-852d-account-create-update-nv5n7" podStartSLOduration=1.346229485 podStartE2EDuration="1.346229485s" podCreationTimestamp="2026-03-21 04:43:16 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-21 04:43:17.341272247 +0000 UTC m=+1201.669058933" watchObservedRunningTime="2026-03-21 04:43:17.346229485 +0000 UTC m=+1201.674016161" Mar 21 04:43:17 crc kubenswrapper[4839]: W0321 04:43:17.349933 4839 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod59ce13a7_d2a6_4c54_908d_39d1511da50b.slice/crio-b2ca9aec2596c246240aa5a6e603cc06199b5ecd40d56bba26b68d9a5bf760e3 WatchSource:0}: Error finding container b2ca9aec2596c246240aa5a6e603cc06199b5ecd40d56bba26b68d9a5bf760e3: Status 404 returned error can't find the container with id b2ca9aec2596c246240aa5a6e603cc06199b5ecd40d56bba26b68d9a5bf760e3 Mar 21 04:43:18 crc kubenswrapper[4839]: I0321 04:43:18.330627 4839 generic.go:334] "Generic (PLEG): container finished" podID="6e1d0e8c-00aa-4770-9e58-b8f706d80a35" containerID="e90ba1902491587c449fe1da895e413653e91fc3602d79a9260ecadc688aa182" exitCode=0 Mar 21 04:43:18 crc kubenswrapper[4839]: I0321 04:43:18.330753 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"6e1d0e8c-00aa-4770-9e58-b8f706d80a35","Type":"ContainerDied","Data":"e90ba1902491587c449fe1da895e413653e91fc3602d79a9260ecadc688aa182"} Mar 21 04:43:18 crc kubenswrapper[4839]: I0321 04:43:18.337637 4839 generic.go:334] "Generic (PLEG): container finished" podID="8028561c-b039-4400-a065-b5efee753b5f" containerID="fcd7e300ab111a88b888a2fc68f007c49d0404de0648aa1177c5d04bb341e74c" exitCode=0 Mar 21 04:43:18 crc kubenswrapper[4839]: I0321 04:43:18.337717 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"8028561c-b039-4400-a065-b5efee753b5f","Type":"ContainerDied","Data":"fcd7e300ab111a88b888a2fc68f007c49d0404de0648aa1177c5d04bb341e74c"} Mar 21 04:43:18 crc kubenswrapper[4839]: I0321 04:43:18.340457 4839 generic.go:334] "Generic (PLEG): container finished" podID="8ad3cc08-174a-4164-aa38-3d7f6fbed0c0" containerID="435dd7b699a596fb94e68ae9d7689a76011012e1ee2be4e567ac5a478d536eb6" exitCode=0 Mar 21 04:43:18 crc kubenswrapper[4839]: I0321 04:43:18.340622 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-852d-account-create-update-nv5n7" event={"ID":"8ad3cc08-174a-4164-aa38-3d7f6fbed0c0","Type":"ContainerDied","Data":"435dd7b699a596fb94e68ae9d7689a76011012e1ee2be4e567ac5a478d536eb6"} Mar 21 04:43:18 crc kubenswrapper[4839]: I0321 04:43:18.344533 4839 generic.go:334] "Generic (PLEG): container finished" podID="59ce13a7-d2a6-4c54-908d-39d1511da50b" containerID="e304597468db9fac443bca06d530c54513659f708975616b601e678f7766dbe4" exitCode=0 Mar 21 04:43:18 crc kubenswrapper[4839]: I0321 04:43:18.344618 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-1eec-account-create-update-h7hp7" event={"ID":"59ce13a7-d2a6-4c54-908d-39d1511da50b","Type":"ContainerDied","Data":"e304597468db9fac443bca06d530c54513659f708975616b601e678f7766dbe4"} Mar 21 04:43:18 crc kubenswrapper[4839]: I0321 04:43:18.344649 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-1eec-account-create-update-h7hp7" event={"ID":"59ce13a7-d2a6-4c54-908d-39d1511da50b","Type":"ContainerStarted","Data":"b2ca9aec2596c246240aa5a6e603cc06199b5ecd40d56bba26b68d9a5bf760e3"} Mar 21 04:43:18 crc kubenswrapper[4839]: I0321 04:43:18.352459 4839 generic.go:334] "Generic (PLEG): container finished" podID="e779c2ff-ee70-4779-b3fc-3b3bf87aff47" containerID="048e9e28ac07a1e9124d69a89e17059f1d443023c6faf3348223cf9a7387e352" exitCode=0 Mar 21 04:43:18 crc kubenswrapper[4839]: I0321 04:43:18.352524 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-v4k9c" event={"ID":"e779c2ff-ee70-4779-b3fc-3b3bf87aff47","Type":"ContainerDied","Data":"048e9e28ac07a1e9124d69a89e17059f1d443023c6faf3348223cf9a7387e352"} Mar 21 04:43:18 crc kubenswrapper[4839]: I0321 04:43:18.356520 4839 generic.go:334] "Generic (PLEG): container finished" podID="c9a6840a-2ece-4b8d-be60-caa89912db9f" containerID="e6c4b76ad2e0d2ae69413d1d9b61feffab9768c0ce8180f11f0b591ba10e6f2c" exitCode=0 Mar 21 04:43:18 crc kubenswrapper[4839]: I0321 04:43:18.356598 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-create-tnx95" event={"ID":"c9a6840a-2ece-4b8d-be60-caa89912db9f","Type":"ContainerDied","Data":"e6c4b76ad2e0d2ae69413d1d9b61feffab9768c0ce8180f11f0b591ba10e6f2c"} Mar 21 04:43:19 crc kubenswrapper[4839]: I0321 04:43:19.388116 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"6e1d0e8c-00aa-4770-9e58-b8f706d80a35","Type":"ContainerStarted","Data":"7e6871f49750eb702160c82a07f561ea187c198c633a9f96f983dabe23b81318"} Mar 21 04:43:19 crc kubenswrapper[4839]: I0321 04:43:19.389097 4839 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-cell1-server-0" Mar 21 04:43:19 crc kubenswrapper[4839]: I0321 04:43:19.394855 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"8028561c-b039-4400-a065-b5efee753b5f","Type":"ContainerStarted","Data":"804d2b77429b6dcf4164535d9f43dee6f0cff10defca7a0d78be2b02039b8f92"} Mar 21 04:43:19 crc kubenswrapper[4839]: I0321 04:43:19.395125 4839 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-server-0" Mar 21 04:43:19 crc kubenswrapper[4839]: I0321 04:43:19.442834 4839 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-cell1-server-0" podStartSLOduration=55.608994074 podStartE2EDuration="1m9.442651093s" podCreationTimestamp="2026-03-21 04:42:10 +0000 UTC" firstStartedPulling="2026-03-21 04:42:22.722289259 +0000 UTC m=+1147.050075935" lastFinishedPulling="2026-03-21 04:42:36.555946278 +0000 UTC m=+1160.883732954" observedRunningTime="2026-03-21 04:43:19.416696987 +0000 UTC m=+1203.744483683" watchObservedRunningTime="2026-03-21 04:43:19.442651093 +0000 UTC m=+1203.770437789" Mar 21 04:43:19 crc kubenswrapper[4839]: I0321 04:43:19.470977 4839 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-server-0" podStartSLOduration=56.658188445 podStartE2EDuration="1m9.470955065s" podCreationTimestamp="2026-03-21 04:42:10 +0000 UTC" firstStartedPulling="2026-03-21 04:42:22.742386071 +0000 UTC m=+1147.070172747" lastFinishedPulling="2026-03-21 04:42:35.555152691 +0000 UTC m=+1159.882939367" observedRunningTime="2026-03-21 04:43:19.459308949 +0000 UTC m=+1203.787095655" watchObservedRunningTime="2026-03-21 04:43:19.470955065 +0000 UTC m=+1203.798741741" Mar 21 04:43:19 crc kubenswrapper[4839]: I0321 04:43:19.844840 4839 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-1eec-account-create-update-h7hp7" Mar 21 04:43:19 crc kubenswrapper[4839]: I0321 04:43:19.855546 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/59ce13a7-d2a6-4c54-908d-39d1511da50b-operator-scripts\") pod \"59ce13a7-d2a6-4c54-908d-39d1511da50b\" (UID: \"59ce13a7-d2a6-4c54-908d-39d1511da50b\") " Mar 21 04:43:19 crc kubenswrapper[4839]: I0321 04:43:19.855701 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-frcpm\" (UniqueName: \"kubernetes.io/projected/59ce13a7-d2a6-4c54-908d-39d1511da50b-kube-api-access-frcpm\") pod \"59ce13a7-d2a6-4c54-908d-39d1511da50b\" (UID: \"59ce13a7-d2a6-4c54-908d-39d1511da50b\") " Mar 21 04:43:19 crc kubenswrapper[4839]: I0321 04:43:19.856923 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/59ce13a7-d2a6-4c54-908d-39d1511da50b-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "59ce13a7-d2a6-4c54-908d-39d1511da50b" (UID: "59ce13a7-d2a6-4c54-908d-39d1511da50b"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 21 04:43:19 crc kubenswrapper[4839]: I0321 04:43:19.872157 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/59ce13a7-d2a6-4c54-908d-39d1511da50b-kube-api-access-frcpm" (OuterVolumeSpecName: "kube-api-access-frcpm") pod "59ce13a7-d2a6-4c54-908d-39d1511da50b" (UID: "59ce13a7-d2a6-4c54-908d-39d1511da50b"). InnerVolumeSpecName "kube-api-access-frcpm". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 21 04:43:19 crc kubenswrapper[4839]: I0321 04:43:19.959687 4839 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/59ce13a7-d2a6-4c54-908d-39d1511da50b-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 21 04:43:19 crc kubenswrapper[4839]: I0321 04:43:19.959727 4839 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-frcpm\" (UniqueName: \"kubernetes.io/projected/59ce13a7-d2a6-4c54-908d-39d1511da50b-kube-api-access-frcpm\") on node \"crc\" DevicePath \"\"" Mar 21 04:43:20 crc kubenswrapper[4839]: I0321 04:43:20.032419 4839 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-v4k9c" Mar 21 04:43:20 crc kubenswrapper[4839]: I0321 04:43:20.041193 4839 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-tnx95" Mar 21 04:43:20 crc kubenswrapper[4839]: I0321 04:43:20.049830 4839 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-852d-account-create-update-nv5n7" Mar 21 04:43:20 crc kubenswrapper[4839]: I0321 04:43:20.060463 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e779c2ff-ee70-4779-b3fc-3b3bf87aff47-operator-scripts\") pod \"e779c2ff-ee70-4779-b3fc-3b3bf87aff47\" (UID: \"e779c2ff-ee70-4779-b3fc-3b3bf87aff47\") " Mar 21 04:43:20 crc kubenswrapper[4839]: I0321 04:43:20.060562 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-49pjl\" (UniqueName: \"kubernetes.io/projected/e779c2ff-ee70-4779-b3fc-3b3bf87aff47-kube-api-access-49pjl\") pod \"e779c2ff-ee70-4779-b3fc-3b3bf87aff47\" (UID: \"e779c2ff-ee70-4779-b3fc-3b3bf87aff47\") " Mar 21 04:43:20 crc kubenswrapper[4839]: I0321 04:43:20.060612 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fm69x\" (UniqueName: \"kubernetes.io/projected/c9a6840a-2ece-4b8d-be60-caa89912db9f-kube-api-access-fm69x\") pod \"c9a6840a-2ece-4b8d-be60-caa89912db9f\" (UID: \"c9a6840a-2ece-4b8d-be60-caa89912db9f\") " Mar 21 04:43:20 crc kubenswrapper[4839]: I0321 04:43:20.060666 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8ad3cc08-174a-4164-aa38-3d7f6fbed0c0-operator-scripts\") pod \"8ad3cc08-174a-4164-aa38-3d7f6fbed0c0\" (UID: \"8ad3cc08-174a-4164-aa38-3d7f6fbed0c0\") " Mar 21 04:43:20 crc kubenswrapper[4839]: I0321 04:43:20.060779 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c9a6840a-2ece-4b8d-be60-caa89912db9f-operator-scripts\") pod \"c9a6840a-2ece-4b8d-be60-caa89912db9f\" (UID: \"c9a6840a-2ece-4b8d-be60-caa89912db9f\") " Mar 21 04:43:20 crc kubenswrapper[4839]: I0321 04:43:20.060805 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wnt44\" (UniqueName: \"kubernetes.io/projected/8ad3cc08-174a-4164-aa38-3d7f6fbed0c0-kube-api-access-wnt44\") pod \"8ad3cc08-174a-4164-aa38-3d7f6fbed0c0\" (UID: \"8ad3cc08-174a-4164-aa38-3d7f6fbed0c0\") " Mar 21 04:43:20 crc kubenswrapper[4839]: I0321 04:43:20.061104 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e779c2ff-ee70-4779-b3fc-3b3bf87aff47-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "e779c2ff-ee70-4779-b3fc-3b3bf87aff47" (UID: "e779c2ff-ee70-4779-b3fc-3b3bf87aff47"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 21 04:43:20 crc kubenswrapper[4839]: I0321 04:43:20.061357 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8ad3cc08-174a-4164-aa38-3d7f6fbed0c0-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "8ad3cc08-174a-4164-aa38-3d7f6fbed0c0" (UID: "8ad3cc08-174a-4164-aa38-3d7f6fbed0c0"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 21 04:43:20 crc kubenswrapper[4839]: I0321 04:43:20.061599 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c9a6840a-2ece-4b8d-be60-caa89912db9f-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "c9a6840a-2ece-4b8d-be60-caa89912db9f" (UID: "c9a6840a-2ece-4b8d-be60-caa89912db9f"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 21 04:43:20 crc kubenswrapper[4839]: I0321 04:43:20.064449 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c9a6840a-2ece-4b8d-be60-caa89912db9f-kube-api-access-fm69x" (OuterVolumeSpecName: "kube-api-access-fm69x") pod "c9a6840a-2ece-4b8d-be60-caa89912db9f" (UID: "c9a6840a-2ece-4b8d-be60-caa89912db9f"). InnerVolumeSpecName "kube-api-access-fm69x". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 21 04:43:20 crc kubenswrapper[4839]: I0321 04:43:20.065845 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e779c2ff-ee70-4779-b3fc-3b3bf87aff47-kube-api-access-49pjl" (OuterVolumeSpecName: "kube-api-access-49pjl") pod "e779c2ff-ee70-4779-b3fc-3b3bf87aff47" (UID: "e779c2ff-ee70-4779-b3fc-3b3bf87aff47"). InnerVolumeSpecName "kube-api-access-49pjl". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 21 04:43:20 crc kubenswrapper[4839]: I0321 04:43:20.068220 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8ad3cc08-174a-4164-aa38-3d7f6fbed0c0-kube-api-access-wnt44" (OuterVolumeSpecName: "kube-api-access-wnt44") pod "8ad3cc08-174a-4164-aa38-3d7f6fbed0c0" (UID: "8ad3cc08-174a-4164-aa38-3d7f6fbed0c0"). InnerVolumeSpecName "kube-api-access-wnt44". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 21 04:43:20 crc kubenswrapper[4839]: I0321 04:43:20.163275 4839 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e779c2ff-ee70-4779-b3fc-3b3bf87aff47-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 21 04:43:20 crc kubenswrapper[4839]: I0321 04:43:20.163546 4839 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-49pjl\" (UniqueName: \"kubernetes.io/projected/e779c2ff-ee70-4779-b3fc-3b3bf87aff47-kube-api-access-49pjl\") on node \"crc\" DevicePath \"\"" Mar 21 04:43:20 crc kubenswrapper[4839]: I0321 04:43:20.163663 4839 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fm69x\" (UniqueName: \"kubernetes.io/projected/c9a6840a-2ece-4b8d-be60-caa89912db9f-kube-api-access-fm69x\") on node \"crc\" DevicePath \"\"" Mar 21 04:43:20 crc kubenswrapper[4839]: I0321 04:43:20.163745 4839 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8ad3cc08-174a-4164-aa38-3d7f6fbed0c0-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 21 04:43:20 crc kubenswrapper[4839]: I0321 04:43:20.163836 4839 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c9a6840a-2ece-4b8d-be60-caa89912db9f-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 21 04:43:20 crc kubenswrapper[4839]: I0321 04:43:20.163951 4839 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wnt44\" (UniqueName: \"kubernetes.io/projected/8ad3cc08-174a-4164-aa38-3d7f6fbed0c0-kube-api-access-wnt44\") on node \"crc\" DevicePath \"\"" Mar 21 04:43:20 crc kubenswrapper[4839]: I0321 04:43:20.217817 4839 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-db-create-vqfbm"] Mar 21 04:43:20 crc kubenswrapper[4839]: E0321 04:43:20.218461 4839 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e779c2ff-ee70-4779-b3fc-3b3bf87aff47" containerName="mariadb-database-create" Mar 21 04:43:20 crc kubenswrapper[4839]: I0321 04:43:20.218528 4839 state_mem.go:107] "Deleted CPUSet assignment" podUID="e779c2ff-ee70-4779-b3fc-3b3bf87aff47" containerName="mariadb-database-create" Mar 21 04:43:20 crc kubenswrapper[4839]: E0321 04:43:20.218617 4839 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="59ce13a7-d2a6-4c54-908d-39d1511da50b" containerName="mariadb-account-create-update" Mar 21 04:43:20 crc kubenswrapper[4839]: I0321 04:43:20.218687 4839 state_mem.go:107] "Deleted CPUSet assignment" podUID="59ce13a7-d2a6-4c54-908d-39d1511da50b" containerName="mariadb-account-create-update" Mar 21 04:43:20 crc kubenswrapper[4839]: E0321 04:43:20.218753 4839 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8ad3cc08-174a-4164-aa38-3d7f6fbed0c0" containerName="mariadb-account-create-update" Mar 21 04:43:20 crc kubenswrapper[4839]: I0321 04:43:20.218805 4839 state_mem.go:107] "Deleted CPUSet assignment" podUID="8ad3cc08-174a-4164-aa38-3d7f6fbed0c0" containerName="mariadb-account-create-update" Mar 21 04:43:20 crc kubenswrapper[4839]: E0321 04:43:20.218863 4839 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c9a6840a-2ece-4b8d-be60-caa89912db9f" containerName="mariadb-database-create" Mar 21 04:43:20 crc kubenswrapper[4839]: I0321 04:43:20.218918 4839 state_mem.go:107] "Deleted CPUSet assignment" podUID="c9a6840a-2ece-4b8d-be60-caa89912db9f" containerName="mariadb-database-create" Mar 21 04:43:20 crc kubenswrapper[4839]: I0321 04:43:20.219139 4839 memory_manager.go:354] "RemoveStaleState removing state" podUID="8ad3cc08-174a-4164-aa38-3d7f6fbed0c0" containerName="mariadb-account-create-update" Mar 21 04:43:20 crc kubenswrapper[4839]: I0321 04:43:20.219503 4839 memory_manager.go:354] "RemoveStaleState removing state" podUID="c9a6840a-2ece-4b8d-be60-caa89912db9f" containerName="mariadb-database-create" Mar 21 04:43:20 crc kubenswrapper[4839]: I0321 04:43:20.219594 4839 memory_manager.go:354] "RemoveStaleState removing state" podUID="59ce13a7-d2a6-4c54-908d-39d1511da50b" containerName="mariadb-account-create-update" Mar 21 04:43:20 crc kubenswrapper[4839]: I0321 04:43:20.219677 4839 memory_manager.go:354] "RemoveStaleState removing state" podUID="e779c2ff-ee70-4779-b3fc-3b3bf87aff47" containerName="mariadb-database-create" Mar 21 04:43:20 crc kubenswrapper[4839]: I0321 04:43:20.220320 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-vqfbm" Mar 21 04:43:20 crc kubenswrapper[4839]: I0321 04:43:20.228426 4839 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-create-vqfbm"] Mar 21 04:43:20 crc kubenswrapper[4839]: I0321 04:43:20.265988 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b46c59d5-1b87-471e-ae9b-b8ba7ca8d754-operator-scripts\") pod \"glance-db-create-vqfbm\" (UID: \"b46c59d5-1b87-471e-ae9b-b8ba7ca8d754\") " pod="openstack/glance-db-create-vqfbm" Mar 21 04:43:20 crc kubenswrapper[4839]: I0321 04:43:20.266039 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q9qph\" (UniqueName: \"kubernetes.io/projected/b46c59d5-1b87-471e-ae9b-b8ba7ca8d754-kube-api-access-q9qph\") pod \"glance-db-create-vqfbm\" (UID: \"b46c59d5-1b87-471e-ae9b-b8ba7ca8d754\") " pod="openstack/glance-db-create-vqfbm" Mar 21 04:43:20 crc kubenswrapper[4839]: I0321 04:43:20.332084 4839 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-31f4-account-create-update-98c9m"] Mar 21 04:43:20 crc kubenswrapper[4839]: I0321 04:43:20.333119 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-31f4-account-create-update-98c9m" Mar 21 04:43:20 crc kubenswrapper[4839]: I0321 04:43:20.335281 4839 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-db-secret" Mar 21 04:43:20 crc kubenswrapper[4839]: I0321 04:43:20.344043 4839 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-31f4-account-create-update-98c9m"] Mar 21 04:43:20 crc kubenswrapper[4839]: I0321 04:43:20.367578 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/5740bec9-4b0c-4092-8309-14fdb2562c2e-operator-scripts\") pod \"glance-31f4-account-create-update-98c9m\" (UID: \"5740bec9-4b0c-4092-8309-14fdb2562c2e\") " pod="openstack/glance-31f4-account-create-update-98c9m" Mar 21 04:43:20 crc kubenswrapper[4839]: I0321 04:43:20.367940 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qpmp5\" (UniqueName: \"kubernetes.io/projected/5740bec9-4b0c-4092-8309-14fdb2562c2e-kube-api-access-qpmp5\") pod \"glance-31f4-account-create-update-98c9m\" (UID: \"5740bec9-4b0c-4092-8309-14fdb2562c2e\") " pod="openstack/glance-31f4-account-create-update-98c9m" Mar 21 04:43:20 crc kubenswrapper[4839]: I0321 04:43:20.368140 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b46c59d5-1b87-471e-ae9b-b8ba7ca8d754-operator-scripts\") pod \"glance-db-create-vqfbm\" (UID: \"b46c59d5-1b87-471e-ae9b-b8ba7ca8d754\") " pod="openstack/glance-db-create-vqfbm" Mar 21 04:43:20 crc kubenswrapper[4839]: I0321 04:43:20.368250 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-q9qph\" (UniqueName: \"kubernetes.io/projected/b46c59d5-1b87-471e-ae9b-b8ba7ca8d754-kube-api-access-q9qph\") pod \"glance-db-create-vqfbm\" (UID: \"b46c59d5-1b87-471e-ae9b-b8ba7ca8d754\") " pod="openstack/glance-db-create-vqfbm" Mar 21 04:43:20 crc kubenswrapper[4839]: I0321 04:43:20.369034 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b46c59d5-1b87-471e-ae9b-b8ba7ca8d754-operator-scripts\") pod \"glance-db-create-vqfbm\" (UID: \"b46c59d5-1b87-471e-ae9b-b8ba7ca8d754\") " pod="openstack/glance-db-create-vqfbm" Mar 21 04:43:20 crc kubenswrapper[4839]: I0321 04:43:20.400048 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-q9qph\" (UniqueName: \"kubernetes.io/projected/b46c59d5-1b87-471e-ae9b-b8ba7ca8d754-kube-api-access-q9qph\") pod \"glance-db-create-vqfbm\" (UID: \"b46c59d5-1b87-471e-ae9b-b8ba7ca8d754\") " pod="openstack/glance-db-create-vqfbm" Mar 21 04:43:20 crc kubenswrapper[4839]: I0321 04:43:20.403623 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-852d-account-create-update-nv5n7" event={"ID":"8ad3cc08-174a-4164-aa38-3d7f6fbed0c0","Type":"ContainerDied","Data":"fd06d64cd3e2f8af0c1e033552a48d6556973db424a6dde6732cb4af920ae9b3"} Mar 21 04:43:20 crc kubenswrapper[4839]: I0321 04:43:20.403675 4839 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="fd06d64cd3e2f8af0c1e033552a48d6556973db424a6dde6732cb4af920ae9b3" Mar 21 04:43:20 crc kubenswrapper[4839]: I0321 04:43:20.403763 4839 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-852d-account-create-update-nv5n7" Mar 21 04:43:20 crc kubenswrapper[4839]: I0321 04:43:20.406431 4839 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-1eec-account-create-update-h7hp7" Mar 21 04:43:20 crc kubenswrapper[4839]: I0321 04:43:20.406441 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-1eec-account-create-update-h7hp7" event={"ID":"59ce13a7-d2a6-4c54-908d-39d1511da50b","Type":"ContainerDied","Data":"b2ca9aec2596c246240aa5a6e603cc06199b5ecd40d56bba26b68d9a5bf760e3"} Mar 21 04:43:20 crc kubenswrapper[4839]: I0321 04:43:20.406480 4839 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b2ca9aec2596c246240aa5a6e603cc06199b5ecd40d56bba26b68d9a5bf760e3" Mar 21 04:43:20 crc kubenswrapper[4839]: I0321 04:43:20.408005 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-v4k9c" event={"ID":"e779c2ff-ee70-4779-b3fc-3b3bf87aff47","Type":"ContainerDied","Data":"97bb4acf95a9a42df523280643f99bc7040e0d83179344bcc8f457ae232c2302"} Mar 21 04:43:20 crc kubenswrapper[4839]: I0321 04:43:20.408131 4839 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="97bb4acf95a9a42df523280643f99bc7040e0d83179344bcc8f457ae232c2302" Mar 21 04:43:20 crc kubenswrapper[4839]: I0321 04:43:20.408252 4839 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-v4k9c" Mar 21 04:43:20 crc kubenswrapper[4839]: I0321 04:43:20.413130 4839 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-tnx95" Mar 21 04:43:20 crc kubenswrapper[4839]: I0321 04:43:20.414659 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-create-tnx95" event={"ID":"c9a6840a-2ece-4b8d-be60-caa89912db9f","Type":"ContainerDied","Data":"978b69facff26547540b897eeb4f977dfaff5f241499331330d6b7477a899fcb"} Mar 21 04:43:20 crc kubenswrapper[4839]: I0321 04:43:20.414729 4839 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="978b69facff26547540b897eeb4f977dfaff5f241499331330d6b7477a899fcb" Mar 21 04:43:20 crc kubenswrapper[4839]: I0321 04:43:20.470399 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/5740bec9-4b0c-4092-8309-14fdb2562c2e-operator-scripts\") pod \"glance-31f4-account-create-update-98c9m\" (UID: \"5740bec9-4b0c-4092-8309-14fdb2562c2e\") " pod="openstack/glance-31f4-account-create-update-98c9m" Mar 21 04:43:20 crc kubenswrapper[4839]: I0321 04:43:20.470522 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qpmp5\" (UniqueName: \"kubernetes.io/projected/5740bec9-4b0c-4092-8309-14fdb2562c2e-kube-api-access-qpmp5\") pod \"glance-31f4-account-create-update-98c9m\" (UID: \"5740bec9-4b0c-4092-8309-14fdb2562c2e\") " pod="openstack/glance-31f4-account-create-update-98c9m" Mar 21 04:43:20 crc kubenswrapper[4839]: I0321 04:43:20.471131 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/5740bec9-4b0c-4092-8309-14fdb2562c2e-operator-scripts\") pod \"glance-31f4-account-create-update-98c9m\" (UID: \"5740bec9-4b0c-4092-8309-14fdb2562c2e\") " pod="openstack/glance-31f4-account-create-update-98c9m" Mar 21 04:43:20 crc kubenswrapper[4839]: I0321 04:43:20.488728 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qpmp5\" (UniqueName: \"kubernetes.io/projected/5740bec9-4b0c-4092-8309-14fdb2562c2e-kube-api-access-qpmp5\") pod \"glance-31f4-account-create-update-98c9m\" (UID: \"5740bec9-4b0c-4092-8309-14fdb2562c2e\") " pod="openstack/glance-31f4-account-create-update-98c9m" Mar 21 04:43:20 crc kubenswrapper[4839]: I0321 04:43:20.548946 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-vqfbm" Mar 21 04:43:20 crc kubenswrapper[4839]: I0321 04:43:20.649302 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-31f4-account-create-update-98c9m" Mar 21 04:43:20 crc kubenswrapper[4839]: I0321 04:43:20.852901 4839 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-controller-ovs-hrww8" Mar 21 04:43:20 crc kubenswrapper[4839]: I0321 04:43:20.858536 4839 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ovn-controller-qt5s4" podUID="b31b64cb-0266-4b8a-9fcb-ae5e36c8309a" containerName="ovn-controller" probeResult="failure" output=< Mar 21 04:43:20 crc kubenswrapper[4839]: ERROR - ovn-controller connection status is 'not connected', expecting 'connected' status Mar 21 04:43:20 crc kubenswrapper[4839]: > Mar 21 04:43:20 crc kubenswrapper[4839]: I0321 04:43:20.948104 4839 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-controller-ovs-hrww8" Mar 21 04:43:20 crc kubenswrapper[4839]: I0321 04:43:20.985246 4839 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-create-vqfbm"] Mar 21 04:43:21 crc kubenswrapper[4839]: W0321 04:43:21.004860 4839 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb46c59d5_1b87_471e_ae9b_b8ba7ca8d754.slice/crio-91da4146a7f72f81244c5bf66899f5aeb8fe1d05e895e4c0f447f26610ae10a8 WatchSource:0}: Error finding container 91da4146a7f72f81244c5bf66899f5aeb8fe1d05e895e4c0f447f26610ae10a8: Status 404 returned error can't find the container with id 91da4146a7f72f81244c5bf66899f5aeb8fe1d05e895e4c0f447f26610ae10a8 Mar 21 04:43:21 crc kubenswrapper[4839]: I0321 04:43:21.143877 4839 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-31f4-account-create-update-98c9m"] Mar 21 04:43:21 crc kubenswrapper[4839]: W0321 04:43:21.148713 4839 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod5740bec9_4b0c_4092_8309_14fdb2562c2e.slice/crio-90280b9cf99b4278e51fbecf4f72a0d75b08edcce54fb92a99812a0178e1ae70 WatchSource:0}: Error finding container 90280b9cf99b4278e51fbecf4f72a0d75b08edcce54fb92a99812a0178e1ae70: Status 404 returned error can't find the container with id 90280b9cf99b4278e51fbecf4f72a0d75b08edcce54fb92a99812a0178e1ae70 Mar 21 04:43:21 crc kubenswrapper[4839]: I0321 04:43:21.166953 4839 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-qt5s4-config-llx9j"] Mar 21 04:43:21 crc kubenswrapper[4839]: I0321 04:43:21.174092 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-qt5s4-config-llx9j" Mar 21 04:43:21 crc kubenswrapper[4839]: I0321 04:43:21.179063 4839 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-qt5s4-config-llx9j"] Mar 21 04:43:21 crc kubenswrapper[4839]: I0321 04:43:21.180778 4839 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-extra-scripts" Mar 21 04:43:21 crc kubenswrapper[4839]: I0321 04:43:21.183852 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/3fb143a8-0cf7-4e83-8e23-aa49453bac07-var-run\") pod \"ovn-controller-qt5s4-config-llx9j\" (UID: \"3fb143a8-0cf7-4e83-8e23-aa49453bac07\") " pod="openstack/ovn-controller-qt5s4-config-llx9j" Mar 21 04:43:21 crc kubenswrapper[4839]: I0321 04:43:21.183904 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/3fb143a8-0cf7-4e83-8e23-aa49453bac07-scripts\") pod \"ovn-controller-qt5s4-config-llx9j\" (UID: \"3fb143a8-0cf7-4e83-8e23-aa49453bac07\") " pod="openstack/ovn-controller-qt5s4-config-llx9j" Mar 21 04:43:21 crc kubenswrapper[4839]: I0321 04:43:21.183930 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/3fb143a8-0cf7-4e83-8e23-aa49453bac07-additional-scripts\") pod \"ovn-controller-qt5s4-config-llx9j\" (UID: \"3fb143a8-0cf7-4e83-8e23-aa49453bac07\") " pod="openstack/ovn-controller-qt5s4-config-llx9j" Mar 21 04:43:21 crc kubenswrapper[4839]: I0321 04:43:21.183992 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bccfm\" (UniqueName: \"kubernetes.io/projected/3fb143a8-0cf7-4e83-8e23-aa49453bac07-kube-api-access-bccfm\") pod \"ovn-controller-qt5s4-config-llx9j\" (UID: \"3fb143a8-0cf7-4e83-8e23-aa49453bac07\") " pod="openstack/ovn-controller-qt5s4-config-llx9j" Mar 21 04:43:21 crc kubenswrapper[4839]: I0321 04:43:21.184010 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/3fb143a8-0cf7-4e83-8e23-aa49453bac07-var-run-ovn\") pod \"ovn-controller-qt5s4-config-llx9j\" (UID: \"3fb143a8-0cf7-4e83-8e23-aa49453bac07\") " pod="openstack/ovn-controller-qt5s4-config-llx9j" Mar 21 04:43:21 crc kubenswrapper[4839]: I0321 04:43:21.184033 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/3fb143a8-0cf7-4e83-8e23-aa49453bac07-var-log-ovn\") pod \"ovn-controller-qt5s4-config-llx9j\" (UID: \"3fb143a8-0cf7-4e83-8e23-aa49453bac07\") " pod="openstack/ovn-controller-qt5s4-config-llx9j" Mar 21 04:43:21 crc kubenswrapper[4839]: I0321 04:43:21.285836 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bccfm\" (UniqueName: \"kubernetes.io/projected/3fb143a8-0cf7-4e83-8e23-aa49453bac07-kube-api-access-bccfm\") pod \"ovn-controller-qt5s4-config-llx9j\" (UID: \"3fb143a8-0cf7-4e83-8e23-aa49453bac07\") " pod="openstack/ovn-controller-qt5s4-config-llx9j" Mar 21 04:43:21 crc kubenswrapper[4839]: I0321 04:43:21.285883 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/3fb143a8-0cf7-4e83-8e23-aa49453bac07-var-run-ovn\") pod \"ovn-controller-qt5s4-config-llx9j\" (UID: \"3fb143a8-0cf7-4e83-8e23-aa49453bac07\") " pod="openstack/ovn-controller-qt5s4-config-llx9j" Mar 21 04:43:21 crc kubenswrapper[4839]: I0321 04:43:21.285923 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/3fb143a8-0cf7-4e83-8e23-aa49453bac07-var-log-ovn\") pod \"ovn-controller-qt5s4-config-llx9j\" (UID: \"3fb143a8-0cf7-4e83-8e23-aa49453bac07\") " pod="openstack/ovn-controller-qt5s4-config-llx9j" Mar 21 04:43:21 crc kubenswrapper[4839]: I0321 04:43:21.285999 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/3fb143a8-0cf7-4e83-8e23-aa49453bac07-var-run\") pod \"ovn-controller-qt5s4-config-llx9j\" (UID: \"3fb143a8-0cf7-4e83-8e23-aa49453bac07\") " pod="openstack/ovn-controller-qt5s4-config-llx9j" Mar 21 04:43:21 crc kubenswrapper[4839]: I0321 04:43:21.286058 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/3fb143a8-0cf7-4e83-8e23-aa49453bac07-scripts\") pod \"ovn-controller-qt5s4-config-llx9j\" (UID: \"3fb143a8-0cf7-4e83-8e23-aa49453bac07\") " pod="openstack/ovn-controller-qt5s4-config-llx9j" Mar 21 04:43:21 crc kubenswrapper[4839]: I0321 04:43:21.286088 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/3fb143a8-0cf7-4e83-8e23-aa49453bac07-additional-scripts\") pod \"ovn-controller-qt5s4-config-llx9j\" (UID: \"3fb143a8-0cf7-4e83-8e23-aa49453bac07\") " pod="openstack/ovn-controller-qt5s4-config-llx9j" Mar 21 04:43:21 crc kubenswrapper[4839]: I0321 04:43:21.286333 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/3fb143a8-0cf7-4e83-8e23-aa49453bac07-var-run-ovn\") pod \"ovn-controller-qt5s4-config-llx9j\" (UID: \"3fb143a8-0cf7-4e83-8e23-aa49453bac07\") " pod="openstack/ovn-controller-qt5s4-config-llx9j" Mar 21 04:43:21 crc kubenswrapper[4839]: I0321 04:43:21.286360 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/3fb143a8-0cf7-4e83-8e23-aa49453bac07-var-log-ovn\") pod \"ovn-controller-qt5s4-config-llx9j\" (UID: \"3fb143a8-0cf7-4e83-8e23-aa49453bac07\") " pod="openstack/ovn-controller-qt5s4-config-llx9j" Mar 21 04:43:21 crc kubenswrapper[4839]: I0321 04:43:21.286408 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/3fb143a8-0cf7-4e83-8e23-aa49453bac07-var-run\") pod \"ovn-controller-qt5s4-config-llx9j\" (UID: \"3fb143a8-0cf7-4e83-8e23-aa49453bac07\") " pod="openstack/ovn-controller-qt5s4-config-llx9j" Mar 21 04:43:21 crc kubenswrapper[4839]: I0321 04:43:21.286922 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/3fb143a8-0cf7-4e83-8e23-aa49453bac07-additional-scripts\") pod \"ovn-controller-qt5s4-config-llx9j\" (UID: \"3fb143a8-0cf7-4e83-8e23-aa49453bac07\") " pod="openstack/ovn-controller-qt5s4-config-llx9j" Mar 21 04:43:21 crc kubenswrapper[4839]: I0321 04:43:21.288245 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/3fb143a8-0cf7-4e83-8e23-aa49453bac07-scripts\") pod \"ovn-controller-qt5s4-config-llx9j\" (UID: \"3fb143a8-0cf7-4e83-8e23-aa49453bac07\") " pod="openstack/ovn-controller-qt5s4-config-llx9j" Mar 21 04:43:21 crc kubenswrapper[4839]: I0321 04:43:21.307644 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bccfm\" (UniqueName: \"kubernetes.io/projected/3fb143a8-0cf7-4e83-8e23-aa49453bac07-kube-api-access-bccfm\") pod \"ovn-controller-qt5s4-config-llx9j\" (UID: \"3fb143a8-0cf7-4e83-8e23-aa49453bac07\") " pod="openstack/ovn-controller-qt5s4-config-llx9j" Mar 21 04:43:21 crc kubenswrapper[4839]: I0321 04:43:21.419508 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-create-vqfbm" event={"ID":"b46c59d5-1b87-471e-ae9b-b8ba7ca8d754","Type":"ContainerStarted","Data":"18ee77a1f0c351aba88f15dc3bad4a37015f55b27e92d2f7b43fdfe709bc67ef"} Mar 21 04:43:21 crc kubenswrapper[4839]: I0321 04:43:21.419562 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-create-vqfbm" event={"ID":"b46c59d5-1b87-471e-ae9b-b8ba7ca8d754","Type":"ContainerStarted","Data":"91da4146a7f72f81244c5bf66899f5aeb8fe1d05e895e4c0f447f26610ae10a8"} Mar 21 04:43:21 crc kubenswrapper[4839]: I0321 04:43:21.422054 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-31f4-account-create-update-98c9m" event={"ID":"5740bec9-4b0c-4092-8309-14fdb2562c2e","Type":"ContainerStarted","Data":"bc85e819a8b1f2def449cfd0987dc3cf3c1805c923545af4ec58edeba1a10775"} Mar 21 04:43:21 crc kubenswrapper[4839]: I0321 04:43:21.422187 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-31f4-account-create-update-98c9m" event={"ID":"5740bec9-4b0c-4092-8309-14fdb2562c2e","Type":"ContainerStarted","Data":"90280b9cf99b4278e51fbecf4f72a0d75b08edcce54fb92a99812a0178e1ae70"} Mar 21 04:43:21 crc kubenswrapper[4839]: I0321 04:43:21.441709 4839 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-db-create-vqfbm" podStartSLOduration=1.441687746 podStartE2EDuration="1.441687746s" podCreationTimestamp="2026-03-21 04:43:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-21 04:43:21.437125939 +0000 UTC m=+1205.764912625" watchObservedRunningTime="2026-03-21 04:43:21.441687746 +0000 UTC m=+1205.769474422" Mar 21 04:43:21 crc kubenswrapper[4839]: I0321 04:43:21.456423 4839 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-31f4-account-create-update-98c9m" podStartSLOduration=1.456406278 podStartE2EDuration="1.456406278s" podCreationTimestamp="2026-03-21 04:43:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-21 04:43:21.455601115 +0000 UTC m=+1205.783387791" watchObservedRunningTime="2026-03-21 04:43:21.456406278 +0000 UTC m=+1205.784192954" Mar 21 04:43:21 crc kubenswrapper[4839]: I0321 04:43:21.556366 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-qt5s4-config-llx9j" Mar 21 04:43:22 crc kubenswrapper[4839]: I0321 04:43:22.035720 4839 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-qt5s4-config-llx9j"] Mar 21 04:43:22 crc kubenswrapper[4839]: I0321 04:43:22.170704 4839 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/root-account-create-update-fgh6w"] Mar 21 04:43:22 crc kubenswrapper[4839]: I0321 04:43:22.171703 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-fgh6w" Mar 21 04:43:22 crc kubenswrapper[4839]: I0321 04:43:22.173640 4839 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-mariadb-root-db-secret" Mar 21 04:43:22 crc kubenswrapper[4839]: I0321 04:43:22.188801 4839 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/root-account-create-update-fgh6w"] Mar 21 04:43:22 crc kubenswrapper[4839]: I0321 04:43:22.302237 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hmcrx\" (UniqueName: \"kubernetes.io/projected/0fc450b4-4ccf-4e6e-97d1-d47f252be788-kube-api-access-hmcrx\") pod \"root-account-create-update-fgh6w\" (UID: \"0fc450b4-4ccf-4e6e-97d1-d47f252be788\") " pod="openstack/root-account-create-update-fgh6w" Mar 21 04:43:22 crc kubenswrapper[4839]: I0321 04:43:22.302385 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0fc450b4-4ccf-4e6e-97d1-d47f252be788-operator-scripts\") pod \"root-account-create-update-fgh6w\" (UID: \"0fc450b4-4ccf-4e6e-97d1-d47f252be788\") " pod="openstack/root-account-create-update-fgh6w" Mar 21 04:43:22 crc kubenswrapper[4839]: I0321 04:43:22.403591 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hmcrx\" (UniqueName: \"kubernetes.io/projected/0fc450b4-4ccf-4e6e-97d1-d47f252be788-kube-api-access-hmcrx\") pod \"root-account-create-update-fgh6w\" (UID: \"0fc450b4-4ccf-4e6e-97d1-d47f252be788\") " pod="openstack/root-account-create-update-fgh6w" Mar 21 04:43:22 crc kubenswrapper[4839]: I0321 04:43:22.403667 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0fc450b4-4ccf-4e6e-97d1-d47f252be788-operator-scripts\") pod \"root-account-create-update-fgh6w\" (UID: \"0fc450b4-4ccf-4e6e-97d1-d47f252be788\") " pod="openstack/root-account-create-update-fgh6w" Mar 21 04:43:22 crc kubenswrapper[4839]: I0321 04:43:22.404525 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0fc450b4-4ccf-4e6e-97d1-d47f252be788-operator-scripts\") pod \"root-account-create-update-fgh6w\" (UID: \"0fc450b4-4ccf-4e6e-97d1-d47f252be788\") " pod="openstack/root-account-create-update-fgh6w" Mar 21 04:43:22 crc kubenswrapper[4839]: I0321 04:43:22.423689 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hmcrx\" (UniqueName: \"kubernetes.io/projected/0fc450b4-4ccf-4e6e-97d1-d47f252be788-kube-api-access-hmcrx\") pod \"root-account-create-update-fgh6w\" (UID: \"0fc450b4-4ccf-4e6e-97d1-d47f252be788\") " pod="openstack/root-account-create-update-fgh6w" Mar 21 04:43:22 crc kubenswrapper[4839]: I0321 04:43:22.444117 4839 generic.go:334] "Generic (PLEG): container finished" podID="b46c59d5-1b87-471e-ae9b-b8ba7ca8d754" containerID="18ee77a1f0c351aba88f15dc3bad4a37015f55b27e92d2f7b43fdfe709bc67ef" exitCode=0 Mar 21 04:43:22 crc kubenswrapper[4839]: I0321 04:43:22.444185 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-create-vqfbm" event={"ID":"b46c59d5-1b87-471e-ae9b-b8ba7ca8d754","Type":"ContainerDied","Data":"18ee77a1f0c351aba88f15dc3bad4a37015f55b27e92d2f7b43fdfe709bc67ef"} Mar 21 04:43:22 crc kubenswrapper[4839]: I0321 04:43:22.446770 4839 generic.go:334] "Generic (PLEG): container finished" podID="5484abbf-53f2-445a-b6fe-0996eba95345" containerID="93b552b909df831d40a3b2b56c1f6ba5babeea45e19a365649677dff6a5a3b56" exitCode=0 Mar 21 04:43:22 crc kubenswrapper[4839]: I0321 04:43:22.446825 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-kkvzq" event={"ID":"5484abbf-53f2-445a-b6fe-0996eba95345","Type":"ContainerDied","Data":"93b552b909df831d40a3b2b56c1f6ba5babeea45e19a365649677dff6a5a3b56"} Mar 21 04:43:22 crc kubenswrapper[4839]: I0321 04:43:22.448727 4839 generic.go:334] "Generic (PLEG): container finished" podID="5740bec9-4b0c-4092-8309-14fdb2562c2e" containerID="bc85e819a8b1f2def449cfd0987dc3cf3c1805c923545af4ec58edeba1a10775" exitCode=0 Mar 21 04:43:22 crc kubenswrapper[4839]: I0321 04:43:22.448874 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-31f4-account-create-update-98c9m" event={"ID":"5740bec9-4b0c-4092-8309-14fdb2562c2e","Type":"ContainerDied","Data":"bc85e819a8b1f2def449cfd0987dc3cf3c1805c923545af4ec58edeba1a10775"} Mar 21 04:43:22 crc kubenswrapper[4839]: I0321 04:43:22.472103 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-qt5s4-config-llx9j" event={"ID":"3fb143a8-0cf7-4e83-8e23-aa49453bac07","Type":"ContainerStarted","Data":"d792580397713b6021c551a3f4cbfaf97f1c5637484d37b25e33338bf6fc4ac7"} Mar 21 04:43:22 crc kubenswrapper[4839]: I0321 04:43:22.472161 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-qt5s4-config-llx9j" event={"ID":"3fb143a8-0cf7-4e83-8e23-aa49453bac07","Type":"ContainerStarted","Data":"c4744e1d4932aeabfd76efbfcdb4f01e6c1633f7bb3d9b33a5ac60558694c4bb"} Mar 21 04:43:22 crc kubenswrapper[4839]: I0321 04:43:22.495332 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-fgh6w" Mar 21 04:43:22 crc kubenswrapper[4839]: I0321 04:43:22.514601 4839 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-qt5s4-config-llx9j" podStartSLOduration=1.5145837100000001 podStartE2EDuration="1.51458371s" podCreationTimestamp="2026-03-21 04:43:21 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-21 04:43:22.507611485 +0000 UTC m=+1206.835398181" watchObservedRunningTime="2026-03-21 04:43:22.51458371 +0000 UTC m=+1206.842370386" Mar 21 04:43:22 crc kubenswrapper[4839]: I0321 04:43:22.982837 4839 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/root-account-create-update-fgh6w"] Mar 21 04:43:23 crc kubenswrapper[4839]: I0321 04:43:23.476303 4839 generic.go:334] "Generic (PLEG): container finished" podID="0fc450b4-4ccf-4e6e-97d1-d47f252be788" containerID="6b5e6693316b5cfa06c0f6c8e7e9f37a0398c873966489b56f00cbd44f60fd16" exitCode=0 Mar 21 04:43:23 crc kubenswrapper[4839]: I0321 04:43:23.477052 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-fgh6w" event={"ID":"0fc450b4-4ccf-4e6e-97d1-d47f252be788","Type":"ContainerDied","Data":"6b5e6693316b5cfa06c0f6c8e7e9f37a0398c873966489b56f00cbd44f60fd16"} Mar 21 04:43:23 crc kubenswrapper[4839]: I0321 04:43:23.477240 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-fgh6w" event={"ID":"0fc450b4-4ccf-4e6e-97d1-d47f252be788","Type":"ContainerStarted","Data":"2bb3f1fdeb6e88bb5e653c065ef6cd3c419af258dfd00a1bd1da4d47d7d7f880"} Mar 21 04:43:23 crc kubenswrapper[4839]: I0321 04:43:23.484695 4839 generic.go:334] "Generic (PLEG): container finished" podID="3fb143a8-0cf7-4e83-8e23-aa49453bac07" containerID="d792580397713b6021c551a3f4cbfaf97f1c5637484d37b25e33338bf6fc4ac7" exitCode=0 Mar 21 04:43:23 crc kubenswrapper[4839]: I0321 04:43:23.484733 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-qt5s4-config-llx9j" event={"ID":"3fb143a8-0cf7-4e83-8e23-aa49453bac07","Type":"ContainerDied","Data":"d792580397713b6021c551a3f4cbfaf97f1c5637484d37b25e33338bf6fc4ac7"} Mar 21 04:43:23 crc kubenswrapper[4839]: I0321 04:43:23.916985 4839 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-kkvzq" Mar 21 04:43:23 crc kubenswrapper[4839]: I0321 04:43:23.964538 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/5484abbf-53f2-445a-b6fe-0996eba95345-dispersionconf\") pod \"5484abbf-53f2-445a-b6fe-0996eba95345\" (UID: \"5484abbf-53f2-445a-b6fe-0996eba95345\") " Mar 21 04:43:23 crc kubenswrapper[4839]: I0321 04:43:23.964894 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/5484abbf-53f2-445a-b6fe-0996eba95345-scripts\") pod \"5484abbf-53f2-445a-b6fe-0996eba95345\" (UID: \"5484abbf-53f2-445a-b6fe-0996eba95345\") " Mar 21 04:43:23 crc kubenswrapper[4839]: I0321 04:43:23.965084 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5484abbf-53f2-445a-b6fe-0996eba95345-combined-ca-bundle\") pod \"5484abbf-53f2-445a-b6fe-0996eba95345\" (UID: \"5484abbf-53f2-445a-b6fe-0996eba95345\") " Mar 21 04:43:23 crc kubenswrapper[4839]: I0321 04:43:23.965783 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4552p\" (UniqueName: \"kubernetes.io/projected/5484abbf-53f2-445a-b6fe-0996eba95345-kube-api-access-4552p\") pod \"5484abbf-53f2-445a-b6fe-0996eba95345\" (UID: \"5484abbf-53f2-445a-b6fe-0996eba95345\") " Mar 21 04:43:23 crc kubenswrapper[4839]: I0321 04:43:23.965954 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/5484abbf-53f2-445a-b6fe-0996eba95345-ring-data-devices\") pod \"5484abbf-53f2-445a-b6fe-0996eba95345\" (UID: \"5484abbf-53f2-445a-b6fe-0996eba95345\") " Mar 21 04:43:23 crc kubenswrapper[4839]: I0321 04:43:23.966085 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/5484abbf-53f2-445a-b6fe-0996eba95345-swiftconf\") pod \"5484abbf-53f2-445a-b6fe-0996eba95345\" (UID: \"5484abbf-53f2-445a-b6fe-0996eba95345\") " Mar 21 04:43:23 crc kubenswrapper[4839]: I0321 04:43:23.966219 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/5484abbf-53f2-445a-b6fe-0996eba95345-etc-swift\") pod \"5484abbf-53f2-445a-b6fe-0996eba95345\" (UID: \"5484abbf-53f2-445a-b6fe-0996eba95345\") " Mar 21 04:43:23 crc kubenswrapper[4839]: I0321 04:43:23.968900 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5484abbf-53f2-445a-b6fe-0996eba95345-etc-swift" (OuterVolumeSpecName: "etc-swift") pod "5484abbf-53f2-445a-b6fe-0996eba95345" (UID: "5484abbf-53f2-445a-b6fe-0996eba95345"). InnerVolumeSpecName "etc-swift". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 21 04:43:23 crc kubenswrapper[4839]: I0321 04:43:23.969518 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5484abbf-53f2-445a-b6fe-0996eba95345-ring-data-devices" (OuterVolumeSpecName: "ring-data-devices") pod "5484abbf-53f2-445a-b6fe-0996eba95345" (UID: "5484abbf-53f2-445a-b6fe-0996eba95345"). InnerVolumeSpecName "ring-data-devices". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 21 04:43:23 crc kubenswrapper[4839]: I0321 04:43:23.970661 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5484abbf-53f2-445a-b6fe-0996eba95345-kube-api-access-4552p" (OuterVolumeSpecName: "kube-api-access-4552p") pod "5484abbf-53f2-445a-b6fe-0996eba95345" (UID: "5484abbf-53f2-445a-b6fe-0996eba95345"). InnerVolumeSpecName "kube-api-access-4552p". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 21 04:43:23 crc kubenswrapper[4839]: I0321 04:43:23.978803 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5484abbf-53f2-445a-b6fe-0996eba95345-dispersionconf" (OuterVolumeSpecName: "dispersionconf") pod "5484abbf-53f2-445a-b6fe-0996eba95345" (UID: "5484abbf-53f2-445a-b6fe-0996eba95345"). InnerVolumeSpecName "dispersionconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 21 04:43:23 crc kubenswrapper[4839]: I0321 04:43:23.999340 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5484abbf-53f2-445a-b6fe-0996eba95345-scripts" (OuterVolumeSpecName: "scripts") pod "5484abbf-53f2-445a-b6fe-0996eba95345" (UID: "5484abbf-53f2-445a-b6fe-0996eba95345"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 21 04:43:24 crc kubenswrapper[4839]: I0321 04:43:24.008937 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5484abbf-53f2-445a-b6fe-0996eba95345-swiftconf" (OuterVolumeSpecName: "swiftconf") pod "5484abbf-53f2-445a-b6fe-0996eba95345" (UID: "5484abbf-53f2-445a-b6fe-0996eba95345"). InnerVolumeSpecName "swiftconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 21 04:43:24 crc kubenswrapper[4839]: I0321 04:43:24.015440 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5484abbf-53f2-445a-b6fe-0996eba95345-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "5484abbf-53f2-445a-b6fe-0996eba95345" (UID: "5484abbf-53f2-445a-b6fe-0996eba95345"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 21 04:43:24 crc kubenswrapper[4839]: I0321 04:43:24.069086 4839 reconciler_common.go:293] "Volume detached for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/5484abbf-53f2-445a-b6fe-0996eba95345-swiftconf\") on node \"crc\" DevicePath \"\"" Mar 21 04:43:24 crc kubenswrapper[4839]: I0321 04:43:24.069139 4839 reconciler_common.go:293] "Volume detached for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/5484abbf-53f2-445a-b6fe-0996eba95345-etc-swift\") on node \"crc\" DevicePath \"\"" Mar 21 04:43:24 crc kubenswrapper[4839]: I0321 04:43:24.069161 4839 reconciler_common.go:293] "Volume detached for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/5484abbf-53f2-445a-b6fe-0996eba95345-dispersionconf\") on node \"crc\" DevicePath \"\"" Mar 21 04:43:24 crc kubenswrapper[4839]: I0321 04:43:24.069177 4839 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/5484abbf-53f2-445a-b6fe-0996eba95345-scripts\") on node \"crc\" DevicePath \"\"" Mar 21 04:43:24 crc kubenswrapper[4839]: I0321 04:43:24.069191 4839 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5484abbf-53f2-445a-b6fe-0996eba95345-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 21 04:43:24 crc kubenswrapper[4839]: I0321 04:43:24.069204 4839 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4552p\" (UniqueName: \"kubernetes.io/projected/5484abbf-53f2-445a-b6fe-0996eba95345-kube-api-access-4552p\") on node \"crc\" DevicePath \"\"" Mar 21 04:43:24 crc kubenswrapper[4839]: I0321 04:43:24.069216 4839 reconciler_common.go:293] "Volume detached for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/5484abbf-53f2-445a-b6fe-0996eba95345-ring-data-devices\") on node \"crc\" DevicePath \"\"" Mar 21 04:43:24 crc kubenswrapper[4839]: I0321 04:43:24.105083 4839 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-31f4-account-create-update-98c9m" Mar 21 04:43:24 crc kubenswrapper[4839]: I0321 04:43:24.117596 4839 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-vqfbm" Mar 21 04:43:24 crc kubenswrapper[4839]: I0321 04:43:24.169881 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-q9qph\" (UniqueName: \"kubernetes.io/projected/b46c59d5-1b87-471e-ae9b-b8ba7ca8d754-kube-api-access-q9qph\") pod \"b46c59d5-1b87-471e-ae9b-b8ba7ca8d754\" (UID: \"b46c59d5-1b87-471e-ae9b-b8ba7ca8d754\") " Mar 21 04:43:24 crc kubenswrapper[4839]: I0321 04:43:24.170015 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qpmp5\" (UniqueName: \"kubernetes.io/projected/5740bec9-4b0c-4092-8309-14fdb2562c2e-kube-api-access-qpmp5\") pod \"5740bec9-4b0c-4092-8309-14fdb2562c2e\" (UID: \"5740bec9-4b0c-4092-8309-14fdb2562c2e\") " Mar 21 04:43:24 crc kubenswrapper[4839]: I0321 04:43:24.170156 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/5740bec9-4b0c-4092-8309-14fdb2562c2e-operator-scripts\") pod \"5740bec9-4b0c-4092-8309-14fdb2562c2e\" (UID: \"5740bec9-4b0c-4092-8309-14fdb2562c2e\") " Mar 21 04:43:24 crc kubenswrapper[4839]: I0321 04:43:24.170207 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b46c59d5-1b87-471e-ae9b-b8ba7ca8d754-operator-scripts\") pod \"b46c59d5-1b87-471e-ae9b-b8ba7ca8d754\" (UID: \"b46c59d5-1b87-471e-ae9b-b8ba7ca8d754\") " Mar 21 04:43:24 crc kubenswrapper[4839]: I0321 04:43:24.171210 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5740bec9-4b0c-4092-8309-14fdb2562c2e-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "5740bec9-4b0c-4092-8309-14fdb2562c2e" (UID: "5740bec9-4b0c-4092-8309-14fdb2562c2e"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 21 04:43:24 crc kubenswrapper[4839]: I0321 04:43:24.171376 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b46c59d5-1b87-471e-ae9b-b8ba7ca8d754-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "b46c59d5-1b87-471e-ae9b-b8ba7ca8d754" (UID: "b46c59d5-1b87-471e-ae9b-b8ba7ca8d754"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 21 04:43:24 crc kubenswrapper[4839]: I0321 04:43:24.173869 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b46c59d5-1b87-471e-ae9b-b8ba7ca8d754-kube-api-access-q9qph" (OuterVolumeSpecName: "kube-api-access-q9qph") pod "b46c59d5-1b87-471e-ae9b-b8ba7ca8d754" (UID: "b46c59d5-1b87-471e-ae9b-b8ba7ca8d754"). InnerVolumeSpecName "kube-api-access-q9qph". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 21 04:43:24 crc kubenswrapper[4839]: I0321 04:43:24.177074 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5740bec9-4b0c-4092-8309-14fdb2562c2e-kube-api-access-qpmp5" (OuterVolumeSpecName: "kube-api-access-qpmp5") pod "5740bec9-4b0c-4092-8309-14fdb2562c2e" (UID: "5740bec9-4b0c-4092-8309-14fdb2562c2e"). InnerVolumeSpecName "kube-api-access-qpmp5". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 21 04:43:24 crc kubenswrapper[4839]: I0321 04:43:24.272297 4839 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qpmp5\" (UniqueName: \"kubernetes.io/projected/5740bec9-4b0c-4092-8309-14fdb2562c2e-kube-api-access-qpmp5\") on node \"crc\" DevicePath \"\"" Mar 21 04:43:24 crc kubenswrapper[4839]: I0321 04:43:24.272604 4839 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/5740bec9-4b0c-4092-8309-14fdb2562c2e-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 21 04:43:24 crc kubenswrapper[4839]: I0321 04:43:24.272622 4839 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b46c59d5-1b87-471e-ae9b-b8ba7ca8d754-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 21 04:43:24 crc kubenswrapper[4839]: I0321 04:43:24.272635 4839 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-q9qph\" (UniqueName: \"kubernetes.io/projected/b46c59d5-1b87-471e-ae9b-b8ba7ca8d754-kube-api-access-q9qph\") on node \"crc\" DevicePath \"\"" Mar 21 04:43:24 crc kubenswrapper[4839]: I0321 04:43:24.495260 4839 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-kkvzq" Mar 21 04:43:24 crc kubenswrapper[4839]: I0321 04:43:24.495241 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-kkvzq" event={"ID":"5484abbf-53f2-445a-b6fe-0996eba95345","Type":"ContainerDied","Data":"609cfec14cc280f7aa99193851e53d3aac7200e8836be9f73b80cc6653f3fdc5"} Mar 21 04:43:24 crc kubenswrapper[4839]: I0321 04:43:24.497967 4839 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="609cfec14cc280f7aa99193851e53d3aac7200e8836be9f73b80cc6653f3fdc5" Mar 21 04:43:24 crc kubenswrapper[4839]: I0321 04:43:24.504474 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-31f4-account-create-update-98c9m" event={"ID":"5740bec9-4b0c-4092-8309-14fdb2562c2e","Type":"ContainerDied","Data":"90280b9cf99b4278e51fbecf4f72a0d75b08edcce54fb92a99812a0178e1ae70"} Mar 21 04:43:24 crc kubenswrapper[4839]: I0321 04:43:24.504765 4839 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="90280b9cf99b4278e51fbecf4f72a0d75b08edcce54fb92a99812a0178e1ae70" Mar 21 04:43:24 crc kubenswrapper[4839]: I0321 04:43:24.504793 4839 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-31f4-account-create-update-98c9m" Mar 21 04:43:24 crc kubenswrapper[4839]: I0321 04:43:24.506894 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-create-vqfbm" event={"ID":"b46c59d5-1b87-471e-ae9b-b8ba7ca8d754","Type":"ContainerDied","Data":"91da4146a7f72f81244c5bf66899f5aeb8fe1d05e895e4c0f447f26610ae10a8"} Mar 21 04:43:24 crc kubenswrapper[4839]: I0321 04:43:24.506926 4839 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="91da4146a7f72f81244c5bf66899f5aeb8fe1d05e895e4c0f447f26610ae10a8" Mar 21 04:43:24 crc kubenswrapper[4839]: I0321 04:43:24.507211 4839 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-vqfbm" Mar 21 04:43:24 crc kubenswrapper[4839]: I0321 04:43:24.978235 4839 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-fgh6w" Mar 21 04:43:24 crc kubenswrapper[4839]: I0321 04:43:24.984662 4839 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-qt5s4-config-llx9j" Mar 21 04:43:25 crc kubenswrapper[4839]: I0321 04:43:25.125133 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/3fb143a8-0cf7-4e83-8e23-aa49453bac07-var-log-ovn\") pod \"3fb143a8-0cf7-4e83-8e23-aa49453bac07\" (UID: \"3fb143a8-0cf7-4e83-8e23-aa49453bac07\") " Mar 21 04:43:25 crc kubenswrapper[4839]: I0321 04:43:25.125215 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/3fb143a8-0cf7-4e83-8e23-aa49453bac07-var-log-ovn" (OuterVolumeSpecName: "var-log-ovn") pod "3fb143a8-0cf7-4e83-8e23-aa49453bac07" (UID: "3fb143a8-0cf7-4e83-8e23-aa49453bac07"). InnerVolumeSpecName "var-log-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 21 04:43:25 crc kubenswrapper[4839]: I0321 04:43:25.125228 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0fc450b4-4ccf-4e6e-97d1-d47f252be788-operator-scripts\") pod \"0fc450b4-4ccf-4e6e-97d1-d47f252be788\" (UID: \"0fc450b4-4ccf-4e6e-97d1-d47f252be788\") " Mar 21 04:43:25 crc kubenswrapper[4839]: I0321 04:43:25.125264 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/3fb143a8-0cf7-4e83-8e23-aa49453bac07-scripts\") pod \"3fb143a8-0cf7-4e83-8e23-aa49453bac07\" (UID: \"3fb143a8-0cf7-4e83-8e23-aa49453bac07\") " Mar 21 04:43:25 crc kubenswrapper[4839]: I0321 04:43:25.125291 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/3fb143a8-0cf7-4e83-8e23-aa49453bac07-var-run\") pod \"3fb143a8-0cf7-4e83-8e23-aa49453bac07\" (UID: \"3fb143a8-0cf7-4e83-8e23-aa49453bac07\") " Mar 21 04:43:25 crc kubenswrapper[4839]: I0321 04:43:25.125324 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/3fb143a8-0cf7-4e83-8e23-aa49453bac07-additional-scripts\") pod \"3fb143a8-0cf7-4e83-8e23-aa49453bac07\" (UID: \"3fb143a8-0cf7-4e83-8e23-aa49453bac07\") " Mar 21 04:43:25 crc kubenswrapper[4839]: I0321 04:43:25.125398 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/3fb143a8-0cf7-4e83-8e23-aa49453bac07-var-run-ovn\") pod \"3fb143a8-0cf7-4e83-8e23-aa49453bac07\" (UID: \"3fb143a8-0cf7-4e83-8e23-aa49453bac07\") " Mar 21 04:43:25 crc kubenswrapper[4839]: I0321 04:43:25.125426 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hmcrx\" (UniqueName: \"kubernetes.io/projected/0fc450b4-4ccf-4e6e-97d1-d47f252be788-kube-api-access-hmcrx\") pod \"0fc450b4-4ccf-4e6e-97d1-d47f252be788\" (UID: \"0fc450b4-4ccf-4e6e-97d1-d47f252be788\") " Mar 21 04:43:25 crc kubenswrapper[4839]: I0321 04:43:25.125552 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bccfm\" (UniqueName: \"kubernetes.io/projected/3fb143a8-0cf7-4e83-8e23-aa49453bac07-kube-api-access-bccfm\") pod \"3fb143a8-0cf7-4e83-8e23-aa49453bac07\" (UID: \"3fb143a8-0cf7-4e83-8e23-aa49453bac07\") " Mar 21 04:43:25 crc kubenswrapper[4839]: I0321 04:43:25.125743 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0fc450b4-4ccf-4e6e-97d1-d47f252be788-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "0fc450b4-4ccf-4e6e-97d1-d47f252be788" (UID: "0fc450b4-4ccf-4e6e-97d1-d47f252be788"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 21 04:43:25 crc kubenswrapper[4839]: I0321 04:43:25.125882 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/3fb143a8-0cf7-4e83-8e23-aa49453bac07-var-run-ovn" (OuterVolumeSpecName: "var-run-ovn") pod "3fb143a8-0cf7-4e83-8e23-aa49453bac07" (UID: "3fb143a8-0cf7-4e83-8e23-aa49453bac07"). InnerVolumeSpecName "var-run-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 21 04:43:25 crc kubenswrapper[4839]: I0321 04:43:25.125951 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/3fb143a8-0cf7-4e83-8e23-aa49453bac07-var-run" (OuterVolumeSpecName: "var-run") pod "3fb143a8-0cf7-4e83-8e23-aa49453bac07" (UID: "3fb143a8-0cf7-4e83-8e23-aa49453bac07"). InnerVolumeSpecName "var-run". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 21 04:43:25 crc kubenswrapper[4839]: I0321 04:43:25.126313 4839 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0fc450b4-4ccf-4e6e-97d1-d47f252be788-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 21 04:43:25 crc kubenswrapper[4839]: I0321 04:43:25.126329 4839 reconciler_common.go:293] "Volume detached for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/3fb143a8-0cf7-4e83-8e23-aa49453bac07-var-run\") on node \"crc\" DevicePath \"\"" Mar 21 04:43:25 crc kubenswrapper[4839]: I0321 04:43:25.126338 4839 reconciler_common.go:293] "Volume detached for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/3fb143a8-0cf7-4e83-8e23-aa49453bac07-var-run-ovn\") on node \"crc\" DevicePath \"\"" Mar 21 04:43:25 crc kubenswrapper[4839]: I0321 04:43:25.126346 4839 reconciler_common.go:293] "Volume detached for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/3fb143a8-0cf7-4e83-8e23-aa49453bac07-var-log-ovn\") on node \"crc\" DevicePath \"\"" Mar 21 04:43:25 crc kubenswrapper[4839]: I0321 04:43:25.126393 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3fb143a8-0cf7-4e83-8e23-aa49453bac07-additional-scripts" (OuterVolumeSpecName: "additional-scripts") pod "3fb143a8-0cf7-4e83-8e23-aa49453bac07" (UID: "3fb143a8-0cf7-4e83-8e23-aa49453bac07"). InnerVolumeSpecName "additional-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 21 04:43:25 crc kubenswrapper[4839]: I0321 04:43:25.126742 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3fb143a8-0cf7-4e83-8e23-aa49453bac07-scripts" (OuterVolumeSpecName: "scripts") pod "3fb143a8-0cf7-4e83-8e23-aa49453bac07" (UID: "3fb143a8-0cf7-4e83-8e23-aa49453bac07"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 21 04:43:25 crc kubenswrapper[4839]: I0321 04:43:25.130547 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0fc450b4-4ccf-4e6e-97d1-d47f252be788-kube-api-access-hmcrx" (OuterVolumeSpecName: "kube-api-access-hmcrx") pod "0fc450b4-4ccf-4e6e-97d1-d47f252be788" (UID: "0fc450b4-4ccf-4e6e-97d1-d47f252be788"). InnerVolumeSpecName "kube-api-access-hmcrx". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 21 04:43:25 crc kubenswrapper[4839]: I0321 04:43:25.147932 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3fb143a8-0cf7-4e83-8e23-aa49453bac07-kube-api-access-bccfm" (OuterVolumeSpecName: "kube-api-access-bccfm") pod "3fb143a8-0cf7-4e83-8e23-aa49453bac07" (UID: "3fb143a8-0cf7-4e83-8e23-aa49453bac07"). InnerVolumeSpecName "kube-api-access-bccfm". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 21 04:43:25 crc kubenswrapper[4839]: I0321 04:43:25.228129 4839 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bccfm\" (UniqueName: \"kubernetes.io/projected/3fb143a8-0cf7-4e83-8e23-aa49453bac07-kube-api-access-bccfm\") on node \"crc\" DevicePath \"\"" Mar 21 04:43:25 crc kubenswrapper[4839]: I0321 04:43:25.228165 4839 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/3fb143a8-0cf7-4e83-8e23-aa49453bac07-scripts\") on node \"crc\" DevicePath \"\"" Mar 21 04:43:25 crc kubenswrapper[4839]: I0321 04:43:25.228174 4839 reconciler_common.go:293] "Volume detached for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/3fb143a8-0cf7-4e83-8e23-aa49453bac07-additional-scripts\") on node \"crc\" DevicePath \"\"" Mar 21 04:43:25 crc kubenswrapper[4839]: I0321 04:43:25.228185 4839 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hmcrx\" (UniqueName: \"kubernetes.io/projected/0fc450b4-4ccf-4e6e-97d1-d47f252be788-kube-api-access-hmcrx\") on node \"crc\" DevicePath \"\"" Mar 21 04:43:25 crc kubenswrapper[4839]: I0321 04:43:25.515562 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-fgh6w" event={"ID":"0fc450b4-4ccf-4e6e-97d1-d47f252be788","Type":"ContainerDied","Data":"2bb3f1fdeb6e88bb5e653c065ef6cd3c419af258dfd00a1bd1da4d47d7d7f880"} Mar 21 04:43:25 crc kubenswrapper[4839]: I0321 04:43:25.515604 4839 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-fgh6w" Mar 21 04:43:25 crc kubenswrapper[4839]: I0321 04:43:25.515617 4839 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2bb3f1fdeb6e88bb5e653c065ef6cd3c419af258dfd00a1bd1da4d47d7d7f880" Mar 21 04:43:25 crc kubenswrapper[4839]: I0321 04:43:25.517326 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-qt5s4-config-llx9j" event={"ID":"3fb143a8-0cf7-4e83-8e23-aa49453bac07","Type":"ContainerDied","Data":"c4744e1d4932aeabfd76efbfcdb4f01e6c1633f7bb3d9b33a5ac60558694c4bb"} Mar 21 04:43:25 crc kubenswrapper[4839]: I0321 04:43:25.517372 4839 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-qt5s4-config-llx9j" Mar 21 04:43:25 crc kubenswrapper[4839]: I0321 04:43:25.517383 4839 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c4744e1d4932aeabfd76efbfcdb4f01e6c1633f7bb3d9b33a5ac60558694c4bb" Mar 21 04:43:25 crc kubenswrapper[4839]: I0321 04:43:25.577387 4839 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-db-sync-ng2tw"] Mar 21 04:43:25 crc kubenswrapper[4839]: E0321 04:43:25.577794 4839 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5740bec9-4b0c-4092-8309-14fdb2562c2e" containerName="mariadb-account-create-update" Mar 21 04:43:25 crc kubenswrapper[4839]: I0321 04:43:25.577815 4839 state_mem.go:107] "Deleted CPUSet assignment" podUID="5740bec9-4b0c-4092-8309-14fdb2562c2e" containerName="mariadb-account-create-update" Mar 21 04:43:25 crc kubenswrapper[4839]: E0321 04:43:25.577832 4839 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5484abbf-53f2-445a-b6fe-0996eba95345" containerName="swift-ring-rebalance" Mar 21 04:43:25 crc kubenswrapper[4839]: I0321 04:43:25.577841 4839 state_mem.go:107] "Deleted CPUSet assignment" podUID="5484abbf-53f2-445a-b6fe-0996eba95345" containerName="swift-ring-rebalance" Mar 21 04:43:25 crc kubenswrapper[4839]: E0321 04:43:25.577849 4839 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3fb143a8-0cf7-4e83-8e23-aa49453bac07" containerName="ovn-config" Mar 21 04:43:25 crc kubenswrapper[4839]: I0321 04:43:25.577857 4839 state_mem.go:107] "Deleted CPUSet assignment" podUID="3fb143a8-0cf7-4e83-8e23-aa49453bac07" containerName="ovn-config" Mar 21 04:43:25 crc kubenswrapper[4839]: E0321 04:43:25.577872 4839 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b46c59d5-1b87-471e-ae9b-b8ba7ca8d754" containerName="mariadb-database-create" Mar 21 04:43:25 crc kubenswrapper[4839]: I0321 04:43:25.577880 4839 state_mem.go:107] "Deleted CPUSet assignment" podUID="b46c59d5-1b87-471e-ae9b-b8ba7ca8d754" containerName="mariadb-database-create" Mar 21 04:43:25 crc kubenswrapper[4839]: E0321 04:43:25.577896 4839 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0fc450b4-4ccf-4e6e-97d1-d47f252be788" containerName="mariadb-account-create-update" Mar 21 04:43:25 crc kubenswrapper[4839]: I0321 04:43:25.577905 4839 state_mem.go:107] "Deleted CPUSet assignment" podUID="0fc450b4-4ccf-4e6e-97d1-d47f252be788" containerName="mariadb-account-create-update" Mar 21 04:43:25 crc kubenswrapper[4839]: I0321 04:43:25.578100 4839 memory_manager.go:354] "RemoveStaleState removing state" podUID="b46c59d5-1b87-471e-ae9b-b8ba7ca8d754" containerName="mariadb-database-create" Mar 21 04:43:25 crc kubenswrapper[4839]: I0321 04:43:25.578115 4839 memory_manager.go:354] "RemoveStaleState removing state" podUID="5484abbf-53f2-445a-b6fe-0996eba95345" containerName="swift-ring-rebalance" Mar 21 04:43:25 crc kubenswrapper[4839]: I0321 04:43:25.578129 4839 memory_manager.go:354] "RemoveStaleState removing state" podUID="5740bec9-4b0c-4092-8309-14fdb2562c2e" containerName="mariadb-account-create-update" Mar 21 04:43:25 crc kubenswrapper[4839]: I0321 04:43:25.578141 4839 memory_manager.go:354] "RemoveStaleState removing state" podUID="3fb143a8-0cf7-4e83-8e23-aa49453bac07" containerName="ovn-config" Mar 21 04:43:25 crc kubenswrapper[4839]: I0321 04:43:25.578151 4839 memory_manager.go:354] "RemoveStaleState removing state" podUID="0fc450b4-4ccf-4e6e-97d1-d47f252be788" containerName="mariadb-account-create-update" Mar 21 04:43:25 crc kubenswrapper[4839]: I0321 04:43:25.578810 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-ng2tw" Mar 21 04:43:25 crc kubenswrapper[4839]: I0321 04:43:25.580840 4839 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-glance-dockercfg-v5bc4" Mar 21 04:43:25 crc kubenswrapper[4839]: I0321 04:43:25.581331 4839 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-config-data" Mar 21 04:43:25 crc kubenswrapper[4839]: I0321 04:43:25.595379 4839 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-sync-ng2tw"] Mar 21 04:43:25 crc kubenswrapper[4839]: I0321 04:43:25.634667 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8phmg\" (UniqueName: \"kubernetes.io/projected/2cc1dfb9-8108-46e5-8dc5-5b555590ecc1-kube-api-access-8phmg\") pod \"glance-db-sync-ng2tw\" (UID: \"2cc1dfb9-8108-46e5-8dc5-5b555590ecc1\") " pod="openstack/glance-db-sync-ng2tw" Mar 21 04:43:25 crc kubenswrapper[4839]: I0321 04:43:25.634715 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2cc1dfb9-8108-46e5-8dc5-5b555590ecc1-config-data\") pod \"glance-db-sync-ng2tw\" (UID: \"2cc1dfb9-8108-46e5-8dc5-5b555590ecc1\") " pod="openstack/glance-db-sync-ng2tw" Mar 21 04:43:25 crc kubenswrapper[4839]: I0321 04:43:25.634790 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/2cc1dfb9-8108-46e5-8dc5-5b555590ecc1-db-sync-config-data\") pod \"glance-db-sync-ng2tw\" (UID: \"2cc1dfb9-8108-46e5-8dc5-5b555590ecc1\") " pod="openstack/glance-db-sync-ng2tw" Mar 21 04:43:25 crc kubenswrapper[4839]: I0321 04:43:25.634839 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2cc1dfb9-8108-46e5-8dc5-5b555590ecc1-combined-ca-bundle\") pod \"glance-db-sync-ng2tw\" (UID: \"2cc1dfb9-8108-46e5-8dc5-5b555590ecc1\") " pod="openstack/glance-db-sync-ng2tw" Mar 21 04:43:25 crc kubenswrapper[4839]: I0321 04:43:25.641906 4839 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovn-controller-qt5s4-config-llx9j"] Mar 21 04:43:25 crc kubenswrapper[4839]: I0321 04:43:25.648392 4839 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ovn-controller-qt5s4-config-llx9j"] Mar 21 04:43:25 crc kubenswrapper[4839]: I0321 04:43:25.736689 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/2cc1dfb9-8108-46e5-8dc5-5b555590ecc1-db-sync-config-data\") pod \"glance-db-sync-ng2tw\" (UID: \"2cc1dfb9-8108-46e5-8dc5-5b555590ecc1\") " pod="openstack/glance-db-sync-ng2tw" Mar 21 04:43:25 crc kubenswrapper[4839]: I0321 04:43:25.736787 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2cc1dfb9-8108-46e5-8dc5-5b555590ecc1-combined-ca-bundle\") pod \"glance-db-sync-ng2tw\" (UID: \"2cc1dfb9-8108-46e5-8dc5-5b555590ecc1\") " pod="openstack/glance-db-sync-ng2tw" Mar 21 04:43:25 crc kubenswrapper[4839]: I0321 04:43:25.736871 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8phmg\" (UniqueName: \"kubernetes.io/projected/2cc1dfb9-8108-46e5-8dc5-5b555590ecc1-kube-api-access-8phmg\") pod \"glance-db-sync-ng2tw\" (UID: \"2cc1dfb9-8108-46e5-8dc5-5b555590ecc1\") " pod="openstack/glance-db-sync-ng2tw" Mar 21 04:43:25 crc kubenswrapper[4839]: I0321 04:43:25.736899 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2cc1dfb9-8108-46e5-8dc5-5b555590ecc1-config-data\") pod \"glance-db-sync-ng2tw\" (UID: \"2cc1dfb9-8108-46e5-8dc5-5b555590ecc1\") " pod="openstack/glance-db-sync-ng2tw" Mar 21 04:43:25 crc kubenswrapper[4839]: I0321 04:43:25.741763 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/2cc1dfb9-8108-46e5-8dc5-5b555590ecc1-db-sync-config-data\") pod \"glance-db-sync-ng2tw\" (UID: \"2cc1dfb9-8108-46e5-8dc5-5b555590ecc1\") " pod="openstack/glance-db-sync-ng2tw" Mar 21 04:43:25 crc kubenswrapper[4839]: I0321 04:43:25.742002 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2cc1dfb9-8108-46e5-8dc5-5b555590ecc1-combined-ca-bundle\") pod \"glance-db-sync-ng2tw\" (UID: \"2cc1dfb9-8108-46e5-8dc5-5b555590ecc1\") " pod="openstack/glance-db-sync-ng2tw" Mar 21 04:43:25 crc kubenswrapper[4839]: I0321 04:43:25.743056 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2cc1dfb9-8108-46e5-8dc5-5b555590ecc1-config-data\") pod \"glance-db-sync-ng2tw\" (UID: \"2cc1dfb9-8108-46e5-8dc5-5b555590ecc1\") " pod="openstack/glance-db-sync-ng2tw" Mar 21 04:43:25 crc kubenswrapper[4839]: I0321 04:43:25.751892 4839 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-qt5s4-config-zb6w6"] Mar 21 04:43:25 crc kubenswrapper[4839]: I0321 04:43:25.754190 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-qt5s4-config-zb6w6" Mar 21 04:43:25 crc kubenswrapper[4839]: I0321 04:43:25.763200 4839 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-qt5s4-config-zb6w6"] Mar 21 04:43:25 crc kubenswrapper[4839]: I0321 04:43:25.763526 4839 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-extra-scripts" Mar 21 04:43:25 crc kubenswrapper[4839]: I0321 04:43:25.767832 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8phmg\" (UniqueName: \"kubernetes.io/projected/2cc1dfb9-8108-46e5-8dc5-5b555590ecc1-kube-api-access-8phmg\") pod \"glance-db-sync-ng2tw\" (UID: \"2cc1dfb9-8108-46e5-8dc5-5b555590ecc1\") " pod="openstack/glance-db-sync-ng2tw" Mar 21 04:43:25 crc kubenswrapper[4839]: I0321 04:43:25.826933 4839 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-controller-qt5s4" Mar 21 04:43:25 crc kubenswrapper[4839]: I0321 04:43:25.838832 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/9a1c7cc4-a44f-4c22-ac1c-9ef543768cf7-var-run\") pod \"ovn-controller-qt5s4-config-zb6w6\" (UID: \"9a1c7cc4-a44f-4c22-ac1c-9ef543768cf7\") " pod="openstack/ovn-controller-qt5s4-config-zb6w6" Mar 21 04:43:25 crc kubenswrapper[4839]: I0321 04:43:25.838894 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wxxxk\" (UniqueName: \"kubernetes.io/projected/9a1c7cc4-a44f-4c22-ac1c-9ef543768cf7-kube-api-access-wxxxk\") pod \"ovn-controller-qt5s4-config-zb6w6\" (UID: \"9a1c7cc4-a44f-4c22-ac1c-9ef543768cf7\") " pod="openstack/ovn-controller-qt5s4-config-zb6w6" Mar 21 04:43:25 crc kubenswrapper[4839]: I0321 04:43:25.838954 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/9a1c7cc4-a44f-4c22-ac1c-9ef543768cf7-additional-scripts\") pod \"ovn-controller-qt5s4-config-zb6w6\" (UID: \"9a1c7cc4-a44f-4c22-ac1c-9ef543768cf7\") " pod="openstack/ovn-controller-qt5s4-config-zb6w6" Mar 21 04:43:25 crc kubenswrapper[4839]: I0321 04:43:25.839011 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/9a1c7cc4-a44f-4c22-ac1c-9ef543768cf7-var-log-ovn\") pod \"ovn-controller-qt5s4-config-zb6w6\" (UID: \"9a1c7cc4-a44f-4c22-ac1c-9ef543768cf7\") " pod="openstack/ovn-controller-qt5s4-config-zb6w6" Mar 21 04:43:25 crc kubenswrapper[4839]: I0321 04:43:25.839028 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/9a1c7cc4-a44f-4c22-ac1c-9ef543768cf7-scripts\") pod \"ovn-controller-qt5s4-config-zb6w6\" (UID: \"9a1c7cc4-a44f-4c22-ac1c-9ef543768cf7\") " pod="openstack/ovn-controller-qt5s4-config-zb6w6" Mar 21 04:43:25 crc kubenswrapper[4839]: I0321 04:43:25.839050 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/9a1c7cc4-a44f-4c22-ac1c-9ef543768cf7-var-run-ovn\") pod \"ovn-controller-qt5s4-config-zb6w6\" (UID: \"9a1c7cc4-a44f-4c22-ac1c-9ef543768cf7\") " pod="openstack/ovn-controller-qt5s4-config-zb6w6" Mar 21 04:43:25 crc kubenswrapper[4839]: I0321 04:43:25.906066 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-ng2tw" Mar 21 04:43:25 crc kubenswrapper[4839]: I0321 04:43:25.940604 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/9a1c7cc4-a44f-4c22-ac1c-9ef543768cf7-var-run\") pod \"ovn-controller-qt5s4-config-zb6w6\" (UID: \"9a1c7cc4-a44f-4c22-ac1c-9ef543768cf7\") " pod="openstack/ovn-controller-qt5s4-config-zb6w6" Mar 21 04:43:25 crc kubenswrapper[4839]: I0321 04:43:25.940669 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wxxxk\" (UniqueName: \"kubernetes.io/projected/9a1c7cc4-a44f-4c22-ac1c-9ef543768cf7-kube-api-access-wxxxk\") pod \"ovn-controller-qt5s4-config-zb6w6\" (UID: \"9a1c7cc4-a44f-4c22-ac1c-9ef543768cf7\") " pod="openstack/ovn-controller-qt5s4-config-zb6w6" Mar 21 04:43:25 crc kubenswrapper[4839]: I0321 04:43:25.940764 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/9a1c7cc4-a44f-4c22-ac1c-9ef543768cf7-additional-scripts\") pod \"ovn-controller-qt5s4-config-zb6w6\" (UID: \"9a1c7cc4-a44f-4c22-ac1c-9ef543768cf7\") " pod="openstack/ovn-controller-qt5s4-config-zb6w6" Mar 21 04:43:25 crc kubenswrapper[4839]: I0321 04:43:25.940875 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/9a1c7cc4-a44f-4c22-ac1c-9ef543768cf7-var-log-ovn\") pod \"ovn-controller-qt5s4-config-zb6w6\" (UID: \"9a1c7cc4-a44f-4c22-ac1c-9ef543768cf7\") " pod="openstack/ovn-controller-qt5s4-config-zb6w6" Mar 21 04:43:25 crc kubenswrapper[4839]: I0321 04:43:25.940899 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/9a1c7cc4-a44f-4c22-ac1c-9ef543768cf7-scripts\") pod \"ovn-controller-qt5s4-config-zb6w6\" (UID: \"9a1c7cc4-a44f-4c22-ac1c-9ef543768cf7\") " pod="openstack/ovn-controller-qt5s4-config-zb6w6" Mar 21 04:43:25 crc kubenswrapper[4839]: I0321 04:43:25.940929 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/9a1c7cc4-a44f-4c22-ac1c-9ef543768cf7-var-run-ovn\") pod \"ovn-controller-qt5s4-config-zb6w6\" (UID: \"9a1c7cc4-a44f-4c22-ac1c-9ef543768cf7\") " pod="openstack/ovn-controller-qt5s4-config-zb6w6" Mar 21 04:43:25 crc kubenswrapper[4839]: I0321 04:43:25.941223 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/9a1c7cc4-a44f-4c22-ac1c-9ef543768cf7-var-log-ovn\") pod \"ovn-controller-qt5s4-config-zb6w6\" (UID: \"9a1c7cc4-a44f-4c22-ac1c-9ef543768cf7\") " pod="openstack/ovn-controller-qt5s4-config-zb6w6" Mar 21 04:43:25 crc kubenswrapper[4839]: I0321 04:43:25.941476 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/9a1c7cc4-a44f-4c22-ac1c-9ef543768cf7-var-run\") pod \"ovn-controller-qt5s4-config-zb6w6\" (UID: \"9a1c7cc4-a44f-4c22-ac1c-9ef543768cf7\") " pod="openstack/ovn-controller-qt5s4-config-zb6w6" Mar 21 04:43:25 crc kubenswrapper[4839]: I0321 04:43:25.941974 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/9a1c7cc4-a44f-4c22-ac1c-9ef543768cf7-additional-scripts\") pod \"ovn-controller-qt5s4-config-zb6w6\" (UID: \"9a1c7cc4-a44f-4c22-ac1c-9ef543768cf7\") " pod="openstack/ovn-controller-qt5s4-config-zb6w6" Mar 21 04:43:25 crc kubenswrapper[4839]: I0321 04:43:25.943658 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/9a1c7cc4-a44f-4c22-ac1c-9ef543768cf7-scripts\") pod \"ovn-controller-qt5s4-config-zb6w6\" (UID: \"9a1c7cc4-a44f-4c22-ac1c-9ef543768cf7\") " pod="openstack/ovn-controller-qt5s4-config-zb6w6" Mar 21 04:43:25 crc kubenswrapper[4839]: I0321 04:43:25.944370 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/9a1c7cc4-a44f-4c22-ac1c-9ef543768cf7-var-run-ovn\") pod \"ovn-controller-qt5s4-config-zb6w6\" (UID: \"9a1c7cc4-a44f-4c22-ac1c-9ef543768cf7\") " pod="openstack/ovn-controller-qt5s4-config-zb6w6" Mar 21 04:43:25 crc kubenswrapper[4839]: I0321 04:43:25.963118 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wxxxk\" (UniqueName: \"kubernetes.io/projected/9a1c7cc4-a44f-4c22-ac1c-9ef543768cf7-kube-api-access-wxxxk\") pod \"ovn-controller-qt5s4-config-zb6w6\" (UID: \"9a1c7cc4-a44f-4c22-ac1c-9ef543768cf7\") " pod="openstack/ovn-controller-qt5s4-config-zb6w6" Mar 21 04:43:26 crc kubenswrapper[4839]: I0321 04:43:26.085574 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-qt5s4-config-zb6w6" Mar 21 04:43:26 crc kubenswrapper[4839]: I0321 04:43:26.462661 4839 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3fb143a8-0cf7-4e83-8e23-aa49453bac07" path="/var/lib/kubelet/pods/3fb143a8-0cf7-4e83-8e23-aa49453bac07/volumes" Mar 21 04:43:26 crc kubenswrapper[4839]: I0321 04:43:26.491229 4839 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-sync-ng2tw"] Mar 21 04:43:26 crc kubenswrapper[4839]: W0321 04:43:26.496992 4839 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod2cc1dfb9_8108_46e5_8dc5_5b555590ecc1.slice/crio-65ed5290939376cc07f297d06efe7a7f9acbf33da55d639c2cc318b6e8be4b9e WatchSource:0}: Error finding container 65ed5290939376cc07f297d06efe7a7f9acbf33da55d639c2cc318b6e8be4b9e: Status 404 returned error can't find the container with id 65ed5290939376cc07f297d06efe7a7f9acbf33da55d639c2cc318b6e8be4b9e Mar 21 04:43:26 crc kubenswrapper[4839]: I0321 04:43:26.525869 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-ng2tw" event={"ID":"2cc1dfb9-8108-46e5-8dc5-5b555590ecc1","Type":"ContainerStarted","Data":"65ed5290939376cc07f297d06efe7a7f9acbf33da55d639c2cc318b6e8be4b9e"} Mar 21 04:43:26 crc kubenswrapper[4839]: I0321 04:43:26.554677 4839 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-qt5s4-config-zb6w6"] Mar 21 04:43:26 crc kubenswrapper[4839]: W0321 04:43:26.556343 4839 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod9a1c7cc4_a44f_4c22_ac1c_9ef543768cf7.slice/crio-c979bc2884e80382e07c8880ad5376980127eb35c7a6271458a4d919ac49e89d WatchSource:0}: Error finding container c979bc2884e80382e07c8880ad5376980127eb35c7a6271458a4d919ac49e89d: Status 404 returned error can't find the container with id c979bc2884e80382e07c8880ad5376980127eb35c7a6271458a4d919ac49e89d Mar 21 04:43:27 crc kubenswrapper[4839]: I0321 04:43:27.537260 4839 generic.go:334] "Generic (PLEG): container finished" podID="9a1c7cc4-a44f-4c22-ac1c-9ef543768cf7" containerID="4f187e9e33fa923f2b2629c019ef104918ae6112912ac0480384a8c6a651a762" exitCode=0 Mar 21 04:43:27 crc kubenswrapper[4839]: I0321 04:43:27.537541 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-qt5s4-config-zb6w6" event={"ID":"9a1c7cc4-a44f-4c22-ac1c-9ef543768cf7","Type":"ContainerDied","Data":"4f187e9e33fa923f2b2629c019ef104918ae6112912ac0480384a8c6a651a762"} Mar 21 04:43:27 crc kubenswrapper[4839]: I0321 04:43:27.537618 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-qt5s4-config-zb6w6" event={"ID":"9a1c7cc4-a44f-4c22-ac1c-9ef543768cf7","Type":"ContainerStarted","Data":"c979bc2884e80382e07c8880ad5376980127eb35c7a6271458a4d919ac49e89d"} Mar 21 04:43:28 crc kubenswrapper[4839]: I0321 04:43:28.473070 4839 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/root-account-create-update-fgh6w"] Mar 21 04:43:28 crc kubenswrapper[4839]: I0321 04:43:28.489944 4839 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/root-account-create-update-fgh6w"] Mar 21 04:43:28 crc kubenswrapper[4839]: I0321 04:43:28.922180 4839 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-qt5s4-config-zb6w6" Mar 21 04:43:29 crc kubenswrapper[4839]: I0321 04:43:29.015855 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/9a1c7cc4-a44f-4c22-ac1c-9ef543768cf7-var-log-ovn\") pod \"9a1c7cc4-a44f-4c22-ac1c-9ef543768cf7\" (UID: \"9a1c7cc4-a44f-4c22-ac1c-9ef543768cf7\") " Mar 21 04:43:29 crc kubenswrapper[4839]: I0321 04:43:29.015953 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/9a1c7cc4-a44f-4c22-ac1c-9ef543768cf7-additional-scripts\") pod \"9a1c7cc4-a44f-4c22-ac1c-9ef543768cf7\" (UID: \"9a1c7cc4-a44f-4c22-ac1c-9ef543768cf7\") " Mar 21 04:43:29 crc kubenswrapper[4839]: I0321 04:43:29.015955 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/9a1c7cc4-a44f-4c22-ac1c-9ef543768cf7-var-log-ovn" (OuterVolumeSpecName: "var-log-ovn") pod "9a1c7cc4-a44f-4c22-ac1c-9ef543768cf7" (UID: "9a1c7cc4-a44f-4c22-ac1c-9ef543768cf7"). InnerVolumeSpecName "var-log-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 21 04:43:29 crc kubenswrapper[4839]: I0321 04:43:29.016032 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wxxxk\" (UniqueName: \"kubernetes.io/projected/9a1c7cc4-a44f-4c22-ac1c-9ef543768cf7-kube-api-access-wxxxk\") pod \"9a1c7cc4-a44f-4c22-ac1c-9ef543768cf7\" (UID: \"9a1c7cc4-a44f-4c22-ac1c-9ef543768cf7\") " Mar 21 04:43:29 crc kubenswrapper[4839]: I0321 04:43:29.016048 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/9a1c7cc4-a44f-4c22-ac1c-9ef543768cf7-var-run\") pod \"9a1c7cc4-a44f-4c22-ac1c-9ef543768cf7\" (UID: \"9a1c7cc4-a44f-4c22-ac1c-9ef543768cf7\") " Mar 21 04:43:29 crc kubenswrapper[4839]: I0321 04:43:29.016091 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/9a1c7cc4-a44f-4c22-ac1c-9ef543768cf7-scripts\") pod \"9a1c7cc4-a44f-4c22-ac1c-9ef543768cf7\" (UID: \"9a1c7cc4-a44f-4c22-ac1c-9ef543768cf7\") " Mar 21 04:43:29 crc kubenswrapper[4839]: I0321 04:43:29.016155 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/9a1c7cc4-a44f-4c22-ac1c-9ef543768cf7-var-run-ovn\") pod \"9a1c7cc4-a44f-4c22-ac1c-9ef543768cf7\" (UID: \"9a1c7cc4-a44f-4c22-ac1c-9ef543768cf7\") " Mar 21 04:43:29 crc kubenswrapper[4839]: I0321 04:43:29.016516 4839 reconciler_common.go:293] "Volume detached for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/9a1c7cc4-a44f-4c22-ac1c-9ef543768cf7-var-log-ovn\") on node \"crc\" DevicePath \"\"" Mar 21 04:43:29 crc kubenswrapper[4839]: I0321 04:43:29.016583 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/9a1c7cc4-a44f-4c22-ac1c-9ef543768cf7-var-run-ovn" (OuterVolumeSpecName: "var-run-ovn") pod "9a1c7cc4-a44f-4c22-ac1c-9ef543768cf7" (UID: "9a1c7cc4-a44f-4c22-ac1c-9ef543768cf7"). InnerVolumeSpecName "var-run-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 21 04:43:29 crc kubenswrapper[4839]: I0321 04:43:29.016588 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/9a1c7cc4-a44f-4c22-ac1c-9ef543768cf7-var-run" (OuterVolumeSpecName: "var-run") pod "9a1c7cc4-a44f-4c22-ac1c-9ef543768cf7" (UID: "9a1c7cc4-a44f-4c22-ac1c-9ef543768cf7"). InnerVolumeSpecName "var-run". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 21 04:43:29 crc kubenswrapper[4839]: I0321 04:43:29.017068 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9a1c7cc4-a44f-4c22-ac1c-9ef543768cf7-additional-scripts" (OuterVolumeSpecName: "additional-scripts") pod "9a1c7cc4-a44f-4c22-ac1c-9ef543768cf7" (UID: "9a1c7cc4-a44f-4c22-ac1c-9ef543768cf7"). InnerVolumeSpecName "additional-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 21 04:43:29 crc kubenswrapper[4839]: I0321 04:43:29.017726 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9a1c7cc4-a44f-4c22-ac1c-9ef543768cf7-scripts" (OuterVolumeSpecName: "scripts") pod "9a1c7cc4-a44f-4c22-ac1c-9ef543768cf7" (UID: "9a1c7cc4-a44f-4c22-ac1c-9ef543768cf7"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 21 04:43:29 crc kubenswrapper[4839]: I0321 04:43:29.033347 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9a1c7cc4-a44f-4c22-ac1c-9ef543768cf7-kube-api-access-wxxxk" (OuterVolumeSpecName: "kube-api-access-wxxxk") pod "9a1c7cc4-a44f-4c22-ac1c-9ef543768cf7" (UID: "9a1c7cc4-a44f-4c22-ac1c-9ef543768cf7"). InnerVolumeSpecName "kube-api-access-wxxxk". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 21 04:43:29 crc kubenswrapper[4839]: I0321 04:43:29.117589 4839 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/9a1c7cc4-a44f-4c22-ac1c-9ef543768cf7-scripts\") on node \"crc\" DevicePath \"\"" Mar 21 04:43:29 crc kubenswrapper[4839]: I0321 04:43:29.117619 4839 reconciler_common.go:293] "Volume detached for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/9a1c7cc4-a44f-4c22-ac1c-9ef543768cf7-var-run-ovn\") on node \"crc\" DevicePath \"\"" Mar 21 04:43:29 crc kubenswrapper[4839]: I0321 04:43:29.117629 4839 reconciler_common.go:293] "Volume detached for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/9a1c7cc4-a44f-4c22-ac1c-9ef543768cf7-additional-scripts\") on node \"crc\" DevicePath \"\"" Mar 21 04:43:29 crc kubenswrapper[4839]: I0321 04:43:29.117639 4839 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wxxxk\" (UniqueName: \"kubernetes.io/projected/9a1c7cc4-a44f-4c22-ac1c-9ef543768cf7-kube-api-access-wxxxk\") on node \"crc\" DevicePath \"\"" Mar 21 04:43:29 crc kubenswrapper[4839]: I0321 04:43:29.117648 4839 reconciler_common.go:293] "Volume detached for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/9a1c7cc4-a44f-4c22-ac1c-9ef543768cf7-var-run\") on node \"crc\" DevicePath \"\"" Mar 21 04:43:29 crc kubenswrapper[4839]: I0321 04:43:29.556389 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-qt5s4-config-zb6w6" event={"ID":"9a1c7cc4-a44f-4c22-ac1c-9ef543768cf7","Type":"ContainerDied","Data":"c979bc2884e80382e07c8880ad5376980127eb35c7a6271458a4d919ac49e89d"} Mar 21 04:43:29 crc kubenswrapper[4839]: I0321 04:43:29.556428 4839 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c979bc2884e80382e07c8880ad5376980127eb35c7a6271458a4d919ac49e89d" Mar 21 04:43:29 crc kubenswrapper[4839]: I0321 04:43:29.556479 4839 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-qt5s4-config-zb6w6" Mar 21 04:43:30 crc kubenswrapper[4839]: I0321 04:43:30.004717 4839 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovn-controller-qt5s4-config-zb6w6"] Mar 21 04:43:30 crc kubenswrapper[4839]: I0321 04:43:30.011836 4839 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ovn-controller-qt5s4-config-zb6w6"] Mar 21 04:43:30 crc kubenswrapper[4839]: I0321 04:43:30.462030 4839 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0fc450b4-4ccf-4e6e-97d1-d47f252be788" path="/var/lib/kubelet/pods/0fc450b4-4ccf-4e6e-97d1-d47f252be788/volumes" Mar 21 04:43:30 crc kubenswrapper[4839]: I0321 04:43:30.463254 4839 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9a1c7cc4-a44f-4c22-ac1c-9ef543768cf7" path="/var/lib/kubelet/pods/9a1c7cc4-a44f-4c22-ac1c-9ef543768cf7/volumes" Mar 21 04:43:31 crc kubenswrapper[4839]: I0321 04:43:31.456210 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/9848d2f0-c562-4b2a-bd1c-cd91c6754079-etc-swift\") pod \"swift-storage-0\" (UID: \"9848d2f0-c562-4b2a-bd1c-cd91c6754079\") " pod="openstack/swift-storage-0" Mar 21 04:43:31 crc kubenswrapper[4839]: I0321 04:43:31.462828 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/9848d2f0-c562-4b2a-bd1c-cd91c6754079-etc-swift\") pod \"swift-storage-0\" (UID: \"9848d2f0-c562-4b2a-bd1c-cd91c6754079\") " pod="openstack/swift-storage-0" Mar 21 04:43:31 crc kubenswrapper[4839]: I0321 04:43:31.518996 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-storage-0" Mar 21 04:43:31 crc kubenswrapper[4839]: I0321 04:43:31.852764 4839 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-cell1-server-0" Mar 21 04:43:32 crc kubenswrapper[4839]: I0321 04:43:32.200771 4839 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-server-0" Mar 21 04:43:33 crc kubenswrapper[4839]: I0321 04:43:33.496653 4839 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/root-account-create-update-xc9zf"] Mar 21 04:43:33 crc kubenswrapper[4839]: E0321 04:43:33.497385 4839 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9a1c7cc4-a44f-4c22-ac1c-9ef543768cf7" containerName="ovn-config" Mar 21 04:43:33 crc kubenswrapper[4839]: I0321 04:43:33.497400 4839 state_mem.go:107] "Deleted CPUSet assignment" podUID="9a1c7cc4-a44f-4c22-ac1c-9ef543768cf7" containerName="ovn-config" Mar 21 04:43:33 crc kubenswrapper[4839]: I0321 04:43:33.497610 4839 memory_manager.go:354] "RemoveStaleState removing state" podUID="9a1c7cc4-a44f-4c22-ac1c-9ef543768cf7" containerName="ovn-config" Mar 21 04:43:33 crc kubenswrapper[4839]: I0321 04:43:33.498177 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-xc9zf" Mar 21 04:43:33 crc kubenswrapper[4839]: I0321 04:43:33.501992 4839 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-cell1-mariadb-root-db-secret" Mar 21 04:43:33 crc kubenswrapper[4839]: I0321 04:43:33.517615 4839 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/root-account-create-update-xc9zf"] Mar 21 04:43:33 crc kubenswrapper[4839]: I0321 04:43:33.600895 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b567c69c-d110-4ab2-aaf7-da82f0e72cc3-operator-scripts\") pod \"root-account-create-update-xc9zf\" (UID: \"b567c69c-d110-4ab2-aaf7-da82f0e72cc3\") " pod="openstack/root-account-create-update-xc9zf" Mar 21 04:43:33 crc kubenswrapper[4839]: I0321 04:43:33.601041 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mtcj5\" (UniqueName: \"kubernetes.io/projected/b567c69c-d110-4ab2-aaf7-da82f0e72cc3-kube-api-access-mtcj5\") pod \"root-account-create-update-xc9zf\" (UID: \"b567c69c-d110-4ab2-aaf7-da82f0e72cc3\") " pod="openstack/root-account-create-update-xc9zf" Mar 21 04:43:33 crc kubenswrapper[4839]: I0321 04:43:33.702431 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mtcj5\" (UniqueName: \"kubernetes.io/projected/b567c69c-d110-4ab2-aaf7-da82f0e72cc3-kube-api-access-mtcj5\") pod \"root-account-create-update-xc9zf\" (UID: \"b567c69c-d110-4ab2-aaf7-da82f0e72cc3\") " pod="openstack/root-account-create-update-xc9zf" Mar 21 04:43:33 crc kubenswrapper[4839]: I0321 04:43:33.702512 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b567c69c-d110-4ab2-aaf7-da82f0e72cc3-operator-scripts\") pod \"root-account-create-update-xc9zf\" (UID: \"b567c69c-d110-4ab2-aaf7-da82f0e72cc3\") " pod="openstack/root-account-create-update-xc9zf" Mar 21 04:43:33 crc kubenswrapper[4839]: I0321 04:43:33.703243 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b567c69c-d110-4ab2-aaf7-da82f0e72cc3-operator-scripts\") pod \"root-account-create-update-xc9zf\" (UID: \"b567c69c-d110-4ab2-aaf7-da82f0e72cc3\") " pod="openstack/root-account-create-update-xc9zf" Mar 21 04:43:33 crc kubenswrapper[4839]: I0321 04:43:33.725631 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mtcj5\" (UniqueName: \"kubernetes.io/projected/b567c69c-d110-4ab2-aaf7-da82f0e72cc3-kube-api-access-mtcj5\") pod \"root-account-create-update-xc9zf\" (UID: \"b567c69c-d110-4ab2-aaf7-da82f0e72cc3\") " pod="openstack/root-account-create-update-xc9zf" Mar 21 04:43:33 crc kubenswrapper[4839]: I0321 04:43:33.814443 4839 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-db-create-nv7qf"] Mar 21 04:43:33 crc kubenswrapper[4839]: I0321 04:43:33.815882 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-nv7qf" Mar 21 04:43:33 crc kubenswrapper[4839]: I0321 04:43:33.823041 4839 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-create-nv7qf"] Mar 21 04:43:33 crc kubenswrapper[4839]: I0321 04:43:33.826761 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-xc9zf" Mar 21 04:43:33 crc kubenswrapper[4839]: I0321 04:43:33.909199 4839 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-3d03-account-create-update-q9rgd"] Mar 21 04:43:33 crc kubenswrapper[4839]: I0321 04:43:33.910484 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-3d03-account-create-update-q9rgd" Mar 21 04:43:33 crc kubenswrapper[4839]: I0321 04:43:33.916034 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ae4cf7f5-74ed-45d7-ace7-24ada744db6c-operator-scripts\") pod \"cinder-db-create-nv7qf\" (UID: \"ae4cf7f5-74ed-45d7-ace7-24ada744db6c\") " pod="openstack/cinder-db-create-nv7qf" Mar 21 04:43:33 crc kubenswrapper[4839]: I0321 04:43:33.916121 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7cgqq\" (UniqueName: \"kubernetes.io/projected/ae4cf7f5-74ed-45d7-ace7-24ada744db6c-kube-api-access-7cgqq\") pod \"cinder-db-create-nv7qf\" (UID: \"ae4cf7f5-74ed-45d7-ace7-24ada744db6c\") " pod="openstack/cinder-db-create-nv7qf" Mar 21 04:43:33 crc kubenswrapper[4839]: I0321 04:43:33.920444 4839 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-3d03-account-create-update-q9rgd"] Mar 21 04:43:33 crc kubenswrapper[4839]: I0321 04:43:33.928445 4839 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-db-secret" Mar 21 04:43:34 crc kubenswrapper[4839]: I0321 04:43:34.011680 4839 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-db-create-h5448"] Mar 21 04:43:34 crc kubenswrapper[4839]: I0321 04:43:34.013265 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-h5448" Mar 21 04:43:34 crc kubenswrapper[4839]: I0321 04:43:34.019255 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7cgqq\" (UniqueName: \"kubernetes.io/projected/ae4cf7f5-74ed-45d7-ace7-24ada744db6c-kube-api-access-7cgqq\") pod \"cinder-db-create-nv7qf\" (UID: \"ae4cf7f5-74ed-45d7-ace7-24ada744db6c\") " pod="openstack/cinder-db-create-nv7qf" Mar 21 04:43:34 crc kubenswrapper[4839]: I0321 04:43:34.019317 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b4t9p\" (UniqueName: \"kubernetes.io/projected/a7dfdbcf-7830-4f8d-a165-119fe80d999a-kube-api-access-b4t9p\") pod \"cinder-3d03-account-create-update-q9rgd\" (UID: \"a7dfdbcf-7830-4f8d-a165-119fe80d999a\") " pod="openstack/cinder-3d03-account-create-update-q9rgd" Mar 21 04:43:34 crc kubenswrapper[4839]: I0321 04:43:34.019460 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a7dfdbcf-7830-4f8d-a165-119fe80d999a-operator-scripts\") pod \"cinder-3d03-account-create-update-q9rgd\" (UID: \"a7dfdbcf-7830-4f8d-a165-119fe80d999a\") " pod="openstack/cinder-3d03-account-create-update-q9rgd" Mar 21 04:43:34 crc kubenswrapper[4839]: I0321 04:43:34.019485 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ae4cf7f5-74ed-45d7-ace7-24ada744db6c-operator-scripts\") pod \"cinder-db-create-nv7qf\" (UID: \"ae4cf7f5-74ed-45d7-ace7-24ada744db6c\") " pod="openstack/cinder-db-create-nv7qf" Mar 21 04:43:34 crc kubenswrapper[4839]: I0321 04:43:34.020757 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ae4cf7f5-74ed-45d7-ace7-24ada744db6c-operator-scripts\") pod \"cinder-db-create-nv7qf\" (UID: \"ae4cf7f5-74ed-45d7-ace7-24ada744db6c\") " pod="openstack/cinder-db-create-nv7qf" Mar 21 04:43:34 crc kubenswrapper[4839]: I0321 04:43:34.036107 4839 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-create-h5448"] Mar 21 04:43:34 crc kubenswrapper[4839]: I0321 04:43:34.064759 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7cgqq\" (UniqueName: \"kubernetes.io/projected/ae4cf7f5-74ed-45d7-ace7-24ada744db6c-kube-api-access-7cgqq\") pod \"cinder-db-create-nv7qf\" (UID: \"ae4cf7f5-74ed-45d7-ace7-24ada744db6c\") " pod="openstack/cinder-db-create-nv7qf" Mar 21 04:43:34 crc kubenswrapper[4839]: I0321 04:43:34.105150 4839 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-db-create-79rjr"] Mar 21 04:43:34 crc kubenswrapper[4839]: I0321 04:43:34.122313 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-b4t9p\" (UniqueName: \"kubernetes.io/projected/a7dfdbcf-7830-4f8d-a165-119fe80d999a-kube-api-access-b4t9p\") pod \"cinder-3d03-account-create-update-q9rgd\" (UID: \"a7dfdbcf-7830-4f8d-a165-119fe80d999a\") " pod="openstack/cinder-3d03-account-create-update-q9rgd" Mar 21 04:43:34 crc kubenswrapper[4839]: I0321 04:43:34.122381 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5brcn\" (UniqueName: \"kubernetes.io/projected/34a240db-9587-446e-af12-a44b87b1a3ac-kube-api-access-5brcn\") pod \"barbican-db-create-h5448\" (UID: \"34a240db-9587-446e-af12-a44b87b1a3ac\") " pod="openstack/barbican-db-create-h5448" Mar 21 04:43:34 crc kubenswrapper[4839]: I0321 04:43:34.122443 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/34a240db-9587-446e-af12-a44b87b1a3ac-operator-scripts\") pod \"barbican-db-create-h5448\" (UID: \"34a240db-9587-446e-af12-a44b87b1a3ac\") " pod="openstack/barbican-db-create-h5448" Mar 21 04:43:34 crc kubenswrapper[4839]: I0321 04:43:34.122479 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a7dfdbcf-7830-4f8d-a165-119fe80d999a-operator-scripts\") pod \"cinder-3d03-account-create-update-q9rgd\" (UID: \"a7dfdbcf-7830-4f8d-a165-119fe80d999a\") " pod="openstack/cinder-3d03-account-create-update-q9rgd" Mar 21 04:43:34 crc kubenswrapper[4839]: I0321 04:43:34.123173 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a7dfdbcf-7830-4f8d-a165-119fe80d999a-operator-scripts\") pod \"cinder-3d03-account-create-update-q9rgd\" (UID: \"a7dfdbcf-7830-4f8d-a165-119fe80d999a\") " pod="openstack/cinder-3d03-account-create-update-q9rgd" Mar 21 04:43:34 crc kubenswrapper[4839]: I0321 04:43:34.128098 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-79rjr" Mar 21 04:43:34 crc kubenswrapper[4839]: I0321 04:43:34.129302 4839 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-create-79rjr"] Mar 21 04:43:34 crc kubenswrapper[4839]: I0321 04:43:34.136617 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-nv7qf" Mar 21 04:43:34 crc kubenswrapper[4839]: I0321 04:43:34.141816 4839 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-4d6f-account-create-update-st2sv"] Mar 21 04:43:34 crc kubenswrapper[4839]: I0321 04:43:34.144481 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-4d6f-account-create-update-st2sv" Mar 21 04:43:34 crc kubenswrapper[4839]: I0321 04:43:34.151083 4839 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-4d6f-account-create-update-st2sv"] Mar 21 04:43:34 crc kubenswrapper[4839]: I0321 04:43:34.152958 4839 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-db-secret" Mar 21 04:43:34 crc kubenswrapper[4839]: I0321 04:43:34.163199 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-b4t9p\" (UniqueName: \"kubernetes.io/projected/a7dfdbcf-7830-4f8d-a165-119fe80d999a-kube-api-access-b4t9p\") pod \"cinder-3d03-account-create-update-q9rgd\" (UID: \"a7dfdbcf-7830-4f8d-a165-119fe80d999a\") " pod="openstack/cinder-3d03-account-create-update-q9rgd" Mar 21 04:43:34 crc kubenswrapper[4839]: I0321 04:43:34.217584 4839 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-db-sync-qgdlf"] Mar 21 04:43:34 crc kubenswrapper[4839]: I0321 04:43:34.218808 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-qgdlf" Mar 21 04:43:34 crc kubenswrapper[4839]: I0321 04:43:34.221773 4839 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-pzsvm" Mar 21 04:43:34 crc kubenswrapper[4839]: I0321 04:43:34.224052 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/34a240db-9587-446e-af12-a44b87b1a3ac-operator-scripts\") pod \"barbican-db-create-h5448\" (UID: \"34a240db-9587-446e-af12-a44b87b1a3ac\") " pod="openstack/barbican-db-create-h5448" Mar 21 04:43:34 crc kubenswrapper[4839]: I0321 04:43:34.224105 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8c8ad856-1b19-4b1c-8124-2e316dd567ee-operator-scripts\") pod \"neutron-4d6f-account-create-update-st2sv\" (UID: \"8c8ad856-1b19-4b1c-8124-2e316dd567ee\") " pod="openstack/neutron-4d6f-account-create-update-st2sv" Mar 21 04:43:34 crc kubenswrapper[4839]: I0321 04:43:34.225262 4839 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Mar 21 04:43:34 crc kubenswrapper[4839]: I0321 04:43:34.226040 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/34a240db-9587-446e-af12-a44b87b1a3ac-operator-scripts\") pod \"barbican-db-create-h5448\" (UID: \"34a240db-9587-446e-af12-a44b87b1a3ac\") " pod="openstack/barbican-db-create-h5448" Mar 21 04:43:34 crc kubenswrapper[4839]: I0321 04:43:34.228139 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9gmsm\" (UniqueName: \"kubernetes.io/projected/8c8ad856-1b19-4b1c-8124-2e316dd567ee-kube-api-access-9gmsm\") pod \"neutron-4d6f-account-create-update-st2sv\" (UID: \"8c8ad856-1b19-4b1c-8124-2e316dd567ee\") " pod="openstack/neutron-4d6f-account-create-update-st2sv" Mar 21 04:43:34 crc kubenswrapper[4839]: I0321 04:43:34.228790 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sfp6q\" (UniqueName: \"kubernetes.io/projected/9a5cee9b-67b3-40b1-bc62-e6a3c4c1272d-kube-api-access-sfp6q\") pod \"neutron-db-create-79rjr\" (UID: \"9a5cee9b-67b3-40b1-bc62-e6a3c4c1272d\") " pod="openstack/neutron-db-create-79rjr" Mar 21 04:43:34 crc kubenswrapper[4839]: I0321 04:43:34.228879 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5brcn\" (UniqueName: \"kubernetes.io/projected/34a240db-9587-446e-af12-a44b87b1a3ac-kube-api-access-5brcn\") pod \"barbican-db-create-h5448\" (UID: \"34a240db-9587-446e-af12-a44b87b1a3ac\") " pod="openstack/barbican-db-create-h5448" Mar 21 04:43:34 crc kubenswrapper[4839]: I0321 04:43:34.228971 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/9a5cee9b-67b3-40b1-bc62-e6a3c4c1272d-operator-scripts\") pod \"neutron-db-create-79rjr\" (UID: \"9a5cee9b-67b3-40b1-bc62-e6a3c4c1272d\") " pod="openstack/neutron-db-create-79rjr" Mar 21 04:43:34 crc kubenswrapper[4839]: I0321 04:43:34.229402 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-3d03-account-create-update-q9rgd" Mar 21 04:43:34 crc kubenswrapper[4839]: I0321 04:43:34.231886 4839 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Mar 21 04:43:34 crc kubenswrapper[4839]: I0321 04:43:34.237081 4839 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Mar 21 04:43:34 crc kubenswrapper[4839]: I0321 04:43:34.265763 4839 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-sync-qgdlf"] Mar 21 04:43:34 crc kubenswrapper[4839]: I0321 04:43:34.267323 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5brcn\" (UniqueName: \"kubernetes.io/projected/34a240db-9587-446e-af12-a44b87b1a3ac-kube-api-access-5brcn\") pod \"barbican-db-create-h5448\" (UID: \"34a240db-9587-446e-af12-a44b87b1a3ac\") " pod="openstack/barbican-db-create-h5448" Mar 21 04:43:34 crc kubenswrapper[4839]: I0321 04:43:34.273861 4839 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-a5f8-account-create-update-2srvb"] Mar 21 04:43:34 crc kubenswrapper[4839]: I0321 04:43:34.275371 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-a5f8-account-create-update-2srvb" Mar 21 04:43:34 crc kubenswrapper[4839]: I0321 04:43:34.278454 4839 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-db-secret" Mar 21 04:43:34 crc kubenswrapper[4839]: I0321 04:43:34.293705 4839 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-a5f8-account-create-update-2srvb"] Mar 21 04:43:34 crc kubenswrapper[4839]: I0321 04:43:34.330104 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sfp6q\" (UniqueName: \"kubernetes.io/projected/9a5cee9b-67b3-40b1-bc62-e6a3c4c1272d-kube-api-access-sfp6q\") pod \"neutron-db-create-79rjr\" (UID: \"9a5cee9b-67b3-40b1-bc62-e6a3c4c1272d\") " pod="openstack/neutron-db-create-79rjr" Mar 21 04:43:34 crc kubenswrapper[4839]: I0321 04:43:34.330156 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bc21c34c-13c1-4733-9013-0cfd304b179c-config-data\") pod \"keystone-db-sync-qgdlf\" (UID: \"bc21c34c-13c1-4733-9013-0cfd304b179c\") " pod="openstack/keystone-db-sync-qgdlf" Mar 21 04:43:34 crc kubenswrapper[4839]: I0321 04:43:34.330203 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bc21c34c-13c1-4733-9013-0cfd304b179c-combined-ca-bundle\") pod \"keystone-db-sync-qgdlf\" (UID: \"bc21c34c-13c1-4733-9013-0cfd304b179c\") " pod="openstack/keystone-db-sync-qgdlf" Mar 21 04:43:34 crc kubenswrapper[4839]: I0321 04:43:34.330231 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/9a5cee9b-67b3-40b1-bc62-e6a3c4c1272d-operator-scripts\") pod \"neutron-db-create-79rjr\" (UID: \"9a5cee9b-67b3-40b1-bc62-e6a3c4c1272d\") " pod="openstack/neutron-db-create-79rjr" Mar 21 04:43:34 crc kubenswrapper[4839]: I0321 04:43:34.330687 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vcwg2\" (UniqueName: \"kubernetes.io/projected/bc21c34c-13c1-4733-9013-0cfd304b179c-kube-api-access-vcwg2\") pod \"keystone-db-sync-qgdlf\" (UID: \"bc21c34c-13c1-4733-9013-0cfd304b179c\") " pod="openstack/keystone-db-sync-qgdlf" Mar 21 04:43:34 crc kubenswrapper[4839]: I0321 04:43:34.330726 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8c8ad856-1b19-4b1c-8124-2e316dd567ee-operator-scripts\") pod \"neutron-4d6f-account-create-update-st2sv\" (UID: \"8c8ad856-1b19-4b1c-8124-2e316dd567ee\") " pod="openstack/neutron-4d6f-account-create-update-st2sv" Mar 21 04:43:34 crc kubenswrapper[4839]: I0321 04:43:34.330751 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9gmsm\" (UniqueName: \"kubernetes.io/projected/8c8ad856-1b19-4b1c-8124-2e316dd567ee-kube-api-access-9gmsm\") pod \"neutron-4d6f-account-create-update-st2sv\" (UID: \"8c8ad856-1b19-4b1c-8124-2e316dd567ee\") " pod="openstack/neutron-4d6f-account-create-update-st2sv" Mar 21 04:43:34 crc kubenswrapper[4839]: I0321 04:43:34.331005 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/9a5cee9b-67b3-40b1-bc62-e6a3c4c1272d-operator-scripts\") pod \"neutron-db-create-79rjr\" (UID: \"9a5cee9b-67b3-40b1-bc62-e6a3c4c1272d\") " pod="openstack/neutron-db-create-79rjr" Mar 21 04:43:34 crc kubenswrapper[4839]: I0321 04:43:34.331688 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8c8ad856-1b19-4b1c-8124-2e316dd567ee-operator-scripts\") pod \"neutron-4d6f-account-create-update-st2sv\" (UID: \"8c8ad856-1b19-4b1c-8124-2e316dd567ee\") " pod="openstack/neutron-4d6f-account-create-update-st2sv" Mar 21 04:43:34 crc kubenswrapper[4839]: I0321 04:43:34.339226 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-h5448" Mar 21 04:43:34 crc kubenswrapper[4839]: I0321 04:43:34.346759 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9gmsm\" (UniqueName: \"kubernetes.io/projected/8c8ad856-1b19-4b1c-8124-2e316dd567ee-kube-api-access-9gmsm\") pod \"neutron-4d6f-account-create-update-st2sv\" (UID: \"8c8ad856-1b19-4b1c-8124-2e316dd567ee\") " pod="openstack/neutron-4d6f-account-create-update-st2sv" Mar 21 04:43:34 crc kubenswrapper[4839]: I0321 04:43:34.363252 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sfp6q\" (UniqueName: \"kubernetes.io/projected/9a5cee9b-67b3-40b1-bc62-e6a3c4c1272d-kube-api-access-sfp6q\") pod \"neutron-db-create-79rjr\" (UID: \"9a5cee9b-67b3-40b1-bc62-e6a3c4c1272d\") " pod="openstack/neutron-db-create-79rjr" Mar 21 04:43:34 crc kubenswrapper[4839]: I0321 04:43:34.432643 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bc21c34c-13c1-4733-9013-0cfd304b179c-config-data\") pod \"keystone-db-sync-qgdlf\" (UID: \"bc21c34c-13c1-4733-9013-0cfd304b179c\") " pod="openstack/keystone-db-sync-qgdlf" Mar 21 04:43:34 crc kubenswrapper[4839]: I0321 04:43:34.432730 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bc21c34c-13c1-4733-9013-0cfd304b179c-combined-ca-bundle\") pod \"keystone-db-sync-qgdlf\" (UID: \"bc21c34c-13c1-4733-9013-0cfd304b179c\") " pod="openstack/keystone-db-sync-qgdlf" Mar 21 04:43:34 crc kubenswrapper[4839]: I0321 04:43:34.432773 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vcwg2\" (UniqueName: \"kubernetes.io/projected/bc21c34c-13c1-4733-9013-0cfd304b179c-kube-api-access-vcwg2\") pod \"keystone-db-sync-qgdlf\" (UID: \"bc21c34c-13c1-4733-9013-0cfd304b179c\") " pod="openstack/keystone-db-sync-qgdlf" Mar 21 04:43:34 crc kubenswrapper[4839]: I0321 04:43:34.432840 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6fm77\" (UniqueName: \"kubernetes.io/projected/f816daf8-a9c7-4e99-a622-2f9bee7d203a-kube-api-access-6fm77\") pod \"barbican-a5f8-account-create-update-2srvb\" (UID: \"f816daf8-a9c7-4e99-a622-2f9bee7d203a\") " pod="openstack/barbican-a5f8-account-create-update-2srvb" Mar 21 04:43:34 crc kubenswrapper[4839]: I0321 04:43:34.432875 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f816daf8-a9c7-4e99-a622-2f9bee7d203a-operator-scripts\") pod \"barbican-a5f8-account-create-update-2srvb\" (UID: \"f816daf8-a9c7-4e99-a622-2f9bee7d203a\") " pod="openstack/barbican-a5f8-account-create-update-2srvb" Mar 21 04:43:34 crc kubenswrapper[4839]: I0321 04:43:34.437295 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bc21c34c-13c1-4733-9013-0cfd304b179c-combined-ca-bundle\") pod \"keystone-db-sync-qgdlf\" (UID: \"bc21c34c-13c1-4733-9013-0cfd304b179c\") " pod="openstack/keystone-db-sync-qgdlf" Mar 21 04:43:34 crc kubenswrapper[4839]: I0321 04:43:34.438831 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bc21c34c-13c1-4733-9013-0cfd304b179c-config-data\") pod \"keystone-db-sync-qgdlf\" (UID: \"bc21c34c-13c1-4733-9013-0cfd304b179c\") " pod="openstack/keystone-db-sync-qgdlf" Mar 21 04:43:34 crc kubenswrapper[4839]: I0321 04:43:34.447108 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-79rjr" Mar 21 04:43:34 crc kubenswrapper[4839]: I0321 04:43:34.449526 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vcwg2\" (UniqueName: \"kubernetes.io/projected/bc21c34c-13c1-4733-9013-0cfd304b179c-kube-api-access-vcwg2\") pod \"keystone-db-sync-qgdlf\" (UID: \"bc21c34c-13c1-4733-9013-0cfd304b179c\") " pod="openstack/keystone-db-sync-qgdlf" Mar 21 04:43:34 crc kubenswrapper[4839]: I0321 04:43:34.464001 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-4d6f-account-create-update-st2sv" Mar 21 04:43:34 crc kubenswrapper[4839]: I0321 04:43:34.533948 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6fm77\" (UniqueName: \"kubernetes.io/projected/f816daf8-a9c7-4e99-a622-2f9bee7d203a-kube-api-access-6fm77\") pod \"barbican-a5f8-account-create-update-2srvb\" (UID: \"f816daf8-a9c7-4e99-a622-2f9bee7d203a\") " pod="openstack/barbican-a5f8-account-create-update-2srvb" Mar 21 04:43:34 crc kubenswrapper[4839]: I0321 04:43:34.534274 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f816daf8-a9c7-4e99-a622-2f9bee7d203a-operator-scripts\") pod \"barbican-a5f8-account-create-update-2srvb\" (UID: \"f816daf8-a9c7-4e99-a622-2f9bee7d203a\") " pod="openstack/barbican-a5f8-account-create-update-2srvb" Mar 21 04:43:34 crc kubenswrapper[4839]: I0321 04:43:34.534936 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f816daf8-a9c7-4e99-a622-2f9bee7d203a-operator-scripts\") pod \"barbican-a5f8-account-create-update-2srvb\" (UID: \"f816daf8-a9c7-4e99-a622-2f9bee7d203a\") " pod="openstack/barbican-a5f8-account-create-update-2srvb" Mar 21 04:43:34 crc kubenswrapper[4839]: I0321 04:43:34.548817 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-qgdlf" Mar 21 04:43:34 crc kubenswrapper[4839]: I0321 04:43:34.564000 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6fm77\" (UniqueName: \"kubernetes.io/projected/f816daf8-a9c7-4e99-a622-2f9bee7d203a-kube-api-access-6fm77\") pod \"barbican-a5f8-account-create-update-2srvb\" (UID: \"f816daf8-a9c7-4e99-a622-2f9bee7d203a\") " pod="openstack/barbican-a5f8-account-create-update-2srvb" Mar 21 04:43:34 crc kubenswrapper[4839]: I0321 04:43:34.615252 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-a5f8-account-create-update-2srvb" Mar 21 04:43:39 crc kubenswrapper[4839]: I0321 04:43:39.942856 4839 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-3d03-account-create-update-q9rgd"] Mar 21 04:43:40 crc kubenswrapper[4839]: I0321 04:43:40.172476 4839 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-4d6f-account-create-update-st2sv"] Mar 21 04:43:40 crc kubenswrapper[4839]: I0321 04:43:40.183124 4839 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/root-account-create-update-xc9zf"] Mar 21 04:43:40 crc kubenswrapper[4839]: W0321 04:43:40.187368 4839 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb567c69c_d110_4ab2_aaf7_da82f0e72cc3.slice/crio-e3011e29981c589ee376b9ae98f36e4cbafbbd3d1eaa73e7fc08679d2e065608 WatchSource:0}: Error finding container e3011e29981c589ee376b9ae98f36e4cbafbbd3d1eaa73e7fc08679d2e065608: Status 404 returned error can't find the container with id e3011e29981c589ee376b9ae98f36e4cbafbbd3d1eaa73e7fc08679d2e065608 Mar 21 04:43:40 crc kubenswrapper[4839]: I0321 04:43:40.289210 4839 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-a5f8-account-create-update-2srvb"] Mar 21 04:43:40 crc kubenswrapper[4839]: I0321 04:43:40.324711 4839 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-sync-qgdlf"] Mar 21 04:43:40 crc kubenswrapper[4839]: I0321 04:43:40.338417 4839 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-create-nv7qf"] Mar 21 04:43:40 crc kubenswrapper[4839]: W0321 04:43:40.361706 4839 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf816daf8_a9c7_4e99_a622_2f9bee7d203a.slice/crio-94ac03e112c80ef0d5418ea52888bc6a7eac3f10db846d5d3635730b5e38afad WatchSource:0}: Error finding container 94ac03e112c80ef0d5418ea52888bc6a7eac3f10db846d5d3635730b5e38afad: Status 404 returned error can't find the container with id 94ac03e112c80ef0d5418ea52888bc6a7eac3f10db846d5d3635730b5e38afad Mar 21 04:43:40 crc kubenswrapper[4839]: I0321 04:43:40.495847 4839 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-storage-0"] Mar 21 04:43:40 crc kubenswrapper[4839]: W0321 04:43:40.497684 4839 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod9848d2f0_c562_4b2a_bd1c_cd91c6754079.slice/crio-e698f98cfc8ae5d6f5ea4aa44381ba9e5573fe7604841a4a4c09e46cce5d1ce7 WatchSource:0}: Error finding container e698f98cfc8ae5d6f5ea4aa44381ba9e5573fe7604841a4a4c09e46cce5d1ce7: Status 404 returned error can't find the container with id e698f98cfc8ae5d6f5ea4aa44381ba9e5573fe7604841a4a4c09e46cce5d1ce7 Mar 21 04:43:40 crc kubenswrapper[4839]: I0321 04:43:40.520709 4839 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-create-h5448"] Mar 21 04:43:40 crc kubenswrapper[4839]: I0321 04:43:40.537515 4839 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-create-79rjr"] Mar 21 04:43:40 crc kubenswrapper[4839]: W0321 04:43:40.566916 4839 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod34a240db_9587_446e_af12_a44b87b1a3ac.slice/crio-31e3db5d72b01022f1986c2a2ec56c7bc9d0988d79194c9f2f0ab8aa647233ba WatchSource:0}: Error finding container 31e3db5d72b01022f1986c2a2ec56c7bc9d0988d79194c9f2f0ab8aa647233ba: Status 404 returned error can't find the container with id 31e3db5d72b01022f1986c2a2ec56c7bc9d0988d79194c9f2f0ab8aa647233ba Mar 21 04:43:40 crc kubenswrapper[4839]: W0321 04:43:40.577429 4839 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod9a5cee9b_67b3_40b1_bc62_e6a3c4c1272d.slice/crio-47ea97309a6522272d1049c002a6ef10748bf6ff9838c09443d1183896d8f227 WatchSource:0}: Error finding container 47ea97309a6522272d1049c002a6ef10748bf6ff9838c09443d1183896d8f227: Status 404 returned error can't find the container with id 47ea97309a6522272d1049c002a6ef10748bf6ff9838c09443d1183896d8f227 Mar 21 04:43:40 crc kubenswrapper[4839]: I0321 04:43:40.687501 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-a5f8-account-create-update-2srvb" event={"ID":"f816daf8-a9c7-4e99-a622-2f9bee7d203a","Type":"ContainerStarted","Data":"94ac03e112c80ef0d5418ea52888bc6a7eac3f10db846d5d3635730b5e38afad"} Mar 21 04:43:40 crc kubenswrapper[4839]: I0321 04:43:40.689652 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"9848d2f0-c562-4b2a-bd1c-cd91c6754079","Type":"ContainerStarted","Data":"e698f98cfc8ae5d6f5ea4aa44381ba9e5573fe7604841a4a4c09e46cce5d1ce7"} Mar 21 04:43:40 crc kubenswrapper[4839]: I0321 04:43:40.691585 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-create-79rjr" event={"ID":"9a5cee9b-67b3-40b1-bc62-e6a3c4c1272d","Type":"ContainerStarted","Data":"47ea97309a6522272d1049c002a6ef10748bf6ff9838c09443d1183896d8f227"} Mar 21 04:43:40 crc kubenswrapper[4839]: I0321 04:43:40.693113 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-qgdlf" event={"ID":"bc21c34c-13c1-4733-9013-0cfd304b179c","Type":"ContainerStarted","Data":"5aa954aabfea506f0a5478f8c3a3555f34687b1de83455cee6f271375bfc4c66"} Mar 21 04:43:40 crc kubenswrapper[4839]: I0321 04:43:40.695447 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-3d03-account-create-update-q9rgd" event={"ID":"a7dfdbcf-7830-4f8d-a165-119fe80d999a","Type":"ContainerStarted","Data":"1fd2a9e659b1f417a5acc26d40481b58d37731bc164379bcd010cc11a61ef9ec"} Mar 21 04:43:40 crc kubenswrapper[4839]: I0321 04:43:40.695479 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-3d03-account-create-update-q9rgd" event={"ID":"a7dfdbcf-7830-4f8d-a165-119fe80d999a","Type":"ContainerStarted","Data":"33a682b67cb79b3a9c84f654296a23ef31bd7ed28ffe98f4e4f81581ace6187b"} Mar 21 04:43:40 crc kubenswrapper[4839]: I0321 04:43:40.700403 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-4d6f-account-create-update-st2sv" event={"ID":"8c8ad856-1b19-4b1c-8124-2e316dd567ee","Type":"ContainerStarted","Data":"61e0e39b96984fba4ce019fbf23e5e9abac32712ce1dead1f6d41d879dfd2bde"} Mar 21 04:43:40 crc kubenswrapper[4839]: I0321 04:43:40.701499 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-create-h5448" event={"ID":"34a240db-9587-446e-af12-a44b87b1a3ac","Type":"ContainerStarted","Data":"31e3db5d72b01022f1986c2a2ec56c7bc9d0988d79194c9f2f0ab8aa647233ba"} Mar 21 04:43:40 crc kubenswrapper[4839]: I0321 04:43:40.705498 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-xc9zf" event={"ID":"b567c69c-d110-4ab2-aaf7-da82f0e72cc3","Type":"ContainerStarted","Data":"1c106f22b4e401c674f904200a929e5e68e3e4f4a62178a136c50ceb882cf719"} Mar 21 04:43:40 crc kubenswrapper[4839]: I0321 04:43:40.705542 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-xc9zf" event={"ID":"b567c69c-d110-4ab2-aaf7-da82f0e72cc3","Type":"ContainerStarted","Data":"e3011e29981c589ee376b9ae98f36e4cbafbbd3d1eaa73e7fc08679d2e065608"} Mar 21 04:43:40 crc kubenswrapper[4839]: I0321 04:43:40.710457 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-create-nv7qf" event={"ID":"ae4cf7f5-74ed-45d7-ace7-24ada744db6c","Type":"ContainerStarted","Data":"534939e65abd8a9805fd9f039f7ba98bd25147b5037e43d0a993f57bc2e141ef"} Mar 21 04:43:40 crc kubenswrapper[4839]: I0321 04:43:40.715067 4839 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-3d03-account-create-update-q9rgd" podStartSLOduration=7.715049071 podStartE2EDuration="7.715049071s" podCreationTimestamp="2026-03-21 04:43:33 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-21 04:43:40.710559066 +0000 UTC m=+1225.038345742" watchObservedRunningTime="2026-03-21 04:43:40.715049071 +0000 UTC m=+1225.042835747" Mar 21 04:43:40 crc kubenswrapper[4839]: I0321 04:43:40.716893 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-ng2tw" event={"ID":"2cc1dfb9-8108-46e5-8dc5-5b555590ecc1","Type":"ContainerStarted","Data":"79604402661ee3c465cb72ff146dbc568553c3204385175c4f68e9dccfa5a6c6"} Mar 21 04:43:40 crc kubenswrapper[4839]: I0321 04:43:40.745367 4839 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/root-account-create-update-xc9zf" podStartSLOduration=7.745349099 podStartE2EDuration="7.745349099s" podCreationTimestamp="2026-03-21 04:43:33 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-21 04:43:40.740707329 +0000 UTC m=+1225.068494005" watchObservedRunningTime="2026-03-21 04:43:40.745349099 +0000 UTC m=+1225.073135775" Mar 21 04:43:40 crc kubenswrapper[4839]: I0321 04:43:40.778861 4839 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-db-sync-ng2tw" podStartSLOduration=2.781630836 podStartE2EDuration="15.778831086s" podCreationTimestamp="2026-03-21 04:43:25 +0000 UTC" firstStartedPulling="2026-03-21 04:43:26.499345993 +0000 UTC m=+1210.827132669" lastFinishedPulling="2026-03-21 04:43:39.496546243 +0000 UTC m=+1223.824332919" observedRunningTime="2026-03-21 04:43:40.768838886 +0000 UTC m=+1225.096625572" watchObservedRunningTime="2026-03-21 04:43:40.778831086 +0000 UTC m=+1225.106617762" Mar 21 04:43:41 crc kubenswrapper[4839]: E0321 04:43:41.334748 4839 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf816daf8_a9c7_4e99_a622_2f9bee7d203a.slice/crio-conmon-1fc9b78e56e247468e98f90edfea187e15daf4ea152975c90e4d68c87986ba79.scope\": RecentStats: unable to find data in memory cache]" Mar 21 04:43:41 crc kubenswrapper[4839]: I0321 04:43:41.741323 4839 generic.go:334] "Generic (PLEG): container finished" podID="f816daf8-a9c7-4e99-a622-2f9bee7d203a" containerID="1fc9b78e56e247468e98f90edfea187e15daf4ea152975c90e4d68c87986ba79" exitCode=0 Mar 21 04:43:41 crc kubenswrapper[4839]: I0321 04:43:41.741893 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-a5f8-account-create-update-2srvb" event={"ID":"f816daf8-a9c7-4e99-a622-2f9bee7d203a","Type":"ContainerDied","Data":"1fc9b78e56e247468e98f90edfea187e15daf4ea152975c90e4d68c87986ba79"} Mar 21 04:43:41 crc kubenswrapper[4839]: I0321 04:43:41.747808 4839 generic.go:334] "Generic (PLEG): container finished" podID="34a240db-9587-446e-af12-a44b87b1a3ac" containerID="3b8d2c5a2c9686ff0f867f32b67ae1a47eca812fdcbd3b93adee22c245151532" exitCode=0 Mar 21 04:43:41 crc kubenswrapper[4839]: I0321 04:43:41.747892 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-create-h5448" event={"ID":"34a240db-9587-446e-af12-a44b87b1a3ac","Type":"ContainerDied","Data":"3b8d2c5a2c9686ff0f867f32b67ae1a47eca812fdcbd3b93adee22c245151532"} Mar 21 04:43:41 crc kubenswrapper[4839]: I0321 04:43:41.751212 4839 generic.go:334] "Generic (PLEG): container finished" podID="b567c69c-d110-4ab2-aaf7-da82f0e72cc3" containerID="1c106f22b4e401c674f904200a929e5e68e3e4f4a62178a136c50ceb882cf719" exitCode=0 Mar 21 04:43:41 crc kubenswrapper[4839]: I0321 04:43:41.751303 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-xc9zf" event={"ID":"b567c69c-d110-4ab2-aaf7-da82f0e72cc3","Type":"ContainerDied","Data":"1c106f22b4e401c674f904200a929e5e68e3e4f4a62178a136c50ceb882cf719"} Mar 21 04:43:41 crc kubenswrapper[4839]: I0321 04:43:41.752946 4839 generic.go:334] "Generic (PLEG): container finished" podID="ae4cf7f5-74ed-45d7-ace7-24ada744db6c" containerID="3a57c10b2c78e441d84cbfab2416b69cf3571cc562b77b3b3134c8875131a599" exitCode=0 Mar 21 04:43:41 crc kubenswrapper[4839]: I0321 04:43:41.753033 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-create-nv7qf" event={"ID":"ae4cf7f5-74ed-45d7-ace7-24ada744db6c","Type":"ContainerDied","Data":"3a57c10b2c78e441d84cbfab2416b69cf3571cc562b77b3b3134c8875131a599"} Mar 21 04:43:41 crc kubenswrapper[4839]: I0321 04:43:41.754279 4839 generic.go:334] "Generic (PLEG): container finished" podID="9a5cee9b-67b3-40b1-bc62-e6a3c4c1272d" containerID="ebddf9a5729dde6feb4416ede20f92a3dd052bc816ed0593e001a7eb65da5807" exitCode=0 Mar 21 04:43:41 crc kubenswrapper[4839]: I0321 04:43:41.754338 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-create-79rjr" event={"ID":"9a5cee9b-67b3-40b1-bc62-e6a3c4c1272d","Type":"ContainerDied","Data":"ebddf9a5729dde6feb4416ede20f92a3dd052bc816ed0593e001a7eb65da5807"} Mar 21 04:43:41 crc kubenswrapper[4839]: I0321 04:43:41.760629 4839 generic.go:334] "Generic (PLEG): container finished" podID="a7dfdbcf-7830-4f8d-a165-119fe80d999a" containerID="1fd2a9e659b1f417a5acc26d40481b58d37731bc164379bcd010cc11a61ef9ec" exitCode=0 Mar 21 04:43:41 crc kubenswrapper[4839]: I0321 04:43:41.760870 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-3d03-account-create-update-q9rgd" event={"ID":"a7dfdbcf-7830-4f8d-a165-119fe80d999a","Type":"ContainerDied","Data":"1fd2a9e659b1f417a5acc26d40481b58d37731bc164379bcd010cc11a61ef9ec"} Mar 21 04:43:41 crc kubenswrapper[4839]: I0321 04:43:41.764886 4839 generic.go:334] "Generic (PLEG): container finished" podID="8c8ad856-1b19-4b1c-8124-2e316dd567ee" containerID="032f3b05c1ff562800fafe59fd0384b7d678921d7fe7e90157dab690dc2e9894" exitCode=0 Mar 21 04:43:41 crc kubenswrapper[4839]: I0321 04:43:41.764965 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-4d6f-account-create-update-st2sv" event={"ID":"8c8ad856-1b19-4b1c-8124-2e316dd567ee","Type":"ContainerDied","Data":"032f3b05c1ff562800fafe59fd0384b7d678921d7fe7e90157dab690dc2e9894"} Mar 21 04:43:46 crc kubenswrapper[4839]: I0321 04:43:46.293761 4839 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-3d03-account-create-update-q9rgd" Mar 21 04:43:46 crc kubenswrapper[4839]: I0321 04:43:46.301337 4839 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-nv7qf" Mar 21 04:43:46 crc kubenswrapper[4839]: I0321 04:43:46.316138 4839 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-h5448" Mar 21 04:43:46 crc kubenswrapper[4839]: I0321 04:43:46.323370 4839 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-79rjr" Mar 21 04:43:46 crc kubenswrapper[4839]: I0321 04:43:46.330932 4839 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-4d6f-account-create-update-st2sv" Mar 21 04:43:46 crc kubenswrapper[4839]: I0321 04:43:46.343340 4839 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-xc9zf" Mar 21 04:43:46 crc kubenswrapper[4839]: I0321 04:43:46.358448 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a7dfdbcf-7830-4f8d-a165-119fe80d999a-operator-scripts\") pod \"a7dfdbcf-7830-4f8d-a165-119fe80d999a\" (UID: \"a7dfdbcf-7830-4f8d-a165-119fe80d999a\") " Mar 21 04:43:46 crc kubenswrapper[4839]: I0321 04:43:46.358603 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-b4t9p\" (UniqueName: \"kubernetes.io/projected/a7dfdbcf-7830-4f8d-a165-119fe80d999a-kube-api-access-b4t9p\") pod \"a7dfdbcf-7830-4f8d-a165-119fe80d999a\" (UID: \"a7dfdbcf-7830-4f8d-a165-119fe80d999a\") " Mar 21 04:43:46 crc kubenswrapper[4839]: I0321 04:43:46.358681 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7cgqq\" (UniqueName: \"kubernetes.io/projected/ae4cf7f5-74ed-45d7-ace7-24ada744db6c-kube-api-access-7cgqq\") pod \"ae4cf7f5-74ed-45d7-ace7-24ada744db6c\" (UID: \"ae4cf7f5-74ed-45d7-ace7-24ada744db6c\") " Mar 21 04:43:46 crc kubenswrapper[4839]: I0321 04:43:46.358789 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ae4cf7f5-74ed-45d7-ace7-24ada744db6c-operator-scripts\") pod \"ae4cf7f5-74ed-45d7-ace7-24ada744db6c\" (UID: \"ae4cf7f5-74ed-45d7-ace7-24ada744db6c\") " Mar 21 04:43:46 crc kubenswrapper[4839]: I0321 04:43:46.358955 4839 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-a5f8-account-create-update-2srvb" Mar 21 04:43:46 crc kubenswrapper[4839]: I0321 04:43:46.360382 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ae4cf7f5-74ed-45d7-ace7-24ada744db6c-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "ae4cf7f5-74ed-45d7-ace7-24ada744db6c" (UID: "ae4cf7f5-74ed-45d7-ace7-24ada744db6c"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 21 04:43:46 crc kubenswrapper[4839]: I0321 04:43:46.362074 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a7dfdbcf-7830-4f8d-a165-119fe80d999a-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "a7dfdbcf-7830-4f8d-a165-119fe80d999a" (UID: "a7dfdbcf-7830-4f8d-a165-119fe80d999a"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 21 04:43:46 crc kubenswrapper[4839]: I0321 04:43:46.369136 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ae4cf7f5-74ed-45d7-ace7-24ada744db6c-kube-api-access-7cgqq" (OuterVolumeSpecName: "kube-api-access-7cgqq") pod "ae4cf7f5-74ed-45d7-ace7-24ada744db6c" (UID: "ae4cf7f5-74ed-45d7-ace7-24ada744db6c"). InnerVolumeSpecName "kube-api-access-7cgqq". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 21 04:43:46 crc kubenswrapper[4839]: I0321 04:43:46.384636 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a7dfdbcf-7830-4f8d-a165-119fe80d999a-kube-api-access-b4t9p" (OuterVolumeSpecName: "kube-api-access-b4t9p") pod "a7dfdbcf-7830-4f8d-a165-119fe80d999a" (UID: "a7dfdbcf-7830-4f8d-a165-119fe80d999a"). InnerVolumeSpecName "kube-api-access-b4t9p". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 21 04:43:46 crc kubenswrapper[4839]: I0321 04:43:46.460781 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8c8ad856-1b19-4b1c-8124-2e316dd567ee-operator-scripts\") pod \"8c8ad856-1b19-4b1c-8124-2e316dd567ee\" (UID: \"8c8ad856-1b19-4b1c-8124-2e316dd567ee\") " Mar 21 04:43:46 crc kubenswrapper[4839]: I0321 04:43:46.461338 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sfp6q\" (UniqueName: \"kubernetes.io/projected/9a5cee9b-67b3-40b1-bc62-e6a3c4c1272d-kube-api-access-sfp6q\") pod \"9a5cee9b-67b3-40b1-bc62-e6a3c4c1272d\" (UID: \"9a5cee9b-67b3-40b1-bc62-e6a3c4c1272d\") " Mar 21 04:43:46 crc kubenswrapper[4839]: I0321 04:43:46.461383 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8c8ad856-1b19-4b1c-8124-2e316dd567ee-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "8c8ad856-1b19-4b1c-8124-2e316dd567ee" (UID: "8c8ad856-1b19-4b1c-8124-2e316dd567ee"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 21 04:43:46 crc kubenswrapper[4839]: I0321 04:43:46.461474 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6fm77\" (UniqueName: \"kubernetes.io/projected/f816daf8-a9c7-4e99-a622-2f9bee7d203a-kube-api-access-6fm77\") pod \"f816daf8-a9c7-4e99-a622-2f9bee7d203a\" (UID: \"f816daf8-a9c7-4e99-a622-2f9bee7d203a\") " Mar 21 04:43:46 crc kubenswrapper[4839]: I0321 04:43:46.461507 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mtcj5\" (UniqueName: \"kubernetes.io/projected/b567c69c-d110-4ab2-aaf7-da82f0e72cc3-kube-api-access-mtcj5\") pod \"b567c69c-d110-4ab2-aaf7-da82f0e72cc3\" (UID: \"b567c69c-d110-4ab2-aaf7-da82f0e72cc3\") " Mar 21 04:43:46 crc kubenswrapper[4839]: I0321 04:43:46.461587 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f816daf8-a9c7-4e99-a622-2f9bee7d203a-operator-scripts\") pod \"f816daf8-a9c7-4e99-a622-2f9bee7d203a\" (UID: \"f816daf8-a9c7-4e99-a622-2f9bee7d203a\") " Mar 21 04:43:46 crc kubenswrapper[4839]: I0321 04:43:46.461618 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/34a240db-9587-446e-af12-a44b87b1a3ac-operator-scripts\") pod \"34a240db-9587-446e-af12-a44b87b1a3ac\" (UID: \"34a240db-9587-446e-af12-a44b87b1a3ac\") " Mar 21 04:43:46 crc kubenswrapper[4839]: I0321 04:43:46.461730 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5brcn\" (UniqueName: \"kubernetes.io/projected/34a240db-9587-446e-af12-a44b87b1a3ac-kube-api-access-5brcn\") pod \"34a240db-9587-446e-af12-a44b87b1a3ac\" (UID: \"34a240db-9587-446e-af12-a44b87b1a3ac\") " Mar 21 04:43:46 crc kubenswrapper[4839]: I0321 04:43:46.461761 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/9a5cee9b-67b3-40b1-bc62-e6a3c4c1272d-operator-scripts\") pod \"9a5cee9b-67b3-40b1-bc62-e6a3c4c1272d\" (UID: \"9a5cee9b-67b3-40b1-bc62-e6a3c4c1272d\") " Mar 21 04:43:46 crc kubenswrapper[4839]: I0321 04:43:46.461792 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b567c69c-d110-4ab2-aaf7-da82f0e72cc3-operator-scripts\") pod \"b567c69c-d110-4ab2-aaf7-da82f0e72cc3\" (UID: \"b567c69c-d110-4ab2-aaf7-da82f0e72cc3\") " Mar 21 04:43:46 crc kubenswrapper[4839]: I0321 04:43:46.461922 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9gmsm\" (UniqueName: \"kubernetes.io/projected/8c8ad856-1b19-4b1c-8124-2e316dd567ee-kube-api-access-9gmsm\") pod \"8c8ad856-1b19-4b1c-8124-2e316dd567ee\" (UID: \"8c8ad856-1b19-4b1c-8124-2e316dd567ee\") " Mar 21 04:43:46 crc kubenswrapper[4839]: I0321 04:43:46.462742 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f816daf8-a9c7-4e99-a622-2f9bee7d203a-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "f816daf8-a9c7-4e99-a622-2f9bee7d203a" (UID: "f816daf8-a9c7-4e99-a622-2f9bee7d203a"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 21 04:43:46 crc kubenswrapper[4839]: I0321 04:43:46.462747 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9a5cee9b-67b3-40b1-bc62-e6a3c4c1272d-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "9a5cee9b-67b3-40b1-bc62-e6a3c4c1272d" (UID: "9a5cee9b-67b3-40b1-bc62-e6a3c4c1272d"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 21 04:43:46 crc kubenswrapper[4839]: I0321 04:43:46.462754 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b567c69c-d110-4ab2-aaf7-da82f0e72cc3-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "b567c69c-d110-4ab2-aaf7-da82f0e72cc3" (UID: "b567c69c-d110-4ab2-aaf7-da82f0e72cc3"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 21 04:43:46 crc kubenswrapper[4839]: I0321 04:43:46.463088 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/34a240db-9587-446e-af12-a44b87b1a3ac-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "34a240db-9587-446e-af12-a44b87b1a3ac" (UID: "34a240db-9587-446e-af12-a44b87b1a3ac"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 21 04:43:46 crc kubenswrapper[4839]: I0321 04:43:46.463383 4839 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/9a5cee9b-67b3-40b1-bc62-e6a3c4c1272d-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 21 04:43:46 crc kubenswrapper[4839]: I0321 04:43:46.463538 4839 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b567c69c-d110-4ab2-aaf7-da82f0e72cc3-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 21 04:43:46 crc kubenswrapper[4839]: I0321 04:43:46.463681 4839 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-b4t9p\" (UniqueName: \"kubernetes.io/projected/a7dfdbcf-7830-4f8d-a165-119fe80d999a-kube-api-access-b4t9p\") on node \"crc\" DevicePath \"\"" Mar 21 04:43:46 crc kubenswrapper[4839]: I0321 04:43:46.463770 4839 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7cgqq\" (UniqueName: \"kubernetes.io/projected/ae4cf7f5-74ed-45d7-ace7-24ada744db6c-kube-api-access-7cgqq\") on node \"crc\" DevicePath \"\"" Mar 21 04:43:46 crc kubenswrapper[4839]: I0321 04:43:46.463863 4839 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ae4cf7f5-74ed-45d7-ace7-24ada744db6c-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 21 04:43:46 crc kubenswrapper[4839]: I0321 04:43:46.463945 4839 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8c8ad856-1b19-4b1c-8124-2e316dd567ee-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 21 04:43:46 crc kubenswrapper[4839]: I0321 04:43:46.464054 4839 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a7dfdbcf-7830-4f8d-a165-119fe80d999a-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 21 04:43:46 crc kubenswrapper[4839]: I0321 04:43:46.464170 4839 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f816daf8-a9c7-4e99-a622-2f9bee7d203a-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 21 04:43:46 crc kubenswrapper[4839]: I0321 04:43:46.464258 4839 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/34a240db-9587-446e-af12-a44b87b1a3ac-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 21 04:43:46 crc kubenswrapper[4839]: I0321 04:43:46.466696 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/34a240db-9587-446e-af12-a44b87b1a3ac-kube-api-access-5brcn" (OuterVolumeSpecName: "kube-api-access-5brcn") pod "34a240db-9587-446e-af12-a44b87b1a3ac" (UID: "34a240db-9587-446e-af12-a44b87b1a3ac"). InnerVolumeSpecName "kube-api-access-5brcn". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 21 04:43:46 crc kubenswrapper[4839]: I0321 04:43:46.476737 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9a5cee9b-67b3-40b1-bc62-e6a3c4c1272d-kube-api-access-sfp6q" (OuterVolumeSpecName: "kube-api-access-sfp6q") pod "9a5cee9b-67b3-40b1-bc62-e6a3c4c1272d" (UID: "9a5cee9b-67b3-40b1-bc62-e6a3c4c1272d"). InnerVolumeSpecName "kube-api-access-sfp6q". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 21 04:43:46 crc kubenswrapper[4839]: I0321 04:43:46.479556 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8c8ad856-1b19-4b1c-8124-2e316dd567ee-kube-api-access-9gmsm" (OuterVolumeSpecName: "kube-api-access-9gmsm") pod "8c8ad856-1b19-4b1c-8124-2e316dd567ee" (UID: "8c8ad856-1b19-4b1c-8124-2e316dd567ee"). InnerVolumeSpecName "kube-api-access-9gmsm". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 21 04:43:46 crc kubenswrapper[4839]: I0321 04:43:46.480583 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f816daf8-a9c7-4e99-a622-2f9bee7d203a-kube-api-access-6fm77" (OuterVolumeSpecName: "kube-api-access-6fm77") pod "f816daf8-a9c7-4e99-a622-2f9bee7d203a" (UID: "f816daf8-a9c7-4e99-a622-2f9bee7d203a"). InnerVolumeSpecName "kube-api-access-6fm77". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 21 04:43:46 crc kubenswrapper[4839]: I0321 04:43:46.498236 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b567c69c-d110-4ab2-aaf7-da82f0e72cc3-kube-api-access-mtcj5" (OuterVolumeSpecName: "kube-api-access-mtcj5") pod "b567c69c-d110-4ab2-aaf7-da82f0e72cc3" (UID: "b567c69c-d110-4ab2-aaf7-da82f0e72cc3"). InnerVolumeSpecName "kube-api-access-mtcj5". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 21 04:43:46 crc kubenswrapper[4839]: I0321 04:43:46.565771 4839 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5brcn\" (UniqueName: \"kubernetes.io/projected/34a240db-9587-446e-af12-a44b87b1a3ac-kube-api-access-5brcn\") on node \"crc\" DevicePath \"\"" Mar 21 04:43:46 crc kubenswrapper[4839]: I0321 04:43:46.565826 4839 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9gmsm\" (UniqueName: \"kubernetes.io/projected/8c8ad856-1b19-4b1c-8124-2e316dd567ee-kube-api-access-9gmsm\") on node \"crc\" DevicePath \"\"" Mar 21 04:43:46 crc kubenswrapper[4839]: I0321 04:43:46.565835 4839 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sfp6q\" (UniqueName: \"kubernetes.io/projected/9a5cee9b-67b3-40b1-bc62-e6a3c4c1272d-kube-api-access-sfp6q\") on node \"crc\" DevicePath \"\"" Mar 21 04:43:46 crc kubenswrapper[4839]: I0321 04:43:46.565845 4839 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6fm77\" (UniqueName: \"kubernetes.io/projected/f816daf8-a9c7-4e99-a622-2f9bee7d203a-kube-api-access-6fm77\") on node \"crc\" DevicePath \"\"" Mar 21 04:43:46 crc kubenswrapper[4839]: I0321 04:43:46.565854 4839 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mtcj5\" (UniqueName: \"kubernetes.io/projected/b567c69c-d110-4ab2-aaf7-da82f0e72cc3-kube-api-access-mtcj5\") on node \"crc\" DevicePath \"\"" Mar 21 04:43:46 crc kubenswrapper[4839]: I0321 04:43:46.807139 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-4d6f-account-create-update-st2sv" event={"ID":"8c8ad856-1b19-4b1c-8124-2e316dd567ee","Type":"ContainerDied","Data":"61e0e39b96984fba4ce019fbf23e5e9abac32712ce1dead1f6d41d879dfd2bde"} Mar 21 04:43:46 crc kubenswrapper[4839]: I0321 04:43:46.807187 4839 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="61e0e39b96984fba4ce019fbf23e5e9abac32712ce1dead1f6d41d879dfd2bde" Mar 21 04:43:46 crc kubenswrapper[4839]: I0321 04:43:46.807156 4839 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-4d6f-account-create-update-st2sv" Mar 21 04:43:46 crc kubenswrapper[4839]: I0321 04:43:46.808580 4839 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-a5f8-account-create-update-2srvb" Mar 21 04:43:46 crc kubenswrapper[4839]: I0321 04:43:46.808601 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-a5f8-account-create-update-2srvb" event={"ID":"f816daf8-a9c7-4e99-a622-2f9bee7d203a","Type":"ContainerDied","Data":"94ac03e112c80ef0d5418ea52888bc6a7eac3f10db846d5d3635730b5e38afad"} Mar 21 04:43:46 crc kubenswrapper[4839]: I0321 04:43:46.808651 4839 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="94ac03e112c80ef0d5418ea52888bc6a7eac3f10db846d5d3635730b5e38afad" Mar 21 04:43:46 crc kubenswrapper[4839]: I0321 04:43:46.811208 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-xc9zf" event={"ID":"b567c69c-d110-4ab2-aaf7-da82f0e72cc3","Type":"ContainerDied","Data":"e3011e29981c589ee376b9ae98f36e4cbafbbd3d1eaa73e7fc08679d2e065608"} Mar 21 04:43:46 crc kubenswrapper[4839]: I0321 04:43:46.811252 4839 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e3011e29981c589ee376b9ae98f36e4cbafbbd3d1eaa73e7fc08679d2e065608" Mar 21 04:43:46 crc kubenswrapper[4839]: I0321 04:43:46.811309 4839 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-xc9zf" Mar 21 04:43:46 crc kubenswrapper[4839]: I0321 04:43:46.812774 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-3d03-account-create-update-q9rgd" event={"ID":"a7dfdbcf-7830-4f8d-a165-119fe80d999a","Type":"ContainerDied","Data":"33a682b67cb79b3a9c84f654296a23ef31bd7ed28ffe98f4e4f81581ace6187b"} Mar 21 04:43:46 crc kubenswrapper[4839]: I0321 04:43:46.812800 4839 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="33a682b67cb79b3a9c84f654296a23ef31bd7ed28ffe98f4e4f81581ace6187b" Mar 21 04:43:46 crc kubenswrapper[4839]: I0321 04:43:46.812850 4839 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-3d03-account-create-update-q9rgd" Mar 21 04:43:46 crc kubenswrapper[4839]: I0321 04:43:46.815480 4839 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-79rjr" Mar 21 04:43:46 crc kubenswrapper[4839]: I0321 04:43:46.815495 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-create-79rjr" event={"ID":"9a5cee9b-67b3-40b1-bc62-e6a3c4c1272d","Type":"ContainerDied","Data":"47ea97309a6522272d1049c002a6ef10748bf6ff9838c09443d1183896d8f227"} Mar 21 04:43:46 crc kubenswrapper[4839]: I0321 04:43:46.815525 4839 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="47ea97309a6522272d1049c002a6ef10748bf6ff9838c09443d1183896d8f227" Mar 21 04:43:46 crc kubenswrapper[4839]: I0321 04:43:46.818478 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-create-h5448" event={"ID":"34a240db-9587-446e-af12-a44b87b1a3ac","Type":"ContainerDied","Data":"31e3db5d72b01022f1986c2a2ec56c7bc9d0988d79194c9f2f0ab8aa647233ba"} Mar 21 04:43:46 crc kubenswrapper[4839]: I0321 04:43:46.818500 4839 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="31e3db5d72b01022f1986c2a2ec56c7bc9d0988d79194c9f2f0ab8aa647233ba" Mar 21 04:43:46 crc kubenswrapper[4839]: I0321 04:43:46.818551 4839 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-h5448" Mar 21 04:43:46 crc kubenswrapper[4839]: I0321 04:43:46.821385 4839 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-nv7qf" Mar 21 04:43:46 crc kubenswrapper[4839]: I0321 04:43:46.821365 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-create-nv7qf" event={"ID":"ae4cf7f5-74ed-45d7-ace7-24ada744db6c","Type":"ContainerDied","Data":"534939e65abd8a9805fd9f039f7ba98bd25147b5037e43d0a993f57bc2e141ef"} Mar 21 04:43:46 crc kubenswrapper[4839]: I0321 04:43:46.821682 4839 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="534939e65abd8a9805fd9f039f7ba98bd25147b5037e43d0a993f57bc2e141ef" Mar 21 04:43:51 crc kubenswrapper[4839]: I0321 04:43:51.864863 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-qgdlf" event={"ID":"bc21c34c-13c1-4733-9013-0cfd304b179c","Type":"ContainerStarted","Data":"412e0d9615c7dcab7728f617fda54216ecfc01e31d3155750522d0825a7d167a"} Mar 21 04:43:51 crc kubenswrapper[4839]: I0321 04:43:51.900725 4839 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-db-sync-qgdlf" podStartSLOduration=6.628246323 podStartE2EDuration="17.900699923s" podCreationTimestamp="2026-03-21 04:43:34 +0000 UTC" firstStartedPulling="2026-03-21 04:43:40.381920842 +0000 UTC m=+1224.709707528" lastFinishedPulling="2026-03-21 04:43:51.654374452 +0000 UTC m=+1235.982161128" observedRunningTime="2026-03-21 04:43:51.883924764 +0000 UTC m=+1236.211711450" watchObservedRunningTime="2026-03-21 04:43:51.900699923 +0000 UTC m=+1236.228486599" Mar 21 04:43:52 crc kubenswrapper[4839]: I0321 04:43:52.879737 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"9848d2f0-c562-4b2a-bd1c-cd91c6754079","Type":"ContainerStarted","Data":"4e5b10c70eee61e441b4838fa7c2c853ba2b08fd634d9e172c49c87b2faeef46"} Mar 21 04:43:52 crc kubenswrapper[4839]: I0321 04:43:52.880066 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"9848d2f0-c562-4b2a-bd1c-cd91c6754079","Type":"ContainerStarted","Data":"a9b37f3384b4b0eae51f61d02514ddd05f5f508134f5482ab9d86baa4db5f11a"} Mar 21 04:43:52 crc kubenswrapper[4839]: I0321 04:43:52.880078 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"9848d2f0-c562-4b2a-bd1c-cd91c6754079","Type":"ContainerStarted","Data":"a86c7925f0c57c463d27dda933e80a017e3b6ba16431475b33a0c1032c17688d"} Mar 21 04:43:52 crc kubenswrapper[4839]: I0321 04:43:52.880087 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"9848d2f0-c562-4b2a-bd1c-cd91c6754079","Type":"ContainerStarted","Data":"299e07809867521cdc6adb7c8e489a9dcd37b132e6dd10141d563e7b532c1da8"} Mar 21 04:43:54 crc kubenswrapper[4839]: I0321 04:43:54.901083 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"9848d2f0-c562-4b2a-bd1c-cd91c6754079","Type":"ContainerStarted","Data":"4f63cc95cc378ebf27d8d9ee231175db976cde31102666f063f8234adf98aef0"} Mar 21 04:43:55 crc kubenswrapper[4839]: I0321 04:43:55.917962 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"9848d2f0-c562-4b2a-bd1c-cd91c6754079","Type":"ContainerStarted","Data":"2543c12a287a398f236910833bc11731c9e0ee7068b40132c0ab3acbd115509f"} Mar 21 04:43:55 crc kubenswrapper[4839]: I0321 04:43:55.918344 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"9848d2f0-c562-4b2a-bd1c-cd91c6754079","Type":"ContainerStarted","Data":"06b61ebed8cd68ee21289b45b2c6605ec597911e951f4357b7aaba905adc5461"} Mar 21 04:43:55 crc kubenswrapper[4839]: I0321 04:43:55.918359 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"9848d2f0-c562-4b2a-bd1c-cd91c6754079","Type":"ContainerStarted","Data":"4d03449d96fdc2c9659cd49ad827e80ab4c39587576b6f127c94e3590e3edd32"} Mar 21 04:43:56 crc kubenswrapper[4839]: I0321 04:43:56.929552 4839 generic.go:334] "Generic (PLEG): container finished" podID="bc21c34c-13c1-4733-9013-0cfd304b179c" containerID="412e0d9615c7dcab7728f617fda54216ecfc01e31d3155750522d0825a7d167a" exitCode=0 Mar 21 04:43:56 crc kubenswrapper[4839]: I0321 04:43:56.929689 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-qgdlf" event={"ID":"bc21c34c-13c1-4733-9013-0cfd304b179c","Type":"ContainerDied","Data":"412e0d9615c7dcab7728f617fda54216ecfc01e31d3155750522d0825a7d167a"} Mar 21 04:43:56 crc kubenswrapper[4839]: I0321 04:43:56.936088 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"9848d2f0-c562-4b2a-bd1c-cd91c6754079","Type":"ContainerStarted","Data":"18fafb2b7cedaeb91e2ed9cae4eb4702c1bb5391ffb3b7f246e498bc73062cf8"} Mar 21 04:43:56 crc kubenswrapper[4839]: I0321 04:43:56.936133 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"9848d2f0-c562-4b2a-bd1c-cd91c6754079","Type":"ContainerStarted","Data":"92047ef2b6fad140e351aba2e4e1717c008862f404cab80c606272eecccccf5e"} Mar 21 04:43:57 crc kubenswrapper[4839]: I0321 04:43:57.961715 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"9848d2f0-c562-4b2a-bd1c-cd91c6754079","Type":"ContainerStarted","Data":"9dc2a480548652ec0f569c2ff8001cc12924ece46aa719e945a332d0283f8c51"} Mar 21 04:43:57 crc kubenswrapper[4839]: I0321 04:43:57.962063 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"9848d2f0-c562-4b2a-bd1c-cd91c6754079","Type":"ContainerStarted","Data":"5ad2a70ac0163e6e775fd4e3f93931ef5d151c0c8130145323cb39ebf1423ddd"} Mar 21 04:43:57 crc kubenswrapper[4839]: I0321 04:43:57.962073 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"9848d2f0-c562-4b2a-bd1c-cd91c6754079","Type":"ContainerStarted","Data":"28bcd0b01ffcf842c57eeb0a6600606662b8b4c1d2547451d70ae901579d7e66"} Mar 21 04:43:59 crc kubenswrapper[4839]: I0321 04:43:58.391497 4839 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-qgdlf" Mar 21 04:43:59 crc kubenswrapper[4839]: I0321 04:43:58.501264 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vcwg2\" (UniqueName: \"kubernetes.io/projected/bc21c34c-13c1-4733-9013-0cfd304b179c-kube-api-access-vcwg2\") pod \"bc21c34c-13c1-4733-9013-0cfd304b179c\" (UID: \"bc21c34c-13c1-4733-9013-0cfd304b179c\") " Mar 21 04:43:59 crc kubenswrapper[4839]: I0321 04:43:58.502007 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bc21c34c-13c1-4733-9013-0cfd304b179c-combined-ca-bundle\") pod \"bc21c34c-13c1-4733-9013-0cfd304b179c\" (UID: \"bc21c34c-13c1-4733-9013-0cfd304b179c\") " Mar 21 04:43:59 crc kubenswrapper[4839]: I0321 04:43:58.502095 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bc21c34c-13c1-4733-9013-0cfd304b179c-config-data\") pod \"bc21c34c-13c1-4733-9013-0cfd304b179c\" (UID: \"bc21c34c-13c1-4733-9013-0cfd304b179c\") " Mar 21 04:43:59 crc kubenswrapper[4839]: I0321 04:43:58.506221 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bc21c34c-13c1-4733-9013-0cfd304b179c-kube-api-access-vcwg2" (OuterVolumeSpecName: "kube-api-access-vcwg2") pod "bc21c34c-13c1-4733-9013-0cfd304b179c" (UID: "bc21c34c-13c1-4733-9013-0cfd304b179c"). InnerVolumeSpecName "kube-api-access-vcwg2". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 21 04:43:59 crc kubenswrapper[4839]: I0321 04:43:58.526550 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bc21c34c-13c1-4733-9013-0cfd304b179c-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "bc21c34c-13c1-4733-9013-0cfd304b179c" (UID: "bc21c34c-13c1-4733-9013-0cfd304b179c"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 21 04:43:59 crc kubenswrapper[4839]: I0321 04:43:58.550443 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bc21c34c-13c1-4733-9013-0cfd304b179c-config-data" (OuterVolumeSpecName: "config-data") pod "bc21c34c-13c1-4733-9013-0cfd304b179c" (UID: "bc21c34c-13c1-4733-9013-0cfd304b179c"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 21 04:43:59 crc kubenswrapper[4839]: I0321 04:43:58.604553 4839 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vcwg2\" (UniqueName: \"kubernetes.io/projected/bc21c34c-13c1-4733-9013-0cfd304b179c-kube-api-access-vcwg2\") on node \"crc\" DevicePath \"\"" Mar 21 04:43:59 crc kubenswrapper[4839]: I0321 04:43:58.604611 4839 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bc21c34c-13c1-4733-9013-0cfd304b179c-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 21 04:43:59 crc kubenswrapper[4839]: I0321 04:43:58.604623 4839 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bc21c34c-13c1-4733-9013-0cfd304b179c-config-data\") on node \"crc\" DevicePath \"\"" Mar 21 04:43:59 crc kubenswrapper[4839]: I0321 04:43:58.970556 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-qgdlf" event={"ID":"bc21c34c-13c1-4733-9013-0cfd304b179c","Type":"ContainerDied","Data":"5aa954aabfea506f0a5478f8c3a3555f34687b1de83455cee6f271375bfc4c66"} Mar 21 04:43:59 crc kubenswrapper[4839]: I0321 04:43:58.970620 4839 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="5aa954aabfea506f0a5478f8c3a3555f34687b1de83455cee6f271375bfc4c66" Mar 21 04:43:59 crc kubenswrapper[4839]: I0321 04:43:58.970690 4839 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-qgdlf" Mar 21 04:43:59 crc kubenswrapper[4839]: I0321 04:43:58.985786 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"9848d2f0-c562-4b2a-bd1c-cd91c6754079","Type":"ContainerStarted","Data":"583bd871e4062ae14fed460f55bea0ef1edad5015242520ba12620d807ed1490"} Mar 21 04:43:59 crc kubenswrapper[4839]: I0321 04:43:58.985824 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"9848d2f0-c562-4b2a-bd1c-cd91c6754079","Type":"ContainerStarted","Data":"16691c3afc23c06b01006fa62e6582e6657a5a82dd9dce41a2cfa310cd369135"} Mar 21 04:43:59 crc kubenswrapper[4839]: I0321 04:43:59.033067 4839 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/swift-storage-0" podStartSLOduration=45.396027161 podStartE2EDuration="1m1.033048622s" podCreationTimestamp="2026-03-21 04:42:58 +0000 UTC" firstStartedPulling="2026-03-21 04:43:40.539434128 +0000 UTC m=+1224.867220804" lastFinishedPulling="2026-03-21 04:43:56.176455589 +0000 UTC m=+1240.504242265" observedRunningTime="2026-03-21 04:43:59.021028956 +0000 UTC m=+1243.348815632" watchObservedRunningTime="2026-03-21 04:43:59.033048622 +0000 UTC m=+1243.360835298" Mar 21 04:43:59 crc kubenswrapper[4839]: I0321 04:43:59.320605 4839 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-5c9d85d47c-ch994"] Mar 21 04:43:59 crc kubenswrapper[4839]: E0321 04:43:59.321001 4839 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9a5cee9b-67b3-40b1-bc62-e6a3c4c1272d" containerName="mariadb-database-create" Mar 21 04:43:59 crc kubenswrapper[4839]: I0321 04:43:59.321023 4839 state_mem.go:107] "Deleted CPUSet assignment" podUID="9a5cee9b-67b3-40b1-bc62-e6a3c4c1272d" containerName="mariadb-database-create" Mar 21 04:43:59 crc kubenswrapper[4839]: E0321 04:43:59.321036 4839 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b567c69c-d110-4ab2-aaf7-da82f0e72cc3" containerName="mariadb-account-create-update" Mar 21 04:43:59 crc kubenswrapper[4839]: I0321 04:43:59.321045 4839 state_mem.go:107] "Deleted CPUSet assignment" podUID="b567c69c-d110-4ab2-aaf7-da82f0e72cc3" containerName="mariadb-account-create-update" Mar 21 04:43:59 crc kubenswrapper[4839]: E0321 04:43:59.321056 4839 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8c8ad856-1b19-4b1c-8124-2e316dd567ee" containerName="mariadb-account-create-update" Mar 21 04:43:59 crc kubenswrapper[4839]: I0321 04:43:59.321064 4839 state_mem.go:107] "Deleted CPUSet assignment" podUID="8c8ad856-1b19-4b1c-8124-2e316dd567ee" containerName="mariadb-account-create-update" Mar 21 04:43:59 crc kubenswrapper[4839]: E0321 04:43:59.321085 4839 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ae4cf7f5-74ed-45d7-ace7-24ada744db6c" containerName="mariadb-database-create" Mar 21 04:43:59 crc kubenswrapper[4839]: I0321 04:43:59.321107 4839 state_mem.go:107] "Deleted CPUSet assignment" podUID="ae4cf7f5-74ed-45d7-ace7-24ada744db6c" containerName="mariadb-database-create" Mar 21 04:43:59 crc kubenswrapper[4839]: E0321 04:43:59.321121 4839 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f816daf8-a9c7-4e99-a622-2f9bee7d203a" containerName="mariadb-account-create-update" Mar 21 04:43:59 crc kubenswrapper[4839]: I0321 04:43:59.321127 4839 state_mem.go:107] "Deleted CPUSet assignment" podUID="f816daf8-a9c7-4e99-a622-2f9bee7d203a" containerName="mariadb-account-create-update" Mar 21 04:43:59 crc kubenswrapper[4839]: E0321 04:43:59.321137 4839 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bc21c34c-13c1-4733-9013-0cfd304b179c" containerName="keystone-db-sync" Mar 21 04:43:59 crc kubenswrapper[4839]: I0321 04:43:59.321143 4839 state_mem.go:107] "Deleted CPUSet assignment" podUID="bc21c34c-13c1-4733-9013-0cfd304b179c" containerName="keystone-db-sync" Mar 21 04:43:59 crc kubenswrapper[4839]: E0321 04:43:59.321149 4839 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a7dfdbcf-7830-4f8d-a165-119fe80d999a" containerName="mariadb-account-create-update" Mar 21 04:43:59 crc kubenswrapper[4839]: I0321 04:43:59.321155 4839 state_mem.go:107] "Deleted CPUSet assignment" podUID="a7dfdbcf-7830-4f8d-a165-119fe80d999a" containerName="mariadb-account-create-update" Mar 21 04:43:59 crc kubenswrapper[4839]: E0321 04:43:59.321164 4839 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="34a240db-9587-446e-af12-a44b87b1a3ac" containerName="mariadb-database-create" Mar 21 04:43:59 crc kubenswrapper[4839]: I0321 04:43:59.321170 4839 state_mem.go:107] "Deleted CPUSet assignment" podUID="34a240db-9587-446e-af12-a44b87b1a3ac" containerName="mariadb-database-create" Mar 21 04:43:59 crc kubenswrapper[4839]: I0321 04:43:59.321353 4839 memory_manager.go:354] "RemoveStaleState removing state" podUID="a7dfdbcf-7830-4f8d-a165-119fe80d999a" containerName="mariadb-account-create-update" Mar 21 04:43:59 crc kubenswrapper[4839]: I0321 04:43:59.321374 4839 memory_manager.go:354] "RemoveStaleState removing state" podUID="bc21c34c-13c1-4733-9013-0cfd304b179c" containerName="keystone-db-sync" Mar 21 04:43:59 crc kubenswrapper[4839]: I0321 04:43:59.321389 4839 memory_manager.go:354] "RemoveStaleState removing state" podUID="b567c69c-d110-4ab2-aaf7-da82f0e72cc3" containerName="mariadb-account-create-update" Mar 21 04:43:59 crc kubenswrapper[4839]: I0321 04:43:59.321399 4839 memory_manager.go:354] "RemoveStaleState removing state" podUID="34a240db-9587-446e-af12-a44b87b1a3ac" containerName="mariadb-database-create" Mar 21 04:43:59 crc kubenswrapper[4839]: I0321 04:43:59.321409 4839 memory_manager.go:354] "RemoveStaleState removing state" podUID="ae4cf7f5-74ed-45d7-ace7-24ada744db6c" containerName="mariadb-database-create" Mar 21 04:43:59 crc kubenswrapper[4839]: I0321 04:43:59.321420 4839 memory_manager.go:354] "RemoveStaleState removing state" podUID="9a5cee9b-67b3-40b1-bc62-e6a3c4c1272d" containerName="mariadb-database-create" Mar 21 04:43:59 crc kubenswrapper[4839]: I0321 04:43:59.321433 4839 memory_manager.go:354] "RemoveStaleState removing state" podUID="f816daf8-a9c7-4e99-a622-2f9bee7d203a" containerName="mariadb-account-create-update" Mar 21 04:43:59 crc kubenswrapper[4839]: I0321 04:43:59.321455 4839 memory_manager.go:354] "RemoveStaleState removing state" podUID="8c8ad856-1b19-4b1c-8124-2e316dd567ee" containerName="mariadb-account-create-update" Mar 21 04:43:59 crc kubenswrapper[4839]: I0321 04:43:59.322238 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5c9d85d47c-ch994" Mar 21 04:43:59 crc kubenswrapper[4839]: I0321 04:43:59.342359 4839 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5c9d85d47c-ch994"] Mar 21 04:43:59 crc kubenswrapper[4839]: I0321 04:43:59.368986 4839 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-bootstrap-5rr4j"] Mar 21 04:43:59 crc kubenswrapper[4839]: I0321 04:43:59.370243 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-5rr4j" Mar 21 04:43:59 crc kubenswrapper[4839]: I0321 04:43:59.398820 4839 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Mar 21 04:43:59 crc kubenswrapper[4839]: I0321 04:43:59.399340 4839 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-pzsvm" Mar 21 04:43:59 crc kubenswrapper[4839]: I0321 04:43:59.399481 4839 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"osp-secret" Mar 21 04:43:59 crc kubenswrapper[4839]: I0321 04:43:59.399655 4839 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Mar 21 04:43:59 crc kubenswrapper[4839]: I0321 04:43:59.399866 4839 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Mar 21 04:43:59 crc kubenswrapper[4839]: I0321 04:43:59.413661 4839 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-5rr4j"] Mar 21 04:43:59 crc kubenswrapper[4839]: I0321 04:43:59.415545 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j28tz\" (UniqueName: \"kubernetes.io/projected/5e86a461-8b9c-4850-b084-5a242058db02-kube-api-access-j28tz\") pod \"dnsmasq-dns-5c9d85d47c-ch994\" (UID: \"5e86a461-8b9c-4850-b084-5a242058db02\") " pod="openstack/dnsmasq-dns-5c9d85d47c-ch994" Mar 21 04:43:59 crc kubenswrapper[4839]: I0321 04:43:59.415657 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5e86a461-8b9c-4850-b084-5a242058db02-config\") pod \"dnsmasq-dns-5c9d85d47c-ch994\" (UID: \"5e86a461-8b9c-4850-b084-5a242058db02\") " pod="openstack/dnsmasq-dns-5c9d85d47c-ch994" Mar 21 04:43:59 crc kubenswrapper[4839]: I0321 04:43:59.415698 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/5e86a461-8b9c-4850-b084-5a242058db02-dns-svc\") pod \"dnsmasq-dns-5c9d85d47c-ch994\" (UID: \"5e86a461-8b9c-4850-b084-5a242058db02\") " pod="openstack/dnsmasq-dns-5c9d85d47c-ch994" Mar 21 04:43:59 crc kubenswrapper[4839]: I0321 04:43:59.415766 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/5e86a461-8b9c-4850-b084-5a242058db02-ovsdbserver-sb\") pod \"dnsmasq-dns-5c9d85d47c-ch994\" (UID: \"5e86a461-8b9c-4850-b084-5a242058db02\") " pod="openstack/dnsmasq-dns-5c9d85d47c-ch994" Mar 21 04:43:59 crc kubenswrapper[4839]: I0321 04:43:59.415892 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/5e86a461-8b9c-4850-b084-5a242058db02-ovsdbserver-nb\") pod \"dnsmasq-dns-5c9d85d47c-ch994\" (UID: \"5e86a461-8b9c-4850-b084-5a242058db02\") " pod="openstack/dnsmasq-dns-5c9d85d47c-ch994" Mar 21 04:43:59 crc kubenswrapper[4839]: I0321 04:43:59.517218 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5e86a461-8b9c-4850-b084-5a242058db02-config\") pod \"dnsmasq-dns-5c9d85d47c-ch994\" (UID: \"5e86a461-8b9c-4850-b084-5a242058db02\") " pod="openstack/dnsmasq-dns-5c9d85d47c-ch994" Mar 21 04:43:59 crc kubenswrapper[4839]: I0321 04:43:59.517524 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/5e86a461-8b9c-4850-b084-5a242058db02-dns-svc\") pod \"dnsmasq-dns-5c9d85d47c-ch994\" (UID: \"5e86a461-8b9c-4850-b084-5a242058db02\") " pod="openstack/dnsmasq-dns-5c9d85d47c-ch994" Mar 21 04:43:59 crc kubenswrapper[4839]: I0321 04:43:59.517585 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/5e86a461-8b9c-4850-b084-5a242058db02-ovsdbserver-sb\") pod \"dnsmasq-dns-5c9d85d47c-ch994\" (UID: \"5e86a461-8b9c-4850-b084-5a242058db02\") " pod="openstack/dnsmasq-dns-5c9d85d47c-ch994" Mar 21 04:43:59 crc kubenswrapper[4839]: I0321 04:43:59.517609 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/848aa53a-bd67-4733-aad7-6ac0f6fc0a15-scripts\") pod \"keystone-bootstrap-5rr4j\" (UID: \"848aa53a-bd67-4733-aad7-6ac0f6fc0a15\") " pod="openstack/keystone-bootstrap-5rr4j" Mar 21 04:43:59 crc kubenswrapper[4839]: I0321 04:43:59.517643 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/848aa53a-bd67-4733-aad7-6ac0f6fc0a15-config-data\") pod \"keystone-bootstrap-5rr4j\" (UID: \"848aa53a-bd67-4733-aad7-6ac0f6fc0a15\") " pod="openstack/keystone-bootstrap-5rr4j" Mar 21 04:43:59 crc kubenswrapper[4839]: I0321 04:43:59.517679 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/848aa53a-bd67-4733-aad7-6ac0f6fc0a15-combined-ca-bundle\") pod \"keystone-bootstrap-5rr4j\" (UID: \"848aa53a-bd67-4733-aad7-6ac0f6fc0a15\") " pod="openstack/keystone-bootstrap-5rr4j" Mar 21 04:43:59 crc kubenswrapper[4839]: I0321 04:43:59.517726 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/848aa53a-bd67-4733-aad7-6ac0f6fc0a15-fernet-keys\") pod \"keystone-bootstrap-5rr4j\" (UID: \"848aa53a-bd67-4733-aad7-6ac0f6fc0a15\") " pod="openstack/keystone-bootstrap-5rr4j" Mar 21 04:43:59 crc kubenswrapper[4839]: I0321 04:43:59.517749 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/5e86a461-8b9c-4850-b084-5a242058db02-ovsdbserver-nb\") pod \"dnsmasq-dns-5c9d85d47c-ch994\" (UID: \"5e86a461-8b9c-4850-b084-5a242058db02\") " pod="openstack/dnsmasq-dns-5c9d85d47c-ch994" Mar 21 04:43:59 crc kubenswrapper[4839]: I0321 04:43:59.517768 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/848aa53a-bd67-4733-aad7-6ac0f6fc0a15-credential-keys\") pod \"keystone-bootstrap-5rr4j\" (UID: \"848aa53a-bd67-4733-aad7-6ac0f6fc0a15\") " pod="openstack/keystone-bootstrap-5rr4j" Mar 21 04:43:59 crc kubenswrapper[4839]: I0321 04:43:59.517786 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rqg7c\" (UniqueName: \"kubernetes.io/projected/848aa53a-bd67-4733-aad7-6ac0f6fc0a15-kube-api-access-rqg7c\") pod \"keystone-bootstrap-5rr4j\" (UID: \"848aa53a-bd67-4733-aad7-6ac0f6fc0a15\") " pod="openstack/keystone-bootstrap-5rr4j" Mar 21 04:43:59 crc kubenswrapper[4839]: I0321 04:43:59.517807 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-j28tz\" (UniqueName: \"kubernetes.io/projected/5e86a461-8b9c-4850-b084-5a242058db02-kube-api-access-j28tz\") pod \"dnsmasq-dns-5c9d85d47c-ch994\" (UID: \"5e86a461-8b9c-4850-b084-5a242058db02\") " pod="openstack/dnsmasq-dns-5c9d85d47c-ch994" Mar 21 04:43:59 crc kubenswrapper[4839]: I0321 04:43:59.518114 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5e86a461-8b9c-4850-b084-5a242058db02-config\") pod \"dnsmasq-dns-5c9d85d47c-ch994\" (UID: \"5e86a461-8b9c-4850-b084-5a242058db02\") " pod="openstack/dnsmasq-dns-5c9d85d47c-ch994" Mar 21 04:43:59 crc kubenswrapper[4839]: I0321 04:43:59.518621 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/5e86a461-8b9c-4850-b084-5a242058db02-dns-svc\") pod \"dnsmasq-dns-5c9d85d47c-ch994\" (UID: \"5e86a461-8b9c-4850-b084-5a242058db02\") " pod="openstack/dnsmasq-dns-5c9d85d47c-ch994" Mar 21 04:43:59 crc kubenswrapper[4839]: I0321 04:43:59.518731 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/5e86a461-8b9c-4850-b084-5a242058db02-ovsdbserver-sb\") pod \"dnsmasq-dns-5c9d85d47c-ch994\" (UID: \"5e86a461-8b9c-4850-b084-5a242058db02\") " pod="openstack/dnsmasq-dns-5c9d85d47c-ch994" Mar 21 04:43:59 crc kubenswrapper[4839]: I0321 04:43:59.519194 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/5e86a461-8b9c-4850-b084-5a242058db02-ovsdbserver-nb\") pod \"dnsmasq-dns-5c9d85d47c-ch994\" (UID: \"5e86a461-8b9c-4850-b084-5a242058db02\") " pod="openstack/dnsmasq-dns-5c9d85d47c-ch994" Mar 21 04:43:59 crc kubenswrapper[4839]: I0321 04:43:59.571173 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-j28tz\" (UniqueName: \"kubernetes.io/projected/5e86a461-8b9c-4850-b084-5a242058db02-kube-api-access-j28tz\") pod \"dnsmasq-dns-5c9d85d47c-ch994\" (UID: \"5e86a461-8b9c-4850-b084-5a242058db02\") " pod="openstack/dnsmasq-dns-5c9d85d47c-ch994" Mar 21 04:43:59 crc kubenswrapper[4839]: I0321 04:43:59.581183 4839 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-db-sync-qfjms"] Mar 21 04:43:59 crc kubenswrapper[4839]: I0321 04:43:59.582197 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-qfjms" Mar 21 04:43:59 crc kubenswrapper[4839]: I0321 04:43:59.593117 4839 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-cinder-dockercfg-4bb6p" Mar 21 04:43:59 crc kubenswrapper[4839]: I0321 04:43:59.593304 4839 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-config-data" Mar 21 04:43:59 crc kubenswrapper[4839]: I0321 04:43:59.593405 4839 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scripts" Mar 21 04:43:59 crc kubenswrapper[4839]: I0321 04:43:59.620450 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/848aa53a-bd67-4733-aad7-6ac0f6fc0a15-fernet-keys\") pod \"keystone-bootstrap-5rr4j\" (UID: \"848aa53a-bd67-4733-aad7-6ac0f6fc0a15\") " pod="openstack/keystone-bootstrap-5rr4j" Mar 21 04:43:59 crc kubenswrapper[4839]: I0321 04:43:59.620507 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/848aa53a-bd67-4733-aad7-6ac0f6fc0a15-credential-keys\") pod \"keystone-bootstrap-5rr4j\" (UID: \"848aa53a-bd67-4733-aad7-6ac0f6fc0a15\") " pod="openstack/keystone-bootstrap-5rr4j" Mar 21 04:43:59 crc kubenswrapper[4839]: I0321 04:43:59.620525 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rqg7c\" (UniqueName: \"kubernetes.io/projected/848aa53a-bd67-4733-aad7-6ac0f6fc0a15-kube-api-access-rqg7c\") pod \"keystone-bootstrap-5rr4j\" (UID: \"848aa53a-bd67-4733-aad7-6ac0f6fc0a15\") " pod="openstack/keystone-bootstrap-5rr4j" Mar 21 04:43:59 crc kubenswrapper[4839]: I0321 04:43:59.620631 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/848aa53a-bd67-4733-aad7-6ac0f6fc0a15-scripts\") pod \"keystone-bootstrap-5rr4j\" (UID: \"848aa53a-bd67-4733-aad7-6ac0f6fc0a15\") " pod="openstack/keystone-bootstrap-5rr4j" Mar 21 04:43:59 crc kubenswrapper[4839]: I0321 04:43:59.620664 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/848aa53a-bd67-4733-aad7-6ac0f6fc0a15-config-data\") pod \"keystone-bootstrap-5rr4j\" (UID: \"848aa53a-bd67-4733-aad7-6ac0f6fc0a15\") " pod="openstack/keystone-bootstrap-5rr4j" Mar 21 04:43:59 crc kubenswrapper[4839]: I0321 04:43:59.620694 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/848aa53a-bd67-4733-aad7-6ac0f6fc0a15-combined-ca-bundle\") pod \"keystone-bootstrap-5rr4j\" (UID: \"848aa53a-bd67-4733-aad7-6ac0f6fc0a15\") " pod="openstack/keystone-bootstrap-5rr4j" Mar 21 04:43:59 crc kubenswrapper[4839]: I0321 04:43:59.636239 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/848aa53a-bd67-4733-aad7-6ac0f6fc0a15-fernet-keys\") pod \"keystone-bootstrap-5rr4j\" (UID: \"848aa53a-bd67-4733-aad7-6ac0f6fc0a15\") " pod="openstack/keystone-bootstrap-5rr4j" Mar 21 04:43:59 crc kubenswrapper[4839]: I0321 04:43:59.638000 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/848aa53a-bd67-4733-aad7-6ac0f6fc0a15-combined-ca-bundle\") pod \"keystone-bootstrap-5rr4j\" (UID: \"848aa53a-bd67-4733-aad7-6ac0f6fc0a15\") " pod="openstack/keystone-bootstrap-5rr4j" Mar 21 04:43:59 crc kubenswrapper[4839]: I0321 04:43:59.642495 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5c9d85d47c-ch994" Mar 21 04:43:59 crc kubenswrapper[4839]: I0321 04:43:59.642919 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/848aa53a-bd67-4733-aad7-6ac0f6fc0a15-config-data\") pod \"keystone-bootstrap-5rr4j\" (UID: \"848aa53a-bd67-4733-aad7-6ac0f6fc0a15\") " pod="openstack/keystone-bootstrap-5rr4j" Mar 21 04:43:59 crc kubenswrapper[4839]: I0321 04:43:59.643946 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/848aa53a-bd67-4733-aad7-6ac0f6fc0a15-scripts\") pod \"keystone-bootstrap-5rr4j\" (UID: \"848aa53a-bd67-4733-aad7-6ac0f6fc0a15\") " pod="openstack/keystone-bootstrap-5rr4j" Mar 21 04:43:59 crc kubenswrapper[4839]: I0321 04:43:59.646185 4839 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5c9d85d47c-ch994"] Mar 21 04:43:59 crc kubenswrapper[4839]: I0321 04:43:59.661526 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/848aa53a-bd67-4733-aad7-6ac0f6fc0a15-credential-keys\") pod \"keystone-bootstrap-5rr4j\" (UID: \"848aa53a-bd67-4733-aad7-6ac0f6fc0a15\") " pod="openstack/keystone-bootstrap-5rr4j" Mar 21 04:43:59 crc kubenswrapper[4839]: I0321 04:43:59.706917 4839 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-sync-qfjms"] Mar 21 04:43:59 crc kubenswrapper[4839]: I0321 04:43:59.727358 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rqg7c\" (UniqueName: \"kubernetes.io/projected/848aa53a-bd67-4733-aad7-6ac0f6fc0a15-kube-api-access-rqg7c\") pod \"keystone-bootstrap-5rr4j\" (UID: \"848aa53a-bd67-4733-aad7-6ac0f6fc0a15\") " pod="openstack/keystone-bootstrap-5rr4j" Mar 21 04:43:59 crc kubenswrapper[4839]: I0321 04:43:59.735420 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6000d2d4-e84a-443f-9094-ab999541331d-scripts\") pod \"cinder-db-sync-qfjms\" (UID: \"6000d2d4-e84a-443f-9094-ab999541331d\") " pod="openstack/cinder-db-sync-qfjms" Mar 21 04:43:59 crc kubenswrapper[4839]: I0321 04:43:59.735466 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6000d2d4-e84a-443f-9094-ab999541331d-combined-ca-bundle\") pod \"cinder-db-sync-qfjms\" (UID: \"6000d2d4-e84a-443f-9094-ab999541331d\") " pod="openstack/cinder-db-sync-qfjms" Mar 21 04:43:59 crc kubenswrapper[4839]: I0321 04:43:59.735533 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g6mpt\" (UniqueName: \"kubernetes.io/projected/6000d2d4-e84a-443f-9094-ab999541331d-kube-api-access-g6mpt\") pod \"cinder-db-sync-qfjms\" (UID: \"6000d2d4-e84a-443f-9094-ab999541331d\") " pod="openstack/cinder-db-sync-qfjms" Mar 21 04:43:59 crc kubenswrapper[4839]: I0321 04:43:59.735586 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/6000d2d4-e84a-443f-9094-ab999541331d-etc-machine-id\") pod \"cinder-db-sync-qfjms\" (UID: \"6000d2d4-e84a-443f-9094-ab999541331d\") " pod="openstack/cinder-db-sync-qfjms" Mar 21 04:43:59 crc kubenswrapper[4839]: I0321 04:43:59.735607 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/6000d2d4-e84a-443f-9094-ab999541331d-db-sync-config-data\") pod \"cinder-db-sync-qfjms\" (UID: \"6000d2d4-e84a-443f-9094-ab999541331d\") " pod="openstack/cinder-db-sync-qfjms" Mar 21 04:43:59 crc kubenswrapper[4839]: I0321 04:43:59.735635 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6000d2d4-e84a-443f-9094-ab999541331d-config-data\") pod \"cinder-db-sync-qfjms\" (UID: \"6000d2d4-e84a-443f-9094-ab999541331d\") " pod="openstack/cinder-db-sync-qfjms" Mar 21 04:43:59 crc kubenswrapper[4839]: I0321 04:43:59.740646 4839 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/horizon-554fbfcbdf-wqcc5"] Mar 21 04:43:59 crc kubenswrapper[4839]: I0321 04:43:59.742348 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-554fbfcbdf-wqcc5" Mar 21 04:43:59 crc kubenswrapper[4839]: I0321 04:43:59.754527 4839 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"horizon-horizon-dockercfg-pn6kj" Mar 21 04:43:59 crc kubenswrapper[4839]: I0321 04:43:59.754695 4839 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"horizon" Mar 21 04:43:59 crc kubenswrapper[4839]: I0321 04:43:59.754824 4839 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"horizon-scripts" Mar 21 04:43:59 crc kubenswrapper[4839]: I0321 04:43:59.754992 4839 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"horizon-config-data" Mar 21 04:43:59 crc kubenswrapper[4839]: I0321 04:43:59.791374 4839 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-db-sync-nm9t5"] Mar 21 04:43:59 crc kubenswrapper[4839]: I0321 04:43:59.792620 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-nm9t5" Mar 21 04:43:59 crc kubenswrapper[4839]: I0321 04:43:59.811762 4839 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-config" Mar 21 04:43:59 crc kubenswrapper[4839]: I0321 04:43:59.813425 4839 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-httpd-config" Mar 21 04:43:59 crc kubenswrapper[4839]: I0321 04:43:59.818232 4839 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-neutron-dockercfg-5mrkq" Mar 21 04:43:59 crc kubenswrapper[4839]: I0321 04:43:59.838620 4839 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-sync-nm9t5"] Mar 21 04:43:59 crc kubenswrapper[4839]: I0321 04:43:59.841993 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g6mpt\" (UniqueName: \"kubernetes.io/projected/6000d2d4-e84a-443f-9094-ab999541331d-kube-api-access-g6mpt\") pod \"cinder-db-sync-qfjms\" (UID: \"6000d2d4-e84a-443f-9094-ab999541331d\") " pod="openstack/cinder-db-sync-qfjms" Mar 21 04:43:59 crc kubenswrapper[4839]: I0321 04:43:59.842033 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/02db0b32-3683-4d02-b645-3cea2cd59b7d-scripts\") pod \"horizon-554fbfcbdf-wqcc5\" (UID: \"02db0b32-3683-4d02-b645-3cea2cd59b7d\") " pod="openstack/horizon-554fbfcbdf-wqcc5" Mar 21 04:43:59 crc kubenswrapper[4839]: I0321 04:43:59.842084 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/6000d2d4-e84a-443f-9094-ab999541331d-etc-machine-id\") pod \"cinder-db-sync-qfjms\" (UID: \"6000d2d4-e84a-443f-9094-ab999541331d\") " pod="openstack/cinder-db-sync-qfjms" Mar 21 04:43:59 crc kubenswrapper[4839]: I0321 04:43:59.842106 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/6000d2d4-e84a-443f-9094-ab999541331d-db-sync-config-data\") pod \"cinder-db-sync-qfjms\" (UID: \"6000d2d4-e84a-443f-9094-ab999541331d\") " pod="openstack/cinder-db-sync-qfjms" Mar 21 04:43:59 crc kubenswrapper[4839]: I0321 04:43:59.842134 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/02db0b32-3683-4d02-b645-3cea2cd59b7d-horizon-secret-key\") pod \"horizon-554fbfcbdf-wqcc5\" (UID: \"02db0b32-3683-4d02-b645-3cea2cd59b7d\") " pod="openstack/horizon-554fbfcbdf-wqcc5" Mar 21 04:43:59 crc kubenswrapper[4839]: I0321 04:43:59.842163 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6000d2d4-e84a-443f-9094-ab999541331d-config-data\") pod \"cinder-db-sync-qfjms\" (UID: \"6000d2d4-e84a-443f-9094-ab999541331d\") " pod="openstack/cinder-db-sync-qfjms" Mar 21 04:43:59 crc kubenswrapper[4839]: I0321 04:43:59.842196 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/02db0b32-3683-4d02-b645-3cea2cd59b7d-config-data\") pod \"horizon-554fbfcbdf-wqcc5\" (UID: \"02db0b32-3683-4d02-b645-3cea2cd59b7d\") " pod="openstack/horizon-554fbfcbdf-wqcc5" Mar 21 04:43:59 crc kubenswrapper[4839]: I0321 04:43:59.842241 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6000d2d4-e84a-443f-9094-ab999541331d-scripts\") pod \"cinder-db-sync-qfjms\" (UID: \"6000d2d4-e84a-443f-9094-ab999541331d\") " pod="openstack/cinder-db-sync-qfjms" Mar 21 04:43:59 crc kubenswrapper[4839]: I0321 04:43:59.842263 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6000d2d4-e84a-443f-9094-ab999541331d-combined-ca-bundle\") pod \"cinder-db-sync-qfjms\" (UID: \"6000d2d4-e84a-443f-9094-ab999541331d\") " pod="openstack/cinder-db-sync-qfjms" Mar 21 04:43:59 crc kubenswrapper[4839]: I0321 04:43:59.842299 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/02db0b32-3683-4d02-b645-3cea2cd59b7d-logs\") pod \"horizon-554fbfcbdf-wqcc5\" (UID: \"02db0b32-3683-4d02-b645-3cea2cd59b7d\") " pod="openstack/horizon-554fbfcbdf-wqcc5" Mar 21 04:43:59 crc kubenswrapper[4839]: I0321 04:43:59.842327 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5ccwx\" (UniqueName: \"kubernetes.io/projected/02db0b32-3683-4d02-b645-3cea2cd59b7d-kube-api-access-5ccwx\") pod \"horizon-554fbfcbdf-wqcc5\" (UID: \"02db0b32-3683-4d02-b645-3cea2cd59b7d\") " pod="openstack/horizon-554fbfcbdf-wqcc5" Mar 21 04:43:59 crc kubenswrapper[4839]: I0321 04:43:59.842714 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/6000d2d4-e84a-443f-9094-ab999541331d-etc-machine-id\") pod \"cinder-db-sync-qfjms\" (UID: \"6000d2d4-e84a-443f-9094-ab999541331d\") " pod="openstack/cinder-db-sync-qfjms" Mar 21 04:43:59 crc kubenswrapper[4839]: I0321 04:43:59.869734 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6000d2d4-e84a-443f-9094-ab999541331d-combined-ca-bundle\") pod \"cinder-db-sync-qfjms\" (UID: \"6000d2d4-e84a-443f-9094-ab999541331d\") " pod="openstack/cinder-db-sync-qfjms" Mar 21 04:43:59 crc kubenswrapper[4839]: I0321 04:43:59.869818 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/6000d2d4-e84a-443f-9094-ab999541331d-db-sync-config-data\") pod \"cinder-db-sync-qfjms\" (UID: \"6000d2d4-e84a-443f-9094-ab999541331d\") " pod="openstack/cinder-db-sync-qfjms" Mar 21 04:43:59 crc kubenswrapper[4839]: I0321 04:43:59.888433 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-g6mpt\" (UniqueName: \"kubernetes.io/projected/6000d2d4-e84a-443f-9094-ab999541331d-kube-api-access-g6mpt\") pod \"cinder-db-sync-qfjms\" (UID: \"6000d2d4-e84a-443f-9094-ab999541331d\") " pod="openstack/cinder-db-sync-qfjms" Mar 21 04:43:59 crc kubenswrapper[4839]: I0321 04:43:59.916599 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6000d2d4-e84a-443f-9094-ab999541331d-scripts\") pod \"cinder-db-sync-qfjms\" (UID: \"6000d2d4-e84a-443f-9094-ab999541331d\") " pod="openstack/cinder-db-sync-qfjms" Mar 21 04:43:59 crc kubenswrapper[4839]: I0321 04:43:59.921698 4839 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-5b868669f-g4f92"] Mar 21 04:43:59 crc kubenswrapper[4839]: I0321 04:43:59.928970 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5b868669f-g4f92" Mar 21 04:43:59 crc kubenswrapper[4839]: I0321 04:43:59.936912 4839 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"dns-swift-storage-0" Mar 21 04:43:59 crc kubenswrapper[4839]: I0321 04:43:59.940312 4839 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-db-sync-wdddk"] Mar 21 04:43:59 crc kubenswrapper[4839]: I0321 04:43:59.942439 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-wdddk" Mar 21 04:43:59 crc kubenswrapper[4839]: I0321 04:43:59.944010 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/02db0b32-3683-4d02-b645-3cea2cd59b7d-logs\") pod \"horizon-554fbfcbdf-wqcc5\" (UID: \"02db0b32-3683-4d02-b645-3cea2cd59b7d\") " pod="openstack/horizon-554fbfcbdf-wqcc5" Mar 21 04:43:59 crc kubenswrapper[4839]: I0321 04:43:59.944088 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5ccwx\" (UniqueName: \"kubernetes.io/projected/02db0b32-3683-4d02-b645-3cea2cd59b7d-kube-api-access-5ccwx\") pod \"horizon-554fbfcbdf-wqcc5\" (UID: \"02db0b32-3683-4d02-b645-3cea2cd59b7d\") " pod="openstack/horizon-554fbfcbdf-wqcc5" Mar 21 04:43:59 crc kubenswrapper[4839]: I0321 04:43:59.944131 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/02db0b32-3683-4d02-b645-3cea2cd59b7d-scripts\") pod \"horizon-554fbfcbdf-wqcc5\" (UID: \"02db0b32-3683-4d02-b645-3cea2cd59b7d\") " pod="openstack/horizon-554fbfcbdf-wqcc5" Mar 21 04:43:59 crc kubenswrapper[4839]: I0321 04:43:59.944182 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/02db0b32-3683-4d02-b645-3cea2cd59b7d-horizon-secret-key\") pod \"horizon-554fbfcbdf-wqcc5\" (UID: \"02db0b32-3683-4d02-b645-3cea2cd59b7d\") " pod="openstack/horizon-554fbfcbdf-wqcc5" Mar 21 04:43:59 crc kubenswrapper[4839]: I0321 04:43:59.944261 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/02db0b32-3683-4d02-b645-3cea2cd59b7d-config-data\") pod \"horizon-554fbfcbdf-wqcc5\" (UID: \"02db0b32-3683-4d02-b645-3cea2cd59b7d\") " pod="openstack/horizon-554fbfcbdf-wqcc5" Mar 21 04:43:59 crc kubenswrapper[4839]: I0321 04:43:59.944307 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/625a99bd-bc01-400e-8e9c-1f5eff390466-combined-ca-bundle\") pod \"neutron-db-sync-nm9t5\" (UID: \"625a99bd-bc01-400e-8e9c-1f5eff390466\") " pod="openstack/neutron-db-sync-nm9t5" Mar 21 04:43:59 crc kubenswrapper[4839]: I0321 04:43:59.944331 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zhrt9\" (UniqueName: \"kubernetes.io/projected/625a99bd-bc01-400e-8e9c-1f5eff390466-kube-api-access-zhrt9\") pod \"neutron-db-sync-nm9t5\" (UID: \"625a99bd-bc01-400e-8e9c-1f5eff390466\") " pod="openstack/neutron-db-sync-nm9t5" Mar 21 04:43:59 crc kubenswrapper[4839]: I0321 04:43:59.944358 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/625a99bd-bc01-400e-8e9c-1f5eff390466-config\") pod \"neutron-db-sync-nm9t5\" (UID: \"625a99bd-bc01-400e-8e9c-1f5eff390466\") " pod="openstack/neutron-db-sync-nm9t5" Mar 21 04:43:59 crc kubenswrapper[4839]: I0321 04:43:59.945320 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/02db0b32-3683-4d02-b645-3cea2cd59b7d-logs\") pod \"horizon-554fbfcbdf-wqcc5\" (UID: \"02db0b32-3683-4d02-b645-3cea2cd59b7d\") " pod="openstack/horizon-554fbfcbdf-wqcc5" Mar 21 04:43:59 crc kubenswrapper[4839]: I0321 04:43:59.946296 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/02db0b32-3683-4d02-b645-3cea2cd59b7d-config-data\") pod \"horizon-554fbfcbdf-wqcc5\" (UID: \"02db0b32-3683-4d02-b645-3cea2cd59b7d\") " pod="openstack/horizon-554fbfcbdf-wqcc5" Mar 21 04:43:59 crc kubenswrapper[4839]: I0321 04:43:59.946384 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6000d2d4-e84a-443f-9094-ab999541331d-config-data\") pod \"cinder-db-sync-qfjms\" (UID: \"6000d2d4-e84a-443f-9094-ab999541331d\") " pod="openstack/cinder-db-sync-qfjms" Mar 21 04:43:59 crc kubenswrapper[4839]: I0321 04:43:59.946932 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/02db0b32-3683-4d02-b645-3cea2cd59b7d-scripts\") pod \"horizon-554fbfcbdf-wqcc5\" (UID: \"02db0b32-3683-4d02-b645-3cea2cd59b7d\") " pod="openstack/horizon-554fbfcbdf-wqcc5" Mar 21 04:43:59 crc kubenswrapper[4839]: I0321 04:43:59.957503 4839 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-scripts" Mar 21 04:43:59 crc kubenswrapper[4839]: I0321 04:43:59.957985 4839 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-config-data" Mar 21 04:43:59 crc kubenswrapper[4839]: I0321 04:43:59.958273 4839 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-placement-dockercfg-qhlcg" Mar 21 04:43:59 crc kubenswrapper[4839]: I0321 04:43:59.958514 4839 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-554fbfcbdf-wqcc5"] Mar 21 04:43:59 crc kubenswrapper[4839]: I0321 04:43:59.959025 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/02db0b32-3683-4d02-b645-3cea2cd59b7d-horizon-secret-key\") pod \"horizon-554fbfcbdf-wqcc5\" (UID: \"02db0b32-3683-4d02-b645-3cea2cd59b7d\") " pod="openstack/horizon-554fbfcbdf-wqcc5" Mar 21 04:43:59 crc kubenswrapper[4839]: I0321 04:43:59.980075 4839 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-sync-wdddk"] Mar 21 04:43:59 crc kubenswrapper[4839]: I0321 04:43:59.986841 4839 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5b868669f-g4f92"] Mar 21 04:43:59 crc kubenswrapper[4839]: I0321 04:43:59.993434 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5ccwx\" (UniqueName: \"kubernetes.io/projected/02db0b32-3683-4d02-b645-3cea2cd59b7d-kube-api-access-5ccwx\") pod \"horizon-554fbfcbdf-wqcc5\" (UID: \"02db0b32-3683-4d02-b645-3cea2cd59b7d\") " pod="openstack/horizon-554fbfcbdf-wqcc5" Mar 21 04:43:59 crc kubenswrapper[4839]: I0321 04:43:59.993510 4839 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-db-sync-t8kxj"] Mar 21 04:44:00 crc kubenswrapper[4839]: I0321 04:44:00.002466 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-5rr4j" Mar 21 04:44:00 crc kubenswrapper[4839]: I0321 04:44:00.025041 4839 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5b868669f-g4f92"] Mar 21 04:44:00 crc kubenswrapper[4839]: I0321 04:44:00.025166 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-t8kxj" Mar 21 04:44:00 crc kubenswrapper[4839]: I0321 04:44:00.031025 4839 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-barbican-dockercfg-qnmpn" Mar 21 04:44:00 crc kubenswrapper[4839]: I0321 04:44:00.031264 4839 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-config-data" Mar 21 04:44:00 crc kubenswrapper[4839]: I0321 04:44:00.034099 4839 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-sync-t8kxj"] Mar 21 04:44:00 crc kubenswrapper[4839]: I0321 04:44:00.050530 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e6e87cbd-1f46-4fa0-9529-8250f9fee21c-scripts\") pod \"placement-db-sync-wdddk\" (UID: \"e6e87cbd-1f46-4fa0-9529-8250f9fee21c\") " pod="openstack/placement-db-sync-wdddk" Mar 21 04:44:00 crc kubenswrapper[4839]: I0321 04:44:00.050589 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9fec7a31-49df-4e3c-9266-8c21d7622445-config\") pod \"dnsmasq-dns-5b868669f-g4f92\" (UID: \"9fec7a31-49df-4e3c-9266-8c21d7622445\") " pod="openstack/dnsmasq-dns-5b868669f-g4f92" Mar 21 04:44:00 crc kubenswrapper[4839]: I0321 04:44:00.050611 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/625a99bd-bc01-400e-8e9c-1f5eff390466-combined-ca-bundle\") pod \"neutron-db-sync-nm9t5\" (UID: \"625a99bd-bc01-400e-8e9c-1f5eff390466\") " pod="openstack/neutron-db-sync-nm9t5" Mar 21 04:44:00 crc kubenswrapper[4839]: I0321 04:44:00.050631 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5r6lw\" (UniqueName: \"kubernetes.io/projected/9fec7a31-49df-4e3c-9266-8c21d7622445-kube-api-access-5r6lw\") pod \"dnsmasq-dns-5b868669f-g4f92\" (UID: \"9fec7a31-49df-4e3c-9266-8c21d7622445\") " pod="openstack/dnsmasq-dns-5b868669f-g4f92" Mar 21 04:44:00 crc kubenswrapper[4839]: I0321 04:44:00.050653 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zhrt9\" (UniqueName: \"kubernetes.io/projected/625a99bd-bc01-400e-8e9c-1f5eff390466-kube-api-access-zhrt9\") pod \"neutron-db-sync-nm9t5\" (UID: \"625a99bd-bc01-400e-8e9c-1f5eff390466\") " pod="openstack/neutron-db-sync-nm9t5" Mar 21 04:44:00 crc kubenswrapper[4839]: I0321 04:44:00.050678 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/625a99bd-bc01-400e-8e9c-1f5eff390466-config\") pod \"neutron-db-sync-nm9t5\" (UID: \"625a99bd-bc01-400e-8e9c-1f5eff390466\") " pod="openstack/neutron-db-sync-nm9t5" Mar 21 04:44:00 crc kubenswrapper[4839]: I0321 04:44:00.050700 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/9fec7a31-49df-4e3c-9266-8c21d7622445-ovsdbserver-sb\") pod \"dnsmasq-dns-5b868669f-g4f92\" (UID: \"9fec7a31-49df-4e3c-9266-8c21d7622445\") " pod="openstack/dnsmasq-dns-5b868669f-g4f92" Mar 21 04:44:00 crc kubenswrapper[4839]: I0321 04:44:00.050717 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e6e87cbd-1f46-4fa0-9529-8250f9fee21c-config-data\") pod \"placement-db-sync-wdddk\" (UID: \"e6e87cbd-1f46-4fa0-9529-8250f9fee21c\") " pod="openstack/placement-db-sync-wdddk" Mar 21 04:44:00 crc kubenswrapper[4839]: I0321 04:44:00.050738 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e6e87cbd-1f46-4fa0-9529-8250f9fee21c-combined-ca-bundle\") pod \"placement-db-sync-wdddk\" (UID: \"e6e87cbd-1f46-4fa0-9529-8250f9fee21c\") " pod="openstack/placement-db-sync-wdddk" Mar 21 04:44:00 crc kubenswrapper[4839]: I0321 04:44:00.050761 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/9fec7a31-49df-4e3c-9266-8c21d7622445-dns-swift-storage-0\") pod \"dnsmasq-dns-5b868669f-g4f92\" (UID: \"9fec7a31-49df-4e3c-9266-8c21d7622445\") " pod="openstack/dnsmasq-dns-5b868669f-g4f92" Mar 21 04:44:00 crc kubenswrapper[4839]: I0321 04:44:00.050779 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/9fec7a31-49df-4e3c-9266-8c21d7622445-ovsdbserver-nb\") pod \"dnsmasq-dns-5b868669f-g4f92\" (UID: \"9fec7a31-49df-4e3c-9266-8c21d7622445\") " pod="openstack/dnsmasq-dns-5b868669f-g4f92" Mar 21 04:44:00 crc kubenswrapper[4839]: I0321 04:44:00.050794 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tdb5v\" (UniqueName: \"kubernetes.io/projected/e6e87cbd-1f46-4fa0-9529-8250f9fee21c-kube-api-access-tdb5v\") pod \"placement-db-sync-wdddk\" (UID: \"e6e87cbd-1f46-4fa0-9529-8250f9fee21c\") " pod="openstack/placement-db-sync-wdddk" Mar 21 04:44:00 crc kubenswrapper[4839]: I0321 04:44:00.050815 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/9fec7a31-49df-4e3c-9266-8c21d7622445-dns-svc\") pod \"dnsmasq-dns-5b868669f-g4f92\" (UID: \"9fec7a31-49df-4e3c-9266-8c21d7622445\") " pod="openstack/dnsmasq-dns-5b868669f-g4f92" Mar 21 04:44:00 crc kubenswrapper[4839]: I0321 04:44:00.050834 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e6e87cbd-1f46-4fa0-9529-8250f9fee21c-logs\") pod \"placement-db-sync-wdddk\" (UID: \"e6e87cbd-1f46-4fa0-9529-8250f9fee21c\") " pod="openstack/placement-db-sync-wdddk" Mar 21 04:44:00 crc kubenswrapper[4839]: I0321 04:44:00.064637 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/625a99bd-bc01-400e-8e9c-1f5eff390466-config\") pod \"neutron-db-sync-nm9t5\" (UID: \"625a99bd-bc01-400e-8e9c-1f5eff390466\") " pod="openstack/neutron-db-sync-nm9t5" Mar 21 04:44:00 crc kubenswrapper[4839]: I0321 04:44:00.065366 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/625a99bd-bc01-400e-8e9c-1f5eff390466-combined-ca-bundle\") pod \"neutron-db-sync-nm9t5\" (UID: \"625a99bd-bc01-400e-8e9c-1f5eff390466\") " pod="openstack/neutron-db-sync-nm9t5" Mar 21 04:44:00 crc kubenswrapper[4839]: I0321 04:44:00.079189 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zhrt9\" (UniqueName: \"kubernetes.io/projected/625a99bd-bc01-400e-8e9c-1f5eff390466-kube-api-access-zhrt9\") pod \"neutron-db-sync-nm9t5\" (UID: \"625a99bd-bc01-400e-8e9c-1f5eff390466\") " pod="openstack/neutron-db-sync-nm9t5" Mar 21 04:44:00 crc kubenswrapper[4839]: I0321 04:44:00.095649 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-554fbfcbdf-wqcc5" Mar 21 04:44:00 crc kubenswrapper[4839]: I0321 04:44:00.100464 4839 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-cf78879c9-mr2ng"] Mar 21 04:44:00 crc kubenswrapper[4839]: I0321 04:44:00.102773 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-cf78879c9-mr2ng" Mar 21 04:44:00 crc kubenswrapper[4839]: I0321 04:44:00.112795 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-nm9t5" Mar 21 04:44:00 crc kubenswrapper[4839]: E0321 04:44:00.143437 4839 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[config dns-svc dns-swift-storage-0 kube-api-access-5r6lw ovsdbserver-nb ovsdbserver-sb], unattached volumes=[], failed to process volumes=[]: context canceled" pod="openstack/dnsmasq-dns-5b868669f-g4f92" podUID="9fec7a31-49df-4e3c-9266-8c21d7622445" Mar 21 04:44:00 crc kubenswrapper[4839]: I0321 04:44:00.152380 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hkzvv\" (UniqueName: \"kubernetes.io/projected/6d0e1745-6e0b-475c-a1de-d049018abea6-kube-api-access-hkzvv\") pod \"barbican-db-sync-t8kxj\" (UID: \"6d0e1745-6e0b-475c-a1de-d049018abea6\") " pod="openstack/barbican-db-sync-t8kxj" Mar 21 04:44:00 crc kubenswrapper[4839]: I0321 04:44:00.152435 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e6e87cbd-1f46-4fa0-9529-8250f9fee21c-scripts\") pod \"placement-db-sync-wdddk\" (UID: \"e6e87cbd-1f46-4fa0-9529-8250f9fee21c\") " pod="openstack/placement-db-sync-wdddk" Mar 21 04:44:00 crc kubenswrapper[4839]: I0321 04:44:00.152486 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/6d0e1745-6e0b-475c-a1de-d049018abea6-db-sync-config-data\") pod \"barbican-db-sync-t8kxj\" (UID: \"6d0e1745-6e0b-475c-a1de-d049018abea6\") " pod="openstack/barbican-db-sync-t8kxj" Mar 21 04:44:00 crc kubenswrapper[4839]: I0321 04:44:00.152518 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9fec7a31-49df-4e3c-9266-8c21d7622445-config\") pod \"dnsmasq-dns-5b868669f-g4f92\" (UID: \"9fec7a31-49df-4e3c-9266-8c21d7622445\") " pod="openstack/dnsmasq-dns-5b868669f-g4f92" Mar 21 04:44:00 crc kubenswrapper[4839]: I0321 04:44:00.152591 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5r6lw\" (UniqueName: \"kubernetes.io/projected/9fec7a31-49df-4e3c-9266-8c21d7622445-kube-api-access-5r6lw\") pod \"dnsmasq-dns-5b868669f-g4f92\" (UID: \"9fec7a31-49df-4e3c-9266-8c21d7622445\") " pod="openstack/dnsmasq-dns-5b868669f-g4f92" Mar 21 04:44:00 crc kubenswrapper[4839]: I0321 04:44:00.152649 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e6e87cbd-1f46-4fa0-9529-8250f9fee21c-config-data\") pod \"placement-db-sync-wdddk\" (UID: \"e6e87cbd-1f46-4fa0-9529-8250f9fee21c\") " pod="openstack/placement-db-sync-wdddk" Mar 21 04:44:00 crc kubenswrapper[4839]: I0321 04:44:00.152671 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/9fec7a31-49df-4e3c-9266-8c21d7622445-ovsdbserver-sb\") pod \"dnsmasq-dns-5b868669f-g4f92\" (UID: \"9fec7a31-49df-4e3c-9266-8c21d7622445\") " pod="openstack/dnsmasq-dns-5b868669f-g4f92" Mar 21 04:44:00 crc kubenswrapper[4839]: I0321 04:44:00.152709 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e6e87cbd-1f46-4fa0-9529-8250f9fee21c-combined-ca-bundle\") pod \"placement-db-sync-wdddk\" (UID: \"e6e87cbd-1f46-4fa0-9529-8250f9fee21c\") " pod="openstack/placement-db-sync-wdddk" Mar 21 04:44:00 crc kubenswrapper[4839]: I0321 04:44:00.152750 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/9fec7a31-49df-4e3c-9266-8c21d7622445-dns-swift-storage-0\") pod \"dnsmasq-dns-5b868669f-g4f92\" (UID: \"9fec7a31-49df-4e3c-9266-8c21d7622445\") " pod="openstack/dnsmasq-dns-5b868669f-g4f92" Mar 21 04:44:00 crc kubenswrapper[4839]: I0321 04:44:00.152775 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/9fec7a31-49df-4e3c-9266-8c21d7622445-ovsdbserver-nb\") pod \"dnsmasq-dns-5b868669f-g4f92\" (UID: \"9fec7a31-49df-4e3c-9266-8c21d7622445\") " pod="openstack/dnsmasq-dns-5b868669f-g4f92" Mar 21 04:44:00 crc kubenswrapper[4839]: I0321 04:44:00.152799 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tdb5v\" (UniqueName: \"kubernetes.io/projected/e6e87cbd-1f46-4fa0-9529-8250f9fee21c-kube-api-access-tdb5v\") pod \"placement-db-sync-wdddk\" (UID: \"e6e87cbd-1f46-4fa0-9529-8250f9fee21c\") " pod="openstack/placement-db-sync-wdddk" Mar 21 04:44:00 crc kubenswrapper[4839]: I0321 04:44:00.152829 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/9fec7a31-49df-4e3c-9266-8c21d7622445-dns-svc\") pod \"dnsmasq-dns-5b868669f-g4f92\" (UID: \"9fec7a31-49df-4e3c-9266-8c21d7622445\") " pod="openstack/dnsmasq-dns-5b868669f-g4f92" Mar 21 04:44:00 crc kubenswrapper[4839]: I0321 04:44:00.152861 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e6e87cbd-1f46-4fa0-9529-8250f9fee21c-logs\") pod \"placement-db-sync-wdddk\" (UID: \"e6e87cbd-1f46-4fa0-9529-8250f9fee21c\") " pod="openstack/placement-db-sync-wdddk" Mar 21 04:44:00 crc kubenswrapper[4839]: I0321 04:44:00.152914 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6d0e1745-6e0b-475c-a1de-d049018abea6-combined-ca-bundle\") pod \"barbican-db-sync-t8kxj\" (UID: \"6d0e1745-6e0b-475c-a1de-d049018abea6\") " pod="openstack/barbican-db-sync-t8kxj" Mar 21 04:44:00 crc kubenswrapper[4839]: I0321 04:44:00.154041 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/9fec7a31-49df-4e3c-9266-8c21d7622445-dns-swift-storage-0\") pod \"dnsmasq-dns-5b868669f-g4f92\" (UID: \"9fec7a31-49df-4e3c-9266-8c21d7622445\") " pod="openstack/dnsmasq-dns-5b868669f-g4f92" Mar 21 04:44:00 crc kubenswrapper[4839]: I0321 04:44:00.154131 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9fec7a31-49df-4e3c-9266-8c21d7622445-config\") pod \"dnsmasq-dns-5b868669f-g4f92\" (UID: \"9fec7a31-49df-4e3c-9266-8c21d7622445\") " pod="openstack/dnsmasq-dns-5b868669f-g4f92" Mar 21 04:44:00 crc kubenswrapper[4839]: I0321 04:44:00.154771 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/9fec7a31-49df-4e3c-9266-8c21d7622445-ovsdbserver-nb\") pod \"dnsmasq-dns-5b868669f-g4f92\" (UID: \"9fec7a31-49df-4e3c-9266-8c21d7622445\") " pod="openstack/dnsmasq-dns-5b868669f-g4f92" Mar 21 04:44:00 crc kubenswrapper[4839]: I0321 04:44:00.155372 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e6e87cbd-1f46-4fa0-9529-8250f9fee21c-logs\") pod \"placement-db-sync-wdddk\" (UID: \"e6e87cbd-1f46-4fa0-9529-8250f9fee21c\") " pod="openstack/placement-db-sync-wdddk" Mar 21 04:44:00 crc kubenswrapper[4839]: I0321 04:44:00.156078 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/9fec7a31-49df-4e3c-9266-8c21d7622445-dns-svc\") pod \"dnsmasq-dns-5b868669f-g4f92\" (UID: \"9fec7a31-49df-4e3c-9266-8c21d7622445\") " pod="openstack/dnsmasq-dns-5b868669f-g4f92" Mar 21 04:44:00 crc kubenswrapper[4839]: I0321 04:44:00.157003 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/9fec7a31-49df-4e3c-9266-8c21d7622445-ovsdbserver-sb\") pod \"dnsmasq-dns-5b868669f-g4f92\" (UID: \"9fec7a31-49df-4e3c-9266-8c21d7622445\") " pod="openstack/dnsmasq-dns-5b868669f-g4f92" Mar 21 04:44:00 crc kubenswrapper[4839]: I0321 04:44:00.158256 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e6e87cbd-1f46-4fa0-9529-8250f9fee21c-scripts\") pod \"placement-db-sync-wdddk\" (UID: \"e6e87cbd-1f46-4fa0-9529-8250f9fee21c\") " pod="openstack/placement-db-sync-wdddk" Mar 21 04:44:00 crc kubenswrapper[4839]: I0321 04:44:00.159207 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e6e87cbd-1f46-4fa0-9529-8250f9fee21c-combined-ca-bundle\") pod \"placement-db-sync-wdddk\" (UID: \"e6e87cbd-1f46-4fa0-9529-8250f9fee21c\") " pod="openstack/placement-db-sync-wdddk" Mar 21 04:44:00 crc kubenswrapper[4839]: I0321 04:44:00.159265 4839 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-cf78879c9-mr2ng"] Mar 21 04:44:00 crc kubenswrapper[4839]: I0321 04:44:00.160334 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e6e87cbd-1f46-4fa0-9529-8250f9fee21c-config-data\") pod \"placement-db-sync-wdddk\" (UID: \"e6e87cbd-1f46-4fa0-9529-8250f9fee21c\") " pod="openstack/placement-db-sync-wdddk" Mar 21 04:44:00 crc kubenswrapper[4839]: I0321 04:44:00.170959 4839 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Mar 21 04:44:00 crc kubenswrapper[4839]: I0321 04:44:00.175054 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 21 04:44:00 crc kubenswrapper[4839]: I0321 04:44:00.177207 4839 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Mar 21 04:44:00 crc kubenswrapper[4839]: I0321 04:44:00.181595 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tdb5v\" (UniqueName: \"kubernetes.io/projected/e6e87cbd-1f46-4fa0-9529-8250f9fee21c-kube-api-access-tdb5v\") pod \"placement-db-sync-wdddk\" (UID: \"e6e87cbd-1f46-4fa0-9529-8250f9fee21c\") " pod="openstack/placement-db-sync-wdddk" Mar 21 04:44:00 crc kubenswrapper[4839]: I0321 04:44:00.181741 4839 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Mar 21 04:44:00 crc kubenswrapper[4839]: I0321 04:44:00.188111 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5r6lw\" (UniqueName: \"kubernetes.io/projected/9fec7a31-49df-4e3c-9266-8c21d7622445-kube-api-access-5r6lw\") pod \"dnsmasq-dns-5b868669f-g4f92\" (UID: \"9fec7a31-49df-4e3c-9266-8c21d7622445\") " pod="openstack/dnsmasq-dns-5b868669f-g4f92" Mar 21 04:44:00 crc kubenswrapper[4839]: I0321 04:44:00.197129 4839 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 21 04:44:00 crc kubenswrapper[4839]: I0321 04:44:00.215537 4839 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/horizon-7c9d7f5-72d27"] Mar 21 04:44:00 crc kubenswrapper[4839]: I0321 04:44:00.217331 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-7c9d7f5-72d27" Mar 21 04:44:00 crc kubenswrapper[4839]: I0321 04:44:00.225602 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-qfjms" Mar 21 04:44:00 crc kubenswrapper[4839]: I0321 04:44:00.239526 4839 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-7c9d7f5-72d27"] Mar 21 04:44:00 crc kubenswrapper[4839]: I0321 04:44:00.256328 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e8a38db0-ab1b-4555-bb52-a903e9c6b5bb-config\") pod \"dnsmasq-dns-cf78879c9-mr2ng\" (UID: \"e8a38db0-ab1b-4555-bb52-a903e9c6b5bb\") " pod="openstack/dnsmasq-dns-cf78879c9-mr2ng" Mar 21 04:44:00 crc kubenswrapper[4839]: I0321 04:44:00.256404 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6c266726-5bfd-4519-bdd5-9db7f6a77df4-config-data\") pod \"ceilometer-0\" (UID: \"6c266726-5bfd-4519-bdd5-9db7f6a77df4\") " pod="openstack/ceilometer-0" Mar 21 04:44:00 crc kubenswrapper[4839]: I0321 04:44:00.256441 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6d0e1745-6e0b-475c-a1de-d049018abea6-combined-ca-bundle\") pod \"barbican-db-sync-t8kxj\" (UID: \"6d0e1745-6e0b-475c-a1de-d049018abea6\") " pod="openstack/barbican-db-sync-t8kxj" Mar 21 04:44:00 crc kubenswrapper[4839]: I0321 04:44:00.256526 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/e8a38db0-ab1b-4555-bb52-a903e9c6b5bb-dns-svc\") pod \"dnsmasq-dns-cf78879c9-mr2ng\" (UID: \"e8a38db0-ab1b-4555-bb52-a903e9c6b5bb\") " pod="openstack/dnsmasq-dns-cf78879c9-mr2ng" Mar 21 04:44:00 crc kubenswrapper[4839]: I0321 04:44:00.256578 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/6c266726-5bfd-4519-bdd5-9db7f6a77df4-log-httpd\") pod \"ceilometer-0\" (UID: \"6c266726-5bfd-4519-bdd5-9db7f6a77df4\") " pod="openstack/ceilometer-0" Mar 21 04:44:00 crc kubenswrapper[4839]: I0321 04:44:00.256610 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/e8a38db0-ab1b-4555-bb52-a903e9c6b5bb-ovsdbserver-sb\") pod \"dnsmasq-dns-cf78879c9-mr2ng\" (UID: \"e8a38db0-ab1b-4555-bb52-a903e9c6b5bb\") " pod="openstack/dnsmasq-dns-cf78879c9-mr2ng" Mar 21 04:44:00 crc kubenswrapper[4839]: I0321 04:44:00.256662 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n7fdg\" (UniqueName: \"kubernetes.io/projected/e8a38db0-ab1b-4555-bb52-a903e9c6b5bb-kube-api-access-n7fdg\") pod \"dnsmasq-dns-cf78879c9-mr2ng\" (UID: \"e8a38db0-ab1b-4555-bb52-a903e9c6b5bb\") " pod="openstack/dnsmasq-dns-cf78879c9-mr2ng" Mar 21 04:44:00 crc kubenswrapper[4839]: I0321 04:44:00.256685 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6c266726-5bfd-4519-bdd5-9db7f6a77df4-scripts\") pod \"ceilometer-0\" (UID: \"6c266726-5bfd-4519-bdd5-9db7f6a77df4\") " pod="openstack/ceilometer-0" Mar 21 04:44:00 crc kubenswrapper[4839]: I0321 04:44:00.256725 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/6c266726-5bfd-4519-bdd5-9db7f6a77df4-run-httpd\") pod \"ceilometer-0\" (UID: \"6c266726-5bfd-4519-bdd5-9db7f6a77df4\") " pod="openstack/ceilometer-0" Mar 21 04:44:00 crc kubenswrapper[4839]: I0321 04:44:00.256750 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6c266726-5bfd-4519-bdd5-9db7f6a77df4-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"6c266726-5bfd-4519-bdd5-9db7f6a77df4\") " pod="openstack/ceilometer-0" Mar 21 04:44:00 crc kubenswrapper[4839]: I0321 04:44:00.256823 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hkzvv\" (UniqueName: \"kubernetes.io/projected/6d0e1745-6e0b-475c-a1de-d049018abea6-kube-api-access-hkzvv\") pod \"barbican-db-sync-t8kxj\" (UID: \"6d0e1745-6e0b-475c-a1de-d049018abea6\") " pod="openstack/barbican-db-sync-t8kxj" Mar 21 04:44:00 crc kubenswrapper[4839]: I0321 04:44:00.256886 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/6d0e1745-6e0b-475c-a1de-d049018abea6-db-sync-config-data\") pod \"barbican-db-sync-t8kxj\" (UID: \"6d0e1745-6e0b-475c-a1de-d049018abea6\") " pod="openstack/barbican-db-sync-t8kxj" Mar 21 04:44:00 crc kubenswrapper[4839]: I0321 04:44:00.256915 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/6c266726-5bfd-4519-bdd5-9db7f6a77df4-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"6c266726-5bfd-4519-bdd5-9db7f6a77df4\") " pod="openstack/ceilometer-0" Mar 21 04:44:00 crc kubenswrapper[4839]: I0321 04:44:00.257070 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/e8a38db0-ab1b-4555-bb52-a903e9c6b5bb-ovsdbserver-nb\") pod \"dnsmasq-dns-cf78879c9-mr2ng\" (UID: \"e8a38db0-ab1b-4555-bb52-a903e9c6b5bb\") " pod="openstack/dnsmasq-dns-cf78879c9-mr2ng" Mar 21 04:44:00 crc kubenswrapper[4839]: I0321 04:44:00.257103 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/e8a38db0-ab1b-4555-bb52-a903e9c6b5bb-dns-swift-storage-0\") pod \"dnsmasq-dns-cf78879c9-mr2ng\" (UID: \"e8a38db0-ab1b-4555-bb52-a903e9c6b5bb\") " pod="openstack/dnsmasq-dns-cf78879c9-mr2ng" Mar 21 04:44:00 crc kubenswrapper[4839]: I0321 04:44:00.257155 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5zddh\" (UniqueName: \"kubernetes.io/projected/6c266726-5bfd-4519-bdd5-9db7f6a77df4-kube-api-access-5zddh\") pod \"ceilometer-0\" (UID: \"6c266726-5bfd-4519-bdd5-9db7f6a77df4\") " pod="openstack/ceilometer-0" Mar 21 04:44:00 crc kubenswrapper[4839]: I0321 04:44:00.275024 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6d0e1745-6e0b-475c-a1de-d049018abea6-combined-ca-bundle\") pod \"barbican-db-sync-t8kxj\" (UID: \"6d0e1745-6e0b-475c-a1de-d049018abea6\") " pod="openstack/barbican-db-sync-t8kxj" Mar 21 04:44:00 crc kubenswrapper[4839]: I0321 04:44:00.279225 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hkzvv\" (UniqueName: \"kubernetes.io/projected/6d0e1745-6e0b-475c-a1de-d049018abea6-kube-api-access-hkzvv\") pod \"barbican-db-sync-t8kxj\" (UID: \"6d0e1745-6e0b-475c-a1de-d049018abea6\") " pod="openstack/barbican-db-sync-t8kxj" Mar 21 04:44:00 crc kubenswrapper[4839]: I0321 04:44:00.283186 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/6d0e1745-6e0b-475c-a1de-d049018abea6-db-sync-config-data\") pod \"barbican-db-sync-t8kxj\" (UID: \"6d0e1745-6e0b-475c-a1de-d049018abea6\") " pod="openstack/barbican-db-sync-t8kxj" Mar 21 04:44:00 crc kubenswrapper[4839]: I0321 04:44:00.307930 4839 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29567804-jn7hw"] Mar 21 04:44:00 crc kubenswrapper[4839]: I0321 04:44:00.309693 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567804-jn7hw" Mar 21 04:44:00 crc kubenswrapper[4839]: I0321 04:44:00.312362 4839 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-swld2" Mar 21 04:44:00 crc kubenswrapper[4839]: I0321 04:44:00.312650 4839 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 21 04:44:00 crc kubenswrapper[4839]: I0321 04:44:00.313285 4839 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 21 04:44:00 crc kubenswrapper[4839]: I0321 04:44:00.356579 4839 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29567804-jn7hw"] Mar 21 04:44:00 crc kubenswrapper[4839]: I0321 04:44:00.370787 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-n7fdg\" (UniqueName: \"kubernetes.io/projected/e8a38db0-ab1b-4555-bb52-a903e9c6b5bb-kube-api-access-n7fdg\") pod \"dnsmasq-dns-cf78879c9-mr2ng\" (UID: \"e8a38db0-ab1b-4555-bb52-a903e9c6b5bb\") " pod="openstack/dnsmasq-dns-cf78879c9-mr2ng" Mar 21 04:44:00 crc kubenswrapper[4839]: I0321 04:44:00.370838 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6c266726-5bfd-4519-bdd5-9db7f6a77df4-scripts\") pod \"ceilometer-0\" (UID: \"6c266726-5bfd-4519-bdd5-9db7f6a77df4\") " pod="openstack/ceilometer-0" Mar 21 04:44:00 crc kubenswrapper[4839]: I0321 04:44:00.370888 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/6c266726-5bfd-4519-bdd5-9db7f6a77df4-run-httpd\") pod \"ceilometer-0\" (UID: \"6c266726-5bfd-4519-bdd5-9db7f6a77df4\") " pod="openstack/ceilometer-0" Mar 21 04:44:00 crc kubenswrapper[4839]: I0321 04:44:00.370904 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6c266726-5bfd-4519-bdd5-9db7f6a77df4-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"6c266726-5bfd-4519-bdd5-9db7f6a77df4\") " pod="openstack/ceilometer-0" Mar 21 04:44:00 crc kubenswrapper[4839]: I0321 04:44:00.370968 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6t7wq\" (UniqueName: \"kubernetes.io/projected/3193915f-60d3-4c8e-aa15-858213ce011c-kube-api-access-6t7wq\") pod \"horizon-7c9d7f5-72d27\" (UID: \"3193915f-60d3-4c8e-aa15-858213ce011c\") " pod="openstack/horizon-7c9d7f5-72d27" Mar 21 04:44:00 crc kubenswrapper[4839]: I0321 04:44:00.371011 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/6c266726-5bfd-4519-bdd5-9db7f6a77df4-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"6c266726-5bfd-4519-bdd5-9db7f6a77df4\") " pod="openstack/ceilometer-0" Mar 21 04:44:00 crc kubenswrapper[4839]: I0321 04:44:00.371082 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/e8a38db0-ab1b-4555-bb52-a903e9c6b5bb-ovsdbserver-nb\") pod \"dnsmasq-dns-cf78879c9-mr2ng\" (UID: \"e8a38db0-ab1b-4555-bb52-a903e9c6b5bb\") " pod="openstack/dnsmasq-dns-cf78879c9-mr2ng" Mar 21 04:44:00 crc kubenswrapper[4839]: I0321 04:44:00.371110 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/e8a38db0-ab1b-4555-bb52-a903e9c6b5bb-dns-swift-storage-0\") pod \"dnsmasq-dns-cf78879c9-mr2ng\" (UID: \"e8a38db0-ab1b-4555-bb52-a903e9c6b5bb\") " pod="openstack/dnsmasq-dns-cf78879c9-mr2ng" Mar 21 04:44:00 crc kubenswrapper[4839]: I0321 04:44:00.371159 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5zddh\" (UniqueName: \"kubernetes.io/projected/6c266726-5bfd-4519-bdd5-9db7f6a77df4-kube-api-access-5zddh\") pod \"ceilometer-0\" (UID: \"6c266726-5bfd-4519-bdd5-9db7f6a77df4\") " pod="openstack/ceilometer-0" Mar 21 04:44:00 crc kubenswrapper[4839]: I0321 04:44:00.371239 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/3193915f-60d3-4c8e-aa15-858213ce011c-config-data\") pod \"horizon-7c9d7f5-72d27\" (UID: \"3193915f-60d3-4c8e-aa15-858213ce011c\") " pod="openstack/horizon-7c9d7f5-72d27" Mar 21 04:44:00 crc kubenswrapper[4839]: I0321 04:44:00.371561 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e8a38db0-ab1b-4555-bb52-a903e9c6b5bb-config\") pod \"dnsmasq-dns-cf78879c9-mr2ng\" (UID: \"e8a38db0-ab1b-4555-bb52-a903e9c6b5bb\") " pod="openstack/dnsmasq-dns-cf78879c9-mr2ng" Mar 21 04:44:00 crc kubenswrapper[4839]: I0321 04:44:00.371619 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6c266726-5bfd-4519-bdd5-9db7f6a77df4-config-data\") pod \"ceilometer-0\" (UID: \"6c266726-5bfd-4519-bdd5-9db7f6a77df4\") " pod="openstack/ceilometer-0" Mar 21 04:44:00 crc kubenswrapper[4839]: I0321 04:44:00.371637 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/3193915f-60d3-4c8e-aa15-858213ce011c-scripts\") pod \"horizon-7c9d7f5-72d27\" (UID: \"3193915f-60d3-4c8e-aa15-858213ce011c\") " pod="openstack/horizon-7c9d7f5-72d27" Mar 21 04:44:00 crc kubenswrapper[4839]: I0321 04:44:00.371669 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3193915f-60d3-4c8e-aa15-858213ce011c-logs\") pod \"horizon-7c9d7f5-72d27\" (UID: \"3193915f-60d3-4c8e-aa15-858213ce011c\") " pod="openstack/horizon-7c9d7f5-72d27" Mar 21 04:44:00 crc kubenswrapper[4839]: I0321 04:44:00.371689 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/3193915f-60d3-4c8e-aa15-858213ce011c-horizon-secret-key\") pod \"horizon-7c9d7f5-72d27\" (UID: \"3193915f-60d3-4c8e-aa15-858213ce011c\") " pod="openstack/horizon-7c9d7f5-72d27" Mar 21 04:44:00 crc kubenswrapper[4839]: I0321 04:44:00.371717 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/e8a38db0-ab1b-4555-bb52-a903e9c6b5bb-dns-svc\") pod \"dnsmasq-dns-cf78879c9-mr2ng\" (UID: \"e8a38db0-ab1b-4555-bb52-a903e9c6b5bb\") " pod="openstack/dnsmasq-dns-cf78879c9-mr2ng" Mar 21 04:44:00 crc kubenswrapper[4839]: I0321 04:44:00.371748 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/6c266726-5bfd-4519-bdd5-9db7f6a77df4-log-httpd\") pod \"ceilometer-0\" (UID: \"6c266726-5bfd-4519-bdd5-9db7f6a77df4\") " pod="openstack/ceilometer-0" Mar 21 04:44:00 crc kubenswrapper[4839]: I0321 04:44:00.371772 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/e8a38db0-ab1b-4555-bb52-a903e9c6b5bb-ovsdbserver-sb\") pod \"dnsmasq-dns-cf78879c9-mr2ng\" (UID: \"e8a38db0-ab1b-4555-bb52-a903e9c6b5bb\") " pod="openstack/dnsmasq-dns-cf78879c9-mr2ng" Mar 21 04:44:00 crc kubenswrapper[4839]: I0321 04:44:00.372441 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/6c266726-5bfd-4519-bdd5-9db7f6a77df4-run-httpd\") pod \"ceilometer-0\" (UID: \"6c266726-5bfd-4519-bdd5-9db7f6a77df4\") " pod="openstack/ceilometer-0" Mar 21 04:44:00 crc kubenswrapper[4839]: I0321 04:44:00.373346 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/e8a38db0-ab1b-4555-bb52-a903e9c6b5bb-ovsdbserver-sb\") pod \"dnsmasq-dns-cf78879c9-mr2ng\" (UID: \"e8a38db0-ab1b-4555-bb52-a903e9c6b5bb\") " pod="openstack/dnsmasq-dns-cf78879c9-mr2ng" Mar 21 04:44:00 crc kubenswrapper[4839]: I0321 04:44:00.373546 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e8a38db0-ab1b-4555-bb52-a903e9c6b5bb-config\") pod \"dnsmasq-dns-cf78879c9-mr2ng\" (UID: \"e8a38db0-ab1b-4555-bb52-a903e9c6b5bb\") " pod="openstack/dnsmasq-dns-cf78879c9-mr2ng" Mar 21 04:44:00 crc kubenswrapper[4839]: I0321 04:44:00.374752 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/e8a38db0-ab1b-4555-bb52-a903e9c6b5bb-dns-swift-storage-0\") pod \"dnsmasq-dns-cf78879c9-mr2ng\" (UID: \"e8a38db0-ab1b-4555-bb52-a903e9c6b5bb\") " pod="openstack/dnsmasq-dns-cf78879c9-mr2ng" Mar 21 04:44:00 crc kubenswrapper[4839]: I0321 04:44:00.376024 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/e8a38db0-ab1b-4555-bb52-a903e9c6b5bb-dns-svc\") pod \"dnsmasq-dns-cf78879c9-mr2ng\" (UID: \"e8a38db0-ab1b-4555-bb52-a903e9c6b5bb\") " pod="openstack/dnsmasq-dns-cf78879c9-mr2ng" Mar 21 04:44:00 crc kubenswrapper[4839]: I0321 04:44:00.376759 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/e8a38db0-ab1b-4555-bb52-a903e9c6b5bb-ovsdbserver-nb\") pod \"dnsmasq-dns-cf78879c9-mr2ng\" (UID: \"e8a38db0-ab1b-4555-bb52-a903e9c6b5bb\") " pod="openstack/dnsmasq-dns-cf78879c9-mr2ng" Mar 21 04:44:00 crc kubenswrapper[4839]: I0321 04:44:00.377471 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/6c266726-5bfd-4519-bdd5-9db7f6a77df4-log-httpd\") pod \"ceilometer-0\" (UID: \"6c266726-5bfd-4519-bdd5-9db7f6a77df4\") " pod="openstack/ceilometer-0" Mar 21 04:44:00 crc kubenswrapper[4839]: I0321 04:44:00.381097 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6c266726-5bfd-4519-bdd5-9db7f6a77df4-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"6c266726-5bfd-4519-bdd5-9db7f6a77df4\") " pod="openstack/ceilometer-0" Mar 21 04:44:00 crc kubenswrapper[4839]: I0321 04:44:00.391351 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6c266726-5bfd-4519-bdd5-9db7f6a77df4-scripts\") pod \"ceilometer-0\" (UID: \"6c266726-5bfd-4519-bdd5-9db7f6a77df4\") " pod="openstack/ceilometer-0" Mar 21 04:44:00 crc kubenswrapper[4839]: I0321 04:44:00.394241 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/6c266726-5bfd-4519-bdd5-9db7f6a77df4-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"6c266726-5bfd-4519-bdd5-9db7f6a77df4\") " pod="openstack/ceilometer-0" Mar 21 04:44:00 crc kubenswrapper[4839]: I0321 04:44:00.398538 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6c266726-5bfd-4519-bdd5-9db7f6a77df4-config-data\") pod \"ceilometer-0\" (UID: \"6c266726-5bfd-4519-bdd5-9db7f6a77df4\") " pod="openstack/ceilometer-0" Mar 21 04:44:00 crc kubenswrapper[4839]: I0321 04:44:00.406205 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-n7fdg\" (UniqueName: \"kubernetes.io/projected/e8a38db0-ab1b-4555-bb52-a903e9c6b5bb-kube-api-access-n7fdg\") pod \"dnsmasq-dns-cf78879c9-mr2ng\" (UID: \"e8a38db0-ab1b-4555-bb52-a903e9c6b5bb\") " pod="openstack/dnsmasq-dns-cf78879c9-mr2ng" Mar 21 04:44:00 crc kubenswrapper[4839]: I0321 04:44:00.410413 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5zddh\" (UniqueName: \"kubernetes.io/projected/6c266726-5bfd-4519-bdd5-9db7f6a77df4-kube-api-access-5zddh\") pod \"ceilometer-0\" (UID: \"6c266726-5bfd-4519-bdd5-9db7f6a77df4\") " pod="openstack/ceilometer-0" Mar 21 04:44:00 crc kubenswrapper[4839]: I0321 04:44:00.459169 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-wdddk" Mar 21 04:44:00 crc kubenswrapper[4839]: I0321 04:44:00.476243 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6t7wq\" (UniqueName: \"kubernetes.io/projected/3193915f-60d3-4c8e-aa15-858213ce011c-kube-api-access-6t7wq\") pod \"horizon-7c9d7f5-72d27\" (UID: \"3193915f-60d3-4c8e-aa15-858213ce011c\") " pod="openstack/horizon-7c9d7f5-72d27" Mar 21 04:44:00 crc kubenswrapper[4839]: I0321 04:44:00.476372 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/3193915f-60d3-4c8e-aa15-858213ce011c-config-data\") pod \"horizon-7c9d7f5-72d27\" (UID: \"3193915f-60d3-4c8e-aa15-858213ce011c\") " pod="openstack/horizon-7c9d7f5-72d27" Mar 21 04:44:00 crc kubenswrapper[4839]: I0321 04:44:00.476425 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rrqtc\" (UniqueName: \"kubernetes.io/projected/117f0438-5ab3-4616-b574-c5bbc43e8ac9-kube-api-access-rrqtc\") pod \"auto-csr-approver-29567804-jn7hw\" (UID: \"117f0438-5ab3-4616-b574-c5bbc43e8ac9\") " pod="openshift-infra/auto-csr-approver-29567804-jn7hw" Mar 21 04:44:00 crc kubenswrapper[4839]: I0321 04:44:00.476477 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/3193915f-60d3-4c8e-aa15-858213ce011c-scripts\") pod \"horizon-7c9d7f5-72d27\" (UID: \"3193915f-60d3-4c8e-aa15-858213ce011c\") " pod="openstack/horizon-7c9d7f5-72d27" Mar 21 04:44:00 crc kubenswrapper[4839]: I0321 04:44:00.476507 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3193915f-60d3-4c8e-aa15-858213ce011c-logs\") pod \"horizon-7c9d7f5-72d27\" (UID: \"3193915f-60d3-4c8e-aa15-858213ce011c\") " pod="openstack/horizon-7c9d7f5-72d27" Mar 21 04:44:00 crc kubenswrapper[4839]: I0321 04:44:00.476532 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/3193915f-60d3-4c8e-aa15-858213ce011c-horizon-secret-key\") pod \"horizon-7c9d7f5-72d27\" (UID: \"3193915f-60d3-4c8e-aa15-858213ce011c\") " pod="openstack/horizon-7c9d7f5-72d27" Mar 21 04:44:00 crc kubenswrapper[4839]: I0321 04:44:00.478306 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/3193915f-60d3-4c8e-aa15-858213ce011c-scripts\") pod \"horizon-7c9d7f5-72d27\" (UID: \"3193915f-60d3-4c8e-aa15-858213ce011c\") " pod="openstack/horizon-7c9d7f5-72d27" Mar 21 04:44:00 crc kubenswrapper[4839]: I0321 04:44:00.478857 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3193915f-60d3-4c8e-aa15-858213ce011c-logs\") pod \"horizon-7c9d7f5-72d27\" (UID: \"3193915f-60d3-4c8e-aa15-858213ce011c\") " pod="openstack/horizon-7c9d7f5-72d27" Mar 21 04:44:00 crc kubenswrapper[4839]: I0321 04:44:00.478721 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/3193915f-60d3-4c8e-aa15-858213ce011c-config-data\") pod \"horizon-7c9d7f5-72d27\" (UID: \"3193915f-60d3-4c8e-aa15-858213ce011c\") " pod="openstack/horizon-7c9d7f5-72d27" Mar 21 04:44:00 crc kubenswrapper[4839]: I0321 04:44:00.478968 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-t8kxj" Mar 21 04:44:00 crc kubenswrapper[4839]: I0321 04:44:00.484532 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/3193915f-60d3-4c8e-aa15-858213ce011c-horizon-secret-key\") pod \"horizon-7c9d7f5-72d27\" (UID: \"3193915f-60d3-4c8e-aa15-858213ce011c\") " pod="openstack/horizon-7c9d7f5-72d27" Mar 21 04:44:00 crc kubenswrapper[4839]: I0321 04:44:00.489915 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-cf78879c9-mr2ng" Mar 21 04:44:00 crc kubenswrapper[4839]: I0321 04:44:00.504067 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6t7wq\" (UniqueName: \"kubernetes.io/projected/3193915f-60d3-4c8e-aa15-858213ce011c-kube-api-access-6t7wq\") pod \"horizon-7c9d7f5-72d27\" (UID: \"3193915f-60d3-4c8e-aa15-858213ce011c\") " pod="openstack/horizon-7c9d7f5-72d27" Mar 21 04:44:00 crc kubenswrapper[4839]: I0321 04:44:00.516835 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 21 04:44:00 crc kubenswrapper[4839]: I0321 04:44:00.553686 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-7c9d7f5-72d27" Mar 21 04:44:00 crc kubenswrapper[4839]: I0321 04:44:00.571387 4839 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5c9d85d47c-ch994"] Mar 21 04:44:00 crc kubenswrapper[4839]: I0321 04:44:00.578608 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rrqtc\" (UniqueName: \"kubernetes.io/projected/117f0438-5ab3-4616-b574-c5bbc43e8ac9-kube-api-access-rrqtc\") pod \"auto-csr-approver-29567804-jn7hw\" (UID: \"117f0438-5ab3-4616-b574-c5bbc43e8ac9\") " pod="openshift-infra/auto-csr-approver-29567804-jn7hw" Mar 21 04:44:00 crc kubenswrapper[4839]: W0321 04:44:00.611280 4839 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod5e86a461_8b9c_4850_b084_5a242058db02.slice/crio-3695c4efcb7ed7c99b03416f2c3b5313f86380f8412fb12414d41e72c87f8386 WatchSource:0}: Error finding container 3695c4efcb7ed7c99b03416f2c3b5313f86380f8412fb12414d41e72c87f8386: Status 404 returned error can't find the container with id 3695c4efcb7ed7c99b03416f2c3b5313f86380f8412fb12414d41e72c87f8386 Mar 21 04:44:00 crc kubenswrapper[4839]: I0321 04:44:00.617415 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rrqtc\" (UniqueName: \"kubernetes.io/projected/117f0438-5ab3-4616-b574-c5bbc43e8ac9-kube-api-access-rrqtc\") pod \"auto-csr-approver-29567804-jn7hw\" (UID: \"117f0438-5ab3-4616-b574-c5bbc43e8ac9\") " pod="openshift-infra/auto-csr-approver-29567804-jn7hw" Mar 21 04:44:00 crc kubenswrapper[4839]: I0321 04:44:00.631287 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567804-jn7hw" Mar 21 04:44:00 crc kubenswrapper[4839]: I0321 04:44:00.702358 4839 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-5rr4j"] Mar 21 04:44:00 crc kubenswrapper[4839]: I0321 04:44:00.808387 4839 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-sync-nm9t5"] Mar 21 04:44:00 crc kubenswrapper[4839]: I0321 04:44:00.941648 4839 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-sync-qfjms"] Mar 21 04:44:00 crc kubenswrapper[4839]: I0321 04:44:00.965155 4839 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-554fbfcbdf-wqcc5"] Mar 21 04:44:00 crc kubenswrapper[4839]: W0321 04:44:00.996901 4839 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod625a99bd_bc01_400e_8e9c_1f5eff390466.slice/crio-829be773cbac605730f273e58ffebe5c5615f40baf855f3c2212fe6c649c7cf3 WatchSource:0}: Error finding container 829be773cbac605730f273e58ffebe5c5615f40baf855f3c2212fe6c649c7cf3: Status 404 returned error can't find the container with id 829be773cbac605730f273e58ffebe5c5615f40baf855f3c2212fe6c649c7cf3 Mar 21 04:44:01 crc kubenswrapper[4839]: W0321 04:44:01.006143 4839 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod02db0b32_3683_4d02_b645_3cea2cd59b7d.slice/crio-8e3dd636e6fac659887299acf1ac0e45d1e3d4824f9b0e8c44ea6e8f2b5429e5 WatchSource:0}: Error finding container 8e3dd636e6fac659887299acf1ac0e45d1e3d4824f9b0e8c44ea6e8f2b5429e5: Status 404 returned error can't find the container with id 8e3dd636e6fac659887299acf1ac0e45d1e3d4824f9b0e8c44ea6e8f2b5429e5 Mar 21 04:44:01 crc kubenswrapper[4839]: I0321 04:44:01.101020 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5c9d85d47c-ch994" event={"ID":"5e86a461-8b9c-4850-b084-5a242058db02","Type":"ContainerStarted","Data":"3695c4efcb7ed7c99b03416f2c3b5313f86380f8412fb12414d41e72c87f8386"} Mar 21 04:44:01 crc kubenswrapper[4839]: I0321 04:44:01.112023 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-5rr4j" event={"ID":"848aa53a-bd67-4733-aad7-6ac0f6fc0a15","Type":"ContainerStarted","Data":"70f7b322b7c3ad74c3e8a9620d17f9758e57775147f74650f1a626aa0f7a8463"} Mar 21 04:44:01 crc kubenswrapper[4839]: I0321 04:44:01.118817 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-nm9t5" event={"ID":"625a99bd-bc01-400e-8e9c-1f5eff390466","Type":"ContainerStarted","Data":"829be773cbac605730f273e58ffebe5c5615f40baf855f3c2212fe6c649c7cf3"} Mar 21 04:44:01 crc kubenswrapper[4839]: I0321 04:44:01.166621 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-554fbfcbdf-wqcc5" event={"ID":"02db0b32-3683-4d02-b645-3cea2cd59b7d","Type":"ContainerStarted","Data":"8e3dd636e6fac659887299acf1ac0e45d1e3d4824f9b0e8c44ea6e8f2b5429e5"} Mar 21 04:44:01 crc kubenswrapper[4839]: I0321 04:44:01.171677 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-qfjms" event={"ID":"6000d2d4-e84a-443f-9094-ab999541331d","Type":"ContainerStarted","Data":"4d5cb1d53067040b399cf367f961d70a4e98d3e793e42e6da997085ddb0d9688"} Mar 21 04:44:01 crc kubenswrapper[4839]: I0321 04:44:01.171749 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5b868669f-g4f92" Mar 21 04:44:01 crc kubenswrapper[4839]: I0321 04:44:01.229670 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5b868669f-g4f92" Mar 21 04:44:01 crc kubenswrapper[4839]: I0321 04:44:01.304839 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/9fec7a31-49df-4e3c-9266-8c21d7622445-dns-svc\") pod \"9fec7a31-49df-4e3c-9266-8c21d7622445\" (UID: \"9fec7a31-49df-4e3c-9266-8c21d7622445\") " Mar 21 04:44:01 crc kubenswrapper[4839]: I0321 04:44:01.304886 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/9fec7a31-49df-4e3c-9266-8c21d7622445-dns-swift-storage-0\") pod \"9fec7a31-49df-4e3c-9266-8c21d7622445\" (UID: \"9fec7a31-49df-4e3c-9266-8c21d7622445\") " Mar 21 04:44:01 crc kubenswrapper[4839]: I0321 04:44:01.304984 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5r6lw\" (UniqueName: \"kubernetes.io/projected/9fec7a31-49df-4e3c-9266-8c21d7622445-kube-api-access-5r6lw\") pod \"9fec7a31-49df-4e3c-9266-8c21d7622445\" (UID: \"9fec7a31-49df-4e3c-9266-8c21d7622445\") " Mar 21 04:44:01 crc kubenswrapper[4839]: I0321 04:44:01.305072 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9fec7a31-49df-4e3c-9266-8c21d7622445-config\") pod \"9fec7a31-49df-4e3c-9266-8c21d7622445\" (UID: \"9fec7a31-49df-4e3c-9266-8c21d7622445\") " Mar 21 04:44:01 crc kubenswrapper[4839]: I0321 04:44:01.305114 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/9fec7a31-49df-4e3c-9266-8c21d7622445-ovsdbserver-nb\") pod \"9fec7a31-49df-4e3c-9266-8c21d7622445\" (UID: \"9fec7a31-49df-4e3c-9266-8c21d7622445\") " Mar 21 04:44:01 crc kubenswrapper[4839]: I0321 04:44:01.305176 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/9fec7a31-49df-4e3c-9266-8c21d7622445-ovsdbserver-sb\") pod \"9fec7a31-49df-4e3c-9266-8c21d7622445\" (UID: \"9fec7a31-49df-4e3c-9266-8c21d7622445\") " Mar 21 04:44:01 crc kubenswrapper[4839]: I0321 04:44:01.305526 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9fec7a31-49df-4e3c-9266-8c21d7622445-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "9fec7a31-49df-4e3c-9266-8c21d7622445" (UID: "9fec7a31-49df-4e3c-9266-8c21d7622445"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 21 04:44:01 crc kubenswrapper[4839]: I0321 04:44:01.306215 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9fec7a31-49df-4e3c-9266-8c21d7622445-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "9fec7a31-49df-4e3c-9266-8c21d7622445" (UID: "9fec7a31-49df-4e3c-9266-8c21d7622445"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 21 04:44:01 crc kubenswrapper[4839]: I0321 04:44:01.306702 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9fec7a31-49df-4e3c-9266-8c21d7622445-config" (OuterVolumeSpecName: "config") pod "9fec7a31-49df-4e3c-9266-8c21d7622445" (UID: "9fec7a31-49df-4e3c-9266-8c21d7622445"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 21 04:44:01 crc kubenswrapper[4839]: I0321 04:44:01.307223 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9fec7a31-49df-4e3c-9266-8c21d7622445-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "9fec7a31-49df-4e3c-9266-8c21d7622445" (UID: "9fec7a31-49df-4e3c-9266-8c21d7622445"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 21 04:44:01 crc kubenswrapper[4839]: I0321 04:44:01.307476 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9fec7a31-49df-4e3c-9266-8c21d7622445-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "9fec7a31-49df-4e3c-9266-8c21d7622445" (UID: "9fec7a31-49df-4e3c-9266-8c21d7622445"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 21 04:44:01 crc kubenswrapper[4839]: I0321 04:44:01.307916 4839 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/9fec7a31-49df-4e3c-9266-8c21d7622445-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Mar 21 04:44:01 crc kubenswrapper[4839]: I0321 04:44:01.307941 4839 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/9fec7a31-49df-4e3c-9266-8c21d7622445-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 21 04:44:01 crc kubenswrapper[4839]: I0321 04:44:01.307953 4839 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/9fec7a31-49df-4e3c-9266-8c21d7622445-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Mar 21 04:44:01 crc kubenswrapper[4839]: I0321 04:44:01.307962 4839 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9fec7a31-49df-4e3c-9266-8c21d7622445-config\") on node \"crc\" DevicePath \"\"" Mar 21 04:44:01 crc kubenswrapper[4839]: I0321 04:44:01.307970 4839 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/9fec7a31-49df-4e3c-9266-8c21d7622445-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Mar 21 04:44:01 crc kubenswrapper[4839]: I0321 04:44:01.332930 4839 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-sync-wdddk"] Mar 21 04:44:01 crc kubenswrapper[4839]: I0321 04:44:01.335793 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9fec7a31-49df-4e3c-9266-8c21d7622445-kube-api-access-5r6lw" (OuterVolumeSpecName: "kube-api-access-5r6lw") pod "9fec7a31-49df-4e3c-9266-8c21d7622445" (UID: "9fec7a31-49df-4e3c-9266-8c21d7622445"). InnerVolumeSpecName "kube-api-access-5r6lw". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 21 04:44:01 crc kubenswrapper[4839]: I0321 04:44:01.409401 4839 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5r6lw\" (UniqueName: \"kubernetes.io/projected/9fec7a31-49df-4e3c-9266-8c21d7622445-kube-api-access-5r6lw\") on node \"crc\" DevicePath \"\"" Mar 21 04:44:01 crc kubenswrapper[4839]: I0321 04:44:01.635769 4839 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-cf78879c9-mr2ng"] Mar 21 04:44:01 crc kubenswrapper[4839]: I0321 04:44:01.666943 4839 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 21 04:44:01 crc kubenswrapper[4839]: I0321 04:44:01.679190 4839 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-sync-t8kxj"] Mar 21 04:44:01 crc kubenswrapper[4839]: W0321 04:44:01.687288 4839 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod6d0e1745_6e0b_475c_a1de_d049018abea6.slice/crio-00637144ea664e135a3a03c08667ad9ad9e5c84e3814ae65ec02c62e19d9549d WatchSource:0}: Error finding container 00637144ea664e135a3a03c08667ad9ad9e5c84e3814ae65ec02c62e19d9549d: Status 404 returned error can't find the container with id 00637144ea664e135a3a03c08667ad9ad9e5c84e3814ae65ec02c62e19d9549d Mar 21 04:44:01 crc kubenswrapper[4839]: I0321 04:44:01.847831 4839 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29567804-jn7hw"] Mar 21 04:44:01 crc kubenswrapper[4839]: I0321 04:44:01.889509 4839 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-7c9d7f5-72d27"] Mar 21 04:44:02 crc kubenswrapper[4839]: I0321 04:44:02.120595 4839 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 21 04:44:02 crc kubenswrapper[4839]: I0321 04:44:02.163916 4839 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-554fbfcbdf-wqcc5"] Mar 21 04:44:02 crc kubenswrapper[4839]: I0321 04:44:02.163974 4839 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/horizon-75c94899fc-bkxlk"] Mar 21 04:44:02 crc kubenswrapper[4839]: I0321 04:44:02.190065 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-75c94899fc-bkxlk" Mar 21 04:44:02 crc kubenswrapper[4839]: I0321 04:44:02.210238 4839 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-75c94899fc-bkxlk"] Mar 21 04:44:02 crc kubenswrapper[4839]: I0321 04:44:02.229936 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-7c9d7f5-72d27" event={"ID":"3193915f-60d3-4c8e-aa15-858213ce011c","Type":"ContainerStarted","Data":"6e4a51f272197d48dcbc81a0aecd9739f163dd5e41504342f75386e4fcf464f5"} Mar 21 04:44:02 crc kubenswrapper[4839]: I0321 04:44:02.235008 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567804-jn7hw" event={"ID":"117f0438-5ab3-4616-b574-c5bbc43e8ac9","Type":"ContainerStarted","Data":"6abcdf109ee136fd47cf7c735b5b48052a42904b27a3f0d33fd3e2a18c075320"} Mar 21 04:44:02 crc kubenswrapper[4839]: I0321 04:44:02.236886 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f9b42e2e-3015-4ae1-a3a9-3eb96949b021-logs\") pod \"horizon-75c94899fc-bkxlk\" (UID: \"f9b42e2e-3015-4ae1-a3a9-3eb96949b021\") " pod="openstack/horizon-75c94899fc-bkxlk" Mar 21 04:44:02 crc kubenswrapper[4839]: I0321 04:44:02.236956 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/f9b42e2e-3015-4ae1-a3a9-3eb96949b021-config-data\") pod \"horizon-75c94899fc-bkxlk\" (UID: \"f9b42e2e-3015-4ae1-a3a9-3eb96949b021\") " pod="openstack/horizon-75c94899fc-bkxlk" Mar 21 04:44:02 crc kubenswrapper[4839]: I0321 04:44:02.237079 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/f9b42e2e-3015-4ae1-a3a9-3eb96949b021-scripts\") pod \"horizon-75c94899fc-bkxlk\" (UID: \"f9b42e2e-3015-4ae1-a3a9-3eb96949b021\") " pod="openstack/horizon-75c94899fc-bkxlk" Mar 21 04:44:02 crc kubenswrapper[4839]: I0321 04:44:02.237115 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s9ntv\" (UniqueName: \"kubernetes.io/projected/f9b42e2e-3015-4ae1-a3a9-3eb96949b021-kube-api-access-s9ntv\") pod \"horizon-75c94899fc-bkxlk\" (UID: \"f9b42e2e-3015-4ae1-a3a9-3eb96949b021\") " pod="openstack/horizon-75c94899fc-bkxlk" Mar 21 04:44:02 crc kubenswrapper[4839]: I0321 04:44:02.237147 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/f9b42e2e-3015-4ae1-a3a9-3eb96949b021-horizon-secret-key\") pod \"horizon-75c94899fc-bkxlk\" (UID: \"f9b42e2e-3015-4ae1-a3a9-3eb96949b021\") " pod="openstack/horizon-75c94899fc-bkxlk" Mar 21 04:44:02 crc kubenswrapper[4839]: I0321 04:44:02.253528 4839 generic.go:334] "Generic (PLEG): container finished" podID="5e86a461-8b9c-4850-b084-5a242058db02" containerID="85588b4178127cda15e4b9075c4a1854a46c41932df5300f78b062a7e2468918" exitCode=0 Mar 21 04:44:02 crc kubenswrapper[4839]: I0321 04:44:02.253620 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5c9d85d47c-ch994" event={"ID":"5e86a461-8b9c-4850-b084-5a242058db02","Type":"ContainerDied","Data":"85588b4178127cda15e4b9075c4a1854a46c41932df5300f78b062a7e2468918"} Mar 21 04:44:02 crc kubenswrapper[4839]: I0321 04:44:02.257295 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"6c266726-5bfd-4519-bdd5-9db7f6a77df4","Type":"ContainerStarted","Data":"41ed81fbf037f8ebe50fd1cd4bb84f9e7c73f61ee6cb668dca265d806ca14d96"} Mar 21 04:44:02 crc kubenswrapper[4839]: I0321 04:44:02.263237 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-cf78879c9-mr2ng" event={"ID":"e8a38db0-ab1b-4555-bb52-a903e9c6b5bb","Type":"ContainerStarted","Data":"52b2f2f770bf2eb0997bcf2aef429b0705724f1b09c55ddcbec469134135b349"} Mar 21 04:44:02 crc kubenswrapper[4839]: I0321 04:44:02.263288 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-cf78879c9-mr2ng" event={"ID":"e8a38db0-ab1b-4555-bb52-a903e9c6b5bb","Type":"ContainerStarted","Data":"2dbe5499f5ce6b46711307e191213d3557376716206b4f9aec95cbff6dcd4f72"} Mar 21 04:44:02 crc kubenswrapper[4839]: I0321 04:44:02.293610 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-5rr4j" event={"ID":"848aa53a-bd67-4733-aad7-6ac0f6fc0a15","Type":"ContainerStarted","Data":"2233fa3f3ad560ada373befd98764d7c67680bcb094c6c63415e8ef4dc05b7f7"} Mar 21 04:44:02 crc kubenswrapper[4839]: I0321 04:44:02.306419 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-nm9t5" event={"ID":"625a99bd-bc01-400e-8e9c-1f5eff390466","Type":"ContainerStarted","Data":"dfcec3a2306ecb1c0b0e9a1bd05577683dcbd7efc3319d4ee942c6e22862d913"} Mar 21 04:44:02 crc kubenswrapper[4839]: I0321 04:44:02.308868 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-t8kxj" event={"ID":"6d0e1745-6e0b-475c-a1de-d049018abea6","Type":"ContainerStarted","Data":"00637144ea664e135a3a03c08667ad9ad9e5c84e3814ae65ec02c62e19d9549d"} Mar 21 04:44:02 crc kubenswrapper[4839]: I0321 04:44:02.335335 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5b868669f-g4f92" Mar 21 04:44:02 crc kubenswrapper[4839]: I0321 04:44:02.335682 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-wdddk" event={"ID":"e6e87cbd-1f46-4fa0-9529-8250f9fee21c","Type":"ContainerStarted","Data":"e10dee2b21cfdb75da16c639d865bd8e8d3823159b603d6fda5a875f34a0fb47"} Mar 21 04:44:02 crc kubenswrapper[4839]: I0321 04:44:02.348004 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/f9b42e2e-3015-4ae1-a3a9-3eb96949b021-scripts\") pod \"horizon-75c94899fc-bkxlk\" (UID: \"f9b42e2e-3015-4ae1-a3a9-3eb96949b021\") " pod="openstack/horizon-75c94899fc-bkxlk" Mar 21 04:44:02 crc kubenswrapper[4839]: I0321 04:44:02.348141 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s9ntv\" (UniqueName: \"kubernetes.io/projected/f9b42e2e-3015-4ae1-a3a9-3eb96949b021-kube-api-access-s9ntv\") pod \"horizon-75c94899fc-bkxlk\" (UID: \"f9b42e2e-3015-4ae1-a3a9-3eb96949b021\") " pod="openstack/horizon-75c94899fc-bkxlk" Mar 21 04:44:02 crc kubenswrapper[4839]: I0321 04:44:02.348228 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/f9b42e2e-3015-4ae1-a3a9-3eb96949b021-horizon-secret-key\") pod \"horizon-75c94899fc-bkxlk\" (UID: \"f9b42e2e-3015-4ae1-a3a9-3eb96949b021\") " pod="openstack/horizon-75c94899fc-bkxlk" Mar 21 04:44:02 crc kubenswrapper[4839]: I0321 04:44:02.348482 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f9b42e2e-3015-4ae1-a3a9-3eb96949b021-logs\") pod \"horizon-75c94899fc-bkxlk\" (UID: \"f9b42e2e-3015-4ae1-a3a9-3eb96949b021\") " pod="openstack/horizon-75c94899fc-bkxlk" Mar 21 04:44:02 crc kubenswrapper[4839]: I0321 04:44:02.348683 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/f9b42e2e-3015-4ae1-a3a9-3eb96949b021-config-data\") pod \"horizon-75c94899fc-bkxlk\" (UID: \"f9b42e2e-3015-4ae1-a3a9-3eb96949b021\") " pod="openstack/horizon-75c94899fc-bkxlk" Mar 21 04:44:02 crc kubenswrapper[4839]: I0321 04:44:02.354543 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/f9b42e2e-3015-4ae1-a3a9-3eb96949b021-scripts\") pod \"horizon-75c94899fc-bkxlk\" (UID: \"f9b42e2e-3015-4ae1-a3a9-3eb96949b021\") " pod="openstack/horizon-75c94899fc-bkxlk" Mar 21 04:44:02 crc kubenswrapper[4839]: I0321 04:44:02.355297 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/f9b42e2e-3015-4ae1-a3a9-3eb96949b021-config-data\") pod \"horizon-75c94899fc-bkxlk\" (UID: \"f9b42e2e-3015-4ae1-a3a9-3eb96949b021\") " pod="openstack/horizon-75c94899fc-bkxlk" Mar 21 04:44:02 crc kubenswrapper[4839]: I0321 04:44:02.357817 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f9b42e2e-3015-4ae1-a3a9-3eb96949b021-logs\") pod \"horizon-75c94899fc-bkxlk\" (UID: \"f9b42e2e-3015-4ae1-a3a9-3eb96949b021\") " pod="openstack/horizon-75c94899fc-bkxlk" Mar 21 04:44:02 crc kubenswrapper[4839]: I0321 04:44:02.372331 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/f9b42e2e-3015-4ae1-a3a9-3eb96949b021-horizon-secret-key\") pod \"horizon-75c94899fc-bkxlk\" (UID: \"f9b42e2e-3015-4ae1-a3a9-3eb96949b021\") " pod="openstack/horizon-75c94899fc-bkxlk" Mar 21 04:44:02 crc kubenswrapper[4839]: I0321 04:44:02.383614 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s9ntv\" (UniqueName: \"kubernetes.io/projected/f9b42e2e-3015-4ae1-a3a9-3eb96949b021-kube-api-access-s9ntv\") pod \"horizon-75c94899fc-bkxlk\" (UID: \"f9b42e2e-3015-4ae1-a3a9-3eb96949b021\") " pod="openstack/horizon-75c94899fc-bkxlk" Mar 21 04:44:02 crc kubenswrapper[4839]: I0321 04:44:02.389726 4839 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-bootstrap-5rr4j" podStartSLOduration=3.389699575 podStartE2EDuration="3.389699575s" podCreationTimestamp="2026-03-21 04:43:59 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-21 04:44:02.37233443 +0000 UTC m=+1246.700121126" watchObservedRunningTime="2026-03-21 04:44:02.389699575 +0000 UTC m=+1246.717486261" Mar 21 04:44:02 crc kubenswrapper[4839]: I0321 04:44:02.461796 4839 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-db-sync-nm9t5" podStartSLOduration=3.4617797120000002 podStartE2EDuration="3.461779712s" podCreationTimestamp="2026-03-21 04:43:59 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-21 04:44:02.412130793 +0000 UTC m=+1246.739917469" watchObservedRunningTime="2026-03-21 04:44:02.461779712 +0000 UTC m=+1246.789566388" Mar 21 04:44:02 crc kubenswrapper[4839]: I0321 04:44:02.504217 4839 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5b868669f-g4f92"] Mar 21 04:44:02 crc kubenswrapper[4839]: I0321 04:44:02.516300 4839 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-5b868669f-g4f92"] Mar 21 04:44:02 crc kubenswrapper[4839]: I0321 04:44:02.533374 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-75c94899fc-bkxlk" Mar 21 04:44:02 crc kubenswrapper[4839]: I0321 04:44:02.665685 4839 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5c9d85d47c-ch994" Mar 21 04:44:02 crc kubenswrapper[4839]: I0321 04:44:02.766846 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-j28tz\" (UniqueName: \"kubernetes.io/projected/5e86a461-8b9c-4850-b084-5a242058db02-kube-api-access-j28tz\") pod \"5e86a461-8b9c-4850-b084-5a242058db02\" (UID: \"5e86a461-8b9c-4850-b084-5a242058db02\") " Mar 21 04:44:02 crc kubenswrapper[4839]: I0321 04:44:02.766905 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/5e86a461-8b9c-4850-b084-5a242058db02-dns-svc\") pod \"5e86a461-8b9c-4850-b084-5a242058db02\" (UID: \"5e86a461-8b9c-4850-b084-5a242058db02\") " Mar 21 04:44:02 crc kubenswrapper[4839]: I0321 04:44:02.766930 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5e86a461-8b9c-4850-b084-5a242058db02-config\") pod \"5e86a461-8b9c-4850-b084-5a242058db02\" (UID: \"5e86a461-8b9c-4850-b084-5a242058db02\") " Mar 21 04:44:02 crc kubenswrapper[4839]: I0321 04:44:02.767000 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/5e86a461-8b9c-4850-b084-5a242058db02-ovsdbserver-nb\") pod \"5e86a461-8b9c-4850-b084-5a242058db02\" (UID: \"5e86a461-8b9c-4850-b084-5a242058db02\") " Mar 21 04:44:02 crc kubenswrapper[4839]: I0321 04:44:02.767042 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/5e86a461-8b9c-4850-b084-5a242058db02-ovsdbserver-sb\") pod \"5e86a461-8b9c-4850-b084-5a242058db02\" (UID: \"5e86a461-8b9c-4850-b084-5a242058db02\") " Mar 21 04:44:02 crc kubenswrapper[4839]: I0321 04:44:02.773962 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5e86a461-8b9c-4850-b084-5a242058db02-kube-api-access-j28tz" (OuterVolumeSpecName: "kube-api-access-j28tz") pod "5e86a461-8b9c-4850-b084-5a242058db02" (UID: "5e86a461-8b9c-4850-b084-5a242058db02"). InnerVolumeSpecName "kube-api-access-j28tz". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 21 04:44:02 crc kubenswrapper[4839]: I0321 04:44:02.824211 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5e86a461-8b9c-4850-b084-5a242058db02-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "5e86a461-8b9c-4850-b084-5a242058db02" (UID: "5e86a461-8b9c-4850-b084-5a242058db02"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 21 04:44:02 crc kubenswrapper[4839]: I0321 04:44:02.824259 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5e86a461-8b9c-4850-b084-5a242058db02-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "5e86a461-8b9c-4850-b084-5a242058db02" (UID: "5e86a461-8b9c-4850-b084-5a242058db02"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 21 04:44:02 crc kubenswrapper[4839]: I0321 04:44:02.824794 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5e86a461-8b9c-4850-b084-5a242058db02-config" (OuterVolumeSpecName: "config") pod "5e86a461-8b9c-4850-b084-5a242058db02" (UID: "5e86a461-8b9c-4850-b084-5a242058db02"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 21 04:44:02 crc kubenswrapper[4839]: I0321 04:44:02.828514 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5e86a461-8b9c-4850-b084-5a242058db02-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "5e86a461-8b9c-4850-b084-5a242058db02" (UID: "5e86a461-8b9c-4850-b084-5a242058db02"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 21 04:44:02 crc kubenswrapper[4839]: I0321 04:44:02.871653 4839 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/5e86a461-8b9c-4850-b084-5a242058db02-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Mar 21 04:44:02 crc kubenswrapper[4839]: I0321 04:44:02.872003 4839 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/5e86a461-8b9c-4850-b084-5a242058db02-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Mar 21 04:44:02 crc kubenswrapper[4839]: I0321 04:44:02.872017 4839 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-j28tz\" (UniqueName: \"kubernetes.io/projected/5e86a461-8b9c-4850-b084-5a242058db02-kube-api-access-j28tz\") on node \"crc\" DevicePath \"\"" Mar 21 04:44:02 crc kubenswrapper[4839]: I0321 04:44:02.872063 4839 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/5e86a461-8b9c-4850-b084-5a242058db02-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 21 04:44:02 crc kubenswrapper[4839]: I0321 04:44:02.872079 4839 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5e86a461-8b9c-4850-b084-5a242058db02-config\") on node \"crc\" DevicePath \"\"" Mar 21 04:44:03 crc kubenswrapper[4839]: I0321 04:44:03.190913 4839 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-75c94899fc-bkxlk"] Mar 21 04:44:03 crc kubenswrapper[4839]: I0321 04:44:03.363123 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567804-jn7hw" event={"ID":"117f0438-5ab3-4616-b574-c5bbc43e8ac9","Type":"ContainerStarted","Data":"91c89b78e4a205a25af8a93dc758c0974e237fac7942a3cc2a1f6b03e61923e1"} Mar 21 04:44:03 crc kubenswrapper[4839]: I0321 04:44:03.366277 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-75c94899fc-bkxlk" event={"ID":"f9b42e2e-3015-4ae1-a3a9-3eb96949b021","Type":"ContainerStarted","Data":"04fe2b7baf42bfa7035c15041a5de93662e87c4359a787bd7c9b47e57eb2a7fa"} Mar 21 04:44:03 crc kubenswrapper[4839]: I0321 04:44:03.376996 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5c9d85d47c-ch994" event={"ID":"5e86a461-8b9c-4850-b084-5a242058db02","Type":"ContainerDied","Data":"3695c4efcb7ed7c99b03416f2c3b5313f86380f8412fb12414d41e72c87f8386"} Mar 21 04:44:03 crc kubenswrapper[4839]: I0321 04:44:03.377042 4839 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5c9d85d47c-ch994" Mar 21 04:44:03 crc kubenswrapper[4839]: I0321 04:44:03.377076 4839 scope.go:117] "RemoveContainer" containerID="85588b4178127cda15e4b9075c4a1854a46c41932df5300f78b062a7e2468918" Mar 21 04:44:03 crc kubenswrapper[4839]: I0321 04:44:03.389868 4839 generic.go:334] "Generic (PLEG): container finished" podID="e8a38db0-ab1b-4555-bb52-a903e9c6b5bb" containerID="52b2f2f770bf2eb0997bcf2aef429b0705724f1b09c55ddcbec469134135b349" exitCode=0 Mar 21 04:44:03 crc kubenswrapper[4839]: I0321 04:44:03.389950 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-cf78879c9-mr2ng" event={"ID":"e8a38db0-ab1b-4555-bb52-a903e9c6b5bb","Type":"ContainerDied","Data":"52b2f2f770bf2eb0997bcf2aef429b0705724f1b09c55ddcbec469134135b349"} Mar 21 04:44:03 crc kubenswrapper[4839]: I0321 04:44:03.389988 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-cf78879c9-mr2ng" event={"ID":"e8a38db0-ab1b-4555-bb52-a903e9c6b5bb","Type":"ContainerStarted","Data":"14b1c90fa0f6d90a07ec7699dcaccca8b3deb82e4f9b4495708e06fb5363f598"} Mar 21 04:44:03 crc kubenswrapper[4839]: I0321 04:44:03.450309 4839 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-cf78879c9-mr2ng" podStartSLOduration=4.450291966 podStartE2EDuration="4.450291966s" podCreationTimestamp="2026-03-21 04:43:59 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-21 04:44:03.449838733 +0000 UTC m=+1247.777625429" watchObservedRunningTime="2026-03-21 04:44:03.450291966 +0000 UTC m=+1247.778078642" Mar 21 04:44:03 crc kubenswrapper[4839]: I0321 04:44:03.499814 4839 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5c9d85d47c-ch994"] Mar 21 04:44:03 crc kubenswrapper[4839]: I0321 04:44:03.523219 4839 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-5c9d85d47c-ch994"] Mar 21 04:44:04 crc kubenswrapper[4839]: I0321 04:44:04.419041 4839 generic.go:334] "Generic (PLEG): container finished" podID="2cc1dfb9-8108-46e5-8dc5-5b555590ecc1" containerID="79604402661ee3c465cb72ff146dbc568553c3204385175c4f68e9dccfa5a6c6" exitCode=0 Mar 21 04:44:04 crc kubenswrapper[4839]: I0321 04:44:04.419145 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-ng2tw" event={"ID":"2cc1dfb9-8108-46e5-8dc5-5b555590ecc1","Type":"ContainerDied","Data":"79604402661ee3c465cb72ff146dbc568553c3204385175c4f68e9dccfa5a6c6"} Mar 21 04:44:04 crc kubenswrapper[4839]: I0321 04:44:04.422854 4839 generic.go:334] "Generic (PLEG): container finished" podID="117f0438-5ab3-4616-b574-c5bbc43e8ac9" containerID="91c89b78e4a205a25af8a93dc758c0974e237fac7942a3cc2a1f6b03e61923e1" exitCode=0 Mar 21 04:44:04 crc kubenswrapper[4839]: I0321 04:44:04.422945 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567804-jn7hw" event={"ID":"117f0438-5ab3-4616-b574-c5bbc43e8ac9","Type":"ContainerDied","Data":"91c89b78e4a205a25af8a93dc758c0974e237fac7942a3cc2a1f6b03e61923e1"} Mar 21 04:44:04 crc kubenswrapper[4839]: I0321 04:44:04.423245 4839 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-cf78879c9-mr2ng" Mar 21 04:44:04 crc kubenswrapper[4839]: I0321 04:44:04.484888 4839 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5e86a461-8b9c-4850-b084-5a242058db02" path="/var/lib/kubelet/pods/5e86a461-8b9c-4850-b084-5a242058db02/volumes" Mar 21 04:44:04 crc kubenswrapper[4839]: I0321 04:44:04.485494 4839 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9fec7a31-49df-4e3c-9266-8c21d7622445" path="/var/lib/kubelet/pods/9fec7a31-49df-4e3c-9266-8c21d7622445/volumes" Mar 21 04:44:08 crc kubenswrapper[4839]: I0321 04:44:08.891651 4839 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-7c9d7f5-72d27"] Mar 21 04:44:08 crc kubenswrapper[4839]: I0321 04:44:08.923674 4839 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/horizon-84c6c985f8-v5cmh"] Mar 21 04:44:08 crc kubenswrapper[4839]: E0321 04:44:08.924056 4839 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5e86a461-8b9c-4850-b084-5a242058db02" containerName="init" Mar 21 04:44:08 crc kubenswrapper[4839]: I0321 04:44:08.924071 4839 state_mem.go:107] "Deleted CPUSet assignment" podUID="5e86a461-8b9c-4850-b084-5a242058db02" containerName="init" Mar 21 04:44:08 crc kubenswrapper[4839]: I0321 04:44:08.924264 4839 memory_manager.go:354] "RemoveStaleState removing state" podUID="5e86a461-8b9c-4850-b084-5a242058db02" containerName="init" Mar 21 04:44:08 crc kubenswrapper[4839]: I0321 04:44:08.925136 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-84c6c985f8-v5cmh" Mar 21 04:44:08 crc kubenswrapper[4839]: I0321 04:44:08.928294 4839 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-horizon-svc" Mar 21 04:44:08 crc kubenswrapper[4839]: I0321 04:44:08.942271 4839 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-84c6c985f8-v5cmh"] Mar 21 04:44:08 crc kubenswrapper[4839]: I0321 04:44:08.970604 4839 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-75c94899fc-bkxlk"] Mar 21 04:44:08 crc kubenswrapper[4839]: I0321 04:44:08.999309 4839 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/horizon-9c97f4dbd-k2scs"] Mar 21 04:44:09 crc kubenswrapper[4839]: I0321 04:44:09.002519 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/b3b26c3a-55d5-442a-9c31-187b0aa60f90-horizon-tls-certs\") pod \"horizon-84c6c985f8-v5cmh\" (UID: \"b3b26c3a-55d5-442a-9c31-187b0aa60f90\") " pod="openstack/horizon-84c6c985f8-v5cmh" Mar 21 04:44:09 crc kubenswrapper[4839]: I0321 04:44:09.002580 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/b3b26c3a-55d5-442a-9c31-187b0aa60f90-scripts\") pod \"horizon-84c6c985f8-v5cmh\" (UID: \"b3b26c3a-55d5-442a-9c31-187b0aa60f90\") " pod="openstack/horizon-84c6c985f8-v5cmh" Mar 21 04:44:09 crc kubenswrapper[4839]: I0321 04:44:09.002628 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/b3b26c3a-55d5-442a-9c31-187b0aa60f90-config-data\") pod \"horizon-84c6c985f8-v5cmh\" (UID: \"b3b26c3a-55d5-442a-9c31-187b0aa60f90\") " pod="openstack/horizon-84c6c985f8-v5cmh" Mar 21 04:44:09 crc kubenswrapper[4839]: I0321 04:44:09.002672 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/b3b26c3a-55d5-442a-9c31-187b0aa60f90-horizon-secret-key\") pod \"horizon-84c6c985f8-v5cmh\" (UID: \"b3b26c3a-55d5-442a-9c31-187b0aa60f90\") " pod="openstack/horizon-84c6c985f8-v5cmh" Mar 21 04:44:09 crc kubenswrapper[4839]: I0321 04:44:09.002730 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b3b26c3a-55d5-442a-9c31-187b0aa60f90-combined-ca-bundle\") pod \"horizon-84c6c985f8-v5cmh\" (UID: \"b3b26c3a-55d5-442a-9c31-187b0aa60f90\") " pod="openstack/horizon-84c6c985f8-v5cmh" Mar 21 04:44:09 crc kubenswrapper[4839]: I0321 04:44:09.002753 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vmrcp\" (UniqueName: \"kubernetes.io/projected/b3b26c3a-55d5-442a-9c31-187b0aa60f90-kube-api-access-vmrcp\") pod \"horizon-84c6c985f8-v5cmh\" (UID: \"b3b26c3a-55d5-442a-9c31-187b0aa60f90\") " pod="openstack/horizon-84c6c985f8-v5cmh" Mar 21 04:44:09 crc kubenswrapper[4839]: I0321 04:44:09.002769 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b3b26c3a-55d5-442a-9c31-187b0aa60f90-logs\") pod \"horizon-84c6c985f8-v5cmh\" (UID: \"b3b26c3a-55d5-442a-9c31-187b0aa60f90\") " pod="openstack/horizon-84c6c985f8-v5cmh" Mar 21 04:44:09 crc kubenswrapper[4839]: I0321 04:44:09.004120 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-9c97f4dbd-k2scs" Mar 21 04:44:09 crc kubenswrapper[4839]: I0321 04:44:09.018949 4839 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-9c97f4dbd-k2scs"] Mar 21 04:44:09 crc kubenswrapper[4839]: I0321 04:44:09.104428 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/b3b26c3a-55d5-442a-9c31-187b0aa60f90-config-data\") pod \"horizon-84c6c985f8-v5cmh\" (UID: \"b3b26c3a-55d5-442a-9c31-187b0aa60f90\") " pod="openstack/horizon-84c6c985f8-v5cmh" Mar 21 04:44:09 crc kubenswrapper[4839]: I0321 04:44:09.104488 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/579308eb-854d-4160-ad35-8677f2d0e634-config-data\") pod \"horizon-9c97f4dbd-k2scs\" (UID: \"579308eb-854d-4160-ad35-8677f2d0e634\") " pod="openstack/horizon-9c97f4dbd-k2scs" Mar 21 04:44:09 crc kubenswrapper[4839]: I0321 04:44:09.104520 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/579308eb-854d-4160-ad35-8677f2d0e634-horizon-secret-key\") pod \"horizon-9c97f4dbd-k2scs\" (UID: \"579308eb-854d-4160-ad35-8677f2d0e634\") " pod="openstack/horizon-9c97f4dbd-k2scs" Mar 21 04:44:09 crc kubenswrapper[4839]: I0321 04:44:09.104545 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/b3b26c3a-55d5-442a-9c31-187b0aa60f90-horizon-secret-key\") pod \"horizon-84c6c985f8-v5cmh\" (UID: \"b3b26c3a-55d5-442a-9c31-187b0aa60f90\") " pod="openstack/horizon-84c6c985f8-v5cmh" Mar 21 04:44:09 crc kubenswrapper[4839]: I0321 04:44:09.104596 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/579308eb-854d-4160-ad35-8677f2d0e634-horizon-tls-certs\") pod \"horizon-9c97f4dbd-k2scs\" (UID: \"579308eb-854d-4160-ad35-8677f2d0e634\") " pod="openstack/horizon-9c97f4dbd-k2scs" Mar 21 04:44:09 crc kubenswrapper[4839]: I0321 04:44:09.104648 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b3b26c3a-55d5-442a-9c31-187b0aa60f90-combined-ca-bundle\") pod \"horizon-84c6c985f8-v5cmh\" (UID: \"b3b26c3a-55d5-442a-9c31-187b0aa60f90\") " pod="openstack/horizon-84c6c985f8-v5cmh" Mar 21 04:44:09 crc kubenswrapper[4839]: I0321 04:44:09.104667 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/579308eb-854d-4160-ad35-8677f2d0e634-logs\") pod \"horizon-9c97f4dbd-k2scs\" (UID: \"579308eb-854d-4160-ad35-8677f2d0e634\") " pod="openstack/horizon-9c97f4dbd-k2scs" Mar 21 04:44:09 crc kubenswrapper[4839]: I0321 04:44:09.104686 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vmrcp\" (UniqueName: \"kubernetes.io/projected/b3b26c3a-55d5-442a-9c31-187b0aa60f90-kube-api-access-vmrcp\") pod \"horizon-84c6c985f8-v5cmh\" (UID: \"b3b26c3a-55d5-442a-9c31-187b0aa60f90\") " pod="openstack/horizon-84c6c985f8-v5cmh" Mar 21 04:44:09 crc kubenswrapper[4839]: I0321 04:44:09.104699 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/579308eb-854d-4160-ad35-8677f2d0e634-scripts\") pod \"horizon-9c97f4dbd-k2scs\" (UID: \"579308eb-854d-4160-ad35-8677f2d0e634\") " pod="openstack/horizon-9c97f4dbd-k2scs" Mar 21 04:44:09 crc kubenswrapper[4839]: I0321 04:44:09.104718 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b3b26c3a-55d5-442a-9c31-187b0aa60f90-logs\") pod \"horizon-84c6c985f8-v5cmh\" (UID: \"b3b26c3a-55d5-442a-9c31-187b0aa60f90\") " pod="openstack/horizon-84c6c985f8-v5cmh" Mar 21 04:44:09 crc kubenswrapper[4839]: I0321 04:44:09.104750 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/b3b26c3a-55d5-442a-9c31-187b0aa60f90-horizon-tls-certs\") pod \"horizon-84c6c985f8-v5cmh\" (UID: \"b3b26c3a-55d5-442a-9c31-187b0aa60f90\") " pod="openstack/horizon-84c6c985f8-v5cmh" Mar 21 04:44:09 crc kubenswrapper[4839]: I0321 04:44:09.104766 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/579308eb-854d-4160-ad35-8677f2d0e634-combined-ca-bundle\") pod \"horizon-9c97f4dbd-k2scs\" (UID: \"579308eb-854d-4160-ad35-8677f2d0e634\") " pod="openstack/horizon-9c97f4dbd-k2scs" Mar 21 04:44:09 crc kubenswrapper[4839]: I0321 04:44:09.104791 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/b3b26c3a-55d5-442a-9c31-187b0aa60f90-scripts\") pod \"horizon-84c6c985f8-v5cmh\" (UID: \"b3b26c3a-55d5-442a-9c31-187b0aa60f90\") " pod="openstack/horizon-84c6c985f8-v5cmh" Mar 21 04:44:09 crc kubenswrapper[4839]: I0321 04:44:09.104814 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r22bn\" (UniqueName: \"kubernetes.io/projected/579308eb-854d-4160-ad35-8677f2d0e634-kube-api-access-r22bn\") pod \"horizon-9c97f4dbd-k2scs\" (UID: \"579308eb-854d-4160-ad35-8677f2d0e634\") " pod="openstack/horizon-9c97f4dbd-k2scs" Mar 21 04:44:09 crc kubenswrapper[4839]: I0321 04:44:09.105891 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/b3b26c3a-55d5-442a-9c31-187b0aa60f90-config-data\") pod \"horizon-84c6c985f8-v5cmh\" (UID: \"b3b26c3a-55d5-442a-9c31-187b0aa60f90\") " pod="openstack/horizon-84c6c985f8-v5cmh" Mar 21 04:44:09 crc kubenswrapper[4839]: I0321 04:44:09.108084 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b3b26c3a-55d5-442a-9c31-187b0aa60f90-logs\") pod \"horizon-84c6c985f8-v5cmh\" (UID: \"b3b26c3a-55d5-442a-9c31-187b0aa60f90\") " pod="openstack/horizon-84c6c985f8-v5cmh" Mar 21 04:44:09 crc kubenswrapper[4839]: I0321 04:44:09.108198 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/b3b26c3a-55d5-442a-9c31-187b0aa60f90-scripts\") pod \"horizon-84c6c985f8-v5cmh\" (UID: \"b3b26c3a-55d5-442a-9c31-187b0aa60f90\") " pod="openstack/horizon-84c6c985f8-v5cmh" Mar 21 04:44:09 crc kubenswrapper[4839]: I0321 04:44:09.112703 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/b3b26c3a-55d5-442a-9c31-187b0aa60f90-horizon-secret-key\") pod \"horizon-84c6c985f8-v5cmh\" (UID: \"b3b26c3a-55d5-442a-9c31-187b0aa60f90\") " pod="openstack/horizon-84c6c985f8-v5cmh" Mar 21 04:44:09 crc kubenswrapper[4839]: I0321 04:44:09.112776 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b3b26c3a-55d5-442a-9c31-187b0aa60f90-combined-ca-bundle\") pod \"horizon-84c6c985f8-v5cmh\" (UID: \"b3b26c3a-55d5-442a-9c31-187b0aa60f90\") " pod="openstack/horizon-84c6c985f8-v5cmh" Mar 21 04:44:09 crc kubenswrapper[4839]: I0321 04:44:09.113017 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/b3b26c3a-55d5-442a-9c31-187b0aa60f90-horizon-tls-certs\") pod \"horizon-84c6c985f8-v5cmh\" (UID: \"b3b26c3a-55d5-442a-9c31-187b0aa60f90\") " pod="openstack/horizon-84c6c985f8-v5cmh" Mar 21 04:44:09 crc kubenswrapper[4839]: I0321 04:44:09.124855 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vmrcp\" (UniqueName: \"kubernetes.io/projected/b3b26c3a-55d5-442a-9c31-187b0aa60f90-kube-api-access-vmrcp\") pod \"horizon-84c6c985f8-v5cmh\" (UID: \"b3b26c3a-55d5-442a-9c31-187b0aa60f90\") " pod="openstack/horizon-84c6c985f8-v5cmh" Mar 21 04:44:09 crc kubenswrapper[4839]: I0321 04:44:09.205930 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/579308eb-854d-4160-ad35-8677f2d0e634-logs\") pod \"horizon-9c97f4dbd-k2scs\" (UID: \"579308eb-854d-4160-ad35-8677f2d0e634\") " pod="openstack/horizon-9c97f4dbd-k2scs" Mar 21 04:44:09 crc kubenswrapper[4839]: I0321 04:44:09.205986 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/579308eb-854d-4160-ad35-8677f2d0e634-scripts\") pod \"horizon-9c97f4dbd-k2scs\" (UID: \"579308eb-854d-4160-ad35-8677f2d0e634\") " pod="openstack/horizon-9c97f4dbd-k2scs" Mar 21 04:44:09 crc kubenswrapper[4839]: I0321 04:44:09.206039 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/579308eb-854d-4160-ad35-8677f2d0e634-combined-ca-bundle\") pod \"horizon-9c97f4dbd-k2scs\" (UID: \"579308eb-854d-4160-ad35-8677f2d0e634\") " pod="openstack/horizon-9c97f4dbd-k2scs" Mar 21 04:44:09 crc kubenswrapper[4839]: I0321 04:44:09.206085 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-r22bn\" (UniqueName: \"kubernetes.io/projected/579308eb-854d-4160-ad35-8677f2d0e634-kube-api-access-r22bn\") pod \"horizon-9c97f4dbd-k2scs\" (UID: \"579308eb-854d-4160-ad35-8677f2d0e634\") " pod="openstack/horizon-9c97f4dbd-k2scs" Mar 21 04:44:09 crc kubenswrapper[4839]: I0321 04:44:09.206135 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/579308eb-854d-4160-ad35-8677f2d0e634-config-data\") pod \"horizon-9c97f4dbd-k2scs\" (UID: \"579308eb-854d-4160-ad35-8677f2d0e634\") " pod="openstack/horizon-9c97f4dbd-k2scs" Mar 21 04:44:09 crc kubenswrapper[4839]: I0321 04:44:09.206171 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/579308eb-854d-4160-ad35-8677f2d0e634-horizon-secret-key\") pod \"horizon-9c97f4dbd-k2scs\" (UID: \"579308eb-854d-4160-ad35-8677f2d0e634\") " pod="openstack/horizon-9c97f4dbd-k2scs" Mar 21 04:44:09 crc kubenswrapper[4839]: I0321 04:44:09.206204 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/579308eb-854d-4160-ad35-8677f2d0e634-horizon-tls-certs\") pod \"horizon-9c97f4dbd-k2scs\" (UID: \"579308eb-854d-4160-ad35-8677f2d0e634\") " pod="openstack/horizon-9c97f4dbd-k2scs" Mar 21 04:44:09 crc kubenswrapper[4839]: I0321 04:44:09.207184 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/579308eb-854d-4160-ad35-8677f2d0e634-scripts\") pod \"horizon-9c97f4dbd-k2scs\" (UID: \"579308eb-854d-4160-ad35-8677f2d0e634\") " pod="openstack/horizon-9c97f4dbd-k2scs" Mar 21 04:44:09 crc kubenswrapper[4839]: I0321 04:44:09.207519 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/579308eb-854d-4160-ad35-8677f2d0e634-config-data\") pod \"horizon-9c97f4dbd-k2scs\" (UID: \"579308eb-854d-4160-ad35-8677f2d0e634\") " pod="openstack/horizon-9c97f4dbd-k2scs" Mar 21 04:44:09 crc kubenswrapper[4839]: I0321 04:44:09.208774 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/579308eb-854d-4160-ad35-8677f2d0e634-logs\") pod \"horizon-9c97f4dbd-k2scs\" (UID: \"579308eb-854d-4160-ad35-8677f2d0e634\") " pod="openstack/horizon-9c97f4dbd-k2scs" Mar 21 04:44:09 crc kubenswrapper[4839]: I0321 04:44:09.210679 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/579308eb-854d-4160-ad35-8677f2d0e634-horizon-tls-certs\") pod \"horizon-9c97f4dbd-k2scs\" (UID: \"579308eb-854d-4160-ad35-8677f2d0e634\") " pod="openstack/horizon-9c97f4dbd-k2scs" Mar 21 04:44:09 crc kubenswrapper[4839]: I0321 04:44:09.211121 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/579308eb-854d-4160-ad35-8677f2d0e634-combined-ca-bundle\") pod \"horizon-9c97f4dbd-k2scs\" (UID: \"579308eb-854d-4160-ad35-8677f2d0e634\") " pod="openstack/horizon-9c97f4dbd-k2scs" Mar 21 04:44:09 crc kubenswrapper[4839]: I0321 04:44:09.212790 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/579308eb-854d-4160-ad35-8677f2d0e634-horizon-secret-key\") pod \"horizon-9c97f4dbd-k2scs\" (UID: \"579308eb-854d-4160-ad35-8677f2d0e634\") " pod="openstack/horizon-9c97f4dbd-k2scs" Mar 21 04:44:09 crc kubenswrapper[4839]: I0321 04:44:09.235584 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-r22bn\" (UniqueName: \"kubernetes.io/projected/579308eb-854d-4160-ad35-8677f2d0e634-kube-api-access-r22bn\") pod \"horizon-9c97f4dbd-k2scs\" (UID: \"579308eb-854d-4160-ad35-8677f2d0e634\") " pod="openstack/horizon-9c97f4dbd-k2scs" Mar 21 04:44:09 crc kubenswrapper[4839]: I0321 04:44:09.261134 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-84c6c985f8-v5cmh" Mar 21 04:44:09 crc kubenswrapper[4839]: I0321 04:44:09.327950 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-9c97f4dbd-k2scs" Mar 21 04:44:10 crc kubenswrapper[4839]: I0321 04:44:10.491763 4839 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-cf78879c9-mr2ng" Mar 21 04:44:10 crc kubenswrapper[4839]: I0321 04:44:10.546597 4839 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-b8fbc5445-zkqb7"] Mar 21 04:44:10 crc kubenswrapper[4839]: I0321 04:44:10.546874 4839 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-b8fbc5445-zkqb7" podUID="67dd1633-1450-4153-b0af-b6887f61944c" containerName="dnsmasq-dns" containerID="cri-o://66a460b182805c08827a7b4f6980d98fea84c8290c7b4fe1cb071b3630a6c029" gracePeriod=10 Mar 21 04:44:11 crc kubenswrapper[4839]: I0321 04:44:11.487516 4839 generic.go:334] "Generic (PLEG): container finished" podID="67dd1633-1450-4153-b0af-b6887f61944c" containerID="66a460b182805c08827a7b4f6980d98fea84c8290c7b4fe1cb071b3630a6c029" exitCode=0 Mar 21 04:44:11 crc kubenswrapper[4839]: I0321 04:44:11.487561 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-b8fbc5445-zkqb7" event={"ID":"67dd1633-1450-4153-b0af-b6887f61944c","Type":"ContainerDied","Data":"66a460b182805c08827a7b4f6980d98fea84c8290c7b4fe1cb071b3630a6c029"} Mar 21 04:44:13 crc kubenswrapper[4839]: I0321 04:44:13.639785 4839 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-b8fbc5445-zkqb7" podUID="67dd1633-1450-4153-b0af-b6887f61944c" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.115:5353: connect: connection refused" Mar 21 04:44:14 crc kubenswrapper[4839]: I0321 04:44:14.581875 4839 generic.go:334] "Generic (PLEG): container finished" podID="848aa53a-bd67-4733-aad7-6ac0f6fc0a15" containerID="2233fa3f3ad560ada373befd98764d7c67680bcb094c6c63415e8ef4dc05b7f7" exitCode=0 Mar 21 04:44:14 crc kubenswrapper[4839]: I0321 04:44:14.581936 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-5rr4j" event={"ID":"848aa53a-bd67-4733-aad7-6ac0f6fc0a15","Type":"ContainerDied","Data":"2233fa3f3ad560ada373befd98764d7c67680bcb094c6c63415e8ef4dc05b7f7"} Mar 21 04:44:15 crc kubenswrapper[4839]: I0321 04:44:15.259594 4839 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567804-jn7hw" Mar 21 04:44:15 crc kubenswrapper[4839]: I0321 04:44:15.266641 4839 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-ng2tw" Mar 21 04:44:15 crc kubenswrapper[4839]: I0321 04:44:15.412964 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8phmg\" (UniqueName: \"kubernetes.io/projected/2cc1dfb9-8108-46e5-8dc5-5b555590ecc1-kube-api-access-8phmg\") pod \"2cc1dfb9-8108-46e5-8dc5-5b555590ecc1\" (UID: \"2cc1dfb9-8108-46e5-8dc5-5b555590ecc1\") " Mar 21 04:44:15 crc kubenswrapper[4839]: I0321 04:44:15.413063 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rrqtc\" (UniqueName: \"kubernetes.io/projected/117f0438-5ab3-4616-b574-c5bbc43e8ac9-kube-api-access-rrqtc\") pod \"117f0438-5ab3-4616-b574-c5bbc43e8ac9\" (UID: \"117f0438-5ab3-4616-b574-c5bbc43e8ac9\") " Mar 21 04:44:15 crc kubenswrapper[4839]: I0321 04:44:15.413134 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2cc1dfb9-8108-46e5-8dc5-5b555590ecc1-config-data\") pod \"2cc1dfb9-8108-46e5-8dc5-5b555590ecc1\" (UID: \"2cc1dfb9-8108-46e5-8dc5-5b555590ecc1\") " Mar 21 04:44:15 crc kubenswrapper[4839]: I0321 04:44:15.413211 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/2cc1dfb9-8108-46e5-8dc5-5b555590ecc1-db-sync-config-data\") pod \"2cc1dfb9-8108-46e5-8dc5-5b555590ecc1\" (UID: \"2cc1dfb9-8108-46e5-8dc5-5b555590ecc1\") " Mar 21 04:44:15 crc kubenswrapper[4839]: I0321 04:44:15.413272 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2cc1dfb9-8108-46e5-8dc5-5b555590ecc1-combined-ca-bundle\") pod \"2cc1dfb9-8108-46e5-8dc5-5b555590ecc1\" (UID: \"2cc1dfb9-8108-46e5-8dc5-5b555590ecc1\") " Mar 21 04:44:15 crc kubenswrapper[4839]: I0321 04:44:15.420517 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2cc1dfb9-8108-46e5-8dc5-5b555590ecc1-kube-api-access-8phmg" (OuterVolumeSpecName: "kube-api-access-8phmg") pod "2cc1dfb9-8108-46e5-8dc5-5b555590ecc1" (UID: "2cc1dfb9-8108-46e5-8dc5-5b555590ecc1"). InnerVolumeSpecName "kube-api-access-8phmg". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 21 04:44:15 crc kubenswrapper[4839]: I0321 04:44:15.420966 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2cc1dfb9-8108-46e5-8dc5-5b555590ecc1-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "2cc1dfb9-8108-46e5-8dc5-5b555590ecc1" (UID: "2cc1dfb9-8108-46e5-8dc5-5b555590ecc1"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 21 04:44:15 crc kubenswrapper[4839]: I0321 04:44:15.421209 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/117f0438-5ab3-4616-b574-c5bbc43e8ac9-kube-api-access-rrqtc" (OuterVolumeSpecName: "kube-api-access-rrqtc") pod "117f0438-5ab3-4616-b574-c5bbc43e8ac9" (UID: "117f0438-5ab3-4616-b574-c5bbc43e8ac9"). InnerVolumeSpecName "kube-api-access-rrqtc". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 21 04:44:15 crc kubenswrapper[4839]: I0321 04:44:15.444013 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2cc1dfb9-8108-46e5-8dc5-5b555590ecc1-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "2cc1dfb9-8108-46e5-8dc5-5b555590ecc1" (UID: "2cc1dfb9-8108-46e5-8dc5-5b555590ecc1"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 21 04:44:15 crc kubenswrapper[4839]: I0321 04:44:15.465682 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2cc1dfb9-8108-46e5-8dc5-5b555590ecc1-config-data" (OuterVolumeSpecName: "config-data") pod "2cc1dfb9-8108-46e5-8dc5-5b555590ecc1" (UID: "2cc1dfb9-8108-46e5-8dc5-5b555590ecc1"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 21 04:44:15 crc kubenswrapper[4839]: I0321 04:44:15.515956 4839 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2cc1dfb9-8108-46e5-8dc5-5b555590ecc1-config-data\") on node \"crc\" DevicePath \"\"" Mar 21 04:44:15 crc kubenswrapper[4839]: I0321 04:44:15.515989 4839 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/2cc1dfb9-8108-46e5-8dc5-5b555590ecc1-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Mar 21 04:44:15 crc kubenswrapper[4839]: I0321 04:44:15.515999 4839 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2cc1dfb9-8108-46e5-8dc5-5b555590ecc1-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 21 04:44:15 crc kubenswrapper[4839]: I0321 04:44:15.516010 4839 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8phmg\" (UniqueName: \"kubernetes.io/projected/2cc1dfb9-8108-46e5-8dc5-5b555590ecc1-kube-api-access-8phmg\") on node \"crc\" DevicePath \"\"" Mar 21 04:44:15 crc kubenswrapper[4839]: I0321 04:44:15.516019 4839 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rrqtc\" (UniqueName: \"kubernetes.io/projected/117f0438-5ab3-4616-b574-c5bbc43e8ac9-kube-api-access-rrqtc\") on node \"crc\" DevicePath \"\"" Mar 21 04:44:15 crc kubenswrapper[4839]: I0321 04:44:15.597757 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567804-jn7hw" event={"ID":"117f0438-5ab3-4616-b574-c5bbc43e8ac9","Type":"ContainerDied","Data":"6abcdf109ee136fd47cf7c735b5b48052a42904b27a3f0d33fd3e2a18c075320"} Mar 21 04:44:15 crc kubenswrapper[4839]: I0321 04:44:15.597796 4839 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6abcdf109ee136fd47cf7c735b5b48052a42904b27a3f0d33fd3e2a18c075320" Mar 21 04:44:15 crc kubenswrapper[4839]: I0321 04:44:15.597844 4839 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567804-jn7hw" Mar 21 04:44:15 crc kubenswrapper[4839]: I0321 04:44:15.605063 4839 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-ng2tw" Mar 21 04:44:15 crc kubenswrapper[4839]: I0321 04:44:15.605126 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-ng2tw" event={"ID":"2cc1dfb9-8108-46e5-8dc5-5b555590ecc1","Type":"ContainerDied","Data":"65ed5290939376cc07f297d06efe7a7f9acbf33da55d639c2cc318b6e8be4b9e"} Mar 21 04:44:15 crc kubenswrapper[4839]: I0321 04:44:15.605297 4839 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="65ed5290939376cc07f297d06efe7a7f9acbf33da55d639c2cc318b6e8be4b9e" Mar 21 04:44:16 crc kubenswrapper[4839]: I0321 04:44:16.338300 4839 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29567798-k5zv2"] Mar 21 04:44:16 crc kubenswrapper[4839]: I0321 04:44:16.345694 4839 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29567798-k5zv2"] Mar 21 04:44:16 crc kubenswrapper[4839]: I0321 04:44:16.464972 4839 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ad32cfd7-7b60-4c76-8df2-eb2e65b102c3" path="/var/lib/kubelet/pods/ad32cfd7-7b60-4c76-8df2-eb2e65b102c3/volumes" Mar 21 04:44:16 crc kubenswrapper[4839]: I0321 04:44:16.674100 4839 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-56df8fb6b7-lzkn7"] Mar 21 04:44:16 crc kubenswrapper[4839]: E0321 04:44:16.674674 4839 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="117f0438-5ab3-4616-b574-c5bbc43e8ac9" containerName="oc" Mar 21 04:44:16 crc kubenswrapper[4839]: I0321 04:44:16.674700 4839 state_mem.go:107] "Deleted CPUSet assignment" podUID="117f0438-5ab3-4616-b574-c5bbc43e8ac9" containerName="oc" Mar 21 04:44:16 crc kubenswrapper[4839]: E0321 04:44:16.674740 4839 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2cc1dfb9-8108-46e5-8dc5-5b555590ecc1" containerName="glance-db-sync" Mar 21 04:44:16 crc kubenswrapper[4839]: I0321 04:44:16.674750 4839 state_mem.go:107] "Deleted CPUSet assignment" podUID="2cc1dfb9-8108-46e5-8dc5-5b555590ecc1" containerName="glance-db-sync" Mar 21 04:44:16 crc kubenswrapper[4839]: I0321 04:44:16.674971 4839 memory_manager.go:354] "RemoveStaleState removing state" podUID="2cc1dfb9-8108-46e5-8dc5-5b555590ecc1" containerName="glance-db-sync" Mar 21 04:44:16 crc kubenswrapper[4839]: I0321 04:44:16.675007 4839 memory_manager.go:354] "RemoveStaleState removing state" podUID="117f0438-5ab3-4616-b574-c5bbc43e8ac9" containerName="oc" Mar 21 04:44:16 crc kubenswrapper[4839]: I0321 04:44:16.676131 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-56df8fb6b7-lzkn7" Mar 21 04:44:16 crc kubenswrapper[4839]: I0321 04:44:16.687767 4839 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-56df8fb6b7-lzkn7"] Mar 21 04:44:16 crc kubenswrapper[4839]: I0321 04:44:16.840542 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6gbq4\" (UniqueName: \"kubernetes.io/projected/93294d9d-21ef-43b5-bac5-35d24543d02a-kube-api-access-6gbq4\") pod \"dnsmasq-dns-56df8fb6b7-lzkn7\" (UID: \"93294d9d-21ef-43b5-bac5-35d24543d02a\") " pod="openstack/dnsmasq-dns-56df8fb6b7-lzkn7" Mar 21 04:44:16 crc kubenswrapper[4839]: I0321 04:44:16.840848 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/93294d9d-21ef-43b5-bac5-35d24543d02a-dns-svc\") pod \"dnsmasq-dns-56df8fb6b7-lzkn7\" (UID: \"93294d9d-21ef-43b5-bac5-35d24543d02a\") " pod="openstack/dnsmasq-dns-56df8fb6b7-lzkn7" Mar 21 04:44:16 crc kubenswrapper[4839]: I0321 04:44:16.840971 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/93294d9d-21ef-43b5-bac5-35d24543d02a-ovsdbserver-sb\") pod \"dnsmasq-dns-56df8fb6b7-lzkn7\" (UID: \"93294d9d-21ef-43b5-bac5-35d24543d02a\") " pod="openstack/dnsmasq-dns-56df8fb6b7-lzkn7" Mar 21 04:44:16 crc kubenswrapper[4839]: I0321 04:44:16.841080 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/93294d9d-21ef-43b5-bac5-35d24543d02a-config\") pod \"dnsmasq-dns-56df8fb6b7-lzkn7\" (UID: \"93294d9d-21ef-43b5-bac5-35d24543d02a\") " pod="openstack/dnsmasq-dns-56df8fb6b7-lzkn7" Mar 21 04:44:16 crc kubenswrapper[4839]: I0321 04:44:16.841299 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/93294d9d-21ef-43b5-bac5-35d24543d02a-ovsdbserver-nb\") pod \"dnsmasq-dns-56df8fb6b7-lzkn7\" (UID: \"93294d9d-21ef-43b5-bac5-35d24543d02a\") " pod="openstack/dnsmasq-dns-56df8fb6b7-lzkn7" Mar 21 04:44:16 crc kubenswrapper[4839]: I0321 04:44:16.841368 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/93294d9d-21ef-43b5-bac5-35d24543d02a-dns-swift-storage-0\") pod \"dnsmasq-dns-56df8fb6b7-lzkn7\" (UID: \"93294d9d-21ef-43b5-bac5-35d24543d02a\") " pod="openstack/dnsmasq-dns-56df8fb6b7-lzkn7" Mar 21 04:44:16 crc kubenswrapper[4839]: I0321 04:44:16.943243 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/93294d9d-21ef-43b5-bac5-35d24543d02a-ovsdbserver-nb\") pod \"dnsmasq-dns-56df8fb6b7-lzkn7\" (UID: \"93294d9d-21ef-43b5-bac5-35d24543d02a\") " pod="openstack/dnsmasq-dns-56df8fb6b7-lzkn7" Mar 21 04:44:16 crc kubenswrapper[4839]: I0321 04:44:16.943324 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/93294d9d-21ef-43b5-bac5-35d24543d02a-dns-swift-storage-0\") pod \"dnsmasq-dns-56df8fb6b7-lzkn7\" (UID: \"93294d9d-21ef-43b5-bac5-35d24543d02a\") " pod="openstack/dnsmasq-dns-56df8fb6b7-lzkn7" Mar 21 04:44:16 crc kubenswrapper[4839]: I0321 04:44:16.943448 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6gbq4\" (UniqueName: \"kubernetes.io/projected/93294d9d-21ef-43b5-bac5-35d24543d02a-kube-api-access-6gbq4\") pod \"dnsmasq-dns-56df8fb6b7-lzkn7\" (UID: \"93294d9d-21ef-43b5-bac5-35d24543d02a\") " pod="openstack/dnsmasq-dns-56df8fb6b7-lzkn7" Mar 21 04:44:16 crc kubenswrapper[4839]: I0321 04:44:16.943508 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/93294d9d-21ef-43b5-bac5-35d24543d02a-dns-svc\") pod \"dnsmasq-dns-56df8fb6b7-lzkn7\" (UID: \"93294d9d-21ef-43b5-bac5-35d24543d02a\") " pod="openstack/dnsmasq-dns-56df8fb6b7-lzkn7" Mar 21 04:44:16 crc kubenswrapper[4839]: I0321 04:44:16.943542 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/93294d9d-21ef-43b5-bac5-35d24543d02a-ovsdbserver-sb\") pod \"dnsmasq-dns-56df8fb6b7-lzkn7\" (UID: \"93294d9d-21ef-43b5-bac5-35d24543d02a\") " pod="openstack/dnsmasq-dns-56df8fb6b7-lzkn7" Mar 21 04:44:16 crc kubenswrapper[4839]: I0321 04:44:16.943591 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/93294d9d-21ef-43b5-bac5-35d24543d02a-config\") pod \"dnsmasq-dns-56df8fb6b7-lzkn7\" (UID: \"93294d9d-21ef-43b5-bac5-35d24543d02a\") " pod="openstack/dnsmasq-dns-56df8fb6b7-lzkn7" Mar 21 04:44:16 crc kubenswrapper[4839]: I0321 04:44:16.944178 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/93294d9d-21ef-43b5-bac5-35d24543d02a-ovsdbserver-nb\") pod \"dnsmasq-dns-56df8fb6b7-lzkn7\" (UID: \"93294d9d-21ef-43b5-bac5-35d24543d02a\") " pod="openstack/dnsmasq-dns-56df8fb6b7-lzkn7" Mar 21 04:44:16 crc kubenswrapper[4839]: I0321 04:44:16.944609 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/93294d9d-21ef-43b5-bac5-35d24543d02a-dns-svc\") pod \"dnsmasq-dns-56df8fb6b7-lzkn7\" (UID: \"93294d9d-21ef-43b5-bac5-35d24543d02a\") " pod="openstack/dnsmasq-dns-56df8fb6b7-lzkn7" Mar 21 04:44:16 crc kubenswrapper[4839]: I0321 04:44:16.944877 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/93294d9d-21ef-43b5-bac5-35d24543d02a-ovsdbserver-sb\") pod \"dnsmasq-dns-56df8fb6b7-lzkn7\" (UID: \"93294d9d-21ef-43b5-bac5-35d24543d02a\") " pod="openstack/dnsmasq-dns-56df8fb6b7-lzkn7" Mar 21 04:44:16 crc kubenswrapper[4839]: I0321 04:44:16.945007 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/93294d9d-21ef-43b5-bac5-35d24543d02a-dns-swift-storage-0\") pod \"dnsmasq-dns-56df8fb6b7-lzkn7\" (UID: \"93294d9d-21ef-43b5-bac5-35d24543d02a\") " pod="openstack/dnsmasq-dns-56df8fb6b7-lzkn7" Mar 21 04:44:16 crc kubenswrapper[4839]: I0321 04:44:16.945845 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/93294d9d-21ef-43b5-bac5-35d24543d02a-config\") pod \"dnsmasq-dns-56df8fb6b7-lzkn7\" (UID: \"93294d9d-21ef-43b5-bac5-35d24543d02a\") " pod="openstack/dnsmasq-dns-56df8fb6b7-lzkn7" Mar 21 04:44:16 crc kubenswrapper[4839]: I0321 04:44:16.963690 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6gbq4\" (UniqueName: \"kubernetes.io/projected/93294d9d-21ef-43b5-bac5-35d24543d02a-kube-api-access-6gbq4\") pod \"dnsmasq-dns-56df8fb6b7-lzkn7\" (UID: \"93294d9d-21ef-43b5-bac5-35d24543d02a\") " pod="openstack/dnsmasq-dns-56df8fb6b7-lzkn7" Mar 21 04:44:17 crc kubenswrapper[4839]: I0321 04:44:17.003351 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-56df8fb6b7-lzkn7" Mar 21 04:44:17 crc kubenswrapper[4839]: E0321 04:44:17.062231 4839 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-ceilometer-central:current-podified" Mar 21 04:44:17 crc kubenswrapper[4839]: E0321 04:44:17.062689 4839 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:ceilometer-central-agent,Image:quay.io/podified-antelope-centos9/openstack-ceilometer-central:current-podified,Command:[/bin/bash],Args:[-c /usr/local/bin/kolla_start],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:n5d5h58ch665h5fch55h67bh669h589h654h84hc9h86h5f5h5b4h678h586h548h96hf5h55dh59h594hc9h5c6h85h94h587h58ch57fh8fh67dh58q,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:scripts,ReadOnly:true,MountPath:/var/lib/openstack/bin,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/openstack/config,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/kolla/config_files/config.json,SubPath:ceilometer-central-config.json,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:combined-ca-bundle,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-5zddh,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[/usr/bin/python3 /var/lib/openstack/bin/centralhealth.py],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:300,TimeoutSeconds:5,PeriodSeconds:5,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod ceilometer-0_openstack(6c266726-5bfd-4519-bdd5-9db7f6a77df4): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 21 04:44:17 crc kubenswrapper[4839]: I0321 04:44:17.713637 4839 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-external-api-0"] Mar 21 04:44:17 crc kubenswrapper[4839]: I0321 04:44:17.715302 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Mar 21 04:44:17 crc kubenswrapper[4839]: I0321 04:44:17.717309 4839 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-scripts" Mar 21 04:44:17 crc kubenswrapper[4839]: I0321 04:44:17.717973 4839 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-glance-dockercfg-v5bc4" Mar 21 04:44:17 crc kubenswrapper[4839]: I0321 04:44:17.718008 4839 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-external-config-data" Mar 21 04:44:17 crc kubenswrapper[4839]: I0321 04:44:17.741001 4839 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Mar 21 04:44:17 crc kubenswrapper[4839]: I0321 04:44:17.859525 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6008d784-c50b-4079-a7b4-c160b8202956-logs\") pod \"glance-default-external-api-0\" (UID: \"6008d784-c50b-4079-a7b4-c160b8202956\") " pod="openstack/glance-default-external-api-0" Mar 21 04:44:17 crc kubenswrapper[4839]: I0321 04:44:17.859618 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6008d784-c50b-4079-a7b4-c160b8202956-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"6008d784-c50b-4079-a7b4-c160b8202956\") " pod="openstack/glance-default-external-api-0" Mar 21 04:44:17 crc kubenswrapper[4839]: I0321 04:44:17.859663 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7wc6r\" (UniqueName: \"kubernetes.io/projected/6008d784-c50b-4079-a7b4-c160b8202956-kube-api-access-7wc6r\") pod \"glance-default-external-api-0\" (UID: \"6008d784-c50b-4079-a7b4-c160b8202956\") " pod="openstack/glance-default-external-api-0" Mar 21 04:44:17 crc kubenswrapper[4839]: I0321 04:44:17.859681 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"glance-default-external-api-0\" (UID: \"6008d784-c50b-4079-a7b4-c160b8202956\") " pod="openstack/glance-default-external-api-0" Mar 21 04:44:17 crc kubenswrapper[4839]: I0321 04:44:17.859697 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6008d784-c50b-4079-a7b4-c160b8202956-scripts\") pod \"glance-default-external-api-0\" (UID: \"6008d784-c50b-4079-a7b4-c160b8202956\") " pod="openstack/glance-default-external-api-0" Mar 21 04:44:17 crc kubenswrapper[4839]: I0321 04:44:17.859737 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6008d784-c50b-4079-a7b4-c160b8202956-config-data\") pod \"glance-default-external-api-0\" (UID: \"6008d784-c50b-4079-a7b4-c160b8202956\") " pod="openstack/glance-default-external-api-0" Mar 21 04:44:17 crc kubenswrapper[4839]: I0321 04:44:17.859920 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/6008d784-c50b-4079-a7b4-c160b8202956-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"6008d784-c50b-4079-a7b4-c160b8202956\") " pod="openstack/glance-default-external-api-0" Mar 21 04:44:17 crc kubenswrapper[4839]: I0321 04:44:17.873783 4839 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 21 04:44:17 crc kubenswrapper[4839]: I0321 04:44:17.875611 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Mar 21 04:44:17 crc kubenswrapper[4839]: I0321 04:44:17.882370 4839 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-internal-config-data" Mar 21 04:44:17 crc kubenswrapper[4839]: I0321 04:44:17.882629 4839 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 21 04:44:17 crc kubenswrapper[4839]: I0321 04:44:17.962109 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7wc6r\" (UniqueName: \"kubernetes.io/projected/6008d784-c50b-4079-a7b4-c160b8202956-kube-api-access-7wc6r\") pod \"glance-default-external-api-0\" (UID: \"6008d784-c50b-4079-a7b4-c160b8202956\") " pod="openstack/glance-default-external-api-0" Mar 21 04:44:17 crc kubenswrapper[4839]: I0321 04:44:17.962173 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"glance-default-external-api-0\" (UID: \"6008d784-c50b-4079-a7b4-c160b8202956\") " pod="openstack/glance-default-external-api-0" Mar 21 04:44:17 crc kubenswrapper[4839]: I0321 04:44:17.962201 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6008d784-c50b-4079-a7b4-c160b8202956-scripts\") pod \"glance-default-external-api-0\" (UID: \"6008d784-c50b-4079-a7b4-c160b8202956\") " pod="openstack/glance-default-external-api-0" Mar 21 04:44:17 crc kubenswrapper[4839]: I0321 04:44:17.962263 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6008d784-c50b-4079-a7b4-c160b8202956-config-data\") pod \"glance-default-external-api-0\" (UID: \"6008d784-c50b-4079-a7b4-c160b8202956\") " pod="openstack/glance-default-external-api-0" Mar 21 04:44:17 crc kubenswrapper[4839]: I0321 04:44:17.962334 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/6008d784-c50b-4079-a7b4-c160b8202956-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"6008d784-c50b-4079-a7b4-c160b8202956\") " pod="openstack/glance-default-external-api-0" Mar 21 04:44:17 crc kubenswrapper[4839]: I0321 04:44:17.962433 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6008d784-c50b-4079-a7b4-c160b8202956-logs\") pod \"glance-default-external-api-0\" (UID: \"6008d784-c50b-4079-a7b4-c160b8202956\") " pod="openstack/glance-default-external-api-0" Mar 21 04:44:17 crc kubenswrapper[4839]: I0321 04:44:17.962502 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6008d784-c50b-4079-a7b4-c160b8202956-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"6008d784-c50b-4079-a7b4-c160b8202956\") " pod="openstack/glance-default-external-api-0" Mar 21 04:44:17 crc kubenswrapper[4839]: I0321 04:44:17.963096 4839 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"glance-default-external-api-0\" (UID: \"6008d784-c50b-4079-a7b4-c160b8202956\") device mount path \"/mnt/openstack/pv07\"" pod="openstack/glance-default-external-api-0" Mar 21 04:44:17 crc kubenswrapper[4839]: I0321 04:44:17.963420 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6008d784-c50b-4079-a7b4-c160b8202956-logs\") pod \"glance-default-external-api-0\" (UID: \"6008d784-c50b-4079-a7b4-c160b8202956\") " pod="openstack/glance-default-external-api-0" Mar 21 04:44:17 crc kubenswrapper[4839]: I0321 04:44:17.963724 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/6008d784-c50b-4079-a7b4-c160b8202956-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"6008d784-c50b-4079-a7b4-c160b8202956\") " pod="openstack/glance-default-external-api-0" Mar 21 04:44:17 crc kubenswrapper[4839]: I0321 04:44:17.968375 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6008d784-c50b-4079-a7b4-c160b8202956-scripts\") pod \"glance-default-external-api-0\" (UID: \"6008d784-c50b-4079-a7b4-c160b8202956\") " pod="openstack/glance-default-external-api-0" Mar 21 04:44:17 crc kubenswrapper[4839]: I0321 04:44:17.969720 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6008d784-c50b-4079-a7b4-c160b8202956-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"6008d784-c50b-4079-a7b4-c160b8202956\") " pod="openstack/glance-default-external-api-0" Mar 21 04:44:17 crc kubenswrapper[4839]: I0321 04:44:17.982004 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6008d784-c50b-4079-a7b4-c160b8202956-config-data\") pod \"glance-default-external-api-0\" (UID: \"6008d784-c50b-4079-a7b4-c160b8202956\") " pod="openstack/glance-default-external-api-0" Mar 21 04:44:17 crc kubenswrapper[4839]: I0321 04:44:17.996484 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7wc6r\" (UniqueName: \"kubernetes.io/projected/6008d784-c50b-4079-a7b4-c160b8202956-kube-api-access-7wc6r\") pod \"glance-default-external-api-0\" (UID: \"6008d784-c50b-4079-a7b4-c160b8202956\") " pod="openstack/glance-default-external-api-0" Mar 21 04:44:18 crc kubenswrapper[4839]: I0321 04:44:18.015107 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"glance-default-external-api-0\" (UID: \"6008d784-c50b-4079-a7b4-c160b8202956\") " pod="openstack/glance-default-external-api-0" Mar 21 04:44:18 crc kubenswrapper[4839]: I0321 04:44:18.038718 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Mar 21 04:44:18 crc kubenswrapper[4839]: I0321 04:44:18.064751 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3d7f6c0e-7fdd-4ed0-94ad-e1b044f296f6-scripts\") pod \"glance-default-internal-api-0\" (UID: \"3d7f6c0e-7fdd-4ed0-94ad-e1b044f296f6\") " pod="openstack/glance-default-internal-api-0" Mar 21 04:44:18 crc kubenswrapper[4839]: I0321 04:44:18.064831 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zdqdc\" (UniqueName: \"kubernetes.io/projected/3d7f6c0e-7fdd-4ed0-94ad-e1b044f296f6-kube-api-access-zdqdc\") pod \"glance-default-internal-api-0\" (UID: \"3d7f6c0e-7fdd-4ed0-94ad-e1b044f296f6\") " pod="openstack/glance-default-internal-api-0" Mar 21 04:44:18 crc kubenswrapper[4839]: I0321 04:44:18.064855 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3d7f6c0e-7fdd-4ed0-94ad-e1b044f296f6-config-data\") pod \"glance-default-internal-api-0\" (UID: \"3d7f6c0e-7fdd-4ed0-94ad-e1b044f296f6\") " pod="openstack/glance-default-internal-api-0" Mar 21 04:44:18 crc kubenswrapper[4839]: I0321 04:44:18.064966 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/3d7f6c0e-7fdd-4ed0-94ad-e1b044f296f6-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"3d7f6c0e-7fdd-4ed0-94ad-e1b044f296f6\") " pod="openstack/glance-default-internal-api-0" Mar 21 04:44:18 crc kubenswrapper[4839]: I0321 04:44:18.065009 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"glance-default-internal-api-0\" (UID: \"3d7f6c0e-7fdd-4ed0-94ad-e1b044f296f6\") " pod="openstack/glance-default-internal-api-0" Mar 21 04:44:18 crc kubenswrapper[4839]: I0321 04:44:18.065039 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3d7f6c0e-7fdd-4ed0-94ad-e1b044f296f6-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"3d7f6c0e-7fdd-4ed0-94ad-e1b044f296f6\") " pod="openstack/glance-default-internal-api-0" Mar 21 04:44:18 crc kubenswrapper[4839]: I0321 04:44:18.065065 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3d7f6c0e-7fdd-4ed0-94ad-e1b044f296f6-logs\") pod \"glance-default-internal-api-0\" (UID: \"3d7f6c0e-7fdd-4ed0-94ad-e1b044f296f6\") " pod="openstack/glance-default-internal-api-0" Mar 21 04:44:18 crc kubenswrapper[4839]: I0321 04:44:18.166877 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/3d7f6c0e-7fdd-4ed0-94ad-e1b044f296f6-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"3d7f6c0e-7fdd-4ed0-94ad-e1b044f296f6\") " pod="openstack/glance-default-internal-api-0" Mar 21 04:44:18 crc kubenswrapper[4839]: I0321 04:44:18.167345 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"glance-default-internal-api-0\" (UID: \"3d7f6c0e-7fdd-4ed0-94ad-e1b044f296f6\") " pod="openstack/glance-default-internal-api-0" Mar 21 04:44:18 crc kubenswrapper[4839]: I0321 04:44:18.167443 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/3d7f6c0e-7fdd-4ed0-94ad-e1b044f296f6-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"3d7f6c0e-7fdd-4ed0-94ad-e1b044f296f6\") " pod="openstack/glance-default-internal-api-0" Mar 21 04:44:18 crc kubenswrapper[4839]: I0321 04:44:18.167635 4839 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"glance-default-internal-api-0\" (UID: \"3d7f6c0e-7fdd-4ed0-94ad-e1b044f296f6\") device mount path \"/mnt/openstack/pv02\"" pod="openstack/glance-default-internal-api-0" Mar 21 04:44:18 crc kubenswrapper[4839]: I0321 04:44:18.167518 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3d7f6c0e-7fdd-4ed0-94ad-e1b044f296f6-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"3d7f6c0e-7fdd-4ed0-94ad-e1b044f296f6\") " pod="openstack/glance-default-internal-api-0" Mar 21 04:44:18 crc kubenswrapper[4839]: I0321 04:44:18.170145 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3d7f6c0e-7fdd-4ed0-94ad-e1b044f296f6-logs\") pod \"glance-default-internal-api-0\" (UID: \"3d7f6c0e-7fdd-4ed0-94ad-e1b044f296f6\") " pod="openstack/glance-default-internal-api-0" Mar 21 04:44:18 crc kubenswrapper[4839]: I0321 04:44:18.178785 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3d7f6c0e-7fdd-4ed0-94ad-e1b044f296f6-scripts\") pod \"glance-default-internal-api-0\" (UID: \"3d7f6c0e-7fdd-4ed0-94ad-e1b044f296f6\") " pod="openstack/glance-default-internal-api-0" Mar 21 04:44:18 crc kubenswrapper[4839]: I0321 04:44:18.179243 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zdqdc\" (UniqueName: \"kubernetes.io/projected/3d7f6c0e-7fdd-4ed0-94ad-e1b044f296f6-kube-api-access-zdqdc\") pod \"glance-default-internal-api-0\" (UID: \"3d7f6c0e-7fdd-4ed0-94ad-e1b044f296f6\") " pod="openstack/glance-default-internal-api-0" Mar 21 04:44:18 crc kubenswrapper[4839]: I0321 04:44:18.179705 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3d7f6c0e-7fdd-4ed0-94ad-e1b044f296f6-config-data\") pod \"glance-default-internal-api-0\" (UID: \"3d7f6c0e-7fdd-4ed0-94ad-e1b044f296f6\") " pod="openstack/glance-default-internal-api-0" Mar 21 04:44:18 crc kubenswrapper[4839]: I0321 04:44:18.172295 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3d7f6c0e-7fdd-4ed0-94ad-e1b044f296f6-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"3d7f6c0e-7fdd-4ed0-94ad-e1b044f296f6\") " pod="openstack/glance-default-internal-api-0" Mar 21 04:44:18 crc kubenswrapper[4839]: I0321 04:44:18.170833 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3d7f6c0e-7fdd-4ed0-94ad-e1b044f296f6-logs\") pod \"glance-default-internal-api-0\" (UID: \"3d7f6c0e-7fdd-4ed0-94ad-e1b044f296f6\") " pod="openstack/glance-default-internal-api-0" Mar 21 04:44:18 crc kubenswrapper[4839]: I0321 04:44:18.183552 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3d7f6c0e-7fdd-4ed0-94ad-e1b044f296f6-config-data\") pod \"glance-default-internal-api-0\" (UID: \"3d7f6c0e-7fdd-4ed0-94ad-e1b044f296f6\") " pod="openstack/glance-default-internal-api-0" Mar 21 04:44:18 crc kubenswrapper[4839]: I0321 04:44:18.191905 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3d7f6c0e-7fdd-4ed0-94ad-e1b044f296f6-scripts\") pod \"glance-default-internal-api-0\" (UID: \"3d7f6c0e-7fdd-4ed0-94ad-e1b044f296f6\") " pod="openstack/glance-default-internal-api-0" Mar 21 04:44:18 crc kubenswrapper[4839]: I0321 04:44:18.196744 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"glance-default-internal-api-0\" (UID: \"3d7f6c0e-7fdd-4ed0-94ad-e1b044f296f6\") " pod="openstack/glance-default-internal-api-0" Mar 21 04:44:18 crc kubenswrapper[4839]: I0321 04:44:18.214086 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zdqdc\" (UniqueName: \"kubernetes.io/projected/3d7f6c0e-7fdd-4ed0-94ad-e1b044f296f6-kube-api-access-zdqdc\") pod \"glance-default-internal-api-0\" (UID: \"3d7f6c0e-7fdd-4ed0-94ad-e1b044f296f6\") " pod="openstack/glance-default-internal-api-0" Mar 21 04:44:18 crc kubenswrapper[4839]: I0321 04:44:18.494058 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Mar 21 04:44:18 crc kubenswrapper[4839]: I0321 04:44:18.639785 4839 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-b8fbc5445-zkqb7" podUID="67dd1633-1450-4153-b0af-b6887f61944c" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.115:5353: connect: connection refused" Mar 21 04:44:19 crc kubenswrapper[4839]: I0321 04:44:19.891820 4839 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Mar 21 04:44:19 crc kubenswrapper[4839]: I0321 04:44:19.969243 4839 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 21 04:44:23 crc kubenswrapper[4839]: E0321 04:44:23.631806 4839 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-horizon:current-podified" Mar 21 04:44:23 crc kubenswrapper[4839]: E0321 04:44:23.632560 4839 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:horizon-log,Image:quay.io/podified-antelope-centos9/openstack-horizon:current-podified,Command:[/bin/bash],Args:[-c tail -n+1 -F /var/log/horizon/horizon.log],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:n657h99h5ffh5f6hc8hdchfdh66fh555h95h5dfh64ch574h555h65dh54dhbfh4h688h75h54fhc6h5f5h7bh4h5b8h679hc6h5d7h687h544h5b5q,ValueFrom:nil,},EnvVar{Name:ENABLE_DESIGNATE,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_HEAT,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_IRONIC,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_MANILA,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_OCTAVIA,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_WATCHER,Value:no,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},EnvVar{Name:UNPACK_THEME,Value:true,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:logs,ReadOnly:false,MountPath:/var/log/horizon,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-s9ntv,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*48,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*true,RunAsGroup:*42400,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod horizon-75c94899fc-bkxlk_openstack(f9b42e2e-3015-4ae1-a3a9-3eb96949b021): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 21 04:44:23 crc kubenswrapper[4839]: E0321 04:44:23.634799 4839 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"horizon-log\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\", failed to \"StartContainer\" for \"horizon\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-horizon:current-podified\\\"\"]" pod="openstack/horizon-75c94899fc-bkxlk" podUID="f9b42e2e-3015-4ae1-a3a9-3eb96949b021" Mar 21 04:44:23 crc kubenswrapper[4839]: E0321 04:44:23.636592 4839 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-horizon:current-podified" Mar 21 04:44:23 crc kubenswrapper[4839]: E0321 04:44:23.636704 4839 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:horizon-log,Image:quay.io/podified-antelope-centos9/openstack-horizon:current-podified,Command:[/bin/bash],Args:[-c tail -n+1 -F /var/log/horizon/horizon.log],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:n65fh654hb7h645hd4h555h58ch548h665hdfh9fh5f9h5dfh5b4h5d4h546h596h57ch685h79h96h5h5b5hffhcdh5b7h547h549h99h5fchdhd5q,ValueFrom:nil,},EnvVar{Name:ENABLE_DESIGNATE,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_HEAT,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_IRONIC,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_MANILA,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_OCTAVIA,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_WATCHER,Value:no,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},EnvVar{Name:UNPACK_THEME,Value:true,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:logs,ReadOnly:false,MountPath:/var/log/horizon,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-5ccwx,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*48,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*true,RunAsGroup:*42400,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod horizon-554fbfcbdf-wqcc5_openstack(02db0b32-3683-4d02-b645-3cea2cd59b7d): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 21 04:44:23 crc kubenswrapper[4839]: E0321 04:44:23.638672 4839 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"horizon-log\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\", failed to \"StartContainer\" for \"horizon\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-horizon:current-podified\\\"\"]" pod="openstack/horizon-554fbfcbdf-wqcc5" podUID="02db0b32-3683-4d02-b645-3cea2cd59b7d" Mar 21 04:44:23 crc kubenswrapper[4839]: I0321 04:44:23.641474 4839 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-b8fbc5445-zkqb7" podUID="67dd1633-1450-4153-b0af-b6887f61944c" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.115:5353: connect: connection refused" Mar 21 04:44:23 crc kubenswrapper[4839]: I0321 04:44:23.641656 4839 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-b8fbc5445-zkqb7" Mar 21 04:44:23 crc kubenswrapper[4839]: E0321 04:44:23.651954 4839 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-horizon:current-podified" Mar 21 04:44:23 crc kubenswrapper[4839]: E0321 04:44:23.652158 4839 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:horizon-log,Image:quay.io/podified-antelope-centos9/openstack-horizon:current-podified,Command:[/bin/bash],Args:[-c tail -n+1 -F /var/log/horizon/horizon.log],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:nd6h56bh595h678h5cfh645hfdhc7h8bh78h68ch6ch86h556hb4h56fh576h5ch5dh54ch67h658hdh579h65ch598hd4h8fh666h66dh586h544q,ValueFrom:nil,},EnvVar{Name:ENABLE_DESIGNATE,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_HEAT,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_IRONIC,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_MANILA,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_OCTAVIA,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_WATCHER,Value:no,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},EnvVar{Name:UNPACK_THEME,Value:true,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:logs,ReadOnly:false,MountPath:/var/log/horizon,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-6t7wq,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*48,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*true,RunAsGroup:*42400,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod horizon-7c9d7f5-72d27_openstack(3193915f-60d3-4c8e-aa15-858213ce011c): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 21 04:44:23 crc kubenswrapper[4839]: E0321 04:44:23.656820 4839 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"horizon-log\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\", failed to \"StartContainer\" for \"horizon\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-horizon:current-podified\\\"\"]" pod="openstack/horizon-7c9d7f5-72d27" podUID="3193915f-60d3-4c8e-aa15-858213ce011c" Mar 21 04:44:29 crc kubenswrapper[4839]: I0321 04:44:29.737871 4839 generic.go:334] "Generic (PLEG): container finished" podID="625a99bd-bc01-400e-8e9c-1f5eff390466" containerID="dfcec3a2306ecb1c0b0e9a1bd05577683dcbd7efc3319d4ee942c6e22862d913" exitCode=0 Mar 21 04:44:29 crc kubenswrapper[4839]: I0321 04:44:29.737921 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-nm9t5" event={"ID":"625a99bd-bc01-400e-8e9c-1f5eff390466","Type":"ContainerDied","Data":"dfcec3a2306ecb1c0b0e9a1bd05577683dcbd7efc3319d4ee942c6e22862d913"} Mar 21 04:44:31 crc kubenswrapper[4839]: I0321 04:44:31.350291 4839 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-5rr4j" Mar 21 04:44:31 crc kubenswrapper[4839]: I0321 04:44:31.359787 4839 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-b8fbc5445-zkqb7" Mar 21 04:44:31 crc kubenswrapper[4839]: I0321 04:44:31.375777 4839 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-7c9d7f5-72d27" Mar 21 04:44:31 crc kubenswrapper[4839]: I0321 04:44:31.379466 4839 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-554fbfcbdf-wqcc5" Mar 21 04:44:31 crc kubenswrapper[4839]: I0321 04:44:31.386375 4839 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-75c94899fc-bkxlk" Mar 21 04:44:31 crc kubenswrapper[4839]: I0321 04:44:31.443688 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ksgp4\" (UniqueName: \"kubernetes.io/projected/67dd1633-1450-4153-b0af-b6887f61944c-kube-api-access-ksgp4\") pod \"67dd1633-1450-4153-b0af-b6887f61944c\" (UID: \"67dd1633-1450-4153-b0af-b6887f61944c\") " Mar 21 04:44:31 crc kubenswrapper[4839]: I0321 04:44:31.443751 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/f9b42e2e-3015-4ae1-a3a9-3eb96949b021-scripts\") pod \"f9b42e2e-3015-4ae1-a3a9-3eb96949b021\" (UID: \"f9b42e2e-3015-4ae1-a3a9-3eb96949b021\") " Mar 21 04:44:31 crc kubenswrapper[4839]: I0321 04:44:31.443799 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/3193915f-60d3-4c8e-aa15-858213ce011c-scripts\") pod \"3193915f-60d3-4c8e-aa15-858213ce011c\" (UID: \"3193915f-60d3-4c8e-aa15-858213ce011c\") " Mar 21 04:44:31 crc kubenswrapper[4839]: I0321 04:44:31.443825 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6t7wq\" (UniqueName: \"kubernetes.io/projected/3193915f-60d3-4c8e-aa15-858213ce011c-kube-api-access-6t7wq\") pod \"3193915f-60d3-4c8e-aa15-858213ce011c\" (UID: \"3193915f-60d3-4c8e-aa15-858213ce011c\") " Mar 21 04:44:31 crc kubenswrapper[4839]: I0321 04:44:31.443843 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/3193915f-60d3-4c8e-aa15-858213ce011c-horizon-secret-key\") pod \"3193915f-60d3-4c8e-aa15-858213ce011c\" (UID: \"3193915f-60d3-4c8e-aa15-858213ce011c\") " Mar 21 04:44:31 crc kubenswrapper[4839]: I0321 04:44:31.443880 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/02db0b32-3683-4d02-b645-3cea2cd59b7d-scripts\") pod \"02db0b32-3683-4d02-b645-3cea2cd59b7d\" (UID: \"02db0b32-3683-4d02-b645-3cea2cd59b7d\") " Mar 21 04:44:31 crc kubenswrapper[4839]: I0321 04:44:31.443900 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/848aa53a-bd67-4733-aad7-6ac0f6fc0a15-scripts\") pod \"848aa53a-bd67-4733-aad7-6ac0f6fc0a15\" (UID: \"848aa53a-bd67-4733-aad7-6ac0f6fc0a15\") " Mar 21 04:44:31 crc kubenswrapper[4839]: I0321 04:44:31.443925 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f9b42e2e-3015-4ae1-a3a9-3eb96949b021-logs\") pod \"f9b42e2e-3015-4ae1-a3a9-3eb96949b021\" (UID: \"f9b42e2e-3015-4ae1-a3a9-3eb96949b021\") " Mar 21 04:44:31 crc kubenswrapper[4839]: I0321 04:44:31.443954 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/848aa53a-bd67-4733-aad7-6ac0f6fc0a15-credential-keys\") pod \"848aa53a-bd67-4733-aad7-6ac0f6fc0a15\" (UID: \"848aa53a-bd67-4733-aad7-6ac0f6fc0a15\") " Mar 21 04:44:31 crc kubenswrapper[4839]: I0321 04:44:31.443971 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3193915f-60d3-4c8e-aa15-858213ce011c-logs\") pod \"3193915f-60d3-4c8e-aa15-858213ce011c\" (UID: \"3193915f-60d3-4c8e-aa15-858213ce011c\") " Mar 21 04:44:31 crc kubenswrapper[4839]: I0321 04:44:31.444008 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rqg7c\" (UniqueName: \"kubernetes.io/projected/848aa53a-bd67-4733-aad7-6ac0f6fc0a15-kube-api-access-rqg7c\") pod \"848aa53a-bd67-4733-aad7-6ac0f6fc0a15\" (UID: \"848aa53a-bd67-4733-aad7-6ac0f6fc0a15\") " Mar 21 04:44:31 crc kubenswrapper[4839]: I0321 04:44:31.444026 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/02db0b32-3683-4d02-b645-3cea2cd59b7d-horizon-secret-key\") pod \"02db0b32-3683-4d02-b645-3cea2cd59b7d\" (UID: \"02db0b32-3683-4d02-b645-3cea2cd59b7d\") " Mar 21 04:44:31 crc kubenswrapper[4839]: I0321 04:44:31.444055 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/67dd1633-1450-4153-b0af-b6887f61944c-config\") pod \"67dd1633-1450-4153-b0af-b6887f61944c\" (UID: \"67dd1633-1450-4153-b0af-b6887f61944c\") " Mar 21 04:44:31 crc kubenswrapper[4839]: I0321 04:44:31.444080 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-s9ntv\" (UniqueName: \"kubernetes.io/projected/f9b42e2e-3015-4ae1-a3a9-3eb96949b021-kube-api-access-s9ntv\") pod \"f9b42e2e-3015-4ae1-a3a9-3eb96949b021\" (UID: \"f9b42e2e-3015-4ae1-a3a9-3eb96949b021\") " Mar 21 04:44:31 crc kubenswrapper[4839]: I0321 04:44:31.444115 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/f9b42e2e-3015-4ae1-a3a9-3eb96949b021-config-data\") pod \"f9b42e2e-3015-4ae1-a3a9-3eb96949b021\" (UID: \"f9b42e2e-3015-4ae1-a3a9-3eb96949b021\") " Mar 21 04:44:31 crc kubenswrapper[4839]: I0321 04:44:31.444144 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/848aa53a-bd67-4733-aad7-6ac0f6fc0a15-config-data\") pod \"848aa53a-bd67-4733-aad7-6ac0f6fc0a15\" (UID: \"848aa53a-bd67-4733-aad7-6ac0f6fc0a15\") " Mar 21 04:44:31 crc kubenswrapper[4839]: I0321 04:44:31.444162 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/848aa53a-bd67-4733-aad7-6ac0f6fc0a15-fernet-keys\") pod \"848aa53a-bd67-4733-aad7-6ac0f6fc0a15\" (UID: \"848aa53a-bd67-4733-aad7-6ac0f6fc0a15\") " Mar 21 04:44:31 crc kubenswrapper[4839]: I0321 04:44:31.444194 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/67dd1633-1450-4153-b0af-b6887f61944c-ovsdbserver-nb\") pod \"67dd1633-1450-4153-b0af-b6887f61944c\" (UID: \"67dd1633-1450-4153-b0af-b6887f61944c\") " Mar 21 04:44:31 crc kubenswrapper[4839]: I0321 04:44:31.444226 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5ccwx\" (UniqueName: \"kubernetes.io/projected/02db0b32-3683-4d02-b645-3cea2cd59b7d-kube-api-access-5ccwx\") pod \"02db0b32-3683-4d02-b645-3cea2cd59b7d\" (UID: \"02db0b32-3683-4d02-b645-3cea2cd59b7d\") " Mar 21 04:44:31 crc kubenswrapper[4839]: I0321 04:44:31.444250 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/3193915f-60d3-4c8e-aa15-858213ce011c-config-data\") pod \"3193915f-60d3-4c8e-aa15-858213ce011c\" (UID: \"3193915f-60d3-4c8e-aa15-858213ce011c\") " Mar 21 04:44:31 crc kubenswrapper[4839]: I0321 04:44:31.444265 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/67dd1633-1450-4153-b0af-b6887f61944c-ovsdbserver-sb\") pod \"67dd1633-1450-4153-b0af-b6887f61944c\" (UID: \"67dd1633-1450-4153-b0af-b6887f61944c\") " Mar 21 04:44:31 crc kubenswrapper[4839]: I0321 04:44:31.444284 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/848aa53a-bd67-4733-aad7-6ac0f6fc0a15-combined-ca-bundle\") pod \"848aa53a-bd67-4733-aad7-6ac0f6fc0a15\" (UID: \"848aa53a-bd67-4733-aad7-6ac0f6fc0a15\") " Mar 21 04:44:31 crc kubenswrapper[4839]: I0321 04:44:31.444303 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/02db0b32-3683-4d02-b645-3cea2cd59b7d-config-data\") pod \"02db0b32-3683-4d02-b645-3cea2cd59b7d\" (UID: \"02db0b32-3683-4d02-b645-3cea2cd59b7d\") " Mar 21 04:44:31 crc kubenswrapper[4839]: I0321 04:44:31.444321 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/02db0b32-3683-4d02-b645-3cea2cd59b7d-logs\") pod \"02db0b32-3683-4d02-b645-3cea2cd59b7d\" (UID: \"02db0b32-3683-4d02-b645-3cea2cd59b7d\") " Mar 21 04:44:31 crc kubenswrapper[4839]: I0321 04:44:31.444343 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/67dd1633-1450-4153-b0af-b6887f61944c-dns-svc\") pod \"67dd1633-1450-4153-b0af-b6887f61944c\" (UID: \"67dd1633-1450-4153-b0af-b6887f61944c\") " Mar 21 04:44:31 crc kubenswrapper[4839]: I0321 04:44:31.444370 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/f9b42e2e-3015-4ae1-a3a9-3eb96949b021-horizon-secret-key\") pod \"f9b42e2e-3015-4ae1-a3a9-3eb96949b021\" (UID: \"f9b42e2e-3015-4ae1-a3a9-3eb96949b021\") " Mar 21 04:44:31 crc kubenswrapper[4839]: I0321 04:44:31.445855 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/02db0b32-3683-4d02-b645-3cea2cd59b7d-config-data" (OuterVolumeSpecName: "config-data") pod "02db0b32-3683-4d02-b645-3cea2cd59b7d" (UID: "02db0b32-3683-4d02-b645-3cea2cd59b7d"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 21 04:44:31 crc kubenswrapper[4839]: I0321 04:44:31.446346 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/02db0b32-3683-4d02-b645-3cea2cd59b7d-logs" (OuterVolumeSpecName: "logs") pod "02db0b32-3683-4d02-b645-3cea2cd59b7d" (UID: "02db0b32-3683-4d02-b645-3cea2cd59b7d"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 21 04:44:31 crc kubenswrapper[4839]: I0321 04:44:31.447285 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3193915f-60d3-4c8e-aa15-858213ce011c-scripts" (OuterVolumeSpecName: "scripts") pod "3193915f-60d3-4c8e-aa15-858213ce011c" (UID: "3193915f-60d3-4c8e-aa15-858213ce011c"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 21 04:44:31 crc kubenswrapper[4839]: I0321 04:44:31.447327 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f9b42e2e-3015-4ae1-a3a9-3eb96949b021-logs" (OuterVolumeSpecName: "logs") pod "f9b42e2e-3015-4ae1-a3a9-3eb96949b021" (UID: "f9b42e2e-3015-4ae1-a3a9-3eb96949b021"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 21 04:44:31 crc kubenswrapper[4839]: I0321 04:44:31.448836 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/02db0b32-3683-4d02-b645-3cea2cd59b7d-scripts" (OuterVolumeSpecName: "scripts") pod "02db0b32-3683-4d02-b645-3cea2cd59b7d" (UID: "02db0b32-3683-4d02-b645-3cea2cd59b7d"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 21 04:44:31 crc kubenswrapper[4839]: I0321 04:44:31.449059 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f9b42e2e-3015-4ae1-a3a9-3eb96949b021-config-data" (OuterVolumeSpecName: "config-data") pod "f9b42e2e-3015-4ae1-a3a9-3eb96949b021" (UID: "f9b42e2e-3015-4ae1-a3a9-3eb96949b021"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 21 04:44:31 crc kubenswrapper[4839]: I0321 04:44:31.449282 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3193915f-60d3-4c8e-aa15-858213ce011c-config-data" (OuterVolumeSpecName: "config-data") pod "3193915f-60d3-4c8e-aa15-858213ce011c" (UID: "3193915f-60d3-4c8e-aa15-858213ce011c"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 21 04:44:31 crc kubenswrapper[4839]: I0321 04:44:31.449389 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/848aa53a-bd67-4733-aad7-6ac0f6fc0a15-credential-keys" (OuterVolumeSpecName: "credential-keys") pod "848aa53a-bd67-4733-aad7-6ac0f6fc0a15" (UID: "848aa53a-bd67-4733-aad7-6ac0f6fc0a15"). InnerVolumeSpecName "credential-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 21 04:44:31 crc kubenswrapper[4839]: I0321 04:44:31.449480 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/67dd1633-1450-4153-b0af-b6887f61944c-kube-api-access-ksgp4" (OuterVolumeSpecName: "kube-api-access-ksgp4") pod "67dd1633-1450-4153-b0af-b6887f61944c" (UID: "67dd1633-1450-4153-b0af-b6887f61944c"). InnerVolumeSpecName "kube-api-access-ksgp4". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 21 04:44:31 crc kubenswrapper[4839]: I0321 04:44:31.450778 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3193915f-60d3-4c8e-aa15-858213ce011c-logs" (OuterVolumeSpecName: "logs") pod "3193915f-60d3-4c8e-aa15-858213ce011c" (UID: "3193915f-60d3-4c8e-aa15-858213ce011c"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 21 04:44:31 crc kubenswrapper[4839]: I0321 04:44:31.453828 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f9b42e2e-3015-4ae1-a3a9-3eb96949b021-scripts" (OuterVolumeSpecName: "scripts") pod "f9b42e2e-3015-4ae1-a3a9-3eb96949b021" (UID: "f9b42e2e-3015-4ae1-a3a9-3eb96949b021"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 21 04:44:31 crc kubenswrapper[4839]: I0321 04:44:31.453865 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/02db0b32-3683-4d02-b645-3cea2cd59b7d-horizon-secret-key" (OuterVolumeSpecName: "horizon-secret-key") pod "02db0b32-3683-4d02-b645-3cea2cd59b7d" (UID: "02db0b32-3683-4d02-b645-3cea2cd59b7d"). InnerVolumeSpecName "horizon-secret-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 21 04:44:31 crc kubenswrapper[4839]: I0321 04:44:31.454487 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/848aa53a-bd67-4733-aad7-6ac0f6fc0a15-kube-api-access-rqg7c" (OuterVolumeSpecName: "kube-api-access-rqg7c") pod "848aa53a-bd67-4733-aad7-6ac0f6fc0a15" (UID: "848aa53a-bd67-4733-aad7-6ac0f6fc0a15"). InnerVolumeSpecName "kube-api-access-rqg7c". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 21 04:44:31 crc kubenswrapper[4839]: I0321 04:44:31.458505 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/848aa53a-bd67-4733-aad7-6ac0f6fc0a15-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "848aa53a-bd67-4733-aad7-6ac0f6fc0a15" (UID: "848aa53a-bd67-4733-aad7-6ac0f6fc0a15"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 21 04:44:31 crc kubenswrapper[4839]: I0321 04:44:31.459697 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/848aa53a-bd67-4733-aad7-6ac0f6fc0a15-scripts" (OuterVolumeSpecName: "scripts") pod "848aa53a-bd67-4733-aad7-6ac0f6fc0a15" (UID: "848aa53a-bd67-4733-aad7-6ac0f6fc0a15"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 21 04:44:31 crc kubenswrapper[4839]: I0321 04:44:31.464712 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3193915f-60d3-4c8e-aa15-858213ce011c-horizon-secret-key" (OuterVolumeSpecName: "horizon-secret-key") pod "3193915f-60d3-4c8e-aa15-858213ce011c" (UID: "3193915f-60d3-4c8e-aa15-858213ce011c"). InnerVolumeSpecName "horizon-secret-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 21 04:44:31 crc kubenswrapper[4839]: I0321 04:44:31.464796 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f9b42e2e-3015-4ae1-a3a9-3eb96949b021-horizon-secret-key" (OuterVolumeSpecName: "horizon-secret-key") pod "f9b42e2e-3015-4ae1-a3a9-3eb96949b021" (UID: "f9b42e2e-3015-4ae1-a3a9-3eb96949b021"). InnerVolumeSpecName "horizon-secret-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 21 04:44:31 crc kubenswrapper[4839]: I0321 04:44:31.464836 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f9b42e2e-3015-4ae1-a3a9-3eb96949b021-kube-api-access-s9ntv" (OuterVolumeSpecName: "kube-api-access-s9ntv") pod "f9b42e2e-3015-4ae1-a3a9-3eb96949b021" (UID: "f9b42e2e-3015-4ae1-a3a9-3eb96949b021"). InnerVolumeSpecName "kube-api-access-s9ntv". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 21 04:44:31 crc kubenswrapper[4839]: I0321 04:44:31.464877 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/02db0b32-3683-4d02-b645-3cea2cd59b7d-kube-api-access-5ccwx" (OuterVolumeSpecName: "kube-api-access-5ccwx") pod "02db0b32-3683-4d02-b645-3cea2cd59b7d" (UID: "02db0b32-3683-4d02-b645-3cea2cd59b7d"). InnerVolumeSpecName "kube-api-access-5ccwx". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 21 04:44:31 crc kubenswrapper[4839]: I0321 04:44:31.464930 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3193915f-60d3-4c8e-aa15-858213ce011c-kube-api-access-6t7wq" (OuterVolumeSpecName: "kube-api-access-6t7wq") pod "3193915f-60d3-4c8e-aa15-858213ce011c" (UID: "3193915f-60d3-4c8e-aa15-858213ce011c"). InnerVolumeSpecName "kube-api-access-6t7wq". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 21 04:44:31 crc kubenswrapper[4839]: I0321 04:44:31.489656 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/848aa53a-bd67-4733-aad7-6ac0f6fc0a15-config-data" (OuterVolumeSpecName: "config-data") pod "848aa53a-bd67-4733-aad7-6ac0f6fc0a15" (UID: "848aa53a-bd67-4733-aad7-6ac0f6fc0a15"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 21 04:44:31 crc kubenswrapper[4839]: I0321 04:44:31.493025 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/848aa53a-bd67-4733-aad7-6ac0f6fc0a15-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "848aa53a-bd67-4733-aad7-6ac0f6fc0a15" (UID: "848aa53a-bd67-4733-aad7-6ac0f6fc0a15"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 21 04:44:31 crc kubenswrapper[4839]: I0321 04:44:31.501394 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/67dd1633-1450-4153-b0af-b6887f61944c-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "67dd1633-1450-4153-b0af-b6887f61944c" (UID: "67dd1633-1450-4153-b0af-b6887f61944c"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 21 04:44:31 crc kubenswrapper[4839]: I0321 04:44:31.504043 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/67dd1633-1450-4153-b0af-b6887f61944c-config" (OuterVolumeSpecName: "config") pod "67dd1633-1450-4153-b0af-b6887f61944c" (UID: "67dd1633-1450-4153-b0af-b6887f61944c"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 21 04:44:31 crc kubenswrapper[4839]: I0321 04:44:31.509177 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/67dd1633-1450-4153-b0af-b6887f61944c-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "67dd1633-1450-4153-b0af-b6887f61944c" (UID: "67dd1633-1450-4153-b0af-b6887f61944c"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 21 04:44:31 crc kubenswrapper[4839]: I0321 04:44:31.510083 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/67dd1633-1450-4153-b0af-b6887f61944c-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "67dd1633-1450-4153-b0af-b6887f61944c" (UID: "67dd1633-1450-4153-b0af-b6887f61944c"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 21 04:44:31 crc kubenswrapper[4839]: I0321 04:44:31.546031 4839 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ksgp4\" (UniqueName: \"kubernetes.io/projected/67dd1633-1450-4153-b0af-b6887f61944c-kube-api-access-ksgp4\") on node \"crc\" DevicePath \"\"" Mar 21 04:44:31 crc kubenswrapper[4839]: I0321 04:44:31.546064 4839 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/f9b42e2e-3015-4ae1-a3a9-3eb96949b021-scripts\") on node \"crc\" DevicePath \"\"" Mar 21 04:44:31 crc kubenswrapper[4839]: I0321 04:44:31.546073 4839 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/3193915f-60d3-4c8e-aa15-858213ce011c-scripts\") on node \"crc\" DevicePath \"\"" Mar 21 04:44:31 crc kubenswrapper[4839]: I0321 04:44:31.546081 4839 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6t7wq\" (UniqueName: \"kubernetes.io/projected/3193915f-60d3-4c8e-aa15-858213ce011c-kube-api-access-6t7wq\") on node \"crc\" DevicePath \"\"" Mar 21 04:44:31 crc kubenswrapper[4839]: I0321 04:44:31.546089 4839 reconciler_common.go:293] "Volume detached for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/3193915f-60d3-4c8e-aa15-858213ce011c-horizon-secret-key\") on node \"crc\" DevicePath \"\"" Mar 21 04:44:31 crc kubenswrapper[4839]: I0321 04:44:31.546098 4839 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/02db0b32-3683-4d02-b645-3cea2cd59b7d-scripts\") on node \"crc\" DevicePath \"\"" Mar 21 04:44:31 crc kubenswrapper[4839]: I0321 04:44:31.546105 4839 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/848aa53a-bd67-4733-aad7-6ac0f6fc0a15-scripts\") on node \"crc\" DevicePath \"\"" Mar 21 04:44:31 crc kubenswrapper[4839]: I0321 04:44:31.546113 4839 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f9b42e2e-3015-4ae1-a3a9-3eb96949b021-logs\") on node \"crc\" DevicePath \"\"" Mar 21 04:44:31 crc kubenswrapper[4839]: I0321 04:44:31.546120 4839 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3193915f-60d3-4c8e-aa15-858213ce011c-logs\") on node \"crc\" DevicePath \"\"" Mar 21 04:44:31 crc kubenswrapper[4839]: I0321 04:44:31.546128 4839 reconciler_common.go:293] "Volume detached for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/848aa53a-bd67-4733-aad7-6ac0f6fc0a15-credential-keys\") on node \"crc\" DevicePath \"\"" Mar 21 04:44:31 crc kubenswrapper[4839]: I0321 04:44:31.546136 4839 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rqg7c\" (UniqueName: \"kubernetes.io/projected/848aa53a-bd67-4733-aad7-6ac0f6fc0a15-kube-api-access-rqg7c\") on node \"crc\" DevicePath \"\"" Mar 21 04:44:31 crc kubenswrapper[4839]: I0321 04:44:31.546144 4839 reconciler_common.go:293] "Volume detached for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/02db0b32-3683-4d02-b645-3cea2cd59b7d-horizon-secret-key\") on node \"crc\" DevicePath \"\"" Mar 21 04:44:31 crc kubenswrapper[4839]: I0321 04:44:31.546151 4839 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/67dd1633-1450-4153-b0af-b6887f61944c-config\") on node \"crc\" DevicePath \"\"" Mar 21 04:44:31 crc kubenswrapper[4839]: I0321 04:44:31.546160 4839 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-s9ntv\" (UniqueName: \"kubernetes.io/projected/f9b42e2e-3015-4ae1-a3a9-3eb96949b021-kube-api-access-s9ntv\") on node \"crc\" DevicePath \"\"" Mar 21 04:44:31 crc kubenswrapper[4839]: I0321 04:44:31.546167 4839 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/f9b42e2e-3015-4ae1-a3a9-3eb96949b021-config-data\") on node \"crc\" DevicePath \"\"" Mar 21 04:44:31 crc kubenswrapper[4839]: I0321 04:44:31.546176 4839 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/848aa53a-bd67-4733-aad7-6ac0f6fc0a15-config-data\") on node \"crc\" DevicePath \"\"" Mar 21 04:44:31 crc kubenswrapper[4839]: I0321 04:44:31.546183 4839 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/848aa53a-bd67-4733-aad7-6ac0f6fc0a15-fernet-keys\") on node \"crc\" DevicePath \"\"" Mar 21 04:44:31 crc kubenswrapper[4839]: I0321 04:44:31.546191 4839 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/67dd1633-1450-4153-b0af-b6887f61944c-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Mar 21 04:44:31 crc kubenswrapper[4839]: I0321 04:44:31.546199 4839 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5ccwx\" (UniqueName: \"kubernetes.io/projected/02db0b32-3683-4d02-b645-3cea2cd59b7d-kube-api-access-5ccwx\") on node \"crc\" DevicePath \"\"" Mar 21 04:44:31 crc kubenswrapper[4839]: I0321 04:44:31.546207 4839 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/3193915f-60d3-4c8e-aa15-858213ce011c-config-data\") on node \"crc\" DevicePath \"\"" Mar 21 04:44:31 crc kubenswrapper[4839]: I0321 04:44:31.546214 4839 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/67dd1633-1450-4153-b0af-b6887f61944c-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Mar 21 04:44:31 crc kubenswrapper[4839]: I0321 04:44:31.546222 4839 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/848aa53a-bd67-4733-aad7-6ac0f6fc0a15-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 21 04:44:31 crc kubenswrapper[4839]: I0321 04:44:31.546229 4839 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/02db0b32-3683-4d02-b645-3cea2cd59b7d-config-data\") on node \"crc\" DevicePath \"\"" Mar 21 04:44:31 crc kubenswrapper[4839]: I0321 04:44:31.546237 4839 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/02db0b32-3683-4d02-b645-3cea2cd59b7d-logs\") on node \"crc\" DevicePath \"\"" Mar 21 04:44:31 crc kubenswrapper[4839]: I0321 04:44:31.546244 4839 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/67dd1633-1450-4153-b0af-b6887f61944c-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 21 04:44:31 crc kubenswrapper[4839]: I0321 04:44:31.546252 4839 reconciler_common.go:293] "Volume detached for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/f9b42e2e-3015-4ae1-a3a9-3eb96949b021-horizon-secret-key\") on node \"crc\" DevicePath \"\"" Mar 21 04:44:31 crc kubenswrapper[4839]: I0321 04:44:31.756215 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-75c94899fc-bkxlk" event={"ID":"f9b42e2e-3015-4ae1-a3a9-3eb96949b021","Type":"ContainerDied","Data":"04fe2b7baf42bfa7035c15041a5de93662e87c4359a787bd7c9b47e57eb2a7fa"} Mar 21 04:44:31 crc kubenswrapper[4839]: I0321 04:44:31.756300 4839 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-75c94899fc-bkxlk" Mar 21 04:44:31 crc kubenswrapper[4839]: I0321 04:44:31.766453 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-5rr4j" event={"ID":"848aa53a-bd67-4733-aad7-6ac0f6fc0a15","Type":"ContainerDied","Data":"70f7b322b7c3ad74c3e8a9620d17f9758e57775147f74650f1a626aa0f7a8463"} Mar 21 04:44:31 crc kubenswrapper[4839]: I0321 04:44:31.766494 4839 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="70f7b322b7c3ad74c3e8a9620d17f9758e57775147f74650f1a626aa0f7a8463" Mar 21 04:44:31 crc kubenswrapper[4839]: I0321 04:44:31.766559 4839 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-5rr4j" Mar 21 04:44:31 crc kubenswrapper[4839]: I0321 04:44:31.775468 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-b8fbc5445-zkqb7" event={"ID":"67dd1633-1450-4153-b0af-b6887f61944c","Type":"ContainerDied","Data":"57de16c4224a656e8f3fcae76650a94702fb081fd5f9e8c3856fcde976889201"} Mar 21 04:44:31 crc kubenswrapper[4839]: I0321 04:44:31.775519 4839 scope.go:117] "RemoveContainer" containerID="66a460b182805c08827a7b4f6980d98fea84c8290c7b4fe1cb071b3630a6c029" Mar 21 04:44:31 crc kubenswrapper[4839]: I0321 04:44:31.775591 4839 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-b8fbc5445-zkqb7" Mar 21 04:44:31 crc kubenswrapper[4839]: I0321 04:44:31.778437 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-7c9d7f5-72d27" event={"ID":"3193915f-60d3-4c8e-aa15-858213ce011c","Type":"ContainerDied","Data":"6e4a51f272197d48dcbc81a0aecd9739f163dd5e41504342f75386e4fcf464f5"} Mar 21 04:44:31 crc kubenswrapper[4839]: I0321 04:44:31.778498 4839 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-7c9d7f5-72d27" Mar 21 04:44:31 crc kubenswrapper[4839]: I0321 04:44:31.780111 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-554fbfcbdf-wqcc5" event={"ID":"02db0b32-3683-4d02-b645-3cea2cd59b7d","Type":"ContainerDied","Data":"8e3dd636e6fac659887299acf1ac0e45d1e3d4824f9b0e8c44ea6e8f2b5429e5"} Mar 21 04:44:31 crc kubenswrapper[4839]: I0321 04:44:31.780186 4839 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-554fbfcbdf-wqcc5" Mar 21 04:44:31 crc kubenswrapper[4839]: I0321 04:44:31.851320 4839 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-75c94899fc-bkxlk"] Mar 21 04:44:31 crc kubenswrapper[4839]: I0321 04:44:31.860378 4839 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/horizon-75c94899fc-bkxlk"] Mar 21 04:44:31 crc kubenswrapper[4839]: I0321 04:44:31.883889 4839 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-7c9d7f5-72d27"] Mar 21 04:44:31 crc kubenswrapper[4839]: I0321 04:44:31.924676 4839 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/horizon-7c9d7f5-72d27"] Mar 21 04:44:31 crc kubenswrapper[4839]: I0321 04:44:31.939819 4839 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-554fbfcbdf-wqcc5"] Mar 21 04:44:31 crc kubenswrapper[4839]: I0321 04:44:31.947744 4839 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/horizon-554fbfcbdf-wqcc5"] Mar 21 04:44:31 crc kubenswrapper[4839]: I0321 04:44:31.957265 4839 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-b8fbc5445-zkqb7"] Mar 21 04:44:31 crc kubenswrapper[4839]: I0321 04:44:31.964936 4839 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-b8fbc5445-zkqb7"] Mar 21 04:44:32 crc kubenswrapper[4839]: I0321 04:44:32.435063 4839 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-bootstrap-5rr4j"] Mar 21 04:44:32 crc kubenswrapper[4839]: I0321 04:44:32.444061 4839 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-bootstrap-5rr4j"] Mar 21 04:44:32 crc kubenswrapper[4839]: I0321 04:44:32.464377 4839 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="02db0b32-3683-4d02-b645-3cea2cd59b7d" path="/var/lib/kubelet/pods/02db0b32-3683-4d02-b645-3cea2cd59b7d/volumes" Mar 21 04:44:32 crc kubenswrapper[4839]: I0321 04:44:32.465475 4839 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3193915f-60d3-4c8e-aa15-858213ce011c" path="/var/lib/kubelet/pods/3193915f-60d3-4c8e-aa15-858213ce011c/volumes" Mar 21 04:44:32 crc kubenswrapper[4839]: I0321 04:44:32.466385 4839 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="67dd1633-1450-4153-b0af-b6887f61944c" path="/var/lib/kubelet/pods/67dd1633-1450-4153-b0af-b6887f61944c/volumes" Mar 21 04:44:32 crc kubenswrapper[4839]: I0321 04:44:32.467754 4839 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="848aa53a-bd67-4733-aad7-6ac0f6fc0a15" path="/var/lib/kubelet/pods/848aa53a-bd67-4733-aad7-6ac0f6fc0a15/volumes" Mar 21 04:44:32 crc kubenswrapper[4839]: I0321 04:44:32.468715 4839 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f9b42e2e-3015-4ae1-a3a9-3eb96949b021" path="/var/lib/kubelet/pods/f9b42e2e-3015-4ae1-a3a9-3eb96949b021/volumes" Mar 21 04:44:32 crc kubenswrapper[4839]: E0321 04:44:32.476781 4839 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-cinder-api:current-podified" Mar 21 04:44:32 crc kubenswrapper[4839]: E0321 04:44:32.476984 4839 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:cinder-db-sync,Image:quay.io/podified-antelope-centos9/openstack-cinder-api:current-podified,Command:[/bin/bash],Args:[-c /usr/local/bin/kolla_set_configs && /usr/local/bin/kolla_start],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:KOLLA_BOOTSTRAP,Value:TRUE,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:etc-machine-id,ReadOnly:true,MountPath:/etc/machine-id,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:scripts,ReadOnly:true,MountPath:/usr/local/bin/container-scripts,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/config-data/merged,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/etc/my.cnf,SubPath:my.cnf,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:db-sync-config-data,ReadOnly:true,MountPath:/etc/cinder/cinder.conf.d,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/kolla/config_files/config.json,SubPath:db-sync-config.json,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:combined-ca-bundle,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-g6mpt,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:nil,Privileged:nil,SELinuxOptions:nil,RunAsUser:*0,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod cinder-db-sync-qfjms_openstack(6000d2d4-e84a-443f-9094-ab999541331d): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 21 04:44:32 crc kubenswrapper[4839]: E0321 04:44:32.480760 4839 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"cinder-db-sync\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/cinder-db-sync-qfjms" podUID="6000d2d4-e84a-443f-9094-ab999541331d" Mar 21 04:44:32 crc kubenswrapper[4839]: I0321 04:44:32.534544 4839 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-nm9t5" Mar 21 04:44:32 crc kubenswrapper[4839]: I0321 04:44:32.538357 4839 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-bootstrap-ts52d"] Mar 21 04:44:32 crc kubenswrapper[4839]: E0321 04:44:32.538748 4839 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="67dd1633-1450-4153-b0af-b6887f61944c" containerName="init" Mar 21 04:44:32 crc kubenswrapper[4839]: I0321 04:44:32.538768 4839 state_mem.go:107] "Deleted CPUSet assignment" podUID="67dd1633-1450-4153-b0af-b6887f61944c" containerName="init" Mar 21 04:44:32 crc kubenswrapper[4839]: E0321 04:44:32.538778 4839 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="848aa53a-bd67-4733-aad7-6ac0f6fc0a15" containerName="keystone-bootstrap" Mar 21 04:44:32 crc kubenswrapper[4839]: I0321 04:44:32.538787 4839 state_mem.go:107] "Deleted CPUSet assignment" podUID="848aa53a-bd67-4733-aad7-6ac0f6fc0a15" containerName="keystone-bootstrap" Mar 21 04:44:32 crc kubenswrapper[4839]: E0321 04:44:32.538820 4839 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="625a99bd-bc01-400e-8e9c-1f5eff390466" containerName="neutron-db-sync" Mar 21 04:44:32 crc kubenswrapper[4839]: I0321 04:44:32.538826 4839 state_mem.go:107] "Deleted CPUSet assignment" podUID="625a99bd-bc01-400e-8e9c-1f5eff390466" containerName="neutron-db-sync" Mar 21 04:44:32 crc kubenswrapper[4839]: E0321 04:44:32.538837 4839 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="67dd1633-1450-4153-b0af-b6887f61944c" containerName="dnsmasq-dns" Mar 21 04:44:32 crc kubenswrapper[4839]: I0321 04:44:32.538842 4839 state_mem.go:107] "Deleted CPUSet assignment" podUID="67dd1633-1450-4153-b0af-b6887f61944c" containerName="dnsmasq-dns" Mar 21 04:44:32 crc kubenswrapper[4839]: I0321 04:44:32.539009 4839 memory_manager.go:354] "RemoveStaleState removing state" podUID="67dd1633-1450-4153-b0af-b6887f61944c" containerName="dnsmasq-dns" Mar 21 04:44:32 crc kubenswrapper[4839]: I0321 04:44:32.539060 4839 memory_manager.go:354] "RemoveStaleState removing state" podUID="625a99bd-bc01-400e-8e9c-1f5eff390466" containerName="neutron-db-sync" Mar 21 04:44:32 crc kubenswrapper[4839]: I0321 04:44:32.539072 4839 memory_manager.go:354] "RemoveStaleState removing state" podUID="848aa53a-bd67-4733-aad7-6ac0f6fc0a15" containerName="keystone-bootstrap" Mar 21 04:44:32 crc kubenswrapper[4839]: I0321 04:44:32.539649 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-ts52d" Mar 21 04:44:32 crc kubenswrapper[4839]: I0321 04:44:32.542285 4839 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Mar 21 04:44:32 crc kubenswrapper[4839]: I0321 04:44:32.542533 4839 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"osp-secret" Mar 21 04:44:32 crc kubenswrapper[4839]: I0321 04:44:32.542845 4839 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-pzsvm" Mar 21 04:44:32 crc kubenswrapper[4839]: I0321 04:44:32.543090 4839 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Mar 21 04:44:32 crc kubenswrapper[4839]: I0321 04:44:32.543173 4839 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Mar 21 04:44:32 crc kubenswrapper[4839]: I0321 04:44:32.544862 4839 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-ts52d"] Mar 21 04:44:32 crc kubenswrapper[4839]: I0321 04:44:32.564168 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/625a99bd-bc01-400e-8e9c-1f5eff390466-config\") pod \"625a99bd-bc01-400e-8e9c-1f5eff390466\" (UID: \"625a99bd-bc01-400e-8e9c-1f5eff390466\") " Mar 21 04:44:32 crc kubenswrapper[4839]: I0321 04:44:32.564366 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zhrt9\" (UniqueName: \"kubernetes.io/projected/625a99bd-bc01-400e-8e9c-1f5eff390466-kube-api-access-zhrt9\") pod \"625a99bd-bc01-400e-8e9c-1f5eff390466\" (UID: \"625a99bd-bc01-400e-8e9c-1f5eff390466\") " Mar 21 04:44:32 crc kubenswrapper[4839]: I0321 04:44:32.564444 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/625a99bd-bc01-400e-8e9c-1f5eff390466-combined-ca-bundle\") pod \"625a99bd-bc01-400e-8e9c-1f5eff390466\" (UID: \"625a99bd-bc01-400e-8e9c-1f5eff390466\") " Mar 21 04:44:32 crc kubenswrapper[4839]: I0321 04:44:32.564790 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/7cada35b-7e7f-4d22-895f-588b90e48c70-credential-keys\") pod \"keystone-bootstrap-ts52d\" (UID: \"7cada35b-7e7f-4d22-895f-588b90e48c70\") " pod="openstack/keystone-bootstrap-ts52d" Mar 21 04:44:32 crc kubenswrapper[4839]: I0321 04:44:32.564905 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7cada35b-7e7f-4d22-895f-588b90e48c70-scripts\") pod \"keystone-bootstrap-ts52d\" (UID: \"7cada35b-7e7f-4d22-895f-588b90e48c70\") " pod="openstack/keystone-bootstrap-ts52d" Mar 21 04:44:32 crc kubenswrapper[4839]: I0321 04:44:32.564958 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/7cada35b-7e7f-4d22-895f-588b90e48c70-fernet-keys\") pod \"keystone-bootstrap-ts52d\" (UID: \"7cada35b-7e7f-4d22-895f-588b90e48c70\") " pod="openstack/keystone-bootstrap-ts52d" Mar 21 04:44:32 crc kubenswrapper[4839]: I0321 04:44:32.565033 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7cada35b-7e7f-4d22-895f-588b90e48c70-combined-ca-bundle\") pod \"keystone-bootstrap-ts52d\" (UID: \"7cada35b-7e7f-4d22-895f-588b90e48c70\") " pod="openstack/keystone-bootstrap-ts52d" Mar 21 04:44:32 crc kubenswrapper[4839]: I0321 04:44:32.565202 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5h955\" (UniqueName: \"kubernetes.io/projected/7cada35b-7e7f-4d22-895f-588b90e48c70-kube-api-access-5h955\") pod \"keystone-bootstrap-ts52d\" (UID: \"7cada35b-7e7f-4d22-895f-588b90e48c70\") " pod="openstack/keystone-bootstrap-ts52d" Mar 21 04:44:32 crc kubenswrapper[4839]: I0321 04:44:32.565235 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7cada35b-7e7f-4d22-895f-588b90e48c70-config-data\") pod \"keystone-bootstrap-ts52d\" (UID: \"7cada35b-7e7f-4d22-895f-588b90e48c70\") " pod="openstack/keystone-bootstrap-ts52d" Mar 21 04:44:32 crc kubenswrapper[4839]: I0321 04:44:32.567050 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/625a99bd-bc01-400e-8e9c-1f5eff390466-kube-api-access-zhrt9" (OuterVolumeSpecName: "kube-api-access-zhrt9") pod "625a99bd-bc01-400e-8e9c-1f5eff390466" (UID: "625a99bd-bc01-400e-8e9c-1f5eff390466"). InnerVolumeSpecName "kube-api-access-zhrt9". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 21 04:44:32 crc kubenswrapper[4839]: I0321 04:44:32.588484 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/625a99bd-bc01-400e-8e9c-1f5eff390466-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "625a99bd-bc01-400e-8e9c-1f5eff390466" (UID: "625a99bd-bc01-400e-8e9c-1f5eff390466"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 21 04:44:32 crc kubenswrapper[4839]: I0321 04:44:32.593199 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/625a99bd-bc01-400e-8e9c-1f5eff390466-config" (OuterVolumeSpecName: "config") pod "625a99bd-bc01-400e-8e9c-1f5eff390466" (UID: "625a99bd-bc01-400e-8e9c-1f5eff390466"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 21 04:44:32 crc kubenswrapper[4839]: I0321 04:44:32.666599 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5h955\" (UniqueName: \"kubernetes.io/projected/7cada35b-7e7f-4d22-895f-588b90e48c70-kube-api-access-5h955\") pod \"keystone-bootstrap-ts52d\" (UID: \"7cada35b-7e7f-4d22-895f-588b90e48c70\") " pod="openstack/keystone-bootstrap-ts52d" Mar 21 04:44:32 crc kubenswrapper[4839]: I0321 04:44:32.666649 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7cada35b-7e7f-4d22-895f-588b90e48c70-config-data\") pod \"keystone-bootstrap-ts52d\" (UID: \"7cada35b-7e7f-4d22-895f-588b90e48c70\") " pod="openstack/keystone-bootstrap-ts52d" Mar 21 04:44:32 crc kubenswrapper[4839]: I0321 04:44:32.666682 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/7cada35b-7e7f-4d22-895f-588b90e48c70-credential-keys\") pod \"keystone-bootstrap-ts52d\" (UID: \"7cada35b-7e7f-4d22-895f-588b90e48c70\") " pod="openstack/keystone-bootstrap-ts52d" Mar 21 04:44:32 crc kubenswrapper[4839]: I0321 04:44:32.666747 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7cada35b-7e7f-4d22-895f-588b90e48c70-scripts\") pod \"keystone-bootstrap-ts52d\" (UID: \"7cada35b-7e7f-4d22-895f-588b90e48c70\") " pod="openstack/keystone-bootstrap-ts52d" Mar 21 04:44:32 crc kubenswrapper[4839]: I0321 04:44:32.666786 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/7cada35b-7e7f-4d22-895f-588b90e48c70-fernet-keys\") pod \"keystone-bootstrap-ts52d\" (UID: \"7cada35b-7e7f-4d22-895f-588b90e48c70\") " pod="openstack/keystone-bootstrap-ts52d" Mar 21 04:44:32 crc kubenswrapper[4839]: I0321 04:44:32.666834 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7cada35b-7e7f-4d22-895f-588b90e48c70-combined-ca-bundle\") pod \"keystone-bootstrap-ts52d\" (UID: \"7cada35b-7e7f-4d22-895f-588b90e48c70\") " pod="openstack/keystone-bootstrap-ts52d" Mar 21 04:44:32 crc kubenswrapper[4839]: I0321 04:44:32.666881 4839 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zhrt9\" (UniqueName: \"kubernetes.io/projected/625a99bd-bc01-400e-8e9c-1f5eff390466-kube-api-access-zhrt9\") on node \"crc\" DevicePath \"\"" Mar 21 04:44:32 crc kubenswrapper[4839]: I0321 04:44:32.666893 4839 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/625a99bd-bc01-400e-8e9c-1f5eff390466-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 21 04:44:32 crc kubenswrapper[4839]: I0321 04:44:32.666902 4839 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/625a99bd-bc01-400e-8e9c-1f5eff390466-config\") on node \"crc\" DevicePath \"\"" Mar 21 04:44:32 crc kubenswrapper[4839]: I0321 04:44:32.670727 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7cada35b-7e7f-4d22-895f-588b90e48c70-scripts\") pod \"keystone-bootstrap-ts52d\" (UID: \"7cada35b-7e7f-4d22-895f-588b90e48c70\") " pod="openstack/keystone-bootstrap-ts52d" Mar 21 04:44:32 crc kubenswrapper[4839]: I0321 04:44:32.670832 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7cada35b-7e7f-4d22-895f-588b90e48c70-combined-ca-bundle\") pod \"keystone-bootstrap-ts52d\" (UID: \"7cada35b-7e7f-4d22-895f-588b90e48c70\") " pod="openstack/keystone-bootstrap-ts52d" Mar 21 04:44:32 crc kubenswrapper[4839]: I0321 04:44:32.671783 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/7cada35b-7e7f-4d22-895f-588b90e48c70-credential-keys\") pod \"keystone-bootstrap-ts52d\" (UID: \"7cada35b-7e7f-4d22-895f-588b90e48c70\") " pod="openstack/keystone-bootstrap-ts52d" Mar 21 04:44:32 crc kubenswrapper[4839]: I0321 04:44:32.679622 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/7cada35b-7e7f-4d22-895f-588b90e48c70-fernet-keys\") pod \"keystone-bootstrap-ts52d\" (UID: \"7cada35b-7e7f-4d22-895f-588b90e48c70\") " pod="openstack/keystone-bootstrap-ts52d" Mar 21 04:44:32 crc kubenswrapper[4839]: I0321 04:44:32.680779 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7cada35b-7e7f-4d22-895f-588b90e48c70-config-data\") pod \"keystone-bootstrap-ts52d\" (UID: \"7cada35b-7e7f-4d22-895f-588b90e48c70\") " pod="openstack/keystone-bootstrap-ts52d" Mar 21 04:44:32 crc kubenswrapper[4839]: I0321 04:44:32.681887 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5h955\" (UniqueName: \"kubernetes.io/projected/7cada35b-7e7f-4d22-895f-588b90e48c70-kube-api-access-5h955\") pod \"keystone-bootstrap-ts52d\" (UID: \"7cada35b-7e7f-4d22-895f-588b90e48c70\") " pod="openstack/keystone-bootstrap-ts52d" Mar 21 04:44:32 crc kubenswrapper[4839]: I0321 04:44:32.780644 4839 scope.go:117] "RemoveContainer" containerID="285d767665dbf1b22bee7f8005f18b61072968dd727d608ba30f4f564d8882bb" Mar 21 04:44:32 crc kubenswrapper[4839]: I0321 04:44:32.795920 4839 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-nm9t5" Mar 21 04:44:32 crc kubenswrapper[4839]: I0321 04:44:32.795966 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-nm9t5" event={"ID":"625a99bd-bc01-400e-8e9c-1f5eff390466","Type":"ContainerDied","Data":"829be773cbac605730f273e58ffebe5c5615f40baf855f3c2212fe6c649c7cf3"} Mar 21 04:44:32 crc kubenswrapper[4839]: I0321 04:44:32.796001 4839 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="829be773cbac605730f273e58ffebe5c5615f40baf855f3c2212fe6c649c7cf3" Mar 21 04:44:32 crc kubenswrapper[4839]: E0321 04:44:32.800827 4839 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"cinder-db-sync\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-cinder-api:current-podified\\\"\"" pod="openstack/cinder-db-sync-qfjms" podUID="6000d2d4-e84a-443f-9094-ab999541331d" Mar 21 04:44:32 crc kubenswrapper[4839]: I0321 04:44:32.920807 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-ts52d" Mar 21 04:44:33 crc kubenswrapper[4839]: I0321 04:44:33.309627 4839 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-84c6c985f8-v5cmh"] Mar 21 04:44:33 crc kubenswrapper[4839]: I0321 04:44:33.322044 4839 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 21 04:44:33 crc kubenswrapper[4839]: I0321 04:44:33.328561 4839 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-9c97f4dbd-k2scs"] Mar 21 04:44:33 crc kubenswrapper[4839]: I0321 04:44:33.550548 4839 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-ts52d"] Mar 21 04:44:33 crc kubenswrapper[4839]: I0321 04:44:33.575399 4839 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-56df8fb6b7-lzkn7"] Mar 21 04:44:33 crc kubenswrapper[4839]: W0321 04:44:33.577513 4839 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod7cada35b_7e7f_4d22_895f_588b90e48c70.slice/crio-e31bc068934ef91e47e0dcf3fab8f1d0d0da5df66baa85de8d2947608e34dbb6 WatchSource:0}: Error finding container e31bc068934ef91e47e0dcf3fab8f1d0d0da5df66baa85de8d2947608e34dbb6: Status 404 returned error can't find the container with id e31bc068934ef91e47e0dcf3fab8f1d0d0da5df66baa85de8d2947608e34dbb6 Mar 21 04:44:33 crc kubenswrapper[4839]: I0321 04:44:33.640066 4839 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-b8fbc5445-zkqb7" podUID="67dd1633-1450-4153-b0af-b6887f61944c" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.115:5353: i/o timeout" Mar 21 04:44:33 crc kubenswrapper[4839]: I0321 04:44:33.811782 4839 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-56df8fb6b7-lzkn7"] Mar 21 04:44:33 crc kubenswrapper[4839]: I0321 04:44:33.813767 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-ts52d" event={"ID":"7cada35b-7e7f-4d22-895f-588b90e48c70","Type":"ContainerStarted","Data":"e31bc068934ef91e47e0dcf3fab8f1d0d0da5df66baa85de8d2947608e34dbb6"} Mar 21 04:44:33 crc kubenswrapper[4839]: I0321 04:44:33.818275 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-56df8fb6b7-lzkn7" event={"ID":"93294d9d-21ef-43b5-bac5-35d24543d02a","Type":"ContainerStarted","Data":"2250ba76cd4a4ca0b77733a0ad0032843aaa7963ae9b1e3267d54ad5c182b9aa"} Mar 21 04:44:33 crc kubenswrapper[4839]: I0321 04:44:33.822093 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-84c6c985f8-v5cmh" event={"ID":"b3b26c3a-55d5-442a-9c31-187b0aa60f90","Type":"ContainerStarted","Data":"b5753b189f3fee68b09fb93ec56788b978b6a6741d48ecf04c45ca76fee101e1"} Mar 21 04:44:33 crc kubenswrapper[4839]: I0321 04:44:33.826825 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-9c97f4dbd-k2scs" event={"ID":"579308eb-854d-4160-ad35-8677f2d0e634","Type":"ContainerStarted","Data":"03888d296ca7ab53ceb5c3abefacbc011c167a0d8c4b77409fc31b4badc571f5"} Mar 21 04:44:33 crc kubenswrapper[4839]: I0321 04:44:33.856827 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-t8kxj" event={"ID":"6d0e1745-6e0b-475c-a1de-d049018abea6","Type":"ContainerStarted","Data":"143fbf65afa2773912765c6bb85681ce2740b19aa556d5df9884eb40a87ddf95"} Mar 21 04:44:33 crc kubenswrapper[4839]: I0321 04:44:33.871397 4839 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-6b7b667979-w74nb"] Mar 21 04:44:33 crc kubenswrapper[4839]: I0321 04:44:33.881941 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-wdddk" event={"ID":"e6e87cbd-1f46-4fa0-9529-8250f9fee21c","Type":"ContainerStarted","Data":"3f2d4fa09933468a7b6e88aaba055705019ffd1468416047f45f0ae828c805fe"} Mar 21 04:44:33 crc kubenswrapper[4839]: I0321 04:44:33.882249 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6b7b667979-w74nb" Mar 21 04:44:33 crc kubenswrapper[4839]: I0321 04:44:33.889956 4839 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-d447b4d96-qkb69"] Mar 21 04:44:33 crc kubenswrapper[4839]: I0321 04:44:33.895433 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-d447b4d96-qkb69" Mar 21 04:44:33 crc kubenswrapper[4839]: I0321 04:44:33.902516 4839 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-httpd-config" Mar 21 04:44:33 crc kubenswrapper[4839]: I0321 04:44:33.902917 4839 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-config" Mar 21 04:44:33 crc kubenswrapper[4839]: I0321 04:44:33.903051 4839 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-neutron-ovndbs" Mar 21 04:44:33 crc kubenswrapper[4839]: I0321 04:44:33.903172 4839 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-neutron-dockercfg-5mrkq" Mar 21 04:44:33 crc kubenswrapper[4839]: I0321 04:44:33.908903 4839 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6b7b667979-w74nb"] Mar 21 04:44:33 crc kubenswrapper[4839]: I0321 04:44:33.937168 4839 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-db-sync-t8kxj" podStartSLOduration=4.214767286 podStartE2EDuration="34.937134793s" podCreationTimestamp="2026-03-21 04:43:59 +0000 UTC" firstStartedPulling="2026-03-21 04:44:01.699786965 +0000 UTC m=+1246.027573641" lastFinishedPulling="2026-03-21 04:44:32.422154462 +0000 UTC m=+1276.749941148" observedRunningTime="2026-03-21 04:44:33.884727237 +0000 UTC m=+1278.212513913" watchObservedRunningTime="2026-03-21 04:44:33.937134793 +0000 UTC m=+1278.264921489" Mar 21 04:44:33 crc kubenswrapper[4839]: I0321 04:44:33.962380 4839 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-d447b4d96-qkb69"] Mar 21 04:44:33 crc kubenswrapper[4839]: I0321 04:44:33.999872 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/aeba2ea1-e2d1-46d2-8e89-982cd58f3b15-dns-swift-storage-0\") pod \"dnsmasq-dns-6b7b667979-w74nb\" (UID: \"aeba2ea1-e2d1-46d2-8e89-982cd58f3b15\") " pod="openstack/dnsmasq-dns-6b7b667979-w74nb" Mar 21 04:44:33 crc kubenswrapper[4839]: I0321 04:44:33.999940 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/12b60d89-b044-4822-bc95-47567123e883-ovndb-tls-certs\") pod \"neutron-d447b4d96-qkb69\" (UID: \"12b60d89-b044-4822-bc95-47567123e883\") " pod="openstack/neutron-d447b4d96-qkb69" Mar 21 04:44:34 crc kubenswrapper[4839]: I0321 04:44:33.999982 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/aeba2ea1-e2d1-46d2-8e89-982cd58f3b15-dns-svc\") pod \"dnsmasq-dns-6b7b667979-w74nb\" (UID: \"aeba2ea1-e2d1-46d2-8e89-982cd58f3b15\") " pod="openstack/dnsmasq-dns-6b7b667979-w74nb" Mar 21 04:44:34 crc kubenswrapper[4839]: I0321 04:44:34.000279 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/aeba2ea1-e2d1-46d2-8e89-982cd58f3b15-ovsdbserver-nb\") pod \"dnsmasq-dns-6b7b667979-w74nb\" (UID: \"aeba2ea1-e2d1-46d2-8e89-982cd58f3b15\") " pod="openstack/dnsmasq-dns-6b7b667979-w74nb" Mar 21 04:44:34 crc kubenswrapper[4839]: I0321 04:44:34.000427 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/aeba2ea1-e2d1-46d2-8e89-982cd58f3b15-config\") pod \"dnsmasq-dns-6b7b667979-w74nb\" (UID: \"aeba2ea1-e2d1-46d2-8e89-982cd58f3b15\") " pod="openstack/dnsmasq-dns-6b7b667979-w74nb" Mar 21 04:44:34 crc kubenswrapper[4839]: I0321 04:44:34.000675 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/aeba2ea1-e2d1-46d2-8e89-982cd58f3b15-ovsdbserver-sb\") pod \"dnsmasq-dns-6b7b667979-w74nb\" (UID: \"aeba2ea1-e2d1-46d2-8e89-982cd58f3b15\") " pod="openstack/dnsmasq-dns-6b7b667979-w74nb" Mar 21 04:44:34 crc kubenswrapper[4839]: I0321 04:44:34.000759 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/12b60d89-b044-4822-bc95-47567123e883-httpd-config\") pod \"neutron-d447b4d96-qkb69\" (UID: \"12b60d89-b044-4822-bc95-47567123e883\") " pod="openstack/neutron-d447b4d96-qkb69" Mar 21 04:44:34 crc kubenswrapper[4839]: I0321 04:44:34.001239 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/12b60d89-b044-4822-bc95-47567123e883-config\") pod \"neutron-d447b4d96-qkb69\" (UID: \"12b60d89-b044-4822-bc95-47567123e883\") " pod="openstack/neutron-d447b4d96-qkb69" Mar 21 04:44:34 crc kubenswrapper[4839]: I0321 04:44:34.001266 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-swz79\" (UniqueName: \"kubernetes.io/projected/12b60d89-b044-4822-bc95-47567123e883-kube-api-access-swz79\") pod \"neutron-d447b4d96-qkb69\" (UID: \"12b60d89-b044-4822-bc95-47567123e883\") " pod="openstack/neutron-d447b4d96-qkb69" Mar 21 04:44:34 crc kubenswrapper[4839]: I0321 04:44:34.002363 4839 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/placement-db-sync-wdddk" podStartSLOduration=5.121571794 podStartE2EDuration="35.002339077s" podCreationTimestamp="2026-03-21 04:43:59 +0000 UTC" firstStartedPulling="2026-03-21 04:44:01.363717193 +0000 UTC m=+1245.691503869" lastFinishedPulling="2026-03-21 04:44:31.244484476 +0000 UTC m=+1275.572271152" observedRunningTime="2026-03-21 04:44:33.957277206 +0000 UTC m=+1278.285063892" watchObservedRunningTime="2026-03-21 04:44:34.002339077 +0000 UTC m=+1278.330125753" Mar 21 04:44:34 crc kubenswrapper[4839]: I0321 04:44:34.004633 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sf4cm\" (UniqueName: \"kubernetes.io/projected/aeba2ea1-e2d1-46d2-8e89-982cd58f3b15-kube-api-access-sf4cm\") pod \"dnsmasq-dns-6b7b667979-w74nb\" (UID: \"aeba2ea1-e2d1-46d2-8e89-982cd58f3b15\") " pod="openstack/dnsmasq-dns-6b7b667979-w74nb" Mar 21 04:44:34 crc kubenswrapper[4839]: I0321 04:44:34.004674 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/12b60d89-b044-4822-bc95-47567123e883-combined-ca-bundle\") pod \"neutron-d447b4d96-qkb69\" (UID: \"12b60d89-b044-4822-bc95-47567123e883\") " pod="openstack/neutron-d447b4d96-qkb69" Mar 21 04:44:34 crc kubenswrapper[4839]: I0321 04:44:34.110366 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/aeba2ea1-e2d1-46d2-8e89-982cd58f3b15-ovsdbserver-nb\") pod \"dnsmasq-dns-6b7b667979-w74nb\" (UID: \"aeba2ea1-e2d1-46d2-8e89-982cd58f3b15\") " pod="openstack/dnsmasq-dns-6b7b667979-w74nb" Mar 21 04:44:34 crc kubenswrapper[4839]: I0321 04:44:34.110414 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/aeba2ea1-e2d1-46d2-8e89-982cd58f3b15-config\") pod \"dnsmasq-dns-6b7b667979-w74nb\" (UID: \"aeba2ea1-e2d1-46d2-8e89-982cd58f3b15\") " pod="openstack/dnsmasq-dns-6b7b667979-w74nb" Mar 21 04:44:34 crc kubenswrapper[4839]: I0321 04:44:34.110460 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/aeba2ea1-e2d1-46d2-8e89-982cd58f3b15-ovsdbserver-sb\") pod \"dnsmasq-dns-6b7b667979-w74nb\" (UID: \"aeba2ea1-e2d1-46d2-8e89-982cd58f3b15\") " pod="openstack/dnsmasq-dns-6b7b667979-w74nb" Mar 21 04:44:34 crc kubenswrapper[4839]: I0321 04:44:34.110484 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/12b60d89-b044-4822-bc95-47567123e883-httpd-config\") pod \"neutron-d447b4d96-qkb69\" (UID: \"12b60d89-b044-4822-bc95-47567123e883\") " pod="openstack/neutron-d447b4d96-qkb69" Mar 21 04:44:34 crc kubenswrapper[4839]: I0321 04:44:34.110593 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/12b60d89-b044-4822-bc95-47567123e883-config\") pod \"neutron-d447b4d96-qkb69\" (UID: \"12b60d89-b044-4822-bc95-47567123e883\") " pod="openstack/neutron-d447b4d96-qkb69" Mar 21 04:44:34 crc kubenswrapper[4839]: I0321 04:44:34.110619 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-swz79\" (UniqueName: \"kubernetes.io/projected/12b60d89-b044-4822-bc95-47567123e883-kube-api-access-swz79\") pod \"neutron-d447b4d96-qkb69\" (UID: \"12b60d89-b044-4822-bc95-47567123e883\") " pod="openstack/neutron-d447b4d96-qkb69" Mar 21 04:44:34 crc kubenswrapper[4839]: I0321 04:44:34.110673 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sf4cm\" (UniqueName: \"kubernetes.io/projected/aeba2ea1-e2d1-46d2-8e89-982cd58f3b15-kube-api-access-sf4cm\") pod \"dnsmasq-dns-6b7b667979-w74nb\" (UID: \"aeba2ea1-e2d1-46d2-8e89-982cd58f3b15\") " pod="openstack/dnsmasq-dns-6b7b667979-w74nb" Mar 21 04:44:34 crc kubenswrapper[4839]: I0321 04:44:34.110693 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/12b60d89-b044-4822-bc95-47567123e883-combined-ca-bundle\") pod \"neutron-d447b4d96-qkb69\" (UID: \"12b60d89-b044-4822-bc95-47567123e883\") " pod="openstack/neutron-d447b4d96-qkb69" Mar 21 04:44:34 crc kubenswrapper[4839]: I0321 04:44:34.110761 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/aeba2ea1-e2d1-46d2-8e89-982cd58f3b15-dns-swift-storage-0\") pod \"dnsmasq-dns-6b7b667979-w74nb\" (UID: \"aeba2ea1-e2d1-46d2-8e89-982cd58f3b15\") " pod="openstack/dnsmasq-dns-6b7b667979-w74nb" Mar 21 04:44:34 crc kubenswrapper[4839]: I0321 04:44:34.110796 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/12b60d89-b044-4822-bc95-47567123e883-ovndb-tls-certs\") pod \"neutron-d447b4d96-qkb69\" (UID: \"12b60d89-b044-4822-bc95-47567123e883\") " pod="openstack/neutron-d447b4d96-qkb69" Mar 21 04:44:34 crc kubenswrapper[4839]: I0321 04:44:34.110865 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/aeba2ea1-e2d1-46d2-8e89-982cd58f3b15-dns-svc\") pod \"dnsmasq-dns-6b7b667979-w74nb\" (UID: \"aeba2ea1-e2d1-46d2-8e89-982cd58f3b15\") " pod="openstack/dnsmasq-dns-6b7b667979-w74nb" Mar 21 04:44:34 crc kubenswrapper[4839]: I0321 04:44:34.111254 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/aeba2ea1-e2d1-46d2-8e89-982cd58f3b15-config\") pod \"dnsmasq-dns-6b7b667979-w74nb\" (UID: \"aeba2ea1-e2d1-46d2-8e89-982cd58f3b15\") " pod="openstack/dnsmasq-dns-6b7b667979-w74nb" Mar 21 04:44:34 crc kubenswrapper[4839]: I0321 04:44:34.111392 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/aeba2ea1-e2d1-46d2-8e89-982cd58f3b15-ovsdbserver-sb\") pod \"dnsmasq-dns-6b7b667979-w74nb\" (UID: \"aeba2ea1-e2d1-46d2-8e89-982cd58f3b15\") " pod="openstack/dnsmasq-dns-6b7b667979-w74nb" Mar 21 04:44:34 crc kubenswrapper[4839]: I0321 04:44:34.111732 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/aeba2ea1-e2d1-46d2-8e89-982cd58f3b15-dns-svc\") pod \"dnsmasq-dns-6b7b667979-w74nb\" (UID: \"aeba2ea1-e2d1-46d2-8e89-982cd58f3b15\") " pod="openstack/dnsmasq-dns-6b7b667979-w74nb" Mar 21 04:44:34 crc kubenswrapper[4839]: I0321 04:44:34.112066 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/aeba2ea1-e2d1-46d2-8e89-982cd58f3b15-dns-swift-storage-0\") pod \"dnsmasq-dns-6b7b667979-w74nb\" (UID: \"aeba2ea1-e2d1-46d2-8e89-982cd58f3b15\") " pod="openstack/dnsmasq-dns-6b7b667979-w74nb" Mar 21 04:44:34 crc kubenswrapper[4839]: I0321 04:44:34.113008 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/aeba2ea1-e2d1-46d2-8e89-982cd58f3b15-ovsdbserver-nb\") pod \"dnsmasq-dns-6b7b667979-w74nb\" (UID: \"aeba2ea1-e2d1-46d2-8e89-982cd58f3b15\") " pod="openstack/dnsmasq-dns-6b7b667979-w74nb" Mar 21 04:44:34 crc kubenswrapper[4839]: I0321 04:44:34.120504 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/12b60d89-b044-4822-bc95-47567123e883-config\") pod \"neutron-d447b4d96-qkb69\" (UID: \"12b60d89-b044-4822-bc95-47567123e883\") " pod="openstack/neutron-d447b4d96-qkb69" Mar 21 04:44:34 crc kubenswrapper[4839]: I0321 04:44:34.122180 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/12b60d89-b044-4822-bc95-47567123e883-httpd-config\") pod \"neutron-d447b4d96-qkb69\" (UID: \"12b60d89-b044-4822-bc95-47567123e883\") " pod="openstack/neutron-d447b4d96-qkb69" Mar 21 04:44:34 crc kubenswrapper[4839]: I0321 04:44:34.124043 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/12b60d89-b044-4822-bc95-47567123e883-combined-ca-bundle\") pod \"neutron-d447b4d96-qkb69\" (UID: \"12b60d89-b044-4822-bc95-47567123e883\") " pod="openstack/neutron-d447b4d96-qkb69" Mar 21 04:44:34 crc kubenswrapper[4839]: I0321 04:44:34.124336 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/12b60d89-b044-4822-bc95-47567123e883-ovndb-tls-certs\") pod \"neutron-d447b4d96-qkb69\" (UID: \"12b60d89-b044-4822-bc95-47567123e883\") " pod="openstack/neutron-d447b4d96-qkb69" Mar 21 04:44:34 crc kubenswrapper[4839]: I0321 04:44:34.130677 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sf4cm\" (UniqueName: \"kubernetes.io/projected/aeba2ea1-e2d1-46d2-8e89-982cd58f3b15-kube-api-access-sf4cm\") pod \"dnsmasq-dns-6b7b667979-w74nb\" (UID: \"aeba2ea1-e2d1-46d2-8e89-982cd58f3b15\") " pod="openstack/dnsmasq-dns-6b7b667979-w74nb" Mar 21 04:44:34 crc kubenswrapper[4839]: I0321 04:44:34.134922 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-swz79\" (UniqueName: \"kubernetes.io/projected/12b60d89-b044-4822-bc95-47567123e883-kube-api-access-swz79\") pod \"neutron-d447b4d96-qkb69\" (UID: \"12b60d89-b044-4822-bc95-47567123e883\") " pod="openstack/neutron-d447b4d96-qkb69" Mar 21 04:44:34 crc kubenswrapper[4839]: I0321 04:44:34.207315 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6b7b667979-w74nb" Mar 21 04:44:34 crc kubenswrapper[4839]: I0321 04:44:34.221110 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-d447b4d96-qkb69" Mar 21 04:44:34 crc kubenswrapper[4839]: I0321 04:44:34.290433 4839 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Mar 21 04:44:34 crc kubenswrapper[4839]: W0321 04:44:34.298007 4839 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod6008d784_c50b_4079_a7b4_c160b8202956.slice/crio-bc5e3457eb4db025b9a82cb5d0e3fd43071910df3cfe4e9dea88ba3c8fb8cc99 WatchSource:0}: Error finding container bc5e3457eb4db025b9a82cb5d0e3fd43071910df3cfe4e9dea88ba3c8fb8cc99: Status 404 returned error can't find the container with id bc5e3457eb4db025b9a82cb5d0e3fd43071910df3cfe4e9dea88ba3c8fb8cc99 Mar 21 04:44:34 crc kubenswrapper[4839]: I0321 04:44:34.905546 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-ts52d" event={"ID":"7cada35b-7e7f-4d22-895f-588b90e48c70","Type":"ContainerStarted","Data":"848904d0e2ac99454595812a77ae5d4f4ec6aacc9198508a3ea49e5fd72d6ee4"} Mar 21 04:44:34 crc kubenswrapper[4839]: I0321 04:44:34.914764 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"6008d784-c50b-4079-a7b4-c160b8202956","Type":"ContainerStarted","Data":"bc5e3457eb4db025b9a82cb5d0e3fd43071910df3cfe4e9dea88ba3c8fb8cc99"} Mar 21 04:44:34 crc kubenswrapper[4839]: I0321 04:44:34.921408 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"6c266726-5bfd-4519-bdd5-9db7f6a77df4","Type":"ContainerStarted","Data":"0a64d9a20f4f5b5d0b9782608a440c655769c9db2754bb98b7278494dc83ae14"} Mar 21 04:44:34 crc kubenswrapper[4839]: I0321 04:44:34.923962 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-84c6c985f8-v5cmh" event={"ID":"b3b26c3a-55d5-442a-9c31-187b0aa60f90","Type":"ContainerStarted","Data":"0bc7ef10848b0da5e68b6c3552cc343013046d2176bf665b0d2389f263149510"} Mar 21 04:44:34 crc kubenswrapper[4839]: I0321 04:44:34.931365 4839 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-d447b4d96-qkb69"] Mar 21 04:44:34 crc kubenswrapper[4839]: I0321 04:44:34.935274 4839 generic.go:334] "Generic (PLEG): container finished" podID="93294d9d-21ef-43b5-bac5-35d24543d02a" containerID="74f2d6e648e65be5fa3f29d363526c322d6b87d1a1296638d9d2a658969d838e" exitCode=0 Mar 21 04:44:34 crc kubenswrapper[4839]: I0321 04:44:34.935381 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-56df8fb6b7-lzkn7" event={"ID":"93294d9d-21ef-43b5-bac5-35d24543d02a","Type":"ContainerDied","Data":"74f2d6e648e65be5fa3f29d363526c322d6b87d1a1296638d9d2a658969d838e"} Mar 21 04:44:34 crc kubenswrapper[4839]: I0321 04:44:34.939294 4839 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-bootstrap-ts52d" podStartSLOduration=2.9392708069999998 podStartE2EDuration="2.939270807s" podCreationTimestamp="2026-03-21 04:44:32 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-21 04:44:34.927999672 +0000 UTC m=+1279.255786348" watchObservedRunningTime="2026-03-21 04:44:34.939270807 +0000 UTC m=+1279.267057483" Mar 21 04:44:34 crc kubenswrapper[4839]: I0321 04:44:34.954406 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-9c97f4dbd-k2scs" event={"ID":"579308eb-854d-4160-ad35-8677f2d0e634","Type":"ContainerStarted","Data":"04b6e51342deeff7b4d476258be9896a458d5b6ccb291866248471c868e4ea4c"} Mar 21 04:44:34 crc kubenswrapper[4839]: I0321 04:44:34.978799 4839 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6b7b667979-w74nb"] Mar 21 04:44:34 crc kubenswrapper[4839]: W0321 04:44:34.992415 4839 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podaeba2ea1_e2d1_46d2_8e89_982cd58f3b15.slice/crio-594e402f9f6a6f0c691bdaa0d2b0e852622545812087194fbffc061c3f4fc05b WatchSource:0}: Error finding container 594e402f9f6a6f0c691bdaa0d2b0e852622545812087194fbffc061c3f4fc05b: Status 404 returned error can't find the container with id 594e402f9f6a6f0c691bdaa0d2b0e852622545812087194fbffc061c3f4fc05b Mar 21 04:44:35 crc kubenswrapper[4839]: I0321 04:44:35.125482 4839 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 21 04:44:35 crc kubenswrapper[4839]: W0321 04:44:35.130452 4839 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod3d7f6c0e_7fdd_4ed0_94ad_e1b044f296f6.slice/crio-895e0558e8de36ac56f79042a259db632c9fdc941b4ad6158154e4d29f6f1e2e WatchSource:0}: Error finding container 895e0558e8de36ac56f79042a259db632c9fdc941b4ad6158154e4d29f6f1e2e: Status 404 returned error can't find the container with id 895e0558e8de36ac56f79042a259db632c9fdc941b4ad6158154e4d29f6f1e2e Mar 21 04:44:35 crc kubenswrapper[4839]: I0321 04:44:35.471073 4839 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-56df8fb6b7-lzkn7" Mar 21 04:44:35 crc kubenswrapper[4839]: I0321 04:44:35.552083 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/93294d9d-21ef-43b5-bac5-35d24543d02a-config\") pod \"93294d9d-21ef-43b5-bac5-35d24543d02a\" (UID: \"93294d9d-21ef-43b5-bac5-35d24543d02a\") " Mar 21 04:44:35 crc kubenswrapper[4839]: I0321 04:44:35.552404 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/93294d9d-21ef-43b5-bac5-35d24543d02a-dns-svc\") pod \"93294d9d-21ef-43b5-bac5-35d24543d02a\" (UID: \"93294d9d-21ef-43b5-bac5-35d24543d02a\") " Mar 21 04:44:35 crc kubenswrapper[4839]: I0321 04:44:35.552484 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/93294d9d-21ef-43b5-bac5-35d24543d02a-ovsdbserver-sb\") pod \"93294d9d-21ef-43b5-bac5-35d24543d02a\" (UID: \"93294d9d-21ef-43b5-bac5-35d24543d02a\") " Mar 21 04:44:35 crc kubenswrapper[4839]: I0321 04:44:35.552529 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/93294d9d-21ef-43b5-bac5-35d24543d02a-ovsdbserver-nb\") pod \"93294d9d-21ef-43b5-bac5-35d24543d02a\" (UID: \"93294d9d-21ef-43b5-bac5-35d24543d02a\") " Mar 21 04:44:35 crc kubenswrapper[4839]: I0321 04:44:35.552634 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6gbq4\" (UniqueName: \"kubernetes.io/projected/93294d9d-21ef-43b5-bac5-35d24543d02a-kube-api-access-6gbq4\") pod \"93294d9d-21ef-43b5-bac5-35d24543d02a\" (UID: \"93294d9d-21ef-43b5-bac5-35d24543d02a\") " Mar 21 04:44:35 crc kubenswrapper[4839]: I0321 04:44:35.552664 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/93294d9d-21ef-43b5-bac5-35d24543d02a-dns-swift-storage-0\") pod \"93294d9d-21ef-43b5-bac5-35d24543d02a\" (UID: \"93294d9d-21ef-43b5-bac5-35d24543d02a\") " Mar 21 04:44:35 crc kubenswrapper[4839]: I0321 04:44:35.586687 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/93294d9d-21ef-43b5-bac5-35d24543d02a-config" (OuterVolumeSpecName: "config") pod "93294d9d-21ef-43b5-bac5-35d24543d02a" (UID: "93294d9d-21ef-43b5-bac5-35d24543d02a"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 21 04:44:35 crc kubenswrapper[4839]: I0321 04:44:35.616202 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/93294d9d-21ef-43b5-bac5-35d24543d02a-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "93294d9d-21ef-43b5-bac5-35d24543d02a" (UID: "93294d9d-21ef-43b5-bac5-35d24543d02a"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 21 04:44:35 crc kubenswrapper[4839]: I0321 04:44:35.625133 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/93294d9d-21ef-43b5-bac5-35d24543d02a-kube-api-access-6gbq4" (OuterVolumeSpecName: "kube-api-access-6gbq4") pod "93294d9d-21ef-43b5-bac5-35d24543d02a" (UID: "93294d9d-21ef-43b5-bac5-35d24543d02a"). InnerVolumeSpecName "kube-api-access-6gbq4". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 21 04:44:35 crc kubenswrapper[4839]: I0321 04:44:35.626699 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/93294d9d-21ef-43b5-bac5-35d24543d02a-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "93294d9d-21ef-43b5-bac5-35d24543d02a" (UID: "93294d9d-21ef-43b5-bac5-35d24543d02a"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 21 04:44:35 crc kubenswrapper[4839]: I0321 04:44:35.626776 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/93294d9d-21ef-43b5-bac5-35d24543d02a-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "93294d9d-21ef-43b5-bac5-35d24543d02a" (UID: "93294d9d-21ef-43b5-bac5-35d24543d02a"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 21 04:44:35 crc kubenswrapper[4839]: I0321 04:44:35.637021 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/93294d9d-21ef-43b5-bac5-35d24543d02a-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "93294d9d-21ef-43b5-bac5-35d24543d02a" (UID: "93294d9d-21ef-43b5-bac5-35d24543d02a"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 21 04:44:35 crc kubenswrapper[4839]: I0321 04:44:35.665041 4839 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/93294d9d-21ef-43b5-bac5-35d24543d02a-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 21 04:44:35 crc kubenswrapper[4839]: I0321 04:44:35.665072 4839 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/93294d9d-21ef-43b5-bac5-35d24543d02a-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Mar 21 04:44:35 crc kubenswrapper[4839]: I0321 04:44:35.665114 4839 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/93294d9d-21ef-43b5-bac5-35d24543d02a-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Mar 21 04:44:35 crc kubenswrapper[4839]: I0321 04:44:35.665127 4839 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6gbq4\" (UniqueName: \"kubernetes.io/projected/93294d9d-21ef-43b5-bac5-35d24543d02a-kube-api-access-6gbq4\") on node \"crc\" DevicePath \"\"" Mar 21 04:44:35 crc kubenswrapper[4839]: I0321 04:44:35.665137 4839 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/93294d9d-21ef-43b5-bac5-35d24543d02a-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Mar 21 04:44:35 crc kubenswrapper[4839]: I0321 04:44:35.665148 4839 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/93294d9d-21ef-43b5-bac5-35d24543d02a-config\") on node \"crc\" DevicePath \"\"" Mar 21 04:44:35 crc kubenswrapper[4839]: I0321 04:44:35.986912 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-d447b4d96-qkb69" event={"ID":"12b60d89-b044-4822-bc95-47567123e883","Type":"ContainerStarted","Data":"67565e75899a6f37a50691495a20ce72ac05f30c786d835eec041d52489fae90"} Mar 21 04:44:35 crc kubenswrapper[4839]: I0321 04:44:35.987040 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-d447b4d96-qkb69" event={"ID":"12b60d89-b044-4822-bc95-47567123e883","Type":"ContainerStarted","Data":"e44c81ce53fb7cb7cb67615e87aced0fc7bd4c886cbd53ea268fc23a5209a592"} Mar 21 04:44:35 crc kubenswrapper[4839]: I0321 04:44:35.993523 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"3d7f6c0e-7fdd-4ed0-94ad-e1b044f296f6","Type":"ContainerStarted","Data":"895e0558e8de36ac56f79042a259db632c9fdc941b4ad6158154e4d29f6f1e2e"} Mar 21 04:44:36 crc kubenswrapper[4839]: I0321 04:44:36.003023 4839 generic.go:334] "Generic (PLEG): container finished" podID="aeba2ea1-e2d1-46d2-8e89-982cd58f3b15" containerID="4d47c811396da6675e98576b8f9d542f9a6e50f5a5df44132f5048a5caae6747" exitCode=0 Mar 21 04:44:36 crc kubenswrapper[4839]: I0321 04:44:36.003102 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6b7b667979-w74nb" event={"ID":"aeba2ea1-e2d1-46d2-8e89-982cd58f3b15","Type":"ContainerDied","Data":"4d47c811396da6675e98576b8f9d542f9a6e50f5a5df44132f5048a5caae6747"} Mar 21 04:44:36 crc kubenswrapper[4839]: I0321 04:44:36.003129 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6b7b667979-w74nb" event={"ID":"aeba2ea1-e2d1-46d2-8e89-982cd58f3b15","Type":"ContainerStarted","Data":"594e402f9f6a6f0c691bdaa0d2b0e852622545812087194fbffc061c3f4fc05b"} Mar 21 04:44:36 crc kubenswrapper[4839]: I0321 04:44:36.010624 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"6008d784-c50b-4079-a7b4-c160b8202956","Type":"ContainerStarted","Data":"656d2bd6e96a6dfef03b94b31d13a8f2dda820b33aef3803e391faf4b5d221eb"} Mar 21 04:44:36 crc kubenswrapper[4839]: I0321 04:44:36.016168 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-84c6c985f8-v5cmh" event={"ID":"b3b26c3a-55d5-442a-9c31-187b0aa60f90","Type":"ContainerStarted","Data":"e004b9646c4df34c1d5bba67912a6fa76f3cccc25c7980ab777e369e37ce16c9"} Mar 21 04:44:36 crc kubenswrapper[4839]: I0321 04:44:36.079597 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-56df8fb6b7-lzkn7" event={"ID":"93294d9d-21ef-43b5-bac5-35d24543d02a","Type":"ContainerDied","Data":"2250ba76cd4a4ca0b77733a0ad0032843aaa7963ae9b1e3267d54ad5c182b9aa"} Mar 21 04:44:36 crc kubenswrapper[4839]: I0321 04:44:36.079668 4839 scope.go:117] "RemoveContainer" containerID="74f2d6e648e65be5fa3f29d363526c322d6b87d1a1296638d9d2a658969d838e" Mar 21 04:44:36 crc kubenswrapper[4839]: I0321 04:44:36.079907 4839 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-56df8fb6b7-lzkn7" Mar 21 04:44:36 crc kubenswrapper[4839]: I0321 04:44:36.112092 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-9c97f4dbd-k2scs" event={"ID":"579308eb-854d-4160-ad35-8677f2d0e634","Type":"ContainerStarted","Data":"8ea4c51163beea5b097beaa576e1811227967d58157ac9043d0eaf20c8c0eed3"} Mar 21 04:44:36 crc kubenswrapper[4839]: I0321 04:44:36.122501 4839 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/horizon-84c6c985f8-v5cmh" podStartSLOduration=26.966153139 podStartE2EDuration="28.122477398s" podCreationTimestamp="2026-03-21 04:44:08 +0000 UTC" firstStartedPulling="2026-03-21 04:44:33.322392995 +0000 UTC m=+1277.650179671" lastFinishedPulling="2026-03-21 04:44:34.478717254 +0000 UTC m=+1278.806503930" observedRunningTime="2026-03-21 04:44:36.111609584 +0000 UTC m=+1280.439396280" watchObservedRunningTime="2026-03-21 04:44:36.122477398 +0000 UTC m=+1280.450264084" Mar 21 04:44:36 crc kubenswrapper[4839]: I0321 04:44:36.193619 4839 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/horizon-9c97f4dbd-k2scs" podStartSLOduration=27.034844181 podStartE2EDuration="28.193597128s" podCreationTimestamp="2026-03-21 04:44:08 +0000 UTC" firstStartedPulling="2026-03-21 04:44:33.321792338 +0000 UTC m=+1277.649579014" lastFinishedPulling="2026-03-21 04:44:34.480545285 +0000 UTC m=+1278.808331961" observedRunningTime="2026-03-21 04:44:36.151986433 +0000 UTC m=+1280.479773129" watchObservedRunningTime="2026-03-21 04:44:36.193597128 +0000 UTC m=+1280.521383804" Mar 21 04:44:36 crc kubenswrapper[4839]: I0321 04:44:36.211493 4839 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-56df8fb6b7-lzkn7"] Mar 21 04:44:36 crc kubenswrapper[4839]: I0321 04:44:36.225017 4839 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-56df8fb6b7-lzkn7"] Mar 21 04:44:36 crc kubenswrapper[4839]: I0321 04:44:36.477801 4839 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="93294d9d-21ef-43b5-bac5-35d24543d02a" path="/var/lib/kubelet/pods/93294d9d-21ef-43b5-bac5-35d24543d02a/volumes" Mar 21 04:44:36 crc kubenswrapper[4839]: I0321 04:44:36.519094 4839 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-98964f649-mrjrt"] Mar 21 04:44:36 crc kubenswrapper[4839]: E0321 04:44:36.520103 4839 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="93294d9d-21ef-43b5-bac5-35d24543d02a" containerName="init" Mar 21 04:44:36 crc kubenswrapper[4839]: I0321 04:44:36.520128 4839 state_mem.go:107] "Deleted CPUSet assignment" podUID="93294d9d-21ef-43b5-bac5-35d24543d02a" containerName="init" Mar 21 04:44:36 crc kubenswrapper[4839]: I0321 04:44:36.520351 4839 memory_manager.go:354] "RemoveStaleState removing state" podUID="93294d9d-21ef-43b5-bac5-35d24543d02a" containerName="init" Mar 21 04:44:36 crc kubenswrapper[4839]: I0321 04:44:36.521449 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-98964f649-mrjrt" Mar 21 04:44:36 crc kubenswrapper[4839]: I0321 04:44:36.525809 4839 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-neutron-public-svc" Mar 21 04:44:36 crc kubenswrapper[4839]: I0321 04:44:36.526087 4839 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-neutron-internal-svc" Mar 21 04:44:36 crc kubenswrapper[4839]: I0321 04:44:36.548727 4839 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-98964f649-mrjrt"] Mar 21 04:44:36 crc kubenswrapper[4839]: I0321 04:44:36.609508 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/e965d008-890b-408c-a5a8-823aca00140a-httpd-config\") pod \"neutron-98964f649-mrjrt\" (UID: \"e965d008-890b-408c-a5a8-823aca00140a\") " pod="openstack/neutron-98964f649-mrjrt" Mar 21 04:44:36 crc kubenswrapper[4839]: I0321 04:44:36.609815 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/e965d008-890b-408c-a5a8-823aca00140a-config\") pod \"neutron-98964f649-mrjrt\" (UID: \"e965d008-890b-408c-a5a8-823aca00140a\") " pod="openstack/neutron-98964f649-mrjrt" Mar 21 04:44:36 crc kubenswrapper[4839]: I0321 04:44:36.609934 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e965d008-890b-408c-a5a8-823aca00140a-combined-ca-bundle\") pod \"neutron-98964f649-mrjrt\" (UID: \"e965d008-890b-408c-a5a8-823aca00140a\") " pod="openstack/neutron-98964f649-mrjrt" Mar 21 04:44:36 crc kubenswrapper[4839]: I0321 04:44:36.610076 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/e965d008-890b-408c-a5a8-823aca00140a-internal-tls-certs\") pod \"neutron-98964f649-mrjrt\" (UID: \"e965d008-890b-408c-a5a8-823aca00140a\") " pod="openstack/neutron-98964f649-mrjrt" Mar 21 04:44:36 crc kubenswrapper[4839]: I0321 04:44:36.610245 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/e965d008-890b-408c-a5a8-823aca00140a-ovndb-tls-certs\") pod \"neutron-98964f649-mrjrt\" (UID: \"e965d008-890b-408c-a5a8-823aca00140a\") " pod="openstack/neutron-98964f649-mrjrt" Mar 21 04:44:36 crc kubenswrapper[4839]: I0321 04:44:36.610464 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/e965d008-890b-408c-a5a8-823aca00140a-public-tls-certs\") pod \"neutron-98964f649-mrjrt\" (UID: \"e965d008-890b-408c-a5a8-823aca00140a\") " pod="openstack/neutron-98964f649-mrjrt" Mar 21 04:44:36 crc kubenswrapper[4839]: I0321 04:44:36.610538 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wbt6d\" (UniqueName: \"kubernetes.io/projected/e965d008-890b-408c-a5a8-823aca00140a-kube-api-access-wbt6d\") pod \"neutron-98964f649-mrjrt\" (UID: \"e965d008-890b-408c-a5a8-823aca00140a\") " pod="openstack/neutron-98964f649-mrjrt" Mar 21 04:44:36 crc kubenswrapper[4839]: I0321 04:44:36.712790 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/e965d008-890b-408c-a5a8-823aca00140a-httpd-config\") pod \"neutron-98964f649-mrjrt\" (UID: \"e965d008-890b-408c-a5a8-823aca00140a\") " pod="openstack/neutron-98964f649-mrjrt" Mar 21 04:44:36 crc kubenswrapper[4839]: I0321 04:44:36.712832 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/e965d008-890b-408c-a5a8-823aca00140a-config\") pod \"neutron-98964f649-mrjrt\" (UID: \"e965d008-890b-408c-a5a8-823aca00140a\") " pod="openstack/neutron-98964f649-mrjrt" Mar 21 04:44:36 crc kubenswrapper[4839]: I0321 04:44:36.712858 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e965d008-890b-408c-a5a8-823aca00140a-combined-ca-bundle\") pod \"neutron-98964f649-mrjrt\" (UID: \"e965d008-890b-408c-a5a8-823aca00140a\") " pod="openstack/neutron-98964f649-mrjrt" Mar 21 04:44:36 crc kubenswrapper[4839]: I0321 04:44:36.712998 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/e965d008-890b-408c-a5a8-823aca00140a-internal-tls-certs\") pod \"neutron-98964f649-mrjrt\" (UID: \"e965d008-890b-408c-a5a8-823aca00140a\") " pod="openstack/neutron-98964f649-mrjrt" Mar 21 04:44:36 crc kubenswrapper[4839]: I0321 04:44:36.713096 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/e965d008-890b-408c-a5a8-823aca00140a-ovndb-tls-certs\") pod \"neutron-98964f649-mrjrt\" (UID: \"e965d008-890b-408c-a5a8-823aca00140a\") " pod="openstack/neutron-98964f649-mrjrt" Mar 21 04:44:36 crc kubenswrapper[4839]: I0321 04:44:36.713185 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/e965d008-890b-408c-a5a8-823aca00140a-public-tls-certs\") pod \"neutron-98964f649-mrjrt\" (UID: \"e965d008-890b-408c-a5a8-823aca00140a\") " pod="openstack/neutron-98964f649-mrjrt" Mar 21 04:44:36 crc kubenswrapper[4839]: I0321 04:44:36.713211 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wbt6d\" (UniqueName: \"kubernetes.io/projected/e965d008-890b-408c-a5a8-823aca00140a-kube-api-access-wbt6d\") pod \"neutron-98964f649-mrjrt\" (UID: \"e965d008-890b-408c-a5a8-823aca00140a\") " pod="openstack/neutron-98964f649-mrjrt" Mar 21 04:44:36 crc kubenswrapper[4839]: I0321 04:44:36.718060 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/e965d008-890b-408c-a5a8-823aca00140a-httpd-config\") pod \"neutron-98964f649-mrjrt\" (UID: \"e965d008-890b-408c-a5a8-823aca00140a\") " pod="openstack/neutron-98964f649-mrjrt" Mar 21 04:44:36 crc kubenswrapper[4839]: I0321 04:44:36.719345 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/e965d008-890b-408c-a5a8-823aca00140a-internal-tls-certs\") pod \"neutron-98964f649-mrjrt\" (UID: \"e965d008-890b-408c-a5a8-823aca00140a\") " pod="openstack/neutron-98964f649-mrjrt" Mar 21 04:44:36 crc kubenswrapper[4839]: I0321 04:44:36.720429 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e965d008-890b-408c-a5a8-823aca00140a-combined-ca-bundle\") pod \"neutron-98964f649-mrjrt\" (UID: \"e965d008-890b-408c-a5a8-823aca00140a\") " pod="openstack/neutron-98964f649-mrjrt" Mar 21 04:44:36 crc kubenswrapper[4839]: I0321 04:44:36.721616 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/e965d008-890b-408c-a5a8-823aca00140a-public-tls-certs\") pod \"neutron-98964f649-mrjrt\" (UID: \"e965d008-890b-408c-a5a8-823aca00140a\") " pod="openstack/neutron-98964f649-mrjrt" Mar 21 04:44:36 crc kubenswrapper[4839]: I0321 04:44:36.722327 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/e965d008-890b-408c-a5a8-823aca00140a-ovndb-tls-certs\") pod \"neutron-98964f649-mrjrt\" (UID: \"e965d008-890b-408c-a5a8-823aca00140a\") " pod="openstack/neutron-98964f649-mrjrt" Mar 21 04:44:36 crc kubenswrapper[4839]: I0321 04:44:36.723115 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/e965d008-890b-408c-a5a8-823aca00140a-config\") pod \"neutron-98964f649-mrjrt\" (UID: \"e965d008-890b-408c-a5a8-823aca00140a\") " pod="openstack/neutron-98964f649-mrjrt" Mar 21 04:44:36 crc kubenswrapper[4839]: I0321 04:44:36.733910 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wbt6d\" (UniqueName: \"kubernetes.io/projected/e965d008-890b-408c-a5a8-823aca00140a-kube-api-access-wbt6d\") pod \"neutron-98964f649-mrjrt\" (UID: \"e965d008-890b-408c-a5a8-823aca00140a\") " pod="openstack/neutron-98964f649-mrjrt" Mar 21 04:44:36 crc kubenswrapper[4839]: I0321 04:44:36.904062 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-98964f649-mrjrt" Mar 21 04:44:37 crc kubenswrapper[4839]: I0321 04:44:37.141781 4839 generic.go:334] "Generic (PLEG): container finished" podID="e6e87cbd-1f46-4fa0-9529-8250f9fee21c" containerID="3f2d4fa09933468a7b6e88aaba055705019ffd1468416047f45f0ae828c805fe" exitCode=0 Mar 21 04:44:37 crc kubenswrapper[4839]: I0321 04:44:37.142138 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-wdddk" event={"ID":"e6e87cbd-1f46-4fa0-9529-8250f9fee21c","Type":"ContainerDied","Data":"3f2d4fa09933468a7b6e88aaba055705019ffd1468416047f45f0ae828c805fe"} Mar 21 04:44:37 crc kubenswrapper[4839]: I0321 04:44:37.153906 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-d447b4d96-qkb69" event={"ID":"12b60d89-b044-4822-bc95-47567123e883","Type":"ContainerStarted","Data":"b34d8d2f70a5b785c9d68f7ee3d5caf1ba929e7c2f1691a3903d1460eb0bcbbc"} Mar 21 04:44:37 crc kubenswrapper[4839]: I0321 04:44:37.154810 4839 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/neutron-d447b4d96-qkb69" Mar 21 04:44:37 crc kubenswrapper[4839]: I0321 04:44:37.156355 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"3d7f6c0e-7fdd-4ed0-94ad-e1b044f296f6","Type":"ContainerStarted","Data":"c262061b12e953fbbacb47f4b9530de8433e420c520843fb5f6b2637c033c0d3"} Mar 21 04:44:37 crc kubenswrapper[4839]: I0321 04:44:37.162308 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6b7b667979-w74nb" event={"ID":"aeba2ea1-e2d1-46d2-8e89-982cd58f3b15","Type":"ContainerStarted","Data":"715b056e0e8951dcb0bce46eff3f4cc77b23970621f830f7f64bde0192431e68"} Mar 21 04:44:37 crc kubenswrapper[4839]: I0321 04:44:37.162517 4839 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-6b7b667979-w74nb" Mar 21 04:44:37 crc kubenswrapper[4839]: I0321 04:44:37.165236 4839 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="6008d784-c50b-4079-a7b4-c160b8202956" containerName="glance-log" containerID="cri-o://656d2bd6e96a6dfef03b94b31d13a8f2dda820b33aef3803e391faf4b5d221eb" gracePeriod=30 Mar 21 04:44:37 crc kubenswrapper[4839]: I0321 04:44:37.165308 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"6008d784-c50b-4079-a7b4-c160b8202956","Type":"ContainerStarted","Data":"0a86b207740d0f8a7f90c03c8833a6d77c693f85ec7fea71ebc1f6a1cbaaac94"} Mar 21 04:44:37 crc kubenswrapper[4839]: I0321 04:44:37.165363 4839 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="6008d784-c50b-4079-a7b4-c160b8202956" containerName="glance-httpd" containerID="cri-o://0a86b207740d0f8a7f90c03c8833a6d77c693f85ec7fea71ebc1f6a1cbaaac94" gracePeriod=30 Mar 21 04:44:37 crc kubenswrapper[4839]: I0321 04:44:37.239688 4839 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-d447b4d96-qkb69" podStartSLOduration=4.239666561 podStartE2EDuration="4.239666561s" podCreationTimestamp="2026-03-21 04:44:33 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-21 04:44:37.21067525 +0000 UTC m=+1281.538461946" watchObservedRunningTime="2026-03-21 04:44:37.239666561 +0000 UTC m=+1281.567453237" Mar 21 04:44:37 crc kubenswrapper[4839]: I0321 04:44:37.261746 4839 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-external-api-0" podStartSLOduration=21.261721628 podStartE2EDuration="21.261721628s" podCreationTimestamp="2026-03-21 04:44:16 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-21 04:44:37.232815969 +0000 UTC m=+1281.560602655" watchObservedRunningTime="2026-03-21 04:44:37.261721628 +0000 UTC m=+1281.589508304" Mar 21 04:44:37 crc kubenswrapper[4839]: I0321 04:44:37.285245 4839 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-6b7b667979-w74nb" podStartSLOduration=4.285216935 podStartE2EDuration="4.285216935s" podCreationTimestamp="2026-03-21 04:44:33 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-21 04:44:37.256542163 +0000 UTC m=+1281.584328849" watchObservedRunningTime="2026-03-21 04:44:37.285216935 +0000 UTC m=+1281.613003621" Mar 21 04:44:38 crc kubenswrapper[4839]: I0321 04:44:38.165206 4839 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-98964f649-mrjrt"] Mar 21 04:44:38 crc kubenswrapper[4839]: I0321 04:44:38.186337 4839 generic.go:334] "Generic (PLEG): container finished" podID="6008d784-c50b-4079-a7b4-c160b8202956" containerID="0a86b207740d0f8a7f90c03c8833a6d77c693f85ec7fea71ebc1f6a1cbaaac94" exitCode=0 Mar 21 04:44:38 crc kubenswrapper[4839]: I0321 04:44:38.186376 4839 generic.go:334] "Generic (PLEG): container finished" podID="6008d784-c50b-4079-a7b4-c160b8202956" containerID="656d2bd6e96a6dfef03b94b31d13a8f2dda820b33aef3803e391faf4b5d221eb" exitCode=143 Mar 21 04:44:38 crc kubenswrapper[4839]: I0321 04:44:38.186376 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"6008d784-c50b-4079-a7b4-c160b8202956","Type":"ContainerDied","Data":"0a86b207740d0f8a7f90c03c8833a6d77c693f85ec7fea71ebc1f6a1cbaaac94"} Mar 21 04:44:38 crc kubenswrapper[4839]: I0321 04:44:38.186420 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"6008d784-c50b-4079-a7b4-c160b8202956","Type":"ContainerDied","Data":"656d2bd6e96a6dfef03b94b31d13a8f2dda820b33aef3803e391faf4b5d221eb"} Mar 21 04:44:39 crc kubenswrapper[4839]: I0321 04:44:39.198157 4839 generic.go:334] "Generic (PLEG): container finished" podID="6d0e1745-6e0b-475c-a1de-d049018abea6" containerID="143fbf65afa2773912765c6bb85681ce2740b19aa556d5df9884eb40a87ddf95" exitCode=0 Mar 21 04:44:39 crc kubenswrapper[4839]: I0321 04:44:39.198737 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-t8kxj" event={"ID":"6d0e1745-6e0b-475c-a1de-d049018abea6","Type":"ContainerDied","Data":"143fbf65afa2773912765c6bb85681ce2740b19aa556d5df9884eb40a87ddf95"} Mar 21 04:44:39 crc kubenswrapper[4839]: I0321 04:44:39.202409 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"3d7f6c0e-7fdd-4ed0-94ad-e1b044f296f6","Type":"ContainerStarted","Data":"169f86351d3c6f2a1f7c5f547a5ee33563ee9e6728a88fde0e0f1b926cb83f77"} Mar 21 04:44:39 crc kubenswrapper[4839]: I0321 04:44:39.202558 4839 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="3d7f6c0e-7fdd-4ed0-94ad-e1b044f296f6" containerName="glance-log" containerID="cri-o://c262061b12e953fbbacb47f4b9530de8433e420c520843fb5f6b2637c033c0d3" gracePeriod=30 Mar 21 04:44:39 crc kubenswrapper[4839]: I0321 04:44:39.202872 4839 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="3d7f6c0e-7fdd-4ed0-94ad-e1b044f296f6" containerName="glance-httpd" containerID="cri-o://169f86351d3c6f2a1f7c5f547a5ee33563ee9e6728a88fde0e0f1b926cb83f77" gracePeriod=30 Mar 21 04:44:39 crc kubenswrapper[4839]: I0321 04:44:39.205267 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-98964f649-mrjrt" event={"ID":"e965d008-890b-408c-a5a8-823aca00140a","Type":"ContainerStarted","Data":"965f07a77abecc2dcc57bce66cbf446672e1ba03feecba479f6dc24ff5964cee"} Mar 21 04:44:39 crc kubenswrapper[4839]: I0321 04:44:39.261531 4839 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/horizon-84c6c985f8-v5cmh" Mar 21 04:44:39 crc kubenswrapper[4839]: I0321 04:44:39.263205 4839 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/horizon-84c6c985f8-v5cmh" Mar 21 04:44:39 crc kubenswrapper[4839]: I0321 04:44:39.328732 4839 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/horizon-9c97f4dbd-k2scs" Mar 21 04:44:39 crc kubenswrapper[4839]: I0321 04:44:39.328784 4839 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/horizon-9c97f4dbd-k2scs" Mar 21 04:44:40 crc kubenswrapper[4839]: I0321 04:44:40.218207 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"3d7f6c0e-7fdd-4ed0-94ad-e1b044f296f6","Type":"ContainerDied","Data":"169f86351d3c6f2a1f7c5f547a5ee33563ee9e6728a88fde0e0f1b926cb83f77"} Mar 21 04:44:40 crc kubenswrapper[4839]: I0321 04:44:40.218144 4839 generic.go:334] "Generic (PLEG): container finished" podID="3d7f6c0e-7fdd-4ed0-94ad-e1b044f296f6" containerID="169f86351d3c6f2a1f7c5f547a5ee33563ee9e6728a88fde0e0f1b926cb83f77" exitCode=0 Mar 21 04:44:40 crc kubenswrapper[4839]: I0321 04:44:40.218613 4839 generic.go:334] "Generic (PLEG): container finished" podID="3d7f6c0e-7fdd-4ed0-94ad-e1b044f296f6" containerID="c262061b12e953fbbacb47f4b9530de8433e420c520843fb5f6b2637c033c0d3" exitCode=143 Mar 21 04:44:40 crc kubenswrapper[4839]: I0321 04:44:40.218725 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"3d7f6c0e-7fdd-4ed0-94ad-e1b044f296f6","Type":"ContainerDied","Data":"c262061b12e953fbbacb47f4b9530de8433e420c520843fb5f6b2637c033c0d3"} Mar 21 04:44:41 crc kubenswrapper[4839]: I0321 04:44:41.226645 4839 generic.go:334] "Generic (PLEG): container finished" podID="7cada35b-7e7f-4d22-895f-588b90e48c70" containerID="848904d0e2ac99454595812a77ae5d4f4ec6aacc9198508a3ea49e5fd72d6ee4" exitCode=0 Mar 21 04:44:41 crc kubenswrapper[4839]: I0321 04:44:41.226981 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-ts52d" event={"ID":"7cada35b-7e7f-4d22-895f-588b90e48c70","Type":"ContainerDied","Data":"848904d0e2ac99454595812a77ae5d4f4ec6aacc9198508a3ea49e5fd72d6ee4"} Mar 21 04:44:41 crc kubenswrapper[4839]: I0321 04:44:41.255889 4839 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-internal-api-0" podStartSLOduration=25.255870796 podStartE2EDuration="25.255870796s" podCreationTimestamp="2026-03-21 04:44:16 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-21 04:44:39.237951224 +0000 UTC m=+1283.565737910" watchObservedRunningTime="2026-03-21 04:44:41.255870796 +0000 UTC m=+1285.583657472" Mar 21 04:44:41 crc kubenswrapper[4839]: I0321 04:44:41.957611 4839 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-wdddk" Mar 21 04:44:42 crc kubenswrapper[4839]: I0321 04:44:42.056881 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e6e87cbd-1f46-4fa0-9529-8250f9fee21c-combined-ca-bundle\") pod \"e6e87cbd-1f46-4fa0-9529-8250f9fee21c\" (UID: \"e6e87cbd-1f46-4fa0-9529-8250f9fee21c\") " Mar 21 04:44:42 crc kubenswrapper[4839]: I0321 04:44:42.057308 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e6e87cbd-1f46-4fa0-9529-8250f9fee21c-scripts\") pod \"e6e87cbd-1f46-4fa0-9529-8250f9fee21c\" (UID: \"e6e87cbd-1f46-4fa0-9529-8250f9fee21c\") " Mar 21 04:44:42 crc kubenswrapper[4839]: I0321 04:44:42.057364 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e6e87cbd-1f46-4fa0-9529-8250f9fee21c-config-data\") pod \"e6e87cbd-1f46-4fa0-9529-8250f9fee21c\" (UID: \"e6e87cbd-1f46-4fa0-9529-8250f9fee21c\") " Mar 21 04:44:42 crc kubenswrapper[4839]: I0321 04:44:42.057393 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tdb5v\" (UniqueName: \"kubernetes.io/projected/e6e87cbd-1f46-4fa0-9529-8250f9fee21c-kube-api-access-tdb5v\") pod \"e6e87cbd-1f46-4fa0-9529-8250f9fee21c\" (UID: \"e6e87cbd-1f46-4fa0-9529-8250f9fee21c\") " Mar 21 04:44:42 crc kubenswrapper[4839]: I0321 04:44:42.057440 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e6e87cbd-1f46-4fa0-9529-8250f9fee21c-logs\") pod \"e6e87cbd-1f46-4fa0-9529-8250f9fee21c\" (UID: \"e6e87cbd-1f46-4fa0-9529-8250f9fee21c\") " Mar 21 04:44:42 crc kubenswrapper[4839]: I0321 04:44:42.058272 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e6e87cbd-1f46-4fa0-9529-8250f9fee21c-logs" (OuterVolumeSpecName: "logs") pod "e6e87cbd-1f46-4fa0-9529-8250f9fee21c" (UID: "e6e87cbd-1f46-4fa0-9529-8250f9fee21c"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 21 04:44:42 crc kubenswrapper[4839]: I0321 04:44:42.065108 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e6e87cbd-1f46-4fa0-9529-8250f9fee21c-scripts" (OuterVolumeSpecName: "scripts") pod "e6e87cbd-1f46-4fa0-9529-8250f9fee21c" (UID: "e6e87cbd-1f46-4fa0-9529-8250f9fee21c"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 21 04:44:42 crc kubenswrapper[4839]: I0321 04:44:42.065490 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e6e87cbd-1f46-4fa0-9529-8250f9fee21c-kube-api-access-tdb5v" (OuterVolumeSpecName: "kube-api-access-tdb5v") pod "e6e87cbd-1f46-4fa0-9529-8250f9fee21c" (UID: "e6e87cbd-1f46-4fa0-9529-8250f9fee21c"). InnerVolumeSpecName "kube-api-access-tdb5v". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 21 04:44:42 crc kubenswrapper[4839]: I0321 04:44:42.090323 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e6e87cbd-1f46-4fa0-9529-8250f9fee21c-config-data" (OuterVolumeSpecName: "config-data") pod "e6e87cbd-1f46-4fa0-9529-8250f9fee21c" (UID: "e6e87cbd-1f46-4fa0-9529-8250f9fee21c"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 21 04:44:42 crc kubenswrapper[4839]: I0321 04:44:42.090759 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e6e87cbd-1f46-4fa0-9529-8250f9fee21c-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "e6e87cbd-1f46-4fa0-9529-8250f9fee21c" (UID: "e6e87cbd-1f46-4fa0-9529-8250f9fee21c"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 21 04:44:42 crc kubenswrapper[4839]: I0321 04:44:42.159805 4839 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e6e87cbd-1f46-4fa0-9529-8250f9fee21c-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 21 04:44:42 crc kubenswrapper[4839]: I0321 04:44:42.159845 4839 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e6e87cbd-1f46-4fa0-9529-8250f9fee21c-scripts\") on node \"crc\" DevicePath \"\"" Mar 21 04:44:42 crc kubenswrapper[4839]: I0321 04:44:42.159856 4839 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e6e87cbd-1f46-4fa0-9529-8250f9fee21c-config-data\") on node \"crc\" DevicePath \"\"" Mar 21 04:44:42 crc kubenswrapper[4839]: I0321 04:44:42.159864 4839 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tdb5v\" (UniqueName: \"kubernetes.io/projected/e6e87cbd-1f46-4fa0-9529-8250f9fee21c-kube-api-access-tdb5v\") on node \"crc\" DevicePath \"\"" Mar 21 04:44:42 crc kubenswrapper[4839]: I0321 04:44:42.159876 4839 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e6e87cbd-1f46-4fa0-9529-8250f9fee21c-logs\") on node \"crc\" DevicePath \"\"" Mar 21 04:44:42 crc kubenswrapper[4839]: I0321 04:44:42.240786 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-wdddk" event={"ID":"e6e87cbd-1f46-4fa0-9529-8250f9fee21c","Type":"ContainerDied","Data":"e10dee2b21cfdb75da16c639d865bd8e8d3823159b603d6fda5a875f34a0fb47"} Mar 21 04:44:42 crc kubenswrapper[4839]: I0321 04:44:42.240806 4839 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-wdddk" Mar 21 04:44:42 crc kubenswrapper[4839]: I0321 04:44:42.240826 4839 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e10dee2b21cfdb75da16c639d865bd8e8d3823159b603d6fda5a875f34a0fb47" Mar 21 04:44:43 crc kubenswrapper[4839]: I0321 04:44:43.162109 4839 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-5788c8f798-khqlb"] Mar 21 04:44:43 crc kubenswrapper[4839]: E0321 04:44:43.162708 4839 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e6e87cbd-1f46-4fa0-9529-8250f9fee21c" containerName="placement-db-sync" Mar 21 04:44:43 crc kubenswrapper[4839]: I0321 04:44:43.162720 4839 state_mem.go:107] "Deleted CPUSet assignment" podUID="e6e87cbd-1f46-4fa0-9529-8250f9fee21c" containerName="placement-db-sync" Mar 21 04:44:43 crc kubenswrapper[4839]: I0321 04:44:43.162891 4839 memory_manager.go:354] "RemoveStaleState removing state" podUID="e6e87cbd-1f46-4fa0-9529-8250f9fee21c" containerName="placement-db-sync" Mar 21 04:44:43 crc kubenswrapper[4839]: I0321 04:44:43.163815 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-5788c8f798-khqlb" Mar 21 04:44:43 crc kubenswrapper[4839]: I0321 04:44:43.170249 4839 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-scripts" Mar 21 04:44:43 crc kubenswrapper[4839]: I0321 04:44:43.170457 4839 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-placement-dockercfg-qhlcg" Mar 21 04:44:43 crc kubenswrapper[4839]: I0321 04:44:43.170612 4839 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-config-data" Mar 21 04:44:43 crc kubenswrapper[4839]: I0321 04:44:43.170471 4839 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-placement-internal-svc" Mar 21 04:44:43 crc kubenswrapper[4839]: I0321 04:44:43.170745 4839 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-placement-public-svc" Mar 21 04:44:43 crc kubenswrapper[4839]: I0321 04:44:43.196090 4839 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-5788c8f798-khqlb"] Mar 21 04:44:43 crc kubenswrapper[4839]: I0321 04:44:43.278721 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/30c2fe46-cd8a-43f9-8968-b6e65d7c862a-public-tls-certs\") pod \"placement-5788c8f798-khqlb\" (UID: \"30c2fe46-cd8a-43f9-8968-b6e65d7c862a\") " pod="openstack/placement-5788c8f798-khqlb" Mar 21 04:44:43 crc kubenswrapper[4839]: I0321 04:44:43.278771 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/30c2fe46-cd8a-43f9-8968-b6e65d7c862a-config-data\") pod \"placement-5788c8f798-khqlb\" (UID: \"30c2fe46-cd8a-43f9-8968-b6e65d7c862a\") " pod="openstack/placement-5788c8f798-khqlb" Mar 21 04:44:43 crc kubenswrapper[4839]: I0321 04:44:43.278790 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/30c2fe46-cd8a-43f9-8968-b6e65d7c862a-scripts\") pod \"placement-5788c8f798-khqlb\" (UID: \"30c2fe46-cd8a-43f9-8968-b6e65d7c862a\") " pod="openstack/placement-5788c8f798-khqlb" Mar 21 04:44:43 crc kubenswrapper[4839]: I0321 04:44:43.278806 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/30c2fe46-cd8a-43f9-8968-b6e65d7c862a-logs\") pod \"placement-5788c8f798-khqlb\" (UID: \"30c2fe46-cd8a-43f9-8968-b6e65d7c862a\") " pod="openstack/placement-5788c8f798-khqlb" Mar 21 04:44:43 crc kubenswrapper[4839]: I0321 04:44:43.278848 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/30c2fe46-cd8a-43f9-8968-b6e65d7c862a-internal-tls-certs\") pod \"placement-5788c8f798-khqlb\" (UID: \"30c2fe46-cd8a-43f9-8968-b6e65d7c862a\") " pod="openstack/placement-5788c8f798-khqlb" Mar 21 04:44:43 crc kubenswrapper[4839]: I0321 04:44:43.278875 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4nhpg\" (UniqueName: \"kubernetes.io/projected/30c2fe46-cd8a-43f9-8968-b6e65d7c862a-kube-api-access-4nhpg\") pod \"placement-5788c8f798-khqlb\" (UID: \"30c2fe46-cd8a-43f9-8968-b6e65d7c862a\") " pod="openstack/placement-5788c8f798-khqlb" Mar 21 04:44:43 crc kubenswrapper[4839]: I0321 04:44:43.278927 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/30c2fe46-cd8a-43f9-8968-b6e65d7c862a-combined-ca-bundle\") pod \"placement-5788c8f798-khqlb\" (UID: \"30c2fe46-cd8a-43f9-8968-b6e65d7c862a\") " pod="openstack/placement-5788c8f798-khqlb" Mar 21 04:44:43 crc kubenswrapper[4839]: I0321 04:44:43.380361 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/30c2fe46-cd8a-43f9-8968-b6e65d7c862a-public-tls-certs\") pod \"placement-5788c8f798-khqlb\" (UID: \"30c2fe46-cd8a-43f9-8968-b6e65d7c862a\") " pod="openstack/placement-5788c8f798-khqlb" Mar 21 04:44:43 crc kubenswrapper[4839]: I0321 04:44:43.380405 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/30c2fe46-cd8a-43f9-8968-b6e65d7c862a-config-data\") pod \"placement-5788c8f798-khqlb\" (UID: \"30c2fe46-cd8a-43f9-8968-b6e65d7c862a\") " pod="openstack/placement-5788c8f798-khqlb" Mar 21 04:44:43 crc kubenswrapper[4839]: I0321 04:44:43.380444 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/30c2fe46-cd8a-43f9-8968-b6e65d7c862a-scripts\") pod \"placement-5788c8f798-khqlb\" (UID: \"30c2fe46-cd8a-43f9-8968-b6e65d7c862a\") " pod="openstack/placement-5788c8f798-khqlb" Mar 21 04:44:43 crc kubenswrapper[4839]: I0321 04:44:43.380465 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/30c2fe46-cd8a-43f9-8968-b6e65d7c862a-logs\") pod \"placement-5788c8f798-khqlb\" (UID: \"30c2fe46-cd8a-43f9-8968-b6e65d7c862a\") " pod="openstack/placement-5788c8f798-khqlb" Mar 21 04:44:43 crc kubenswrapper[4839]: I0321 04:44:43.380507 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/30c2fe46-cd8a-43f9-8968-b6e65d7c862a-internal-tls-certs\") pod \"placement-5788c8f798-khqlb\" (UID: \"30c2fe46-cd8a-43f9-8968-b6e65d7c862a\") " pod="openstack/placement-5788c8f798-khqlb" Mar 21 04:44:43 crc kubenswrapper[4839]: I0321 04:44:43.380538 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4nhpg\" (UniqueName: \"kubernetes.io/projected/30c2fe46-cd8a-43f9-8968-b6e65d7c862a-kube-api-access-4nhpg\") pod \"placement-5788c8f798-khqlb\" (UID: \"30c2fe46-cd8a-43f9-8968-b6e65d7c862a\") " pod="openstack/placement-5788c8f798-khqlb" Mar 21 04:44:43 crc kubenswrapper[4839]: I0321 04:44:43.380653 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/30c2fe46-cd8a-43f9-8968-b6e65d7c862a-combined-ca-bundle\") pod \"placement-5788c8f798-khqlb\" (UID: \"30c2fe46-cd8a-43f9-8968-b6e65d7c862a\") " pod="openstack/placement-5788c8f798-khqlb" Mar 21 04:44:43 crc kubenswrapper[4839]: I0321 04:44:43.382253 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/30c2fe46-cd8a-43f9-8968-b6e65d7c862a-logs\") pod \"placement-5788c8f798-khqlb\" (UID: \"30c2fe46-cd8a-43f9-8968-b6e65d7c862a\") " pod="openstack/placement-5788c8f798-khqlb" Mar 21 04:44:43 crc kubenswrapper[4839]: I0321 04:44:43.385049 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/30c2fe46-cd8a-43f9-8968-b6e65d7c862a-scripts\") pod \"placement-5788c8f798-khqlb\" (UID: \"30c2fe46-cd8a-43f9-8968-b6e65d7c862a\") " pod="openstack/placement-5788c8f798-khqlb" Mar 21 04:44:43 crc kubenswrapper[4839]: I0321 04:44:43.386295 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/30c2fe46-cd8a-43f9-8968-b6e65d7c862a-public-tls-certs\") pod \"placement-5788c8f798-khqlb\" (UID: \"30c2fe46-cd8a-43f9-8968-b6e65d7c862a\") " pod="openstack/placement-5788c8f798-khqlb" Mar 21 04:44:43 crc kubenswrapper[4839]: I0321 04:44:43.389003 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/30c2fe46-cd8a-43f9-8968-b6e65d7c862a-internal-tls-certs\") pod \"placement-5788c8f798-khqlb\" (UID: \"30c2fe46-cd8a-43f9-8968-b6e65d7c862a\") " pod="openstack/placement-5788c8f798-khqlb" Mar 21 04:44:43 crc kubenswrapper[4839]: I0321 04:44:43.389651 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/30c2fe46-cd8a-43f9-8968-b6e65d7c862a-config-data\") pod \"placement-5788c8f798-khqlb\" (UID: \"30c2fe46-cd8a-43f9-8968-b6e65d7c862a\") " pod="openstack/placement-5788c8f798-khqlb" Mar 21 04:44:43 crc kubenswrapper[4839]: I0321 04:44:43.398856 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/30c2fe46-cd8a-43f9-8968-b6e65d7c862a-combined-ca-bundle\") pod \"placement-5788c8f798-khqlb\" (UID: \"30c2fe46-cd8a-43f9-8968-b6e65d7c862a\") " pod="openstack/placement-5788c8f798-khqlb" Mar 21 04:44:43 crc kubenswrapper[4839]: I0321 04:44:43.400240 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4nhpg\" (UniqueName: \"kubernetes.io/projected/30c2fe46-cd8a-43f9-8968-b6e65d7c862a-kube-api-access-4nhpg\") pod \"placement-5788c8f798-khqlb\" (UID: \"30c2fe46-cd8a-43f9-8968-b6e65d7c862a\") " pod="openstack/placement-5788c8f798-khqlb" Mar 21 04:44:43 crc kubenswrapper[4839]: I0321 04:44:43.487416 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-5788c8f798-khqlb" Mar 21 04:44:44 crc kubenswrapper[4839]: I0321 04:44:44.209762 4839 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-6b7b667979-w74nb" Mar 21 04:44:44 crc kubenswrapper[4839]: I0321 04:44:44.274280 4839 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-cf78879c9-mr2ng"] Mar 21 04:44:44 crc kubenswrapper[4839]: I0321 04:44:44.274522 4839 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-cf78879c9-mr2ng" podUID="e8a38db0-ab1b-4555-bb52-a903e9c6b5bb" containerName="dnsmasq-dns" containerID="cri-o://14b1c90fa0f6d90a07ec7699dcaccca8b3deb82e4f9b4495708e06fb5363f598" gracePeriod=10 Mar 21 04:44:44 crc kubenswrapper[4839]: I0321 04:44:44.733110 4839 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Mar 21 04:44:44 crc kubenswrapper[4839]: I0321 04:44:44.773081 4839 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-t8kxj" Mar 21 04:44:44 crc kubenswrapper[4839]: I0321 04:44:44.784316 4839 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-ts52d" Mar 21 04:44:44 crc kubenswrapper[4839]: I0321 04:44:44.829441 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/6008d784-c50b-4079-a7b4-c160b8202956-httpd-run\") pod \"6008d784-c50b-4079-a7b4-c160b8202956\" (UID: \"6008d784-c50b-4079-a7b4-c160b8202956\") " Mar 21 04:44:44 crc kubenswrapper[4839]: I0321 04:44:44.829517 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7wc6r\" (UniqueName: \"kubernetes.io/projected/6008d784-c50b-4079-a7b4-c160b8202956-kube-api-access-7wc6r\") pod \"6008d784-c50b-4079-a7b4-c160b8202956\" (UID: \"6008d784-c50b-4079-a7b4-c160b8202956\") " Mar 21 04:44:44 crc kubenswrapper[4839]: I0321 04:44:44.829668 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6008d784-c50b-4079-a7b4-c160b8202956-scripts\") pod \"6008d784-c50b-4079-a7b4-c160b8202956\" (UID: \"6008d784-c50b-4079-a7b4-c160b8202956\") " Mar 21 04:44:44 crc kubenswrapper[4839]: I0321 04:44:44.829738 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"6008d784-c50b-4079-a7b4-c160b8202956\" (UID: \"6008d784-c50b-4079-a7b4-c160b8202956\") " Mar 21 04:44:44 crc kubenswrapper[4839]: I0321 04:44:44.829766 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6008d784-c50b-4079-a7b4-c160b8202956-combined-ca-bundle\") pod \"6008d784-c50b-4079-a7b4-c160b8202956\" (UID: \"6008d784-c50b-4079-a7b4-c160b8202956\") " Mar 21 04:44:44 crc kubenswrapper[4839]: I0321 04:44:44.829820 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6008d784-c50b-4079-a7b4-c160b8202956-config-data\") pod \"6008d784-c50b-4079-a7b4-c160b8202956\" (UID: \"6008d784-c50b-4079-a7b4-c160b8202956\") " Mar 21 04:44:44 crc kubenswrapper[4839]: I0321 04:44:44.834664 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6008d784-c50b-4079-a7b4-c160b8202956-logs\") pod \"6008d784-c50b-4079-a7b4-c160b8202956\" (UID: \"6008d784-c50b-4079-a7b4-c160b8202956\") " Mar 21 04:44:44 crc kubenswrapper[4839]: I0321 04:44:44.833876 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6008d784-c50b-4079-a7b4-c160b8202956-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "6008d784-c50b-4079-a7b4-c160b8202956" (UID: "6008d784-c50b-4079-a7b4-c160b8202956"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 21 04:44:44 crc kubenswrapper[4839]: I0321 04:44:44.835360 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6008d784-c50b-4079-a7b4-c160b8202956-logs" (OuterVolumeSpecName: "logs") pod "6008d784-c50b-4079-a7b4-c160b8202956" (UID: "6008d784-c50b-4079-a7b4-c160b8202956"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 21 04:44:44 crc kubenswrapper[4839]: I0321 04:44:44.835495 4839 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6008d784-c50b-4079-a7b4-c160b8202956-logs\") on node \"crc\" DevicePath \"\"" Mar 21 04:44:44 crc kubenswrapper[4839]: I0321 04:44:44.835524 4839 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/6008d784-c50b-4079-a7b4-c160b8202956-httpd-run\") on node \"crc\" DevicePath \"\"" Mar 21 04:44:44 crc kubenswrapper[4839]: I0321 04:44:44.860585 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6008d784-c50b-4079-a7b4-c160b8202956-scripts" (OuterVolumeSpecName: "scripts") pod "6008d784-c50b-4079-a7b4-c160b8202956" (UID: "6008d784-c50b-4079-a7b4-c160b8202956"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 21 04:44:44 crc kubenswrapper[4839]: I0321 04:44:44.861353 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6008d784-c50b-4079-a7b4-c160b8202956-kube-api-access-7wc6r" (OuterVolumeSpecName: "kube-api-access-7wc6r") pod "6008d784-c50b-4079-a7b4-c160b8202956" (UID: "6008d784-c50b-4079-a7b4-c160b8202956"). InnerVolumeSpecName "kube-api-access-7wc6r". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 21 04:44:44 crc kubenswrapper[4839]: I0321 04:44:44.861835 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage07-crc" (OuterVolumeSpecName: "glance") pod "6008d784-c50b-4079-a7b4-c160b8202956" (UID: "6008d784-c50b-4079-a7b4-c160b8202956"). InnerVolumeSpecName "local-storage07-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Mar 21 04:44:44 crc kubenswrapper[4839]: I0321 04:44:44.937524 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hkzvv\" (UniqueName: \"kubernetes.io/projected/6d0e1745-6e0b-475c-a1de-d049018abea6-kube-api-access-hkzvv\") pod \"6d0e1745-6e0b-475c-a1de-d049018abea6\" (UID: \"6d0e1745-6e0b-475c-a1de-d049018abea6\") " Mar 21 04:44:44 crc kubenswrapper[4839]: I0321 04:44:44.940614 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/6d0e1745-6e0b-475c-a1de-d049018abea6-db-sync-config-data\") pod \"6d0e1745-6e0b-475c-a1de-d049018abea6\" (UID: \"6d0e1745-6e0b-475c-a1de-d049018abea6\") " Mar 21 04:44:44 crc kubenswrapper[4839]: I0321 04:44:44.940679 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5h955\" (UniqueName: \"kubernetes.io/projected/7cada35b-7e7f-4d22-895f-588b90e48c70-kube-api-access-5h955\") pod \"7cada35b-7e7f-4d22-895f-588b90e48c70\" (UID: \"7cada35b-7e7f-4d22-895f-588b90e48c70\") " Mar 21 04:44:44 crc kubenswrapper[4839]: I0321 04:44:44.940706 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/7cada35b-7e7f-4d22-895f-588b90e48c70-credential-keys\") pod \"7cada35b-7e7f-4d22-895f-588b90e48c70\" (UID: \"7cada35b-7e7f-4d22-895f-588b90e48c70\") " Mar 21 04:44:44 crc kubenswrapper[4839]: I0321 04:44:44.940804 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6d0e1745-6e0b-475c-a1de-d049018abea6-combined-ca-bundle\") pod \"6d0e1745-6e0b-475c-a1de-d049018abea6\" (UID: \"6d0e1745-6e0b-475c-a1de-d049018abea6\") " Mar 21 04:44:44 crc kubenswrapper[4839]: I0321 04:44:44.940878 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7cada35b-7e7f-4d22-895f-588b90e48c70-config-data\") pod \"7cada35b-7e7f-4d22-895f-588b90e48c70\" (UID: \"7cada35b-7e7f-4d22-895f-588b90e48c70\") " Mar 21 04:44:44 crc kubenswrapper[4839]: I0321 04:44:44.941010 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7cada35b-7e7f-4d22-895f-588b90e48c70-combined-ca-bundle\") pod \"7cada35b-7e7f-4d22-895f-588b90e48c70\" (UID: \"7cada35b-7e7f-4d22-895f-588b90e48c70\") " Mar 21 04:44:44 crc kubenswrapper[4839]: I0321 04:44:44.941191 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/7cada35b-7e7f-4d22-895f-588b90e48c70-fernet-keys\") pod \"7cada35b-7e7f-4d22-895f-588b90e48c70\" (UID: \"7cada35b-7e7f-4d22-895f-588b90e48c70\") " Mar 21 04:44:44 crc kubenswrapper[4839]: I0321 04:44:44.941253 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7cada35b-7e7f-4d22-895f-588b90e48c70-scripts\") pod \"7cada35b-7e7f-4d22-895f-588b90e48c70\" (UID: \"7cada35b-7e7f-4d22-895f-588b90e48c70\") " Mar 21 04:44:44 crc kubenswrapper[4839]: I0321 04:44:44.942542 4839 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") on node \"crc\" " Mar 21 04:44:44 crc kubenswrapper[4839]: I0321 04:44:44.942562 4839 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7wc6r\" (UniqueName: \"kubernetes.io/projected/6008d784-c50b-4079-a7b4-c160b8202956-kube-api-access-7wc6r\") on node \"crc\" DevicePath \"\"" Mar 21 04:44:44 crc kubenswrapper[4839]: I0321 04:44:44.942603 4839 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6008d784-c50b-4079-a7b4-c160b8202956-scripts\") on node \"crc\" DevicePath \"\"" Mar 21 04:44:44 crc kubenswrapper[4839]: I0321 04:44:44.948183 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7cada35b-7e7f-4d22-895f-588b90e48c70-kube-api-access-5h955" (OuterVolumeSpecName: "kube-api-access-5h955") pod "7cada35b-7e7f-4d22-895f-588b90e48c70" (UID: "7cada35b-7e7f-4d22-895f-588b90e48c70"). InnerVolumeSpecName "kube-api-access-5h955". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 21 04:44:44 crc kubenswrapper[4839]: I0321 04:44:44.973527 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7cada35b-7e7f-4d22-895f-588b90e48c70-credential-keys" (OuterVolumeSpecName: "credential-keys") pod "7cada35b-7e7f-4d22-895f-588b90e48c70" (UID: "7cada35b-7e7f-4d22-895f-588b90e48c70"). InnerVolumeSpecName "credential-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 21 04:44:44 crc kubenswrapper[4839]: I0321 04:44:44.977215 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6d0e1745-6e0b-475c-a1de-d049018abea6-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "6d0e1745-6e0b-475c-a1de-d049018abea6" (UID: "6d0e1745-6e0b-475c-a1de-d049018abea6"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 21 04:44:44 crc kubenswrapper[4839]: I0321 04:44:44.978838 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6d0e1745-6e0b-475c-a1de-d049018abea6-kube-api-access-hkzvv" (OuterVolumeSpecName: "kube-api-access-hkzvv") pod "6d0e1745-6e0b-475c-a1de-d049018abea6" (UID: "6d0e1745-6e0b-475c-a1de-d049018abea6"). InnerVolumeSpecName "kube-api-access-hkzvv". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 21 04:44:44 crc kubenswrapper[4839]: I0321 04:44:44.984342 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7cada35b-7e7f-4d22-895f-588b90e48c70-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "7cada35b-7e7f-4d22-895f-588b90e48c70" (UID: "7cada35b-7e7f-4d22-895f-588b90e48c70"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 21 04:44:44 crc kubenswrapper[4839]: I0321 04:44:44.991956 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7cada35b-7e7f-4d22-895f-588b90e48c70-scripts" (OuterVolumeSpecName: "scripts") pod "7cada35b-7e7f-4d22-895f-588b90e48c70" (UID: "7cada35b-7e7f-4d22-895f-588b90e48c70"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 21 04:44:45 crc kubenswrapper[4839]: I0321 04:44:45.014295 4839 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-cf78879c9-mr2ng" Mar 21 04:44:45 crc kubenswrapper[4839]: I0321 04:44:45.044321 4839 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hkzvv\" (UniqueName: \"kubernetes.io/projected/6d0e1745-6e0b-475c-a1de-d049018abea6-kube-api-access-hkzvv\") on node \"crc\" DevicePath \"\"" Mar 21 04:44:45 crc kubenswrapper[4839]: I0321 04:44:45.044352 4839 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/6d0e1745-6e0b-475c-a1de-d049018abea6-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Mar 21 04:44:45 crc kubenswrapper[4839]: I0321 04:44:45.044366 4839 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5h955\" (UniqueName: \"kubernetes.io/projected/7cada35b-7e7f-4d22-895f-588b90e48c70-kube-api-access-5h955\") on node \"crc\" DevicePath \"\"" Mar 21 04:44:45 crc kubenswrapper[4839]: I0321 04:44:45.044378 4839 reconciler_common.go:293] "Volume detached for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/7cada35b-7e7f-4d22-895f-588b90e48c70-credential-keys\") on node \"crc\" DevicePath \"\"" Mar 21 04:44:45 crc kubenswrapper[4839]: I0321 04:44:45.044389 4839 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/7cada35b-7e7f-4d22-895f-588b90e48c70-fernet-keys\") on node \"crc\" DevicePath \"\"" Mar 21 04:44:45 crc kubenswrapper[4839]: I0321 04:44:45.044400 4839 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7cada35b-7e7f-4d22-895f-588b90e48c70-scripts\") on node \"crc\" DevicePath \"\"" Mar 21 04:44:45 crc kubenswrapper[4839]: I0321 04:44:45.050377 4839 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Mar 21 04:44:45 crc kubenswrapper[4839]: I0321 04:44:45.100093 4839 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage07-crc" (UniqueName: "kubernetes.io/local-volume/local-storage07-crc") on node "crc" Mar 21 04:44:45 crc kubenswrapper[4839]: I0321 04:44:45.146627 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3d7f6c0e-7fdd-4ed0-94ad-e1b044f296f6-scripts\") pod \"3d7f6c0e-7fdd-4ed0-94ad-e1b044f296f6\" (UID: \"3d7f6c0e-7fdd-4ed0-94ad-e1b044f296f6\") " Mar 21 04:44:45 crc kubenswrapper[4839]: I0321 04:44:45.146705 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3d7f6c0e-7fdd-4ed0-94ad-e1b044f296f6-combined-ca-bundle\") pod \"3d7f6c0e-7fdd-4ed0-94ad-e1b044f296f6\" (UID: \"3d7f6c0e-7fdd-4ed0-94ad-e1b044f296f6\") " Mar 21 04:44:45 crc kubenswrapper[4839]: I0321 04:44:45.146704 4839 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-5788c8f798-khqlb"] Mar 21 04:44:45 crc kubenswrapper[4839]: I0321 04:44:45.146853 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zdqdc\" (UniqueName: \"kubernetes.io/projected/3d7f6c0e-7fdd-4ed0-94ad-e1b044f296f6-kube-api-access-zdqdc\") pod \"3d7f6c0e-7fdd-4ed0-94ad-e1b044f296f6\" (UID: \"3d7f6c0e-7fdd-4ed0-94ad-e1b044f296f6\") " Mar 21 04:44:45 crc kubenswrapper[4839]: I0321 04:44:45.148439 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3d7f6c0e-7fdd-4ed0-94ad-e1b044f296f6-logs\") pod \"3d7f6c0e-7fdd-4ed0-94ad-e1b044f296f6\" (UID: \"3d7f6c0e-7fdd-4ed0-94ad-e1b044f296f6\") " Mar 21 04:44:45 crc kubenswrapper[4839]: I0321 04:44:45.148586 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"3d7f6c0e-7fdd-4ed0-94ad-e1b044f296f6\" (UID: \"3d7f6c0e-7fdd-4ed0-94ad-e1b044f296f6\") " Mar 21 04:44:45 crc kubenswrapper[4839]: I0321 04:44:45.148882 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3d7f6c0e-7fdd-4ed0-94ad-e1b044f296f6-config-data\") pod \"3d7f6c0e-7fdd-4ed0-94ad-e1b044f296f6\" (UID: \"3d7f6c0e-7fdd-4ed0-94ad-e1b044f296f6\") " Mar 21 04:44:45 crc kubenswrapper[4839]: I0321 04:44:45.149063 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/e8a38db0-ab1b-4555-bb52-a903e9c6b5bb-dns-svc\") pod \"e8a38db0-ab1b-4555-bb52-a903e9c6b5bb\" (UID: \"e8a38db0-ab1b-4555-bb52-a903e9c6b5bb\") " Mar 21 04:44:45 crc kubenswrapper[4839]: I0321 04:44:45.149292 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/e8a38db0-ab1b-4555-bb52-a903e9c6b5bb-ovsdbserver-sb\") pod \"e8a38db0-ab1b-4555-bb52-a903e9c6b5bb\" (UID: \"e8a38db0-ab1b-4555-bb52-a903e9c6b5bb\") " Mar 21 04:44:45 crc kubenswrapper[4839]: I0321 04:44:45.150670 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-n7fdg\" (UniqueName: \"kubernetes.io/projected/e8a38db0-ab1b-4555-bb52-a903e9c6b5bb-kube-api-access-n7fdg\") pod \"e8a38db0-ab1b-4555-bb52-a903e9c6b5bb\" (UID: \"e8a38db0-ab1b-4555-bb52-a903e9c6b5bb\") " Mar 21 04:44:45 crc kubenswrapper[4839]: I0321 04:44:45.150720 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/e8a38db0-ab1b-4555-bb52-a903e9c6b5bb-dns-swift-storage-0\") pod \"e8a38db0-ab1b-4555-bb52-a903e9c6b5bb\" (UID: \"e8a38db0-ab1b-4555-bb52-a903e9c6b5bb\") " Mar 21 04:44:45 crc kubenswrapper[4839]: I0321 04:44:45.150776 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/e8a38db0-ab1b-4555-bb52-a903e9c6b5bb-ovsdbserver-nb\") pod \"e8a38db0-ab1b-4555-bb52-a903e9c6b5bb\" (UID: \"e8a38db0-ab1b-4555-bb52-a903e9c6b5bb\") " Mar 21 04:44:45 crc kubenswrapper[4839]: I0321 04:44:45.151193 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e8a38db0-ab1b-4555-bb52-a903e9c6b5bb-config\") pod \"e8a38db0-ab1b-4555-bb52-a903e9c6b5bb\" (UID: \"e8a38db0-ab1b-4555-bb52-a903e9c6b5bb\") " Mar 21 04:44:45 crc kubenswrapper[4839]: I0321 04:44:45.151237 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/3d7f6c0e-7fdd-4ed0-94ad-e1b044f296f6-httpd-run\") pod \"3d7f6c0e-7fdd-4ed0-94ad-e1b044f296f6\" (UID: \"3d7f6c0e-7fdd-4ed0-94ad-e1b044f296f6\") " Mar 21 04:44:45 crc kubenswrapper[4839]: I0321 04:44:45.153279 4839 reconciler_common.go:293] "Volume detached for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") on node \"crc\" DevicePath \"\"" Mar 21 04:44:45 crc kubenswrapper[4839]: I0321 04:44:45.154396 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3d7f6c0e-7fdd-4ed0-94ad-e1b044f296f6-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "3d7f6c0e-7fdd-4ed0-94ad-e1b044f296f6" (UID: "3d7f6c0e-7fdd-4ed0-94ad-e1b044f296f6"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 21 04:44:45 crc kubenswrapper[4839]: I0321 04:44:45.156450 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7cada35b-7e7f-4d22-895f-588b90e48c70-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "7cada35b-7e7f-4d22-895f-588b90e48c70" (UID: "7cada35b-7e7f-4d22-895f-588b90e48c70"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 21 04:44:45 crc kubenswrapper[4839]: I0321 04:44:45.156861 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3d7f6c0e-7fdd-4ed0-94ad-e1b044f296f6-logs" (OuterVolumeSpecName: "logs") pod "3d7f6c0e-7fdd-4ed0-94ad-e1b044f296f6" (UID: "3d7f6c0e-7fdd-4ed0-94ad-e1b044f296f6"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 21 04:44:45 crc kubenswrapper[4839]: I0321 04:44:45.165223 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3d7f6c0e-7fdd-4ed0-94ad-e1b044f296f6-kube-api-access-zdqdc" (OuterVolumeSpecName: "kube-api-access-zdqdc") pod "3d7f6c0e-7fdd-4ed0-94ad-e1b044f296f6" (UID: "3d7f6c0e-7fdd-4ed0-94ad-e1b044f296f6"). InnerVolumeSpecName "kube-api-access-zdqdc". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 21 04:44:45 crc kubenswrapper[4839]: I0321 04:44:45.178461 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e8a38db0-ab1b-4555-bb52-a903e9c6b5bb-kube-api-access-n7fdg" (OuterVolumeSpecName: "kube-api-access-n7fdg") pod "e8a38db0-ab1b-4555-bb52-a903e9c6b5bb" (UID: "e8a38db0-ab1b-4555-bb52-a903e9c6b5bb"). InnerVolumeSpecName "kube-api-access-n7fdg". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 21 04:44:45 crc kubenswrapper[4839]: I0321 04:44:45.178804 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6d0e1745-6e0b-475c-a1de-d049018abea6-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "6d0e1745-6e0b-475c-a1de-d049018abea6" (UID: "6d0e1745-6e0b-475c-a1de-d049018abea6"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 21 04:44:45 crc kubenswrapper[4839]: I0321 04:44:45.181073 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage02-crc" (OuterVolumeSpecName: "glance") pod "3d7f6c0e-7fdd-4ed0-94ad-e1b044f296f6" (UID: "3d7f6c0e-7fdd-4ed0-94ad-e1b044f296f6"). InnerVolumeSpecName "local-storage02-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Mar 21 04:44:45 crc kubenswrapper[4839]: I0321 04:44:45.182695 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3d7f6c0e-7fdd-4ed0-94ad-e1b044f296f6-scripts" (OuterVolumeSpecName: "scripts") pod "3d7f6c0e-7fdd-4ed0-94ad-e1b044f296f6" (UID: "3d7f6c0e-7fdd-4ed0-94ad-e1b044f296f6"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 21 04:44:45 crc kubenswrapper[4839]: I0321 04:44:45.182730 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7cada35b-7e7f-4d22-895f-588b90e48c70-config-data" (OuterVolumeSpecName: "config-data") pod "7cada35b-7e7f-4d22-895f-588b90e48c70" (UID: "7cada35b-7e7f-4d22-895f-588b90e48c70"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 21 04:44:45 crc kubenswrapper[4839]: I0321 04:44:45.195669 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6008d784-c50b-4079-a7b4-c160b8202956-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "6008d784-c50b-4079-a7b4-c160b8202956" (UID: "6008d784-c50b-4079-a7b4-c160b8202956"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 21 04:44:45 crc kubenswrapper[4839]: I0321 04:44:45.200512 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6008d784-c50b-4079-a7b4-c160b8202956-config-data" (OuterVolumeSpecName: "config-data") pod "6008d784-c50b-4079-a7b4-c160b8202956" (UID: "6008d784-c50b-4079-a7b4-c160b8202956"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 21 04:44:45 crc kubenswrapper[4839]: I0321 04:44:45.226070 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3d7f6c0e-7fdd-4ed0-94ad-e1b044f296f6-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "3d7f6c0e-7fdd-4ed0-94ad-e1b044f296f6" (UID: "3d7f6c0e-7fdd-4ed0-94ad-e1b044f296f6"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 21 04:44:45 crc kubenswrapper[4839]: I0321 04:44:45.255167 4839 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6008d784-c50b-4079-a7b4-c160b8202956-config-data\") on node \"crc\" DevicePath \"\"" Mar 21 04:44:45 crc kubenswrapper[4839]: I0321 04:44:45.255205 4839 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/3d7f6c0e-7fdd-4ed0-94ad-e1b044f296f6-httpd-run\") on node \"crc\" DevicePath \"\"" Mar 21 04:44:45 crc kubenswrapper[4839]: I0321 04:44:45.255218 4839 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3d7f6c0e-7fdd-4ed0-94ad-e1b044f296f6-scripts\") on node \"crc\" DevicePath \"\"" Mar 21 04:44:45 crc kubenswrapper[4839]: I0321 04:44:45.255228 4839 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3d7f6c0e-7fdd-4ed0-94ad-e1b044f296f6-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 21 04:44:45 crc kubenswrapper[4839]: I0321 04:44:45.255243 4839 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6d0e1745-6e0b-475c-a1de-d049018abea6-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 21 04:44:45 crc kubenswrapper[4839]: I0321 04:44:45.255254 4839 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zdqdc\" (UniqueName: \"kubernetes.io/projected/3d7f6c0e-7fdd-4ed0-94ad-e1b044f296f6-kube-api-access-zdqdc\") on node \"crc\" DevicePath \"\"" Mar 21 04:44:45 crc kubenswrapper[4839]: I0321 04:44:45.255266 4839 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7cada35b-7e7f-4d22-895f-588b90e48c70-config-data\") on node \"crc\" DevicePath \"\"" Mar 21 04:44:45 crc kubenswrapper[4839]: I0321 04:44:45.255276 4839 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3d7f6c0e-7fdd-4ed0-94ad-e1b044f296f6-logs\") on node \"crc\" DevicePath \"\"" Mar 21 04:44:45 crc kubenswrapper[4839]: I0321 04:44:45.255300 4839 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") on node \"crc\" " Mar 21 04:44:45 crc kubenswrapper[4839]: I0321 04:44:45.255313 4839 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7cada35b-7e7f-4d22-895f-588b90e48c70-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 21 04:44:45 crc kubenswrapper[4839]: I0321 04:44:45.255326 4839 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-n7fdg\" (UniqueName: \"kubernetes.io/projected/e8a38db0-ab1b-4555-bb52-a903e9c6b5bb-kube-api-access-n7fdg\") on node \"crc\" DevicePath \"\"" Mar 21 04:44:45 crc kubenswrapper[4839]: I0321 04:44:45.255339 4839 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6008d784-c50b-4079-a7b4-c160b8202956-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 21 04:44:45 crc kubenswrapper[4839]: I0321 04:44:45.262814 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e8a38db0-ab1b-4555-bb52-a903e9c6b5bb-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "e8a38db0-ab1b-4555-bb52-a903e9c6b5bb" (UID: "e8a38db0-ab1b-4555-bb52-a903e9c6b5bb"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 21 04:44:45 crc kubenswrapper[4839]: I0321 04:44:45.274498 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-98964f649-mrjrt" event={"ID":"e965d008-890b-408c-a5a8-823aca00140a","Type":"ContainerStarted","Data":"6e416952cf65a99f24d43cb637a81bb2e071806b75507c88029a3d669986edf2"} Mar 21 04:44:45 crc kubenswrapper[4839]: I0321 04:44:45.279885 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-5788c8f798-khqlb" event={"ID":"30c2fe46-cd8a-43f9-8968-b6e65d7c862a","Type":"ContainerStarted","Data":"3897ef34a5a560221b0da70d53a0118dcc2423f236d8ea84230926286a71f6ee"} Mar 21 04:44:45 crc kubenswrapper[4839]: I0321 04:44:45.282173 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e8a38db0-ab1b-4555-bb52-a903e9c6b5bb-config" (OuterVolumeSpecName: "config") pod "e8a38db0-ab1b-4555-bb52-a903e9c6b5bb" (UID: "e8a38db0-ab1b-4555-bb52-a903e9c6b5bb"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 21 04:44:45 crc kubenswrapper[4839]: I0321 04:44:45.282701 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3d7f6c0e-7fdd-4ed0-94ad-e1b044f296f6-config-data" (OuterVolumeSpecName: "config-data") pod "3d7f6c0e-7fdd-4ed0-94ad-e1b044f296f6" (UID: "3d7f6c0e-7fdd-4ed0-94ad-e1b044f296f6"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 21 04:44:45 crc kubenswrapper[4839]: I0321 04:44:45.285507 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-t8kxj" event={"ID":"6d0e1745-6e0b-475c-a1de-d049018abea6","Type":"ContainerDied","Data":"00637144ea664e135a3a03c08667ad9ad9e5c84e3814ae65ec02c62e19d9549d"} Mar 21 04:44:45 crc kubenswrapper[4839]: I0321 04:44:45.285558 4839 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="00637144ea664e135a3a03c08667ad9ad9e5c84e3814ae65ec02c62e19d9549d" Mar 21 04:44:45 crc kubenswrapper[4839]: I0321 04:44:45.285638 4839 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-t8kxj" Mar 21 04:44:45 crc kubenswrapper[4839]: I0321 04:44:45.286509 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e8a38db0-ab1b-4555-bb52-a903e9c6b5bb-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "e8a38db0-ab1b-4555-bb52-a903e9c6b5bb" (UID: "e8a38db0-ab1b-4555-bb52-a903e9c6b5bb"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 21 04:44:45 crc kubenswrapper[4839]: I0321 04:44:45.290212 4839 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage02-crc" (UniqueName: "kubernetes.io/local-volume/local-storage02-crc") on node "crc" Mar 21 04:44:45 crc kubenswrapper[4839]: I0321 04:44:45.296833 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"3d7f6c0e-7fdd-4ed0-94ad-e1b044f296f6","Type":"ContainerDied","Data":"895e0558e8de36ac56f79042a259db632c9fdc941b4ad6158154e4d29f6f1e2e"} Mar 21 04:44:45 crc kubenswrapper[4839]: I0321 04:44:45.296835 4839 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Mar 21 04:44:45 crc kubenswrapper[4839]: I0321 04:44:45.296915 4839 scope.go:117] "RemoveContainer" containerID="169f86351d3c6f2a1f7c5f547a5ee33563ee9e6728a88fde0e0f1b926cb83f77" Mar 21 04:44:45 crc kubenswrapper[4839]: I0321 04:44:45.307826 4839 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-ts52d" Mar 21 04:44:45 crc kubenswrapper[4839]: I0321 04:44:45.308510 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-ts52d" event={"ID":"7cada35b-7e7f-4d22-895f-588b90e48c70","Type":"ContainerDied","Data":"e31bc068934ef91e47e0dcf3fab8f1d0d0da5df66baa85de8d2947608e34dbb6"} Mar 21 04:44:45 crc kubenswrapper[4839]: I0321 04:44:45.308604 4839 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e31bc068934ef91e47e0dcf3fab8f1d0d0da5df66baa85de8d2947608e34dbb6" Mar 21 04:44:45 crc kubenswrapper[4839]: I0321 04:44:45.314039 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"6008d784-c50b-4079-a7b4-c160b8202956","Type":"ContainerDied","Data":"bc5e3457eb4db025b9a82cb5d0e3fd43071910df3cfe4e9dea88ba3c8fb8cc99"} Mar 21 04:44:45 crc kubenswrapper[4839]: I0321 04:44:45.314116 4839 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Mar 21 04:44:45 crc kubenswrapper[4839]: I0321 04:44:45.317299 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e8a38db0-ab1b-4555-bb52-a903e9c6b5bb-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "e8a38db0-ab1b-4555-bb52-a903e9c6b5bb" (UID: "e8a38db0-ab1b-4555-bb52-a903e9c6b5bb"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 21 04:44:45 crc kubenswrapper[4839]: I0321 04:44:45.320027 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"6c266726-5bfd-4519-bdd5-9db7f6a77df4","Type":"ContainerStarted","Data":"97884e844baf73e80ed5f7a5c51d988d7d7009365523dceffa2a7bc9d1e19948"} Mar 21 04:44:45 crc kubenswrapper[4839]: I0321 04:44:45.321410 4839 generic.go:334] "Generic (PLEG): container finished" podID="e8a38db0-ab1b-4555-bb52-a903e9c6b5bb" containerID="14b1c90fa0f6d90a07ec7699dcaccca8b3deb82e4f9b4495708e06fb5363f598" exitCode=0 Mar 21 04:44:45 crc kubenswrapper[4839]: I0321 04:44:45.321436 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-cf78879c9-mr2ng" event={"ID":"e8a38db0-ab1b-4555-bb52-a903e9c6b5bb","Type":"ContainerDied","Data":"14b1c90fa0f6d90a07ec7699dcaccca8b3deb82e4f9b4495708e06fb5363f598"} Mar 21 04:44:45 crc kubenswrapper[4839]: I0321 04:44:45.321451 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-cf78879c9-mr2ng" event={"ID":"e8a38db0-ab1b-4555-bb52-a903e9c6b5bb","Type":"ContainerDied","Data":"2dbe5499f5ce6b46711307e191213d3557376716206b4f9aec95cbff6dcd4f72"} Mar 21 04:44:45 crc kubenswrapper[4839]: I0321 04:44:45.321495 4839 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-cf78879c9-mr2ng" Mar 21 04:44:45 crc kubenswrapper[4839]: I0321 04:44:45.327595 4839 scope.go:117] "RemoveContainer" containerID="c262061b12e953fbbacb47f4b9530de8433e420c520843fb5f6b2637c033c0d3" Mar 21 04:44:45 crc kubenswrapper[4839]: I0321 04:44:45.334534 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e8a38db0-ab1b-4555-bb52-a903e9c6b5bb-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "e8a38db0-ab1b-4555-bb52-a903e9c6b5bb" (UID: "e8a38db0-ab1b-4555-bb52-a903e9c6b5bb"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 21 04:44:45 crc kubenswrapper[4839]: I0321 04:44:45.357123 4839 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Mar 21 04:44:45 crc kubenswrapper[4839]: I0321 04:44:45.390064 4839 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/e8a38db0-ab1b-4555-bb52-a903e9c6b5bb-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 21 04:44:45 crc kubenswrapper[4839]: I0321 04:44:45.390095 4839 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/e8a38db0-ab1b-4555-bb52-a903e9c6b5bb-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Mar 21 04:44:45 crc kubenswrapper[4839]: I0321 04:44:45.390106 4839 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/e8a38db0-ab1b-4555-bb52-a903e9c6b5bb-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Mar 21 04:44:45 crc kubenswrapper[4839]: I0321 04:44:45.390116 4839 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/e8a38db0-ab1b-4555-bb52-a903e9c6b5bb-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Mar 21 04:44:45 crc kubenswrapper[4839]: I0321 04:44:45.390124 4839 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e8a38db0-ab1b-4555-bb52-a903e9c6b5bb-config\") on node \"crc\" DevicePath \"\"" Mar 21 04:44:45 crc kubenswrapper[4839]: I0321 04:44:45.390132 4839 reconciler_common.go:293] "Volume detached for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") on node \"crc\" DevicePath \"\"" Mar 21 04:44:45 crc kubenswrapper[4839]: I0321 04:44:45.390141 4839 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3d7f6c0e-7fdd-4ed0-94ad-e1b044f296f6-config-data\") on node \"crc\" DevicePath \"\"" Mar 21 04:44:45 crc kubenswrapper[4839]: I0321 04:44:45.394708 4839 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-external-api-0"] Mar 21 04:44:45 crc kubenswrapper[4839]: I0321 04:44:45.414273 4839 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 21 04:44:45 crc kubenswrapper[4839]: I0321 04:44:45.450482 4839 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 21 04:44:45 crc kubenswrapper[4839]: I0321 04:44:45.466610 4839 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-external-api-0"] Mar 21 04:44:45 crc kubenswrapper[4839]: E0321 04:44:45.467005 4839 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6008d784-c50b-4079-a7b4-c160b8202956" containerName="glance-log" Mar 21 04:44:45 crc kubenswrapper[4839]: I0321 04:44:45.467018 4839 state_mem.go:107] "Deleted CPUSet assignment" podUID="6008d784-c50b-4079-a7b4-c160b8202956" containerName="glance-log" Mar 21 04:44:45 crc kubenswrapper[4839]: E0321 04:44:45.467033 4839 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6008d784-c50b-4079-a7b4-c160b8202956" containerName="glance-httpd" Mar 21 04:44:45 crc kubenswrapper[4839]: I0321 04:44:45.467039 4839 state_mem.go:107] "Deleted CPUSet assignment" podUID="6008d784-c50b-4079-a7b4-c160b8202956" containerName="glance-httpd" Mar 21 04:44:45 crc kubenswrapper[4839]: E0321 04:44:45.467047 4839 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3d7f6c0e-7fdd-4ed0-94ad-e1b044f296f6" containerName="glance-httpd" Mar 21 04:44:45 crc kubenswrapper[4839]: I0321 04:44:45.467054 4839 state_mem.go:107] "Deleted CPUSet assignment" podUID="3d7f6c0e-7fdd-4ed0-94ad-e1b044f296f6" containerName="glance-httpd" Mar 21 04:44:45 crc kubenswrapper[4839]: E0321 04:44:45.467068 4839 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3d7f6c0e-7fdd-4ed0-94ad-e1b044f296f6" containerName="glance-log" Mar 21 04:44:45 crc kubenswrapper[4839]: I0321 04:44:45.467074 4839 state_mem.go:107] "Deleted CPUSet assignment" podUID="3d7f6c0e-7fdd-4ed0-94ad-e1b044f296f6" containerName="glance-log" Mar 21 04:44:45 crc kubenswrapper[4839]: E0321 04:44:45.467088 4839 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e8a38db0-ab1b-4555-bb52-a903e9c6b5bb" containerName="dnsmasq-dns" Mar 21 04:44:45 crc kubenswrapper[4839]: I0321 04:44:45.467094 4839 state_mem.go:107] "Deleted CPUSet assignment" podUID="e8a38db0-ab1b-4555-bb52-a903e9c6b5bb" containerName="dnsmasq-dns" Mar 21 04:44:45 crc kubenswrapper[4839]: E0321 04:44:45.467108 4839 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e8a38db0-ab1b-4555-bb52-a903e9c6b5bb" containerName="init" Mar 21 04:44:45 crc kubenswrapper[4839]: I0321 04:44:45.467114 4839 state_mem.go:107] "Deleted CPUSet assignment" podUID="e8a38db0-ab1b-4555-bb52-a903e9c6b5bb" containerName="init" Mar 21 04:44:45 crc kubenswrapper[4839]: E0321 04:44:45.467125 4839 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6d0e1745-6e0b-475c-a1de-d049018abea6" containerName="barbican-db-sync" Mar 21 04:44:45 crc kubenswrapper[4839]: I0321 04:44:45.467132 4839 state_mem.go:107] "Deleted CPUSet assignment" podUID="6d0e1745-6e0b-475c-a1de-d049018abea6" containerName="barbican-db-sync" Mar 21 04:44:45 crc kubenswrapper[4839]: E0321 04:44:45.467145 4839 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7cada35b-7e7f-4d22-895f-588b90e48c70" containerName="keystone-bootstrap" Mar 21 04:44:45 crc kubenswrapper[4839]: I0321 04:44:45.467153 4839 state_mem.go:107] "Deleted CPUSet assignment" podUID="7cada35b-7e7f-4d22-895f-588b90e48c70" containerName="keystone-bootstrap" Mar 21 04:44:45 crc kubenswrapper[4839]: I0321 04:44:45.467295 4839 memory_manager.go:354] "RemoveStaleState removing state" podUID="7cada35b-7e7f-4d22-895f-588b90e48c70" containerName="keystone-bootstrap" Mar 21 04:44:45 crc kubenswrapper[4839]: I0321 04:44:45.467321 4839 memory_manager.go:354] "RemoveStaleState removing state" podUID="6008d784-c50b-4079-a7b4-c160b8202956" containerName="glance-log" Mar 21 04:44:45 crc kubenswrapper[4839]: I0321 04:44:45.467338 4839 memory_manager.go:354] "RemoveStaleState removing state" podUID="3d7f6c0e-7fdd-4ed0-94ad-e1b044f296f6" containerName="glance-log" Mar 21 04:44:45 crc kubenswrapper[4839]: I0321 04:44:45.467349 4839 memory_manager.go:354] "RemoveStaleState removing state" podUID="3d7f6c0e-7fdd-4ed0-94ad-e1b044f296f6" containerName="glance-httpd" Mar 21 04:44:45 crc kubenswrapper[4839]: I0321 04:44:45.467359 4839 memory_manager.go:354] "RemoveStaleState removing state" podUID="e8a38db0-ab1b-4555-bb52-a903e9c6b5bb" containerName="dnsmasq-dns" Mar 21 04:44:45 crc kubenswrapper[4839]: I0321 04:44:45.467366 4839 memory_manager.go:354] "RemoveStaleState removing state" podUID="6d0e1745-6e0b-475c-a1de-d049018abea6" containerName="barbican-db-sync" Mar 21 04:44:45 crc kubenswrapper[4839]: I0321 04:44:45.467380 4839 memory_manager.go:354] "RemoveStaleState removing state" podUID="6008d784-c50b-4079-a7b4-c160b8202956" containerName="glance-httpd" Mar 21 04:44:45 crc kubenswrapper[4839]: I0321 04:44:45.468290 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Mar 21 04:44:45 crc kubenswrapper[4839]: I0321 04:44:45.474834 4839 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-scripts" Mar 21 04:44:45 crc kubenswrapper[4839]: I0321 04:44:45.475719 4839 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-external-config-data" Mar 21 04:44:45 crc kubenswrapper[4839]: I0321 04:44:45.475879 4839 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-public-svc" Mar 21 04:44:45 crc kubenswrapper[4839]: I0321 04:44:45.476034 4839 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-glance-dockercfg-v5bc4" Mar 21 04:44:45 crc kubenswrapper[4839]: I0321 04:44:45.485077 4839 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Mar 21 04:44:45 crc kubenswrapper[4839]: I0321 04:44:45.485869 4839 scope.go:117] "RemoveContainer" containerID="0a86b207740d0f8a7f90c03c8833a6d77c693f85ec7fea71ebc1f6a1cbaaac94" Mar 21 04:44:45 crc kubenswrapper[4839]: I0321 04:44:45.491997 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/524772c8-3fdb-43dc-8532-1d8e9dcdeb97-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"524772c8-3fdb-43dc-8532-1d8e9dcdeb97\") " pod="openstack/glance-default-external-api-0" Mar 21 04:44:45 crc kubenswrapper[4839]: I0321 04:44:45.492076 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/524772c8-3fdb-43dc-8532-1d8e9dcdeb97-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"524772c8-3fdb-43dc-8532-1d8e9dcdeb97\") " pod="openstack/glance-default-external-api-0" Mar 21 04:44:45 crc kubenswrapper[4839]: I0321 04:44:45.492097 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kgs9m\" (UniqueName: \"kubernetes.io/projected/524772c8-3fdb-43dc-8532-1d8e9dcdeb97-kube-api-access-kgs9m\") pod \"glance-default-external-api-0\" (UID: \"524772c8-3fdb-43dc-8532-1d8e9dcdeb97\") " pod="openstack/glance-default-external-api-0" Mar 21 04:44:45 crc kubenswrapper[4839]: I0321 04:44:45.492116 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/524772c8-3fdb-43dc-8532-1d8e9dcdeb97-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"524772c8-3fdb-43dc-8532-1d8e9dcdeb97\") " pod="openstack/glance-default-external-api-0" Mar 21 04:44:45 crc kubenswrapper[4839]: I0321 04:44:45.492209 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/524772c8-3fdb-43dc-8532-1d8e9dcdeb97-logs\") pod \"glance-default-external-api-0\" (UID: \"524772c8-3fdb-43dc-8532-1d8e9dcdeb97\") " pod="openstack/glance-default-external-api-0" Mar 21 04:44:45 crc kubenswrapper[4839]: I0321 04:44:45.492246 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/524772c8-3fdb-43dc-8532-1d8e9dcdeb97-scripts\") pod \"glance-default-external-api-0\" (UID: \"524772c8-3fdb-43dc-8532-1d8e9dcdeb97\") " pod="openstack/glance-default-external-api-0" Mar 21 04:44:45 crc kubenswrapper[4839]: I0321 04:44:45.492315 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/524772c8-3fdb-43dc-8532-1d8e9dcdeb97-config-data\") pod \"glance-default-external-api-0\" (UID: \"524772c8-3fdb-43dc-8532-1d8e9dcdeb97\") " pod="openstack/glance-default-external-api-0" Mar 21 04:44:45 crc kubenswrapper[4839]: I0321 04:44:45.492354 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"glance-default-external-api-0\" (UID: \"524772c8-3fdb-43dc-8532-1d8e9dcdeb97\") " pod="openstack/glance-default-external-api-0" Mar 21 04:44:45 crc kubenswrapper[4839]: I0321 04:44:45.500498 4839 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 21 04:44:45 crc kubenswrapper[4839]: I0321 04:44:45.506545 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Mar 21 04:44:45 crc kubenswrapper[4839]: I0321 04:44:45.508472 4839 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-internal-svc" Mar 21 04:44:45 crc kubenswrapper[4839]: I0321 04:44:45.509076 4839 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-internal-config-data" Mar 21 04:44:45 crc kubenswrapper[4839]: I0321 04:44:45.525035 4839 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 21 04:44:45 crc kubenswrapper[4839]: I0321 04:44:45.569347 4839 scope.go:117] "RemoveContainer" containerID="656d2bd6e96a6dfef03b94b31d13a8f2dda820b33aef3803e391faf4b5d221eb" Mar 21 04:44:45 crc kubenswrapper[4839]: I0321 04:44:45.594102 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"glance-default-external-api-0\" (UID: \"524772c8-3fdb-43dc-8532-1d8e9dcdeb97\") " pod="openstack/glance-default-external-api-0" Mar 21 04:44:45 crc kubenswrapper[4839]: I0321 04:44:45.594155 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/524772c8-3fdb-43dc-8532-1d8e9dcdeb97-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"524772c8-3fdb-43dc-8532-1d8e9dcdeb97\") " pod="openstack/glance-default-external-api-0" Mar 21 04:44:45 crc kubenswrapper[4839]: I0321 04:44:45.594188 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/524772c8-3fdb-43dc-8532-1d8e9dcdeb97-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"524772c8-3fdb-43dc-8532-1d8e9dcdeb97\") " pod="openstack/glance-default-external-api-0" Mar 21 04:44:45 crc kubenswrapper[4839]: I0321 04:44:45.594203 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kgs9m\" (UniqueName: \"kubernetes.io/projected/524772c8-3fdb-43dc-8532-1d8e9dcdeb97-kube-api-access-kgs9m\") pod \"glance-default-external-api-0\" (UID: \"524772c8-3fdb-43dc-8532-1d8e9dcdeb97\") " pod="openstack/glance-default-external-api-0" Mar 21 04:44:45 crc kubenswrapper[4839]: I0321 04:44:45.594218 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/524772c8-3fdb-43dc-8532-1d8e9dcdeb97-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"524772c8-3fdb-43dc-8532-1d8e9dcdeb97\") " pod="openstack/glance-default-external-api-0" Mar 21 04:44:45 crc kubenswrapper[4839]: I0321 04:44:45.594242 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/524772c8-3fdb-43dc-8532-1d8e9dcdeb97-logs\") pod \"glance-default-external-api-0\" (UID: \"524772c8-3fdb-43dc-8532-1d8e9dcdeb97\") " pod="openstack/glance-default-external-api-0" Mar 21 04:44:45 crc kubenswrapper[4839]: I0321 04:44:45.594265 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/524772c8-3fdb-43dc-8532-1d8e9dcdeb97-scripts\") pod \"glance-default-external-api-0\" (UID: \"524772c8-3fdb-43dc-8532-1d8e9dcdeb97\") " pod="openstack/glance-default-external-api-0" Mar 21 04:44:45 crc kubenswrapper[4839]: I0321 04:44:45.594319 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/524772c8-3fdb-43dc-8532-1d8e9dcdeb97-config-data\") pod \"glance-default-external-api-0\" (UID: \"524772c8-3fdb-43dc-8532-1d8e9dcdeb97\") " pod="openstack/glance-default-external-api-0" Mar 21 04:44:45 crc kubenswrapper[4839]: I0321 04:44:45.594887 4839 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"glance-default-external-api-0\" (UID: \"524772c8-3fdb-43dc-8532-1d8e9dcdeb97\") device mount path \"/mnt/openstack/pv07\"" pod="openstack/glance-default-external-api-0" Mar 21 04:44:45 crc kubenswrapper[4839]: I0321 04:44:45.595714 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/524772c8-3fdb-43dc-8532-1d8e9dcdeb97-logs\") pod \"glance-default-external-api-0\" (UID: \"524772c8-3fdb-43dc-8532-1d8e9dcdeb97\") " pod="openstack/glance-default-external-api-0" Mar 21 04:44:45 crc kubenswrapper[4839]: I0321 04:44:45.596522 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/524772c8-3fdb-43dc-8532-1d8e9dcdeb97-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"524772c8-3fdb-43dc-8532-1d8e9dcdeb97\") " pod="openstack/glance-default-external-api-0" Mar 21 04:44:45 crc kubenswrapper[4839]: I0321 04:44:45.602038 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/524772c8-3fdb-43dc-8532-1d8e9dcdeb97-config-data\") pod \"glance-default-external-api-0\" (UID: \"524772c8-3fdb-43dc-8532-1d8e9dcdeb97\") " pod="openstack/glance-default-external-api-0" Mar 21 04:44:45 crc kubenswrapper[4839]: I0321 04:44:45.607326 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/524772c8-3fdb-43dc-8532-1d8e9dcdeb97-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"524772c8-3fdb-43dc-8532-1d8e9dcdeb97\") " pod="openstack/glance-default-external-api-0" Mar 21 04:44:45 crc kubenswrapper[4839]: I0321 04:44:45.607380 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/524772c8-3fdb-43dc-8532-1d8e9dcdeb97-scripts\") pod \"glance-default-external-api-0\" (UID: \"524772c8-3fdb-43dc-8532-1d8e9dcdeb97\") " pod="openstack/glance-default-external-api-0" Mar 21 04:44:45 crc kubenswrapper[4839]: I0321 04:44:45.616365 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/524772c8-3fdb-43dc-8532-1d8e9dcdeb97-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"524772c8-3fdb-43dc-8532-1d8e9dcdeb97\") " pod="openstack/glance-default-external-api-0" Mar 21 04:44:45 crc kubenswrapper[4839]: I0321 04:44:45.621560 4839 scope.go:117] "RemoveContainer" containerID="14b1c90fa0f6d90a07ec7699dcaccca8b3deb82e4f9b4495708e06fb5363f598" Mar 21 04:44:45 crc kubenswrapper[4839]: I0321 04:44:45.628433 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kgs9m\" (UniqueName: \"kubernetes.io/projected/524772c8-3fdb-43dc-8532-1d8e9dcdeb97-kube-api-access-kgs9m\") pod \"glance-default-external-api-0\" (UID: \"524772c8-3fdb-43dc-8532-1d8e9dcdeb97\") " pod="openstack/glance-default-external-api-0" Mar 21 04:44:45 crc kubenswrapper[4839]: I0321 04:44:45.644921 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"glance-default-external-api-0\" (UID: \"524772c8-3fdb-43dc-8532-1d8e9dcdeb97\") " pod="openstack/glance-default-external-api-0" Mar 21 04:44:45 crc kubenswrapper[4839]: I0321 04:44:45.695584 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/506e1e04-5787-48bb-9165-96a55f0d3095-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"506e1e04-5787-48bb-9165-96a55f0d3095\") " pod="openstack/glance-default-internal-api-0" Mar 21 04:44:45 crc kubenswrapper[4839]: I0321 04:44:45.695633 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tfhfr\" (UniqueName: \"kubernetes.io/projected/506e1e04-5787-48bb-9165-96a55f0d3095-kube-api-access-tfhfr\") pod \"glance-default-internal-api-0\" (UID: \"506e1e04-5787-48bb-9165-96a55f0d3095\") " pod="openstack/glance-default-internal-api-0" Mar 21 04:44:45 crc kubenswrapper[4839]: I0321 04:44:45.695672 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/506e1e04-5787-48bb-9165-96a55f0d3095-config-data\") pod \"glance-default-internal-api-0\" (UID: \"506e1e04-5787-48bb-9165-96a55f0d3095\") " pod="openstack/glance-default-internal-api-0" Mar 21 04:44:45 crc kubenswrapper[4839]: I0321 04:44:45.695699 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/506e1e04-5787-48bb-9165-96a55f0d3095-logs\") pod \"glance-default-internal-api-0\" (UID: \"506e1e04-5787-48bb-9165-96a55f0d3095\") " pod="openstack/glance-default-internal-api-0" Mar 21 04:44:45 crc kubenswrapper[4839]: I0321 04:44:45.695836 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"glance-default-internal-api-0\" (UID: \"506e1e04-5787-48bb-9165-96a55f0d3095\") " pod="openstack/glance-default-internal-api-0" Mar 21 04:44:45 crc kubenswrapper[4839]: I0321 04:44:45.695861 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/506e1e04-5787-48bb-9165-96a55f0d3095-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"506e1e04-5787-48bb-9165-96a55f0d3095\") " pod="openstack/glance-default-internal-api-0" Mar 21 04:44:45 crc kubenswrapper[4839]: I0321 04:44:45.695882 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/506e1e04-5787-48bb-9165-96a55f0d3095-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"506e1e04-5787-48bb-9165-96a55f0d3095\") " pod="openstack/glance-default-internal-api-0" Mar 21 04:44:45 crc kubenswrapper[4839]: I0321 04:44:45.695915 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/506e1e04-5787-48bb-9165-96a55f0d3095-scripts\") pod \"glance-default-internal-api-0\" (UID: \"506e1e04-5787-48bb-9165-96a55f0d3095\") " pod="openstack/glance-default-internal-api-0" Mar 21 04:44:45 crc kubenswrapper[4839]: I0321 04:44:45.787699 4839 scope.go:117] "RemoveContainer" containerID="52b2f2f770bf2eb0997bcf2aef429b0705724f1b09c55ddcbec469134135b349" Mar 21 04:44:45 crc kubenswrapper[4839]: I0321 04:44:45.795774 4839 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-cf78879c9-mr2ng"] Mar 21 04:44:45 crc kubenswrapper[4839]: I0321 04:44:45.807494 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/506e1e04-5787-48bb-9165-96a55f0d3095-config-data\") pod \"glance-default-internal-api-0\" (UID: \"506e1e04-5787-48bb-9165-96a55f0d3095\") " pod="openstack/glance-default-internal-api-0" Mar 21 04:44:45 crc kubenswrapper[4839]: I0321 04:44:45.807585 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/506e1e04-5787-48bb-9165-96a55f0d3095-logs\") pod \"glance-default-internal-api-0\" (UID: \"506e1e04-5787-48bb-9165-96a55f0d3095\") " pod="openstack/glance-default-internal-api-0" Mar 21 04:44:45 crc kubenswrapper[4839]: I0321 04:44:45.807838 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"glance-default-internal-api-0\" (UID: \"506e1e04-5787-48bb-9165-96a55f0d3095\") " pod="openstack/glance-default-internal-api-0" Mar 21 04:44:45 crc kubenswrapper[4839]: I0321 04:44:45.807877 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/506e1e04-5787-48bb-9165-96a55f0d3095-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"506e1e04-5787-48bb-9165-96a55f0d3095\") " pod="openstack/glance-default-internal-api-0" Mar 21 04:44:45 crc kubenswrapper[4839]: I0321 04:44:45.807905 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/506e1e04-5787-48bb-9165-96a55f0d3095-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"506e1e04-5787-48bb-9165-96a55f0d3095\") " pod="openstack/glance-default-internal-api-0" Mar 21 04:44:45 crc kubenswrapper[4839]: I0321 04:44:45.807944 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/506e1e04-5787-48bb-9165-96a55f0d3095-scripts\") pod \"glance-default-internal-api-0\" (UID: \"506e1e04-5787-48bb-9165-96a55f0d3095\") " pod="openstack/glance-default-internal-api-0" Mar 21 04:44:45 crc kubenswrapper[4839]: I0321 04:44:45.807975 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/506e1e04-5787-48bb-9165-96a55f0d3095-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"506e1e04-5787-48bb-9165-96a55f0d3095\") " pod="openstack/glance-default-internal-api-0" Mar 21 04:44:45 crc kubenswrapper[4839]: I0321 04:44:45.807997 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tfhfr\" (UniqueName: \"kubernetes.io/projected/506e1e04-5787-48bb-9165-96a55f0d3095-kube-api-access-tfhfr\") pod \"glance-default-internal-api-0\" (UID: \"506e1e04-5787-48bb-9165-96a55f0d3095\") " pod="openstack/glance-default-internal-api-0" Mar 21 04:44:45 crc kubenswrapper[4839]: I0321 04:44:45.818635 4839 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"glance-default-internal-api-0\" (UID: \"506e1e04-5787-48bb-9165-96a55f0d3095\") device mount path \"/mnt/openstack/pv02\"" pod="openstack/glance-default-internal-api-0" Mar 21 04:44:45 crc kubenswrapper[4839]: I0321 04:44:45.819489 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/506e1e04-5787-48bb-9165-96a55f0d3095-config-data\") pod \"glance-default-internal-api-0\" (UID: \"506e1e04-5787-48bb-9165-96a55f0d3095\") " pod="openstack/glance-default-internal-api-0" Mar 21 04:44:45 crc kubenswrapper[4839]: I0321 04:44:45.820040 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/506e1e04-5787-48bb-9165-96a55f0d3095-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"506e1e04-5787-48bb-9165-96a55f0d3095\") " pod="openstack/glance-default-internal-api-0" Mar 21 04:44:45 crc kubenswrapper[4839]: I0321 04:44:45.827667 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/506e1e04-5787-48bb-9165-96a55f0d3095-logs\") pod \"glance-default-internal-api-0\" (UID: \"506e1e04-5787-48bb-9165-96a55f0d3095\") " pod="openstack/glance-default-internal-api-0" Mar 21 04:44:45 crc kubenswrapper[4839]: I0321 04:44:45.828188 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/506e1e04-5787-48bb-9165-96a55f0d3095-scripts\") pod \"glance-default-internal-api-0\" (UID: \"506e1e04-5787-48bb-9165-96a55f0d3095\") " pod="openstack/glance-default-internal-api-0" Mar 21 04:44:45 crc kubenswrapper[4839]: I0321 04:44:45.845689 4839 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-cf78879c9-mr2ng"] Mar 21 04:44:45 crc kubenswrapper[4839]: I0321 04:44:45.857800 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/506e1e04-5787-48bb-9165-96a55f0d3095-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"506e1e04-5787-48bb-9165-96a55f0d3095\") " pod="openstack/glance-default-internal-api-0" Mar 21 04:44:45 crc kubenswrapper[4839]: I0321 04:44:45.859627 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/506e1e04-5787-48bb-9165-96a55f0d3095-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"506e1e04-5787-48bb-9165-96a55f0d3095\") " pod="openstack/glance-default-internal-api-0" Mar 21 04:44:45 crc kubenswrapper[4839]: I0321 04:44:45.871132 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Mar 21 04:44:45 crc kubenswrapper[4839]: I0321 04:44:45.874643 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tfhfr\" (UniqueName: \"kubernetes.io/projected/506e1e04-5787-48bb-9165-96a55f0d3095-kube-api-access-tfhfr\") pod \"glance-default-internal-api-0\" (UID: \"506e1e04-5787-48bb-9165-96a55f0d3095\") " pod="openstack/glance-default-internal-api-0" Mar 21 04:44:45 crc kubenswrapper[4839]: I0321 04:44:45.893753 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"glance-default-internal-api-0\" (UID: \"506e1e04-5787-48bb-9165-96a55f0d3095\") " pod="openstack/glance-default-internal-api-0" Mar 21 04:44:45 crc kubenswrapper[4839]: I0321 04:44:45.914314 4839 scope.go:117] "RemoveContainer" containerID="14b1c90fa0f6d90a07ec7699dcaccca8b3deb82e4f9b4495708e06fb5363f598" Mar 21 04:44:45 crc kubenswrapper[4839]: E0321 04:44:45.918940 4839 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"14b1c90fa0f6d90a07ec7699dcaccca8b3deb82e4f9b4495708e06fb5363f598\": container with ID starting with 14b1c90fa0f6d90a07ec7699dcaccca8b3deb82e4f9b4495708e06fb5363f598 not found: ID does not exist" containerID="14b1c90fa0f6d90a07ec7699dcaccca8b3deb82e4f9b4495708e06fb5363f598" Mar 21 04:44:45 crc kubenswrapper[4839]: I0321 04:44:45.918984 4839 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"14b1c90fa0f6d90a07ec7699dcaccca8b3deb82e4f9b4495708e06fb5363f598"} err="failed to get container status \"14b1c90fa0f6d90a07ec7699dcaccca8b3deb82e4f9b4495708e06fb5363f598\": rpc error: code = NotFound desc = could not find container \"14b1c90fa0f6d90a07ec7699dcaccca8b3deb82e4f9b4495708e06fb5363f598\": container with ID starting with 14b1c90fa0f6d90a07ec7699dcaccca8b3deb82e4f9b4495708e06fb5363f598 not found: ID does not exist" Mar 21 04:44:45 crc kubenswrapper[4839]: I0321 04:44:45.919008 4839 scope.go:117] "RemoveContainer" containerID="52b2f2f770bf2eb0997bcf2aef429b0705724f1b09c55ddcbec469134135b349" Mar 21 04:44:45 crc kubenswrapper[4839]: E0321 04:44:45.925173 4839 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"52b2f2f770bf2eb0997bcf2aef429b0705724f1b09c55ddcbec469134135b349\": container with ID starting with 52b2f2f770bf2eb0997bcf2aef429b0705724f1b09c55ddcbec469134135b349 not found: ID does not exist" containerID="52b2f2f770bf2eb0997bcf2aef429b0705724f1b09c55ddcbec469134135b349" Mar 21 04:44:45 crc kubenswrapper[4839]: I0321 04:44:45.925214 4839 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"52b2f2f770bf2eb0997bcf2aef429b0705724f1b09c55ddcbec469134135b349"} err="failed to get container status \"52b2f2f770bf2eb0997bcf2aef429b0705724f1b09c55ddcbec469134135b349\": rpc error: code = NotFound desc = could not find container \"52b2f2f770bf2eb0997bcf2aef429b0705724f1b09c55ddcbec469134135b349\": container with ID starting with 52b2f2f770bf2eb0997bcf2aef429b0705724f1b09c55ddcbec469134135b349 not found: ID does not exist" Mar 21 04:44:46 crc kubenswrapper[4839]: I0321 04:44:46.005461 4839 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-worker-77466dd775-brs5x"] Mar 21 04:44:46 crc kubenswrapper[4839]: I0321 04:44:46.007060 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-worker-77466dd775-brs5x" Mar 21 04:44:46 crc kubenswrapper[4839]: I0321 04:44:46.011252 4839 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-barbican-dockercfg-qnmpn" Mar 21 04:44:46 crc kubenswrapper[4839]: I0321 04:44:46.011372 4839 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-worker-config-data" Mar 21 04:44:46 crc kubenswrapper[4839]: I0321 04:44:46.011444 4839 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-config-data" Mar 21 04:44:46 crc kubenswrapper[4839]: I0321 04:44:46.033394 4839 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-keystone-listener-56b894998b-l59vx"] Mar 21 04:44:46 crc kubenswrapper[4839]: I0321 04:44:46.034816 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-keystone-listener-56b894998b-l59vx" Mar 21 04:44:46 crc kubenswrapper[4839]: I0321 04:44:46.037532 4839 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-keystone-listener-config-data" Mar 21 04:44:46 crc kubenswrapper[4839]: I0321 04:44:46.066628 4839 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-worker-77466dd775-brs5x"] Mar 21 04:44:46 crc kubenswrapper[4839]: I0321 04:44:46.118486 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/392fb516-8745-40fb-b38d-53106c8310df-config-data-custom\") pod \"barbican-worker-77466dd775-brs5x\" (UID: \"392fb516-8745-40fb-b38d-53106c8310df\") " pod="openstack/barbican-worker-77466dd775-brs5x" Mar 21 04:44:46 crc kubenswrapper[4839]: I0321 04:44:46.118542 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e5f64e49-61a6-4601-b37b-f9af6079108c-combined-ca-bundle\") pod \"barbican-keystone-listener-56b894998b-l59vx\" (UID: \"e5f64e49-61a6-4601-b37b-f9af6079108c\") " pod="openstack/barbican-keystone-listener-56b894998b-l59vx" Mar 21 04:44:46 crc kubenswrapper[4839]: I0321 04:44:46.118608 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/e5f64e49-61a6-4601-b37b-f9af6079108c-config-data-custom\") pod \"barbican-keystone-listener-56b894998b-l59vx\" (UID: \"e5f64e49-61a6-4601-b37b-f9af6079108c\") " pod="openstack/barbican-keystone-listener-56b894998b-l59vx" Mar 21 04:44:46 crc kubenswrapper[4839]: I0321 04:44:46.118635 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/392fb516-8745-40fb-b38d-53106c8310df-config-data\") pod \"barbican-worker-77466dd775-brs5x\" (UID: \"392fb516-8745-40fb-b38d-53106c8310df\") " pod="openstack/barbican-worker-77466dd775-brs5x" Mar 21 04:44:46 crc kubenswrapper[4839]: I0321 04:44:46.118671 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xmt9m\" (UniqueName: \"kubernetes.io/projected/392fb516-8745-40fb-b38d-53106c8310df-kube-api-access-xmt9m\") pod \"barbican-worker-77466dd775-brs5x\" (UID: \"392fb516-8745-40fb-b38d-53106c8310df\") " pod="openstack/barbican-worker-77466dd775-brs5x" Mar 21 04:44:46 crc kubenswrapper[4839]: I0321 04:44:46.118691 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/392fb516-8745-40fb-b38d-53106c8310df-logs\") pod \"barbican-worker-77466dd775-brs5x\" (UID: \"392fb516-8745-40fb-b38d-53106c8310df\") " pod="openstack/barbican-worker-77466dd775-brs5x" Mar 21 04:44:46 crc kubenswrapper[4839]: I0321 04:44:46.118727 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n782h\" (UniqueName: \"kubernetes.io/projected/e5f64e49-61a6-4601-b37b-f9af6079108c-kube-api-access-n782h\") pod \"barbican-keystone-listener-56b894998b-l59vx\" (UID: \"e5f64e49-61a6-4601-b37b-f9af6079108c\") " pod="openstack/barbican-keystone-listener-56b894998b-l59vx" Mar 21 04:44:46 crc kubenswrapper[4839]: I0321 04:44:46.118767 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/392fb516-8745-40fb-b38d-53106c8310df-combined-ca-bundle\") pod \"barbican-worker-77466dd775-brs5x\" (UID: \"392fb516-8745-40fb-b38d-53106c8310df\") " pod="openstack/barbican-worker-77466dd775-brs5x" Mar 21 04:44:46 crc kubenswrapper[4839]: I0321 04:44:46.118800 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e5f64e49-61a6-4601-b37b-f9af6079108c-config-data\") pod \"barbican-keystone-listener-56b894998b-l59vx\" (UID: \"e5f64e49-61a6-4601-b37b-f9af6079108c\") " pod="openstack/barbican-keystone-listener-56b894998b-l59vx" Mar 21 04:44:46 crc kubenswrapper[4839]: I0321 04:44:46.118814 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e5f64e49-61a6-4601-b37b-f9af6079108c-logs\") pod \"barbican-keystone-listener-56b894998b-l59vx\" (UID: \"e5f64e49-61a6-4601-b37b-f9af6079108c\") " pod="openstack/barbican-keystone-listener-56b894998b-l59vx" Mar 21 04:44:46 crc kubenswrapper[4839]: I0321 04:44:46.132620 4839 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-keystone-listener-56b894998b-l59vx"] Mar 21 04:44:46 crc kubenswrapper[4839]: I0321 04:44:46.166163 4839 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-848cf88cfc-k67ln"] Mar 21 04:44:46 crc kubenswrapper[4839]: I0321 04:44:46.167881 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-848cf88cfc-k67ln" Mar 21 04:44:46 crc kubenswrapper[4839]: I0321 04:44:46.181305 4839 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-cb996784d-fvhvp"] Mar 21 04:44:46 crc kubenswrapper[4839]: I0321 04:44:46.181957 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Mar 21 04:44:46 crc kubenswrapper[4839]: I0321 04:44:46.184314 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cb996784d-fvhvp" Mar 21 04:44:46 crc kubenswrapper[4839]: I0321 04:44:46.191100 4839 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-pzsvm" Mar 21 04:44:46 crc kubenswrapper[4839]: I0321 04:44:46.191374 4839 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-keystone-internal-svc" Mar 21 04:44:46 crc kubenswrapper[4839]: I0321 04:44:46.191617 4839 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-keystone-public-svc" Mar 21 04:44:46 crc kubenswrapper[4839]: I0321 04:44:46.191853 4839 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Mar 21 04:44:46 crc kubenswrapper[4839]: I0321 04:44:46.191992 4839 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Mar 21 04:44:46 crc kubenswrapper[4839]: I0321 04:44:46.192147 4839 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Mar 21 04:44:46 crc kubenswrapper[4839]: I0321 04:44:46.203947 4839 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-848cf88cfc-k67ln"] Mar 21 04:44:46 crc kubenswrapper[4839]: I0321 04:44:46.223424 4839 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-cb996784d-fvhvp"] Mar 21 04:44:46 crc kubenswrapper[4839]: I0321 04:44:46.223741 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e5f64e49-61a6-4601-b37b-f9af6079108c-config-data\") pod \"barbican-keystone-listener-56b894998b-l59vx\" (UID: \"e5f64e49-61a6-4601-b37b-f9af6079108c\") " pod="openstack/barbican-keystone-listener-56b894998b-l59vx" Mar 21 04:44:46 crc kubenswrapper[4839]: I0321 04:44:46.223798 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e5f64e49-61a6-4601-b37b-f9af6079108c-logs\") pod \"barbican-keystone-listener-56b894998b-l59vx\" (UID: \"e5f64e49-61a6-4601-b37b-f9af6079108c\") " pod="openstack/barbican-keystone-listener-56b894998b-l59vx" Mar 21 04:44:46 crc kubenswrapper[4839]: I0321 04:44:46.223844 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/392fb516-8745-40fb-b38d-53106c8310df-config-data-custom\") pod \"barbican-worker-77466dd775-brs5x\" (UID: \"392fb516-8745-40fb-b38d-53106c8310df\") " pod="openstack/barbican-worker-77466dd775-brs5x" Mar 21 04:44:46 crc kubenswrapper[4839]: I0321 04:44:46.223880 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e5f64e49-61a6-4601-b37b-f9af6079108c-combined-ca-bundle\") pod \"barbican-keystone-listener-56b894998b-l59vx\" (UID: \"e5f64e49-61a6-4601-b37b-f9af6079108c\") " pod="openstack/barbican-keystone-listener-56b894998b-l59vx" Mar 21 04:44:46 crc kubenswrapper[4839]: I0321 04:44:46.223967 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/e5f64e49-61a6-4601-b37b-f9af6079108c-config-data-custom\") pod \"barbican-keystone-listener-56b894998b-l59vx\" (UID: \"e5f64e49-61a6-4601-b37b-f9af6079108c\") " pod="openstack/barbican-keystone-listener-56b894998b-l59vx" Mar 21 04:44:46 crc kubenswrapper[4839]: I0321 04:44:46.224017 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/392fb516-8745-40fb-b38d-53106c8310df-config-data\") pod \"barbican-worker-77466dd775-brs5x\" (UID: \"392fb516-8745-40fb-b38d-53106c8310df\") " pod="openstack/barbican-worker-77466dd775-brs5x" Mar 21 04:44:46 crc kubenswrapper[4839]: I0321 04:44:46.224082 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xmt9m\" (UniqueName: \"kubernetes.io/projected/392fb516-8745-40fb-b38d-53106c8310df-kube-api-access-xmt9m\") pod \"barbican-worker-77466dd775-brs5x\" (UID: \"392fb516-8745-40fb-b38d-53106c8310df\") " pod="openstack/barbican-worker-77466dd775-brs5x" Mar 21 04:44:46 crc kubenswrapper[4839]: I0321 04:44:46.224128 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/392fb516-8745-40fb-b38d-53106c8310df-logs\") pod \"barbican-worker-77466dd775-brs5x\" (UID: \"392fb516-8745-40fb-b38d-53106c8310df\") " pod="openstack/barbican-worker-77466dd775-brs5x" Mar 21 04:44:46 crc kubenswrapper[4839]: I0321 04:44:46.224205 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-n782h\" (UniqueName: \"kubernetes.io/projected/e5f64e49-61a6-4601-b37b-f9af6079108c-kube-api-access-n782h\") pod \"barbican-keystone-listener-56b894998b-l59vx\" (UID: \"e5f64e49-61a6-4601-b37b-f9af6079108c\") " pod="openstack/barbican-keystone-listener-56b894998b-l59vx" Mar 21 04:44:46 crc kubenswrapper[4839]: I0321 04:44:46.224287 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/392fb516-8745-40fb-b38d-53106c8310df-combined-ca-bundle\") pod \"barbican-worker-77466dd775-brs5x\" (UID: \"392fb516-8745-40fb-b38d-53106c8310df\") " pod="openstack/barbican-worker-77466dd775-brs5x" Mar 21 04:44:46 crc kubenswrapper[4839]: I0321 04:44:46.226919 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/392fb516-8745-40fb-b38d-53106c8310df-logs\") pod \"barbican-worker-77466dd775-brs5x\" (UID: \"392fb516-8745-40fb-b38d-53106c8310df\") " pod="openstack/barbican-worker-77466dd775-brs5x" Mar 21 04:44:46 crc kubenswrapper[4839]: I0321 04:44:46.227677 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e5f64e49-61a6-4601-b37b-f9af6079108c-logs\") pod \"barbican-keystone-listener-56b894998b-l59vx\" (UID: \"e5f64e49-61a6-4601-b37b-f9af6079108c\") " pod="openstack/barbican-keystone-listener-56b894998b-l59vx" Mar 21 04:44:46 crc kubenswrapper[4839]: I0321 04:44:46.230296 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/e5f64e49-61a6-4601-b37b-f9af6079108c-config-data-custom\") pod \"barbican-keystone-listener-56b894998b-l59vx\" (UID: \"e5f64e49-61a6-4601-b37b-f9af6079108c\") " pod="openstack/barbican-keystone-listener-56b894998b-l59vx" Mar 21 04:44:46 crc kubenswrapper[4839]: I0321 04:44:46.230528 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/392fb516-8745-40fb-b38d-53106c8310df-config-data-custom\") pod \"barbican-worker-77466dd775-brs5x\" (UID: \"392fb516-8745-40fb-b38d-53106c8310df\") " pod="openstack/barbican-worker-77466dd775-brs5x" Mar 21 04:44:46 crc kubenswrapper[4839]: I0321 04:44:46.231222 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e5f64e49-61a6-4601-b37b-f9af6079108c-config-data\") pod \"barbican-keystone-listener-56b894998b-l59vx\" (UID: \"e5f64e49-61a6-4601-b37b-f9af6079108c\") " pod="openstack/barbican-keystone-listener-56b894998b-l59vx" Mar 21 04:44:46 crc kubenswrapper[4839]: I0321 04:44:46.234270 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e5f64e49-61a6-4601-b37b-f9af6079108c-combined-ca-bundle\") pod \"barbican-keystone-listener-56b894998b-l59vx\" (UID: \"e5f64e49-61a6-4601-b37b-f9af6079108c\") " pod="openstack/barbican-keystone-listener-56b894998b-l59vx" Mar 21 04:44:46 crc kubenswrapper[4839]: I0321 04:44:46.234992 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/392fb516-8745-40fb-b38d-53106c8310df-config-data\") pod \"barbican-worker-77466dd775-brs5x\" (UID: \"392fb516-8745-40fb-b38d-53106c8310df\") " pod="openstack/barbican-worker-77466dd775-brs5x" Mar 21 04:44:46 crc kubenswrapper[4839]: I0321 04:44:46.235775 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/392fb516-8745-40fb-b38d-53106c8310df-combined-ca-bundle\") pod \"barbican-worker-77466dd775-brs5x\" (UID: \"392fb516-8745-40fb-b38d-53106c8310df\") " pod="openstack/barbican-worker-77466dd775-brs5x" Mar 21 04:44:46 crc kubenswrapper[4839]: I0321 04:44:46.260516 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xmt9m\" (UniqueName: \"kubernetes.io/projected/392fb516-8745-40fb-b38d-53106c8310df-kube-api-access-xmt9m\") pod \"barbican-worker-77466dd775-brs5x\" (UID: \"392fb516-8745-40fb-b38d-53106c8310df\") " pod="openstack/barbican-worker-77466dd775-brs5x" Mar 21 04:44:46 crc kubenswrapper[4839]: I0321 04:44:46.281475 4839 scope.go:117] "RemoveContainer" containerID="3564e41aa34a1722e5c61a5b47bf82e1bb5bc4612fbb2dc888f7e8b1d996cdd6" Mar 21 04:44:46 crc kubenswrapper[4839]: I0321 04:44:46.287718 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-n782h\" (UniqueName: \"kubernetes.io/projected/e5f64e49-61a6-4601-b37b-f9af6079108c-kube-api-access-n782h\") pod \"barbican-keystone-listener-56b894998b-l59vx\" (UID: \"e5f64e49-61a6-4601-b37b-f9af6079108c\") " pod="openstack/barbican-keystone-listener-56b894998b-l59vx" Mar 21 04:44:46 crc kubenswrapper[4839]: I0321 04:44:46.295435 4839 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-api-59cc78c49d-pkvb5"] Mar 21 04:44:46 crc kubenswrapper[4839]: I0321 04:44:46.299808 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-59cc78c49d-pkvb5" Mar 21 04:44:46 crc kubenswrapper[4839]: I0321 04:44:46.309353 4839 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-api-config-data" Mar 21 04:44:46 crc kubenswrapper[4839]: I0321 04:44:46.321964 4839 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-59cc78c49d-pkvb5"] Mar 21 04:44:46 crc kubenswrapper[4839]: I0321 04:44:46.325633 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/6a3fcdf0-3099-467b-928b-89a4876130fe-credential-keys\") pod \"keystone-cb996784d-fvhvp\" (UID: \"6a3fcdf0-3099-467b-928b-89a4876130fe\") " pod="openstack/keystone-cb996784d-fvhvp" Mar 21 04:44:46 crc kubenswrapper[4839]: I0321 04:44:46.325702 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xn7xh\" (UniqueName: \"kubernetes.io/projected/ac45c53b-2486-47d1-aaf4-23b76adfd431-kube-api-access-xn7xh\") pod \"dnsmasq-dns-848cf88cfc-k67ln\" (UID: \"ac45c53b-2486-47d1-aaf4-23b76adfd431\") " pod="openstack/dnsmasq-dns-848cf88cfc-k67ln" Mar 21 04:44:46 crc kubenswrapper[4839]: I0321 04:44:46.325721 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6a3fcdf0-3099-467b-928b-89a4876130fe-combined-ca-bundle\") pod \"keystone-cb996784d-fvhvp\" (UID: \"6a3fcdf0-3099-467b-928b-89a4876130fe\") " pod="openstack/keystone-cb996784d-fvhvp" Mar 21 04:44:46 crc kubenswrapper[4839]: I0321 04:44:46.325754 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/6a3fcdf0-3099-467b-928b-89a4876130fe-fernet-keys\") pod \"keystone-cb996784d-fvhvp\" (UID: \"6a3fcdf0-3099-467b-928b-89a4876130fe\") " pod="openstack/keystone-cb996784d-fvhvp" Mar 21 04:44:46 crc kubenswrapper[4839]: I0321 04:44:46.325771 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6a3fcdf0-3099-467b-928b-89a4876130fe-config-data\") pod \"keystone-cb996784d-fvhvp\" (UID: \"6a3fcdf0-3099-467b-928b-89a4876130fe\") " pod="openstack/keystone-cb996784d-fvhvp" Mar 21 04:44:46 crc kubenswrapper[4839]: I0321 04:44:46.325794 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ac45c53b-2486-47d1-aaf4-23b76adfd431-config\") pod \"dnsmasq-dns-848cf88cfc-k67ln\" (UID: \"ac45c53b-2486-47d1-aaf4-23b76adfd431\") " pod="openstack/dnsmasq-dns-848cf88cfc-k67ln" Mar 21 04:44:46 crc kubenswrapper[4839]: I0321 04:44:46.325809 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/ac45c53b-2486-47d1-aaf4-23b76adfd431-ovsdbserver-nb\") pod \"dnsmasq-dns-848cf88cfc-k67ln\" (UID: \"ac45c53b-2486-47d1-aaf4-23b76adfd431\") " pod="openstack/dnsmasq-dns-848cf88cfc-k67ln" Mar 21 04:44:46 crc kubenswrapper[4839]: I0321 04:44:46.325834 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6a3fcdf0-3099-467b-928b-89a4876130fe-scripts\") pod \"keystone-cb996784d-fvhvp\" (UID: \"6a3fcdf0-3099-467b-928b-89a4876130fe\") " pod="openstack/keystone-cb996784d-fvhvp" Mar 21 04:44:46 crc kubenswrapper[4839]: I0321 04:44:46.325854 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/ac45c53b-2486-47d1-aaf4-23b76adfd431-dns-swift-storage-0\") pod \"dnsmasq-dns-848cf88cfc-k67ln\" (UID: \"ac45c53b-2486-47d1-aaf4-23b76adfd431\") " pod="openstack/dnsmasq-dns-848cf88cfc-k67ln" Mar 21 04:44:46 crc kubenswrapper[4839]: I0321 04:44:46.325877 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ac45c53b-2486-47d1-aaf4-23b76adfd431-dns-svc\") pod \"dnsmasq-dns-848cf88cfc-k67ln\" (UID: \"ac45c53b-2486-47d1-aaf4-23b76adfd431\") " pod="openstack/dnsmasq-dns-848cf88cfc-k67ln" Mar 21 04:44:46 crc kubenswrapper[4839]: I0321 04:44:46.325902 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/6a3fcdf0-3099-467b-928b-89a4876130fe-internal-tls-certs\") pod \"keystone-cb996784d-fvhvp\" (UID: \"6a3fcdf0-3099-467b-928b-89a4876130fe\") " pod="openstack/keystone-cb996784d-fvhvp" Mar 21 04:44:46 crc kubenswrapper[4839]: I0321 04:44:46.325932 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/ac45c53b-2486-47d1-aaf4-23b76adfd431-ovsdbserver-sb\") pod \"dnsmasq-dns-848cf88cfc-k67ln\" (UID: \"ac45c53b-2486-47d1-aaf4-23b76adfd431\") " pod="openstack/dnsmasq-dns-848cf88cfc-k67ln" Mar 21 04:44:46 crc kubenswrapper[4839]: I0321 04:44:46.325977 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6qvlb\" (UniqueName: \"kubernetes.io/projected/6a3fcdf0-3099-467b-928b-89a4876130fe-kube-api-access-6qvlb\") pod \"keystone-cb996784d-fvhvp\" (UID: \"6a3fcdf0-3099-467b-928b-89a4876130fe\") " pod="openstack/keystone-cb996784d-fvhvp" Mar 21 04:44:46 crc kubenswrapper[4839]: I0321 04:44:46.326019 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/6a3fcdf0-3099-467b-928b-89a4876130fe-public-tls-certs\") pod \"keystone-cb996784d-fvhvp\" (UID: \"6a3fcdf0-3099-467b-928b-89a4876130fe\") " pod="openstack/keystone-cb996784d-fvhvp" Mar 21 04:44:46 crc kubenswrapper[4839]: I0321 04:44:46.376603 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-98964f649-mrjrt" event={"ID":"e965d008-890b-408c-a5a8-823aca00140a","Type":"ContainerStarted","Data":"c52f7b158358ef8b38cfac03210bf15a4ca76a8dbb9c567dbd73763e507062d1"} Mar 21 04:44:46 crc kubenswrapper[4839]: I0321 04:44:46.376825 4839 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/neutron-98964f649-mrjrt" Mar 21 04:44:46 crc kubenswrapper[4839]: I0321 04:44:46.386815 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-worker-77466dd775-brs5x" Mar 21 04:44:46 crc kubenswrapper[4839]: I0321 04:44:46.389720 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-keystone-listener-56b894998b-l59vx" Mar 21 04:44:46 crc kubenswrapper[4839]: I0321 04:44:46.399877 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-5788c8f798-khqlb" event={"ID":"30c2fe46-cd8a-43f9-8968-b6e65d7c862a","Type":"ContainerStarted","Data":"ab24204f92b48f3bd0e1ca51a011a08b99bfb2f39778550bf43149f79a695df5"} Mar 21 04:44:46 crc kubenswrapper[4839]: I0321 04:44:46.400271 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-5788c8f798-khqlb" event={"ID":"30c2fe46-cd8a-43f9-8968-b6e65d7c862a","Type":"ContainerStarted","Data":"50769488e00d55eb3429e84962a3785fccd7fc5850dfd3db47f0ac5b5be66345"} Mar 21 04:44:46 crc kubenswrapper[4839]: I0321 04:44:46.401167 4839 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/placement-5788c8f798-khqlb" Mar 21 04:44:46 crc kubenswrapper[4839]: I0321 04:44:46.403044 4839 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/placement-5788c8f798-khqlb" Mar 21 04:44:46 crc kubenswrapper[4839]: I0321 04:44:46.412239 4839 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-keystone-listener-7b946d96f4-chv76"] Mar 21 04:44:46 crc kubenswrapper[4839]: I0321 04:44:46.413903 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-keystone-listener-7b946d96f4-chv76" Mar 21 04:44:46 crc kubenswrapper[4839]: I0321 04:44:46.428176 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/08c48ff3-782c-4f5a-8a20-0736565e247a-combined-ca-bundle\") pod \"barbican-api-59cc78c49d-pkvb5\" (UID: \"08c48ff3-782c-4f5a-8a20-0736565e247a\") " pod="openstack/barbican-api-59cc78c49d-pkvb5" Mar 21 04:44:46 crc kubenswrapper[4839]: I0321 04:44:46.428219 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/6a3fcdf0-3099-467b-928b-89a4876130fe-fernet-keys\") pod \"keystone-cb996784d-fvhvp\" (UID: \"6a3fcdf0-3099-467b-928b-89a4876130fe\") " pod="openstack/keystone-cb996784d-fvhvp" Mar 21 04:44:46 crc kubenswrapper[4839]: I0321 04:44:46.428242 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6a3fcdf0-3099-467b-928b-89a4876130fe-config-data\") pod \"keystone-cb996784d-fvhvp\" (UID: \"6a3fcdf0-3099-467b-928b-89a4876130fe\") " pod="openstack/keystone-cb996784d-fvhvp" Mar 21 04:44:46 crc kubenswrapper[4839]: I0321 04:44:46.428269 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ac45c53b-2486-47d1-aaf4-23b76adfd431-config\") pod \"dnsmasq-dns-848cf88cfc-k67ln\" (UID: \"ac45c53b-2486-47d1-aaf4-23b76adfd431\") " pod="openstack/dnsmasq-dns-848cf88cfc-k67ln" Mar 21 04:44:46 crc kubenswrapper[4839]: I0321 04:44:46.428286 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/ac45c53b-2486-47d1-aaf4-23b76adfd431-ovsdbserver-nb\") pod \"dnsmasq-dns-848cf88cfc-k67ln\" (UID: \"ac45c53b-2486-47d1-aaf4-23b76adfd431\") " pod="openstack/dnsmasq-dns-848cf88cfc-k67ln" Mar 21 04:44:46 crc kubenswrapper[4839]: I0321 04:44:46.428311 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6a3fcdf0-3099-467b-928b-89a4876130fe-scripts\") pod \"keystone-cb996784d-fvhvp\" (UID: \"6a3fcdf0-3099-467b-928b-89a4876130fe\") " pod="openstack/keystone-cb996784d-fvhvp" Mar 21 04:44:46 crc kubenswrapper[4839]: I0321 04:44:46.428328 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lsv2l\" (UniqueName: \"kubernetes.io/projected/08c48ff3-782c-4f5a-8a20-0736565e247a-kube-api-access-lsv2l\") pod \"barbican-api-59cc78c49d-pkvb5\" (UID: \"08c48ff3-782c-4f5a-8a20-0736565e247a\") " pod="openstack/barbican-api-59cc78c49d-pkvb5" Mar 21 04:44:46 crc kubenswrapper[4839]: I0321 04:44:46.428352 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/ac45c53b-2486-47d1-aaf4-23b76adfd431-dns-swift-storage-0\") pod \"dnsmasq-dns-848cf88cfc-k67ln\" (UID: \"ac45c53b-2486-47d1-aaf4-23b76adfd431\") " pod="openstack/dnsmasq-dns-848cf88cfc-k67ln" Mar 21 04:44:46 crc kubenswrapper[4839]: I0321 04:44:46.428372 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ac45c53b-2486-47d1-aaf4-23b76adfd431-dns-svc\") pod \"dnsmasq-dns-848cf88cfc-k67ln\" (UID: \"ac45c53b-2486-47d1-aaf4-23b76adfd431\") " pod="openstack/dnsmasq-dns-848cf88cfc-k67ln" Mar 21 04:44:46 crc kubenswrapper[4839]: I0321 04:44:46.428391 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/6a3fcdf0-3099-467b-928b-89a4876130fe-internal-tls-certs\") pod \"keystone-cb996784d-fvhvp\" (UID: \"6a3fcdf0-3099-467b-928b-89a4876130fe\") " pod="openstack/keystone-cb996784d-fvhvp" Mar 21 04:44:46 crc kubenswrapper[4839]: I0321 04:44:46.428409 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/ac45c53b-2486-47d1-aaf4-23b76adfd431-ovsdbserver-sb\") pod \"dnsmasq-dns-848cf88cfc-k67ln\" (UID: \"ac45c53b-2486-47d1-aaf4-23b76adfd431\") " pod="openstack/dnsmasq-dns-848cf88cfc-k67ln" Mar 21 04:44:46 crc kubenswrapper[4839]: I0321 04:44:46.428448 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6qvlb\" (UniqueName: \"kubernetes.io/projected/6a3fcdf0-3099-467b-928b-89a4876130fe-kube-api-access-6qvlb\") pod \"keystone-cb996784d-fvhvp\" (UID: \"6a3fcdf0-3099-467b-928b-89a4876130fe\") " pod="openstack/keystone-cb996784d-fvhvp" Mar 21 04:44:46 crc kubenswrapper[4839]: I0321 04:44:46.428488 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/6a3fcdf0-3099-467b-928b-89a4876130fe-public-tls-certs\") pod \"keystone-cb996784d-fvhvp\" (UID: \"6a3fcdf0-3099-467b-928b-89a4876130fe\") " pod="openstack/keystone-cb996784d-fvhvp" Mar 21 04:44:46 crc kubenswrapper[4839]: I0321 04:44:46.428519 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/08c48ff3-782c-4f5a-8a20-0736565e247a-config-data-custom\") pod \"barbican-api-59cc78c49d-pkvb5\" (UID: \"08c48ff3-782c-4f5a-8a20-0736565e247a\") " pod="openstack/barbican-api-59cc78c49d-pkvb5" Mar 21 04:44:46 crc kubenswrapper[4839]: I0321 04:44:46.428542 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/6a3fcdf0-3099-467b-928b-89a4876130fe-credential-keys\") pod \"keystone-cb996784d-fvhvp\" (UID: \"6a3fcdf0-3099-467b-928b-89a4876130fe\") " pod="openstack/keystone-cb996784d-fvhvp" Mar 21 04:44:46 crc kubenswrapper[4839]: I0321 04:44:46.428558 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/08c48ff3-782c-4f5a-8a20-0736565e247a-config-data\") pod \"barbican-api-59cc78c49d-pkvb5\" (UID: \"08c48ff3-782c-4f5a-8a20-0736565e247a\") " pod="openstack/barbican-api-59cc78c49d-pkvb5" Mar 21 04:44:46 crc kubenswrapper[4839]: I0321 04:44:46.428672 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xn7xh\" (UniqueName: \"kubernetes.io/projected/ac45c53b-2486-47d1-aaf4-23b76adfd431-kube-api-access-xn7xh\") pod \"dnsmasq-dns-848cf88cfc-k67ln\" (UID: \"ac45c53b-2486-47d1-aaf4-23b76adfd431\") " pod="openstack/dnsmasq-dns-848cf88cfc-k67ln" Mar 21 04:44:46 crc kubenswrapper[4839]: I0321 04:44:46.428691 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6a3fcdf0-3099-467b-928b-89a4876130fe-combined-ca-bundle\") pod \"keystone-cb996784d-fvhvp\" (UID: \"6a3fcdf0-3099-467b-928b-89a4876130fe\") " pod="openstack/keystone-cb996784d-fvhvp" Mar 21 04:44:46 crc kubenswrapper[4839]: I0321 04:44:46.428714 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/08c48ff3-782c-4f5a-8a20-0736565e247a-logs\") pod \"barbican-api-59cc78c49d-pkvb5\" (UID: \"08c48ff3-782c-4f5a-8a20-0736565e247a\") " pod="openstack/barbican-api-59cc78c49d-pkvb5" Mar 21 04:44:46 crc kubenswrapper[4839]: I0321 04:44:46.431643 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/ac45c53b-2486-47d1-aaf4-23b76adfd431-ovsdbserver-sb\") pod \"dnsmasq-dns-848cf88cfc-k67ln\" (UID: \"ac45c53b-2486-47d1-aaf4-23b76adfd431\") " pod="openstack/dnsmasq-dns-848cf88cfc-k67ln" Mar 21 04:44:46 crc kubenswrapper[4839]: I0321 04:44:46.432253 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/ac45c53b-2486-47d1-aaf4-23b76adfd431-dns-swift-storage-0\") pod \"dnsmasq-dns-848cf88cfc-k67ln\" (UID: \"ac45c53b-2486-47d1-aaf4-23b76adfd431\") " pod="openstack/dnsmasq-dns-848cf88cfc-k67ln" Mar 21 04:44:46 crc kubenswrapper[4839]: I0321 04:44:46.432858 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ac45c53b-2486-47d1-aaf4-23b76adfd431-dns-svc\") pod \"dnsmasq-dns-848cf88cfc-k67ln\" (UID: \"ac45c53b-2486-47d1-aaf4-23b76adfd431\") " pod="openstack/dnsmasq-dns-848cf88cfc-k67ln" Mar 21 04:44:46 crc kubenswrapper[4839]: I0321 04:44:46.433190 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ac45c53b-2486-47d1-aaf4-23b76adfd431-config\") pod \"dnsmasq-dns-848cf88cfc-k67ln\" (UID: \"ac45c53b-2486-47d1-aaf4-23b76adfd431\") " pod="openstack/dnsmasq-dns-848cf88cfc-k67ln" Mar 21 04:44:46 crc kubenswrapper[4839]: I0321 04:44:46.434043 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/ac45c53b-2486-47d1-aaf4-23b76adfd431-ovsdbserver-nb\") pod \"dnsmasq-dns-848cf88cfc-k67ln\" (UID: \"ac45c53b-2486-47d1-aaf4-23b76adfd431\") " pod="openstack/dnsmasq-dns-848cf88cfc-k67ln" Mar 21 04:44:46 crc kubenswrapper[4839]: I0321 04:44:46.435074 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/6a3fcdf0-3099-467b-928b-89a4876130fe-fernet-keys\") pod \"keystone-cb996784d-fvhvp\" (UID: \"6a3fcdf0-3099-467b-928b-89a4876130fe\") " pod="openstack/keystone-cb996784d-fvhvp" Mar 21 04:44:46 crc kubenswrapper[4839]: I0321 04:44:46.437267 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6a3fcdf0-3099-467b-928b-89a4876130fe-config-data\") pod \"keystone-cb996784d-fvhvp\" (UID: \"6a3fcdf0-3099-467b-928b-89a4876130fe\") " pod="openstack/keystone-cb996784d-fvhvp" Mar 21 04:44:46 crc kubenswrapper[4839]: I0321 04:44:46.439073 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6a3fcdf0-3099-467b-928b-89a4876130fe-scripts\") pod \"keystone-cb996784d-fvhvp\" (UID: \"6a3fcdf0-3099-467b-928b-89a4876130fe\") " pod="openstack/keystone-cb996784d-fvhvp" Mar 21 04:44:46 crc kubenswrapper[4839]: I0321 04:44:46.439145 4839 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-worker-db77b8b5f-grbp8"] Mar 21 04:44:46 crc kubenswrapper[4839]: I0321 04:44:46.440244 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/6a3fcdf0-3099-467b-928b-89a4876130fe-credential-keys\") pod \"keystone-cb996784d-fvhvp\" (UID: \"6a3fcdf0-3099-467b-928b-89a4876130fe\") " pod="openstack/keystone-cb996784d-fvhvp" Mar 21 04:44:46 crc kubenswrapper[4839]: I0321 04:44:46.442124 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-worker-db77b8b5f-grbp8" Mar 21 04:44:46 crc kubenswrapper[4839]: I0321 04:44:46.451968 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/6a3fcdf0-3099-467b-928b-89a4876130fe-public-tls-certs\") pod \"keystone-cb996784d-fvhvp\" (UID: \"6a3fcdf0-3099-467b-928b-89a4876130fe\") " pod="openstack/keystone-cb996784d-fvhvp" Mar 21 04:44:46 crc kubenswrapper[4839]: I0321 04:44:46.457206 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6a3fcdf0-3099-467b-928b-89a4876130fe-combined-ca-bundle\") pod \"keystone-cb996784d-fvhvp\" (UID: \"6a3fcdf0-3099-467b-928b-89a4876130fe\") " pod="openstack/keystone-cb996784d-fvhvp" Mar 21 04:44:46 crc kubenswrapper[4839]: I0321 04:44:46.480950 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/6a3fcdf0-3099-467b-928b-89a4876130fe-internal-tls-certs\") pod \"keystone-cb996784d-fvhvp\" (UID: \"6a3fcdf0-3099-467b-928b-89a4876130fe\") " pod="openstack/keystone-cb996784d-fvhvp" Mar 21 04:44:46 crc kubenswrapper[4839]: I0321 04:44:46.487276 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xn7xh\" (UniqueName: \"kubernetes.io/projected/ac45c53b-2486-47d1-aaf4-23b76adfd431-kube-api-access-xn7xh\") pod \"dnsmasq-dns-848cf88cfc-k67ln\" (UID: \"ac45c53b-2486-47d1-aaf4-23b76adfd431\") " pod="openstack/dnsmasq-dns-848cf88cfc-k67ln" Mar 21 04:44:46 crc kubenswrapper[4839]: I0321 04:44:46.490268 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6qvlb\" (UniqueName: \"kubernetes.io/projected/6a3fcdf0-3099-467b-928b-89a4876130fe-kube-api-access-6qvlb\") pod \"keystone-cb996784d-fvhvp\" (UID: \"6a3fcdf0-3099-467b-928b-89a4876130fe\") " pod="openstack/keystone-cb996784d-fvhvp" Mar 21 04:44:46 crc kubenswrapper[4839]: I0321 04:44:46.490704 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-848cf88cfc-k67ln" Mar 21 04:44:46 crc kubenswrapper[4839]: I0321 04:44:46.511074 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cb996784d-fvhvp" Mar 21 04:44:46 crc kubenswrapper[4839]: I0321 04:44:46.528167 4839 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-98964f649-mrjrt" podStartSLOduration=10.528147519000001 podStartE2EDuration="10.528147519s" podCreationTimestamp="2026-03-21 04:44:36 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-21 04:44:46.457365399 +0000 UTC m=+1290.785152075" watchObservedRunningTime="2026-03-21 04:44:46.528147519 +0000 UTC m=+1290.855934195" Mar 21 04:44:46 crc kubenswrapper[4839]: I0321 04:44:46.532804 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3563c0f9-9e82-4798-bae3-b3836a6b5866-config-data\") pod \"barbican-worker-db77b8b5f-grbp8\" (UID: \"3563c0f9-9e82-4798-bae3-b3836a6b5866\") " pod="openstack/barbican-worker-db77b8b5f-grbp8" Mar 21 04:44:46 crc kubenswrapper[4839]: I0321 04:44:46.532918 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/e6e03301-fb6e-467b-b19d-21b5c475d35c-config-data-custom\") pod \"barbican-keystone-listener-7b946d96f4-chv76\" (UID: \"e6e03301-fb6e-467b-b19d-21b5c475d35c\") " pod="openstack/barbican-keystone-listener-7b946d96f4-chv76" Mar 21 04:44:46 crc kubenswrapper[4839]: I0321 04:44:46.532937 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/3563c0f9-9e82-4798-bae3-b3836a6b5866-config-data-custom\") pod \"barbican-worker-db77b8b5f-grbp8\" (UID: \"3563c0f9-9e82-4798-bae3-b3836a6b5866\") " pod="openstack/barbican-worker-db77b8b5f-grbp8" Mar 21 04:44:46 crc kubenswrapper[4839]: I0321 04:44:46.532965 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e6e03301-fb6e-467b-b19d-21b5c475d35c-config-data\") pod \"barbican-keystone-listener-7b946d96f4-chv76\" (UID: \"e6e03301-fb6e-467b-b19d-21b5c475d35c\") " pod="openstack/barbican-keystone-listener-7b946d96f4-chv76" Mar 21 04:44:46 crc kubenswrapper[4839]: I0321 04:44:46.532989 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2lbgq\" (UniqueName: \"kubernetes.io/projected/3563c0f9-9e82-4798-bae3-b3836a6b5866-kube-api-access-2lbgq\") pod \"barbican-worker-db77b8b5f-grbp8\" (UID: \"3563c0f9-9e82-4798-bae3-b3836a6b5866\") " pod="openstack/barbican-worker-db77b8b5f-grbp8" Mar 21 04:44:46 crc kubenswrapper[4839]: I0321 04:44:46.533031 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/08c48ff3-782c-4f5a-8a20-0736565e247a-config-data-custom\") pod \"barbican-api-59cc78c49d-pkvb5\" (UID: \"08c48ff3-782c-4f5a-8a20-0736565e247a\") " pod="openstack/barbican-api-59cc78c49d-pkvb5" Mar 21 04:44:46 crc kubenswrapper[4839]: I0321 04:44:46.533062 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e6e03301-fb6e-467b-b19d-21b5c475d35c-combined-ca-bundle\") pod \"barbican-keystone-listener-7b946d96f4-chv76\" (UID: \"e6e03301-fb6e-467b-b19d-21b5c475d35c\") " pod="openstack/barbican-keystone-listener-7b946d96f4-chv76" Mar 21 04:44:46 crc kubenswrapper[4839]: I0321 04:44:46.533092 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/08c48ff3-782c-4f5a-8a20-0736565e247a-config-data\") pod \"barbican-api-59cc78c49d-pkvb5\" (UID: \"08c48ff3-782c-4f5a-8a20-0736565e247a\") " pod="openstack/barbican-api-59cc78c49d-pkvb5" Mar 21 04:44:46 crc kubenswrapper[4839]: I0321 04:44:46.533154 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/08c48ff3-782c-4f5a-8a20-0736565e247a-logs\") pod \"barbican-api-59cc78c49d-pkvb5\" (UID: \"08c48ff3-782c-4f5a-8a20-0736565e247a\") " pod="openstack/barbican-api-59cc78c49d-pkvb5" Mar 21 04:44:46 crc kubenswrapper[4839]: I0321 04:44:46.533206 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/08c48ff3-782c-4f5a-8a20-0736565e247a-combined-ca-bundle\") pod \"barbican-api-59cc78c49d-pkvb5\" (UID: \"08c48ff3-782c-4f5a-8a20-0736565e247a\") " pod="openstack/barbican-api-59cc78c49d-pkvb5" Mar 21 04:44:46 crc kubenswrapper[4839]: I0321 04:44:46.533252 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e6e03301-fb6e-467b-b19d-21b5c475d35c-logs\") pod \"barbican-keystone-listener-7b946d96f4-chv76\" (UID: \"e6e03301-fb6e-467b-b19d-21b5c475d35c\") " pod="openstack/barbican-keystone-listener-7b946d96f4-chv76" Mar 21 04:44:46 crc kubenswrapper[4839]: I0321 04:44:46.533319 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lsv2l\" (UniqueName: \"kubernetes.io/projected/08c48ff3-782c-4f5a-8a20-0736565e247a-kube-api-access-lsv2l\") pod \"barbican-api-59cc78c49d-pkvb5\" (UID: \"08c48ff3-782c-4f5a-8a20-0736565e247a\") " pod="openstack/barbican-api-59cc78c49d-pkvb5" Mar 21 04:44:46 crc kubenswrapper[4839]: I0321 04:44:46.533342 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3563c0f9-9e82-4798-bae3-b3836a6b5866-combined-ca-bundle\") pod \"barbican-worker-db77b8b5f-grbp8\" (UID: \"3563c0f9-9e82-4798-bae3-b3836a6b5866\") " pod="openstack/barbican-worker-db77b8b5f-grbp8" Mar 21 04:44:46 crc kubenswrapper[4839]: I0321 04:44:46.533404 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z4r45\" (UniqueName: \"kubernetes.io/projected/e6e03301-fb6e-467b-b19d-21b5c475d35c-kube-api-access-z4r45\") pod \"barbican-keystone-listener-7b946d96f4-chv76\" (UID: \"e6e03301-fb6e-467b-b19d-21b5c475d35c\") " pod="openstack/barbican-keystone-listener-7b946d96f4-chv76" Mar 21 04:44:46 crc kubenswrapper[4839]: I0321 04:44:46.533450 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3563c0f9-9e82-4798-bae3-b3836a6b5866-logs\") pod \"barbican-worker-db77b8b5f-grbp8\" (UID: \"3563c0f9-9e82-4798-bae3-b3836a6b5866\") " pod="openstack/barbican-worker-db77b8b5f-grbp8" Mar 21 04:44:46 crc kubenswrapper[4839]: I0321 04:44:46.543849 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/08c48ff3-782c-4f5a-8a20-0736565e247a-logs\") pod \"barbican-api-59cc78c49d-pkvb5\" (UID: \"08c48ff3-782c-4f5a-8a20-0736565e247a\") " pod="openstack/barbican-api-59cc78c49d-pkvb5" Mar 21 04:44:46 crc kubenswrapper[4839]: I0321 04:44:46.550488 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/08c48ff3-782c-4f5a-8a20-0736565e247a-combined-ca-bundle\") pod \"barbican-api-59cc78c49d-pkvb5\" (UID: \"08c48ff3-782c-4f5a-8a20-0736565e247a\") " pod="openstack/barbican-api-59cc78c49d-pkvb5" Mar 21 04:44:46 crc kubenswrapper[4839]: I0321 04:44:46.574148 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/08c48ff3-782c-4f5a-8a20-0736565e247a-config-data\") pod \"barbican-api-59cc78c49d-pkvb5\" (UID: \"08c48ff3-782c-4f5a-8a20-0736565e247a\") " pod="openstack/barbican-api-59cc78c49d-pkvb5" Mar 21 04:44:46 crc kubenswrapper[4839]: I0321 04:44:46.575733 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/08c48ff3-782c-4f5a-8a20-0736565e247a-config-data-custom\") pod \"barbican-api-59cc78c49d-pkvb5\" (UID: \"08c48ff3-782c-4f5a-8a20-0736565e247a\") " pod="openstack/barbican-api-59cc78c49d-pkvb5" Mar 21 04:44:46 crc kubenswrapper[4839]: I0321 04:44:46.587732 4839 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3d7f6c0e-7fdd-4ed0-94ad-e1b044f296f6" path="/var/lib/kubelet/pods/3d7f6c0e-7fdd-4ed0-94ad-e1b044f296f6/volumes" Mar 21 04:44:46 crc kubenswrapper[4839]: I0321 04:44:46.588523 4839 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6008d784-c50b-4079-a7b4-c160b8202956" path="/var/lib/kubelet/pods/6008d784-c50b-4079-a7b4-c160b8202956/volumes" Mar 21 04:44:46 crc kubenswrapper[4839]: I0321 04:44:46.588703 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lsv2l\" (UniqueName: \"kubernetes.io/projected/08c48ff3-782c-4f5a-8a20-0736565e247a-kube-api-access-lsv2l\") pod \"barbican-api-59cc78c49d-pkvb5\" (UID: \"08c48ff3-782c-4f5a-8a20-0736565e247a\") " pod="openstack/barbican-api-59cc78c49d-pkvb5" Mar 21 04:44:46 crc kubenswrapper[4839]: I0321 04:44:46.601534 4839 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e8a38db0-ab1b-4555-bb52-a903e9c6b5bb" path="/var/lib/kubelet/pods/e8a38db0-ab1b-4555-bb52-a903e9c6b5bb/volumes" Mar 21 04:44:46 crc kubenswrapper[4839]: I0321 04:44:46.620387 4839 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-keystone-listener-7b946d96f4-chv76"] Mar 21 04:44:46 crc kubenswrapper[4839]: I0321 04:44:46.620434 4839 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-worker-db77b8b5f-grbp8"] Mar 21 04:44:46 crc kubenswrapper[4839]: I0321 04:44:46.620874 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-59cc78c49d-pkvb5" Mar 21 04:44:46 crc kubenswrapper[4839]: I0321 04:44:46.641033 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3563c0f9-9e82-4798-bae3-b3836a6b5866-logs\") pod \"barbican-worker-db77b8b5f-grbp8\" (UID: \"3563c0f9-9e82-4798-bae3-b3836a6b5866\") " pod="openstack/barbican-worker-db77b8b5f-grbp8" Mar 21 04:44:46 crc kubenswrapper[4839]: I0321 04:44:46.641105 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3563c0f9-9e82-4798-bae3-b3836a6b5866-config-data\") pod \"barbican-worker-db77b8b5f-grbp8\" (UID: \"3563c0f9-9e82-4798-bae3-b3836a6b5866\") " pod="openstack/barbican-worker-db77b8b5f-grbp8" Mar 21 04:44:46 crc kubenswrapper[4839]: I0321 04:44:46.689725 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/e6e03301-fb6e-467b-b19d-21b5c475d35c-config-data-custom\") pod \"barbican-keystone-listener-7b946d96f4-chv76\" (UID: \"e6e03301-fb6e-467b-b19d-21b5c475d35c\") " pod="openstack/barbican-keystone-listener-7b946d96f4-chv76" Mar 21 04:44:46 crc kubenswrapper[4839]: I0321 04:44:46.690044 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/3563c0f9-9e82-4798-bae3-b3836a6b5866-config-data-custom\") pod \"barbican-worker-db77b8b5f-grbp8\" (UID: \"3563c0f9-9e82-4798-bae3-b3836a6b5866\") " pod="openstack/barbican-worker-db77b8b5f-grbp8" Mar 21 04:44:46 crc kubenswrapper[4839]: I0321 04:44:46.690079 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e6e03301-fb6e-467b-b19d-21b5c475d35c-config-data\") pod \"barbican-keystone-listener-7b946d96f4-chv76\" (UID: \"e6e03301-fb6e-467b-b19d-21b5c475d35c\") " pod="openstack/barbican-keystone-listener-7b946d96f4-chv76" Mar 21 04:44:46 crc kubenswrapper[4839]: I0321 04:44:46.690118 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2lbgq\" (UniqueName: \"kubernetes.io/projected/3563c0f9-9e82-4798-bae3-b3836a6b5866-kube-api-access-2lbgq\") pod \"barbican-worker-db77b8b5f-grbp8\" (UID: \"3563c0f9-9e82-4798-bae3-b3836a6b5866\") " pod="openstack/barbican-worker-db77b8b5f-grbp8" Mar 21 04:44:46 crc kubenswrapper[4839]: I0321 04:44:46.690184 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e6e03301-fb6e-467b-b19d-21b5c475d35c-combined-ca-bundle\") pod \"barbican-keystone-listener-7b946d96f4-chv76\" (UID: \"e6e03301-fb6e-467b-b19d-21b5c475d35c\") " pod="openstack/barbican-keystone-listener-7b946d96f4-chv76" Mar 21 04:44:46 crc kubenswrapper[4839]: I0321 04:44:46.690351 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e6e03301-fb6e-467b-b19d-21b5c475d35c-logs\") pod \"barbican-keystone-listener-7b946d96f4-chv76\" (UID: \"e6e03301-fb6e-467b-b19d-21b5c475d35c\") " pod="openstack/barbican-keystone-listener-7b946d96f4-chv76" Mar 21 04:44:46 crc kubenswrapper[4839]: I0321 04:44:46.690417 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3563c0f9-9e82-4798-bae3-b3836a6b5866-combined-ca-bundle\") pod \"barbican-worker-db77b8b5f-grbp8\" (UID: \"3563c0f9-9e82-4798-bae3-b3836a6b5866\") " pod="openstack/barbican-worker-db77b8b5f-grbp8" Mar 21 04:44:46 crc kubenswrapper[4839]: I0321 04:44:46.690465 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-z4r45\" (UniqueName: \"kubernetes.io/projected/e6e03301-fb6e-467b-b19d-21b5c475d35c-kube-api-access-z4r45\") pod \"barbican-keystone-listener-7b946d96f4-chv76\" (UID: \"e6e03301-fb6e-467b-b19d-21b5c475d35c\") " pod="openstack/barbican-keystone-listener-7b946d96f4-chv76" Mar 21 04:44:46 crc kubenswrapper[4839]: I0321 04:44:46.693486 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3563c0f9-9e82-4798-bae3-b3836a6b5866-logs\") pod \"barbican-worker-db77b8b5f-grbp8\" (UID: \"3563c0f9-9e82-4798-bae3-b3836a6b5866\") " pod="openstack/barbican-worker-db77b8b5f-grbp8" Mar 21 04:44:46 crc kubenswrapper[4839]: I0321 04:44:46.704854 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e6e03301-fb6e-467b-b19d-21b5c475d35c-logs\") pod \"barbican-keystone-listener-7b946d96f4-chv76\" (UID: \"e6e03301-fb6e-467b-b19d-21b5c475d35c\") " pod="openstack/barbican-keystone-listener-7b946d96f4-chv76" Mar 21 04:44:46 crc kubenswrapper[4839]: I0321 04:44:46.707883 4839 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/placement-5788c8f798-khqlb" podStartSLOduration=3.707869757 podStartE2EDuration="3.707869757s" podCreationTimestamp="2026-03-21 04:44:43 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-21 04:44:46.535220937 +0000 UTC m=+1290.863007613" watchObservedRunningTime="2026-03-21 04:44:46.707869757 +0000 UTC m=+1291.035656433" Mar 21 04:44:46 crc kubenswrapper[4839]: I0321 04:44:46.708100 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e6e03301-fb6e-467b-b19d-21b5c475d35c-config-data\") pod \"barbican-keystone-listener-7b946d96f4-chv76\" (UID: \"e6e03301-fb6e-467b-b19d-21b5c475d35c\") " pod="openstack/barbican-keystone-listener-7b946d96f4-chv76" Mar 21 04:44:46 crc kubenswrapper[4839]: I0321 04:44:46.724401 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-z4r45\" (UniqueName: \"kubernetes.io/projected/e6e03301-fb6e-467b-b19d-21b5c475d35c-kube-api-access-z4r45\") pod \"barbican-keystone-listener-7b946d96f4-chv76\" (UID: \"e6e03301-fb6e-467b-b19d-21b5c475d35c\") " pod="openstack/barbican-keystone-listener-7b946d96f4-chv76" Mar 21 04:44:46 crc kubenswrapper[4839]: I0321 04:44:46.745151 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2lbgq\" (UniqueName: \"kubernetes.io/projected/3563c0f9-9e82-4798-bae3-b3836a6b5866-kube-api-access-2lbgq\") pod \"barbican-worker-db77b8b5f-grbp8\" (UID: \"3563c0f9-9e82-4798-bae3-b3836a6b5866\") " pod="openstack/barbican-worker-db77b8b5f-grbp8" Mar 21 04:44:46 crc kubenswrapper[4839]: I0321 04:44:46.750370 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e6e03301-fb6e-467b-b19d-21b5c475d35c-combined-ca-bundle\") pod \"barbican-keystone-listener-7b946d96f4-chv76\" (UID: \"e6e03301-fb6e-467b-b19d-21b5c475d35c\") " pod="openstack/barbican-keystone-listener-7b946d96f4-chv76" Mar 21 04:44:46 crc kubenswrapper[4839]: I0321 04:44:46.755707 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3563c0f9-9e82-4798-bae3-b3836a6b5866-config-data\") pod \"barbican-worker-db77b8b5f-grbp8\" (UID: \"3563c0f9-9e82-4798-bae3-b3836a6b5866\") " pod="openstack/barbican-worker-db77b8b5f-grbp8" Mar 21 04:44:46 crc kubenswrapper[4839]: I0321 04:44:46.760485 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/3563c0f9-9e82-4798-bae3-b3836a6b5866-config-data-custom\") pod \"barbican-worker-db77b8b5f-grbp8\" (UID: \"3563c0f9-9e82-4798-bae3-b3836a6b5866\") " pod="openstack/barbican-worker-db77b8b5f-grbp8" Mar 21 04:44:46 crc kubenswrapper[4839]: I0321 04:44:46.767136 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/e6e03301-fb6e-467b-b19d-21b5c475d35c-config-data-custom\") pod \"barbican-keystone-listener-7b946d96f4-chv76\" (UID: \"e6e03301-fb6e-467b-b19d-21b5c475d35c\") " pod="openstack/barbican-keystone-listener-7b946d96f4-chv76" Mar 21 04:44:46 crc kubenswrapper[4839]: I0321 04:44:46.768761 4839 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-api-67dd687666-pgfc5"] Mar 21 04:44:46 crc kubenswrapper[4839]: I0321 04:44:46.775020 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3563c0f9-9e82-4798-bae3-b3836a6b5866-combined-ca-bundle\") pod \"barbican-worker-db77b8b5f-grbp8\" (UID: \"3563c0f9-9e82-4798-bae3-b3836a6b5866\") " pod="openstack/barbican-worker-db77b8b5f-grbp8" Mar 21 04:44:46 crc kubenswrapper[4839]: I0321 04:44:46.802460 4839 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-67dd687666-pgfc5"] Mar 21 04:44:46 crc kubenswrapper[4839]: I0321 04:44:46.802559 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-67dd687666-pgfc5" Mar 21 04:44:46 crc kubenswrapper[4839]: I0321 04:44:46.805143 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/10c8735c-f1c9-40f7-bd34-60bb0749bc23-logs\") pod \"barbican-api-67dd687666-pgfc5\" (UID: \"10c8735c-f1c9-40f7-bd34-60bb0749bc23\") " pod="openstack/barbican-api-67dd687666-pgfc5" Mar 21 04:44:46 crc kubenswrapper[4839]: I0321 04:44:46.805221 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/10c8735c-f1c9-40f7-bd34-60bb0749bc23-config-data\") pod \"barbican-api-67dd687666-pgfc5\" (UID: \"10c8735c-f1c9-40f7-bd34-60bb0749bc23\") " pod="openstack/barbican-api-67dd687666-pgfc5" Mar 21 04:44:46 crc kubenswrapper[4839]: I0321 04:44:46.805259 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pj9bx\" (UniqueName: \"kubernetes.io/projected/10c8735c-f1c9-40f7-bd34-60bb0749bc23-kube-api-access-pj9bx\") pod \"barbican-api-67dd687666-pgfc5\" (UID: \"10c8735c-f1c9-40f7-bd34-60bb0749bc23\") " pod="openstack/barbican-api-67dd687666-pgfc5" Mar 21 04:44:46 crc kubenswrapper[4839]: I0321 04:44:46.805376 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/10c8735c-f1c9-40f7-bd34-60bb0749bc23-config-data-custom\") pod \"barbican-api-67dd687666-pgfc5\" (UID: \"10c8735c-f1c9-40f7-bd34-60bb0749bc23\") " pod="openstack/barbican-api-67dd687666-pgfc5" Mar 21 04:44:46 crc kubenswrapper[4839]: I0321 04:44:46.805402 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/10c8735c-f1c9-40f7-bd34-60bb0749bc23-combined-ca-bundle\") pod \"barbican-api-67dd687666-pgfc5\" (UID: \"10c8735c-f1c9-40f7-bd34-60bb0749bc23\") " pod="openstack/barbican-api-67dd687666-pgfc5" Mar 21 04:44:46 crc kubenswrapper[4839]: I0321 04:44:46.906660 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/10c8735c-f1c9-40f7-bd34-60bb0749bc23-config-data\") pod \"barbican-api-67dd687666-pgfc5\" (UID: \"10c8735c-f1c9-40f7-bd34-60bb0749bc23\") " pod="openstack/barbican-api-67dd687666-pgfc5" Mar 21 04:44:46 crc kubenswrapper[4839]: I0321 04:44:46.906739 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pj9bx\" (UniqueName: \"kubernetes.io/projected/10c8735c-f1c9-40f7-bd34-60bb0749bc23-kube-api-access-pj9bx\") pod \"barbican-api-67dd687666-pgfc5\" (UID: \"10c8735c-f1c9-40f7-bd34-60bb0749bc23\") " pod="openstack/barbican-api-67dd687666-pgfc5" Mar 21 04:44:46 crc kubenswrapper[4839]: I0321 04:44:46.906813 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/10c8735c-f1c9-40f7-bd34-60bb0749bc23-config-data-custom\") pod \"barbican-api-67dd687666-pgfc5\" (UID: \"10c8735c-f1c9-40f7-bd34-60bb0749bc23\") " pod="openstack/barbican-api-67dd687666-pgfc5" Mar 21 04:44:46 crc kubenswrapper[4839]: I0321 04:44:46.906838 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/10c8735c-f1c9-40f7-bd34-60bb0749bc23-combined-ca-bundle\") pod \"barbican-api-67dd687666-pgfc5\" (UID: \"10c8735c-f1c9-40f7-bd34-60bb0749bc23\") " pod="openstack/barbican-api-67dd687666-pgfc5" Mar 21 04:44:46 crc kubenswrapper[4839]: I0321 04:44:46.908842 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/10c8735c-f1c9-40f7-bd34-60bb0749bc23-logs\") pod \"barbican-api-67dd687666-pgfc5\" (UID: \"10c8735c-f1c9-40f7-bd34-60bb0749bc23\") " pod="openstack/barbican-api-67dd687666-pgfc5" Mar 21 04:44:46 crc kubenswrapper[4839]: I0321 04:44:46.909444 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/10c8735c-f1c9-40f7-bd34-60bb0749bc23-logs\") pod \"barbican-api-67dd687666-pgfc5\" (UID: \"10c8735c-f1c9-40f7-bd34-60bb0749bc23\") " pod="openstack/barbican-api-67dd687666-pgfc5" Mar 21 04:44:46 crc kubenswrapper[4839]: I0321 04:44:46.913001 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/10c8735c-f1c9-40f7-bd34-60bb0749bc23-config-data-custom\") pod \"barbican-api-67dd687666-pgfc5\" (UID: \"10c8735c-f1c9-40f7-bd34-60bb0749bc23\") " pod="openstack/barbican-api-67dd687666-pgfc5" Mar 21 04:44:46 crc kubenswrapper[4839]: I0321 04:44:46.929401 4839 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Mar 21 04:44:46 crc kubenswrapper[4839]: I0321 04:44:46.946500 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/10c8735c-f1c9-40f7-bd34-60bb0749bc23-combined-ca-bundle\") pod \"barbican-api-67dd687666-pgfc5\" (UID: \"10c8735c-f1c9-40f7-bd34-60bb0749bc23\") " pod="openstack/barbican-api-67dd687666-pgfc5" Mar 21 04:44:46 crc kubenswrapper[4839]: I0321 04:44:46.952301 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/10c8735c-f1c9-40f7-bd34-60bb0749bc23-config-data\") pod \"barbican-api-67dd687666-pgfc5\" (UID: \"10c8735c-f1c9-40f7-bd34-60bb0749bc23\") " pod="openstack/barbican-api-67dd687666-pgfc5" Mar 21 04:44:46 crc kubenswrapper[4839]: I0321 04:44:46.967278 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pj9bx\" (UniqueName: \"kubernetes.io/projected/10c8735c-f1c9-40f7-bd34-60bb0749bc23-kube-api-access-pj9bx\") pod \"barbican-api-67dd687666-pgfc5\" (UID: \"10c8735c-f1c9-40f7-bd34-60bb0749bc23\") " pod="openstack/barbican-api-67dd687666-pgfc5" Mar 21 04:44:46 crc kubenswrapper[4839]: I0321 04:44:46.995636 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-keystone-listener-7b946d96f4-chv76" Mar 21 04:44:47 crc kubenswrapper[4839]: I0321 04:44:47.009891 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-worker-db77b8b5f-grbp8" Mar 21 04:44:47 crc kubenswrapper[4839]: I0321 04:44:47.127202 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-67dd687666-pgfc5" Mar 21 04:44:47 crc kubenswrapper[4839]: I0321 04:44:47.136199 4839 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 21 04:44:47 crc kubenswrapper[4839]: I0321 04:44:47.259172 4839 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-keystone-listener-56b894998b-l59vx"] Mar 21 04:44:47 crc kubenswrapper[4839]: I0321 04:44:47.408377 4839 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-cb996784d-fvhvp"] Mar 21 04:44:47 crc kubenswrapper[4839]: I0321 04:44:47.429863 4839 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-worker-77466dd775-brs5x"] Mar 21 04:44:47 crc kubenswrapper[4839]: W0321 04:44:47.431689 4839 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod392fb516_8745_40fb_b38d_53106c8310df.slice/crio-aaed11a9eedbf42719456ee569b2249eb2ed204275c2e327bc69e0e9ba6d0c2e WatchSource:0}: Error finding container aaed11a9eedbf42719456ee569b2249eb2ed204275c2e327bc69e0e9ba6d0c2e: Status 404 returned error can't find the container with id aaed11a9eedbf42719456ee569b2249eb2ed204275c2e327bc69e0e9ba6d0c2e Mar 21 04:44:47 crc kubenswrapper[4839]: I0321 04:44:47.607533 4839 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-59cc78c49d-pkvb5"] Mar 21 04:44:47 crc kubenswrapper[4839]: I0321 04:44:47.624904 4839 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-848cf88cfc-k67ln"] Mar 21 04:44:47 crc kubenswrapper[4839]: I0321 04:44:47.635589 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-qfjms" event={"ID":"6000d2d4-e84a-443f-9094-ab999541331d","Type":"ContainerStarted","Data":"89d53502805454e28eeac8a6f5794fb2c1a2eba3acba95c45fd4f0d839ae56ac"} Mar 21 04:44:47 crc kubenswrapper[4839]: I0321 04:44:47.661909 4839 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-db-sync-qfjms" podStartSLOduration=4.662872252 podStartE2EDuration="48.661888245s" podCreationTimestamp="2026-03-21 04:43:59 +0000 UTC" firstStartedPulling="2026-03-21 04:44:01.030684456 +0000 UTC m=+1245.358471132" lastFinishedPulling="2026-03-21 04:44:45.029700459 +0000 UTC m=+1289.357487125" observedRunningTime="2026-03-21 04:44:47.658952143 +0000 UTC m=+1291.986738839" watchObservedRunningTime="2026-03-21 04:44:47.661888245 +0000 UTC m=+1291.989674921" Mar 21 04:44:47 crc kubenswrapper[4839]: I0321 04:44:47.673020 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-77466dd775-brs5x" event={"ID":"392fb516-8745-40fb-b38d-53106c8310df","Type":"ContainerStarted","Data":"aaed11a9eedbf42719456ee569b2249eb2ed204275c2e327bc69e0e9ba6d0c2e"} Mar 21 04:44:47 crc kubenswrapper[4839]: W0321 04:44:47.675585 4839 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod08c48ff3_782c_4f5a_8a20_0736565e247a.slice/crio-75d6eae50b83f1f8ec52b3d138f4404879d9130c0a03aa11c82efdf2866d9d7d WatchSource:0}: Error finding container 75d6eae50b83f1f8ec52b3d138f4404879d9130c0a03aa11c82efdf2866d9d7d: Status 404 returned error can't find the container with id 75d6eae50b83f1f8ec52b3d138f4404879d9130c0a03aa11c82efdf2866d9d7d Mar 21 04:44:47 crc kubenswrapper[4839]: I0321 04:44:47.676953 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"524772c8-3fdb-43dc-8532-1d8e9dcdeb97","Type":"ContainerStarted","Data":"d1e4b5b263d8711e41038cc9c72c0cf72e4c984c445036bd50e6733715123ea1"} Mar 21 04:44:47 crc kubenswrapper[4839]: I0321 04:44:47.689945 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cb996784d-fvhvp" event={"ID":"6a3fcdf0-3099-467b-928b-89a4876130fe","Type":"ContainerStarted","Data":"bd36894c5fe54247c8be01f9d997698010ac96edc9b8aff94a62d4d015134c51"} Mar 21 04:44:47 crc kubenswrapper[4839]: I0321 04:44:47.708852 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-56b894998b-l59vx" event={"ID":"e5f64e49-61a6-4601-b37b-f9af6079108c","Type":"ContainerStarted","Data":"5a91be0fc03b74a93a2e4a4a9276bbbc5ad15e2d8798017231a484ecfe7f2ff6"} Mar 21 04:44:47 crc kubenswrapper[4839]: I0321 04:44:47.715385 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"506e1e04-5787-48bb-9165-96a55f0d3095","Type":"ContainerStarted","Data":"b3b20c1c58919d92014e2b8b23b7b38a20303479312dd8c6c51224fcd3f18728"} Mar 21 04:44:47 crc kubenswrapper[4839]: I0321 04:44:47.917173 4839 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-worker-db77b8b5f-grbp8"] Mar 21 04:44:47 crc kubenswrapper[4839]: I0321 04:44:47.961714 4839 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-keystone-listener-7b946d96f4-chv76"] Mar 21 04:44:47 crc kubenswrapper[4839]: I0321 04:44:47.981262 4839 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-67dd687666-pgfc5"] Mar 21 04:44:48 crc kubenswrapper[4839]: W0321 04:44:48.076920 4839 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod10c8735c_f1c9_40f7_bd34_60bb0749bc23.slice/crio-eef1fbdee77ab0e9e93f444be08641661339d7257383b84dc3aa743e29072b31 WatchSource:0}: Error finding container eef1fbdee77ab0e9e93f444be08641661339d7257383b84dc3aa743e29072b31: Status 404 returned error can't find the container with id eef1fbdee77ab0e9e93f444be08641661339d7257383b84dc3aa743e29072b31 Mar 21 04:44:48 crc kubenswrapper[4839]: I0321 04:44:48.755793 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-59cc78c49d-pkvb5" event={"ID":"08c48ff3-782c-4f5a-8a20-0736565e247a","Type":"ContainerStarted","Data":"006498edc7390e4a3c12026f57aa6ecaf4e9b90cf55099e0f8f58424b7b384f9"} Mar 21 04:44:48 crc kubenswrapper[4839]: I0321 04:44:48.756140 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-59cc78c49d-pkvb5" event={"ID":"08c48ff3-782c-4f5a-8a20-0736565e247a","Type":"ContainerStarted","Data":"75d6eae50b83f1f8ec52b3d138f4404879d9130c0a03aa11c82efdf2866d9d7d"} Mar 21 04:44:48 crc kubenswrapper[4839]: I0321 04:44:48.763037 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cb996784d-fvhvp" event={"ID":"6a3fcdf0-3099-467b-928b-89a4876130fe","Type":"ContainerStarted","Data":"995e93aab27ceb2b4a1d4104b2418622a19b1beaed4ed2fe4d5c82d14a2d55d4"} Mar 21 04:44:48 crc kubenswrapper[4839]: I0321 04:44:48.763112 4839 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/keystone-cb996784d-fvhvp" Mar 21 04:44:48 crc kubenswrapper[4839]: I0321 04:44:48.778439 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"524772c8-3fdb-43dc-8532-1d8e9dcdeb97","Type":"ContainerStarted","Data":"7a15ea0b23f3e1ed7bc4a276314021fe6cacf15124fb851f6f27bcccdd4586fc"} Mar 21 04:44:48 crc kubenswrapper[4839]: I0321 04:44:48.790617 4839 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-cb996784d-fvhvp" podStartSLOduration=2.790594172 podStartE2EDuration="2.790594172s" podCreationTimestamp="2026-03-21 04:44:46 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-21 04:44:48.779207773 +0000 UTC m=+1293.106994449" watchObservedRunningTime="2026-03-21 04:44:48.790594172 +0000 UTC m=+1293.118380848" Mar 21 04:44:48 crc kubenswrapper[4839]: I0321 04:44:48.791697 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-db77b8b5f-grbp8" event={"ID":"3563c0f9-9e82-4798-bae3-b3836a6b5866","Type":"ContainerStarted","Data":"70074b5be9f0a546822a32bf575056e24fb92673a44bc210c38a301ddd216343"} Mar 21 04:44:48 crc kubenswrapper[4839]: I0321 04:44:48.794861 4839 generic.go:334] "Generic (PLEG): container finished" podID="ac45c53b-2486-47d1-aaf4-23b76adfd431" containerID="d6b588c7e0ea2083a499b261b5c79b627db47988e3cf9da2d927f03f127f5e76" exitCode=0 Mar 21 04:44:48 crc kubenswrapper[4839]: I0321 04:44:48.794912 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-848cf88cfc-k67ln" event={"ID":"ac45c53b-2486-47d1-aaf4-23b76adfd431","Type":"ContainerDied","Data":"d6b588c7e0ea2083a499b261b5c79b627db47988e3cf9da2d927f03f127f5e76"} Mar 21 04:44:48 crc kubenswrapper[4839]: I0321 04:44:48.794935 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-848cf88cfc-k67ln" event={"ID":"ac45c53b-2486-47d1-aaf4-23b76adfd431","Type":"ContainerStarted","Data":"2f1d63d47cab7235a54c65fca44b344c795fcf2a4d8ecfd84ead6214de4729cf"} Mar 21 04:44:48 crc kubenswrapper[4839]: I0321 04:44:48.808918 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-67dd687666-pgfc5" event={"ID":"10c8735c-f1c9-40f7-bd34-60bb0749bc23","Type":"ContainerStarted","Data":"5b137b53eba217c749f810e3fe6d4536182b4cec7923324d43b649cbc888ca03"} Mar 21 04:44:48 crc kubenswrapper[4839]: I0321 04:44:48.809240 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-67dd687666-pgfc5" event={"ID":"10c8735c-f1c9-40f7-bd34-60bb0749bc23","Type":"ContainerStarted","Data":"eef1fbdee77ab0e9e93f444be08641661339d7257383b84dc3aa743e29072b31"} Mar 21 04:44:48 crc kubenswrapper[4839]: I0321 04:44:48.832544 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-7b946d96f4-chv76" event={"ID":"e6e03301-fb6e-467b-b19d-21b5c475d35c","Type":"ContainerStarted","Data":"fca5e18e77c121faaaf0ada8e901cc40df5819da429b90285bcb8be9c693b177"} Mar 21 04:44:48 crc kubenswrapper[4839]: I0321 04:44:48.841321 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"506e1e04-5787-48bb-9165-96a55f0d3095","Type":"ContainerStarted","Data":"688009d7356d78e3eb36a5befafccac32153750022bd8fbc6ea8dbee86aced35"} Mar 21 04:44:49 crc kubenswrapper[4839]: I0321 04:44:49.266359 4839 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/horizon-84c6c985f8-v5cmh" podUID="b3b26c3a-55d5-442a-9c31-187b0aa60f90" containerName="horizon" probeResult="failure" output="Get \"https://10.217.0.151:8443/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.0.151:8443: connect: connection refused" Mar 21 04:44:49 crc kubenswrapper[4839]: I0321 04:44:49.329828 4839 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/horizon-9c97f4dbd-k2scs" podUID="579308eb-854d-4160-ad35-8677f2d0e634" containerName="horizon" probeResult="failure" output="Get \"https://10.217.0.152:8443/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.0.152:8443: connect: connection refused" Mar 21 04:44:49 crc kubenswrapper[4839]: I0321 04:44:49.494927 4839 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-api-59cc78c49d-pkvb5"] Mar 21 04:44:49 crc kubenswrapper[4839]: I0321 04:44:49.522708 4839 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-api-d9cf4c794-jb7lf"] Mar 21 04:44:49 crc kubenswrapper[4839]: I0321 04:44:49.525738 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-d9cf4c794-jb7lf" Mar 21 04:44:49 crc kubenswrapper[4839]: I0321 04:44:49.527853 4839 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-barbican-internal-svc" Mar 21 04:44:49 crc kubenswrapper[4839]: I0321 04:44:49.528117 4839 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-barbican-public-svc" Mar 21 04:44:49 crc kubenswrapper[4839]: I0321 04:44:49.544152 4839 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-d9cf4c794-jb7lf"] Mar 21 04:44:49 crc kubenswrapper[4839]: I0321 04:44:49.633158 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dzwb8\" (UniqueName: \"kubernetes.io/projected/37ba14c5-dfc7-4268-86c9-c0efe37fe6c9-kube-api-access-dzwb8\") pod \"barbican-api-d9cf4c794-jb7lf\" (UID: \"37ba14c5-dfc7-4268-86c9-c0efe37fe6c9\") " pod="openstack/barbican-api-d9cf4c794-jb7lf" Mar 21 04:44:49 crc kubenswrapper[4839]: I0321 04:44:49.633327 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/37ba14c5-dfc7-4268-86c9-c0efe37fe6c9-config-data-custom\") pod \"barbican-api-d9cf4c794-jb7lf\" (UID: \"37ba14c5-dfc7-4268-86c9-c0efe37fe6c9\") " pod="openstack/barbican-api-d9cf4c794-jb7lf" Mar 21 04:44:49 crc kubenswrapper[4839]: I0321 04:44:49.633435 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/37ba14c5-dfc7-4268-86c9-c0efe37fe6c9-config-data\") pod \"barbican-api-d9cf4c794-jb7lf\" (UID: \"37ba14c5-dfc7-4268-86c9-c0efe37fe6c9\") " pod="openstack/barbican-api-d9cf4c794-jb7lf" Mar 21 04:44:49 crc kubenswrapper[4839]: I0321 04:44:49.633508 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/37ba14c5-dfc7-4268-86c9-c0efe37fe6c9-internal-tls-certs\") pod \"barbican-api-d9cf4c794-jb7lf\" (UID: \"37ba14c5-dfc7-4268-86c9-c0efe37fe6c9\") " pod="openstack/barbican-api-d9cf4c794-jb7lf" Mar 21 04:44:49 crc kubenswrapper[4839]: I0321 04:44:49.633534 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/37ba14c5-dfc7-4268-86c9-c0efe37fe6c9-logs\") pod \"barbican-api-d9cf4c794-jb7lf\" (UID: \"37ba14c5-dfc7-4268-86c9-c0efe37fe6c9\") " pod="openstack/barbican-api-d9cf4c794-jb7lf" Mar 21 04:44:49 crc kubenswrapper[4839]: I0321 04:44:49.633600 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/37ba14c5-dfc7-4268-86c9-c0efe37fe6c9-combined-ca-bundle\") pod \"barbican-api-d9cf4c794-jb7lf\" (UID: \"37ba14c5-dfc7-4268-86c9-c0efe37fe6c9\") " pod="openstack/barbican-api-d9cf4c794-jb7lf" Mar 21 04:44:49 crc kubenswrapper[4839]: I0321 04:44:49.633622 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/37ba14c5-dfc7-4268-86c9-c0efe37fe6c9-public-tls-certs\") pod \"barbican-api-d9cf4c794-jb7lf\" (UID: \"37ba14c5-dfc7-4268-86c9-c0efe37fe6c9\") " pod="openstack/barbican-api-d9cf4c794-jb7lf" Mar 21 04:44:49 crc kubenswrapper[4839]: I0321 04:44:49.735583 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/37ba14c5-dfc7-4268-86c9-c0efe37fe6c9-config-data\") pod \"barbican-api-d9cf4c794-jb7lf\" (UID: \"37ba14c5-dfc7-4268-86c9-c0efe37fe6c9\") " pod="openstack/barbican-api-d9cf4c794-jb7lf" Mar 21 04:44:49 crc kubenswrapper[4839]: I0321 04:44:49.735636 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/37ba14c5-dfc7-4268-86c9-c0efe37fe6c9-internal-tls-certs\") pod \"barbican-api-d9cf4c794-jb7lf\" (UID: \"37ba14c5-dfc7-4268-86c9-c0efe37fe6c9\") " pod="openstack/barbican-api-d9cf4c794-jb7lf" Mar 21 04:44:49 crc kubenswrapper[4839]: I0321 04:44:49.735661 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/37ba14c5-dfc7-4268-86c9-c0efe37fe6c9-logs\") pod \"barbican-api-d9cf4c794-jb7lf\" (UID: \"37ba14c5-dfc7-4268-86c9-c0efe37fe6c9\") " pod="openstack/barbican-api-d9cf4c794-jb7lf" Mar 21 04:44:49 crc kubenswrapper[4839]: I0321 04:44:49.735709 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/37ba14c5-dfc7-4268-86c9-c0efe37fe6c9-combined-ca-bundle\") pod \"barbican-api-d9cf4c794-jb7lf\" (UID: \"37ba14c5-dfc7-4268-86c9-c0efe37fe6c9\") " pod="openstack/barbican-api-d9cf4c794-jb7lf" Mar 21 04:44:49 crc kubenswrapper[4839]: I0321 04:44:49.735729 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/37ba14c5-dfc7-4268-86c9-c0efe37fe6c9-public-tls-certs\") pod \"barbican-api-d9cf4c794-jb7lf\" (UID: \"37ba14c5-dfc7-4268-86c9-c0efe37fe6c9\") " pod="openstack/barbican-api-d9cf4c794-jb7lf" Mar 21 04:44:49 crc kubenswrapper[4839]: I0321 04:44:49.735801 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dzwb8\" (UniqueName: \"kubernetes.io/projected/37ba14c5-dfc7-4268-86c9-c0efe37fe6c9-kube-api-access-dzwb8\") pod \"barbican-api-d9cf4c794-jb7lf\" (UID: \"37ba14c5-dfc7-4268-86c9-c0efe37fe6c9\") " pod="openstack/barbican-api-d9cf4c794-jb7lf" Mar 21 04:44:49 crc kubenswrapper[4839]: I0321 04:44:49.735848 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/37ba14c5-dfc7-4268-86c9-c0efe37fe6c9-config-data-custom\") pod \"barbican-api-d9cf4c794-jb7lf\" (UID: \"37ba14c5-dfc7-4268-86c9-c0efe37fe6c9\") " pod="openstack/barbican-api-d9cf4c794-jb7lf" Mar 21 04:44:49 crc kubenswrapper[4839]: I0321 04:44:49.736631 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/37ba14c5-dfc7-4268-86c9-c0efe37fe6c9-logs\") pod \"barbican-api-d9cf4c794-jb7lf\" (UID: \"37ba14c5-dfc7-4268-86c9-c0efe37fe6c9\") " pod="openstack/barbican-api-d9cf4c794-jb7lf" Mar 21 04:44:49 crc kubenswrapper[4839]: I0321 04:44:49.743682 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/37ba14c5-dfc7-4268-86c9-c0efe37fe6c9-combined-ca-bundle\") pod \"barbican-api-d9cf4c794-jb7lf\" (UID: \"37ba14c5-dfc7-4268-86c9-c0efe37fe6c9\") " pod="openstack/barbican-api-d9cf4c794-jb7lf" Mar 21 04:44:49 crc kubenswrapper[4839]: I0321 04:44:49.747490 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/37ba14c5-dfc7-4268-86c9-c0efe37fe6c9-config-data\") pod \"barbican-api-d9cf4c794-jb7lf\" (UID: \"37ba14c5-dfc7-4268-86c9-c0efe37fe6c9\") " pod="openstack/barbican-api-d9cf4c794-jb7lf" Mar 21 04:44:49 crc kubenswrapper[4839]: I0321 04:44:49.749951 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/37ba14c5-dfc7-4268-86c9-c0efe37fe6c9-public-tls-certs\") pod \"barbican-api-d9cf4c794-jb7lf\" (UID: \"37ba14c5-dfc7-4268-86c9-c0efe37fe6c9\") " pod="openstack/barbican-api-d9cf4c794-jb7lf" Mar 21 04:44:49 crc kubenswrapper[4839]: I0321 04:44:49.751702 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/37ba14c5-dfc7-4268-86c9-c0efe37fe6c9-internal-tls-certs\") pod \"barbican-api-d9cf4c794-jb7lf\" (UID: \"37ba14c5-dfc7-4268-86c9-c0efe37fe6c9\") " pod="openstack/barbican-api-d9cf4c794-jb7lf" Mar 21 04:44:49 crc kubenswrapper[4839]: I0321 04:44:49.753192 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/37ba14c5-dfc7-4268-86c9-c0efe37fe6c9-config-data-custom\") pod \"barbican-api-d9cf4c794-jb7lf\" (UID: \"37ba14c5-dfc7-4268-86c9-c0efe37fe6c9\") " pod="openstack/barbican-api-d9cf4c794-jb7lf" Mar 21 04:44:49 crc kubenswrapper[4839]: I0321 04:44:49.755199 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dzwb8\" (UniqueName: \"kubernetes.io/projected/37ba14c5-dfc7-4268-86c9-c0efe37fe6c9-kube-api-access-dzwb8\") pod \"barbican-api-d9cf4c794-jb7lf\" (UID: \"37ba14c5-dfc7-4268-86c9-c0efe37fe6c9\") " pod="openstack/barbican-api-d9cf4c794-jb7lf" Mar 21 04:44:49 crc kubenswrapper[4839]: I0321 04:44:49.862184 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"506e1e04-5787-48bb-9165-96a55f0d3095","Type":"ContainerStarted","Data":"9d897b01178474175025269c566e1858192f12c1b5756dd643a41a358a91f169"} Mar 21 04:44:49 crc kubenswrapper[4839]: I0321 04:44:49.867696 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-59cc78c49d-pkvb5" event={"ID":"08c48ff3-782c-4f5a-8a20-0736565e247a","Type":"ContainerStarted","Data":"3c96822ce29fd36df89bd69bb9910a63e4f7280b5ee31572e7428e248b386c7b"} Mar 21 04:44:49 crc kubenswrapper[4839]: I0321 04:44:49.867772 4839 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-59cc78c49d-pkvb5" Mar 21 04:44:49 crc kubenswrapper[4839]: I0321 04:44:49.868133 4839 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-59cc78c49d-pkvb5" Mar 21 04:44:49 crc kubenswrapper[4839]: I0321 04:44:49.878632 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"524772c8-3fdb-43dc-8532-1d8e9dcdeb97","Type":"ContainerStarted","Data":"f646a915486b1a93d441b9746d2db8ddcc03de6d349fab6b4e064f47d3119f34"} Mar 21 04:44:49 crc kubenswrapper[4839]: I0321 04:44:49.878835 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-d9cf4c794-jb7lf" Mar 21 04:44:49 crc kubenswrapper[4839]: I0321 04:44:49.890254 4839 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-internal-api-0" podStartSLOduration=4.890238025 podStartE2EDuration="4.890238025s" podCreationTimestamp="2026-03-21 04:44:45 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-21 04:44:49.884109453 +0000 UTC m=+1294.211896129" watchObservedRunningTime="2026-03-21 04:44:49.890238025 +0000 UTC m=+1294.218024701" Mar 21 04:44:49 crc kubenswrapper[4839]: I0321 04:44:49.900392 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-848cf88cfc-k67ln" event={"ID":"ac45c53b-2486-47d1-aaf4-23b76adfd431","Type":"ContainerStarted","Data":"d7d86bc6d96470a04c1fc681cf73561b422455dc884417b0677e9ae418f682f0"} Mar 21 04:44:49 crc kubenswrapper[4839]: I0321 04:44:49.901207 4839 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-848cf88cfc-k67ln" Mar 21 04:44:49 crc kubenswrapper[4839]: I0321 04:44:49.906886 4839 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-api-59cc78c49d-pkvb5" podStartSLOduration=3.9068737799999997 podStartE2EDuration="3.90687378s" podCreationTimestamp="2026-03-21 04:44:46 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-21 04:44:49.905620145 +0000 UTC m=+1294.233406821" watchObservedRunningTime="2026-03-21 04:44:49.90687378 +0000 UTC m=+1294.234660456" Mar 21 04:44:49 crc kubenswrapper[4839]: I0321 04:44:49.912986 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-67dd687666-pgfc5" event={"ID":"10c8735c-f1c9-40f7-bd34-60bb0749bc23","Type":"ContainerStarted","Data":"77ad1711dc27b34bdfe011c55c79bdc7d6ef8a5e0c42e7951254c65ef19efa51"} Mar 21 04:44:49 crc kubenswrapper[4839]: I0321 04:44:49.913054 4839 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-67dd687666-pgfc5" Mar 21 04:44:49 crc kubenswrapper[4839]: I0321 04:44:49.913342 4839 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-67dd687666-pgfc5" Mar 21 04:44:49 crc kubenswrapper[4839]: I0321 04:44:49.935790 4839 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-external-api-0" podStartSLOduration=4.935761508 podStartE2EDuration="4.935761508s" podCreationTimestamp="2026-03-21 04:44:45 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-21 04:44:49.933827594 +0000 UTC m=+1294.261614270" watchObservedRunningTime="2026-03-21 04:44:49.935761508 +0000 UTC m=+1294.263548184" Mar 21 04:44:49 crc kubenswrapper[4839]: I0321 04:44:49.959926 4839 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-848cf88cfc-k67ln" podStartSLOduration=3.959909934 podStartE2EDuration="3.959909934s" podCreationTimestamp="2026-03-21 04:44:46 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-21 04:44:49.953502344 +0000 UTC m=+1294.281289020" watchObservedRunningTime="2026-03-21 04:44:49.959909934 +0000 UTC m=+1294.287696610" Mar 21 04:44:49 crc kubenswrapper[4839]: I0321 04:44:49.981971 4839 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-api-67dd687666-pgfc5" podStartSLOduration=3.98194831 podStartE2EDuration="3.98194831s" podCreationTimestamp="2026-03-21 04:44:46 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-21 04:44:49.973312109 +0000 UTC m=+1294.301098795" watchObservedRunningTime="2026-03-21 04:44:49.98194831 +0000 UTC m=+1294.309734986" Mar 21 04:44:50 crc kubenswrapper[4839]: I0321 04:44:50.502403 4839 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-d9cf4c794-jb7lf"] Mar 21 04:44:50 crc kubenswrapper[4839]: I0321 04:44:50.920548 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-d9cf4c794-jb7lf" event={"ID":"37ba14c5-dfc7-4268-86c9-c0efe37fe6c9","Type":"ContainerStarted","Data":"d854a871bf480c473cddf8b21d5c13c7c57ca42f1882e6429c31909d901bd04a"} Mar 21 04:44:50 crc kubenswrapper[4839]: I0321 04:44:50.921179 4839 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-api-59cc78c49d-pkvb5" podUID="08c48ff3-782c-4f5a-8a20-0736565e247a" containerName="barbican-api" containerID="cri-o://3c96822ce29fd36df89bd69bb9910a63e4f7280b5ee31572e7428e248b386c7b" gracePeriod=30 Mar 21 04:44:50 crc kubenswrapper[4839]: I0321 04:44:50.921140 4839 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-api-59cc78c49d-pkvb5" podUID="08c48ff3-782c-4f5a-8a20-0736565e247a" containerName="barbican-api-log" containerID="cri-o://006498edc7390e4a3c12026f57aa6ecaf4e9b90cf55099e0f8f58424b7b384f9" gracePeriod=30 Mar 21 04:44:51 crc kubenswrapper[4839]: I0321 04:44:51.599321 4839 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-59cc78c49d-pkvb5" Mar 21 04:44:51 crc kubenswrapper[4839]: I0321 04:44:51.691610 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/08c48ff3-782c-4f5a-8a20-0736565e247a-logs\") pod \"08c48ff3-782c-4f5a-8a20-0736565e247a\" (UID: \"08c48ff3-782c-4f5a-8a20-0736565e247a\") " Mar 21 04:44:51 crc kubenswrapper[4839]: I0321 04:44:51.691669 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/08c48ff3-782c-4f5a-8a20-0736565e247a-combined-ca-bundle\") pod \"08c48ff3-782c-4f5a-8a20-0736565e247a\" (UID: \"08c48ff3-782c-4f5a-8a20-0736565e247a\") " Mar 21 04:44:51 crc kubenswrapper[4839]: I0321 04:44:51.691730 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/08c48ff3-782c-4f5a-8a20-0736565e247a-config-data\") pod \"08c48ff3-782c-4f5a-8a20-0736565e247a\" (UID: \"08c48ff3-782c-4f5a-8a20-0736565e247a\") " Mar 21 04:44:51 crc kubenswrapper[4839]: I0321 04:44:51.691781 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/08c48ff3-782c-4f5a-8a20-0736565e247a-config-data-custom\") pod \"08c48ff3-782c-4f5a-8a20-0736565e247a\" (UID: \"08c48ff3-782c-4f5a-8a20-0736565e247a\") " Mar 21 04:44:51 crc kubenswrapper[4839]: I0321 04:44:51.691841 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lsv2l\" (UniqueName: \"kubernetes.io/projected/08c48ff3-782c-4f5a-8a20-0736565e247a-kube-api-access-lsv2l\") pod \"08c48ff3-782c-4f5a-8a20-0736565e247a\" (UID: \"08c48ff3-782c-4f5a-8a20-0736565e247a\") " Mar 21 04:44:51 crc kubenswrapper[4839]: I0321 04:44:51.692036 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/08c48ff3-782c-4f5a-8a20-0736565e247a-logs" (OuterVolumeSpecName: "logs") pod "08c48ff3-782c-4f5a-8a20-0736565e247a" (UID: "08c48ff3-782c-4f5a-8a20-0736565e247a"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 21 04:44:51 crc kubenswrapper[4839]: I0321 04:44:51.692468 4839 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/08c48ff3-782c-4f5a-8a20-0736565e247a-logs\") on node \"crc\" DevicePath \"\"" Mar 21 04:44:51 crc kubenswrapper[4839]: I0321 04:44:51.695613 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/08c48ff3-782c-4f5a-8a20-0736565e247a-kube-api-access-lsv2l" (OuterVolumeSpecName: "kube-api-access-lsv2l") pod "08c48ff3-782c-4f5a-8a20-0736565e247a" (UID: "08c48ff3-782c-4f5a-8a20-0736565e247a"). InnerVolumeSpecName "kube-api-access-lsv2l". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 21 04:44:51 crc kubenswrapper[4839]: I0321 04:44:51.697687 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/08c48ff3-782c-4f5a-8a20-0736565e247a-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "08c48ff3-782c-4f5a-8a20-0736565e247a" (UID: "08c48ff3-782c-4f5a-8a20-0736565e247a"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 21 04:44:51 crc kubenswrapper[4839]: I0321 04:44:51.733722 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/08c48ff3-782c-4f5a-8a20-0736565e247a-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "08c48ff3-782c-4f5a-8a20-0736565e247a" (UID: "08c48ff3-782c-4f5a-8a20-0736565e247a"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 21 04:44:51 crc kubenswrapper[4839]: I0321 04:44:51.793754 4839 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/08c48ff3-782c-4f5a-8a20-0736565e247a-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 21 04:44:51 crc kubenswrapper[4839]: I0321 04:44:51.793780 4839 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/08c48ff3-782c-4f5a-8a20-0736565e247a-config-data-custom\") on node \"crc\" DevicePath \"\"" Mar 21 04:44:51 crc kubenswrapper[4839]: I0321 04:44:51.793788 4839 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lsv2l\" (UniqueName: \"kubernetes.io/projected/08c48ff3-782c-4f5a-8a20-0736565e247a-kube-api-access-lsv2l\") on node \"crc\" DevicePath \"\"" Mar 21 04:44:51 crc kubenswrapper[4839]: I0321 04:44:51.810594 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/08c48ff3-782c-4f5a-8a20-0736565e247a-config-data" (OuterVolumeSpecName: "config-data") pod "08c48ff3-782c-4f5a-8a20-0736565e247a" (UID: "08c48ff3-782c-4f5a-8a20-0736565e247a"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 21 04:44:51 crc kubenswrapper[4839]: I0321 04:44:51.904045 4839 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/08c48ff3-782c-4f5a-8a20-0736565e247a-config-data\") on node \"crc\" DevicePath \"\"" Mar 21 04:44:51 crc kubenswrapper[4839]: I0321 04:44:51.934997 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-77466dd775-brs5x" event={"ID":"392fb516-8745-40fb-b38d-53106c8310df","Type":"ContainerStarted","Data":"ae270d4db24cad72c19c535910d2db30e454a5a89554335e5a48e6588c21e3f2"} Mar 21 04:44:51 crc kubenswrapper[4839]: I0321 04:44:51.939401 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-56b894998b-l59vx" event={"ID":"e5f64e49-61a6-4601-b37b-f9af6079108c","Type":"ContainerStarted","Data":"d2facd4bdb131b2c8282fb6aaa7c8482ec973fccdc0e979a7ef3bb2d591fc401"} Mar 21 04:44:51 crc kubenswrapper[4839]: I0321 04:44:51.939444 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-56b894998b-l59vx" event={"ID":"e5f64e49-61a6-4601-b37b-f9af6079108c","Type":"ContainerStarted","Data":"a4599ccee0ab959c4e13333beacaf8a8188b2b3effd5f393fa70626300ad559e"} Mar 21 04:44:51 crc kubenswrapper[4839]: I0321 04:44:51.950377 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-7b946d96f4-chv76" event={"ID":"e6e03301-fb6e-467b-b19d-21b5c475d35c","Type":"ContainerStarted","Data":"0109ad2133b2847c0789961fe4df245b3330bac2bdaa17fccfe8427de3b0f70a"} Mar 21 04:44:51 crc kubenswrapper[4839]: I0321 04:44:51.950428 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-7b946d96f4-chv76" event={"ID":"e6e03301-fb6e-467b-b19d-21b5c475d35c","Type":"ContainerStarted","Data":"225ca07006eea0413be478bddfc73f7cf7ca14abc2e370b12af874b935b50e00"} Mar 21 04:44:51 crc kubenswrapper[4839]: I0321 04:44:51.962996 4839 generic.go:334] "Generic (PLEG): container finished" podID="08c48ff3-782c-4f5a-8a20-0736565e247a" containerID="3c96822ce29fd36df89bd69bb9910a63e4f7280b5ee31572e7428e248b386c7b" exitCode=0 Mar 21 04:44:51 crc kubenswrapper[4839]: I0321 04:44:51.963029 4839 generic.go:334] "Generic (PLEG): container finished" podID="08c48ff3-782c-4f5a-8a20-0736565e247a" containerID="006498edc7390e4a3c12026f57aa6ecaf4e9b90cf55099e0f8f58424b7b384f9" exitCode=143 Mar 21 04:44:51 crc kubenswrapper[4839]: I0321 04:44:51.963073 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-59cc78c49d-pkvb5" event={"ID":"08c48ff3-782c-4f5a-8a20-0736565e247a","Type":"ContainerDied","Data":"3c96822ce29fd36df89bd69bb9910a63e4f7280b5ee31572e7428e248b386c7b"} Mar 21 04:44:51 crc kubenswrapper[4839]: I0321 04:44:51.963100 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-59cc78c49d-pkvb5" event={"ID":"08c48ff3-782c-4f5a-8a20-0736565e247a","Type":"ContainerDied","Data":"006498edc7390e4a3c12026f57aa6ecaf4e9b90cf55099e0f8f58424b7b384f9"} Mar 21 04:44:51 crc kubenswrapper[4839]: I0321 04:44:51.963110 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-59cc78c49d-pkvb5" event={"ID":"08c48ff3-782c-4f5a-8a20-0736565e247a","Type":"ContainerDied","Data":"75d6eae50b83f1f8ec52b3d138f4404879d9130c0a03aa11c82efdf2866d9d7d"} Mar 21 04:44:51 crc kubenswrapper[4839]: I0321 04:44:51.963128 4839 scope.go:117] "RemoveContainer" containerID="3c96822ce29fd36df89bd69bb9910a63e4f7280b5ee31572e7428e248b386c7b" Mar 21 04:44:51 crc kubenswrapper[4839]: I0321 04:44:51.963228 4839 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-59cc78c49d-pkvb5" Mar 21 04:44:51 crc kubenswrapper[4839]: I0321 04:44:51.973317 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-d9cf4c794-jb7lf" event={"ID":"37ba14c5-dfc7-4268-86c9-c0efe37fe6c9","Type":"ContainerStarted","Data":"a6b1b6f8886247bae06bf792f12be02011d856b581bc06b70f970fa3ce5a8ba9"} Mar 21 04:44:52 crc kubenswrapper[4839]: I0321 04:44:52.001436 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-db77b8b5f-grbp8" event={"ID":"3563c0f9-9e82-4798-bae3-b3836a6b5866","Type":"ContainerStarted","Data":"8bd67185751e9518d82306412b1a995f91b84a70526a6d67cfc3e74508e78318"} Mar 21 04:44:52 crc kubenswrapper[4839]: I0321 04:44:52.003687 4839 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-keystone-listener-56b894998b-l59vx" podStartSLOduration=3.417777461 podStartE2EDuration="7.003662418s" podCreationTimestamp="2026-03-21 04:44:45 +0000 UTC" firstStartedPulling="2026-03-21 04:44:47.403667611 +0000 UTC m=+1291.731454287" lastFinishedPulling="2026-03-21 04:44:50.989552568 +0000 UTC m=+1295.317339244" observedRunningTime="2026-03-21 04:44:51.971173609 +0000 UTC m=+1296.298960285" watchObservedRunningTime="2026-03-21 04:44:52.003662418 +0000 UTC m=+1296.331449094" Mar 21 04:44:52 crc kubenswrapper[4839]: I0321 04:44:52.005185 4839 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-keystone-listener-7b946d96f4-chv76" podStartSLOduration=3.090337297 podStartE2EDuration="6.00517622s" podCreationTimestamp="2026-03-21 04:44:46 +0000 UTC" firstStartedPulling="2026-03-21 04:44:48.080322031 +0000 UTC m=+1292.408108707" lastFinishedPulling="2026-03-21 04:44:50.995160954 +0000 UTC m=+1295.322947630" observedRunningTime="2026-03-21 04:44:51.996777125 +0000 UTC m=+1296.324563801" watchObservedRunningTime="2026-03-21 04:44:52.00517622 +0000 UTC m=+1296.332962896" Mar 21 04:44:52 crc kubenswrapper[4839]: I0321 04:44:52.027493 4839 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-worker-db77b8b5f-grbp8" podStartSLOduration=3.060209184 podStartE2EDuration="6.027479244s" podCreationTimestamp="2026-03-21 04:44:46 +0000 UTC" firstStartedPulling="2026-03-21 04:44:48.02271675 +0000 UTC m=+1292.350503436" lastFinishedPulling="2026-03-21 04:44:50.98998682 +0000 UTC m=+1295.317773496" observedRunningTime="2026-03-21 04:44:52.024405978 +0000 UTC m=+1296.352192654" watchObservedRunningTime="2026-03-21 04:44:52.027479244 +0000 UTC m=+1296.355265920" Mar 21 04:44:52 crc kubenswrapper[4839]: I0321 04:44:52.059822 4839 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-keystone-listener-56b894998b-l59vx"] Mar 21 04:44:52 crc kubenswrapper[4839]: I0321 04:44:52.059830 4839 scope.go:117] "RemoveContainer" containerID="006498edc7390e4a3c12026f57aa6ecaf4e9b90cf55099e0f8f58424b7b384f9" Mar 21 04:44:52 crc kubenswrapper[4839]: I0321 04:44:52.072692 4839 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-api-59cc78c49d-pkvb5"] Mar 21 04:44:52 crc kubenswrapper[4839]: I0321 04:44:52.092553 4839 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-api-59cc78c49d-pkvb5"] Mar 21 04:44:52 crc kubenswrapper[4839]: I0321 04:44:52.098688 4839 scope.go:117] "RemoveContainer" containerID="3c96822ce29fd36df89bd69bb9910a63e4f7280b5ee31572e7428e248b386c7b" Mar 21 04:44:52 crc kubenswrapper[4839]: I0321 04:44:52.103444 4839 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-worker-77466dd775-brs5x"] Mar 21 04:44:52 crc kubenswrapper[4839]: E0321 04:44:52.103725 4839 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3c96822ce29fd36df89bd69bb9910a63e4f7280b5ee31572e7428e248b386c7b\": container with ID starting with 3c96822ce29fd36df89bd69bb9910a63e4f7280b5ee31572e7428e248b386c7b not found: ID does not exist" containerID="3c96822ce29fd36df89bd69bb9910a63e4f7280b5ee31572e7428e248b386c7b" Mar 21 04:44:52 crc kubenswrapper[4839]: I0321 04:44:52.103819 4839 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3c96822ce29fd36df89bd69bb9910a63e4f7280b5ee31572e7428e248b386c7b"} err="failed to get container status \"3c96822ce29fd36df89bd69bb9910a63e4f7280b5ee31572e7428e248b386c7b\": rpc error: code = NotFound desc = could not find container \"3c96822ce29fd36df89bd69bb9910a63e4f7280b5ee31572e7428e248b386c7b\": container with ID starting with 3c96822ce29fd36df89bd69bb9910a63e4f7280b5ee31572e7428e248b386c7b not found: ID does not exist" Mar 21 04:44:52 crc kubenswrapper[4839]: I0321 04:44:52.103910 4839 scope.go:117] "RemoveContainer" containerID="006498edc7390e4a3c12026f57aa6ecaf4e9b90cf55099e0f8f58424b7b384f9" Mar 21 04:44:52 crc kubenswrapper[4839]: E0321 04:44:52.106777 4839 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"006498edc7390e4a3c12026f57aa6ecaf4e9b90cf55099e0f8f58424b7b384f9\": container with ID starting with 006498edc7390e4a3c12026f57aa6ecaf4e9b90cf55099e0f8f58424b7b384f9 not found: ID does not exist" containerID="006498edc7390e4a3c12026f57aa6ecaf4e9b90cf55099e0f8f58424b7b384f9" Mar 21 04:44:52 crc kubenswrapper[4839]: I0321 04:44:52.106814 4839 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"006498edc7390e4a3c12026f57aa6ecaf4e9b90cf55099e0f8f58424b7b384f9"} err="failed to get container status \"006498edc7390e4a3c12026f57aa6ecaf4e9b90cf55099e0f8f58424b7b384f9\": rpc error: code = NotFound desc = could not find container \"006498edc7390e4a3c12026f57aa6ecaf4e9b90cf55099e0f8f58424b7b384f9\": container with ID starting with 006498edc7390e4a3c12026f57aa6ecaf4e9b90cf55099e0f8f58424b7b384f9 not found: ID does not exist" Mar 21 04:44:52 crc kubenswrapper[4839]: I0321 04:44:52.106840 4839 scope.go:117] "RemoveContainer" containerID="3c96822ce29fd36df89bd69bb9910a63e4f7280b5ee31572e7428e248b386c7b" Mar 21 04:44:52 crc kubenswrapper[4839]: I0321 04:44:52.107085 4839 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3c96822ce29fd36df89bd69bb9910a63e4f7280b5ee31572e7428e248b386c7b"} err="failed to get container status \"3c96822ce29fd36df89bd69bb9910a63e4f7280b5ee31572e7428e248b386c7b\": rpc error: code = NotFound desc = could not find container \"3c96822ce29fd36df89bd69bb9910a63e4f7280b5ee31572e7428e248b386c7b\": container with ID starting with 3c96822ce29fd36df89bd69bb9910a63e4f7280b5ee31572e7428e248b386c7b not found: ID does not exist" Mar 21 04:44:52 crc kubenswrapper[4839]: I0321 04:44:52.107104 4839 scope.go:117] "RemoveContainer" containerID="006498edc7390e4a3c12026f57aa6ecaf4e9b90cf55099e0f8f58424b7b384f9" Mar 21 04:44:52 crc kubenswrapper[4839]: I0321 04:44:52.109386 4839 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"006498edc7390e4a3c12026f57aa6ecaf4e9b90cf55099e0f8f58424b7b384f9"} err="failed to get container status \"006498edc7390e4a3c12026f57aa6ecaf4e9b90cf55099e0f8f58424b7b384f9\": rpc error: code = NotFound desc = could not find container \"006498edc7390e4a3c12026f57aa6ecaf4e9b90cf55099e0f8f58424b7b384f9\": container with ID starting with 006498edc7390e4a3c12026f57aa6ecaf4e9b90cf55099e0f8f58424b7b384f9 not found: ID does not exist" Mar 21 04:44:52 crc kubenswrapper[4839]: I0321 04:44:52.499640 4839 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="08c48ff3-782c-4f5a-8a20-0736565e247a" path="/var/lib/kubelet/pods/08c48ff3-782c-4f5a-8a20-0736565e247a/volumes" Mar 21 04:44:53 crc kubenswrapper[4839]: I0321 04:44:53.018869 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-77466dd775-brs5x" event={"ID":"392fb516-8745-40fb-b38d-53106c8310df","Type":"ContainerStarted","Data":"fd761f8a43876dc94432e55d47974d0e86c8a21f74c77ba35b0fda3b1a64872e"} Mar 21 04:44:53 crc kubenswrapper[4839]: I0321 04:44:53.031998 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-d9cf4c794-jb7lf" event={"ID":"37ba14c5-dfc7-4268-86c9-c0efe37fe6c9","Type":"ContainerStarted","Data":"ead5ea8cce6422b9395b5e4042e0f142a18f226fac54adc05d9902b6f49bcd2b"} Mar 21 04:44:53 crc kubenswrapper[4839]: I0321 04:44:53.034849 4839 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-d9cf4c794-jb7lf" Mar 21 04:44:53 crc kubenswrapper[4839]: I0321 04:44:53.034891 4839 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-d9cf4c794-jb7lf" Mar 21 04:44:53 crc kubenswrapper[4839]: I0321 04:44:53.042217 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-db77b8b5f-grbp8" event={"ID":"3563c0f9-9e82-4798-bae3-b3836a6b5866","Type":"ContainerStarted","Data":"9d92b450d693bd25d9bf1a793ee362f3b967d2ff5568fa333eb1ee5c645df9e7"} Mar 21 04:44:53 crc kubenswrapper[4839]: I0321 04:44:53.042707 4839 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-worker-77466dd775-brs5x" podStartSLOduration=4.483561038 podStartE2EDuration="8.042684805s" podCreationTimestamp="2026-03-21 04:44:45 +0000 UTC" firstStartedPulling="2026-03-21 04:44:47.433929488 +0000 UTC m=+1291.761716164" lastFinishedPulling="2026-03-21 04:44:50.993053255 +0000 UTC m=+1295.320839931" observedRunningTime="2026-03-21 04:44:53.039715602 +0000 UTC m=+1297.367502278" watchObservedRunningTime="2026-03-21 04:44:53.042684805 +0000 UTC m=+1297.370471481" Mar 21 04:44:53 crc kubenswrapper[4839]: I0321 04:44:53.074004 4839 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-api-d9cf4c794-jb7lf" podStartSLOduration=4.073986201 podStartE2EDuration="4.073986201s" podCreationTimestamp="2026-03-21 04:44:49 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-21 04:44:53.062122779 +0000 UTC m=+1297.389909465" watchObservedRunningTime="2026-03-21 04:44:53.073986201 +0000 UTC m=+1297.401772877" Mar 21 04:44:54 crc kubenswrapper[4839]: I0321 04:44:54.050483 4839 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-worker-77466dd775-brs5x" podUID="392fb516-8745-40fb-b38d-53106c8310df" containerName="barbican-worker-log" containerID="cri-o://ae270d4db24cad72c19c535910d2db30e454a5a89554335e5a48e6588c21e3f2" gracePeriod=30 Mar 21 04:44:54 crc kubenswrapper[4839]: I0321 04:44:54.050530 4839 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-worker-77466dd775-brs5x" podUID="392fb516-8745-40fb-b38d-53106c8310df" containerName="barbican-worker" containerID="cri-o://fd761f8a43876dc94432e55d47974d0e86c8a21f74c77ba35b0fda3b1a64872e" gracePeriod=30 Mar 21 04:44:54 crc kubenswrapper[4839]: I0321 04:44:54.051854 4839 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-keystone-listener-56b894998b-l59vx" podUID="e5f64e49-61a6-4601-b37b-f9af6079108c" containerName="barbican-keystone-listener-log" containerID="cri-o://a4599ccee0ab959c4e13333beacaf8a8188b2b3effd5f393fa70626300ad559e" gracePeriod=30 Mar 21 04:44:54 crc kubenswrapper[4839]: I0321 04:44:54.051960 4839 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-keystone-listener-56b894998b-l59vx" podUID="e5f64e49-61a6-4601-b37b-f9af6079108c" containerName="barbican-keystone-listener" containerID="cri-o://d2facd4bdb131b2c8282fb6aaa7c8482ec973fccdc0e979a7ef3bb2d591fc401" gracePeriod=30 Mar 21 04:44:55 crc kubenswrapper[4839]: I0321 04:44:55.060554 4839 generic.go:334] "Generic (PLEG): container finished" podID="392fb516-8745-40fb-b38d-53106c8310df" containerID="ae270d4db24cad72c19c535910d2db30e454a5a89554335e5a48e6588c21e3f2" exitCode=143 Mar 21 04:44:55 crc kubenswrapper[4839]: I0321 04:44:55.060604 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-77466dd775-brs5x" event={"ID":"392fb516-8745-40fb-b38d-53106c8310df","Type":"ContainerDied","Data":"ae270d4db24cad72c19c535910d2db30e454a5a89554335e5a48e6588c21e3f2"} Mar 21 04:44:55 crc kubenswrapper[4839]: I0321 04:44:55.066405 4839 generic.go:334] "Generic (PLEG): container finished" podID="e5f64e49-61a6-4601-b37b-f9af6079108c" containerID="d2facd4bdb131b2c8282fb6aaa7c8482ec973fccdc0e979a7ef3bb2d591fc401" exitCode=0 Mar 21 04:44:55 crc kubenswrapper[4839]: I0321 04:44:55.066433 4839 generic.go:334] "Generic (PLEG): container finished" podID="e5f64e49-61a6-4601-b37b-f9af6079108c" containerID="a4599ccee0ab959c4e13333beacaf8a8188b2b3effd5f393fa70626300ad559e" exitCode=143 Mar 21 04:44:55 crc kubenswrapper[4839]: I0321 04:44:55.066501 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-56b894998b-l59vx" event={"ID":"e5f64e49-61a6-4601-b37b-f9af6079108c","Type":"ContainerDied","Data":"d2facd4bdb131b2c8282fb6aaa7c8482ec973fccdc0e979a7ef3bb2d591fc401"} Mar 21 04:44:55 crc kubenswrapper[4839]: I0321 04:44:55.066560 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-56b894998b-l59vx" event={"ID":"e5f64e49-61a6-4601-b37b-f9af6079108c","Type":"ContainerDied","Data":"a4599ccee0ab959c4e13333beacaf8a8188b2b3effd5f393fa70626300ad559e"} Mar 21 04:44:55 crc kubenswrapper[4839]: I0321 04:44:55.984532 4839 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Mar 21 04:44:55 crc kubenswrapper[4839]: I0321 04:44:55.985442 4839 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Mar 21 04:44:56 crc kubenswrapper[4839]: I0321 04:44:56.027921 4839 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Mar 21 04:44:56 crc kubenswrapper[4839]: I0321 04:44:56.039726 4839 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Mar 21 04:44:56 crc kubenswrapper[4839]: I0321 04:44:56.083108 4839 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Mar 21 04:44:56 crc kubenswrapper[4839]: I0321 04:44:56.083147 4839 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Mar 21 04:44:56 crc kubenswrapper[4839]: I0321 04:44:56.183431 4839 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Mar 21 04:44:56 crc kubenswrapper[4839]: I0321 04:44:56.183476 4839 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Mar 21 04:44:56 crc kubenswrapper[4839]: I0321 04:44:56.221716 4839 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Mar 21 04:44:56 crc kubenswrapper[4839]: I0321 04:44:56.493689 4839 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-848cf88cfc-k67ln" Mar 21 04:44:56 crc kubenswrapper[4839]: I0321 04:44:56.554831 4839 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6b7b667979-w74nb"] Mar 21 04:44:56 crc kubenswrapper[4839]: I0321 04:44:56.555062 4839 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-6b7b667979-w74nb" podUID="aeba2ea1-e2d1-46d2-8e89-982cd58f3b15" containerName="dnsmasq-dns" containerID="cri-o://715b056e0e8951dcb0bce46eff3f4cc77b23970621f830f7f64bde0192431e68" gracePeriod=10 Mar 21 04:44:56 crc kubenswrapper[4839]: I0321 04:44:56.736242 4839 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Mar 21 04:44:57 crc kubenswrapper[4839]: I0321 04:44:57.091787 4839 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Mar 21 04:44:57 crc kubenswrapper[4839]: I0321 04:44:57.092202 4839 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Mar 21 04:44:57 crc kubenswrapper[4839]: I0321 04:44:57.134837 4839 prober.go:107] "Probe failed" probeType="Liveness" pod="openstack/barbican-api-67dd687666-pgfc5" podUID="10c8735c-f1c9-40f7-bd34-60bb0749bc23" containerName="barbican-api-log" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 21 04:44:58 crc kubenswrapper[4839]: I0321 04:44:58.209327 4839 generic.go:334] "Generic (PLEG): container finished" podID="aeba2ea1-e2d1-46d2-8e89-982cd58f3b15" containerID="715b056e0e8951dcb0bce46eff3f4cc77b23970621f830f7f64bde0192431e68" exitCode=0 Mar 21 04:44:58 crc kubenswrapper[4839]: I0321 04:44:58.210367 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6b7b667979-w74nb" event={"ID":"aeba2ea1-e2d1-46d2-8e89-982cd58f3b15","Type":"ContainerDied","Data":"715b056e0e8951dcb0bce46eff3f4cc77b23970621f830f7f64bde0192431e68"} Mar 21 04:44:58 crc kubenswrapper[4839]: I0321 04:44:58.859028 4839 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-67dd687666-pgfc5" Mar 21 04:44:58 crc kubenswrapper[4839]: I0321 04:44:58.987814 4839 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-67dd687666-pgfc5" Mar 21 04:44:59 crc kubenswrapper[4839]: I0321 04:44:59.209293 4839 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-6b7b667979-w74nb" podUID="aeba2ea1-e2d1-46d2-8e89-982cd58f3b15" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.157:5353: connect: connection refused" Mar 21 04:44:59 crc kubenswrapper[4839]: I0321 04:44:59.259813 4839 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Mar 21 04:44:59 crc kubenswrapper[4839]: I0321 04:44:59.259935 4839 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Mar 21 04:44:59 crc kubenswrapper[4839]: I0321 04:44:59.262705 4839 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/horizon-84c6c985f8-v5cmh" podUID="b3b26c3a-55d5-442a-9c31-187b0aa60f90" containerName="horizon" probeResult="failure" output="Get \"https://10.217.0.151:8443/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.0.151:8443: connect: connection refused" Mar 21 04:44:59 crc kubenswrapper[4839]: I0321 04:44:59.292826 4839 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Mar 21 04:44:59 crc kubenswrapper[4839]: I0321 04:44:59.329327 4839 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/horizon-9c97f4dbd-k2scs" podUID="579308eb-854d-4160-ad35-8677f2d0e634" containerName="horizon" probeResult="failure" output="Get \"https://10.217.0.152:8443/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.0.152:8443: connect: connection refused" Mar 21 04:44:59 crc kubenswrapper[4839]: I0321 04:44:59.697863 4839 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Mar 21 04:44:59 crc kubenswrapper[4839]: I0321 04:44:59.698215 4839 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Mar 21 04:44:59 crc kubenswrapper[4839]: I0321 04:44:59.754743 4839 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Mar 21 04:45:00 crc kubenswrapper[4839]: I0321 04:45:00.151602 4839 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29567805-9sr8j"] Mar 21 04:45:00 crc kubenswrapper[4839]: E0321 04:45:00.152050 4839 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="08c48ff3-782c-4f5a-8a20-0736565e247a" containerName="barbican-api-log" Mar 21 04:45:00 crc kubenswrapper[4839]: I0321 04:45:00.152071 4839 state_mem.go:107] "Deleted CPUSet assignment" podUID="08c48ff3-782c-4f5a-8a20-0736565e247a" containerName="barbican-api-log" Mar 21 04:45:00 crc kubenswrapper[4839]: E0321 04:45:00.152111 4839 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="08c48ff3-782c-4f5a-8a20-0736565e247a" containerName="barbican-api" Mar 21 04:45:00 crc kubenswrapper[4839]: I0321 04:45:00.152119 4839 state_mem.go:107] "Deleted CPUSet assignment" podUID="08c48ff3-782c-4f5a-8a20-0736565e247a" containerName="barbican-api" Mar 21 04:45:00 crc kubenswrapper[4839]: I0321 04:45:00.152317 4839 memory_manager.go:354] "RemoveStaleState removing state" podUID="08c48ff3-782c-4f5a-8a20-0736565e247a" containerName="barbican-api" Mar 21 04:45:00 crc kubenswrapper[4839]: I0321 04:45:00.152358 4839 memory_manager.go:354] "RemoveStaleState removing state" podUID="08c48ff3-782c-4f5a-8a20-0736565e247a" containerName="barbican-api-log" Mar 21 04:45:00 crc kubenswrapper[4839]: I0321 04:45:00.153079 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29567805-9sr8j" Mar 21 04:45:00 crc kubenswrapper[4839]: I0321 04:45:00.155847 4839 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Mar 21 04:45:00 crc kubenswrapper[4839]: I0321 04:45:00.155848 4839 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Mar 21 04:45:00 crc kubenswrapper[4839]: I0321 04:45:00.172091 4839 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29567805-9sr8j"] Mar 21 04:45:00 crc kubenswrapper[4839]: I0321 04:45:00.213246 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/47d5a79e-3e14-4d49-bed4-a9c49e7b7f26-secret-volume\") pod \"collect-profiles-29567805-9sr8j\" (UID: \"47d5a79e-3e14-4d49-bed4-a9c49e7b7f26\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29567805-9sr8j" Mar 21 04:45:00 crc kubenswrapper[4839]: I0321 04:45:00.213377 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6rsc2\" (UniqueName: \"kubernetes.io/projected/47d5a79e-3e14-4d49-bed4-a9c49e7b7f26-kube-api-access-6rsc2\") pod \"collect-profiles-29567805-9sr8j\" (UID: \"47d5a79e-3e14-4d49-bed4-a9c49e7b7f26\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29567805-9sr8j" Mar 21 04:45:00 crc kubenswrapper[4839]: I0321 04:45:00.213411 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/47d5a79e-3e14-4d49-bed4-a9c49e7b7f26-config-volume\") pod \"collect-profiles-29567805-9sr8j\" (UID: \"47d5a79e-3e14-4d49-bed4-a9c49e7b7f26\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29567805-9sr8j" Mar 21 04:45:00 crc kubenswrapper[4839]: I0321 04:45:00.315232 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6rsc2\" (UniqueName: \"kubernetes.io/projected/47d5a79e-3e14-4d49-bed4-a9c49e7b7f26-kube-api-access-6rsc2\") pod \"collect-profiles-29567805-9sr8j\" (UID: \"47d5a79e-3e14-4d49-bed4-a9c49e7b7f26\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29567805-9sr8j" Mar 21 04:45:00 crc kubenswrapper[4839]: I0321 04:45:00.315304 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/47d5a79e-3e14-4d49-bed4-a9c49e7b7f26-config-volume\") pod \"collect-profiles-29567805-9sr8j\" (UID: \"47d5a79e-3e14-4d49-bed4-a9c49e7b7f26\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29567805-9sr8j" Mar 21 04:45:00 crc kubenswrapper[4839]: I0321 04:45:00.315592 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/47d5a79e-3e14-4d49-bed4-a9c49e7b7f26-secret-volume\") pod \"collect-profiles-29567805-9sr8j\" (UID: \"47d5a79e-3e14-4d49-bed4-a9c49e7b7f26\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29567805-9sr8j" Mar 21 04:45:00 crc kubenswrapper[4839]: I0321 04:45:00.317014 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/47d5a79e-3e14-4d49-bed4-a9c49e7b7f26-config-volume\") pod \"collect-profiles-29567805-9sr8j\" (UID: \"47d5a79e-3e14-4d49-bed4-a9c49e7b7f26\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29567805-9sr8j" Mar 21 04:45:00 crc kubenswrapper[4839]: I0321 04:45:00.364048 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/47d5a79e-3e14-4d49-bed4-a9c49e7b7f26-secret-volume\") pod \"collect-profiles-29567805-9sr8j\" (UID: \"47d5a79e-3e14-4d49-bed4-a9c49e7b7f26\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29567805-9sr8j" Mar 21 04:45:00 crc kubenswrapper[4839]: I0321 04:45:00.370272 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6rsc2\" (UniqueName: \"kubernetes.io/projected/47d5a79e-3e14-4d49-bed4-a9c49e7b7f26-kube-api-access-6rsc2\") pod \"collect-profiles-29567805-9sr8j\" (UID: \"47d5a79e-3e14-4d49-bed4-a9c49e7b7f26\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29567805-9sr8j" Mar 21 04:45:00 crc kubenswrapper[4839]: I0321 04:45:00.471583 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29567805-9sr8j" Mar 21 04:45:00 crc kubenswrapper[4839]: I0321 04:45:00.980379 4839 patch_prober.go:28] interesting pod/machine-config-daemon-jx4q7 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 21 04:45:00 crc kubenswrapper[4839]: I0321 04:45:00.980741 4839 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-jx4q7" podUID="4f92fefb-d5cd-451a-8bbe-31eea55d5bd9" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 21 04:45:01 crc kubenswrapper[4839]: I0321 04:45:01.243859 4839 generic.go:334] "Generic (PLEG): container finished" podID="392fb516-8745-40fb-b38d-53106c8310df" containerID="fd761f8a43876dc94432e55d47974d0e86c8a21f74c77ba35b0fda3b1a64872e" exitCode=0 Mar 21 04:45:01 crc kubenswrapper[4839]: I0321 04:45:01.243927 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-77466dd775-brs5x" event={"ID":"392fb516-8745-40fb-b38d-53106c8310df","Type":"ContainerDied","Data":"fd761f8a43876dc94432e55d47974d0e86c8a21f74c77ba35b0fda3b1a64872e"} Mar 21 04:45:02 crc kubenswrapper[4839]: I0321 04:45:02.073812 4839 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-d9cf4c794-jb7lf" Mar 21 04:45:02 crc kubenswrapper[4839]: I0321 04:45:02.483101 4839 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-d9cf4c794-jb7lf" Mar 21 04:45:02 crc kubenswrapper[4839]: I0321 04:45:02.585758 4839 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-api-67dd687666-pgfc5"] Mar 21 04:45:02 crc kubenswrapper[4839]: I0321 04:45:02.585993 4839 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-api-67dd687666-pgfc5" podUID="10c8735c-f1c9-40f7-bd34-60bb0749bc23" containerName="barbican-api-log" containerID="cri-o://5b137b53eba217c749f810e3fe6d4536182b4cec7923324d43b649cbc888ca03" gracePeriod=30 Mar 21 04:45:02 crc kubenswrapper[4839]: I0321 04:45:02.586536 4839 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-api-67dd687666-pgfc5" podUID="10c8735c-f1c9-40f7-bd34-60bb0749bc23" containerName="barbican-api" containerID="cri-o://77ad1711dc27b34bdfe011c55c79bdc7d6ef8a5e0c42e7951254c65ef19efa51" gracePeriod=30 Mar 21 04:45:03 crc kubenswrapper[4839]: I0321 04:45:03.264617 4839 generic.go:334] "Generic (PLEG): container finished" podID="6000d2d4-e84a-443f-9094-ab999541331d" containerID="89d53502805454e28eeac8a6f5794fb2c1a2eba3acba95c45fd4f0d839ae56ac" exitCode=0 Mar 21 04:45:03 crc kubenswrapper[4839]: I0321 04:45:03.264915 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-qfjms" event={"ID":"6000d2d4-e84a-443f-9094-ab999541331d","Type":"ContainerDied","Data":"89d53502805454e28eeac8a6f5794fb2c1a2eba3acba95c45fd4f0d839ae56ac"} Mar 21 04:45:03 crc kubenswrapper[4839]: I0321 04:45:03.273852 4839 generic.go:334] "Generic (PLEG): container finished" podID="10c8735c-f1c9-40f7-bd34-60bb0749bc23" containerID="5b137b53eba217c749f810e3fe6d4536182b4cec7923324d43b649cbc888ca03" exitCode=143 Mar 21 04:45:03 crc kubenswrapper[4839]: I0321 04:45:03.273919 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-67dd687666-pgfc5" event={"ID":"10c8735c-f1c9-40f7-bd34-60bb0749bc23","Type":"ContainerDied","Data":"5b137b53eba217c749f810e3fe6d4536182b4cec7923324d43b649cbc888ca03"} Mar 21 04:45:03 crc kubenswrapper[4839]: E0321 04:45:03.592277 4839 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/ubi9/httpd-24:latest" Mar 21 04:45:03 crc kubenswrapper[4839]: E0321 04:45:03.592719 4839 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:proxy-httpd,Image:registry.redhat.io/ubi9/httpd-24:latest,Command:[/usr/sbin/httpd],Args:[-DFOREGROUND],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:proxy-httpd,HostPort:0,ContainerPort:3000,Protocol:TCP,HostIP:,},},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config-data,ReadOnly:true,MountPath:/etc/httpd/conf/httpd.conf,SubPath:httpd.conf,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/etc/httpd/conf.d/ssl.conf,SubPath:ssl.conf,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:run-httpd,ReadOnly:false,MountPath:/run/httpd,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:log-httpd,ReadOnly:false,MountPath:/var/log/httpd,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:combined-ca-bundle,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-5zddh,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/,Port:{0 3000 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:300,TimeoutSeconds:30,PeriodSeconds:30,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/,Port:{0 3000 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:10,TimeoutSeconds:30,PeriodSeconds:30,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod ceilometer-0_openstack(6c266726-5bfd-4519-bdd5-9db7f6a77df4): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Mar 21 04:45:03 crc kubenswrapper[4839]: E0321 04:45:03.594145 4839 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"ceilometer-central-agent\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\", failed to \"StartContainer\" for \"proxy-httpd\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"]" pod="openstack/ceilometer-0" podUID="6c266726-5bfd-4519-bdd5-9db7f6a77df4" Mar 21 04:45:03 crc kubenswrapper[4839]: I0321 04:45:03.729692 4839 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-keystone-listener-56b894998b-l59vx" Mar 21 04:45:03 crc kubenswrapper[4839]: I0321 04:45:03.745070 4839 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6b7b667979-w74nb" Mar 21 04:45:03 crc kubenswrapper[4839]: I0321 04:45:03.757202 4839 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-worker-77466dd775-brs5x" Mar 21 04:45:03 crc kubenswrapper[4839]: I0321 04:45:03.806034 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/aeba2ea1-e2d1-46d2-8e89-982cd58f3b15-dns-svc\") pod \"aeba2ea1-e2d1-46d2-8e89-982cd58f3b15\" (UID: \"aeba2ea1-e2d1-46d2-8e89-982cd58f3b15\") " Mar 21 04:45:03 crc kubenswrapper[4839]: I0321 04:45:03.806105 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/392fb516-8745-40fb-b38d-53106c8310df-combined-ca-bundle\") pod \"392fb516-8745-40fb-b38d-53106c8310df\" (UID: \"392fb516-8745-40fb-b38d-53106c8310df\") " Mar 21 04:45:03 crc kubenswrapper[4839]: I0321 04:45:03.807439 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/392fb516-8745-40fb-b38d-53106c8310df-config-data-custom\") pod \"392fb516-8745-40fb-b38d-53106c8310df\" (UID: \"392fb516-8745-40fb-b38d-53106c8310df\") " Mar 21 04:45:03 crc kubenswrapper[4839]: I0321 04:45:03.807511 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-n782h\" (UniqueName: \"kubernetes.io/projected/e5f64e49-61a6-4601-b37b-f9af6079108c-kube-api-access-n782h\") pod \"e5f64e49-61a6-4601-b37b-f9af6079108c\" (UID: \"e5f64e49-61a6-4601-b37b-f9af6079108c\") " Mar 21 04:45:03 crc kubenswrapper[4839]: I0321 04:45:03.807533 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e5f64e49-61a6-4601-b37b-f9af6079108c-config-data\") pod \"e5f64e49-61a6-4601-b37b-f9af6079108c\" (UID: \"e5f64e49-61a6-4601-b37b-f9af6079108c\") " Mar 21 04:45:03 crc kubenswrapper[4839]: I0321 04:45:03.807562 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/e5f64e49-61a6-4601-b37b-f9af6079108c-config-data-custom\") pod \"e5f64e49-61a6-4601-b37b-f9af6079108c\" (UID: \"e5f64e49-61a6-4601-b37b-f9af6079108c\") " Mar 21 04:45:03 crc kubenswrapper[4839]: I0321 04:45:03.808544 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/aeba2ea1-e2d1-46d2-8e89-982cd58f3b15-ovsdbserver-sb\") pod \"aeba2ea1-e2d1-46d2-8e89-982cd58f3b15\" (UID: \"aeba2ea1-e2d1-46d2-8e89-982cd58f3b15\") " Mar 21 04:45:03 crc kubenswrapper[4839]: I0321 04:45:03.808574 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/392fb516-8745-40fb-b38d-53106c8310df-logs\") pod \"392fb516-8745-40fb-b38d-53106c8310df\" (UID: \"392fb516-8745-40fb-b38d-53106c8310df\") " Mar 21 04:45:03 crc kubenswrapper[4839]: I0321 04:45:03.809247 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/392fb516-8745-40fb-b38d-53106c8310df-config-data\") pod \"392fb516-8745-40fb-b38d-53106c8310df\" (UID: \"392fb516-8745-40fb-b38d-53106c8310df\") " Mar 21 04:45:03 crc kubenswrapper[4839]: I0321 04:45:03.809275 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/aeba2ea1-e2d1-46d2-8e89-982cd58f3b15-dns-swift-storage-0\") pod \"aeba2ea1-e2d1-46d2-8e89-982cd58f3b15\" (UID: \"aeba2ea1-e2d1-46d2-8e89-982cd58f3b15\") " Mar 21 04:45:03 crc kubenswrapper[4839]: I0321 04:45:03.809310 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/aeba2ea1-e2d1-46d2-8e89-982cd58f3b15-ovsdbserver-nb\") pod \"aeba2ea1-e2d1-46d2-8e89-982cd58f3b15\" (UID: \"aeba2ea1-e2d1-46d2-8e89-982cd58f3b15\") " Mar 21 04:45:03 crc kubenswrapper[4839]: I0321 04:45:03.809406 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/aeba2ea1-e2d1-46d2-8e89-982cd58f3b15-config\") pod \"aeba2ea1-e2d1-46d2-8e89-982cd58f3b15\" (UID: \"aeba2ea1-e2d1-46d2-8e89-982cd58f3b15\") " Mar 21 04:45:03 crc kubenswrapper[4839]: I0321 04:45:03.809464 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e5f64e49-61a6-4601-b37b-f9af6079108c-combined-ca-bundle\") pod \"e5f64e49-61a6-4601-b37b-f9af6079108c\" (UID: \"e5f64e49-61a6-4601-b37b-f9af6079108c\") " Mar 21 04:45:03 crc kubenswrapper[4839]: I0321 04:45:03.809493 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e5f64e49-61a6-4601-b37b-f9af6079108c-logs\") pod \"e5f64e49-61a6-4601-b37b-f9af6079108c\" (UID: \"e5f64e49-61a6-4601-b37b-f9af6079108c\") " Mar 21 04:45:03 crc kubenswrapper[4839]: I0321 04:45:03.809529 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sf4cm\" (UniqueName: \"kubernetes.io/projected/aeba2ea1-e2d1-46d2-8e89-982cd58f3b15-kube-api-access-sf4cm\") pod \"aeba2ea1-e2d1-46d2-8e89-982cd58f3b15\" (UID: \"aeba2ea1-e2d1-46d2-8e89-982cd58f3b15\") " Mar 21 04:45:03 crc kubenswrapper[4839]: I0321 04:45:03.809554 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xmt9m\" (UniqueName: \"kubernetes.io/projected/392fb516-8745-40fb-b38d-53106c8310df-kube-api-access-xmt9m\") pod \"392fb516-8745-40fb-b38d-53106c8310df\" (UID: \"392fb516-8745-40fb-b38d-53106c8310df\") " Mar 21 04:45:03 crc kubenswrapper[4839]: I0321 04:45:03.819125 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/392fb516-8745-40fb-b38d-53106c8310df-logs" (OuterVolumeSpecName: "logs") pod "392fb516-8745-40fb-b38d-53106c8310df" (UID: "392fb516-8745-40fb-b38d-53106c8310df"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 21 04:45:03 crc kubenswrapper[4839]: I0321 04:45:03.821962 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/392fb516-8745-40fb-b38d-53106c8310df-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "392fb516-8745-40fb-b38d-53106c8310df" (UID: "392fb516-8745-40fb-b38d-53106c8310df"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 21 04:45:03 crc kubenswrapper[4839]: I0321 04:45:03.822142 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e5f64e49-61a6-4601-b37b-f9af6079108c-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "e5f64e49-61a6-4601-b37b-f9af6079108c" (UID: "e5f64e49-61a6-4601-b37b-f9af6079108c"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 21 04:45:03 crc kubenswrapper[4839]: I0321 04:45:03.822528 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e5f64e49-61a6-4601-b37b-f9af6079108c-logs" (OuterVolumeSpecName: "logs") pod "e5f64e49-61a6-4601-b37b-f9af6079108c" (UID: "e5f64e49-61a6-4601-b37b-f9af6079108c"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 21 04:45:03 crc kubenswrapper[4839]: I0321 04:45:03.822970 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/392fb516-8745-40fb-b38d-53106c8310df-kube-api-access-xmt9m" (OuterVolumeSpecName: "kube-api-access-xmt9m") pod "392fb516-8745-40fb-b38d-53106c8310df" (UID: "392fb516-8745-40fb-b38d-53106c8310df"). InnerVolumeSpecName "kube-api-access-xmt9m". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 21 04:45:03 crc kubenswrapper[4839]: I0321 04:45:03.843318 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e5f64e49-61a6-4601-b37b-f9af6079108c-kube-api-access-n782h" (OuterVolumeSpecName: "kube-api-access-n782h") pod "e5f64e49-61a6-4601-b37b-f9af6079108c" (UID: "e5f64e49-61a6-4601-b37b-f9af6079108c"). InnerVolumeSpecName "kube-api-access-n782h". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 21 04:45:03 crc kubenswrapper[4839]: I0321 04:45:03.843872 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/aeba2ea1-e2d1-46d2-8e89-982cd58f3b15-kube-api-access-sf4cm" (OuterVolumeSpecName: "kube-api-access-sf4cm") pod "aeba2ea1-e2d1-46d2-8e89-982cd58f3b15" (UID: "aeba2ea1-e2d1-46d2-8e89-982cd58f3b15"). InnerVolumeSpecName "kube-api-access-sf4cm". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 21 04:45:03 crc kubenswrapper[4839]: I0321 04:45:03.895157 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/392fb516-8745-40fb-b38d-53106c8310df-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "392fb516-8745-40fb-b38d-53106c8310df" (UID: "392fb516-8745-40fb-b38d-53106c8310df"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 21 04:45:03 crc kubenswrapper[4839]: I0321 04:45:03.914936 4839 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xmt9m\" (UniqueName: \"kubernetes.io/projected/392fb516-8745-40fb-b38d-53106c8310df-kube-api-access-xmt9m\") on node \"crc\" DevicePath \"\"" Mar 21 04:45:03 crc kubenswrapper[4839]: I0321 04:45:03.914974 4839 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/392fb516-8745-40fb-b38d-53106c8310df-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 21 04:45:03 crc kubenswrapper[4839]: I0321 04:45:03.914987 4839 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/392fb516-8745-40fb-b38d-53106c8310df-config-data-custom\") on node \"crc\" DevicePath \"\"" Mar 21 04:45:03 crc kubenswrapper[4839]: I0321 04:45:03.914997 4839 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-n782h\" (UniqueName: \"kubernetes.io/projected/e5f64e49-61a6-4601-b37b-f9af6079108c-kube-api-access-n782h\") on node \"crc\" DevicePath \"\"" Mar 21 04:45:03 crc kubenswrapper[4839]: I0321 04:45:03.915010 4839 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/e5f64e49-61a6-4601-b37b-f9af6079108c-config-data-custom\") on node \"crc\" DevicePath \"\"" Mar 21 04:45:03 crc kubenswrapper[4839]: I0321 04:45:03.915023 4839 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/392fb516-8745-40fb-b38d-53106c8310df-logs\") on node \"crc\" DevicePath \"\"" Mar 21 04:45:03 crc kubenswrapper[4839]: I0321 04:45:03.915035 4839 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e5f64e49-61a6-4601-b37b-f9af6079108c-logs\") on node \"crc\" DevicePath \"\"" Mar 21 04:45:03 crc kubenswrapper[4839]: I0321 04:45:03.915045 4839 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sf4cm\" (UniqueName: \"kubernetes.io/projected/aeba2ea1-e2d1-46d2-8e89-982cd58f3b15-kube-api-access-sf4cm\") on node \"crc\" DevicePath \"\"" Mar 21 04:45:03 crc kubenswrapper[4839]: E0321 04:45:03.959314 4839 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e5f64e49-61a6-4601-b37b-f9af6079108c-combined-ca-bundle podName:e5f64e49-61a6-4601-b37b-f9af6079108c nodeName:}" failed. No retries permitted until 2026-03-21 04:45:04.45928051 +0000 UTC m=+1308.787067186 (durationBeforeRetry 500ms). Error: error cleaning subPath mounts for volume "combined-ca-bundle" (UniqueName: "kubernetes.io/secret/e5f64e49-61a6-4601-b37b-f9af6079108c-combined-ca-bundle") pod "e5f64e49-61a6-4601-b37b-f9af6079108c" (UID: "e5f64e49-61a6-4601-b37b-f9af6079108c") : error deleting /var/lib/kubelet/pods/e5f64e49-61a6-4601-b37b-f9af6079108c/volume-subpaths: remove /var/lib/kubelet/pods/e5f64e49-61a6-4601-b37b-f9af6079108c/volume-subpaths: no such file or directory Mar 21 04:45:03 crc kubenswrapper[4839]: I0321 04:45:03.959924 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/aeba2ea1-e2d1-46d2-8e89-982cd58f3b15-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "aeba2ea1-e2d1-46d2-8e89-982cd58f3b15" (UID: "aeba2ea1-e2d1-46d2-8e89-982cd58f3b15"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 21 04:45:03 crc kubenswrapper[4839]: I0321 04:45:03.959958 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/aeba2ea1-e2d1-46d2-8e89-982cd58f3b15-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "aeba2ea1-e2d1-46d2-8e89-982cd58f3b15" (UID: "aeba2ea1-e2d1-46d2-8e89-982cd58f3b15"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 21 04:45:03 crc kubenswrapper[4839]: I0321 04:45:03.960058 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/aeba2ea1-e2d1-46d2-8e89-982cd58f3b15-config" (OuterVolumeSpecName: "config") pod "aeba2ea1-e2d1-46d2-8e89-982cd58f3b15" (UID: "aeba2ea1-e2d1-46d2-8e89-982cd58f3b15"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 21 04:45:03 crc kubenswrapper[4839]: I0321 04:45:03.962793 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e5f64e49-61a6-4601-b37b-f9af6079108c-config-data" (OuterVolumeSpecName: "config-data") pod "e5f64e49-61a6-4601-b37b-f9af6079108c" (UID: "e5f64e49-61a6-4601-b37b-f9af6079108c"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 21 04:45:03 crc kubenswrapper[4839]: I0321 04:45:03.964428 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/aeba2ea1-e2d1-46d2-8e89-982cd58f3b15-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "aeba2ea1-e2d1-46d2-8e89-982cd58f3b15" (UID: "aeba2ea1-e2d1-46d2-8e89-982cd58f3b15"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 21 04:45:03 crc kubenswrapper[4839]: I0321 04:45:03.969164 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/aeba2ea1-e2d1-46d2-8e89-982cd58f3b15-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "aeba2ea1-e2d1-46d2-8e89-982cd58f3b15" (UID: "aeba2ea1-e2d1-46d2-8e89-982cd58f3b15"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 21 04:45:03 crc kubenswrapper[4839]: I0321 04:45:03.989823 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/392fb516-8745-40fb-b38d-53106c8310df-config-data" (OuterVolumeSpecName: "config-data") pod "392fb516-8745-40fb-b38d-53106c8310df" (UID: "392fb516-8745-40fb-b38d-53106c8310df"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 21 04:45:04 crc kubenswrapper[4839]: I0321 04:45:04.017247 4839 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/aeba2ea1-e2d1-46d2-8e89-982cd58f3b15-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Mar 21 04:45:04 crc kubenswrapper[4839]: I0321 04:45:04.017277 4839 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/aeba2ea1-e2d1-46d2-8e89-982cd58f3b15-config\") on node \"crc\" DevicePath \"\"" Mar 21 04:45:04 crc kubenswrapper[4839]: I0321 04:45:04.017286 4839 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/aeba2ea1-e2d1-46d2-8e89-982cd58f3b15-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 21 04:45:04 crc kubenswrapper[4839]: I0321 04:45:04.017296 4839 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e5f64e49-61a6-4601-b37b-f9af6079108c-config-data\") on node \"crc\" DevicePath \"\"" Mar 21 04:45:04 crc kubenswrapper[4839]: I0321 04:45:04.017304 4839 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/aeba2ea1-e2d1-46d2-8e89-982cd58f3b15-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Mar 21 04:45:04 crc kubenswrapper[4839]: I0321 04:45:04.017313 4839 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/392fb516-8745-40fb-b38d-53106c8310df-config-data\") on node \"crc\" DevicePath \"\"" Mar 21 04:45:04 crc kubenswrapper[4839]: I0321 04:45:04.017323 4839 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/aeba2ea1-e2d1-46d2-8e89-982cd58f3b15-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Mar 21 04:45:04 crc kubenswrapper[4839]: I0321 04:45:04.188255 4839 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29567805-9sr8j"] Mar 21 04:45:04 crc kubenswrapper[4839]: I0321 04:45:04.230597 4839 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/neutron-d447b4d96-qkb69" Mar 21 04:45:04 crc kubenswrapper[4839]: I0321 04:45:04.293343 4839 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6b7b667979-w74nb" Mar 21 04:45:04 crc kubenswrapper[4839]: I0321 04:45:04.293334 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6b7b667979-w74nb" event={"ID":"aeba2ea1-e2d1-46d2-8e89-982cd58f3b15","Type":"ContainerDied","Data":"594e402f9f6a6f0c691bdaa0d2b0e852622545812087194fbffc061c3f4fc05b"} Mar 21 04:45:04 crc kubenswrapper[4839]: I0321 04:45:04.293794 4839 scope.go:117] "RemoveContainer" containerID="715b056e0e8951dcb0bce46eff3f4cc77b23970621f830f7f64bde0192431e68" Mar 21 04:45:04 crc kubenswrapper[4839]: I0321 04:45:04.302194 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29567805-9sr8j" event={"ID":"47d5a79e-3e14-4d49-bed4-a9c49e7b7f26","Type":"ContainerStarted","Data":"06a1cfd95284ed8daa71d7f99c5f1c2898ca406b61f4a56279166d4934b555c0"} Mar 21 04:45:04 crc kubenswrapper[4839]: I0321 04:45:04.307727 4839 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-worker-77466dd775-brs5x" Mar 21 04:45:04 crc kubenswrapper[4839]: I0321 04:45:04.308071 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-77466dd775-brs5x" event={"ID":"392fb516-8745-40fb-b38d-53106c8310df","Type":"ContainerDied","Data":"aaed11a9eedbf42719456ee569b2249eb2ed204275c2e327bc69e0e9ba6d0c2e"} Mar 21 04:45:04 crc kubenswrapper[4839]: I0321 04:45:04.314913 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-56b894998b-l59vx" event={"ID":"e5f64e49-61a6-4601-b37b-f9af6079108c","Type":"ContainerDied","Data":"5a91be0fc03b74a93a2e4a4a9276bbbc5ad15e2d8798017231a484ecfe7f2ff6"} Mar 21 04:45:04 crc kubenswrapper[4839]: I0321 04:45:04.314972 4839 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-keystone-listener-56b894998b-l59vx" Mar 21 04:45:04 crc kubenswrapper[4839]: I0321 04:45:04.315726 4839 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="6c266726-5bfd-4519-bdd5-9db7f6a77df4" containerName="ceilometer-notification-agent" containerID="cri-o://0a64d9a20f4f5b5d0b9782608a440c655769c9db2754bb98b7278494dc83ae14" gracePeriod=30 Mar 21 04:45:04 crc kubenswrapper[4839]: I0321 04:45:04.315883 4839 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="6c266726-5bfd-4519-bdd5-9db7f6a77df4" containerName="sg-core" containerID="cri-o://97884e844baf73e80ed5f7a5c51d988d7d7009365523dceffa2a7bc9d1e19948" gracePeriod=30 Mar 21 04:45:04 crc kubenswrapper[4839]: I0321 04:45:04.340168 4839 scope.go:117] "RemoveContainer" containerID="4d47c811396da6675e98576b8f9d542f9a6e50f5a5df44132f5048a5caae6747" Mar 21 04:45:04 crc kubenswrapper[4839]: I0321 04:45:04.346967 4839 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6b7b667979-w74nb"] Mar 21 04:45:04 crc kubenswrapper[4839]: I0321 04:45:04.355132 4839 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-6b7b667979-w74nb"] Mar 21 04:45:04 crc kubenswrapper[4839]: I0321 04:45:04.393871 4839 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-worker-77466dd775-brs5x"] Mar 21 04:45:04 crc kubenswrapper[4839]: I0321 04:45:04.402042 4839 scope.go:117] "RemoveContainer" containerID="fd761f8a43876dc94432e55d47974d0e86c8a21f74c77ba35b0fda3b1a64872e" Mar 21 04:45:04 crc kubenswrapper[4839]: I0321 04:45:04.402867 4839 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-worker-77466dd775-brs5x"] Mar 21 04:45:04 crc kubenswrapper[4839]: I0321 04:45:04.441351 4839 scope.go:117] "RemoveContainer" containerID="ae270d4db24cad72c19c535910d2db30e454a5a89554335e5a48e6588c21e3f2" Mar 21 04:45:04 crc kubenswrapper[4839]: I0321 04:45:04.478661 4839 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="392fb516-8745-40fb-b38d-53106c8310df" path="/var/lib/kubelet/pods/392fb516-8745-40fb-b38d-53106c8310df/volumes" Mar 21 04:45:04 crc kubenswrapper[4839]: I0321 04:45:04.479231 4839 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="aeba2ea1-e2d1-46d2-8e89-982cd58f3b15" path="/var/lib/kubelet/pods/aeba2ea1-e2d1-46d2-8e89-982cd58f3b15/volumes" Mar 21 04:45:04 crc kubenswrapper[4839]: I0321 04:45:04.486570 4839 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-98964f649-mrjrt"] Mar 21 04:45:04 crc kubenswrapper[4839]: I0321 04:45:04.486803 4839 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/neutron-98964f649-mrjrt" podUID="e965d008-890b-408c-a5a8-823aca00140a" containerName="neutron-api" containerID="cri-o://6e416952cf65a99f24d43cb637a81bb2e071806b75507c88029a3d669986edf2" gracePeriod=30 Mar 21 04:45:04 crc kubenswrapper[4839]: I0321 04:45:04.486944 4839 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/neutron-98964f649-mrjrt" podUID="e965d008-890b-408c-a5a8-823aca00140a" containerName="neutron-httpd" containerID="cri-o://c52f7b158358ef8b38cfac03210bf15a4ca76a8dbb9c567dbd73763e507062d1" gracePeriod=30 Mar 21 04:45:04 crc kubenswrapper[4839]: I0321 04:45:04.489172 4839 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-748dbf85fc-jslwv"] Mar 21 04:45:04 crc kubenswrapper[4839]: E0321 04:45:04.489505 4839 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="392fb516-8745-40fb-b38d-53106c8310df" containerName="barbican-worker" Mar 21 04:45:04 crc kubenswrapper[4839]: I0321 04:45:04.489518 4839 state_mem.go:107] "Deleted CPUSet assignment" podUID="392fb516-8745-40fb-b38d-53106c8310df" containerName="barbican-worker" Mar 21 04:45:04 crc kubenswrapper[4839]: E0321 04:45:04.489531 4839 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e5f64e49-61a6-4601-b37b-f9af6079108c" containerName="barbican-keystone-listener" Mar 21 04:45:04 crc kubenswrapper[4839]: I0321 04:45:04.489536 4839 state_mem.go:107] "Deleted CPUSet assignment" podUID="e5f64e49-61a6-4601-b37b-f9af6079108c" containerName="barbican-keystone-listener" Mar 21 04:45:04 crc kubenswrapper[4839]: E0321 04:45:04.489562 4839 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e5f64e49-61a6-4601-b37b-f9af6079108c" containerName="barbican-keystone-listener-log" Mar 21 04:45:04 crc kubenswrapper[4839]: I0321 04:45:04.489570 4839 state_mem.go:107] "Deleted CPUSet assignment" podUID="e5f64e49-61a6-4601-b37b-f9af6079108c" containerName="barbican-keystone-listener-log" Mar 21 04:45:04 crc kubenswrapper[4839]: E0321 04:45:04.489581 4839 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="392fb516-8745-40fb-b38d-53106c8310df" containerName="barbican-worker-log" Mar 21 04:45:04 crc kubenswrapper[4839]: I0321 04:45:04.489587 4839 state_mem.go:107] "Deleted CPUSet assignment" podUID="392fb516-8745-40fb-b38d-53106c8310df" containerName="barbican-worker-log" Mar 21 04:45:04 crc kubenswrapper[4839]: E0321 04:45:04.489614 4839 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="aeba2ea1-e2d1-46d2-8e89-982cd58f3b15" containerName="dnsmasq-dns" Mar 21 04:45:04 crc kubenswrapper[4839]: I0321 04:45:04.489621 4839 state_mem.go:107] "Deleted CPUSet assignment" podUID="aeba2ea1-e2d1-46d2-8e89-982cd58f3b15" containerName="dnsmasq-dns" Mar 21 04:45:04 crc kubenswrapper[4839]: E0321 04:45:04.489633 4839 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="aeba2ea1-e2d1-46d2-8e89-982cd58f3b15" containerName="init" Mar 21 04:45:04 crc kubenswrapper[4839]: I0321 04:45:04.489638 4839 state_mem.go:107] "Deleted CPUSet assignment" podUID="aeba2ea1-e2d1-46d2-8e89-982cd58f3b15" containerName="init" Mar 21 04:45:04 crc kubenswrapper[4839]: I0321 04:45:04.490088 4839 memory_manager.go:354] "RemoveStaleState removing state" podUID="e5f64e49-61a6-4601-b37b-f9af6079108c" containerName="barbican-keystone-listener" Mar 21 04:45:04 crc kubenswrapper[4839]: I0321 04:45:04.490110 4839 memory_manager.go:354] "RemoveStaleState removing state" podUID="392fb516-8745-40fb-b38d-53106c8310df" containerName="barbican-worker-log" Mar 21 04:45:04 crc kubenswrapper[4839]: I0321 04:45:04.490121 4839 memory_manager.go:354] "RemoveStaleState removing state" podUID="e5f64e49-61a6-4601-b37b-f9af6079108c" containerName="barbican-keystone-listener-log" Mar 21 04:45:04 crc kubenswrapper[4839]: I0321 04:45:04.490132 4839 memory_manager.go:354] "RemoveStaleState removing state" podUID="aeba2ea1-e2d1-46d2-8e89-982cd58f3b15" containerName="dnsmasq-dns" Mar 21 04:45:04 crc kubenswrapper[4839]: I0321 04:45:04.490140 4839 memory_manager.go:354] "RemoveStaleState removing state" podUID="392fb516-8745-40fb-b38d-53106c8310df" containerName="barbican-worker" Mar 21 04:45:04 crc kubenswrapper[4839]: I0321 04:45:04.491659 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-748dbf85fc-jslwv" Mar 21 04:45:04 crc kubenswrapper[4839]: I0321 04:45:04.494931 4839 scope.go:117] "RemoveContainer" containerID="d2facd4bdb131b2c8282fb6aaa7c8482ec973fccdc0e979a7ef3bb2d591fc401" Mar 21 04:45:04 crc kubenswrapper[4839]: I0321 04:45:04.505855 4839 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-748dbf85fc-jslwv"] Mar 21 04:45:04 crc kubenswrapper[4839]: I0321 04:45:04.532424 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e5f64e49-61a6-4601-b37b-f9af6079108c-combined-ca-bundle\") pod \"e5f64e49-61a6-4601-b37b-f9af6079108c\" (UID: \"e5f64e49-61a6-4601-b37b-f9af6079108c\") " Mar 21 04:45:04 crc kubenswrapper[4839]: I0321 04:45:04.532758 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/cd21ac8b-d3c0-4f0c-9205-d60d55425d8a-internal-tls-certs\") pod \"neutron-748dbf85fc-jslwv\" (UID: \"cd21ac8b-d3c0-4f0c-9205-d60d55425d8a\") " pod="openstack/neutron-748dbf85fc-jslwv" Mar 21 04:45:04 crc kubenswrapper[4839]: I0321 04:45:04.532809 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cd21ac8b-d3c0-4f0c-9205-d60d55425d8a-combined-ca-bundle\") pod \"neutron-748dbf85fc-jslwv\" (UID: \"cd21ac8b-d3c0-4f0c-9205-d60d55425d8a\") " pod="openstack/neutron-748dbf85fc-jslwv" Mar 21 04:45:04 crc kubenswrapper[4839]: I0321 04:45:04.532833 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/cd21ac8b-d3c0-4f0c-9205-d60d55425d8a-config\") pod \"neutron-748dbf85fc-jslwv\" (UID: \"cd21ac8b-d3c0-4f0c-9205-d60d55425d8a\") " pod="openstack/neutron-748dbf85fc-jslwv" Mar 21 04:45:04 crc kubenswrapper[4839]: I0321 04:45:04.532918 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/cd21ac8b-d3c0-4f0c-9205-d60d55425d8a-ovndb-tls-certs\") pod \"neutron-748dbf85fc-jslwv\" (UID: \"cd21ac8b-d3c0-4f0c-9205-d60d55425d8a\") " pod="openstack/neutron-748dbf85fc-jslwv" Mar 21 04:45:04 crc kubenswrapper[4839]: I0321 04:45:04.532957 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/cd21ac8b-d3c0-4f0c-9205-d60d55425d8a-httpd-config\") pod \"neutron-748dbf85fc-jslwv\" (UID: \"cd21ac8b-d3c0-4f0c-9205-d60d55425d8a\") " pod="openstack/neutron-748dbf85fc-jslwv" Mar 21 04:45:04 crc kubenswrapper[4839]: I0321 04:45:04.532978 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/cd21ac8b-d3c0-4f0c-9205-d60d55425d8a-public-tls-certs\") pod \"neutron-748dbf85fc-jslwv\" (UID: \"cd21ac8b-d3c0-4f0c-9205-d60d55425d8a\") " pod="openstack/neutron-748dbf85fc-jslwv" Mar 21 04:45:04 crc kubenswrapper[4839]: I0321 04:45:04.533038 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xzlqg\" (UniqueName: \"kubernetes.io/projected/cd21ac8b-d3c0-4f0c-9205-d60d55425d8a-kube-api-access-xzlqg\") pod \"neutron-748dbf85fc-jslwv\" (UID: \"cd21ac8b-d3c0-4f0c-9205-d60d55425d8a\") " pod="openstack/neutron-748dbf85fc-jslwv" Mar 21 04:45:04 crc kubenswrapper[4839]: I0321 04:45:04.537461 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e5f64e49-61a6-4601-b37b-f9af6079108c-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "e5f64e49-61a6-4601-b37b-f9af6079108c" (UID: "e5f64e49-61a6-4601-b37b-f9af6079108c"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 21 04:45:04 crc kubenswrapper[4839]: I0321 04:45:04.538395 4839 scope.go:117] "RemoveContainer" containerID="a4599ccee0ab959c4e13333beacaf8a8188b2b3effd5f393fa70626300ad559e" Mar 21 04:45:04 crc kubenswrapper[4839]: I0321 04:45:04.602513 4839 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/neutron-98964f649-mrjrt" podUID="e965d008-890b-408c-a5a8-823aca00140a" containerName="neutron-httpd" probeResult="failure" output="Get \"https://10.217.0.159:9696/\": read tcp 10.217.0.2:57558->10.217.0.159:9696: read: connection reset by peer" Mar 21 04:45:04 crc kubenswrapper[4839]: I0321 04:45:04.634991 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/cd21ac8b-d3c0-4f0c-9205-d60d55425d8a-ovndb-tls-certs\") pod \"neutron-748dbf85fc-jslwv\" (UID: \"cd21ac8b-d3c0-4f0c-9205-d60d55425d8a\") " pod="openstack/neutron-748dbf85fc-jslwv" Mar 21 04:45:04 crc kubenswrapper[4839]: I0321 04:45:04.635043 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/cd21ac8b-d3c0-4f0c-9205-d60d55425d8a-httpd-config\") pod \"neutron-748dbf85fc-jslwv\" (UID: \"cd21ac8b-d3c0-4f0c-9205-d60d55425d8a\") " pod="openstack/neutron-748dbf85fc-jslwv" Mar 21 04:45:04 crc kubenswrapper[4839]: I0321 04:45:04.635062 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/cd21ac8b-d3c0-4f0c-9205-d60d55425d8a-public-tls-certs\") pod \"neutron-748dbf85fc-jslwv\" (UID: \"cd21ac8b-d3c0-4f0c-9205-d60d55425d8a\") " pod="openstack/neutron-748dbf85fc-jslwv" Mar 21 04:45:04 crc kubenswrapper[4839]: I0321 04:45:04.635119 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xzlqg\" (UniqueName: \"kubernetes.io/projected/cd21ac8b-d3c0-4f0c-9205-d60d55425d8a-kube-api-access-xzlqg\") pod \"neutron-748dbf85fc-jslwv\" (UID: \"cd21ac8b-d3c0-4f0c-9205-d60d55425d8a\") " pod="openstack/neutron-748dbf85fc-jslwv" Mar 21 04:45:04 crc kubenswrapper[4839]: I0321 04:45:04.635145 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/cd21ac8b-d3c0-4f0c-9205-d60d55425d8a-internal-tls-certs\") pod \"neutron-748dbf85fc-jslwv\" (UID: \"cd21ac8b-d3c0-4f0c-9205-d60d55425d8a\") " pod="openstack/neutron-748dbf85fc-jslwv" Mar 21 04:45:04 crc kubenswrapper[4839]: I0321 04:45:04.635177 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cd21ac8b-d3c0-4f0c-9205-d60d55425d8a-combined-ca-bundle\") pod \"neutron-748dbf85fc-jslwv\" (UID: \"cd21ac8b-d3c0-4f0c-9205-d60d55425d8a\") " pod="openstack/neutron-748dbf85fc-jslwv" Mar 21 04:45:04 crc kubenswrapper[4839]: I0321 04:45:04.635195 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/cd21ac8b-d3c0-4f0c-9205-d60d55425d8a-config\") pod \"neutron-748dbf85fc-jslwv\" (UID: \"cd21ac8b-d3c0-4f0c-9205-d60d55425d8a\") " pod="openstack/neutron-748dbf85fc-jslwv" Mar 21 04:45:04 crc kubenswrapper[4839]: I0321 04:45:04.635253 4839 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e5f64e49-61a6-4601-b37b-f9af6079108c-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 21 04:45:04 crc kubenswrapper[4839]: I0321 04:45:04.641184 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/cd21ac8b-d3c0-4f0c-9205-d60d55425d8a-config\") pod \"neutron-748dbf85fc-jslwv\" (UID: \"cd21ac8b-d3c0-4f0c-9205-d60d55425d8a\") " pod="openstack/neutron-748dbf85fc-jslwv" Mar 21 04:45:04 crc kubenswrapper[4839]: I0321 04:45:04.650200 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/cd21ac8b-d3c0-4f0c-9205-d60d55425d8a-public-tls-certs\") pod \"neutron-748dbf85fc-jslwv\" (UID: \"cd21ac8b-d3c0-4f0c-9205-d60d55425d8a\") " pod="openstack/neutron-748dbf85fc-jslwv" Mar 21 04:45:04 crc kubenswrapper[4839]: I0321 04:45:04.654513 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/cd21ac8b-d3c0-4f0c-9205-d60d55425d8a-httpd-config\") pod \"neutron-748dbf85fc-jslwv\" (UID: \"cd21ac8b-d3c0-4f0c-9205-d60d55425d8a\") " pod="openstack/neutron-748dbf85fc-jslwv" Mar 21 04:45:04 crc kubenswrapper[4839]: I0321 04:45:04.656481 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xzlqg\" (UniqueName: \"kubernetes.io/projected/cd21ac8b-d3c0-4f0c-9205-d60d55425d8a-kube-api-access-xzlqg\") pod \"neutron-748dbf85fc-jslwv\" (UID: \"cd21ac8b-d3c0-4f0c-9205-d60d55425d8a\") " pod="openstack/neutron-748dbf85fc-jslwv" Mar 21 04:45:04 crc kubenswrapper[4839]: I0321 04:45:04.671950 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/cd21ac8b-d3c0-4f0c-9205-d60d55425d8a-ovndb-tls-certs\") pod \"neutron-748dbf85fc-jslwv\" (UID: \"cd21ac8b-d3c0-4f0c-9205-d60d55425d8a\") " pod="openstack/neutron-748dbf85fc-jslwv" Mar 21 04:45:04 crc kubenswrapper[4839]: I0321 04:45:04.675856 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cd21ac8b-d3c0-4f0c-9205-d60d55425d8a-combined-ca-bundle\") pod \"neutron-748dbf85fc-jslwv\" (UID: \"cd21ac8b-d3c0-4f0c-9205-d60d55425d8a\") " pod="openstack/neutron-748dbf85fc-jslwv" Mar 21 04:45:04 crc kubenswrapper[4839]: I0321 04:45:04.682078 4839 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-keystone-listener-56b894998b-l59vx"] Mar 21 04:45:04 crc kubenswrapper[4839]: I0321 04:45:04.689432 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/cd21ac8b-d3c0-4f0c-9205-d60d55425d8a-internal-tls-certs\") pod \"neutron-748dbf85fc-jslwv\" (UID: \"cd21ac8b-d3c0-4f0c-9205-d60d55425d8a\") " pod="openstack/neutron-748dbf85fc-jslwv" Mar 21 04:45:04 crc kubenswrapper[4839]: I0321 04:45:04.724435 4839 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-keystone-listener-56b894998b-l59vx"] Mar 21 04:45:04 crc kubenswrapper[4839]: I0321 04:45:04.820957 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-748dbf85fc-jslwv" Mar 21 04:45:04 crc kubenswrapper[4839]: I0321 04:45:04.869953 4839 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-qfjms" Mar 21 04:45:04 crc kubenswrapper[4839]: I0321 04:45:04.939513 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/6000d2d4-e84a-443f-9094-ab999541331d-db-sync-config-data\") pod \"6000d2d4-e84a-443f-9094-ab999541331d\" (UID: \"6000d2d4-e84a-443f-9094-ab999541331d\") " Mar 21 04:45:04 crc kubenswrapper[4839]: I0321 04:45:04.939617 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6000d2d4-e84a-443f-9094-ab999541331d-scripts\") pod \"6000d2d4-e84a-443f-9094-ab999541331d\" (UID: \"6000d2d4-e84a-443f-9094-ab999541331d\") " Mar 21 04:45:04 crc kubenswrapper[4839]: I0321 04:45:04.939648 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-g6mpt\" (UniqueName: \"kubernetes.io/projected/6000d2d4-e84a-443f-9094-ab999541331d-kube-api-access-g6mpt\") pod \"6000d2d4-e84a-443f-9094-ab999541331d\" (UID: \"6000d2d4-e84a-443f-9094-ab999541331d\") " Mar 21 04:45:04 crc kubenswrapper[4839]: I0321 04:45:04.939780 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6000d2d4-e84a-443f-9094-ab999541331d-combined-ca-bundle\") pod \"6000d2d4-e84a-443f-9094-ab999541331d\" (UID: \"6000d2d4-e84a-443f-9094-ab999541331d\") " Mar 21 04:45:04 crc kubenswrapper[4839]: I0321 04:45:04.939824 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6000d2d4-e84a-443f-9094-ab999541331d-config-data\") pod \"6000d2d4-e84a-443f-9094-ab999541331d\" (UID: \"6000d2d4-e84a-443f-9094-ab999541331d\") " Mar 21 04:45:04 crc kubenswrapper[4839]: I0321 04:45:04.939870 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/6000d2d4-e84a-443f-9094-ab999541331d-etc-machine-id\") pod \"6000d2d4-e84a-443f-9094-ab999541331d\" (UID: \"6000d2d4-e84a-443f-9094-ab999541331d\") " Mar 21 04:45:04 crc kubenswrapper[4839]: I0321 04:45:04.940254 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/6000d2d4-e84a-443f-9094-ab999541331d-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "6000d2d4-e84a-443f-9094-ab999541331d" (UID: "6000d2d4-e84a-443f-9094-ab999541331d"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 21 04:45:04 crc kubenswrapper[4839]: I0321 04:45:04.944965 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6000d2d4-e84a-443f-9094-ab999541331d-scripts" (OuterVolumeSpecName: "scripts") pod "6000d2d4-e84a-443f-9094-ab999541331d" (UID: "6000d2d4-e84a-443f-9094-ab999541331d"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 21 04:45:04 crc kubenswrapper[4839]: I0321 04:45:04.951611 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6000d2d4-e84a-443f-9094-ab999541331d-kube-api-access-g6mpt" (OuterVolumeSpecName: "kube-api-access-g6mpt") pod "6000d2d4-e84a-443f-9094-ab999541331d" (UID: "6000d2d4-e84a-443f-9094-ab999541331d"). InnerVolumeSpecName "kube-api-access-g6mpt". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 21 04:45:04 crc kubenswrapper[4839]: I0321 04:45:04.951619 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6000d2d4-e84a-443f-9094-ab999541331d-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "6000d2d4-e84a-443f-9094-ab999541331d" (UID: "6000d2d4-e84a-443f-9094-ab999541331d"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 21 04:45:04 crc kubenswrapper[4839]: I0321 04:45:04.969208 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6000d2d4-e84a-443f-9094-ab999541331d-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "6000d2d4-e84a-443f-9094-ab999541331d" (UID: "6000d2d4-e84a-443f-9094-ab999541331d"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 21 04:45:04 crc kubenswrapper[4839]: I0321 04:45:04.996689 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6000d2d4-e84a-443f-9094-ab999541331d-config-data" (OuterVolumeSpecName: "config-data") pod "6000d2d4-e84a-443f-9094-ab999541331d" (UID: "6000d2d4-e84a-443f-9094-ab999541331d"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 21 04:45:05 crc kubenswrapper[4839]: I0321 04:45:05.041828 4839 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6000d2d4-e84a-443f-9094-ab999541331d-scripts\") on node \"crc\" DevicePath \"\"" Mar 21 04:45:05 crc kubenswrapper[4839]: I0321 04:45:05.041863 4839 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-g6mpt\" (UniqueName: \"kubernetes.io/projected/6000d2d4-e84a-443f-9094-ab999541331d-kube-api-access-g6mpt\") on node \"crc\" DevicePath \"\"" Mar 21 04:45:05 crc kubenswrapper[4839]: I0321 04:45:05.041902 4839 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6000d2d4-e84a-443f-9094-ab999541331d-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 21 04:45:05 crc kubenswrapper[4839]: I0321 04:45:05.041919 4839 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6000d2d4-e84a-443f-9094-ab999541331d-config-data\") on node \"crc\" DevicePath \"\"" Mar 21 04:45:05 crc kubenswrapper[4839]: I0321 04:45:05.041930 4839 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/6000d2d4-e84a-443f-9094-ab999541331d-etc-machine-id\") on node \"crc\" DevicePath \"\"" Mar 21 04:45:05 crc kubenswrapper[4839]: I0321 04:45:05.041939 4839 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/6000d2d4-e84a-443f-9094-ab999541331d-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Mar 21 04:45:05 crc kubenswrapper[4839]: I0321 04:45:05.325707 4839 generic.go:334] "Generic (PLEG): container finished" podID="e965d008-890b-408c-a5a8-823aca00140a" containerID="c52f7b158358ef8b38cfac03210bf15a4ca76a8dbb9c567dbd73763e507062d1" exitCode=0 Mar 21 04:45:05 crc kubenswrapper[4839]: I0321 04:45:05.325794 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-98964f649-mrjrt" event={"ID":"e965d008-890b-408c-a5a8-823aca00140a","Type":"ContainerDied","Data":"c52f7b158358ef8b38cfac03210bf15a4ca76a8dbb9c567dbd73763e507062d1"} Mar 21 04:45:05 crc kubenswrapper[4839]: I0321 04:45:05.329083 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-qfjms" event={"ID":"6000d2d4-e84a-443f-9094-ab999541331d","Type":"ContainerDied","Data":"4d5cb1d53067040b399cf367f961d70a4e98d3e793e42e6da997085ddb0d9688"} Mar 21 04:45:05 crc kubenswrapper[4839]: I0321 04:45:05.329113 4839 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="4d5cb1d53067040b399cf367f961d70a4e98d3e793e42e6da997085ddb0d9688" Mar 21 04:45:05 crc kubenswrapper[4839]: I0321 04:45:05.329162 4839 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-qfjms" Mar 21 04:45:05 crc kubenswrapper[4839]: I0321 04:45:05.332178 4839 generic.go:334] "Generic (PLEG): container finished" podID="47d5a79e-3e14-4d49-bed4-a9c49e7b7f26" containerID="77bf1caf6b0a8e86542e0854eb602cb5e02b5990b61a512100dca57b8da7f1d1" exitCode=0 Mar 21 04:45:05 crc kubenswrapper[4839]: I0321 04:45:05.332249 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29567805-9sr8j" event={"ID":"47d5a79e-3e14-4d49-bed4-a9c49e7b7f26","Type":"ContainerDied","Data":"77bf1caf6b0a8e86542e0854eb602cb5e02b5990b61a512100dca57b8da7f1d1"} Mar 21 04:45:05 crc kubenswrapper[4839]: I0321 04:45:05.334407 4839 generic.go:334] "Generic (PLEG): container finished" podID="6c266726-5bfd-4519-bdd5-9db7f6a77df4" containerID="97884e844baf73e80ed5f7a5c51d988d7d7009365523dceffa2a7bc9d1e19948" exitCode=2 Mar 21 04:45:05 crc kubenswrapper[4839]: I0321 04:45:05.334478 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"6c266726-5bfd-4519-bdd5-9db7f6a77df4","Type":"ContainerDied","Data":"97884e844baf73e80ed5f7a5c51d988d7d7009365523dceffa2a7bc9d1e19948"} Mar 21 04:45:05 crc kubenswrapper[4839]: I0321 04:45:05.370953 4839 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-748dbf85fc-jslwv"] Mar 21 04:45:05 crc kubenswrapper[4839]: W0321 04:45:05.375868 4839 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podcd21ac8b_d3c0_4f0c_9205_d60d55425d8a.slice/crio-ac9ca763b0341f17070dbd9db2d04ffcd513f8092500c1d0f5359ce9caa74b46 WatchSource:0}: Error finding container ac9ca763b0341f17070dbd9db2d04ffcd513f8092500c1d0f5359ce9caa74b46: Status 404 returned error can't find the container with id ac9ca763b0341f17070dbd9db2d04ffcd513f8092500c1d0f5359ce9caa74b46 Mar 21 04:45:05 crc kubenswrapper[4839]: I0321 04:45:05.636538 4839 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-6578955fd5-qc28r"] Mar 21 04:45:05 crc kubenswrapper[4839]: E0321 04:45:05.636982 4839 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6000d2d4-e84a-443f-9094-ab999541331d" containerName="cinder-db-sync" Mar 21 04:45:05 crc kubenswrapper[4839]: I0321 04:45:05.636996 4839 state_mem.go:107] "Deleted CPUSet assignment" podUID="6000d2d4-e84a-443f-9094-ab999541331d" containerName="cinder-db-sync" Mar 21 04:45:05 crc kubenswrapper[4839]: I0321 04:45:05.637174 4839 memory_manager.go:354] "RemoveStaleState removing state" podUID="6000d2d4-e84a-443f-9094-ab999541331d" containerName="cinder-db-sync" Mar 21 04:45:05 crc kubenswrapper[4839]: I0321 04:45:05.638169 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6578955fd5-qc28r" Mar 21 04:45:05 crc kubenswrapper[4839]: I0321 04:45:05.684077 4839 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6578955fd5-qc28r"] Mar 21 04:45:05 crc kubenswrapper[4839]: I0321 04:45:05.731000 4839 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-scheduler-0"] Mar 21 04:45:05 crc kubenswrapper[4839]: I0321 04:45:05.732871 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Mar 21 04:45:05 crc kubenswrapper[4839]: I0321 04:45:05.740083 4839 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scripts" Mar 21 04:45:05 crc kubenswrapper[4839]: I0321 04:45:05.740472 4839 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-cinder-dockercfg-4bb6p" Mar 21 04:45:05 crc kubenswrapper[4839]: I0321 04:45:05.740671 4839 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-config-data" Mar 21 04:45:05 crc kubenswrapper[4839]: I0321 04:45:05.740863 4839 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scheduler-config-data" Mar 21 04:45:05 crc kubenswrapper[4839]: I0321 04:45:05.758636 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/439bd408-2f5c-45cc-a2f7-8166a4a279c2-ovsdbserver-nb\") pod \"dnsmasq-dns-6578955fd5-qc28r\" (UID: \"439bd408-2f5c-45cc-a2f7-8166a4a279c2\") " pod="openstack/dnsmasq-dns-6578955fd5-qc28r" Mar 21 04:45:05 crc kubenswrapper[4839]: I0321 04:45:05.758688 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4wj7c\" (UniqueName: \"kubernetes.io/projected/439bd408-2f5c-45cc-a2f7-8166a4a279c2-kube-api-access-4wj7c\") pod \"dnsmasq-dns-6578955fd5-qc28r\" (UID: \"439bd408-2f5c-45cc-a2f7-8166a4a279c2\") " pod="openstack/dnsmasq-dns-6578955fd5-qc28r" Mar 21 04:45:05 crc kubenswrapper[4839]: I0321 04:45:05.758756 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/439bd408-2f5c-45cc-a2f7-8166a4a279c2-config\") pod \"dnsmasq-dns-6578955fd5-qc28r\" (UID: \"439bd408-2f5c-45cc-a2f7-8166a4a279c2\") " pod="openstack/dnsmasq-dns-6578955fd5-qc28r" Mar 21 04:45:05 crc kubenswrapper[4839]: I0321 04:45:05.758796 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/439bd408-2f5c-45cc-a2f7-8166a4a279c2-dns-svc\") pod \"dnsmasq-dns-6578955fd5-qc28r\" (UID: \"439bd408-2f5c-45cc-a2f7-8166a4a279c2\") " pod="openstack/dnsmasq-dns-6578955fd5-qc28r" Mar 21 04:45:05 crc kubenswrapper[4839]: I0321 04:45:05.758846 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/439bd408-2f5c-45cc-a2f7-8166a4a279c2-ovsdbserver-sb\") pod \"dnsmasq-dns-6578955fd5-qc28r\" (UID: \"439bd408-2f5c-45cc-a2f7-8166a4a279c2\") " pod="openstack/dnsmasq-dns-6578955fd5-qc28r" Mar 21 04:45:05 crc kubenswrapper[4839]: I0321 04:45:05.758893 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/439bd408-2f5c-45cc-a2f7-8166a4a279c2-dns-swift-storage-0\") pod \"dnsmasq-dns-6578955fd5-qc28r\" (UID: \"439bd408-2f5c-45cc-a2f7-8166a4a279c2\") " pod="openstack/dnsmasq-dns-6578955fd5-qc28r" Mar 21 04:45:05 crc kubenswrapper[4839]: I0321 04:45:05.768655 4839 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Mar 21 04:45:05 crc kubenswrapper[4839]: I0321 04:45:05.860461 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/82b135c8-5fc8-4930-9577-1dd9181a1dae-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"82b135c8-5fc8-4930-9577-1dd9181a1dae\") " pod="openstack/cinder-scheduler-0" Mar 21 04:45:05 crc kubenswrapper[4839]: I0321 04:45:05.860540 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/439bd408-2f5c-45cc-a2f7-8166a4a279c2-ovsdbserver-nb\") pod \"dnsmasq-dns-6578955fd5-qc28r\" (UID: \"439bd408-2f5c-45cc-a2f7-8166a4a279c2\") " pod="openstack/dnsmasq-dns-6578955fd5-qc28r" Mar 21 04:45:05 crc kubenswrapper[4839]: I0321 04:45:05.860566 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4wj7c\" (UniqueName: \"kubernetes.io/projected/439bd408-2f5c-45cc-a2f7-8166a4a279c2-kube-api-access-4wj7c\") pod \"dnsmasq-dns-6578955fd5-qc28r\" (UID: \"439bd408-2f5c-45cc-a2f7-8166a4a279c2\") " pod="openstack/dnsmasq-dns-6578955fd5-qc28r" Mar 21 04:45:05 crc kubenswrapper[4839]: I0321 04:45:05.860636 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/82b135c8-5fc8-4930-9577-1dd9181a1dae-scripts\") pod \"cinder-scheduler-0\" (UID: \"82b135c8-5fc8-4930-9577-1dd9181a1dae\") " pod="openstack/cinder-scheduler-0" Mar 21 04:45:05 crc kubenswrapper[4839]: I0321 04:45:05.860657 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/82b135c8-5fc8-4930-9577-1dd9181a1dae-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"82b135c8-5fc8-4930-9577-1dd9181a1dae\") " pod="openstack/cinder-scheduler-0" Mar 21 04:45:05 crc kubenswrapper[4839]: I0321 04:45:05.860710 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9nhfj\" (UniqueName: \"kubernetes.io/projected/82b135c8-5fc8-4930-9577-1dd9181a1dae-kube-api-access-9nhfj\") pod \"cinder-scheduler-0\" (UID: \"82b135c8-5fc8-4930-9577-1dd9181a1dae\") " pod="openstack/cinder-scheduler-0" Mar 21 04:45:05 crc kubenswrapper[4839]: I0321 04:45:05.860737 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/439bd408-2f5c-45cc-a2f7-8166a4a279c2-config\") pod \"dnsmasq-dns-6578955fd5-qc28r\" (UID: \"439bd408-2f5c-45cc-a2f7-8166a4a279c2\") " pod="openstack/dnsmasq-dns-6578955fd5-qc28r" Mar 21 04:45:05 crc kubenswrapper[4839]: I0321 04:45:05.860767 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/82b135c8-5fc8-4930-9577-1dd9181a1dae-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"82b135c8-5fc8-4930-9577-1dd9181a1dae\") " pod="openstack/cinder-scheduler-0" Mar 21 04:45:05 crc kubenswrapper[4839]: I0321 04:45:05.860802 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/439bd408-2f5c-45cc-a2f7-8166a4a279c2-dns-svc\") pod \"dnsmasq-dns-6578955fd5-qc28r\" (UID: \"439bd408-2f5c-45cc-a2f7-8166a4a279c2\") " pod="openstack/dnsmasq-dns-6578955fd5-qc28r" Mar 21 04:45:05 crc kubenswrapper[4839]: I0321 04:45:05.860855 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/439bd408-2f5c-45cc-a2f7-8166a4a279c2-ovsdbserver-sb\") pod \"dnsmasq-dns-6578955fd5-qc28r\" (UID: \"439bd408-2f5c-45cc-a2f7-8166a4a279c2\") " pod="openstack/dnsmasq-dns-6578955fd5-qc28r" Mar 21 04:45:05 crc kubenswrapper[4839]: I0321 04:45:05.860893 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/439bd408-2f5c-45cc-a2f7-8166a4a279c2-dns-swift-storage-0\") pod \"dnsmasq-dns-6578955fd5-qc28r\" (UID: \"439bd408-2f5c-45cc-a2f7-8166a4a279c2\") " pod="openstack/dnsmasq-dns-6578955fd5-qc28r" Mar 21 04:45:05 crc kubenswrapper[4839]: I0321 04:45:05.860926 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/82b135c8-5fc8-4930-9577-1dd9181a1dae-config-data\") pod \"cinder-scheduler-0\" (UID: \"82b135c8-5fc8-4930-9577-1dd9181a1dae\") " pod="openstack/cinder-scheduler-0" Mar 21 04:45:05 crc kubenswrapper[4839]: I0321 04:45:05.861795 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/439bd408-2f5c-45cc-a2f7-8166a4a279c2-ovsdbserver-nb\") pod \"dnsmasq-dns-6578955fd5-qc28r\" (UID: \"439bd408-2f5c-45cc-a2f7-8166a4a279c2\") " pod="openstack/dnsmasq-dns-6578955fd5-qc28r" Mar 21 04:45:05 crc kubenswrapper[4839]: I0321 04:45:05.861906 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/439bd408-2f5c-45cc-a2f7-8166a4a279c2-dns-svc\") pod \"dnsmasq-dns-6578955fd5-qc28r\" (UID: \"439bd408-2f5c-45cc-a2f7-8166a4a279c2\") " pod="openstack/dnsmasq-dns-6578955fd5-qc28r" Mar 21 04:45:05 crc kubenswrapper[4839]: I0321 04:45:05.862144 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/439bd408-2f5c-45cc-a2f7-8166a4a279c2-config\") pod \"dnsmasq-dns-6578955fd5-qc28r\" (UID: \"439bd408-2f5c-45cc-a2f7-8166a4a279c2\") " pod="openstack/dnsmasq-dns-6578955fd5-qc28r" Mar 21 04:45:05 crc kubenswrapper[4839]: I0321 04:45:05.863661 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/439bd408-2f5c-45cc-a2f7-8166a4a279c2-ovsdbserver-sb\") pod \"dnsmasq-dns-6578955fd5-qc28r\" (UID: \"439bd408-2f5c-45cc-a2f7-8166a4a279c2\") " pod="openstack/dnsmasq-dns-6578955fd5-qc28r" Mar 21 04:45:05 crc kubenswrapper[4839]: I0321 04:45:05.864517 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/439bd408-2f5c-45cc-a2f7-8166a4a279c2-dns-swift-storage-0\") pod \"dnsmasq-dns-6578955fd5-qc28r\" (UID: \"439bd408-2f5c-45cc-a2f7-8166a4a279c2\") " pod="openstack/dnsmasq-dns-6578955fd5-qc28r" Mar 21 04:45:05 crc kubenswrapper[4839]: I0321 04:45:05.887170 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4wj7c\" (UniqueName: \"kubernetes.io/projected/439bd408-2f5c-45cc-a2f7-8166a4a279c2-kube-api-access-4wj7c\") pod \"dnsmasq-dns-6578955fd5-qc28r\" (UID: \"439bd408-2f5c-45cc-a2f7-8166a4a279c2\") " pod="openstack/dnsmasq-dns-6578955fd5-qc28r" Mar 21 04:45:05 crc kubenswrapper[4839]: I0321 04:45:05.949456 4839 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-api-0"] Mar 21 04:45:05 crc kubenswrapper[4839]: I0321 04:45:05.951273 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Mar 21 04:45:05 crc kubenswrapper[4839]: I0321 04:45:05.954034 4839 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-api-config-data" Mar 21 04:45:05 crc kubenswrapper[4839]: I0321 04:45:05.958716 4839 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Mar 21 04:45:05 crc kubenswrapper[4839]: I0321 04:45:05.964691 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/82b135c8-5fc8-4930-9577-1dd9181a1dae-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"82b135c8-5fc8-4930-9577-1dd9181a1dae\") " pod="openstack/cinder-scheduler-0" Mar 21 04:45:05 crc kubenswrapper[4839]: I0321 04:45:05.964760 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/82b135c8-5fc8-4930-9577-1dd9181a1dae-scripts\") pod \"cinder-scheduler-0\" (UID: \"82b135c8-5fc8-4930-9577-1dd9181a1dae\") " pod="openstack/cinder-scheduler-0" Mar 21 04:45:05 crc kubenswrapper[4839]: I0321 04:45:05.964786 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/82b135c8-5fc8-4930-9577-1dd9181a1dae-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"82b135c8-5fc8-4930-9577-1dd9181a1dae\") " pod="openstack/cinder-scheduler-0" Mar 21 04:45:05 crc kubenswrapper[4839]: I0321 04:45:05.964817 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/82b135c8-5fc8-4930-9577-1dd9181a1dae-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"82b135c8-5fc8-4930-9577-1dd9181a1dae\") " pod="openstack/cinder-scheduler-0" Mar 21 04:45:05 crc kubenswrapper[4839]: I0321 04:45:05.964837 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9nhfj\" (UniqueName: \"kubernetes.io/projected/82b135c8-5fc8-4930-9577-1dd9181a1dae-kube-api-access-9nhfj\") pod \"cinder-scheduler-0\" (UID: \"82b135c8-5fc8-4930-9577-1dd9181a1dae\") " pod="openstack/cinder-scheduler-0" Mar 21 04:45:05 crc kubenswrapper[4839]: I0321 04:45:05.964880 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/82b135c8-5fc8-4930-9577-1dd9181a1dae-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"82b135c8-5fc8-4930-9577-1dd9181a1dae\") " pod="openstack/cinder-scheduler-0" Mar 21 04:45:05 crc kubenswrapper[4839]: I0321 04:45:05.964958 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/82b135c8-5fc8-4930-9577-1dd9181a1dae-config-data\") pod \"cinder-scheduler-0\" (UID: \"82b135c8-5fc8-4930-9577-1dd9181a1dae\") " pod="openstack/cinder-scheduler-0" Mar 21 04:45:05 crc kubenswrapper[4839]: I0321 04:45:05.968559 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/82b135c8-5fc8-4930-9577-1dd9181a1dae-scripts\") pod \"cinder-scheduler-0\" (UID: \"82b135c8-5fc8-4930-9577-1dd9181a1dae\") " pod="openstack/cinder-scheduler-0" Mar 21 04:45:05 crc kubenswrapper[4839]: I0321 04:45:05.970513 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/82b135c8-5fc8-4930-9577-1dd9181a1dae-config-data\") pod \"cinder-scheduler-0\" (UID: \"82b135c8-5fc8-4930-9577-1dd9181a1dae\") " pod="openstack/cinder-scheduler-0" Mar 21 04:45:05 crc kubenswrapper[4839]: I0321 04:45:05.988210 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/82b135c8-5fc8-4930-9577-1dd9181a1dae-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"82b135c8-5fc8-4930-9577-1dd9181a1dae\") " pod="openstack/cinder-scheduler-0" Mar 21 04:45:05 crc kubenswrapper[4839]: I0321 04:45:05.989131 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/82b135c8-5fc8-4930-9577-1dd9181a1dae-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"82b135c8-5fc8-4930-9577-1dd9181a1dae\") " pod="openstack/cinder-scheduler-0" Mar 21 04:45:05 crc kubenswrapper[4839]: I0321 04:45:05.989805 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6578955fd5-qc28r" Mar 21 04:45:06 crc kubenswrapper[4839]: I0321 04:45:06.003382 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9nhfj\" (UniqueName: \"kubernetes.io/projected/82b135c8-5fc8-4930-9577-1dd9181a1dae-kube-api-access-9nhfj\") pod \"cinder-scheduler-0\" (UID: \"82b135c8-5fc8-4930-9577-1dd9181a1dae\") " pod="openstack/cinder-scheduler-0" Mar 21 04:45:06 crc kubenswrapper[4839]: I0321 04:45:06.067310 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/63efa50f-a0e7-4912-bbd8-c610daf572fd-logs\") pod \"cinder-api-0\" (UID: \"63efa50f-a0e7-4912-bbd8-c610daf572fd\") " pod="openstack/cinder-api-0" Mar 21 04:45:06 crc kubenswrapper[4839]: I0321 04:45:06.067381 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/63efa50f-a0e7-4912-bbd8-c610daf572fd-etc-machine-id\") pod \"cinder-api-0\" (UID: \"63efa50f-a0e7-4912-bbd8-c610daf572fd\") " pod="openstack/cinder-api-0" Mar 21 04:45:06 crc kubenswrapper[4839]: I0321 04:45:06.067439 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/63efa50f-a0e7-4912-bbd8-c610daf572fd-config-data\") pod \"cinder-api-0\" (UID: \"63efa50f-a0e7-4912-bbd8-c610daf572fd\") " pod="openstack/cinder-api-0" Mar 21 04:45:06 crc kubenswrapper[4839]: I0321 04:45:06.067478 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/63efa50f-a0e7-4912-bbd8-c610daf572fd-config-data-custom\") pod \"cinder-api-0\" (UID: \"63efa50f-a0e7-4912-bbd8-c610daf572fd\") " pod="openstack/cinder-api-0" Mar 21 04:45:06 crc kubenswrapper[4839]: I0321 04:45:06.067517 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/63efa50f-a0e7-4912-bbd8-c610daf572fd-scripts\") pod \"cinder-api-0\" (UID: \"63efa50f-a0e7-4912-bbd8-c610daf572fd\") " pod="openstack/cinder-api-0" Mar 21 04:45:06 crc kubenswrapper[4839]: I0321 04:45:06.067540 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kmpmm\" (UniqueName: \"kubernetes.io/projected/63efa50f-a0e7-4912-bbd8-c610daf572fd-kube-api-access-kmpmm\") pod \"cinder-api-0\" (UID: \"63efa50f-a0e7-4912-bbd8-c610daf572fd\") " pod="openstack/cinder-api-0" Mar 21 04:45:06 crc kubenswrapper[4839]: I0321 04:45:06.067695 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/63efa50f-a0e7-4912-bbd8-c610daf572fd-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"63efa50f-a0e7-4912-bbd8-c610daf572fd\") " pod="openstack/cinder-api-0" Mar 21 04:45:06 crc kubenswrapper[4839]: I0321 04:45:06.073838 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Mar 21 04:45:06 crc kubenswrapper[4839]: I0321 04:45:06.169039 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/63efa50f-a0e7-4912-bbd8-c610daf572fd-etc-machine-id\") pod \"cinder-api-0\" (UID: \"63efa50f-a0e7-4912-bbd8-c610daf572fd\") " pod="openstack/cinder-api-0" Mar 21 04:45:06 crc kubenswrapper[4839]: I0321 04:45:06.169425 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/63efa50f-a0e7-4912-bbd8-c610daf572fd-config-data\") pod \"cinder-api-0\" (UID: \"63efa50f-a0e7-4912-bbd8-c610daf572fd\") " pod="openstack/cinder-api-0" Mar 21 04:45:06 crc kubenswrapper[4839]: I0321 04:45:06.169479 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/63efa50f-a0e7-4912-bbd8-c610daf572fd-config-data-custom\") pod \"cinder-api-0\" (UID: \"63efa50f-a0e7-4912-bbd8-c610daf572fd\") " pod="openstack/cinder-api-0" Mar 21 04:45:06 crc kubenswrapper[4839]: I0321 04:45:06.169520 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/63efa50f-a0e7-4912-bbd8-c610daf572fd-scripts\") pod \"cinder-api-0\" (UID: \"63efa50f-a0e7-4912-bbd8-c610daf572fd\") " pod="openstack/cinder-api-0" Mar 21 04:45:06 crc kubenswrapper[4839]: I0321 04:45:06.169540 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kmpmm\" (UniqueName: \"kubernetes.io/projected/63efa50f-a0e7-4912-bbd8-c610daf572fd-kube-api-access-kmpmm\") pod \"cinder-api-0\" (UID: \"63efa50f-a0e7-4912-bbd8-c610daf572fd\") " pod="openstack/cinder-api-0" Mar 21 04:45:06 crc kubenswrapper[4839]: I0321 04:45:06.169690 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/63efa50f-a0e7-4912-bbd8-c610daf572fd-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"63efa50f-a0e7-4912-bbd8-c610daf572fd\") " pod="openstack/cinder-api-0" Mar 21 04:45:06 crc kubenswrapper[4839]: I0321 04:45:06.169771 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/63efa50f-a0e7-4912-bbd8-c610daf572fd-logs\") pod \"cinder-api-0\" (UID: \"63efa50f-a0e7-4912-bbd8-c610daf572fd\") " pod="openstack/cinder-api-0" Mar 21 04:45:06 crc kubenswrapper[4839]: I0321 04:45:06.170243 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/63efa50f-a0e7-4912-bbd8-c610daf572fd-logs\") pod \"cinder-api-0\" (UID: \"63efa50f-a0e7-4912-bbd8-c610daf572fd\") " pod="openstack/cinder-api-0" Mar 21 04:45:06 crc kubenswrapper[4839]: I0321 04:45:06.170306 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/63efa50f-a0e7-4912-bbd8-c610daf572fd-etc-machine-id\") pod \"cinder-api-0\" (UID: \"63efa50f-a0e7-4912-bbd8-c610daf572fd\") " pod="openstack/cinder-api-0" Mar 21 04:45:06 crc kubenswrapper[4839]: I0321 04:45:06.176440 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/63efa50f-a0e7-4912-bbd8-c610daf572fd-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"63efa50f-a0e7-4912-bbd8-c610daf572fd\") " pod="openstack/cinder-api-0" Mar 21 04:45:06 crc kubenswrapper[4839]: I0321 04:45:06.176843 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/63efa50f-a0e7-4912-bbd8-c610daf572fd-scripts\") pod \"cinder-api-0\" (UID: \"63efa50f-a0e7-4912-bbd8-c610daf572fd\") " pod="openstack/cinder-api-0" Mar 21 04:45:06 crc kubenswrapper[4839]: I0321 04:45:06.177438 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/63efa50f-a0e7-4912-bbd8-c610daf572fd-config-data\") pod \"cinder-api-0\" (UID: \"63efa50f-a0e7-4912-bbd8-c610daf572fd\") " pod="openstack/cinder-api-0" Mar 21 04:45:06 crc kubenswrapper[4839]: I0321 04:45:06.177655 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/63efa50f-a0e7-4912-bbd8-c610daf572fd-config-data-custom\") pod \"cinder-api-0\" (UID: \"63efa50f-a0e7-4912-bbd8-c610daf572fd\") " pod="openstack/cinder-api-0" Mar 21 04:45:06 crc kubenswrapper[4839]: I0321 04:45:06.200462 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kmpmm\" (UniqueName: \"kubernetes.io/projected/63efa50f-a0e7-4912-bbd8-c610daf572fd-kube-api-access-kmpmm\") pod \"cinder-api-0\" (UID: \"63efa50f-a0e7-4912-bbd8-c610daf572fd\") " pod="openstack/cinder-api-0" Mar 21 04:45:06 crc kubenswrapper[4839]: I0321 04:45:06.313916 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Mar 21 04:45:06 crc kubenswrapper[4839]: I0321 04:45:06.358311 4839 generic.go:334] "Generic (PLEG): container finished" podID="10c8735c-f1c9-40f7-bd34-60bb0749bc23" containerID="77ad1711dc27b34bdfe011c55c79bdc7d6ef8a5e0c42e7951254c65ef19efa51" exitCode=0 Mar 21 04:45:06 crc kubenswrapper[4839]: I0321 04:45:06.358374 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-67dd687666-pgfc5" event={"ID":"10c8735c-f1c9-40f7-bd34-60bb0749bc23","Type":"ContainerDied","Data":"77ad1711dc27b34bdfe011c55c79bdc7d6ef8a5e0c42e7951254c65ef19efa51"} Mar 21 04:45:06 crc kubenswrapper[4839]: I0321 04:45:06.363489 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-748dbf85fc-jslwv" event={"ID":"cd21ac8b-d3c0-4f0c-9205-d60d55425d8a","Type":"ContainerStarted","Data":"3b862c544d0747d8c936d406da9f0474ff7e8a90021f0f455055aabe5b622b8f"} Mar 21 04:45:06 crc kubenswrapper[4839]: I0321 04:45:06.364259 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-748dbf85fc-jslwv" event={"ID":"cd21ac8b-d3c0-4f0c-9205-d60d55425d8a","Type":"ContainerStarted","Data":"a7ca8e601071db58f4f665717965ec3f7697dafded6b1d08bad598e9d13b62ad"} Mar 21 04:45:06 crc kubenswrapper[4839]: I0321 04:45:06.364342 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-748dbf85fc-jslwv" event={"ID":"cd21ac8b-d3c0-4f0c-9205-d60d55425d8a","Type":"ContainerStarted","Data":"ac9ca763b0341f17070dbd9db2d04ffcd513f8092500c1d0f5359ce9caa74b46"} Mar 21 04:45:06 crc kubenswrapper[4839]: I0321 04:45:06.364499 4839 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/neutron-748dbf85fc-jslwv" Mar 21 04:45:06 crc kubenswrapper[4839]: I0321 04:45:06.411679 4839 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-748dbf85fc-jslwv" podStartSLOduration=2.411657206 podStartE2EDuration="2.411657206s" podCreationTimestamp="2026-03-21 04:45:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-21 04:45:06.384198788 +0000 UTC m=+1310.711985464" watchObservedRunningTime="2026-03-21 04:45:06.411657206 +0000 UTC m=+1310.739443882" Mar 21 04:45:06 crc kubenswrapper[4839]: I0321 04:45:06.476331 4839 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e5f64e49-61a6-4601-b37b-f9af6079108c" path="/var/lib/kubelet/pods/e5f64e49-61a6-4601-b37b-f9af6079108c/volumes" Mar 21 04:45:06 crc kubenswrapper[4839]: I0321 04:45:06.531547 4839 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6578955fd5-qc28r"] Mar 21 04:45:06 crc kubenswrapper[4839]: I0321 04:45:06.684320 4839 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Mar 21 04:45:06 crc kubenswrapper[4839]: I0321 04:45:06.810228 4839 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-67dd687666-pgfc5" Mar 21 04:45:06 crc kubenswrapper[4839]: I0321 04:45:06.897920 4839 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29567805-9sr8j" Mar 21 04:45:06 crc kubenswrapper[4839]: I0321 04:45:06.905743 4839 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/neutron-98964f649-mrjrt" podUID="e965d008-890b-408c-a5a8-823aca00140a" containerName="neutron-httpd" probeResult="failure" output="Get \"https://10.217.0.159:9696/\": dial tcp 10.217.0.159:9696: connect: connection refused" Mar 21 04:45:06 crc kubenswrapper[4839]: I0321 04:45:06.906298 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pj9bx\" (UniqueName: \"kubernetes.io/projected/10c8735c-f1c9-40f7-bd34-60bb0749bc23-kube-api-access-pj9bx\") pod \"10c8735c-f1c9-40f7-bd34-60bb0749bc23\" (UID: \"10c8735c-f1c9-40f7-bd34-60bb0749bc23\") " Mar 21 04:45:06 crc kubenswrapper[4839]: I0321 04:45:06.906416 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/10c8735c-f1c9-40f7-bd34-60bb0749bc23-config-data-custom\") pod \"10c8735c-f1c9-40f7-bd34-60bb0749bc23\" (UID: \"10c8735c-f1c9-40f7-bd34-60bb0749bc23\") " Mar 21 04:45:06 crc kubenswrapper[4839]: I0321 04:45:06.906465 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/10c8735c-f1c9-40f7-bd34-60bb0749bc23-logs\") pod \"10c8735c-f1c9-40f7-bd34-60bb0749bc23\" (UID: \"10c8735c-f1c9-40f7-bd34-60bb0749bc23\") " Mar 21 04:45:06 crc kubenswrapper[4839]: I0321 04:45:06.906522 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/10c8735c-f1c9-40f7-bd34-60bb0749bc23-config-data\") pod \"10c8735c-f1c9-40f7-bd34-60bb0749bc23\" (UID: \"10c8735c-f1c9-40f7-bd34-60bb0749bc23\") " Mar 21 04:45:06 crc kubenswrapper[4839]: I0321 04:45:06.906569 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/10c8735c-f1c9-40f7-bd34-60bb0749bc23-combined-ca-bundle\") pod \"10c8735c-f1c9-40f7-bd34-60bb0749bc23\" (UID: \"10c8735c-f1c9-40f7-bd34-60bb0749bc23\") " Mar 21 04:45:06 crc kubenswrapper[4839]: I0321 04:45:06.907135 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/10c8735c-f1c9-40f7-bd34-60bb0749bc23-logs" (OuterVolumeSpecName: "logs") pod "10c8735c-f1c9-40f7-bd34-60bb0749bc23" (UID: "10c8735c-f1c9-40f7-bd34-60bb0749bc23"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 21 04:45:06 crc kubenswrapper[4839]: I0321 04:45:06.907539 4839 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/10c8735c-f1c9-40f7-bd34-60bb0749bc23-logs\") on node \"crc\" DevicePath \"\"" Mar 21 04:45:06 crc kubenswrapper[4839]: I0321 04:45:06.913617 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/10c8735c-f1c9-40f7-bd34-60bb0749bc23-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "10c8735c-f1c9-40f7-bd34-60bb0749bc23" (UID: "10c8735c-f1c9-40f7-bd34-60bb0749bc23"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 21 04:45:06 crc kubenswrapper[4839]: I0321 04:45:06.914060 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/10c8735c-f1c9-40f7-bd34-60bb0749bc23-kube-api-access-pj9bx" (OuterVolumeSpecName: "kube-api-access-pj9bx") pod "10c8735c-f1c9-40f7-bd34-60bb0749bc23" (UID: "10c8735c-f1c9-40f7-bd34-60bb0749bc23"). InnerVolumeSpecName "kube-api-access-pj9bx". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 21 04:45:06 crc kubenswrapper[4839]: I0321 04:45:06.937522 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/10c8735c-f1c9-40f7-bd34-60bb0749bc23-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "10c8735c-f1c9-40f7-bd34-60bb0749bc23" (UID: "10c8735c-f1c9-40f7-bd34-60bb0749bc23"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 21 04:45:06 crc kubenswrapper[4839]: I0321 04:45:06.980219 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/10c8735c-f1c9-40f7-bd34-60bb0749bc23-config-data" (OuterVolumeSpecName: "config-data") pod "10c8735c-f1c9-40f7-bd34-60bb0749bc23" (UID: "10c8735c-f1c9-40f7-bd34-60bb0749bc23"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 21 04:45:07 crc kubenswrapper[4839]: I0321 04:45:07.008744 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/47d5a79e-3e14-4d49-bed4-a9c49e7b7f26-secret-volume\") pod \"47d5a79e-3e14-4d49-bed4-a9c49e7b7f26\" (UID: \"47d5a79e-3e14-4d49-bed4-a9c49e7b7f26\") " Mar 21 04:45:07 crc kubenswrapper[4839]: I0321 04:45:07.008886 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/47d5a79e-3e14-4d49-bed4-a9c49e7b7f26-config-volume\") pod \"47d5a79e-3e14-4d49-bed4-a9c49e7b7f26\" (UID: \"47d5a79e-3e14-4d49-bed4-a9c49e7b7f26\") " Mar 21 04:45:07 crc kubenswrapper[4839]: I0321 04:45:07.008906 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6rsc2\" (UniqueName: \"kubernetes.io/projected/47d5a79e-3e14-4d49-bed4-a9c49e7b7f26-kube-api-access-6rsc2\") pod \"47d5a79e-3e14-4d49-bed4-a9c49e7b7f26\" (UID: \"47d5a79e-3e14-4d49-bed4-a9c49e7b7f26\") " Mar 21 04:45:07 crc kubenswrapper[4839]: I0321 04:45:07.009350 4839 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pj9bx\" (UniqueName: \"kubernetes.io/projected/10c8735c-f1c9-40f7-bd34-60bb0749bc23-kube-api-access-pj9bx\") on node \"crc\" DevicePath \"\"" Mar 21 04:45:07 crc kubenswrapper[4839]: I0321 04:45:07.009364 4839 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/10c8735c-f1c9-40f7-bd34-60bb0749bc23-config-data-custom\") on node \"crc\" DevicePath \"\"" Mar 21 04:45:07 crc kubenswrapper[4839]: I0321 04:45:07.009373 4839 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/10c8735c-f1c9-40f7-bd34-60bb0749bc23-config-data\") on node \"crc\" DevicePath \"\"" Mar 21 04:45:07 crc kubenswrapper[4839]: I0321 04:45:07.009384 4839 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/10c8735c-f1c9-40f7-bd34-60bb0749bc23-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 21 04:45:07 crc kubenswrapper[4839]: I0321 04:45:07.009561 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/47d5a79e-3e14-4d49-bed4-a9c49e7b7f26-config-volume" (OuterVolumeSpecName: "config-volume") pod "47d5a79e-3e14-4d49-bed4-a9c49e7b7f26" (UID: "47d5a79e-3e14-4d49-bed4-a9c49e7b7f26"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 21 04:45:07 crc kubenswrapper[4839]: I0321 04:45:07.012694 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/47d5a79e-3e14-4d49-bed4-a9c49e7b7f26-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "47d5a79e-3e14-4d49-bed4-a9c49e7b7f26" (UID: "47d5a79e-3e14-4d49-bed4-a9c49e7b7f26"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 21 04:45:07 crc kubenswrapper[4839]: I0321 04:45:07.012928 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/47d5a79e-3e14-4d49-bed4-a9c49e7b7f26-kube-api-access-6rsc2" (OuterVolumeSpecName: "kube-api-access-6rsc2") pod "47d5a79e-3e14-4d49-bed4-a9c49e7b7f26" (UID: "47d5a79e-3e14-4d49-bed4-a9c49e7b7f26"). InnerVolumeSpecName "kube-api-access-6rsc2". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 21 04:45:07 crc kubenswrapper[4839]: I0321 04:45:07.095013 4839 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Mar 21 04:45:07 crc kubenswrapper[4839]: W0321 04:45:07.097013 4839 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod63efa50f_a0e7_4912_bbd8_c610daf572fd.slice/crio-6c8ae425c983b2b78de1cbadfa55da8fe679ecb65a24d6c38077401c6093e407 WatchSource:0}: Error finding container 6c8ae425c983b2b78de1cbadfa55da8fe679ecb65a24d6c38077401c6093e407: Status 404 returned error can't find the container with id 6c8ae425c983b2b78de1cbadfa55da8fe679ecb65a24d6c38077401c6093e407 Mar 21 04:45:07 crc kubenswrapper[4839]: I0321 04:45:07.111178 4839 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/47d5a79e-3e14-4d49-bed4-a9c49e7b7f26-config-volume\") on node \"crc\" DevicePath \"\"" Mar 21 04:45:07 crc kubenswrapper[4839]: I0321 04:45:07.111210 4839 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6rsc2\" (UniqueName: \"kubernetes.io/projected/47d5a79e-3e14-4d49-bed4-a9c49e7b7f26-kube-api-access-6rsc2\") on node \"crc\" DevicePath \"\"" Mar 21 04:45:07 crc kubenswrapper[4839]: I0321 04:45:07.111219 4839 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/47d5a79e-3e14-4d49-bed4-a9c49e7b7f26-secret-volume\") on node \"crc\" DevicePath \"\"" Mar 21 04:45:07 crc kubenswrapper[4839]: I0321 04:45:07.404249 4839 generic.go:334] "Generic (PLEG): container finished" podID="439bd408-2f5c-45cc-a2f7-8166a4a279c2" containerID="583f06ca5d49ecf29149b9c8c616f5732714a4739948ee020d19d827d943128b" exitCode=0 Mar 21 04:45:07 crc kubenswrapper[4839]: I0321 04:45:07.404529 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6578955fd5-qc28r" event={"ID":"439bd408-2f5c-45cc-a2f7-8166a4a279c2","Type":"ContainerDied","Data":"583f06ca5d49ecf29149b9c8c616f5732714a4739948ee020d19d827d943128b"} Mar 21 04:45:07 crc kubenswrapper[4839]: I0321 04:45:07.404564 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6578955fd5-qc28r" event={"ID":"439bd408-2f5c-45cc-a2f7-8166a4a279c2","Type":"ContainerStarted","Data":"21228341591d8e5aec6ec7937412b30e803ee3d8f05a2c9720bd304ab86d36ca"} Mar 21 04:45:07 crc kubenswrapper[4839]: I0321 04:45:07.409998 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"82b135c8-5fc8-4930-9577-1dd9181a1dae","Type":"ContainerStarted","Data":"63601f472b5773f8a2494b64c1e43c04ccfd11d451bf08893682392398171dec"} Mar 21 04:45:07 crc kubenswrapper[4839]: I0321 04:45:07.418058 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29567805-9sr8j" event={"ID":"47d5a79e-3e14-4d49-bed4-a9c49e7b7f26","Type":"ContainerDied","Data":"06a1cfd95284ed8daa71d7f99c5f1c2898ca406b61f4a56279166d4934b555c0"} Mar 21 04:45:07 crc kubenswrapper[4839]: I0321 04:45:07.418103 4839 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="06a1cfd95284ed8daa71d7f99c5f1c2898ca406b61f4a56279166d4934b555c0" Mar 21 04:45:07 crc kubenswrapper[4839]: I0321 04:45:07.418170 4839 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29567805-9sr8j" Mar 21 04:45:07 crc kubenswrapper[4839]: I0321 04:45:07.439892 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-67dd687666-pgfc5" event={"ID":"10c8735c-f1c9-40f7-bd34-60bb0749bc23","Type":"ContainerDied","Data":"eef1fbdee77ab0e9e93f444be08641661339d7257383b84dc3aa743e29072b31"} Mar 21 04:45:07 crc kubenswrapper[4839]: I0321 04:45:07.439939 4839 scope.go:117] "RemoveContainer" containerID="77ad1711dc27b34bdfe011c55c79bdc7d6ef8a5e0c42e7951254c65ef19efa51" Mar 21 04:45:07 crc kubenswrapper[4839]: I0321 04:45:07.440042 4839 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-67dd687666-pgfc5" Mar 21 04:45:07 crc kubenswrapper[4839]: I0321 04:45:07.447795 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"63efa50f-a0e7-4912-bbd8-c610daf572fd","Type":"ContainerStarted","Data":"6c8ae425c983b2b78de1cbadfa55da8fe679ecb65a24d6c38077401c6093e407"} Mar 21 04:45:07 crc kubenswrapper[4839]: I0321 04:45:07.508819 4839 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-api-67dd687666-pgfc5"] Mar 21 04:45:07 crc kubenswrapper[4839]: I0321 04:45:07.533223 4839 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-api-67dd687666-pgfc5"] Mar 21 04:45:07 crc kubenswrapper[4839]: I0321 04:45:07.559551 4839 scope.go:117] "RemoveContainer" containerID="5b137b53eba217c749f810e3fe6d4536182b4cec7923324d43b649cbc888ca03" Mar 21 04:45:08 crc kubenswrapper[4839]: I0321 04:45:08.441435 4839 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-api-0"] Mar 21 04:45:08 crc kubenswrapper[4839]: I0321 04:45:08.472396 4839 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="10c8735c-f1c9-40f7-bd34-60bb0749bc23" path="/var/lib/kubelet/pods/10c8735c-f1c9-40f7-bd34-60bb0749bc23/volumes" Mar 21 04:45:08 crc kubenswrapper[4839]: I0321 04:45:08.473508 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"82b135c8-5fc8-4930-9577-1dd9181a1dae","Type":"ContainerStarted","Data":"94e949d149a7c44ebe62d3aa61b17816324f2bfff6307c3c5a788a6a257442fa"} Mar 21 04:45:08 crc kubenswrapper[4839]: I0321 04:45:08.476470 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"63efa50f-a0e7-4912-bbd8-c610daf572fd","Type":"ContainerStarted","Data":"bbbc59bd3ae95b686894b1a781c124d75cbce182408b10ddc7aaf1fc47510ff5"} Mar 21 04:45:08 crc kubenswrapper[4839]: I0321 04:45:08.478098 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6578955fd5-qc28r" event={"ID":"439bd408-2f5c-45cc-a2f7-8166a4a279c2","Type":"ContainerStarted","Data":"c21a3fe620f5493aafd1cd6d7aa74d463dea1dbb4165e2e674ec4728c83d9d18"} Mar 21 04:45:08 crc kubenswrapper[4839]: I0321 04:45:08.478218 4839 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-6578955fd5-qc28r" Mar 21 04:45:08 crc kubenswrapper[4839]: I0321 04:45:08.501240 4839 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-6578955fd5-qc28r" podStartSLOduration=3.501214752 podStartE2EDuration="3.501214752s" podCreationTimestamp="2026-03-21 04:45:05 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-21 04:45:08.49682835 +0000 UTC m=+1312.824615046" watchObservedRunningTime="2026-03-21 04:45:08.501214752 +0000 UTC m=+1312.829001428" Mar 21 04:45:09 crc kubenswrapper[4839]: I0321 04:45:09.495797 4839 generic.go:334] "Generic (PLEG): container finished" podID="6c266726-5bfd-4519-bdd5-9db7f6a77df4" containerID="0a64d9a20f4f5b5d0b9782608a440c655769c9db2754bb98b7278494dc83ae14" exitCode=0 Mar 21 04:45:09 crc kubenswrapper[4839]: I0321 04:45:09.495890 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"6c266726-5bfd-4519-bdd5-9db7f6a77df4","Type":"ContainerDied","Data":"0a64d9a20f4f5b5d0b9782608a440c655769c9db2754bb98b7278494dc83ae14"} Mar 21 04:45:09 crc kubenswrapper[4839]: I0321 04:45:09.505064 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"63efa50f-a0e7-4912-bbd8-c610daf572fd","Type":"ContainerStarted","Data":"cd27cd089c3832825f90b0c44877ec010b880b3484062118b2e7c9a67fa21c1c"} Mar 21 04:45:09 crc kubenswrapper[4839]: I0321 04:45:09.505188 4839 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-api-0" podUID="63efa50f-a0e7-4912-bbd8-c610daf572fd" containerName="cinder-api-log" containerID="cri-o://bbbc59bd3ae95b686894b1a781c124d75cbce182408b10ddc7aaf1fc47510ff5" gracePeriod=30 Mar 21 04:45:09 crc kubenswrapper[4839]: I0321 04:45:09.505281 4839 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-api-0" podUID="63efa50f-a0e7-4912-bbd8-c610daf572fd" containerName="cinder-api" containerID="cri-o://cd27cd089c3832825f90b0c44877ec010b880b3484062118b2e7c9a67fa21c1c" gracePeriod=30 Mar 21 04:45:09 crc kubenswrapper[4839]: I0321 04:45:09.505431 4839 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/cinder-api-0" Mar 21 04:45:09 crc kubenswrapper[4839]: I0321 04:45:09.519285 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"82b135c8-5fc8-4930-9577-1dd9181a1dae","Type":"ContainerStarted","Data":"29fe57ad79066c4ceb1edc999f7c370920b15cd9f7e5562d2b7d130cb45455af"} Mar 21 04:45:09 crc kubenswrapper[4839]: I0321 04:45:09.525792 4839 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-api-0" podStartSLOduration=4.525778032 podStartE2EDuration="4.525778032s" podCreationTimestamp="2026-03-21 04:45:05 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-21 04:45:09.524075135 +0000 UTC m=+1313.851861831" watchObservedRunningTime="2026-03-21 04:45:09.525778032 +0000 UTC m=+1313.853564708" Mar 21 04:45:09 crc kubenswrapper[4839]: I0321 04:45:09.559260 4839 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-scheduler-0" podStartSLOduration=3.962041362 podStartE2EDuration="4.559210228s" podCreationTimestamp="2026-03-21 04:45:05 +0000 UTC" firstStartedPulling="2026-03-21 04:45:06.75640487 +0000 UTC m=+1311.084191546" lastFinishedPulling="2026-03-21 04:45:07.353573736 +0000 UTC m=+1311.681360412" observedRunningTime="2026-03-21 04:45:09.557611753 +0000 UTC m=+1313.885398429" watchObservedRunningTime="2026-03-21 04:45:09.559210228 +0000 UTC m=+1313.886996904" Mar 21 04:45:10 crc kubenswrapper[4839]: I0321 04:45:10.040771 4839 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 21 04:45:10 crc kubenswrapper[4839]: I0321 04:45:10.101840 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/6c266726-5bfd-4519-bdd5-9db7f6a77df4-run-httpd\") pod \"6c266726-5bfd-4519-bdd5-9db7f6a77df4\" (UID: \"6c266726-5bfd-4519-bdd5-9db7f6a77df4\") " Mar 21 04:45:10 crc kubenswrapper[4839]: I0321 04:45:10.101913 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/6c266726-5bfd-4519-bdd5-9db7f6a77df4-log-httpd\") pod \"6c266726-5bfd-4519-bdd5-9db7f6a77df4\" (UID: \"6c266726-5bfd-4519-bdd5-9db7f6a77df4\") " Mar 21 04:45:10 crc kubenswrapper[4839]: I0321 04:45:10.101952 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/6c266726-5bfd-4519-bdd5-9db7f6a77df4-sg-core-conf-yaml\") pod \"6c266726-5bfd-4519-bdd5-9db7f6a77df4\" (UID: \"6c266726-5bfd-4519-bdd5-9db7f6a77df4\") " Mar 21 04:45:10 crc kubenswrapper[4839]: I0321 04:45:10.102053 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6c266726-5bfd-4519-bdd5-9db7f6a77df4-scripts\") pod \"6c266726-5bfd-4519-bdd5-9db7f6a77df4\" (UID: \"6c266726-5bfd-4519-bdd5-9db7f6a77df4\") " Mar 21 04:45:10 crc kubenswrapper[4839]: I0321 04:45:10.102071 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6c266726-5bfd-4519-bdd5-9db7f6a77df4-config-data\") pod \"6c266726-5bfd-4519-bdd5-9db7f6a77df4\" (UID: \"6c266726-5bfd-4519-bdd5-9db7f6a77df4\") " Mar 21 04:45:10 crc kubenswrapper[4839]: I0321 04:45:10.102119 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6c266726-5bfd-4519-bdd5-9db7f6a77df4-combined-ca-bundle\") pod \"6c266726-5bfd-4519-bdd5-9db7f6a77df4\" (UID: \"6c266726-5bfd-4519-bdd5-9db7f6a77df4\") " Mar 21 04:45:10 crc kubenswrapper[4839]: I0321 04:45:10.102193 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5zddh\" (UniqueName: \"kubernetes.io/projected/6c266726-5bfd-4519-bdd5-9db7f6a77df4-kube-api-access-5zddh\") pod \"6c266726-5bfd-4519-bdd5-9db7f6a77df4\" (UID: \"6c266726-5bfd-4519-bdd5-9db7f6a77df4\") " Mar 21 04:45:10 crc kubenswrapper[4839]: I0321 04:45:10.102238 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6c266726-5bfd-4519-bdd5-9db7f6a77df4-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "6c266726-5bfd-4519-bdd5-9db7f6a77df4" (UID: "6c266726-5bfd-4519-bdd5-9db7f6a77df4"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 21 04:45:10 crc kubenswrapper[4839]: I0321 04:45:10.102249 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6c266726-5bfd-4519-bdd5-9db7f6a77df4-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "6c266726-5bfd-4519-bdd5-9db7f6a77df4" (UID: "6c266726-5bfd-4519-bdd5-9db7f6a77df4"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 21 04:45:10 crc kubenswrapper[4839]: I0321 04:45:10.102559 4839 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/6c266726-5bfd-4519-bdd5-9db7f6a77df4-run-httpd\") on node \"crc\" DevicePath \"\"" Mar 21 04:45:10 crc kubenswrapper[4839]: I0321 04:45:10.102590 4839 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/6c266726-5bfd-4519-bdd5-9db7f6a77df4-log-httpd\") on node \"crc\" DevicePath \"\"" Mar 21 04:45:10 crc kubenswrapper[4839]: I0321 04:45:10.112707 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6c266726-5bfd-4519-bdd5-9db7f6a77df4-scripts" (OuterVolumeSpecName: "scripts") pod "6c266726-5bfd-4519-bdd5-9db7f6a77df4" (UID: "6c266726-5bfd-4519-bdd5-9db7f6a77df4"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 21 04:45:10 crc kubenswrapper[4839]: I0321 04:45:10.156613 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6c266726-5bfd-4519-bdd5-9db7f6a77df4-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "6c266726-5bfd-4519-bdd5-9db7f6a77df4" (UID: "6c266726-5bfd-4519-bdd5-9db7f6a77df4"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 21 04:45:10 crc kubenswrapper[4839]: I0321 04:45:10.156662 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6c266726-5bfd-4519-bdd5-9db7f6a77df4-kube-api-access-5zddh" (OuterVolumeSpecName: "kube-api-access-5zddh") pod "6c266726-5bfd-4519-bdd5-9db7f6a77df4" (UID: "6c266726-5bfd-4519-bdd5-9db7f6a77df4"). InnerVolumeSpecName "kube-api-access-5zddh". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 21 04:45:10 crc kubenswrapper[4839]: I0321 04:45:10.165121 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6c266726-5bfd-4519-bdd5-9db7f6a77df4-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "6c266726-5bfd-4519-bdd5-9db7f6a77df4" (UID: "6c266726-5bfd-4519-bdd5-9db7f6a77df4"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 21 04:45:10 crc kubenswrapper[4839]: I0321 04:45:10.168446 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6c266726-5bfd-4519-bdd5-9db7f6a77df4-config-data" (OuterVolumeSpecName: "config-data") pod "6c266726-5bfd-4519-bdd5-9db7f6a77df4" (UID: "6c266726-5bfd-4519-bdd5-9db7f6a77df4"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 21 04:45:10 crc kubenswrapper[4839]: I0321 04:45:10.205994 4839 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6c266726-5bfd-4519-bdd5-9db7f6a77df4-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 21 04:45:10 crc kubenswrapper[4839]: I0321 04:45:10.206031 4839 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5zddh\" (UniqueName: \"kubernetes.io/projected/6c266726-5bfd-4519-bdd5-9db7f6a77df4-kube-api-access-5zddh\") on node \"crc\" DevicePath \"\"" Mar 21 04:45:10 crc kubenswrapper[4839]: I0321 04:45:10.206043 4839 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/6c266726-5bfd-4519-bdd5-9db7f6a77df4-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Mar 21 04:45:10 crc kubenswrapper[4839]: I0321 04:45:10.206051 4839 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6c266726-5bfd-4519-bdd5-9db7f6a77df4-scripts\") on node \"crc\" DevicePath \"\"" Mar 21 04:45:10 crc kubenswrapper[4839]: I0321 04:45:10.206063 4839 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6c266726-5bfd-4519-bdd5-9db7f6a77df4-config-data\") on node \"crc\" DevicePath \"\"" Mar 21 04:45:10 crc kubenswrapper[4839]: I0321 04:45:10.279547 4839 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Mar 21 04:45:10 crc kubenswrapper[4839]: I0321 04:45:10.409373 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/63efa50f-a0e7-4912-bbd8-c610daf572fd-combined-ca-bundle\") pod \"63efa50f-a0e7-4912-bbd8-c610daf572fd\" (UID: \"63efa50f-a0e7-4912-bbd8-c610daf572fd\") " Mar 21 04:45:10 crc kubenswrapper[4839]: I0321 04:45:10.409489 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/63efa50f-a0e7-4912-bbd8-c610daf572fd-scripts\") pod \"63efa50f-a0e7-4912-bbd8-c610daf572fd\" (UID: \"63efa50f-a0e7-4912-bbd8-c610daf572fd\") " Mar 21 04:45:10 crc kubenswrapper[4839]: I0321 04:45:10.409631 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/63efa50f-a0e7-4912-bbd8-c610daf572fd-logs\") pod \"63efa50f-a0e7-4912-bbd8-c610daf572fd\" (UID: \"63efa50f-a0e7-4912-bbd8-c610daf572fd\") " Mar 21 04:45:10 crc kubenswrapper[4839]: I0321 04:45:10.409688 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/63efa50f-a0e7-4912-bbd8-c610daf572fd-config-data\") pod \"63efa50f-a0e7-4912-bbd8-c610daf572fd\" (UID: \"63efa50f-a0e7-4912-bbd8-c610daf572fd\") " Mar 21 04:45:10 crc kubenswrapper[4839]: I0321 04:45:10.409744 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kmpmm\" (UniqueName: \"kubernetes.io/projected/63efa50f-a0e7-4912-bbd8-c610daf572fd-kube-api-access-kmpmm\") pod \"63efa50f-a0e7-4912-bbd8-c610daf572fd\" (UID: \"63efa50f-a0e7-4912-bbd8-c610daf572fd\") " Mar 21 04:45:10 crc kubenswrapper[4839]: I0321 04:45:10.409780 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/63efa50f-a0e7-4912-bbd8-c610daf572fd-etc-machine-id\") pod \"63efa50f-a0e7-4912-bbd8-c610daf572fd\" (UID: \"63efa50f-a0e7-4912-bbd8-c610daf572fd\") " Mar 21 04:45:10 crc kubenswrapper[4839]: I0321 04:45:10.409831 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/63efa50f-a0e7-4912-bbd8-c610daf572fd-config-data-custom\") pod \"63efa50f-a0e7-4912-bbd8-c610daf572fd\" (UID: \"63efa50f-a0e7-4912-bbd8-c610daf572fd\") " Mar 21 04:45:10 crc kubenswrapper[4839]: I0321 04:45:10.409925 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/63efa50f-a0e7-4912-bbd8-c610daf572fd-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "63efa50f-a0e7-4912-bbd8-c610daf572fd" (UID: "63efa50f-a0e7-4912-bbd8-c610daf572fd"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 21 04:45:10 crc kubenswrapper[4839]: I0321 04:45:10.410294 4839 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/63efa50f-a0e7-4912-bbd8-c610daf572fd-etc-machine-id\") on node \"crc\" DevicePath \"\"" Mar 21 04:45:10 crc kubenswrapper[4839]: I0321 04:45:10.410370 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/63efa50f-a0e7-4912-bbd8-c610daf572fd-logs" (OuterVolumeSpecName: "logs") pod "63efa50f-a0e7-4912-bbd8-c610daf572fd" (UID: "63efa50f-a0e7-4912-bbd8-c610daf572fd"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 21 04:45:10 crc kubenswrapper[4839]: I0321 04:45:10.414765 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/63efa50f-a0e7-4912-bbd8-c610daf572fd-kube-api-access-kmpmm" (OuterVolumeSpecName: "kube-api-access-kmpmm") pod "63efa50f-a0e7-4912-bbd8-c610daf572fd" (UID: "63efa50f-a0e7-4912-bbd8-c610daf572fd"). InnerVolumeSpecName "kube-api-access-kmpmm". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 21 04:45:10 crc kubenswrapper[4839]: I0321 04:45:10.415252 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/63efa50f-a0e7-4912-bbd8-c610daf572fd-scripts" (OuterVolumeSpecName: "scripts") pod "63efa50f-a0e7-4912-bbd8-c610daf572fd" (UID: "63efa50f-a0e7-4912-bbd8-c610daf572fd"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 21 04:45:10 crc kubenswrapper[4839]: I0321 04:45:10.415554 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/63efa50f-a0e7-4912-bbd8-c610daf572fd-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "63efa50f-a0e7-4912-bbd8-c610daf572fd" (UID: "63efa50f-a0e7-4912-bbd8-c610daf572fd"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 21 04:45:10 crc kubenswrapper[4839]: I0321 04:45:10.432991 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/63efa50f-a0e7-4912-bbd8-c610daf572fd-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "63efa50f-a0e7-4912-bbd8-c610daf572fd" (UID: "63efa50f-a0e7-4912-bbd8-c610daf572fd"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 21 04:45:10 crc kubenswrapper[4839]: I0321 04:45:10.472937 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/63efa50f-a0e7-4912-bbd8-c610daf572fd-config-data" (OuterVolumeSpecName: "config-data") pod "63efa50f-a0e7-4912-bbd8-c610daf572fd" (UID: "63efa50f-a0e7-4912-bbd8-c610daf572fd"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 21 04:45:10 crc kubenswrapper[4839]: I0321 04:45:10.512072 4839 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/63efa50f-a0e7-4912-bbd8-c610daf572fd-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 21 04:45:10 crc kubenswrapper[4839]: I0321 04:45:10.512122 4839 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/63efa50f-a0e7-4912-bbd8-c610daf572fd-scripts\") on node \"crc\" DevicePath \"\"" Mar 21 04:45:10 crc kubenswrapper[4839]: I0321 04:45:10.512136 4839 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/63efa50f-a0e7-4912-bbd8-c610daf572fd-logs\") on node \"crc\" DevicePath \"\"" Mar 21 04:45:10 crc kubenswrapper[4839]: I0321 04:45:10.512146 4839 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/63efa50f-a0e7-4912-bbd8-c610daf572fd-config-data\") on node \"crc\" DevicePath \"\"" Mar 21 04:45:10 crc kubenswrapper[4839]: I0321 04:45:10.512156 4839 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kmpmm\" (UniqueName: \"kubernetes.io/projected/63efa50f-a0e7-4912-bbd8-c610daf572fd-kube-api-access-kmpmm\") on node \"crc\" DevicePath \"\"" Mar 21 04:45:10 crc kubenswrapper[4839]: I0321 04:45:10.512170 4839 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/63efa50f-a0e7-4912-bbd8-c610daf572fd-config-data-custom\") on node \"crc\" DevicePath \"\"" Mar 21 04:45:10 crc kubenswrapper[4839]: I0321 04:45:10.535773 4839 generic.go:334] "Generic (PLEG): container finished" podID="63efa50f-a0e7-4912-bbd8-c610daf572fd" containerID="cd27cd089c3832825f90b0c44877ec010b880b3484062118b2e7c9a67fa21c1c" exitCode=0 Mar 21 04:45:10 crc kubenswrapper[4839]: I0321 04:45:10.535808 4839 generic.go:334] "Generic (PLEG): container finished" podID="63efa50f-a0e7-4912-bbd8-c610daf572fd" containerID="bbbc59bd3ae95b686894b1a781c124d75cbce182408b10ddc7aaf1fc47510ff5" exitCode=143 Mar 21 04:45:10 crc kubenswrapper[4839]: I0321 04:45:10.535849 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"63efa50f-a0e7-4912-bbd8-c610daf572fd","Type":"ContainerDied","Data":"cd27cd089c3832825f90b0c44877ec010b880b3484062118b2e7c9a67fa21c1c"} Mar 21 04:45:10 crc kubenswrapper[4839]: I0321 04:45:10.535879 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"63efa50f-a0e7-4912-bbd8-c610daf572fd","Type":"ContainerDied","Data":"bbbc59bd3ae95b686894b1a781c124d75cbce182408b10ddc7aaf1fc47510ff5"} Mar 21 04:45:10 crc kubenswrapper[4839]: I0321 04:45:10.535892 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"63efa50f-a0e7-4912-bbd8-c610daf572fd","Type":"ContainerDied","Data":"6c8ae425c983b2b78de1cbadfa55da8fe679ecb65a24d6c38077401c6093e407"} Mar 21 04:45:10 crc kubenswrapper[4839]: I0321 04:45:10.535912 4839 scope.go:117] "RemoveContainer" containerID="cd27cd089c3832825f90b0c44877ec010b880b3484062118b2e7c9a67fa21c1c" Mar 21 04:45:10 crc kubenswrapper[4839]: I0321 04:45:10.536071 4839 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Mar 21 04:45:10 crc kubenswrapper[4839]: I0321 04:45:10.546874 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"6c266726-5bfd-4519-bdd5-9db7f6a77df4","Type":"ContainerDied","Data":"41ed81fbf037f8ebe50fd1cd4bb84f9e7c73f61ee6cb668dca265d806ca14d96"} Mar 21 04:45:10 crc kubenswrapper[4839]: I0321 04:45:10.546970 4839 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 21 04:45:10 crc kubenswrapper[4839]: I0321 04:45:10.571519 4839 scope.go:117] "RemoveContainer" containerID="bbbc59bd3ae95b686894b1a781c124d75cbce182408b10ddc7aaf1fc47510ff5" Mar 21 04:45:10 crc kubenswrapper[4839]: I0321 04:45:10.611954 4839 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 21 04:45:10 crc kubenswrapper[4839]: I0321 04:45:10.615623 4839 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Mar 21 04:45:10 crc kubenswrapper[4839]: I0321 04:45:10.631802 4839 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-api-0"] Mar 21 04:45:10 crc kubenswrapper[4839]: I0321 04:45:10.643255 4839 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-api-0"] Mar 21 04:45:10 crc kubenswrapper[4839]: I0321 04:45:10.656463 4839 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Mar 21 04:45:10 crc kubenswrapper[4839]: E0321 04:45:10.656942 4839 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6c266726-5bfd-4519-bdd5-9db7f6a77df4" containerName="ceilometer-notification-agent" Mar 21 04:45:10 crc kubenswrapper[4839]: I0321 04:45:10.656967 4839 state_mem.go:107] "Deleted CPUSet assignment" podUID="6c266726-5bfd-4519-bdd5-9db7f6a77df4" containerName="ceilometer-notification-agent" Mar 21 04:45:10 crc kubenswrapper[4839]: E0321 04:45:10.656984 4839 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="63efa50f-a0e7-4912-bbd8-c610daf572fd" containerName="cinder-api" Mar 21 04:45:10 crc kubenswrapper[4839]: I0321 04:45:10.656992 4839 state_mem.go:107] "Deleted CPUSet assignment" podUID="63efa50f-a0e7-4912-bbd8-c610daf572fd" containerName="cinder-api" Mar 21 04:45:10 crc kubenswrapper[4839]: E0321 04:45:10.657014 4839 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="63efa50f-a0e7-4912-bbd8-c610daf572fd" containerName="cinder-api-log" Mar 21 04:45:10 crc kubenswrapper[4839]: I0321 04:45:10.657022 4839 state_mem.go:107] "Deleted CPUSet assignment" podUID="63efa50f-a0e7-4912-bbd8-c610daf572fd" containerName="cinder-api-log" Mar 21 04:45:10 crc kubenswrapper[4839]: E0321 04:45:10.657041 4839 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="10c8735c-f1c9-40f7-bd34-60bb0749bc23" containerName="barbican-api" Mar 21 04:45:10 crc kubenswrapper[4839]: I0321 04:45:10.657048 4839 state_mem.go:107] "Deleted CPUSet assignment" podUID="10c8735c-f1c9-40f7-bd34-60bb0749bc23" containerName="barbican-api" Mar 21 04:45:10 crc kubenswrapper[4839]: E0321 04:45:10.657071 4839 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6c266726-5bfd-4519-bdd5-9db7f6a77df4" containerName="sg-core" Mar 21 04:45:10 crc kubenswrapper[4839]: I0321 04:45:10.657080 4839 state_mem.go:107] "Deleted CPUSet assignment" podUID="6c266726-5bfd-4519-bdd5-9db7f6a77df4" containerName="sg-core" Mar 21 04:45:10 crc kubenswrapper[4839]: E0321 04:45:10.657097 4839 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="10c8735c-f1c9-40f7-bd34-60bb0749bc23" containerName="barbican-api-log" Mar 21 04:45:10 crc kubenswrapper[4839]: I0321 04:45:10.657104 4839 state_mem.go:107] "Deleted CPUSet assignment" podUID="10c8735c-f1c9-40f7-bd34-60bb0749bc23" containerName="barbican-api-log" Mar 21 04:45:10 crc kubenswrapper[4839]: E0321 04:45:10.657114 4839 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="47d5a79e-3e14-4d49-bed4-a9c49e7b7f26" containerName="collect-profiles" Mar 21 04:45:10 crc kubenswrapper[4839]: I0321 04:45:10.657121 4839 state_mem.go:107] "Deleted CPUSet assignment" podUID="47d5a79e-3e14-4d49-bed4-a9c49e7b7f26" containerName="collect-profiles" Mar 21 04:45:10 crc kubenswrapper[4839]: I0321 04:45:10.657323 4839 memory_manager.go:354] "RemoveStaleState removing state" podUID="10c8735c-f1c9-40f7-bd34-60bb0749bc23" containerName="barbican-api" Mar 21 04:45:10 crc kubenswrapper[4839]: I0321 04:45:10.657344 4839 memory_manager.go:354] "RemoveStaleState removing state" podUID="6c266726-5bfd-4519-bdd5-9db7f6a77df4" containerName="ceilometer-notification-agent" Mar 21 04:45:10 crc kubenswrapper[4839]: I0321 04:45:10.657362 4839 memory_manager.go:354] "RemoveStaleState removing state" podUID="6c266726-5bfd-4519-bdd5-9db7f6a77df4" containerName="sg-core" Mar 21 04:45:10 crc kubenswrapper[4839]: I0321 04:45:10.657373 4839 memory_manager.go:354] "RemoveStaleState removing state" podUID="10c8735c-f1c9-40f7-bd34-60bb0749bc23" containerName="barbican-api-log" Mar 21 04:45:10 crc kubenswrapper[4839]: I0321 04:45:10.657384 4839 memory_manager.go:354] "RemoveStaleState removing state" podUID="63efa50f-a0e7-4912-bbd8-c610daf572fd" containerName="cinder-api-log" Mar 21 04:45:10 crc kubenswrapper[4839]: I0321 04:45:10.657401 4839 memory_manager.go:354] "RemoveStaleState removing state" podUID="47d5a79e-3e14-4d49-bed4-a9c49e7b7f26" containerName="collect-profiles" Mar 21 04:45:10 crc kubenswrapper[4839]: I0321 04:45:10.657416 4839 memory_manager.go:354] "RemoveStaleState removing state" podUID="63efa50f-a0e7-4912-bbd8-c610daf572fd" containerName="cinder-api" Mar 21 04:45:10 crc kubenswrapper[4839]: I0321 04:45:10.658234 4839 scope.go:117] "RemoveContainer" containerID="cd27cd089c3832825f90b0c44877ec010b880b3484062118b2e7c9a67fa21c1c" Mar 21 04:45:10 crc kubenswrapper[4839]: E0321 04:45:10.658744 4839 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"cd27cd089c3832825f90b0c44877ec010b880b3484062118b2e7c9a67fa21c1c\": container with ID starting with cd27cd089c3832825f90b0c44877ec010b880b3484062118b2e7c9a67fa21c1c not found: ID does not exist" containerID="cd27cd089c3832825f90b0c44877ec010b880b3484062118b2e7c9a67fa21c1c" Mar 21 04:45:10 crc kubenswrapper[4839]: I0321 04:45:10.658777 4839 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cd27cd089c3832825f90b0c44877ec010b880b3484062118b2e7c9a67fa21c1c"} err="failed to get container status \"cd27cd089c3832825f90b0c44877ec010b880b3484062118b2e7c9a67fa21c1c\": rpc error: code = NotFound desc = could not find container \"cd27cd089c3832825f90b0c44877ec010b880b3484062118b2e7c9a67fa21c1c\": container with ID starting with cd27cd089c3832825f90b0c44877ec010b880b3484062118b2e7c9a67fa21c1c not found: ID does not exist" Mar 21 04:45:10 crc kubenswrapper[4839]: I0321 04:45:10.658802 4839 scope.go:117] "RemoveContainer" containerID="bbbc59bd3ae95b686894b1a781c124d75cbce182408b10ddc7aaf1fc47510ff5" Mar 21 04:45:10 crc kubenswrapper[4839]: E0321 04:45:10.659082 4839 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bbbc59bd3ae95b686894b1a781c124d75cbce182408b10ddc7aaf1fc47510ff5\": container with ID starting with bbbc59bd3ae95b686894b1a781c124d75cbce182408b10ddc7aaf1fc47510ff5 not found: ID does not exist" containerID="bbbc59bd3ae95b686894b1a781c124d75cbce182408b10ddc7aaf1fc47510ff5" Mar 21 04:45:10 crc kubenswrapper[4839]: I0321 04:45:10.659108 4839 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bbbc59bd3ae95b686894b1a781c124d75cbce182408b10ddc7aaf1fc47510ff5"} err="failed to get container status \"bbbc59bd3ae95b686894b1a781c124d75cbce182408b10ddc7aaf1fc47510ff5\": rpc error: code = NotFound desc = could not find container \"bbbc59bd3ae95b686894b1a781c124d75cbce182408b10ddc7aaf1fc47510ff5\": container with ID starting with bbbc59bd3ae95b686894b1a781c124d75cbce182408b10ddc7aaf1fc47510ff5 not found: ID does not exist" Mar 21 04:45:10 crc kubenswrapper[4839]: I0321 04:45:10.659126 4839 scope.go:117] "RemoveContainer" containerID="cd27cd089c3832825f90b0c44877ec010b880b3484062118b2e7c9a67fa21c1c" Mar 21 04:45:10 crc kubenswrapper[4839]: I0321 04:45:10.659370 4839 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cd27cd089c3832825f90b0c44877ec010b880b3484062118b2e7c9a67fa21c1c"} err="failed to get container status \"cd27cd089c3832825f90b0c44877ec010b880b3484062118b2e7c9a67fa21c1c\": rpc error: code = NotFound desc = could not find container \"cd27cd089c3832825f90b0c44877ec010b880b3484062118b2e7c9a67fa21c1c\": container with ID starting with cd27cd089c3832825f90b0c44877ec010b880b3484062118b2e7c9a67fa21c1c not found: ID does not exist" Mar 21 04:45:10 crc kubenswrapper[4839]: I0321 04:45:10.659395 4839 scope.go:117] "RemoveContainer" containerID="bbbc59bd3ae95b686894b1a781c124d75cbce182408b10ddc7aaf1fc47510ff5" Mar 21 04:45:10 crc kubenswrapper[4839]: I0321 04:45:10.659585 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 21 04:45:10 crc kubenswrapper[4839]: I0321 04:45:10.659627 4839 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bbbc59bd3ae95b686894b1a781c124d75cbce182408b10ddc7aaf1fc47510ff5"} err="failed to get container status \"bbbc59bd3ae95b686894b1a781c124d75cbce182408b10ddc7aaf1fc47510ff5\": rpc error: code = NotFound desc = could not find container \"bbbc59bd3ae95b686894b1a781c124d75cbce182408b10ddc7aaf1fc47510ff5\": container with ID starting with bbbc59bd3ae95b686894b1a781c124d75cbce182408b10ddc7aaf1fc47510ff5 not found: ID does not exist" Mar 21 04:45:10 crc kubenswrapper[4839]: I0321 04:45:10.659646 4839 scope.go:117] "RemoveContainer" containerID="97884e844baf73e80ed5f7a5c51d988d7d7009365523dceffa2a7bc9d1e19948" Mar 21 04:45:10 crc kubenswrapper[4839]: I0321 04:45:10.696130 4839 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Mar 21 04:45:10 crc kubenswrapper[4839]: I0321 04:45:10.726113 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/b4b205eb-a84a-4c2f-8b49-068d4e0a8ec9-run-httpd\") pod \"ceilometer-0\" (UID: \"b4b205eb-a84a-4c2f-8b49-068d4e0a8ec9\") " pod="openstack/ceilometer-0" Mar 21 04:45:10 crc kubenswrapper[4839]: I0321 04:45:10.726166 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/b4b205eb-a84a-4c2f-8b49-068d4e0a8ec9-log-httpd\") pod \"ceilometer-0\" (UID: \"b4b205eb-a84a-4c2f-8b49-068d4e0a8ec9\") " pod="openstack/ceilometer-0" Mar 21 04:45:10 crc kubenswrapper[4839]: I0321 04:45:10.726209 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b4b205eb-a84a-4c2f-8b49-068d4e0a8ec9-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"b4b205eb-a84a-4c2f-8b49-068d4e0a8ec9\") " pod="openstack/ceilometer-0" Mar 21 04:45:10 crc kubenswrapper[4839]: I0321 04:45:10.726279 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vmhp2\" (UniqueName: \"kubernetes.io/projected/b4b205eb-a84a-4c2f-8b49-068d4e0a8ec9-kube-api-access-vmhp2\") pod \"ceilometer-0\" (UID: \"b4b205eb-a84a-4c2f-8b49-068d4e0a8ec9\") " pod="openstack/ceilometer-0" Mar 21 04:45:10 crc kubenswrapper[4839]: I0321 04:45:10.726348 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b4b205eb-a84a-4c2f-8b49-068d4e0a8ec9-scripts\") pod \"ceilometer-0\" (UID: \"b4b205eb-a84a-4c2f-8b49-068d4e0a8ec9\") " pod="openstack/ceilometer-0" Mar 21 04:45:10 crc kubenswrapper[4839]: I0321 04:45:10.726428 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/b4b205eb-a84a-4c2f-8b49-068d4e0a8ec9-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"b4b205eb-a84a-4c2f-8b49-068d4e0a8ec9\") " pod="openstack/ceilometer-0" Mar 21 04:45:10 crc kubenswrapper[4839]: I0321 04:45:10.726719 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b4b205eb-a84a-4c2f-8b49-068d4e0a8ec9-config-data\") pod \"ceilometer-0\" (UID: \"b4b205eb-a84a-4c2f-8b49-068d4e0a8ec9\") " pod="openstack/ceilometer-0" Mar 21 04:45:10 crc kubenswrapper[4839]: I0321 04:45:10.729505 4839 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Mar 21 04:45:10 crc kubenswrapper[4839]: I0321 04:45:10.749818 4839 scope.go:117] "RemoveContainer" containerID="0a64d9a20f4f5b5d0b9782608a440c655769c9db2754bb98b7278494dc83ae14" Mar 21 04:45:10 crc kubenswrapper[4839]: I0321 04:45:10.785318 4839 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 21 04:45:10 crc kubenswrapper[4839]: I0321 04:45:10.815120 4839 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-api-0"] Mar 21 04:45:10 crc kubenswrapper[4839]: I0321 04:45:10.816992 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Mar 21 04:45:10 crc kubenswrapper[4839]: I0321 04:45:10.826502 4839 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-api-config-data" Mar 21 04:45:10 crc kubenswrapper[4839]: I0321 04:45:10.826905 4839 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-cinder-public-svc" Mar 21 04:45:10 crc kubenswrapper[4839]: I0321 04:45:10.826937 4839 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-cinder-internal-svc" Mar 21 04:45:10 crc kubenswrapper[4839]: I0321 04:45:10.833209 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vmhp2\" (UniqueName: \"kubernetes.io/projected/b4b205eb-a84a-4c2f-8b49-068d4e0a8ec9-kube-api-access-vmhp2\") pod \"ceilometer-0\" (UID: \"b4b205eb-a84a-4c2f-8b49-068d4e0a8ec9\") " pod="openstack/ceilometer-0" Mar 21 04:45:10 crc kubenswrapper[4839]: I0321 04:45:10.833294 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b4b205eb-a84a-4c2f-8b49-068d4e0a8ec9-scripts\") pod \"ceilometer-0\" (UID: \"b4b205eb-a84a-4c2f-8b49-068d4e0a8ec9\") " pod="openstack/ceilometer-0" Mar 21 04:45:10 crc kubenswrapper[4839]: I0321 04:45:10.833331 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/b4b205eb-a84a-4c2f-8b49-068d4e0a8ec9-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"b4b205eb-a84a-4c2f-8b49-068d4e0a8ec9\") " pod="openstack/ceilometer-0" Mar 21 04:45:10 crc kubenswrapper[4839]: I0321 04:45:10.833390 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b4b205eb-a84a-4c2f-8b49-068d4e0a8ec9-config-data\") pod \"ceilometer-0\" (UID: \"b4b205eb-a84a-4c2f-8b49-068d4e0a8ec9\") " pod="openstack/ceilometer-0" Mar 21 04:45:10 crc kubenswrapper[4839]: I0321 04:45:10.833518 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/b4b205eb-a84a-4c2f-8b49-068d4e0a8ec9-run-httpd\") pod \"ceilometer-0\" (UID: \"b4b205eb-a84a-4c2f-8b49-068d4e0a8ec9\") " pod="openstack/ceilometer-0" Mar 21 04:45:10 crc kubenswrapper[4839]: I0321 04:45:10.833546 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/b4b205eb-a84a-4c2f-8b49-068d4e0a8ec9-log-httpd\") pod \"ceilometer-0\" (UID: \"b4b205eb-a84a-4c2f-8b49-068d4e0a8ec9\") " pod="openstack/ceilometer-0" Mar 21 04:45:10 crc kubenswrapper[4839]: I0321 04:45:10.833603 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b4b205eb-a84a-4c2f-8b49-068d4e0a8ec9-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"b4b205eb-a84a-4c2f-8b49-068d4e0a8ec9\") " pod="openstack/ceilometer-0" Mar 21 04:45:10 crc kubenswrapper[4839]: I0321 04:45:10.834964 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/b4b205eb-a84a-4c2f-8b49-068d4e0a8ec9-run-httpd\") pod \"ceilometer-0\" (UID: \"b4b205eb-a84a-4c2f-8b49-068d4e0a8ec9\") " pod="openstack/ceilometer-0" Mar 21 04:45:10 crc kubenswrapper[4839]: I0321 04:45:10.841126 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b4b205eb-a84a-4c2f-8b49-068d4e0a8ec9-scripts\") pod \"ceilometer-0\" (UID: \"b4b205eb-a84a-4c2f-8b49-068d4e0a8ec9\") " pod="openstack/ceilometer-0" Mar 21 04:45:10 crc kubenswrapper[4839]: I0321 04:45:10.841913 4839 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Mar 21 04:45:10 crc kubenswrapper[4839]: I0321 04:45:10.850539 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/b4b205eb-a84a-4c2f-8b49-068d4e0a8ec9-log-httpd\") pod \"ceilometer-0\" (UID: \"b4b205eb-a84a-4c2f-8b49-068d4e0a8ec9\") " pod="openstack/ceilometer-0" Mar 21 04:45:10 crc kubenswrapper[4839]: I0321 04:45:10.852818 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b4b205eb-a84a-4c2f-8b49-068d4e0a8ec9-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"b4b205eb-a84a-4c2f-8b49-068d4e0a8ec9\") " pod="openstack/ceilometer-0" Mar 21 04:45:10 crc kubenswrapper[4839]: I0321 04:45:10.854052 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b4b205eb-a84a-4c2f-8b49-068d4e0a8ec9-config-data\") pod \"ceilometer-0\" (UID: \"b4b205eb-a84a-4c2f-8b49-068d4e0a8ec9\") " pod="openstack/ceilometer-0" Mar 21 04:45:10 crc kubenswrapper[4839]: I0321 04:45:10.855036 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/b4b205eb-a84a-4c2f-8b49-068d4e0a8ec9-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"b4b205eb-a84a-4c2f-8b49-068d4e0a8ec9\") " pod="openstack/ceilometer-0" Mar 21 04:45:10 crc kubenswrapper[4839]: I0321 04:45:10.878338 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vmhp2\" (UniqueName: \"kubernetes.io/projected/b4b205eb-a84a-4c2f-8b49-068d4e0a8ec9-kube-api-access-vmhp2\") pod \"ceilometer-0\" (UID: \"b4b205eb-a84a-4c2f-8b49-068d4e0a8ec9\") " pod="openstack/ceilometer-0" Mar 21 04:45:10 crc kubenswrapper[4839]: I0321 04:45:10.935545 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/5162af3c-3b00-4643-afd9-680f6e2f5c03-config-data-custom\") pod \"cinder-api-0\" (UID: \"5162af3c-3b00-4643-afd9-680f6e2f5c03\") " pod="openstack/cinder-api-0" Mar 21 04:45:10 crc kubenswrapper[4839]: I0321 04:45:10.935635 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/5162af3c-3b00-4643-afd9-680f6e2f5c03-public-tls-certs\") pod \"cinder-api-0\" (UID: \"5162af3c-3b00-4643-afd9-680f6e2f5c03\") " pod="openstack/cinder-api-0" Mar 21 04:45:10 crc kubenswrapper[4839]: I0321 04:45:10.935664 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5162af3c-3b00-4643-afd9-680f6e2f5c03-logs\") pod \"cinder-api-0\" (UID: \"5162af3c-3b00-4643-afd9-680f6e2f5c03\") " pod="openstack/cinder-api-0" Mar 21 04:45:10 crc kubenswrapper[4839]: I0321 04:45:10.935719 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5162af3c-3b00-4643-afd9-680f6e2f5c03-scripts\") pod \"cinder-api-0\" (UID: \"5162af3c-3b00-4643-afd9-680f6e2f5c03\") " pod="openstack/cinder-api-0" Mar 21 04:45:10 crc kubenswrapper[4839]: I0321 04:45:10.935739 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5162af3c-3b00-4643-afd9-680f6e2f5c03-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"5162af3c-3b00-4643-afd9-680f6e2f5c03\") " pod="openstack/cinder-api-0" Mar 21 04:45:10 crc kubenswrapper[4839]: I0321 04:45:10.935770 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/5162af3c-3b00-4643-afd9-680f6e2f5c03-internal-tls-certs\") pod \"cinder-api-0\" (UID: \"5162af3c-3b00-4643-afd9-680f6e2f5c03\") " pod="openstack/cinder-api-0" Mar 21 04:45:10 crc kubenswrapper[4839]: I0321 04:45:10.935853 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vvp2q\" (UniqueName: \"kubernetes.io/projected/5162af3c-3b00-4643-afd9-680f6e2f5c03-kube-api-access-vvp2q\") pod \"cinder-api-0\" (UID: \"5162af3c-3b00-4643-afd9-680f6e2f5c03\") " pod="openstack/cinder-api-0" Mar 21 04:45:10 crc kubenswrapper[4839]: I0321 04:45:10.935985 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/5162af3c-3b00-4643-afd9-680f6e2f5c03-etc-machine-id\") pod \"cinder-api-0\" (UID: \"5162af3c-3b00-4643-afd9-680f6e2f5c03\") " pod="openstack/cinder-api-0" Mar 21 04:45:10 crc kubenswrapper[4839]: I0321 04:45:10.936024 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5162af3c-3b00-4643-afd9-680f6e2f5c03-config-data\") pod \"cinder-api-0\" (UID: \"5162af3c-3b00-4643-afd9-680f6e2f5c03\") " pod="openstack/cinder-api-0" Mar 21 04:45:11 crc kubenswrapper[4839]: I0321 04:45:11.037829 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vvp2q\" (UniqueName: \"kubernetes.io/projected/5162af3c-3b00-4643-afd9-680f6e2f5c03-kube-api-access-vvp2q\") pod \"cinder-api-0\" (UID: \"5162af3c-3b00-4643-afd9-680f6e2f5c03\") " pod="openstack/cinder-api-0" Mar 21 04:45:11 crc kubenswrapper[4839]: I0321 04:45:11.038177 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/5162af3c-3b00-4643-afd9-680f6e2f5c03-etc-machine-id\") pod \"cinder-api-0\" (UID: \"5162af3c-3b00-4643-afd9-680f6e2f5c03\") " pod="openstack/cinder-api-0" Mar 21 04:45:11 crc kubenswrapper[4839]: I0321 04:45:11.038265 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5162af3c-3b00-4643-afd9-680f6e2f5c03-config-data\") pod \"cinder-api-0\" (UID: \"5162af3c-3b00-4643-afd9-680f6e2f5c03\") " pod="openstack/cinder-api-0" Mar 21 04:45:11 crc kubenswrapper[4839]: I0321 04:45:11.038352 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/5162af3c-3b00-4643-afd9-680f6e2f5c03-config-data-custom\") pod \"cinder-api-0\" (UID: \"5162af3c-3b00-4643-afd9-680f6e2f5c03\") " pod="openstack/cinder-api-0" Mar 21 04:45:11 crc kubenswrapper[4839]: I0321 04:45:11.038476 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/5162af3c-3b00-4643-afd9-680f6e2f5c03-public-tls-certs\") pod \"cinder-api-0\" (UID: \"5162af3c-3b00-4643-afd9-680f6e2f5c03\") " pod="openstack/cinder-api-0" Mar 21 04:45:11 crc kubenswrapper[4839]: I0321 04:45:11.038556 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5162af3c-3b00-4643-afd9-680f6e2f5c03-logs\") pod \"cinder-api-0\" (UID: \"5162af3c-3b00-4643-afd9-680f6e2f5c03\") " pod="openstack/cinder-api-0" Mar 21 04:45:11 crc kubenswrapper[4839]: I0321 04:45:11.038681 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5162af3c-3b00-4643-afd9-680f6e2f5c03-scripts\") pod \"cinder-api-0\" (UID: \"5162af3c-3b00-4643-afd9-680f6e2f5c03\") " pod="openstack/cinder-api-0" Mar 21 04:45:11 crc kubenswrapper[4839]: I0321 04:45:11.038746 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5162af3c-3b00-4643-afd9-680f6e2f5c03-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"5162af3c-3b00-4643-afd9-680f6e2f5c03\") " pod="openstack/cinder-api-0" Mar 21 04:45:11 crc kubenswrapper[4839]: I0321 04:45:11.038823 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/5162af3c-3b00-4643-afd9-680f6e2f5c03-internal-tls-certs\") pod \"cinder-api-0\" (UID: \"5162af3c-3b00-4643-afd9-680f6e2f5c03\") " pod="openstack/cinder-api-0" Mar 21 04:45:11 crc kubenswrapper[4839]: I0321 04:45:11.038430 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/5162af3c-3b00-4643-afd9-680f6e2f5c03-etc-machine-id\") pod \"cinder-api-0\" (UID: \"5162af3c-3b00-4643-afd9-680f6e2f5c03\") " pod="openstack/cinder-api-0" Mar 21 04:45:11 crc kubenswrapper[4839]: I0321 04:45:11.040035 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5162af3c-3b00-4643-afd9-680f6e2f5c03-logs\") pod \"cinder-api-0\" (UID: \"5162af3c-3b00-4643-afd9-680f6e2f5c03\") " pod="openstack/cinder-api-0" Mar 21 04:45:11 crc kubenswrapper[4839]: I0321 04:45:11.042599 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/5162af3c-3b00-4643-afd9-680f6e2f5c03-internal-tls-certs\") pod \"cinder-api-0\" (UID: \"5162af3c-3b00-4643-afd9-680f6e2f5c03\") " pod="openstack/cinder-api-0" Mar 21 04:45:11 crc kubenswrapper[4839]: I0321 04:45:11.043790 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5162af3c-3b00-4643-afd9-680f6e2f5c03-config-data\") pod \"cinder-api-0\" (UID: \"5162af3c-3b00-4643-afd9-680f6e2f5c03\") " pod="openstack/cinder-api-0" Mar 21 04:45:11 crc kubenswrapper[4839]: I0321 04:45:11.044087 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/5162af3c-3b00-4643-afd9-680f6e2f5c03-config-data-custom\") pod \"cinder-api-0\" (UID: \"5162af3c-3b00-4643-afd9-680f6e2f5c03\") " pod="openstack/cinder-api-0" Mar 21 04:45:11 crc kubenswrapper[4839]: I0321 04:45:11.044235 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5162af3c-3b00-4643-afd9-680f6e2f5c03-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"5162af3c-3b00-4643-afd9-680f6e2f5c03\") " pod="openstack/cinder-api-0" Mar 21 04:45:11 crc kubenswrapper[4839]: I0321 04:45:11.048282 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5162af3c-3b00-4643-afd9-680f6e2f5c03-scripts\") pod \"cinder-api-0\" (UID: \"5162af3c-3b00-4643-afd9-680f6e2f5c03\") " pod="openstack/cinder-api-0" Mar 21 04:45:11 crc kubenswrapper[4839]: I0321 04:45:11.055694 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/5162af3c-3b00-4643-afd9-680f6e2f5c03-public-tls-certs\") pod \"cinder-api-0\" (UID: \"5162af3c-3b00-4643-afd9-680f6e2f5c03\") " pod="openstack/cinder-api-0" Mar 21 04:45:11 crc kubenswrapper[4839]: I0321 04:45:11.059468 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vvp2q\" (UniqueName: \"kubernetes.io/projected/5162af3c-3b00-4643-afd9-680f6e2f5c03-kube-api-access-vvp2q\") pod \"cinder-api-0\" (UID: \"5162af3c-3b00-4643-afd9-680f6e2f5c03\") " pod="openstack/cinder-api-0" Mar 21 04:45:11 crc kubenswrapper[4839]: I0321 04:45:11.075145 4839 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/cinder-scheduler-0" Mar 21 04:45:11 crc kubenswrapper[4839]: I0321 04:45:11.087916 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 21 04:45:11 crc kubenswrapper[4839]: I0321 04:45:11.276797 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Mar 21 04:45:11 crc kubenswrapper[4839]: I0321 04:45:11.530789 4839 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/horizon-9c97f4dbd-k2scs" Mar 21 04:45:11 crc kubenswrapper[4839]: W0321 04:45:11.530914 4839 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb4b205eb_a84a_4c2f_8b49_068d4e0a8ec9.slice/crio-c2ee73cd19c0a119a6c2d0a87e1fbe0c6315a3f99b97a20c1aa91548c3c2c2dc WatchSource:0}: Error finding container c2ee73cd19c0a119a6c2d0a87e1fbe0c6315a3f99b97a20c1aa91548c3c2c2dc: Status 404 returned error can't find the container with id c2ee73cd19c0a119a6c2d0a87e1fbe0c6315a3f99b97a20c1aa91548c3c2c2dc Mar 21 04:45:11 crc kubenswrapper[4839]: I0321 04:45:11.531588 4839 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 21 04:45:11 crc kubenswrapper[4839]: I0321 04:45:11.561542 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"b4b205eb-a84a-4c2f-8b49-068d4e0a8ec9","Type":"ContainerStarted","Data":"c2ee73cd19c0a119a6c2d0a87e1fbe0c6315a3f99b97a20c1aa91548c3c2c2dc"} Mar 21 04:45:11 crc kubenswrapper[4839]: I0321 04:45:11.699326 4839 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Mar 21 04:45:11 crc kubenswrapper[4839]: W0321 04:45:11.702209 4839 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod5162af3c_3b00_4643_afd9_680f6e2f5c03.slice/crio-9ef657035e164bc132f225494533c3caf524d329fd3782910ff8fe4a3680f936 WatchSource:0}: Error finding container 9ef657035e164bc132f225494533c3caf524d329fd3782910ff8fe4a3680f936: Status 404 returned error can't find the container with id 9ef657035e164bc132f225494533c3caf524d329fd3782910ff8fe4a3680f936 Mar 21 04:45:11 crc kubenswrapper[4839]: I0321 04:45:11.720777 4839 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/horizon-84c6c985f8-v5cmh" Mar 21 04:45:12 crc kubenswrapper[4839]: I0321 04:45:12.465501 4839 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="63efa50f-a0e7-4912-bbd8-c610daf572fd" path="/var/lib/kubelet/pods/63efa50f-a0e7-4912-bbd8-c610daf572fd/volumes" Mar 21 04:45:12 crc kubenswrapper[4839]: I0321 04:45:12.466476 4839 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6c266726-5bfd-4519-bdd5-9db7f6a77df4" path="/var/lib/kubelet/pods/6c266726-5bfd-4519-bdd5-9db7f6a77df4/volumes" Mar 21 04:45:12 crc kubenswrapper[4839]: I0321 04:45:12.582445 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"5162af3c-3b00-4643-afd9-680f6e2f5c03","Type":"ContainerStarted","Data":"857a3c055c4bbb192eb5427ba4b0f790e0a6fefcec9392f76c0e8b227ed287ad"} Mar 21 04:45:12 crc kubenswrapper[4839]: I0321 04:45:12.582492 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"5162af3c-3b00-4643-afd9-680f6e2f5c03","Type":"ContainerStarted","Data":"9ef657035e164bc132f225494533c3caf524d329fd3782910ff8fe4a3680f936"} Mar 21 04:45:13 crc kubenswrapper[4839]: I0321 04:45:13.451187 4839 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/horizon-9c97f4dbd-k2scs" Mar 21 04:45:13 crc kubenswrapper[4839]: I0321 04:45:13.578332 4839 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-84c6c985f8-v5cmh"] Mar 21 04:45:13 crc kubenswrapper[4839]: I0321 04:45:13.578968 4839 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/horizon-84c6c985f8-v5cmh" podUID="b3b26c3a-55d5-442a-9c31-187b0aa60f90" containerName="horizon-log" containerID="cri-o://0bc7ef10848b0da5e68b6c3552cc343013046d2176bf665b0d2389f263149510" gracePeriod=30 Mar 21 04:45:13 crc kubenswrapper[4839]: I0321 04:45:13.579475 4839 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/horizon-84c6c985f8-v5cmh" podUID="b3b26c3a-55d5-442a-9c31-187b0aa60f90" containerName="horizon" containerID="cri-o://e004b9646c4df34c1d5bba67912a6fa76f3cccc25c7980ab777e369e37ce16c9" gracePeriod=30 Mar 21 04:45:13 crc kubenswrapper[4839]: I0321 04:45:13.585419 4839 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/horizon-84c6c985f8-v5cmh" podUID="b3b26c3a-55d5-442a-9c31-187b0aa60f90" containerName="horizon" probeResult="failure" output="Get \"https://10.217.0.151:8443/dashboard/auth/login/?next=/dashboard/\": EOF" Mar 21 04:45:13 crc kubenswrapper[4839]: I0321 04:45:13.612411 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"5162af3c-3b00-4643-afd9-680f6e2f5c03","Type":"ContainerStarted","Data":"fdc9a5f2c9f812e6c489afb2deb5cfd846ff8ba1578516379dd6a810bb73aecc"} Mar 21 04:45:13 crc kubenswrapper[4839]: I0321 04:45:13.613749 4839 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/cinder-api-0" Mar 21 04:45:13 crc kubenswrapper[4839]: I0321 04:45:13.615963 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"b4b205eb-a84a-4c2f-8b49-068d4e0a8ec9","Type":"ContainerStarted","Data":"535723ef61f58618feb7059f038c3fae7ab1b8f214f13d762b44e16ad7930481"} Mar 21 04:45:13 crc kubenswrapper[4839]: I0321 04:45:13.615995 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"b4b205eb-a84a-4c2f-8b49-068d4e0a8ec9","Type":"ContainerStarted","Data":"c93a95cdff62e2e7b6e4859da043de79cf7d681e70275fda634ef213fa4a479c"} Mar 21 04:45:13 crc kubenswrapper[4839]: I0321 04:45:13.647166 4839 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-api-0" podStartSLOduration=3.647147298 podStartE2EDuration="3.647147298s" podCreationTimestamp="2026-03-21 04:45:10 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-21 04:45:13.633326351 +0000 UTC m=+1317.961113027" watchObservedRunningTime="2026-03-21 04:45:13.647147298 +0000 UTC m=+1317.974933974" Mar 21 04:45:15 crc kubenswrapper[4839]: I0321 04:45:15.193319 4839 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/placement-5788c8f798-khqlb" Mar 21 04:45:15 crc kubenswrapper[4839]: I0321 04:45:15.218125 4839 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/placement-5788c8f798-khqlb" Mar 21 04:45:15 crc kubenswrapper[4839]: I0321 04:45:15.478407 4839 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-75bd8b89b4-djjlh"] Mar 21 04:45:15 crc kubenswrapper[4839]: I0321 04:45:15.482472 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-75bd8b89b4-djjlh" Mar 21 04:45:15 crc kubenswrapper[4839]: I0321 04:45:15.509412 4839 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-75bd8b89b4-djjlh"] Mar 21 04:45:15 crc kubenswrapper[4839]: I0321 04:45:15.636977 4839 generic.go:334] "Generic (PLEG): container finished" podID="e965d008-890b-408c-a5a8-823aca00140a" containerID="6e416952cf65a99f24d43cb637a81bb2e071806b75507c88029a3d669986edf2" exitCode=0 Mar 21 04:45:15 crc kubenswrapper[4839]: I0321 04:45:15.637478 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-98964f649-mrjrt" event={"ID":"e965d008-890b-408c-a5a8-823aca00140a","Type":"ContainerDied","Data":"6e416952cf65a99f24d43cb637a81bb2e071806b75507c88029a3d669986edf2"} Mar 21 04:45:15 crc kubenswrapper[4839]: I0321 04:45:15.667807 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bf5a44f8-8eb1-4953-b611-a02576e414ea-config-data\") pod \"placement-75bd8b89b4-djjlh\" (UID: \"bf5a44f8-8eb1-4953-b611-a02576e414ea\") " pod="openstack/placement-75bd8b89b4-djjlh" Mar 21 04:45:15 crc kubenswrapper[4839]: I0321 04:45:15.667905 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/bf5a44f8-8eb1-4953-b611-a02576e414ea-public-tls-certs\") pod \"placement-75bd8b89b4-djjlh\" (UID: \"bf5a44f8-8eb1-4953-b611-a02576e414ea\") " pod="openstack/placement-75bd8b89b4-djjlh" Mar 21 04:45:15 crc kubenswrapper[4839]: I0321 04:45:15.667940 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/bf5a44f8-8eb1-4953-b611-a02576e414ea-logs\") pod \"placement-75bd8b89b4-djjlh\" (UID: \"bf5a44f8-8eb1-4953-b611-a02576e414ea\") " pod="openstack/placement-75bd8b89b4-djjlh" Mar 21 04:45:15 crc kubenswrapper[4839]: I0321 04:45:15.667994 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jdqxc\" (UniqueName: \"kubernetes.io/projected/bf5a44f8-8eb1-4953-b611-a02576e414ea-kube-api-access-jdqxc\") pod \"placement-75bd8b89b4-djjlh\" (UID: \"bf5a44f8-8eb1-4953-b611-a02576e414ea\") " pod="openstack/placement-75bd8b89b4-djjlh" Mar 21 04:45:15 crc kubenswrapper[4839]: I0321 04:45:15.668090 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/bf5a44f8-8eb1-4953-b611-a02576e414ea-internal-tls-certs\") pod \"placement-75bd8b89b4-djjlh\" (UID: \"bf5a44f8-8eb1-4953-b611-a02576e414ea\") " pod="openstack/placement-75bd8b89b4-djjlh" Mar 21 04:45:15 crc kubenswrapper[4839]: I0321 04:45:15.668120 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/bf5a44f8-8eb1-4953-b611-a02576e414ea-scripts\") pod \"placement-75bd8b89b4-djjlh\" (UID: \"bf5a44f8-8eb1-4953-b611-a02576e414ea\") " pod="openstack/placement-75bd8b89b4-djjlh" Mar 21 04:45:15 crc kubenswrapper[4839]: I0321 04:45:15.668154 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bf5a44f8-8eb1-4953-b611-a02576e414ea-combined-ca-bundle\") pod \"placement-75bd8b89b4-djjlh\" (UID: \"bf5a44f8-8eb1-4953-b611-a02576e414ea\") " pod="openstack/placement-75bd8b89b4-djjlh" Mar 21 04:45:15 crc kubenswrapper[4839]: I0321 04:45:15.770824 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/bf5a44f8-8eb1-4953-b611-a02576e414ea-internal-tls-certs\") pod \"placement-75bd8b89b4-djjlh\" (UID: \"bf5a44f8-8eb1-4953-b611-a02576e414ea\") " pod="openstack/placement-75bd8b89b4-djjlh" Mar 21 04:45:15 crc kubenswrapper[4839]: I0321 04:45:15.771103 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/bf5a44f8-8eb1-4953-b611-a02576e414ea-scripts\") pod \"placement-75bd8b89b4-djjlh\" (UID: \"bf5a44f8-8eb1-4953-b611-a02576e414ea\") " pod="openstack/placement-75bd8b89b4-djjlh" Mar 21 04:45:15 crc kubenswrapper[4839]: I0321 04:45:15.771168 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bf5a44f8-8eb1-4953-b611-a02576e414ea-combined-ca-bundle\") pod \"placement-75bd8b89b4-djjlh\" (UID: \"bf5a44f8-8eb1-4953-b611-a02576e414ea\") " pod="openstack/placement-75bd8b89b4-djjlh" Mar 21 04:45:15 crc kubenswrapper[4839]: I0321 04:45:15.772944 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bf5a44f8-8eb1-4953-b611-a02576e414ea-config-data\") pod \"placement-75bd8b89b4-djjlh\" (UID: \"bf5a44f8-8eb1-4953-b611-a02576e414ea\") " pod="openstack/placement-75bd8b89b4-djjlh" Mar 21 04:45:15 crc kubenswrapper[4839]: I0321 04:45:15.773130 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/bf5a44f8-8eb1-4953-b611-a02576e414ea-public-tls-certs\") pod \"placement-75bd8b89b4-djjlh\" (UID: \"bf5a44f8-8eb1-4953-b611-a02576e414ea\") " pod="openstack/placement-75bd8b89b4-djjlh" Mar 21 04:45:15 crc kubenswrapper[4839]: I0321 04:45:15.773187 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/bf5a44f8-8eb1-4953-b611-a02576e414ea-logs\") pod \"placement-75bd8b89b4-djjlh\" (UID: \"bf5a44f8-8eb1-4953-b611-a02576e414ea\") " pod="openstack/placement-75bd8b89b4-djjlh" Mar 21 04:45:15 crc kubenswrapper[4839]: I0321 04:45:15.773301 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jdqxc\" (UniqueName: \"kubernetes.io/projected/bf5a44f8-8eb1-4953-b611-a02576e414ea-kube-api-access-jdqxc\") pod \"placement-75bd8b89b4-djjlh\" (UID: \"bf5a44f8-8eb1-4953-b611-a02576e414ea\") " pod="openstack/placement-75bd8b89b4-djjlh" Mar 21 04:45:15 crc kubenswrapper[4839]: I0321 04:45:15.773561 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/bf5a44f8-8eb1-4953-b611-a02576e414ea-logs\") pod \"placement-75bd8b89b4-djjlh\" (UID: \"bf5a44f8-8eb1-4953-b611-a02576e414ea\") " pod="openstack/placement-75bd8b89b4-djjlh" Mar 21 04:45:15 crc kubenswrapper[4839]: I0321 04:45:15.777428 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bf5a44f8-8eb1-4953-b611-a02576e414ea-config-data\") pod \"placement-75bd8b89b4-djjlh\" (UID: \"bf5a44f8-8eb1-4953-b611-a02576e414ea\") " pod="openstack/placement-75bd8b89b4-djjlh" Mar 21 04:45:15 crc kubenswrapper[4839]: I0321 04:45:15.779178 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/bf5a44f8-8eb1-4953-b611-a02576e414ea-internal-tls-certs\") pod \"placement-75bd8b89b4-djjlh\" (UID: \"bf5a44f8-8eb1-4953-b611-a02576e414ea\") " pod="openstack/placement-75bd8b89b4-djjlh" Mar 21 04:45:15 crc kubenswrapper[4839]: I0321 04:45:15.780096 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/bf5a44f8-8eb1-4953-b611-a02576e414ea-scripts\") pod \"placement-75bd8b89b4-djjlh\" (UID: \"bf5a44f8-8eb1-4953-b611-a02576e414ea\") " pod="openstack/placement-75bd8b89b4-djjlh" Mar 21 04:45:15 crc kubenswrapper[4839]: I0321 04:45:15.790297 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/bf5a44f8-8eb1-4953-b611-a02576e414ea-public-tls-certs\") pod \"placement-75bd8b89b4-djjlh\" (UID: \"bf5a44f8-8eb1-4953-b611-a02576e414ea\") " pod="openstack/placement-75bd8b89b4-djjlh" Mar 21 04:45:15 crc kubenswrapper[4839]: I0321 04:45:15.791127 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jdqxc\" (UniqueName: \"kubernetes.io/projected/bf5a44f8-8eb1-4953-b611-a02576e414ea-kube-api-access-jdqxc\") pod \"placement-75bd8b89b4-djjlh\" (UID: \"bf5a44f8-8eb1-4953-b611-a02576e414ea\") " pod="openstack/placement-75bd8b89b4-djjlh" Mar 21 04:45:15 crc kubenswrapper[4839]: I0321 04:45:15.798193 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bf5a44f8-8eb1-4953-b611-a02576e414ea-combined-ca-bundle\") pod \"placement-75bd8b89b4-djjlh\" (UID: \"bf5a44f8-8eb1-4953-b611-a02576e414ea\") " pod="openstack/placement-75bd8b89b4-djjlh" Mar 21 04:45:15 crc kubenswrapper[4839]: I0321 04:45:15.855008 4839 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-98964f649-mrjrt" Mar 21 04:45:15 crc kubenswrapper[4839]: I0321 04:45:15.917066 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-75bd8b89b4-djjlh" Mar 21 04:45:15 crc kubenswrapper[4839]: I0321 04:45:15.978445 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/e965d008-890b-408c-a5a8-823aca00140a-public-tls-certs\") pod \"e965d008-890b-408c-a5a8-823aca00140a\" (UID: \"e965d008-890b-408c-a5a8-823aca00140a\") " Mar 21 04:45:15 crc kubenswrapper[4839]: I0321 04:45:15.978542 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/e965d008-890b-408c-a5a8-823aca00140a-config\") pod \"e965d008-890b-408c-a5a8-823aca00140a\" (UID: \"e965d008-890b-408c-a5a8-823aca00140a\") " Mar 21 04:45:15 crc kubenswrapper[4839]: I0321 04:45:15.978607 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/e965d008-890b-408c-a5a8-823aca00140a-internal-tls-certs\") pod \"e965d008-890b-408c-a5a8-823aca00140a\" (UID: \"e965d008-890b-408c-a5a8-823aca00140a\") " Mar 21 04:45:15 crc kubenswrapper[4839]: I0321 04:45:15.978632 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/e965d008-890b-408c-a5a8-823aca00140a-ovndb-tls-certs\") pod \"e965d008-890b-408c-a5a8-823aca00140a\" (UID: \"e965d008-890b-408c-a5a8-823aca00140a\") " Mar 21 04:45:15 crc kubenswrapper[4839]: I0321 04:45:15.978772 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e965d008-890b-408c-a5a8-823aca00140a-combined-ca-bundle\") pod \"e965d008-890b-408c-a5a8-823aca00140a\" (UID: \"e965d008-890b-408c-a5a8-823aca00140a\") " Mar 21 04:45:15 crc kubenswrapper[4839]: I0321 04:45:15.978878 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wbt6d\" (UniqueName: \"kubernetes.io/projected/e965d008-890b-408c-a5a8-823aca00140a-kube-api-access-wbt6d\") pod \"e965d008-890b-408c-a5a8-823aca00140a\" (UID: \"e965d008-890b-408c-a5a8-823aca00140a\") " Mar 21 04:45:15 crc kubenswrapper[4839]: I0321 04:45:15.978893 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/e965d008-890b-408c-a5a8-823aca00140a-httpd-config\") pod \"e965d008-890b-408c-a5a8-823aca00140a\" (UID: \"e965d008-890b-408c-a5a8-823aca00140a\") " Mar 21 04:45:15 crc kubenswrapper[4839]: I0321 04:45:15.987074 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e965d008-890b-408c-a5a8-823aca00140a-kube-api-access-wbt6d" (OuterVolumeSpecName: "kube-api-access-wbt6d") pod "e965d008-890b-408c-a5a8-823aca00140a" (UID: "e965d008-890b-408c-a5a8-823aca00140a"). InnerVolumeSpecName "kube-api-access-wbt6d". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 21 04:45:15 crc kubenswrapper[4839]: I0321 04:45:15.987697 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e965d008-890b-408c-a5a8-823aca00140a-httpd-config" (OuterVolumeSpecName: "httpd-config") pod "e965d008-890b-408c-a5a8-823aca00140a" (UID: "e965d008-890b-408c-a5a8-823aca00140a"). InnerVolumeSpecName "httpd-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 21 04:45:15 crc kubenswrapper[4839]: I0321 04:45:15.992757 4839 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-6578955fd5-qc28r" Mar 21 04:45:16 crc kubenswrapper[4839]: I0321 04:45:16.071761 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e965d008-890b-408c-a5a8-823aca00140a-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "e965d008-890b-408c-a5a8-823aca00140a" (UID: "e965d008-890b-408c-a5a8-823aca00140a"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 21 04:45:16 crc kubenswrapper[4839]: I0321 04:45:16.084775 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e965d008-890b-408c-a5a8-823aca00140a-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "e965d008-890b-408c-a5a8-823aca00140a" (UID: "e965d008-890b-408c-a5a8-823aca00140a"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 21 04:45:16 crc kubenswrapper[4839]: I0321 04:45:16.085063 4839 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wbt6d\" (UniqueName: \"kubernetes.io/projected/e965d008-890b-408c-a5a8-823aca00140a-kube-api-access-wbt6d\") on node \"crc\" DevicePath \"\"" Mar 21 04:45:16 crc kubenswrapper[4839]: I0321 04:45:16.085100 4839 reconciler_common.go:293] "Volume detached for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/e965d008-890b-408c-a5a8-823aca00140a-httpd-config\") on node \"crc\" DevicePath \"\"" Mar 21 04:45:16 crc kubenswrapper[4839]: I0321 04:45:16.085110 4839 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/e965d008-890b-408c-a5a8-823aca00140a-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 21 04:45:16 crc kubenswrapper[4839]: I0321 04:45:16.085123 4839 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e965d008-890b-408c-a5a8-823aca00140a-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 21 04:45:16 crc kubenswrapper[4839]: I0321 04:45:16.125363 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e965d008-890b-408c-a5a8-823aca00140a-config" (OuterVolumeSpecName: "config") pod "e965d008-890b-408c-a5a8-823aca00140a" (UID: "e965d008-890b-408c-a5a8-823aca00140a"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 21 04:45:16 crc kubenswrapper[4839]: I0321 04:45:16.137004 4839 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-848cf88cfc-k67ln"] Mar 21 04:45:16 crc kubenswrapper[4839]: I0321 04:45:16.137366 4839 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-848cf88cfc-k67ln" podUID="ac45c53b-2486-47d1-aaf4-23b76adfd431" containerName="dnsmasq-dns" containerID="cri-o://d7d86bc6d96470a04c1fc681cf73561b422455dc884417b0677e9ae418f682f0" gracePeriod=10 Mar 21 04:45:16 crc kubenswrapper[4839]: I0321 04:45:16.158931 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e965d008-890b-408c-a5a8-823aca00140a-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "e965d008-890b-408c-a5a8-823aca00140a" (UID: "e965d008-890b-408c-a5a8-823aca00140a"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 21 04:45:16 crc kubenswrapper[4839]: I0321 04:45:16.191889 4839 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/e965d008-890b-408c-a5a8-823aca00140a-public-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 21 04:45:16 crc kubenswrapper[4839]: I0321 04:45:16.191923 4839 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/e965d008-890b-408c-a5a8-823aca00140a-config\") on node \"crc\" DevicePath \"\"" Mar 21 04:45:16 crc kubenswrapper[4839]: I0321 04:45:16.233445 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e965d008-890b-408c-a5a8-823aca00140a-ovndb-tls-certs" (OuterVolumeSpecName: "ovndb-tls-certs") pod "e965d008-890b-408c-a5a8-823aca00140a" (UID: "e965d008-890b-408c-a5a8-823aca00140a"). InnerVolumeSpecName "ovndb-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 21 04:45:16 crc kubenswrapper[4839]: I0321 04:45:16.292612 4839 reconciler_common.go:293] "Volume detached for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/e965d008-890b-408c-a5a8-823aca00140a-ovndb-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 21 04:45:16 crc kubenswrapper[4839]: I0321 04:45:16.542105 4839 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/cinder-scheduler-0" Mar 21 04:45:16 crc kubenswrapper[4839]: I0321 04:45:16.554655 4839 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-75bd8b89b4-djjlh"] Mar 21 04:45:16 crc kubenswrapper[4839]: W0321 04:45:16.568867 4839 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podbf5a44f8_8eb1_4953_b611_a02576e414ea.slice/crio-8b4121a67833b28031e0850228e8d0dd605437ca3e0db7dd43ea487dc7db5f3b WatchSource:0}: Error finding container 8b4121a67833b28031e0850228e8d0dd605437ca3e0db7dd43ea487dc7db5f3b: Status 404 returned error can't find the container with id 8b4121a67833b28031e0850228e8d0dd605437ca3e0db7dd43ea487dc7db5f3b Mar 21 04:45:16 crc kubenswrapper[4839]: I0321 04:45:16.620906 4839 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-scheduler-0"] Mar 21 04:45:16 crc kubenswrapper[4839]: I0321 04:45:16.742105 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"b4b205eb-a84a-4c2f-8b49-068d4e0a8ec9","Type":"ContainerStarted","Data":"c23f136b850cb236ed5c6370a36a398b1c3d6f65a75d6f5afc35d4214b4f0367"} Mar 21 04:45:16 crc kubenswrapper[4839]: I0321 04:45:16.751895 4839 generic.go:334] "Generic (PLEG): container finished" podID="ac45c53b-2486-47d1-aaf4-23b76adfd431" containerID="d7d86bc6d96470a04c1fc681cf73561b422455dc884417b0677e9ae418f682f0" exitCode=0 Mar 21 04:45:16 crc kubenswrapper[4839]: I0321 04:45:16.751994 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-848cf88cfc-k67ln" event={"ID":"ac45c53b-2486-47d1-aaf4-23b76adfd431","Type":"ContainerDied","Data":"d7d86bc6d96470a04c1fc681cf73561b422455dc884417b0677e9ae418f682f0"} Mar 21 04:45:16 crc kubenswrapper[4839]: I0321 04:45:16.752066 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-848cf88cfc-k67ln" event={"ID":"ac45c53b-2486-47d1-aaf4-23b76adfd431","Type":"ContainerDied","Data":"2f1d63d47cab7235a54c65fca44b344c795fcf2a4d8ecfd84ead6214de4729cf"} Mar 21 04:45:16 crc kubenswrapper[4839]: I0321 04:45:16.752085 4839 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2f1d63d47cab7235a54c65fca44b344c795fcf2a4d8ecfd84ead6214de4729cf" Mar 21 04:45:16 crc kubenswrapper[4839]: I0321 04:45:16.752823 4839 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-848cf88cfc-k67ln" Mar 21 04:45:16 crc kubenswrapper[4839]: I0321 04:45:16.754504 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-98964f649-mrjrt" event={"ID":"e965d008-890b-408c-a5a8-823aca00140a","Type":"ContainerDied","Data":"965f07a77abecc2dcc57bce66cbf446672e1ba03feecba479f6dc24ff5964cee"} Mar 21 04:45:16 crc kubenswrapper[4839]: I0321 04:45:16.754622 4839 scope.go:117] "RemoveContainer" containerID="c52f7b158358ef8b38cfac03210bf15a4ca76a8dbb9c567dbd73763e507062d1" Mar 21 04:45:16 crc kubenswrapper[4839]: I0321 04:45:16.754780 4839 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-98964f649-mrjrt" Mar 21 04:45:16 crc kubenswrapper[4839]: I0321 04:45:16.759885 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-75bd8b89b4-djjlh" event={"ID":"bf5a44f8-8eb1-4953-b611-a02576e414ea","Type":"ContainerStarted","Data":"8b4121a67833b28031e0850228e8d0dd605437ca3e0db7dd43ea487dc7db5f3b"} Mar 21 04:45:16 crc kubenswrapper[4839]: I0321 04:45:16.760005 4839 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-scheduler-0" podUID="82b135c8-5fc8-4930-9577-1dd9181a1dae" containerName="cinder-scheduler" containerID="cri-o://94e949d149a7c44ebe62d3aa61b17816324f2bfff6307c3c5a788a6a257442fa" gracePeriod=30 Mar 21 04:45:16 crc kubenswrapper[4839]: I0321 04:45:16.760088 4839 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-scheduler-0" podUID="82b135c8-5fc8-4930-9577-1dd9181a1dae" containerName="probe" containerID="cri-o://29fe57ad79066c4ceb1edc999f7c370920b15cd9f7e5562d2b7d130cb45455af" gracePeriod=30 Mar 21 04:45:16 crc kubenswrapper[4839]: I0321 04:45:16.791141 4839 scope.go:117] "RemoveContainer" containerID="6e416952cf65a99f24d43cb637a81bb2e071806b75507c88029a3d669986edf2" Mar 21 04:45:16 crc kubenswrapper[4839]: I0321 04:45:16.817797 4839 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-98964f649-mrjrt"] Mar 21 04:45:16 crc kubenswrapper[4839]: I0321 04:45:16.834628 4839 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-98964f649-mrjrt"] Mar 21 04:45:16 crc kubenswrapper[4839]: I0321 04:45:16.843712 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/ac45c53b-2486-47d1-aaf4-23b76adfd431-ovsdbserver-nb\") pod \"ac45c53b-2486-47d1-aaf4-23b76adfd431\" (UID: \"ac45c53b-2486-47d1-aaf4-23b76adfd431\") " Mar 21 04:45:16 crc kubenswrapper[4839]: I0321 04:45:16.844116 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ac45c53b-2486-47d1-aaf4-23b76adfd431-dns-svc\") pod \"ac45c53b-2486-47d1-aaf4-23b76adfd431\" (UID: \"ac45c53b-2486-47d1-aaf4-23b76adfd431\") " Mar 21 04:45:16 crc kubenswrapper[4839]: I0321 04:45:16.844198 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/ac45c53b-2486-47d1-aaf4-23b76adfd431-dns-swift-storage-0\") pod \"ac45c53b-2486-47d1-aaf4-23b76adfd431\" (UID: \"ac45c53b-2486-47d1-aaf4-23b76adfd431\") " Mar 21 04:45:16 crc kubenswrapper[4839]: I0321 04:45:16.844279 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/ac45c53b-2486-47d1-aaf4-23b76adfd431-ovsdbserver-sb\") pod \"ac45c53b-2486-47d1-aaf4-23b76adfd431\" (UID: \"ac45c53b-2486-47d1-aaf4-23b76adfd431\") " Mar 21 04:45:16 crc kubenswrapper[4839]: I0321 04:45:16.844331 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ac45c53b-2486-47d1-aaf4-23b76adfd431-config\") pod \"ac45c53b-2486-47d1-aaf4-23b76adfd431\" (UID: \"ac45c53b-2486-47d1-aaf4-23b76adfd431\") " Mar 21 04:45:16 crc kubenswrapper[4839]: I0321 04:45:16.844388 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xn7xh\" (UniqueName: \"kubernetes.io/projected/ac45c53b-2486-47d1-aaf4-23b76adfd431-kube-api-access-xn7xh\") pod \"ac45c53b-2486-47d1-aaf4-23b76adfd431\" (UID: \"ac45c53b-2486-47d1-aaf4-23b76adfd431\") " Mar 21 04:45:16 crc kubenswrapper[4839]: I0321 04:45:16.853449 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ac45c53b-2486-47d1-aaf4-23b76adfd431-kube-api-access-xn7xh" (OuterVolumeSpecName: "kube-api-access-xn7xh") pod "ac45c53b-2486-47d1-aaf4-23b76adfd431" (UID: "ac45c53b-2486-47d1-aaf4-23b76adfd431"). InnerVolumeSpecName "kube-api-access-xn7xh". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 21 04:45:16 crc kubenswrapper[4839]: I0321 04:45:16.932201 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ac45c53b-2486-47d1-aaf4-23b76adfd431-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "ac45c53b-2486-47d1-aaf4-23b76adfd431" (UID: "ac45c53b-2486-47d1-aaf4-23b76adfd431"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 21 04:45:16 crc kubenswrapper[4839]: I0321 04:45:16.933799 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ac45c53b-2486-47d1-aaf4-23b76adfd431-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "ac45c53b-2486-47d1-aaf4-23b76adfd431" (UID: "ac45c53b-2486-47d1-aaf4-23b76adfd431"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 21 04:45:16 crc kubenswrapper[4839]: I0321 04:45:16.943008 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ac45c53b-2486-47d1-aaf4-23b76adfd431-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "ac45c53b-2486-47d1-aaf4-23b76adfd431" (UID: "ac45c53b-2486-47d1-aaf4-23b76adfd431"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 21 04:45:16 crc kubenswrapper[4839]: I0321 04:45:16.946920 4839 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/ac45c53b-2486-47d1-aaf4-23b76adfd431-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Mar 21 04:45:16 crc kubenswrapper[4839]: I0321 04:45:16.947324 4839 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/ac45c53b-2486-47d1-aaf4-23b76adfd431-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Mar 21 04:45:16 crc kubenswrapper[4839]: I0321 04:45:16.947473 4839 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xn7xh\" (UniqueName: \"kubernetes.io/projected/ac45c53b-2486-47d1-aaf4-23b76adfd431-kube-api-access-xn7xh\") on node \"crc\" DevicePath \"\"" Mar 21 04:45:16 crc kubenswrapper[4839]: I0321 04:45:16.947733 4839 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/ac45c53b-2486-47d1-aaf4-23b76adfd431-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Mar 21 04:45:16 crc kubenswrapper[4839]: I0321 04:45:16.950058 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ac45c53b-2486-47d1-aaf4-23b76adfd431-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "ac45c53b-2486-47d1-aaf4-23b76adfd431" (UID: "ac45c53b-2486-47d1-aaf4-23b76adfd431"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 21 04:45:16 crc kubenswrapper[4839]: I0321 04:45:16.974064 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ac45c53b-2486-47d1-aaf4-23b76adfd431-config" (OuterVolumeSpecName: "config") pod "ac45c53b-2486-47d1-aaf4-23b76adfd431" (UID: "ac45c53b-2486-47d1-aaf4-23b76adfd431"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 21 04:45:16 crc kubenswrapper[4839]: I0321 04:45:16.983789 4839 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/horizon-84c6c985f8-v5cmh" podUID="b3b26c3a-55d5-442a-9c31-187b0aa60f90" containerName="horizon" probeResult="failure" output="Get \"https://10.217.0.151:8443/dashboard/auth/login/?next=/dashboard/\": read tcp 10.217.0.2:40434->10.217.0.151:8443: read: connection reset by peer" Mar 21 04:45:17 crc kubenswrapper[4839]: I0321 04:45:17.053071 4839 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ac45c53b-2486-47d1-aaf4-23b76adfd431-config\") on node \"crc\" DevicePath \"\"" Mar 21 04:45:17 crc kubenswrapper[4839]: I0321 04:45:17.053110 4839 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ac45c53b-2486-47d1-aaf4-23b76adfd431-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 21 04:45:17 crc kubenswrapper[4839]: I0321 04:45:17.771910 4839 generic.go:334] "Generic (PLEG): container finished" podID="82b135c8-5fc8-4930-9577-1dd9181a1dae" containerID="29fe57ad79066c4ceb1edc999f7c370920b15cd9f7e5562d2b7d130cb45455af" exitCode=0 Mar 21 04:45:17 crc kubenswrapper[4839]: I0321 04:45:17.772103 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"82b135c8-5fc8-4930-9577-1dd9181a1dae","Type":"ContainerDied","Data":"29fe57ad79066c4ceb1edc999f7c370920b15cd9f7e5562d2b7d130cb45455af"} Mar 21 04:45:17 crc kubenswrapper[4839]: I0321 04:45:17.774234 4839 generic.go:334] "Generic (PLEG): container finished" podID="b3b26c3a-55d5-442a-9c31-187b0aa60f90" containerID="e004b9646c4df34c1d5bba67912a6fa76f3cccc25c7980ab777e369e37ce16c9" exitCode=0 Mar 21 04:45:17 crc kubenswrapper[4839]: I0321 04:45:17.774286 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-84c6c985f8-v5cmh" event={"ID":"b3b26c3a-55d5-442a-9c31-187b0aa60f90","Type":"ContainerDied","Data":"e004b9646c4df34c1d5bba67912a6fa76f3cccc25c7980ab777e369e37ce16c9"} Mar 21 04:45:17 crc kubenswrapper[4839]: I0321 04:45:17.777396 4839 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-848cf88cfc-k67ln" Mar 21 04:45:17 crc kubenswrapper[4839]: I0321 04:45:17.783742 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-75bd8b89b4-djjlh" event={"ID":"bf5a44f8-8eb1-4953-b611-a02576e414ea","Type":"ContainerStarted","Data":"371fbcf0e25d611c8e7011aab1b093aa65787f92921f45d7a12f88bef0f0c11a"} Mar 21 04:45:17 crc kubenswrapper[4839]: I0321 04:45:17.784106 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-75bd8b89b4-djjlh" event={"ID":"bf5a44f8-8eb1-4953-b611-a02576e414ea","Type":"ContainerStarted","Data":"1582457b3b52da6361fec62b9a2f0adae91fef69d3f2274a400f4dcc5a17c915"} Mar 21 04:45:17 crc kubenswrapper[4839]: I0321 04:45:17.784209 4839 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/placement-75bd8b89b4-djjlh" Mar 21 04:45:17 crc kubenswrapper[4839]: I0321 04:45:17.784307 4839 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/placement-75bd8b89b4-djjlh" Mar 21 04:45:17 crc kubenswrapper[4839]: I0321 04:45:17.837785 4839 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/placement-75bd8b89b4-djjlh" podStartSLOduration=2.837766972 podStartE2EDuration="2.837766972s" podCreationTimestamp="2026-03-21 04:45:15 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-21 04:45:17.834374867 +0000 UTC m=+1322.162161543" watchObservedRunningTime="2026-03-21 04:45:17.837766972 +0000 UTC m=+1322.165553648" Mar 21 04:45:17 crc kubenswrapper[4839]: I0321 04:45:17.881545 4839 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-848cf88cfc-k67ln"] Mar 21 04:45:17 crc kubenswrapper[4839]: I0321 04:45:17.888804 4839 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-848cf88cfc-k67ln"] Mar 21 04:45:18 crc kubenswrapper[4839]: I0321 04:45:18.466305 4839 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ac45c53b-2486-47d1-aaf4-23b76adfd431" path="/var/lib/kubelet/pods/ac45c53b-2486-47d1-aaf4-23b76adfd431/volumes" Mar 21 04:45:18 crc kubenswrapper[4839]: I0321 04:45:18.467480 4839 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e965d008-890b-408c-a5a8-823aca00140a" path="/var/lib/kubelet/pods/e965d008-890b-408c-a5a8-823aca00140a/volumes" Mar 21 04:45:18 crc kubenswrapper[4839]: I0321 04:45:18.613504 4839 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/keystone-cb996784d-fvhvp" Mar 21 04:45:18 crc kubenswrapper[4839]: I0321 04:45:18.789196 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"b4b205eb-a84a-4c2f-8b49-068d4e0a8ec9","Type":"ContainerStarted","Data":"4474b8a6d0b40b2af1d9b2dc57ca277c0c905892995cd020658c486bec967d7a"} Mar 21 04:45:18 crc kubenswrapper[4839]: I0321 04:45:18.814448 4839 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.815438702 podStartE2EDuration="8.814431035s" podCreationTimestamp="2026-03-21 04:45:10 +0000 UTC" firstStartedPulling="2026-03-21 04:45:11.53318052 +0000 UTC m=+1315.860967186" lastFinishedPulling="2026-03-21 04:45:17.532172843 +0000 UTC m=+1321.859959519" observedRunningTime="2026-03-21 04:45:18.812558582 +0000 UTC m=+1323.140345268" watchObservedRunningTime="2026-03-21 04:45:18.814431035 +0000 UTC m=+1323.142217711" Mar 21 04:45:19 crc kubenswrapper[4839]: I0321 04:45:19.261937 4839 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/horizon-84c6c985f8-v5cmh" podUID="b3b26c3a-55d5-442a-9c31-187b0aa60f90" containerName="horizon" probeResult="failure" output="Get \"https://10.217.0.151:8443/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.0.151:8443: connect: connection refused" Mar 21 04:45:19 crc kubenswrapper[4839]: I0321 04:45:19.827367 4839 generic.go:334] "Generic (PLEG): container finished" podID="82b135c8-5fc8-4930-9577-1dd9181a1dae" containerID="94e949d149a7c44ebe62d3aa61b17816324f2bfff6307c3c5a788a6a257442fa" exitCode=0 Mar 21 04:45:19 crc kubenswrapper[4839]: I0321 04:45:19.827471 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"82b135c8-5fc8-4930-9577-1dd9181a1dae","Type":"ContainerDied","Data":"94e949d149a7c44ebe62d3aa61b17816324f2bfff6307c3c5a788a6a257442fa"} Mar 21 04:45:19 crc kubenswrapper[4839]: I0321 04:45:19.827987 4839 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Mar 21 04:45:20 crc kubenswrapper[4839]: I0321 04:45:20.293610 4839 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Mar 21 04:45:20 crc kubenswrapper[4839]: I0321 04:45:20.424594 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/82b135c8-5fc8-4930-9577-1dd9181a1dae-combined-ca-bundle\") pod \"82b135c8-5fc8-4930-9577-1dd9181a1dae\" (UID: \"82b135c8-5fc8-4930-9577-1dd9181a1dae\") " Mar 21 04:45:20 crc kubenswrapper[4839]: I0321 04:45:20.424776 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/82b135c8-5fc8-4930-9577-1dd9181a1dae-config-data-custom\") pod \"82b135c8-5fc8-4930-9577-1dd9181a1dae\" (UID: \"82b135c8-5fc8-4930-9577-1dd9181a1dae\") " Mar 21 04:45:20 crc kubenswrapper[4839]: I0321 04:45:20.424809 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9nhfj\" (UniqueName: \"kubernetes.io/projected/82b135c8-5fc8-4930-9577-1dd9181a1dae-kube-api-access-9nhfj\") pod \"82b135c8-5fc8-4930-9577-1dd9181a1dae\" (UID: \"82b135c8-5fc8-4930-9577-1dd9181a1dae\") " Mar 21 04:45:20 crc kubenswrapper[4839]: I0321 04:45:20.424842 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/82b135c8-5fc8-4930-9577-1dd9181a1dae-config-data\") pod \"82b135c8-5fc8-4930-9577-1dd9181a1dae\" (UID: \"82b135c8-5fc8-4930-9577-1dd9181a1dae\") " Mar 21 04:45:20 crc kubenswrapper[4839]: I0321 04:45:20.425782 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/82b135c8-5fc8-4930-9577-1dd9181a1dae-etc-machine-id\") pod \"82b135c8-5fc8-4930-9577-1dd9181a1dae\" (UID: \"82b135c8-5fc8-4930-9577-1dd9181a1dae\") " Mar 21 04:45:20 crc kubenswrapper[4839]: I0321 04:45:20.425894 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/82b135c8-5fc8-4930-9577-1dd9181a1dae-scripts\") pod \"82b135c8-5fc8-4930-9577-1dd9181a1dae\" (UID: \"82b135c8-5fc8-4930-9577-1dd9181a1dae\") " Mar 21 04:45:20 crc kubenswrapper[4839]: I0321 04:45:20.425894 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/82b135c8-5fc8-4930-9577-1dd9181a1dae-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "82b135c8-5fc8-4930-9577-1dd9181a1dae" (UID: "82b135c8-5fc8-4930-9577-1dd9181a1dae"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 21 04:45:20 crc kubenswrapper[4839]: I0321 04:45:20.426517 4839 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/82b135c8-5fc8-4930-9577-1dd9181a1dae-etc-machine-id\") on node \"crc\" DevicePath \"\"" Mar 21 04:45:20 crc kubenswrapper[4839]: I0321 04:45:20.455992 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/82b135c8-5fc8-4930-9577-1dd9181a1dae-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "82b135c8-5fc8-4930-9577-1dd9181a1dae" (UID: "82b135c8-5fc8-4930-9577-1dd9181a1dae"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 21 04:45:20 crc kubenswrapper[4839]: I0321 04:45:20.457330 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/82b135c8-5fc8-4930-9577-1dd9181a1dae-kube-api-access-9nhfj" (OuterVolumeSpecName: "kube-api-access-9nhfj") pod "82b135c8-5fc8-4930-9577-1dd9181a1dae" (UID: "82b135c8-5fc8-4930-9577-1dd9181a1dae"). InnerVolumeSpecName "kube-api-access-9nhfj". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 21 04:45:20 crc kubenswrapper[4839]: I0321 04:45:20.459722 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/82b135c8-5fc8-4930-9577-1dd9181a1dae-scripts" (OuterVolumeSpecName: "scripts") pod "82b135c8-5fc8-4930-9577-1dd9181a1dae" (UID: "82b135c8-5fc8-4930-9577-1dd9181a1dae"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 21 04:45:20 crc kubenswrapper[4839]: I0321 04:45:20.499545 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/82b135c8-5fc8-4930-9577-1dd9181a1dae-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "82b135c8-5fc8-4930-9577-1dd9181a1dae" (UID: "82b135c8-5fc8-4930-9577-1dd9181a1dae"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 21 04:45:20 crc kubenswrapper[4839]: I0321 04:45:20.529893 4839 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/82b135c8-5fc8-4930-9577-1dd9181a1dae-config-data-custom\") on node \"crc\" DevicePath \"\"" Mar 21 04:45:20 crc kubenswrapper[4839]: I0321 04:45:20.529932 4839 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9nhfj\" (UniqueName: \"kubernetes.io/projected/82b135c8-5fc8-4930-9577-1dd9181a1dae-kube-api-access-9nhfj\") on node \"crc\" DevicePath \"\"" Mar 21 04:45:20 crc kubenswrapper[4839]: I0321 04:45:20.529947 4839 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/82b135c8-5fc8-4930-9577-1dd9181a1dae-scripts\") on node \"crc\" DevicePath \"\"" Mar 21 04:45:20 crc kubenswrapper[4839]: I0321 04:45:20.529961 4839 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/82b135c8-5fc8-4930-9577-1dd9181a1dae-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 21 04:45:20 crc kubenswrapper[4839]: I0321 04:45:20.576764 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/82b135c8-5fc8-4930-9577-1dd9181a1dae-config-data" (OuterVolumeSpecName: "config-data") pod "82b135c8-5fc8-4930-9577-1dd9181a1dae" (UID: "82b135c8-5fc8-4930-9577-1dd9181a1dae"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 21 04:45:20 crc kubenswrapper[4839]: I0321 04:45:20.632746 4839 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/82b135c8-5fc8-4930-9577-1dd9181a1dae-config-data\") on node \"crc\" DevicePath \"\"" Mar 21 04:45:20 crc kubenswrapper[4839]: I0321 04:45:20.851942 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"82b135c8-5fc8-4930-9577-1dd9181a1dae","Type":"ContainerDied","Data":"63601f472b5773f8a2494b64c1e43c04ccfd11d451bf08893682392398171dec"} Mar 21 04:45:20 crc kubenswrapper[4839]: I0321 04:45:20.851957 4839 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Mar 21 04:45:20 crc kubenswrapper[4839]: I0321 04:45:20.852215 4839 scope.go:117] "RemoveContainer" containerID="29fe57ad79066c4ceb1edc999f7c370920b15cd9f7e5562d2b7d130cb45455af" Mar 21 04:45:20 crc kubenswrapper[4839]: I0321 04:45:20.896627 4839 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-scheduler-0"] Mar 21 04:45:20 crc kubenswrapper[4839]: I0321 04:45:20.899402 4839 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-scheduler-0"] Mar 21 04:45:20 crc kubenswrapper[4839]: I0321 04:45:20.901966 4839 scope.go:117] "RemoveContainer" containerID="94e949d149a7c44ebe62d3aa61b17816324f2bfff6307c3c5a788a6a257442fa" Mar 21 04:45:20 crc kubenswrapper[4839]: I0321 04:45:20.922926 4839 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-scheduler-0"] Mar 21 04:45:20 crc kubenswrapper[4839]: E0321 04:45:20.923339 4839 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="82b135c8-5fc8-4930-9577-1dd9181a1dae" containerName="cinder-scheduler" Mar 21 04:45:20 crc kubenswrapper[4839]: I0321 04:45:20.923361 4839 state_mem.go:107] "Deleted CPUSet assignment" podUID="82b135c8-5fc8-4930-9577-1dd9181a1dae" containerName="cinder-scheduler" Mar 21 04:45:20 crc kubenswrapper[4839]: E0321 04:45:20.923387 4839 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="82b135c8-5fc8-4930-9577-1dd9181a1dae" containerName="probe" Mar 21 04:45:20 crc kubenswrapper[4839]: I0321 04:45:20.923399 4839 state_mem.go:107] "Deleted CPUSet assignment" podUID="82b135c8-5fc8-4930-9577-1dd9181a1dae" containerName="probe" Mar 21 04:45:20 crc kubenswrapper[4839]: E0321 04:45:20.923434 4839 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ac45c53b-2486-47d1-aaf4-23b76adfd431" containerName="init" Mar 21 04:45:20 crc kubenswrapper[4839]: I0321 04:45:20.923442 4839 state_mem.go:107] "Deleted CPUSet assignment" podUID="ac45c53b-2486-47d1-aaf4-23b76adfd431" containerName="init" Mar 21 04:45:20 crc kubenswrapper[4839]: E0321 04:45:20.923461 4839 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e965d008-890b-408c-a5a8-823aca00140a" containerName="neutron-httpd" Mar 21 04:45:20 crc kubenswrapper[4839]: I0321 04:45:20.923470 4839 state_mem.go:107] "Deleted CPUSet assignment" podUID="e965d008-890b-408c-a5a8-823aca00140a" containerName="neutron-httpd" Mar 21 04:45:20 crc kubenswrapper[4839]: E0321 04:45:20.923483 4839 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ac45c53b-2486-47d1-aaf4-23b76adfd431" containerName="dnsmasq-dns" Mar 21 04:45:20 crc kubenswrapper[4839]: I0321 04:45:20.923491 4839 state_mem.go:107] "Deleted CPUSet assignment" podUID="ac45c53b-2486-47d1-aaf4-23b76adfd431" containerName="dnsmasq-dns" Mar 21 04:45:20 crc kubenswrapper[4839]: E0321 04:45:20.923514 4839 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e965d008-890b-408c-a5a8-823aca00140a" containerName="neutron-api" Mar 21 04:45:20 crc kubenswrapper[4839]: I0321 04:45:20.923522 4839 state_mem.go:107] "Deleted CPUSet assignment" podUID="e965d008-890b-408c-a5a8-823aca00140a" containerName="neutron-api" Mar 21 04:45:20 crc kubenswrapper[4839]: I0321 04:45:20.923758 4839 memory_manager.go:354] "RemoveStaleState removing state" podUID="82b135c8-5fc8-4930-9577-1dd9181a1dae" containerName="probe" Mar 21 04:45:20 crc kubenswrapper[4839]: I0321 04:45:20.923782 4839 memory_manager.go:354] "RemoveStaleState removing state" podUID="e965d008-890b-408c-a5a8-823aca00140a" containerName="neutron-api" Mar 21 04:45:20 crc kubenswrapper[4839]: I0321 04:45:20.923798 4839 memory_manager.go:354] "RemoveStaleState removing state" podUID="ac45c53b-2486-47d1-aaf4-23b76adfd431" containerName="dnsmasq-dns" Mar 21 04:45:20 crc kubenswrapper[4839]: I0321 04:45:20.923816 4839 memory_manager.go:354] "RemoveStaleState removing state" podUID="e965d008-890b-408c-a5a8-823aca00140a" containerName="neutron-httpd" Mar 21 04:45:20 crc kubenswrapper[4839]: I0321 04:45:20.923826 4839 memory_manager.go:354] "RemoveStaleState removing state" podUID="82b135c8-5fc8-4930-9577-1dd9181a1dae" containerName="cinder-scheduler" Mar 21 04:45:20 crc kubenswrapper[4839]: I0321 04:45:20.924946 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Mar 21 04:45:20 crc kubenswrapper[4839]: I0321 04:45:20.929777 4839 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scheduler-config-data" Mar 21 04:45:20 crc kubenswrapper[4839]: I0321 04:45:20.938188 4839 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Mar 21 04:45:21 crc kubenswrapper[4839]: I0321 04:45:21.040185 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6hxmg\" (UniqueName: \"kubernetes.io/projected/77964653-d242-4258-b06e-c9cd0fb64d84-kube-api-access-6hxmg\") pod \"cinder-scheduler-0\" (UID: \"77964653-d242-4258-b06e-c9cd0fb64d84\") " pod="openstack/cinder-scheduler-0" Mar 21 04:45:21 crc kubenswrapper[4839]: I0321 04:45:21.040237 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/77964653-d242-4258-b06e-c9cd0fb64d84-config-data\") pod \"cinder-scheduler-0\" (UID: \"77964653-d242-4258-b06e-c9cd0fb64d84\") " pod="openstack/cinder-scheduler-0" Mar 21 04:45:21 crc kubenswrapper[4839]: I0321 04:45:21.040270 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/77964653-d242-4258-b06e-c9cd0fb64d84-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"77964653-d242-4258-b06e-c9cd0fb64d84\") " pod="openstack/cinder-scheduler-0" Mar 21 04:45:21 crc kubenswrapper[4839]: I0321 04:45:21.040402 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/77964653-d242-4258-b06e-c9cd0fb64d84-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"77964653-d242-4258-b06e-c9cd0fb64d84\") " pod="openstack/cinder-scheduler-0" Mar 21 04:45:21 crc kubenswrapper[4839]: I0321 04:45:21.040487 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/77964653-d242-4258-b06e-c9cd0fb64d84-scripts\") pod \"cinder-scheduler-0\" (UID: \"77964653-d242-4258-b06e-c9cd0fb64d84\") " pod="openstack/cinder-scheduler-0" Mar 21 04:45:21 crc kubenswrapper[4839]: I0321 04:45:21.040599 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/77964653-d242-4258-b06e-c9cd0fb64d84-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"77964653-d242-4258-b06e-c9cd0fb64d84\") " pod="openstack/cinder-scheduler-0" Mar 21 04:45:21 crc kubenswrapper[4839]: I0321 04:45:21.142008 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6hxmg\" (UniqueName: \"kubernetes.io/projected/77964653-d242-4258-b06e-c9cd0fb64d84-kube-api-access-6hxmg\") pod \"cinder-scheduler-0\" (UID: \"77964653-d242-4258-b06e-c9cd0fb64d84\") " pod="openstack/cinder-scheduler-0" Mar 21 04:45:21 crc kubenswrapper[4839]: I0321 04:45:21.142053 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/77964653-d242-4258-b06e-c9cd0fb64d84-config-data\") pod \"cinder-scheduler-0\" (UID: \"77964653-d242-4258-b06e-c9cd0fb64d84\") " pod="openstack/cinder-scheduler-0" Mar 21 04:45:21 crc kubenswrapper[4839]: I0321 04:45:21.142077 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/77964653-d242-4258-b06e-c9cd0fb64d84-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"77964653-d242-4258-b06e-c9cd0fb64d84\") " pod="openstack/cinder-scheduler-0" Mar 21 04:45:21 crc kubenswrapper[4839]: I0321 04:45:21.142102 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/77964653-d242-4258-b06e-c9cd0fb64d84-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"77964653-d242-4258-b06e-c9cd0fb64d84\") " pod="openstack/cinder-scheduler-0" Mar 21 04:45:21 crc kubenswrapper[4839]: I0321 04:45:21.142172 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/77964653-d242-4258-b06e-c9cd0fb64d84-scripts\") pod \"cinder-scheduler-0\" (UID: \"77964653-d242-4258-b06e-c9cd0fb64d84\") " pod="openstack/cinder-scheduler-0" Mar 21 04:45:21 crc kubenswrapper[4839]: I0321 04:45:21.142223 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/77964653-d242-4258-b06e-c9cd0fb64d84-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"77964653-d242-4258-b06e-c9cd0fb64d84\") " pod="openstack/cinder-scheduler-0" Mar 21 04:45:21 crc kubenswrapper[4839]: I0321 04:45:21.142264 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/77964653-d242-4258-b06e-c9cd0fb64d84-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"77964653-d242-4258-b06e-c9cd0fb64d84\") " pod="openstack/cinder-scheduler-0" Mar 21 04:45:21 crc kubenswrapper[4839]: I0321 04:45:21.148980 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/77964653-d242-4258-b06e-c9cd0fb64d84-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"77964653-d242-4258-b06e-c9cd0fb64d84\") " pod="openstack/cinder-scheduler-0" Mar 21 04:45:21 crc kubenswrapper[4839]: I0321 04:45:21.149176 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/77964653-d242-4258-b06e-c9cd0fb64d84-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"77964653-d242-4258-b06e-c9cd0fb64d84\") " pod="openstack/cinder-scheduler-0" Mar 21 04:45:21 crc kubenswrapper[4839]: I0321 04:45:21.149362 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/77964653-d242-4258-b06e-c9cd0fb64d84-scripts\") pod \"cinder-scheduler-0\" (UID: \"77964653-d242-4258-b06e-c9cd0fb64d84\") " pod="openstack/cinder-scheduler-0" Mar 21 04:45:21 crc kubenswrapper[4839]: I0321 04:45:21.151652 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/77964653-d242-4258-b06e-c9cd0fb64d84-config-data\") pod \"cinder-scheduler-0\" (UID: \"77964653-d242-4258-b06e-c9cd0fb64d84\") " pod="openstack/cinder-scheduler-0" Mar 21 04:45:21 crc kubenswrapper[4839]: I0321 04:45:21.165167 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6hxmg\" (UniqueName: \"kubernetes.io/projected/77964653-d242-4258-b06e-c9cd0fb64d84-kube-api-access-6hxmg\") pod \"cinder-scheduler-0\" (UID: \"77964653-d242-4258-b06e-c9cd0fb64d84\") " pod="openstack/cinder-scheduler-0" Mar 21 04:45:21 crc kubenswrapper[4839]: I0321 04:45:21.258488 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Mar 21 04:45:21 crc kubenswrapper[4839]: I0321 04:45:21.506836 4839 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-848cf88cfc-k67ln" podUID="ac45c53b-2486-47d1-aaf4-23b76adfd431" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.165:5353: i/o timeout" Mar 21 04:45:21 crc kubenswrapper[4839]: I0321 04:45:21.815236 4839 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Mar 21 04:45:21 crc kubenswrapper[4839]: I0321 04:45:21.839243 4839 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstackclient"] Mar 21 04:45:21 crc kubenswrapper[4839]: I0321 04:45:21.840543 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Mar 21 04:45:21 crc kubenswrapper[4839]: I0321 04:45:21.843241 4839 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-config-secret" Mar 21 04:45:21 crc kubenswrapper[4839]: I0321 04:45:21.843306 4839 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstackclient-openstackclient-dockercfg-4vnsh" Mar 21 04:45:21 crc kubenswrapper[4839]: I0321 04:45:21.846846 4839 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-config" Mar 21 04:45:21 crc kubenswrapper[4839]: I0321 04:45:21.866252 4839 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstackclient"] Mar 21 04:45:21 crc kubenswrapper[4839]: I0321 04:45:21.885337 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"77964653-d242-4258-b06e-c9cd0fb64d84","Type":"ContainerStarted","Data":"1618cf281ea19a9f57a3e69a69a8dc2918fb433aa2cfe42c7bca56c00689cd62"} Mar 21 04:45:21 crc kubenswrapper[4839]: I0321 04:45:21.959663 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/829e2047-17e6-49ec-9baf-1339c0f5aea6-openstack-config\") pod \"openstackclient\" (UID: \"829e2047-17e6-49ec-9baf-1339c0f5aea6\") " pod="openstack/openstackclient" Mar 21 04:45:21 crc kubenswrapper[4839]: I0321 04:45:21.959793 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/829e2047-17e6-49ec-9baf-1339c0f5aea6-openstack-config-secret\") pod \"openstackclient\" (UID: \"829e2047-17e6-49ec-9baf-1339c0f5aea6\") " pod="openstack/openstackclient" Mar 21 04:45:21 crc kubenswrapper[4839]: I0321 04:45:21.959861 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/829e2047-17e6-49ec-9baf-1339c0f5aea6-combined-ca-bundle\") pod \"openstackclient\" (UID: \"829e2047-17e6-49ec-9baf-1339c0f5aea6\") " pod="openstack/openstackclient" Mar 21 04:45:21 crc kubenswrapper[4839]: I0321 04:45:21.959904 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7tlqc\" (UniqueName: \"kubernetes.io/projected/829e2047-17e6-49ec-9baf-1339c0f5aea6-kube-api-access-7tlqc\") pod \"openstackclient\" (UID: \"829e2047-17e6-49ec-9baf-1339c0f5aea6\") " pod="openstack/openstackclient" Mar 21 04:45:22 crc kubenswrapper[4839]: I0321 04:45:22.061868 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/829e2047-17e6-49ec-9baf-1339c0f5aea6-combined-ca-bundle\") pod \"openstackclient\" (UID: \"829e2047-17e6-49ec-9baf-1339c0f5aea6\") " pod="openstack/openstackclient" Mar 21 04:45:22 crc kubenswrapper[4839]: I0321 04:45:22.061928 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7tlqc\" (UniqueName: \"kubernetes.io/projected/829e2047-17e6-49ec-9baf-1339c0f5aea6-kube-api-access-7tlqc\") pod \"openstackclient\" (UID: \"829e2047-17e6-49ec-9baf-1339c0f5aea6\") " pod="openstack/openstackclient" Mar 21 04:45:22 crc kubenswrapper[4839]: I0321 04:45:22.062018 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/829e2047-17e6-49ec-9baf-1339c0f5aea6-openstack-config\") pod \"openstackclient\" (UID: \"829e2047-17e6-49ec-9baf-1339c0f5aea6\") " pod="openstack/openstackclient" Mar 21 04:45:22 crc kubenswrapper[4839]: I0321 04:45:22.062073 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/829e2047-17e6-49ec-9baf-1339c0f5aea6-openstack-config-secret\") pod \"openstackclient\" (UID: \"829e2047-17e6-49ec-9baf-1339c0f5aea6\") " pod="openstack/openstackclient" Mar 21 04:45:22 crc kubenswrapper[4839]: I0321 04:45:22.063114 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/829e2047-17e6-49ec-9baf-1339c0f5aea6-openstack-config\") pod \"openstackclient\" (UID: \"829e2047-17e6-49ec-9baf-1339c0f5aea6\") " pod="openstack/openstackclient" Mar 21 04:45:22 crc kubenswrapper[4839]: I0321 04:45:22.074903 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/829e2047-17e6-49ec-9baf-1339c0f5aea6-combined-ca-bundle\") pod \"openstackclient\" (UID: \"829e2047-17e6-49ec-9baf-1339c0f5aea6\") " pod="openstack/openstackclient" Mar 21 04:45:22 crc kubenswrapper[4839]: I0321 04:45:22.085283 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/829e2047-17e6-49ec-9baf-1339c0f5aea6-openstack-config-secret\") pod \"openstackclient\" (UID: \"829e2047-17e6-49ec-9baf-1339c0f5aea6\") " pod="openstack/openstackclient" Mar 21 04:45:22 crc kubenswrapper[4839]: I0321 04:45:22.095171 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7tlqc\" (UniqueName: \"kubernetes.io/projected/829e2047-17e6-49ec-9baf-1339c0f5aea6-kube-api-access-7tlqc\") pod \"openstackclient\" (UID: \"829e2047-17e6-49ec-9baf-1339c0f5aea6\") " pod="openstack/openstackclient" Mar 21 04:45:22 crc kubenswrapper[4839]: I0321 04:45:22.125497 4839 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/openstackclient"] Mar 21 04:45:22 crc kubenswrapper[4839]: I0321 04:45:22.125795 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Mar 21 04:45:22 crc kubenswrapper[4839]: I0321 04:45:22.152932 4839 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/openstackclient"] Mar 21 04:45:22 crc kubenswrapper[4839]: I0321 04:45:22.211043 4839 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstackclient"] Mar 21 04:45:22 crc kubenswrapper[4839]: I0321 04:45:22.214461 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Mar 21 04:45:22 crc kubenswrapper[4839]: I0321 04:45:22.223214 4839 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstackclient"] Mar 21 04:45:22 crc kubenswrapper[4839]: I0321 04:45:22.268873 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mtpr9\" (UniqueName: \"kubernetes.io/projected/52b9f7e1-d86c-457e-9391-eee855a9f7a7-kube-api-access-mtpr9\") pod \"openstackclient\" (UID: \"52b9f7e1-d86c-457e-9391-eee855a9f7a7\") " pod="openstack/openstackclient" Mar 21 04:45:22 crc kubenswrapper[4839]: I0321 04:45:22.268917 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/52b9f7e1-d86c-457e-9391-eee855a9f7a7-combined-ca-bundle\") pod \"openstackclient\" (UID: \"52b9f7e1-d86c-457e-9391-eee855a9f7a7\") " pod="openstack/openstackclient" Mar 21 04:45:22 crc kubenswrapper[4839]: I0321 04:45:22.268950 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/52b9f7e1-d86c-457e-9391-eee855a9f7a7-openstack-config-secret\") pod \"openstackclient\" (UID: \"52b9f7e1-d86c-457e-9391-eee855a9f7a7\") " pod="openstack/openstackclient" Mar 21 04:45:22 crc kubenswrapper[4839]: I0321 04:45:22.269122 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/52b9f7e1-d86c-457e-9391-eee855a9f7a7-openstack-config\") pod \"openstackclient\" (UID: \"52b9f7e1-d86c-457e-9391-eee855a9f7a7\") " pod="openstack/openstackclient" Mar 21 04:45:22 crc kubenswrapper[4839]: E0321 04:45:22.317243 4839 log.go:32] "RunPodSandbox from runtime service failed" err=< Mar 21 04:45:22 crc kubenswrapper[4839]: rpc error: code = Unknown desc = failed to create pod network sandbox k8s_openstackclient_openstack_829e2047-17e6-49ec-9baf-1339c0f5aea6_0(cccec67cfb8155b62fbd9064b5fd161612d7a40d0d6bccbf8bda1bbbb638ecac): error adding pod openstack_openstackclient to CNI network "multus-cni-network": plugin type="multus-shim" name="multus-cni-network" failed (add): CmdAdd (shim): CNI request failed with status 400: 'ContainerID:"cccec67cfb8155b62fbd9064b5fd161612d7a40d0d6bccbf8bda1bbbb638ecac" Netns:"/var/run/netns/a9a1b473-5023-4516-8b95-be260f3e2bc9" IfName:"eth0" Args:"IgnoreUnknown=1;K8S_POD_NAMESPACE=openstack;K8S_POD_NAME=openstackclient;K8S_POD_INFRA_CONTAINER_ID=cccec67cfb8155b62fbd9064b5fd161612d7a40d0d6bccbf8bda1bbbb638ecac;K8S_POD_UID=829e2047-17e6-49ec-9baf-1339c0f5aea6" Path:"" ERRORED: error configuring pod [openstack/openstackclient] networking: Multus: [openstack/openstackclient/829e2047-17e6-49ec-9baf-1339c0f5aea6]: expected pod UID "829e2047-17e6-49ec-9baf-1339c0f5aea6" but got "52b9f7e1-d86c-457e-9391-eee855a9f7a7" from Kube API Mar 21 04:45:22 crc kubenswrapper[4839]: ': StdinData: {"binDir":"/var/lib/cni/bin","clusterNetwork":"/host/run/multus/cni/net.d/10-ovn-kubernetes.conf","cniVersion":"0.3.1","daemonSocketDir":"/run/multus/socket","globalNamespaces":"default,openshift-multus,openshift-sriov-network-operator,openshift-cnv","logLevel":"verbose","logToStderr":true,"name":"multus-cni-network","namespaceIsolation":true,"type":"multus-shim"} Mar 21 04:45:22 crc kubenswrapper[4839]: > Mar 21 04:45:22 crc kubenswrapper[4839]: E0321 04:45:22.317316 4839 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err=< Mar 21 04:45:22 crc kubenswrapper[4839]: rpc error: code = Unknown desc = failed to create pod network sandbox k8s_openstackclient_openstack_829e2047-17e6-49ec-9baf-1339c0f5aea6_0(cccec67cfb8155b62fbd9064b5fd161612d7a40d0d6bccbf8bda1bbbb638ecac): error adding pod openstack_openstackclient to CNI network "multus-cni-network": plugin type="multus-shim" name="multus-cni-network" failed (add): CmdAdd (shim): CNI request failed with status 400: 'ContainerID:"cccec67cfb8155b62fbd9064b5fd161612d7a40d0d6bccbf8bda1bbbb638ecac" Netns:"/var/run/netns/a9a1b473-5023-4516-8b95-be260f3e2bc9" IfName:"eth0" Args:"IgnoreUnknown=1;K8S_POD_NAMESPACE=openstack;K8S_POD_NAME=openstackclient;K8S_POD_INFRA_CONTAINER_ID=cccec67cfb8155b62fbd9064b5fd161612d7a40d0d6bccbf8bda1bbbb638ecac;K8S_POD_UID=829e2047-17e6-49ec-9baf-1339c0f5aea6" Path:"" ERRORED: error configuring pod [openstack/openstackclient] networking: Multus: [openstack/openstackclient/829e2047-17e6-49ec-9baf-1339c0f5aea6]: expected pod UID "829e2047-17e6-49ec-9baf-1339c0f5aea6" but got "52b9f7e1-d86c-457e-9391-eee855a9f7a7" from Kube API Mar 21 04:45:22 crc kubenswrapper[4839]: ': StdinData: {"binDir":"/var/lib/cni/bin","clusterNetwork":"/host/run/multus/cni/net.d/10-ovn-kubernetes.conf","cniVersion":"0.3.1","daemonSocketDir":"/run/multus/socket","globalNamespaces":"default,openshift-multus,openshift-sriov-network-operator,openshift-cnv","logLevel":"verbose","logToStderr":true,"name":"multus-cni-network","namespaceIsolation":true,"type":"multus-shim"} Mar 21 04:45:22 crc kubenswrapper[4839]: > pod="openstack/openstackclient" Mar 21 04:45:22 crc kubenswrapper[4839]: I0321 04:45:22.371970 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mtpr9\" (UniqueName: \"kubernetes.io/projected/52b9f7e1-d86c-457e-9391-eee855a9f7a7-kube-api-access-mtpr9\") pod \"openstackclient\" (UID: \"52b9f7e1-d86c-457e-9391-eee855a9f7a7\") " pod="openstack/openstackclient" Mar 21 04:45:22 crc kubenswrapper[4839]: I0321 04:45:22.372041 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/52b9f7e1-d86c-457e-9391-eee855a9f7a7-combined-ca-bundle\") pod \"openstackclient\" (UID: \"52b9f7e1-d86c-457e-9391-eee855a9f7a7\") " pod="openstack/openstackclient" Mar 21 04:45:22 crc kubenswrapper[4839]: I0321 04:45:22.372092 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/52b9f7e1-d86c-457e-9391-eee855a9f7a7-openstack-config-secret\") pod \"openstackclient\" (UID: \"52b9f7e1-d86c-457e-9391-eee855a9f7a7\") " pod="openstack/openstackclient" Mar 21 04:45:22 crc kubenswrapper[4839]: I0321 04:45:22.372185 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/52b9f7e1-d86c-457e-9391-eee855a9f7a7-openstack-config\") pod \"openstackclient\" (UID: \"52b9f7e1-d86c-457e-9391-eee855a9f7a7\") " pod="openstack/openstackclient" Mar 21 04:45:22 crc kubenswrapper[4839]: I0321 04:45:22.373606 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/52b9f7e1-d86c-457e-9391-eee855a9f7a7-openstack-config\") pod \"openstackclient\" (UID: \"52b9f7e1-d86c-457e-9391-eee855a9f7a7\") " pod="openstack/openstackclient" Mar 21 04:45:22 crc kubenswrapper[4839]: I0321 04:45:22.375974 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/52b9f7e1-d86c-457e-9391-eee855a9f7a7-openstack-config-secret\") pod \"openstackclient\" (UID: \"52b9f7e1-d86c-457e-9391-eee855a9f7a7\") " pod="openstack/openstackclient" Mar 21 04:45:22 crc kubenswrapper[4839]: I0321 04:45:22.376042 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/52b9f7e1-d86c-457e-9391-eee855a9f7a7-combined-ca-bundle\") pod \"openstackclient\" (UID: \"52b9f7e1-d86c-457e-9391-eee855a9f7a7\") " pod="openstack/openstackclient" Mar 21 04:45:22 crc kubenswrapper[4839]: I0321 04:45:22.392556 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mtpr9\" (UniqueName: \"kubernetes.io/projected/52b9f7e1-d86c-457e-9391-eee855a9f7a7-kube-api-access-mtpr9\") pod \"openstackclient\" (UID: \"52b9f7e1-d86c-457e-9391-eee855a9f7a7\") " pod="openstack/openstackclient" Mar 21 04:45:22 crc kubenswrapper[4839]: I0321 04:45:22.469818 4839 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="82b135c8-5fc8-4930-9577-1dd9181a1dae" path="/var/lib/kubelet/pods/82b135c8-5fc8-4930-9577-1dd9181a1dae/volumes" Mar 21 04:45:22 crc kubenswrapper[4839]: I0321 04:45:22.535430 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Mar 21 04:45:22 crc kubenswrapper[4839]: I0321 04:45:22.916851 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Mar 21 04:45:22 crc kubenswrapper[4839]: I0321 04:45:22.918493 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"77964653-d242-4258-b06e-c9cd0fb64d84","Type":"ContainerStarted","Data":"0c7f33b3e0cfcb04e6e2b5015ac76fb269b31f642a83f153e7c4a18eadb3d006"} Mar 21 04:45:22 crc kubenswrapper[4839]: I0321 04:45:22.952939 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Mar 21 04:45:22 crc kubenswrapper[4839]: I0321 04:45:22.956113 4839 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openstack/openstackclient" oldPodUID="829e2047-17e6-49ec-9baf-1339c0f5aea6" podUID="52b9f7e1-d86c-457e-9391-eee855a9f7a7" Mar 21 04:45:23 crc kubenswrapper[4839]: I0321 04:45:23.026616 4839 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstackclient"] Mar 21 04:45:23 crc kubenswrapper[4839]: I0321 04:45:23.084141 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7tlqc\" (UniqueName: \"kubernetes.io/projected/829e2047-17e6-49ec-9baf-1339c0f5aea6-kube-api-access-7tlqc\") pod \"829e2047-17e6-49ec-9baf-1339c0f5aea6\" (UID: \"829e2047-17e6-49ec-9baf-1339c0f5aea6\") " Mar 21 04:45:23 crc kubenswrapper[4839]: I0321 04:45:23.084250 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/829e2047-17e6-49ec-9baf-1339c0f5aea6-openstack-config-secret\") pod \"829e2047-17e6-49ec-9baf-1339c0f5aea6\" (UID: \"829e2047-17e6-49ec-9baf-1339c0f5aea6\") " Mar 21 04:45:23 crc kubenswrapper[4839]: I0321 04:45:23.084510 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/829e2047-17e6-49ec-9baf-1339c0f5aea6-combined-ca-bundle\") pod \"829e2047-17e6-49ec-9baf-1339c0f5aea6\" (UID: \"829e2047-17e6-49ec-9baf-1339c0f5aea6\") " Mar 21 04:45:23 crc kubenswrapper[4839]: I0321 04:45:23.084660 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/829e2047-17e6-49ec-9baf-1339c0f5aea6-openstack-config\") pod \"829e2047-17e6-49ec-9baf-1339c0f5aea6\" (UID: \"829e2047-17e6-49ec-9baf-1339c0f5aea6\") " Mar 21 04:45:23 crc kubenswrapper[4839]: I0321 04:45:23.085298 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/829e2047-17e6-49ec-9baf-1339c0f5aea6-openstack-config" (OuterVolumeSpecName: "openstack-config") pod "829e2047-17e6-49ec-9baf-1339c0f5aea6" (UID: "829e2047-17e6-49ec-9baf-1339c0f5aea6"). InnerVolumeSpecName "openstack-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 21 04:45:23 crc kubenswrapper[4839]: I0321 04:45:23.092707 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/829e2047-17e6-49ec-9baf-1339c0f5aea6-kube-api-access-7tlqc" (OuterVolumeSpecName: "kube-api-access-7tlqc") pod "829e2047-17e6-49ec-9baf-1339c0f5aea6" (UID: "829e2047-17e6-49ec-9baf-1339c0f5aea6"). InnerVolumeSpecName "kube-api-access-7tlqc". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 21 04:45:23 crc kubenswrapper[4839]: I0321 04:45:23.092881 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/829e2047-17e6-49ec-9baf-1339c0f5aea6-openstack-config-secret" (OuterVolumeSpecName: "openstack-config-secret") pod "829e2047-17e6-49ec-9baf-1339c0f5aea6" (UID: "829e2047-17e6-49ec-9baf-1339c0f5aea6"). InnerVolumeSpecName "openstack-config-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 21 04:45:23 crc kubenswrapper[4839]: I0321 04:45:23.093860 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/829e2047-17e6-49ec-9baf-1339c0f5aea6-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "829e2047-17e6-49ec-9baf-1339c0f5aea6" (UID: "829e2047-17e6-49ec-9baf-1339c0f5aea6"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 21 04:45:23 crc kubenswrapper[4839]: I0321 04:45:23.186484 4839 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/829e2047-17e6-49ec-9baf-1339c0f5aea6-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 21 04:45:23 crc kubenswrapper[4839]: I0321 04:45:23.186521 4839 reconciler_common.go:293] "Volume detached for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/829e2047-17e6-49ec-9baf-1339c0f5aea6-openstack-config\") on node \"crc\" DevicePath \"\"" Mar 21 04:45:23 crc kubenswrapper[4839]: I0321 04:45:23.186533 4839 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7tlqc\" (UniqueName: \"kubernetes.io/projected/829e2047-17e6-49ec-9baf-1339c0f5aea6-kube-api-access-7tlqc\") on node \"crc\" DevicePath \"\"" Mar 21 04:45:23 crc kubenswrapper[4839]: I0321 04:45:23.186547 4839 reconciler_common.go:293] "Volume detached for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/829e2047-17e6-49ec-9baf-1339c0f5aea6-openstack-config-secret\") on node \"crc\" DevicePath \"\"" Mar 21 04:45:23 crc kubenswrapper[4839]: I0321 04:45:23.686349 4839 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/cinder-api-0" Mar 21 04:45:23 crc kubenswrapper[4839]: I0321 04:45:23.932687 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstackclient" event={"ID":"52b9f7e1-d86c-457e-9391-eee855a9f7a7","Type":"ContainerStarted","Data":"c327f228ec049650eba893fbc6b85e088c562445e9ab6b437e18033dbe633ec6"} Mar 21 04:45:23 crc kubenswrapper[4839]: I0321 04:45:23.938517 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Mar 21 04:45:23 crc kubenswrapper[4839]: I0321 04:45:23.940991 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"77964653-d242-4258-b06e-c9cd0fb64d84","Type":"ContainerStarted","Data":"bfa5e215a8064142662175001ba32c6559c68245a861ff8dad6895a48d51f16f"} Mar 21 04:45:23 crc kubenswrapper[4839]: I0321 04:45:23.966064 4839 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openstack/openstackclient" oldPodUID="829e2047-17e6-49ec-9baf-1339c0f5aea6" podUID="52b9f7e1-d86c-457e-9391-eee855a9f7a7" Mar 21 04:45:23 crc kubenswrapper[4839]: I0321 04:45:23.969497 4839 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-scheduler-0" podStartSLOduration=3.969477808 podStartE2EDuration="3.969477808s" podCreationTimestamp="2026-03-21 04:45:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-21 04:45:23.963510701 +0000 UTC m=+1328.291297377" watchObservedRunningTime="2026-03-21 04:45:23.969477808 +0000 UTC m=+1328.297264484" Mar 21 04:45:24 crc kubenswrapper[4839]: I0321 04:45:24.477157 4839 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="829e2047-17e6-49ec-9baf-1339c0f5aea6" path="/var/lib/kubelet/pods/829e2047-17e6-49ec-9baf-1339c0f5aea6/volumes" Mar 21 04:45:26 crc kubenswrapper[4839]: I0321 04:45:26.259831 4839 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/cinder-scheduler-0" Mar 21 04:45:27 crc kubenswrapper[4839]: I0321 04:45:27.467345 4839 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/swift-proxy-b66c6bfff-76gfx"] Mar 21 04:45:27 crc kubenswrapper[4839]: I0321 04:45:27.469896 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-proxy-b66c6bfff-76gfx" Mar 21 04:45:27 crc kubenswrapper[4839]: I0321 04:45:27.476167 4839 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-swift-public-svc" Mar 21 04:45:27 crc kubenswrapper[4839]: I0321 04:45:27.476310 4839 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-swift-internal-svc" Mar 21 04:45:27 crc kubenswrapper[4839]: I0321 04:45:27.476357 4839 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"swift-proxy-config-data" Mar 21 04:45:27 crc kubenswrapper[4839]: I0321 04:45:27.510293 4839 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-proxy-b66c6bfff-76gfx"] Mar 21 04:45:27 crc kubenswrapper[4839]: I0321 04:45:27.565419 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/1af5fd5b-8392-4e55-b3fb-fdc9285dd135-log-httpd\") pod \"swift-proxy-b66c6bfff-76gfx\" (UID: \"1af5fd5b-8392-4e55-b3fb-fdc9285dd135\") " pod="openstack/swift-proxy-b66c6bfff-76gfx" Mar 21 04:45:27 crc kubenswrapper[4839]: I0321 04:45:27.565463 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/1af5fd5b-8392-4e55-b3fb-fdc9285dd135-run-httpd\") pod \"swift-proxy-b66c6bfff-76gfx\" (UID: \"1af5fd5b-8392-4e55-b3fb-fdc9285dd135\") " pod="openstack/swift-proxy-b66c6bfff-76gfx" Mar 21 04:45:27 crc kubenswrapper[4839]: I0321 04:45:27.565487 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1af5fd5b-8392-4e55-b3fb-fdc9285dd135-config-data\") pod \"swift-proxy-b66c6bfff-76gfx\" (UID: \"1af5fd5b-8392-4e55-b3fb-fdc9285dd135\") " pod="openstack/swift-proxy-b66c6bfff-76gfx" Mar 21 04:45:27 crc kubenswrapper[4839]: I0321 04:45:27.565512 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/1af5fd5b-8392-4e55-b3fb-fdc9285dd135-internal-tls-certs\") pod \"swift-proxy-b66c6bfff-76gfx\" (UID: \"1af5fd5b-8392-4e55-b3fb-fdc9285dd135\") " pod="openstack/swift-proxy-b66c6bfff-76gfx" Mar 21 04:45:27 crc kubenswrapper[4839]: I0321 04:45:27.565562 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1af5fd5b-8392-4e55-b3fb-fdc9285dd135-combined-ca-bundle\") pod \"swift-proxy-b66c6bfff-76gfx\" (UID: \"1af5fd5b-8392-4e55-b3fb-fdc9285dd135\") " pod="openstack/swift-proxy-b66c6bfff-76gfx" Mar 21 04:45:27 crc kubenswrapper[4839]: I0321 04:45:27.565607 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/1af5fd5b-8392-4e55-b3fb-fdc9285dd135-etc-swift\") pod \"swift-proxy-b66c6bfff-76gfx\" (UID: \"1af5fd5b-8392-4e55-b3fb-fdc9285dd135\") " pod="openstack/swift-proxy-b66c6bfff-76gfx" Mar 21 04:45:27 crc kubenswrapper[4839]: I0321 04:45:27.565630 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/1af5fd5b-8392-4e55-b3fb-fdc9285dd135-public-tls-certs\") pod \"swift-proxy-b66c6bfff-76gfx\" (UID: \"1af5fd5b-8392-4e55-b3fb-fdc9285dd135\") " pod="openstack/swift-proxy-b66c6bfff-76gfx" Mar 21 04:45:27 crc kubenswrapper[4839]: I0321 04:45:27.565715 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zfhdt\" (UniqueName: \"kubernetes.io/projected/1af5fd5b-8392-4e55-b3fb-fdc9285dd135-kube-api-access-zfhdt\") pod \"swift-proxy-b66c6bfff-76gfx\" (UID: \"1af5fd5b-8392-4e55-b3fb-fdc9285dd135\") " pod="openstack/swift-proxy-b66c6bfff-76gfx" Mar 21 04:45:27 crc kubenswrapper[4839]: I0321 04:45:27.667901 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/1af5fd5b-8392-4e55-b3fb-fdc9285dd135-etc-swift\") pod \"swift-proxy-b66c6bfff-76gfx\" (UID: \"1af5fd5b-8392-4e55-b3fb-fdc9285dd135\") " pod="openstack/swift-proxy-b66c6bfff-76gfx" Mar 21 04:45:27 crc kubenswrapper[4839]: I0321 04:45:27.667968 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/1af5fd5b-8392-4e55-b3fb-fdc9285dd135-public-tls-certs\") pod \"swift-proxy-b66c6bfff-76gfx\" (UID: \"1af5fd5b-8392-4e55-b3fb-fdc9285dd135\") " pod="openstack/swift-proxy-b66c6bfff-76gfx" Mar 21 04:45:27 crc kubenswrapper[4839]: I0321 04:45:27.668077 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zfhdt\" (UniqueName: \"kubernetes.io/projected/1af5fd5b-8392-4e55-b3fb-fdc9285dd135-kube-api-access-zfhdt\") pod \"swift-proxy-b66c6bfff-76gfx\" (UID: \"1af5fd5b-8392-4e55-b3fb-fdc9285dd135\") " pod="openstack/swift-proxy-b66c6bfff-76gfx" Mar 21 04:45:27 crc kubenswrapper[4839]: I0321 04:45:27.668186 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/1af5fd5b-8392-4e55-b3fb-fdc9285dd135-log-httpd\") pod \"swift-proxy-b66c6bfff-76gfx\" (UID: \"1af5fd5b-8392-4e55-b3fb-fdc9285dd135\") " pod="openstack/swift-proxy-b66c6bfff-76gfx" Mar 21 04:45:27 crc kubenswrapper[4839]: I0321 04:45:27.668212 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/1af5fd5b-8392-4e55-b3fb-fdc9285dd135-run-httpd\") pod \"swift-proxy-b66c6bfff-76gfx\" (UID: \"1af5fd5b-8392-4e55-b3fb-fdc9285dd135\") " pod="openstack/swift-proxy-b66c6bfff-76gfx" Mar 21 04:45:27 crc kubenswrapper[4839]: I0321 04:45:27.668239 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1af5fd5b-8392-4e55-b3fb-fdc9285dd135-config-data\") pod \"swift-proxy-b66c6bfff-76gfx\" (UID: \"1af5fd5b-8392-4e55-b3fb-fdc9285dd135\") " pod="openstack/swift-proxy-b66c6bfff-76gfx" Mar 21 04:45:27 crc kubenswrapper[4839]: I0321 04:45:27.668272 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/1af5fd5b-8392-4e55-b3fb-fdc9285dd135-internal-tls-certs\") pod \"swift-proxy-b66c6bfff-76gfx\" (UID: \"1af5fd5b-8392-4e55-b3fb-fdc9285dd135\") " pod="openstack/swift-proxy-b66c6bfff-76gfx" Mar 21 04:45:27 crc kubenswrapper[4839]: I0321 04:45:27.668331 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1af5fd5b-8392-4e55-b3fb-fdc9285dd135-combined-ca-bundle\") pod \"swift-proxy-b66c6bfff-76gfx\" (UID: \"1af5fd5b-8392-4e55-b3fb-fdc9285dd135\") " pod="openstack/swift-proxy-b66c6bfff-76gfx" Mar 21 04:45:27 crc kubenswrapper[4839]: I0321 04:45:27.668892 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/1af5fd5b-8392-4e55-b3fb-fdc9285dd135-log-httpd\") pod \"swift-proxy-b66c6bfff-76gfx\" (UID: \"1af5fd5b-8392-4e55-b3fb-fdc9285dd135\") " pod="openstack/swift-proxy-b66c6bfff-76gfx" Mar 21 04:45:27 crc kubenswrapper[4839]: I0321 04:45:27.668973 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/1af5fd5b-8392-4e55-b3fb-fdc9285dd135-run-httpd\") pod \"swift-proxy-b66c6bfff-76gfx\" (UID: \"1af5fd5b-8392-4e55-b3fb-fdc9285dd135\") " pod="openstack/swift-proxy-b66c6bfff-76gfx" Mar 21 04:45:27 crc kubenswrapper[4839]: I0321 04:45:27.674893 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/1af5fd5b-8392-4e55-b3fb-fdc9285dd135-public-tls-certs\") pod \"swift-proxy-b66c6bfff-76gfx\" (UID: \"1af5fd5b-8392-4e55-b3fb-fdc9285dd135\") " pod="openstack/swift-proxy-b66c6bfff-76gfx" Mar 21 04:45:27 crc kubenswrapper[4839]: I0321 04:45:27.675217 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1af5fd5b-8392-4e55-b3fb-fdc9285dd135-config-data\") pod \"swift-proxy-b66c6bfff-76gfx\" (UID: \"1af5fd5b-8392-4e55-b3fb-fdc9285dd135\") " pod="openstack/swift-proxy-b66c6bfff-76gfx" Mar 21 04:45:27 crc kubenswrapper[4839]: I0321 04:45:27.676583 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1af5fd5b-8392-4e55-b3fb-fdc9285dd135-combined-ca-bundle\") pod \"swift-proxy-b66c6bfff-76gfx\" (UID: \"1af5fd5b-8392-4e55-b3fb-fdc9285dd135\") " pod="openstack/swift-proxy-b66c6bfff-76gfx" Mar 21 04:45:27 crc kubenswrapper[4839]: I0321 04:45:27.680300 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/1af5fd5b-8392-4e55-b3fb-fdc9285dd135-internal-tls-certs\") pod \"swift-proxy-b66c6bfff-76gfx\" (UID: \"1af5fd5b-8392-4e55-b3fb-fdc9285dd135\") " pod="openstack/swift-proxy-b66c6bfff-76gfx" Mar 21 04:45:27 crc kubenswrapper[4839]: I0321 04:45:27.681833 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/1af5fd5b-8392-4e55-b3fb-fdc9285dd135-etc-swift\") pod \"swift-proxy-b66c6bfff-76gfx\" (UID: \"1af5fd5b-8392-4e55-b3fb-fdc9285dd135\") " pod="openstack/swift-proxy-b66c6bfff-76gfx" Mar 21 04:45:27 crc kubenswrapper[4839]: I0321 04:45:27.685435 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zfhdt\" (UniqueName: \"kubernetes.io/projected/1af5fd5b-8392-4e55-b3fb-fdc9285dd135-kube-api-access-zfhdt\") pod \"swift-proxy-b66c6bfff-76gfx\" (UID: \"1af5fd5b-8392-4e55-b3fb-fdc9285dd135\") " pod="openstack/swift-proxy-b66c6bfff-76gfx" Mar 21 04:45:27 crc kubenswrapper[4839]: I0321 04:45:27.792201 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-proxy-b66c6bfff-76gfx" Mar 21 04:45:29 crc kubenswrapper[4839]: I0321 04:45:29.262436 4839 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/horizon-84c6c985f8-v5cmh" podUID="b3b26c3a-55d5-442a-9c31-187b0aa60f90" containerName="horizon" probeResult="failure" output="Get \"https://10.217.0.151:8443/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.0.151:8443: connect: connection refused" Mar 21 04:45:30 crc kubenswrapper[4839]: I0321 04:45:30.347099 4839 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 21 04:45:30 crc kubenswrapper[4839]: I0321 04:45:30.347456 4839 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="b4b205eb-a84a-4c2f-8b49-068d4e0a8ec9" containerName="ceilometer-central-agent" containerID="cri-o://c93a95cdff62e2e7b6e4859da043de79cf7d681e70275fda634ef213fa4a479c" gracePeriod=30 Mar 21 04:45:30 crc kubenswrapper[4839]: I0321 04:45:30.347554 4839 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="b4b205eb-a84a-4c2f-8b49-068d4e0a8ec9" containerName="proxy-httpd" containerID="cri-o://4474b8a6d0b40b2af1d9b2dc57ca277c0c905892995cd020658c486bec967d7a" gracePeriod=30 Mar 21 04:45:30 crc kubenswrapper[4839]: I0321 04:45:30.347615 4839 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="b4b205eb-a84a-4c2f-8b49-068d4e0a8ec9" containerName="ceilometer-notification-agent" containerID="cri-o://535723ef61f58618feb7059f038c3fae7ab1b8f214f13d762b44e16ad7930481" gracePeriod=30 Mar 21 04:45:30 crc kubenswrapper[4839]: I0321 04:45:30.347856 4839 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="b4b205eb-a84a-4c2f-8b49-068d4e0a8ec9" containerName="sg-core" containerID="cri-o://c23f136b850cb236ed5c6370a36a398b1c3d6f65a75d6f5afc35d4214b4f0367" gracePeriod=30 Mar 21 04:45:30 crc kubenswrapper[4839]: I0321 04:45:30.362647 4839 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ceilometer-0" podUID="b4b205eb-a84a-4c2f-8b49-068d4e0a8ec9" containerName="proxy-httpd" probeResult="failure" output="Get \"http://10.217.0.177:3000/\": EOF" Mar 21 04:45:30 crc kubenswrapper[4839]: I0321 04:45:30.981045 4839 patch_prober.go:28] interesting pod/machine-config-daemon-jx4q7 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 21 04:45:30 crc kubenswrapper[4839]: I0321 04:45:30.981128 4839 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-jx4q7" podUID="4f92fefb-d5cd-451a-8bbe-31eea55d5bd9" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 21 04:45:31 crc kubenswrapper[4839]: I0321 04:45:31.011206 4839 generic.go:334] "Generic (PLEG): container finished" podID="b4b205eb-a84a-4c2f-8b49-068d4e0a8ec9" containerID="4474b8a6d0b40b2af1d9b2dc57ca277c0c905892995cd020658c486bec967d7a" exitCode=0 Mar 21 04:45:31 crc kubenswrapper[4839]: I0321 04:45:31.011243 4839 generic.go:334] "Generic (PLEG): container finished" podID="b4b205eb-a84a-4c2f-8b49-068d4e0a8ec9" containerID="c23f136b850cb236ed5c6370a36a398b1c3d6f65a75d6f5afc35d4214b4f0367" exitCode=2 Mar 21 04:45:31 crc kubenswrapper[4839]: I0321 04:45:31.011253 4839 generic.go:334] "Generic (PLEG): container finished" podID="b4b205eb-a84a-4c2f-8b49-068d4e0a8ec9" containerID="535723ef61f58618feb7059f038c3fae7ab1b8f214f13d762b44e16ad7930481" exitCode=0 Mar 21 04:45:31 crc kubenswrapper[4839]: I0321 04:45:31.011262 4839 generic.go:334] "Generic (PLEG): container finished" podID="b4b205eb-a84a-4c2f-8b49-068d4e0a8ec9" containerID="c93a95cdff62e2e7b6e4859da043de79cf7d681e70275fda634ef213fa4a479c" exitCode=0 Mar 21 04:45:31 crc kubenswrapper[4839]: I0321 04:45:31.011286 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"b4b205eb-a84a-4c2f-8b49-068d4e0a8ec9","Type":"ContainerDied","Data":"4474b8a6d0b40b2af1d9b2dc57ca277c0c905892995cd020658c486bec967d7a"} Mar 21 04:45:31 crc kubenswrapper[4839]: I0321 04:45:31.011317 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"b4b205eb-a84a-4c2f-8b49-068d4e0a8ec9","Type":"ContainerDied","Data":"c23f136b850cb236ed5c6370a36a398b1c3d6f65a75d6f5afc35d4214b4f0367"} Mar 21 04:45:31 crc kubenswrapper[4839]: I0321 04:45:31.011331 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"b4b205eb-a84a-4c2f-8b49-068d4e0a8ec9","Type":"ContainerDied","Data":"535723ef61f58618feb7059f038c3fae7ab1b8f214f13d762b44e16ad7930481"} Mar 21 04:45:31 crc kubenswrapper[4839]: I0321 04:45:31.011345 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"b4b205eb-a84a-4c2f-8b49-068d4e0a8ec9","Type":"ContainerDied","Data":"c93a95cdff62e2e7b6e4859da043de79cf7d681e70275fda634ef213fa4a479c"} Mar 21 04:45:31 crc kubenswrapper[4839]: I0321 04:45:31.488939 4839 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/cinder-scheduler-0" Mar 21 04:45:32 crc kubenswrapper[4839]: I0321 04:45:32.746646 4839 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 21 04:45:32 crc kubenswrapper[4839]: I0321 04:45:32.810035 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/b4b205eb-a84a-4c2f-8b49-068d4e0a8ec9-sg-core-conf-yaml\") pod \"b4b205eb-a84a-4c2f-8b49-068d4e0a8ec9\" (UID: \"b4b205eb-a84a-4c2f-8b49-068d4e0a8ec9\") " Mar 21 04:45:32 crc kubenswrapper[4839]: I0321 04:45:32.810099 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vmhp2\" (UniqueName: \"kubernetes.io/projected/b4b205eb-a84a-4c2f-8b49-068d4e0a8ec9-kube-api-access-vmhp2\") pod \"b4b205eb-a84a-4c2f-8b49-068d4e0a8ec9\" (UID: \"b4b205eb-a84a-4c2f-8b49-068d4e0a8ec9\") " Mar 21 04:45:32 crc kubenswrapper[4839]: I0321 04:45:32.810143 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b4b205eb-a84a-4c2f-8b49-068d4e0a8ec9-config-data\") pod \"b4b205eb-a84a-4c2f-8b49-068d4e0a8ec9\" (UID: \"b4b205eb-a84a-4c2f-8b49-068d4e0a8ec9\") " Mar 21 04:45:32 crc kubenswrapper[4839]: I0321 04:45:32.810241 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/b4b205eb-a84a-4c2f-8b49-068d4e0a8ec9-run-httpd\") pod \"b4b205eb-a84a-4c2f-8b49-068d4e0a8ec9\" (UID: \"b4b205eb-a84a-4c2f-8b49-068d4e0a8ec9\") " Mar 21 04:45:32 crc kubenswrapper[4839]: I0321 04:45:32.810336 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b4b205eb-a84a-4c2f-8b49-068d4e0a8ec9-scripts\") pod \"b4b205eb-a84a-4c2f-8b49-068d4e0a8ec9\" (UID: \"b4b205eb-a84a-4c2f-8b49-068d4e0a8ec9\") " Mar 21 04:45:32 crc kubenswrapper[4839]: I0321 04:45:32.810419 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/b4b205eb-a84a-4c2f-8b49-068d4e0a8ec9-log-httpd\") pod \"b4b205eb-a84a-4c2f-8b49-068d4e0a8ec9\" (UID: \"b4b205eb-a84a-4c2f-8b49-068d4e0a8ec9\") " Mar 21 04:45:32 crc kubenswrapper[4839]: I0321 04:45:32.810463 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b4b205eb-a84a-4c2f-8b49-068d4e0a8ec9-combined-ca-bundle\") pod \"b4b205eb-a84a-4c2f-8b49-068d4e0a8ec9\" (UID: \"b4b205eb-a84a-4c2f-8b49-068d4e0a8ec9\") " Mar 21 04:45:32 crc kubenswrapper[4839]: I0321 04:45:32.811037 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b4b205eb-a84a-4c2f-8b49-068d4e0a8ec9-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "b4b205eb-a84a-4c2f-8b49-068d4e0a8ec9" (UID: "b4b205eb-a84a-4c2f-8b49-068d4e0a8ec9"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 21 04:45:32 crc kubenswrapper[4839]: I0321 04:45:32.811214 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b4b205eb-a84a-4c2f-8b49-068d4e0a8ec9-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "b4b205eb-a84a-4c2f-8b49-068d4e0a8ec9" (UID: "b4b205eb-a84a-4c2f-8b49-068d4e0a8ec9"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 21 04:45:32 crc kubenswrapper[4839]: I0321 04:45:32.816503 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b4b205eb-a84a-4c2f-8b49-068d4e0a8ec9-scripts" (OuterVolumeSpecName: "scripts") pod "b4b205eb-a84a-4c2f-8b49-068d4e0a8ec9" (UID: "b4b205eb-a84a-4c2f-8b49-068d4e0a8ec9"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 21 04:45:32 crc kubenswrapper[4839]: I0321 04:45:32.816630 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b4b205eb-a84a-4c2f-8b49-068d4e0a8ec9-kube-api-access-vmhp2" (OuterVolumeSpecName: "kube-api-access-vmhp2") pod "b4b205eb-a84a-4c2f-8b49-068d4e0a8ec9" (UID: "b4b205eb-a84a-4c2f-8b49-068d4e0a8ec9"). InnerVolumeSpecName "kube-api-access-vmhp2". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 21 04:45:32 crc kubenswrapper[4839]: I0321 04:45:32.837049 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b4b205eb-a84a-4c2f-8b49-068d4e0a8ec9-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "b4b205eb-a84a-4c2f-8b49-068d4e0a8ec9" (UID: "b4b205eb-a84a-4c2f-8b49-068d4e0a8ec9"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 21 04:45:32 crc kubenswrapper[4839]: I0321 04:45:32.885682 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b4b205eb-a84a-4c2f-8b49-068d4e0a8ec9-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "b4b205eb-a84a-4c2f-8b49-068d4e0a8ec9" (UID: "b4b205eb-a84a-4c2f-8b49-068d4e0a8ec9"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 21 04:45:32 crc kubenswrapper[4839]: I0321 04:45:32.902919 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b4b205eb-a84a-4c2f-8b49-068d4e0a8ec9-config-data" (OuterVolumeSpecName: "config-data") pod "b4b205eb-a84a-4c2f-8b49-068d4e0a8ec9" (UID: "b4b205eb-a84a-4c2f-8b49-068d4e0a8ec9"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 21 04:45:32 crc kubenswrapper[4839]: I0321 04:45:32.913082 4839 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b4b205eb-a84a-4c2f-8b49-068d4e0a8ec9-scripts\") on node \"crc\" DevicePath \"\"" Mar 21 04:45:32 crc kubenswrapper[4839]: I0321 04:45:32.913114 4839 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/b4b205eb-a84a-4c2f-8b49-068d4e0a8ec9-log-httpd\") on node \"crc\" DevicePath \"\"" Mar 21 04:45:32 crc kubenswrapper[4839]: I0321 04:45:32.913124 4839 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b4b205eb-a84a-4c2f-8b49-068d4e0a8ec9-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 21 04:45:32 crc kubenswrapper[4839]: I0321 04:45:32.913137 4839 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/b4b205eb-a84a-4c2f-8b49-068d4e0a8ec9-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Mar 21 04:45:32 crc kubenswrapper[4839]: I0321 04:45:32.913146 4839 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vmhp2\" (UniqueName: \"kubernetes.io/projected/b4b205eb-a84a-4c2f-8b49-068d4e0a8ec9-kube-api-access-vmhp2\") on node \"crc\" DevicePath \"\"" Mar 21 04:45:32 crc kubenswrapper[4839]: I0321 04:45:32.913156 4839 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b4b205eb-a84a-4c2f-8b49-068d4e0a8ec9-config-data\") on node \"crc\" DevicePath \"\"" Mar 21 04:45:32 crc kubenswrapper[4839]: I0321 04:45:32.913163 4839 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/b4b205eb-a84a-4c2f-8b49-068d4e0a8ec9-run-httpd\") on node \"crc\" DevicePath \"\"" Mar 21 04:45:33 crc kubenswrapper[4839]: I0321 04:45:33.028800 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstackclient" event={"ID":"52b9f7e1-d86c-457e-9391-eee855a9f7a7","Type":"ContainerStarted","Data":"a5dd87302be70fb0a0dce91612650d2cd8feb29133bfdeebdfad1374e6c9d593"} Mar 21 04:45:33 crc kubenswrapper[4839]: I0321 04:45:33.035521 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"b4b205eb-a84a-4c2f-8b49-068d4e0a8ec9","Type":"ContainerDied","Data":"c2ee73cd19c0a119a6c2d0a87e1fbe0c6315a3f99b97a20c1aa91548c3c2c2dc"} Mar 21 04:45:33 crc kubenswrapper[4839]: I0321 04:45:33.035601 4839 scope.go:117] "RemoveContainer" containerID="4474b8a6d0b40b2af1d9b2dc57ca277c0c905892995cd020658c486bec967d7a" Mar 21 04:45:33 crc kubenswrapper[4839]: I0321 04:45:33.035600 4839 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 21 04:45:33 crc kubenswrapper[4839]: I0321 04:45:33.054469 4839 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/openstackclient" podStartSLOduration=1.608250314 podStartE2EDuration="11.053934919s" podCreationTimestamp="2026-03-21 04:45:22 +0000 UTC" firstStartedPulling="2026-03-21 04:45:23.03368175 +0000 UTC m=+1327.361468426" lastFinishedPulling="2026-03-21 04:45:32.479366365 +0000 UTC m=+1336.807153031" observedRunningTime="2026-03-21 04:45:33.047716245 +0000 UTC m=+1337.375502921" watchObservedRunningTime="2026-03-21 04:45:33.053934919 +0000 UTC m=+1337.381721595" Mar 21 04:45:33 crc kubenswrapper[4839]: I0321 04:45:33.067933 4839 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-proxy-b66c6bfff-76gfx"] Mar 21 04:45:33 crc kubenswrapper[4839]: I0321 04:45:33.073476 4839 scope.go:117] "RemoveContainer" containerID="c23f136b850cb236ed5c6370a36a398b1c3d6f65a75d6f5afc35d4214b4f0367" Mar 21 04:45:33 crc kubenswrapper[4839]: I0321 04:45:33.085736 4839 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 21 04:45:33 crc kubenswrapper[4839]: I0321 04:45:33.101558 4839 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Mar 21 04:45:33 crc kubenswrapper[4839]: I0321 04:45:33.104905 4839 scope.go:117] "RemoveContainer" containerID="535723ef61f58618feb7059f038c3fae7ab1b8f214f13d762b44e16ad7930481" Mar 21 04:45:33 crc kubenswrapper[4839]: I0321 04:45:33.113779 4839 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Mar 21 04:45:33 crc kubenswrapper[4839]: E0321 04:45:33.114411 4839 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b4b205eb-a84a-4c2f-8b49-068d4e0a8ec9" containerName="proxy-httpd" Mar 21 04:45:33 crc kubenswrapper[4839]: I0321 04:45:33.114519 4839 state_mem.go:107] "Deleted CPUSet assignment" podUID="b4b205eb-a84a-4c2f-8b49-068d4e0a8ec9" containerName="proxy-httpd" Mar 21 04:45:33 crc kubenswrapper[4839]: E0321 04:45:33.114558 4839 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b4b205eb-a84a-4c2f-8b49-068d4e0a8ec9" containerName="ceilometer-central-agent" Mar 21 04:45:33 crc kubenswrapper[4839]: I0321 04:45:33.114582 4839 state_mem.go:107] "Deleted CPUSet assignment" podUID="b4b205eb-a84a-4c2f-8b49-068d4e0a8ec9" containerName="ceilometer-central-agent" Mar 21 04:45:33 crc kubenswrapper[4839]: E0321 04:45:33.114606 4839 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b4b205eb-a84a-4c2f-8b49-068d4e0a8ec9" containerName="ceilometer-notification-agent" Mar 21 04:45:33 crc kubenswrapper[4839]: I0321 04:45:33.114614 4839 state_mem.go:107] "Deleted CPUSet assignment" podUID="b4b205eb-a84a-4c2f-8b49-068d4e0a8ec9" containerName="ceilometer-notification-agent" Mar 21 04:45:33 crc kubenswrapper[4839]: E0321 04:45:33.114626 4839 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b4b205eb-a84a-4c2f-8b49-068d4e0a8ec9" containerName="sg-core" Mar 21 04:45:33 crc kubenswrapper[4839]: I0321 04:45:33.114634 4839 state_mem.go:107] "Deleted CPUSet assignment" podUID="b4b205eb-a84a-4c2f-8b49-068d4e0a8ec9" containerName="sg-core" Mar 21 04:45:33 crc kubenswrapper[4839]: I0321 04:45:33.114970 4839 memory_manager.go:354] "RemoveStaleState removing state" podUID="b4b205eb-a84a-4c2f-8b49-068d4e0a8ec9" containerName="proxy-httpd" Mar 21 04:45:33 crc kubenswrapper[4839]: I0321 04:45:33.115236 4839 memory_manager.go:354] "RemoveStaleState removing state" podUID="b4b205eb-a84a-4c2f-8b49-068d4e0a8ec9" containerName="sg-core" Mar 21 04:45:33 crc kubenswrapper[4839]: I0321 04:45:33.115257 4839 memory_manager.go:354] "RemoveStaleState removing state" podUID="b4b205eb-a84a-4c2f-8b49-068d4e0a8ec9" containerName="ceilometer-central-agent" Mar 21 04:45:33 crc kubenswrapper[4839]: I0321 04:45:33.115279 4839 memory_manager.go:354] "RemoveStaleState removing state" podUID="b4b205eb-a84a-4c2f-8b49-068d4e0a8ec9" containerName="ceilometer-notification-agent" Mar 21 04:45:33 crc kubenswrapper[4839]: I0321 04:45:33.117519 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 21 04:45:33 crc kubenswrapper[4839]: I0321 04:45:33.120201 4839 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Mar 21 04:45:33 crc kubenswrapper[4839]: I0321 04:45:33.120476 4839 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Mar 21 04:45:33 crc kubenswrapper[4839]: I0321 04:45:33.132121 4839 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 21 04:45:33 crc kubenswrapper[4839]: I0321 04:45:33.144766 4839 scope.go:117] "RemoveContainer" containerID="c93a95cdff62e2e7b6e4859da043de79cf7d681e70275fda634ef213fa4a479c" Mar 21 04:45:33 crc kubenswrapper[4839]: I0321 04:45:33.218741 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ksth5\" (UniqueName: \"kubernetes.io/projected/78141ffe-ee2c-4b66-b8e9-7224e526dfa2-kube-api-access-ksth5\") pod \"ceilometer-0\" (UID: \"78141ffe-ee2c-4b66-b8e9-7224e526dfa2\") " pod="openstack/ceilometer-0" Mar 21 04:45:33 crc kubenswrapper[4839]: I0321 04:45:33.218784 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/78141ffe-ee2c-4b66-b8e9-7224e526dfa2-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"78141ffe-ee2c-4b66-b8e9-7224e526dfa2\") " pod="openstack/ceilometer-0" Mar 21 04:45:33 crc kubenswrapper[4839]: I0321 04:45:33.218835 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/78141ffe-ee2c-4b66-b8e9-7224e526dfa2-scripts\") pod \"ceilometer-0\" (UID: \"78141ffe-ee2c-4b66-b8e9-7224e526dfa2\") " pod="openstack/ceilometer-0" Mar 21 04:45:33 crc kubenswrapper[4839]: I0321 04:45:33.218849 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/78141ffe-ee2c-4b66-b8e9-7224e526dfa2-config-data\") pod \"ceilometer-0\" (UID: \"78141ffe-ee2c-4b66-b8e9-7224e526dfa2\") " pod="openstack/ceilometer-0" Mar 21 04:45:33 crc kubenswrapper[4839]: I0321 04:45:33.218899 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/78141ffe-ee2c-4b66-b8e9-7224e526dfa2-log-httpd\") pod \"ceilometer-0\" (UID: \"78141ffe-ee2c-4b66-b8e9-7224e526dfa2\") " pod="openstack/ceilometer-0" Mar 21 04:45:33 crc kubenswrapper[4839]: I0321 04:45:33.218935 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/78141ffe-ee2c-4b66-b8e9-7224e526dfa2-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"78141ffe-ee2c-4b66-b8e9-7224e526dfa2\") " pod="openstack/ceilometer-0" Mar 21 04:45:33 crc kubenswrapper[4839]: I0321 04:45:33.218997 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/78141ffe-ee2c-4b66-b8e9-7224e526dfa2-run-httpd\") pod \"ceilometer-0\" (UID: \"78141ffe-ee2c-4b66-b8e9-7224e526dfa2\") " pod="openstack/ceilometer-0" Mar 21 04:45:33 crc kubenswrapper[4839]: I0321 04:45:33.320104 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/78141ffe-ee2c-4b66-b8e9-7224e526dfa2-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"78141ffe-ee2c-4b66-b8e9-7224e526dfa2\") " pod="openstack/ceilometer-0" Mar 21 04:45:33 crc kubenswrapper[4839]: I0321 04:45:33.320201 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/78141ffe-ee2c-4b66-b8e9-7224e526dfa2-run-httpd\") pod \"ceilometer-0\" (UID: \"78141ffe-ee2c-4b66-b8e9-7224e526dfa2\") " pod="openstack/ceilometer-0" Mar 21 04:45:33 crc kubenswrapper[4839]: I0321 04:45:33.320237 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ksth5\" (UniqueName: \"kubernetes.io/projected/78141ffe-ee2c-4b66-b8e9-7224e526dfa2-kube-api-access-ksth5\") pod \"ceilometer-0\" (UID: \"78141ffe-ee2c-4b66-b8e9-7224e526dfa2\") " pod="openstack/ceilometer-0" Mar 21 04:45:33 crc kubenswrapper[4839]: I0321 04:45:33.320259 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/78141ffe-ee2c-4b66-b8e9-7224e526dfa2-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"78141ffe-ee2c-4b66-b8e9-7224e526dfa2\") " pod="openstack/ceilometer-0" Mar 21 04:45:33 crc kubenswrapper[4839]: I0321 04:45:33.320302 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/78141ffe-ee2c-4b66-b8e9-7224e526dfa2-scripts\") pod \"ceilometer-0\" (UID: \"78141ffe-ee2c-4b66-b8e9-7224e526dfa2\") " pod="openstack/ceilometer-0" Mar 21 04:45:33 crc kubenswrapper[4839]: I0321 04:45:33.320408 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/78141ffe-ee2c-4b66-b8e9-7224e526dfa2-config-data\") pod \"ceilometer-0\" (UID: \"78141ffe-ee2c-4b66-b8e9-7224e526dfa2\") " pod="openstack/ceilometer-0" Mar 21 04:45:33 crc kubenswrapper[4839]: I0321 04:45:33.320465 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/78141ffe-ee2c-4b66-b8e9-7224e526dfa2-log-httpd\") pod \"ceilometer-0\" (UID: \"78141ffe-ee2c-4b66-b8e9-7224e526dfa2\") " pod="openstack/ceilometer-0" Mar 21 04:45:33 crc kubenswrapper[4839]: I0321 04:45:33.320918 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/78141ffe-ee2c-4b66-b8e9-7224e526dfa2-log-httpd\") pod \"ceilometer-0\" (UID: \"78141ffe-ee2c-4b66-b8e9-7224e526dfa2\") " pod="openstack/ceilometer-0" Mar 21 04:45:33 crc kubenswrapper[4839]: I0321 04:45:33.320995 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/78141ffe-ee2c-4b66-b8e9-7224e526dfa2-run-httpd\") pod \"ceilometer-0\" (UID: \"78141ffe-ee2c-4b66-b8e9-7224e526dfa2\") " pod="openstack/ceilometer-0" Mar 21 04:45:33 crc kubenswrapper[4839]: I0321 04:45:33.324212 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/78141ffe-ee2c-4b66-b8e9-7224e526dfa2-config-data\") pod \"ceilometer-0\" (UID: \"78141ffe-ee2c-4b66-b8e9-7224e526dfa2\") " pod="openstack/ceilometer-0" Mar 21 04:45:33 crc kubenswrapper[4839]: I0321 04:45:33.324588 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/78141ffe-ee2c-4b66-b8e9-7224e526dfa2-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"78141ffe-ee2c-4b66-b8e9-7224e526dfa2\") " pod="openstack/ceilometer-0" Mar 21 04:45:33 crc kubenswrapper[4839]: I0321 04:45:33.324954 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/78141ffe-ee2c-4b66-b8e9-7224e526dfa2-scripts\") pod \"ceilometer-0\" (UID: \"78141ffe-ee2c-4b66-b8e9-7224e526dfa2\") " pod="openstack/ceilometer-0" Mar 21 04:45:33 crc kubenswrapper[4839]: I0321 04:45:33.324980 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/78141ffe-ee2c-4b66-b8e9-7224e526dfa2-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"78141ffe-ee2c-4b66-b8e9-7224e526dfa2\") " pod="openstack/ceilometer-0" Mar 21 04:45:33 crc kubenswrapper[4839]: I0321 04:45:33.341035 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ksth5\" (UniqueName: \"kubernetes.io/projected/78141ffe-ee2c-4b66-b8e9-7224e526dfa2-kube-api-access-ksth5\") pod \"ceilometer-0\" (UID: \"78141ffe-ee2c-4b66-b8e9-7224e526dfa2\") " pod="openstack/ceilometer-0" Mar 21 04:45:33 crc kubenswrapper[4839]: I0321 04:45:33.462193 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 21 04:45:33 crc kubenswrapper[4839]: I0321 04:45:33.909938 4839 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 21 04:45:33 crc kubenswrapper[4839]: W0321 04:45:33.915785 4839 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod78141ffe_ee2c_4b66_b8e9_7224e526dfa2.slice/crio-826cbd06b784209787d8818bee60d8b520f17d5ca2e06bfab9213c157d0fdf3a WatchSource:0}: Error finding container 826cbd06b784209787d8818bee60d8b520f17d5ca2e06bfab9213c157d0fdf3a: Status 404 returned error can't find the container with id 826cbd06b784209787d8818bee60d8b520f17d5ca2e06bfab9213c157d0fdf3a Mar 21 04:45:34 crc kubenswrapper[4839]: I0321 04:45:34.047156 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"78141ffe-ee2c-4b66-b8e9-7224e526dfa2","Type":"ContainerStarted","Data":"826cbd06b784209787d8818bee60d8b520f17d5ca2e06bfab9213c157d0fdf3a"} Mar 21 04:45:34 crc kubenswrapper[4839]: I0321 04:45:34.049302 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-proxy-b66c6bfff-76gfx" event={"ID":"1af5fd5b-8392-4e55-b3fb-fdc9285dd135","Type":"ContainerStarted","Data":"ee99a1a569d8b7a699f9dad6be9bed8c1d6fcb40d8143ea287a9d977a097eee4"} Mar 21 04:45:34 crc kubenswrapper[4839]: I0321 04:45:34.049334 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-proxy-b66c6bfff-76gfx" event={"ID":"1af5fd5b-8392-4e55-b3fb-fdc9285dd135","Type":"ContainerStarted","Data":"e9bcd08231d23ab63a4055b1e20413d6c3cbe21af71de74483e4551996a1b55f"} Mar 21 04:45:34 crc kubenswrapper[4839]: I0321 04:45:34.049349 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-proxy-b66c6bfff-76gfx" event={"ID":"1af5fd5b-8392-4e55-b3fb-fdc9285dd135","Type":"ContainerStarted","Data":"b6bd6f4fe36bb8f41cb753bbbf61d47676d982d86187d83427cd05de9a878678"} Mar 21 04:45:34 crc kubenswrapper[4839]: I0321 04:45:34.465036 4839 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b4b205eb-a84a-4c2f-8b49-068d4e0a8ec9" path="/var/lib/kubelet/pods/b4b205eb-a84a-4c2f-8b49-068d4e0a8ec9/volumes" Mar 21 04:45:34 crc kubenswrapper[4839]: I0321 04:45:34.833604 4839 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/neutron-748dbf85fc-jslwv" Mar 21 04:45:34 crc kubenswrapper[4839]: I0321 04:45:34.954916 4839 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-d447b4d96-qkb69"] Mar 21 04:45:34 crc kubenswrapper[4839]: I0321 04:45:34.956777 4839 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/neutron-d447b4d96-qkb69" podUID="12b60d89-b044-4822-bc95-47567123e883" containerName="neutron-api" containerID="cri-o://67565e75899a6f37a50691495a20ce72ac05f30c786d835eec041d52489fae90" gracePeriod=30 Mar 21 04:45:34 crc kubenswrapper[4839]: I0321 04:45:34.956970 4839 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/neutron-d447b4d96-qkb69" podUID="12b60d89-b044-4822-bc95-47567123e883" containerName="neutron-httpd" containerID="cri-o://b34d8d2f70a5b785c9d68f7ee3d5caf1ba929e7c2f1691a3903d1460eb0bcbbc" gracePeriod=30 Mar 21 04:45:35 crc kubenswrapper[4839]: I0321 04:45:35.089699 4839 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/swift-proxy-b66c6bfff-76gfx" Mar 21 04:45:35 crc kubenswrapper[4839]: I0321 04:45:35.089746 4839 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/swift-proxy-b66c6bfff-76gfx" Mar 21 04:45:35 crc kubenswrapper[4839]: I0321 04:45:35.150417 4839 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/swift-proxy-b66c6bfff-76gfx" podStartSLOduration=8.150393858 podStartE2EDuration="8.150393858s" podCreationTimestamp="2026-03-21 04:45:27 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-21 04:45:35.124989117 +0000 UTC m=+1339.452775813" watchObservedRunningTime="2026-03-21 04:45:35.150393858 +0000 UTC m=+1339.478180534" Mar 21 04:45:36 crc kubenswrapper[4839]: I0321 04:45:36.099592 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"78141ffe-ee2c-4b66-b8e9-7224e526dfa2","Type":"ContainerStarted","Data":"b9ba563d5bd44575dcd70c87b15efdd8ce73c0f0033e02dc5b2a64d859412a76"} Mar 21 04:45:36 crc kubenswrapper[4839]: I0321 04:45:36.101780 4839 generic.go:334] "Generic (PLEG): container finished" podID="12b60d89-b044-4822-bc95-47567123e883" containerID="b34d8d2f70a5b785c9d68f7ee3d5caf1ba929e7c2f1691a3903d1460eb0bcbbc" exitCode=0 Mar 21 04:45:36 crc kubenswrapper[4839]: I0321 04:45:36.101840 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-d447b4d96-qkb69" event={"ID":"12b60d89-b044-4822-bc95-47567123e883","Type":"ContainerDied","Data":"b34d8d2f70a5b785c9d68f7ee3d5caf1ba929e7c2f1691a3903d1460eb0bcbbc"} Mar 21 04:45:36 crc kubenswrapper[4839]: I0321 04:45:36.393410 4839 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 21 04:45:37 crc kubenswrapper[4839]: I0321 04:45:37.113782 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"78141ffe-ee2c-4b66-b8e9-7224e526dfa2","Type":"ContainerStarted","Data":"7ad0bd3925864d1f7ff2de141b61807a3f02be0a2c40d1fe27bf1e9b2163e79a"} Mar 21 04:45:37 crc kubenswrapper[4839]: I0321 04:45:37.114096 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"78141ffe-ee2c-4b66-b8e9-7224e526dfa2","Type":"ContainerStarted","Data":"0f08d30af95cdb5d8f671e4eec16f22bf5a749117dd6ef17a7712750d567ba4f"} Mar 21 04:45:39 crc kubenswrapper[4839]: I0321 04:45:39.134885 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"78141ffe-ee2c-4b66-b8e9-7224e526dfa2","Type":"ContainerStarted","Data":"b3eab21d6ba6a94a0f66e87839a22f19a3c9384b821721458022b1cab04ab92a"} Mar 21 04:45:39 crc kubenswrapper[4839]: I0321 04:45:39.135155 4839 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="78141ffe-ee2c-4b66-b8e9-7224e526dfa2" containerName="ceilometer-central-agent" containerID="cri-o://b9ba563d5bd44575dcd70c87b15efdd8ce73c0f0033e02dc5b2a64d859412a76" gracePeriod=30 Mar 21 04:45:39 crc kubenswrapper[4839]: I0321 04:45:39.135391 4839 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="78141ffe-ee2c-4b66-b8e9-7224e526dfa2" containerName="proxy-httpd" containerID="cri-o://b3eab21d6ba6a94a0f66e87839a22f19a3c9384b821721458022b1cab04ab92a" gracePeriod=30 Mar 21 04:45:39 crc kubenswrapper[4839]: I0321 04:45:39.135467 4839 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="78141ffe-ee2c-4b66-b8e9-7224e526dfa2" containerName="sg-core" containerID="cri-o://7ad0bd3925864d1f7ff2de141b61807a3f02be0a2c40d1fe27bf1e9b2163e79a" gracePeriod=30 Mar 21 04:45:39 crc kubenswrapper[4839]: I0321 04:45:39.135424 4839 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="78141ffe-ee2c-4b66-b8e9-7224e526dfa2" containerName="ceilometer-notification-agent" containerID="cri-o://0f08d30af95cdb5d8f671e4eec16f22bf5a749117dd6ef17a7712750d567ba4f" gracePeriod=30 Mar 21 04:45:39 crc kubenswrapper[4839]: I0321 04:45:39.135620 4839 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Mar 21 04:45:39 crc kubenswrapper[4839]: I0321 04:45:39.169318 4839 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=1.382260259 podStartE2EDuration="6.169297628s" podCreationTimestamp="2026-03-21 04:45:33 +0000 UTC" firstStartedPulling="2026-03-21 04:45:33.920056298 +0000 UTC m=+1338.247842974" lastFinishedPulling="2026-03-21 04:45:38.707093667 +0000 UTC m=+1343.034880343" observedRunningTime="2026-03-21 04:45:39.159933186 +0000 UTC m=+1343.487719862" watchObservedRunningTime="2026-03-21 04:45:39.169297628 +0000 UTC m=+1343.497084304" Mar 21 04:45:39 crc kubenswrapper[4839]: I0321 04:45:39.261843 4839 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/horizon-84c6c985f8-v5cmh" podUID="b3b26c3a-55d5-442a-9c31-187b0aa60f90" containerName="horizon" probeResult="failure" output="Get \"https://10.217.0.151:8443/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.0.151:8443: connect: connection refused" Mar 21 04:45:40 crc kubenswrapper[4839]: I0321 04:45:40.065092 4839 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-d447b4d96-qkb69" Mar 21 04:45:40 crc kubenswrapper[4839]: I0321 04:45:40.146636 4839 generic.go:334] "Generic (PLEG): container finished" podID="12b60d89-b044-4822-bc95-47567123e883" containerID="67565e75899a6f37a50691495a20ce72ac05f30c786d835eec041d52489fae90" exitCode=0 Mar 21 04:45:40 crc kubenswrapper[4839]: I0321 04:45:40.146689 4839 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-d447b4d96-qkb69" Mar 21 04:45:40 crc kubenswrapper[4839]: I0321 04:45:40.146689 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-d447b4d96-qkb69" event={"ID":"12b60d89-b044-4822-bc95-47567123e883","Type":"ContainerDied","Data":"67565e75899a6f37a50691495a20ce72ac05f30c786d835eec041d52489fae90"} Mar 21 04:45:40 crc kubenswrapper[4839]: I0321 04:45:40.146748 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-d447b4d96-qkb69" event={"ID":"12b60d89-b044-4822-bc95-47567123e883","Type":"ContainerDied","Data":"e44c81ce53fb7cb7cb67615e87aced0fc7bd4c886cbd53ea268fc23a5209a592"} Mar 21 04:45:40 crc kubenswrapper[4839]: I0321 04:45:40.146769 4839 scope.go:117] "RemoveContainer" containerID="b34d8d2f70a5b785c9d68f7ee3d5caf1ba929e7c2f1691a3903d1460eb0bcbbc" Mar 21 04:45:40 crc kubenswrapper[4839]: I0321 04:45:40.151066 4839 generic.go:334] "Generic (PLEG): container finished" podID="78141ffe-ee2c-4b66-b8e9-7224e526dfa2" containerID="b3eab21d6ba6a94a0f66e87839a22f19a3c9384b821721458022b1cab04ab92a" exitCode=0 Mar 21 04:45:40 crc kubenswrapper[4839]: I0321 04:45:40.151099 4839 generic.go:334] "Generic (PLEG): container finished" podID="78141ffe-ee2c-4b66-b8e9-7224e526dfa2" containerID="7ad0bd3925864d1f7ff2de141b61807a3f02be0a2c40d1fe27bf1e9b2163e79a" exitCode=2 Mar 21 04:45:40 crc kubenswrapper[4839]: I0321 04:45:40.151108 4839 generic.go:334] "Generic (PLEG): container finished" podID="78141ffe-ee2c-4b66-b8e9-7224e526dfa2" containerID="0f08d30af95cdb5d8f671e4eec16f22bf5a749117dd6ef17a7712750d567ba4f" exitCode=0 Mar 21 04:45:40 crc kubenswrapper[4839]: I0321 04:45:40.151126 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"78141ffe-ee2c-4b66-b8e9-7224e526dfa2","Type":"ContainerDied","Data":"b3eab21d6ba6a94a0f66e87839a22f19a3c9384b821721458022b1cab04ab92a"} Mar 21 04:45:40 crc kubenswrapper[4839]: I0321 04:45:40.151149 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"78141ffe-ee2c-4b66-b8e9-7224e526dfa2","Type":"ContainerDied","Data":"7ad0bd3925864d1f7ff2de141b61807a3f02be0a2c40d1fe27bf1e9b2163e79a"} Mar 21 04:45:40 crc kubenswrapper[4839]: I0321 04:45:40.151158 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"78141ffe-ee2c-4b66-b8e9-7224e526dfa2","Type":"ContainerDied","Data":"0f08d30af95cdb5d8f671e4eec16f22bf5a749117dd6ef17a7712750d567ba4f"} Mar 21 04:45:40 crc kubenswrapper[4839]: I0321 04:45:40.166791 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-swz79\" (UniqueName: \"kubernetes.io/projected/12b60d89-b044-4822-bc95-47567123e883-kube-api-access-swz79\") pod \"12b60d89-b044-4822-bc95-47567123e883\" (UID: \"12b60d89-b044-4822-bc95-47567123e883\") " Mar 21 04:45:40 crc kubenswrapper[4839]: I0321 04:45:40.166867 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/12b60d89-b044-4822-bc95-47567123e883-httpd-config\") pod \"12b60d89-b044-4822-bc95-47567123e883\" (UID: \"12b60d89-b044-4822-bc95-47567123e883\") " Mar 21 04:45:40 crc kubenswrapper[4839]: I0321 04:45:40.167018 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/12b60d89-b044-4822-bc95-47567123e883-config\") pod \"12b60d89-b044-4822-bc95-47567123e883\" (UID: \"12b60d89-b044-4822-bc95-47567123e883\") " Mar 21 04:45:40 crc kubenswrapper[4839]: I0321 04:45:40.167042 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/12b60d89-b044-4822-bc95-47567123e883-ovndb-tls-certs\") pod \"12b60d89-b044-4822-bc95-47567123e883\" (UID: \"12b60d89-b044-4822-bc95-47567123e883\") " Mar 21 04:45:40 crc kubenswrapper[4839]: I0321 04:45:40.167251 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/12b60d89-b044-4822-bc95-47567123e883-combined-ca-bundle\") pod \"12b60d89-b044-4822-bc95-47567123e883\" (UID: \"12b60d89-b044-4822-bc95-47567123e883\") " Mar 21 04:45:40 crc kubenswrapper[4839]: I0321 04:45:40.168914 4839 scope.go:117] "RemoveContainer" containerID="67565e75899a6f37a50691495a20ce72ac05f30c786d835eec041d52489fae90" Mar 21 04:45:40 crc kubenswrapper[4839]: I0321 04:45:40.173801 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/12b60d89-b044-4822-bc95-47567123e883-httpd-config" (OuterVolumeSpecName: "httpd-config") pod "12b60d89-b044-4822-bc95-47567123e883" (UID: "12b60d89-b044-4822-bc95-47567123e883"). InnerVolumeSpecName "httpd-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 21 04:45:40 crc kubenswrapper[4839]: I0321 04:45:40.174322 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/12b60d89-b044-4822-bc95-47567123e883-kube-api-access-swz79" (OuterVolumeSpecName: "kube-api-access-swz79") pod "12b60d89-b044-4822-bc95-47567123e883" (UID: "12b60d89-b044-4822-bc95-47567123e883"). InnerVolumeSpecName "kube-api-access-swz79". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 21 04:45:40 crc kubenswrapper[4839]: I0321 04:45:40.227589 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/12b60d89-b044-4822-bc95-47567123e883-config" (OuterVolumeSpecName: "config") pod "12b60d89-b044-4822-bc95-47567123e883" (UID: "12b60d89-b044-4822-bc95-47567123e883"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 21 04:45:40 crc kubenswrapper[4839]: I0321 04:45:40.244244 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/12b60d89-b044-4822-bc95-47567123e883-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "12b60d89-b044-4822-bc95-47567123e883" (UID: "12b60d89-b044-4822-bc95-47567123e883"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 21 04:45:40 crc kubenswrapper[4839]: I0321 04:45:40.257859 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/12b60d89-b044-4822-bc95-47567123e883-ovndb-tls-certs" (OuterVolumeSpecName: "ovndb-tls-certs") pod "12b60d89-b044-4822-bc95-47567123e883" (UID: "12b60d89-b044-4822-bc95-47567123e883"). InnerVolumeSpecName "ovndb-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 21 04:45:40 crc kubenswrapper[4839]: I0321 04:45:40.269753 4839 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/12b60d89-b044-4822-bc95-47567123e883-config\") on node \"crc\" DevicePath \"\"" Mar 21 04:45:40 crc kubenswrapper[4839]: I0321 04:45:40.269782 4839 reconciler_common.go:293] "Volume detached for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/12b60d89-b044-4822-bc95-47567123e883-ovndb-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 21 04:45:40 crc kubenswrapper[4839]: I0321 04:45:40.269796 4839 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/12b60d89-b044-4822-bc95-47567123e883-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 21 04:45:40 crc kubenswrapper[4839]: I0321 04:45:40.269805 4839 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-swz79\" (UniqueName: \"kubernetes.io/projected/12b60d89-b044-4822-bc95-47567123e883-kube-api-access-swz79\") on node \"crc\" DevicePath \"\"" Mar 21 04:45:40 crc kubenswrapper[4839]: I0321 04:45:40.269814 4839 reconciler_common.go:293] "Volume detached for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/12b60d89-b044-4822-bc95-47567123e883-httpd-config\") on node \"crc\" DevicePath \"\"" Mar 21 04:45:40 crc kubenswrapper[4839]: I0321 04:45:40.311283 4839 scope.go:117] "RemoveContainer" containerID="b34d8d2f70a5b785c9d68f7ee3d5caf1ba929e7c2f1691a3903d1460eb0bcbbc" Mar 21 04:45:40 crc kubenswrapper[4839]: E0321 04:45:40.311880 4839 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b34d8d2f70a5b785c9d68f7ee3d5caf1ba929e7c2f1691a3903d1460eb0bcbbc\": container with ID starting with b34d8d2f70a5b785c9d68f7ee3d5caf1ba929e7c2f1691a3903d1460eb0bcbbc not found: ID does not exist" containerID="b34d8d2f70a5b785c9d68f7ee3d5caf1ba929e7c2f1691a3903d1460eb0bcbbc" Mar 21 04:45:40 crc kubenswrapper[4839]: I0321 04:45:40.311929 4839 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b34d8d2f70a5b785c9d68f7ee3d5caf1ba929e7c2f1691a3903d1460eb0bcbbc"} err="failed to get container status \"b34d8d2f70a5b785c9d68f7ee3d5caf1ba929e7c2f1691a3903d1460eb0bcbbc\": rpc error: code = NotFound desc = could not find container \"b34d8d2f70a5b785c9d68f7ee3d5caf1ba929e7c2f1691a3903d1460eb0bcbbc\": container with ID starting with b34d8d2f70a5b785c9d68f7ee3d5caf1ba929e7c2f1691a3903d1460eb0bcbbc not found: ID does not exist" Mar 21 04:45:40 crc kubenswrapper[4839]: I0321 04:45:40.311995 4839 scope.go:117] "RemoveContainer" containerID="67565e75899a6f37a50691495a20ce72ac05f30c786d835eec041d52489fae90" Mar 21 04:45:40 crc kubenswrapper[4839]: E0321 04:45:40.312430 4839 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"67565e75899a6f37a50691495a20ce72ac05f30c786d835eec041d52489fae90\": container with ID starting with 67565e75899a6f37a50691495a20ce72ac05f30c786d835eec041d52489fae90 not found: ID does not exist" containerID="67565e75899a6f37a50691495a20ce72ac05f30c786d835eec041d52489fae90" Mar 21 04:45:40 crc kubenswrapper[4839]: I0321 04:45:40.312462 4839 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"67565e75899a6f37a50691495a20ce72ac05f30c786d835eec041d52489fae90"} err="failed to get container status \"67565e75899a6f37a50691495a20ce72ac05f30c786d835eec041d52489fae90\": rpc error: code = NotFound desc = could not find container \"67565e75899a6f37a50691495a20ce72ac05f30c786d835eec041d52489fae90\": container with ID starting with 67565e75899a6f37a50691495a20ce72ac05f30c786d835eec041d52489fae90 not found: ID does not exist" Mar 21 04:45:40 crc kubenswrapper[4839]: I0321 04:45:40.495415 4839 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-d447b4d96-qkb69"] Mar 21 04:45:40 crc kubenswrapper[4839]: I0321 04:45:40.506329 4839 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-d447b4d96-qkb69"] Mar 21 04:45:41 crc kubenswrapper[4839]: I0321 04:45:41.162737 4839 generic.go:334] "Generic (PLEG): container finished" podID="78141ffe-ee2c-4b66-b8e9-7224e526dfa2" containerID="b9ba563d5bd44575dcd70c87b15efdd8ce73c0f0033e02dc5b2a64d859412a76" exitCode=0 Mar 21 04:45:41 crc kubenswrapper[4839]: I0321 04:45:41.162810 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"78141ffe-ee2c-4b66-b8e9-7224e526dfa2","Type":"ContainerDied","Data":"b9ba563d5bd44575dcd70c87b15efdd8ce73c0f0033e02dc5b2a64d859412a76"} Mar 21 04:45:41 crc kubenswrapper[4839]: I0321 04:45:41.243184 4839 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 21 04:45:41 crc kubenswrapper[4839]: I0321 04:45:41.323511 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/78141ffe-ee2c-4b66-b8e9-7224e526dfa2-log-httpd\") pod \"78141ffe-ee2c-4b66-b8e9-7224e526dfa2\" (UID: \"78141ffe-ee2c-4b66-b8e9-7224e526dfa2\") " Mar 21 04:45:41 crc kubenswrapper[4839]: I0321 04:45:41.323703 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ksth5\" (UniqueName: \"kubernetes.io/projected/78141ffe-ee2c-4b66-b8e9-7224e526dfa2-kube-api-access-ksth5\") pod \"78141ffe-ee2c-4b66-b8e9-7224e526dfa2\" (UID: \"78141ffe-ee2c-4b66-b8e9-7224e526dfa2\") " Mar 21 04:45:41 crc kubenswrapper[4839]: I0321 04:45:41.323748 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/78141ffe-ee2c-4b66-b8e9-7224e526dfa2-combined-ca-bundle\") pod \"78141ffe-ee2c-4b66-b8e9-7224e526dfa2\" (UID: \"78141ffe-ee2c-4b66-b8e9-7224e526dfa2\") " Mar 21 04:45:41 crc kubenswrapper[4839]: I0321 04:45:41.323788 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/78141ffe-ee2c-4b66-b8e9-7224e526dfa2-run-httpd\") pod \"78141ffe-ee2c-4b66-b8e9-7224e526dfa2\" (UID: \"78141ffe-ee2c-4b66-b8e9-7224e526dfa2\") " Mar 21 04:45:41 crc kubenswrapper[4839]: I0321 04:45:41.323805 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/78141ffe-ee2c-4b66-b8e9-7224e526dfa2-sg-core-conf-yaml\") pod \"78141ffe-ee2c-4b66-b8e9-7224e526dfa2\" (UID: \"78141ffe-ee2c-4b66-b8e9-7224e526dfa2\") " Mar 21 04:45:41 crc kubenswrapper[4839]: I0321 04:45:41.323840 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/78141ffe-ee2c-4b66-b8e9-7224e526dfa2-scripts\") pod \"78141ffe-ee2c-4b66-b8e9-7224e526dfa2\" (UID: \"78141ffe-ee2c-4b66-b8e9-7224e526dfa2\") " Mar 21 04:45:41 crc kubenswrapper[4839]: I0321 04:45:41.323911 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/78141ffe-ee2c-4b66-b8e9-7224e526dfa2-config-data\") pod \"78141ffe-ee2c-4b66-b8e9-7224e526dfa2\" (UID: \"78141ffe-ee2c-4b66-b8e9-7224e526dfa2\") " Mar 21 04:45:41 crc kubenswrapper[4839]: I0321 04:45:41.324390 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/78141ffe-ee2c-4b66-b8e9-7224e526dfa2-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "78141ffe-ee2c-4b66-b8e9-7224e526dfa2" (UID: "78141ffe-ee2c-4b66-b8e9-7224e526dfa2"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 21 04:45:41 crc kubenswrapper[4839]: I0321 04:45:41.324623 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/78141ffe-ee2c-4b66-b8e9-7224e526dfa2-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "78141ffe-ee2c-4b66-b8e9-7224e526dfa2" (UID: "78141ffe-ee2c-4b66-b8e9-7224e526dfa2"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 21 04:45:41 crc kubenswrapper[4839]: I0321 04:45:41.332898 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/78141ffe-ee2c-4b66-b8e9-7224e526dfa2-kube-api-access-ksth5" (OuterVolumeSpecName: "kube-api-access-ksth5") pod "78141ffe-ee2c-4b66-b8e9-7224e526dfa2" (UID: "78141ffe-ee2c-4b66-b8e9-7224e526dfa2"). InnerVolumeSpecName "kube-api-access-ksth5". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 21 04:45:41 crc kubenswrapper[4839]: I0321 04:45:41.334796 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/78141ffe-ee2c-4b66-b8e9-7224e526dfa2-scripts" (OuterVolumeSpecName: "scripts") pod "78141ffe-ee2c-4b66-b8e9-7224e526dfa2" (UID: "78141ffe-ee2c-4b66-b8e9-7224e526dfa2"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 21 04:45:41 crc kubenswrapper[4839]: I0321 04:45:41.359757 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/78141ffe-ee2c-4b66-b8e9-7224e526dfa2-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "78141ffe-ee2c-4b66-b8e9-7224e526dfa2" (UID: "78141ffe-ee2c-4b66-b8e9-7224e526dfa2"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 21 04:45:41 crc kubenswrapper[4839]: I0321 04:45:41.426284 4839 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/78141ffe-ee2c-4b66-b8e9-7224e526dfa2-log-httpd\") on node \"crc\" DevicePath \"\"" Mar 21 04:45:41 crc kubenswrapper[4839]: I0321 04:45:41.426685 4839 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ksth5\" (UniqueName: \"kubernetes.io/projected/78141ffe-ee2c-4b66-b8e9-7224e526dfa2-kube-api-access-ksth5\") on node \"crc\" DevicePath \"\"" Mar 21 04:45:41 crc kubenswrapper[4839]: I0321 04:45:41.426700 4839 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/78141ffe-ee2c-4b66-b8e9-7224e526dfa2-run-httpd\") on node \"crc\" DevicePath \"\"" Mar 21 04:45:41 crc kubenswrapper[4839]: I0321 04:45:41.426711 4839 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/78141ffe-ee2c-4b66-b8e9-7224e526dfa2-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Mar 21 04:45:41 crc kubenswrapper[4839]: I0321 04:45:41.426723 4839 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/78141ffe-ee2c-4b66-b8e9-7224e526dfa2-scripts\") on node \"crc\" DevicePath \"\"" Mar 21 04:45:41 crc kubenswrapper[4839]: I0321 04:45:41.432038 4839 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-db-create-4zz89"] Mar 21 04:45:41 crc kubenswrapper[4839]: E0321 04:45:41.432439 4839 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="12b60d89-b044-4822-bc95-47567123e883" containerName="neutron-api" Mar 21 04:45:41 crc kubenswrapper[4839]: I0321 04:45:41.432457 4839 state_mem.go:107] "Deleted CPUSet assignment" podUID="12b60d89-b044-4822-bc95-47567123e883" containerName="neutron-api" Mar 21 04:45:41 crc kubenswrapper[4839]: E0321 04:45:41.432481 4839 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="12b60d89-b044-4822-bc95-47567123e883" containerName="neutron-httpd" Mar 21 04:45:41 crc kubenswrapper[4839]: I0321 04:45:41.432490 4839 state_mem.go:107] "Deleted CPUSet assignment" podUID="12b60d89-b044-4822-bc95-47567123e883" containerName="neutron-httpd" Mar 21 04:45:41 crc kubenswrapper[4839]: E0321 04:45:41.432501 4839 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="78141ffe-ee2c-4b66-b8e9-7224e526dfa2" containerName="ceilometer-central-agent" Mar 21 04:45:41 crc kubenswrapper[4839]: I0321 04:45:41.432547 4839 state_mem.go:107] "Deleted CPUSet assignment" podUID="78141ffe-ee2c-4b66-b8e9-7224e526dfa2" containerName="ceilometer-central-agent" Mar 21 04:45:41 crc kubenswrapper[4839]: E0321 04:45:41.432583 4839 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="78141ffe-ee2c-4b66-b8e9-7224e526dfa2" containerName="ceilometer-notification-agent" Mar 21 04:45:41 crc kubenswrapper[4839]: I0321 04:45:41.432591 4839 state_mem.go:107] "Deleted CPUSet assignment" podUID="78141ffe-ee2c-4b66-b8e9-7224e526dfa2" containerName="ceilometer-notification-agent" Mar 21 04:45:41 crc kubenswrapper[4839]: E0321 04:45:41.432606 4839 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="78141ffe-ee2c-4b66-b8e9-7224e526dfa2" containerName="sg-core" Mar 21 04:45:41 crc kubenswrapper[4839]: I0321 04:45:41.432615 4839 state_mem.go:107] "Deleted CPUSet assignment" podUID="78141ffe-ee2c-4b66-b8e9-7224e526dfa2" containerName="sg-core" Mar 21 04:45:41 crc kubenswrapper[4839]: E0321 04:45:41.432631 4839 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="78141ffe-ee2c-4b66-b8e9-7224e526dfa2" containerName="proxy-httpd" Mar 21 04:45:41 crc kubenswrapper[4839]: I0321 04:45:41.432638 4839 state_mem.go:107] "Deleted CPUSet assignment" podUID="78141ffe-ee2c-4b66-b8e9-7224e526dfa2" containerName="proxy-httpd" Mar 21 04:45:41 crc kubenswrapper[4839]: I0321 04:45:41.432791 4839 memory_manager.go:354] "RemoveStaleState removing state" podUID="78141ffe-ee2c-4b66-b8e9-7224e526dfa2" containerName="sg-core" Mar 21 04:45:41 crc kubenswrapper[4839]: I0321 04:45:41.432800 4839 memory_manager.go:354] "RemoveStaleState removing state" podUID="12b60d89-b044-4822-bc95-47567123e883" containerName="neutron-httpd" Mar 21 04:45:41 crc kubenswrapper[4839]: I0321 04:45:41.432816 4839 memory_manager.go:354] "RemoveStaleState removing state" podUID="78141ffe-ee2c-4b66-b8e9-7224e526dfa2" containerName="proxy-httpd" Mar 21 04:45:41 crc kubenswrapper[4839]: I0321 04:45:41.432829 4839 memory_manager.go:354] "RemoveStaleState removing state" podUID="78141ffe-ee2c-4b66-b8e9-7224e526dfa2" containerName="ceilometer-central-agent" Mar 21 04:45:41 crc kubenswrapper[4839]: I0321 04:45:41.432843 4839 memory_manager.go:354] "RemoveStaleState removing state" podUID="78141ffe-ee2c-4b66-b8e9-7224e526dfa2" containerName="ceilometer-notification-agent" Mar 21 04:45:41 crc kubenswrapper[4839]: I0321 04:45:41.432853 4839 memory_manager.go:354] "RemoveStaleState removing state" podUID="12b60d89-b044-4822-bc95-47567123e883" containerName="neutron-api" Mar 21 04:45:41 crc kubenswrapper[4839]: I0321 04:45:41.433370 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-4zz89" Mar 21 04:45:41 crc kubenswrapper[4839]: I0321 04:45:41.449371 4839 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-db-create-4zz89"] Mar 21 04:45:41 crc kubenswrapper[4839]: I0321 04:45:41.469830 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/78141ffe-ee2c-4b66-b8e9-7224e526dfa2-config-data" (OuterVolumeSpecName: "config-data") pod "78141ffe-ee2c-4b66-b8e9-7224e526dfa2" (UID: "78141ffe-ee2c-4b66-b8e9-7224e526dfa2"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 21 04:45:41 crc kubenswrapper[4839]: I0321 04:45:41.470944 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/78141ffe-ee2c-4b66-b8e9-7224e526dfa2-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "78141ffe-ee2c-4b66-b8e9-7224e526dfa2" (UID: "78141ffe-ee2c-4b66-b8e9-7224e526dfa2"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 21 04:45:41 crc kubenswrapper[4839]: I0321 04:45:41.503914 4839 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-db-create-ds7tq"] Mar 21 04:45:41 crc kubenswrapper[4839]: I0321 04:45:41.504959 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-ds7tq" Mar 21 04:45:41 crc kubenswrapper[4839]: I0321 04:45:41.522768 4839 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-db-create-ds7tq"] Mar 21 04:45:41 crc kubenswrapper[4839]: I0321 04:45:41.527940 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b76e9253-1495-42d5-910f-cce6f2730243-operator-scripts\") pod \"nova-api-db-create-4zz89\" (UID: \"b76e9253-1495-42d5-910f-cce6f2730243\") " pod="openstack/nova-api-db-create-4zz89" Mar 21 04:45:41 crc kubenswrapper[4839]: I0321 04:45:41.528250 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-978bj\" (UniqueName: \"kubernetes.io/projected/b76e9253-1495-42d5-910f-cce6f2730243-kube-api-access-978bj\") pod \"nova-api-db-create-4zz89\" (UID: \"b76e9253-1495-42d5-910f-cce6f2730243\") " pod="openstack/nova-api-db-create-4zz89" Mar 21 04:45:41 crc kubenswrapper[4839]: I0321 04:45:41.529160 4839 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/78141ffe-ee2c-4b66-b8e9-7224e526dfa2-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 21 04:45:41 crc kubenswrapper[4839]: I0321 04:45:41.529667 4839 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/78141ffe-ee2c-4b66-b8e9-7224e526dfa2-config-data\") on node \"crc\" DevicePath \"\"" Mar 21 04:45:41 crc kubenswrapper[4839]: I0321 04:45:41.617797 4839 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-db-create-w9wx6"] Mar 21 04:45:41 crc kubenswrapper[4839]: I0321 04:45:41.619290 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-w9wx6" Mar 21 04:45:41 crc kubenswrapper[4839]: I0321 04:45:41.630597 4839 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-48d5-account-create-update-5k79b"] Mar 21 04:45:41 crc kubenswrapper[4839]: I0321 04:45:41.631828 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-48d5-account-create-update-5k79b" Mar 21 04:45:41 crc kubenswrapper[4839]: I0321 04:45:41.634374 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b76e9253-1495-42d5-910f-cce6f2730243-operator-scripts\") pod \"nova-api-db-create-4zz89\" (UID: \"b76e9253-1495-42d5-910f-cce6f2730243\") " pod="openstack/nova-api-db-create-4zz89" Mar 21 04:45:41 crc kubenswrapper[4839]: I0321 04:45:41.634465 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/9220ed3c-2e97-4efc-a4cc-28bb29774ad8-operator-scripts\") pod \"nova-cell0-db-create-ds7tq\" (UID: \"9220ed3c-2e97-4efc-a4cc-28bb29774ad8\") " pod="openstack/nova-cell0-db-create-ds7tq" Mar 21 04:45:41 crc kubenswrapper[4839]: I0321 04:45:41.634515 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mb2gw\" (UniqueName: \"kubernetes.io/projected/9220ed3c-2e97-4efc-a4cc-28bb29774ad8-kube-api-access-mb2gw\") pod \"nova-cell0-db-create-ds7tq\" (UID: \"9220ed3c-2e97-4efc-a4cc-28bb29774ad8\") " pod="openstack/nova-cell0-db-create-ds7tq" Mar 21 04:45:41 crc kubenswrapper[4839]: I0321 04:45:41.634606 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-978bj\" (UniqueName: \"kubernetes.io/projected/b76e9253-1495-42d5-910f-cce6f2730243-kube-api-access-978bj\") pod \"nova-api-db-create-4zz89\" (UID: \"b76e9253-1495-42d5-910f-cce6f2730243\") " pod="openstack/nova-api-db-create-4zz89" Mar 21 04:45:41 crc kubenswrapper[4839]: I0321 04:45:41.635183 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b76e9253-1495-42d5-910f-cce6f2730243-operator-scripts\") pod \"nova-api-db-create-4zz89\" (UID: \"b76e9253-1495-42d5-910f-cce6f2730243\") " pod="openstack/nova-api-db-create-4zz89" Mar 21 04:45:41 crc kubenswrapper[4839]: I0321 04:45:41.636328 4839 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-db-secret" Mar 21 04:45:41 crc kubenswrapper[4839]: I0321 04:45:41.645994 4839 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-db-create-w9wx6"] Mar 21 04:45:41 crc kubenswrapper[4839]: I0321 04:45:41.663099 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-978bj\" (UniqueName: \"kubernetes.io/projected/b76e9253-1495-42d5-910f-cce6f2730243-kube-api-access-978bj\") pod \"nova-api-db-create-4zz89\" (UID: \"b76e9253-1495-42d5-910f-cce6f2730243\") " pod="openstack/nova-api-db-create-4zz89" Mar 21 04:45:41 crc kubenswrapper[4839]: I0321 04:45:41.674472 4839 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-48d5-account-create-update-5k79b"] Mar 21 04:45:41 crc kubenswrapper[4839]: I0321 04:45:41.736646 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/46c56098-2959-4bd0-b762-36a4ee1bb2e6-operator-scripts\") pod \"nova-cell1-db-create-w9wx6\" (UID: \"46c56098-2959-4bd0-b762-36a4ee1bb2e6\") " pod="openstack/nova-cell1-db-create-w9wx6" Mar 21 04:45:41 crc kubenswrapper[4839]: I0321 04:45:41.736727 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gpg9b\" (UniqueName: \"kubernetes.io/projected/f481fb0d-ac2f-4989-a547-50f5081e4e78-kube-api-access-gpg9b\") pod \"nova-api-48d5-account-create-update-5k79b\" (UID: \"f481fb0d-ac2f-4989-a547-50f5081e4e78\") " pod="openstack/nova-api-48d5-account-create-update-5k79b" Mar 21 04:45:41 crc kubenswrapper[4839]: I0321 04:45:41.736833 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/9220ed3c-2e97-4efc-a4cc-28bb29774ad8-operator-scripts\") pod \"nova-cell0-db-create-ds7tq\" (UID: \"9220ed3c-2e97-4efc-a4cc-28bb29774ad8\") " pod="openstack/nova-cell0-db-create-ds7tq" Mar 21 04:45:41 crc kubenswrapper[4839]: I0321 04:45:41.736880 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mb2gw\" (UniqueName: \"kubernetes.io/projected/9220ed3c-2e97-4efc-a4cc-28bb29774ad8-kube-api-access-mb2gw\") pod \"nova-cell0-db-create-ds7tq\" (UID: \"9220ed3c-2e97-4efc-a4cc-28bb29774ad8\") " pod="openstack/nova-cell0-db-create-ds7tq" Mar 21 04:45:41 crc kubenswrapper[4839]: I0321 04:45:41.736910 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bhvhx\" (UniqueName: \"kubernetes.io/projected/46c56098-2959-4bd0-b762-36a4ee1bb2e6-kube-api-access-bhvhx\") pod \"nova-cell1-db-create-w9wx6\" (UID: \"46c56098-2959-4bd0-b762-36a4ee1bb2e6\") " pod="openstack/nova-cell1-db-create-w9wx6" Mar 21 04:45:41 crc kubenswrapper[4839]: I0321 04:45:41.736937 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f481fb0d-ac2f-4989-a547-50f5081e4e78-operator-scripts\") pod \"nova-api-48d5-account-create-update-5k79b\" (UID: \"f481fb0d-ac2f-4989-a547-50f5081e4e78\") " pod="openstack/nova-api-48d5-account-create-update-5k79b" Mar 21 04:45:41 crc kubenswrapper[4839]: I0321 04:45:41.737779 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/9220ed3c-2e97-4efc-a4cc-28bb29774ad8-operator-scripts\") pod \"nova-cell0-db-create-ds7tq\" (UID: \"9220ed3c-2e97-4efc-a4cc-28bb29774ad8\") " pod="openstack/nova-cell0-db-create-ds7tq" Mar 21 04:45:41 crc kubenswrapper[4839]: I0321 04:45:41.750937 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-4zz89" Mar 21 04:45:41 crc kubenswrapper[4839]: I0321 04:45:41.759641 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mb2gw\" (UniqueName: \"kubernetes.io/projected/9220ed3c-2e97-4efc-a4cc-28bb29774ad8-kube-api-access-mb2gw\") pod \"nova-cell0-db-create-ds7tq\" (UID: \"9220ed3c-2e97-4efc-a4cc-28bb29774ad8\") " pod="openstack/nova-cell0-db-create-ds7tq" Mar 21 04:45:41 crc kubenswrapper[4839]: I0321 04:45:41.827146 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-ds7tq" Mar 21 04:45:41 crc kubenswrapper[4839]: I0321 04:45:41.829356 4839 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-46c8-account-create-update-mp8jl"] Mar 21 04:45:41 crc kubenswrapper[4839]: I0321 04:45:41.830705 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-46c8-account-create-update-mp8jl" Mar 21 04:45:41 crc kubenswrapper[4839]: I0321 04:45:41.833379 4839 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-db-secret" Mar 21 04:45:41 crc kubenswrapper[4839]: I0321 04:45:41.838405 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/46c56098-2959-4bd0-b762-36a4ee1bb2e6-operator-scripts\") pod \"nova-cell1-db-create-w9wx6\" (UID: \"46c56098-2959-4bd0-b762-36a4ee1bb2e6\") " pod="openstack/nova-cell1-db-create-w9wx6" Mar 21 04:45:41 crc kubenswrapper[4839]: I0321 04:45:41.838472 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gpg9b\" (UniqueName: \"kubernetes.io/projected/f481fb0d-ac2f-4989-a547-50f5081e4e78-kube-api-access-gpg9b\") pod \"nova-api-48d5-account-create-update-5k79b\" (UID: \"f481fb0d-ac2f-4989-a547-50f5081e4e78\") " pod="openstack/nova-api-48d5-account-create-update-5k79b" Mar 21 04:45:41 crc kubenswrapper[4839]: I0321 04:45:41.838589 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bhvhx\" (UniqueName: \"kubernetes.io/projected/46c56098-2959-4bd0-b762-36a4ee1bb2e6-kube-api-access-bhvhx\") pod \"nova-cell1-db-create-w9wx6\" (UID: \"46c56098-2959-4bd0-b762-36a4ee1bb2e6\") " pod="openstack/nova-cell1-db-create-w9wx6" Mar 21 04:45:41 crc kubenswrapper[4839]: I0321 04:45:41.838612 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f481fb0d-ac2f-4989-a547-50f5081e4e78-operator-scripts\") pod \"nova-api-48d5-account-create-update-5k79b\" (UID: \"f481fb0d-ac2f-4989-a547-50f5081e4e78\") " pod="openstack/nova-api-48d5-account-create-update-5k79b" Mar 21 04:45:41 crc kubenswrapper[4839]: I0321 04:45:41.839351 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f481fb0d-ac2f-4989-a547-50f5081e4e78-operator-scripts\") pod \"nova-api-48d5-account-create-update-5k79b\" (UID: \"f481fb0d-ac2f-4989-a547-50f5081e4e78\") " pod="openstack/nova-api-48d5-account-create-update-5k79b" Mar 21 04:45:41 crc kubenswrapper[4839]: I0321 04:45:41.842583 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/46c56098-2959-4bd0-b762-36a4ee1bb2e6-operator-scripts\") pod \"nova-cell1-db-create-w9wx6\" (UID: \"46c56098-2959-4bd0-b762-36a4ee1bb2e6\") " pod="openstack/nova-cell1-db-create-w9wx6" Mar 21 04:45:41 crc kubenswrapper[4839]: I0321 04:45:41.857674 4839 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-46c8-account-create-update-mp8jl"] Mar 21 04:45:41 crc kubenswrapper[4839]: I0321 04:45:41.864735 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bhvhx\" (UniqueName: \"kubernetes.io/projected/46c56098-2959-4bd0-b762-36a4ee1bb2e6-kube-api-access-bhvhx\") pod \"nova-cell1-db-create-w9wx6\" (UID: \"46c56098-2959-4bd0-b762-36a4ee1bb2e6\") " pod="openstack/nova-cell1-db-create-w9wx6" Mar 21 04:45:41 crc kubenswrapper[4839]: I0321 04:45:41.865171 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gpg9b\" (UniqueName: \"kubernetes.io/projected/f481fb0d-ac2f-4989-a547-50f5081e4e78-kube-api-access-gpg9b\") pod \"nova-api-48d5-account-create-update-5k79b\" (UID: \"f481fb0d-ac2f-4989-a547-50f5081e4e78\") " pod="openstack/nova-api-48d5-account-create-update-5k79b" Mar 21 04:45:41 crc kubenswrapper[4839]: I0321 04:45:41.933840 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-w9wx6" Mar 21 04:45:41 crc kubenswrapper[4839]: I0321 04:45:41.940510 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/4185a56e-9d10-4aea-ad84-a865dff3e6be-operator-scripts\") pod \"nova-cell0-46c8-account-create-update-mp8jl\" (UID: \"4185a56e-9d10-4aea-ad84-a865dff3e6be\") " pod="openstack/nova-cell0-46c8-account-create-update-mp8jl" Mar 21 04:45:41 crc kubenswrapper[4839]: I0321 04:45:41.940802 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vb7zf\" (UniqueName: \"kubernetes.io/projected/4185a56e-9d10-4aea-ad84-a865dff3e6be-kube-api-access-vb7zf\") pod \"nova-cell0-46c8-account-create-update-mp8jl\" (UID: \"4185a56e-9d10-4aea-ad84-a865dff3e6be\") " pod="openstack/nova-cell0-46c8-account-create-update-mp8jl" Mar 21 04:45:41 crc kubenswrapper[4839]: I0321 04:45:41.945721 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-48d5-account-create-update-5k79b" Mar 21 04:45:42 crc kubenswrapper[4839]: I0321 04:45:42.041622 4839 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-94b7-account-create-update-zmpzr"] Mar 21 04:45:42 crc kubenswrapper[4839]: I0321 04:45:42.042807 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vb7zf\" (UniqueName: \"kubernetes.io/projected/4185a56e-9d10-4aea-ad84-a865dff3e6be-kube-api-access-vb7zf\") pod \"nova-cell0-46c8-account-create-update-mp8jl\" (UID: \"4185a56e-9d10-4aea-ad84-a865dff3e6be\") " pod="openstack/nova-cell0-46c8-account-create-update-mp8jl" Mar 21 04:45:42 crc kubenswrapper[4839]: I0321 04:45:42.042855 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/4185a56e-9d10-4aea-ad84-a865dff3e6be-operator-scripts\") pod \"nova-cell0-46c8-account-create-update-mp8jl\" (UID: \"4185a56e-9d10-4aea-ad84-a865dff3e6be\") " pod="openstack/nova-cell0-46c8-account-create-update-mp8jl" Mar 21 04:45:42 crc kubenswrapper[4839]: I0321 04:45:42.043062 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-94b7-account-create-update-zmpzr" Mar 21 04:45:42 crc kubenswrapper[4839]: I0321 04:45:42.045085 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/4185a56e-9d10-4aea-ad84-a865dff3e6be-operator-scripts\") pod \"nova-cell0-46c8-account-create-update-mp8jl\" (UID: \"4185a56e-9d10-4aea-ad84-a865dff3e6be\") " pod="openstack/nova-cell0-46c8-account-create-update-mp8jl" Mar 21 04:45:42 crc kubenswrapper[4839]: I0321 04:45:42.046105 4839 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-db-secret" Mar 21 04:45:42 crc kubenswrapper[4839]: I0321 04:45:42.058667 4839 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-94b7-account-create-update-zmpzr"] Mar 21 04:45:42 crc kubenswrapper[4839]: I0321 04:45:42.064906 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vb7zf\" (UniqueName: \"kubernetes.io/projected/4185a56e-9d10-4aea-ad84-a865dff3e6be-kube-api-access-vb7zf\") pod \"nova-cell0-46c8-account-create-update-mp8jl\" (UID: \"4185a56e-9d10-4aea-ad84-a865dff3e6be\") " pod="openstack/nova-cell0-46c8-account-create-update-mp8jl" Mar 21 04:45:42 crc kubenswrapper[4839]: I0321 04:45:42.144988 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/60534a44-1538-4bdb-81d1-043c9ae84cee-operator-scripts\") pod \"nova-cell1-94b7-account-create-update-zmpzr\" (UID: \"60534a44-1538-4bdb-81d1-043c9ae84cee\") " pod="openstack/nova-cell1-94b7-account-create-update-zmpzr" Mar 21 04:45:42 crc kubenswrapper[4839]: I0321 04:45:42.145293 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j98k2\" (UniqueName: \"kubernetes.io/projected/60534a44-1538-4bdb-81d1-043c9ae84cee-kube-api-access-j98k2\") pod \"nova-cell1-94b7-account-create-update-zmpzr\" (UID: \"60534a44-1538-4bdb-81d1-043c9ae84cee\") " pod="openstack/nova-cell1-94b7-account-create-update-zmpzr" Mar 21 04:45:42 crc kubenswrapper[4839]: I0321 04:45:42.186338 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"78141ffe-ee2c-4b66-b8e9-7224e526dfa2","Type":"ContainerDied","Data":"826cbd06b784209787d8818bee60d8b520f17d5ca2e06bfab9213c157d0fdf3a"} Mar 21 04:45:42 crc kubenswrapper[4839]: I0321 04:45:42.186400 4839 scope.go:117] "RemoveContainer" containerID="b3eab21d6ba6a94a0f66e87839a22f19a3c9384b821721458022b1cab04ab92a" Mar 21 04:45:42 crc kubenswrapper[4839]: I0321 04:45:42.186593 4839 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 21 04:45:42 crc kubenswrapper[4839]: I0321 04:45:42.232959 4839 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 21 04:45:42 crc kubenswrapper[4839]: I0321 04:45:42.241007 4839 scope.go:117] "RemoveContainer" containerID="7ad0bd3925864d1f7ff2de141b61807a3f02be0a2c40d1fe27bf1e9b2163e79a" Mar 21 04:45:42 crc kubenswrapper[4839]: I0321 04:45:42.246892 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/60534a44-1538-4bdb-81d1-043c9ae84cee-operator-scripts\") pod \"nova-cell1-94b7-account-create-update-zmpzr\" (UID: \"60534a44-1538-4bdb-81d1-043c9ae84cee\") " pod="openstack/nova-cell1-94b7-account-create-update-zmpzr" Mar 21 04:45:42 crc kubenswrapper[4839]: I0321 04:45:42.246963 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-j98k2\" (UniqueName: \"kubernetes.io/projected/60534a44-1538-4bdb-81d1-043c9ae84cee-kube-api-access-j98k2\") pod \"nova-cell1-94b7-account-create-update-zmpzr\" (UID: \"60534a44-1538-4bdb-81d1-043c9ae84cee\") " pod="openstack/nova-cell1-94b7-account-create-update-zmpzr" Mar 21 04:45:42 crc kubenswrapper[4839]: I0321 04:45:42.250930 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/60534a44-1538-4bdb-81d1-043c9ae84cee-operator-scripts\") pod \"nova-cell1-94b7-account-create-update-zmpzr\" (UID: \"60534a44-1538-4bdb-81d1-043c9ae84cee\") " pod="openstack/nova-cell1-94b7-account-create-update-zmpzr" Mar 21 04:45:42 crc kubenswrapper[4839]: W0321 04:45:42.252103 4839 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb76e9253_1495_42d5_910f_cce6f2730243.slice/crio-54b14184b0a30e6f28cd2e9d592a640dccc616fa6f788aae4c5dcf3a458c8feb WatchSource:0}: Error finding container 54b14184b0a30e6f28cd2e9d592a640dccc616fa6f788aae4c5dcf3a458c8feb: Status 404 returned error can't find the container with id 54b14184b0a30e6f28cd2e9d592a640dccc616fa6f788aae4c5dcf3a458c8feb Mar 21 04:45:42 crc kubenswrapper[4839]: I0321 04:45:42.261992 4839 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Mar 21 04:45:42 crc kubenswrapper[4839]: I0321 04:45:42.266353 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-j98k2\" (UniqueName: \"kubernetes.io/projected/60534a44-1538-4bdb-81d1-043c9ae84cee-kube-api-access-j98k2\") pod \"nova-cell1-94b7-account-create-update-zmpzr\" (UID: \"60534a44-1538-4bdb-81d1-043c9ae84cee\") " pod="openstack/nova-cell1-94b7-account-create-update-zmpzr" Mar 21 04:45:42 crc kubenswrapper[4839]: I0321 04:45:42.283850 4839 scope.go:117] "RemoveContainer" containerID="0f08d30af95cdb5d8f671e4eec16f22bf5a749117dd6ef17a7712750d567ba4f" Mar 21 04:45:42 crc kubenswrapper[4839]: I0321 04:45:42.284034 4839 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-db-create-4zz89"] Mar 21 04:45:42 crc kubenswrapper[4839]: I0321 04:45:42.295828 4839 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Mar 21 04:45:42 crc kubenswrapper[4839]: I0321 04:45:42.298758 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 21 04:45:42 crc kubenswrapper[4839]: I0321 04:45:42.300963 4839 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Mar 21 04:45:42 crc kubenswrapper[4839]: I0321 04:45:42.301226 4839 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Mar 21 04:45:42 crc kubenswrapper[4839]: I0321 04:45:42.307138 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-46c8-account-create-update-mp8jl" Mar 21 04:45:42 crc kubenswrapper[4839]: I0321 04:45:42.307753 4839 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 21 04:45:42 crc kubenswrapper[4839]: I0321 04:45:42.326554 4839 scope.go:117] "RemoveContainer" containerID="b9ba563d5bd44575dcd70c87b15efdd8ce73c0f0033e02dc5b2a64d859412a76" Mar 21 04:45:42 crc kubenswrapper[4839]: I0321 04:45:42.348451 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/df08a3cb-a9ae-4b8e-a9c8-604c41db5158-run-httpd\") pod \"ceilometer-0\" (UID: \"df08a3cb-a9ae-4b8e-a9c8-604c41db5158\") " pod="openstack/ceilometer-0" Mar 21 04:45:42 crc kubenswrapper[4839]: I0321 04:45:42.348543 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ldtbn\" (UniqueName: \"kubernetes.io/projected/df08a3cb-a9ae-4b8e-a9c8-604c41db5158-kube-api-access-ldtbn\") pod \"ceilometer-0\" (UID: \"df08a3cb-a9ae-4b8e-a9c8-604c41db5158\") " pod="openstack/ceilometer-0" Mar 21 04:45:42 crc kubenswrapper[4839]: I0321 04:45:42.348631 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/df08a3cb-a9ae-4b8e-a9c8-604c41db5158-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"df08a3cb-a9ae-4b8e-a9c8-604c41db5158\") " pod="openstack/ceilometer-0" Mar 21 04:45:42 crc kubenswrapper[4839]: I0321 04:45:42.348666 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/df08a3cb-a9ae-4b8e-a9c8-604c41db5158-log-httpd\") pod \"ceilometer-0\" (UID: \"df08a3cb-a9ae-4b8e-a9c8-604c41db5158\") " pod="openstack/ceilometer-0" Mar 21 04:45:42 crc kubenswrapper[4839]: I0321 04:45:42.348698 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/df08a3cb-a9ae-4b8e-a9c8-604c41db5158-scripts\") pod \"ceilometer-0\" (UID: \"df08a3cb-a9ae-4b8e-a9c8-604c41db5158\") " pod="openstack/ceilometer-0" Mar 21 04:45:42 crc kubenswrapper[4839]: I0321 04:45:42.349014 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/df08a3cb-a9ae-4b8e-a9c8-604c41db5158-config-data\") pod \"ceilometer-0\" (UID: \"df08a3cb-a9ae-4b8e-a9c8-604c41db5158\") " pod="openstack/ceilometer-0" Mar 21 04:45:42 crc kubenswrapper[4839]: I0321 04:45:42.349169 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/df08a3cb-a9ae-4b8e-a9c8-604c41db5158-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"df08a3cb-a9ae-4b8e-a9c8-604c41db5158\") " pod="openstack/ceilometer-0" Mar 21 04:45:42 crc kubenswrapper[4839]: I0321 04:45:42.382817 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-94b7-account-create-update-zmpzr" Mar 21 04:45:42 crc kubenswrapper[4839]: I0321 04:45:42.451430 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/df08a3cb-a9ae-4b8e-a9c8-604c41db5158-scripts\") pod \"ceilometer-0\" (UID: \"df08a3cb-a9ae-4b8e-a9c8-604c41db5158\") " pod="openstack/ceilometer-0" Mar 21 04:45:42 crc kubenswrapper[4839]: I0321 04:45:42.451736 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/df08a3cb-a9ae-4b8e-a9c8-604c41db5158-config-data\") pod \"ceilometer-0\" (UID: \"df08a3cb-a9ae-4b8e-a9c8-604c41db5158\") " pod="openstack/ceilometer-0" Mar 21 04:45:42 crc kubenswrapper[4839]: I0321 04:45:42.451837 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/df08a3cb-a9ae-4b8e-a9c8-604c41db5158-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"df08a3cb-a9ae-4b8e-a9c8-604c41db5158\") " pod="openstack/ceilometer-0" Mar 21 04:45:42 crc kubenswrapper[4839]: I0321 04:45:42.451951 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/df08a3cb-a9ae-4b8e-a9c8-604c41db5158-run-httpd\") pod \"ceilometer-0\" (UID: \"df08a3cb-a9ae-4b8e-a9c8-604c41db5158\") " pod="openstack/ceilometer-0" Mar 21 04:45:42 crc kubenswrapper[4839]: I0321 04:45:42.452065 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ldtbn\" (UniqueName: \"kubernetes.io/projected/df08a3cb-a9ae-4b8e-a9c8-604c41db5158-kube-api-access-ldtbn\") pod \"ceilometer-0\" (UID: \"df08a3cb-a9ae-4b8e-a9c8-604c41db5158\") " pod="openstack/ceilometer-0" Mar 21 04:45:42 crc kubenswrapper[4839]: I0321 04:45:42.452690 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/df08a3cb-a9ae-4b8e-a9c8-604c41db5158-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"df08a3cb-a9ae-4b8e-a9c8-604c41db5158\") " pod="openstack/ceilometer-0" Mar 21 04:45:42 crc kubenswrapper[4839]: I0321 04:45:42.452788 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/df08a3cb-a9ae-4b8e-a9c8-604c41db5158-log-httpd\") pod \"ceilometer-0\" (UID: \"df08a3cb-a9ae-4b8e-a9c8-604c41db5158\") " pod="openstack/ceilometer-0" Mar 21 04:45:42 crc kubenswrapper[4839]: I0321 04:45:42.452556 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/df08a3cb-a9ae-4b8e-a9c8-604c41db5158-run-httpd\") pod \"ceilometer-0\" (UID: \"df08a3cb-a9ae-4b8e-a9c8-604c41db5158\") " pod="openstack/ceilometer-0" Mar 21 04:45:42 crc kubenswrapper[4839]: I0321 04:45:42.453232 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/df08a3cb-a9ae-4b8e-a9c8-604c41db5158-log-httpd\") pod \"ceilometer-0\" (UID: \"df08a3cb-a9ae-4b8e-a9c8-604c41db5158\") " pod="openstack/ceilometer-0" Mar 21 04:45:42 crc kubenswrapper[4839]: I0321 04:45:42.458593 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/df08a3cb-a9ae-4b8e-a9c8-604c41db5158-scripts\") pod \"ceilometer-0\" (UID: \"df08a3cb-a9ae-4b8e-a9c8-604c41db5158\") " pod="openstack/ceilometer-0" Mar 21 04:45:42 crc kubenswrapper[4839]: I0321 04:45:42.460726 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/df08a3cb-a9ae-4b8e-a9c8-604c41db5158-config-data\") pod \"ceilometer-0\" (UID: \"df08a3cb-a9ae-4b8e-a9c8-604c41db5158\") " pod="openstack/ceilometer-0" Mar 21 04:45:42 crc kubenswrapper[4839]: I0321 04:45:42.461236 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/df08a3cb-a9ae-4b8e-a9c8-604c41db5158-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"df08a3cb-a9ae-4b8e-a9c8-604c41db5158\") " pod="openstack/ceilometer-0" Mar 21 04:45:42 crc kubenswrapper[4839]: I0321 04:45:42.463629 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/df08a3cb-a9ae-4b8e-a9c8-604c41db5158-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"df08a3cb-a9ae-4b8e-a9c8-604c41db5158\") " pod="openstack/ceilometer-0" Mar 21 04:45:42 crc kubenswrapper[4839]: I0321 04:45:42.474061 4839 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="12b60d89-b044-4822-bc95-47567123e883" path="/var/lib/kubelet/pods/12b60d89-b044-4822-bc95-47567123e883/volumes" Mar 21 04:45:42 crc kubenswrapper[4839]: I0321 04:45:42.474887 4839 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="78141ffe-ee2c-4b66-b8e9-7224e526dfa2" path="/var/lib/kubelet/pods/78141ffe-ee2c-4b66-b8e9-7224e526dfa2/volumes" Mar 21 04:45:42 crc kubenswrapper[4839]: I0321 04:45:42.477783 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ldtbn\" (UniqueName: \"kubernetes.io/projected/df08a3cb-a9ae-4b8e-a9c8-604c41db5158-kube-api-access-ldtbn\") pod \"ceilometer-0\" (UID: \"df08a3cb-a9ae-4b8e-a9c8-604c41db5158\") " pod="openstack/ceilometer-0" Mar 21 04:45:42 crc kubenswrapper[4839]: I0321 04:45:42.487724 4839 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-48d5-account-create-update-5k79b"] Mar 21 04:45:42 crc kubenswrapper[4839]: I0321 04:45:42.510458 4839 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-db-create-w9wx6"] Mar 21 04:45:42 crc kubenswrapper[4839]: I0321 04:45:42.523315 4839 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-db-create-ds7tq"] Mar 21 04:45:42 crc kubenswrapper[4839]: W0321 04:45:42.554912 4839 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod46c56098_2959_4bd0_b762_36a4ee1bb2e6.slice/crio-4cd877ec810dc6f7d6a39c46cd7ecf7300f180e282dae1509e0c792ab4b45fc8 WatchSource:0}: Error finding container 4cd877ec810dc6f7d6a39c46cd7ecf7300f180e282dae1509e0c792ab4b45fc8: Status 404 returned error can't find the container with id 4cd877ec810dc6f7d6a39c46cd7ecf7300f180e282dae1509e0c792ab4b45fc8 Mar 21 04:45:42 crc kubenswrapper[4839]: I0321 04:45:42.628254 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 21 04:45:42 crc kubenswrapper[4839]: I0321 04:45:42.803764 4839 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/swift-proxy-b66c6bfff-76gfx" Mar 21 04:45:42 crc kubenswrapper[4839]: I0321 04:45:42.826477 4839 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/swift-proxy-b66c6bfff-76gfx" Mar 21 04:45:42 crc kubenswrapper[4839]: I0321 04:45:42.923754 4839 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-94b7-account-create-update-zmpzr"] Mar 21 04:45:42 crc kubenswrapper[4839]: W0321 04:45:42.927658 4839 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod60534a44_1538_4bdb_81d1_043c9ae84cee.slice/crio-df524f3b5015131b55e090a47dcfb3d8225d4911cc5b551f8673ad913f2f5471 WatchSource:0}: Error finding container df524f3b5015131b55e090a47dcfb3d8225d4911cc5b551f8673ad913f2f5471: Status 404 returned error can't find the container with id df524f3b5015131b55e090a47dcfb3d8225d4911cc5b551f8673ad913f2f5471 Mar 21 04:45:43 crc kubenswrapper[4839]: I0321 04:45:43.000930 4839 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-46c8-account-create-update-mp8jl"] Mar 21 04:45:43 crc kubenswrapper[4839]: I0321 04:45:43.228832 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-w9wx6" event={"ID":"46c56098-2959-4bd0-b762-36a4ee1bb2e6","Type":"ContainerStarted","Data":"ccd22af7723d538ca33a42ba3654ebdb55e8713c02134e6ab93cc893ad28c76a"} Mar 21 04:45:43 crc kubenswrapper[4839]: I0321 04:45:43.228884 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-w9wx6" event={"ID":"46c56098-2959-4bd0-b762-36a4ee1bb2e6","Type":"ContainerStarted","Data":"4cd877ec810dc6f7d6a39c46cd7ecf7300f180e282dae1509e0c792ab4b45fc8"} Mar 21 04:45:43 crc kubenswrapper[4839]: I0321 04:45:43.234131 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-db-create-4zz89" event={"ID":"b76e9253-1495-42d5-910f-cce6f2730243","Type":"ContainerStarted","Data":"d3d91629ebc8060afc821dc6f6ff1f1f4f9eb9613514c223b3a39c31ccd40e5c"} Mar 21 04:45:43 crc kubenswrapper[4839]: I0321 04:45:43.234364 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-db-create-4zz89" event={"ID":"b76e9253-1495-42d5-910f-cce6f2730243","Type":"ContainerStarted","Data":"54b14184b0a30e6f28cd2e9d592a640dccc616fa6f788aae4c5dcf3a458c8feb"} Mar 21 04:45:43 crc kubenswrapper[4839]: I0321 04:45:43.238467 4839 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 21 04:45:43 crc kubenswrapper[4839]: I0321 04:45:43.244500 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-94b7-account-create-update-zmpzr" event={"ID":"60534a44-1538-4bdb-81d1-043c9ae84cee","Type":"ContainerStarted","Data":"df524f3b5015131b55e090a47dcfb3d8225d4911cc5b551f8673ad913f2f5471"} Mar 21 04:45:43 crc kubenswrapper[4839]: I0321 04:45:43.255220 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-46c8-account-create-update-mp8jl" event={"ID":"4185a56e-9d10-4aea-ad84-a865dff3e6be","Type":"ContainerStarted","Data":"5066435ed3d77bc5c33c59d562874afad187c31dc999c5e5a391a142f1d66cb0"} Mar 21 04:45:43 crc kubenswrapper[4839]: I0321 04:45:43.261765 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-48d5-account-create-update-5k79b" event={"ID":"f481fb0d-ac2f-4989-a547-50f5081e4e78","Type":"ContainerStarted","Data":"9c0964d074027bd8b20f7561440904d50f0f5ad70eed7435ed8da532c09da947"} Mar 21 04:45:43 crc kubenswrapper[4839]: I0321 04:45:43.261919 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-48d5-account-create-update-5k79b" event={"ID":"f481fb0d-ac2f-4989-a547-50f5081e4e78","Type":"ContainerStarted","Data":"f92cef9a4e1b4a3b36fa3f0703a08a139186e14f9ea165ca6acea88ecdb50732"} Mar 21 04:45:43 crc kubenswrapper[4839]: I0321 04:45:43.279005 4839 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-db-create-4zz89" podStartSLOduration=2.278970437 podStartE2EDuration="2.278970437s" podCreationTimestamp="2026-03-21 04:45:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-21 04:45:43.254316157 +0000 UTC m=+1347.582102843" watchObservedRunningTime="2026-03-21 04:45:43.278970437 +0000 UTC m=+1347.606757113" Mar 21 04:45:43 crc kubenswrapper[4839]: I0321 04:45:43.300869 4839 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-48d5-account-create-update-5k79b" podStartSLOduration=2.300843349 podStartE2EDuration="2.300843349s" podCreationTimestamp="2026-03-21 04:45:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-21 04:45:43.295057327 +0000 UTC m=+1347.622844003" watchObservedRunningTime="2026-03-21 04:45:43.300843349 +0000 UTC m=+1347.628630025" Mar 21 04:45:43 crc kubenswrapper[4839]: I0321 04:45:43.300988 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db-create-ds7tq" event={"ID":"9220ed3c-2e97-4efc-a4cc-28bb29774ad8","Type":"ContainerStarted","Data":"296c6956b7a45c772d2bc75858a9b2db91782289c6cf30854b24fd106bb5d692"} Mar 21 04:45:43 crc kubenswrapper[4839]: I0321 04:45:43.301049 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db-create-ds7tq" event={"ID":"9220ed3c-2e97-4efc-a4cc-28bb29774ad8","Type":"ContainerStarted","Data":"69b1f529d6acbca50b055efd49164192e5afe15ca2555525c0367f380a6d5b3e"} Mar 21 04:45:43 crc kubenswrapper[4839]: I0321 04:45:43.319734 4839 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-db-create-ds7tq" podStartSLOduration=2.319717137 podStartE2EDuration="2.319717137s" podCreationTimestamp="2026-03-21 04:45:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-21 04:45:43.314205533 +0000 UTC m=+1347.641992209" watchObservedRunningTime="2026-03-21 04:45:43.319717137 +0000 UTC m=+1347.647503813" Mar 21 04:45:44 crc kubenswrapper[4839]: I0321 04:45:44.329174 4839 generic.go:334] "Generic (PLEG): container finished" podID="b76e9253-1495-42d5-910f-cce6f2730243" containerID="d3d91629ebc8060afc821dc6f6ff1f1f4f9eb9613514c223b3a39c31ccd40e5c" exitCode=0 Mar 21 04:45:44 crc kubenswrapper[4839]: I0321 04:45:44.329486 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-db-create-4zz89" event={"ID":"b76e9253-1495-42d5-910f-cce6f2730243","Type":"ContainerDied","Data":"d3d91629ebc8060afc821dc6f6ff1f1f4f9eb9613514c223b3a39c31ccd40e5c"} Mar 21 04:45:44 crc kubenswrapper[4839]: I0321 04:45:44.333868 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"df08a3cb-a9ae-4b8e-a9c8-604c41db5158","Type":"ContainerStarted","Data":"41ff38380ac8ed55675761ad2bd4b24ee85da709e085d038d76ac53207f2c9ae"} Mar 21 04:45:44 crc kubenswrapper[4839]: I0321 04:45:44.353047 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-94b7-account-create-update-zmpzr" event={"ID":"60534a44-1538-4bdb-81d1-043c9ae84cee","Type":"ContainerStarted","Data":"2de908b5bd6bba55215cf326e7323c0123b89a96311bd62e86b355ee0ff19bc1"} Mar 21 04:45:44 crc kubenswrapper[4839]: I0321 04:45:44.360078 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-46c8-account-create-update-mp8jl" event={"ID":"4185a56e-9d10-4aea-ad84-a865dff3e6be","Type":"ContainerStarted","Data":"c7f784ce54bb50fe64fb506149fb81059511360e35c18d47126e20bcbe758d00"} Mar 21 04:45:44 crc kubenswrapper[4839]: I0321 04:45:44.368389 4839 generic.go:334] "Generic (PLEG): container finished" podID="f481fb0d-ac2f-4989-a547-50f5081e4e78" containerID="9c0964d074027bd8b20f7561440904d50f0f5ad70eed7435ed8da532c09da947" exitCode=0 Mar 21 04:45:44 crc kubenswrapper[4839]: I0321 04:45:44.368783 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-48d5-account-create-update-5k79b" event={"ID":"f481fb0d-ac2f-4989-a547-50f5081e4e78","Type":"ContainerDied","Data":"9c0964d074027bd8b20f7561440904d50f0f5ad70eed7435ed8da532c09da947"} Mar 21 04:45:44 crc kubenswrapper[4839]: I0321 04:45:44.371839 4839 generic.go:334] "Generic (PLEG): container finished" podID="9220ed3c-2e97-4efc-a4cc-28bb29774ad8" containerID="296c6956b7a45c772d2bc75858a9b2db91782289c6cf30854b24fd106bb5d692" exitCode=0 Mar 21 04:45:44 crc kubenswrapper[4839]: I0321 04:45:44.371890 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db-create-ds7tq" event={"ID":"9220ed3c-2e97-4efc-a4cc-28bb29774ad8","Type":"ContainerDied","Data":"296c6956b7a45c772d2bc75858a9b2db91782289c6cf30854b24fd106bb5d692"} Mar 21 04:45:44 crc kubenswrapper[4839]: I0321 04:45:44.377209 4839 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-94b7-account-create-update-zmpzr" podStartSLOduration=3.37719068 podStartE2EDuration="3.37719068s" podCreationTimestamp="2026-03-21 04:45:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-21 04:45:44.371446669 +0000 UTC m=+1348.699233345" watchObservedRunningTime="2026-03-21 04:45:44.37719068 +0000 UTC m=+1348.704977356" Mar 21 04:45:44 crc kubenswrapper[4839]: I0321 04:45:44.385607 4839 generic.go:334] "Generic (PLEG): container finished" podID="46c56098-2959-4bd0-b762-36a4ee1bb2e6" containerID="ccd22af7723d538ca33a42ba3654ebdb55e8713c02134e6ab93cc893ad28c76a" exitCode=0 Mar 21 04:45:44 crc kubenswrapper[4839]: I0321 04:45:44.385706 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-w9wx6" event={"ID":"46c56098-2959-4bd0-b762-36a4ee1bb2e6","Type":"ContainerDied","Data":"ccd22af7723d538ca33a42ba3654ebdb55e8713c02134e6ab93cc893ad28c76a"} Mar 21 04:45:44 crc kubenswrapper[4839]: I0321 04:45:44.393207 4839 generic.go:334] "Generic (PLEG): container finished" podID="b3b26c3a-55d5-442a-9c31-187b0aa60f90" containerID="0bc7ef10848b0da5e68b6c3552cc343013046d2176bf665b0d2389f263149510" exitCode=137 Mar 21 04:45:44 crc kubenswrapper[4839]: I0321 04:45:44.393301 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-84c6c985f8-v5cmh" event={"ID":"b3b26c3a-55d5-442a-9c31-187b0aa60f90","Type":"ContainerDied","Data":"0bc7ef10848b0da5e68b6c3552cc343013046d2176bf665b0d2389f263149510"} Mar 21 04:45:44 crc kubenswrapper[4839]: I0321 04:45:44.411837 4839 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-46c8-account-create-update-mp8jl" podStartSLOduration=3.411817518 podStartE2EDuration="3.411817518s" podCreationTimestamp="2026-03-21 04:45:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-21 04:45:44.386998814 +0000 UTC m=+1348.714785510" watchObservedRunningTime="2026-03-21 04:45:44.411817518 +0000 UTC m=+1348.739604194" Mar 21 04:45:44 crc kubenswrapper[4839]: I0321 04:45:44.513688 4839 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 21 04:45:44 crc kubenswrapper[4839]: I0321 04:45:44.678795 4839 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-84c6c985f8-v5cmh" Mar 21 04:45:44 crc kubenswrapper[4839]: I0321 04:45:44.715712 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/b3b26c3a-55d5-442a-9c31-187b0aa60f90-scripts\") pod \"b3b26c3a-55d5-442a-9c31-187b0aa60f90\" (UID: \"b3b26c3a-55d5-442a-9c31-187b0aa60f90\") " Mar 21 04:45:44 crc kubenswrapper[4839]: I0321 04:45:44.716605 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b3b26c3a-55d5-442a-9c31-187b0aa60f90-logs\") pod \"b3b26c3a-55d5-442a-9c31-187b0aa60f90\" (UID: \"b3b26c3a-55d5-442a-9c31-187b0aa60f90\") " Mar 21 04:45:44 crc kubenswrapper[4839]: I0321 04:45:44.716729 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/b3b26c3a-55d5-442a-9c31-187b0aa60f90-horizon-secret-key\") pod \"b3b26c3a-55d5-442a-9c31-187b0aa60f90\" (UID: \"b3b26c3a-55d5-442a-9c31-187b0aa60f90\") " Mar 21 04:45:44 crc kubenswrapper[4839]: I0321 04:45:44.716800 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vmrcp\" (UniqueName: \"kubernetes.io/projected/b3b26c3a-55d5-442a-9c31-187b0aa60f90-kube-api-access-vmrcp\") pod \"b3b26c3a-55d5-442a-9c31-187b0aa60f90\" (UID: \"b3b26c3a-55d5-442a-9c31-187b0aa60f90\") " Mar 21 04:45:44 crc kubenswrapper[4839]: I0321 04:45:44.716839 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/b3b26c3a-55d5-442a-9c31-187b0aa60f90-horizon-tls-certs\") pod \"b3b26c3a-55d5-442a-9c31-187b0aa60f90\" (UID: \"b3b26c3a-55d5-442a-9c31-187b0aa60f90\") " Mar 21 04:45:44 crc kubenswrapper[4839]: I0321 04:45:44.716976 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/b3b26c3a-55d5-442a-9c31-187b0aa60f90-config-data\") pod \"b3b26c3a-55d5-442a-9c31-187b0aa60f90\" (UID: \"b3b26c3a-55d5-442a-9c31-187b0aa60f90\") " Mar 21 04:45:44 crc kubenswrapper[4839]: I0321 04:45:44.717026 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b3b26c3a-55d5-442a-9c31-187b0aa60f90-combined-ca-bundle\") pod \"b3b26c3a-55d5-442a-9c31-187b0aa60f90\" (UID: \"b3b26c3a-55d5-442a-9c31-187b0aa60f90\") " Mar 21 04:45:44 crc kubenswrapper[4839]: I0321 04:45:44.722234 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b3b26c3a-55d5-442a-9c31-187b0aa60f90-kube-api-access-vmrcp" (OuterVolumeSpecName: "kube-api-access-vmrcp") pod "b3b26c3a-55d5-442a-9c31-187b0aa60f90" (UID: "b3b26c3a-55d5-442a-9c31-187b0aa60f90"). InnerVolumeSpecName "kube-api-access-vmrcp". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 21 04:45:44 crc kubenswrapper[4839]: I0321 04:45:44.722485 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b3b26c3a-55d5-442a-9c31-187b0aa60f90-logs" (OuterVolumeSpecName: "logs") pod "b3b26c3a-55d5-442a-9c31-187b0aa60f90" (UID: "b3b26c3a-55d5-442a-9c31-187b0aa60f90"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 21 04:45:44 crc kubenswrapper[4839]: I0321 04:45:44.725254 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b3b26c3a-55d5-442a-9c31-187b0aa60f90-horizon-secret-key" (OuterVolumeSpecName: "horizon-secret-key") pod "b3b26c3a-55d5-442a-9c31-187b0aa60f90" (UID: "b3b26c3a-55d5-442a-9c31-187b0aa60f90"). InnerVolumeSpecName "horizon-secret-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 21 04:45:44 crc kubenswrapper[4839]: I0321 04:45:44.748057 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b3b26c3a-55d5-442a-9c31-187b0aa60f90-config-data" (OuterVolumeSpecName: "config-data") pod "b3b26c3a-55d5-442a-9c31-187b0aa60f90" (UID: "b3b26c3a-55d5-442a-9c31-187b0aa60f90"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 21 04:45:44 crc kubenswrapper[4839]: I0321 04:45:44.749198 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b3b26c3a-55d5-442a-9c31-187b0aa60f90-scripts" (OuterVolumeSpecName: "scripts") pod "b3b26c3a-55d5-442a-9c31-187b0aa60f90" (UID: "b3b26c3a-55d5-442a-9c31-187b0aa60f90"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 21 04:45:44 crc kubenswrapper[4839]: I0321 04:45:44.749941 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b3b26c3a-55d5-442a-9c31-187b0aa60f90-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "b3b26c3a-55d5-442a-9c31-187b0aa60f90" (UID: "b3b26c3a-55d5-442a-9c31-187b0aa60f90"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 21 04:45:44 crc kubenswrapper[4839]: I0321 04:45:44.790128 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b3b26c3a-55d5-442a-9c31-187b0aa60f90-horizon-tls-certs" (OuterVolumeSpecName: "horizon-tls-certs") pod "b3b26c3a-55d5-442a-9c31-187b0aa60f90" (UID: "b3b26c3a-55d5-442a-9c31-187b0aa60f90"). InnerVolumeSpecName "horizon-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 21 04:45:44 crc kubenswrapper[4839]: I0321 04:45:44.821283 4839 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b3b26c3a-55d5-442a-9c31-187b0aa60f90-logs\") on node \"crc\" DevicePath \"\"" Mar 21 04:45:44 crc kubenswrapper[4839]: I0321 04:45:44.821345 4839 reconciler_common.go:293] "Volume detached for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/b3b26c3a-55d5-442a-9c31-187b0aa60f90-horizon-secret-key\") on node \"crc\" DevicePath \"\"" Mar 21 04:45:44 crc kubenswrapper[4839]: I0321 04:45:44.821361 4839 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vmrcp\" (UniqueName: \"kubernetes.io/projected/b3b26c3a-55d5-442a-9c31-187b0aa60f90-kube-api-access-vmrcp\") on node \"crc\" DevicePath \"\"" Mar 21 04:45:44 crc kubenswrapper[4839]: I0321 04:45:44.821374 4839 reconciler_common.go:293] "Volume detached for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/b3b26c3a-55d5-442a-9c31-187b0aa60f90-horizon-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 21 04:45:44 crc kubenswrapper[4839]: I0321 04:45:44.821386 4839 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/b3b26c3a-55d5-442a-9c31-187b0aa60f90-config-data\") on node \"crc\" DevicePath \"\"" Mar 21 04:45:44 crc kubenswrapper[4839]: I0321 04:45:44.821398 4839 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b3b26c3a-55d5-442a-9c31-187b0aa60f90-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 21 04:45:44 crc kubenswrapper[4839]: I0321 04:45:44.821412 4839 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/b3b26c3a-55d5-442a-9c31-187b0aa60f90-scripts\") on node \"crc\" DevicePath \"\"" Mar 21 04:45:45 crc kubenswrapper[4839]: I0321 04:45:45.412152 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-84c6c985f8-v5cmh" event={"ID":"b3b26c3a-55d5-442a-9c31-187b0aa60f90","Type":"ContainerDied","Data":"b5753b189f3fee68b09fb93ec56788b978b6a6741d48ecf04c45ca76fee101e1"} Mar 21 04:45:45 crc kubenswrapper[4839]: I0321 04:45:45.412176 4839 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-84c6c985f8-v5cmh" Mar 21 04:45:45 crc kubenswrapper[4839]: I0321 04:45:45.413383 4839 scope.go:117] "RemoveContainer" containerID="e004b9646c4df34c1d5bba67912a6fa76f3cccc25c7980ab777e369e37ce16c9" Mar 21 04:45:45 crc kubenswrapper[4839]: I0321 04:45:45.418871 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"df08a3cb-a9ae-4b8e-a9c8-604c41db5158","Type":"ContainerStarted","Data":"fc0f3fd92cb474b1eb7bf19080b6bc0cbd204d90f6d9a3fb300c3450c2d63712"} Mar 21 04:45:45 crc kubenswrapper[4839]: I0321 04:45:45.418911 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"df08a3cb-a9ae-4b8e-a9c8-604c41db5158","Type":"ContainerStarted","Data":"815ecf57604f93f58650798c5566709077c42cf6955a2a6a4da70402fb94d50e"} Mar 21 04:45:45 crc kubenswrapper[4839]: I0321 04:45:45.421354 4839 generic.go:334] "Generic (PLEG): container finished" podID="60534a44-1538-4bdb-81d1-043c9ae84cee" containerID="2de908b5bd6bba55215cf326e7323c0123b89a96311bd62e86b355ee0ff19bc1" exitCode=0 Mar 21 04:45:45 crc kubenswrapper[4839]: I0321 04:45:45.421466 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-94b7-account-create-update-zmpzr" event={"ID":"60534a44-1538-4bdb-81d1-043c9ae84cee","Type":"ContainerDied","Data":"2de908b5bd6bba55215cf326e7323c0123b89a96311bd62e86b355ee0ff19bc1"} Mar 21 04:45:45 crc kubenswrapper[4839]: I0321 04:45:45.423119 4839 generic.go:334] "Generic (PLEG): container finished" podID="4185a56e-9d10-4aea-ad84-a865dff3e6be" containerID="c7f784ce54bb50fe64fb506149fb81059511360e35c18d47126e20bcbe758d00" exitCode=0 Mar 21 04:45:45 crc kubenswrapper[4839]: I0321 04:45:45.423285 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-46c8-account-create-update-mp8jl" event={"ID":"4185a56e-9d10-4aea-ad84-a865dff3e6be","Type":"ContainerDied","Data":"c7f784ce54bb50fe64fb506149fb81059511360e35c18d47126e20bcbe758d00"} Mar 21 04:45:45 crc kubenswrapper[4839]: I0321 04:45:45.483995 4839 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-84c6c985f8-v5cmh"] Mar 21 04:45:45 crc kubenswrapper[4839]: I0321 04:45:45.504956 4839 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/horizon-84c6c985f8-v5cmh"] Mar 21 04:45:45 crc kubenswrapper[4839]: I0321 04:45:45.632752 4839 scope.go:117] "RemoveContainer" containerID="0bc7ef10848b0da5e68b6c3552cc343013046d2176bf665b0d2389f263149510" Mar 21 04:45:45 crc kubenswrapper[4839]: I0321 04:45:45.970959 4839 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-48d5-account-create-update-5k79b" Mar 21 04:45:46 crc kubenswrapper[4839]: I0321 04:45:46.047339 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gpg9b\" (UniqueName: \"kubernetes.io/projected/f481fb0d-ac2f-4989-a547-50f5081e4e78-kube-api-access-gpg9b\") pod \"f481fb0d-ac2f-4989-a547-50f5081e4e78\" (UID: \"f481fb0d-ac2f-4989-a547-50f5081e4e78\") " Mar 21 04:45:46 crc kubenswrapper[4839]: I0321 04:45:46.047439 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f481fb0d-ac2f-4989-a547-50f5081e4e78-operator-scripts\") pod \"f481fb0d-ac2f-4989-a547-50f5081e4e78\" (UID: \"f481fb0d-ac2f-4989-a547-50f5081e4e78\") " Mar 21 04:45:46 crc kubenswrapper[4839]: I0321 04:45:46.052823 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f481fb0d-ac2f-4989-a547-50f5081e4e78-kube-api-access-gpg9b" (OuterVolumeSpecName: "kube-api-access-gpg9b") pod "f481fb0d-ac2f-4989-a547-50f5081e4e78" (UID: "f481fb0d-ac2f-4989-a547-50f5081e4e78"). InnerVolumeSpecName "kube-api-access-gpg9b". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 21 04:45:46 crc kubenswrapper[4839]: I0321 04:45:46.069179 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f481fb0d-ac2f-4989-a547-50f5081e4e78-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "f481fb0d-ac2f-4989-a547-50f5081e4e78" (UID: "f481fb0d-ac2f-4989-a547-50f5081e4e78"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 21 04:45:46 crc kubenswrapper[4839]: I0321 04:45:46.149033 4839 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gpg9b\" (UniqueName: \"kubernetes.io/projected/f481fb0d-ac2f-4989-a547-50f5081e4e78-kube-api-access-gpg9b\") on node \"crc\" DevicePath \"\"" Mar 21 04:45:46 crc kubenswrapper[4839]: I0321 04:45:46.149066 4839 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f481fb0d-ac2f-4989-a547-50f5081e4e78-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 21 04:45:46 crc kubenswrapper[4839]: I0321 04:45:46.185513 4839 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-4zz89" Mar 21 04:45:46 crc kubenswrapper[4839]: I0321 04:45:46.206347 4839 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-w9wx6" Mar 21 04:45:46 crc kubenswrapper[4839]: I0321 04:45:46.210763 4839 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-ds7tq" Mar 21 04:45:46 crc kubenswrapper[4839]: I0321 04:45:46.250177 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/9220ed3c-2e97-4efc-a4cc-28bb29774ad8-operator-scripts\") pod \"9220ed3c-2e97-4efc-a4cc-28bb29774ad8\" (UID: \"9220ed3c-2e97-4efc-a4cc-28bb29774ad8\") " Mar 21 04:45:46 crc kubenswrapper[4839]: I0321 04:45:46.250328 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/46c56098-2959-4bd0-b762-36a4ee1bb2e6-operator-scripts\") pod \"46c56098-2959-4bd0-b762-36a4ee1bb2e6\" (UID: \"46c56098-2959-4bd0-b762-36a4ee1bb2e6\") " Mar 21 04:45:46 crc kubenswrapper[4839]: I0321 04:45:46.250374 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mb2gw\" (UniqueName: \"kubernetes.io/projected/9220ed3c-2e97-4efc-a4cc-28bb29774ad8-kube-api-access-mb2gw\") pod \"9220ed3c-2e97-4efc-a4cc-28bb29774ad8\" (UID: \"9220ed3c-2e97-4efc-a4cc-28bb29774ad8\") " Mar 21 04:45:46 crc kubenswrapper[4839]: I0321 04:45:46.250422 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-978bj\" (UniqueName: \"kubernetes.io/projected/b76e9253-1495-42d5-910f-cce6f2730243-kube-api-access-978bj\") pod \"b76e9253-1495-42d5-910f-cce6f2730243\" (UID: \"b76e9253-1495-42d5-910f-cce6f2730243\") " Mar 21 04:45:46 crc kubenswrapper[4839]: I0321 04:45:46.250486 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bhvhx\" (UniqueName: \"kubernetes.io/projected/46c56098-2959-4bd0-b762-36a4ee1bb2e6-kube-api-access-bhvhx\") pod \"46c56098-2959-4bd0-b762-36a4ee1bb2e6\" (UID: \"46c56098-2959-4bd0-b762-36a4ee1bb2e6\") " Mar 21 04:45:46 crc kubenswrapper[4839]: I0321 04:45:46.250515 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b76e9253-1495-42d5-910f-cce6f2730243-operator-scripts\") pod \"b76e9253-1495-42d5-910f-cce6f2730243\" (UID: \"b76e9253-1495-42d5-910f-cce6f2730243\") " Mar 21 04:45:46 crc kubenswrapper[4839]: I0321 04:45:46.250642 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9220ed3c-2e97-4efc-a4cc-28bb29774ad8-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "9220ed3c-2e97-4efc-a4cc-28bb29774ad8" (UID: "9220ed3c-2e97-4efc-a4cc-28bb29774ad8"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 21 04:45:46 crc kubenswrapper[4839]: I0321 04:45:46.251049 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/46c56098-2959-4bd0-b762-36a4ee1bb2e6-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "46c56098-2959-4bd0-b762-36a4ee1bb2e6" (UID: "46c56098-2959-4bd0-b762-36a4ee1bb2e6"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 21 04:45:46 crc kubenswrapper[4839]: I0321 04:45:46.251417 4839 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/9220ed3c-2e97-4efc-a4cc-28bb29774ad8-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 21 04:45:46 crc kubenswrapper[4839]: I0321 04:45:46.251428 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b76e9253-1495-42d5-910f-cce6f2730243-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "b76e9253-1495-42d5-910f-cce6f2730243" (UID: "b76e9253-1495-42d5-910f-cce6f2730243"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 21 04:45:46 crc kubenswrapper[4839]: I0321 04:45:46.251439 4839 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/46c56098-2959-4bd0-b762-36a4ee1bb2e6-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 21 04:45:46 crc kubenswrapper[4839]: I0321 04:45:46.257401 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/46c56098-2959-4bd0-b762-36a4ee1bb2e6-kube-api-access-bhvhx" (OuterVolumeSpecName: "kube-api-access-bhvhx") pod "46c56098-2959-4bd0-b762-36a4ee1bb2e6" (UID: "46c56098-2959-4bd0-b762-36a4ee1bb2e6"). InnerVolumeSpecName "kube-api-access-bhvhx". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 21 04:45:46 crc kubenswrapper[4839]: I0321 04:45:46.273822 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9220ed3c-2e97-4efc-a4cc-28bb29774ad8-kube-api-access-mb2gw" (OuterVolumeSpecName: "kube-api-access-mb2gw") pod "9220ed3c-2e97-4efc-a4cc-28bb29774ad8" (UID: "9220ed3c-2e97-4efc-a4cc-28bb29774ad8"). InnerVolumeSpecName "kube-api-access-mb2gw". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 21 04:45:46 crc kubenswrapper[4839]: I0321 04:45:46.273939 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b76e9253-1495-42d5-910f-cce6f2730243-kube-api-access-978bj" (OuterVolumeSpecName: "kube-api-access-978bj") pod "b76e9253-1495-42d5-910f-cce6f2730243" (UID: "b76e9253-1495-42d5-910f-cce6f2730243"). InnerVolumeSpecName "kube-api-access-978bj". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 21 04:45:46 crc kubenswrapper[4839]: I0321 04:45:46.353439 4839 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mb2gw\" (UniqueName: \"kubernetes.io/projected/9220ed3c-2e97-4efc-a4cc-28bb29774ad8-kube-api-access-mb2gw\") on node \"crc\" DevicePath \"\"" Mar 21 04:45:46 crc kubenswrapper[4839]: I0321 04:45:46.353487 4839 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bhvhx\" (UniqueName: \"kubernetes.io/projected/46c56098-2959-4bd0-b762-36a4ee1bb2e6-kube-api-access-bhvhx\") on node \"crc\" DevicePath \"\"" Mar 21 04:45:46 crc kubenswrapper[4839]: I0321 04:45:46.353501 4839 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-978bj\" (UniqueName: \"kubernetes.io/projected/b76e9253-1495-42d5-910f-cce6f2730243-kube-api-access-978bj\") on node \"crc\" DevicePath \"\"" Mar 21 04:45:46 crc kubenswrapper[4839]: I0321 04:45:46.353519 4839 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b76e9253-1495-42d5-910f-cce6f2730243-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 21 04:45:46 crc kubenswrapper[4839]: I0321 04:45:46.433807 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db-create-ds7tq" event={"ID":"9220ed3c-2e97-4efc-a4cc-28bb29774ad8","Type":"ContainerDied","Data":"69b1f529d6acbca50b055efd49164192e5afe15ca2555525c0367f380a6d5b3e"} Mar 21 04:45:46 crc kubenswrapper[4839]: I0321 04:45:46.433864 4839 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="69b1f529d6acbca50b055efd49164192e5afe15ca2555525c0367f380a6d5b3e" Mar 21 04:45:46 crc kubenswrapper[4839]: I0321 04:45:46.433863 4839 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-ds7tq" Mar 21 04:45:46 crc kubenswrapper[4839]: I0321 04:45:46.435397 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-w9wx6" event={"ID":"46c56098-2959-4bd0-b762-36a4ee1bb2e6","Type":"ContainerDied","Data":"4cd877ec810dc6f7d6a39c46cd7ecf7300f180e282dae1509e0c792ab4b45fc8"} Mar 21 04:45:46 crc kubenswrapper[4839]: I0321 04:45:46.435451 4839 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="4cd877ec810dc6f7d6a39c46cd7ecf7300f180e282dae1509e0c792ab4b45fc8" Mar 21 04:45:46 crc kubenswrapper[4839]: I0321 04:45:46.435408 4839 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-w9wx6" Mar 21 04:45:46 crc kubenswrapper[4839]: I0321 04:45:46.438332 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-db-create-4zz89" event={"ID":"b76e9253-1495-42d5-910f-cce6f2730243","Type":"ContainerDied","Data":"54b14184b0a30e6f28cd2e9d592a640dccc616fa6f788aae4c5dcf3a458c8feb"} Mar 21 04:45:46 crc kubenswrapper[4839]: I0321 04:45:46.438358 4839 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="54b14184b0a30e6f28cd2e9d592a640dccc616fa6f788aae4c5dcf3a458c8feb" Mar 21 04:45:46 crc kubenswrapper[4839]: I0321 04:45:46.438371 4839 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-4zz89" Mar 21 04:45:46 crc kubenswrapper[4839]: I0321 04:45:46.440560 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"df08a3cb-a9ae-4b8e-a9c8-604c41db5158","Type":"ContainerStarted","Data":"5e8e76a9a5316a3f376f3d393cce449d73ead54bab022e83ee496cd3fcbd47be"} Mar 21 04:45:46 crc kubenswrapper[4839]: I0321 04:45:46.442144 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-48d5-account-create-update-5k79b" event={"ID":"f481fb0d-ac2f-4989-a547-50f5081e4e78","Type":"ContainerDied","Data":"f92cef9a4e1b4a3b36fa3f0703a08a139186e14f9ea165ca6acea88ecdb50732"} Mar 21 04:45:46 crc kubenswrapper[4839]: I0321 04:45:46.442187 4839 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f92cef9a4e1b4a3b36fa3f0703a08a139186e14f9ea165ca6acea88ecdb50732" Mar 21 04:45:46 crc kubenswrapper[4839]: I0321 04:45:46.442210 4839 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-48d5-account-create-update-5k79b" Mar 21 04:45:46 crc kubenswrapper[4839]: I0321 04:45:46.477770 4839 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b3b26c3a-55d5-442a-9c31-187b0aa60f90" path="/var/lib/kubelet/pods/b3b26c3a-55d5-442a-9c31-187b0aa60f90/volumes" Mar 21 04:45:47 crc kubenswrapper[4839]: I0321 04:45:47.048166 4839 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-46c8-account-create-update-mp8jl" Mar 21 04:45:47 crc kubenswrapper[4839]: I0321 04:45:47.055677 4839 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-94b7-account-create-update-zmpzr" Mar 21 04:45:47 crc kubenswrapper[4839]: I0321 04:45:47.067466 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/4185a56e-9d10-4aea-ad84-a865dff3e6be-operator-scripts\") pod \"4185a56e-9d10-4aea-ad84-a865dff3e6be\" (UID: \"4185a56e-9d10-4aea-ad84-a865dff3e6be\") " Mar 21 04:45:47 crc kubenswrapper[4839]: I0321 04:45:47.067536 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vb7zf\" (UniqueName: \"kubernetes.io/projected/4185a56e-9d10-4aea-ad84-a865dff3e6be-kube-api-access-vb7zf\") pod \"4185a56e-9d10-4aea-ad84-a865dff3e6be\" (UID: \"4185a56e-9d10-4aea-ad84-a865dff3e6be\") " Mar 21 04:45:47 crc kubenswrapper[4839]: I0321 04:45:47.067645 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-j98k2\" (UniqueName: \"kubernetes.io/projected/60534a44-1538-4bdb-81d1-043c9ae84cee-kube-api-access-j98k2\") pod \"60534a44-1538-4bdb-81d1-043c9ae84cee\" (UID: \"60534a44-1538-4bdb-81d1-043c9ae84cee\") " Mar 21 04:45:47 crc kubenswrapper[4839]: I0321 04:45:47.067785 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/60534a44-1538-4bdb-81d1-043c9ae84cee-operator-scripts\") pod \"60534a44-1538-4bdb-81d1-043c9ae84cee\" (UID: \"60534a44-1538-4bdb-81d1-043c9ae84cee\") " Mar 21 04:45:47 crc kubenswrapper[4839]: I0321 04:45:47.068277 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4185a56e-9d10-4aea-ad84-a865dff3e6be-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "4185a56e-9d10-4aea-ad84-a865dff3e6be" (UID: "4185a56e-9d10-4aea-ad84-a865dff3e6be"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 21 04:45:47 crc kubenswrapper[4839]: I0321 04:45:47.069074 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/60534a44-1538-4bdb-81d1-043c9ae84cee-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "60534a44-1538-4bdb-81d1-043c9ae84cee" (UID: "60534a44-1538-4bdb-81d1-043c9ae84cee"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 21 04:45:47 crc kubenswrapper[4839]: I0321 04:45:47.107034 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4185a56e-9d10-4aea-ad84-a865dff3e6be-kube-api-access-vb7zf" (OuterVolumeSpecName: "kube-api-access-vb7zf") pod "4185a56e-9d10-4aea-ad84-a865dff3e6be" (UID: "4185a56e-9d10-4aea-ad84-a865dff3e6be"). InnerVolumeSpecName "kube-api-access-vb7zf". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 21 04:45:47 crc kubenswrapper[4839]: I0321 04:45:47.107201 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/60534a44-1538-4bdb-81d1-043c9ae84cee-kube-api-access-j98k2" (OuterVolumeSpecName: "kube-api-access-j98k2") pod "60534a44-1538-4bdb-81d1-043c9ae84cee" (UID: "60534a44-1538-4bdb-81d1-043c9ae84cee"). InnerVolumeSpecName "kube-api-access-j98k2". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 21 04:45:47 crc kubenswrapper[4839]: I0321 04:45:47.170225 4839 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/60534a44-1538-4bdb-81d1-043c9ae84cee-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 21 04:45:47 crc kubenswrapper[4839]: I0321 04:45:47.170256 4839 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/4185a56e-9d10-4aea-ad84-a865dff3e6be-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 21 04:45:47 crc kubenswrapper[4839]: I0321 04:45:47.170268 4839 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vb7zf\" (UniqueName: \"kubernetes.io/projected/4185a56e-9d10-4aea-ad84-a865dff3e6be-kube-api-access-vb7zf\") on node \"crc\" DevicePath \"\"" Mar 21 04:45:47 crc kubenswrapper[4839]: I0321 04:45:47.170279 4839 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-j98k2\" (UniqueName: \"kubernetes.io/projected/60534a44-1538-4bdb-81d1-043c9ae84cee-kube-api-access-j98k2\") on node \"crc\" DevicePath \"\"" Mar 21 04:45:47 crc kubenswrapper[4839]: I0321 04:45:47.286131 4839 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/placement-75bd8b89b4-djjlh" Mar 21 04:45:47 crc kubenswrapper[4839]: I0321 04:45:47.376732 4839 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/placement-75bd8b89b4-djjlh" Mar 21 04:45:47 crc kubenswrapper[4839]: I0321 04:45:47.481220 4839 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-94b7-account-create-update-zmpzr" Mar 21 04:45:47 crc kubenswrapper[4839]: I0321 04:45:47.481415 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-94b7-account-create-update-zmpzr" event={"ID":"60534a44-1538-4bdb-81d1-043c9ae84cee","Type":"ContainerDied","Data":"df524f3b5015131b55e090a47dcfb3d8225d4911cc5b551f8673ad913f2f5471"} Mar 21 04:45:47 crc kubenswrapper[4839]: I0321 04:45:47.482490 4839 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="df524f3b5015131b55e090a47dcfb3d8225d4911cc5b551f8673ad913f2f5471" Mar 21 04:45:47 crc kubenswrapper[4839]: I0321 04:45:47.482516 4839 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-5788c8f798-khqlb"] Mar 21 04:45:47 crc kubenswrapper[4839]: I0321 04:45:47.482769 4839 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/placement-5788c8f798-khqlb" podUID="30c2fe46-cd8a-43f9-8968-b6e65d7c862a" containerName="placement-log" containerID="cri-o://50769488e00d55eb3429e84962a3785fccd7fc5850dfd3db47f0ac5b5be66345" gracePeriod=30 Mar 21 04:45:47 crc kubenswrapper[4839]: I0321 04:45:47.483178 4839 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/placement-5788c8f798-khqlb" podUID="30c2fe46-cd8a-43f9-8968-b6e65d7c862a" containerName="placement-api" containerID="cri-o://ab24204f92b48f3bd0e1ca51a011a08b99bfb2f39778550bf43149f79a695df5" gracePeriod=30 Mar 21 04:45:47 crc kubenswrapper[4839]: I0321 04:45:47.498147 4839 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-46c8-account-create-update-mp8jl" Mar 21 04:45:47 crc kubenswrapper[4839]: I0321 04:45:47.498324 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-46c8-account-create-update-mp8jl" event={"ID":"4185a56e-9d10-4aea-ad84-a865dff3e6be","Type":"ContainerDied","Data":"5066435ed3d77bc5c33c59d562874afad187c31dc999c5e5a391a142f1d66cb0"} Mar 21 04:45:47 crc kubenswrapper[4839]: I0321 04:45:47.498439 4839 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="5066435ed3d77bc5c33c59d562874afad187c31dc999c5e5a391a142f1d66cb0" Mar 21 04:45:48 crc kubenswrapper[4839]: I0321 04:45:48.396918 4839 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Mar 21 04:45:48 crc kubenswrapper[4839]: I0321 04:45:48.397722 4839 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="524772c8-3fdb-43dc-8532-1d8e9dcdeb97" containerName="glance-log" containerID="cri-o://7a15ea0b23f3e1ed7bc4a276314021fe6cacf15124fb851f6f27bcccdd4586fc" gracePeriod=30 Mar 21 04:45:48 crc kubenswrapper[4839]: I0321 04:45:48.397851 4839 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="524772c8-3fdb-43dc-8532-1d8e9dcdeb97" containerName="glance-httpd" containerID="cri-o://f646a915486b1a93d441b9746d2db8ddcc03de6d349fab6b4e064f47d3119f34" gracePeriod=30 Mar 21 04:45:48 crc kubenswrapper[4839]: I0321 04:45:48.514078 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"df08a3cb-a9ae-4b8e-a9c8-604c41db5158","Type":"ContainerStarted","Data":"5b4cfe3ae4d579e3c19a2481eace665e5e33eabedf68c279bc80562968a22717"} Mar 21 04:45:48 crc kubenswrapper[4839]: I0321 04:45:48.514275 4839 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="df08a3cb-a9ae-4b8e-a9c8-604c41db5158" containerName="ceilometer-central-agent" containerID="cri-o://815ecf57604f93f58650798c5566709077c42cf6955a2a6a4da70402fb94d50e" gracePeriod=30 Mar 21 04:45:48 crc kubenswrapper[4839]: I0321 04:45:48.514629 4839 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Mar 21 04:45:48 crc kubenswrapper[4839]: I0321 04:45:48.514969 4839 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="df08a3cb-a9ae-4b8e-a9c8-604c41db5158" containerName="proxy-httpd" containerID="cri-o://5b4cfe3ae4d579e3c19a2481eace665e5e33eabedf68c279bc80562968a22717" gracePeriod=30 Mar 21 04:45:48 crc kubenswrapper[4839]: I0321 04:45:48.515031 4839 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="df08a3cb-a9ae-4b8e-a9c8-604c41db5158" containerName="sg-core" containerID="cri-o://5e8e76a9a5316a3f376f3d393cce449d73ead54bab022e83ee496cd3fcbd47be" gracePeriod=30 Mar 21 04:45:48 crc kubenswrapper[4839]: I0321 04:45:48.515078 4839 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="df08a3cb-a9ae-4b8e-a9c8-604c41db5158" containerName="ceilometer-notification-agent" containerID="cri-o://fc0f3fd92cb474b1eb7bf19080b6bc0cbd204d90f6d9a3fb300c3450c2d63712" gracePeriod=30 Mar 21 04:45:48 crc kubenswrapper[4839]: I0321 04:45:48.520337 4839 generic.go:334] "Generic (PLEG): container finished" podID="30c2fe46-cd8a-43f9-8968-b6e65d7c862a" containerID="50769488e00d55eb3429e84962a3785fccd7fc5850dfd3db47f0ac5b5be66345" exitCode=143 Mar 21 04:45:48 crc kubenswrapper[4839]: I0321 04:45:48.520383 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-5788c8f798-khqlb" event={"ID":"30c2fe46-cd8a-43f9-8968-b6e65d7c862a","Type":"ContainerDied","Data":"50769488e00d55eb3429e84962a3785fccd7fc5850dfd3db47f0ac5b5be66345"} Mar 21 04:45:48 crc kubenswrapper[4839]: I0321 04:45:48.555288 4839 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=1.9983906820000001 podStartE2EDuration="6.555268491s" podCreationTimestamp="2026-03-21 04:45:42 +0000 UTC" firstStartedPulling="2026-03-21 04:45:43.228986669 +0000 UTC m=+1347.556773345" lastFinishedPulling="2026-03-21 04:45:47.785864478 +0000 UTC m=+1352.113651154" observedRunningTime="2026-03-21 04:45:48.54985561 +0000 UTC m=+1352.877642286" watchObservedRunningTime="2026-03-21 04:45:48.555268491 +0000 UTC m=+1352.883055167" Mar 21 04:45:49 crc kubenswrapper[4839]: I0321 04:45:49.535706 4839 generic.go:334] "Generic (PLEG): container finished" podID="df08a3cb-a9ae-4b8e-a9c8-604c41db5158" containerID="5b4cfe3ae4d579e3c19a2481eace665e5e33eabedf68c279bc80562968a22717" exitCode=0 Mar 21 04:45:49 crc kubenswrapper[4839]: I0321 04:45:49.536025 4839 generic.go:334] "Generic (PLEG): container finished" podID="df08a3cb-a9ae-4b8e-a9c8-604c41db5158" containerID="5e8e76a9a5316a3f376f3d393cce449d73ead54bab022e83ee496cd3fcbd47be" exitCode=2 Mar 21 04:45:49 crc kubenswrapper[4839]: I0321 04:45:49.536060 4839 generic.go:334] "Generic (PLEG): container finished" podID="df08a3cb-a9ae-4b8e-a9c8-604c41db5158" containerID="fc0f3fd92cb474b1eb7bf19080b6bc0cbd204d90f6d9a3fb300c3450c2d63712" exitCode=0 Mar 21 04:45:49 crc kubenswrapper[4839]: I0321 04:45:49.536110 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"df08a3cb-a9ae-4b8e-a9c8-604c41db5158","Type":"ContainerDied","Data":"5b4cfe3ae4d579e3c19a2481eace665e5e33eabedf68c279bc80562968a22717"} Mar 21 04:45:49 crc kubenswrapper[4839]: I0321 04:45:49.536139 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"df08a3cb-a9ae-4b8e-a9c8-604c41db5158","Type":"ContainerDied","Data":"5e8e76a9a5316a3f376f3d393cce449d73ead54bab022e83ee496cd3fcbd47be"} Mar 21 04:45:49 crc kubenswrapper[4839]: I0321 04:45:49.536152 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"df08a3cb-a9ae-4b8e-a9c8-604c41db5158","Type":"ContainerDied","Data":"fc0f3fd92cb474b1eb7bf19080b6bc0cbd204d90f6d9a3fb300c3450c2d63712"} Mar 21 04:45:49 crc kubenswrapper[4839]: I0321 04:45:49.538844 4839 generic.go:334] "Generic (PLEG): container finished" podID="524772c8-3fdb-43dc-8532-1d8e9dcdeb97" containerID="7a15ea0b23f3e1ed7bc4a276314021fe6cacf15124fb851f6f27bcccdd4586fc" exitCode=143 Mar 21 04:45:49 crc kubenswrapper[4839]: I0321 04:45:49.538876 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"524772c8-3fdb-43dc-8532-1d8e9dcdeb97","Type":"ContainerDied","Data":"7a15ea0b23f3e1ed7bc4a276314021fe6cacf15124fb851f6f27bcccdd4586fc"} Mar 21 04:45:50 crc kubenswrapper[4839]: I0321 04:45:50.103418 4839 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 21 04:45:50 crc kubenswrapper[4839]: I0321 04:45:50.103688 4839 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="506e1e04-5787-48bb-9165-96a55f0d3095" containerName="glance-log" containerID="cri-o://688009d7356d78e3eb36a5befafccac32153750022bd8fbc6ea8dbee86aced35" gracePeriod=30 Mar 21 04:45:50 crc kubenswrapper[4839]: I0321 04:45:50.103816 4839 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="506e1e04-5787-48bb-9165-96a55f0d3095" containerName="glance-httpd" containerID="cri-o://9d897b01178474175025269c566e1858192f12c1b5756dd643a41a358a91f169" gracePeriod=30 Mar 21 04:45:50 crc kubenswrapper[4839]: I0321 04:45:50.559784 4839 generic.go:334] "Generic (PLEG): container finished" podID="506e1e04-5787-48bb-9165-96a55f0d3095" containerID="688009d7356d78e3eb36a5befafccac32153750022bd8fbc6ea8dbee86aced35" exitCode=143 Mar 21 04:45:50 crc kubenswrapper[4839]: I0321 04:45:50.560264 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"506e1e04-5787-48bb-9165-96a55f0d3095","Type":"ContainerDied","Data":"688009d7356d78e3eb36a5befafccac32153750022bd8fbc6ea8dbee86aced35"} Mar 21 04:45:51 crc kubenswrapper[4839]: I0321 04:45:51.064926 4839 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-5788c8f798-khqlb" Mar 21 04:45:51 crc kubenswrapper[4839]: I0321 04:45:51.257750 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/30c2fe46-cd8a-43f9-8968-b6e65d7c862a-public-tls-certs\") pod \"30c2fe46-cd8a-43f9-8968-b6e65d7c862a\" (UID: \"30c2fe46-cd8a-43f9-8968-b6e65d7c862a\") " Mar 21 04:45:51 crc kubenswrapper[4839]: I0321 04:45:51.257824 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/30c2fe46-cd8a-43f9-8968-b6e65d7c862a-config-data\") pod \"30c2fe46-cd8a-43f9-8968-b6e65d7c862a\" (UID: \"30c2fe46-cd8a-43f9-8968-b6e65d7c862a\") " Mar 21 04:45:51 crc kubenswrapper[4839]: I0321 04:45:51.257854 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/30c2fe46-cd8a-43f9-8968-b6e65d7c862a-logs\") pod \"30c2fe46-cd8a-43f9-8968-b6e65d7c862a\" (UID: \"30c2fe46-cd8a-43f9-8968-b6e65d7c862a\") " Mar 21 04:45:51 crc kubenswrapper[4839]: I0321 04:45:51.257878 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4nhpg\" (UniqueName: \"kubernetes.io/projected/30c2fe46-cd8a-43f9-8968-b6e65d7c862a-kube-api-access-4nhpg\") pod \"30c2fe46-cd8a-43f9-8968-b6e65d7c862a\" (UID: \"30c2fe46-cd8a-43f9-8968-b6e65d7c862a\") " Mar 21 04:45:51 crc kubenswrapper[4839]: I0321 04:45:51.257966 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/30c2fe46-cd8a-43f9-8968-b6e65d7c862a-internal-tls-certs\") pod \"30c2fe46-cd8a-43f9-8968-b6e65d7c862a\" (UID: \"30c2fe46-cd8a-43f9-8968-b6e65d7c862a\") " Mar 21 04:45:51 crc kubenswrapper[4839]: I0321 04:45:51.258085 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/30c2fe46-cd8a-43f9-8968-b6e65d7c862a-scripts\") pod \"30c2fe46-cd8a-43f9-8968-b6e65d7c862a\" (UID: \"30c2fe46-cd8a-43f9-8968-b6e65d7c862a\") " Mar 21 04:45:51 crc kubenswrapper[4839]: I0321 04:45:51.258106 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/30c2fe46-cd8a-43f9-8968-b6e65d7c862a-combined-ca-bundle\") pod \"30c2fe46-cd8a-43f9-8968-b6e65d7c862a\" (UID: \"30c2fe46-cd8a-43f9-8968-b6e65d7c862a\") " Mar 21 04:45:51 crc kubenswrapper[4839]: I0321 04:45:51.258492 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/30c2fe46-cd8a-43f9-8968-b6e65d7c862a-logs" (OuterVolumeSpecName: "logs") pod "30c2fe46-cd8a-43f9-8968-b6e65d7c862a" (UID: "30c2fe46-cd8a-43f9-8968-b6e65d7c862a"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 21 04:45:51 crc kubenswrapper[4839]: I0321 04:45:51.263463 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/30c2fe46-cd8a-43f9-8968-b6e65d7c862a-scripts" (OuterVolumeSpecName: "scripts") pod "30c2fe46-cd8a-43f9-8968-b6e65d7c862a" (UID: "30c2fe46-cd8a-43f9-8968-b6e65d7c862a"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 21 04:45:51 crc kubenswrapper[4839]: I0321 04:45:51.268697 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/30c2fe46-cd8a-43f9-8968-b6e65d7c862a-kube-api-access-4nhpg" (OuterVolumeSpecName: "kube-api-access-4nhpg") pod "30c2fe46-cd8a-43f9-8968-b6e65d7c862a" (UID: "30c2fe46-cd8a-43f9-8968-b6e65d7c862a"). InnerVolumeSpecName "kube-api-access-4nhpg". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 21 04:45:51 crc kubenswrapper[4839]: I0321 04:45:51.338299 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/30c2fe46-cd8a-43f9-8968-b6e65d7c862a-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "30c2fe46-cd8a-43f9-8968-b6e65d7c862a" (UID: "30c2fe46-cd8a-43f9-8968-b6e65d7c862a"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 21 04:45:51 crc kubenswrapper[4839]: I0321 04:45:51.338820 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/30c2fe46-cd8a-43f9-8968-b6e65d7c862a-config-data" (OuterVolumeSpecName: "config-data") pod "30c2fe46-cd8a-43f9-8968-b6e65d7c862a" (UID: "30c2fe46-cd8a-43f9-8968-b6e65d7c862a"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 21 04:45:51 crc kubenswrapper[4839]: I0321 04:45:51.363048 4839 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/30c2fe46-cd8a-43f9-8968-b6e65d7c862a-scripts\") on node \"crc\" DevicePath \"\"" Mar 21 04:45:51 crc kubenswrapper[4839]: I0321 04:45:51.363085 4839 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/30c2fe46-cd8a-43f9-8968-b6e65d7c862a-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 21 04:45:51 crc kubenswrapper[4839]: I0321 04:45:51.363100 4839 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/30c2fe46-cd8a-43f9-8968-b6e65d7c862a-config-data\") on node \"crc\" DevicePath \"\"" Mar 21 04:45:51 crc kubenswrapper[4839]: I0321 04:45:51.363114 4839 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/30c2fe46-cd8a-43f9-8968-b6e65d7c862a-logs\") on node \"crc\" DevicePath \"\"" Mar 21 04:45:51 crc kubenswrapper[4839]: I0321 04:45:51.363126 4839 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4nhpg\" (UniqueName: \"kubernetes.io/projected/30c2fe46-cd8a-43f9-8968-b6e65d7c862a-kube-api-access-4nhpg\") on node \"crc\" DevicePath \"\"" Mar 21 04:45:51 crc kubenswrapper[4839]: I0321 04:45:51.382085 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/30c2fe46-cd8a-43f9-8968-b6e65d7c862a-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "30c2fe46-cd8a-43f9-8968-b6e65d7c862a" (UID: "30c2fe46-cd8a-43f9-8968-b6e65d7c862a"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 21 04:45:51 crc kubenswrapper[4839]: I0321 04:45:51.393737 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/30c2fe46-cd8a-43f9-8968-b6e65d7c862a-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "30c2fe46-cd8a-43f9-8968-b6e65d7c862a" (UID: "30c2fe46-cd8a-43f9-8968-b6e65d7c862a"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 21 04:45:51 crc kubenswrapper[4839]: I0321 04:45:51.465384 4839 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/30c2fe46-cd8a-43f9-8968-b6e65d7c862a-public-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 21 04:45:51 crc kubenswrapper[4839]: I0321 04:45:51.465430 4839 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/30c2fe46-cd8a-43f9-8968-b6e65d7c862a-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 21 04:45:51 crc kubenswrapper[4839]: I0321 04:45:51.569554 4839 generic.go:334] "Generic (PLEG): container finished" podID="30c2fe46-cd8a-43f9-8968-b6e65d7c862a" containerID="ab24204f92b48f3bd0e1ca51a011a08b99bfb2f39778550bf43149f79a695df5" exitCode=0 Mar 21 04:45:51 crc kubenswrapper[4839]: I0321 04:45:51.569610 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-5788c8f798-khqlb" event={"ID":"30c2fe46-cd8a-43f9-8968-b6e65d7c862a","Type":"ContainerDied","Data":"ab24204f92b48f3bd0e1ca51a011a08b99bfb2f39778550bf43149f79a695df5"} Mar 21 04:45:51 crc kubenswrapper[4839]: I0321 04:45:51.569636 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-5788c8f798-khqlb" event={"ID":"30c2fe46-cd8a-43f9-8968-b6e65d7c862a","Type":"ContainerDied","Data":"3897ef34a5a560221b0da70d53a0118dcc2423f236d8ea84230926286a71f6ee"} Mar 21 04:45:51 crc kubenswrapper[4839]: I0321 04:45:51.569640 4839 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-5788c8f798-khqlb" Mar 21 04:45:51 crc kubenswrapper[4839]: I0321 04:45:51.569651 4839 scope.go:117] "RemoveContainer" containerID="ab24204f92b48f3bd0e1ca51a011a08b99bfb2f39778550bf43149f79a695df5" Mar 21 04:45:51 crc kubenswrapper[4839]: I0321 04:45:51.629234 4839 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-5788c8f798-khqlb"] Mar 21 04:45:51 crc kubenswrapper[4839]: I0321 04:45:51.636763 4839 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-5788c8f798-khqlb"] Mar 21 04:45:51 crc kubenswrapper[4839]: I0321 04:45:51.652123 4839 scope.go:117] "RemoveContainer" containerID="50769488e00d55eb3429e84962a3785fccd7fc5850dfd3db47f0ac5b5be66345" Mar 21 04:45:51 crc kubenswrapper[4839]: I0321 04:45:51.678265 4839 scope.go:117] "RemoveContainer" containerID="ab24204f92b48f3bd0e1ca51a011a08b99bfb2f39778550bf43149f79a695df5" Mar 21 04:45:51 crc kubenswrapper[4839]: E0321 04:45:51.678929 4839 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ab24204f92b48f3bd0e1ca51a011a08b99bfb2f39778550bf43149f79a695df5\": container with ID starting with ab24204f92b48f3bd0e1ca51a011a08b99bfb2f39778550bf43149f79a695df5 not found: ID does not exist" containerID="ab24204f92b48f3bd0e1ca51a011a08b99bfb2f39778550bf43149f79a695df5" Mar 21 04:45:51 crc kubenswrapper[4839]: I0321 04:45:51.678997 4839 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ab24204f92b48f3bd0e1ca51a011a08b99bfb2f39778550bf43149f79a695df5"} err="failed to get container status \"ab24204f92b48f3bd0e1ca51a011a08b99bfb2f39778550bf43149f79a695df5\": rpc error: code = NotFound desc = could not find container \"ab24204f92b48f3bd0e1ca51a011a08b99bfb2f39778550bf43149f79a695df5\": container with ID starting with ab24204f92b48f3bd0e1ca51a011a08b99bfb2f39778550bf43149f79a695df5 not found: ID does not exist" Mar 21 04:45:51 crc kubenswrapper[4839]: I0321 04:45:51.679030 4839 scope.go:117] "RemoveContainer" containerID="50769488e00d55eb3429e84962a3785fccd7fc5850dfd3db47f0ac5b5be66345" Mar 21 04:45:51 crc kubenswrapper[4839]: E0321 04:45:51.682415 4839 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"50769488e00d55eb3429e84962a3785fccd7fc5850dfd3db47f0ac5b5be66345\": container with ID starting with 50769488e00d55eb3429e84962a3785fccd7fc5850dfd3db47f0ac5b5be66345 not found: ID does not exist" containerID="50769488e00d55eb3429e84962a3785fccd7fc5850dfd3db47f0ac5b5be66345" Mar 21 04:45:51 crc kubenswrapper[4839]: I0321 04:45:51.682474 4839 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"50769488e00d55eb3429e84962a3785fccd7fc5850dfd3db47f0ac5b5be66345"} err="failed to get container status \"50769488e00d55eb3429e84962a3785fccd7fc5850dfd3db47f0ac5b5be66345\": rpc error: code = NotFound desc = could not find container \"50769488e00d55eb3429e84962a3785fccd7fc5850dfd3db47f0ac5b5be66345\": container with ID starting with 50769488e00d55eb3429e84962a3785fccd7fc5850dfd3db47f0ac5b5be66345 not found: ID does not exist" Mar 21 04:45:51 crc kubenswrapper[4839]: I0321 04:45:51.991226 4839 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Mar 21 04:45:52 crc kubenswrapper[4839]: I0321 04:45:52.077765 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/524772c8-3fdb-43dc-8532-1d8e9dcdeb97-scripts\") pod \"524772c8-3fdb-43dc-8532-1d8e9dcdeb97\" (UID: \"524772c8-3fdb-43dc-8532-1d8e9dcdeb97\") " Mar 21 04:45:52 crc kubenswrapper[4839]: I0321 04:45:52.078096 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kgs9m\" (UniqueName: \"kubernetes.io/projected/524772c8-3fdb-43dc-8532-1d8e9dcdeb97-kube-api-access-kgs9m\") pod \"524772c8-3fdb-43dc-8532-1d8e9dcdeb97\" (UID: \"524772c8-3fdb-43dc-8532-1d8e9dcdeb97\") " Mar 21 04:45:52 crc kubenswrapper[4839]: I0321 04:45:52.078191 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/524772c8-3fdb-43dc-8532-1d8e9dcdeb97-logs\") pod \"524772c8-3fdb-43dc-8532-1d8e9dcdeb97\" (UID: \"524772c8-3fdb-43dc-8532-1d8e9dcdeb97\") " Mar 21 04:45:52 crc kubenswrapper[4839]: I0321 04:45:52.078227 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/524772c8-3fdb-43dc-8532-1d8e9dcdeb97-public-tls-certs\") pod \"524772c8-3fdb-43dc-8532-1d8e9dcdeb97\" (UID: \"524772c8-3fdb-43dc-8532-1d8e9dcdeb97\") " Mar 21 04:45:52 crc kubenswrapper[4839]: I0321 04:45:52.078307 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"524772c8-3fdb-43dc-8532-1d8e9dcdeb97\" (UID: \"524772c8-3fdb-43dc-8532-1d8e9dcdeb97\") " Mar 21 04:45:52 crc kubenswrapper[4839]: I0321 04:45:52.078346 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/524772c8-3fdb-43dc-8532-1d8e9dcdeb97-config-data\") pod \"524772c8-3fdb-43dc-8532-1d8e9dcdeb97\" (UID: \"524772c8-3fdb-43dc-8532-1d8e9dcdeb97\") " Mar 21 04:45:52 crc kubenswrapper[4839]: I0321 04:45:52.078378 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/524772c8-3fdb-43dc-8532-1d8e9dcdeb97-combined-ca-bundle\") pod \"524772c8-3fdb-43dc-8532-1d8e9dcdeb97\" (UID: \"524772c8-3fdb-43dc-8532-1d8e9dcdeb97\") " Mar 21 04:45:52 crc kubenswrapper[4839]: I0321 04:45:52.078405 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/524772c8-3fdb-43dc-8532-1d8e9dcdeb97-httpd-run\") pod \"524772c8-3fdb-43dc-8532-1d8e9dcdeb97\" (UID: \"524772c8-3fdb-43dc-8532-1d8e9dcdeb97\") " Mar 21 04:45:52 crc kubenswrapper[4839]: I0321 04:45:52.078428 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/524772c8-3fdb-43dc-8532-1d8e9dcdeb97-logs" (OuterVolumeSpecName: "logs") pod "524772c8-3fdb-43dc-8532-1d8e9dcdeb97" (UID: "524772c8-3fdb-43dc-8532-1d8e9dcdeb97"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 21 04:45:52 crc kubenswrapper[4839]: I0321 04:45:52.078854 4839 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/524772c8-3fdb-43dc-8532-1d8e9dcdeb97-logs\") on node \"crc\" DevicePath \"\"" Mar 21 04:45:52 crc kubenswrapper[4839]: I0321 04:45:52.079080 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/524772c8-3fdb-43dc-8532-1d8e9dcdeb97-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "524772c8-3fdb-43dc-8532-1d8e9dcdeb97" (UID: "524772c8-3fdb-43dc-8532-1d8e9dcdeb97"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 21 04:45:52 crc kubenswrapper[4839]: I0321 04:45:52.094456 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage07-crc" (OuterVolumeSpecName: "glance") pod "524772c8-3fdb-43dc-8532-1d8e9dcdeb97" (UID: "524772c8-3fdb-43dc-8532-1d8e9dcdeb97"). InnerVolumeSpecName "local-storage07-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Mar 21 04:45:52 crc kubenswrapper[4839]: I0321 04:45:52.094881 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/524772c8-3fdb-43dc-8532-1d8e9dcdeb97-kube-api-access-kgs9m" (OuterVolumeSpecName: "kube-api-access-kgs9m") pod "524772c8-3fdb-43dc-8532-1d8e9dcdeb97" (UID: "524772c8-3fdb-43dc-8532-1d8e9dcdeb97"). InnerVolumeSpecName "kube-api-access-kgs9m". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 21 04:45:52 crc kubenswrapper[4839]: I0321 04:45:52.096734 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/524772c8-3fdb-43dc-8532-1d8e9dcdeb97-scripts" (OuterVolumeSpecName: "scripts") pod "524772c8-3fdb-43dc-8532-1d8e9dcdeb97" (UID: "524772c8-3fdb-43dc-8532-1d8e9dcdeb97"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 21 04:45:52 crc kubenswrapper[4839]: I0321 04:45:52.109385 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/524772c8-3fdb-43dc-8532-1d8e9dcdeb97-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "524772c8-3fdb-43dc-8532-1d8e9dcdeb97" (UID: "524772c8-3fdb-43dc-8532-1d8e9dcdeb97"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 21 04:45:52 crc kubenswrapper[4839]: I0321 04:45:52.179543 4839 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/524772c8-3fdb-43dc-8532-1d8e9dcdeb97-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 21 04:45:52 crc kubenswrapper[4839]: I0321 04:45:52.179746 4839 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/524772c8-3fdb-43dc-8532-1d8e9dcdeb97-httpd-run\") on node \"crc\" DevicePath \"\"" Mar 21 04:45:52 crc kubenswrapper[4839]: I0321 04:45:52.179759 4839 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/524772c8-3fdb-43dc-8532-1d8e9dcdeb97-scripts\") on node \"crc\" DevicePath \"\"" Mar 21 04:45:52 crc kubenswrapper[4839]: I0321 04:45:52.179768 4839 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kgs9m\" (UniqueName: \"kubernetes.io/projected/524772c8-3fdb-43dc-8532-1d8e9dcdeb97-kube-api-access-kgs9m\") on node \"crc\" DevicePath \"\"" Mar 21 04:45:52 crc kubenswrapper[4839]: I0321 04:45:52.179798 4839 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") on node \"crc\" " Mar 21 04:45:52 crc kubenswrapper[4839]: I0321 04:45:52.214660 4839 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-conductor-db-sync-5dvtr"] Mar 21 04:45:52 crc kubenswrapper[4839]: E0321 04:45:52.215384 4839 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="524772c8-3fdb-43dc-8532-1d8e9dcdeb97" containerName="glance-httpd" Mar 21 04:45:52 crc kubenswrapper[4839]: I0321 04:45:52.215412 4839 state_mem.go:107] "Deleted CPUSet assignment" podUID="524772c8-3fdb-43dc-8532-1d8e9dcdeb97" containerName="glance-httpd" Mar 21 04:45:52 crc kubenswrapper[4839]: E0321 04:45:52.215435 4839 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="30c2fe46-cd8a-43f9-8968-b6e65d7c862a" containerName="placement-log" Mar 21 04:45:52 crc kubenswrapper[4839]: I0321 04:45:52.215442 4839 state_mem.go:107] "Deleted CPUSet assignment" podUID="30c2fe46-cd8a-43f9-8968-b6e65d7c862a" containerName="placement-log" Mar 21 04:45:52 crc kubenswrapper[4839]: E0321 04:45:52.215450 4839 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="524772c8-3fdb-43dc-8532-1d8e9dcdeb97" containerName="glance-log" Mar 21 04:45:52 crc kubenswrapper[4839]: I0321 04:45:52.215460 4839 state_mem.go:107] "Deleted CPUSet assignment" podUID="524772c8-3fdb-43dc-8532-1d8e9dcdeb97" containerName="glance-log" Mar 21 04:45:52 crc kubenswrapper[4839]: E0321 04:45:52.215470 4839 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b3b26c3a-55d5-442a-9c31-187b0aa60f90" containerName="horizon" Mar 21 04:45:52 crc kubenswrapper[4839]: I0321 04:45:52.215477 4839 state_mem.go:107] "Deleted CPUSet assignment" podUID="b3b26c3a-55d5-442a-9c31-187b0aa60f90" containerName="horizon" Mar 21 04:45:52 crc kubenswrapper[4839]: E0321 04:45:52.215493 4839 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b3b26c3a-55d5-442a-9c31-187b0aa60f90" containerName="horizon-log" Mar 21 04:45:52 crc kubenswrapper[4839]: I0321 04:45:52.215499 4839 state_mem.go:107] "Deleted CPUSet assignment" podUID="b3b26c3a-55d5-442a-9c31-187b0aa60f90" containerName="horizon-log" Mar 21 04:45:52 crc kubenswrapper[4839]: E0321 04:45:52.215515 4839 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="30c2fe46-cd8a-43f9-8968-b6e65d7c862a" containerName="placement-api" Mar 21 04:45:52 crc kubenswrapper[4839]: I0321 04:45:52.215520 4839 state_mem.go:107] "Deleted CPUSet assignment" podUID="30c2fe46-cd8a-43f9-8968-b6e65d7c862a" containerName="placement-api" Mar 21 04:45:52 crc kubenswrapper[4839]: E0321 04:45:52.215535 4839 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="60534a44-1538-4bdb-81d1-043c9ae84cee" containerName="mariadb-account-create-update" Mar 21 04:45:52 crc kubenswrapper[4839]: I0321 04:45:52.215541 4839 state_mem.go:107] "Deleted CPUSet assignment" podUID="60534a44-1538-4bdb-81d1-043c9ae84cee" containerName="mariadb-account-create-update" Mar 21 04:45:52 crc kubenswrapper[4839]: E0321 04:45:52.215557 4839 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4185a56e-9d10-4aea-ad84-a865dff3e6be" containerName="mariadb-account-create-update" Mar 21 04:45:52 crc kubenswrapper[4839]: I0321 04:45:52.215582 4839 state_mem.go:107] "Deleted CPUSet assignment" podUID="4185a56e-9d10-4aea-ad84-a865dff3e6be" containerName="mariadb-account-create-update" Mar 21 04:45:52 crc kubenswrapper[4839]: E0321 04:45:52.215593 4839 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b76e9253-1495-42d5-910f-cce6f2730243" containerName="mariadb-database-create" Mar 21 04:45:52 crc kubenswrapper[4839]: I0321 04:45:52.215599 4839 state_mem.go:107] "Deleted CPUSet assignment" podUID="b76e9253-1495-42d5-910f-cce6f2730243" containerName="mariadb-database-create" Mar 21 04:45:52 crc kubenswrapper[4839]: E0321 04:45:52.215620 4839 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9220ed3c-2e97-4efc-a4cc-28bb29774ad8" containerName="mariadb-database-create" Mar 21 04:45:52 crc kubenswrapper[4839]: I0321 04:45:52.215626 4839 state_mem.go:107] "Deleted CPUSet assignment" podUID="9220ed3c-2e97-4efc-a4cc-28bb29774ad8" containerName="mariadb-database-create" Mar 21 04:45:52 crc kubenswrapper[4839]: E0321 04:45:52.215640 4839 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="46c56098-2959-4bd0-b762-36a4ee1bb2e6" containerName="mariadb-database-create" Mar 21 04:45:52 crc kubenswrapper[4839]: I0321 04:45:52.215651 4839 state_mem.go:107] "Deleted CPUSet assignment" podUID="46c56098-2959-4bd0-b762-36a4ee1bb2e6" containerName="mariadb-database-create" Mar 21 04:45:52 crc kubenswrapper[4839]: E0321 04:45:52.215664 4839 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f481fb0d-ac2f-4989-a547-50f5081e4e78" containerName="mariadb-account-create-update" Mar 21 04:45:52 crc kubenswrapper[4839]: I0321 04:45:52.215670 4839 state_mem.go:107] "Deleted CPUSet assignment" podUID="f481fb0d-ac2f-4989-a547-50f5081e4e78" containerName="mariadb-account-create-update" Mar 21 04:45:52 crc kubenswrapper[4839]: I0321 04:45:52.215866 4839 memory_manager.go:354] "RemoveStaleState removing state" podUID="524772c8-3fdb-43dc-8532-1d8e9dcdeb97" containerName="glance-log" Mar 21 04:45:52 crc kubenswrapper[4839]: I0321 04:45:52.215878 4839 memory_manager.go:354] "RemoveStaleState removing state" podUID="b76e9253-1495-42d5-910f-cce6f2730243" containerName="mariadb-database-create" Mar 21 04:45:52 crc kubenswrapper[4839]: I0321 04:45:52.215894 4839 memory_manager.go:354] "RemoveStaleState removing state" podUID="9220ed3c-2e97-4efc-a4cc-28bb29774ad8" containerName="mariadb-database-create" Mar 21 04:45:52 crc kubenswrapper[4839]: I0321 04:45:52.215905 4839 memory_manager.go:354] "RemoveStaleState removing state" podUID="46c56098-2959-4bd0-b762-36a4ee1bb2e6" containerName="mariadb-database-create" Mar 21 04:45:52 crc kubenswrapper[4839]: I0321 04:45:52.215912 4839 memory_manager.go:354] "RemoveStaleState removing state" podUID="30c2fe46-cd8a-43f9-8968-b6e65d7c862a" containerName="placement-api" Mar 21 04:45:52 crc kubenswrapper[4839]: I0321 04:45:52.215929 4839 memory_manager.go:354] "RemoveStaleState removing state" podUID="30c2fe46-cd8a-43f9-8968-b6e65d7c862a" containerName="placement-log" Mar 21 04:45:52 crc kubenswrapper[4839]: I0321 04:45:52.215940 4839 memory_manager.go:354] "RemoveStaleState removing state" podUID="60534a44-1538-4bdb-81d1-043c9ae84cee" containerName="mariadb-account-create-update" Mar 21 04:45:52 crc kubenswrapper[4839]: I0321 04:45:52.215949 4839 memory_manager.go:354] "RemoveStaleState removing state" podUID="b3b26c3a-55d5-442a-9c31-187b0aa60f90" containerName="horizon" Mar 21 04:45:52 crc kubenswrapper[4839]: I0321 04:45:52.215957 4839 memory_manager.go:354] "RemoveStaleState removing state" podUID="4185a56e-9d10-4aea-ad84-a865dff3e6be" containerName="mariadb-account-create-update" Mar 21 04:45:52 crc kubenswrapper[4839]: I0321 04:45:52.215964 4839 memory_manager.go:354] "RemoveStaleState removing state" podUID="524772c8-3fdb-43dc-8532-1d8e9dcdeb97" containerName="glance-httpd" Mar 21 04:45:52 crc kubenswrapper[4839]: I0321 04:45:52.215973 4839 memory_manager.go:354] "RemoveStaleState removing state" podUID="f481fb0d-ac2f-4989-a547-50f5081e4e78" containerName="mariadb-account-create-update" Mar 21 04:45:52 crc kubenswrapper[4839]: I0321 04:45:52.215985 4839 memory_manager.go:354] "RemoveStaleState removing state" podUID="b3b26c3a-55d5-442a-9c31-187b0aa60f90" containerName="horizon-log" Mar 21 04:45:52 crc kubenswrapper[4839]: I0321 04:45:52.216880 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-5dvtr" Mar 21 04:45:52 crc kubenswrapper[4839]: I0321 04:45:52.220732 4839 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-conductor-config-data" Mar 21 04:45:52 crc kubenswrapper[4839]: I0321 04:45:52.221072 4839 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-conductor-scripts" Mar 21 04:45:52 crc kubenswrapper[4839]: I0321 04:45:52.221222 4839 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-nova-dockercfg-t66x4" Mar 21 04:45:52 crc kubenswrapper[4839]: I0321 04:45:52.226593 4839 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage07-crc" (UniqueName: "kubernetes.io/local-volume/local-storage07-crc") on node "crc" Mar 21 04:45:52 crc kubenswrapper[4839]: I0321 04:45:52.231808 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/524772c8-3fdb-43dc-8532-1d8e9dcdeb97-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "524772c8-3fdb-43dc-8532-1d8e9dcdeb97" (UID: "524772c8-3fdb-43dc-8532-1d8e9dcdeb97"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 21 04:45:52 crc kubenswrapper[4839]: I0321 04:45:52.240824 4839 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-5dvtr"] Mar 21 04:45:52 crc kubenswrapper[4839]: I0321 04:45:52.242633 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/524772c8-3fdb-43dc-8532-1d8e9dcdeb97-config-data" (OuterVolumeSpecName: "config-data") pod "524772c8-3fdb-43dc-8532-1d8e9dcdeb97" (UID: "524772c8-3fdb-43dc-8532-1d8e9dcdeb97"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 21 04:45:52 crc kubenswrapper[4839]: I0321 04:45:52.283981 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kmr76\" (UniqueName: \"kubernetes.io/projected/bbaf057c-375e-4da6-a7cd-8c879a51ff50-kube-api-access-kmr76\") pod \"nova-cell0-conductor-db-sync-5dvtr\" (UID: \"bbaf057c-375e-4da6-a7cd-8c879a51ff50\") " pod="openstack/nova-cell0-conductor-db-sync-5dvtr" Mar 21 04:45:52 crc kubenswrapper[4839]: I0321 04:45:52.284046 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bbaf057c-375e-4da6-a7cd-8c879a51ff50-config-data\") pod \"nova-cell0-conductor-db-sync-5dvtr\" (UID: \"bbaf057c-375e-4da6-a7cd-8c879a51ff50\") " pod="openstack/nova-cell0-conductor-db-sync-5dvtr" Mar 21 04:45:52 crc kubenswrapper[4839]: I0321 04:45:52.284067 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bbaf057c-375e-4da6-a7cd-8c879a51ff50-combined-ca-bundle\") pod \"nova-cell0-conductor-db-sync-5dvtr\" (UID: \"bbaf057c-375e-4da6-a7cd-8c879a51ff50\") " pod="openstack/nova-cell0-conductor-db-sync-5dvtr" Mar 21 04:45:52 crc kubenswrapper[4839]: I0321 04:45:52.284194 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/bbaf057c-375e-4da6-a7cd-8c879a51ff50-scripts\") pod \"nova-cell0-conductor-db-sync-5dvtr\" (UID: \"bbaf057c-375e-4da6-a7cd-8c879a51ff50\") " pod="openstack/nova-cell0-conductor-db-sync-5dvtr" Mar 21 04:45:52 crc kubenswrapper[4839]: I0321 04:45:52.284255 4839 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/524772c8-3fdb-43dc-8532-1d8e9dcdeb97-public-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 21 04:45:52 crc kubenswrapper[4839]: I0321 04:45:52.284270 4839 reconciler_common.go:293] "Volume detached for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") on node \"crc\" DevicePath \"\"" Mar 21 04:45:52 crc kubenswrapper[4839]: I0321 04:45:52.284361 4839 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/524772c8-3fdb-43dc-8532-1d8e9dcdeb97-config-data\") on node \"crc\" DevicePath \"\"" Mar 21 04:45:52 crc kubenswrapper[4839]: I0321 04:45:52.385603 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/bbaf057c-375e-4da6-a7cd-8c879a51ff50-scripts\") pod \"nova-cell0-conductor-db-sync-5dvtr\" (UID: \"bbaf057c-375e-4da6-a7cd-8c879a51ff50\") " pod="openstack/nova-cell0-conductor-db-sync-5dvtr" Mar 21 04:45:52 crc kubenswrapper[4839]: I0321 04:45:52.385679 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kmr76\" (UniqueName: \"kubernetes.io/projected/bbaf057c-375e-4da6-a7cd-8c879a51ff50-kube-api-access-kmr76\") pod \"nova-cell0-conductor-db-sync-5dvtr\" (UID: \"bbaf057c-375e-4da6-a7cd-8c879a51ff50\") " pod="openstack/nova-cell0-conductor-db-sync-5dvtr" Mar 21 04:45:52 crc kubenswrapper[4839]: I0321 04:45:52.385707 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bbaf057c-375e-4da6-a7cd-8c879a51ff50-config-data\") pod \"nova-cell0-conductor-db-sync-5dvtr\" (UID: \"bbaf057c-375e-4da6-a7cd-8c879a51ff50\") " pod="openstack/nova-cell0-conductor-db-sync-5dvtr" Mar 21 04:45:52 crc kubenswrapper[4839]: I0321 04:45:52.385723 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bbaf057c-375e-4da6-a7cd-8c879a51ff50-combined-ca-bundle\") pod \"nova-cell0-conductor-db-sync-5dvtr\" (UID: \"bbaf057c-375e-4da6-a7cd-8c879a51ff50\") " pod="openstack/nova-cell0-conductor-db-sync-5dvtr" Mar 21 04:45:52 crc kubenswrapper[4839]: I0321 04:45:52.391693 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/bbaf057c-375e-4da6-a7cd-8c879a51ff50-scripts\") pod \"nova-cell0-conductor-db-sync-5dvtr\" (UID: \"bbaf057c-375e-4da6-a7cd-8c879a51ff50\") " pod="openstack/nova-cell0-conductor-db-sync-5dvtr" Mar 21 04:45:52 crc kubenswrapper[4839]: I0321 04:45:52.392186 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bbaf057c-375e-4da6-a7cd-8c879a51ff50-combined-ca-bundle\") pod \"nova-cell0-conductor-db-sync-5dvtr\" (UID: \"bbaf057c-375e-4da6-a7cd-8c879a51ff50\") " pod="openstack/nova-cell0-conductor-db-sync-5dvtr" Mar 21 04:45:52 crc kubenswrapper[4839]: I0321 04:45:52.392708 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bbaf057c-375e-4da6-a7cd-8c879a51ff50-config-data\") pod \"nova-cell0-conductor-db-sync-5dvtr\" (UID: \"bbaf057c-375e-4da6-a7cd-8c879a51ff50\") " pod="openstack/nova-cell0-conductor-db-sync-5dvtr" Mar 21 04:45:52 crc kubenswrapper[4839]: I0321 04:45:52.416473 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kmr76\" (UniqueName: \"kubernetes.io/projected/bbaf057c-375e-4da6-a7cd-8c879a51ff50-kube-api-access-kmr76\") pod \"nova-cell0-conductor-db-sync-5dvtr\" (UID: \"bbaf057c-375e-4da6-a7cd-8c879a51ff50\") " pod="openstack/nova-cell0-conductor-db-sync-5dvtr" Mar 21 04:45:52 crc kubenswrapper[4839]: I0321 04:45:52.463712 4839 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="30c2fe46-cd8a-43f9-8968-b6e65d7c862a" path="/var/lib/kubelet/pods/30c2fe46-cd8a-43f9-8968-b6e65d7c862a/volumes" Mar 21 04:45:52 crc kubenswrapper[4839]: I0321 04:45:52.548266 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-5dvtr" Mar 21 04:45:52 crc kubenswrapper[4839]: I0321 04:45:52.581748 4839 generic.go:334] "Generic (PLEG): container finished" podID="524772c8-3fdb-43dc-8532-1d8e9dcdeb97" containerID="f646a915486b1a93d441b9746d2db8ddcc03de6d349fab6b4e064f47d3119f34" exitCode=0 Mar 21 04:45:52 crc kubenswrapper[4839]: I0321 04:45:52.581807 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"524772c8-3fdb-43dc-8532-1d8e9dcdeb97","Type":"ContainerDied","Data":"f646a915486b1a93d441b9746d2db8ddcc03de6d349fab6b4e064f47d3119f34"} Mar 21 04:45:52 crc kubenswrapper[4839]: I0321 04:45:52.581839 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"524772c8-3fdb-43dc-8532-1d8e9dcdeb97","Type":"ContainerDied","Data":"d1e4b5b263d8711e41038cc9c72c0cf72e4c984c445036bd50e6733715123ea1"} Mar 21 04:45:52 crc kubenswrapper[4839]: I0321 04:45:52.581860 4839 scope.go:117] "RemoveContainer" containerID="f646a915486b1a93d441b9746d2db8ddcc03de6d349fab6b4e064f47d3119f34" Mar 21 04:45:52 crc kubenswrapper[4839]: I0321 04:45:52.582005 4839 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Mar 21 04:45:52 crc kubenswrapper[4839]: I0321 04:45:52.690186 4839 scope.go:117] "RemoveContainer" containerID="7a15ea0b23f3e1ed7bc4a276314021fe6cacf15124fb851f6f27bcccdd4586fc" Mar 21 04:45:52 crc kubenswrapper[4839]: I0321 04:45:52.697337 4839 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Mar 21 04:45:52 crc kubenswrapper[4839]: I0321 04:45:52.705059 4839 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-external-api-0"] Mar 21 04:45:52 crc kubenswrapper[4839]: I0321 04:45:52.718723 4839 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-external-api-0"] Mar 21 04:45:52 crc kubenswrapper[4839]: I0321 04:45:52.720525 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Mar 21 04:45:52 crc kubenswrapper[4839]: I0321 04:45:52.723700 4839 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-external-config-data" Mar 21 04:45:52 crc kubenswrapper[4839]: I0321 04:45:52.724360 4839 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-public-svc" Mar 21 04:45:52 crc kubenswrapper[4839]: I0321 04:45:52.728524 4839 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Mar 21 04:45:52 crc kubenswrapper[4839]: I0321 04:45:52.744685 4839 scope.go:117] "RemoveContainer" containerID="f646a915486b1a93d441b9746d2db8ddcc03de6d349fab6b4e064f47d3119f34" Mar 21 04:45:52 crc kubenswrapper[4839]: E0321 04:45:52.745483 4839 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f646a915486b1a93d441b9746d2db8ddcc03de6d349fab6b4e064f47d3119f34\": container with ID starting with f646a915486b1a93d441b9746d2db8ddcc03de6d349fab6b4e064f47d3119f34 not found: ID does not exist" containerID="f646a915486b1a93d441b9746d2db8ddcc03de6d349fab6b4e064f47d3119f34" Mar 21 04:45:52 crc kubenswrapper[4839]: I0321 04:45:52.745521 4839 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f646a915486b1a93d441b9746d2db8ddcc03de6d349fab6b4e064f47d3119f34"} err="failed to get container status \"f646a915486b1a93d441b9746d2db8ddcc03de6d349fab6b4e064f47d3119f34\": rpc error: code = NotFound desc = could not find container \"f646a915486b1a93d441b9746d2db8ddcc03de6d349fab6b4e064f47d3119f34\": container with ID starting with f646a915486b1a93d441b9746d2db8ddcc03de6d349fab6b4e064f47d3119f34 not found: ID does not exist" Mar 21 04:45:52 crc kubenswrapper[4839]: I0321 04:45:52.745545 4839 scope.go:117] "RemoveContainer" containerID="7a15ea0b23f3e1ed7bc4a276314021fe6cacf15124fb851f6f27bcccdd4586fc" Mar 21 04:45:52 crc kubenswrapper[4839]: E0321 04:45:52.746082 4839 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7a15ea0b23f3e1ed7bc4a276314021fe6cacf15124fb851f6f27bcccdd4586fc\": container with ID starting with 7a15ea0b23f3e1ed7bc4a276314021fe6cacf15124fb851f6f27bcccdd4586fc not found: ID does not exist" containerID="7a15ea0b23f3e1ed7bc4a276314021fe6cacf15124fb851f6f27bcccdd4586fc" Mar 21 04:45:52 crc kubenswrapper[4839]: I0321 04:45:52.746161 4839 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7a15ea0b23f3e1ed7bc4a276314021fe6cacf15124fb851f6f27bcccdd4586fc"} err="failed to get container status \"7a15ea0b23f3e1ed7bc4a276314021fe6cacf15124fb851f6f27bcccdd4586fc\": rpc error: code = NotFound desc = could not find container \"7a15ea0b23f3e1ed7bc4a276314021fe6cacf15124fb851f6f27bcccdd4586fc\": container with ID starting with 7a15ea0b23f3e1ed7bc4a276314021fe6cacf15124fb851f6f27bcccdd4586fc not found: ID does not exist" Mar 21 04:45:52 crc kubenswrapper[4839]: I0321 04:45:52.895499 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q55mb\" (UniqueName: \"kubernetes.io/projected/3e3e15ec-7425-4e0a-99a8-db3bb1cd486c-kube-api-access-q55mb\") pod \"glance-default-external-api-0\" (UID: \"3e3e15ec-7425-4e0a-99a8-db3bb1cd486c\") " pod="openstack/glance-default-external-api-0" Mar 21 04:45:52 crc kubenswrapper[4839]: I0321 04:45:52.895555 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3e3e15ec-7425-4e0a-99a8-db3bb1cd486c-scripts\") pod \"glance-default-external-api-0\" (UID: \"3e3e15ec-7425-4e0a-99a8-db3bb1cd486c\") " pod="openstack/glance-default-external-api-0" Mar 21 04:45:52 crc kubenswrapper[4839]: I0321 04:45:52.895689 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3e3e15ec-7425-4e0a-99a8-db3bb1cd486c-logs\") pod \"glance-default-external-api-0\" (UID: \"3e3e15ec-7425-4e0a-99a8-db3bb1cd486c\") " pod="openstack/glance-default-external-api-0" Mar 21 04:45:52 crc kubenswrapper[4839]: I0321 04:45:52.895735 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"glance-default-external-api-0\" (UID: \"3e3e15ec-7425-4e0a-99a8-db3bb1cd486c\") " pod="openstack/glance-default-external-api-0" Mar 21 04:45:52 crc kubenswrapper[4839]: I0321 04:45:52.895774 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/3e3e15ec-7425-4e0a-99a8-db3bb1cd486c-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"3e3e15ec-7425-4e0a-99a8-db3bb1cd486c\") " pod="openstack/glance-default-external-api-0" Mar 21 04:45:52 crc kubenswrapper[4839]: I0321 04:45:52.895841 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3e3e15ec-7425-4e0a-99a8-db3bb1cd486c-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"3e3e15ec-7425-4e0a-99a8-db3bb1cd486c\") " pod="openstack/glance-default-external-api-0" Mar 21 04:45:52 crc kubenswrapper[4839]: I0321 04:45:52.895897 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/3e3e15ec-7425-4e0a-99a8-db3bb1cd486c-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"3e3e15ec-7425-4e0a-99a8-db3bb1cd486c\") " pod="openstack/glance-default-external-api-0" Mar 21 04:45:52 crc kubenswrapper[4839]: I0321 04:45:52.895920 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3e3e15ec-7425-4e0a-99a8-db3bb1cd486c-config-data\") pod \"glance-default-external-api-0\" (UID: \"3e3e15ec-7425-4e0a-99a8-db3bb1cd486c\") " pod="openstack/glance-default-external-api-0" Mar 21 04:45:52 crc kubenswrapper[4839]: I0321 04:45:52.996917 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/3e3e15ec-7425-4e0a-99a8-db3bb1cd486c-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"3e3e15ec-7425-4e0a-99a8-db3bb1cd486c\") " pod="openstack/glance-default-external-api-0" Mar 21 04:45:52 crc kubenswrapper[4839]: I0321 04:45:52.996993 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3e3e15ec-7425-4e0a-99a8-db3bb1cd486c-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"3e3e15ec-7425-4e0a-99a8-db3bb1cd486c\") " pod="openstack/glance-default-external-api-0" Mar 21 04:45:52 crc kubenswrapper[4839]: I0321 04:45:52.997035 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/3e3e15ec-7425-4e0a-99a8-db3bb1cd486c-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"3e3e15ec-7425-4e0a-99a8-db3bb1cd486c\") " pod="openstack/glance-default-external-api-0" Mar 21 04:45:52 crc kubenswrapper[4839]: I0321 04:45:52.997054 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3e3e15ec-7425-4e0a-99a8-db3bb1cd486c-config-data\") pod \"glance-default-external-api-0\" (UID: \"3e3e15ec-7425-4e0a-99a8-db3bb1cd486c\") " pod="openstack/glance-default-external-api-0" Mar 21 04:45:52 crc kubenswrapper[4839]: I0321 04:45:52.997082 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-q55mb\" (UniqueName: \"kubernetes.io/projected/3e3e15ec-7425-4e0a-99a8-db3bb1cd486c-kube-api-access-q55mb\") pod \"glance-default-external-api-0\" (UID: \"3e3e15ec-7425-4e0a-99a8-db3bb1cd486c\") " pod="openstack/glance-default-external-api-0" Mar 21 04:45:52 crc kubenswrapper[4839]: I0321 04:45:52.997101 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3e3e15ec-7425-4e0a-99a8-db3bb1cd486c-scripts\") pod \"glance-default-external-api-0\" (UID: \"3e3e15ec-7425-4e0a-99a8-db3bb1cd486c\") " pod="openstack/glance-default-external-api-0" Mar 21 04:45:52 crc kubenswrapper[4839]: I0321 04:45:52.997140 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3e3e15ec-7425-4e0a-99a8-db3bb1cd486c-logs\") pod \"glance-default-external-api-0\" (UID: \"3e3e15ec-7425-4e0a-99a8-db3bb1cd486c\") " pod="openstack/glance-default-external-api-0" Mar 21 04:45:52 crc kubenswrapper[4839]: I0321 04:45:52.997170 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"glance-default-external-api-0\" (UID: \"3e3e15ec-7425-4e0a-99a8-db3bb1cd486c\") " pod="openstack/glance-default-external-api-0" Mar 21 04:45:52 crc kubenswrapper[4839]: I0321 04:45:52.997541 4839 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"glance-default-external-api-0\" (UID: \"3e3e15ec-7425-4e0a-99a8-db3bb1cd486c\") device mount path \"/mnt/openstack/pv07\"" pod="openstack/glance-default-external-api-0" Mar 21 04:45:52 crc kubenswrapper[4839]: I0321 04:45:52.998272 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/3e3e15ec-7425-4e0a-99a8-db3bb1cd486c-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"3e3e15ec-7425-4e0a-99a8-db3bb1cd486c\") " pod="openstack/glance-default-external-api-0" Mar 21 04:45:52 crc kubenswrapper[4839]: I0321 04:45:52.998273 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3e3e15ec-7425-4e0a-99a8-db3bb1cd486c-logs\") pod \"glance-default-external-api-0\" (UID: \"3e3e15ec-7425-4e0a-99a8-db3bb1cd486c\") " pod="openstack/glance-default-external-api-0" Mar 21 04:45:53 crc kubenswrapper[4839]: I0321 04:45:53.002781 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3e3e15ec-7425-4e0a-99a8-db3bb1cd486c-scripts\") pod \"glance-default-external-api-0\" (UID: \"3e3e15ec-7425-4e0a-99a8-db3bb1cd486c\") " pod="openstack/glance-default-external-api-0" Mar 21 04:45:53 crc kubenswrapper[4839]: I0321 04:45:53.003939 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3e3e15ec-7425-4e0a-99a8-db3bb1cd486c-config-data\") pod \"glance-default-external-api-0\" (UID: \"3e3e15ec-7425-4e0a-99a8-db3bb1cd486c\") " pod="openstack/glance-default-external-api-0" Mar 21 04:45:53 crc kubenswrapper[4839]: I0321 04:45:53.007467 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/3e3e15ec-7425-4e0a-99a8-db3bb1cd486c-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"3e3e15ec-7425-4e0a-99a8-db3bb1cd486c\") " pod="openstack/glance-default-external-api-0" Mar 21 04:45:53 crc kubenswrapper[4839]: I0321 04:45:53.008282 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3e3e15ec-7425-4e0a-99a8-db3bb1cd486c-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"3e3e15ec-7425-4e0a-99a8-db3bb1cd486c\") " pod="openstack/glance-default-external-api-0" Mar 21 04:45:53 crc kubenswrapper[4839]: I0321 04:45:53.015825 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-q55mb\" (UniqueName: \"kubernetes.io/projected/3e3e15ec-7425-4e0a-99a8-db3bb1cd486c-kube-api-access-q55mb\") pod \"glance-default-external-api-0\" (UID: \"3e3e15ec-7425-4e0a-99a8-db3bb1cd486c\") " pod="openstack/glance-default-external-api-0" Mar 21 04:45:53 crc kubenswrapper[4839]: I0321 04:45:53.027730 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"glance-default-external-api-0\" (UID: \"3e3e15ec-7425-4e0a-99a8-db3bb1cd486c\") " pod="openstack/glance-default-external-api-0" Mar 21 04:45:53 crc kubenswrapper[4839]: I0321 04:45:53.038837 4839 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-5dvtr"] Mar 21 04:45:53 crc kubenswrapper[4839]: I0321 04:45:53.061689 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Mar 21 04:45:53 crc kubenswrapper[4839]: I0321 04:45:53.592885 4839 generic.go:334] "Generic (PLEG): container finished" podID="506e1e04-5787-48bb-9165-96a55f0d3095" containerID="9d897b01178474175025269c566e1858192f12c1b5756dd643a41a358a91f169" exitCode=0 Mar 21 04:45:53 crc kubenswrapper[4839]: I0321 04:45:53.592939 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"506e1e04-5787-48bb-9165-96a55f0d3095","Type":"ContainerDied","Data":"9d897b01178474175025269c566e1858192f12c1b5756dd643a41a358a91f169"} Mar 21 04:45:53 crc kubenswrapper[4839]: I0321 04:45:53.595540 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-5dvtr" event={"ID":"bbaf057c-375e-4da6-a7cd-8c879a51ff50","Type":"ContainerStarted","Data":"167a9bd8f5cfd7d579c6c62283502e438b5ae393020828f4bac8087f747ad53c"} Mar 21 04:45:53 crc kubenswrapper[4839]: I0321 04:45:53.618770 4839 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Mar 21 04:45:53 crc kubenswrapper[4839]: I0321 04:45:53.858817 4839 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Mar 21 04:45:54 crc kubenswrapper[4839]: I0321 04:45:54.019409 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/506e1e04-5787-48bb-9165-96a55f0d3095-combined-ca-bundle\") pod \"506e1e04-5787-48bb-9165-96a55f0d3095\" (UID: \"506e1e04-5787-48bb-9165-96a55f0d3095\") " Mar 21 04:45:54 crc kubenswrapper[4839]: I0321 04:45:54.019477 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/506e1e04-5787-48bb-9165-96a55f0d3095-internal-tls-certs\") pod \"506e1e04-5787-48bb-9165-96a55f0d3095\" (UID: \"506e1e04-5787-48bb-9165-96a55f0d3095\") " Mar 21 04:45:54 crc kubenswrapper[4839]: I0321 04:45:54.019501 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/506e1e04-5787-48bb-9165-96a55f0d3095-scripts\") pod \"506e1e04-5787-48bb-9165-96a55f0d3095\" (UID: \"506e1e04-5787-48bb-9165-96a55f0d3095\") " Mar 21 04:45:54 crc kubenswrapper[4839]: I0321 04:45:54.019550 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/506e1e04-5787-48bb-9165-96a55f0d3095-config-data\") pod \"506e1e04-5787-48bb-9165-96a55f0d3095\" (UID: \"506e1e04-5787-48bb-9165-96a55f0d3095\") " Mar 21 04:45:54 crc kubenswrapper[4839]: I0321 04:45:54.019614 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/506e1e04-5787-48bb-9165-96a55f0d3095-httpd-run\") pod \"506e1e04-5787-48bb-9165-96a55f0d3095\" (UID: \"506e1e04-5787-48bb-9165-96a55f0d3095\") " Mar 21 04:45:54 crc kubenswrapper[4839]: I0321 04:45:54.019658 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/506e1e04-5787-48bb-9165-96a55f0d3095-logs\") pod \"506e1e04-5787-48bb-9165-96a55f0d3095\" (UID: \"506e1e04-5787-48bb-9165-96a55f0d3095\") " Mar 21 04:45:54 crc kubenswrapper[4839]: I0321 04:45:54.019682 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tfhfr\" (UniqueName: \"kubernetes.io/projected/506e1e04-5787-48bb-9165-96a55f0d3095-kube-api-access-tfhfr\") pod \"506e1e04-5787-48bb-9165-96a55f0d3095\" (UID: \"506e1e04-5787-48bb-9165-96a55f0d3095\") " Mar 21 04:45:54 crc kubenswrapper[4839]: I0321 04:45:54.019707 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"506e1e04-5787-48bb-9165-96a55f0d3095\" (UID: \"506e1e04-5787-48bb-9165-96a55f0d3095\") " Mar 21 04:45:54 crc kubenswrapper[4839]: I0321 04:45:54.020980 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/506e1e04-5787-48bb-9165-96a55f0d3095-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "506e1e04-5787-48bb-9165-96a55f0d3095" (UID: "506e1e04-5787-48bb-9165-96a55f0d3095"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 21 04:45:54 crc kubenswrapper[4839]: I0321 04:45:54.021081 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/506e1e04-5787-48bb-9165-96a55f0d3095-logs" (OuterVolumeSpecName: "logs") pod "506e1e04-5787-48bb-9165-96a55f0d3095" (UID: "506e1e04-5787-48bb-9165-96a55f0d3095"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 21 04:45:54 crc kubenswrapper[4839]: I0321 04:45:54.031026 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/506e1e04-5787-48bb-9165-96a55f0d3095-kube-api-access-tfhfr" (OuterVolumeSpecName: "kube-api-access-tfhfr") pod "506e1e04-5787-48bb-9165-96a55f0d3095" (UID: "506e1e04-5787-48bb-9165-96a55f0d3095"). InnerVolumeSpecName "kube-api-access-tfhfr". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 21 04:45:54 crc kubenswrapper[4839]: I0321 04:45:54.031106 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/506e1e04-5787-48bb-9165-96a55f0d3095-scripts" (OuterVolumeSpecName: "scripts") pod "506e1e04-5787-48bb-9165-96a55f0d3095" (UID: "506e1e04-5787-48bb-9165-96a55f0d3095"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 21 04:45:54 crc kubenswrapper[4839]: I0321 04:45:54.035456 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage02-crc" (OuterVolumeSpecName: "glance") pod "506e1e04-5787-48bb-9165-96a55f0d3095" (UID: "506e1e04-5787-48bb-9165-96a55f0d3095"). InnerVolumeSpecName "local-storage02-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Mar 21 04:45:54 crc kubenswrapper[4839]: I0321 04:45:54.061896 4839 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 21 04:45:54 crc kubenswrapper[4839]: I0321 04:45:54.088390 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/506e1e04-5787-48bb-9165-96a55f0d3095-config-data" (OuterVolumeSpecName: "config-data") pod "506e1e04-5787-48bb-9165-96a55f0d3095" (UID: "506e1e04-5787-48bb-9165-96a55f0d3095"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 21 04:45:54 crc kubenswrapper[4839]: I0321 04:45:54.094001 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/506e1e04-5787-48bb-9165-96a55f0d3095-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "506e1e04-5787-48bb-9165-96a55f0d3095" (UID: "506e1e04-5787-48bb-9165-96a55f0d3095"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 21 04:45:54 crc kubenswrapper[4839]: I0321 04:45:54.095677 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/506e1e04-5787-48bb-9165-96a55f0d3095-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "506e1e04-5787-48bb-9165-96a55f0d3095" (UID: "506e1e04-5787-48bb-9165-96a55f0d3095"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 21 04:45:54 crc kubenswrapper[4839]: I0321 04:45:54.121416 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/df08a3cb-a9ae-4b8e-a9c8-604c41db5158-config-data\") pod \"df08a3cb-a9ae-4b8e-a9c8-604c41db5158\" (UID: \"df08a3cb-a9ae-4b8e-a9c8-604c41db5158\") " Mar 21 04:45:54 crc kubenswrapper[4839]: I0321 04:45:54.121875 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/df08a3cb-a9ae-4b8e-a9c8-604c41db5158-run-httpd\") pod \"df08a3cb-a9ae-4b8e-a9c8-604c41db5158\" (UID: \"df08a3cb-a9ae-4b8e-a9c8-604c41db5158\") " Mar 21 04:45:54 crc kubenswrapper[4839]: I0321 04:45:54.121905 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/df08a3cb-a9ae-4b8e-a9c8-604c41db5158-scripts\") pod \"df08a3cb-a9ae-4b8e-a9c8-604c41db5158\" (UID: \"df08a3cb-a9ae-4b8e-a9c8-604c41db5158\") " Mar 21 04:45:54 crc kubenswrapper[4839]: I0321 04:45:54.121955 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/df08a3cb-a9ae-4b8e-a9c8-604c41db5158-combined-ca-bundle\") pod \"df08a3cb-a9ae-4b8e-a9c8-604c41db5158\" (UID: \"df08a3cb-a9ae-4b8e-a9c8-604c41db5158\") " Mar 21 04:45:54 crc kubenswrapper[4839]: I0321 04:45:54.121985 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ldtbn\" (UniqueName: \"kubernetes.io/projected/df08a3cb-a9ae-4b8e-a9c8-604c41db5158-kube-api-access-ldtbn\") pod \"df08a3cb-a9ae-4b8e-a9c8-604c41db5158\" (UID: \"df08a3cb-a9ae-4b8e-a9c8-604c41db5158\") " Mar 21 04:45:54 crc kubenswrapper[4839]: I0321 04:45:54.122033 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/df08a3cb-a9ae-4b8e-a9c8-604c41db5158-log-httpd\") pod \"df08a3cb-a9ae-4b8e-a9c8-604c41db5158\" (UID: \"df08a3cb-a9ae-4b8e-a9c8-604c41db5158\") " Mar 21 04:45:54 crc kubenswrapper[4839]: I0321 04:45:54.122065 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/df08a3cb-a9ae-4b8e-a9c8-604c41db5158-sg-core-conf-yaml\") pod \"df08a3cb-a9ae-4b8e-a9c8-604c41db5158\" (UID: \"df08a3cb-a9ae-4b8e-a9c8-604c41db5158\") " Mar 21 04:45:54 crc kubenswrapper[4839]: I0321 04:45:54.123215 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/df08a3cb-a9ae-4b8e-a9c8-604c41db5158-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "df08a3cb-a9ae-4b8e-a9c8-604c41db5158" (UID: "df08a3cb-a9ae-4b8e-a9c8-604c41db5158"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 21 04:45:54 crc kubenswrapper[4839]: I0321 04:45:54.123558 4839 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/506e1e04-5787-48bb-9165-96a55f0d3095-httpd-run\") on node \"crc\" DevicePath \"\"" Mar 21 04:45:54 crc kubenswrapper[4839]: I0321 04:45:54.123684 4839 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/506e1e04-5787-48bb-9165-96a55f0d3095-logs\") on node \"crc\" DevicePath \"\"" Mar 21 04:45:54 crc kubenswrapper[4839]: I0321 04:45:54.123697 4839 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tfhfr\" (UniqueName: \"kubernetes.io/projected/506e1e04-5787-48bb-9165-96a55f0d3095-kube-api-access-tfhfr\") on node \"crc\" DevicePath \"\"" Mar 21 04:45:54 crc kubenswrapper[4839]: I0321 04:45:54.123717 4839 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") on node \"crc\" " Mar 21 04:45:54 crc kubenswrapper[4839]: I0321 04:45:54.123727 4839 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/506e1e04-5787-48bb-9165-96a55f0d3095-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 21 04:45:54 crc kubenswrapper[4839]: I0321 04:45:54.123735 4839 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/506e1e04-5787-48bb-9165-96a55f0d3095-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 21 04:45:54 crc kubenswrapper[4839]: I0321 04:45:54.123743 4839 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/506e1e04-5787-48bb-9165-96a55f0d3095-scripts\") on node \"crc\" DevicePath \"\"" Mar 21 04:45:54 crc kubenswrapper[4839]: I0321 04:45:54.123751 4839 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/df08a3cb-a9ae-4b8e-a9c8-604c41db5158-log-httpd\") on node \"crc\" DevicePath \"\"" Mar 21 04:45:54 crc kubenswrapper[4839]: I0321 04:45:54.123758 4839 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/506e1e04-5787-48bb-9165-96a55f0d3095-config-data\") on node \"crc\" DevicePath \"\"" Mar 21 04:45:54 crc kubenswrapper[4839]: I0321 04:45:54.126870 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/df08a3cb-a9ae-4b8e-a9c8-604c41db5158-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "df08a3cb-a9ae-4b8e-a9c8-604c41db5158" (UID: "df08a3cb-a9ae-4b8e-a9c8-604c41db5158"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 21 04:45:54 crc kubenswrapper[4839]: I0321 04:45:54.128784 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/df08a3cb-a9ae-4b8e-a9c8-604c41db5158-kube-api-access-ldtbn" (OuterVolumeSpecName: "kube-api-access-ldtbn") pod "df08a3cb-a9ae-4b8e-a9c8-604c41db5158" (UID: "df08a3cb-a9ae-4b8e-a9c8-604c41db5158"). InnerVolumeSpecName "kube-api-access-ldtbn". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 21 04:45:54 crc kubenswrapper[4839]: I0321 04:45:54.129386 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/df08a3cb-a9ae-4b8e-a9c8-604c41db5158-scripts" (OuterVolumeSpecName: "scripts") pod "df08a3cb-a9ae-4b8e-a9c8-604c41db5158" (UID: "df08a3cb-a9ae-4b8e-a9c8-604c41db5158"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 21 04:45:54 crc kubenswrapper[4839]: I0321 04:45:54.168191 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/df08a3cb-a9ae-4b8e-a9c8-604c41db5158-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "df08a3cb-a9ae-4b8e-a9c8-604c41db5158" (UID: "df08a3cb-a9ae-4b8e-a9c8-604c41db5158"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 21 04:45:54 crc kubenswrapper[4839]: I0321 04:45:54.173547 4839 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage02-crc" (UniqueName: "kubernetes.io/local-volume/local-storage02-crc") on node "crc" Mar 21 04:45:54 crc kubenswrapper[4839]: I0321 04:45:54.226043 4839 reconciler_common.go:293] "Volume detached for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") on node \"crc\" DevicePath \"\"" Mar 21 04:45:54 crc kubenswrapper[4839]: I0321 04:45:54.226369 4839 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/df08a3cb-a9ae-4b8e-a9c8-604c41db5158-run-httpd\") on node \"crc\" DevicePath \"\"" Mar 21 04:45:54 crc kubenswrapper[4839]: I0321 04:45:54.226382 4839 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/df08a3cb-a9ae-4b8e-a9c8-604c41db5158-scripts\") on node \"crc\" DevicePath \"\"" Mar 21 04:45:54 crc kubenswrapper[4839]: I0321 04:45:54.226393 4839 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ldtbn\" (UniqueName: \"kubernetes.io/projected/df08a3cb-a9ae-4b8e-a9c8-604c41db5158-kube-api-access-ldtbn\") on node \"crc\" DevicePath \"\"" Mar 21 04:45:54 crc kubenswrapper[4839]: I0321 04:45:54.226403 4839 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/df08a3cb-a9ae-4b8e-a9c8-604c41db5158-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Mar 21 04:45:54 crc kubenswrapper[4839]: I0321 04:45:54.226488 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/df08a3cb-a9ae-4b8e-a9c8-604c41db5158-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "df08a3cb-a9ae-4b8e-a9c8-604c41db5158" (UID: "df08a3cb-a9ae-4b8e-a9c8-604c41db5158"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 21 04:45:54 crc kubenswrapper[4839]: I0321 04:45:54.248733 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/df08a3cb-a9ae-4b8e-a9c8-604c41db5158-config-data" (OuterVolumeSpecName: "config-data") pod "df08a3cb-a9ae-4b8e-a9c8-604c41db5158" (UID: "df08a3cb-a9ae-4b8e-a9c8-604c41db5158"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 21 04:45:54 crc kubenswrapper[4839]: I0321 04:45:54.329767 4839 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/df08a3cb-a9ae-4b8e-a9c8-604c41db5158-config-data\") on node \"crc\" DevicePath \"\"" Mar 21 04:45:54 crc kubenswrapper[4839]: I0321 04:45:54.329806 4839 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/df08a3cb-a9ae-4b8e-a9c8-604c41db5158-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 21 04:45:54 crc kubenswrapper[4839]: I0321 04:45:54.473143 4839 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="524772c8-3fdb-43dc-8532-1d8e9dcdeb97" path="/var/lib/kubelet/pods/524772c8-3fdb-43dc-8532-1d8e9dcdeb97/volumes" Mar 21 04:45:54 crc kubenswrapper[4839]: I0321 04:45:54.618495 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"506e1e04-5787-48bb-9165-96a55f0d3095","Type":"ContainerDied","Data":"b3b20c1c58919d92014e2b8b23b7b38a20303479312dd8c6c51224fcd3f18728"} Mar 21 04:45:54 crc kubenswrapper[4839]: I0321 04:45:54.618543 4839 scope.go:117] "RemoveContainer" containerID="9d897b01178474175025269c566e1858192f12c1b5756dd643a41a358a91f169" Mar 21 04:45:54 crc kubenswrapper[4839]: I0321 04:45:54.618681 4839 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Mar 21 04:45:54 crc kubenswrapper[4839]: I0321 04:45:54.625437 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"3e3e15ec-7425-4e0a-99a8-db3bb1cd486c","Type":"ContainerStarted","Data":"5a8717c8e282ed1a9dd0c8621b1e357e6005c3b9225ddebfca3234dc9e7a1b1c"} Mar 21 04:45:54 crc kubenswrapper[4839]: I0321 04:45:54.625810 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"3e3e15ec-7425-4e0a-99a8-db3bb1cd486c","Type":"ContainerStarted","Data":"edef24dd73386b464b06e73affcab4a9872357de542f95bf3af4a775e757c502"} Mar 21 04:45:54 crc kubenswrapper[4839]: I0321 04:45:54.632767 4839 generic.go:334] "Generic (PLEG): container finished" podID="df08a3cb-a9ae-4b8e-a9c8-604c41db5158" containerID="815ecf57604f93f58650798c5566709077c42cf6955a2a6a4da70402fb94d50e" exitCode=0 Mar 21 04:45:54 crc kubenswrapper[4839]: I0321 04:45:54.632809 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"df08a3cb-a9ae-4b8e-a9c8-604c41db5158","Type":"ContainerDied","Data":"815ecf57604f93f58650798c5566709077c42cf6955a2a6a4da70402fb94d50e"} Mar 21 04:45:54 crc kubenswrapper[4839]: I0321 04:45:54.632835 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"df08a3cb-a9ae-4b8e-a9c8-604c41db5158","Type":"ContainerDied","Data":"41ff38380ac8ed55675761ad2bd4b24ee85da709e085d038d76ac53207f2c9ae"} Mar 21 04:45:54 crc kubenswrapper[4839]: I0321 04:45:54.632890 4839 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 21 04:45:54 crc kubenswrapper[4839]: I0321 04:45:54.655709 4839 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 21 04:45:54 crc kubenswrapper[4839]: I0321 04:45:54.665737 4839 scope.go:117] "RemoveContainer" containerID="688009d7356d78e3eb36a5befafccac32153750022bd8fbc6ea8dbee86aced35" Mar 21 04:45:54 crc kubenswrapper[4839]: I0321 04:45:54.674075 4839 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 21 04:45:54 crc kubenswrapper[4839]: I0321 04:45:54.705668 4839 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 21 04:45:54 crc kubenswrapper[4839]: I0321 04:45:54.725382 4839 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Mar 21 04:45:54 crc kubenswrapper[4839]: I0321 04:45:54.758132 4839 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 21 04:45:54 crc kubenswrapper[4839]: I0321 04:45:54.759708 4839 scope.go:117] "RemoveContainer" containerID="5b4cfe3ae4d579e3c19a2481eace665e5e33eabedf68c279bc80562968a22717" Mar 21 04:45:54 crc kubenswrapper[4839]: E0321 04:45:54.760854 4839 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="df08a3cb-a9ae-4b8e-a9c8-604c41db5158" containerName="ceilometer-central-agent" Mar 21 04:45:54 crc kubenswrapper[4839]: I0321 04:45:54.760882 4839 state_mem.go:107] "Deleted CPUSet assignment" podUID="df08a3cb-a9ae-4b8e-a9c8-604c41db5158" containerName="ceilometer-central-agent" Mar 21 04:45:54 crc kubenswrapper[4839]: E0321 04:45:54.760899 4839 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="df08a3cb-a9ae-4b8e-a9c8-604c41db5158" containerName="sg-core" Mar 21 04:45:54 crc kubenswrapper[4839]: I0321 04:45:54.760906 4839 state_mem.go:107] "Deleted CPUSet assignment" podUID="df08a3cb-a9ae-4b8e-a9c8-604c41db5158" containerName="sg-core" Mar 21 04:45:54 crc kubenswrapper[4839]: E0321 04:45:54.760920 4839 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="506e1e04-5787-48bb-9165-96a55f0d3095" containerName="glance-log" Mar 21 04:45:54 crc kubenswrapper[4839]: I0321 04:45:54.760927 4839 state_mem.go:107] "Deleted CPUSet assignment" podUID="506e1e04-5787-48bb-9165-96a55f0d3095" containerName="glance-log" Mar 21 04:45:54 crc kubenswrapper[4839]: E0321 04:45:54.760937 4839 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="506e1e04-5787-48bb-9165-96a55f0d3095" containerName="glance-httpd" Mar 21 04:45:54 crc kubenswrapper[4839]: I0321 04:45:54.760943 4839 state_mem.go:107] "Deleted CPUSet assignment" podUID="506e1e04-5787-48bb-9165-96a55f0d3095" containerName="glance-httpd" Mar 21 04:45:54 crc kubenswrapper[4839]: E0321 04:45:54.760967 4839 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="df08a3cb-a9ae-4b8e-a9c8-604c41db5158" containerName="proxy-httpd" Mar 21 04:45:54 crc kubenswrapper[4839]: I0321 04:45:54.760975 4839 state_mem.go:107] "Deleted CPUSet assignment" podUID="df08a3cb-a9ae-4b8e-a9c8-604c41db5158" containerName="proxy-httpd" Mar 21 04:45:54 crc kubenswrapper[4839]: E0321 04:45:54.761001 4839 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="df08a3cb-a9ae-4b8e-a9c8-604c41db5158" containerName="ceilometer-notification-agent" Mar 21 04:45:54 crc kubenswrapper[4839]: I0321 04:45:54.761008 4839 state_mem.go:107] "Deleted CPUSet assignment" podUID="df08a3cb-a9ae-4b8e-a9c8-604c41db5158" containerName="ceilometer-notification-agent" Mar 21 04:45:54 crc kubenswrapper[4839]: I0321 04:45:54.761293 4839 memory_manager.go:354] "RemoveStaleState removing state" podUID="df08a3cb-a9ae-4b8e-a9c8-604c41db5158" containerName="ceilometer-notification-agent" Mar 21 04:45:54 crc kubenswrapper[4839]: I0321 04:45:54.761310 4839 memory_manager.go:354] "RemoveStaleState removing state" podUID="506e1e04-5787-48bb-9165-96a55f0d3095" containerName="glance-log" Mar 21 04:45:54 crc kubenswrapper[4839]: I0321 04:45:54.761321 4839 memory_manager.go:354] "RemoveStaleState removing state" podUID="df08a3cb-a9ae-4b8e-a9c8-604c41db5158" containerName="ceilometer-central-agent" Mar 21 04:45:54 crc kubenswrapper[4839]: I0321 04:45:54.761330 4839 memory_manager.go:354] "RemoveStaleState removing state" podUID="df08a3cb-a9ae-4b8e-a9c8-604c41db5158" containerName="sg-core" Mar 21 04:45:54 crc kubenswrapper[4839]: I0321 04:45:54.761338 4839 memory_manager.go:354] "RemoveStaleState removing state" podUID="df08a3cb-a9ae-4b8e-a9c8-604c41db5158" containerName="proxy-httpd" Mar 21 04:45:54 crc kubenswrapper[4839]: I0321 04:45:54.761355 4839 memory_manager.go:354] "RemoveStaleState removing state" podUID="506e1e04-5787-48bb-9165-96a55f0d3095" containerName="glance-httpd" Mar 21 04:45:54 crc kubenswrapper[4839]: I0321 04:45:54.762481 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Mar 21 04:45:54 crc kubenswrapper[4839]: I0321 04:45:54.766687 4839 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-internal-config-data" Mar 21 04:45:54 crc kubenswrapper[4839]: I0321 04:45:54.766870 4839 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-internal-svc" Mar 21 04:45:54 crc kubenswrapper[4839]: I0321 04:45:54.802107 4839 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 21 04:45:54 crc kubenswrapper[4839]: I0321 04:45:54.808803 4839 scope.go:117] "RemoveContainer" containerID="5e8e76a9a5316a3f376f3d393cce449d73ead54bab022e83ee496cd3fcbd47be" Mar 21 04:45:54 crc kubenswrapper[4839]: I0321 04:45:54.824983 4839 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Mar 21 04:45:54 crc kubenswrapper[4839]: I0321 04:45:54.827979 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 21 04:45:54 crc kubenswrapper[4839]: I0321 04:45:54.832539 4839 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Mar 21 04:45:54 crc kubenswrapper[4839]: I0321 04:45:54.832707 4839 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Mar 21 04:45:54 crc kubenswrapper[4839]: I0321 04:45:54.850153 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/8d47edf7-95e3-4bb5-ab87-27c9db5d05d6-run-httpd\") pod \"ceilometer-0\" (UID: \"8d47edf7-95e3-4bb5-ab87-27c9db5d05d6\") " pod="openstack/ceilometer-0" Mar 21 04:45:54 crc kubenswrapper[4839]: I0321 04:45:54.850390 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8d47edf7-95e3-4bb5-ab87-27c9db5d05d6-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"8d47edf7-95e3-4bb5-ab87-27c9db5d05d6\") " pod="openstack/ceilometer-0" Mar 21 04:45:54 crc kubenswrapper[4839]: I0321 04:45:54.850423 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/8d47edf7-95e3-4bb5-ab87-27c9db5d05d6-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"8d47edf7-95e3-4bb5-ab87-27c9db5d05d6\") " pod="openstack/ceilometer-0" Mar 21 04:45:54 crc kubenswrapper[4839]: I0321 04:45:54.850444 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8d47edf7-95e3-4bb5-ab87-27c9db5d05d6-config-data\") pod \"ceilometer-0\" (UID: \"8d47edf7-95e3-4bb5-ab87-27c9db5d05d6\") " pod="openstack/ceilometer-0" Mar 21 04:45:54 crc kubenswrapper[4839]: I0321 04:45:54.850479 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/8d47edf7-95e3-4bb5-ab87-27c9db5d05d6-log-httpd\") pod \"ceilometer-0\" (UID: \"8d47edf7-95e3-4bb5-ab87-27c9db5d05d6\") " pod="openstack/ceilometer-0" Mar 21 04:45:54 crc kubenswrapper[4839]: I0321 04:45:54.850505 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cp7fg\" (UniqueName: \"kubernetes.io/projected/8d47edf7-95e3-4bb5-ab87-27c9db5d05d6-kube-api-access-cp7fg\") pod \"ceilometer-0\" (UID: \"8d47edf7-95e3-4bb5-ab87-27c9db5d05d6\") " pod="openstack/ceilometer-0" Mar 21 04:45:54 crc kubenswrapper[4839]: I0321 04:45:54.850560 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8d47edf7-95e3-4bb5-ab87-27c9db5d05d6-scripts\") pod \"ceilometer-0\" (UID: \"8d47edf7-95e3-4bb5-ab87-27c9db5d05d6\") " pod="openstack/ceilometer-0" Mar 21 04:45:54 crc kubenswrapper[4839]: I0321 04:45:54.857819 4839 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 21 04:45:54 crc kubenswrapper[4839]: I0321 04:45:54.886435 4839 scope.go:117] "RemoveContainer" containerID="fc0f3fd92cb474b1eb7bf19080b6bc0cbd204d90f6d9a3fb300c3450c2d63712" Mar 21 04:45:54 crc kubenswrapper[4839]: I0321 04:45:54.924989 4839 scope.go:117] "RemoveContainer" containerID="815ecf57604f93f58650798c5566709077c42cf6955a2a6a4da70402fb94d50e" Mar 21 04:45:54 crc kubenswrapper[4839]: I0321 04:45:54.950641 4839 scope.go:117] "RemoveContainer" containerID="5b4cfe3ae4d579e3c19a2481eace665e5e33eabedf68c279bc80562968a22717" Mar 21 04:45:54 crc kubenswrapper[4839]: E0321 04:45:54.951400 4839 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5b4cfe3ae4d579e3c19a2481eace665e5e33eabedf68c279bc80562968a22717\": container with ID starting with 5b4cfe3ae4d579e3c19a2481eace665e5e33eabedf68c279bc80562968a22717 not found: ID does not exist" containerID="5b4cfe3ae4d579e3c19a2481eace665e5e33eabedf68c279bc80562968a22717" Mar 21 04:45:54 crc kubenswrapper[4839]: I0321 04:45:54.951456 4839 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5b4cfe3ae4d579e3c19a2481eace665e5e33eabedf68c279bc80562968a22717"} err="failed to get container status \"5b4cfe3ae4d579e3c19a2481eace665e5e33eabedf68c279bc80562968a22717\": rpc error: code = NotFound desc = could not find container \"5b4cfe3ae4d579e3c19a2481eace665e5e33eabedf68c279bc80562968a22717\": container with ID starting with 5b4cfe3ae4d579e3c19a2481eace665e5e33eabedf68c279bc80562968a22717 not found: ID does not exist" Mar 21 04:45:54 crc kubenswrapper[4839]: I0321 04:45:54.951489 4839 scope.go:117] "RemoveContainer" containerID="5e8e76a9a5316a3f376f3d393cce449d73ead54bab022e83ee496cd3fcbd47be" Mar 21 04:45:54 crc kubenswrapper[4839]: E0321 04:45:54.951914 4839 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5e8e76a9a5316a3f376f3d393cce449d73ead54bab022e83ee496cd3fcbd47be\": container with ID starting with 5e8e76a9a5316a3f376f3d393cce449d73ead54bab022e83ee496cd3fcbd47be not found: ID does not exist" containerID="5e8e76a9a5316a3f376f3d393cce449d73ead54bab022e83ee496cd3fcbd47be" Mar 21 04:45:54 crc kubenswrapper[4839]: I0321 04:45:54.952107 4839 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5e8e76a9a5316a3f376f3d393cce449d73ead54bab022e83ee496cd3fcbd47be"} err="failed to get container status \"5e8e76a9a5316a3f376f3d393cce449d73ead54bab022e83ee496cd3fcbd47be\": rpc error: code = NotFound desc = could not find container \"5e8e76a9a5316a3f376f3d393cce449d73ead54bab022e83ee496cd3fcbd47be\": container with ID starting with 5e8e76a9a5316a3f376f3d393cce449d73ead54bab022e83ee496cd3fcbd47be not found: ID does not exist" Mar 21 04:45:54 crc kubenswrapper[4839]: I0321 04:45:54.953135 4839 scope.go:117] "RemoveContainer" containerID="fc0f3fd92cb474b1eb7bf19080b6bc0cbd204d90f6d9a3fb300c3450c2d63712" Mar 21 04:45:54 crc kubenswrapper[4839]: I0321 04:45:54.952002 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8d47edf7-95e3-4bb5-ab87-27c9db5d05d6-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"8d47edf7-95e3-4bb5-ab87-27c9db5d05d6\") " pod="openstack/ceilometer-0" Mar 21 04:45:54 crc kubenswrapper[4839]: I0321 04:45:54.953456 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/8d47edf7-95e3-4bb5-ab87-27c9db5d05d6-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"8d47edf7-95e3-4bb5-ab87-27c9db5d05d6\") " pod="openstack/ceilometer-0" Mar 21 04:45:54 crc kubenswrapper[4839]: I0321 04:45:54.953488 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"glance-default-internal-api-0\" (UID: \"c7aa4192-53bb-412e-b25e-1fe47c59fa75\") " pod="openstack/glance-default-internal-api-0" Mar 21 04:45:54 crc kubenswrapper[4839]: I0321 04:45:54.953520 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8d47edf7-95e3-4bb5-ab87-27c9db5d05d6-config-data\") pod \"ceilometer-0\" (UID: \"8d47edf7-95e3-4bb5-ab87-27c9db5d05d6\") " pod="openstack/ceilometer-0" Mar 21 04:45:54 crc kubenswrapper[4839]: I0321 04:45:54.953581 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/c7aa4192-53bb-412e-b25e-1fe47c59fa75-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"c7aa4192-53bb-412e-b25e-1fe47c59fa75\") " pod="openstack/glance-default-internal-api-0" Mar 21 04:45:54 crc kubenswrapper[4839]: I0321 04:45:54.953620 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/8d47edf7-95e3-4bb5-ab87-27c9db5d05d6-log-httpd\") pod \"ceilometer-0\" (UID: \"8d47edf7-95e3-4bb5-ab87-27c9db5d05d6\") " pod="openstack/ceilometer-0" Mar 21 04:45:54 crc kubenswrapper[4839]: I0321 04:45:54.953659 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cp7fg\" (UniqueName: \"kubernetes.io/projected/8d47edf7-95e3-4bb5-ab87-27c9db5d05d6-kube-api-access-cp7fg\") pod \"ceilometer-0\" (UID: \"8d47edf7-95e3-4bb5-ab87-27c9db5d05d6\") " pod="openstack/ceilometer-0" Mar 21 04:45:54 crc kubenswrapper[4839]: I0321 04:45:54.953692 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c7aa4192-53bb-412e-b25e-1fe47c59fa75-logs\") pod \"glance-default-internal-api-0\" (UID: \"c7aa4192-53bb-412e-b25e-1fe47c59fa75\") " pod="openstack/glance-default-internal-api-0" Mar 21 04:45:54 crc kubenswrapper[4839]: I0321 04:45:54.953716 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c7aa4192-53bb-412e-b25e-1fe47c59fa75-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"c7aa4192-53bb-412e-b25e-1fe47c59fa75\") " pod="openstack/glance-default-internal-api-0" Mar 21 04:45:54 crc kubenswrapper[4839]: I0321 04:45:54.953773 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8d47edf7-95e3-4bb5-ab87-27c9db5d05d6-scripts\") pod \"ceilometer-0\" (UID: \"8d47edf7-95e3-4bb5-ab87-27c9db5d05d6\") " pod="openstack/ceilometer-0" Mar 21 04:45:54 crc kubenswrapper[4839]: I0321 04:45:54.953920 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c7aa4192-53bb-412e-b25e-1fe47c59fa75-scripts\") pod \"glance-default-internal-api-0\" (UID: \"c7aa4192-53bb-412e-b25e-1fe47c59fa75\") " pod="openstack/glance-default-internal-api-0" Mar 21 04:45:54 crc kubenswrapper[4839]: I0321 04:45:54.953962 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c7aa4192-53bb-412e-b25e-1fe47c59fa75-config-data\") pod \"glance-default-internal-api-0\" (UID: \"c7aa4192-53bb-412e-b25e-1fe47c59fa75\") " pod="openstack/glance-default-internal-api-0" Mar 21 04:45:54 crc kubenswrapper[4839]: I0321 04:45:54.953985 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/8d47edf7-95e3-4bb5-ab87-27c9db5d05d6-run-httpd\") pod \"ceilometer-0\" (UID: \"8d47edf7-95e3-4bb5-ab87-27c9db5d05d6\") " pod="openstack/ceilometer-0" Mar 21 04:45:54 crc kubenswrapper[4839]: I0321 04:45:54.954087 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4qzpf\" (UniqueName: \"kubernetes.io/projected/c7aa4192-53bb-412e-b25e-1fe47c59fa75-kube-api-access-4qzpf\") pod \"glance-default-internal-api-0\" (UID: \"c7aa4192-53bb-412e-b25e-1fe47c59fa75\") " pod="openstack/glance-default-internal-api-0" Mar 21 04:45:54 crc kubenswrapper[4839]: I0321 04:45:54.954180 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/c7aa4192-53bb-412e-b25e-1fe47c59fa75-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"c7aa4192-53bb-412e-b25e-1fe47c59fa75\") " pod="openstack/glance-default-internal-api-0" Mar 21 04:45:54 crc kubenswrapper[4839]: I0321 04:45:54.954471 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/8d47edf7-95e3-4bb5-ab87-27c9db5d05d6-log-httpd\") pod \"ceilometer-0\" (UID: \"8d47edf7-95e3-4bb5-ab87-27c9db5d05d6\") " pod="openstack/ceilometer-0" Mar 21 04:45:54 crc kubenswrapper[4839]: I0321 04:45:54.954693 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/8d47edf7-95e3-4bb5-ab87-27c9db5d05d6-run-httpd\") pod \"ceilometer-0\" (UID: \"8d47edf7-95e3-4bb5-ab87-27c9db5d05d6\") " pod="openstack/ceilometer-0" Mar 21 04:45:54 crc kubenswrapper[4839]: I0321 04:45:54.959145 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8d47edf7-95e3-4bb5-ab87-27c9db5d05d6-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"8d47edf7-95e3-4bb5-ab87-27c9db5d05d6\") " pod="openstack/ceilometer-0" Mar 21 04:45:54 crc kubenswrapper[4839]: I0321 04:45:54.960588 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/8d47edf7-95e3-4bb5-ab87-27c9db5d05d6-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"8d47edf7-95e3-4bb5-ab87-27c9db5d05d6\") " pod="openstack/ceilometer-0" Mar 21 04:45:54 crc kubenswrapper[4839]: I0321 04:45:54.962716 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8d47edf7-95e3-4bb5-ab87-27c9db5d05d6-scripts\") pod \"ceilometer-0\" (UID: \"8d47edf7-95e3-4bb5-ab87-27c9db5d05d6\") " pod="openstack/ceilometer-0" Mar 21 04:45:54 crc kubenswrapper[4839]: E0321 04:45:54.963807 4839 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fc0f3fd92cb474b1eb7bf19080b6bc0cbd204d90f6d9a3fb300c3450c2d63712\": container with ID starting with fc0f3fd92cb474b1eb7bf19080b6bc0cbd204d90f6d9a3fb300c3450c2d63712 not found: ID does not exist" containerID="fc0f3fd92cb474b1eb7bf19080b6bc0cbd204d90f6d9a3fb300c3450c2d63712" Mar 21 04:45:54 crc kubenswrapper[4839]: I0321 04:45:54.963864 4839 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fc0f3fd92cb474b1eb7bf19080b6bc0cbd204d90f6d9a3fb300c3450c2d63712"} err="failed to get container status \"fc0f3fd92cb474b1eb7bf19080b6bc0cbd204d90f6d9a3fb300c3450c2d63712\": rpc error: code = NotFound desc = could not find container \"fc0f3fd92cb474b1eb7bf19080b6bc0cbd204d90f6d9a3fb300c3450c2d63712\": container with ID starting with fc0f3fd92cb474b1eb7bf19080b6bc0cbd204d90f6d9a3fb300c3450c2d63712 not found: ID does not exist" Mar 21 04:45:54 crc kubenswrapper[4839]: I0321 04:45:54.963897 4839 scope.go:117] "RemoveContainer" containerID="815ecf57604f93f58650798c5566709077c42cf6955a2a6a4da70402fb94d50e" Mar 21 04:45:54 crc kubenswrapper[4839]: E0321 04:45:54.965061 4839 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"815ecf57604f93f58650798c5566709077c42cf6955a2a6a4da70402fb94d50e\": container with ID starting with 815ecf57604f93f58650798c5566709077c42cf6955a2a6a4da70402fb94d50e not found: ID does not exist" containerID="815ecf57604f93f58650798c5566709077c42cf6955a2a6a4da70402fb94d50e" Mar 21 04:45:54 crc kubenswrapper[4839]: I0321 04:45:54.965099 4839 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"815ecf57604f93f58650798c5566709077c42cf6955a2a6a4da70402fb94d50e"} err="failed to get container status \"815ecf57604f93f58650798c5566709077c42cf6955a2a6a4da70402fb94d50e\": rpc error: code = NotFound desc = could not find container \"815ecf57604f93f58650798c5566709077c42cf6955a2a6a4da70402fb94d50e\": container with ID starting with 815ecf57604f93f58650798c5566709077c42cf6955a2a6a4da70402fb94d50e not found: ID does not exist" Mar 21 04:45:54 crc kubenswrapper[4839]: I0321 04:45:54.967182 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8d47edf7-95e3-4bb5-ab87-27c9db5d05d6-config-data\") pod \"ceilometer-0\" (UID: \"8d47edf7-95e3-4bb5-ab87-27c9db5d05d6\") " pod="openstack/ceilometer-0" Mar 21 04:45:54 crc kubenswrapper[4839]: I0321 04:45:54.974548 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cp7fg\" (UniqueName: \"kubernetes.io/projected/8d47edf7-95e3-4bb5-ab87-27c9db5d05d6-kube-api-access-cp7fg\") pod \"ceilometer-0\" (UID: \"8d47edf7-95e3-4bb5-ab87-27c9db5d05d6\") " pod="openstack/ceilometer-0" Mar 21 04:45:55 crc kubenswrapper[4839]: I0321 04:45:55.056383 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"glance-default-internal-api-0\" (UID: \"c7aa4192-53bb-412e-b25e-1fe47c59fa75\") " pod="openstack/glance-default-internal-api-0" Mar 21 04:45:55 crc kubenswrapper[4839]: I0321 04:45:55.056884 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/c7aa4192-53bb-412e-b25e-1fe47c59fa75-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"c7aa4192-53bb-412e-b25e-1fe47c59fa75\") " pod="openstack/glance-default-internal-api-0" Mar 21 04:45:55 crc kubenswrapper[4839]: I0321 04:45:55.056973 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c7aa4192-53bb-412e-b25e-1fe47c59fa75-logs\") pod \"glance-default-internal-api-0\" (UID: \"c7aa4192-53bb-412e-b25e-1fe47c59fa75\") " pod="openstack/glance-default-internal-api-0" Mar 21 04:45:55 crc kubenswrapper[4839]: I0321 04:45:55.057045 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c7aa4192-53bb-412e-b25e-1fe47c59fa75-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"c7aa4192-53bb-412e-b25e-1fe47c59fa75\") " pod="openstack/glance-default-internal-api-0" Mar 21 04:45:55 crc kubenswrapper[4839]: I0321 04:45:55.057193 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c7aa4192-53bb-412e-b25e-1fe47c59fa75-scripts\") pod \"glance-default-internal-api-0\" (UID: \"c7aa4192-53bb-412e-b25e-1fe47c59fa75\") " pod="openstack/glance-default-internal-api-0" Mar 21 04:45:55 crc kubenswrapper[4839]: I0321 04:45:55.057333 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c7aa4192-53bb-412e-b25e-1fe47c59fa75-config-data\") pod \"glance-default-internal-api-0\" (UID: \"c7aa4192-53bb-412e-b25e-1fe47c59fa75\") " pod="openstack/glance-default-internal-api-0" Mar 21 04:45:55 crc kubenswrapper[4839]: I0321 04:45:55.057920 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4qzpf\" (UniqueName: \"kubernetes.io/projected/c7aa4192-53bb-412e-b25e-1fe47c59fa75-kube-api-access-4qzpf\") pod \"glance-default-internal-api-0\" (UID: \"c7aa4192-53bb-412e-b25e-1fe47c59fa75\") " pod="openstack/glance-default-internal-api-0" Mar 21 04:45:55 crc kubenswrapper[4839]: I0321 04:45:55.058005 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/c7aa4192-53bb-412e-b25e-1fe47c59fa75-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"c7aa4192-53bb-412e-b25e-1fe47c59fa75\") " pod="openstack/glance-default-internal-api-0" Mar 21 04:45:55 crc kubenswrapper[4839]: I0321 04:45:55.056980 4839 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"glance-default-internal-api-0\" (UID: \"c7aa4192-53bb-412e-b25e-1fe47c59fa75\") device mount path \"/mnt/openstack/pv02\"" pod="openstack/glance-default-internal-api-0" Mar 21 04:45:55 crc kubenswrapper[4839]: I0321 04:45:55.060222 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c7aa4192-53bb-412e-b25e-1fe47c59fa75-logs\") pod \"glance-default-internal-api-0\" (UID: \"c7aa4192-53bb-412e-b25e-1fe47c59fa75\") " pod="openstack/glance-default-internal-api-0" Mar 21 04:45:55 crc kubenswrapper[4839]: I0321 04:45:55.060454 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/c7aa4192-53bb-412e-b25e-1fe47c59fa75-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"c7aa4192-53bb-412e-b25e-1fe47c59fa75\") " pod="openstack/glance-default-internal-api-0" Mar 21 04:45:55 crc kubenswrapper[4839]: I0321 04:45:55.064429 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c7aa4192-53bb-412e-b25e-1fe47c59fa75-config-data\") pod \"glance-default-internal-api-0\" (UID: \"c7aa4192-53bb-412e-b25e-1fe47c59fa75\") " pod="openstack/glance-default-internal-api-0" Mar 21 04:45:55 crc kubenswrapper[4839]: I0321 04:45:55.070649 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c7aa4192-53bb-412e-b25e-1fe47c59fa75-scripts\") pod \"glance-default-internal-api-0\" (UID: \"c7aa4192-53bb-412e-b25e-1fe47c59fa75\") " pod="openstack/glance-default-internal-api-0" Mar 21 04:45:55 crc kubenswrapper[4839]: I0321 04:45:55.071497 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c7aa4192-53bb-412e-b25e-1fe47c59fa75-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"c7aa4192-53bb-412e-b25e-1fe47c59fa75\") " pod="openstack/glance-default-internal-api-0" Mar 21 04:45:55 crc kubenswrapper[4839]: I0321 04:45:55.076721 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/c7aa4192-53bb-412e-b25e-1fe47c59fa75-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"c7aa4192-53bb-412e-b25e-1fe47c59fa75\") " pod="openstack/glance-default-internal-api-0" Mar 21 04:45:55 crc kubenswrapper[4839]: I0321 04:45:55.081585 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4qzpf\" (UniqueName: \"kubernetes.io/projected/c7aa4192-53bb-412e-b25e-1fe47c59fa75-kube-api-access-4qzpf\") pod \"glance-default-internal-api-0\" (UID: \"c7aa4192-53bb-412e-b25e-1fe47c59fa75\") " pod="openstack/glance-default-internal-api-0" Mar 21 04:45:55 crc kubenswrapper[4839]: I0321 04:45:55.105001 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"glance-default-internal-api-0\" (UID: \"c7aa4192-53bb-412e-b25e-1fe47c59fa75\") " pod="openstack/glance-default-internal-api-0" Mar 21 04:45:55 crc kubenswrapper[4839]: I0321 04:45:55.163924 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 21 04:45:55 crc kubenswrapper[4839]: I0321 04:45:55.394544 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Mar 21 04:45:55 crc kubenswrapper[4839]: I0321 04:45:55.651354 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"3e3e15ec-7425-4e0a-99a8-db3bb1cd486c","Type":"ContainerStarted","Data":"9e1322d5a22397b4c22be78ee22e362bc9ad41eba5ef21736d759c4b8060bf8e"} Mar 21 04:45:55 crc kubenswrapper[4839]: I0321 04:45:55.677363 4839 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-external-api-0" podStartSLOduration=3.6773410220000002 podStartE2EDuration="3.677341022s" podCreationTimestamp="2026-03-21 04:45:52 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-21 04:45:55.671996112 +0000 UTC m=+1359.999782798" watchObservedRunningTime="2026-03-21 04:45:55.677341022 +0000 UTC m=+1360.005127698" Mar 21 04:45:55 crc kubenswrapper[4839]: I0321 04:45:55.742513 4839 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 21 04:45:55 crc kubenswrapper[4839]: W0321 04:45:55.753953 4839 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod8d47edf7_95e3_4bb5_ab87_27c9db5d05d6.slice/crio-d75d884bbe4a31f233cb211086c9cf5693ae2ba4faf11ef4b832e01a60ea7483 WatchSource:0}: Error finding container d75d884bbe4a31f233cb211086c9cf5693ae2ba4faf11ef4b832e01a60ea7483: Status 404 returned error can't find the container with id d75d884bbe4a31f233cb211086c9cf5693ae2ba4faf11ef4b832e01a60ea7483 Mar 21 04:45:55 crc kubenswrapper[4839]: I0321 04:45:55.926732 4839 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 21 04:45:56 crc kubenswrapper[4839]: I0321 04:45:56.464417 4839 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="506e1e04-5787-48bb-9165-96a55f0d3095" path="/var/lib/kubelet/pods/506e1e04-5787-48bb-9165-96a55f0d3095/volumes" Mar 21 04:45:56 crc kubenswrapper[4839]: I0321 04:45:56.465503 4839 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="df08a3cb-a9ae-4b8e-a9c8-604c41db5158" path="/var/lib/kubelet/pods/df08a3cb-a9ae-4b8e-a9c8-604c41db5158/volumes" Mar 21 04:45:56 crc kubenswrapper[4839]: I0321 04:45:56.680487 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"c7aa4192-53bb-412e-b25e-1fe47c59fa75","Type":"ContainerStarted","Data":"e89b3f506e5a960d98bb0c12eb2af752b39195c18d53d7ed12721c0cbc928e86"} Mar 21 04:45:56 crc kubenswrapper[4839]: I0321 04:45:56.680537 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"c7aa4192-53bb-412e-b25e-1fe47c59fa75","Type":"ContainerStarted","Data":"cc947298e9a397670be68e795db4fd3f3eed3e7514187eed9facc497f7066d52"} Mar 21 04:45:56 crc kubenswrapper[4839]: I0321 04:45:56.686872 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"8d47edf7-95e3-4bb5-ab87-27c9db5d05d6","Type":"ContainerStarted","Data":"cbba2b10323381d9b303a23b6607bd17c5906d7437bb21c852b760d41642da03"} Mar 21 04:45:56 crc kubenswrapper[4839]: I0321 04:45:56.686955 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"8d47edf7-95e3-4bb5-ab87-27c9db5d05d6","Type":"ContainerStarted","Data":"d75d884bbe4a31f233cb211086c9cf5693ae2ba4faf11ef4b832e01a60ea7483"} Mar 21 04:45:57 crc kubenswrapper[4839]: I0321 04:45:57.698375 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"c7aa4192-53bb-412e-b25e-1fe47c59fa75","Type":"ContainerStarted","Data":"29c2b41c34c58734758f0c4fa9ee85718e44617f6aedee7277175ff1043e3cd3"} Mar 21 04:45:57 crc kubenswrapper[4839]: I0321 04:45:57.732386 4839 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-internal-api-0" podStartSLOduration=3.732369502 podStartE2EDuration="3.732369502s" podCreationTimestamp="2026-03-21 04:45:54 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-21 04:45:57.718972445 +0000 UTC m=+1362.046759121" watchObservedRunningTime="2026-03-21 04:45:57.732369502 +0000 UTC m=+1362.060156178" Mar 21 04:46:00 crc kubenswrapper[4839]: I0321 04:46:00.146365 4839 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29567806-g4rcl"] Mar 21 04:46:00 crc kubenswrapper[4839]: I0321 04:46:00.148127 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567806-g4rcl" Mar 21 04:46:00 crc kubenswrapper[4839]: I0321 04:46:00.150165 4839 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-swld2" Mar 21 04:46:00 crc kubenswrapper[4839]: I0321 04:46:00.150889 4839 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 21 04:46:00 crc kubenswrapper[4839]: I0321 04:46:00.151381 4839 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 21 04:46:00 crc kubenswrapper[4839]: I0321 04:46:00.158676 4839 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29567806-g4rcl"] Mar 21 04:46:00 crc kubenswrapper[4839]: I0321 04:46:00.267155 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-88ds5\" (UniqueName: \"kubernetes.io/projected/75c1454e-0aed-48d9-a0f2-f7c2797156ce-kube-api-access-88ds5\") pod \"auto-csr-approver-29567806-g4rcl\" (UID: \"75c1454e-0aed-48d9-a0f2-f7c2797156ce\") " pod="openshift-infra/auto-csr-approver-29567806-g4rcl" Mar 21 04:46:00 crc kubenswrapper[4839]: I0321 04:46:00.370063 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-88ds5\" (UniqueName: \"kubernetes.io/projected/75c1454e-0aed-48d9-a0f2-f7c2797156ce-kube-api-access-88ds5\") pod \"auto-csr-approver-29567806-g4rcl\" (UID: \"75c1454e-0aed-48d9-a0f2-f7c2797156ce\") " pod="openshift-infra/auto-csr-approver-29567806-g4rcl" Mar 21 04:46:00 crc kubenswrapper[4839]: I0321 04:46:00.389952 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-88ds5\" (UniqueName: \"kubernetes.io/projected/75c1454e-0aed-48d9-a0f2-f7c2797156ce-kube-api-access-88ds5\") pod \"auto-csr-approver-29567806-g4rcl\" (UID: \"75c1454e-0aed-48d9-a0f2-f7c2797156ce\") " pod="openshift-infra/auto-csr-approver-29567806-g4rcl" Mar 21 04:46:00 crc kubenswrapper[4839]: I0321 04:46:00.471761 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567806-g4rcl" Mar 21 04:46:00 crc kubenswrapper[4839]: I0321 04:46:00.980041 4839 patch_prober.go:28] interesting pod/machine-config-daemon-jx4q7 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 21 04:46:00 crc kubenswrapper[4839]: I0321 04:46:00.980261 4839 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-jx4q7" podUID="4f92fefb-d5cd-451a-8bbe-31eea55d5bd9" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 21 04:46:00 crc kubenswrapper[4839]: I0321 04:46:00.980320 4839 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-jx4q7" Mar 21 04:46:00 crc kubenswrapper[4839]: I0321 04:46:00.981213 4839 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"48bb6d2443587cf3023178aa72ea424c113f55b1e7600821dbf21c214de8e70f"} pod="openshift-machine-config-operator/machine-config-daemon-jx4q7" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 21 04:46:00 crc kubenswrapper[4839]: I0321 04:46:00.981271 4839 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-jx4q7" podUID="4f92fefb-d5cd-451a-8bbe-31eea55d5bd9" containerName="machine-config-daemon" containerID="cri-o://48bb6d2443587cf3023178aa72ea424c113f55b1e7600821dbf21c214de8e70f" gracePeriod=600 Mar 21 04:46:01 crc kubenswrapper[4839]: I0321 04:46:01.681965 4839 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29567806-g4rcl"] Mar 21 04:46:01 crc kubenswrapper[4839]: W0321 04:46:01.682584 4839 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod75c1454e_0aed_48d9_a0f2_f7c2797156ce.slice/crio-bdf7c9c6ed03e29549d652d87f7937c2f76fe41b3839872da46b59ddd063bb8e WatchSource:0}: Error finding container bdf7c9c6ed03e29549d652d87f7937c2f76fe41b3839872da46b59ddd063bb8e: Status 404 returned error can't find the container with id bdf7c9c6ed03e29549d652d87f7937c2f76fe41b3839872da46b59ddd063bb8e Mar 21 04:46:01 crc kubenswrapper[4839]: I0321 04:46:01.735622 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"8d47edf7-95e3-4bb5-ab87-27c9db5d05d6","Type":"ContainerStarted","Data":"1c072f26f9d106f6161eb66b3f6c9a76cc9db2ce4ede2775c0501c104b78c25c"} Mar 21 04:46:01 crc kubenswrapper[4839]: I0321 04:46:01.738667 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-5dvtr" event={"ID":"bbaf057c-375e-4da6-a7cd-8c879a51ff50","Type":"ContainerStarted","Data":"a57d3dec4c234a21b088b3986b8d9a4b8012dec53cc26619ad9bdd0f9475d8cc"} Mar 21 04:46:01 crc kubenswrapper[4839]: I0321 04:46:01.741413 4839 generic.go:334] "Generic (PLEG): container finished" podID="4f92fefb-d5cd-451a-8bbe-31eea55d5bd9" containerID="48bb6d2443587cf3023178aa72ea424c113f55b1e7600821dbf21c214de8e70f" exitCode=0 Mar 21 04:46:01 crc kubenswrapper[4839]: I0321 04:46:01.741476 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-jx4q7" event={"ID":"4f92fefb-d5cd-451a-8bbe-31eea55d5bd9","Type":"ContainerDied","Data":"48bb6d2443587cf3023178aa72ea424c113f55b1e7600821dbf21c214de8e70f"} Mar 21 04:46:01 crc kubenswrapper[4839]: I0321 04:46:01.741729 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-jx4q7" event={"ID":"4f92fefb-d5cd-451a-8bbe-31eea55d5bd9","Type":"ContainerStarted","Data":"c031ed8f7b7576f57e9530a46687f2f2de2e5c2a62f42435eef393cfd7af2b37"} Mar 21 04:46:01 crc kubenswrapper[4839]: I0321 04:46:01.741807 4839 scope.go:117] "RemoveContainer" containerID="3ca17db50991abbb7e584e1a028ac5195afd6abd747f7e5e9969a64ed39bcf6c" Mar 21 04:46:01 crc kubenswrapper[4839]: I0321 04:46:01.743413 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567806-g4rcl" event={"ID":"75c1454e-0aed-48d9-a0f2-f7c2797156ce","Type":"ContainerStarted","Data":"bdf7c9c6ed03e29549d652d87f7937c2f76fe41b3839872da46b59ddd063bb8e"} Mar 21 04:46:01 crc kubenswrapper[4839]: I0321 04:46:01.763909 4839 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-conductor-db-sync-5dvtr" podStartSLOduration=1.453500151 podStartE2EDuration="9.763888061s" podCreationTimestamp="2026-03-21 04:45:52 +0000 UTC" firstStartedPulling="2026-03-21 04:45:53.036410662 +0000 UTC m=+1357.364197338" lastFinishedPulling="2026-03-21 04:46:01.346798572 +0000 UTC m=+1365.674585248" observedRunningTime="2026-03-21 04:46:01.759748135 +0000 UTC m=+1366.087534811" watchObservedRunningTime="2026-03-21 04:46:01.763888061 +0000 UTC m=+1366.091674737" Mar 21 04:46:02 crc kubenswrapper[4839]: I0321 04:46:02.796742 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"8d47edf7-95e3-4bb5-ab87-27c9db5d05d6","Type":"ContainerStarted","Data":"1e51df175e72743e7d699f4e5fcec298f453f2659fd1cb0a4c210eba9115f1a3"} Mar 21 04:46:03 crc kubenswrapper[4839]: I0321 04:46:03.062086 4839 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Mar 21 04:46:03 crc kubenswrapper[4839]: I0321 04:46:03.062483 4839 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Mar 21 04:46:03 crc kubenswrapper[4839]: I0321 04:46:03.099215 4839 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Mar 21 04:46:03 crc kubenswrapper[4839]: I0321 04:46:03.109708 4839 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Mar 21 04:46:03 crc kubenswrapper[4839]: I0321 04:46:03.805901 4839 generic.go:334] "Generic (PLEG): container finished" podID="75c1454e-0aed-48d9-a0f2-f7c2797156ce" containerID="66cb92ff47a88ccd93ffde6b9853588d4c4d5f3a25eb2c7a9862fbe6f8dc60f8" exitCode=0 Mar 21 04:46:03 crc kubenswrapper[4839]: I0321 04:46:03.806022 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567806-g4rcl" event={"ID":"75c1454e-0aed-48d9-a0f2-f7c2797156ce","Type":"ContainerDied","Data":"66cb92ff47a88ccd93ffde6b9853588d4c4d5f3a25eb2c7a9862fbe6f8dc60f8"} Mar 21 04:46:03 crc kubenswrapper[4839]: I0321 04:46:03.809205 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"8d47edf7-95e3-4bb5-ab87-27c9db5d05d6","Type":"ContainerStarted","Data":"1ebe513656b6f58bbf6f0d69227894541ab6a4fa4cbe47f5b1af5f7551f5352e"} Mar 21 04:46:03 crc kubenswrapper[4839]: I0321 04:46:03.809689 4839 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Mar 21 04:46:03 crc kubenswrapper[4839]: I0321 04:46:03.809727 4839 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Mar 21 04:46:03 crc kubenswrapper[4839]: I0321 04:46:03.809738 4839 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Mar 21 04:46:03 crc kubenswrapper[4839]: I0321 04:46:03.852035 4839 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.3344914279999998 podStartE2EDuration="9.852013202s" podCreationTimestamp="2026-03-21 04:45:54 +0000 UTC" firstStartedPulling="2026-03-21 04:45:55.756017906 +0000 UTC m=+1360.083804582" lastFinishedPulling="2026-03-21 04:46:03.27353968 +0000 UTC m=+1367.601326356" observedRunningTime="2026-03-21 04:46:03.851178389 +0000 UTC m=+1368.178965085" watchObservedRunningTime="2026-03-21 04:46:03.852013202 +0000 UTC m=+1368.179799878" Mar 21 04:46:05 crc kubenswrapper[4839]: I0321 04:46:05.153909 4839 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567806-g4rcl" Mar 21 04:46:05 crc kubenswrapper[4839]: I0321 04:46:05.267009 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-88ds5\" (UniqueName: \"kubernetes.io/projected/75c1454e-0aed-48d9-a0f2-f7c2797156ce-kube-api-access-88ds5\") pod \"75c1454e-0aed-48d9-a0f2-f7c2797156ce\" (UID: \"75c1454e-0aed-48d9-a0f2-f7c2797156ce\") " Mar 21 04:46:05 crc kubenswrapper[4839]: I0321 04:46:05.282495 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/75c1454e-0aed-48d9-a0f2-f7c2797156ce-kube-api-access-88ds5" (OuterVolumeSpecName: "kube-api-access-88ds5") pod "75c1454e-0aed-48d9-a0f2-f7c2797156ce" (UID: "75c1454e-0aed-48d9-a0f2-f7c2797156ce"). InnerVolumeSpecName "kube-api-access-88ds5". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 21 04:46:05 crc kubenswrapper[4839]: I0321 04:46:05.370380 4839 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-88ds5\" (UniqueName: \"kubernetes.io/projected/75c1454e-0aed-48d9-a0f2-f7c2797156ce-kube-api-access-88ds5\") on node \"crc\" DevicePath \"\"" Mar 21 04:46:05 crc kubenswrapper[4839]: I0321 04:46:05.395445 4839 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Mar 21 04:46:05 crc kubenswrapper[4839]: I0321 04:46:05.395501 4839 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Mar 21 04:46:05 crc kubenswrapper[4839]: I0321 04:46:05.429709 4839 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Mar 21 04:46:05 crc kubenswrapper[4839]: I0321 04:46:05.438714 4839 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Mar 21 04:46:05 crc kubenswrapper[4839]: I0321 04:46:05.793253 4839 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Mar 21 04:46:05 crc kubenswrapper[4839]: I0321 04:46:05.807538 4839 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Mar 21 04:46:05 crc kubenswrapper[4839]: I0321 04:46:05.836295 4839 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567806-g4rcl" Mar 21 04:46:05 crc kubenswrapper[4839]: I0321 04:46:05.836532 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567806-g4rcl" event={"ID":"75c1454e-0aed-48d9-a0f2-f7c2797156ce","Type":"ContainerDied","Data":"bdf7c9c6ed03e29549d652d87f7937c2f76fe41b3839872da46b59ddd063bb8e"} Mar 21 04:46:05 crc kubenswrapper[4839]: I0321 04:46:05.836561 4839 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="bdf7c9c6ed03e29549d652d87f7937c2f76fe41b3839872da46b59ddd063bb8e" Mar 21 04:46:05 crc kubenswrapper[4839]: I0321 04:46:05.837376 4839 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Mar 21 04:46:05 crc kubenswrapper[4839]: I0321 04:46:05.837878 4839 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Mar 21 04:46:06 crc kubenswrapper[4839]: I0321 04:46:06.254235 4839 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29567800-hzcbk"] Mar 21 04:46:06 crc kubenswrapper[4839]: I0321 04:46:06.262511 4839 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29567800-hzcbk"] Mar 21 04:46:06 crc kubenswrapper[4839]: I0321 04:46:06.465419 4839 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4a2cd29b-967b-4cf6-9902-6f30ad049cb1" path="/var/lib/kubelet/pods/4a2cd29b-967b-4cf6-9902-6f30ad049cb1/volumes" Mar 21 04:46:07 crc kubenswrapper[4839]: I0321 04:46:07.841189 4839 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Mar 21 04:46:07 crc kubenswrapper[4839]: I0321 04:46:07.855928 4839 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Mar 21 04:46:07 crc kubenswrapper[4839]: I0321 04:46:07.875129 4839 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Mar 21 04:46:17 crc kubenswrapper[4839]: I0321 04:46:17.937177 4839 generic.go:334] "Generic (PLEG): container finished" podID="bbaf057c-375e-4da6-a7cd-8c879a51ff50" containerID="a57d3dec4c234a21b088b3986b8d9a4b8012dec53cc26619ad9bdd0f9475d8cc" exitCode=0 Mar 21 04:46:17 crc kubenswrapper[4839]: I0321 04:46:17.937264 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-5dvtr" event={"ID":"bbaf057c-375e-4da6-a7cd-8c879a51ff50","Type":"ContainerDied","Data":"a57d3dec4c234a21b088b3986b8d9a4b8012dec53cc26619ad9bdd0f9475d8cc"} Mar 21 04:46:19 crc kubenswrapper[4839]: I0321 04:46:19.306415 4839 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-5dvtr" Mar 21 04:46:19 crc kubenswrapper[4839]: I0321 04:46:19.428052 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kmr76\" (UniqueName: \"kubernetes.io/projected/bbaf057c-375e-4da6-a7cd-8c879a51ff50-kube-api-access-kmr76\") pod \"bbaf057c-375e-4da6-a7cd-8c879a51ff50\" (UID: \"bbaf057c-375e-4da6-a7cd-8c879a51ff50\") " Mar 21 04:46:19 crc kubenswrapper[4839]: I0321 04:46:19.428138 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bbaf057c-375e-4da6-a7cd-8c879a51ff50-config-data\") pod \"bbaf057c-375e-4da6-a7cd-8c879a51ff50\" (UID: \"bbaf057c-375e-4da6-a7cd-8c879a51ff50\") " Mar 21 04:46:19 crc kubenswrapper[4839]: I0321 04:46:19.428159 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/bbaf057c-375e-4da6-a7cd-8c879a51ff50-scripts\") pod \"bbaf057c-375e-4da6-a7cd-8c879a51ff50\" (UID: \"bbaf057c-375e-4da6-a7cd-8c879a51ff50\") " Mar 21 04:46:19 crc kubenswrapper[4839]: I0321 04:46:19.428289 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bbaf057c-375e-4da6-a7cd-8c879a51ff50-combined-ca-bundle\") pod \"bbaf057c-375e-4da6-a7cd-8c879a51ff50\" (UID: \"bbaf057c-375e-4da6-a7cd-8c879a51ff50\") " Mar 21 04:46:19 crc kubenswrapper[4839]: I0321 04:46:19.436495 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bbaf057c-375e-4da6-a7cd-8c879a51ff50-kube-api-access-kmr76" (OuterVolumeSpecName: "kube-api-access-kmr76") pod "bbaf057c-375e-4da6-a7cd-8c879a51ff50" (UID: "bbaf057c-375e-4da6-a7cd-8c879a51ff50"). InnerVolumeSpecName "kube-api-access-kmr76". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 21 04:46:19 crc kubenswrapper[4839]: I0321 04:46:19.437186 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bbaf057c-375e-4da6-a7cd-8c879a51ff50-scripts" (OuterVolumeSpecName: "scripts") pod "bbaf057c-375e-4da6-a7cd-8c879a51ff50" (UID: "bbaf057c-375e-4da6-a7cd-8c879a51ff50"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 21 04:46:19 crc kubenswrapper[4839]: I0321 04:46:19.455834 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bbaf057c-375e-4da6-a7cd-8c879a51ff50-config-data" (OuterVolumeSpecName: "config-data") pod "bbaf057c-375e-4da6-a7cd-8c879a51ff50" (UID: "bbaf057c-375e-4da6-a7cd-8c879a51ff50"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 21 04:46:19 crc kubenswrapper[4839]: I0321 04:46:19.460768 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bbaf057c-375e-4da6-a7cd-8c879a51ff50-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "bbaf057c-375e-4da6-a7cd-8c879a51ff50" (UID: "bbaf057c-375e-4da6-a7cd-8c879a51ff50"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 21 04:46:19 crc kubenswrapper[4839]: I0321 04:46:19.530975 4839 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kmr76\" (UniqueName: \"kubernetes.io/projected/bbaf057c-375e-4da6-a7cd-8c879a51ff50-kube-api-access-kmr76\") on node \"crc\" DevicePath \"\"" Mar 21 04:46:19 crc kubenswrapper[4839]: I0321 04:46:19.531113 4839 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bbaf057c-375e-4da6-a7cd-8c879a51ff50-config-data\") on node \"crc\" DevicePath \"\"" Mar 21 04:46:19 crc kubenswrapper[4839]: I0321 04:46:19.531131 4839 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/bbaf057c-375e-4da6-a7cd-8c879a51ff50-scripts\") on node \"crc\" DevicePath \"\"" Mar 21 04:46:19 crc kubenswrapper[4839]: I0321 04:46:19.531144 4839 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bbaf057c-375e-4da6-a7cd-8c879a51ff50-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 21 04:46:19 crc kubenswrapper[4839]: I0321 04:46:19.959340 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-5dvtr" event={"ID":"bbaf057c-375e-4da6-a7cd-8c879a51ff50","Type":"ContainerDied","Data":"167a9bd8f5cfd7d579c6c62283502e438b5ae393020828f4bac8087f747ad53c"} Mar 21 04:46:19 crc kubenswrapper[4839]: I0321 04:46:19.959386 4839 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="167a9bd8f5cfd7d579c6c62283502e438b5ae393020828f4bac8087f747ad53c" Mar 21 04:46:19 crc kubenswrapper[4839]: I0321 04:46:19.959452 4839 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-5dvtr" Mar 21 04:46:20 crc kubenswrapper[4839]: I0321 04:46:20.063299 4839 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-conductor-0"] Mar 21 04:46:20 crc kubenswrapper[4839]: E0321 04:46:20.064009 4839 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bbaf057c-375e-4da6-a7cd-8c879a51ff50" containerName="nova-cell0-conductor-db-sync" Mar 21 04:46:20 crc kubenswrapper[4839]: I0321 04:46:20.064029 4839 state_mem.go:107] "Deleted CPUSet assignment" podUID="bbaf057c-375e-4da6-a7cd-8c879a51ff50" containerName="nova-cell0-conductor-db-sync" Mar 21 04:46:20 crc kubenswrapper[4839]: E0321 04:46:20.064052 4839 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="75c1454e-0aed-48d9-a0f2-f7c2797156ce" containerName="oc" Mar 21 04:46:20 crc kubenswrapper[4839]: I0321 04:46:20.064059 4839 state_mem.go:107] "Deleted CPUSet assignment" podUID="75c1454e-0aed-48d9-a0f2-f7c2797156ce" containerName="oc" Mar 21 04:46:20 crc kubenswrapper[4839]: I0321 04:46:20.064253 4839 memory_manager.go:354] "RemoveStaleState removing state" podUID="bbaf057c-375e-4da6-a7cd-8c879a51ff50" containerName="nova-cell0-conductor-db-sync" Mar 21 04:46:20 crc kubenswrapper[4839]: I0321 04:46:20.064281 4839 memory_manager.go:354] "RemoveStaleState removing state" podUID="75c1454e-0aed-48d9-a0f2-f7c2797156ce" containerName="oc" Mar 21 04:46:20 crc kubenswrapper[4839]: I0321 04:46:20.064965 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Mar 21 04:46:20 crc kubenswrapper[4839]: I0321 04:46:20.067464 4839 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-nova-dockercfg-t66x4" Mar 21 04:46:20 crc kubenswrapper[4839]: I0321 04:46:20.067666 4839 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-conductor-config-data" Mar 21 04:46:20 crc kubenswrapper[4839]: I0321 04:46:20.075610 4839 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-0"] Mar 21 04:46:20 crc kubenswrapper[4839]: I0321 04:46:20.142282 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/152d0351-12d2-4cf1-ad49-fd943b223442-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"152d0351-12d2-4cf1-ad49-fd943b223442\") " pod="openstack/nova-cell0-conductor-0" Mar 21 04:46:20 crc kubenswrapper[4839]: I0321 04:46:20.142363 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/152d0351-12d2-4cf1-ad49-fd943b223442-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"152d0351-12d2-4cf1-ad49-fd943b223442\") " pod="openstack/nova-cell0-conductor-0" Mar 21 04:46:20 crc kubenswrapper[4839]: I0321 04:46:20.142398 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dt2lq\" (UniqueName: \"kubernetes.io/projected/152d0351-12d2-4cf1-ad49-fd943b223442-kube-api-access-dt2lq\") pod \"nova-cell0-conductor-0\" (UID: \"152d0351-12d2-4cf1-ad49-fd943b223442\") " pod="openstack/nova-cell0-conductor-0" Mar 21 04:46:20 crc kubenswrapper[4839]: I0321 04:46:20.244354 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/152d0351-12d2-4cf1-ad49-fd943b223442-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"152d0351-12d2-4cf1-ad49-fd943b223442\") " pod="openstack/nova-cell0-conductor-0" Mar 21 04:46:20 crc kubenswrapper[4839]: I0321 04:46:20.244423 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dt2lq\" (UniqueName: \"kubernetes.io/projected/152d0351-12d2-4cf1-ad49-fd943b223442-kube-api-access-dt2lq\") pod \"nova-cell0-conductor-0\" (UID: \"152d0351-12d2-4cf1-ad49-fd943b223442\") " pod="openstack/nova-cell0-conductor-0" Mar 21 04:46:20 crc kubenswrapper[4839]: I0321 04:46:20.244594 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/152d0351-12d2-4cf1-ad49-fd943b223442-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"152d0351-12d2-4cf1-ad49-fd943b223442\") " pod="openstack/nova-cell0-conductor-0" Mar 21 04:46:20 crc kubenswrapper[4839]: I0321 04:46:20.249868 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/152d0351-12d2-4cf1-ad49-fd943b223442-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"152d0351-12d2-4cf1-ad49-fd943b223442\") " pod="openstack/nova-cell0-conductor-0" Mar 21 04:46:20 crc kubenswrapper[4839]: I0321 04:46:20.253366 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/152d0351-12d2-4cf1-ad49-fd943b223442-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"152d0351-12d2-4cf1-ad49-fd943b223442\") " pod="openstack/nova-cell0-conductor-0" Mar 21 04:46:20 crc kubenswrapper[4839]: I0321 04:46:20.264276 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dt2lq\" (UniqueName: \"kubernetes.io/projected/152d0351-12d2-4cf1-ad49-fd943b223442-kube-api-access-dt2lq\") pod \"nova-cell0-conductor-0\" (UID: \"152d0351-12d2-4cf1-ad49-fd943b223442\") " pod="openstack/nova-cell0-conductor-0" Mar 21 04:46:20 crc kubenswrapper[4839]: I0321 04:46:20.383376 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Mar 21 04:46:20 crc kubenswrapper[4839]: I0321 04:46:20.819891 4839 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-0"] Mar 21 04:46:20 crc kubenswrapper[4839]: W0321 04:46:20.830445 4839 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod152d0351_12d2_4cf1_ad49_fd943b223442.slice/crio-f05bbac8b138a0f5d634f17f9935fc2d124e9f923454d943a82c4716905ac60f WatchSource:0}: Error finding container f05bbac8b138a0f5d634f17f9935fc2d124e9f923454d943a82c4716905ac60f: Status 404 returned error can't find the container with id f05bbac8b138a0f5d634f17f9935fc2d124e9f923454d943a82c4716905ac60f Mar 21 04:46:20 crc kubenswrapper[4839]: I0321 04:46:20.968850 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"152d0351-12d2-4cf1-ad49-fd943b223442","Type":"ContainerStarted","Data":"f05bbac8b138a0f5d634f17f9935fc2d124e9f923454d943a82c4716905ac60f"} Mar 21 04:46:21 crc kubenswrapper[4839]: I0321 04:46:21.982598 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"152d0351-12d2-4cf1-ad49-fd943b223442","Type":"ContainerStarted","Data":"561a4d809fd1b08d12fca4778c4c1c7d41e351f065cfb7df8cce030baf49ce86"} Mar 21 04:46:21 crc kubenswrapper[4839]: I0321 04:46:21.982796 4839 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell0-conductor-0" Mar 21 04:46:22 crc kubenswrapper[4839]: I0321 04:46:22.009895 4839 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-conductor-0" podStartSLOduration=2.009866404 podStartE2EDuration="2.009866404s" podCreationTimestamp="2026-03-21 04:46:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-21 04:46:21.999869792 +0000 UTC m=+1386.327656528" watchObservedRunningTime="2026-03-21 04:46:22.009866404 +0000 UTC m=+1386.337653120" Mar 21 04:46:25 crc kubenswrapper[4839]: I0321 04:46:25.169448 4839 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ceilometer-0" Mar 21 04:46:25 crc kubenswrapper[4839]: I0321 04:46:25.413557 4839 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell0-conductor-0" Mar 21 04:46:25 crc kubenswrapper[4839]: I0321 04:46:25.922423 4839 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-cell-mapping-csj7l"] Mar 21 04:46:25 crc kubenswrapper[4839]: I0321 04:46:25.923878 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-csj7l" Mar 21 04:46:25 crc kubenswrapper[4839]: I0321 04:46:25.925513 4839 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-manage-scripts" Mar 21 04:46:25 crc kubenswrapper[4839]: I0321 04:46:25.925761 4839 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-manage-config-data" Mar 21 04:46:25 crc kubenswrapper[4839]: I0321 04:46:25.933226 4839 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-cell-mapping-csj7l"] Mar 21 04:46:25 crc kubenswrapper[4839]: I0321 04:46:25.975986 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/37c6fbf7-427d-45a8-b190-439265c8d6d0-config-data\") pod \"nova-cell0-cell-mapping-csj7l\" (UID: \"37c6fbf7-427d-45a8-b190-439265c8d6d0\") " pod="openstack/nova-cell0-cell-mapping-csj7l" Mar 21 04:46:25 crc kubenswrapper[4839]: I0321 04:46:25.976061 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5wj7k\" (UniqueName: \"kubernetes.io/projected/37c6fbf7-427d-45a8-b190-439265c8d6d0-kube-api-access-5wj7k\") pod \"nova-cell0-cell-mapping-csj7l\" (UID: \"37c6fbf7-427d-45a8-b190-439265c8d6d0\") " pod="openstack/nova-cell0-cell-mapping-csj7l" Mar 21 04:46:25 crc kubenswrapper[4839]: I0321 04:46:25.976185 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/37c6fbf7-427d-45a8-b190-439265c8d6d0-scripts\") pod \"nova-cell0-cell-mapping-csj7l\" (UID: \"37c6fbf7-427d-45a8-b190-439265c8d6d0\") " pod="openstack/nova-cell0-cell-mapping-csj7l" Mar 21 04:46:25 crc kubenswrapper[4839]: I0321 04:46:25.976226 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/37c6fbf7-427d-45a8-b190-439265c8d6d0-combined-ca-bundle\") pod \"nova-cell0-cell-mapping-csj7l\" (UID: \"37c6fbf7-427d-45a8-b190-439265c8d6d0\") " pod="openstack/nova-cell0-cell-mapping-csj7l" Mar 21 04:46:26 crc kubenswrapper[4839]: I0321 04:46:26.064868 4839 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-scheduler-0"] Mar 21 04:46:26 crc kubenswrapper[4839]: I0321 04:46:26.065939 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Mar 21 04:46:26 crc kubenswrapper[4839]: I0321 04:46:26.068471 4839 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-scheduler-config-data" Mar 21 04:46:26 crc kubenswrapper[4839]: I0321 04:46:26.077958 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/37c6fbf7-427d-45a8-b190-439265c8d6d0-config-data\") pod \"nova-cell0-cell-mapping-csj7l\" (UID: \"37c6fbf7-427d-45a8-b190-439265c8d6d0\") " pod="openstack/nova-cell0-cell-mapping-csj7l" Mar 21 04:46:26 crc kubenswrapper[4839]: I0321 04:46:26.078015 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5wj7k\" (UniqueName: \"kubernetes.io/projected/37c6fbf7-427d-45a8-b190-439265c8d6d0-kube-api-access-5wj7k\") pod \"nova-cell0-cell-mapping-csj7l\" (UID: \"37c6fbf7-427d-45a8-b190-439265c8d6d0\") " pod="openstack/nova-cell0-cell-mapping-csj7l" Mar 21 04:46:26 crc kubenswrapper[4839]: I0321 04:46:26.078083 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/37c6fbf7-427d-45a8-b190-439265c8d6d0-scripts\") pod \"nova-cell0-cell-mapping-csj7l\" (UID: \"37c6fbf7-427d-45a8-b190-439265c8d6d0\") " pod="openstack/nova-cell0-cell-mapping-csj7l" Mar 21 04:46:26 crc kubenswrapper[4839]: I0321 04:46:26.078105 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/37c6fbf7-427d-45a8-b190-439265c8d6d0-combined-ca-bundle\") pod \"nova-cell0-cell-mapping-csj7l\" (UID: \"37c6fbf7-427d-45a8-b190-439265c8d6d0\") " pod="openstack/nova-cell0-cell-mapping-csj7l" Mar 21 04:46:26 crc kubenswrapper[4839]: I0321 04:46:26.084357 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/37c6fbf7-427d-45a8-b190-439265c8d6d0-combined-ca-bundle\") pod \"nova-cell0-cell-mapping-csj7l\" (UID: \"37c6fbf7-427d-45a8-b190-439265c8d6d0\") " pod="openstack/nova-cell0-cell-mapping-csj7l" Mar 21 04:46:26 crc kubenswrapper[4839]: I0321 04:46:26.091233 4839 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Mar 21 04:46:26 crc kubenswrapper[4839]: I0321 04:46:26.101160 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/37c6fbf7-427d-45a8-b190-439265c8d6d0-scripts\") pod \"nova-cell0-cell-mapping-csj7l\" (UID: \"37c6fbf7-427d-45a8-b190-439265c8d6d0\") " pod="openstack/nova-cell0-cell-mapping-csj7l" Mar 21 04:46:26 crc kubenswrapper[4839]: I0321 04:46:26.102552 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/37c6fbf7-427d-45a8-b190-439265c8d6d0-config-data\") pod \"nova-cell0-cell-mapping-csj7l\" (UID: \"37c6fbf7-427d-45a8-b190-439265c8d6d0\") " pod="openstack/nova-cell0-cell-mapping-csj7l" Mar 21 04:46:26 crc kubenswrapper[4839]: I0321 04:46:26.135389 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5wj7k\" (UniqueName: \"kubernetes.io/projected/37c6fbf7-427d-45a8-b190-439265c8d6d0-kube-api-access-5wj7k\") pod \"nova-cell0-cell-mapping-csj7l\" (UID: \"37c6fbf7-427d-45a8-b190-439265c8d6d0\") " pod="openstack/nova-cell0-cell-mapping-csj7l" Mar 21 04:46:26 crc kubenswrapper[4839]: I0321 04:46:26.181771 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/62694a5a-1565-4831-bff3-504a782692bb-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"62694a5a-1565-4831-bff3-504a782692bb\") " pod="openstack/nova-scheduler-0" Mar 21 04:46:26 crc kubenswrapper[4839]: I0321 04:46:26.181929 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/62694a5a-1565-4831-bff3-504a782692bb-config-data\") pod \"nova-scheduler-0\" (UID: \"62694a5a-1565-4831-bff3-504a782692bb\") " pod="openstack/nova-scheduler-0" Mar 21 04:46:26 crc kubenswrapper[4839]: I0321 04:46:26.181990 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6ww4d\" (UniqueName: \"kubernetes.io/projected/62694a5a-1565-4831-bff3-504a782692bb-kube-api-access-6ww4d\") pod \"nova-scheduler-0\" (UID: \"62694a5a-1565-4831-bff3-504a782692bb\") " pod="openstack/nova-scheduler-0" Mar 21 04:46:26 crc kubenswrapper[4839]: I0321 04:46:26.191384 4839 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Mar 21 04:46:26 crc kubenswrapper[4839]: I0321 04:46:26.229345 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 21 04:46:26 crc kubenswrapper[4839]: I0321 04:46:26.234142 4839 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Mar 21 04:46:26 crc kubenswrapper[4839]: I0321 04:46:26.238052 4839 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Mar 21 04:46:26 crc kubenswrapper[4839]: I0321 04:46:26.239550 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Mar 21 04:46:26 crc kubenswrapper[4839]: I0321 04:46:26.248437 4839 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-novncproxy-config-data" Mar 21 04:46:26 crc kubenswrapper[4839]: I0321 04:46:26.259688 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-csj7l" Mar 21 04:46:26 crc kubenswrapper[4839]: I0321 04:46:26.274628 4839 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Mar 21 04:46:26 crc kubenswrapper[4839]: I0321 04:46:26.290317 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/205b5c5e-c09f-4b4a-8a56-f98531ad0125-config-data\") pod \"nova-metadata-0\" (UID: \"205b5c5e-c09f-4b4a-8a56-f98531ad0125\") " pod="openstack/nova-metadata-0" Mar 21 04:46:26 crc kubenswrapper[4839]: I0321 04:46:26.290395 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/205b5c5e-c09f-4b4a-8a56-f98531ad0125-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"205b5c5e-c09f-4b4a-8a56-f98531ad0125\") " pod="openstack/nova-metadata-0" Mar 21 04:46:26 crc kubenswrapper[4839]: I0321 04:46:26.290457 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2dwgm\" (UniqueName: \"kubernetes.io/projected/205b5c5e-c09f-4b4a-8a56-f98531ad0125-kube-api-access-2dwgm\") pod \"nova-metadata-0\" (UID: \"205b5c5e-c09f-4b4a-8a56-f98531ad0125\") " pod="openstack/nova-metadata-0" Mar 21 04:46:26 crc kubenswrapper[4839]: I0321 04:46:26.290689 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/62694a5a-1565-4831-bff3-504a782692bb-config-data\") pod \"nova-scheduler-0\" (UID: \"62694a5a-1565-4831-bff3-504a782692bb\") " pod="openstack/nova-scheduler-0" Mar 21 04:46:26 crc kubenswrapper[4839]: I0321 04:46:26.290860 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/205b5c5e-c09f-4b4a-8a56-f98531ad0125-logs\") pod \"nova-metadata-0\" (UID: \"205b5c5e-c09f-4b4a-8a56-f98531ad0125\") " pod="openstack/nova-metadata-0" Mar 21 04:46:26 crc kubenswrapper[4839]: I0321 04:46:26.290927 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6ww4d\" (UniqueName: \"kubernetes.io/projected/62694a5a-1565-4831-bff3-504a782692bb-kube-api-access-6ww4d\") pod \"nova-scheduler-0\" (UID: \"62694a5a-1565-4831-bff3-504a782692bb\") " pod="openstack/nova-scheduler-0" Mar 21 04:46:26 crc kubenswrapper[4839]: I0321 04:46:26.291163 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/62694a5a-1565-4831-bff3-504a782692bb-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"62694a5a-1565-4831-bff3-504a782692bb\") " pod="openstack/nova-scheduler-0" Mar 21 04:46:26 crc kubenswrapper[4839]: I0321 04:46:26.310641 4839 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Mar 21 04:46:26 crc kubenswrapper[4839]: I0321 04:46:26.316338 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/62694a5a-1565-4831-bff3-504a782692bb-config-data\") pod \"nova-scheduler-0\" (UID: \"62694a5a-1565-4831-bff3-504a782692bb\") " pod="openstack/nova-scheduler-0" Mar 21 04:46:26 crc kubenswrapper[4839]: I0321 04:46:26.327765 4839 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Mar 21 04:46:26 crc kubenswrapper[4839]: I0321 04:46:26.330267 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 21 04:46:26 crc kubenswrapper[4839]: I0321 04:46:26.331337 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/62694a5a-1565-4831-bff3-504a782692bb-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"62694a5a-1565-4831-bff3-504a782692bb\") " pod="openstack/nova-scheduler-0" Mar 21 04:46:26 crc kubenswrapper[4839]: I0321 04:46:26.333998 4839 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Mar 21 04:46:26 crc kubenswrapper[4839]: I0321 04:46:26.340134 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6ww4d\" (UniqueName: \"kubernetes.io/projected/62694a5a-1565-4831-bff3-504a782692bb-kube-api-access-6ww4d\") pod \"nova-scheduler-0\" (UID: \"62694a5a-1565-4831-bff3-504a782692bb\") " pod="openstack/nova-scheduler-0" Mar 21 04:46:26 crc kubenswrapper[4839]: I0321 04:46:26.386524 4839 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Mar 21 04:46:26 crc kubenswrapper[4839]: I0321 04:46:26.393344 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3307932f-5c67-4abb-9649-e4b3a0a19e9c-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"3307932f-5c67-4abb-9649-e4b3a0a19e9c\") " pod="openstack/nova-cell1-novncproxy-0" Mar 21 04:46:26 crc kubenswrapper[4839]: I0321 04:46:26.393602 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5xbgl\" (UniqueName: \"kubernetes.io/projected/3307932f-5c67-4abb-9649-e4b3a0a19e9c-kube-api-access-5xbgl\") pod \"nova-cell1-novncproxy-0\" (UID: \"3307932f-5c67-4abb-9649-e4b3a0a19e9c\") " pod="openstack/nova-cell1-novncproxy-0" Mar 21 04:46:26 crc kubenswrapper[4839]: I0321 04:46:26.394033 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4zk4h\" (UniqueName: \"kubernetes.io/projected/60e89b8f-2e6a-43a6-a9de-162a457cc5fb-kube-api-access-4zk4h\") pod \"nova-api-0\" (UID: \"60e89b8f-2e6a-43a6-a9de-162a457cc5fb\") " pod="openstack/nova-api-0" Mar 21 04:46:26 crc kubenswrapper[4839]: I0321 04:46:26.394146 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3307932f-5c67-4abb-9649-e4b3a0a19e9c-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"3307932f-5c67-4abb-9649-e4b3a0a19e9c\") " pod="openstack/nova-cell1-novncproxy-0" Mar 21 04:46:26 crc kubenswrapper[4839]: I0321 04:46:26.394250 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/205b5c5e-c09f-4b4a-8a56-f98531ad0125-logs\") pod \"nova-metadata-0\" (UID: \"205b5c5e-c09f-4b4a-8a56-f98531ad0125\") " pod="openstack/nova-metadata-0" Mar 21 04:46:26 crc kubenswrapper[4839]: I0321 04:46:26.394374 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/60e89b8f-2e6a-43a6-a9de-162a457cc5fb-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"60e89b8f-2e6a-43a6-a9de-162a457cc5fb\") " pod="openstack/nova-api-0" Mar 21 04:46:26 crc kubenswrapper[4839]: I0321 04:46:26.394511 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/60e89b8f-2e6a-43a6-a9de-162a457cc5fb-logs\") pod \"nova-api-0\" (UID: \"60e89b8f-2e6a-43a6-a9de-162a457cc5fb\") " pod="openstack/nova-api-0" Mar 21 04:46:26 crc kubenswrapper[4839]: I0321 04:46:26.394633 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/205b5c5e-c09f-4b4a-8a56-f98531ad0125-config-data\") pod \"nova-metadata-0\" (UID: \"205b5c5e-c09f-4b4a-8a56-f98531ad0125\") " pod="openstack/nova-metadata-0" Mar 21 04:46:26 crc kubenswrapper[4839]: I0321 04:46:26.394739 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/205b5c5e-c09f-4b4a-8a56-f98531ad0125-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"205b5c5e-c09f-4b4a-8a56-f98531ad0125\") " pod="openstack/nova-metadata-0" Mar 21 04:46:26 crc kubenswrapper[4839]: I0321 04:46:26.394831 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2dwgm\" (UniqueName: \"kubernetes.io/projected/205b5c5e-c09f-4b4a-8a56-f98531ad0125-kube-api-access-2dwgm\") pod \"nova-metadata-0\" (UID: \"205b5c5e-c09f-4b4a-8a56-f98531ad0125\") " pod="openstack/nova-metadata-0" Mar 21 04:46:26 crc kubenswrapper[4839]: I0321 04:46:26.394918 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/60e89b8f-2e6a-43a6-a9de-162a457cc5fb-config-data\") pod \"nova-api-0\" (UID: \"60e89b8f-2e6a-43a6-a9de-162a457cc5fb\") " pod="openstack/nova-api-0" Mar 21 04:46:26 crc kubenswrapper[4839]: I0321 04:46:26.400369 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/205b5c5e-c09f-4b4a-8a56-f98531ad0125-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"205b5c5e-c09f-4b4a-8a56-f98531ad0125\") " pod="openstack/nova-metadata-0" Mar 21 04:46:26 crc kubenswrapper[4839]: I0321 04:46:26.400819 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/205b5c5e-c09f-4b4a-8a56-f98531ad0125-logs\") pod \"nova-metadata-0\" (UID: \"205b5c5e-c09f-4b4a-8a56-f98531ad0125\") " pod="openstack/nova-metadata-0" Mar 21 04:46:26 crc kubenswrapper[4839]: I0321 04:46:26.421211 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2dwgm\" (UniqueName: \"kubernetes.io/projected/205b5c5e-c09f-4b4a-8a56-f98531ad0125-kube-api-access-2dwgm\") pod \"nova-metadata-0\" (UID: \"205b5c5e-c09f-4b4a-8a56-f98531ad0125\") " pod="openstack/nova-metadata-0" Mar 21 04:46:26 crc kubenswrapper[4839]: I0321 04:46:26.421274 4839 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-bccf8f775-gwlp7"] Mar 21 04:46:26 crc kubenswrapper[4839]: I0321 04:46:26.422719 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-bccf8f775-gwlp7" Mar 21 04:46:26 crc kubenswrapper[4839]: I0321 04:46:26.430264 4839 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-bccf8f775-gwlp7"] Mar 21 04:46:26 crc kubenswrapper[4839]: I0321 04:46:26.466892 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/205b5c5e-c09f-4b4a-8a56-f98531ad0125-config-data\") pod \"nova-metadata-0\" (UID: \"205b5c5e-c09f-4b4a-8a56-f98531ad0125\") " pod="openstack/nova-metadata-0" Mar 21 04:46:26 crc kubenswrapper[4839]: I0321 04:46:26.497991 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3307932f-5c67-4abb-9649-e4b3a0a19e9c-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"3307932f-5c67-4abb-9649-e4b3a0a19e9c\") " pod="openstack/nova-cell1-novncproxy-0" Mar 21 04:46:26 crc kubenswrapper[4839]: I0321 04:46:26.498051 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5xbgl\" (UniqueName: \"kubernetes.io/projected/3307932f-5c67-4abb-9649-e4b3a0a19e9c-kube-api-access-5xbgl\") pod \"nova-cell1-novncproxy-0\" (UID: \"3307932f-5c67-4abb-9649-e4b3a0a19e9c\") " pod="openstack/nova-cell1-novncproxy-0" Mar 21 04:46:26 crc kubenswrapper[4839]: I0321 04:46:26.498126 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/378a796b-e896-48a8-9e03-65e3b371c636-ovsdbserver-nb\") pod \"dnsmasq-dns-bccf8f775-gwlp7\" (UID: \"378a796b-e896-48a8-9e03-65e3b371c636\") " pod="openstack/dnsmasq-dns-bccf8f775-gwlp7" Mar 21 04:46:26 crc kubenswrapper[4839]: I0321 04:46:26.498160 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4zk4h\" (UniqueName: \"kubernetes.io/projected/60e89b8f-2e6a-43a6-a9de-162a457cc5fb-kube-api-access-4zk4h\") pod \"nova-api-0\" (UID: \"60e89b8f-2e6a-43a6-a9de-162a457cc5fb\") " pod="openstack/nova-api-0" Mar 21 04:46:26 crc kubenswrapper[4839]: I0321 04:46:26.498182 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3307932f-5c67-4abb-9649-e4b3a0a19e9c-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"3307932f-5c67-4abb-9649-e4b3a0a19e9c\") " pod="openstack/nova-cell1-novncproxy-0" Mar 21 04:46:26 crc kubenswrapper[4839]: I0321 04:46:26.498215 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/378a796b-e896-48a8-9e03-65e3b371c636-ovsdbserver-sb\") pod \"dnsmasq-dns-bccf8f775-gwlp7\" (UID: \"378a796b-e896-48a8-9e03-65e3b371c636\") " pod="openstack/dnsmasq-dns-bccf8f775-gwlp7" Mar 21 04:46:26 crc kubenswrapper[4839]: I0321 04:46:26.498265 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/60e89b8f-2e6a-43a6-a9de-162a457cc5fb-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"60e89b8f-2e6a-43a6-a9de-162a457cc5fb\") " pod="openstack/nova-api-0" Mar 21 04:46:26 crc kubenswrapper[4839]: I0321 04:46:26.498289 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/378a796b-e896-48a8-9e03-65e3b371c636-config\") pod \"dnsmasq-dns-bccf8f775-gwlp7\" (UID: \"378a796b-e896-48a8-9e03-65e3b371c636\") " pod="openstack/dnsmasq-dns-bccf8f775-gwlp7" Mar 21 04:46:26 crc kubenswrapper[4839]: I0321 04:46:26.498350 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/378a796b-e896-48a8-9e03-65e3b371c636-dns-swift-storage-0\") pod \"dnsmasq-dns-bccf8f775-gwlp7\" (UID: \"378a796b-e896-48a8-9e03-65e3b371c636\") " pod="openstack/dnsmasq-dns-bccf8f775-gwlp7" Mar 21 04:46:26 crc kubenswrapper[4839]: I0321 04:46:26.498405 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/378a796b-e896-48a8-9e03-65e3b371c636-dns-svc\") pod \"dnsmasq-dns-bccf8f775-gwlp7\" (UID: \"378a796b-e896-48a8-9e03-65e3b371c636\") " pod="openstack/dnsmasq-dns-bccf8f775-gwlp7" Mar 21 04:46:26 crc kubenswrapper[4839]: I0321 04:46:26.498428 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/60e89b8f-2e6a-43a6-a9de-162a457cc5fb-logs\") pod \"nova-api-0\" (UID: \"60e89b8f-2e6a-43a6-a9de-162a457cc5fb\") " pod="openstack/nova-api-0" Mar 21 04:46:26 crc kubenswrapper[4839]: I0321 04:46:26.498455 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4c6c4\" (UniqueName: \"kubernetes.io/projected/378a796b-e896-48a8-9e03-65e3b371c636-kube-api-access-4c6c4\") pod \"dnsmasq-dns-bccf8f775-gwlp7\" (UID: \"378a796b-e896-48a8-9e03-65e3b371c636\") " pod="openstack/dnsmasq-dns-bccf8f775-gwlp7" Mar 21 04:46:26 crc kubenswrapper[4839]: I0321 04:46:26.498530 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/60e89b8f-2e6a-43a6-a9de-162a457cc5fb-config-data\") pod \"nova-api-0\" (UID: \"60e89b8f-2e6a-43a6-a9de-162a457cc5fb\") " pod="openstack/nova-api-0" Mar 21 04:46:26 crc kubenswrapper[4839]: I0321 04:46:26.502627 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/60e89b8f-2e6a-43a6-a9de-162a457cc5fb-logs\") pod \"nova-api-0\" (UID: \"60e89b8f-2e6a-43a6-a9de-162a457cc5fb\") " pod="openstack/nova-api-0" Mar 21 04:46:26 crc kubenswrapper[4839]: I0321 04:46:26.503478 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/60e89b8f-2e6a-43a6-a9de-162a457cc5fb-config-data\") pod \"nova-api-0\" (UID: \"60e89b8f-2e6a-43a6-a9de-162a457cc5fb\") " pod="openstack/nova-api-0" Mar 21 04:46:26 crc kubenswrapper[4839]: I0321 04:46:26.506385 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3307932f-5c67-4abb-9649-e4b3a0a19e9c-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"3307932f-5c67-4abb-9649-e4b3a0a19e9c\") " pod="openstack/nova-cell1-novncproxy-0" Mar 21 04:46:26 crc kubenswrapper[4839]: I0321 04:46:26.506938 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/60e89b8f-2e6a-43a6-a9de-162a457cc5fb-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"60e89b8f-2e6a-43a6-a9de-162a457cc5fb\") " pod="openstack/nova-api-0" Mar 21 04:46:26 crc kubenswrapper[4839]: I0321 04:46:26.509731 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3307932f-5c67-4abb-9649-e4b3a0a19e9c-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"3307932f-5c67-4abb-9649-e4b3a0a19e9c\") " pod="openstack/nova-cell1-novncproxy-0" Mar 21 04:46:26 crc kubenswrapper[4839]: I0321 04:46:26.516872 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Mar 21 04:46:26 crc kubenswrapper[4839]: I0321 04:46:26.528048 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4zk4h\" (UniqueName: \"kubernetes.io/projected/60e89b8f-2e6a-43a6-a9de-162a457cc5fb-kube-api-access-4zk4h\") pod \"nova-api-0\" (UID: \"60e89b8f-2e6a-43a6-a9de-162a457cc5fb\") " pod="openstack/nova-api-0" Mar 21 04:46:26 crc kubenswrapper[4839]: I0321 04:46:26.531095 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5xbgl\" (UniqueName: \"kubernetes.io/projected/3307932f-5c67-4abb-9649-e4b3a0a19e9c-kube-api-access-5xbgl\") pod \"nova-cell1-novncproxy-0\" (UID: \"3307932f-5c67-4abb-9649-e4b3a0a19e9c\") " pod="openstack/nova-cell1-novncproxy-0" Mar 21 04:46:26 crc kubenswrapper[4839]: I0321 04:46:26.539706 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 21 04:46:26 crc kubenswrapper[4839]: I0321 04:46:26.600783 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/378a796b-e896-48a8-9e03-65e3b371c636-ovsdbserver-nb\") pod \"dnsmasq-dns-bccf8f775-gwlp7\" (UID: \"378a796b-e896-48a8-9e03-65e3b371c636\") " pod="openstack/dnsmasq-dns-bccf8f775-gwlp7" Mar 21 04:46:26 crc kubenswrapper[4839]: I0321 04:46:26.600895 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/378a796b-e896-48a8-9e03-65e3b371c636-ovsdbserver-sb\") pod \"dnsmasq-dns-bccf8f775-gwlp7\" (UID: \"378a796b-e896-48a8-9e03-65e3b371c636\") " pod="openstack/dnsmasq-dns-bccf8f775-gwlp7" Mar 21 04:46:26 crc kubenswrapper[4839]: I0321 04:46:26.600954 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/378a796b-e896-48a8-9e03-65e3b371c636-config\") pod \"dnsmasq-dns-bccf8f775-gwlp7\" (UID: \"378a796b-e896-48a8-9e03-65e3b371c636\") " pod="openstack/dnsmasq-dns-bccf8f775-gwlp7" Mar 21 04:46:26 crc kubenswrapper[4839]: I0321 04:46:26.601023 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/378a796b-e896-48a8-9e03-65e3b371c636-dns-swift-storage-0\") pod \"dnsmasq-dns-bccf8f775-gwlp7\" (UID: \"378a796b-e896-48a8-9e03-65e3b371c636\") " pod="openstack/dnsmasq-dns-bccf8f775-gwlp7" Mar 21 04:46:26 crc kubenswrapper[4839]: I0321 04:46:26.601222 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/378a796b-e896-48a8-9e03-65e3b371c636-dns-svc\") pod \"dnsmasq-dns-bccf8f775-gwlp7\" (UID: \"378a796b-e896-48a8-9e03-65e3b371c636\") " pod="openstack/dnsmasq-dns-bccf8f775-gwlp7" Mar 21 04:46:26 crc kubenswrapper[4839]: I0321 04:46:26.601265 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4c6c4\" (UniqueName: \"kubernetes.io/projected/378a796b-e896-48a8-9e03-65e3b371c636-kube-api-access-4c6c4\") pod \"dnsmasq-dns-bccf8f775-gwlp7\" (UID: \"378a796b-e896-48a8-9e03-65e3b371c636\") " pod="openstack/dnsmasq-dns-bccf8f775-gwlp7" Mar 21 04:46:26 crc kubenswrapper[4839]: I0321 04:46:26.603089 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/378a796b-e896-48a8-9e03-65e3b371c636-dns-swift-storage-0\") pod \"dnsmasq-dns-bccf8f775-gwlp7\" (UID: \"378a796b-e896-48a8-9e03-65e3b371c636\") " pod="openstack/dnsmasq-dns-bccf8f775-gwlp7" Mar 21 04:46:26 crc kubenswrapper[4839]: I0321 04:46:26.603421 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/378a796b-e896-48a8-9e03-65e3b371c636-ovsdbserver-nb\") pod \"dnsmasq-dns-bccf8f775-gwlp7\" (UID: \"378a796b-e896-48a8-9e03-65e3b371c636\") " pod="openstack/dnsmasq-dns-bccf8f775-gwlp7" Mar 21 04:46:26 crc kubenswrapper[4839]: I0321 04:46:26.604008 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/378a796b-e896-48a8-9e03-65e3b371c636-ovsdbserver-sb\") pod \"dnsmasq-dns-bccf8f775-gwlp7\" (UID: \"378a796b-e896-48a8-9e03-65e3b371c636\") " pod="openstack/dnsmasq-dns-bccf8f775-gwlp7" Mar 21 04:46:26 crc kubenswrapper[4839]: I0321 04:46:26.604610 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/378a796b-e896-48a8-9e03-65e3b371c636-dns-svc\") pod \"dnsmasq-dns-bccf8f775-gwlp7\" (UID: \"378a796b-e896-48a8-9e03-65e3b371c636\") " pod="openstack/dnsmasq-dns-bccf8f775-gwlp7" Mar 21 04:46:26 crc kubenswrapper[4839]: I0321 04:46:26.604859 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/378a796b-e896-48a8-9e03-65e3b371c636-config\") pod \"dnsmasq-dns-bccf8f775-gwlp7\" (UID: \"378a796b-e896-48a8-9e03-65e3b371c636\") " pod="openstack/dnsmasq-dns-bccf8f775-gwlp7" Mar 21 04:46:26 crc kubenswrapper[4839]: I0321 04:46:26.624037 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4c6c4\" (UniqueName: \"kubernetes.io/projected/378a796b-e896-48a8-9e03-65e3b371c636-kube-api-access-4c6c4\") pod \"dnsmasq-dns-bccf8f775-gwlp7\" (UID: \"378a796b-e896-48a8-9e03-65e3b371c636\") " pod="openstack/dnsmasq-dns-bccf8f775-gwlp7" Mar 21 04:46:26 crc kubenswrapper[4839]: I0321 04:46:26.696703 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 21 04:46:26 crc kubenswrapper[4839]: I0321 04:46:26.827065 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Mar 21 04:46:26 crc kubenswrapper[4839]: I0321 04:46:26.849761 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-bccf8f775-gwlp7" Mar 21 04:46:26 crc kubenswrapper[4839]: I0321 04:46:26.862345 4839 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-cell-mapping-csj7l"] Mar 21 04:46:27 crc kubenswrapper[4839]: I0321 04:46:27.082702 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-csj7l" event={"ID":"37c6fbf7-427d-45a8-b190-439265c8d6d0","Type":"ContainerStarted","Data":"deb2bfcfde4895ebb1cae51b1ec4d964acc93bf15dd9d8c3a0d4a4811a853624"} Mar 21 04:46:27 crc kubenswrapper[4839]: I0321 04:46:27.109126 4839 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Mar 21 04:46:27 crc kubenswrapper[4839]: W0321 04:46:27.118813 4839 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod60e89b8f_2e6a_43a6_a9de_162a457cc5fb.slice/crio-578df92eb43a9d3deb754fc3112c79ab2340cf1a5936b7d1362d0e02e009882d WatchSource:0}: Error finding container 578df92eb43a9d3deb754fc3112c79ab2340cf1a5936b7d1362d0e02e009882d: Status 404 returned error can't find the container with id 578df92eb43a9d3deb754fc3112c79ab2340cf1a5936b7d1362d0e02e009882d Mar 21 04:46:27 crc kubenswrapper[4839]: I0321 04:46:27.184960 4839 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Mar 21 04:46:27 crc kubenswrapper[4839]: I0321 04:46:27.278902 4839 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-conductor-db-sync-jznl6"] Mar 21 04:46:27 crc kubenswrapper[4839]: I0321 04:46:27.280796 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-jznl6" Mar 21 04:46:27 crc kubenswrapper[4839]: I0321 04:46:27.286803 4839 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-conductor-scripts" Mar 21 04:46:27 crc kubenswrapper[4839]: I0321 04:46:27.286990 4839 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-conductor-config-data" Mar 21 04:46:27 crc kubenswrapper[4839]: I0321 04:46:27.289710 4839 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-jznl6"] Mar 21 04:46:27 crc kubenswrapper[4839]: I0321 04:46:27.326501 4839 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Mar 21 04:46:27 crc kubenswrapper[4839]: W0321 04:46:27.356698 4839 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod205b5c5e_c09f_4b4a_8a56_f98531ad0125.slice/crio-5792eedab352d18af4b7af67287b836849e3b15e2d915a2161d15245e06868bd WatchSource:0}: Error finding container 5792eedab352d18af4b7af67287b836849e3b15e2d915a2161d15245e06868bd: Status 404 returned error can't find the container with id 5792eedab352d18af4b7af67287b836849e3b15e2d915a2161d15245e06868bd Mar 21 04:46:27 crc kubenswrapper[4839]: I0321 04:46:27.416637 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/500decd4-2b92-4e52-bfa8-bb8d1fe13b9d-combined-ca-bundle\") pod \"nova-cell1-conductor-db-sync-jznl6\" (UID: \"500decd4-2b92-4e52-bfa8-bb8d1fe13b9d\") " pod="openstack/nova-cell1-conductor-db-sync-jznl6" Mar 21 04:46:27 crc kubenswrapper[4839]: I0321 04:46:27.416783 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6rrz9\" (UniqueName: \"kubernetes.io/projected/500decd4-2b92-4e52-bfa8-bb8d1fe13b9d-kube-api-access-6rrz9\") pod \"nova-cell1-conductor-db-sync-jznl6\" (UID: \"500decd4-2b92-4e52-bfa8-bb8d1fe13b9d\") " pod="openstack/nova-cell1-conductor-db-sync-jznl6" Mar 21 04:46:27 crc kubenswrapper[4839]: I0321 04:46:27.416802 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/500decd4-2b92-4e52-bfa8-bb8d1fe13b9d-config-data\") pod \"nova-cell1-conductor-db-sync-jznl6\" (UID: \"500decd4-2b92-4e52-bfa8-bb8d1fe13b9d\") " pod="openstack/nova-cell1-conductor-db-sync-jznl6" Mar 21 04:46:27 crc kubenswrapper[4839]: I0321 04:46:27.416837 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/500decd4-2b92-4e52-bfa8-bb8d1fe13b9d-scripts\") pod \"nova-cell1-conductor-db-sync-jznl6\" (UID: \"500decd4-2b92-4e52-bfa8-bb8d1fe13b9d\") " pod="openstack/nova-cell1-conductor-db-sync-jznl6" Mar 21 04:46:27 crc kubenswrapper[4839]: I0321 04:46:27.518235 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6rrz9\" (UniqueName: \"kubernetes.io/projected/500decd4-2b92-4e52-bfa8-bb8d1fe13b9d-kube-api-access-6rrz9\") pod \"nova-cell1-conductor-db-sync-jznl6\" (UID: \"500decd4-2b92-4e52-bfa8-bb8d1fe13b9d\") " pod="openstack/nova-cell1-conductor-db-sync-jznl6" Mar 21 04:46:27 crc kubenswrapper[4839]: I0321 04:46:27.518605 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/500decd4-2b92-4e52-bfa8-bb8d1fe13b9d-config-data\") pod \"nova-cell1-conductor-db-sync-jznl6\" (UID: \"500decd4-2b92-4e52-bfa8-bb8d1fe13b9d\") " pod="openstack/nova-cell1-conductor-db-sync-jznl6" Mar 21 04:46:27 crc kubenswrapper[4839]: I0321 04:46:27.518738 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/500decd4-2b92-4e52-bfa8-bb8d1fe13b9d-scripts\") pod \"nova-cell1-conductor-db-sync-jznl6\" (UID: \"500decd4-2b92-4e52-bfa8-bb8d1fe13b9d\") " pod="openstack/nova-cell1-conductor-db-sync-jznl6" Mar 21 04:46:27 crc kubenswrapper[4839]: I0321 04:46:27.518870 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/500decd4-2b92-4e52-bfa8-bb8d1fe13b9d-combined-ca-bundle\") pod \"nova-cell1-conductor-db-sync-jznl6\" (UID: \"500decd4-2b92-4e52-bfa8-bb8d1fe13b9d\") " pod="openstack/nova-cell1-conductor-db-sync-jznl6" Mar 21 04:46:27 crc kubenswrapper[4839]: I0321 04:46:27.526223 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/500decd4-2b92-4e52-bfa8-bb8d1fe13b9d-config-data\") pod \"nova-cell1-conductor-db-sync-jznl6\" (UID: \"500decd4-2b92-4e52-bfa8-bb8d1fe13b9d\") " pod="openstack/nova-cell1-conductor-db-sync-jznl6" Mar 21 04:46:27 crc kubenswrapper[4839]: I0321 04:46:27.533998 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/500decd4-2b92-4e52-bfa8-bb8d1fe13b9d-combined-ca-bundle\") pod \"nova-cell1-conductor-db-sync-jznl6\" (UID: \"500decd4-2b92-4e52-bfa8-bb8d1fe13b9d\") " pod="openstack/nova-cell1-conductor-db-sync-jznl6" Mar 21 04:46:27 crc kubenswrapper[4839]: I0321 04:46:27.535415 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/500decd4-2b92-4e52-bfa8-bb8d1fe13b9d-scripts\") pod \"nova-cell1-conductor-db-sync-jznl6\" (UID: \"500decd4-2b92-4e52-bfa8-bb8d1fe13b9d\") " pod="openstack/nova-cell1-conductor-db-sync-jznl6" Mar 21 04:46:27 crc kubenswrapper[4839]: I0321 04:46:27.545839 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6rrz9\" (UniqueName: \"kubernetes.io/projected/500decd4-2b92-4e52-bfa8-bb8d1fe13b9d-kube-api-access-6rrz9\") pod \"nova-cell1-conductor-db-sync-jznl6\" (UID: \"500decd4-2b92-4e52-bfa8-bb8d1fe13b9d\") " pod="openstack/nova-cell1-conductor-db-sync-jznl6" Mar 21 04:46:27 crc kubenswrapper[4839]: I0321 04:46:27.556691 4839 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Mar 21 04:46:27 crc kubenswrapper[4839]: W0321 04:46:27.578303 4839 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod378a796b_e896_48a8_9e03_65e3b371c636.slice/crio-49d3afde602a166e8c5a9ef71743020fdf1f738c3940d641e7dae2434ec0eb13 WatchSource:0}: Error finding container 49d3afde602a166e8c5a9ef71743020fdf1f738c3940d641e7dae2434ec0eb13: Status 404 returned error can't find the container with id 49d3afde602a166e8c5a9ef71743020fdf1f738c3940d641e7dae2434ec0eb13 Mar 21 04:46:27 crc kubenswrapper[4839]: I0321 04:46:27.597786 4839 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-bccf8f775-gwlp7"] Mar 21 04:46:27 crc kubenswrapper[4839]: I0321 04:46:27.647662 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-jznl6" Mar 21 04:46:28 crc kubenswrapper[4839]: I0321 04:46:28.099892 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"60e89b8f-2e6a-43a6-a9de-162a457cc5fb","Type":"ContainerStarted","Data":"578df92eb43a9d3deb754fc3112c79ab2340cf1a5936b7d1362d0e02e009882d"} Mar 21 04:46:28 crc kubenswrapper[4839]: I0321 04:46:28.116917 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"205b5c5e-c09f-4b4a-8a56-f98531ad0125","Type":"ContainerStarted","Data":"5792eedab352d18af4b7af67287b836849e3b15e2d915a2161d15245e06868bd"} Mar 21 04:46:28 crc kubenswrapper[4839]: I0321 04:46:28.135517 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"3307932f-5c67-4abb-9649-e4b3a0a19e9c","Type":"ContainerStarted","Data":"51088d3062d2b2360e5c4a54fc629b5fbeeafa49a6e356a501876595e528c519"} Mar 21 04:46:28 crc kubenswrapper[4839]: I0321 04:46:28.156111 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-csj7l" event={"ID":"37c6fbf7-427d-45a8-b190-439265c8d6d0","Type":"ContainerStarted","Data":"6500e5c41c0724032a37daabaaadca5a2ab96ab0732aaceeaaccdf5e739d902c"} Mar 21 04:46:28 crc kubenswrapper[4839]: I0321 04:46:28.164237 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"62694a5a-1565-4831-bff3-504a782692bb","Type":"ContainerStarted","Data":"0b486a0ca8e515cbd2ddc4f12af8c937feaf1c88976b8dbba2cd271361b2775c"} Mar 21 04:46:28 crc kubenswrapper[4839]: I0321 04:46:28.175454 4839 generic.go:334] "Generic (PLEG): container finished" podID="378a796b-e896-48a8-9e03-65e3b371c636" containerID="880297fb77f65981125f101cde38f55dd95860faac6dbd936272889d1aa0b1aa" exitCode=0 Mar 21 04:46:28 crc kubenswrapper[4839]: I0321 04:46:28.175499 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-bccf8f775-gwlp7" event={"ID":"378a796b-e896-48a8-9e03-65e3b371c636","Type":"ContainerDied","Data":"880297fb77f65981125f101cde38f55dd95860faac6dbd936272889d1aa0b1aa"} Mar 21 04:46:28 crc kubenswrapper[4839]: I0321 04:46:28.175523 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-bccf8f775-gwlp7" event={"ID":"378a796b-e896-48a8-9e03-65e3b371c636","Type":"ContainerStarted","Data":"49d3afde602a166e8c5a9ef71743020fdf1f738c3940d641e7dae2434ec0eb13"} Mar 21 04:46:28 crc kubenswrapper[4839]: I0321 04:46:28.206081 4839 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-cell-mapping-csj7l" podStartSLOduration=3.206060108 podStartE2EDuration="3.206060108s" podCreationTimestamp="2026-03-21 04:46:25 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-21 04:46:28.176499486 +0000 UTC m=+1392.504286162" watchObservedRunningTime="2026-03-21 04:46:28.206060108 +0000 UTC m=+1392.533846774" Mar 21 04:46:28 crc kubenswrapper[4839]: I0321 04:46:28.266831 4839 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-jznl6"] Mar 21 04:46:29 crc kubenswrapper[4839]: I0321 04:46:29.202394 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-jznl6" event={"ID":"500decd4-2b92-4e52-bfa8-bb8d1fe13b9d","Type":"ContainerStarted","Data":"118f2c293ce181a9defa7eb0621b40d7a4ec32e8ea91c36b0f98ccebfdd6ba13"} Mar 21 04:46:29 crc kubenswrapper[4839]: I0321 04:46:29.202789 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-jznl6" event={"ID":"500decd4-2b92-4e52-bfa8-bb8d1fe13b9d","Type":"ContainerStarted","Data":"adab925016451244fae9f2cf83f23ed7b20a7f3728fde316dcb382033aa897aa"} Mar 21 04:46:29 crc kubenswrapper[4839]: I0321 04:46:29.211713 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-bccf8f775-gwlp7" event={"ID":"378a796b-e896-48a8-9e03-65e3b371c636","Type":"ContainerStarted","Data":"d33f1cdd73480cf38d5a67e559fe413c35de9b47b49b6298618c23ca1c61bfaa"} Mar 21 04:46:29 crc kubenswrapper[4839]: I0321 04:46:29.211769 4839 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-bccf8f775-gwlp7" Mar 21 04:46:29 crc kubenswrapper[4839]: I0321 04:46:29.223902 4839 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-conductor-db-sync-jznl6" podStartSLOduration=2.223877025 podStartE2EDuration="2.223877025s" podCreationTimestamp="2026-03-21 04:46:27 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-21 04:46:29.221243361 +0000 UTC m=+1393.549030037" watchObservedRunningTime="2026-03-21 04:46:29.223877025 +0000 UTC m=+1393.551663711" Mar 21 04:46:29 crc kubenswrapper[4839]: I0321 04:46:29.257358 4839 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-bccf8f775-gwlp7" podStartSLOduration=3.257339287 podStartE2EDuration="3.257339287s" podCreationTimestamp="2026-03-21 04:46:26 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-21 04:46:29.250583167 +0000 UTC m=+1393.578369853" watchObservedRunningTime="2026-03-21 04:46:29.257339287 +0000 UTC m=+1393.585125963" Mar 21 04:46:30 crc kubenswrapper[4839]: I0321 04:46:30.182769 4839 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Mar 21 04:46:30 crc kubenswrapper[4839]: I0321 04:46:30.224978 4839 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Mar 21 04:46:32 crc kubenswrapper[4839]: I0321 04:46:32.254099 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"3307932f-5c67-4abb-9649-e4b3a0a19e9c","Type":"ContainerStarted","Data":"09c6f9251092b481adefc14b90420fb6a766a56c4477a842505f3e261b3621ae"} Mar 21 04:46:32 crc kubenswrapper[4839]: I0321 04:46:32.254152 4839 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-cell1-novncproxy-0" podUID="3307932f-5c67-4abb-9649-e4b3a0a19e9c" containerName="nova-cell1-novncproxy-novncproxy" containerID="cri-o://09c6f9251092b481adefc14b90420fb6a766a56c4477a842505f3e261b3621ae" gracePeriod=30 Mar 21 04:46:32 crc kubenswrapper[4839]: I0321 04:46:32.256773 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"62694a5a-1565-4831-bff3-504a782692bb","Type":"ContainerStarted","Data":"8395b25a3a6856e54e89996dc9dd678ad43d748174fdcbe34c43485b6c527d32"} Mar 21 04:46:32 crc kubenswrapper[4839]: I0321 04:46:32.263421 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"60e89b8f-2e6a-43a6-a9de-162a457cc5fb","Type":"ContainerStarted","Data":"7cdfb7b00a9da31b06629cf79954f57907a802a192541e04d064b035f961823c"} Mar 21 04:46:32 crc kubenswrapper[4839]: I0321 04:46:32.263464 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"60e89b8f-2e6a-43a6-a9de-162a457cc5fb","Type":"ContainerStarted","Data":"226ae5bc81ef92a5b9fb858af04a67ca5304b066b92652d3561cb37b483b0ed0"} Mar 21 04:46:32 crc kubenswrapper[4839]: I0321 04:46:32.266029 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"205b5c5e-c09f-4b4a-8a56-f98531ad0125","Type":"ContainerStarted","Data":"1410d57874493b657cf6a91a40179a2fad0e19711c82366afcf394ece3f3c6ea"} Mar 21 04:46:32 crc kubenswrapper[4839]: I0321 04:46:32.266063 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"205b5c5e-c09f-4b4a-8a56-f98531ad0125","Type":"ContainerStarted","Data":"3f29eb5cddc75c3558b62b85ba6c233f60869930c7d451ce2bed01578fe06e6c"} Mar 21 04:46:32 crc kubenswrapper[4839]: I0321 04:46:32.266183 4839 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="205b5c5e-c09f-4b4a-8a56-f98531ad0125" containerName="nova-metadata-log" containerID="cri-o://3f29eb5cddc75c3558b62b85ba6c233f60869930c7d451ce2bed01578fe06e6c" gracePeriod=30 Mar 21 04:46:32 crc kubenswrapper[4839]: I0321 04:46:32.266461 4839 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="205b5c5e-c09f-4b4a-8a56-f98531ad0125" containerName="nova-metadata-metadata" containerID="cri-o://1410d57874493b657cf6a91a40179a2fad0e19711c82366afcf394ece3f3c6ea" gracePeriod=30 Mar 21 04:46:32 crc kubenswrapper[4839]: I0321 04:46:32.281425 4839 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-novncproxy-0" podStartSLOduration=2.238367916 podStartE2EDuration="6.28140755s" podCreationTimestamp="2026-03-21 04:46:26 +0000 UTC" firstStartedPulling="2026-03-21 04:46:27.569247654 +0000 UTC m=+1391.897034340" lastFinishedPulling="2026-03-21 04:46:31.612287298 +0000 UTC m=+1395.940073974" observedRunningTime="2026-03-21 04:46:32.277301205 +0000 UTC m=+1396.605087881" watchObservedRunningTime="2026-03-21 04:46:32.28140755 +0000 UTC m=+1396.609194226" Mar 21 04:46:32 crc kubenswrapper[4839]: I0321 04:46:32.301544 4839 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-scheduler-0" podStartSLOduration=1.919290498 podStartE2EDuration="6.301527947s" podCreationTimestamp="2026-03-21 04:46:26 +0000 UTC" firstStartedPulling="2026-03-21 04:46:27.230171392 +0000 UTC m=+1391.557958078" lastFinishedPulling="2026-03-21 04:46:31.612408851 +0000 UTC m=+1395.940195527" observedRunningTime="2026-03-21 04:46:32.298323167 +0000 UTC m=+1396.626109853" watchObservedRunningTime="2026-03-21 04:46:32.301527947 +0000 UTC m=+1396.629314623" Mar 21 04:46:32 crc kubenswrapper[4839]: I0321 04:46:32.330411 4839 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=2.080120854 podStartE2EDuration="6.330388379s" podCreationTimestamp="2026-03-21 04:46:26 +0000 UTC" firstStartedPulling="2026-03-21 04:46:27.362099815 +0000 UTC m=+1391.689886491" lastFinishedPulling="2026-03-21 04:46:31.61236734 +0000 UTC m=+1395.940154016" observedRunningTime="2026-03-21 04:46:32.324765701 +0000 UTC m=+1396.652552377" watchObservedRunningTime="2026-03-21 04:46:32.330388379 +0000 UTC m=+1396.658175055" Mar 21 04:46:32 crc kubenswrapper[4839]: I0321 04:46:32.372076 4839 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=1.8874762330000001 podStartE2EDuration="6.372047712s" podCreationTimestamp="2026-03-21 04:46:26 +0000 UTC" firstStartedPulling="2026-03-21 04:46:27.132924155 +0000 UTC m=+1391.460710831" lastFinishedPulling="2026-03-21 04:46:31.617495634 +0000 UTC m=+1395.945282310" observedRunningTime="2026-03-21 04:46:32.351222955 +0000 UTC m=+1396.679009631" watchObservedRunningTime="2026-03-21 04:46:32.372047712 +0000 UTC m=+1396.699834398" Mar 21 04:46:32 crc kubenswrapper[4839]: I0321 04:46:32.542399 4839 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/kube-state-metrics-0"] Mar 21 04:46:32 crc kubenswrapper[4839]: I0321 04:46:32.542670 4839 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/kube-state-metrics-0" podUID="76b8f1b8-aa66-4f5e-937a-f837a2da28f1" containerName="kube-state-metrics" containerID="cri-o://ad857e3bd5d30310b9baab625e84f950c629d3216d303b1cad7b0178fb62aa6b" gracePeriod=30 Mar 21 04:46:33 crc kubenswrapper[4839]: I0321 04:46:33.024049 4839 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Mar 21 04:46:33 crc kubenswrapper[4839]: I0321 04:46:33.171376 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fr9w9\" (UniqueName: \"kubernetes.io/projected/76b8f1b8-aa66-4f5e-937a-f837a2da28f1-kube-api-access-fr9w9\") pod \"76b8f1b8-aa66-4f5e-937a-f837a2da28f1\" (UID: \"76b8f1b8-aa66-4f5e-937a-f837a2da28f1\") " Mar 21 04:46:33 crc kubenswrapper[4839]: I0321 04:46:33.178088 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/76b8f1b8-aa66-4f5e-937a-f837a2da28f1-kube-api-access-fr9w9" (OuterVolumeSpecName: "kube-api-access-fr9w9") pod "76b8f1b8-aa66-4f5e-937a-f837a2da28f1" (UID: "76b8f1b8-aa66-4f5e-937a-f837a2da28f1"). InnerVolumeSpecName "kube-api-access-fr9w9". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 21 04:46:33 crc kubenswrapper[4839]: I0321 04:46:33.273184 4839 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fr9w9\" (UniqueName: \"kubernetes.io/projected/76b8f1b8-aa66-4f5e-937a-f837a2da28f1-kube-api-access-fr9w9\") on node \"crc\" DevicePath \"\"" Mar 21 04:46:33 crc kubenswrapper[4839]: I0321 04:46:33.276657 4839 generic.go:334] "Generic (PLEG): container finished" podID="76b8f1b8-aa66-4f5e-937a-f837a2da28f1" containerID="ad857e3bd5d30310b9baab625e84f950c629d3216d303b1cad7b0178fb62aa6b" exitCode=2 Mar 21 04:46:33 crc kubenswrapper[4839]: I0321 04:46:33.276727 4839 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Mar 21 04:46:33 crc kubenswrapper[4839]: I0321 04:46:33.276717 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"76b8f1b8-aa66-4f5e-937a-f837a2da28f1","Type":"ContainerDied","Data":"ad857e3bd5d30310b9baab625e84f950c629d3216d303b1cad7b0178fb62aa6b"} Mar 21 04:46:33 crc kubenswrapper[4839]: I0321 04:46:33.276870 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"76b8f1b8-aa66-4f5e-937a-f837a2da28f1","Type":"ContainerDied","Data":"c3e02332eed0f6ac50479a637c2f9551186161a99dab978e61007f6da0cf9aba"} Mar 21 04:46:33 crc kubenswrapper[4839]: I0321 04:46:33.276893 4839 scope.go:117] "RemoveContainer" containerID="ad857e3bd5d30310b9baab625e84f950c629d3216d303b1cad7b0178fb62aa6b" Mar 21 04:46:33 crc kubenswrapper[4839]: I0321 04:46:33.279234 4839 generic.go:334] "Generic (PLEG): container finished" podID="205b5c5e-c09f-4b4a-8a56-f98531ad0125" containerID="3f29eb5cddc75c3558b62b85ba6c233f60869930c7d451ce2bed01578fe06e6c" exitCode=143 Mar 21 04:46:33 crc kubenswrapper[4839]: I0321 04:46:33.279313 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"205b5c5e-c09f-4b4a-8a56-f98531ad0125","Type":"ContainerDied","Data":"3f29eb5cddc75c3558b62b85ba6c233f60869930c7d451ce2bed01578fe06e6c"} Mar 21 04:46:33 crc kubenswrapper[4839]: I0321 04:46:33.297416 4839 scope.go:117] "RemoveContainer" containerID="ad857e3bd5d30310b9baab625e84f950c629d3216d303b1cad7b0178fb62aa6b" Mar 21 04:46:33 crc kubenswrapper[4839]: E0321 04:46:33.297860 4839 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ad857e3bd5d30310b9baab625e84f950c629d3216d303b1cad7b0178fb62aa6b\": container with ID starting with ad857e3bd5d30310b9baab625e84f950c629d3216d303b1cad7b0178fb62aa6b not found: ID does not exist" containerID="ad857e3bd5d30310b9baab625e84f950c629d3216d303b1cad7b0178fb62aa6b" Mar 21 04:46:33 crc kubenswrapper[4839]: I0321 04:46:33.297911 4839 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ad857e3bd5d30310b9baab625e84f950c629d3216d303b1cad7b0178fb62aa6b"} err="failed to get container status \"ad857e3bd5d30310b9baab625e84f950c629d3216d303b1cad7b0178fb62aa6b\": rpc error: code = NotFound desc = could not find container \"ad857e3bd5d30310b9baab625e84f950c629d3216d303b1cad7b0178fb62aa6b\": container with ID starting with ad857e3bd5d30310b9baab625e84f950c629d3216d303b1cad7b0178fb62aa6b not found: ID does not exist" Mar 21 04:46:33 crc kubenswrapper[4839]: I0321 04:46:33.314348 4839 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/kube-state-metrics-0"] Mar 21 04:46:33 crc kubenswrapper[4839]: I0321 04:46:33.323500 4839 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/kube-state-metrics-0"] Mar 21 04:46:33 crc kubenswrapper[4839]: I0321 04:46:33.339021 4839 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/kube-state-metrics-0"] Mar 21 04:46:33 crc kubenswrapper[4839]: E0321 04:46:33.340711 4839 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="76b8f1b8-aa66-4f5e-937a-f837a2da28f1" containerName="kube-state-metrics" Mar 21 04:46:33 crc kubenswrapper[4839]: I0321 04:46:33.340747 4839 state_mem.go:107] "Deleted CPUSet assignment" podUID="76b8f1b8-aa66-4f5e-937a-f837a2da28f1" containerName="kube-state-metrics" Mar 21 04:46:33 crc kubenswrapper[4839]: I0321 04:46:33.341623 4839 memory_manager.go:354] "RemoveStaleState removing state" podUID="76b8f1b8-aa66-4f5e-937a-f837a2da28f1" containerName="kube-state-metrics" Mar 21 04:46:33 crc kubenswrapper[4839]: I0321 04:46:33.342739 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Mar 21 04:46:33 crc kubenswrapper[4839]: I0321 04:46:33.348098 4839 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"kube-state-metrics-tls-config" Mar 21 04:46:33 crc kubenswrapper[4839]: I0321 04:46:33.348496 4839 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-kube-state-metrics-svc" Mar 21 04:46:33 crc kubenswrapper[4839]: I0321 04:46:33.389927 4839 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Mar 21 04:46:33 crc kubenswrapper[4839]: I0321 04:46:33.481226 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1626316f-b029-4424-b783-25eeb2790eb2-combined-ca-bundle\") pod \"kube-state-metrics-0\" (UID: \"1626316f-b029-4424-b783-25eeb2790eb2\") " pod="openstack/kube-state-metrics-0" Mar 21 04:46:33 crc kubenswrapper[4839]: I0321 04:46:33.481307 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-tls-config\" (UniqueName: \"kubernetes.io/secret/1626316f-b029-4424-b783-25eeb2790eb2-kube-state-metrics-tls-config\") pod \"kube-state-metrics-0\" (UID: \"1626316f-b029-4424-b783-25eeb2790eb2\") " pod="openstack/kube-state-metrics-0" Mar 21 04:46:33 crc kubenswrapper[4839]: I0321 04:46:33.481348 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-tls-certs\" (UniqueName: \"kubernetes.io/secret/1626316f-b029-4424-b783-25eeb2790eb2-kube-state-metrics-tls-certs\") pod \"kube-state-metrics-0\" (UID: \"1626316f-b029-4424-b783-25eeb2790eb2\") " pod="openstack/kube-state-metrics-0" Mar 21 04:46:33 crc kubenswrapper[4839]: I0321 04:46:33.481377 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zpthn\" (UniqueName: \"kubernetes.io/projected/1626316f-b029-4424-b783-25eeb2790eb2-kube-api-access-zpthn\") pod \"kube-state-metrics-0\" (UID: \"1626316f-b029-4424-b783-25eeb2790eb2\") " pod="openstack/kube-state-metrics-0" Mar 21 04:46:33 crc kubenswrapper[4839]: I0321 04:46:33.583154 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zpthn\" (UniqueName: \"kubernetes.io/projected/1626316f-b029-4424-b783-25eeb2790eb2-kube-api-access-zpthn\") pod \"kube-state-metrics-0\" (UID: \"1626316f-b029-4424-b783-25eeb2790eb2\") " pod="openstack/kube-state-metrics-0" Mar 21 04:46:33 crc kubenswrapper[4839]: I0321 04:46:33.583451 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1626316f-b029-4424-b783-25eeb2790eb2-combined-ca-bundle\") pod \"kube-state-metrics-0\" (UID: \"1626316f-b029-4424-b783-25eeb2790eb2\") " pod="openstack/kube-state-metrics-0" Mar 21 04:46:33 crc kubenswrapper[4839]: I0321 04:46:33.583580 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-state-metrics-tls-config\" (UniqueName: \"kubernetes.io/secret/1626316f-b029-4424-b783-25eeb2790eb2-kube-state-metrics-tls-config\") pod \"kube-state-metrics-0\" (UID: \"1626316f-b029-4424-b783-25eeb2790eb2\") " pod="openstack/kube-state-metrics-0" Mar 21 04:46:33 crc kubenswrapper[4839]: I0321 04:46:33.583672 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-state-metrics-tls-certs\" (UniqueName: \"kubernetes.io/secret/1626316f-b029-4424-b783-25eeb2790eb2-kube-state-metrics-tls-certs\") pod \"kube-state-metrics-0\" (UID: \"1626316f-b029-4424-b783-25eeb2790eb2\") " pod="openstack/kube-state-metrics-0" Mar 21 04:46:33 crc kubenswrapper[4839]: I0321 04:46:33.590285 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-tls-certs\" (UniqueName: \"kubernetes.io/secret/1626316f-b029-4424-b783-25eeb2790eb2-kube-state-metrics-tls-certs\") pod \"kube-state-metrics-0\" (UID: \"1626316f-b029-4424-b783-25eeb2790eb2\") " pod="openstack/kube-state-metrics-0" Mar 21 04:46:33 crc kubenswrapper[4839]: I0321 04:46:33.592925 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-tls-config\" (UniqueName: \"kubernetes.io/secret/1626316f-b029-4424-b783-25eeb2790eb2-kube-state-metrics-tls-config\") pod \"kube-state-metrics-0\" (UID: \"1626316f-b029-4424-b783-25eeb2790eb2\") " pod="openstack/kube-state-metrics-0" Mar 21 04:46:33 crc kubenswrapper[4839]: I0321 04:46:33.601788 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zpthn\" (UniqueName: \"kubernetes.io/projected/1626316f-b029-4424-b783-25eeb2790eb2-kube-api-access-zpthn\") pod \"kube-state-metrics-0\" (UID: \"1626316f-b029-4424-b783-25eeb2790eb2\") " pod="openstack/kube-state-metrics-0" Mar 21 04:46:33 crc kubenswrapper[4839]: I0321 04:46:33.604330 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1626316f-b029-4424-b783-25eeb2790eb2-combined-ca-bundle\") pod \"kube-state-metrics-0\" (UID: \"1626316f-b029-4424-b783-25eeb2790eb2\") " pod="openstack/kube-state-metrics-0" Mar 21 04:46:33 crc kubenswrapper[4839]: I0321 04:46:33.666802 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Mar 21 04:46:34 crc kubenswrapper[4839]: I0321 04:46:34.154234 4839 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Mar 21 04:46:34 crc kubenswrapper[4839]: I0321 04:46:34.291606 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"1626316f-b029-4424-b783-25eeb2790eb2","Type":"ContainerStarted","Data":"ba925efacaf027591d0a4dc124094b8564705b768ee6312022862c157381c653"} Mar 21 04:46:34 crc kubenswrapper[4839]: I0321 04:46:34.462938 4839 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="76b8f1b8-aa66-4f5e-937a-f837a2da28f1" path="/var/lib/kubelet/pods/76b8f1b8-aa66-4f5e-937a-f837a2da28f1/volumes" Mar 21 04:46:35 crc kubenswrapper[4839]: I0321 04:46:35.051248 4839 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 21 04:46:35 crc kubenswrapper[4839]: I0321 04:46:35.052098 4839 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="8d47edf7-95e3-4bb5-ab87-27c9db5d05d6" containerName="ceilometer-central-agent" containerID="cri-o://cbba2b10323381d9b303a23b6607bd17c5906d7437bb21c852b760d41642da03" gracePeriod=30 Mar 21 04:46:35 crc kubenswrapper[4839]: I0321 04:46:35.052313 4839 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="8d47edf7-95e3-4bb5-ab87-27c9db5d05d6" containerName="proxy-httpd" containerID="cri-o://1ebe513656b6f58bbf6f0d69227894541ab6a4fa4cbe47f5b1af5f7551f5352e" gracePeriod=30 Mar 21 04:46:35 crc kubenswrapper[4839]: I0321 04:46:35.052369 4839 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="8d47edf7-95e3-4bb5-ab87-27c9db5d05d6" containerName="ceilometer-notification-agent" containerID="cri-o://1c072f26f9d106f6161eb66b3f6c9a76cc9db2ce4ede2775c0501c104b78c25c" gracePeriod=30 Mar 21 04:46:35 crc kubenswrapper[4839]: I0321 04:46:35.052317 4839 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="8d47edf7-95e3-4bb5-ab87-27c9db5d05d6" containerName="sg-core" containerID="cri-o://1e51df175e72743e7d699f4e5fcec298f453f2659fd1cb0a4c210eba9115f1a3" gracePeriod=30 Mar 21 04:46:35 crc kubenswrapper[4839]: I0321 04:46:35.303627 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"1626316f-b029-4424-b783-25eeb2790eb2","Type":"ContainerStarted","Data":"ac03578aadfbad30195691a7bbe3beec9a22dc9381538c0c5f6b5f64c58f29b2"} Mar 21 04:46:35 crc kubenswrapper[4839]: I0321 04:46:35.303786 4839 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/kube-state-metrics-0" Mar 21 04:46:35 crc kubenswrapper[4839]: I0321 04:46:35.307369 4839 generic.go:334] "Generic (PLEG): container finished" podID="8d47edf7-95e3-4bb5-ab87-27c9db5d05d6" containerID="1ebe513656b6f58bbf6f0d69227894541ab6a4fa4cbe47f5b1af5f7551f5352e" exitCode=0 Mar 21 04:46:35 crc kubenswrapper[4839]: I0321 04:46:35.307436 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"8d47edf7-95e3-4bb5-ab87-27c9db5d05d6","Type":"ContainerDied","Data":"1ebe513656b6f58bbf6f0d69227894541ab6a4fa4cbe47f5b1af5f7551f5352e"} Mar 21 04:46:35 crc kubenswrapper[4839]: I0321 04:46:35.307476 4839 generic.go:334] "Generic (PLEG): container finished" podID="8d47edf7-95e3-4bb5-ab87-27c9db5d05d6" containerID="1e51df175e72743e7d699f4e5fcec298f453f2659fd1cb0a4c210eba9115f1a3" exitCode=2 Mar 21 04:46:35 crc kubenswrapper[4839]: I0321 04:46:35.307499 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"8d47edf7-95e3-4bb5-ab87-27c9db5d05d6","Type":"ContainerDied","Data":"1e51df175e72743e7d699f4e5fcec298f453f2659fd1cb0a4c210eba9115f1a3"} Mar 21 04:46:35 crc kubenswrapper[4839]: I0321 04:46:35.338091 4839 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/kube-state-metrics-0" podStartSLOduration=1.961919914 podStartE2EDuration="2.338074961s" podCreationTimestamp="2026-03-21 04:46:33 +0000 UTC" firstStartedPulling="2026-03-21 04:46:34.157732951 +0000 UTC m=+1398.485519627" lastFinishedPulling="2026-03-21 04:46:34.533887998 +0000 UTC m=+1398.861674674" observedRunningTime="2026-03-21 04:46:35.334615714 +0000 UTC m=+1399.662402390" watchObservedRunningTime="2026-03-21 04:46:35.338074961 +0000 UTC m=+1399.665861637" Mar 21 04:46:36 crc kubenswrapper[4839]: I0321 04:46:36.320072 4839 generic.go:334] "Generic (PLEG): container finished" podID="8d47edf7-95e3-4bb5-ab87-27c9db5d05d6" containerID="cbba2b10323381d9b303a23b6607bd17c5906d7437bb21c852b760d41642da03" exitCode=0 Mar 21 04:46:36 crc kubenswrapper[4839]: I0321 04:46:36.320156 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"8d47edf7-95e3-4bb5-ab87-27c9db5d05d6","Type":"ContainerDied","Data":"cbba2b10323381d9b303a23b6607bd17c5906d7437bb21c852b760d41642da03"} Mar 21 04:46:36 crc kubenswrapper[4839]: I0321 04:46:36.325005 4839 generic.go:334] "Generic (PLEG): container finished" podID="37c6fbf7-427d-45a8-b190-439265c8d6d0" containerID="6500e5c41c0724032a37daabaaadca5a2ab96ab0732aaceeaaccdf5e739d902c" exitCode=0 Mar 21 04:46:36 crc kubenswrapper[4839]: I0321 04:46:36.325110 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-csj7l" event={"ID":"37c6fbf7-427d-45a8-b190-439265c8d6d0","Type":"ContainerDied","Data":"6500e5c41c0724032a37daabaaadca5a2ab96ab0732aaceeaaccdf5e739d902c"} Mar 21 04:46:36 crc kubenswrapper[4839]: I0321 04:46:36.517815 4839 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-scheduler-0" Mar 21 04:46:36 crc kubenswrapper[4839]: I0321 04:46:36.518122 4839 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-scheduler-0" Mar 21 04:46:36 crc kubenswrapper[4839]: I0321 04:46:36.540422 4839 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Mar 21 04:46:36 crc kubenswrapper[4839]: I0321 04:46:36.540466 4839 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Mar 21 04:46:36 crc kubenswrapper[4839]: I0321 04:46:36.554562 4839 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-scheduler-0" Mar 21 04:46:36 crc kubenswrapper[4839]: I0321 04:46:36.828533 4839 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell1-novncproxy-0" Mar 21 04:46:36 crc kubenswrapper[4839]: I0321 04:46:36.852444 4839 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-bccf8f775-gwlp7" Mar 21 04:46:36 crc kubenswrapper[4839]: I0321 04:46:36.939402 4839 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6578955fd5-qc28r"] Mar 21 04:46:37 crc kubenswrapper[4839]: I0321 04:46:37.335085 4839 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-6578955fd5-qc28r" podUID="439bd408-2f5c-45cc-a2f7-8166a4a279c2" containerName="dnsmasq-dns" containerID="cri-o://c21a3fe620f5493aafd1cd6d7aa74d463dea1dbb4165e2e674ec4728c83d9d18" gracePeriod=10 Mar 21 04:46:37 crc kubenswrapper[4839]: I0321 04:46:37.370134 4839 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-scheduler-0" Mar 21 04:46:37 crc kubenswrapper[4839]: I0321 04:46:37.624803 4839 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="60e89b8f-2e6a-43a6-a9de-162a457cc5fb" containerName="nova-api-log" probeResult="failure" output="Get \"http://10.217.0.202:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 21 04:46:37 crc kubenswrapper[4839]: I0321 04:46:37.624859 4839 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="60e89b8f-2e6a-43a6-a9de-162a457cc5fb" containerName="nova-api-api" probeResult="failure" output="Get \"http://10.217.0.202:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 21 04:46:37 crc kubenswrapper[4839]: I0321 04:46:37.868224 4839 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-csj7l" Mar 21 04:46:37 crc kubenswrapper[4839]: I0321 04:46:37.968079 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5wj7k\" (UniqueName: \"kubernetes.io/projected/37c6fbf7-427d-45a8-b190-439265c8d6d0-kube-api-access-5wj7k\") pod \"37c6fbf7-427d-45a8-b190-439265c8d6d0\" (UID: \"37c6fbf7-427d-45a8-b190-439265c8d6d0\") " Mar 21 04:46:37 crc kubenswrapper[4839]: I0321 04:46:37.968243 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/37c6fbf7-427d-45a8-b190-439265c8d6d0-scripts\") pod \"37c6fbf7-427d-45a8-b190-439265c8d6d0\" (UID: \"37c6fbf7-427d-45a8-b190-439265c8d6d0\") " Mar 21 04:46:37 crc kubenswrapper[4839]: I0321 04:46:37.968273 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/37c6fbf7-427d-45a8-b190-439265c8d6d0-config-data\") pod \"37c6fbf7-427d-45a8-b190-439265c8d6d0\" (UID: \"37c6fbf7-427d-45a8-b190-439265c8d6d0\") " Mar 21 04:46:37 crc kubenswrapper[4839]: I0321 04:46:37.968345 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/37c6fbf7-427d-45a8-b190-439265c8d6d0-combined-ca-bundle\") pod \"37c6fbf7-427d-45a8-b190-439265c8d6d0\" (UID: \"37c6fbf7-427d-45a8-b190-439265c8d6d0\") " Mar 21 04:46:37 crc kubenswrapper[4839]: I0321 04:46:37.998387 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/37c6fbf7-427d-45a8-b190-439265c8d6d0-scripts" (OuterVolumeSpecName: "scripts") pod "37c6fbf7-427d-45a8-b190-439265c8d6d0" (UID: "37c6fbf7-427d-45a8-b190-439265c8d6d0"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 21 04:46:37 crc kubenswrapper[4839]: I0321 04:46:37.999514 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/37c6fbf7-427d-45a8-b190-439265c8d6d0-kube-api-access-5wj7k" (OuterVolumeSpecName: "kube-api-access-5wj7k") pod "37c6fbf7-427d-45a8-b190-439265c8d6d0" (UID: "37c6fbf7-427d-45a8-b190-439265c8d6d0"). InnerVolumeSpecName "kube-api-access-5wj7k". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 21 04:46:38 crc kubenswrapper[4839]: I0321 04:46:38.040773 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/37c6fbf7-427d-45a8-b190-439265c8d6d0-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "37c6fbf7-427d-45a8-b190-439265c8d6d0" (UID: "37c6fbf7-427d-45a8-b190-439265c8d6d0"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 21 04:46:38 crc kubenswrapper[4839]: I0321 04:46:38.052093 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/37c6fbf7-427d-45a8-b190-439265c8d6d0-config-data" (OuterVolumeSpecName: "config-data") pod "37c6fbf7-427d-45a8-b190-439265c8d6d0" (UID: "37c6fbf7-427d-45a8-b190-439265c8d6d0"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 21 04:46:38 crc kubenswrapper[4839]: I0321 04:46:38.070213 4839 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5wj7k\" (UniqueName: \"kubernetes.io/projected/37c6fbf7-427d-45a8-b190-439265c8d6d0-kube-api-access-5wj7k\") on node \"crc\" DevicePath \"\"" Mar 21 04:46:38 crc kubenswrapper[4839]: I0321 04:46:38.070256 4839 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/37c6fbf7-427d-45a8-b190-439265c8d6d0-scripts\") on node \"crc\" DevicePath \"\"" Mar 21 04:46:38 crc kubenswrapper[4839]: I0321 04:46:38.070269 4839 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/37c6fbf7-427d-45a8-b190-439265c8d6d0-config-data\") on node \"crc\" DevicePath \"\"" Mar 21 04:46:38 crc kubenswrapper[4839]: I0321 04:46:38.070280 4839 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/37c6fbf7-427d-45a8-b190-439265c8d6d0-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 21 04:46:38 crc kubenswrapper[4839]: I0321 04:46:38.115526 4839 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6578955fd5-qc28r" Mar 21 04:46:38 crc kubenswrapper[4839]: I0321 04:46:38.286733 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/439bd408-2f5c-45cc-a2f7-8166a4a279c2-ovsdbserver-sb\") pod \"439bd408-2f5c-45cc-a2f7-8166a4a279c2\" (UID: \"439bd408-2f5c-45cc-a2f7-8166a4a279c2\") " Mar 21 04:46:38 crc kubenswrapper[4839]: I0321 04:46:38.286794 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/439bd408-2f5c-45cc-a2f7-8166a4a279c2-ovsdbserver-nb\") pod \"439bd408-2f5c-45cc-a2f7-8166a4a279c2\" (UID: \"439bd408-2f5c-45cc-a2f7-8166a4a279c2\") " Mar 21 04:46:38 crc kubenswrapper[4839]: I0321 04:46:38.286822 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/439bd408-2f5c-45cc-a2f7-8166a4a279c2-dns-svc\") pod \"439bd408-2f5c-45cc-a2f7-8166a4a279c2\" (UID: \"439bd408-2f5c-45cc-a2f7-8166a4a279c2\") " Mar 21 04:46:38 crc kubenswrapper[4839]: I0321 04:46:38.286852 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4wj7c\" (UniqueName: \"kubernetes.io/projected/439bd408-2f5c-45cc-a2f7-8166a4a279c2-kube-api-access-4wj7c\") pod \"439bd408-2f5c-45cc-a2f7-8166a4a279c2\" (UID: \"439bd408-2f5c-45cc-a2f7-8166a4a279c2\") " Mar 21 04:46:38 crc kubenswrapper[4839]: I0321 04:46:38.286906 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/439bd408-2f5c-45cc-a2f7-8166a4a279c2-config\") pod \"439bd408-2f5c-45cc-a2f7-8166a4a279c2\" (UID: \"439bd408-2f5c-45cc-a2f7-8166a4a279c2\") " Mar 21 04:46:38 crc kubenswrapper[4839]: I0321 04:46:38.286937 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/439bd408-2f5c-45cc-a2f7-8166a4a279c2-dns-swift-storage-0\") pod \"439bd408-2f5c-45cc-a2f7-8166a4a279c2\" (UID: \"439bd408-2f5c-45cc-a2f7-8166a4a279c2\") " Mar 21 04:46:38 crc kubenswrapper[4839]: I0321 04:46:38.298073 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/439bd408-2f5c-45cc-a2f7-8166a4a279c2-kube-api-access-4wj7c" (OuterVolumeSpecName: "kube-api-access-4wj7c") pod "439bd408-2f5c-45cc-a2f7-8166a4a279c2" (UID: "439bd408-2f5c-45cc-a2f7-8166a4a279c2"). InnerVolumeSpecName "kube-api-access-4wj7c". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 21 04:46:38 crc kubenswrapper[4839]: I0321 04:46:38.346551 4839 generic.go:334] "Generic (PLEG): container finished" podID="439bd408-2f5c-45cc-a2f7-8166a4a279c2" containerID="c21a3fe620f5493aafd1cd6d7aa74d463dea1dbb4165e2e674ec4728c83d9d18" exitCode=0 Mar 21 04:46:38 crc kubenswrapper[4839]: I0321 04:46:38.346633 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6578955fd5-qc28r" event={"ID":"439bd408-2f5c-45cc-a2f7-8166a4a279c2","Type":"ContainerDied","Data":"c21a3fe620f5493aafd1cd6d7aa74d463dea1dbb4165e2e674ec4728c83d9d18"} Mar 21 04:46:38 crc kubenswrapper[4839]: I0321 04:46:38.346686 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6578955fd5-qc28r" event={"ID":"439bd408-2f5c-45cc-a2f7-8166a4a279c2","Type":"ContainerDied","Data":"21228341591d8e5aec6ec7937412b30e803ee3d8f05a2c9720bd304ab86d36ca"} Mar 21 04:46:38 crc kubenswrapper[4839]: I0321 04:46:38.346687 4839 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6578955fd5-qc28r" Mar 21 04:46:38 crc kubenswrapper[4839]: I0321 04:46:38.346706 4839 scope.go:117] "RemoveContainer" containerID="c21a3fe620f5493aafd1cd6d7aa74d463dea1dbb4165e2e674ec4728c83d9d18" Mar 21 04:46:38 crc kubenswrapper[4839]: I0321 04:46:38.349274 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-csj7l" event={"ID":"37c6fbf7-427d-45a8-b190-439265c8d6d0","Type":"ContainerDied","Data":"deb2bfcfde4895ebb1cae51b1ec4d964acc93bf15dd9d8c3a0d4a4811a853624"} Mar 21 04:46:38 crc kubenswrapper[4839]: I0321 04:46:38.349352 4839 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="deb2bfcfde4895ebb1cae51b1ec4d964acc93bf15dd9d8c3a0d4a4811a853624" Mar 21 04:46:38 crc kubenswrapper[4839]: I0321 04:46:38.349463 4839 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-csj7l" Mar 21 04:46:38 crc kubenswrapper[4839]: I0321 04:46:38.350220 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/439bd408-2f5c-45cc-a2f7-8166a4a279c2-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "439bd408-2f5c-45cc-a2f7-8166a4a279c2" (UID: "439bd408-2f5c-45cc-a2f7-8166a4a279c2"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 21 04:46:38 crc kubenswrapper[4839]: I0321 04:46:38.354421 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/439bd408-2f5c-45cc-a2f7-8166a4a279c2-config" (OuterVolumeSpecName: "config") pod "439bd408-2f5c-45cc-a2f7-8166a4a279c2" (UID: "439bd408-2f5c-45cc-a2f7-8166a4a279c2"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 21 04:46:38 crc kubenswrapper[4839]: I0321 04:46:38.358911 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/439bd408-2f5c-45cc-a2f7-8166a4a279c2-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "439bd408-2f5c-45cc-a2f7-8166a4a279c2" (UID: "439bd408-2f5c-45cc-a2f7-8166a4a279c2"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 21 04:46:38 crc kubenswrapper[4839]: I0321 04:46:38.368824 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/439bd408-2f5c-45cc-a2f7-8166a4a279c2-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "439bd408-2f5c-45cc-a2f7-8166a4a279c2" (UID: "439bd408-2f5c-45cc-a2f7-8166a4a279c2"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 21 04:46:38 crc kubenswrapper[4839]: I0321 04:46:38.374166 4839 scope.go:117] "RemoveContainer" containerID="583f06ca5d49ecf29149b9c8c616f5732714a4739948ee020d19d827d943128b" Mar 21 04:46:38 crc kubenswrapper[4839]: I0321 04:46:38.382487 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/439bd408-2f5c-45cc-a2f7-8166a4a279c2-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "439bd408-2f5c-45cc-a2f7-8166a4a279c2" (UID: "439bd408-2f5c-45cc-a2f7-8166a4a279c2"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 21 04:46:38 crc kubenswrapper[4839]: I0321 04:46:38.389525 4839 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/439bd408-2f5c-45cc-a2f7-8166a4a279c2-config\") on node \"crc\" DevicePath \"\"" Mar 21 04:46:38 crc kubenswrapper[4839]: I0321 04:46:38.389558 4839 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/439bd408-2f5c-45cc-a2f7-8166a4a279c2-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Mar 21 04:46:38 crc kubenswrapper[4839]: I0321 04:46:38.389580 4839 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/439bd408-2f5c-45cc-a2f7-8166a4a279c2-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Mar 21 04:46:38 crc kubenswrapper[4839]: I0321 04:46:38.389589 4839 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/439bd408-2f5c-45cc-a2f7-8166a4a279c2-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Mar 21 04:46:38 crc kubenswrapper[4839]: I0321 04:46:38.389598 4839 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/439bd408-2f5c-45cc-a2f7-8166a4a279c2-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 21 04:46:38 crc kubenswrapper[4839]: I0321 04:46:38.389606 4839 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4wj7c\" (UniqueName: \"kubernetes.io/projected/439bd408-2f5c-45cc-a2f7-8166a4a279c2-kube-api-access-4wj7c\") on node \"crc\" DevicePath \"\"" Mar 21 04:46:38 crc kubenswrapper[4839]: I0321 04:46:38.402516 4839 scope.go:117] "RemoveContainer" containerID="c21a3fe620f5493aafd1cd6d7aa74d463dea1dbb4165e2e674ec4728c83d9d18" Mar 21 04:46:38 crc kubenswrapper[4839]: E0321 04:46:38.405220 4839 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c21a3fe620f5493aafd1cd6d7aa74d463dea1dbb4165e2e674ec4728c83d9d18\": container with ID starting with c21a3fe620f5493aafd1cd6d7aa74d463dea1dbb4165e2e674ec4728c83d9d18 not found: ID does not exist" containerID="c21a3fe620f5493aafd1cd6d7aa74d463dea1dbb4165e2e674ec4728c83d9d18" Mar 21 04:46:38 crc kubenswrapper[4839]: I0321 04:46:38.405270 4839 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c21a3fe620f5493aafd1cd6d7aa74d463dea1dbb4165e2e674ec4728c83d9d18"} err="failed to get container status \"c21a3fe620f5493aafd1cd6d7aa74d463dea1dbb4165e2e674ec4728c83d9d18\": rpc error: code = NotFound desc = could not find container \"c21a3fe620f5493aafd1cd6d7aa74d463dea1dbb4165e2e674ec4728c83d9d18\": container with ID starting with c21a3fe620f5493aafd1cd6d7aa74d463dea1dbb4165e2e674ec4728c83d9d18 not found: ID does not exist" Mar 21 04:46:38 crc kubenswrapper[4839]: I0321 04:46:38.405299 4839 scope.go:117] "RemoveContainer" containerID="583f06ca5d49ecf29149b9c8c616f5732714a4739948ee020d19d827d943128b" Mar 21 04:46:38 crc kubenswrapper[4839]: E0321 04:46:38.405591 4839 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"583f06ca5d49ecf29149b9c8c616f5732714a4739948ee020d19d827d943128b\": container with ID starting with 583f06ca5d49ecf29149b9c8c616f5732714a4739948ee020d19d827d943128b not found: ID does not exist" containerID="583f06ca5d49ecf29149b9c8c616f5732714a4739948ee020d19d827d943128b" Mar 21 04:46:38 crc kubenswrapper[4839]: I0321 04:46:38.405620 4839 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"583f06ca5d49ecf29149b9c8c616f5732714a4739948ee020d19d827d943128b"} err="failed to get container status \"583f06ca5d49ecf29149b9c8c616f5732714a4739948ee020d19d827d943128b\": rpc error: code = NotFound desc = could not find container \"583f06ca5d49ecf29149b9c8c616f5732714a4739948ee020d19d827d943128b\": container with ID starting with 583f06ca5d49ecf29149b9c8c616f5732714a4739948ee020d19d827d943128b not found: ID does not exist" Mar 21 04:46:38 crc kubenswrapper[4839]: I0321 04:46:38.464241 4839 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Mar 21 04:46:38 crc kubenswrapper[4839]: I0321 04:46:38.464444 4839 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="60e89b8f-2e6a-43a6-a9de-162a457cc5fb" containerName="nova-api-log" containerID="cri-o://226ae5bc81ef92a5b9fb858af04a67ca5304b066b92652d3561cb37b483b0ed0" gracePeriod=30 Mar 21 04:46:38 crc kubenswrapper[4839]: I0321 04:46:38.465135 4839 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="60e89b8f-2e6a-43a6-a9de-162a457cc5fb" containerName="nova-api-api" containerID="cri-o://7cdfb7b00a9da31b06629cf79954f57907a802a192541e04d064b035f961823c" gracePeriod=30 Mar 21 04:46:38 crc kubenswrapper[4839]: I0321 04:46:38.670895 4839 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6578955fd5-qc28r"] Mar 21 04:46:38 crc kubenswrapper[4839]: I0321 04:46:38.677957 4839 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-6578955fd5-qc28r"] Mar 21 04:46:38 crc kubenswrapper[4839]: I0321 04:46:38.891442 4839 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Mar 21 04:46:39 crc kubenswrapper[4839]: I0321 04:46:39.361632 4839 generic.go:334] "Generic (PLEG): container finished" podID="8d47edf7-95e3-4bb5-ab87-27c9db5d05d6" containerID="1c072f26f9d106f6161eb66b3f6c9a76cc9db2ce4ede2775c0501c104b78c25c" exitCode=0 Mar 21 04:46:39 crc kubenswrapper[4839]: I0321 04:46:39.361699 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"8d47edf7-95e3-4bb5-ab87-27c9db5d05d6","Type":"ContainerDied","Data":"1c072f26f9d106f6161eb66b3f6c9a76cc9db2ce4ede2775c0501c104b78c25c"} Mar 21 04:46:39 crc kubenswrapper[4839]: I0321 04:46:39.366028 4839 generic.go:334] "Generic (PLEG): container finished" podID="60e89b8f-2e6a-43a6-a9de-162a457cc5fb" containerID="226ae5bc81ef92a5b9fb858af04a67ca5304b066b92652d3561cb37b483b0ed0" exitCode=143 Mar 21 04:46:39 crc kubenswrapper[4839]: I0321 04:46:39.366272 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"60e89b8f-2e6a-43a6-a9de-162a457cc5fb","Type":"ContainerDied","Data":"226ae5bc81ef92a5b9fb858af04a67ca5304b066b92652d3561cb37b483b0ed0"} Mar 21 04:46:39 crc kubenswrapper[4839]: I0321 04:46:39.886990 4839 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 21 04:46:39 crc kubenswrapper[4839]: I0321 04:46:39.947500 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8d47edf7-95e3-4bb5-ab87-27c9db5d05d6-scripts\") pod \"8d47edf7-95e3-4bb5-ab87-27c9db5d05d6\" (UID: \"8d47edf7-95e3-4bb5-ab87-27c9db5d05d6\") " Mar 21 04:46:39 crc kubenswrapper[4839]: I0321 04:46:39.947549 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/8d47edf7-95e3-4bb5-ab87-27c9db5d05d6-run-httpd\") pod \"8d47edf7-95e3-4bb5-ab87-27c9db5d05d6\" (UID: \"8d47edf7-95e3-4bb5-ab87-27c9db5d05d6\") " Mar 21 04:46:39 crc kubenswrapper[4839]: I0321 04:46:39.947586 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8d47edf7-95e3-4bb5-ab87-27c9db5d05d6-combined-ca-bundle\") pod \"8d47edf7-95e3-4bb5-ab87-27c9db5d05d6\" (UID: \"8d47edf7-95e3-4bb5-ab87-27c9db5d05d6\") " Mar 21 04:46:39 crc kubenswrapper[4839]: I0321 04:46:39.947662 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cp7fg\" (UniqueName: \"kubernetes.io/projected/8d47edf7-95e3-4bb5-ab87-27c9db5d05d6-kube-api-access-cp7fg\") pod \"8d47edf7-95e3-4bb5-ab87-27c9db5d05d6\" (UID: \"8d47edf7-95e3-4bb5-ab87-27c9db5d05d6\") " Mar 21 04:46:39 crc kubenswrapper[4839]: I0321 04:46:39.947684 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/8d47edf7-95e3-4bb5-ab87-27c9db5d05d6-sg-core-conf-yaml\") pod \"8d47edf7-95e3-4bb5-ab87-27c9db5d05d6\" (UID: \"8d47edf7-95e3-4bb5-ab87-27c9db5d05d6\") " Mar 21 04:46:39 crc kubenswrapper[4839]: I0321 04:46:39.947767 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/8d47edf7-95e3-4bb5-ab87-27c9db5d05d6-log-httpd\") pod \"8d47edf7-95e3-4bb5-ab87-27c9db5d05d6\" (UID: \"8d47edf7-95e3-4bb5-ab87-27c9db5d05d6\") " Mar 21 04:46:39 crc kubenswrapper[4839]: I0321 04:46:39.947789 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8d47edf7-95e3-4bb5-ab87-27c9db5d05d6-config-data\") pod \"8d47edf7-95e3-4bb5-ab87-27c9db5d05d6\" (UID: \"8d47edf7-95e3-4bb5-ab87-27c9db5d05d6\") " Mar 21 04:46:39 crc kubenswrapper[4839]: I0321 04:46:39.948712 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8d47edf7-95e3-4bb5-ab87-27c9db5d05d6-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "8d47edf7-95e3-4bb5-ab87-27c9db5d05d6" (UID: "8d47edf7-95e3-4bb5-ab87-27c9db5d05d6"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 21 04:46:39 crc kubenswrapper[4839]: I0321 04:46:39.949008 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8d47edf7-95e3-4bb5-ab87-27c9db5d05d6-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "8d47edf7-95e3-4bb5-ab87-27c9db5d05d6" (UID: "8d47edf7-95e3-4bb5-ab87-27c9db5d05d6"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 21 04:46:39 crc kubenswrapper[4839]: I0321 04:46:39.955247 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8d47edf7-95e3-4bb5-ab87-27c9db5d05d6-kube-api-access-cp7fg" (OuterVolumeSpecName: "kube-api-access-cp7fg") pod "8d47edf7-95e3-4bb5-ab87-27c9db5d05d6" (UID: "8d47edf7-95e3-4bb5-ab87-27c9db5d05d6"). InnerVolumeSpecName "kube-api-access-cp7fg". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 21 04:46:39 crc kubenswrapper[4839]: I0321 04:46:39.958725 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8d47edf7-95e3-4bb5-ab87-27c9db5d05d6-scripts" (OuterVolumeSpecName: "scripts") pod "8d47edf7-95e3-4bb5-ab87-27c9db5d05d6" (UID: "8d47edf7-95e3-4bb5-ab87-27c9db5d05d6"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 21 04:46:39 crc kubenswrapper[4839]: I0321 04:46:39.997782 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8d47edf7-95e3-4bb5-ab87-27c9db5d05d6-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "8d47edf7-95e3-4bb5-ab87-27c9db5d05d6" (UID: "8d47edf7-95e3-4bb5-ab87-27c9db5d05d6"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 21 04:46:40 crc kubenswrapper[4839]: I0321 04:46:40.050194 4839 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/8d47edf7-95e3-4bb5-ab87-27c9db5d05d6-log-httpd\") on node \"crc\" DevicePath \"\"" Mar 21 04:46:40 crc kubenswrapper[4839]: I0321 04:46:40.050238 4839 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8d47edf7-95e3-4bb5-ab87-27c9db5d05d6-scripts\") on node \"crc\" DevicePath \"\"" Mar 21 04:46:40 crc kubenswrapper[4839]: I0321 04:46:40.050249 4839 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/8d47edf7-95e3-4bb5-ab87-27c9db5d05d6-run-httpd\") on node \"crc\" DevicePath \"\"" Mar 21 04:46:40 crc kubenswrapper[4839]: I0321 04:46:40.050262 4839 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cp7fg\" (UniqueName: \"kubernetes.io/projected/8d47edf7-95e3-4bb5-ab87-27c9db5d05d6-kube-api-access-cp7fg\") on node \"crc\" DevicePath \"\"" Mar 21 04:46:40 crc kubenswrapper[4839]: I0321 04:46:40.050275 4839 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/8d47edf7-95e3-4bb5-ab87-27c9db5d05d6-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Mar 21 04:46:40 crc kubenswrapper[4839]: I0321 04:46:40.052500 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8d47edf7-95e3-4bb5-ab87-27c9db5d05d6-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "8d47edf7-95e3-4bb5-ab87-27c9db5d05d6" (UID: "8d47edf7-95e3-4bb5-ab87-27c9db5d05d6"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 21 04:46:40 crc kubenswrapper[4839]: I0321 04:46:40.079106 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8d47edf7-95e3-4bb5-ab87-27c9db5d05d6-config-data" (OuterVolumeSpecName: "config-data") pod "8d47edf7-95e3-4bb5-ab87-27c9db5d05d6" (UID: "8d47edf7-95e3-4bb5-ab87-27c9db5d05d6"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 21 04:46:40 crc kubenswrapper[4839]: I0321 04:46:40.151709 4839 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8d47edf7-95e3-4bb5-ab87-27c9db5d05d6-config-data\") on node \"crc\" DevicePath \"\"" Mar 21 04:46:40 crc kubenswrapper[4839]: I0321 04:46:40.151758 4839 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8d47edf7-95e3-4bb5-ab87-27c9db5d05d6-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 21 04:46:40 crc kubenswrapper[4839]: I0321 04:46:40.401039 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"8d47edf7-95e3-4bb5-ab87-27c9db5d05d6","Type":"ContainerDied","Data":"d75d884bbe4a31f233cb211086c9cf5693ae2ba4faf11ef4b832e01a60ea7483"} Mar 21 04:46:40 crc kubenswrapper[4839]: I0321 04:46:40.401089 4839 scope.go:117] "RemoveContainer" containerID="1ebe513656b6f58bbf6f0d69227894541ab6a4fa4cbe47f5b1af5f7551f5352e" Mar 21 04:46:40 crc kubenswrapper[4839]: I0321 04:46:40.401207 4839 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 21 04:46:40 crc kubenswrapper[4839]: I0321 04:46:40.408880 4839 generic.go:334] "Generic (PLEG): container finished" podID="500decd4-2b92-4e52-bfa8-bb8d1fe13b9d" containerID="118f2c293ce181a9defa7eb0621b40d7a4ec32e8ea91c36b0f98ccebfdd6ba13" exitCode=0 Mar 21 04:46:40 crc kubenswrapper[4839]: I0321 04:46:40.409152 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-jznl6" event={"ID":"500decd4-2b92-4e52-bfa8-bb8d1fe13b9d","Type":"ContainerDied","Data":"118f2c293ce181a9defa7eb0621b40d7a4ec32e8ea91c36b0f98ccebfdd6ba13"} Mar 21 04:46:40 crc kubenswrapper[4839]: I0321 04:46:40.409222 4839 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-scheduler-0" podUID="62694a5a-1565-4831-bff3-504a782692bb" containerName="nova-scheduler-scheduler" containerID="cri-o://8395b25a3a6856e54e89996dc9dd678ad43d748174fdcbe34c43485b6c527d32" gracePeriod=30 Mar 21 04:46:40 crc kubenswrapper[4839]: I0321 04:46:40.424468 4839 scope.go:117] "RemoveContainer" containerID="1e51df175e72743e7d699f4e5fcec298f453f2659fd1cb0a4c210eba9115f1a3" Mar 21 04:46:40 crc kubenswrapper[4839]: I0321 04:46:40.468843 4839 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="439bd408-2f5c-45cc-a2f7-8166a4a279c2" path="/var/lib/kubelet/pods/439bd408-2f5c-45cc-a2f7-8166a4a279c2/volumes" Mar 21 04:46:40 crc kubenswrapper[4839]: I0321 04:46:40.469667 4839 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 21 04:46:40 crc kubenswrapper[4839]: I0321 04:46:40.469699 4839 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Mar 21 04:46:40 crc kubenswrapper[4839]: I0321 04:46:40.471008 4839 scope.go:117] "RemoveContainer" containerID="1c072f26f9d106f6161eb66b3f6c9a76cc9db2ce4ede2775c0501c104b78c25c" Mar 21 04:46:40 crc kubenswrapper[4839]: I0321 04:46:40.488682 4839 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Mar 21 04:46:40 crc kubenswrapper[4839]: E0321 04:46:40.489130 4839 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="439bd408-2f5c-45cc-a2f7-8166a4a279c2" containerName="init" Mar 21 04:46:40 crc kubenswrapper[4839]: I0321 04:46:40.489149 4839 state_mem.go:107] "Deleted CPUSet assignment" podUID="439bd408-2f5c-45cc-a2f7-8166a4a279c2" containerName="init" Mar 21 04:46:40 crc kubenswrapper[4839]: E0321 04:46:40.489171 4839 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8d47edf7-95e3-4bb5-ab87-27c9db5d05d6" containerName="ceilometer-notification-agent" Mar 21 04:46:40 crc kubenswrapper[4839]: I0321 04:46:40.489178 4839 state_mem.go:107] "Deleted CPUSet assignment" podUID="8d47edf7-95e3-4bb5-ab87-27c9db5d05d6" containerName="ceilometer-notification-agent" Mar 21 04:46:40 crc kubenswrapper[4839]: E0321 04:46:40.489189 4839 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8d47edf7-95e3-4bb5-ab87-27c9db5d05d6" containerName="ceilometer-central-agent" Mar 21 04:46:40 crc kubenswrapper[4839]: I0321 04:46:40.489196 4839 state_mem.go:107] "Deleted CPUSet assignment" podUID="8d47edf7-95e3-4bb5-ab87-27c9db5d05d6" containerName="ceilometer-central-agent" Mar 21 04:46:40 crc kubenswrapper[4839]: E0321 04:46:40.489210 4839 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="439bd408-2f5c-45cc-a2f7-8166a4a279c2" containerName="dnsmasq-dns" Mar 21 04:46:40 crc kubenswrapper[4839]: I0321 04:46:40.489216 4839 state_mem.go:107] "Deleted CPUSet assignment" podUID="439bd408-2f5c-45cc-a2f7-8166a4a279c2" containerName="dnsmasq-dns" Mar 21 04:46:40 crc kubenswrapper[4839]: E0321 04:46:40.489226 4839 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="37c6fbf7-427d-45a8-b190-439265c8d6d0" containerName="nova-manage" Mar 21 04:46:40 crc kubenswrapper[4839]: I0321 04:46:40.489233 4839 state_mem.go:107] "Deleted CPUSet assignment" podUID="37c6fbf7-427d-45a8-b190-439265c8d6d0" containerName="nova-manage" Mar 21 04:46:40 crc kubenswrapper[4839]: E0321 04:46:40.489244 4839 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8d47edf7-95e3-4bb5-ab87-27c9db5d05d6" containerName="sg-core" Mar 21 04:46:40 crc kubenswrapper[4839]: I0321 04:46:40.489249 4839 state_mem.go:107] "Deleted CPUSet assignment" podUID="8d47edf7-95e3-4bb5-ab87-27c9db5d05d6" containerName="sg-core" Mar 21 04:46:40 crc kubenswrapper[4839]: E0321 04:46:40.489265 4839 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8d47edf7-95e3-4bb5-ab87-27c9db5d05d6" containerName="proxy-httpd" Mar 21 04:46:40 crc kubenswrapper[4839]: I0321 04:46:40.489270 4839 state_mem.go:107] "Deleted CPUSet assignment" podUID="8d47edf7-95e3-4bb5-ab87-27c9db5d05d6" containerName="proxy-httpd" Mar 21 04:46:40 crc kubenswrapper[4839]: I0321 04:46:40.489470 4839 memory_manager.go:354] "RemoveStaleState removing state" podUID="8d47edf7-95e3-4bb5-ab87-27c9db5d05d6" containerName="ceilometer-notification-agent" Mar 21 04:46:40 crc kubenswrapper[4839]: I0321 04:46:40.489513 4839 memory_manager.go:354] "RemoveStaleState removing state" podUID="8d47edf7-95e3-4bb5-ab87-27c9db5d05d6" containerName="ceilometer-central-agent" Mar 21 04:46:40 crc kubenswrapper[4839]: I0321 04:46:40.489528 4839 memory_manager.go:354] "RemoveStaleState removing state" podUID="8d47edf7-95e3-4bb5-ab87-27c9db5d05d6" containerName="sg-core" Mar 21 04:46:40 crc kubenswrapper[4839]: I0321 04:46:40.489540 4839 memory_manager.go:354] "RemoveStaleState removing state" podUID="439bd408-2f5c-45cc-a2f7-8166a4a279c2" containerName="dnsmasq-dns" Mar 21 04:46:40 crc kubenswrapper[4839]: I0321 04:46:40.489549 4839 memory_manager.go:354] "RemoveStaleState removing state" podUID="8d47edf7-95e3-4bb5-ab87-27c9db5d05d6" containerName="proxy-httpd" Mar 21 04:46:40 crc kubenswrapper[4839]: I0321 04:46:40.489561 4839 memory_manager.go:354] "RemoveStaleState removing state" podUID="37c6fbf7-427d-45a8-b190-439265c8d6d0" containerName="nova-manage" Mar 21 04:46:40 crc kubenswrapper[4839]: I0321 04:46:40.491330 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 21 04:46:40 crc kubenswrapper[4839]: I0321 04:46:40.497299 4839 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Mar 21 04:46:40 crc kubenswrapper[4839]: I0321 04:46:40.497746 4839 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Mar 21 04:46:40 crc kubenswrapper[4839]: I0321 04:46:40.510493 4839 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ceilometer-internal-svc" Mar 21 04:46:40 crc kubenswrapper[4839]: I0321 04:46:40.517382 4839 scope.go:117] "RemoveContainer" containerID="cbba2b10323381d9b303a23b6607bd17c5906d7437bb21c852b760d41642da03" Mar 21 04:46:40 crc kubenswrapper[4839]: I0321 04:46:40.520016 4839 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 21 04:46:40 crc kubenswrapper[4839]: I0321 04:46:40.660283 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ab85f3f4-7277-419b-96bc-4f56d5891b16-log-httpd\") pod \"ceilometer-0\" (UID: \"ab85f3f4-7277-419b-96bc-4f56d5891b16\") " pod="openstack/ceilometer-0" Mar 21 04:46:40 crc kubenswrapper[4839]: I0321 04:46:40.660366 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mtppj\" (UniqueName: \"kubernetes.io/projected/ab85f3f4-7277-419b-96bc-4f56d5891b16-kube-api-access-mtppj\") pod \"ceilometer-0\" (UID: \"ab85f3f4-7277-419b-96bc-4f56d5891b16\") " pod="openstack/ceilometer-0" Mar 21 04:46:40 crc kubenswrapper[4839]: I0321 04:46:40.660386 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ab85f3f4-7277-419b-96bc-4f56d5891b16-config-data\") pod \"ceilometer-0\" (UID: \"ab85f3f4-7277-419b-96bc-4f56d5891b16\") " pod="openstack/ceilometer-0" Mar 21 04:46:40 crc kubenswrapper[4839]: I0321 04:46:40.660411 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ab85f3f4-7277-419b-96bc-4f56d5891b16-scripts\") pod \"ceilometer-0\" (UID: \"ab85f3f4-7277-419b-96bc-4f56d5891b16\") " pod="openstack/ceilometer-0" Mar 21 04:46:40 crc kubenswrapper[4839]: I0321 04:46:40.660475 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/ab85f3f4-7277-419b-96bc-4f56d5891b16-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"ab85f3f4-7277-419b-96bc-4f56d5891b16\") " pod="openstack/ceilometer-0" Mar 21 04:46:40 crc kubenswrapper[4839]: I0321 04:46:40.660596 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ab85f3f4-7277-419b-96bc-4f56d5891b16-run-httpd\") pod \"ceilometer-0\" (UID: \"ab85f3f4-7277-419b-96bc-4f56d5891b16\") " pod="openstack/ceilometer-0" Mar 21 04:46:40 crc kubenswrapper[4839]: I0321 04:46:40.660687 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ab85f3f4-7277-419b-96bc-4f56d5891b16-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"ab85f3f4-7277-419b-96bc-4f56d5891b16\") " pod="openstack/ceilometer-0" Mar 21 04:46:40 crc kubenswrapper[4839]: I0321 04:46:40.660819 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/ab85f3f4-7277-419b-96bc-4f56d5891b16-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"ab85f3f4-7277-419b-96bc-4f56d5891b16\") " pod="openstack/ceilometer-0" Mar 21 04:46:40 crc kubenswrapper[4839]: I0321 04:46:40.762822 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/ab85f3f4-7277-419b-96bc-4f56d5891b16-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"ab85f3f4-7277-419b-96bc-4f56d5891b16\") " pod="openstack/ceilometer-0" Mar 21 04:46:40 crc kubenswrapper[4839]: I0321 04:46:40.762895 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ab85f3f4-7277-419b-96bc-4f56d5891b16-log-httpd\") pod \"ceilometer-0\" (UID: \"ab85f3f4-7277-419b-96bc-4f56d5891b16\") " pod="openstack/ceilometer-0" Mar 21 04:46:40 crc kubenswrapper[4839]: I0321 04:46:40.762933 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mtppj\" (UniqueName: \"kubernetes.io/projected/ab85f3f4-7277-419b-96bc-4f56d5891b16-kube-api-access-mtppj\") pod \"ceilometer-0\" (UID: \"ab85f3f4-7277-419b-96bc-4f56d5891b16\") " pod="openstack/ceilometer-0" Mar 21 04:46:40 crc kubenswrapper[4839]: I0321 04:46:40.762949 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ab85f3f4-7277-419b-96bc-4f56d5891b16-config-data\") pod \"ceilometer-0\" (UID: \"ab85f3f4-7277-419b-96bc-4f56d5891b16\") " pod="openstack/ceilometer-0" Mar 21 04:46:40 crc kubenswrapper[4839]: I0321 04:46:40.762972 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ab85f3f4-7277-419b-96bc-4f56d5891b16-scripts\") pod \"ceilometer-0\" (UID: \"ab85f3f4-7277-419b-96bc-4f56d5891b16\") " pod="openstack/ceilometer-0" Mar 21 04:46:40 crc kubenswrapper[4839]: I0321 04:46:40.763008 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/ab85f3f4-7277-419b-96bc-4f56d5891b16-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"ab85f3f4-7277-419b-96bc-4f56d5891b16\") " pod="openstack/ceilometer-0" Mar 21 04:46:40 crc kubenswrapper[4839]: I0321 04:46:40.763035 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ab85f3f4-7277-419b-96bc-4f56d5891b16-run-httpd\") pod \"ceilometer-0\" (UID: \"ab85f3f4-7277-419b-96bc-4f56d5891b16\") " pod="openstack/ceilometer-0" Mar 21 04:46:40 crc kubenswrapper[4839]: I0321 04:46:40.763068 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ab85f3f4-7277-419b-96bc-4f56d5891b16-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"ab85f3f4-7277-419b-96bc-4f56d5891b16\") " pod="openstack/ceilometer-0" Mar 21 04:46:40 crc kubenswrapper[4839]: I0321 04:46:40.763677 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ab85f3f4-7277-419b-96bc-4f56d5891b16-log-httpd\") pod \"ceilometer-0\" (UID: \"ab85f3f4-7277-419b-96bc-4f56d5891b16\") " pod="openstack/ceilometer-0" Mar 21 04:46:40 crc kubenswrapper[4839]: I0321 04:46:40.763957 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ab85f3f4-7277-419b-96bc-4f56d5891b16-run-httpd\") pod \"ceilometer-0\" (UID: \"ab85f3f4-7277-419b-96bc-4f56d5891b16\") " pod="openstack/ceilometer-0" Mar 21 04:46:40 crc kubenswrapper[4839]: I0321 04:46:40.768797 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ab85f3f4-7277-419b-96bc-4f56d5891b16-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"ab85f3f4-7277-419b-96bc-4f56d5891b16\") " pod="openstack/ceilometer-0" Mar 21 04:46:40 crc kubenswrapper[4839]: I0321 04:46:40.768351 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/ab85f3f4-7277-419b-96bc-4f56d5891b16-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"ab85f3f4-7277-419b-96bc-4f56d5891b16\") " pod="openstack/ceilometer-0" Mar 21 04:46:40 crc kubenswrapper[4839]: I0321 04:46:40.775385 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ab85f3f4-7277-419b-96bc-4f56d5891b16-scripts\") pod \"ceilometer-0\" (UID: \"ab85f3f4-7277-419b-96bc-4f56d5891b16\") " pod="openstack/ceilometer-0" Mar 21 04:46:40 crc kubenswrapper[4839]: I0321 04:46:40.776307 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/ab85f3f4-7277-419b-96bc-4f56d5891b16-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"ab85f3f4-7277-419b-96bc-4f56d5891b16\") " pod="openstack/ceilometer-0" Mar 21 04:46:40 crc kubenswrapper[4839]: I0321 04:46:40.777399 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ab85f3f4-7277-419b-96bc-4f56d5891b16-config-data\") pod \"ceilometer-0\" (UID: \"ab85f3f4-7277-419b-96bc-4f56d5891b16\") " pod="openstack/ceilometer-0" Mar 21 04:46:40 crc kubenswrapper[4839]: I0321 04:46:40.785662 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mtppj\" (UniqueName: \"kubernetes.io/projected/ab85f3f4-7277-419b-96bc-4f56d5891b16-kube-api-access-mtppj\") pod \"ceilometer-0\" (UID: \"ab85f3f4-7277-419b-96bc-4f56d5891b16\") " pod="openstack/ceilometer-0" Mar 21 04:46:40 crc kubenswrapper[4839]: I0321 04:46:40.812777 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 21 04:46:41 crc kubenswrapper[4839]: I0321 04:46:41.254392 4839 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 21 04:46:41 crc kubenswrapper[4839]: I0321 04:46:41.421032 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"ab85f3f4-7277-419b-96bc-4f56d5891b16","Type":"ContainerStarted","Data":"a7cee1eec896bbb6d355b98092ee2f6320ccf9ad32ba43390936af295231ddd4"} Mar 21 04:46:41 crc kubenswrapper[4839]: E0321 04:46:41.537073 4839 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="8395b25a3a6856e54e89996dc9dd678ad43d748174fdcbe34c43485b6c527d32" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Mar 21 04:46:41 crc kubenswrapper[4839]: E0321 04:46:41.543400 4839 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="8395b25a3a6856e54e89996dc9dd678ad43d748174fdcbe34c43485b6c527d32" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Mar 21 04:46:41 crc kubenswrapper[4839]: E0321 04:46:41.573040 4839 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="8395b25a3a6856e54e89996dc9dd678ad43d748174fdcbe34c43485b6c527d32" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Mar 21 04:46:41 crc kubenswrapper[4839]: E0321 04:46:41.573125 4839 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/nova-scheduler-0" podUID="62694a5a-1565-4831-bff3-504a782692bb" containerName="nova-scheduler-scheduler" Mar 21 04:46:41 crc kubenswrapper[4839]: I0321 04:46:41.860260 4839 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-jznl6" Mar 21 04:46:41 crc kubenswrapper[4839]: I0321 04:46:41.984872 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/500decd4-2b92-4e52-bfa8-bb8d1fe13b9d-combined-ca-bundle\") pod \"500decd4-2b92-4e52-bfa8-bb8d1fe13b9d\" (UID: \"500decd4-2b92-4e52-bfa8-bb8d1fe13b9d\") " Mar 21 04:46:41 crc kubenswrapper[4839]: I0321 04:46:41.984921 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/500decd4-2b92-4e52-bfa8-bb8d1fe13b9d-config-data\") pod \"500decd4-2b92-4e52-bfa8-bb8d1fe13b9d\" (UID: \"500decd4-2b92-4e52-bfa8-bb8d1fe13b9d\") " Mar 21 04:46:41 crc kubenswrapper[4839]: I0321 04:46:41.985024 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6rrz9\" (UniqueName: \"kubernetes.io/projected/500decd4-2b92-4e52-bfa8-bb8d1fe13b9d-kube-api-access-6rrz9\") pod \"500decd4-2b92-4e52-bfa8-bb8d1fe13b9d\" (UID: \"500decd4-2b92-4e52-bfa8-bb8d1fe13b9d\") " Mar 21 04:46:41 crc kubenswrapper[4839]: I0321 04:46:41.985103 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/500decd4-2b92-4e52-bfa8-bb8d1fe13b9d-scripts\") pod \"500decd4-2b92-4e52-bfa8-bb8d1fe13b9d\" (UID: \"500decd4-2b92-4e52-bfa8-bb8d1fe13b9d\") " Mar 21 04:46:41 crc kubenswrapper[4839]: I0321 04:46:41.990578 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/500decd4-2b92-4e52-bfa8-bb8d1fe13b9d-kube-api-access-6rrz9" (OuterVolumeSpecName: "kube-api-access-6rrz9") pod "500decd4-2b92-4e52-bfa8-bb8d1fe13b9d" (UID: "500decd4-2b92-4e52-bfa8-bb8d1fe13b9d"). InnerVolumeSpecName "kube-api-access-6rrz9". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 21 04:46:41 crc kubenswrapper[4839]: I0321 04:46:41.990555 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/500decd4-2b92-4e52-bfa8-bb8d1fe13b9d-scripts" (OuterVolumeSpecName: "scripts") pod "500decd4-2b92-4e52-bfa8-bb8d1fe13b9d" (UID: "500decd4-2b92-4e52-bfa8-bb8d1fe13b9d"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 21 04:46:42 crc kubenswrapper[4839]: I0321 04:46:42.014156 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/500decd4-2b92-4e52-bfa8-bb8d1fe13b9d-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "500decd4-2b92-4e52-bfa8-bb8d1fe13b9d" (UID: "500decd4-2b92-4e52-bfa8-bb8d1fe13b9d"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 21 04:46:42 crc kubenswrapper[4839]: I0321 04:46:42.018684 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/500decd4-2b92-4e52-bfa8-bb8d1fe13b9d-config-data" (OuterVolumeSpecName: "config-data") pod "500decd4-2b92-4e52-bfa8-bb8d1fe13b9d" (UID: "500decd4-2b92-4e52-bfa8-bb8d1fe13b9d"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 21 04:46:42 crc kubenswrapper[4839]: I0321 04:46:42.090546 4839 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/500decd4-2b92-4e52-bfa8-bb8d1fe13b9d-scripts\") on node \"crc\" DevicePath \"\"" Mar 21 04:46:42 crc kubenswrapper[4839]: I0321 04:46:42.090614 4839 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/500decd4-2b92-4e52-bfa8-bb8d1fe13b9d-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 21 04:46:42 crc kubenswrapper[4839]: I0321 04:46:42.090629 4839 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/500decd4-2b92-4e52-bfa8-bb8d1fe13b9d-config-data\") on node \"crc\" DevicePath \"\"" Mar 21 04:46:42 crc kubenswrapper[4839]: I0321 04:46:42.090642 4839 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6rrz9\" (UniqueName: \"kubernetes.io/projected/500decd4-2b92-4e52-bfa8-bb8d1fe13b9d-kube-api-access-6rrz9\") on node \"crc\" DevicePath \"\"" Mar 21 04:46:42 crc kubenswrapper[4839]: I0321 04:46:42.435117 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-jznl6" event={"ID":"500decd4-2b92-4e52-bfa8-bb8d1fe13b9d","Type":"ContainerDied","Data":"adab925016451244fae9f2cf83f23ed7b20a7f3728fde316dcb382033aa897aa"} Mar 21 04:46:42 crc kubenswrapper[4839]: I0321 04:46:42.437021 4839 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="adab925016451244fae9f2cf83f23ed7b20a7f3728fde316dcb382033aa897aa" Mar 21 04:46:42 crc kubenswrapper[4839]: I0321 04:46:42.437126 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"ab85f3f4-7277-419b-96bc-4f56d5891b16","Type":"ContainerStarted","Data":"c9624ee09cc62f49f1e2db6cc40410325ea975a44c5ea0f1eaa54772bc90e8de"} Mar 21 04:46:42 crc kubenswrapper[4839]: I0321 04:46:42.435309 4839 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-jznl6" Mar 21 04:46:42 crc kubenswrapper[4839]: I0321 04:46:42.468561 4839 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8d47edf7-95e3-4bb5-ab87-27c9db5d05d6" path="/var/lib/kubelet/pods/8d47edf7-95e3-4bb5-ab87-27c9db5d05d6/volumes" Mar 21 04:46:42 crc kubenswrapper[4839]: I0321 04:46:42.503722 4839 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-conductor-0"] Mar 21 04:46:42 crc kubenswrapper[4839]: E0321 04:46:42.504344 4839 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="500decd4-2b92-4e52-bfa8-bb8d1fe13b9d" containerName="nova-cell1-conductor-db-sync" Mar 21 04:46:42 crc kubenswrapper[4839]: I0321 04:46:42.504369 4839 state_mem.go:107] "Deleted CPUSet assignment" podUID="500decd4-2b92-4e52-bfa8-bb8d1fe13b9d" containerName="nova-cell1-conductor-db-sync" Mar 21 04:46:42 crc kubenswrapper[4839]: I0321 04:46:42.504730 4839 memory_manager.go:354] "RemoveStaleState removing state" podUID="500decd4-2b92-4e52-bfa8-bb8d1fe13b9d" containerName="nova-cell1-conductor-db-sync" Mar 21 04:46:42 crc kubenswrapper[4839]: I0321 04:46:42.505545 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-0" Mar 21 04:46:42 crc kubenswrapper[4839]: I0321 04:46:42.508745 4839 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-conductor-config-data" Mar 21 04:46:42 crc kubenswrapper[4839]: I0321 04:46:42.527069 4839 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-0"] Mar 21 04:46:42 crc kubenswrapper[4839]: I0321 04:46:42.701138 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3194b187-fe06-4eed-b725-995cef2b05a0-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"3194b187-fe06-4eed-b725-995cef2b05a0\") " pod="openstack/nova-cell1-conductor-0" Mar 21 04:46:42 crc kubenswrapper[4839]: I0321 04:46:42.701219 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-smpbc\" (UniqueName: \"kubernetes.io/projected/3194b187-fe06-4eed-b725-995cef2b05a0-kube-api-access-smpbc\") pod \"nova-cell1-conductor-0\" (UID: \"3194b187-fe06-4eed-b725-995cef2b05a0\") " pod="openstack/nova-cell1-conductor-0" Mar 21 04:46:42 crc kubenswrapper[4839]: I0321 04:46:42.701286 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3194b187-fe06-4eed-b725-995cef2b05a0-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"3194b187-fe06-4eed-b725-995cef2b05a0\") " pod="openstack/nova-cell1-conductor-0" Mar 21 04:46:42 crc kubenswrapper[4839]: I0321 04:46:42.803115 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3194b187-fe06-4eed-b725-995cef2b05a0-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"3194b187-fe06-4eed-b725-995cef2b05a0\") " pod="openstack/nova-cell1-conductor-0" Mar 21 04:46:42 crc kubenswrapper[4839]: I0321 04:46:42.803244 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3194b187-fe06-4eed-b725-995cef2b05a0-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"3194b187-fe06-4eed-b725-995cef2b05a0\") " pod="openstack/nova-cell1-conductor-0" Mar 21 04:46:42 crc kubenswrapper[4839]: I0321 04:46:42.803287 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-smpbc\" (UniqueName: \"kubernetes.io/projected/3194b187-fe06-4eed-b725-995cef2b05a0-kube-api-access-smpbc\") pod \"nova-cell1-conductor-0\" (UID: \"3194b187-fe06-4eed-b725-995cef2b05a0\") " pod="openstack/nova-cell1-conductor-0" Mar 21 04:46:42 crc kubenswrapper[4839]: I0321 04:46:42.808771 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3194b187-fe06-4eed-b725-995cef2b05a0-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"3194b187-fe06-4eed-b725-995cef2b05a0\") " pod="openstack/nova-cell1-conductor-0" Mar 21 04:46:42 crc kubenswrapper[4839]: I0321 04:46:42.812633 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3194b187-fe06-4eed-b725-995cef2b05a0-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"3194b187-fe06-4eed-b725-995cef2b05a0\") " pod="openstack/nova-cell1-conductor-0" Mar 21 04:46:42 crc kubenswrapper[4839]: I0321 04:46:42.840946 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-smpbc\" (UniqueName: \"kubernetes.io/projected/3194b187-fe06-4eed-b725-995cef2b05a0-kube-api-access-smpbc\") pod \"nova-cell1-conductor-0\" (UID: \"3194b187-fe06-4eed-b725-995cef2b05a0\") " pod="openstack/nova-cell1-conductor-0" Mar 21 04:46:43 crc kubenswrapper[4839]: I0321 04:46:43.132561 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-0" Mar 21 04:46:43 crc kubenswrapper[4839]: I0321 04:46:43.266221 4839 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 21 04:46:43 crc kubenswrapper[4839]: I0321 04:46:43.413451 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4zk4h\" (UniqueName: \"kubernetes.io/projected/60e89b8f-2e6a-43a6-a9de-162a457cc5fb-kube-api-access-4zk4h\") pod \"60e89b8f-2e6a-43a6-a9de-162a457cc5fb\" (UID: \"60e89b8f-2e6a-43a6-a9de-162a457cc5fb\") " Mar 21 04:46:43 crc kubenswrapper[4839]: I0321 04:46:43.413826 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/60e89b8f-2e6a-43a6-a9de-162a457cc5fb-logs\") pod \"60e89b8f-2e6a-43a6-a9de-162a457cc5fb\" (UID: \"60e89b8f-2e6a-43a6-a9de-162a457cc5fb\") " Mar 21 04:46:43 crc kubenswrapper[4839]: I0321 04:46:43.413973 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/60e89b8f-2e6a-43a6-a9de-162a457cc5fb-combined-ca-bundle\") pod \"60e89b8f-2e6a-43a6-a9de-162a457cc5fb\" (UID: \"60e89b8f-2e6a-43a6-a9de-162a457cc5fb\") " Mar 21 04:46:43 crc kubenswrapper[4839]: I0321 04:46:43.414009 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/60e89b8f-2e6a-43a6-a9de-162a457cc5fb-config-data\") pod \"60e89b8f-2e6a-43a6-a9de-162a457cc5fb\" (UID: \"60e89b8f-2e6a-43a6-a9de-162a457cc5fb\") " Mar 21 04:46:43 crc kubenswrapper[4839]: I0321 04:46:43.414427 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/60e89b8f-2e6a-43a6-a9de-162a457cc5fb-logs" (OuterVolumeSpecName: "logs") pod "60e89b8f-2e6a-43a6-a9de-162a457cc5fb" (UID: "60e89b8f-2e6a-43a6-a9de-162a457cc5fb"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 21 04:46:43 crc kubenswrapper[4839]: I0321 04:46:43.414852 4839 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/60e89b8f-2e6a-43a6-a9de-162a457cc5fb-logs\") on node \"crc\" DevicePath \"\"" Mar 21 04:46:43 crc kubenswrapper[4839]: I0321 04:46:43.423160 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/60e89b8f-2e6a-43a6-a9de-162a457cc5fb-kube-api-access-4zk4h" (OuterVolumeSpecName: "kube-api-access-4zk4h") pod "60e89b8f-2e6a-43a6-a9de-162a457cc5fb" (UID: "60e89b8f-2e6a-43a6-a9de-162a457cc5fb"). InnerVolumeSpecName "kube-api-access-4zk4h". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 21 04:46:43 crc kubenswrapper[4839]: I0321 04:46:43.438680 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/60e89b8f-2e6a-43a6-a9de-162a457cc5fb-config-data" (OuterVolumeSpecName: "config-data") pod "60e89b8f-2e6a-43a6-a9de-162a457cc5fb" (UID: "60e89b8f-2e6a-43a6-a9de-162a457cc5fb"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 21 04:46:43 crc kubenswrapper[4839]: I0321 04:46:43.450900 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/60e89b8f-2e6a-43a6-a9de-162a457cc5fb-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "60e89b8f-2e6a-43a6-a9de-162a457cc5fb" (UID: "60e89b8f-2e6a-43a6-a9de-162a457cc5fb"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 21 04:46:43 crc kubenswrapper[4839]: I0321 04:46:43.459931 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"ab85f3f4-7277-419b-96bc-4f56d5891b16","Type":"ContainerStarted","Data":"b489bca73bc4e58653d79d57303a9cbf2e7e55c12b829b4e8ad5f82f29e57974"} Mar 21 04:46:43 crc kubenswrapper[4839]: I0321 04:46:43.459990 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"ab85f3f4-7277-419b-96bc-4f56d5891b16","Type":"ContainerStarted","Data":"81cce601ac7ae485ab54fe031a2c9780e7939ffd4d001fc1df6fa33f481f6387"} Mar 21 04:46:43 crc kubenswrapper[4839]: I0321 04:46:43.461739 4839 generic.go:334] "Generic (PLEG): container finished" podID="60e89b8f-2e6a-43a6-a9de-162a457cc5fb" containerID="7cdfb7b00a9da31b06629cf79954f57907a802a192541e04d064b035f961823c" exitCode=0 Mar 21 04:46:43 crc kubenswrapper[4839]: I0321 04:46:43.461762 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"60e89b8f-2e6a-43a6-a9de-162a457cc5fb","Type":"ContainerDied","Data":"7cdfb7b00a9da31b06629cf79954f57907a802a192541e04d064b035f961823c"} Mar 21 04:46:43 crc kubenswrapper[4839]: I0321 04:46:43.461777 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"60e89b8f-2e6a-43a6-a9de-162a457cc5fb","Type":"ContainerDied","Data":"578df92eb43a9d3deb754fc3112c79ab2340cf1a5936b7d1362d0e02e009882d"} Mar 21 04:46:43 crc kubenswrapper[4839]: I0321 04:46:43.461792 4839 scope.go:117] "RemoveContainer" containerID="7cdfb7b00a9da31b06629cf79954f57907a802a192541e04d064b035f961823c" Mar 21 04:46:43 crc kubenswrapper[4839]: I0321 04:46:43.461893 4839 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 21 04:46:43 crc kubenswrapper[4839]: I0321 04:46:43.511475 4839 scope.go:117] "RemoveContainer" containerID="226ae5bc81ef92a5b9fb858af04a67ca5304b066b92652d3561cb37b483b0ed0" Mar 21 04:46:43 crc kubenswrapper[4839]: I0321 04:46:43.514791 4839 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Mar 21 04:46:43 crc kubenswrapper[4839]: I0321 04:46:43.516121 4839 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4zk4h\" (UniqueName: \"kubernetes.io/projected/60e89b8f-2e6a-43a6-a9de-162a457cc5fb-kube-api-access-4zk4h\") on node \"crc\" DevicePath \"\"" Mar 21 04:46:43 crc kubenswrapper[4839]: I0321 04:46:43.516150 4839 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/60e89b8f-2e6a-43a6-a9de-162a457cc5fb-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 21 04:46:43 crc kubenswrapper[4839]: I0321 04:46:43.516163 4839 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/60e89b8f-2e6a-43a6-a9de-162a457cc5fb-config-data\") on node \"crc\" DevicePath \"\"" Mar 21 04:46:43 crc kubenswrapper[4839]: I0321 04:46:43.527881 4839 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-0"] Mar 21 04:46:43 crc kubenswrapper[4839]: I0321 04:46:43.547847 4839 scope.go:117] "RemoveContainer" containerID="7cdfb7b00a9da31b06629cf79954f57907a802a192541e04d064b035f961823c" Mar 21 04:46:43 crc kubenswrapper[4839]: E0321 04:46:43.548356 4839 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7cdfb7b00a9da31b06629cf79954f57907a802a192541e04d064b035f961823c\": container with ID starting with 7cdfb7b00a9da31b06629cf79954f57907a802a192541e04d064b035f961823c not found: ID does not exist" containerID="7cdfb7b00a9da31b06629cf79954f57907a802a192541e04d064b035f961823c" Mar 21 04:46:43 crc kubenswrapper[4839]: I0321 04:46:43.548395 4839 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7cdfb7b00a9da31b06629cf79954f57907a802a192541e04d064b035f961823c"} err="failed to get container status \"7cdfb7b00a9da31b06629cf79954f57907a802a192541e04d064b035f961823c\": rpc error: code = NotFound desc = could not find container \"7cdfb7b00a9da31b06629cf79954f57907a802a192541e04d064b035f961823c\": container with ID starting with 7cdfb7b00a9da31b06629cf79954f57907a802a192541e04d064b035f961823c not found: ID does not exist" Mar 21 04:46:43 crc kubenswrapper[4839]: I0321 04:46:43.548421 4839 scope.go:117] "RemoveContainer" containerID="226ae5bc81ef92a5b9fb858af04a67ca5304b066b92652d3561cb37b483b0ed0" Mar 21 04:46:43 crc kubenswrapper[4839]: E0321 04:46:43.551161 4839 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"226ae5bc81ef92a5b9fb858af04a67ca5304b066b92652d3561cb37b483b0ed0\": container with ID starting with 226ae5bc81ef92a5b9fb858af04a67ca5304b066b92652d3561cb37b483b0ed0 not found: ID does not exist" containerID="226ae5bc81ef92a5b9fb858af04a67ca5304b066b92652d3561cb37b483b0ed0" Mar 21 04:46:43 crc kubenswrapper[4839]: I0321 04:46:43.551195 4839 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"226ae5bc81ef92a5b9fb858af04a67ca5304b066b92652d3561cb37b483b0ed0"} err="failed to get container status \"226ae5bc81ef92a5b9fb858af04a67ca5304b066b92652d3561cb37b483b0ed0\": rpc error: code = NotFound desc = could not find container \"226ae5bc81ef92a5b9fb858af04a67ca5304b066b92652d3561cb37b483b0ed0\": container with ID starting with 226ae5bc81ef92a5b9fb858af04a67ca5304b066b92652d3561cb37b483b0ed0 not found: ID does not exist" Mar 21 04:46:43 crc kubenswrapper[4839]: I0321 04:46:43.554394 4839 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Mar 21 04:46:43 crc kubenswrapper[4839]: E0321 04:46:43.554856 4839 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="60e89b8f-2e6a-43a6-a9de-162a457cc5fb" containerName="nova-api-api" Mar 21 04:46:43 crc kubenswrapper[4839]: I0321 04:46:43.554881 4839 state_mem.go:107] "Deleted CPUSet assignment" podUID="60e89b8f-2e6a-43a6-a9de-162a457cc5fb" containerName="nova-api-api" Mar 21 04:46:43 crc kubenswrapper[4839]: E0321 04:46:43.554920 4839 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="60e89b8f-2e6a-43a6-a9de-162a457cc5fb" containerName="nova-api-log" Mar 21 04:46:43 crc kubenswrapper[4839]: I0321 04:46:43.554929 4839 state_mem.go:107] "Deleted CPUSet assignment" podUID="60e89b8f-2e6a-43a6-a9de-162a457cc5fb" containerName="nova-api-log" Mar 21 04:46:43 crc kubenswrapper[4839]: I0321 04:46:43.555158 4839 memory_manager.go:354] "RemoveStaleState removing state" podUID="60e89b8f-2e6a-43a6-a9de-162a457cc5fb" containerName="nova-api-api" Mar 21 04:46:43 crc kubenswrapper[4839]: I0321 04:46:43.555176 4839 memory_manager.go:354] "RemoveStaleState removing state" podUID="60e89b8f-2e6a-43a6-a9de-162a457cc5fb" containerName="nova-api-log" Mar 21 04:46:43 crc kubenswrapper[4839]: I0321 04:46:43.556370 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 21 04:46:43 crc kubenswrapper[4839]: I0321 04:46:43.558655 4839 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Mar 21 04:46:43 crc kubenswrapper[4839]: I0321 04:46:43.564518 4839 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Mar 21 04:46:43 crc kubenswrapper[4839]: I0321 04:46:43.658443 4839 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-0"] Mar 21 04:46:43 crc kubenswrapper[4839]: I0321 04:46:43.685187 4839 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/kube-state-metrics-0" Mar 21 04:46:43 crc kubenswrapper[4839]: I0321 04:46:43.720001 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cbjsj\" (UniqueName: \"kubernetes.io/projected/233bba1a-658e-4073-acb5-c80398a849f1-kube-api-access-cbjsj\") pod \"nova-api-0\" (UID: \"233bba1a-658e-4073-acb5-c80398a849f1\") " pod="openstack/nova-api-0" Mar 21 04:46:43 crc kubenswrapper[4839]: I0321 04:46:43.720100 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/233bba1a-658e-4073-acb5-c80398a849f1-config-data\") pod \"nova-api-0\" (UID: \"233bba1a-658e-4073-acb5-c80398a849f1\") " pod="openstack/nova-api-0" Mar 21 04:46:43 crc kubenswrapper[4839]: I0321 04:46:43.720134 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/233bba1a-658e-4073-acb5-c80398a849f1-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"233bba1a-658e-4073-acb5-c80398a849f1\") " pod="openstack/nova-api-0" Mar 21 04:46:43 crc kubenswrapper[4839]: I0321 04:46:43.720224 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/233bba1a-658e-4073-acb5-c80398a849f1-logs\") pod \"nova-api-0\" (UID: \"233bba1a-658e-4073-acb5-c80398a849f1\") " pod="openstack/nova-api-0" Mar 21 04:46:43 crc kubenswrapper[4839]: I0321 04:46:43.822855 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/233bba1a-658e-4073-acb5-c80398a849f1-logs\") pod \"nova-api-0\" (UID: \"233bba1a-658e-4073-acb5-c80398a849f1\") " pod="openstack/nova-api-0" Mar 21 04:46:43 crc kubenswrapper[4839]: I0321 04:46:43.823366 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cbjsj\" (UniqueName: \"kubernetes.io/projected/233bba1a-658e-4073-acb5-c80398a849f1-kube-api-access-cbjsj\") pod \"nova-api-0\" (UID: \"233bba1a-658e-4073-acb5-c80398a849f1\") " pod="openstack/nova-api-0" Mar 21 04:46:43 crc kubenswrapper[4839]: I0321 04:46:43.823410 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/233bba1a-658e-4073-acb5-c80398a849f1-config-data\") pod \"nova-api-0\" (UID: \"233bba1a-658e-4073-acb5-c80398a849f1\") " pod="openstack/nova-api-0" Mar 21 04:46:43 crc kubenswrapper[4839]: I0321 04:46:43.823437 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/233bba1a-658e-4073-acb5-c80398a849f1-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"233bba1a-658e-4073-acb5-c80398a849f1\") " pod="openstack/nova-api-0" Mar 21 04:46:43 crc kubenswrapper[4839]: I0321 04:46:43.823455 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/233bba1a-658e-4073-acb5-c80398a849f1-logs\") pod \"nova-api-0\" (UID: \"233bba1a-658e-4073-acb5-c80398a849f1\") " pod="openstack/nova-api-0" Mar 21 04:46:43 crc kubenswrapper[4839]: I0321 04:46:43.829948 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/233bba1a-658e-4073-acb5-c80398a849f1-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"233bba1a-658e-4073-acb5-c80398a849f1\") " pod="openstack/nova-api-0" Mar 21 04:46:43 crc kubenswrapper[4839]: I0321 04:46:43.831090 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/233bba1a-658e-4073-acb5-c80398a849f1-config-data\") pod \"nova-api-0\" (UID: \"233bba1a-658e-4073-acb5-c80398a849f1\") " pod="openstack/nova-api-0" Mar 21 04:46:43 crc kubenswrapper[4839]: I0321 04:46:43.851458 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cbjsj\" (UniqueName: \"kubernetes.io/projected/233bba1a-658e-4073-acb5-c80398a849f1-kube-api-access-cbjsj\") pod \"nova-api-0\" (UID: \"233bba1a-658e-4073-acb5-c80398a849f1\") " pod="openstack/nova-api-0" Mar 21 04:46:43 crc kubenswrapper[4839]: I0321 04:46:43.873580 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 21 04:46:44 crc kubenswrapper[4839]: I0321 04:46:44.344386 4839 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Mar 21 04:46:44 crc kubenswrapper[4839]: I0321 04:46:44.487491 4839 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="60e89b8f-2e6a-43a6-a9de-162a457cc5fb" path="/var/lib/kubelet/pods/60e89b8f-2e6a-43a6-a9de-162a457cc5fb/volumes" Mar 21 04:46:44 crc kubenswrapper[4839]: I0321 04:46:44.496083 4839 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Mar 21 04:46:44 crc kubenswrapper[4839]: I0321 04:46:44.509863 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-0" event={"ID":"3194b187-fe06-4eed-b725-995cef2b05a0","Type":"ContainerStarted","Data":"f59167442230fe9f9fb6760493e49d3f6b6cafa239d5654d1dabd78b392a18e0"} Mar 21 04:46:44 crc kubenswrapper[4839]: I0321 04:46:44.509907 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-0" event={"ID":"3194b187-fe06-4eed-b725-995cef2b05a0","Type":"ContainerStarted","Data":"de23fe2c5f0df093ecc4c0d5c58b5ef3097e1868de09c451d0bb724cc7a1addd"} Mar 21 04:46:44 crc kubenswrapper[4839]: I0321 04:46:44.509937 4839 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell1-conductor-0" Mar 21 04:46:44 crc kubenswrapper[4839]: I0321 04:46:44.518126 4839 generic.go:334] "Generic (PLEG): container finished" podID="62694a5a-1565-4831-bff3-504a782692bb" containerID="8395b25a3a6856e54e89996dc9dd678ad43d748174fdcbe34c43485b6c527d32" exitCode=0 Mar 21 04:46:44 crc kubenswrapper[4839]: I0321 04:46:44.518308 4839 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Mar 21 04:46:44 crc kubenswrapper[4839]: I0321 04:46:44.518381 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"62694a5a-1565-4831-bff3-504a782692bb","Type":"ContainerDied","Data":"8395b25a3a6856e54e89996dc9dd678ad43d748174fdcbe34c43485b6c527d32"} Mar 21 04:46:44 crc kubenswrapper[4839]: I0321 04:46:44.519111 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"62694a5a-1565-4831-bff3-504a782692bb","Type":"ContainerDied","Data":"0b486a0ca8e515cbd2ddc4f12af8c937feaf1c88976b8dbba2cd271361b2775c"} Mar 21 04:46:44 crc kubenswrapper[4839]: I0321 04:46:44.519142 4839 scope.go:117] "RemoveContainer" containerID="8395b25a3a6856e54e89996dc9dd678ad43d748174fdcbe34c43485b6c527d32" Mar 21 04:46:44 crc kubenswrapper[4839]: I0321 04:46:44.536273 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/62694a5a-1565-4831-bff3-504a782692bb-config-data\") pod \"62694a5a-1565-4831-bff3-504a782692bb\" (UID: \"62694a5a-1565-4831-bff3-504a782692bb\") " Mar 21 04:46:44 crc kubenswrapper[4839]: I0321 04:46:44.536336 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6ww4d\" (UniqueName: \"kubernetes.io/projected/62694a5a-1565-4831-bff3-504a782692bb-kube-api-access-6ww4d\") pod \"62694a5a-1565-4831-bff3-504a782692bb\" (UID: \"62694a5a-1565-4831-bff3-504a782692bb\") " Mar 21 04:46:44 crc kubenswrapper[4839]: I0321 04:46:44.536452 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/62694a5a-1565-4831-bff3-504a782692bb-combined-ca-bundle\") pod \"62694a5a-1565-4831-bff3-504a782692bb\" (UID: \"62694a5a-1565-4831-bff3-504a782692bb\") " Mar 21 04:46:44 crc kubenswrapper[4839]: I0321 04:46:44.542109 4839 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-conductor-0" podStartSLOduration=2.542091353 podStartE2EDuration="2.542091353s" podCreationTimestamp="2026-03-21 04:46:42 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-21 04:46:44.529451197 +0000 UTC m=+1408.857237873" watchObservedRunningTime="2026-03-21 04:46:44.542091353 +0000 UTC m=+1408.869878029" Mar 21 04:46:44 crc kubenswrapper[4839]: I0321 04:46:44.548040 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/62694a5a-1565-4831-bff3-504a782692bb-kube-api-access-6ww4d" (OuterVolumeSpecName: "kube-api-access-6ww4d") pod "62694a5a-1565-4831-bff3-504a782692bb" (UID: "62694a5a-1565-4831-bff3-504a782692bb"). InnerVolumeSpecName "kube-api-access-6ww4d". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 21 04:46:44 crc kubenswrapper[4839]: I0321 04:46:44.555088 4839 scope.go:117] "RemoveContainer" containerID="8395b25a3a6856e54e89996dc9dd678ad43d748174fdcbe34c43485b6c527d32" Mar 21 04:46:44 crc kubenswrapper[4839]: E0321 04:46:44.559739 4839 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8395b25a3a6856e54e89996dc9dd678ad43d748174fdcbe34c43485b6c527d32\": container with ID starting with 8395b25a3a6856e54e89996dc9dd678ad43d748174fdcbe34c43485b6c527d32 not found: ID does not exist" containerID="8395b25a3a6856e54e89996dc9dd678ad43d748174fdcbe34c43485b6c527d32" Mar 21 04:46:44 crc kubenswrapper[4839]: I0321 04:46:44.559785 4839 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8395b25a3a6856e54e89996dc9dd678ad43d748174fdcbe34c43485b6c527d32"} err="failed to get container status \"8395b25a3a6856e54e89996dc9dd678ad43d748174fdcbe34c43485b6c527d32\": rpc error: code = NotFound desc = could not find container \"8395b25a3a6856e54e89996dc9dd678ad43d748174fdcbe34c43485b6c527d32\": container with ID starting with 8395b25a3a6856e54e89996dc9dd678ad43d748174fdcbe34c43485b6c527d32 not found: ID does not exist" Mar 21 04:46:44 crc kubenswrapper[4839]: I0321 04:46:44.569612 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/62694a5a-1565-4831-bff3-504a782692bb-config-data" (OuterVolumeSpecName: "config-data") pod "62694a5a-1565-4831-bff3-504a782692bb" (UID: "62694a5a-1565-4831-bff3-504a782692bb"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 21 04:46:44 crc kubenswrapper[4839]: I0321 04:46:44.570863 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/62694a5a-1565-4831-bff3-504a782692bb-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "62694a5a-1565-4831-bff3-504a782692bb" (UID: "62694a5a-1565-4831-bff3-504a782692bb"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 21 04:46:44 crc kubenswrapper[4839]: I0321 04:46:44.639034 4839 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/62694a5a-1565-4831-bff3-504a782692bb-config-data\") on node \"crc\" DevicePath \"\"" Mar 21 04:46:44 crc kubenswrapper[4839]: I0321 04:46:44.639068 4839 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6ww4d\" (UniqueName: \"kubernetes.io/projected/62694a5a-1565-4831-bff3-504a782692bb-kube-api-access-6ww4d\") on node \"crc\" DevicePath \"\"" Mar 21 04:46:44 crc kubenswrapper[4839]: I0321 04:46:44.639080 4839 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/62694a5a-1565-4831-bff3-504a782692bb-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 21 04:46:44 crc kubenswrapper[4839]: I0321 04:46:44.696948 4839 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Mar 21 04:46:44 crc kubenswrapper[4839]: I0321 04:46:44.696980 4839 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Mar 21 04:46:45 crc kubenswrapper[4839]: I0321 04:46:45.005442 4839 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Mar 21 04:46:45 crc kubenswrapper[4839]: I0321 04:46:45.019022 4839 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-scheduler-0"] Mar 21 04:46:45 crc kubenswrapper[4839]: I0321 04:46:45.029787 4839 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-scheduler-0"] Mar 21 04:46:45 crc kubenswrapper[4839]: E0321 04:46:45.030480 4839 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="62694a5a-1565-4831-bff3-504a782692bb" containerName="nova-scheduler-scheduler" Mar 21 04:46:45 crc kubenswrapper[4839]: I0321 04:46:45.030498 4839 state_mem.go:107] "Deleted CPUSet assignment" podUID="62694a5a-1565-4831-bff3-504a782692bb" containerName="nova-scheduler-scheduler" Mar 21 04:46:45 crc kubenswrapper[4839]: I0321 04:46:45.030709 4839 memory_manager.go:354] "RemoveStaleState removing state" podUID="62694a5a-1565-4831-bff3-504a782692bb" containerName="nova-scheduler-scheduler" Mar 21 04:46:45 crc kubenswrapper[4839]: I0321 04:46:45.031357 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Mar 21 04:46:45 crc kubenswrapper[4839]: I0321 04:46:45.047755 4839 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-scheduler-config-data" Mar 21 04:46:45 crc kubenswrapper[4839]: I0321 04:46:45.069119 4839 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Mar 21 04:46:45 crc kubenswrapper[4839]: I0321 04:46:45.148334 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4szpp\" (UniqueName: \"kubernetes.io/projected/1ee2fcd4-456d-436a-ae9e-95f8224e2834-kube-api-access-4szpp\") pod \"nova-scheduler-0\" (UID: \"1ee2fcd4-456d-436a-ae9e-95f8224e2834\") " pod="openstack/nova-scheduler-0" Mar 21 04:46:45 crc kubenswrapper[4839]: I0321 04:46:45.148397 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1ee2fcd4-456d-436a-ae9e-95f8224e2834-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"1ee2fcd4-456d-436a-ae9e-95f8224e2834\") " pod="openstack/nova-scheduler-0" Mar 21 04:46:45 crc kubenswrapper[4839]: I0321 04:46:45.148439 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1ee2fcd4-456d-436a-ae9e-95f8224e2834-config-data\") pod \"nova-scheduler-0\" (UID: \"1ee2fcd4-456d-436a-ae9e-95f8224e2834\") " pod="openstack/nova-scheduler-0" Mar 21 04:46:45 crc kubenswrapper[4839]: I0321 04:46:45.249961 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4szpp\" (UniqueName: \"kubernetes.io/projected/1ee2fcd4-456d-436a-ae9e-95f8224e2834-kube-api-access-4szpp\") pod \"nova-scheduler-0\" (UID: \"1ee2fcd4-456d-436a-ae9e-95f8224e2834\") " pod="openstack/nova-scheduler-0" Mar 21 04:46:45 crc kubenswrapper[4839]: I0321 04:46:45.250036 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1ee2fcd4-456d-436a-ae9e-95f8224e2834-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"1ee2fcd4-456d-436a-ae9e-95f8224e2834\") " pod="openstack/nova-scheduler-0" Mar 21 04:46:45 crc kubenswrapper[4839]: I0321 04:46:45.250074 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1ee2fcd4-456d-436a-ae9e-95f8224e2834-config-data\") pod \"nova-scheduler-0\" (UID: \"1ee2fcd4-456d-436a-ae9e-95f8224e2834\") " pod="openstack/nova-scheduler-0" Mar 21 04:46:45 crc kubenswrapper[4839]: I0321 04:46:45.254142 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1ee2fcd4-456d-436a-ae9e-95f8224e2834-config-data\") pod \"nova-scheduler-0\" (UID: \"1ee2fcd4-456d-436a-ae9e-95f8224e2834\") " pod="openstack/nova-scheduler-0" Mar 21 04:46:45 crc kubenswrapper[4839]: I0321 04:46:45.255263 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1ee2fcd4-456d-436a-ae9e-95f8224e2834-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"1ee2fcd4-456d-436a-ae9e-95f8224e2834\") " pod="openstack/nova-scheduler-0" Mar 21 04:46:45 crc kubenswrapper[4839]: I0321 04:46:45.309257 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4szpp\" (UniqueName: \"kubernetes.io/projected/1ee2fcd4-456d-436a-ae9e-95f8224e2834-kube-api-access-4szpp\") pod \"nova-scheduler-0\" (UID: \"1ee2fcd4-456d-436a-ae9e-95f8224e2834\") " pod="openstack/nova-scheduler-0" Mar 21 04:46:45 crc kubenswrapper[4839]: I0321 04:46:45.348515 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Mar 21 04:46:45 crc kubenswrapper[4839]: I0321 04:46:45.530616 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"233bba1a-658e-4073-acb5-c80398a849f1","Type":"ContainerStarted","Data":"deb071d3d18e498c97565ddfbf97de4939f8ca911621645835647f16b1a60c9f"} Mar 21 04:46:45 crc kubenswrapper[4839]: I0321 04:46:45.530973 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"233bba1a-658e-4073-acb5-c80398a849f1","Type":"ContainerStarted","Data":"31f18552e99ec1604ace2931e4e04c749226768da827514f9d6063a08cb84d92"} Mar 21 04:46:45 crc kubenswrapper[4839]: I0321 04:46:45.530995 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"233bba1a-658e-4073-acb5-c80398a849f1","Type":"ContainerStarted","Data":"b12b8ea76c192eae2fc0d2d772e7797d514c5f048bf2be61c0b8a59a9057d453"} Mar 21 04:46:45 crc kubenswrapper[4839]: I0321 04:46:45.551273 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"ab85f3f4-7277-419b-96bc-4f56d5891b16","Type":"ContainerStarted","Data":"9c55f46092a6529b94b2130cf1674611a911d2c1c073359151d36cbadd55f2db"} Mar 21 04:46:45 crc kubenswrapper[4839]: I0321 04:46:45.551344 4839 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Mar 21 04:46:45 crc kubenswrapper[4839]: I0321 04:46:45.555777 4839 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=2.555735862 podStartE2EDuration="2.555735862s" podCreationTimestamp="2026-03-21 04:46:43 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-21 04:46:45.549893527 +0000 UTC m=+1409.877680203" watchObservedRunningTime="2026-03-21 04:46:45.555735862 +0000 UTC m=+1409.883522568" Mar 21 04:46:45 crc kubenswrapper[4839]: I0321 04:46:45.589111 4839 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.022949639 podStartE2EDuration="5.58908796s" podCreationTimestamp="2026-03-21 04:46:40 +0000 UTC" firstStartedPulling="2026-03-21 04:46:41.26295627 +0000 UTC m=+1405.590742946" lastFinishedPulling="2026-03-21 04:46:44.829094591 +0000 UTC m=+1409.156881267" observedRunningTime="2026-03-21 04:46:45.579756758 +0000 UTC m=+1409.907543454" watchObservedRunningTime="2026-03-21 04:46:45.58908796 +0000 UTC m=+1409.916874636" Mar 21 04:46:45 crc kubenswrapper[4839]: I0321 04:46:45.850505 4839 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Mar 21 04:46:45 crc kubenswrapper[4839]: W0321 04:46:45.864793 4839 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod1ee2fcd4_456d_436a_ae9e_95f8224e2834.slice/crio-2dc68d964bf781e74e92f46f7b166439f6d66a340baaedb7718535c26ac20b36 WatchSource:0}: Error finding container 2dc68d964bf781e74e92f46f7b166439f6d66a340baaedb7718535c26ac20b36: Status 404 returned error can't find the container with id 2dc68d964bf781e74e92f46f7b166439f6d66a340baaedb7718535c26ac20b36 Mar 21 04:46:46 crc kubenswrapper[4839]: I0321 04:46:46.469756 4839 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="62694a5a-1565-4831-bff3-504a782692bb" path="/var/lib/kubelet/pods/62694a5a-1565-4831-bff3-504a782692bb/volumes" Mar 21 04:46:46 crc kubenswrapper[4839]: I0321 04:46:46.561107 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"1ee2fcd4-456d-436a-ae9e-95f8224e2834","Type":"ContainerStarted","Data":"09c33300639c44e28a412a85ce7db7748f2c84c1bd2fbec427d1d44f42a06978"} Mar 21 04:46:46 crc kubenswrapper[4839]: I0321 04:46:46.561155 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"1ee2fcd4-456d-436a-ae9e-95f8224e2834","Type":"ContainerStarted","Data":"2dc68d964bf781e74e92f46f7b166439f6d66a340baaedb7718535c26ac20b36"} Mar 21 04:46:46 crc kubenswrapper[4839]: I0321 04:46:46.601256 4839 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-scheduler-0" podStartSLOduration=1.6012325280000002 podStartE2EDuration="1.601232528s" podCreationTimestamp="2026-03-21 04:46:45 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-21 04:46:46.593761558 +0000 UTC m=+1410.921548234" watchObservedRunningTime="2026-03-21 04:46:46.601232528 +0000 UTC m=+1410.929019204" Mar 21 04:46:47 crc kubenswrapper[4839]: I0321 04:46:47.784666 4839 scope.go:117] "RemoveContainer" containerID="54072f0390a561fb948d238ef6ee4fb04223cd43a9ba8e8eef297b621c8367df" Mar 21 04:46:48 crc kubenswrapper[4839]: I0321 04:46:48.159480 4839 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell1-conductor-0" Mar 21 04:46:50 crc kubenswrapper[4839]: I0321 04:46:50.349306 4839 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-scheduler-0" Mar 21 04:46:53 crc kubenswrapper[4839]: I0321 04:46:53.875249 4839 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Mar 21 04:46:53 crc kubenswrapper[4839]: I0321 04:46:53.875868 4839 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Mar 21 04:46:55 crc kubenswrapper[4839]: I0321 04:46:55.073990 4839 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="233bba1a-658e-4073-acb5-c80398a849f1" containerName="nova-api-log" probeResult="failure" output="Get \"http://10.217.0.208:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 21 04:46:55 crc kubenswrapper[4839]: I0321 04:46:55.116777 4839 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="233bba1a-658e-4073-acb5-c80398a849f1" containerName="nova-api-api" probeResult="failure" output="Get \"http://10.217.0.208:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 21 04:46:55 crc kubenswrapper[4839]: I0321 04:46:55.351141 4839 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-scheduler-0" Mar 21 04:46:55 crc kubenswrapper[4839]: I0321 04:46:55.377774 4839 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-scheduler-0" Mar 21 04:46:56 crc kubenswrapper[4839]: I0321 04:46:56.085661 4839 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-scheduler-0" Mar 21 04:47:01 crc kubenswrapper[4839]: I0321 04:47:01.875258 4839 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Mar 21 04:47:01 crc kubenswrapper[4839]: I0321 04:47:01.875832 4839 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Mar 21 04:47:02 crc kubenswrapper[4839]: I0321 04:47:02.800843 4839 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Mar 21 04:47:02 crc kubenswrapper[4839]: I0321 04:47:02.806459 4839 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 21 04:47:02 crc kubenswrapper[4839]: I0321 04:47:02.861349 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2dwgm\" (UniqueName: \"kubernetes.io/projected/205b5c5e-c09f-4b4a-8a56-f98531ad0125-kube-api-access-2dwgm\") pod \"205b5c5e-c09f-4b4a-8a56-f98531ad0125\" (UID: \"205b5c5e-c09f-4b4a-8a56-f98531ad0125\") " Mar 21 04:47:02 crc kubenswrapper[4839]: I0321 04:47:02.861486 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/205b5c5e-c09f-4b4a-8a56-f98531ad0125-logs\") pod \"205b5c5e-c09f-4b4a-8a56-f98531ad0125\" (UID: \"205b5c5e-c09f-4b4a-8a56-f98531ad0125\") " Mar 21 04:47:02 crc kubenswrapper[4839]: I0321 04:47:02.861520 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/205b5c5e-c09f-4b4a-8a56-f98531ad0125-config-data\") pod \"205b5c5e-c09f-4b4a-8a56-f98531ad0125\" (UID: \"205b5c5e-c09f-4b4a-8a56-f98531ad0125\") " Mar 21 04:47:02 crc kubenswrapper[4839]: I0321 04:47:02.861711 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3307932f-5c67-4abb-9649-e4b3a0a19e9c-combined-ca-bundle\") pod \"3307932f-5c67-4abb-9649-e4b3a0a19e9c\" (UID: \"3307932f-5c67-4abb-9649-e4b3a0a19e9c\") " Mar 21 04:47:02 crc kubenswrapper[4839]: I0321 04:47:02.861757 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3307932f-5c67-4abb-9649-e4b3a0a19e9c-config-data\") pod \"3307932f-5c67-4abb-9649-e4b3a0a19e9c\" (UID: \"3307932f-5c67-4abb-9649-e4b3a0a19e9c\") " Mar 21 04:47:02 crc kubenswrapper[4839]: I0321 04:47:02.861773 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5xbgl\" (UniqueName: \"kubernetes.io/projected/3307932f-5c67-4abb-9649-e4b3a0a19e9c-kube-api-access-5xbgl\") pod \"3307932f-5c67-4abb-9649-e4b3a0a19e9c\" (UID: \"3307932f-5c67-4abb-9649-e4b3a0a19e9c\") " Mar 21 04:47:02 crc kubenswrapper[4839]: I0321 04:47:02.861863 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/205b5c5e-c09f-4b4a-8a56-f98531ad0125-combined-ca-bundle\") pod \"205b5c5e-c09f-4b4a-8a56-f98531ad0125\" (UID: \"205b5c5e-c09f-4b4a-8a56-f98531ad0125\") " Mar 21 04:47:02 crc kubenswrapper[4839]: I0321 04:47:02.862390 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/205b5c5e-c09f-4b4a-8a56-f98531ad0125-logs" (OuterVolumeSpecName: "logs") pod "205b5c5e-c09f-4b4a-8a56-f98531ad0125" (UID: "205b5c5e-c09f-4b4a-8a56-f98531ad0125"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 21 04:47:02 crc kubenswrapper[4839]: I0321 04:47:02.867274 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/205b5c5e-c09f-4b4a-8a56-f98531ad0125-kube-api-access-2dwgm" (OuterVolumeSpecName: "kube-api-access-2dwgm") pod "205b5c5e-c09f-4b4a-8a56-f98531ad0125" (UID: "205b5c5e-c09f-4b4a-8a56-f98531ad0125"). InnerVolumeSpecName "kube-api-access-2dwgm". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 21 04:47:02 crc kubenswrapper[4839]: I0321 04:47:02.867592 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3307932f-5c67-4abb-9649-e4b3a0a19e9c-kube-api-access-5xbgl" (OuterVolumeSpecName: "kube-api-access-5xbgl") pod "3307932f-5c67-4abb-9649-e4b3a0a19e9c" (UID: "3307932f-5c67-4abb-9649-e4b3a0a19e9c"). InnerVolumeSpecName "kube-api-access-5xbgl". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 21 04:47:02 crc kubenswrapper[4839]: I0321 04:47:02.889291 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/205b5c5e-c09f-4b4a-8a56-f98531ad0125-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "205b5c5e-c09f-4b4a-8a56-f98531ad0125" (UID: "205b5c5e-c09f-4b4a-8a56-f98531ad0125"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 21 04:47:02 crc kubenswrapper[4839]: I0321 04:47:02.890076 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/205b5c5e-c09f-4b4a-8a56-f98531ad0125-config-data" (OuterVolumeSpecName: "config-data") pod "205b5c5e-c09f-4b4a-8a56-f98531ad0125" (UID: "205b5c5e-c09f-4b4a-8a56-f98531ad0125"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 21 04:47:02 crc kubenswrapper[4839]: I0321 04:47:02.894374 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3307932f-5c67-4abb-9649-e4b3a0a19e9c-config-data" (OuterVolumeSpecName: "config-data") pod "3307932f-5c67-4abb-9649-e4b3a0a19e9c" (UID: "3307932f-5c67-4abb-9649-e4b3a0a19e9c"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 21 04:47:02 crc kubenswrapper[4839]: I0321 04:47:02.901549 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3307932f-5c67-4abb-9649-e4b3a0a19e9c-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "3307932f-5c67-4abb-9649-e4b3a0a19e9c" (UID: "3307932f-5c67-4abb-9649-e4b3a0a19e9c"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 21 04:47:02 crc kubenswrapper[4839]: I0321 04:47:02.963897 4839 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/205b5c5e-c09f-4b4a-8a56-f98531ad0125-logs\") on node \"crc\" DevicePath \"\"" Mar 21 04:47:02 crc kubenswrapper[4839]: I0321 04:47:02.963951 4839 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/205b5c5e-c09f-4b4a-8a56-f98531ad0125-config-data\") on node \"crc\" DevicePath \"\"" Mar 21 04:47:02 crc kubenswrapper[4839]: I0321 04:47:02.963967 4839 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3307932f-5c67-4abb-9649-e4b3a0a19e9c-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 21 04:47:02 crc kubenswrapper[4839]: I0321 04:47:02.963982 4839 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3307932f-5c67-4abb-9649-e4b3a0a19e9c-config-data\") on node \"crc\" DevicePath \"\"" Mar 21 04:47:02 crc kubenswrapper[4839]: I0321 04:47:02.963994 4839 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5xbgl\" (UniqueName: \"kubernetes.io/projected/3307932f-5c67-4abb-9649-e4b3a0a19e9c-kube-api-access-5xbgl\") on node \"crc\" DevicePath \"\"" Mar 21 04:47:02 crc kubenswrapper[4839]: I0321 04:47:02.964006 4839 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/205b5c5e-c09f-4b4a-8a56-f98531ad0125-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 21 04:47:02 crc kubenswrapper[4839]: I0321 04:47:02.964018 4839 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2dwgm\" (UniqueName: \"kubernetes.io/projected/205b5c5e-c09f-4b4a-8a56-f98531ad0125-kube-api-access-2dwgm\") on node \"crc\" DevicePath \"\"" Mar 21 04:47:03 crc kubenswrapper[4839]: I0321 04:47:03.158875 4839 generic.go:334] "Generic (PLEG): container finished" podID="3307932f-5c67-4abb-9649-e4b3a0a19e9c" containerID="09c6f9251092b481adefc14b90420fb6a766a56c4477a842505f3e261b3621ae" exitCode=137 Mar 21 04:47:03 crc kubenswrapper[4839]: I0321 04:47:03.158974 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"3307932f-5c67-4abb-9649-e4b3a0a19e9c","Type":"ContainerDied","Data":"09c6f9251092b481adefc14b90420fb6a766a56c4477a842505f3e261b3621ae"} Mar 21 04:47:03 crc kubenswrapper[4839]: I0321 04:47:03.158983 4839 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Mar 21 04:47:03 crc kubenswrapper[4839]: I0321 04:47:03.159001 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"3307932f-5c67-4abb-9649-e4b3a0a19e9c","Type":"ContainerDied","Data":"51088d3062d2b2360e5c4a54fc629b5fbeeafa49a6e356a501876595e528c519"} Mar 21 04:47:03 crc kubenswrapper[4839]: I0321 04:47:03.159019 4839 scope.go:117] "RemoveContainer" containerID="09c6f9251092b481adefc14b90420fb6a766a56c4477a842505f3e261b3621ae" Mar 21 04:47:03 crc kubenswrapper[4839]: I0321 04:47:03.160924 4839 generic.go:334] "Generic (PLEG): container finished" podID="205b5c5e-c09f-4b4a-8a56-f98531ad0125" containerID="1410d57874493b657cf6a91a40179a2fad0e19711c82366afcf394ece3f3c6ea" exitCode=137 Mar 21 04:47:03 crc kubenswrapper[4839]: I0321 04:47:03.161070 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"205b5c5e-c09f-4b4a-8a56-f98531ad0125","Type":"ContainerDied","Data":"1410d57874493b657cf6a91a40179a2fad0e19711c82366afcf394ece3f3c6ea"} Mar 21 04:47:03 crc kubenswrapper[4839]: I0321 04:47:03.161122 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"205b5c5e-c09f-4b4a-8a56-f98531ad0125","Type":"ContainerDied","Data":"5792eedab352d18af4b7af67287b836849e3b15e2d915a2161d15245e06868bd"} Mar 21 04:47:03 crc kubenswrapper[4839]: I0321 04:47:03.161238 4839 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 21 04:47:03 crc kubenswrapper[4839]: I0321 04:47:03.198725 4839 scope.go:117] "RemoveContainer" containerID="09c6f9251092b481adefc14b90420fb6a766a56c4477a842505f3e261b3621ae" Mar 21 04:47:03 crc kubenswrapper[4839]: E0321 04:47:03.199401 4839 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"09c6f9251092b481adefc14b90420fb6a766a56c4477a842505f3e261b3621ae\": container with ID starting with 09c6f9251092b481adefc14b90420fb6a766a56c4477a842505f3e261b3621ae not found: ID does not exist" containerID="09c6f9251092b481adefc14b90420fb6a766a56c4477a842505f3e261b3621ae" Mar 21 04:47:03 crc kubenswrapper[4839]: I0321 04:47:03.199448 4839 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"09c6f9251092b481adefc14b90420fb6a766a56c4477a842505f3e261b3621ae"} err="failed to get container status \"09c6f9251092b481adefc14b90420fb6a766a56c4477a842505f3e261b3621ae\": rpc error: code = NotFound desc = could not find container \"09c6f9251092b481adefc14b90420fb6a766a56c4477a842505f3e261b3621ae\": container with ID starting with 09c6f9251092b481adefc14b90420fb6a766a56c4477a842505f3e261b3621ae not found: ID does not exist" Mar 21 04:47:03 crc kubenswrapper[4839]: I0321 04:47:03.199471 4839 scope.go:117] "RemoveContainer" containerID="1410d57874493b657cf6a91a40179a2fad0e19711c82366afcf394ece3f3c6ea" Mar 21 04:47:03 crc kubenswrapper[4839]: I0321 04:47:03.203739 4839 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Mar 21 04:47:03 crc kubenswrapper[4839]: I0321 04:47:03.225238 4839 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Mar 21 04:47:03 crc kubenswrapper[4839]: I0321 04:47:03.237793 4839 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Mar 21 04:47:03 crc kubenswrapper[4839]: I0321 04:47:03.241209 4839 scope.go:117] "RemoveContainer" containerID="3f29eb5cddc75c3558b62b85ba6c233f60869930c7d451ce2bed01578fe06e6c" Mar 21 04:47:03 crc kubenswrapper[4839]: I0321 04:47:03.248936 4839 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Mar 21 04:47:03 crc kubenswrapper[4839]: E0321 04:47:03.249438 4839 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="205b5c5e-c09f-4b4a-8a56-f98531ad0125" containerName="nova-metadata-metadata" Mar 21 04:47:03 crc kubenswrapper[4839]: I0321 04:47:03.249461 4839 state_mem.go:107] "Deleted CPUSet assignment" podUID="205b5c5e-c09f-4b4a-8a56-f98531ad0125" containerName="nova-metadata-metadata" Mar 21 04:47:03 crc kubenswrapper[4839]: E0321 04:47:03.249493 4839 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3307932f-5c67-4abb-9649-e4b3a0a19e9c" containerName="nova-cell1-novncproxy-novncproxy" Mar 21 04:47:03 crc kubenswrapper[4839]: I0321 04:47:03.249502 4839 state_mem.go:107] "Deleted CPUSet assignment" podUID="3307932f-5c67-4abb-9649-e4b3a0a19e9c" containerName="nova-cell1-novncproxy-novncproxy" Mar 21 04:47:03 crc kubenswrapper[4839]: E0321 04:47:03.249516 4839 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="205b5c5e-c09f-4b4a-8a56-f98531ad0125" containerName="nova-metadata-log" Mar 21 04:47:03 crc kubenswrapper[4839]: I0321 04:47:03.249524 4839 state_mem.go:107] "Deleted CPUSet assignment" podUID="205b5c5e-c09f-4b4a-8a56-f98531ad0125" containerName="nova-metadata-log" Mar 21 04:47:03 crc kubenswrapper[4839]: I0321 04:47:03.249739 4839 memory_manager.go:354] "RemoveStaleState removing state" podUID="3307932f-5c67-4abb-9649-e4b3a0a19e9c" containerName="nova-cell1-novncproxy-novncproxy" Mar 21 04:47:03 crc kubenswrapper[4839]: I0321 04:47:03.249752 4839 memory_manager.go:354] "RemoveStaleState removing state" podUID="205b5c5e-c09f-4b4a-8a56-f98531ad0125" containerName="nova-metadata-metadata" Mar 21 04:47:03 crc kubenswrapper[4839]: I0321 04:47:03.249767 4839 memory_manager.go:354] "RemoveStaleState removing state" podUID="205b5c5e-c09f-4b4a-8a56-f98531ad0125" containerName="nova-metadata-log" Mar 21 04:47:03 crc kubenswrapper[4839]: I0321 04:47:03.250527 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Mar 21 04:47:03 crc kubenswrapper[4839]: I0321 04:47:03.256412 4839 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-novncproxy-cell1-public-svc" Mar 21 04:47:03 crc kubenswrapper[4839]: I0321 04:47:03.256703 4839 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-novncproxy-config-data" Mar 21 04:47:03 crc kubenswrapper[4839]: I0321 04:47:03.256723 4839 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-novncproxy-cell1-vencrypt" Mar 21 04:47:03 crc kubenswrapper[4839]: I0321 04:47:03.259661 4839 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-metadata-0"] Mar 21 04:47:03 crc kubenswrapper[4839]: I0321 04:47:03.273074 4839 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Mar 21 04:47:03 crc kubenswrapper[4839]: I0321 04:47:03.284538 4839 scope.go:117] "RemoveContainer" containerID="1410d57874493b657cf6a91a40179a2fad0e19711c82366afcf394ece3f3c6ea" Mar 21 04:47:03 crc kubenswrapper[4839]: E0321 04:47:03.285041 4839 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1410d57874493b657cf6a91a40179a2fad0e19711c82366afcf394ece3f3c6ea\": container with ID starting with 1410d57874493b657cf6a91a40179a2fad0e19711c82366afcf394ece3f3c6ea not found: ID does not exist" containerID="1410d57874493b657cf6a91a40179a2fad0e19711c82366afcf394ece3f3c6ea" Mar 21 04:47:03 crc kubenswrapper[4839]: I0321 04:47:03.285069 4839 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1410d57874493b657cf6a91a40179a2fad0e19711c82366afcf394ece3f3c6ea"} err="failed to get container status \"1410d57874493b657cf6a91a40179a2fad0e19711c82366afcf394ece3f3c6ea\": rpc error: code = NotFound desc = could not find container \"1410d57874493b657cf6a91a40179a2fad0e19711c82366afcf394ece3f3c6ea\": container with ID starting with 1410d57874493b657cf6a91a40179a2fad0e19711c82366afcf394ece3f3c6ea not found: ID does not exist" Mar 21 04:47:03 crc kubenswrapper[4839]: I0321 04:47:03.285090 4839 scope.go:117] "RemoveContainer" containerID="3f29eb5cddc75c3558b62b85ba6c233f60869930c7d451ce2bed01578fe06e6c" Mar 21 04:47:03 crc kubenswrapper[4839]: I0321 04:47:03.285714 4839 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Mar 21 04:47:03 crc kubenswrapper[4839]: I0321 04:47:03.287742 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 21 04:47:03 crc kubenswrapper[4839]: E0321 04:47:03.290175 4839 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3f29eb5cddc75c3558b62b85ba6c233f60869930c7d451ce2bed01578fe06e6c\": container with ID starting with 3f29eb5cddc75c3558b62b85ba6c233f60869930c7d451ce2bed01578fe06e6c not found: ID does not exist" containerID="3f29eb5cddc75c3558b62b85ba6c233f60869930c7d451ce2bed01578fe06e6c" Mar 21 04:47:03 crc kubenswrapper[4839]: I0321 04:47:03.290262 4839 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3f29eb5cddc75c3558b62b85ba6c233f60869930c7d451ce2bed01578fe06e6c"} err="failed to get container status \"3f29eb5cddc75c3558b62b85ba6c233f60869930c7d451ce2bed01578fe06e6c\": rpc error: code = NotFound desc = could not find container \"3f29eb5cddc75c3558b62b85ba6c233f60869930c7d451ce2bed01578fe06e6c\": container with ID starting with 3f29eb5cddc75c3558b62b85ba6c233f60869930c7d451ce2bed01578fe06e6c not found: ID does not exist" Mar 21 04:47:03 crc kubenswrapper[4839]: I0321 04:47:03.290560 4839 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Mar 21 04:47:03 crc kubenswrapper[4839]: I0321 04:47:03.290803 4839 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-metadata-internal-svc" Mar 21 04:47:03 crc kubenswrapper[4839]: I0321 04:47:03.303755 4839 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Mar 21 04:47:03 crc kubenswrapper[4839]: I0321 04:47:03.370653 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tcxww\" (UniqueName: \"kubernetes.io/projected/9ddf8fc2-ec2a-4b98-aa76-2dc43426e3f2-kube-api-access-tcxww\") pod \"nova-cell1-novncproxy-0\" (UID: \"9ddf8fc2-ec2a-4b98-aa76-2dc43426e3f2\") " pod="openstack/nova-cell1-novncproxy-0" Mar 21 04:47:03 crc kubenswrapper[4839]: I0321 04:47:03.371347 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/9ddf8fc2-ec2a-4b98-aa76-2dc43426e3f2-vencrypt-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"9ddf8fc2-ec2a-4b98-aa76-2dc43426e3f2\") " pod="openstack/nova-cell1-novncproxy-0" Mar 21 04:47:03 crc kubenswrapper[4839]: I0321 04:47:03.371391 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9ddf8fc2-ec2a-4b98-aa76-2dc43426e3f2-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"9ddf8fc2-ec2a-4b98-aa76-2dc43426e3f2\") " pod="openstack/nova-cell1-novncproxy-0" Mar 21 04:47:03 crc kubenswrapper[4839]: I0321 04:47:03.371475 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9ddf8fc2-ec2a-4b98-aa76-2dc43426e3f2-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"9ddf8fc2-ec2a-4b98-aa76-2dc43426e3f2\") " pod="openstack/nova-cell1-novncproxy-0" Mar 21 04:47:03 crc kubenswrapper[4839]: I0321 04:47:03.371622 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/9ddf8fc2-ec2a-4b98-aa76-2dc43426e3f2-nova-novncproxy-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"9ddf8fc2-ec2a-4b98-aa76-2dc43426e3f2\") " pod="openstack/nova-cell1-novncproxy-0" Mar 21 04:47:03 crc kubenswrapper[4839]: I0321 04:47:03.475214 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9ddf8fc2-ec2a-4b98-aa76-2dc43426e3f2-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"9ddf8fc2-ec2a-4b98-aa76-2dc43426e3f2\") " pod="openstack/nova-cell1-novncproxy-0" Mar 21 04:47:03 crc kubenswrapper[4839]: I0321 04:47:03.475301 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/8e28a9be-2244-43bb-9043-2ededa502897-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"8e28a9be-2244-43bb-9043-2ededa502897\") " pod="openstack/nova-metadata-0" Mar 21 04:47:03 crc kubenswrapper[4839]: I0321 04:47:03.475358 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/9ddf8fc2-ec2a-4b98-aa76-2dc43426e3f2-nova-novncproxy-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"9ddf8fc2-ec2a-4b98-aa76-2dc43426e3f2\") " pod="openstack/nova-cell1-novncproxy-0" Mar 21 04:47:03 crc kubenswrapper[4839]: I0321 04:47:03.475618 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8e28a9be-2244-43bb-9043-2ededa502897-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"8e28a9be-2244-43bb-9043-2ededa502897\") " pod="openstack/nova-metadata-0" Mar 21 04:47:03 crc kubenswrapper[4839]: I0321 04:47:03.475668 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tcxww\" (UniqueName: \"kubernetes.io/projected/9ddf8fc2-ec2a-4b98-aa76-2dc43426e3f2-kube-api-access-tcxww\") pod \"nova-cell1-novncproxy-0\" (UID: \"9ddf8fc2-ec2a-4b98-aa76-2dc43426e3f2\") " pod="openstack/nova-cell1-novncproxy-0" Mar 21 04:47:03 crc kubenswrapper[4839]: I0321 04:47:03.475802 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2b7fw\" (UniqueName: \"kubernetes.io/projected/8e28a9be-2244-43bb-9043-2ededa502897-kube-api-access-2b7fw\") pod \"nova-metadata-0\" (UID: \"8e28a9be-2244-43bb-9043-2ededa502897\") " pod="openstack/nova-metadata-0" Mar 21 04:47:03 crc kubenswrapper[4839]: I0321 04:47:03.475885 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8e28a9be-2244-43bb-9043-2ededa502897-config-data\") pod \"nova-metadata-0\" (UID: \"8e28a9be-2244-43bb-9043-2ededa502897\") " pod="openstack/nova-metadata-0" Mar 21 04:47:03 crc kubenswrapper[4839]: I0321 04:47:03.475936 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/9ddf8fc2-ec2a-4b98-aa76-2dc43426e3f2-vencrypt-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"9ddf8fc2-ec2a-4b98-aa76-2dc43426e3f2\") " pod="openstack/nova-cell1-novncproxy-0" Mar 21 04:47:03 crc kubenswrapper[4839]: I0321 04:47:03.475971 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9ddf8fc2-ec2a-4b98-aa76-2dc43426e3f2-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"9ddf8fc2-ec2a-4b98-aa76-2dc43426e3f2\") " pod="openstack/nova-cell1-novncproxy-0" Mar 21 04:47:03 crc kubenswrapper[4839]: I0321 04:47:03.476020 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8e28a9be-2244-43bb-9043-2ededa502897-logs\") pod \"nova-metadata-0\" (UID: \"8e28a9be-2244-43bb-9043-2ededa502897\") " pod="openstack/nova-metadata-0" Mar 21 04:47:03 crc kubenswrapper[4839]: I0321 04:47:03.488670 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9ddf8fc2-ec2a-4b98-aa76-2dc43426e3f2-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"9ddf8fc2-ec2a-4b98-aa76-2dc43426e3f2\") " pod="openstack/nova-cell1-novncproxy-0" Mar 21 04:47:03 crc kubenswrapper[4839]: I0321 04:47:03.488684 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9ddf8fc2-ec2a-4b98-aa76-2dc43426e3f2-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"9ddf8fc2-ec2a-4b98-aa76-2dc43426e3f2\") " pod="openstack/nova-cell1-novncproxy-0" Mar 21 04:47:03 crc kubenswrapper[4839]: I0321 04:47:03.488732 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/9ddf8fc2-ec2a-4b98-aa76-2dc43426e3f2-nova-novncproxy-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"9ddf8fc2-ec2a-4b98-aa76-2dc43426e3f2\") " pod="openstack/nova-cell1-novncproxy-0" Mar 21 04:47:03 crc kubenswrapper[4839]: I0321 04:47:03.488748 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/9ddf8fc2-ec2a-4b98-aa76-2dc43426e3f2-vencrypt-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"9ddf8fc2-ec2a-4b98-aa76-2dc43426e3f2\") " pod="openstack/nova-cell1-novncproxy-0" Mar 21 04:47:03 crc kubenswrapper[4839]: I0321 04:47:03.493862 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tcxww\" (UniqueName: \"kubernetes.io/projected/9ddf8fc2-ec2a-4b98-aa76-2dc43426e3f2-kube-api-access-tcxww\") pod \"nova-cell1-novncproxy-0\" (UID: \"9ddf8fc2-ec2a-4b98-aa76-2dc43426e3f2\") " pod="openstack/nova-cell1-novncproxy-0" Mar 21 04:47:03 crc kubenswrapper[4839]: I0321 04:47:03.569513 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Mar 21 04:47:03 crc kubenswrapper[4839]: I0321 04:47:03.577356 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2b7fw\" (UniqueName: \"kubernetes.io/projected/8e28a9be-2244-43bb-9043-2ededa502897-kube-api-access-2b7fw\") pod \"nova-metadata-0\" (UID: \"8e28a9be-2244-43bb-9043-2ededa502897\") " pod="openstack/nova-metadata-0" Mar 21 04:47:03 crc kubenswrapper[4839]: I0321 04:47:03.577543 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8e28a9be-2244-43bb-9043-2ededa502897-config-data\") pod \"nova-metadata-0\" (UID: \"8e28a9be-2244-43bb-9043-2ededa502897\") " pod="openstack/nova-metadata-0" Mar 21 04:47:03 crc kubenswrapper[4839]: I0321 04:47:03.577739 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8e28a9be-2244-43bb-9043-2ededa502897-logs\") pod \"nova-metadata-0\" (UID: \"8e28a9be-2244-43bb-9043-2ededa502897\") " pod="openstack/nova-metadata-0" Mar 21 04:47:03 crc kubenswrapper[4839]: I0321 04:47:03.578074 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/8e28a9be-2244-43bb-9043-2ededa502897-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"8e28a9be-2244-43bb-9043-2ededa502897\") " pod="openstack/nova-metadata-0" Mar 21 04:47:03 crc kubenswrapper[4839]: I0321 04:47:03.578238 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8e28a9be-2244-43bb-9043-2ededa502897-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"8e28a9be-2244-43bb-9043-2ededa502897\") " pod="openstack/nova-metadata-0" Mar 21 04:47:03 crc kubenswrapper[4839]: I0321 04:47:03.578394 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8e28a9be-2244-43bb-9043-2ededa502897-logs\") pod \"nova-metadata-0\" (UID: \"8e28a9be-2244-43bb-9043-2ededa502897\") " pod="openstack/nova-metadata-0" Mar 21 04:47:03 crc kubenswrapper[4839]: I0321 04:47:03.581175 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8e28a9be-2244-43bb-9043-2ededa502897-config-data\") pod \"nova-metadata-0\" (UID: \"8e28a9be-2244-43bb-9043-2ededa502897\") " pod="openstack/nova-metadata-0" Mar 21 04:47:03 crc kubenswrapper[4839]: I0321 04:47:03.581453 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/8e28a9be-2244-43bb-9043-2ededa502897-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"8e28a9be-2244-43bb-9043-2ededa502897\") " pod="openstack/nova-metadata-0" Mar 21 04:47:03 crc kubenswrapper[4839]: I0321 04:47:03.581964 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8e28a9be-2244-43bb-9043-2ededa502897-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"8e28a9be-2244-43bb-9043-2ededa502897\") " pod="openstack/nova-metadata-0" Mar 21 04:47:03 crc kubenswrapper[4839]: I0321 04:47:03.593488 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2b7fw\" (UniqueName: \"kubernetes.io/projected/8e28a9be-2244-43bb-9043-2ededa502897-kube-api-access-2b7fw\") pod \"nova-metadata-0\" (UID: \"8e28a9be-2244-43bb-9043-2ededa502897\") " pod="openstack/nova-metadata-0" Mar 21 04:47:03 crc kubenswrapper[4839]: I0321 04:47:03.605012 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 21 04:47:03 crc kubenswrapper[4839]: I0321 04:47:03.883142 4839 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Mar 21 04:47:03 crc kubenswrapper[4839]: I0321 04:47:03.884497 4839 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Mar 21 04:47:03 crc kubenswrapper[4839]: I0321 04:47:03.887666 4839 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Mar 21 04:47:03 crc kubenswrapper[4839]: I0321 04:47:03.992288 4839 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Mar 21 04:47:03 crc kubenswrapper[4839]: W0321 04:47:03.993259 4839 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod9ddf8fc2_ec2a_4b98_aa76_2dc43426e3f2.slice/crio-b3ba58611e9b49d710ea676883c59ee0efd6c5274af7bdb09c8a550e55c06599 WatchSource:0}: Error finding container b3ba58611e9b49d710ea676883c59ee0efd6c5274af7bdb09c8a550e55c06599: Status 404 returned error can't find the container with id b3ba58611e9b49d710ea676883c59ee0efd6c5274af7bdb09c8a550e55c06599 Mar 21 04:47:04 crc kubenswrapper[4839]: I0321 04:47:04.100149 4839 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Mar 21 04:47:04 crc kubenswrapper[4839]: W0321 04:47:04.101255 4839 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod8e28a9be_2244_43bb_9043_2ededa502897.slice/crio-6d254a4eb3f7d7fdb366dc10b2c861d9dba719fa5c681ffb26fb0cc817d0f6f3 WatchSource:0}: Error finding container 6d254a4eb3f7d7fdb366dc10b2c861d9dba719fa5c681ffb26fb0cc817d0f6f3: Status 404 returned error can't find the container with id 6d254a4eb3f7d7fdb366dc10b2c861d9dba719fa5c681ffb26fb0cc817d0f6f3 Mar 21 04:47:04 crc kubenswrapper[4839]: I0321 04:47:04.175060 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"9ddf8fc2-ec2a-4b98-aa76-2dc43426e3f2","Type":"ContainerStarted","Data":"b3ba58611e9b49d710ea676883c59ee0efd6c5274af7bdb09c8a550e55c06599"} Mar 21 04:47:04 crc kubenswrapper[4839]: I0321 04:47:04.180689 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"8e28a9be-2244-43bb-9043-2ededa502897","Type":"ContainerStarted","Data":"6d254a4eb3f7d7fdb366dc10b2c861d9dba719fa5c681ffb26fb0cc817d0f6f3"} Mar 21 04:47:04 crc kubenswrapper[4839]: I0321 04:47:04.188046 4839 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Mar 21 04:47:04 crc kubenswrapper[4839]: I0321 04:47:04.397462 4839 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-cd5cbd7b9-6862l"] Mar 21 04:47:04 crc kubenswrapper[4839]: I0321 04:47:04.401889 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-cd5cbd7b9-6862l" Mar 21 04:47:04 crc kubenswrapper[4839]: I0321 04:47:04.418067 4839 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-cd5cbd7b9-6862l"] Mar 21 04:47:04 crc kubenswrapper[4839]: I0321 04:47:04.473707 4839 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="205b5c5e-c09f-4b4a-8a56-f98531ad0125" path="/var/lib/kubelet/pods/205b5c5e-c09f-4b4a-8a56-f98531ad0125/volumes" Mar 21 04:47:04 crc kubenswrapper[4839]: I0321 04:47:04.474480 4839 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3307932f-5c67-4abb-9649-e4b3a0a19e9c" path="/var/lib/kubelet/pods/3307932f-5c67-4abb-9649-e4b3a0a19e9c/volumes" Mar 21 04:47:04 crc kubenswrapper[4839]: I0321 04:47:04.500023 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/f0b06ab0-2209-4fb3-a837-ec755b412525-dns-svc\") pod \"dnsmasq-dns-cd5cbd7b9-6862l\" (UID: \"f0b06ab0-2209-4fb3-a837-ec755b412525\") " pod="openstack/dnsmasq-dns-cd5cbd7b9-6862l" Mar 21 04:47:04 crc kubenswrapper[4839]: I0321 04:47:04.500130 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f0b06ab0-2209-4fb3-a837-ec755b412525-config\") pod \"dnsmasq-dns-cd5cbd7b9-6862l\" (UID: \"f0b06ab0-2209-4fb3-a837-ec755b412525\") " pod="openstack/dnsmasq-dns-cd5cbd7b9-6862l" Mar 21 04:47:04 crc kubenswrapper[4839]: I0321 04:47:04.500159 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/f0b06ab0-2209-4fb3-a837-ec755b412525-dns-swift-storage-0\") pod \"dnsmasq-dns-cd5cbd7b9-6862l\" (UID: \"f0b06ab0-2209-4fb3-a837-ec755b412525\") " pod="openstack/dnsmasq-dns-cd5cbd7b9-6862l" Mar 21 04:47:04 crc kubenswrapper[4839]: I0321 04:47:04.500214 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/f0b06ab0-2209-4fb3-a837-ec755b412525-ovsdbserver-nb\") pod \"dnsmasq-dns-cd5cbd7b9-6862l\" (UID: \"f0b06ab0-2209-4fb3-a837-ec755b412525\") " pod="openstack/dnsmasq-dns-cd5cbd7b9-6862l" Mar 21 04:47:04 crc kubenswrapper[4839]: I0321 04:47:04.500275 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/f0b06ab0-2209-4fb3-a837-ec755b412525-ovsdbserver-sb\") pod \"dnsmasq-dns-cd5cbd7b9-6862l\" (UID: \"f0b06ab0-2209-4fb3-a837-ec755b412525\") " pod="openstack/dnsmasq-dns-cd5cbd7b9-6862l" Mar 21 04:47:04 crc kubenswrapper[4839]: I0321 04:47:04.500318 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pfp48\" (UniqueName: \"kubernetes.io/projected/f0b06ab0-2209-4fb3-a837-ec755b412525-kube-api-access-pfp48\") pod \"dnsmasq-dns-cd5cbd7b9-6862l\" (UID: \"f0b06ab0-2209-4fb3-a837-ec755b412525\") " pod="openstack/dnsmasq-dns-cd5cbd7b9-6862l" Mar 21 04:47:04 crc kubenswrapper[4839]: I0321 04:47:04.603447 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/f0b06ab0-2209-4fb3-a837-ec755b412525-dns-svc\") pod \"dnsmasq-dns-cd5cbd7b9-6862l\" (UID: \"f0b06ab0-2209-4fb3-a837-ec755b412525\") " pod="openstack/dnsmasq-dns-cd5cbd7b9-6862l" Mar 21 04:47:04 crc kubenswrapper[4839]: I0321 04:47:04.603542 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f0b06ab0-2209-4fb3-a837-ec755b412525-config\") pod \"dnsmasq-dns-cd5cbd7b9-6862l\" (UID: \"f0b06ab0-2209-4fb3-a837-ec755b412525\") " pod="openstack/dnsmasq-dns-cd5cbd7b9-6862l" Mar 21 04:47:04 crc kubenswrapper[4839]: I0321 04:47:04.603591 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/f0b06ab0-2209-4fb3-a837-ec755b412525-dns-swift-storage-0\") pod \"dnsmasq-dns-cd5cbd7b9-6862l\" (UID: \"f0b06ab0-2209-4fb3-a837-ec755b412525\") " pod="openstack/dnsmasq-dns-cd5cbd7b9-6862l" Mar 21 04:47:04 crc kubenswrapper[4839]: I0321 04:47:04.603625 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/f0b06ab0-2209-4fb3-a837-ec755b412525-ovsdbserver-nb\") pod \"dnsmasq-dns-cd5cbd7b9-6862l\" (UID: \"f0b06ab0-2209-4fb3-a837-ec755b412525\") " pod="openstack/dnsmasq-dns-cd5cbd7b9-6862l" Mar 21 04:47:04 crc kubenswrapper[4839]: I0321 04:47:04.603672 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/f0b06ab0-2209-4fb3-a837-ec755b412525-ovsdbserver-sb\") pod \"dnsmasq-dns-cd5cbd7b9-6862l\" (UID: \"f0b06ab0-2209-4fb3-a837-ec755b412525\") " pod="openstack/dnsmasq-dns-cd5cbd7b9-6862l" Mar 21 04:47:04 crc kubenswrapper[4839]: I0321 04:47:04.603705 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pfp48\" (UniqueName: \"kubernetes.io/projected/f0b06ab0-2209-4fb3-a837-ec755b412525-kube-api-access-pfp48\") pod \"dnsmasq-dns-cd5cbd7b9-6862l\" (UID: \"f0b06ab0-2209-4fb3-a837-ec755b412525\") " pod="openstack/dnsmasq-dns-cd5cbd7b9-6862l" Mar 21 04:47:04 crc kubenswrapper[4839]: I0321 04:47:04.604585 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f0b06ab0-2209-4fb3-a837-ec755b412525-config\") pod \"dnsmasq-dns-cd5cbd7b9-6862l\" (UID: \"f0b06ab0-2209-4fb3-a837-ec755b412525\") " pod="openstack/dnsmasq-dns-cd5cbd7b9-6862l" Mar 21 04:47:04 crc kubenswrapper[4839]: I0321 04:47:04.604713 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/f0b06ab0-2209-4fb3-a837-ec755b412525-dns-svc\") pod \"dnsmasq-dns-cd5cbd7b9-6862l\" (UID: \"f0b06ab0-2209-4fb3-a837-ec755b412525\") " pod="openstack/dnsmasq-dns-cd5cbd7b9-6862l" Mar 21 04:47:04 crc kubenswrapper[4839]: I0321 04:47:04.604857 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/f0b06ab0-2209-4fb3-a837-ec755b412525-ovsdbserver-sb\") pod \"dnsmasq-dns-cd5cbd7b9-6862l\" (UID: \"f0b06ab0-2209-4fb3-a837-ec755b412525\") " pod="openstack/dnsmasq-dns-cd5cbd7b9-6862l" Mar 21 04:47:04 crc kubenswrapper[4839]: I0321 04:47:04.604869 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/f0b06ab0-2209-4fb3-a837-ec755b412525-dns-swift-storage-0\") pod \"dnsmasq-dns-cd5cbd7b9-6862l\" (UID: \"f0b06ab0-2209-4fb3-a837-ec755b412525\") " pod="openstack/dnsmasq-dns-cd5cbd7b9-6862l" Mar 21 04:47:04 crc kubenswrapper[4839]: I0321 04:47:04.605340 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/f0b06ab0-2209-4fb3-a837-ec755b412525-ovsdbserver-nb\") pod \"dnsmasq-dns-cd5cbd7b9-6862l\" (UID: \"f0b06ab0-2209-4fb3-a837-ec755b412525\") " pod="openstack/dnsmasq-dns-cd5cbd7b9-6862l" Mar 21 04:47:04 crc kubenswrapper[4839]: I0321 04:47:04.629286 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pfp48\" (UniqueName: \"kubernetes.io/projected/f0b06ab0-2209-4fb3-a837-ec755b412525-kube-api-access-pfp48\") pod \"dnsmasq-dns-cd5cbd7b9-6862l\" (UID: \"f0b06ab0-2209-4fb3-a837-ec755b412525\") " pod="openstack/dnsmasq-dns-cd5cbd7b9-6862l" Mar 21 04:47:04 crc kubenswrapper[4839]: I0321 04:47:04.851645 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-cd5cbd7b9-6862l" Mar 21 04:47:05 crc kubenswrapper[4839]: I0321 04:47:05.191712 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"9ddf8fc2-ec2a-4b98-aa76-2dc43426e3f2","Type":"ContainerStarted","Data":"51609e39b21ec93da79fc362ba18f8cef3ba4ca1acff776bbc773773aba829bf"} Mar 21 04:47:05 crc kubenswrapper[4839]: I0321 04:47:05.194931 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"8e28a9be-2244-43bb-9043-2ededa502897","Type":"ContainerStarted","Data":"a015fcef30e3edb89fbd565a12ab7be168cef7141e42ffd913f3c444b15b9cb5"} Mar 21 04:47:05 crc kubenswrapper[4839]: I0321 04:47:05.194979 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"8e28a9be-2244-43bb-9043-2ededa502897","Type":"ContainerStarted","Data":"c9bb20765c0019dc8a9e6988a830d859fed3dd4c660411ba1ffc4dc6daa22e48"} Mar 21 04:47:05 crc kubenswrapper[4839]: I0321 04:47:05.228793 4839 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-novncproxy-0" podStartSLOduration=2.228770189 podStartE2EDuration="2.228770189s" podCreationTimestamp="2026-03-21 04:47:03 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-21 04:47:05.225326402 +0000 UTC m=+1429.553113098" watchObservedRunningTime="2026-03-21 04:47:05.228770189 +0000 UTC m=+1429.556556875" Mar 21 04:47:05 crc kubenswrapper[4839]: I0321 04:47:05.277907 4839 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=2.277882882 podStartE2EDuration="2.277882882s" podCreationTimestamp="2026-03-21 04:47:03 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-21 04:47:05.25471438 +0000 UTC m=+1429.582501076" watchObservedRunningTime="2026-03-21 04:47:05.277882882 +0000 UTC m=+1429.605669568" Mar 21 04:47:05 crc kubenswrapper[4839]: I0321 04:47:05.396533 4839 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-cd5cbd7b9-6862l"] Mar 21 04:47:06 crc kubenswrapper[4839]: I0321 04:47:06.213422 4839 generic.go:334] "Generic (PLEG): container finished" podID="f0b06ab0-2209-4fb3-a837-ec755b412525" containerID="9c66a34072939fe1bc06d5317cefcaef381970be743e11c85ddce2a426a837fb" exitCode=0 Mar 21 04:47:06 crc kubenswrapper[4839]: I0321 04:47:06.214027 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-cd5cbd7b9-6862l" event={"ID":"f0b06ab0-2209-4fb3-a837-ec755b412525","Type":"ContainerDied","Data":"9c66a34072939fe1bc06d5317cefcaef381970be743e11c85ddce2a426a837fb"} Mar 21 04:47:06 crc kubenswrapper[4839]: I0321 04:47:06.214068 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-cd5cbd7b9-6862l" event={"ID":"f0b06ab0-2209-4fb3-a837-ec755b412525","Type":"ContainerStarted","Data":"223ef65b13d73e2f7904cd94127f13fb98845ae46f6fe3c063ed71c9184d7fbc"} Mar 21 04:47:06 crc kubenswrapper[4839]: I0321 04:47:06.370362 4839 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 21 04:47:06 crc kubenswrapper[4839]: I0321 04:47:06.371253 4839 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="ab85f3f4-7277-419b-96bc-4f56d5891b16" containerName="ceilometer-central-agent" containerID="cri-o://c9624ee09cc62f49f1e2db6cc40410325ea975a44c5ea0f1eaa54772bc90e8de" gracePeriod=30 Mar 21 04:47:06 crc kubenswrapper[4839]: I0321 04:47:06.371323 4839 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="ab85f3f4-7277-419b-96bc-4f56d5891b16" containerName="proxy-httpd" containerID="cri-o://9c55f46092a6529b94b2130cf1674611a911d2c1c073359151d36cbadd55f2db" gracePeriod=30 Mar 21 04:47:06 crc kubenswrapper[4839]: I0321 04:47:06.371368 4839 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="ab85f3f4-7277-419b-96bc-4f56d5891b16" containerName="sg-core" containerID="cri-o://b489bca73bc4e58653d79d57303a9cbf2e7e55c12b829b4e8ad5f82f29e57974" gracePeriod=30 Mar 21 04:47:06 crc kubenswrapper[4839]: I0321 04:47:06.371425 4839 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="ab85f3f4-7277-419b-96bc-4f56d5891b16" containerName="ceilometer-notification-agent" containerID="cri-o://81cce601ac7ae485ab54fe031a2c9780e7939ffd4d001fc1df6fa33f481f6387" gracePeriod=30 Mar 21 04:47:06 crc kubenswrapper[4839]: I0321 04:47:06.386272 4839 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ceilometer-0" podUID="ab85f3f4-7277-419b-96bc-4f56d5891b16" containerName="proxy-httpd" probeResult="failure" output="Get \"https://10.217.0.206:3000/\": read tcp 10.217.0.2:57470->10.217.0.206:3000: read: connection reset by peer" Mar 21 04:47:06 crc kubenswrapper[4839]: I0321 04:47:06.589275 4839 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Mar 21 04:47:06 crc kubenswrapper[4839]: E0321 04:47:06.787723 4839 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podab85f3f4_7277_419b_96bc_4f56d5891b16.slice/crio-9c55f46092a6529b94b2130cf1674611a911d2c1c073359151d36cbadd55f2db.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podab85f3f4_7277_419b_96bc_4f56d5891b16.slice/crio-conmon-9c55f46092a6529b94b2130cf1674611a911d2c1c073359151d36cbadd55f2db.scope\": RecentStats: unable to find data in memory cache]" Mar 21 04:47:07 crc kubenswrapper[4839]: I0321 04:47:07.227333 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-cd5cbd7b9-6862l" event={"ID":"f0b06ab0-2209-4fb3-a837-ec755b412525","Type":"ContainerStarted","Data":"49e68a91d6df7e43ddd3ea0fec63512f2ded0793c00e4d188853502265e78a28"} Mar 21 04:47:07 crc kubenswrapper[4839]: I0321 04:47:07.227661 4839 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-cd5cbd7b9-6862l" Mar 21 04:47:07 crc kubenswrapper[4839]: I0321 04:47:07.255442 4839 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-cd5cbd7b9-6862l" podStartSLOduration=3.2554163799999998 podStartE2EDuration="3.25541638s" podCreationTimestamp="2026-03-21 04:47:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-21 04:47:07.249274577 +0000 UTC m=+1431.577061263" watchObservedRunningTime="2026-03-21 04:47:07.25541638 +0000 UTC m=+1431.583203056" Mar 21 04:47:07 crc kubenswrapper[4839]: I0321 04:47:07.258127 4839 generic.go:334] "Generic (PLEG): container finished" podID="ab85f3f4-7277-419b-96bc-4f56d5891b16" containerID="9c55f46092a6529b94b2130cf1674611a911d2c1c073359151d36cbadd55f2db" exitCode=0 Mar 21 04:47:07 crc kubenswrapper[4839]: I0321 04:47:07.258163 4839 generic.go:334] "Generic (PLEG): container finished" podID="ab85f3f4-7277-419b-96bc-4f56d5891b16" containerID="b489bca73bc4e58653d79d57303a9cbf2e7e55c12b829b4e8ad5f82f29e57974" exitCode=2 Mar 21 04:47:07 crc kubenswrapper[4839]: I0321 04:47:07.258175 4839 generic.go:334] "Generic (PLEG): container finished" podID="ab85f3f4-7277-419b-96bc-4f56d5891b16" containerID="81cce601ac7ae485ab54fe031a2c9780e7939ffd4d001fc1df6fa33f481f6387" exitCode=0 Mar 21 04:47:07 crc kubenswrapper[4839]: I0321 04:47:07.258185 4839 generic.go:334] "Generic (PLEG): container finished" podID="ab85f3f4-7277-419b-96bc-4f56d5891b16" containerID="c9624ee09cc62f49f1e2db6cc40410325ea975a44c5ea0f1eaa54772bc90e8de" exitCode=0 Mar 21 04:47:07 crc kubenswrapper[4839]: I0321 04:47:07.258381 4839 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="233bba1a-658e-4073-acb5-c80398a849f1" containerName="nova-api-log" containerID="cri-o://31f18552e99ec1604ace2931e4e04c749226768da827514f9d6063a08cb84d92" gracePeriod=30 Mar 21 04:47:07 crc kubenswrapper[4839]: I0321 04:47:07.258682 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"ab85f3f4-7277-419b-96bc-4f56d5891b16","Type":"ContainerDied","Data":"9c55f46092a6529b94b2130cf1674611a911d2c1c073359151d36cbadd55f2db"} Mar 21 04:47:07 crc kubenswrapper[4839]: I0321 04:47:07.258720 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"ab85f3f4-7277-419b-96bc-4f56d5891b16","Type":"ContainerDied","Data":"b489bca73bc4e58653d79d57303a9cbf2e7e55c12b829b4e8ad5f82f29e57974"} Mar 21 04:47:07 crc kubenswrapper[4839]: I0321 04:47:07.258736 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"ab85f3f4-7277-419b-96bc-4f56d5891b16","Type":"ContainerDied","Data":"81cce601ac7ae485ab54fe031a2c9780e7939ffd4d001fc1df6fa33f481f6387"} Mar 21 04:47:07 crc kubenswrapper[4839]: I0321 04:47:07.258746 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"ab85f3f4-7277-419b-96bc-4f56d5891b16","Type":"ContainerDied","Data":"c9624ee09cc62f49f1e2db6cc40410325ea975a44c5ea0f1eaa54772bc90e8de"} Mar 21 04:47:07 crc kubenswrapper[4839]: I0321 04:47:07.259061 4839 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="233bba1a-658e-4073-acb5-c80398a849f1" containerName="nova-api-api" containerID="cri-o://deb071d3d18e498c97565ddfbf97de4939f8ca911621645835647f16b1a60c9f" gracePeriod=30 Mar 21 04:47:07 crc kubenswrapper[4839]: I0321 04:47:07.492309 4839 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 21 04:47:07 crc kubenswrapper[4839]: I0321 04:47:07.672546 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ab85f3f4-7277-419b-96bc-4f56d5891b16-config-data\") pod \"ab85f3f4-7277-419b-96bc-4f56d5891b16\" (UID: \"ab85f3f4-7277-419b-96bc-4f56d5891b16\") " Mar 21 04:47:07 crc kubenswrapper[4839]: I0321 04:47:07.672630 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ab85f3f4-7277-419b-96bc-4f56d5891b16-log-httpd\") pod \"ab85f3f4-7277-419b-96bc-4f56d5891b16\" (UID: \"ab85f3f4-7277-419b-96bc-4f56d5891b16\") " Mar 21 04:47:07 crc kubenswrapper[4839]: I0321 04:47:07.672745 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ab85f3f4-7277-419b-96bc-4f56d5891b16-run-httpd\") pod \"ab85f3f4-7277-419b-96bc-4f56d5891b16\" (UID: \"ab85f3f4-7277-419b-96bc-4f56d5891b16\") " Mar 21 04:47:07 crc kubenswrapper[4839]: I0321 04:47:07.672829 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/ab85f3f4-7277-419b-96bc-4f56d5891b16-sg-core-conf-yaml\") pod \"ab85f3f4-7277-419b-96bc-4f56d5891b16\" (UID: \"ab85f3f4-7277-419b-96bc-4f56d5891b16\") " Mar 21 04:47:07 crc kubenswrapper[4839]: I0321 04:47:07.672873 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ab85f3f4-7277-419b-96bc-4f56d5891b16-combined-ca-bundle\") pod \"ab85f3f4-7277-419b-96bc-4f56d5891b16\" (UID: \"ab85f3f4-7277-419b-96bc-4f56d5891b16\") " Mar 21 04:47:07 crc kubenswrapper[4839]: I0321 04:47:07.672899 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ab85f3f4-7277-419b-96bc-4f56d5891b16-scripts\") pod \"ab85f3f4-7277-419b-96bc-4f56d5891b16\" (UID: \"ab85f3f4-7277-419b-96bc-4f56d5891b16\") " Mar 21 04:47:07 crc kubenswrapper[4839]: I0321 04:47:07.672934 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mtppj\" (UniqueName: \"kubernetes.io/projected/ab85f3f4-7277-419b-96bc-4f56d5891b16-kube-api-access-mtppj\") pod \"ab85f3f4-7277-419b-96bc-4f56d5891b16\" (UID: \"ab85f3f4-7277-419b-96bc-4f56d5891b16\") " Mar 21 04:47:07 crc kubenswrapper[4839]: I0321 04:47:07.673000 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/ab85f3f4-7277-419b-96bc-4f56d5891b16-ceilometer-tls-certs\") pod \"ab85f3f4-7277-419b-96bc-4f56d5891b16\" (UID: \"ab85f3f4-7277-419b-96bc-4f56d5891b16\") " Mar 21 04:47:07 crc kubenswrapper[4839]: I0321 04:47:07.673041 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ab85f3f4-7277-419b-96bc-4f56d5891b16-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "ab85f3f4-7277-419b-96bc-4f56d5891b16" (UID: "ab85f3f4-7277-419b-96bc-4f56d5891b16"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 21 04:47:07 crc kubenswrapper[4839]: I0321 04:47:07.673578 4839 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ab85f3f4-7277-419b-96bc-4f56d5891b16-log-httpd\") on node \"crc\" DevicePath \"\"" Mar 21 04:47:07 crc kubenswrapper[4839]: I0321 04:47:07.673865 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ab85f3f4-7277-419b-96bc-4f56d5891b16-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "ab85f3f4-7277-419b-96bc-4f56d5891b16" (UID: "ab85f3f4-7277-419b-96bc-4f56d5891b16"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 21 04:47:07 crc kubenswrapper[4839]: I0321 04:47:07.680000 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ab85f3f4-7277-419b-96bc-4f56d5891b16-kube-api-access-mtppj" (OuterVolumeSpecName: "kube-api-access-mtppj") pod "ab85f3f4-7277-419b-96bc-4f56d5891b16" (UID: "ab85f3f4-7277-419b-96bc-4f56d5891b16"). InnerVolumeSpecName "kube-api-access-mtppj". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 21 04:47:07 crc kubenswrapper[4839]: I0321 04:47:07.681677 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ab85f3f4-7277-419b-96bc-4f56d5891b16-scripts" (OuterVolumeSpecName: "scripts") pod "ab85f3f4-7277-419b-96bc-4f56d5891b16" (UID: "ab85f3f4-7277-419b-96bc-4f56d5891b16"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 21 04:47:07 crc kubenswrapper[4839]: I0321 04:47:07.716367 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ab85f3f4-7277-419b-96bc-4f56d5891b16-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "ab85f3f4-7277-419b-96bc-4f56d5891b16" (UID: "ab85f3f4-7277-419b-96bc-4f56d5891b16"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 21 04:47:07 crc kubenswrapper[4839]: I0321 04:47:07.729745 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ab85f3f4-7277-419b-96bc-4f56d5891b16-ceilometer-tls-certs" (OuterVolumeSpecName: "ceilometer-tls-certs") pod "ab85f3f4-7277-419b-96bc-4f56d5891b16" (UID: "ab85f3f4-7277-419b-96bc-4f56d5891b16"). InnerVolumeSpecName "ceilometer-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 21 04:47:07 crc kubenswrapper[4839]: I0321 04:47:07.775083 4839 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ab85f3f4-7277-419b-96bc-4f56d5891b16-run-httpd\") on node \"crc\" DevicePath \"\"" Mar 21 04:47:07 crc kubenswrapper[4839]: I0321 04:47:07.775119 4839 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/ab85f3f4-7277-419b-96bc-4f56d5891b16-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Mar 21 04:47:07 crc kubenswrapper[4839]: I0321 04:47:07.775134 4839 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ab85f3f4-7277-419b-96bc-4f56d5891b16-scripts\") on node \"crc\" DevicePath \"\"" Mar 21 04:47:07 crc kubenswrapper[4839]: I0321 04:47:07.775146 4839 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mtppj\" (UniqueName: \"kubernetes.io/projected/ab85f3f4-7277-419b-96bc-4f56d5891b16-kube-api-access-mtppj\") on node \"crc\" DevicePath \"\"" Mar 21 04:47:07 crc kubenswrapper[4839]: I0321 04:47:07.775157 4839 reconciler_common.go:293] "Volume detached for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/ab85f3f4-7277-419b-96bc-4f56d5891b16-ceilometer-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 21 04:47:07 crc kubenswrapper[4839]: I0321 04:47:07.777311 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ab85f3f4-7277-419b-96bc-4f56d5891b16-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "ab85f3f4-7277-419b-96bc-4f56d5891b16" (UID: "ab85f3f4-7277-419b-96bc-4f56d5891b16"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 21 04:47:07 crc kubenswrapper[4839]: I0321 04:47:07.807585 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ab85f3f4-7277-419b-96bc-4f56d5891b16-config-data" (OuterVolumeSpecName: "config-data") pod "ab85f3f4-7277-419b-96bc-4f56d5891b16" (UID: "ab85f3f4-7277-419b-96bc-4f56d5891b16"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 21 04:47:07 crc kubenswrapper[4839]: I0321 04:47:07.877608 4839 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ab85f3f4-7277-419b-96bc-4f56d5891b16-config-data\") on node \"crc\" DevicePath \"\"" Mar 21 04:47:07 crc kubenswrapper[4839]: I0321 04:47:07.877869 4839 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ab85f3f4-7277-419b-96bc-4f56d5891b16-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 21 04:47:08 crc kubenswrapper[4839]: I0321 04:47:08.275351 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"ab85f3f4-7277-419b-96bc-4f56d5891b16","Type":"ContainerDied","Data":"a7cee1eec896bbb6d355b98092ee2f6320ccf9ad32ba43390936af295231ddd4"} Mar 21 04:47:08 crc kubenswrapper[4839]: I0321 04:47:08.276300 4839 scope.go:117] "RemoveContainer" containerID="9c55f46092a6529b94b2130cf1674611a911d2c1c073359151d36cbadd55f2db" Mar 21 04:47:08 crc kubenswrapper[4839]: I0321 04:47:08.275727 4839 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 21 04:47:08 crc kubenswrapper[4839]: I0321 04:47:08.278881 4839 generic.go:334] "Generic (PLEG): container finished" podID="233bba1a-658e-4073-acb5-c80398a849f1" containerID="31f18552e99ec1604ace2931e4e04c749226768da827514f9d6063a08cb84d92" exitCode=143 Mar 21 04:47:08 crc kubenswrapper[4839]: I0321 04:47:08.279035 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"233bba1a-658e-4073-acb5-c80398a849f1","Type":"ContainerDied","Data":"31f18552e99ec1604ace2931e4e04c749226768da827514f9d6063a08cb84d92"} Mar 21 04:47:08 crc kubenswrapper[4839]: I0321 04:47:08.301700 4839 scope.go:117] "RemoveContainer" containerID="b489bca73bc4e58653d79d57303a9cbf2e7e55c12b829b4e8ad5f82f29e57974" Mar 21 04:47:08 crc kubenswrapper[4839]: I0321 04:47:08.316825 4839 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 21 04:47:08 crc kubenswrapper[4839]: I0321 04:47:08.335608 4839 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Mar 21 04:47:08 crc kubenswrapper[4839]: I0321 04:47:08.339287 4839 scope.go:117] "RemoveContainer" containerID="81cce601ac7ae485ab54fe031a2c9780e7939ffd4d001fc1df6fa33f481f6387" Mar 21 04:47:08 crc kubenswrapper[4839]: I0321 04:47:08.362659 4839 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Mar 21 04:47:08 crc kubenswrapper[4839]: E0321 04:47:08.363357 4839 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ab85f3f4-7277-419b-96bc-4f56d5891b16" containerName="proxy-httpd" Mar 21 04:47:08 crc kubenswrapper[4839]: I0321 04:47:08.363392 4839 state_mem.go:107] "Deleted CPUSet assignment" podUID="ab85f3f4-7277-419b-96bc-4f56d5891b16" containerName="proxy-httpd" Mar 21 04:47:08 crc kubenswrapper[4839]: E0321 04:47:08.363428 4839 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ab85f3f4-7277-419b-96bc-4f56d5891b16" containerName="ceilometer-notification-agent" Mar 21 04:47:08 crc kubenswrapper[4839]: I0321 04:47:08.363437 4839 state_mem.go:107] "Deleted CPUSet assignment" podUID="ab85f3f4-7277-419b-96bc-4f56d5891b16" containerName="ceilometer-notification-agent" Mar 21 04:47:08 crc kubenswrapper[4839]: E0321 04:47:08.363470 4839 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ab85f3f4-7277-419b-96bc-4f56d5891b16" containerName="ceilometer-central-agent" Mar 21 04:47:08 crc kubenswrapper[4839]: I0321 04:47:08.363479 4839 state_mem.go:107] "Deleted CPUSet assignment" podUID="ab85f3f4-7277-419b-96bc-4f56d5891b16" containerName="ceilometer-central-agent" Mar 21 04:47:08 crc kubenswrapper[4839]: E0321 04:47:08.363496 4839 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ab85f3f4-7277-419b-96bc-4f56d5891b16" containerName="sg-core" Mar 21 04:47:08 crc kubenswrapper[4839]: I0321 04:47:08.363504 4839 state_mem.go:107] "Deleted CPUSet assignment" podUID="ab85f3f4-7277-419b-96bc-4f56d5891b16" containerName="sg-core" Mar 21 04:47:08 crc kubenswrapper[4839]: I0321 04:47:08.363717 4839 memory_manager.go:354] "RemoveStaleState removing state" podUID="ab85f3f4-7277-419b-96bc-4f56d5891b16" containerName="proxy-httpd" Mar 21 04:47:08 crc kubenswrapper[4839]: I0321 04:47:08.363733 4839 memory_manager.go:354] "RemoveStaleState removing state" podUID="ab85f3f4-7277-419b-96bc-4f56d5891b16" containerName="sg-core" Mar 21 04:47:08 crc kubenswrapper[4839]: I0321 04:47:08.363753 4839 memory_manager.go:354] "RemoveStaleState removing state" podUID="ab85f3f4-7277-419b-96bc-4f56d5891b16" containerName="ceilometer-central-agent" Mar 21 04:47:08 crc kubenswrapper[4839]: I0321 04:47:08.363796 4839 memory_manager.go:354] "RemoveStaleState removing state" podUID="ab85f3f4-7277-419b-96bc-4f56d5891b16" containerName="ceilometer-notification-agent" Mar 21 04:47:08 crc kubenswrapper[4839]: I0321 04:47:08.366046 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 21 04:47:08 crc kubenswrapper[4839]: I0321 04:47:08.368782 4839 scope.go:117] "RemoveContainer" containerID="c9624ee09cc62f49f1e2db6cc40410325ea975a44c5ea0f1eaa54772bc90e8de" Mar 21 04:47:08 crc kubenswrapper[4839]: I0321 04:47:08.369393 4839 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Mar 21 04:47:08 crc kubenswrapper[4839]: I0321 04:47:08.370561 4839 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Mar 21 04:47:08 crc kubenswrapper[4839]: I0321 04:47:08.371238 4839 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ceilometer-internal-svc" Mar 21 04:47:08 crc kubenswrapper[4839]: I0321 04:47:08.389251 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/d1041d12-2cae-4009-a3f3-9df6e219d03b-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"d1041d12-2cae-4009-a3f3-9df6e219d03b\") " pod="openstack/ceilometer-0" Mar 21 04:47:08 crc kubenswrapper[4839]: I0321 04:47:08.390303 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/d1041d12-2cae-4009-a3f3-9df6e219d03b-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"d1041d12-2cae-4009-a3f3-9df6e219d03b\") " pod="openstack/ceilometer-0" Mar 21 04:47:08 crc kubenswrapper[4839]: I0321 04:47:08.390349 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d1041d12-2cae-4009-a3f3-9df6e219d03b-scripts\") pod \"ceilometer-0\" (UID: \"d1041d12-2cae-4009-a3f3-9df6e219d03b\") " pod="openstack/ceilometer-0" Mar 21 04:47:08 crc kubenswrapper[4839]: I0321 04:47:08.390390 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/d1041d12-2cae-4009-a3f3-9df6e219d03b-log-httpd\") pod \"ceilometer-0\" (UID: \"d1041d12-2cae-4009-a3f3-9df6e219d03b\") " pod="openstack/ceilometer-0" Mar 21 04:47:08 crc kubenswrapper[4839]: I0321 04:47:08.390458 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/d1041d12-2cae-4009-a3f3-9df6e219d03b-run-httpd\") pod \"ceilometer-0\" (UID: \"d1041d12-2cae-4009-a3f3-9df6e219d03b\") " pod="openstack/ceilometer-0" Mar 21 04:47:08 crc kubenswrapper[4839]: I0321 04:47:08.390507 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2qncg\" (UniqueName: \"kubernetes.io/projected/d1041d12-2cae-4009-a3f3-9df6e219d03b-kube-api-access-2qncg\") pod \"ceilometer-0\" (UID: \"d1041d12-2cae-4009-a3f3-9df6e219d03b\") " pod="openstack/ceilometer-0" Mar 21 04:47:08 crc kubenswrapper[4839]: I0321 04:47:08.390538 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d1041d12-2cae-4009-a3f3-9df6e219d03b-config-data\") pod \"ceilometer-0\" (UID: \"d1041d12-2cae-4009-a3f3-9df6e219d03b\") " pod="openstack/ceilometer-0" Mar 21 04:47:08 crc kubenswrapper[4839]: I0321 04:47:08.390599 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d1041d12-2cae-4009-a3f3-9df6e219d03b-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"d1041d12-2cae-4009-a3f3-9df6e219d03b\") " pod="openstack/ceilometer-0" Mar 21 04:47:08 crc kubenswrapper[4839]: I0321 04:47:08.389696 4839 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 21 04:47:08 crc kubenswrapper[4839]: I0321 04:47:08.472337 4839 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ab85f3f4-7277-419b-96bc-4f56d5891b16" path="/var/lib/kubelet/pods/ab85f3f4-7277-419b-96bc-4f56d5891b16/volumes" Mar 21 04:47:08 crc kubenswrapper[4839]: I0321 04:47:08.491976 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/d1041d12-2cae-4009-a3f3-9df6e219d03b-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"d1041d12-2cae-4009-a3f3-9df6e219d03b\") " pod="openstack/ceilometer-0" Mar 21 04:47:08 crc kubenswrapper[4839]: I0321 04:47:08.492059 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/d1041d12-2cae-4009-a3f3-9df6e219d03b-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"d1041d12-2cae-4009-a3f3-9df6e219d03b\") " pod="openstack/ceilometer-0" Mar 21 04:47:08 crc kubenswrapper[4839]: I0321 04:47:08.492089 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d1041d12-2cae-4009-a3f3-9df6e219d03b-scripts\") pod \"ceilometer-0\" (UID: \"d1041d12-2cae-4009-a3f3-9df6e219d03b\") " pod="openstack/ceilometer-0" Mar 21 04:47:08 crc kubenswrapper[4839]: I0321 04:47:08.492125 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/d1041d12-2cae-4009-a3f3-9df6e219d03b-log-httpd\") pod \"ceilometer-0\" (UID: \"d1041d12-2cae-4009-a3f3-9df6e219d03b\") " pod="openstack/ceilometer-0" Mar 21 04:47:08 crc kubenswrapper[4839]: I0321 04:47:08.492771 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/d1041d12-2cae-4009-a3f3-9df6e219d03b-log-httpd\") pod \"ceilometer-0\" (UID: \"d1041d12-2cae-4009-a3f3-9df6e219d03b\") " pod="openstack/ceilometer-0" Mar 21 04:47:08 crc kubenswrapper[4839]: I0321 04:47:08.492175 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/d1041d12-2cae-4009-a3f3-9df6e219d03b-run-httpd\") pod \"ceilometer-0\" (UID: \"d1041d12-2cae-4009-a3f3-9df6e219d03b\") " pod="openstack/ceilometer-0" Mar 21 04:47:08 crc kubenswrapper[4839]: I0321 04:47:08.493128 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/d1041d12-2cae-4009-a3f3-9df6e219d03b-run-httpd\") pod \"ceilometer-0\" (UID: \"d1041d12-2cae-4009-a3f3-9df6e219d03b\") " pod="openstack/ceilometer-0" Mar 21 04:47:08 crc kubenswrapper[4839]: I0321 04:47:08.493216 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2qncg\" (UniqueName: \"kubernetes.io/projected/d1041d12-2cae-4009-a3f3-9df6e219d03b-kube-api-access-2qncg\") pod \"ceilometer-0\" (UID: \"d1041d12-2cae-4009-a3f3-9df6e219d03b\") " pod="openstack/ceilometer-0" Mar 21 04:47:08 crc kubenswrapper[4839]: I0321 04:47:08.493277 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d1041d12-2cae-4009-a3f3-9df6e219d03b-config-data\") pod \"ceilometer-0\" (UID: \"d1041d12-2cae-4009-a3f3-9df6e219d03b\") " pod="openstack/ceilometer-0" Mar 21 04:47:08 crc kubenswrapper[4839]: I0321 04:47:08.493659 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d1041d12-2cae-4009-a3f3-9df6e219d03b-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"d1041d12-2cae-4009-a3f3-9df6e219d03b\") " pod="openstack/ceilometer-0" Mar 21 04:47:08 crc kubenswrapper[4839]: I0321 04:47:08.497279 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d1041d12-2cae-4009-a3f3-9df6e219d03b-scripts\") pod \"ceilometer-0\" (UID: \"d1041d12-2cae-4009-a3f3-9df6e219d03b\") " pod="openstack/ceilometer-0" Mar 21 04:47:08 crc kubenswrapper[4839]: I0321 04:47:08.497985 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d1041d12-2cae-4009-a3f3-9df6e219d03b-config-data\") pod \"ceilometer-0\" (UID: \"d1041d12-2cae-4009-a3f3-9df6e219d03b\") " pod="openstack/ceilometer-0" Mar 21 04:47:08 crc kubenswrapper[4839]: I0321 04:47:08.498591 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/d1041d12-2cae-4009-a3f3-9df6e219d03b-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"d1041d12-2cae-4009-a3f3-9df6e219d03b\") " pod="openstack/ceilometer-0" Mar 21 04:47:08 crc kubenswrapper[4839]: I0321 04:47:08.499025 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d1041d12-2cae-4009-a3f3-9df6e219d03b-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"d1041d12-2cae-4009-a3f3-9df6e219d03b\") " pod="openstack/ceilometer-0" Mar 21 04:47:08 crc kubenswrapper[4839]: I0321 04:47:08.500114 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/d1041d12-2cae-4009-a3f3-9df6e219d03b-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"d1041d12-2cae-4009-a3f3-9df6e219d03b\") " pod="openstack/ceilometer-0" Mar 21 04:47:08 crc kubenswrapper[4839]: I0321 04:47:08.508695 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2qncg\" (UniqueName: \"kubernetes.io/projected/d1041d12-2cae-4009-a3f3-9df6e219d03b-kube-api-access-2qncg\") pod \"ceilometer-0\" (UID: \"d1041d12-2cae-4009-a3f3-9df6e219d03b\") " pod="openstack/ceilometer-0" Mar 21 04:47:08 crc kubenswrapper[4839]: I0321 04:47:08.570051 4839 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell1-novncproxy-0" Mar 21 04:47:08 crc kubenswrapper[4839]: I0321 04:47:08.777277 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 21 04:47:09 crc kubenswrapper[4839]: W0321 04:47:09.229907 4839 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd1041d12_2cae_4009_a3f3_9df6e219d03b.slice/crio-1f124d4e1c5b85e0428b76ea46da810507bdfa3cbd9fe3bc66187a84ade8184f WatchSource:0}: Error finding container 1f124d4e1c5b85e0428b76ea46da810507bdfa3cbd9fe3bc66187a84ade8184f: Status 404 returned error can't find the container with id 1f124d4e1c5b85e0428b76ea46da810507bdfa3cbd9fe3bc66187a84ade8184f Mar 21 04:47:09 crc kubenswrapper[4839]: I0321 04:47:09.238791 4839 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 21 04:47:09 crc kubenswrapper[4839]: I0321 04:47:09.296185 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"d1041d12-2cae-4009-a3f3-9df6e219d03b","Type":"ContainerStarted","Data":"1f124d4e1c5b85e0428b76ea46da810507bdfa3cbd9fe3bc66187a84ade8184f"} Mar 21 04:47:10 crc kubenswrapper[4839]: I0321 04:47:10.304858 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"d1041d12-2cae-4009-a3f3-9df6e219d03b","Type":"ContainerStarted","Data":"55d2f2d12a7309ea8eebf9d85f9e7ac8e6fc4bfa05f1a19e2884034606a058b9"} Mar 21 04:47:10 crc kubenswrapper[4839]: I0321 04:47:10.930662 4839 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 21 04:47:10 crc kubenswrapper[4839]: I0321 04:47:10.955309 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/233bba1a-658e-4073-acb5-c80398a849f1-logs\") pod \"233bba1a-658e-4073-acb5-c80398a849f1\" (UID: \"233bba1a-658e-4073-acb5-c80398a849f1\") " Mar 21 04:47:10 crc kubenswrapper[4839]: I0321 04:47:10.955436 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/233bba1a-658e-4073-acb5-c80398a849f1-config-data\") pod \"233bba1a-658e-4073-acb5-c80398a849f1\" (UID: \"233bba1a-658e-4073-acb5-c80398a849f1\") " Mar 21 04:47:10 crc kubenswrapper[4839]: I0321 04:47:10.955521 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/233bba1a-658e-4073-acb5-c80398a849f1-combined-ca-bundle\") pod \"233bba1a-658e-4073-acb5-c80398a849f1\" (UID: \"233bba1a-658e-4073-acb5-c80398a849f1\") " Mar 21 04:47:10 crc kubenswrapper[4839]: I0321 04:47:10.955675 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cbjsj\" (UniqueName: \"kubernetes.io/projected/233bba1a-658e-4073-acb5-c80398a849f1-kube-api-access-cbjsj\") pod \"233bba1a-658e-4073-acb5-c80398a849f1\" (UID: \"233bba1a-658e-4073-acb5-c80398a849f1\") " Mar 21 04:47:10 crc kubenswrapper[4839]: I0321 04:47:10.963675 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/233bba1a-658e-4073-acb5-c80398a849f1-logs" (OuterVolumeSpecName: "logs") pod "233bba1a-658e-4073-acb5-c80398a849f1" (UID: "233bba1a-658e-4073-acb5-c80398a849f1"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 21 04:47:10 crc kubenswrapper[4839]: I0321 04:47:10.968151 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/233bba1a-658e-4073-acb5-c80398a849f1-kube-api-access-cbjsj" (OuterVolumeSpecName: "kube-api-access-cbjsj") pod "233bba1a-658e-4073-acb5-c80398a849f1" (UID: "233bba1a-658e-4073-acb5-c80398a849f1"). InnerVolumeSpecName "kube-api-access-cbjsj". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 21 04:47:10 crc kubenswrapper[4839]: I0321 04:47:10.988971 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/233bba1a-658e-4073-acb5-c80398a849f1-config-data" (OuterVolumeSpecName: "config-data") pod "233bba1a-658e-4073-acb5-c80398a849f1" (UID: "233bba1a-658e-4073-acb5-c80398a849f1"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 21 04:47:10 crc kubenswrapper[4839]: I0321 04:47:10.997522 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/233bba1a-658e-4073-acb5-c80398a849f1-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "233bba1a-658e-4073-acb5-c80398a849f1" (UID: "233bba1a-658e-4073-acb5-c80398a849f1"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 21 04:47:11 crc kubenswrapper[4839]: I0321 04:47:11.058305 4839 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cbjsj\" (UniqueName: \"kubernetes.io/projected/233bba1a-658e-4073-acb5-c80398a849f1-kube-api-access-cbjsj\") on node \"crc\" DevicePath \"\"" Mar 21 04:47:11 crc kubenswrapper[4839]: I0321 04:47:11.058332 4839 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/233bba1a-658e-4073-acb5-c80398a849f1-logs\") on node \"crc\" DevicePath \"\"" Mar 21 04:47:11 crc kubenswrapper[4839]: I0321 04:47:11.058341 4839 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/233bba1a-658e-4073-acb5-c80398a849f1-config-data\") on node \"crc\" DevicePath \"\"" Mar 21 04:47:11 crc kubenswrapper[4839]: I0321 04:47:11.058350 4839 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/233bba1a-658e-4073-acb5-c80398a849f1-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 21 04:47:11 crc kubenswrapper[4839]: I0321 04:47:11.319739 4839 generic.go:334] "Generic (PLEG): container finished" podID="233bba1a-658e-4073-acb5-c80398a849f1" containerID="deb071d3d18e498c97565ddfbf97de4939f8ca911621645835647f16b1a60c9f" exitCode=0 Mar 21 04:47:11 crc kubenswrapper[4839]: I0321 04:47:11.319799 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"233bba1a-658e-4073-acb5-c80398a849f1","Type":"ContainerDied","Data":"deb071d3d18e498c97565ddfbf97de4939f8ca911621645835647f16b1a60c9f"} Mar 21 04:47:11 crc kubenswrapper[4839]: I0321 04:47:11.319825 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"233bba1a-658e-4073-acb5-c80398a849f1","Type":"ContainerDied","Data":"b12b8ea76c192eae2fc0d2d772e7797d514c5f048bf2be61c0b8a59a9057d453"} Mar 21 04:47:11 crc kubenswrapper[4839]: I0321 04:47:11.319841 4839 scope.go:117] "RemoveContainer" containerID="deb071d3d18e498c97565ddfbf97de4939f8ca911621645835647f16b1a60c9f" Mar 21 04:47:11 crc kubenswrapper[4839]: I0321 04:47:11.319934 4839 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 21 04:47:11 crc kubenswrapper[4839]: I0321 04:47:11.326282 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"d1041d12-2cae-4009-a3f3-9df6e219d03b","Type":"ContainerStarted","Data":"1ca8631fd39c71f0c76e109d72496b0706ab64826ec05aa45944e9a60f5abc30"} Mar 21 04:47:11 crc kubenswrapper[4839]: I0321 04:47:11.367654 4839 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Mar 21 04:47:11 crc kubenswrapper[4839]: I0321 04:47:11.371335 4839 scope.go:117] "RemoveContainer" containerID="31f18552e99ec1604ace2931e4e04c749226768da827514f9d6063a08cb84d92" Mar 21 04:47:11 crc kubenswrapper[4839]: I0321 04:47:11.393754 4839 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-0"] Mar 21 04:47:11 crc kubenswrapper[4839]: I0321 04:47:11.426974 4839 scope.go:117] "RemoveContainer" containerID="deb071d3d18e498c97565ddfbf97de4939f8ca911621645835647f16b1a60c9f" Mar 21 04:47:11 crc kubenswrapper[4839]: E0321 04:47:11.431279 4839 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"deb071d3d18e498c97565ddfbf97de4939f8ca911621645835647f16b1a60c9f\": container with ID starting with deb071d3d18e498c97565ddfbf97de4939f8ca911621645835647f16b1a60c9f not found: ID does not exist" containerID="deb071d3d18e498c97565ddfbf97de4939f8ca911621645835647f16b1a60c9f" Mar 21 04:47:11 crc kubenswrapper[4839]: I0321 04:47:11.432185 4839 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"deb071d3d18e498c97565ddfbf97de4939f8ca911621645835647f16b1a60c9f"} err="failed to get container status \"deb071d3d18e498c97565ddfbf97de4939f8ca911621645835647f16b1a60c9f\": rpc error: code = NotFound desc = could not find container \"deb071d3d18e498c97565ddfbf97de4939f8ca911621645835647f16b1a60c9f\": container with ID starting with deb071d3d18e498c97565ddfbf97de4939f8ca911621645835647f16b1a60c9f not found: ID does not exist" Mar 21 04:47:11 crc kubenswrapper[4839]: I0321 04:47:11.432225 4839 scope.go:117] "RemoveContainer" containerID="31f18552e99ec1604ace2931e4e04c749226768da827514f9d6063a08cb84d92" Mar 21 04:47:11 crc kubenswrapper[4839]: I0321 04:47:11.432591 4839 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Mar 21 04:47:11 crc kubenswrapper[4839]: E0321 04:47:11.432754 4839 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"31f18552e99ec1604ace2931e4e04c749226768da827514f9d6063a08cb84d92\": container with ID starting with 31f18552e99ec1604ace2931e4e04c749226768da827514f9d6063a08cb84d92 not found: ID does not exist" containerID="31f18552e99ec1604ace2931e4e04c749226768da827514f9d6063a08cb84d92" Mar 21 04:47:11 crc kubenswrapper[4839]: I0321 04:47:11.432814 4839 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"31f18552e99ec1604ace2931e4e04c749226768da827514f9d6063a08cb84d92"} err="failed to get container status \"31f18552e99ec1604ace2931e4e04c749226768da827514f9d6063a08cb84d92\": rpc error: code = NotFound desc = could not find container \"31f18552e99ec1604ace2931e4e04c749226768da827514f9d6063a08cb84d92\": container with ID starting with 31f18552e99ec1604ace2931e4e04c749226768da827514f9d6063a08cb84d92 not found: ID does not exist" Mar 21 04:47:11 crc kubenswrapper[4839]: E0321 04:47:11.433097 4839 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="233bba1a-658e-4073-acb5-c80398a849f1" containerName="nova-api-log" Mar 21 04:47:11 crc kubenswrapper[4839]: I0321 04:47:11.433165 4839 state_mem.go:107] "Deleted CPUSet assignment" podUID="233bba1a-658e-4073-acb5-c80398a849f1" containerName="nova-api-log" Mar 21 04:47:11 crc kubenswrapper[4839]: E0321 04:47:11.433341 4839 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="233bba1a-658e-4073-acb5-c80398a849f1" containerName="nova-api-api" Mar 21 04:47:11 crc kubenswrapper[4839]: I0321 04:47:11.433461 4839 state_mem.go:107] "Deleted CPUSet assignment" podUID="233bba1a-658e-4073-acb5-c80398a849f1" containerName="nova-api-api" Mar 21 04:47:11 crc kubenswrapper[4839]: I0321 04:47:11.433729 4839 memory_manager.go:354] "RemoveStaleState removing state" podUID="233bba1a-658e-4073-acb5-c80398a849f1" containerName="nova-api-api" Mar 21 04:47:11 crc kubenswrapper[4839]: I0321 04:47:11.433802 4839 memory_manager.go:354] "RemoveStaleState removing state" podUID="233bba1a-658e-4073-acb5-c80398a849f1" containerName="nova-api-log" Mar 21 04:47:11 crc kubenswrapper[4839]: I0321 04:47:11.434954 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 21 04:47:11 crc kubenswrapper[4839]: I0321 04:47:11.437164 4839 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Mar 21 04:47:11 crc kubenswrapper[4839]: I0321 04:47:11.441211 4839 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-internal-svc" Mar 21 04:47:11 crc kubenswrapper[4839]: I0321 04:47:11.442294 4839 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-public-svc" Mar 21 04:47:11 crc kubenswrapper[4839]: I0321 04:47:11.446562 4839 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Mar 21 04:47:11 crc kubenswrapper[4839]: I0321 04:47:11.572110 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7gj6w\" (UniqueName: \"kubernetes.io/projected/968e5045-c2d8-4fba-9011-0a81fa2b95a3-kube-api-access-7gj6w\") pod \"nova-api-0\" (UID: \"968e5045-c2d8-4fba-9011-0a81fa2b95a3\") " pod="openstack/nova-api-0" Mar 21 04:47:11 crc kubenswrapper[4839]: I0321 04:47:11.572221 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/968e5045-c2d8-4fba-9011-0a81fa2b95a3-public-tls-certs\") pod \"nova-api-0\" (UID: \"968e5045-c2d8-4fba-9011-0a81fa2b95a3\") " pod="openstack/nova-api-0" Mar 21 04:47:11 crc kubenswrapper[4839]: I0321 04:47:11.572251 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/968e5045-c2d8-4fba-9011-0a81fa2b95a3-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"968e5045-c2d8-4fba-9011-0a81fa2b95a3\") " pod="openstack/nova-api-0" Mar 21 04:47:11 crc kubenswrapper[4839]: I0321 04:47:11.572517 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/968e5045-c2d8-4fba-9011-0a81fa2b95a3-logs\") pod \"nova-api-0\" (UID: \"968e5045-c2d8-4fba-9011-0a81fa2b95a3\") " pod="openstack/nova-api-0" Mar 21 04:47:11 crc kubenswrapper[4839]: I0321 04:47:11.572604 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/968e5045-c2d8-4fba-9011-0a81fa2b95a3-internal-tls-certs\") pod \"nova-api-0\" (UID: \"968e5045-c2d8-4fba-9011-0a81fa2b95a3\") " pod="openstack/nova-api-0" Mar 21 04:47:11 crc kubenswrapper[4839]: I0321 04:47:11.572650 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/968e5045-c2d8-4fba-9011-0a81fa2b95a3-config-data\") pod \"nova-api-0\" (UID: \"968e5045-c2d8-4fba-9011-0a81fa2b95a3\") " pod="openstack/nova-api-0" Mar 21 04:47:11 crc kubenswrapper[4839]: I0321 04:47:11.677962 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7gj6w\" (UniqueName: \"kubernetes.io/projected/968e5045-c2d8-4fba-9011-0a81fa2b95a3-kube-api-access-7gj6w\") pod \"nova-api-0\" (UID: \"968e5045-c2d8-4fba-9011-0a81fa2b95a3\") " pod="openstack/nova-api-0" Mar 21 04:47:11 crc kubenswrapper[4839]: I0321 04:47:11.678050 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/968e5045-c2d8-4fba-9011-0a81fa2b95a3-public-tls-certs\") pod \"nova-api-0\" (UID: \"968e5045-c2d8-4fba-9011-0a81fa2b95a3\") " pod="openstack/nova-api-0" Mar 21 04:47:11 crc kubenswrapper[4839]: I0321 04:47:11.678071 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/968e5045-c2d8-4fba-9011-0a81fa2b95a3-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"968e5045-c2d8-4fba-9011-0a81fa2b95a3\") " pod="openstack/nova-api-0" Mar 21 04:47:11 crc kubenswrapper[4839]: I0321 04:47:11.678130 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/968e5045-c2d8-4fba-9011-0a81fa2b95a3-logs\") pod \"nova-api-0\" (UID: \"968e5045-c2d8-4fba-9011-0a81fa2b95a3\") " pod="openstack/nova-api-0" Mar 21 04:47:11 crc kubenswrapper[4839]: I0321 04:47:11.678152 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/968e5045-c2d8-4fba-9011-0a81fa2b95a3-internal-tls-certs\") pod \"nova-api-0\" (UID: \"968e5045-c2d8-4fba-9011-0a81fa2b95a3\") " pod="openstack/nova-api-0" Mar 21 04:47:11 crc kubenswrapper[4839]: I0321 04:47:11.678168 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/968e5045-c2d8-4fba-9011-0a81fa2b95a3-config-data\") pod \"nova-api-0\" (UID: \"968e5045-c2d8-4fba-9011-0a81fa2b95a3\") " pod="openstack/nova-api-0" Mar 21 04:47:11 crc kubenswrapper[4839]: I0321 04:47:11.680698 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/968e5045-c2d8-4fba-9011-0a81fa2b95a3-logs\") pod \"nova-api-0\" (UID: \"968e5045-c2d8-4fba-9011-0a81fa2b95a3\") " pod="openstack/nova-api-0" Mar 21 04:47:11 crc kubenswrapper[4839]: I0321 04:47:11.693411 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/968e5045-c2d8-4fba-9011-0a81fa2b95a3-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"968e5045-c2d8-4fba-9011-0a81fa2b95a3\") " pod="openstack/nova-api-0" Mar 21 04:47:11 crc kubenswrapper[4839]: I0321 04:47:11.693824 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/968e5045-c2d8-4fba-9011-0a81fa2b95a3-internal-tls-certs\") pod \"nova-api-0\" (UID: \"968e5045-c2d8-4fba-9011-0a81fa2b95a3\") " pod="openstack/nova-api-0" Mar 21 04:47:11 crc kubenswrapper[4839]: I0321 04:47:11.694034 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/968e5045-c2d8-4fba-9011-0a81fa2b95a3-public-tls-certs\") pod \"nova-api-0\" (UID: \"968e5045-c2d8-4fba-9011-0a81fa2b95a3\") " pod="openstack/nova-api-0" Mar 21 04:47:11 crc kubenswrapper[4839]: I0321 04:47:11.694859 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/968e5045-c2d8-4fba-9011-0a81fa2b95a3-config-data\") pod \"nova-api-0\" (UID: \"968e5045-c2d8-4fba-9011-0a81fa2b95a3\") " pod="openstack/nova-api-0" Mar 21 04:47:11 crc kubenswrapper[4839]: I0321 04:47:11.697836 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7gj6w\" (UniqueName: \"kubernetes.io/projected/968e5045-c2d8-4fba-9011-0a81fa2b95a3-kube-api-access-7gj6w\") pod \"nova-api-0\" (UID: \"968e5045-c2d8-4fba-9011-0a81fa2b95a3\") " pod="openstack/nova-api-0" Mar 21 04:47:11 crc kubenswrapper[4839]: I0321 04:47:11.767438 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 21 04:47:12 crc kubenswrapper[4839]: I0321 04:47:12.253387 4839 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Mar 21 04:47:12 crc kubenswrapper[4839]: W0321 04:47:12.254920 4839 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod968e5045_c2d8_4fba_9011_0a81fa2b95a3.slice/crio-3291d4efbcbfe77a6e6e2dd5345f3dfb280041836b19d5e1194b1100ef0d1f64 WatchSource:0}: Error finding container 3291d4efbcbfe77a6e6e2dd5345f3dfb280041836b19d5e1194b1100ef0d1f64: Status 404 returned error can't find the container with id 3291d4efbcbfe77a6e6e2dd5345f3dfb280041836b19d5e1194b1100ef0d1f64 Mar 21 04:47:12 crc kubenswrapper[4839]: I0321 04:47:12.350727 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"d1041d12-2cae-4009-a3f3-9df6e219d03b","Type":"ContainerStarted","Data":"fd6c67538c0b1e67c9621e2de3fecc0f885e4fbb28c56c5b9b0fbfb23c369ac4"} Mar 21 04:47:12 crc kubenswrapper[4839]: I0321 04:47:12.351764 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"968e5045-c2d8-4fba-9011-0a81fa2b95a3","Type":"ContainerStarted","Data":"3291d4efbcbfe77a6e6e2dd5345f3dfb280041836b19d5e1194b1100ef0d1f64"} Mar 21 04:47:12 crc kubenswrapper[4839]: I0321 04:47:12.463300 4839 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="233bba1a-658e-4073-acb5-c80398a849f1" path="/var/lib/kubelet/pods/233bba1a-658e-4073-acb5-c80398a849f1/volumes" Mar 21 04:47:13 crc kubenswrapper[4839]: I0321 04:47:13.366062 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"968e5045-c2d8-4fba-9011-0a81fa2b95a3","Type":"ContainerStarted","Data":"4126f0f0236467088a06a786d560aeca176722ae7ed2730314d9af69edbaed6c"} Mar 21 04:47:13 crc kubenswrapper[4839]: I0321 04:47:13.366453 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"968e5045-c2d8-4fba-9011-0a81fa2b95a3","Type":"ContainerStarted","Data":"2e809d0916446ccc1b6f965c38dcf8b14dcb39acbc20bb5dcd7f8eed60511dad"} Mar 21 04:47:13 crc kubenswrapper[4839]: I0321 04:47:13.606123 4839 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Mar 21 04:47:13 crc kubenswrapper[4839]: I0321 04:47:13.606175 4839 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Mar 21 04:47:13 crc kubenswrapper[4839]: I0321 04:47:13.609346 4839 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-cell1-novncproxy-0" Mar 21 04:47:13 crc kubenswrapper[4839]: I0321 04:47:13.635552 4839 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-cell1-novncproxy-0" Mar 21 04:47:13 crc kubenswrapper[4839]: I0321 04:47:13.667915 4839 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=2.6678926020000002 podStartE2EDuration="2.667892602s" podCreationTimestamp="2026-03-21 04:47:11 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-21 04:47:13.407712399 +0000 UTC m=+1437.735499085" watchObservedRunningTime="2026-03-21 04:47:13.667892602 +0000 UTC m=+1437.995679278" Mar 21 04:47:14 crc kubenswrapper[4839]: I0321 04:47:14.378375 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"d1041d12-2cae-4009-a3f3-9df6e219d03b","Type":"ContainerStarted","Data":"de018109d491c3a05b4a9a0a0f84bf56007711a954d24819a83da36ce16df2f6"} Mar 21 04:47:14 crc kubenswrapper[4839]: I0321 04:47:14.396432 4839 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell1-novncproxy-0" Mar 21 04:47:14 crc kubenswrapper[4839]: I0321 04:47:14.397053 4839 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.626955733 podStartE2EDuration="6.397033604s" podCreationTimestamp="2026-03-21 04:47:08 +0000 UTC" firstStartedPulling="2026-03-21 04:47:09.232541087 +0000 UTC m=+1433.560327763" lastFinishedPulling="2026-03-21 04:47:13.002618948 +0000 UTC m=+1437.330405634" observedRunningTime="2026-03-21 04:47:14.396800377 +0000 UTC m=+1438.724587053" watchObservedRunningTime="2026-03-21 04:47:14.397033604 +0000 UTC m=+1438.724820280" Mar 21 04:47:14 crc kubenswrapper[4839]: I0321 04:47:14.572022 4839 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-cell-mapping-f7kjm"] Mar 21 04:47:14 crc kubenswrapper[4839]: I0321 04:47:14.573882 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-f7kjm" Mar 21 04:47:14 crc kubenswrapper[4839]: I0321 04:47:14.581418 4839 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-manage-scripts" Mar 21 04:47:14 crc kubenswrapper[4839]: I0321 04:47:14.584415 4839 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-manage-config-data" Mar 21 04:47:14 crc kubenswrapper[4839]: I0321 04:47:14.588233 4839 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-cell-mapping-f7kjm"] Mar 21 04:47:14 crc kubenswrapper[4839]: I0321 04:47:14.628724 4839 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="8e28a9be-2244-43bb-9043-2ededa502897" containerName="nova-metadata-log" probeResult="failure" output="Get \"https://10.217.0.211:8775/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 21 04:47:14 crc kubenswrapper[4839]: I0321 04:47:14.628720 4839 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="8e28a9be-2244-43bb-9043-2ededa502897" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"https://10.217.0.211:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 21 04:47:14 crc kubenswrapper[4839]: I0321 04:47:14.632913 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6c8778a4-d8b7-4331-be57-d1844b3c0f9f-config-data\") pod \"nova-cell1-cell-mapping-f7kjm\" (UID: \"6c8778a4-d8b7-4331-be57-d1844b3c0f9f\") " pod="openstack/nova-cell1-cell-mapping-f7kjm" Mar 21 04:47:14 crc kubenswrapper[4839]: I0321 04:47:14.633011 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6c8778a4-d8b7-4331-be57-d1844b3c0f9f-scripts\") pod \"nova-cell1-cell-mapping-f7kjm\" (UID: \"6c8778a4-d8b7-4331-be57-d1844b3c0f9f\") " pod="openstack/nova-cell1-cell-mapping-f7kjm" Mar 21 04:47:14 crc kubenswrapper[4839]: I0321 04:47:14.633054 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9fqkd\" (UniqueName: \"kubernetes.io/projected/6c8778a4-d8b7-4331-be57-d1844b3c0f9f-kube-api-access-9fqkd\") pod \"nova-cell1-cell-mapping-f7kjm\" (UID: \"6c8778a4-d8b7-4331-be57-d1844b3c0f9f\") " pod="openstack/nova-cell1-cell-mapping-f7kjm" Mar 21 04:47:14 crc kubenswrapper[4839]: I0321 04:47:14.633084 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6c8778a4-d8b7-4331-be57-d1844b3c0f9f-combined-ca-bundle\") pod \"nova-cell1-cell-mapping-f7kjm\" (UID: \"6c8778a4-d8b7-4331-be57-d1844b3c0f9f\") " pod="openstack/nova-cell1-cell-mapping-f7kjm" Mar 21 04:47:14 crc kubenswrapper[4839]: I0321 04:47:14.734714 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6c8778a4-d8b7-4331-be57-d1844b3c0f9f-config-data\") pod \"nova-cell1-cell-mapping-f7kjm\" (UID: \"6c8778a4-d8b7-4331-be57-d1844b3c0f9f\") " pod="openstack/nova-cell1-cell-mapping-f7kjm" Mar 21 04:47:14 crc kubenswrapper[4839]: I0321 04:47:14.734814 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6c8778a4-d8b7-4331-be57-d1844b3c0f9f-scripts\") pod \"nova-cell1-cell-mapping-f7kjm\" (UID: \"6c8778a4-d8b7-4331-be57-d1844b3c0f9f\") " pod="openstack/nova-cell1-cell-mapping-f7kjm" Mar 21 04:47:14 crc kubenswrapper[4839]: I0321 04:47:14.734852 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9fqkd\" (UniqueName: \"kubernetes.io/projected/6c8778a4-d8b7-4331-be57-d1844b3c0f9f-kube-api-access-9fqkd\") pod \"nova-cell1-cell-mapping-f7kjm\" (UID: \"6c8778a4-d8b7-4331-be57-d1844b3c0f9f\") " pod="openstack/nova-cell1-cell-mapping-f7kjm" Mar 21 04:47:14 crc kubenswrapper[4839]: I0321 04:47:14.734879 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6c8778a4-d8b7-4331-be57-d1844b3c0f9f-combined-ca-bundle\") pod \"nova-cell1-cell-mapping-f7kjm\" (UID: \"6c8778a4-d8b7-4331-be57-d1844b3c0f9f\") " pod="openstack/nova-cell1-cell-mapping-f7kjm" Mar 21 04:47:14 crc kubenswrapper[4839]: I0321 04:47:14.740842 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6c8778a4-d8b7-4331-be57-d1844b3c0f9f-scripts\") pod \"nova-cell1-cell-mapping-f7kjm\" (UID: \"6c8778a4-d8b7-4331-be57-d1844b3c0f9f\") " pod="openstack/nova-cell1-cell-mapping-f7kjm" Mar 21 04:47:14 crc kubenswrapper[4839]: I0321 04:47:14.741124 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6c8778a4-d8b7-4331-be57-d1844b3c0f9f-config-data\") pod \"nova-cell1-cell-mapping-f7kjm\" (UID: \"6c8778a4-d8b7-4331-be57-d1844b3c0f9f\") " pod="openstack/nova-cell1-cell-mapping-f7kjm" Mar 21 04:47:14 crc kubenswrapper[4839]: I0321 04:47:14.748157 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6c8778a4-d8b7-4331-be57-d1844b3c0f9f-combined-ca-bundle\") pod \"nova-cell1-cell-mapping-f7kjm\" (UID: \"6c8778a4-d8b7-4331-be57-d1844b3c0f9f\") " pod="openstack/nova-cell1-cell-mapping-f7kjm" Mar 21 04:47:14 crc kubenswrapper[4839]: I0321 04:47:14.754952 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9fqkd\" (UniqueName: \"kubernetes.io/projected/6c8778a4-d8b7-4331-be57-d1844b3c0f9f-kube-api-access-9fqkd\") pod \"nova-cell1-cell-mapping-f7kjm\" (UID: \"6c8778a4-d8b7-4331-be57-d1844b3c0f9f\") " pod="openstack/nova-cell1-cell-mapping-f7kjm" Mar 21 04:47:14 crc kubenswrapper[4839]: I0321 04:47:14.852773 4839 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-cd5cbd7b9-6862l" Mar 21 04:47:14 crc kubenswrapper[4839]: I0321 04:47:14.903004 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-f7kjm" Mar 21 04:47:14 crc kubenswrapper[4839]: I0321 04:47:14.937960 4839 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-bccf8f775-gwlp7"] Mar 21 04:47:14 crc kubenswrapper[4839]: I0321 04:47:14.938193 4839 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-bccf8f775-gwlp7" podUID="378a796b-e896-48a8-9e03-65e3b371c636" containerName="dnsmasq-dns" containerID="cri-o://d33f1cdd73480cf38d5a67e559fe413c35de9b47b49b6298618c23ca1c61bfaa" gracePeriod=10 Mar 21 04:47:15 crc kubenswrapper[4839]: I0321 04:47:15.409758 4839 generic.go:334] "Generic (PLEG): container finished" podID="378a796b-e896-48a8-9e03-65e3b371c636" containerID="d33f1cdd73480cf38d5a67e559fe413c35de9b47b49b6298618c23ca1c61bfaa" exitCode=0 Mar 21 04:47:15 crc kubenswrapper[4839]: I0321 04:47:15.410398 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-bccf8f775-gwlp7" event={"ID":"378a796b-e896-48a8-9e03-65e3b371c636","Type":"ContainerDied","Data":"d33f1cdd73480cf38d5a67e559fe413c35de9b47b49b6298618c23ca1c61bfaa"} Mar 21 04:47:15 crc kubenswrapper[4839]: I0321 04:47:15.412353 4839 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Mar 21 04:47:15 crc kubenswrapper[4839]: I0321 04:47:15.463518 4839 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-cell-mapping-f7kjm"] Mar 21 04:47:15 crc kubenswrapper[4839]: I0321 04:47:15.483228 4839 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-bccf8f775-gwlp7" Mar 21 04:47:15 crc kubenswrapper[4839]: I0321 04:47:15.655244 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/378a796b-e896-48a8-9e03-65e3b371c636-dns-svc\") pod \"378a796b-e896-48a8-9e03-65e3b371c636\" (UID: \"378a796b-e896-48a8-9e03-65e3b371c636\") " Mar 21 04:47:15 crc kubenswrapper[4839]: I0321 04:47:15.655620 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4c6c4\" (UniqueName: \"kubernetes.io/projected/378a796b-e896-48a8-9e03-65e3b371c636-kube-api-access-4c6c4\") pod \"378a796b-e896-48a8-9e03-65e3b371c636\" (UID: \"378a796b-e896-48a8-9e03-65e3b371c636\") " Mar 21 04:47:15 crc kubenswrapper[4839]: I0321 04:47:15.655777 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/378a796b-e896-48a8-9e03-65e3b371c636-config\") pod \"378a796b-e896-48a8-9e03-65e3b371c636\" (UID: \"378a796b-e896-48a8-9e03-65e3b371c636\") " Mar 21 04:47:15 crc kubenswrapper[4839]: I0321 04:47:15.655822 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/378a796b-e896-48a8-9e03-65e3b371c636-ovsdbserver-nb\") pod \"378a796b-e896-48a8-9e03-65e3b371c636\" (UID: \"378a796b-e896-48a8-9e03-65e3b371c636\") " Mar 21 04:47:15 crc kubenswrapper[4839]: I0321 04:47:15.655910 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/378a796b-e896-48a8-9e03-65e3b371c636-ovsdbserver-sb\") pod \"378a796b-e896-48a8-9e03-65e3b371c636\" (UID: \"378a796b-e896-48a8-9e03-65e3b371c636\") " Mar 21 04:47:15 crc kubenswrapper[4839]: I0321 04:47:15.655941 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/378a796b-e896-48a8-9e03-65e3b371c636-dns-swift-storage-0\") pod \"378a796b-e896-48a8-9e03-65e3b371c636\" (UID: \"378a796b-e896-48a8-9e03-65e3b371c636\") " Mar 21 04:47:15 crc kubenswrapper[4839]: I0321 04:47:15.661326 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/378a796b-e896-48a8-9e03-65e3b371c636-kube-api-access-4c6c4" (OuterVolumeSpecName: "kube-api-access-4c6c4") pod "378a796b-e896-48a8-9e03-65e3b371c636" (UID: "378a796b-e896-48a8-9e03-65e3b371c636"). InnerVolumeSpecName "kube-api-access-4c6c4". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 21 04:47:15 crc kubenswrapper[4839]: I0321 04:47:15.703367 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/378a796b-e896-48a8-9e03-65e3b371c636-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "378a796b-e896-48a8-9e03-65e3b371c636" (UID: "378a796b-e896-48a8-9e03-65e3b371c636"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 21 04:47:15 crc kubenswrapper[4839]: I0321 04:47:15.711016 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/378a796b-e896-48a8-9e03-65e3b371c636-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "378a796b-e896-48a8-9e03-65e3b371c636" (UID: "378a796b-e896-48a8-9e03-65e3b371c636"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 21 04:47:15 crc kubenswrapper[4839]: I0321 04:47:15.716754 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/378a796b-e896-48a8-9e03-65e3b371c636-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "378a796b-e896-48a8-9e03-65e3b371c636" (UID: "378a796b-e896-48a8-9e03-65e3b371c636"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 21 04:47:15 crc kubenswrapper[4839]: I0321 04:47:15.724334 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/378a796b-e896-48a8-9e03-65e3b371c636-config" (OuterVolumeSpecName: "config") pod "378a796b-e896-48a8-9e03-65e3b371c636" (UID: "378a796b-e896-48a8-9e03-65e3b371c636"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 21 04:47:15 crc kubenswrapper[4839]: I0321 04:47:15.735803 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/378a796b-e896-48a8-9e03-65e3b371c636-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "378a796b-e896-48a8-9e03-65e3b371c636" (UID: "378a796b-e896-48a8-9e03-65e3b371c636"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 21 04:47:15 crc kubenswrapper[4839]: I0321 04:47:15.758941 4839 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/378a796b-e896-48a8-9e03-65e3b371c636-config\") on node \"crc\" DevicePath \"\"" Mar 21 04:47:15 crc kubenswrapper[4839]: I0321 04:47:15.758984 4839 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/378a796b-e896-48a8-9e03-65e3b371c636-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Mar 21 04:47:15 crc kubenswrapper[4839]: I0321 04:47:15.758999 4839 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/378a796b-e896-48a8-9e03-65e3b371c636-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Mar 21 04:47:15 crc kubenswrapper[4839]: I0321 04:47:15.759016 4839 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/378a796b-e896-48a8-9e03-65e3b371c636-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Mar 21 04:47:15 crc kubenswrapper[4839]: I0321 04:47:15.759030 4839 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/378a796b-e896-48a8-9e03-65e3b371c636-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 21 04:47:15 crc kubenswrapper[4839]: I0321 04:47:15.759044 4839 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4c6c4\" (UniqueName: \"kubernetes.io/projected/378a796b-e896-48a8-9e03-65e3b371c636-kube-api-access-4c6c4\") on node \"crc\" DevicePath \"\"" Mar 21 04:47:16 crc kubenswrapper[4839]: I0321 04:47:16.425024 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-f7kjm" event={"ID":"6c8778a4-d8b7-4331-be57-d1844b3c0f9f","Type":"ContainerStarted","Data":"07f2e48c7301d0027bae700357d24a79a9ba9d36dd4d10cd8158d308e2f8bf3d"} Mar 21 04:47:16 crc kubenswrapper[4839]: I0321 04:47:16.425451 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-f7kjm" event={"ID":"6c8778a4-d8b7-4331-be57-d1844b3c0f9f","Type":"ContainerStarted","Data":"9f409cc408d1754bbcdbaab1c8502e074d738e287574c49b7ecc38ef5824176e"} Mar 21 04:47:16 crc kubenswrapper[4839]: I0321 04:47:16.426655 4839 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-bccf8f775-gwlp7" Mar 21 04:47:16 crc kubenswrapper[4839]: I0321 04:47:16.427339 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-bccf8f775-gwlp7" event={"ID":"378a796b-e896-48a8-9e03-65e3b371c636","Type":"ContainerDied","Data":"49d3afde602a166e8c5a9ef71743020fdf1f738c3940d641e7dae2434ec0eb13"} Mar 21 04:47:16 crc kubenswrapper[4839]: I0321 04:47:16.427374 4839 scope.go:117] "RemoveContainer" containerID="d33f1cdd73480cf38d5a67e559fe413c35de9b47b49b6298618c23ca1c61bfaa" Mar 21 04:47:16 crc kubenswrapper[4839]: I0321 04:47:16.456692 4839 scope.go:117] "RemoveContainer" containerID="880297fb77f65981125f101cde38f55dd95860faac6dbd936272889d1aa0b1aa" Mar 21 04:47:16 crc kubenswrapper[4839]: I0321 04:47:16.468382 4839 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-cell-mapping-f7kjm" podStartSLOduration=2.468366173 podStartE2EDuration="2.468366173s" podCreationTimestamp="2026-03-21 04:47:14 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-21 04:47:16.442529666 +0000 UTC m=+1440.770316362" watchObservedRunningTime="2026-03-21 04:47:16.468366173 +0000 UTC m=+1440.796152839" Mar 21 04:47:16 crc kubenswrapper[4839]: I0321 04:47:16.490805 4839 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-bccf8f775-gwlp7"] Mar 21 04:47:16 crc kubenswrapper[4839]: I0321 04:47:16.500186 4839 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-bccf8f775-gwlp7"] Mar 21 04:47:18 crc kubenswrapper[4839]: I0321 04:47:18.466936 4839 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="378a796b-e896-48a8-9e03-65e3b371c636" path="/var/lib/kubelet/pods/378a796b-e896-48a8-9e03-65e3b371c636/volumes" Mar 21 04:47:21 crc kubenswrapper[4839]: I0321 04:47:21.485273 4839 generic.go:334] "Generic (PLEG): container finished" podID="6c8778a4-d8b7-4331-be57-d1844b3c0f9f" containerID="07f2e48c7301d0027bae700357d24a79a9ba9d36dd4d10cd8158d308e2f8bf3d" exitCode=0 Mar 21 04:47:21 crc kubenswrapper[4839]: I0321 04:47:21.485349 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-f7kjm" event={"ID":"6c8778a4-d8b7-4331-be57-d1844b3c0f9f","Type":"ContainerDied","Data":"07f2e48c7301d0027bae700357d24a79a9ba9d36dd4d10cd8158d308e2f8bf3d"} Mar 21 04:47:21 crc kubenswrapper[4839]: I0321 04:47:21.605087 4839 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Mar 21 04:47:21 crc kubenswrapper[4839]: I0321 04:47:21.605404 4839 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Mar 21 04:47:21 crc kubenswrapper[4839]: I0321 04:47:21.768383 4839 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Mar 21 04:47:21 crc kubenswrapper[4839]: I0321 04:47:21.768443 4839 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Mar 21 04:47:22 crc kubenswrapper[4839]: I0321 04:47:22.783718 4839 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="968e5045-c2d8-4fba-9011-0a81fa2b95a3" containerName="nova-api-api" probeResult="failure" output="Get \"https://10.217.0.214:8774/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 21 04:47:22 crc kubenswrapper[4839]: I0321 04:47:22.783773 4839 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="968e5045-c2d8-4fba-9011-0a81fa2b95a3" containerName="nova-api-log" probeResult="failure" output="Get \"https://10.217.0.214:8774/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 21 04:47:22 crc kubenswrapper[4839]: I0321 04:47:22.902869 4839 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-f7kjm" Mar 21 04:47:23 crc kubenswrapper[4839]: I0321 04:47:23.004470 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9fqkd\" (UniqueName: \"kubernetes.io/projected/6c8778a4-d8b7-4331-be57-d1844b3c0f9f-kube-api-access-9fqkd\") pod \"6c8778a4-d8b7-4331-be57-d1844b3c0f9f\" (UID: \"6c8778a4-d8b7-4331-be57-d1844b3c0f9f\") " Mar 21 04:47:23 crc kubenswrapper[4839]: I0321 04:47:23.004592 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6c8778a4-d8b7-4331-be57-d1844b3c0f9f-scripts\") pod \"6c8778a4-d8b7-4331-be57-d1844b3c0f9f\" (UID: \"6c8778a4-d8b7-4331-be57-d1844b3c0f9f\") " Mar 21 04:47:23 crc kubenswrapper[4839]: I0321 04:47:23.004722 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6c8778a4-d8b7-4331-be57-d1844b3c0f9f-config-data\") pod \"6c8778a4-d8b7-4331-be57-d1844b3c0f9f\" (UID: \"6c8778a4-d8b7-4331-be57-d1844b3c0f9f\") " Mar 21 04:47:23 crc kubenswrapper[4839]: I0321 04:47:23.004752 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6c8778a4-d8b7-4331-be57-d1844b3c0f9f-combined-ca-bundle\") pod \"6c8778a4-d8b7-4331-be57-d1844b3c0f9f\" (UID: \"6c8778a4-d8b7-4331-be57-d1844b3c0f9f\") " Mar 21 04:47:23 crc kubenswrapper[4839]: I0321 04:47:23.010797 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6c8778a4-d8b7-4331-be57-d1844b3c0f9f-scripts" (OuterVolumeSpecName: "scripts") pod "6c8778a4-d8b7-4331-be57-d1844b3c0f9f" (UID: "6c8778a4-d8b7-4331-be57-d1844b3c0f9f"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 21 04:47:23 crc kubenswrapper[4839]: I0321 04:47:23.012420 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6c8778a4-d8b7-4331-be57-d1844b3c0f9f-kube-api-access-9fqkd" (OuterVolumeSpecName: "kube-api-access-9fqkd") pod "6c8778a4-d8b7-4331-be57-d1844b3c0f9f" (UID: "6c8778a4-d8b7-4331-be57-d1844b3c0f9f"). InnerVolumeSpecName "kube-api-access-9fqkd". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 21 04:47:23 crc kubenswrapper[4839]: I0321 04:47:23.031011 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6c8778a4-d8b7-4331-be57-d1844b3c0f9f-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "6c8778a4-d8b7-4331-be57-d1844b3c0f9f" (UID: "6c8778a4-d8b7-4331-be57-d1844b3c0f9f"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 21 04:47:23 crc kubenswrapper[4839]: I0321 04:47:23.049256 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6c8778a4-d8b7-4331-be57-d1844b3c0f9f-config-data" (OuterVolumeSpecName: "config-data") pod "6c8778a4-d8b7-4331-be57-d1844b3c0f9f" (UID: "6c8778a4-d8b7-4331-be57-d1844b3c0f9f"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 21 04:47:23 crc kubenswrapper[4839]: I0321 04:47:23.107413 4839 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6c8778a4-d8b7-4331-be57-d1844b3c0f9f-config-data\") on node \"crc\" DevicePath \"\"" Mar 21 04:47:23 crc kubenswrapper[4839]: I0321 04:47:23.107444 4839 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6c8778a4-d8b7-4331-be57-d1844b3c0f9f-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 21 04:47:23 crc kubenswrapper[4839]: I0321 04:47:23.107455 4839 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9fqkd\" (UniqueName: \"kubernetes.io/projected/6c8778a4-d8b7-4331-be57-d1844b3c0f9f-kube-api-access-9fqkd\") on node \"crc\" DevicePath \"\"" Mar 21 04:47:23 crc kubenswrapper[4839]: I0321 04:47:23.107463 4839 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6c8778a4-d8b7-4331-be57-d1844b3c0f9f-scripts\") on node \"crc\" DevicePath \"\"" Mar 21 04:47:23 crc kubenswrapper[4839]: I0321 04:47:23.503528 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-f7kjm" event={"ID":"6c8778a4-d8b7-4331-be57-d1844b3c0f9f","Type":"ContainerDied","Data":"9f409cc408d1754bbcdbaab1c8502e074d738e287574c49b7ecc38ef5824176e"} Mar 21 04:47:23 crc kubenswrapper[4839]: I0321 04:47:23.503979 4839 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="9f409cc408d1754bbcdbaab1c8502e074d738e287574c49b7ecc38ef5824176e" Mar 21 04:47:23 crc kubenswrapper[4839]: I0321 04:47:23.503584 4839 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-f7kjm" Mar 21 04:47:23 crc kubenswrapper[4839]: I0321 04:47:23.610077 4839 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Mar 21 04:47:23 crc kubenswrapper[4839]: I0321 04:47:23.610830 4839 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Mar 21 04:47:23 crc kubenswrapper[4839]: I0321 04:47:23.615187 4839 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Mar 21 04:47:23 crc kubenswrapper[4839]: I0321 04:47:23.698989 4839 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Mar 21 04:47:23 crc kubenswrapper[4839]: I0321 04:47:23.699316 4839 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="968e5045-c2d8-4fba-9011-0a81fa2b95a3" containerName="nova-api-log" containerID="cri-o://2e809d0916446ccc1b6f965c38dcf8b14dcb39acbc20bb5dcd7f8eed60511dad" gracePeriod=30 Mar 21 04:47:23 crc kubenswrapper[4839]: I0321 04:47:23.699842 4839 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="968e5045-c2d8-4fba-9011-0a81fa2b95a3" containerName="nova-api-api" containerID="cri-o://4126f0f0236467088a06a786d560aeca176722ae7ed2730314d9af69edbaed6c" gracePeriod=30 Mar 21 04:47:23 crc kubenswrapper[4839]: I0321 04:47:23.716000 4839 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Mar 21 04:47:23 crc kubenswrapper[4839]: I0321 04:47:23.716260 4839 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-scheduler-0" podUID="1ee2fcd4-456d-436a-ae9e-95f8224e2834" containerName="nova-scheduler-scheduler" containerID="cri-o://09c33300639c44e28a412a85ce7db7748f2c84c1bd2fbec427d1d44f42a06978" gracePeriod=30 Mar 21 04:47:23 crc kubenswrapper[4839]: I0321 04:47:23.739768 4839 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Mar 21 04:47:24 crc kubenswrapper[4839]: I0321 04:47:24.514157 4839 generic.go:334] "Generic (PLEG): container finished" podID="968e5045-c2d8-4fba-9011-0a81fa2b95a3" containerID="2e809d0916446ccc1b6f965c38dcf8b14dcb39acbc20bb5dcd7f8eed60511dad" exitCode=143 Mar 21 04:47:24 crc kubenswrapper[4839]: I0321 04:47:24.514246 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"968e5045-c2d8-4fba-9011-0a81fa2b95a3","Type":"ContainerDied","Data":"2e809d0916446ccc1b6f965c38dcf8b14dcb39acbc20bb5dcd7f8eed60511dad"} Mar 21 04:47:24 crc kubenswrapper[4839]: I0321 04:47:24.520659 4839 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Mar 21 04:47:25 crc kubenswrapper[4839]: E0321 04:47:25.350754 4839 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="09c33300639c44e28a412a85ce7db7748f2c84c1bd2fbec427d1d44f42a06978" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Mar 21 04:47:25 crc kubenswrapper[4839]: E0321 04:47:25.352377 4839 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="09c33300639c44e28a412a85ce7db7748f2c84c1bd2fbec427d1d44f42a06978" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Mar 21 04:47:25 crc kubenswrapper[4839]: E0321 04:47:25.353976 4839 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="09c33300639c44e28a412a85ce7db7748f2c84c1bd2fbec427d1d44f42a06978" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Mar 21 04:47:25 crc kubenswrapper[4839]: E0321 04:47:25.354051 4839 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/nova-scheduler-0" podUID="1ee2fcd4-456d-436a-ae9e-95f8224e2834" containerName="nova-scheduler-scheduler" Mar 21 04:47:25 crc kubenswrapper[4839]: I0321 04:47:25.521849 4839 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="8e28a9be-2244-43bb-9043-2ededa502897" containerName="nova-metadata-log" containerID="cri-o://c9bb20765c0019dc8a9e6988a830d859fed3dd4c660411ba1ffc4dc6daa22e48" gracePeriod=30 Mar 21 04:47:25 crc kubenswrapper[4839]: I0321 04:47:25.522070 4839 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="8e28a9be-2244-43bb-9043-2ededa502897" containerName="nova-metadata-metadata" containerID="cri-o://a015fcef30e3edb89fbd565a12ab7be168cef7141e42ffd913f3c444b15b9cb5" gracePeriod=30 Mar 21 04:47:26 crc kubenswrapper[4839]: I0321 04:47:26.531644 4839 generic.go:334] "Generic (PLEG): container finished" podID="8e28a9be-2244-43bb-9043-2ededa502897" containerID="c9bb20765c0019dc8a9e6988a830d859fed3dd4c660411ba1ffc4dc6daa22e48" exitCode=143 Mar 21 04:47:26 crc kubenswrapper[4839]: I0321 04:47:26.531683 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"8e28a9be-2244-43bb-9043-2ededa502897","Type":"ContainerDied","Data":"c9bb20765c0019dc8a9e6988a830d859fed3dd4c660411ba1ffc4dc6daa22e48"} Mar 21 04:47:28 crc kubenswrapper[4839]: I0321 04:47:28.491638 4839 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 21 04:47:28 crc kubenswrapper[4839]: I0321 04:47:28.560512 4839 generic.go:334] "Generic (PLEG): container finished" podID="968e5045-c2d8-4fba-9011-0a81fa2b95a3" containerID="4126f0f0236467088a06a786d560aeca176722ae7ed2730314d9af69edbaed6c" exitCode=0 Mar 21 04:47:28 crc kubenswrapper[4839]: I0321 04:47:28.560588 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"968e5045-c2d8-4fba-9011-0a81fa2b95a3","Type":"ContainerDied","Data":"4126f0f0236467088a06a786d560aeca176722ae7ed2730314d9af69edbaed6c"} Mar 21 04:47:28 crc kubenswrapper[4839]: I0321 04:47:28.560620 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"968e5045-c2d8-4fba-9011-0a81fa2b95a3","Type":"ContainerDied","Data":"3291d4efbcbfe77a6e6e2dd5345f3dfb280041836b19d5e1194b1100ef0d1f64"} Mar 21 04:47:28 crc kubenswrapper[4839]: I0321 04:47:28.560642 4839 scope.go:117] "RemoveContainer" containerID="4126f0f0236467088a06a786d560aeca176722ae7ed2730314d9af69edbaed6c" Mar 21 04:47:28 crc kubenswrapper[4839]: I0321 04:47:28.560821 4839 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 21 04:47:28 crc kubenswrapper[4839]: I0321 04:47:28.590327 4839 scope.go:117] "RemoveContainer" containerID="2e809d0916446ccc1b6f965c38dcf8b14dcb39acbc20bb5dcd7f8eed60511dad" Mar 21 04:47:28 crc kubenswrapper[4839]: I0321 04:47:28.609023 4839 scope.go:117] "RemoveContainer" containerID="4126f0f0236467088a06a786d560aeca176722ae7ed2730314d9af69edbaed6c" Mar 21 04:47:28 crc kubenswrapper[4839]: E0321 04:47:28.609497 4839 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4126f0f0236467088a06a786d560aeca176722ae7ed2730314d9af69edbaed6c\": container with ID starting with 4126f0f0236467088a06a786d560aeca176722ae7ed2730314d9af69edbaed6c not found: ID does not exist" containerID="4126f0f0236467088a06a786d560aeca176722ae7ed2730314d9af69edbaed6c" Mar 21 04:47:28 crc kubenswrapper[4839]: I0321 04:47:28.609528 4839 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4126f0f0236467088a06a786d560aeca176722ae7ed2730314d9af69edbaed6c"} err="failed to get container status \"4126f0f0236467088a06a786d560aeca176722ae7ed2730314d9af69edbaed6c\": rpc error: code = NotFound desc = could not find container \"4126f0f0236467088a06a786d560aeca176722ae7ed2730314d9af69edbaed6c\": container with ID starting with 4126f0f0236467088a06a786d560aeca176722ae7ed2730314d9af69edbaed6c not found: ID does not exist" Mar 21 04:47:28 crc kubenswrapper[4839]: I0321 04:47:28.609548 4839 scope.go:117] "RemoveContainer" containerID="2e809d0916446ccc1b6f965c38dcf8b14dcb39acbc20bb5dcd7f8eed60511dad" Mar 21 04:47:28 crc kubenswrapper[4839]: E0321 04:47:28.609850 4839 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2e809d0916446ccc1b6f965c38dcf8b14dcb39acbc20bb5dcd7f8eed60511dad\": container with ID starting with 2e809d0916446ccc1b6f965c38dcf8b14dcb39acbc20bb5dcd7f8eed60511dad not found: ID does not exist" containerID="2e809d0916446ccc1b6f965c38dcf8b14dcb39acbc20bb5dcd7f8eed60511dad" Mar 21 04:47:28 crc kubenswrapper[4839]: I0321 04:47:28.609879 4839 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2e809d0916446ccc1b6f965c38dcf8b14dcb39acbc20bb5dcd7f8eed60511dad"} err="failed to get container status \"2e809d0916446ccc1b6f965c38dcf8b14dcb39acbc20bb5dcd7f8eed60511dad\": rpc error: code = NotFound desc = could not find container \"2e809d0916446ccc1b6f965c38dcf8b14dcb39acbc20bb5dcd7f8eed60511dad\": container with ID starting with 2e809d0916446ccc1b6f965c38dcf8b14dcb39acbc20bb5dcd7f8eed60511dad not found: ID does not exist" Mar 21 04:47:28 crc kubenswrapper[4839]: I0321 04:47:28.649374 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/968e5045-c2d8-4fba-9011-0a81fa2b95a3-public-tls-certs\") pod \"968e5045-c2d8-4fba-9011-0a81fa2b95a3\" (UID: \"968e5045-c2d8-4fba-9011-0a81fa2b95a3\") " Mar 21 04:47:28 crc kubenswrapper[4839]: I0321 04:47:28.649480 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/968e5045-c2d8-4fba-9011-0a81fa2b95a3-config-data\") pod \"968e5045-c2d8-4fba-9011-0a81fa2b95a3\" (UID: \"968e5045-c2d8-4fba-9011-0a81fa2b95a3\") " Mar 21 04:47:28 crc kubenswrapper[4839]: I0321 04:47:28.649556 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/968e5045-c2d8-4fba-9011-0a81fa2b95a3-logs\") pod \"968e5045-c2d8-4fba-9011-0a81fa2b95a3\" (UID: \"968e5045-c2d8-4fba-9011-0a81fa2b95a3\") " Mar 21 04:47:28 crc kubenswrapper[4839]: I0321 04:47:28.651449 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/968e5045-c2d8-4fba-9011-0a81fa2b95a3-logs" (OuterVolumeSpecName: "logs") pod "968e5045-c2d8-4fba-9011-0a81fa2b95a3" (UID: "968e5045-c2d8-4fba-9011-0a81fa2b95a3"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 21 04:47:28 crc kubenswrapper[4839]: I0321 04:47:28.651669 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/968e5045-c2d8-4fba-9011-0a81fa2b95a3-combined-ca-bundle\") pod \"968e5045-c2d8-4fba-9011-0a81fa2b95a3\" (UID: \"968e5045-c2d8-4fba-9011-0a81fa2b95a3\") " Mar 21 04:47:28 crc kubenswrapper[4839]: I0321 04:47:28.651783 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7gj6w\" (UniqueName: \"kubernetes.io/projected/968e5045-c2d8-4fba-9011-0a81fa2b95a3-kube-api-access-7gj6w\") pod \"968e5045-c2d8-4fba-9011-0a81fa2b95a3\" (UID: \"968e5045-c2d8-4fba-9011-0a81fa2b95a3\") " Mar 21 04:47:28 crc kubenswrapper[4839]: I0321 04:47:28.652171 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/968e5045-c2d8-4fba-9011-0a81fa2b95a3-internal-tls-certs\") pod \"968e5045-c2d8-4fba-9011-0a81fa2b95a3\" (UID: \"968e5045-c2d8-4fba-9011-0a81fa2b95a3\") " Mar 21 04:47:28 crc kubenswrapper[4839]: I0321 04:47:28.653040 4839 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/968e5045-c2d8-4fba-9011-0a81fa2b95a3-logs\") on node \"crc\" DevicePath \"\"" Mar 21 04:47:28 crc kubenswrapper[4839]: I0321 04:47:28.668255 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/968e5045-c2d8-4fba-9011-0a81fa2b95a3-kube-api-access-7gj6w" (OuterVolumeSpecName: "kube-api-access-7gj6w") pod "968e5045-c2d8-4fba-9011-0a81fa2b95a3" (UID: "968e5045-c2d8-4fba-9011-0a81fa2b95a3"). InnerVolumeSpecName "kube-api-access-7gj6w". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 21 04:47:28 crc kubenswrapper[4839]: I0321 04:47:28.681120 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/968e5045-c2d8-4fba-9011-0a81fa2b95a3-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "968e5045-c2d8-4fba-9011-0a81fa2b95a3" (UID: "968e5045-c2d8-4fba-9011-0a81fa2b95a3"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 21 04:47:28 crc kubenswrapper[4839]: I0321 04:47:28.681424 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/968e5045-c2d8-4fba-9011-0a81fa2b95a3-config-data" (OuterVolumeSpecName: "config-data") pod "968e5045-c2d8-4fba-9011-0a81fa2b95a3" (UID: "968e5045-c2d8-4fba-9011-0a81fa2b95a3"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 21 04:47:28 crc kubenswrapper[4839]: I0321 04:47:28.698466 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/968e5045-c2d8-4fba-9011-0a81fa2b95a3-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "968e5045-c2d8-4fba-9011-0a81fa2b95a3" (UID: "968e5045-c2d8-4fba-9011-0a81fa2b95a3"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 21 04:47:28 crc kubenswrapper[4839]: I0321 04:47:28.701100 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/968e5045-c2d8-4fba-9011-0a81fa2b95a3-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "968e5045-c2d8-4fba-9011-0a81fa2b95a3" (UID: "968e5045-c2d8-4fba-9011-0a81fa2b95a3"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 21 04:47:28 crc kubenswrapper[4839]: I0321 04:47:28.754253 4839 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/968e5045-c2d8-4fba-9011-0a81fa2b95a3-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 21 04:47:28 crc kubenswrapper[4839]: I0321 04:47:28.754295 4839 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7gj6w\" (UniqueName: \"kubernetes.io/projected/968e5045-c2d8-4fba-9011-0a81fa2b95a3-kube-api-access-7gj6w\") on node \"crc\" DevicePath \"\"" Mar 21 04:47:28 crc kubenswrapper[4839]: I0321 04:47:28.754306 4839 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/968e5045-c2d8-4fba-9011-0a81fa2b95a3-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 21 04:47:28 crc kubenswrapper[4839]: I0321 04:47:28.754314 4839 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/968e5045-c2d8-4fba-9011-0a81fa2b95a3-public-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 21 04:47:28 crc kubenswrapper[4839]: I0321 04:47:28.754323 4839 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/968e5045-c2d8-4fba-9011-0a81fa2b95a3-config-data\") on node \"crc\" DevicePath \"\"" Mar 21 04:47:28 crc kubenswrapper[4839]: I0321 04:47:28.913473 4839 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Mar 21 04:47:28 crc kubenswrapper[4839]: I0321 04:47:28.943303 4839 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-0"] Mar 21 04:47:28 crc kubenswrapper[4839]: I0321 04:47:28.954621 4839 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Mar 21 04:47:28 crc kubenswrapper[4839]: E0321 04:47:28.955077 4839 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="378a796b-e896-48a8-9e03-65e3b371c636" containerName="dnsmasq-dns" Mar 21 04:47:28 crc kubenswrapper[4839]: I0321 04:47:28.955089 4839 state_mem.go:107] "Deleted CPUSet assignment" podUID="378a796b-e896-48a8-9e03-65e3b371c636" containerName="dnsmasq-dns" Mar 21 04:47:28 crc kubenswrapper[4839]: E0321 04:47:28.955103 4839 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="968e5045-c2d8-4fba-9011-0a81fa2b95a3" containerName="nova-api-api" Mar 21 04:47:28 crc kubenswrapper[4839]: I0321 04:47:28.955108 4839 state_mem.go:107] "Deleted CPUSet assignment" podUID="968e5045-c2d8-4fba-9011-0a81fa2b95a3" containerName="nova-api-api" Mar 21 04:47:28 crc kubenswrapper[4839]: E0321 04:47:28.955130 4839 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="968e5045-c2d8-4fba-9011-0a81fa2b95a3" containerName="nova-api-log" Mar 21 04:47:28 crc kubenswrapper[4839]: I0321 04:47:28.955136 4839 state_mem.go:107] "Deleted CPUSet assignment" podUID="968e5045-c2d8-4fba-9011-0a81fa2b95a3" containerName="nova-api-log" Mar 21 04:47:28 crc kubenswrapper[4839]: E0321 04:47:28.955153 4839 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="378a796b-e896-48a8-9e03-65e3b371c636" containerName="init" Mar 21 04:47:28 crc kubenswrapper[4839]: I0321 04:47:28.955160 4839 state_mem.go:107] "Deleted CPUSet assignment" podUID="378a796b-e896-48a8-9e03-65e3b371c636" containerName="init" Mar 21 04:47:28 crc kubenswrapper[4839]: E0321 04:47:28.955167 4839 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6c8778a4-d8b7-4331-be57-d1844b3c0f9f" containerName="nova-manage" Mar 21 04:47:28 crc kubenswrapper[4839]: I0321 04:47:28.955193 4839 state_mem.go:107] "Deleted CPUSet assignment" podUID="6c8778a4-d8b7-4331-be57-d1844b3c0f9f" containerName="nova-manage" Mar 21 04:47:28 crc kubenswrapper[4839]: I0321 04:47:28.955349 4839 memory_manager.go:354] "RemoveStaleState removing state" podUID="968e5045-c2d8-4fba-9011-0a81fa2b95a3" containerName="nova-api-log" Mar 21 04:47:28 crc kubenswrapper[4839]: I0321 04:47:28.955362 4839 memory_manager.go:354] "RemoveStaleState removing state" podUID="6c8778a4-d8b7-4331-be57-d1844b3c0f9f" containerName="nova-manage" Mar 21 04:47:28 crc kubenswrapper[4839]: I0321 04:47:28.955374 4839 memory_manager.go:354] "RemoveStaleState removing state" podUID="968e5045-c2d8-4fba-9011-0a81fa2b95a3" containerName="nova-api-api" Mar 21 04:47:28 crc kubenswrapper[4839]: I0321 04:47:28.955385 4839 memory_manager.go:354] "RemoveStaleState removing state" podUID="378a796b-e896-48a8-9e03-65e3b371c636" containerName="dnsmasq-dns" Mar 21 04:47:28 crc kubenswrapper[4839]: I0321 04:47:28.956519 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 21 04:47:28 crc kubenswrapper[4839]: I0321 04:47:28.959126 4839 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-internal-svc" Mar 21 04:47:28 crc kubenswrapper[4839]: I0321 04:47:28.959173 4839 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Mar 21 04:47:28 crc kubenswrapper[4839]: I0321 04:47:28.959379 4839 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-public-svc" Mar 21 04:47:28 crc kubenswrapper[4839]: I0321 04:47:28.975508 4839 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Mar 21 04:47:29 crc kubenswrapper[4839]: I0321 04:47:29.058490 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/627bf6a3-cf5d-42e1-9250-ba6684bb2cfc-logs\") pod \"nova-api-0\" (UID: \"627bf6a3-cf5d-42e1-9250-ba6684bb2cfc\") " pod="openstack/nova-api-0" Mar 21 04:47:29 crc kubenswrapper[4839]: I0321 04:47:29.058533 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/627bf6a3-cf5d-42e1-9250-ba6684bb2cfc-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"627bf6a3-cf5d-42e1-9250-ba6684bb2cfc\") " pod="openstack/nova-api-0" Mar 21 04:47:29 crc kubenswrapper[4839]: I0321 04:47:29.058557 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/627bf6a3-cf5d-42e1-9250-ba6684bb2cfc-internal-tls-certs\") pod \"nova-api-0\" (UID: \"627bf6a3-cf5d-42e1-9250-ba6684bb2cfc\") " pod="openstack/nova-api-0" Mar 21 04:47:29 crc kubenswrapper[4839]: I0321 04:47:29.058694 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/627bf6a3-cf5d-42e1-9250-ba6684bb2cfc-public-tls-certs\") pod \"nova-api-0\" (UID: \"627bf6a3-cf5d-42e1-9250-ba6684bb2cfc\") " pod="openstack/nova-api-0" Mar 21 04:47:29 crc kubenswrapper[4839]: I0321 04:47:29.058722 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z8g5h\" (UniqueName: \"kubernetes.io/projected/627bf6a3-cf5d-42e1-9250-ba6684bb2cfc-kube-api-access-z8g5h\") pod \"nova-api-0\" (UID: \"627bf6a3-cf5d-42e1-9250-ba6684bb2cfc\") " pod="openstack/nova-api-0" Mar 21 04:47:29 crc kubenswrapper[4839]: I0321 04:47:29.058832 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/627bf6a3-cf5d-42e1-9250-ba6684bb2cfc-config-data\") pod \"nova-api-0\" (UID: \"627bf6a3-cf5d-42e1-9250-ba6684bb2cfc\") " pod="openstack/nova-api-0" Mar 21 04:47:29 crc kubenswrapper[4839]: I0321 04:47:29.061078 4839 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 21 04:47:29 crc kubenswrapper[4839]: I0321 04:47:29.160322 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/627bf6a3-cf5d-42e1-9250-ba6684bb2cfc-logs\") pod \"nova-api-0\" (UID: \"627bf6a3-cf5d-42e1-9250-ba6684bb2cfc\") " pod="openstack/nova-api-0" Mar 21 04:47:29 crc kubenswrapper[4839]: I0321 04:47:29.160366 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/627bf6a3-cf5d-42e1-9250-ba6684bb2cfc-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"627bf6a3-cf5d-42e1-9250-ba6684bb2cfc\") " pod="openstack/nova-api-0" Mar 21 04:47:29 crc kubenswrapper[4839]: I0321 04:47:29.160391 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/627bf6a3-cf5d-42e1-9250-ba6684bb2cfc-internal-tls-certs\") pod \"nova-api-0\" (UID: \"627bf6a3-cf5d-42e1-9250-ba6684bb2cfc\") " pod="openstack/nova-api-0" Mar 21 04:47:29 crc kubenswrapper[4839]: I0321 04:47:29.160473 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/627bf6a3-cf5d-42e1-9250-ba6684bb2cfc-public-tls-certs\") pod \"nova-api-0\" (UID: \"627bf6a3-cf5d-42e1-9250-ba6684bb2cfc\") " pod="openstack/nova-api-0" Mar 21 04:47:29 crc kubenswrapper[4839]: I0321 04:47:29.160506 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-z8g5h\" (UniqueName: \"kubernetes.io/projected/627bf6a3-cf5d-42e1-9250-ba6684bb2cfc-kube-api-access-z8g5h\") pod \"nova-api-0\" (UID: \"627bf6a3-cf5d-42e1-9250-ba6684bb2cfc\") " pod="openstack/nova-api-0" Mar 21 04:47:29 crc kubenswrapper[4839]: I0321 04:47:29.160586 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/627bf6a3-cf5d-42e1-9250-ba6684bb2cfc-config-data\") pod \"nova-api-0\" (UID: \"627bf6a3-cf5d-42e1-9250-ba6684bb2cfc\") " pod="openstack/nova-api-0" Mar 21 04:47:29 crc kubenswrapper[4839]: I0321 04:47:29.161865 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/627bf6a3-cf5d-42e1-9250-ba6684bb2cfc-logs\") pod \"nova-api-0\" (UID: \"627bf6a3-cf5d-42e1-9250-ba6684bb2cfc\") " pod="openstack/nova-api-0" Mar 21 04:47:29 crc kubenswrapper[4839]: I0321 04:47:29.165434 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/627bf6a3-cf5d-42e1-9250-ba6684bb2cfc-config-data\") pod \"nova-api-0\" (UID: \"627bf6a3-cf5d-42e1-9250-ba6684bb2cfc\") " pod="openstack/nova-api-0" Mar 21 04:47:29 crc kubenswrapper[4839]: I0321 04:47:29.166145 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/627bf6a3-cf5d-42e1-9250-ba6684bb2cfc-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"627bf6a3-cf5d-42e1-9250-ba6684bb2cfc\") " pod="openstack/nova-api-0" Mar 21 04:47:29 crc kubenswrapper[4839]: I0321 04:47:29.166263 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/627bf6a3-cf5d-42e1-9250-ba6684bb2cfc-public-tls-certs\") pod \"nova-api-0\" (UID: \"627bf6a3-cf5d-42e1-9250-ba6684bb2cfc\") " pod="openstack/nova-api-0" Mar 21 04:47:29 crc kubenswrapper[4839]: I0321 04:47:29.167126 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/627bf6a3-cf5d-42e1-9250-ba6684bb2cfc-internal-tls-certs\") pod \"nova-api-0\" (UID: \"627bf6a3-cf5d-42e1-9250-ba6684bb2cfc\") " pod="openstack/nova-api-0" Mar 21 04:47:29 crc kubenswrapper[4839]: I0321 04:47:29.195865 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-z8g5h\" (UniqueName: \"kubernetes.io/projected/627bf6a3-cf5d-42e1-9250-ba6684bb2cfc-kube-api-access-z8g5h\") pod \"nova-api-0\" (UID: \"627bf6a3-cf5d-42e1-9250-ba6684bb2cfc\") " pod="openstack/nova-api-0" Mar 21 04:47:29 crc kubenswrapper[4839]: I0321 04:47:29.261729 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2b7fw\" (UniqueName: \"kubernetes.io/projected/8e28a9be-2244-43bb-9043-2ededa502897-kube-api-access-2b7fw\") pod \"8e28a9be-2244-43bb-9043-2ededa502897\" (UID: \"8e28a9be-2244-43bb-9043-2ededa502897\") " Mar 21 04:47:29 crc kubenswrapper[4839]: I0321 04:47:29.261827 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8e28a9be-2244-43bb-9043-2ededa502897-config-data\") pod \"8e28a9be-2244-43bb-9043-2ededa502897\" (UID: \"8e28a9be-2244-43bb-9043-2ededa502897\") " Mar 21 04:47:29 crc kubenswrapper[4839]: I0321 04:47:29.261879 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/8e28a9be-2244-43bb-9043-2ededa502897-nova-metadata-tls-certs\") pod \"8e28a9be-2244-43bb-9043-2ededa502897\" (UID: \"8e28a9be-2244-43bb-9043-2ededa502897\") " Mar 21 04:47:29 crc kubenswrapper[4839]: I0321 04:47:29.261935 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8e28a9be-2244-43bb-9043-2ededa502897-combined-ca-bundle\") pod \"8e28a9be-2244-43bb-9043-2ededa502897\" (UID: \"8e28a9be-2244-43bb-9043-2ededa502897\") " Mar 21 04:47:29 crc kubenswrapper[4839]: I0321 04:47:29.261999 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8e28a9be-2244-43bb-9043-2ededa502897-logs\") pod \"8e28a9be-2244-43bb-9043-2ededa502897\" (UID: \"8e28a9be-2244-43bb-9043-2ededa502897\") " Mar 21 04:47:29 crc kubenswrapper[4839]: I0321 04:47:29.262713 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8e28a9be-2244-43bb-9043-2ededa502897-logs" (OuterVolumeSpecName: "logs") pod "8e28a9be-2244-43bb-9043-2ededa502897" (UID: "8e28a9be-2244-43bb-9043-2ededa502897"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 21 04:47:29 crc kubenswrapper[4839]: I0321 04:47:29.282459 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8e28a9be-2244-43bb-9043-2ededa502897-kube-api-access-2b7fw" (OuterVolumeSpecName: "kube-api-access-2b7fw") pod "8e28a9be-2244-43bb-9043-2ededa502897" (UID: "8e28a9be-2244-43bb-9043-2ededa502897"). InnerVolumeSpecName "kube-api-access-2b7fw". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 21 04:47:29 crc kubenswrapper[4839]: I0321 04:47:29.291510 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8e28a9be-2244-43bb-9043-2ededa502897-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "8e28a9be-2244-43bb-9043-2ededa502897" (UID: "8e28a9be-2244-43bb-9043-2ededa502897"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 21 04:47:29 crc kubenswrapper[4839]: I0321 04:47:29.319783 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8e28a9be-2244-43bb-9043-2ededa502897-config-data" (OuterVolumeSpecName: "config-data") pod "8e28a9be-2244-43bb-9043-2ededa502897" (UID: "8e28a9be-2244-43bb-9043-2ededa502897"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 21 04:47:29 crc kubenswrapper[4839]: I0321 04:47:29.351797 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 21 04:47:29 crc kubenswrapper[4839]: I0321 04:47:29.364918 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8e28a9be-2244-43bb-9043-2ededa502897-nova-metadata-tls-certs" (OuterVolumeSpecName: "nova-metadata-tls-certs") pod "8e28a9be-2244-43bb-9043-2ededa502897" (UID: "8e28a9be-2244-43bb-9043-2ededa502897"). InnerVolumeSpecName "nova-metadata-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 21 04:47:29 crc kubenswrapper[4839]: I0321 04:47:29.373091 4839 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8e28a9be-2244-43bb-9043-2ededa502897-logs\") on node \"crc\" DevicePath \"\"" Mar 21 04:47:29 crc kubenswrapper[4839]: I0321 04:47:29.373135 4839 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2b7fw\" (UniqueName: \"kubernetes.io/projected/8e28a9be-2244-43bb-9043-2ededa502897-kube-api-access-2b7fw\") on node \"crc\" DevicePath \"\"" Mar 21 04:47:29 crc kubenswrapper[4839]: I0321 04:47:29.373150 4839 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8e28a9be-2244-43bb-9043-2ededa502897-config-data\") on node \"crc\" DevicePath \"\"" Mar 21 04:47:29 crc kubenswrapper[4839]: I0321 04:47:29.373162 4839 reconciler_common.go:293] "Volume detached for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/8e28a9be-2244-43bb-9043-2ededa502897-nova-metadata-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 21 04:47:29 crc kubenswrapper[4839]: I0321 04:47:29.373173 4839 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8e28a9be-2244-43bb-9043-2ededa502897-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 21 04:47:29 crc kubenswrapper[4839]: I0321 04:47:29.430024 4839 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Mar 21 04:47:29 crc kubenswrapper[4839]: I0321 04:47:29.574437 4839 generic.go:334] "Generic (PLEG): container finished" podID="8e28a9be-2244-43bb-9043-2ededa502897" containerID="a015fcef30e3edb89fbd565a12ab7be168cef7141e42ffd913f3c444b15b9cb5" exitCode=0 Mar 21 04:47:29 crc kubenswrapper[4839]: I0321 04:47:29.574587 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"8e28a9be-2244-43bb-9043-2ededa502897","Type":"ContainerDied","Data":"a015fcef30e3edb89fbd565a12ab7be168cef7141e42ffd913f3c444b15b9cb5"} Mar 21 04:47:29 crc kubenswrapper[4839]: I0321 04:47:29.574860 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"8e28a9be-2244-43bb-9043-2ededa502897","Type":"ContainerDied","Data":"6d254a4eb3f7d7fdb366dc10b2c861d9dba719fa5c681ffb26fb0cc817d0f6f3"} Mar 21 04:47:29 crc kubenswrapper[4839]: I0321 04:47:29.574678 4839 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 21 04:47:29 crc kubenswrapper[4839]: I0321 04:47:29.574887 4839 scope.go:117] "RemoveContainer" containerID="a015fcef30e3edb89fbd565a12ab7be168cef7141e42ffd913f3c444b15b9cb5" Mar 21 04:47:29 crc kubenswrapper[4839]: I0321 04:47:29.576580 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4szpp\" (UniqueName: \"kubernetes.io/projected/1ee2fcd4-456d-436a-ae9e-95f8224e2834-kube-api-access-4szpp\") pod \"1ee2fcd4-456d-436a-ae9e-95f8224e2834\" (UID: \"1ee2fcd4-456d-436a-ae9e-95f8224e2834\") " Mar 21 04:47:29 crc kubenswrapper[4839]: I0321 04:47:29.576673 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1ee2fcd4-456d-436a-ae9e-95f8224e2834-combined-ca-bundle\") pod \"1ee2fcd4-456d-436a-ae9e-95f8224e2834\" (UID: \"1ee2fcd4-456d-436a-ae9e-95f8224e2834\") " Mar 21 04:47:29 crc kubenswrapper[4839]: I0321 04:47:29.577276 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1ee2fcd4-456d-436a-ae9e-95f8224e2834-config-data\") pod \"1ee2fcd4-456d-436a-ae9e-95f8224e2834\" (UID: \"1ee2fcd4-456d-436a-ae9e-95f8224e2834\") " Mar 21 04:47:29 crc kubenswrapper[4839]: I0321 04:47:29.580542 4839 generic.go:334] "Generic (PLEG): container finished" podID="1ee2fcd4-456d-436a-ae9e-95f8224e2834" containerID="09c33300639c44e28a412a85ce7db7748f2c84c1bd2fbec427d1d44f42a06978" exitCode=0 Mar 21 04:47:29 crc kubenswrapper[4839]: I0321 04:47:29.580600 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"1ee2fcd4-456d-436a-ae9e-95f8224e2834","Type":"ContainerDied","Data":"09c33300639c44e28a412a85ce7db7748f2c84c1bd2fbec427d1d44f42a06978"} Mar 21 04:47:29 crc kubenswrapper[4839]: I0321 04:47:29.580631 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"1ee2fcd4-456d-436a-ae9e-95f8224e2834","Type":"ContainerDied","Data":"2dc68d964bf781e74e92f46f7b166439f6d66a340baaedb7718535c26ac20b36"} Mar 21 04:47:29 crc kubenswrapper[4839]: I0321 04:47:29.580702 4839 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Mar 21 04:47:29 crc kubenswrapper[4839]: I0321 04:47:29.581005 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1ee2fcd4-456d-436a-ae9e-95f8224e2834-kube-api-access-4szpp" (OuterVolumeSpecName: "kube-api-access-4szpp") pod "1ee2fcd4-456d-436a-ae9e-95f8224e2834" (UID: "1ee2fcd4-456d-436a-ae9e-95f8224e2834"). InnerVolumeSpecName "kube-api-access-4szpp". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 21 04:47:29 crc kubenswrapper[4839]: I0321 04:47:29.599844 4839 scope.go:117] "RemoveContainer" containerID="c9bb20765c0019dc8a9e6988a830d859fed3dd4c660411ba1ffc4dc6daa22e48" Mar 21 04:47:29 crc kubenswrapper[4839]: I0321 04:47:29.610577 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1ee2fcd4-456d-436a-ae9e-95f8224e2834-config-data" (OuterVolumeSpecName: "config-data") pod "1ee2fcd4-456d-436a-ae9e-95f8224e2834" (UID: "1ee2fcd4-456d-436a-ae9e-95f8224e2834"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 21 04:47:29 crc kubenswrapper[4839]: I0321 04:47:29.613064 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1ee2fcd4-456d-436a-ae9e-95f8224e2834-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "1ee2fcd4-456d-436a-ae9e-95f8224e2834" (UID: "1ee2fcd4-456d-436a-ae9e-95f8224e2834"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 21 04:47:29 crc kubenswrapper[4839]: I0321 04:47:29.619138 4839 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Mar 21 04:47:29 crc kubenswrapper[4839]: I0321 04:47:29.641826 4839 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-metadata-0"] Mar 21 04:47:29 crc kubenswrapper[4839]: I0321 04:47:29.660760 4839 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Mar 21 04:47:29 crc kubenswrapper[4839]: E0321 04:47:29.661353 4839 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8e28a9be-2244-43bb-9043-2ededa502897" containerName="nova-metadata-log" Mar 21 04:47:29 crc kubenswrapper[4839]: I0321 04:47:29.661370 4839 state_mem.go:107] "Deleted CPUSet assignment" podUID="8e28a9be-2244-43bb-9043-2ededa502897" containerName="nova-metadata-log" Mar 21 04:47:29 crc kubenswrapper[4839]: E0321 04:47:29.661389 4839 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8e28a9be-2244-43bb-9043-2ededa502897" containerName="nova-metadata-metadata" Mar 21 04:47:29 crc kubenswrapper[4839]: I0321 04:47:29.661396 4839 state_mem.go:107] "Deleted CPUSet assignment" podUID="8e28a9be-2244-43bb-9043-2ededa502897" containerName="nova-metadata-metadata" Mar 21 04:47:29 crc kubenswrapper[4839]: E0321 04:47:29.661423 4839 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1ee2fcd4-456d-436a-ae9e-95f8224e2834" containerName="nova-scheduler-scheduler" Mar 21 04:47:29 crc kubenswrapper[4839]: I0321 04:47:29.661436 4839 state_mem.go:107] "Deleted CPUSet assignment" podUID="1ee2fcd4-456d-436a-ae9e-95f8224e2834" containerName="nova-scheduler-scheduler" Mar 21 04:47:29 crc kubenswrapper[4839]: I0321 04:47:29.661667 4839 memory_manager.go:354] "RemoveStaleState removing state" podUID="8e28a9be-2244-43bb-9043-2ededa502897" containerName="nova-metadata-metadata" Mar 21 04:47:29 crc kubenswrapper[4839]: I0321 04:47:29.661688 4839 memory_manager.go:354] "RemoveStaleState removing state" podUID="8e28a9be-2244-43bb-9043-2ededa502897" containerName="nova-metadata-log" Mar 21 04:47:29 crc kubenswrapper[4839]: I0321 04:47:29.661703 4839 memory_manager.go:354] "RemoveStaleState removing state" podUID="1ee2fcd4-456d-436a-ae9e-95f8224e2834" containerName="nova-scheduler-scheduler" Mar 21 04:47:29 crc kubenswrapper[4839]: I0321 04:47:29.661948 4839 scope.go:117] "RemoveContainer" containerID="a015fcef30e3edb89fbd565a12ab7be168cef7141e42ffd913f3c444b15b9cb5" Mar 21 04:47:29 crc kubenswrapper[4839]: I0321 04:47:29.662953 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 21 04:47:29 crc kubenswrapper[4839]: E0321 04:47:29.672186 4839 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a015fcef30e3edb89fbd565a12ab7be168cef7141e42ffd913f3c444b15b9cb5\": container with ID starting with a015fcef30e3edb89fbd565a12ab7be168cef7141e42ffd913f3c444b15b9cb5 not found: ID does not exist" containerID="a015fcef30e3edb89fbd565a12ab7be168cef7141e42ffd913f3c444b15b9cb5" Mar 21 04:47:29 crc kubenswrapper[4839]: I0321 04:47:29.672251 4839 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a015fcef30e3edb89fbd565a12ab7be168cef7141e42ffd913f3c444b15b9cb5"} err="failed to get container status \"a015fcef30e3edb89fbd565a12ab7be168cef7141e42ffd913f3c444b15b9cb5\": rpc error: code = NotFound desc = could not find container \"a015fcef30e3edb89fbd565a12ab7be168cef7141e42ffd913f3c444b15b9cb5\": container with ID starting with a015fcef30e3edb89fbd565a12ab7be168cef7141e42ffd913f3c444b15b9cb5 not found: ID does not exist" Mar 21 04:47:29 crc kubenswrapper[4839]: I0321 04:47:29.672284 4839 scope.go:117] "RemoveContainer" containerID="c9bb20765c0019dc8a9e6988a830d859fed3dd4c660411ba1ffc4dc6daa22e48" Mar 21 04:47:29 crc kubenswrapper[4839]: I0321 04:47:29.673178 4839 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Mar 21 04:47:29 crc kubenswrapper[4839]: I0321 04:47:29.673390 4839 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-metadata-internal-svc" Mar 21 04:47:29 crc kubenswrapper[4839]: I0321 04:47:29.673466 4839 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Mar 21 04:47:29 crc kubenswrapper[4839]: E0321 04:47:29.678928 4839 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c9bb20765c0019dc8a9e6988a830d859fed3dd4c660411ba1ffc4dc6daa22e48\": container with ID starting with c9bb20765c0019dc8a9e6988a830d859fed3dd4c660411ba1ffc4dc6daa22e48 not found: ID does not exist" containerID="c9bb20765c0019dc8a9e6988a830d859fed3dd4c660411ba1ffc4dc6daa22e48" Mar 21 04:47:29 crc kubenswrapper[4839]: I0321 04:47:29.678991 4839 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c9bb20765c0019dc8a9e6988a830d859fed3dd4c660411ba1ffc4dc6daa22e48"} err="failed to get container status \"c9bb20765c0019dc8a9e6988a830d859fed3dd4c660411ba1ffc4dc6daa22e48\": rpc error: code = NotFound desc = could not find container \"c9bb20765c0019dc8a9e6988a830d859fed3dd4c660411ba1ffc4dc6daa22e48\": container with ID starting with c9bb20765c0019dc8a9e6988a830d859fed3dd4c660411ba1ffc4dc6daa22e48 not found: ID does not exist" Mar 21 04:47:29 crc kubenswrapper[4839]: I0321 04:47:29.679033 4839 scope.go:117] "RemoveContainer" containerID="09c33300639c44e28a412a85ce7db7748f2c84c1bd2fbec427d1d44f42a06978" Mar 21 04:47:29 crc kubenswrapper[4839]: I0321 04:47:29.679912 4839 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1ee2fcd4-456d-436a-ae9e-95f8224e2834-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 21 04:47:29 crc kubenswrapper[4839]: I0321 04:47:29.679946 4839 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1ee2fcd4-456d-436a-ae9e-95f8224e2834-config-data\") on node \"crc\" DevicePath \"\"" Mar 21 04:47:29 crc kubenswrapper[4839]: I0321 04:47:29.679959 4839 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4szpp\" (UniqueName: \"kubernetes.io/projected/1ee2fcd4-456d-436a-ae9e-95f8224e2834-kube-api-access-4szpp\") on node \"crc\" DevicePath \"\"" Mar 21 04:47:29 crc kubenswrapper[4839]: I0321 04:47:29.711560 4839 scope.go:117] "RemoveContainer" containerID="09c33300639c44e28a412a85ce7db7748f2c84c1bd2fbec427d1d44f42a06978" Mar 21 04:47:29 crc kubenswrapper[4839]: E0321 04:47:29.712275 4839 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"09c33300639c44e28a412a85ce7db7748f2c84c1bd2fbec427d1d44f42a06978\": container with ID starting with 09c33300639c44e28a412a85ce7db7748f2c84c1bd2fbec427d1d44f42a06978 not found: ID does not exist" containerID="09c33300639c44e28a412a85ce7db7748f2c84c1bd2fbec427d1d44f42a06978" Mar 21 04:47:29 crc kubenswrapper[4839]: I0321 04:47:29.712322 4839 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"09c33300639c44e28a412a85ce7db7748f2c84c1bd2fbec427d1d44f42a06978"} err="failed to get container status \"09c33300639c44e28a412a85ce7db7748f2c84c1bd2fbec427d1d44f42a06978\": rpc error: code = NotFound desc = could not find container \"09c33300639c44e28a412a85ce7db7748f2c84c1bd2fbec427d1d44f42a06978\": container with ID starting with 09c33300639c44e28a412a85ce7db7748f2c84c1bd2fbec427d1d44f42a06978 not found: ID does not exist" Mar 21 04:47:29 crc kubenswrapper[4839]: I0321 04:47:29.781938 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0aafbc7f-e890-4a32-8531-f148aeea18e6-logs\") pod \"nova-metadata-0\" (UID: \"0aafbc7f-e890-4a32-8531-f148aeea18e6\") " pod="openstack/nova-metadata-0" Mar 21 04:47:29 crc kubenswrapper[4839]: I0321 04:47:29.782012 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0aafbc7f-e890-4a32-8531-f148aeea18e6-config-data\") pod \"nova-metadata-0\" (UID: \"0aafbc7f-e890-4a32-8531-f148aeea18e6\") " pod="openstack/nova-metadata-0" Mar 21 04:47:29 crc kubenswrapper[4839]: I0321 04:47:29.782160 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0aafbc7f-e890-4a32-8531-f148aeea18e6-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"0aafbc7f-e890-4a32-8531-f148aeea18e6\") " pod="openstack/nova-metadata-0" Mar 21 04:47:29 crc kubenswrapper[4839]: I0321 04:47:29.782200 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s2gkg\" (UniqueName: \"kubernetes.io/projected/0aafbc7f-e890-4a32-8531-f148aeea18e6-kube-api-access-s2gkg\") pod \"nova-metadata-0\" (UID: \"0aafbc7f-e890-4a32-8531-f148aeea18e6\") " pod="openstack/nova-metadata-0" Mar 21 04:47:29 crc kubenswrapper[4839]: I0321 04:47:29.782406 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/0aafbc7f-e890-4a32-8531-f148aeea18e6-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"0aafbc7f-e890-4a32-8531-f148aeea18e6\") " pod="openstack/nova-metadata-0" Mar 21 04:47:29 crc kubenswrapper[4839]: W0321 04:47:29.876706 4839 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod627bf6a3_cf5d_42e1_9250_ba6684bb2cfc.slice/crio-3283d8452731a9b98569c92c6ce618f615e4b8dd5186f3210f59ff284faeb863 WatchSource:0}: Error finding container 3283d8452731a9b98569c92c6ce618f615e4b8dd5186f3210f59ff284faeb863: Status 404 returned error can't find the container with id 3283d8452731a9b98569c92c6ce618f615e4b8dd5186f3210f59ff284faeb863 Mar 21 04:47:29 crc kubenswrapper[4839]: I0321 04:47:29.877539 4839 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Mar 21 04:47:29 crc kubenswrapper[4839]: I0321 04:47:29.885556 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0aafbc7f-e890-4a32-8531-f148aeea18e6-logs\") pod \"nova-metadata-0\" (UID: \"0aafbc7f-e890-4a32-8531-f148aeea18e6\") " pod="openstack/nova-metadata-0" Mar 21 04:47:29 crc kubenswrapper[4839]: I0321 04:47:29.885627 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0aafbc7f-e890-4a32-8531-f148aeea18e6-config-data\") pod \"nova-metadata-0\" (UID: \"0aafbc7f-e890-4a32-8531-f148aeea18e6\") " pod="openstack/nova-metadata-0" Mar 21 04:47:29 crc kubenswrapper[4839]: I0321 04:47:29.885689 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0aafbc7f-e890-4a32-8531-f148aeea18e6-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"0aafbc7f-e890-4a32-8531-f148aeea18e6\") " pod="openstack/nova-metadata-0" Mar 21 04:47:29 crc kubenswrapper[4839]: I0321 04:47:29.885720 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2gkg\" (UniqueName: \"kubernetes.io/projected/0aafbc7f-e890-4a32-8531-f148aeea18e6-kube-api-access-s2gkg\") pod \"nova-metadata-0\" (UID: \"0aafbc7f-e890-4a32-8531-f148aeea18e6\") " pod="openstack/nova-metadata-0" Mar 21 04:47:29 crc kubenswrapper[4839]: I0321 04:47:29.885773 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/0aafbc7f-e890-4a32-8531-f148aeea18e6-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"0aafbc7f-e890-4a32-8531-f148aeea18e6\") " pod="openstack/nova-metadata-0" Mar 21 04:47:29 crc kubenswrapper[4839]: I0321 04:47:29.886287 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0aafbc7f-e890-4a32-8531-f148aeea18e6-logs\") pod \"nova-metadata-0\" (UID: \"0aafbc7f-e890-4a32-8531-f148aeea18e6\") " pod="openstack/nova-metadata-0" Mar 21 04:47:29 crc kubenswrapper[4839]: I0321 04:47:29.889954 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/0aafbc7f-e890-4a32-8531-f148aeea18e6-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"0aafbc7f-e890-4a32-8531-f148aeea18e6\") " pod="openstack/nova-metadata-0" Mar 21 04:47:29 crc kubenswrapper[4839]: I0321 04:47:29.890218 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0aafbc7f-e890-4a32-8531-f148aeea18e6-config-data\") pod \"nova-metadata-0\" (UID: \"0aafbc7f-e890-4a32-8531-f148aeea18e6\") " pod="openstack/nova-metadata-0" Mar 21 04:47:29 crc kubenswrapper[4839]: I0321 04:47:29.890306 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0aafbc7f-e890-4a32-8531-f148aeea18e6-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"0aafbc7f-e890-4a32-8531-f148aeea18e6\") " pod="openstack/nova-metadata-0" Mar 21 04:47:29 crc kubenswrapper[4839]: I0321 04:47:29.905598 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s2gkg\" (UniqueName: \"kubernetes.io/projected/0aafbc7f-e890-4a32-8531-f148aeea18e6-kube-api-access-s2gkg\") pod \"nova-metadata-0\" (UID: \"0aafbc7f-e890-4a32-8531-f148aeea18e6\") " pod="openstack/nova-metadata-0" Mar 21 04:47:29 crc kubenswrapper[4839]: I0321 04:47:29.926053 4839 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Mar 21 04:47:29 crc kubenswrapper[4839]: I0321 04:47:29.939104 4839 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-scheduler-0"] Mar 21 04:47:29 crc kubenswrapper[4839]: I0321 04:47:29.960613 4839 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-scheduler-0"] Mar 21 04:47:30 crc kubenswrapper[4839]: I0321 04:47:29.962829 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Mar 21 04:47:30 crc kubenswrapper[4839]: I0321 04:47:29.965407 4839 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-scheduler-config-data" Mar 21 04:47:30 crc kubenswrapper[4839]: I0321 04:47:29.972069 4839 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Mar 21 04:47:30 crc kubenswrapper[4839]: I0321 04:47:29.988844 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bbecccff-0ecc-44ff-a57b-f7289b8bcf5a-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"bbecccff-0ecc-44ff-a57b-f7289b8bcf5a\") " pod="openstack/nova-scheduler-0" Mar 21 04:47:30 crc kubenswrapper[4839]: I0321 04:47:29.988921 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tj2fn\" (UniqueName: \"kubernetes.io/projected/bbecccff-0ecc-44ff-a57b-f7289b8bcf5a-kube-api-access-tj2fn\") pod \"nova-scheduler-0\" (UID: \"bbecccff-0ecc-44ff-a57b-f7289b8bcf5a\") " pod="openstack/nova-scheduler-0" Mar 21 04:47:30 crc kubenswrapper[4839]: I0321 04:47:29.989078 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bbecccff-0ecc-44ff-a57b-f7289b8bcf5a-config-data\") pod \"nova-scheduler-0\" (UID: \"bbecccff-0ecc-44ff-a57b-f7289b8bcf5a\") " pod="openstack/nova-scheduler-0" Mar 21 04:47:30 crc kubenswrapper[4839]: I0321 04:47:29.995998 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 21 04:47:30 crc kubenswrapper[4839]: I0321 04:47:30.089762 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bbecccff-0ecc-44ff-a57b-f7289b8bcf5a-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"bbecccff-0ecc-44ff-a57b-f7289b8bcf5a\") " pod="openstack/nova-scheduler-0" Mar 21 04:47:30 crc kubenswrapper[4839]: I0321 04:47:30.090368 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tj2fn\" (UniqueName: \"kubernetes.io/projected/bbecccff-0ecc-44ff-a57b-f7289b8bcf5a-kube-api-access-tj2fn\") pod \"nova-scheduler-0\" (UID: \"bbecccff-0ecc-44ff-a57b-f7289b8bcf5a\") " pod="openstack/nova-scheduler-0" Mar 21 04:47:30 crc kubenswrapper[4839]: I0321 04:47:30.090465 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bbecccff-0ecc-44ff-a57b-f7289b8bcf5a-config-data\") pod \"nova-scheduler-0\" (UID: \"bbecccff-0ecc-44ff-a57b-f7289b8bcf5a\") " pod="openstack/nova-scheduler-0" Mar 21 04:47:30 crc kubenswrapper[4839]: I0321 04:47:30.095724 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bbecccff-0ecc-44ff-a57b-f7289b8bcf5a-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"bbecccff-0ecc-44ff-a57b-f7289b8bcf5a\") " pod="openstack/nova-scheduler-0" Mar 21 04:47:30 crc kubenswrapper[4839]: I0321 04:47:30.098950 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bbecccff-0ecc-44ff-a57b-f7289b8bcf5a-config-data\") pod \"nova-scheduler-0\" (UID: \"bbecccff-0ecc-44ff-a57b-f7289b8bcf5a\") " pod="openstack/nova-scheduler-0" Mar 21 04:47:30 crc kubenswrapper[4839]: I0321 04:47:30.113149 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tj2fn\" (UniqueName: \"kubernetes.io/projected/bbecccff-0ecc-44ff-a57b-f7289b8bcf5a-kube-api-access-tj2fn\") pod \"nova-scheduler-0\" (UID: \"bbecccff-0ecc-44ff-a57b-f7289b8bcf5a\") " pod="openstack/nova-scheduler-0" Mar 21 04:47:30 crc kubenswrapper[4839]: I0321 04:47:30.292471 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Mar 21 04:47:30 crc kubenswrapper[4839]: I0321 04:47:30.463743 4839 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1ee2fcd4-456d-436a-ae9e-95f8224e2834" path="/var/lib/kubelet/pods/1ee2fcd4-456d-436a-ae9e-95f8224e2834/volumes" Mar 21 04:47:30 crc kubenswrapper[4839]: I0321 04:47:30.464655 4839 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8e28a9be-2244-43bb-9043-2ededa502897" path="/var/lib/kubelet/pods/8e28a9be-2244-43bb-9043-2ededa502897/volumes" Mar 21 04:47:30 crc kubenswrapper[4839]: I0321 04:47:30.465321 4839 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="968e5045-c2d8-4fba-9011-0a81fa2b95a3" path="/var/lib/kubelet/pods/968e5045-c2d8-4fba-9011-0a81fa2b95a3/volumes" Mar 21 04:47:30 crc kubenswrapper[4839]: I0321 04:47:30.595542 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"627bf6a3-cf5d-42e1-9250-ba6684bb2cfc","Type":"ContainerStarted","Data":"7940cd1faee5fd561ec476f66f5450aa5a8dc708421e7e79724db8e7453dedca"} Mar 21 04:47:30 crc kubenswrapper[4839]: I0321 04:47:30.595596 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"627bf6a3-cf5d-42e1-9250-ba6684bb2cfc","Type":"ContainerStarted","Data":"0b820abaeb5bf853c0bbac610f2e9119d63d24e6251736b33d73065353027322"} Mar 21 04:47:30 crc kubenswrapper[4839]: I0321 04:47:30.595614 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"627bf6a3-cf5d-42e1-9250-ba6684bb2cfc","Type":"ContainerStarted","Data":"3283d8452731a9b98569c92c6ce618f615e4b8dd5186f3210f59ff284faeb863"} Mar 21 04:47:30 crc kubenswrapper[4839]: I0321 04:47:30.624347 4839 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=2.624330559 podStartE2EDuration="2.624330559s" podCreationTimestamp="2026-03-21 04:47:28 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-21 04:47:30.622764035 +0000 UTC m=+1454.950550721" watchObservedRunningTime="2026-03-21 04:47:30.624330559 +0000 UTC m=+1454.952117235" Mar 21 04:47:31 crc kubenswrapper[4839]: I0321 04:47:31.139140 4839 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Mar 21 04:47:31 crc kubenswrapper[4839]: W0321 04:47:31.144275 4839 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podbbecccff_0ecc_44ff_a57b_f7289b8bcf5a.slice/crio-532cd331f4b4bfadced6d32f461f1b3aa7af6a207e4c3134243bfb7413a97873 WatchSource:0}: Error finding container 532cd331f4b4bfadced6d32f461f1b3aa7af6a207e4c3134243bfb7413a97873: Status 404 returned error can't find the container with id 532cd331f4b4bfadced6d32f461f1b3aa7af6a207e4c3134243bfb7413a97873 Mar 21 04:47:31 crc kubenswrapper[4839]: I0321 04:47:31.147273 4839 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Mar 21 04:47:31 crc kubenswrapper[4839]: I0321 04:47:31.607830 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"0aafbc7f-e890-4a32-8531-f148aeea18e6","Type":"ContainerStarted","Data":"a118aa3df2fe7292795057b6d29b804d3a5495d74d5bab2d0e6ec99ace13ba78"} Mar 21 04:47:31 crc kubenswrapper[4839]: I0321 04:47:31.607897 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"0aafbc7f-e890-4a32-8531-f148aeea18e6","Type":"ContainerStarted","Data":"147b88cf5646198b8c45418c7c3437d1bb597e68449481a9d3059a44ca3dc5c8"} Mar 21 04:47:31 crc kubenswrapper[4839]: I0321 04:47:31.607912 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"0aafbc7f-e890-4a32-8531-f148aeea18e6","Type":"ContainerStarted","Data":"ab89a0ab3ec9c73d28ffe25d85bfed521e615c51fda977373fb2f11682983456"} Mar 21 04:47:31 crc kubenswrapper[4839]: I0321 04:47:31.610361 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"bbecccff-0ecc-44ff-a57b-f7289b8bcf5a","Type":"ContainerStarted","Data":"10a225aeaf244148b45f1297b659b648d1a6ae727a554cc7ee1ac3dd86eb8195"} Mar 21 04:47:31 crc kubenswrapper[4839]: I0321 04:47:31.610425 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"bbecccff-0ecc-44ff-a57b-f7289b8bcf5a","Type":"ContainerStarted","Data":"532cd331f4b4bfadced6d32f461f1b3aa7af6a207e4c3134243bfb7413a97873"} Mar 21 04:47:31 crc kubenswrapper[4839]: I0321 04:47:31.637759 4839 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=2.637734672 podStartE2EDuration="2.637734672s" podCreationTimestamp="2026-03-21 04:47:29 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-21 04:47:31.627424812 +0000 UTC m=+1455.955211508" watchObservedRunningTime="2026-03-21 04:47:31.637734672 +0000 UTC m=+1455.965521348" Mar 21 04:47:31 crc kubenswrapper[4839]: I0321 04:47:31.653120 4839 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-scheduler-0" podStartSLOduration=2.653099555 podStartE2EDuration="2.653099555s" podCreationTimestamp="2026-03-21 04:47:29 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-21 04:47:31.640395087 +0000 UTC m=+1455.968181823" watchObservedRunningTime="2026-03-21 04:47:31.653099555 +0000 UTC m=+1455.980886231" Mar 21 04:47:35 crc kubenswrapper[4839]: I0321 04:47:35.292904 4839 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-scheduler-0" Mar 21 04:47:38 crc kubenswrapper[4839]: I0321 04:47:38.786948 4839 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ceilometer-0" Mar 21 04:47:39 crc kubenswrapper[4839]: I0321 04:47:39.353659 4839 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Mar 21 04:47:39 crc kubenswrapper[4839]: I0321 04:47:39.353704 4839 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Mar 21 04:47:39 crc kubenswrapper[4839]: I0321 04:47:39.996609 4839 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Mar 21 04:47:39 crc kubenswrapper[4839]: I0321 04:47:39.996769 4839 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Mar 21 04:47:40 crc kubenswrapper[4839]: I0321 04:47:40.293515 4839 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-scheduler-0" Mar 21 04:47:40 crc kubenswrapper[4839]: I0321 04:47:40.347385 4839 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-scheduler-0" Mar 21 04:47:40 crc kubenswrapper[4839]: I0321 04:47:40.366773 4839 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="627bf6a3-cf5d-42e1-9250-ba6684bb2cfc" containerName="nova-api-log" probeResult="failure" output="Get \"https://10.217.0.216:8774/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 21 04:47:40 crc kubenswrapper[4839]: I0321 04:47:40.366841 4839 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="627bf6a3-cf5d-42e1-9250-ba6684bb2cfc" containerName="nova-api-api" probeResult="failure" output="Get \"https://10.217.0.216:8774/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 21 04:47:40 crc kubenswrapper[4839]: I0321 04:47:40.730109 4839 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-scheduler-0" Mar 21 04:47:41 crc kubenswrapper[4839]: I0321 04:47:41.010734 4839 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="0aafbc7f-e890-4a32-8531-f148aeea18e6" containerName="nova-metadata-log" probeResult="failure" output="Get \"https://10.217.0.217:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 21 04:47:41 crc kubenswrapper[4839]: I0321 04:47:41.010739 4839 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="0aafbc7f-e890-4a32-8531-f148aeea18e6" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"https://10.217.0.217:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 21 04:47:47 crc kubenswrapper[4839]: I0321 04:47:47.352629 4839 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Mar 21 04:47:47 crc kubenswrapper[4839]: I0321 04:47:47.353003 4839 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Mar 21 04:47:47 crc kubenswrapper[4839]: I0321 04:47:47.996671 4839 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Mar 21 04:47:47 crc kubenswrapper[4839]: I0321 04:47:47.996732 4839 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Mar 21 04:47:49 crc kubenswrapper[4839]: I0321 04:47:49.362318 4839 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Mar 21 04:47:49 crc kubenswrapper[4839]: I0321 04:47:49.368337 4839 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Mar 21 04:47:49 crc kubenswrapper[4839]: I0321 04:47:49.374833 4839 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Mar 21 04:47:49 crc kubenswrapper[4839]: I0321 04:47:49.791751 4839 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Mar 21 04:47:50 crc kubenswrapper[4839]: I0321 04:47:50.004484 4839 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Mar 21 04:47:50 crc kubenswrapper[4839]: I0321 04:47:50.010169 4839 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Mar 21 04:47:50 crc kubenswrapper[4839]: I0321 04:47:50.011520 4839 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Mar 21 04:47:50 crc kubenswrapper[4839]: I0321 04:47:50.794315 4839 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Mar 21 04:47:59 crc kubenswrapper[4839]: I0321 04:47:59.004154 4839 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-server-0"] Mar 21 04:47:59 crc kubenswrapper[4839]: I0321 04:47:59.928235 4839 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Mar 21 04:48:00 crc kubenswrapper[4839]: I0321 04:48:00.146647 4839 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29567808-pxvv9"] Mar 21 04:48:00 crc kubenswrapper[4839]: I0321 04:48:00.147926 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567808-pxvv9" Mar 21 04:48:00 crc kubenswrapper[4839]: I0321 04:48:00.152514 4839 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-swld2" Mar 21 04:48:00 crc kubenswrapper[4839]: I0321 04:48:00.152541 4839 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 21 04:48:00 crc kubenswrapper[4839]: I0321 04:48:00.157852 4839 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 21 04:48:00 crc kubenswrapper[4839]: I0321 04:48:00.158744 4839 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29567808-pxvv9"] Mar 21 04:48:00 crc kubenswrapper[4839]: I0321 04:48:00.212263 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-99cb7\" (UniqueName: \"kubernetes.io/projected/5d56af53-fce2-4320-b4fa-32b5c6798921-kube-api-access-99cb7\") pod \"auto-csr-approver-29567808-pxvv9\" (UID: \"5d56af53-fce2-4320-b4fa-32b5c6798921\") " pod="openshift-infra/auto-csr-approver-29567808-pxvv9" Mar 21 04:48:00 crc kubenswrapper[4839]: I0321 04:48:00.313936 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-99cb7\" (UniqueName: \"kubernetes.io/projected/5d56af53-fce2-4320-b4fa-32b5c6798921-kube-api-access-99cb7\") pod \"auto-csr-approver-29567808-pxvv9\" (UID: \"5d56af53-fce2-4320-b4fa-32b5c6798921\") " pod="openshift-infra/auto-csr-approver-29567808-pxvv9" Mar 21 04:48:00 crc kubenswrapper[4839]: I0321 04:48:00.337774 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-99cb7\" (UniqueName: \"kubernetes.io/projected/5d56af53-fce2-4320-b4fa-32b5c6798921-kube-api-access-99cb7\") pod \"auto-csr-approver-29567808-pxvv9\" (UID: \"5d56af53-fce2-4320-b4fa-32b5c6798921\") " pod="openshift-infra/auto-csr-approver-29567808-pxvv9" Mar 21 04:48:00 crc kubenswrapper[4839]: I0321 04:48:00.469533 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567808-pxvv9" Mar 21 04:48:01 crc kubenswrapper[4839]: I0321 04:48:01.002082 4839 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29567808-pxvv9"] Mar 21 04:48:01 crc kubenswrapper[4839]: I0321 04:48:01.922725 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567808-pxvv9" event={"ID":"5d56af53-fce2-4320-b4fa-32b5c6798921","Type":"ContainerStarted","Data":"4db35d3cf811a0383cb9214531f4ae97e7bb78cbf5eb7b01a56d33a37845c03c"} Mar 21 04:48:02 crc kubenswrapper[4839]: I0321 04:48:02.933871 4839 generic.go:334] "Generic (PLEG): container finished" podID="5d56af53-fce2-4320-b4fa-32b5c6798921" containerID="fe7545d66419e9d11543f534eecf214e1fa485d02ad773333c092ee39cadde88" exitCode=0 Mar 21 04:48:02 crc kubenswrapper[4839]: I0321 04:48:02.933973 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567808-pxvv9" event={"ID":"5d56af53-fce2-4320-b4fa-32b5c6798921","Type":"ContainerDied","Data":"fe7545d66419e9d11543f534eecf214e1fa485d02ad773333c092ee39cadde88"} Mar 21 04:48:03 crc kubenswrapper[4839]: I0321 04:48:03.608343 4839 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/rabbitmq-server-0" podUID="8028561c-b039-4400-a065-b5efee753b5f" containerName="rabbitmq" containerID="cri-o://804d2b77429b6dcf4164535d9f43dee6f0cff10defca7a0d78be2b02039b8f92" gracePeriod=604796 Mar 21 04:48:04 crc kubenswrapper[4839]: I0321 04:48:04.264701 4839 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567808-pxvv9" Mar 21 04:48:04 crc kubenswrapper[4839]: I0321 04:48:04.395766 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-99cb7\" (UniqueName: \"kubernetes.io/projected/5d56af53-fce2-4320-b4fa-32b5c6798921-kube-api-access-99cb7\") pod \"5d56af53-fce2-4320-b4fa-32b5c6798921\" (UID: \"5d56af53-fce2-4320-b4fa-32b5c6798921\") " Mar 21 04:48:04 crc kubenswrapper[4839]: I0321 04:48:04.401097 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5d56af53-fce2-4320-b4fa-32b5c6798921-kube-api-access-99cb7" (OuterVolumeSpecName: "kube-api-access-99cb7") pod "5d56af53-fce2-4320-b4fa-32b5c6798921" (UID: "5d56af53-fce2-4320-b4fa-32b5c6798921"). InnerVolumeSpecName "kube-api-access-99cb7". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 21 04:48:04 crc kubenswrapper[4839]: I0321 04:48:04.498133 4839 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-99cb7\" (UniqueName: \"kubernetes.io/projected/5d56af53-fce2-4320-b4fa-32b5c6798921-kube-api-access-99cb7\") on node \"crc\" DevicePath \"\"" Mar 21 04:48:04 crc kubenswrapper[4839]: I0321 04:48:04.950761 4839 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/rabbitmq-cell1-server-0" podUID="6e1d0e8c-00aa-4770-9e58-b8f706d80a35" containerName="rabbitmq" containerID="cri-o://7e6871f49750eb702160c82a07f561ea187c198c633a9f96f983dabe23b81318" gracePeriod=604795 Mar 21 04:48:04 crc kubenswrapper[4839]: I0321 04:48:04.956018 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567808-pxvv9" event={"ID":"5d56af53-fce2-4320-b4fa-32b5c6798921","Type":"ContainerDied","Data":"4db35d3cf811a0383cb9214531f4ae97e7bb78cbf5eb7b01a56d33a37845c03c"} Mar 21 04:48:04 crc kubenswrapper[4839]: I0321 04:48:04.956064 4839 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="4db35d3cf811a0383cb9214531f4ae97e7bb78cbf5eb7b01a56d33a37845c03c" Mar 21 04:48:04 crc kubenswrapper[4839]: I0321 04:48:04.956103 4839 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567808-pxvv9" Mar 21 04:48:05 crc kubenswrapper[4839]: I0321 04:48:05.334031 4839 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29567802-zsmks"] Mar 21 04:48:05 crc kubenswrapper[4839]: I0321 04:48:05.343245 4839 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29567802-zsmks"] Mar 21 04:48:06 crc kubenswrapper[4839]: I0321 04:48:06.473621 4839 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ab3902e0-a483-447f-b86c-4fe8e8983152" path="/var/lib/kubelet/pods/ab3902e0-a483-447f-b86c-4fe8e8983152/volumes" Mar 21 04:48:08 crc kubenswrapper[4839]: I0321 04:48:08.661184 4839 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-kwl97"] Mar 21 04:48:08 crc kubenswrapper[4839]: E0321 04:48:08.662890 4839 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5d56af53-fce2-4320-b4fa-32b5c6798921" containerName="oc" Mar 21 04:48:08 crc kubenswrapper[4839]: I0321 04:48:08.662912 4839 state_mem.go:107] "Deleted CPUSet assignment" podUID="5d56af53-fce2-4320-b4fa-32b5c6798921" containerName="oc" Mar 21 04:48:08 crc kubenswrapper[4839]: I0321 04:48:08.663151 4839 memory_manager.go:354] "RemoveStaleState removing state" podUID="5d56af53-fce2-4320-b4fa-32b5c6798921" containerName="oc" Mar 21 04:48:08 crc kubenswrapper[4839]: I0321 04:48:08.664885 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-kwl97" Mar 21 04:48:08 crc kubenswrapper[4839]: I0321 04:48:08.677758 4839 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-kwl97"] Mar 21 04:48:08 crc kubenswrapper[4839]: I0321 04:48:08.776248 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/346daec7-d0f8-4237-a189-2b84c2a65207-utilities\") pod \"redhat-operators-kwl97\" (UID: \"346daec7-d0f8-4237-a189-2b84c2a65207\") " pod="openshift-marketplace/redhat-operators-kwl97" Mar 21 04:48:08 crc kubenswrapper[4839]: I0321 04:48:08.776337 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/346daec7-d0f8-4237-a189-2b84c2a65207-catalog-content\") pod \"redhat-operators-kwl97\" (UID: \"346daec7-d0f8-4237-a189-2b84c2a65207\") " pod="openshift-marketplace/redhat-operators-kwl97" Mar 21 04:48:08 crc kubenswrapper[4839]: I0321 04:48:08.776490 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nl8vh\" (UniqueName: \"kubernetes.io/projected/346daec7-d0f8-4237-a189-2b84c2a65207-kube-api-access-nl8vh\") pod \"redhat-operators-kwl97\" (UID: \"346daec7-d0f8-4237-a189-2b84c2a65207\") " pod="openshift-marketplace/redhat-operators-kwl97" Mar 21 04:48:08 crc kubenswrapper[4839]: I0321 04:48:08.892754 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/346daec7-d0f8-4237-a189-2b84c2a65207-utilities\") pod \"redhat-operators-kwl97\" (UID: \"346daec7-d0f8-4237-a189-2b84c2a65207\") " pod="openshift-marketplace/redhat-operators-kwl97" Mar 21 04:48:08 crc kubenswrapper[4839]: I0321 04:48:08.892903 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/346daec7-d0f8-4237-a189-2b84c2a65207-catalog-content\") pod \"redhat-operators-kwl97\" (UID: \"346daec7-d0f8-4237-a189-2b84c2a65207\") " pod="openshift-marketplace/redhat-operators-kwl97" Mar 21 04:48:08 crc kubenswrapper[4839]: I0321 04:48:08.893008 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nl8vh\" (UniqueName: \"kubernetes.io/projected/346daec7-d0f8-4237-a189-2b84c2a65207-kube-api-access-nl8vh\") pod \"redhat-operators-kwl97\" (UID: \"346daec7-d0f8-4237-a189-2b84c2a65207\") " pod="openshift-marketplace/redhat-operators-kwl97" Mar 21 04:48:08 crc kubenswrapper[4839]: I0321 04:48:08.894078 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/346daec7-d0f8-4237-a189-2b84c2a65207-utilities\") pod \"redhat-operators-kwl97\" (UID: \"346daec7-d0f8-4237-a189-2b84c2a65207\") " pod="openshift-marketplace/redhat-operators-kwl97" Mar 21 04:48:08 crc kubenswrapper[4839]: I0321 04:48:08.894316 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/346daec7-d0f8-4237-a189-2b84c2a65207-catalog-content\") pod \"redhat-operators-kwl97\" (UID: \"346daec7-d0f8-4237-a189-2b84c2a65207\") " pod="openshift-marketplace/redhat-operators-kwl97" Mar 21 04:48:08 crc kubenswrapper[4839]: I0321 04:48:08.916076 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nl8vh\" (UniqueName: \"kubernetes.io/projected/346daec7-d0f8-4237-a189-2b84c2a65207-kube-api-access-nl8vh\") pod \"redhat-operators-kwl97\" (UID: \"346daec7-d0f8-4237-a189-2b84c2a65207\") " pod="openshift-marketplace/redhat-operators-kwl97" Mar 21 04:48:08 crc kubenswrapper[4839]: I0321 04:48:08.995690 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-kwl97" Mar 21 04:48:09 crc kubenswrapper[4839]: I0321 04:48:09.475360 4839 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-kwl97"] Mar 21 04:48:10 crc kubenswrapper[4839]: I0321 04:48:10.005349 4839 generic.go:334] "Generic (PLEG): container finished" podID="8028561c-b039-4400-a065-b5efee753b5f" containerID="804d2b77429b6dcf4164535d9f43dee6f0cff10defca7a0d78be2b02039b8f92" exitCode=0 Mar 21 04:48:10 crc kubenswrapper[4839]: I0321 04:48:10.005647 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"8028561c-b039-4400-a065-b5efee753b5f","Type":"ContainerDied","Data":"804d2b77429b6dcf4164535d9f43dee6f0cff10defca7a0d78be2b02039b8f92"} Mar 21 04:48:10 crc kubenswrapper[4839]: I0321 04:48:10.007862 4839 generic.go:334] "Generic (PLEG): container finished" podID="346daec7-d0f8-4237-a189-2b84c2a65207" containerID="acab5d81cd2e1c37c1a56acb9fa01bf1a93058aa2c338773bcb2ab0c75d8fc2b" exitCode=0 Mar 21 04:48:10 crc kubenswrapper[4839]: I0321 04:48:10.007919 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-kwl97" event={"ID":"346daec7-d0f8-4237-a189-2b84c2a65207","Type":"ContainerDied","Data":"acab5d81cd2e1c37c1a56acb9fa01bf1a93058aa2c338773bcb2ab0c75d8fc2b"} Mar 21 04:48:10 crc kubenswrapper[4839]: I0321 04:48:10.007944 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-kwl97" event={"ID":"346daec7-d0f8-4237-a189-2b84c2a65207","Type":"ContainerStarted","Data":"4d51b3ae90fbcc99929a148d8825194077e5974286f615cf5f4f49328360ebc9"} Mar 21 04:48:10 crc kubenswrapper[4839]: I0321 04:48:10.227959 4839 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Mar 21 04:48:10 crc kubenswrapper[4839]: I0321 04:48:10.323094 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/8028561c-b039-4400-a065-b5efee753b5f-plugins-conf\") pod \"8028561c-b039-4400-a065-b5efee753b5f\" (UID: \"8028561c-b039-4400-a065-b5efee753b5f\") " Mar 21 04:48:10 crc kubenswrapper[4839]: I0321 04:48:10.323169 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"persistence\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"8028561c-b039-4400-a065-b5efee753b5f\" (UID: \"8028561c-b039-4400-a065-b5efee753b5f\") " Mar 21 04:48:10 crc kubenswrapper[4839]: I0321 04:48:10.323232 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vh2cd\" (UniqueName: \"kubernetes.io/projected/8028561c-b039-4400-a065-b5efee753b5f-kube-api-access-vh2cd\") pod \"8028561c-b039-4400-a065-b5efee753b5f\" (UID: \"8028561c-b039-4400-a065-b5efee753b5f\") " Mar 21 04:48:10 crc kubenswrapper[4839]: I0321 04:48:10.323314 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/8028561c-b039-4400-a065-b5efee753b5f-server-conf\") pod \"8028561c-b039-4400-a065-b5efee753b5f\" (UID: \"8028561c-b039-4400-a065-b5efee753b5f\") " Mar 21 04:48:10 crc kubenswrapper[4839]: I0321 04:48:10.323361 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/8028561c-b039-4400-a065-b5efee753b5f-rabbitmq-confd\") pod \"8028561c-b039-4400-a065-b5efee753b5f\" (UID: \"8028561c-b039-4400-a065-b5efee753b5f\") " Mar 21 04:48:10 crc kubenswrapper[4839]: I0321 04:48:10.323393 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/8028561c-b039-4400-a065-b5efee753b5f-rabbitmq-erlang-cookie\") pod \"8028561c-b039-4400-a065-b5efee753b5f\" (UID: \"8028561c-b039-4400-a065-b5efee753b5f\") " Mar 21 04:48:10 crc kubenswrapper[4839]: I0321 04:48:10.323451 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/8028561c-b039-4400-a065-b5efee753b5f-config-data\") pod \"8028561c-b039-4400-a065-b5efee753b5f\" (UID: \"8028561c-b039-4400-a065-b5efee753b5f\") " Mar 21 04:48:10 crc kubenswrapper[4839]: I0321 04:48:10.323499 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/8028561c-b039-4400-a065-b5efee753b5f-pod-info\") pod \"8028561c-b039-4400-a065-b5efee753b5f\" (UID: \"8028561c-b039-4400-a065-b5efee753b5f\") " Mar 21 04:48:10 crc kubenswrapper[4839]: I0321 04:48:10.323526 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/8028561c-b039-4400-a065-b5efee753b5f-erlang-cookie-secret\") pod \"8028561c-b039-4400-a065-b5efee753b5f\" (UID: \"8028561c-b039-4400-a065-b5efee753b5f\") " Mar 21 04:48:10 crc kubenswrapper[4839]: I0321 04:48:10.323650 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/8028561c-b039-4400-a065-b5efee753b5f-rabbitmq-plugins\") pod \"8028561c-b039-4400-a065-b5efee753b5f\" (UID: \"8028561c-b039-4400-a065-b5efee753b5f\") " Mar 21 04:48:10 crc kubenswrapper[4839]: I0321 04:48:10.323879 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8028561c-b039-4400-a065-b5efee753b5f-plugins-conf" (OuterVolumeSpecName: "plugins-conf") pod "8028561c-b039-4400-a065-b5efee753b5f" (UID: "8028561c-b039-4400-a065-b5efee753b5f"). InnerVolumeSpecName "plugins-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 21 04:48:10 crc kubenswrapper[4839]: I0321 04:48:10.324023 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/8028561c-b039-4400-a065-b5efee753b5f-rabbitmq-tls\") pod \"8028561c-b039-4400-a065-b5efee753b5f\" (UID: \"8028561c-b039-4400-a065-b5efee753b5f\") " Mar 21 04:48:10 crc kubenswrapper[4839]: I0321 04:48:10.324074 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8028561c-b039-4400-a065-b5efee753b5f-rabbitmq-plugins" (OuterVolumeSpecName: "rabbitmq-plugins") pod "8028561c-b039-4400-a065-b5efee753b5f" (UID: "8028561c-b039-4400-a065-b5efee753b5f"). InnerVolumeSpecName "rabbitmq-plugins". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 21 04:48:10 crc kubenswrapper[4839]: I0321 04:48:10.324289 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8028561c-b039-4400-a065-b5efee753b5f-rabbitmq-erlang-cookie" (OuterVolumeSpecName: "rabbitmq-erlang-cookie") pod "8028561c-b039-4400-a065-b5efee753b5f" (UID: "8028561c-b039-4400-a065-b5efee753b5f"). InnerVolumeSpecName "rabbitmq-erlang-cookie". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 21 04:48:10 crc kubenswrapper[4839]: I0321 04:48:10.325265 4839 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/8028561c-b039-4400-a065-b5efee753b5f-rabbitmq-erlang-cookie\") on node \"crc\" DevicePath \"\"" Mar 21 04:48:10 crc kubenswrapper[4839]: I0321 04:48:10.325286 4839 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/8028561c-b039-4400-a065-b5efee753b5f-rabbitmq-plugins\") on node \"crc\" DevicePath \"\"" Mar 21 04:48:10 crc kubenswrapper[4839]: I0321 04:48:10.325296 4839 reconciler_common.go:293] "Volume detached for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/8028561c-b039-4400-a065-b5efee753b5f-plugins-conf\") on node \"crc\" DevicePath \"\"" Mar 21 04:48:10 crc kubenswrapper[4839]: I0321 04:48:10.333973 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/downward-api/8028561c-b039-4400-a065-b5efee753b5f-pod-info" (OuterVolumeSpecName: "pod-info") pod "8028561c-b039-4400-a065-b5efee753b5f" (UID: "8028561c-b039-4400-a065-b5efee753b5f"). InnerVolumeSpecName "pod-info". PluginName "kubernetes.io/downward-api", VolumeGidValue "" Mar 21 04:48:10 crc kubenswrapper[4839]: I0321 04:48:10.361317 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8028561c-b039-4400-a065-b5efee753b5f-rabbitmq-tls" (OuterVolumeSpecName: "rabbitmq-tls") pod "8028561c-b039-4400-a065-b5efee753b5f" (UID: "8028561c-b039-4400-a065-b5efee753b5f"). InnerVolumeSpecName "rabbitmq-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 21 04:48:10 crc kubenswrapper[4839]: I0321 04:48:10.362858 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8028561c-b039-4400-a065-b5efee753b5f-erlang-cookie-secret" (OuterVolumeSpecName: "erlang-cookie-secret") pod "8028561c-b039-4400-a065-b5efee753b5f" (UID: "8028561c-b039-4400-a065-b5efee753b5f"). InnerVolumeSpecName "erlang-cookie-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 21 04:48:10 crc kubenswrapper[4839]: I0321 04:48:10.362985 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage03-crc" (OuterVolumeSpecName: "persistence") pod "8028561c-b039-4400-a065-b5efee753b5f" (UID: "8028561c-b039-4400-a065-b5efee753b5f"). InnerVolumeSpecName "local-storage03-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Mar 21 04:48:10 crc kubenswrapper[4839]: I0321 04:48:10.363149 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8028561c-b039-4400-a065-b5efee753b5f-kube-api-access-vh2cd" (OuterVolumeSpecName: "kube-api-access-vh2cd") pod "8028561c-b039-4400-a065-b5efee753b5f" (UID: "8028561c-b039-4400-a065-b5efee753b5f"). InnerVolumeSpecName "kube-api-access-vh2cd". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 21 04:48:10 crc kubenswrapper[4839]: I0321 04:48:10.407109 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8028561c-b039-4400-a065-b5efee753b5f-config-data" (OuterVolumeSpecName: "config-data") pod "8028561c-b039-4400-a065-b5efee753b5f" (UID: "8028561c-b039-4400-a065-b5efee753b5f"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 21 04:48:10 crc kubenswrapper[4839]: I0321 04:48:10.409138 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8028561c-b039-4400-a065-b5efee753b5f-server-conf" (OuterVolumeSpecName: "server-conf") pod "8028561c-b039-4400-a065-b5efee753b5f" (UID: "8028561c-b039-4400-a065-b5efee753b5f"). InnerVolumeSpecName "server-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 21 04:48:10 crc kubenswrapper[4839]: I0321 04:48:10.427338 4839 reconciler_common.go:293] "Volume detached for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/8028561c-b039-4400-a065-b5efee753b5f-pod-info\") on node \"crc\" DevicePath \"\"" Mar 21 04:48:10 crc kubenswrapper[4839]: I0321 04:48:10.427673 4839 reconciler_common.go:293] "Volume detached for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/8028561c-b039-4400-a065-b5efee753b5f-erlang-cookie-secret\") on node \"crc\" DevicePath \"\"" Mar 21 04:48:10 crc kubenswrapper[4839]: I0321 04:48:10.427783 4839 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/8028561c-b039-4400-a065-b5efee753b5f-rabbitmq-tls\") on node \"crc\" DevicePath \"\"" Mar 21 04:48:10 crc kubenswrapper[4839]: I0321 04:48:10.427930 4839 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") on node \"crc\" " Mar 21 04:48:10 crc kubenswrapper[4839]: I0321 04:48:10.428033 4839 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vh2cd\" (UniqueName: \"kubernetes.io/projected/8028561c-b039-4400-a065-b5efee753b5f-kube-api-access-vh2cd\") on node \"crc\" DevicePath \"\"" Mar 21 04:48:10 crc kubenswrapper[4839]: I0321 04:48:10.428131 4839 reconciler_common.go:293] "Volume detached for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/8028561c-b039-4400-a065-b5efee753b5f-server-conf\") on node \"crc\" DevicePath \"\"" Mar 21 04:48:10 crc kubenswrapper[4839]: I0321 04:48:10.428443 4839 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/8028561c-b039-4400-a065-b5efee753b5f-config-data\") on node \"crc\" DevicePath \"\"" Mar 21 04:48:10 crc kubenswrapper[4839]: I0321 04:48:10.479837 4839 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage03-crc" (UniqueName: "kubernetes.io/local-volume/local-storage03-crc") on node "crc" Mar 21 04:48:10 crc kubenswrapper[4839]: I0321 04:48:10.494716 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8028561c-b039-4400-a065-b5efee753b5f-rabbitmq-confd" (OuterVolumeSpecName: "rabbitmq-confd") pod "8028561c-b039-4400-a065-b5efee753b5f" (UID: "8028561c-b039-4400-a065-b5efee753b5f"). InnerVolumeSpecName "rabbitmq-confd". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 21 04:48:10 crc kubenswrapper[4839]: I0321 04:48:10.530393 4839 reconciler_common.go:293] "Volume detached for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") on node \"crc\" DevicePath \"\"" Mar 21 04:48:10 crc kubenswrapper[4839]: I0321 04:48:10.530425 4839 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/8028561c-b039-4400-a065-b5efee753b5f-rabbitmq-confd\") on node \"crc\" DevicePath \"\"" Mar 21 04:48:11 crc kubenswrapper[4839]: I0321 04:48:11.020694 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"8028561c-b039-4400-a065-b5efee753b5f","Type":"ContainerDied","Data":"5eaf787d4b2014f872ad6aefa43fc8d3d3baab1a1f0af69a0017de992e3a8b54"} Mar 21 04:48:11 crc kubenswrapper[4839]: I0321 04:48:11.020746 4839 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Mar 21 04:48:11 crc kubenswrapper[4839]: I0321 04:48:11.020771 4839 scope.go:117] "RemoveContainer" containerID="804d2b77429b6dcf4164535d9f43dee6f0cff10defca7a0d78be2b02039b8f92" Mar 21 04:48:11 crc kubenswrapper[4839]: I0321 04:48:11.022983 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-kwl97" event={"ID":"346daec7-d0f8-4237-a189-2b84c2a65207","Type":"ContainerStarted","Data":"79758e53a6a4afc87d273da5e0a098ac34797242bb0be0e09b4fef2bf01960eb"} Mar 21 04:48:11 crc kubenswrapper[4839]: I0321 04:48:11.044768 4839 scope.go:117] "RemoveContainer" containerID="fcd7e300ab111a88b888a2fc68f007c49d0404de0648aa1177c5d04bb341e74c" Mar 21 04:48:11 crc kubenswrapper[4839]: I0321 04:48:11.075199 4839 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-server-0"] Mar 21 04:48:11 crc kubenswrapper[4839]: I0321 04:48:11.095376 4839 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/rabbitmq-server-0"] Mar 21 04:48:11 crc kubenswrapper[4839]: I0321 04:48:11.115320 4839 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-server-0"] Mar 21 04:48:11 crc kubenswrapper[4839]: E0321 04:48:11.115857 4839 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8028561c-b039-4400-a065-b5efee753b5f" containerName="setup-container" Mar 21 04:48:11 crc kubenswrapper[4839]: I0321 04:48:11.115877 4839 state_mem.go:107] "Deleted CPUSet assignment" podUID="8028561c-b039-4400-a065-b5efee753b5f" containerName="setup-container" Mar 21 04:48:11 crc kubenswrapper[4839]: E0321 04:48:11.115914 4839 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8028561c-b039-4400-a065-b5efee753b5f" containerName="rabbitmq" Mar 21 04:48:11 crc kubenswrapper[4839]: I0321 04:48:11.115921 4839 state_mem.go:107] "Deleted CPUSet assignment" podUID="8028561c-b039-4400-a065-b5efee753b5f" containerName="rabbitmq" Mar 21 04:48:11 crc kubenswrapper[4839]: I0321 04:48:11.116084 4839 memory_manager.go:354] "RemoveStaleState removing state" podUID="8028561c-b039-4400-a065-b5efee753b5f" containerName="rabbitmq" Mar 21 04:48:11 crc kubenswrapper[4839]: I0321 04:48:11.117045 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Mar 21 04:48:11 crc kubenswrapper[4839]: I0321 04:48:11.120440 4839 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-default-user" Mar 21 04:48:11 crc kubenswrapper[4839]: I0321 04:48:11.120515 4839 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-plugins-conf" Mar 21 04:48:11 crc kubenswrapper[4839]: I0321 04:48:11.126061 4839 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-erlang-cookie" Mar 21 04:48:11 crc kubenswrapper[4839]: I0321 04:48:11.126241 4839 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-server-dockercfg-nxhtb" Mar 21 04:48:11 crc kubenswrapper[4839]: I0321 04:48:11.126287 4839 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-config-data" Mar 21 04:48:11 crc kubenswrapper[4839]: I0321 04:48:11.130271 4839 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Mar 21 04:48:11 crc kubenswrapper[4839]: I0321 04:48:11.136376 4839 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-rabbitmq-svc" Mar 21 04:48:11 crc kubenswrapper[4839]: I0321 04:48:11.136551 4839 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-server-conf" Mar 21 04:48:11 crc kubenswrapper[4839]: I0321 04:48:11.246876 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/bfff67da-8ea4-4798-9b8d-58a3abac4347-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"bfff67da-8ea4-4798-9b8d-58a3abac4347\") " pod="openstack/rabbitmq-server-0" Mar 21 04:48:11 crc kubenswrapper[4839]: I0321 04:48:11.246932 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xcbzl\" (UniqueName: \"kubernetes.io/projected/bfff67da-8ea4-4798-9b8d-58a3abac4347-kube-api-access-xcbzl\") pod \"rabbitmq-server-0\" (UID: \"bfff67da-8ea4-4798-9b8d-58a3abac4347\") " pod="openstack/rabbitmq-server-0" Mar 21 04:48:11 crc kubenswrapper[4839]: I0321 04:48:11.246963 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/bfff67da-8ea4-4798-9b8d-58a3abac4347-config-data\") pod \"rabbitmq-server-0\" (UID: \"bfff67da-8ea4-4798-9b8d-58a3abac4347\") " pod="openstack/rabbitmq-server-0" Mar 21 04:48:11 crc kubenswrapper[4839]: I0321 04:48:11.247061 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/bfff67da-8ea4-4798-9b8d-58a3abac4347-server-conf\") pod \"rabbitmq-server-0\" (UID: \"bfff67da-8ea4-4798-9b8d-58a3abac4347\") " pod="openstack/rabbitmq-server-0" Mar 21 04:48:11 crc kubenswrapper[4839]: I0321 04:48:11.247241 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/bfff67da-8ea4-4798-9b8d-58a3abac4347-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"bfff67da-8ea4-4798-9b8d-58a3abac4347\") " pod="openstack/rabbitmq-server-0" Mar 21 04:48:11 crc kubenswrapper[4839]: I0321 04:48:11.247433 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/bfff67da-8ea4-4798-9b8d-58a3abac4347-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"bfff67da-8ea4-4798-9b8d-58a3abac4347\") " pod="openstack/rabbitmq-server-0" Mar 21 04:48:11 crc kubenswrapper[4839]: I0321 04:48:11.247489 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/bfff67da-8ea4-4798-9b8d-58a3abac4347-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"bfff67da-8ea4-4798-9b8d-58a3abac4347\") " pod="openstack/rabbitmq-server-0" Mar 21 04:48:11 crc kubenswrapper[4839]: I0321 04:48:11.247598 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/bfff67da-8ea4-4798-9b8d-58a3abac4347-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"bfff67da-8ea4-4798-9b8d-58a3abac4347\") " pod="openstack/rabbitmq-server-0" Mar 21 04:48:11 crc kubenswrapper[4839]: I0321 04:48:11.247634 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"rabbitmq-server-0\" (UID: \"bfff67da-8ea4-4798-9b8d-58a3abac4347\") " pod="openstack/rabbitmq-server-0" Mar 21 04:48:11 crc kubenswrapper[4839]: I0321 04:48:11.247712 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/bfff67da-8ea4-4798-9b8d-58a3abac4347-pod-info\") pod \"rabbitmq-server-0\" (UID: \"bfff67da-8ea4-4798-9b8d-58a3abac4347\") " pod="openstack/rabbitmq-server-0" Mar 21 04:48:11 crc kubenswrapper[4839]: I0321 04:48:11.247742 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/bfff67da-8ea4-4798-9b8d-58a3abac4347-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"bfff67da-8ea4-4798-9b8d-58a3abac4347\") " pod="openstack/rabbitmq-server-0" Mar 21 04:48:11 crc kubenswrapper[4839]: I0321 04:48:11.348975 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/bfff67da-8ea4-4798-9b8d-58a3abac4347-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"bfff67da-8ea4-4798-9b8d-58a3abac4347\") " pod="openstack/rabbitmq-server-0" Mar 21 04:48:11 crc kubenswrapper[4839]: I0321 04:48:11.349070 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/bfff67da-8ea4-4798-9b8d-58a3abac4347-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"bfff67da-8ea4-4798-9b8d-58a3abac4347\") " pod="openstack/rabbitmq-server-0" Mar 21 04:48:11 crc kubenswrapper[4839]: I0321 04:48:11.349104 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/bfff67da-8ea4-4798-9b8d-58a3abac4347-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"bfff67da-8ea4-4798-9b8d-58a3abac4347\") " pod="openstack/rabbitmq-server-0" Mar 21 04:48:11 crc kubenswrapper[4839]: I0321 04:48:11.349137 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/bfff67da-8ea4-4798-9b8d-58a3abac4347-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"bfff67da-8ea4-4798-9b8d-58a3abac4347\") " pod="openstack/rabbitmq-server-0" Mar 21 04:48:11 crc kubenswrapper[4839]: I0321 04:48:11.349157 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"rabbitmq-server-0\" (UID: \"bfff67da-8ea4-4798-9b8d-58a3abac4347\") " pod="openstack/rabbitmq-server-0" Mar 21 04:48:11 crc kubenswrapper[4839]: I0321 04:48:11.349291 4839 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"rabbitmq-server-0\" (UID: \"bfff67da-8ea4-4798-9b8d-58a3abac4347\") device mount path \"/mnt/openstack/pv03\"" pod="openstack/rabbitmq-server-0" Mar 21 04:48:11 crc kubenswrapper[4839]: I0321 04:48:11.349652 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/bfff67da-8ea4-4798-9b8d-58a3abac4347-pod-info\") pod \"rabbitmq-server-0\" (UID: \"bfff67da-8ea4-4798-9b8d-58a3abac4347\") " pod="openstack/rabbitmq-server-0" Mar 21 04:48:11 crc kubenswrapper[4839]: I0321 04:48:11.349761 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/bfff67da-8ea4-4798-9b8d-58a3abac4347-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"bfff67da-8ea4-4798-9b8d-58a3abac4347\") " pod="openstack/rabbitmq-server-0" Mar 21 04:48:11 crc kubenswrapper[4839]: I0321 04:48:11.349813 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/bfff67da-8ea4-4798-9b8d-58a3abac4347-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"bfff67da-8ea4-4798-9b8d-58a3abac4347\") " pod="openstack/rabbitmq-server-0" Mar 21 04:48:11 crc kubenswrapper[4839]: I0321 04:48:11.349831 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/bfff67da-8ea4-4798-9b8d-58a3abac4347-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"bfff67da-8ea4-4798-9b8d-58a3abac4347\") " pod="openstack/rabbitmq-server-0" Mar 21 04:48:11 crc kubenswrapper[4839]: I0321 04:48:11.349862 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xcbzl\" (UniqueName: \"kubernetes.io/projected/bfff67da-8ea4-4798-9b8d-58a3abac4347-kube-api-access-xcbzl\") pod \"rabbitmq-server-0\" (UID: \"bfff67da-8ea4-4798-9b8d-58a3abac4347\") " pod="openstack/rabbitmq-server-0" Mar 21 04:48:11 crc kubenswrapper[4839]: I0321 04:48:11.349840 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/bfff67da-8ea4-4798-9b8d-58a3abac4347-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"bfff67da-8ea4-4798-9b8d-58a3abac4347\") " pod="openstack/rabbitmq-server-0" Mar 21 04:48:11 crc kubenswrapper[4839]: I0321 04:48:11.349918 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/bfff67da-8ea4-4798-9b8d-58a3abac4347-config-data\") pod \"rabbitmq-server-0\" (UID: \"bfff67da-8ea4-4798-9b8d-58a3abac4347\") " pod="openstack/rabbitmq-server-0" Mar 21 04:48:11 crc kubenswrapper[4839]: I0321 04:48:11.349936 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/bfff67da-8ea4-4798-9b8d-58a3abac4347-server-conf\") pod \"rabbitmq-server-0\" (UID: \"bfff67da-8ea4-4798-9b8d-58a3abac4347\") " pod="openstack/rabbitmq-server-0" Mar 21 04:48:11 crc kubenswrapper[4839]: I0321 04:48:11.350907 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/bfff67da-8ea4-4798-9b8d-58a3abac4347-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"bfff67da-8ea4-4798-9b8d-58a3abac4347\") " pod="openstack/rabbitmq-server-0" Mar 21 04:48:11 crc kubenswrapper[4839]: I0321 04:48:11.351298 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/bfff67da-8ea4-4798-9b8d-58a3abac4347-server-conf\") pod \"rabbitmq-server-0\" (UID: \"bfff67da-8ea4-4798-9b8d-58a3abac4347\") " pod="openstack/rabbitmq-server-0" Mar 21 04:48:11 crc kubenswrapper[4839]: I0321 04:48:11.351922 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/bfff67da-8ea4-4798-9b8d-58a3abac4347-config-data\") pod \"rabbitmq-server-0\" (UID: \"bfff67da-8ea4-4798-9b8d-58a3abac4347\") " pod="openstack/rabbitmq-server-0" Mar 21 04:48:11 crc kubenswrapper[4839]: I0321 04:48:11.355548 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/bfff67da-8ea4-4798-9b8d-58a3abac4347-pod-info\") pod \"rabbitmq-server-0\" (UID: \"bfff67da-8ea4-4798-9b8d-58a3abac4347\") " pod="openstack/rabbitmq-server-0" Mar 21 04:48:11 crc kubenswrapper[4839]: I0321 04:48:11.355723 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/bfff67da-8ea4-4798-9b8d-58a3abac4347-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"bfff67da-8ea4-4798-9b8d-58a3abac4347\") " pod="openstack/rabbitmq-server-0" Mar 21 04:48:11 crc kubenswrapper[4839]: I0321 04:48:11.357409 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/bfff67da-8ea4-4798-9b8d-58a3abac4347-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"bfff67da-8ea4-4798-9b8d-58a3abac4347\") " pod="openstack/rabbitmq-server-0" Mar 21 04:48:11 crc kubenswrapper[4839]: I0321 04:48:11.368242 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/bfff67da-8ea4-4798-9b8d-58a3abac4347-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"bfff67da-8ea4-4798-9b8d-58a3abac4347\") " pod="openstack/rabbitmq-server-0" Mar 21 04:48:11 crc kubenswrapper[4839]: I0321 04:48:11.375553 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xcbzl\" (UniqueName: \"kubernetes.io/projected/bfff67da-8ea4-4798-9b8d-58a3abac4347-kube-api-access-xcbzl\") pod \"rabbitmq-server-0\" (UID: \"bfff67da-8ea4-4798-9b8d-58a3abac4347\") " pod="openstack/rabbitmq-server-0" Mar 21 04:48:11 crc kubenswrapper[4839]: I0321 04:48:11.386913 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"rabbitmq-server-0\" (UID: \"bfff67da-8ea4-4798-9b8d-58a3abac4347\") " pod="openstack/rabbitmq-server-0" Mar 21 04:48:11 crc kubenswrapper[4839]: I0321 04:48:11.464021 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Mar 21 04:48:11 crc kubenswrapper[4839]: I0321 04:48:11.476414 4839 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Mar 21 04:48:11 crc kubenswrapper[4839]: I0321 04:48:11.555361 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/6e1d0e8c-00aa-4770-9e58-b8f706d80a35-rabbitmq-confd\") pod \"6e1d0e8c-00aa-4770-9e58-b8f706d80a35\" (UID: \"6e1d0e8c-00aa-4770-9e58-b8f706d80a35\") " Mar 21 04:48:11 crc kubenswrapper[4839]: I0321 04:48:11.555445 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/6e1d0e8c-00aa-4770-9e58-b8f706d80a35-rabbitmq-erlang-cookie\") pod \"6e1d0e8c-00aa-4770-9e58-b8f706d80a35\" (UID: \"6e1d0e8c-00aa-4770-9e58-b8f706d80a35\") " Mar 21 04:48:11 crc kubenswrapper[4839]: I0321 04:48:11.555519 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/6e1d0e8c-00aa-4770-9e58-b8f706d80a35-rabbitmq-tls\") pod \"6e1d0e8c-00aa-4770-9e58-b8f706d80a35\" (UID: \"6e1d0e8c-00aa-4770-9e58-b8f706d80a35\") " Mar 21 04:48:11 crc kubenswrapper[4839]: I0321 04:48:11.555549 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/6e1d0e8c-00aa-4770-9e58-b8f706d80a35-server-conf\") pod \"6e1d0e8c-00aa-4770-9e58-b8f706d80a35\" (UID: \"6e1d0e8c-00aa-4770-9e58-b8f706d80a35\") " Mar 21 04:48:11 crc kubenswrapper[4839]: I0321 04:48:11.555582 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/6e1d0e8c-00aa-4770-9e58-b8f706d80a35-plugins-conf\") pod \"6e1d0e8c-00aa-4770-9e58-b8f706d80a35\" (UID: \"6e1d0e8c-00aa-4770-9e58-b8f706d80a35\") " Mar 21 04:48:11 crc kubenswrapper[4839]: I0321 04:48:11.555612 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rb4vz\" (UniqueName: \"kubernetes.io/projected/6e1d0e8c-00aa-4770-9e58-b8f706d80a35-kube-api-access-rb4vz\") pod \"6e1d0e8c-00aa-4770-9e58-b8f706d80a35\" (UID: \"6e1d0e8c-00aa-4770-9e58-b8f706d80a35\") " Mar 21 04:48:11 crc kubenswrapper[4839]: I0321 04:48:11.555631 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/6e1d0e8c-00aa-4770-9e58-b8f706d80a35-erlang-cookie-secret\") pod \"6e1d0e8c-00aa-4770-9e58-b8f706d80a35\" (UID: \"6e1d0e8c-00aa-4770-9e58-b8f706d80a35\") " Mar 21 04:48:11 crc kubenswrapper[4839]: I0321 04:48:11.555658 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/6e1d0e8c-00aa-4770-9e58-b8f706d80a35-rabbitmq-plugins\") pod \"6e1d0e8c-00aa-4770-9e58-b8f706d80a35\" (UID: \"6e1d0e8c-00aa-4770-9e58-b8f706d80a35\") " Mar 21 04:48:11 crc kubenswrapper[4839]: I0321 04:48:11.555722 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/6e1d0e8c-00aa-4770-9e58-b8f706d80a35-config-data\") pod \"6e1d0e8c-00aa-4770-9e58-b8f706d80a35\" (UID: \"6e1d0e8c-00aa-4770-9e58-b8f706d80a35\") " Mar 21 04:48:11 crc kubenswrapper[4839]: I0321 04:48:11.555749 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"persistence\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"6e1d0e8c-00aa-4770-9e58-b8f706d80a35\" (UID: \"6e1d0e8c-00aa-4770-9e58-b8f706d80a35\") " Mar 21 04:48:11 crc kubenswrapper[4839]: I0321 04:48:11.555792 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/6e1d0e8c-00aa-4770-9e58-b8f706d80a35-pod-info\") pod \"6e1d0e8c-00aa-4770-9e58-b8f706d80a35\" (UID: \"6e1d0e8c-00aa-4770-9e58-b8f706d80a35\") " Mar 21 04:48:11 crc kubenswrapper[4839]: I0321 04:48:11.556757 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6e1d0e8c-00aa-4770-9e58-b8f706d80a35-plugins-conf" (OuterVolumeSpecName: "plugins-conf") pod "6e1d0e8c-00aa-4770-9e58-b8f706d80a35" (UID: "6e1d0e8c-00aa-4770-9e58-b8f706d80a35"). InnerVolumeSpecName "plugins-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 21 04:48:11 crc kubenswrapper[4839]: I0321 04:48:11.558916 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6e1d0e8c-00aa-4770-9e58-b8f706d80a35-rabbitmq-plugins" (OuterVolumeSpecName: "rabbitmq-plugins") pod "6e1d0e8c-00aa-4770-9e58-b8f706d80a35" (UID: "6e1d0e8c-00aa-4770-9e58-b8f706d80a35"). InnerVolumeSpecName "rabbitmq-plugins". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 21 04:48:11 crc kubenswrapper[4839]: I0321 04:48:11.560989 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6e1d0e8c-00aa-4770-9e58-b8f706d80a35-rabbitmq-erlang-cookie" (OuterVolumeSpecName: "rabbitmq-erlang-cookie") pod "6e1d0e8c-00aa-4770-9e58-b8f706d80a35" (UID: "6e1d0e8c-00aa-4770-9e58-b8f706d80a35"). InnerVolumeSpecName "rabbitmq-erlang-cookie". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 21 04:48:11 crc kubenswrapper[4839]: I0321 04:48:11.582612 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6e1d0e8c-00aa-4770-9e58-b8f706d80a35-erlang-cookie-secret" (OuterVolumeSpecName: "erlang-cookie-secret") pod "6e1d0e8c-00aa-4770-9e58-b8f706d80a35" (UID: "6e1d0e8c-00aa-4770-9e58-b8f706d80a35"). InnerVolumeSpecName "erlang-cookie-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 21 04:48:11 crc kubenswrapper[4839]: I0321 04:48:11.582617 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage04-crc" (OuterVolumeSpecName: "persistence") pod "6e1d0e8c-00aa-4770-9e58-b8f706d80a35" (UID: "6e1d0e8c-00aa-4770-9e58-b8f706d80a35"). InnerVolumeSpecName "local-storage04-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Mar 21 04:48:11 crc kubenswrapper[4839]: I0321 04:48:11.582667 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6e1d0e8c-00aa-4770-9e58-b8f706d80a35-kube-api-access-rb4vz" (OuterVolumeSpecName: "kube-api-access-rb4vz") pod "6e1d0e8c-00aa-4770-9e58-b8f706d80a35" (UID: "6e1d0e8c-00aa-4770-9e58-b8f706d80a35"). InnerVolumeSpecName "kube-api-access-rb4vz". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 21 04:48:11 crc kubenswrapper[4839]: I0321 04:48:11.583127 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6e1d0e8c-00aa-4770-9e58-b8f706d80a35-rabbitmq-tls" (OuterVolumeSpecName: "rabbitmq-tls") pod "6e1d0e8c-00aa-4770-9e58-b8f706d80a35" (UID: "6e1d0e8c-00aa-4770-9e58-b8f706d80a35"). InnerVolumeSpecName "rabbitmq-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 21 04:48:11 crc kubenswrapper[4839]: I0321 04:48:11.587253 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6e1d0e8c-00aa-4770-9e58-b8f706d80a35-config-data" (OuterVolumeSpecName: "config-data") pod "6e1d0e8c-00aa-4770-9e58-b8f706d80a35" (UID: "6e1d0e8c-00aa-4770-9e58-b8f706d80a35"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 21 04:48:11 crc kubenswrapper[4839]: I0321 04:48:11.590126 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/downward-api/6e1d0e8c-00aa-4770-9e58-b8f706d80a35-pod-info" (OuterVolumeSpecName: "pod-info") pod "6e1d0e8c-00aa-4770-9e58-b8f706d80a35" (UID: "6e1d0e8c-00aa-4770-9e58-b8f706d80a35"). InnerVolumeSpecName "pod-info". PluginName "kubernetes.io/downward-api", VolumeGidValue "" Mar 21 04:48:11 crc kubenswrapper[4839]: I0321 04:48:11.626431 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6e1d0e8c-00aa-4770-9e58-b8f706d80a35-server-conf" (OuterVolumeSpecName: "server-conf") pod "6e1d0e8c-00aa-4770-9e58-b8f706d80a35" (UID: "6e1d0e8c-00aa-4770-9e58-b8f706d80a35"). InnerVolumeSpecName "server-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 21 04:48:11 crc kubenswrapper[4839]: I0321 04:48:11.657946 4839 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/6e1d0e8c-00aa-4770-9e58-b8f706d80a35-config-data\") on node \"crc\" DevicePath \"\"" Mar 21 04:48:11 crc kubenswrapper[4839]: I0321 04:48:11.657996 4839 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") on node \"crc\" " Mar 21 04:48:11 crc kubenswrapper[4839]: I0321 04:48:11.658008 4839 reconciler_common.go:293] "Volume detached for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/6e1d0e8c-00aa-4770-9e58-b8f706d80a35-pod-info\") on node \"crc\" DevicePath \"\"" Mar 21 04:48:11 crc kubenswrapper[4839]: I0321 04:48:11.658021 4839 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/6e1d0e8c-00aa-4770-9e58-b8f706d80a35-rabbitmq-erlang-cookie\") on node \"crc\" DevicePath \"\"" Mar 21 04:48:11 crc kubenswrapper[4839]: I0321 04:48:11.658031 4839 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/6e1d0e8c-00aa-4770-9e58-b8f706d80a35-rabbitmq-tls\") on node \"crc\" DevicePath \"\"" Mar 21 04:48:11 crc kubenswrapper[4839]: I0321 04:48:11.658039 4839 reconciler_common.go:293] "Volume detached for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/6e1d0e8c-00aa-4770-9e58-b8f706d80a35-server-conf\") on node \"crc\" DevicePath \"\"" Mar 21 04:48:11 crc kubenswrapper[4839]: I0321 04:48:11.658046 4839 reconciler_common.go:293] "Volume detached for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/6e1d0e8c-00aa-4770-9e58-b8f706d80a35-plugins-conf\") on node \"crc\" DevicePath \"\"" Mar 21 04:48:11 crc kubenswrapper[4839]: I0321 04:48:11.658249 4839 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rb4vz\" (UniqueName: \"kubernetes.io/projected/6e1d0e8c-00aa-4770-9e58-b8f706d80a35-kube-api-access-rb4vz\") on node \"crc\" DevicePath \"\"" Mar 21 04:48:11 crc kubenswrapper[4839]: I0321 04:48:11.658260 4839 reconciler_common.go:293] "Volume detached for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/6e1d0e8c-00aa-4770-9e58-b8f706d80a35-erlang-cookie-secret\") on node \"crc\" DevicePath \"\"" Mar 21 04:48:11 crc kubenswrapper[4839]: I0321 04:48:11.658268 4839 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/6e1d0e8c-00aa-4770-9e58-b8f706d80a35-rabbitmq-plugins\") on node \"crc\" DevicePath \"\"" Mar 21 04:48:11 crc kubenswrapper[4839]: I0321 04:48:11.715139 4839 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage04-crc" (UniqueName: "kubernetes.io/local-volume/local-storage04-crc") on node "crc" Mar 21 04:48:11 crc kubenswrapper[4839]: I0321 04:48:11.749844 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6e1d0e8c-00aa-4770-9e58-b8f706d80a35-rabbitmq-confd" (OuterVolumeSpecName: "rabbitmq-confd") pod "6e1d0e8c-00aa-4770-9e58-b8f706d80a35" (UID: "6e1d0e8c-00aa-4770-9e58-b8f706d80a35"). InnerVolumeSpecName "rabbitmq-confd". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 21 04:48:11 crc kubenswrapper[4839]: I0321 04:48:11.760171 4839 reconciler_common.go:293] "Volume detached for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") on node \"crc\" DevicePath \"\"" Mar 21 04:48:11 crc kubenswrapper[4839]: I0321 04:48:11.760214 4839 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/6e1d0e8c-00aa-4770-9e58-b8f706d80a35-rabbitmq-confd\") on node \"crc\" DevicePath \"\"" Mar 21 04:48:12 crc kubenswrapper[4839]: I0321 04:48:12.026109 4839 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Mar 21 04:48:12 crc kubenswrapper[4839]: W0321 04:48:12.031063 4839 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podbfff67da_8ea4_4798_9b8d_58a3abac4347.slice/crio-46b13d5c0adc0bef49bf9499cfee788b52e0c94a7a04a01ffcc03ba32584dcd5 WatchSource:0}: Error finding container 46b13d5c0adc0bef49bf9499cfee788b52e0c94a7a04a01ffcc03ba32584dcd5: Status 404 returned error can't find the container with id 46b13d5c0adc0bef49bf9499cfee788b52e0c94a7a04a01ffcc03ba32584dcd5 Mar 21 04:48:12 crc kubenswrapper[4839]: I0321 04:48:12.052264 4839 generic.go:334] "Generic (PLEG): container finished" podID="346daec7-d0f8-4237-a189-2b84c2a65207" containerID="79758e53a6a4afc87d273da5e0a098ac34797242bb0be0e09b4fef2bf01960eb" exitCode=0 Mar 21 04:48:12 crc kubenswrapper[4839]: I0321 04:48:12.052369 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-kwl97" event={"ID":"346daec7-d0f8-4237-a189-2b84c2a65207","Type":"ContainerDied","Data":"79758e53a6a4afc87d273da5e0a098ac34797242bb0be0e09b4fef2bf01960eb"} Mar 21 04:48:12 crc kubenswrapper[4839]: I0321 04:48:12.055407 4839 generic.go:334] "Generic (PLEG): container finished" podID="6e1d0e8c-00aa-4770-9e58-b8f706d80a35" containerID="7e6871f49750eb702160c82a07f561ea187c198c633a9f96f983dabe23b81318" exitCode=0 Mar 21 04:48:12 crc kubenswrapper[4839]: I0321 04:48:12.055463 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"6e1d0e8c-00aa-4770-9e58-b8f706d80a35","Type":"ContainerDied","Data":"7e6871f49750eb702160c82a07f561ea187c198c633a9f96f983dabe23b81318"} Mar 21 04:48:12 crc kubenswrapper[4839]: I0321 04:48:12.055487 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"6e1d0e8c-00aa-4770-9e58-b8f706d80a35","Type":"ContainerDied","Data":"13cf1811708e735c8587e5f387524078eddb6176802aa11ecbd1435c38ed0541"} Mar 21 04:48:12 crc kubenswrapper[4839]: I0321 04:48:12.055502 4839 scope.go:117] "RemoveContainer" containerID="7e6871f49750eb702160c82a07f561ea187c198c633a9f96f983dabe23b81318" Mar 21 04:48:12 crc kubenswrapper[4839]: I0321 04:48:12.055635 4839 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Mar 21 04:48:12 crc kubenswrapper[4839]: I0321 04:48:12.212926 4839 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Mar 21 04:48:12 crc kubenswrapper[4839]: I0321 04:48:12.224744 4839 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Mar 21 04:48:12 crc kubenswrapper[4839]: I0321 04:48:12.226386 4839 scope.go:117] "RemoveContainer" containerID="e90ba1902491587c449fe1da895e413653e91fc3602d79a9260ecadc688aa182" Mar 21 04:48:12 crc kubenswrapper[4839]: I0321 04:48:12.239771 4839 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Mar 21 04:48:12 crc kubenswrapper[4839]: E0321 04:48:12.240242 4839 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6e1d0e8c-00aa-4770-9e58-b8f706d80a35" containerName="rabbitmq" Mar 21 04:48:12 crc kubenswrapper[4839]: I0321 04:48:12.240260 4839 state_mem.go:107] "Deleted CPUSet assignment" podUID="6e1d0e8c-00aa-4770-9e58-b8f706d80a35" containerName="rabbitmq" Mar 21 04:48:12 crc kubenswrapper[4839]: E0321 04:48:12.240289 4839 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6e1d0e8c-00aa-4770-9e58-b8f706d80a35" containerName="setup-container" Mar 21 04:48:12 crc kubenswrapper[4839]: I0321 04:48:12.240296 4839 state_mem.go:107] "Deleted CPUSet assignment" podUID="6e1d0e8c-00aa-4770-9e58-b8f706d80a35" containerName="setup-container" Mar 21 04:48:12 crc kubenswrapper[4839]: I0321 04:48:12.240506 4839 memory_manager.go:354] "RemoveStaleState removing state" podUID="6e1d0e8c-00aa-4770-9e58-b8f706d80a35" containerName="rabbitmq" Mar 21 04:48:12 crc kubenswrapper[4839]: I0321 04:48:12.241972 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Mar 21 04:48:12 crc kubenswrapper[4839]: I0321 04:48:12.246011 4839 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-default-user" Mar 21 04:48:12 crc kubenswrapper[4839]: I0321 04:48:12.246030 4839 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-erlang-cookie" Mar 21 04:48:12 crc kubenswrapper[4839]: I0321 04:48:12.246333 4839 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-config-data" Mar 21 04:48:12 crc kubenswrapper[4839]: I0321 04:48:12.246429 4839 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-server-dockercfg-wq8rw" Mar 21 04:48:12 crc kubenswrapper[4839]: I0321 04:48:12.246514 4839 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-plugins-conf" Mar 21 04:48:12 crc kubenswrapper[4839]: I0321 04:48:12.246642 4839 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-server-conf" Mar 21 04:48:12 crc kubenswrapper[4839]: I0321 04:48:12.246528 4839 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-rabbitmq-cell1-svc" Mar 21 04:48:12 crc kubenswrapper[4839]: I0321 04:48:12.250477 4839 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Mar 21 04:48:12 crc kubenswrapper[4839]: I0321 04:48:12.298541 4839 scope.go:117] "RemoveContainer" containerID="7e6871f49750eb702160c82a07f561ea187c198c633a9f96f983dabe23b81318" Mar 21 04:48:12 crc kubenswrapper[4839]: E0321 04:48:12.299458 4839 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7e6871f49750eb702160c82a07f561ea187c198c633a9f96f983dabe23b81318\": container with ID starting with 7e6871f49750eb702160c82a07f561ea187c198c633a9f96f983dabe23b81318 not found: ID does not exist" containerID="7e6871f49750eb702160c82a07f561ea187c198c633a9f96f983dabe23b81318" Mar 21 04:48:12 crc kubenswrapper[4839]: I0321 04:48:12.299506 4839 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7e6871f49750eb702160c82a07f561ea187c198c633a9f96f983dabe23b81318"} err="failed to get container status \"7e6871f49750eb702160c82a07f561ea187c198c633a9f96f983dabe23b81318\": rpc error: code = NotFound desc = could not find container \"7e6871f49750eb702160c82a07f561ea187c198c633a9f96f983dabe23b81318\": container with ID starting with 7e6871f49750eb702160c82a07f561ea187c198c633a9f96f983dabe23b81318 not found: ID does not exist" Mar 21 04:48:12 crc kubenswrapper[4839]: I0321 04:48:12.299530 4839 scope.go:117] "RemoveContainer" containerID="e90ba1902491587c449fe1da895e413653e91fc3602d79a9260ecadc688aa182" Mar 21 04:48:12 crc kubenswrapper[4839]: E0321 04:48:12.299877 4839 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e90ba1902491587c449fe1da895e413653e91fc3602d79a9260ecadc688aa182\": container with ID starting with e90ba1902491587c449fe1da895e413653e91fc3602d79a9260ecadc688aa182 not found: ID does not exist" containerID="e90ba1902491587c449fe1da895e413653e91fc3602d79a9260ecadc688aa182" Mar 21 04:48:12 crc kubenswrapper[4839]: I0321 04:48:12.299917 4839 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e90ba1902491587c449fe1da895e413653e91fc3602d79a9260ecadc688aa182"} err="failed to get container status \"e90ba1902491587c449fe1da895e413653e91fc3602d79a9260ecadc688aa182\": rpc error: code = NotFound desc = could not find container \"e90ba1902491587c449fe1da895e413653e91fc3602d79a9260ecadc688aa182\": container with ID starting with e90ba1902491587c449fe1da895e413653e91fc3602d79a9260ecadc688aa182 not found: ID does not exist" Mar 21 04:48:12 crc kubenswrapper[4839]: I0321 04:48:12.380035 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/fa82c4a0-2b0e-4e22-9e91-7fc899122414-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"fa82c4a0-2b0e-4e22-9e91-7fc899122414\") " pod="openstack/rabbitmq-cell1-server-0" Mar 21 04:48:12 crc kubenswrapper[4839]: I0321 04:48:12.380114 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/fa82c4a0-2b0e-4e22-9e91-7fc899122414-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"fa82c4a0-2b0e-4e22-9e91-7fc899122414\") " pod="openstack/rabbitmq-cell1-server-0" Mar 21 04:48:12 crc kubenswrapper[4839]: I0321 04:48:12.380143 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6tbk9\" (UniqueName: \"kubernetes.io/projected/fa82c4a0-2b0e-4e22-9e91-7fc899122414-kube-api-access-6tbk9\") pod \"rabbitmq-cell1-server-0\" (UID: \"fa82c4a0-2b0e-4e22-9e91-7fc899122414\") " pod="openstack/rabbitmq-cell1-server-0" Mar 21 04:48:12 crc kubenswrapper[4839]: I0321 04:48:12.380291 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/fa82c4a0-2b0e-4e22-9e91-7fc899122414-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"fa82c4a0-2b0e-4e22-9e91-7fc899122414\") " pod="openstack/rabbitmq-cell1-server-0" Mar 21 04:48:12 crc kubenswrapper[4839]: I0321 04:48:12.380456 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/fa82c4a0-2b0e-4e22-9e91-7fc899122414-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"fa82c4a0-2b0e-4e22-9e91-7fc899122414\") " pod="openstack/rabbitmq-cell1-server-0" Mar 21 04:48:12 crc kubenswrapper[4839]: I0321 04:48:12.380511 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/fa82c4a0-2b0e-4e22-9e91-7fc899122414-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"fa82c4a0-2b0e-4e22-9e91-7fc899122414\") " pod="openstack/rabbitmq-cell1-server-0" Mar 21 04:48:12 crc kubenswrapper[4839]: I0321 04:48:12.380554 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/fa82c4a0-2b0e-4e22-9e91-7fc899122414-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"fa82c4a0-2b0e-4e22-9e91-7fc899122414\") " pod="openstack/rabbitmq-cell1-server-0" Mar 21 04:48:12 crc kubenswrapper[4839]: I0321 04:48:12.380602 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/fa82c4a0-2b0e-4e22-9e91-7fc899122414-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"fa82c4a0-2b0e-4e22-9e91-7fc899122414\") " pod="openstack/rabbitmq-cell1-server-0" Mar 21 04:48:12 crc kubenswrapper[4839]: I0321 04:48:12.380653 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"fa82c4a0-2b0e-4e22-9e91-7fc899122414\") " pod="openstack/rabbitmq-cell1-server-0" Mar 21 04:48:12 crc kubenswrapper[4839]: I0321 04:48:12.380684 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/fa82c4a0-2b0e-4e22-9e91-7fc899122414-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"fa82c4a0-2b0e-4e22-9e91-7fc899122414\") " pod="openstack/rabbitmq-cell1-server-0" Mar 21 04:48:12 crc kubenswrapper[4839]: I0321 04:48:12.380730 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/fa82c4a0-2b0e-4e22-9e91-7fc899122414-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"fa82c4a0-2b0e-4e22-9e91-7fc899122414\") " pod="openstack/rabbitmq-cell1-server-0" Mar 21 04:48:12 crc kubenswrapper[4839]: I0321 04:48:12.465371 4839 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6e1d0e8c-00aa-4770-9e58-b8f706d80a35" path="/var/lib/kubelet/pods/6e1d0e8c-00aa-4770-9e58-b8f706d80a35/volumes" Mar 21 04:48:12 crc kubenswrapper[4839]: I0321 04:48:12.466282 4839 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8028561c-b039-4400-a065-b5efee753b5f" path="/var/lib/kubelet/pods/8028561c-b039-4400-a065-b5efee753b5f/volumes" Mar 21 04:48:12 crc kubenswrapper[4839]: I0321 04:48:12.482451 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/fa82c4a0-2b0e-4e22-9e91-7fc899122414-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"fa82c4a0-2b0e-4e22-9e91-7fc899122414\") " pod="openstack/rabbitmq-cell1-server-0" Mar 21 04:48:12 crc kubenswrapper[4839]: I0321 04:48:12.482536 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/fa82c4a0-2b0e-4e22-9e91-7fc899122414-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"fa82c4a0-2b0e-4e22-9e91-7fc899122414\") " pod="openstack/rabbitmq-cell1-server-0" Mar 21 04:48:12 crc kubenswrapper[4839]: I0321 04:48:12.482610 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/fa82c4a0-2b0e-4e22-9e91-7fc899122414-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"fa82c4a0-2b0e-4e22-9e91-7fc899122414\") " pod="openstack/rabbitmq-cell1-server-0" Mar 21 04:48:12 crc kubenswrapper[4839]: I0321 04:48:12.482647 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/fa82c4a0-2b0e-4e22-9e91-7fc899122414-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"fa82c4a0-2b0e-4e22-9e91-7fc899122414\") " pod="openstack/rabbitmq-cell1-server-0" Mar 21 04:48:12 crc kubenswrapper[4839]: I0321 04:48:12.482691 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"fa82c4a0-2b0e-4e22-9e91-7fc899122414\") " pod="openstack/rabbitmq-cell1-server-0" Mar 21 04:48:12 crc kubenswrapper[4839]: I0321 04:48:12.482875 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/fa82c4a0-2b0e-4e22-9e91-7fc899122414-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"fa82c4a0-2b0e-4e22-9e91-7fc899122414\") " pod="openstack/rabbitmq-cell1-server-0" Mar 21 04:48:12 crc kubenswrapper[4839]: I0321 04:48:12.483205 4839 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"fa82c4a0-2b0e-4e22-9e91-7fc899122414\") device mount path \"/mnt/openstack/pv04\"" pod="openstack/rabbitmq-cell1-server-0" Mar 21 04:48:12 crc kubenswrapper[4839]: I0321 04:48:12.483439 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/fa82c4a0-2b0e-4e22-9e91-7fc899122414-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"fa82c4a0-2b0e-4e22-9e91-7fc899122414\") " pod="openstack/rabbitmq-cell1-server-0" Mar 21 04:48:12 crc kubenswrapper[4839]: I0321 04:48:12.483502 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/fa82c4a0-2b0e-4e22-9e91-7fc899122414-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"fa82c4a0-2b0e-4e22-9e91-7fc899122414\") " pod="openstack/rabbitmq-cell1-server-0" Mar 21 04:48:12 crc kubenswrapper[4839]: I0321 04:48:12.483513 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/fa82c4a0-2b0e-4e22-9e91-7fc899122414-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"fa82c4a0-2b0e-4e22-9e91-7fc899122414\") " pod="openstack/rabbitmq-cell1-server-0" Mar 21 04:48:12 crc kubenswrapper[4839]: I0321 04:48:12.483753 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/fa82c4a0-2b0e-4e22-9e91-7fc899122414-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"fa82c4a0-2b0e-4e22-9e91-7fc899122414\") " pod="openstack/rabbitmq-cell1-server-0" Mar 21 04:48:12 crc kubenswrapper[4839]: I0321 04:48:12.483760 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/fa82c4a0-2b0e-4e22-9e91-7fc899122414-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"fa82c4a0-2b0e-4e22-9e91-7fc899122414\") " pod="openstack/rabbitmq-cell1-server-0" Mar 21 04:48:12 crc kubenswrapper[4839]: I0321 04:48:12.483772 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/fa82c4a0-2b0e-4e22-9e91-7fc899122414-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"fa82c4a0-2b0e-4e22-9e91-7fc899122414\") " pod="openstack/rabbitmq-cell1-server-0" Mar 21 04:48:12 crc kubenswrapper[4839]: I0321 04:48:12.483850 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/fa82c4a0-2b0e-4e22-9e91-7fc899122414-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"fa82c4a0-2b0e-4e22-9e91-7fc899122414\") " pod="openstack/rabbitmq-cell1-server-0" Mar 21 04:48:12 crc kubenswrapper[4839]: I0321 04:48:12.483874 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6tbk9\" (UniqueName: \"kubernetes.io/projected/fa82c4a0-2b0e-4e22-9e91-7fc899122414-kube-api-access-6tbk9\") pod \"rabbitmq-cell1-server-0\" (UID: \"fa82c4a0-2b0e-4e22-9e91-7fc899122414\") " pod="openstack/rabbitmq-cell1-server-0" Mar 21 04:48:12 crc kubenswrapper[4839]: I0321 04:48:12.483919 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/fa82c4a0-2b0e-4e22-9e91-7fc899122414-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"fa82c4a0-2b0e-4e22-9e91-7fc899122414\") " pod="openstack/rabbitmq-cell1-server-0" Mar 21 04:48:12 crc kubenswrapper[4839]: I0321 04:48:12.484927 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/fa82c4a0-2b0e-4e22-9e91-7fc899122414-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"fa82c4a0-2b0e-4e22-9e91-7fc899122414\") " pod="openstack/rabbitmq-cell1-server-0" Mar 21 04:48:12 crc kubenswrapper[4839]: I0321 04:48:12.487320 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/fa82c4a0-2b0e-4e22-9e91-7fc899122414-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"fa82c4a0-2b0e-4e22-9e91-7fc899122414\") " pod="openstack/rabbitmq-cell1-server-0" Mar 21 04:48:12 crc kubenswrapper[4839]: I0321 04:48:12.487351 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/fa82c4a0-2b0e-4e22-9e91-7fc899122414-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"fa82c4a0-2b0e-4e22-9e91-7fc899122414\") " pod="openstack/rabbitmq-cell1-server-0" Mar 21 04:48:12 crc kubenswrapper[4839]: I0321 04:48:12.488004 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/fa82c4a0-2b0e-4e22-9e91-7fc899122414-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"fa82c4a0-2b0e-4e22-9e91-7fc899122414\") " pod="openstack/rabbitmq-cell1-server-0" Mar 21 04:48:12 crc kubenswrapper[4839]: I0321 04:48:12.489766 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/fa82c4a0-2b0e-4e22-9e91-7fc899122414-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"fa82c4a0-2b0e-4e22-9e91-7fc899122414\") " pod="openstack/rabbitmq-cell1-server-0" Mar 21 04:48:12 crc kubenswrapper[4839]: I0321 04:48:12.507250 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6tbk9\" (UniqueName: \"kubernetes.io/projected/fa82c4a0-2b0e-4e22-9e91-7fc899122414-kube-api-access-6tbk9\") pod \"rabbitmq-cell1-server-0\" (UID: \"fa82c4a0-2b0e-4e22-9e91-7fc899122414\") " pod="openstack/rabbitmq-cell1-server-0" Mar 21 04:48:12 crc kubenswrapper[4839]: I0321 04:48:12.529370 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"fa82c4a0-2b0e-4e22-9e91-7fc899122414\") " pod="openstack/rabbitmq-cell1-server-0" Mar 21 04:48:12 crc kubenswrapper[4839]: I0321 04:48:12.596969 4839 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-d558885bc-cg5zx"] Mar 21 04:48:12 crc kubenswrapper[4839]: I0321 04:48:12.608312 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-d558885bc-cg5zx" Mar 21 04:48:12 crc kubenswrapper[4839]: I0321 04:48:12.615708 4839 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-edpm-ipam" Mar 21 04:48:12 crc kubenswrapper[4839]: I0321 04:48:12.616723 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Mar 21 04:48:12 crc kubenswrapper[4839]: I0321 04:48:12.622121 4839 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-d558885bc-cg5zx"] Mar 21 04:48:12 crc kubenswrapper[4839]: I0321 04:48:12.691934 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9tpcn\" (UniqueName: \"kubernetes.io/projected/fa3a88fe-e92a-48a2-9d53-97c2e2c16407-kube-api-access-9tpcn\") pod \"dnsmasq-dns-d558885bc-cg5zx\" (UID: \"fa3a88fe-e92a-48a2-9d53-97c2e2c16407\") " pod="openstack/dnsmasq-dns-d558885bc-cg5zx" Mar 21 04:48:12 crc kubenswrapper[4839]: I0321 04:48:12.692222 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/fa3a88fe-e92a-48a2-9d53-97c2e2c16407-ovsdbserver-nb\") pod \"dnsmasq-dns-d558885bc-cg5zx\" (UID: \"fa3a88fe-e92a-48a2-9d53-97c2e2c16407\") " pod="openstack/dnsmasq-dns-d558885bc-cg5zx" Mar 21 04:48:12 crc kubenswrapper[4839]: I0321 04:48:12.692383 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/fa3a88fe-e92a-48a2-9d53-97c2e2c16407-ovsdbserver-sb\") pod \"dnsmasq-dns-d558885bc-cg5zx\" (UID: \"fa3a88fe-e92a-48a2-9d53-97c2e2c16407\") " pod="openstack/dnsmasq-dns-d558885bc-cg5zx" Mar 21 04:48:12 crc kubenswrapper[4839]: I0321 04:48:12.692547 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fa3a88fe-e92a-48a2-9d53-97c2e2c16407-config\") pod \"dnsmasq-dns-d558885bc-cg5zx\" (UID: \"fa3a88fe-e92a-48a2-9d53-97c2e2c16407\") " pod="openstack/dnsmasq-dns-d558885bc-cg5zx" Mar 21 04:48:12 crc kubenswrapper[4839]: I0321 04:48:12.692677 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/fa3a88fe-e92a-48a2-9d53-97c2e2c16407-dns-swift-storage-0\") pod \"dnsmasq-dns-d558885bc-cg5zx\" (UID: \"fa3a88fe-e92a-48a2-9d53-97c2e2c16407\") " pod="openstack/dnsmasq-dns-d558885bc-cg5zx" Mar 21 04:48:12 crc kubenswrapper[4839]: I0321 04:48:12.692804 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/fa3a88fe-e92a-48a2-9d53-97c2e2c16407-openstack-edpm-ipam\") pod \"dnsmasq-dns-d558885bc-cg5zx\" (UID: \"fa3a88fe-e92a-48a2-9d53-97c2e2c16407\") " pod="openstack/dnsmasq-dns-d558885bc-cg5zx" Mar 21 04:48:12 crc kubenswrapper[4839]: I0321 04:48:12.692995 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/fa3a88fe-e92a-48a2-9d53-97c2e2c16407-dns-svc\") pod \"dnsmasq-dns-d558885bc-cg5zx\" (UID: \"fa3a88fe-e92a-48a2-9d53-97c2e2c16407\") " pod="openstack/dnsmasq-dns-d558885bc-cg5zx" Mar 21 04:48:12 crc kubenswrapper[4839]: I0321 04:48:12.794751 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fa3a88fe-e92a-48a2-9d53-97c2e2c16407-config\") pod \"dnsmasq-dns-d558885bc-cg5zx\" (UID: \"fa3a88fe-e92a-48a2-9d53-97c2e2c16407\") " pod="openstack/dnsmasq-dns-d558885bc-cg5zx" Mar 21 04:48:12 crc kubenswrapper[4839]: I0321 04:48:12.794814 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/fa3a88fe-e92a-48a2-9d53-97c2e2c16407-dns-swift-storage-0\") pod \"dnsmasq-dns-d558885bc-cg5zx\" (UID: \"fa3a88fe-e92a-48a2-9d53-97c2e2c16407\") " pod="openstack/dnsmasq-dns-d558885bc-cg5zx" Mar 21 04:48:12 crc kubenswrapper[4839]: I0321 04:48:12.794873 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/fa3a88fe-e92a-48a2-9d53-97c2e2c16407-openstack-edpm-ipam\") pod \"dnsmasq-dns-d558885bc-cg5zx\" (UID: \"fa3a88fe-e92a-48a2-9d53-97c2e2c16407\") " pod="openstack/dnsmasq-dns-d558885bc-cg5zx" Mar 21 04:48:12 crc kubenswrapper[4839]: I0321 04:48:12.794967 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/fa3a88fe-e92a-48a2-9d53-97c2e2c16407-dns-svc\") pod \"dnsmasq-dns-d558885bc-cg5zx\" (UID: \"fa3a88fe-e92a-48a2-9d53-97c2e2c16407\") " pod="openstack/dnsmasq-dns-d558885bc-cg5zx" Mar 21 04:48:12 crc kubenswrapper[4839]: I0321 04:48:12.795019 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9tpcn\" (UniqueName: \"kubernetes.io/projected/fa3a88fe-e92a-48a2-9d53-97c2e2c16407-kube-api-access-9tpcn\") pod \"dnsmasq-dns-d558885bc-cg5zx\" (UID: \"fa3a88fe-e92a-48a2-9d53-97c2e2c16407\") " pod="openstack/dnsmasq-dns-d558885bc-cg5zx" Mar 21 04:48:12 crc kubenswrapper[4839]: I0321 04:48:12.795042 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/fa3a88fe-e92a-48a2-9d53-97c2e2c16407-ovsdbserver-nb\") pod \"dnsmasq-dns-d558885bc-cg5zx\" (UID: \"fa3a88fe-e92a-48a2-9d53-97c2e2c16407\") " pod="openstack/dnsmasq-dns-d558885bc-cg5zx" Mar 21 04:48:12 crc kubenswrapper[4839]: I0321 04:48:12.795106 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/fa3a88fe-e92a-48a2-9d53-97c2e2c16407-ovsdbserver-sb\") pod \"dnsmasq-dns-d558885bc-cg5zx\" (UID: \"fa3a88fe-e92a-48a2-9d53-97c2e2c16407\") " pod="openstack/dnsmasq-dns-d558885bc-cg5zx" Mar 21 04:48:12 crc kubenswrapper[4839]: I0321 04:48:12.796105 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/fa3a88fe-e92a-48a2-9d53-97c2e2c16407-ovsdbserver-sb\") pod \"dnsmasq-dns-d558885bc-cg5zx\" (UID: \"fa3a88fe-e92a-48a2-9d53-97c2e2c16407\") " pod="openstack/dnsmasq-dns-d558885bc-cg5zx" Mar 21 04:48:12 crc kubenswrapper[4839]: I0321 04:48:12.796774 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fa3a88fe-e92a-48a2-9d53-97c2e2c16407-config\") pod \"dnsmasq-dns-d558885bc-cg5zx\" (UID: \"fa3a88fe-e92a-48a2-9d53-97c2e2c16407\") " pod="openstack/dnsmasq-dns-d558885bc-cg5zx" Mar 21 04:48:12 crc kubenswrapper[4839]: I0321 04:48:12.797492 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/fa3a88fe-e92a-48a2-9d53-97c2e2c16407-dns-svc\") pod \"dnsmasq-dns-d558885bc-cg5zx\" (UID: \"fa3a88fe-e92a-48a2-9d53-97c2e2c16407\") " pod="openstack/dnsmasq-dns-d558885bc-cg5zx" Mar 21 04:48:12 crc kubenswrapper[4839]: I0321 04:48:12.801153 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/fa3a88fe-e92a-48a2-9d53-97c2e2c16407-dns-swift-storage-0\") pod \"dnsmasq-dns-d558885bc-cg5zx\" (UID: \"fa3a88fe-e92a-48a2-9d53-97c2e2c16407\") " pod="openstack/dnsmasq-dns-d558885bc-cg5zx" Mar 21 04:48:12 crc kubenswrapper[4839]: I0321 04:48:12.802395 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/fa3a88fe-e92a-48a2-9d53-97c2e2c16407-ovsdbserver-nb\") pod \"dnsmasq-dns-d558885bc-cg5zx\" (UID: \"fa3a88fe-e92a-48a2-9d53-97c2e2c16407\") " pod="openstack/dnsmasq-dns-d558885bc-cg5zx" Mar 21 04:48:12 crc kubenswrapper[4839]: I0321 04:48:12.804466 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/fa3a88fe-e92a-48a2-9d53-97c2e2c16407-openstack-edpm-ipam\") pod \"dnsmasq-dns-d558885bc-cg5zx\" (UID: \"fa3a88fe-e92a-48a2-9d53-97c2e2c16407\") " pod="openstack/dnsmasq-dns-d558885bc-cg5zx" Mar 21 04:48:12 crc kubenswrapper[4839]: I0321 04:48:12.826488 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9tpcn\" (UniqueName: \"kubernetes.io/projected/fa3a88fe-e92a-48a2-9d53-97c2e2c16407-kube-api-access-9tpcn\") pod \"dnsmasq-dns-d558885bc-cg5zx\" (UID: \"fa3a88fe-e92a-48a2-9d53-97c2e2c16407\") " pod="openstack/dnsmasq-dns-d558885bc-cg5zx" Mar 21 04:48:12 crc kubenswrapper[4839]: I0321 04:48:12.933738 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-d558885bc-cg5zx" Mar 21 04:48:13 crc kubenswrapper[4839]: I0321 04:48:13.088134 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"bfff67da-8ea4-4798-9b8d-58a3abac4347","Type":"ContainerStarted","Data":"46b13d5c0adc0bef49bf9499cfee788b52e0c94a7a04a01ffcc03ba32584dcd5"} Mar 21 04:48:13 crc kubenswrapper[4839]: I0321 04:48:13.239611 4839 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Mar 21 04:48:13 crc kubenswrapper[4839]: W0321 04:48:13.440124 4839 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podfa3a88fe_e92a_48a2_9d53_97c2e2c16407.slice/crio-6afe4a7a44e7d79961cd510bbe96aaeab650fa9c2ae4dcc1d6d1067536db5cf3 WatchSource:0}: Error finding container 6afe4a7a44e7d79961cd510bbe96aaeab650fa9c2ae4dcc1d6d1067536db5cf3: Status 404 returned error can't find the container with id 6afe4a7a44e7d79961cd510bbe96aaeab650fa9c2ae4dcc1d6d1067536db5cf3 Mar 21 04:48:13 crc kubenswrapper[4839]: I0321 04:48:13.448323 4839 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-d558885bc-cg5zx"] Mar 21 04:48:14 crc kubenswrapper[4839]: I0321 04:48:14.099647 4839 generic.go:334] "Generic (PLEG): container finished" podID="fa3a88fe-e92a-48a2-9d53-97c2e2c16407" containerID="03d38a8e228a32b6d2c321880966d58bdb028deb1a7c9e33a2a7cd25a69ed0dc" exitCode=0 Mar 21 04:48:14 crc kubenswrapper[4839]: I0321 04:48:14.099753 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-d558885bc-cg5zx" event={"ID":"fa3a88fe-e92a-48a2-9d53-97c2e2c16407","Type":"ContainerDied","Data":"03d38a8e228a32b6d2c321880966d58bdb028deb1a7c9e33a2a7cd25a69ed0dc"} Mar 21 04:48:14 crc kubenswrapper[4839]: I0321 04:48:14.100042 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-d558885bc-cg5zx" event={"ID":"fa3a88fe-e92a-48a2-9d53-97c2e2c16407","Type":"ContainerStarted","Data":"6afe4a7a44e7d79961cd510bbe96aaeab650fa9c2ae4dcc1d6d1067536db5cf3"} Mar 21 04:48:14 crc kubenswrapper[4839]: I0321 04:48:14.101222 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"fa82c4a0-2b0e-4e22-9e91-7fc899122414","Type":"ContainerStarted","Data":"25a926428acdd1dd9442d2adc6952a7148d2fc65ae1286292b5a5776f9568879"} Mar 21 04:48:14 crc kubenswrapper[4839]: I0321 04:48:14.107332 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-kwl97" event={"ID":"346daec7-d0f8-4237-a189-2b84c2a65207","Type":"ContainerStarted","Data":"8f34fe59339a2072a730832f50bd62453a0d0ef1ad303dbcbb364d5ce3b9734d"} Mar 21 04:48:14 crc kubenswrapper[4839]: I0321 04:48:14.112885 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"bfff67da-8ea4-4798-9b8d-58a3abac4347","Type":"ContainerStarted","Data":"644c67f4886f966671f8710482c7d08251548f19faca3155012ce9a9e6664332"} Mar 21 04:48:14 crc kubenswrapper[4839]: I0321 04:48:14.171405 4839 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-kwl97" podStartSLOduration=3.470987738 podStartE2EDuration="6.171377384s" podCreationTimestamp="2026-03-21 04:48:08 +0000 UTC" firstStartedPulling="2026-03-21 04:48:10.014818196 +0000 UTC m=+1494.342604872" lastFinishedPulling="2026-03-21 04:48:12.715207842 +0000 UTC m=+1497.042994518" observedRunningTime="2026-03-21 04:48:14.152246587 +0000 UTC m=+1498.480033263" watchObservedRunningTime="2026-03-21 04:48:14.171377384 +0000 UTC m=+1498.499164070" Mar 21 04:48:15 crc kubenswrapper[4839]: I0321 04:48:15.123509 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-d558885bc-cg5zx" event={"ID":"fa3a88fe-e92a-48a2-9d53-97c2e2c16407","Type":"ContainerStarted","Data":"7e3932a251fb5ea95be45f70c482ec5596ce908fd935cd3f361d8e3a21897545"} Mar 21 04:48:15 crc kubenswrapper[4839]: I0321 04:48:15.123877 4839 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-d558885bc-cg5zx" Mar 21 04:48:15 crc kubenswrapper[4839]: I0321 04:48:15.125334 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"fa82c4a0-2b0e-4e22-9e91-7fc899122414","Type":"ContainerStarted","Data":"7ac821c6a3f3ad08d86993b9d344baebd30c5481183b9c62427b46b6443230a7"} Mar 21 04:48:15 crc kubenswrapper[4839]: I0321 04:48:15.182178 4839 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-d558885bc-cg5zx" podStartSLOduration=3.182156365 podStartE2EDuration="3.182156365s" podCreationTimestamp="2026-03-21 04:48:12 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-21 04:48:15.14813921 +0000 UTC m=+1499.475925906" watchObservedRunningTime="2026-03-21 04:48:15.182156365 +0000 UTC m=+1499.509943061" Mar 21 04:48:18 crc kubenswrapper[4839]: I0321 04:48:18.996058 4839 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-kwl97" Mar 21 04:48:18 crc kubenswrapper[4839]: I0321 04:48:18.997275 4839 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-kwl97" Mar 21 04:48:20 crc kubenswrapper[4839]: I0321 04:48:20.048248 4839 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-kwl97" podUID="346daec7-d0f8-4237-a189-2b84c2a65207" containerName="registry-server" probeResult="failure" output=< Mar 21 04:48:20 crc kubenswrapper[4839]: timeout: failed to connect service ":50051" within 1s Mar 21 04:48:20 crc kubenswrapper[4839]: > Mar 21 04:48:22 crc kubenswrapper[4839]: I0321 04:48:22.934710 4839 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-d558885bc-cg5zx" Mar 21 04:48:22 crc kubenswrapper[4839]: I0321 04:48:22.995070 4839 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-cd5cbd7b9-6862l"] Mar 21 04:48:22 crc kubenswrapper[4839]: I0321 04:48:22.995338 4839 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-cd5cbd7b9-6862l" podUID="f0b06ab0-2209-4fb3-a837-ec755b412525" containerName="dnsmasq-dns" containerID="cri-o://49e68a91d6df7e43ddd3ea0fec63512f2ded0793c00e4d188853502265e78a28" gracePeriod=10 Mar 21 04:48:23 crc kubenswrapper[4839]: I0321 04:48:23.163061 4839 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-78c64bc9c5-n4nl2"] Mar 21 04:48:23 crc kubenswrapper[4839]: I0321 04:48:23.166387 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-78c64bc9c5-n4nl2" Mar 21 04:48:23 crc kubenswrapper[4839]: I0321 04:48:23.188684 4839 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-78c64bc9c5-n4nl2"] Mar 21 04:48:23 crc kubenswrapper[4839]: I0321 04:48:23.202901 4839 generic.go:334] "Generic (PLEG): container finished" podID="f0b06ab0-2209-4fb3-a837-ec755b412525" containerID="49e68a91d6df7e43ddd3ea0fec63512f2ded0793c00e4d188853502265e78a28" exitCode=0 Mar 21 04:48:23 crc kubenswrapper[4839]: I0321 04:48:23.202950 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-cd5cbd7b9-6862l" event={"ID":"f0b06ab0-2209-4fb3-a837-ec755b412525","Type":"ContainerDied","Data":"49e68a91d6df7e43ddd3ea0fec63512f2ded0793c00e4d188853502265e78a28"} Mar 21 04:48:23 crc kubenswrapper[4839]: I0321 04:48:23.300534 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/a31699b4-0a8f-42c8-b7f4-319ef1d5423a-openstack-edpm-ipam\") pod \"dnsmasq-dns-78c64bc9c5-n4nl2\" (UID: \"a31699b4-0a8f-42c8-b7f4-319ef1d5423a\") " pod="openstack/dnsmasq-dns-78c64bc9c5-n4nl2" Mar 21 04:48:23 crc kubenswrapper[4839]: I0321 04:48:23.300597 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gkclb\" (UniqueName: \"kubernetes.io/projected/a31699b4-0a8f-42c8-b7f4-319ef1d5423a-kube-api-access-gkclb\") pod \"dnsmasq-dns-78c64bc9c5-n4nl2\" (UID: \"a31699b4-0a8f-42c8-b7f4-319ef1d5423a\") " pod="openstack/dnsmasq-dns-78c64bc9c5-n4nl2" Mar 21 04:48:23 crc kubenswrapper[4839]: I0321 04:48:23.300920 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/a31699b4-0a8f-42c8-b7f4-319ef1d5423a-dns-svc\") pod \"dnsmasq-dns-78c64bc9c5-n4nl2\" (UID: \"a31699b4-0a8f-42c8-b7f4-319ef1d5423a\") " pod="openstack/dnsmasq-dns-78c64bc9c5-n4nl2" Mar 21 04:48:23 crc kubenswrapper[4839]: I0321 04:48:23.300990 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/a31699b4-0a8f-42c8-b7f4-319ef1d5423a-ovsdbserver-sb\") pod \"dnsmasq-dns-78c64bc9c5-n4nl2\" (UID: \"a31699b4-0a8f-42c8-b7f4-319ef1d5423a\") " pod="openstack/dnsmasq-dns-78c64bc9c5-n4nl2" Mar 21 04:48:23 crc kubenswrapper[4839]: I0321 04:48:23.301033 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/a31699b4-0a8f-42c8-b7f4-319ef1d5423a-dns-swift-storage-0\") pod \"dnsmasq-dns-78c64bc9c5-n4nl2\" (UID: \"a31699b4-0a8f-42c8-b7f4-319ef1d5423a\") " pod="openstack/dnsmasq-dns-78c64bc9c5-n4nl2" Mar 21 04:48:23 crc kubenswrapper[4839]: I0321 04:48:23.301156 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/a31699b4-0a8f-42c8-b7f4-319ef1d5423a-ovsdbserver-nb\") pod \"dnsmasq-dns-78c64bc9c5-n4nl2\" (UID: \"a31699b4-0a8f-42c8-b7f4-319ef1d5423a\") " pod="openstack/dnsmasq-dns-78c64bc9c5-n4nl2" Mar 21 04:48:23 crc kubenswrapper[4839]: I0321 04:48:23.301184 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a31699b4-0a8f-42c8-b7f4-319ef1d5423a-config\") pod \"dnsmasq-dns-78c64bc9c5-n4nl2\" (UID: \"a31699b4-0a8f-42c8-b7f4-319ef1d5423a\") " pod="openstack/dnsmasq-dns-78c64bc9c5-n4nl2" Mar 21 04:48:23 crc kubenswrapper[4839]: I0321 04:48:23.403006 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/a31699b4-0a8f-42c8-b7f4-319ef1d5423a-dns-svc\") pod \"dnsmasq-dns-78c64bc9c5-n4nl2\" (UID: \"a31699b4-0a8f-42c8-b7f4-319ef1d5423a\") " pod="openstack/dnsmasq-dns-78c64bc9c5-n4nl2" Mar 21 04:48:23 crc kubenswrapper[4839]: I0321 04:48:23.403075 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/a31699b4-0a8f-42c8-b7f4-319ef1d5423a-ovsdbserver-sb\") pod \"dnsmasq-dns-78c64bc9c5-n4nl2\" (UID: \"a31699b4-0a8f-42c8-b7f4-319ef1d5423a\") " pod="openstack/dnsmasq-dns-78c64bc9c5-n4nl2" Mar 21 04:48:23 crc kubenswrapper[4839]: I0321 04:48:23.403105 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/a31699b4-0a8f-42c8-b7f4-319ef1d5423a-dns-swift-storage-0\") pod \"dnsmasq-dns-78c64bc9c5-n4nl2\" (UID: \"a31699b4-0a8f-42c8-b7f4-319ef1d5423a\") " pod="openstack/dnsmasq-dns-78c64bc9c5-n4nl2" Mar 21 04:48:23 crc kubenswrapper[4839]: I0321 04:48:23.403193 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/a31699b4-0a8f-42c8-b7f4-319ef1d5423a-ovsdbserver-nb\") pod \"dnsmasq-dns-78c64bc9c5-n4nl2\" (UID: \"a31699b4-0a8f-42c8-b7f4-319ef1d5423a\") " pod="openstack/dnsmasq-dns-78c64bc9c5-n4nl2" Mar 21 04:48:23 crc kubenswrapper[4839]: I0321 04:48:23.403234 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a31699b4-0a8f-42c8-b7f4-319ef1d5423a-config\") pod \"dnsmasq-dns-78c64bc9c5-n4nl2\" (UID: \"a31699b4-0a8f-42c8-b7f4-319ef1d5423a\") " pod="openstack/dnsmasq-dns-78c64bc9c5-n4nl2" Mar 21 04:48:23 crc kubenswrapper[4839]: I0321 04:48:23.403330 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/a31699b4-0a8f-42c8-b7f4-319ef1d5423a-openstack-edpm-ipam\") pod \"dnsmasq-dns-78c64bc9c5-n4nl2\" (UID: \"a31699b4-0a8f-42c8-b7f4-319ef1d5423a\") " pod="openstack/dnsmasq-dns-78c64bc9c5-n4nl2" Mar 21 04:48:23 crc kubenswrapper[4839]: I0321 04:48:23.403371 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gkclb\" (UniqueName: \"kubernetes.io/projected/a31699b4-0a8f-42c8-b7f4-319ef1d5423a-kube-api-access-gkclb\") pod \"dnsmasq-dns-78c64bc9c5-n4nl2\" (UID: \"a31699b4-0a8f-42c8-b7f4-319ef1d5423a\") " pod="openstack/dnsmasq-dns-78c64bc9c5-n4nl2" Mar 21 04:48:23 crc kubenswrapper[4839]: I0321 04:48:23.404809 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/a31699b4-0a8f-42c8-b7f4-319ef1d5423a-ovsdbserver-sb\") pod \"dnsmasq-dns-78c64bc9c5-n4nl2\" (UID: \"a31699b4-0a8f-42c8-b7f4-319ef1d5423a\") " pod="openstack/dnsmasq-dns-78c64bc9c5-n4nl2" Mar 21 04:48:23 crc kubenswrapper[4839]: I0321 04:48:23.404819 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/a31699b4-0a8f-42c8-b7f4-319ef1d5423a-openstack-edpm-ipam\") pod \"dnsmasq-dns-78c64bc9c5-n4nl2\" (UID: \"a31699b4-0a8f-42c8-b7f4-319ef1d5423a\") " pod="openstack/dnsmasq-dns-78c64bc9c5-n4nl2" Mar 21 04:48:23 crc kubenswrapper[4839]: I0321 04:48:23.404914 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/a31699b4-0a8f-42c8-b7f4-319ef1d5423a-ovsdbserver-nb\") pod \"dnsmasq-dns-78c64bc9c5-n4nl2\" (UID: \"a31699b4-0a8f-42c8-b7f4-319ef1d5423a\") " pod="openstack/dnsmasq-dns-78c64bc9c5-n4nl2" Mar 21 04:48:23 crc kubenswrapper[4839]: I0321 04:48:23.405198 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a31699b4-0a8f-42c8-b7f4-319ef1d5423a-config\") pod \"dnsmasq-dns-78c64bc9c5-n4nl2\" (UID: \"a31699b4-0a8f-42c8-b7f4-319ef1d5423a\") " pod="openstack/dnsmasq-dns-78c64bc9c5-n4nl2" Mar 21 04:48:23 crc kubenswrapper[4839]: I0321 04:48:23.405562 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/a31699b4-0a8f-42c8-b7f4-319ef1d5423a-dns-svc\") pod \"dnsmasq-dns-78c64bc9c5-n4nl2\" (UID: \"a31699b4-0a8f-42c8-b7f4-319ef1d5423a\") " pod="openstack/dnsmasq-dns-78c64bc9c5-n4nl2" Mar 21 04:48:23 crc kubenswrapper[4839]: I0321 04:48:23.416081 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/a31699b4-0a8f-42c8-b7f4-319ef1d5423a-dns-swift-storage-0\") pod \"dnsmasq-dns-78c64bc9c5-n4nl2\" (UID: \"a31699b4-0a8f-42c8-b7f4-319ef1d5423a\") " pod="openstack/dnsmasq-dns-78c64bc9c5-n4nl2" Mar 21 04:48:23 crc kubenswrapper[4839]: I0321 04:48:23.430453 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gkclb\" (UniqueName: \"kubernetes.io/projected/a31699b4-0a8f-42c8-b7f4-319ef1d5423a-kube-api-access-gkclb\") pod \"dnsmasq-dns-78c64bc9c5-n4nl2\" (UID: \"a31699b4-0a8f-42c8-b7f4-319ef1d5423a\") " pod="openstack/dnsmasq-dns-78c64bc9c5-n4nl2" Mar 21 04:48:23 crc kubenswrapper[4839]: I0321 04:48:23.541555 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-78c64bc9c5-n4nl2" Mar 21 04:48:23 crc kubenswrapper[4839]: I0321 04:48:23.661599 4839 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-cd5cbd7b9-6862l" Mar 21 04:48:23 crc kubenswrapper[4839]: I0321 04:48:23.810210 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/f0b06ab0-2209-4fb3-a837-ec755b412525-ovsdbserver-nb\") pod \"f0b06ab0-2209-4fb3-a837-ec755b412525\" (UID: \"f0b06ab0-2209-4fb3-a837-ec755b412525\") " Mar 21 04:48:23 crc kubenswrapper[4839]: I0321 04:48:23.810304 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/f0b06ab0-2209-4fb3-a837-ec755b412525-ovsdbserver-sb\") pod \"f0b06ab0-2209-4fb3-a837-ec755b412525\" (UID: \"f0b06ab0-2209-4fb3-a837-ec755b412525\") " Mar 21 04:48:23 crc kubenswrapper[4839]: I0321 04:48:23.810349 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/f0b06ab0-2209-4fb3-a837-ec755b412525-dns-svc\") pod \"f0b06ab0-2209-4fb3-a837-ec755b412525\" (UID: \"f0b06ab0-2209-4fb3-a837-ec755b412525\") " Mar 21 04:48:23 crc kubenswrapper[4839]: I0321 04:48:23.810434 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pfp48\" (UniqueName: \"kubernetes.io/projected/f0b06ab0-2209-4fb3-a837-ec755b412525-kube-api-access-pfp48\") pod \"f0b06ab0-2209-4fb3-a837-ec755b412525\" (UID: \"f0b06ab0-2209-4fb3-a837-ec755b412525\") " Mar 21 04:48:23 crc kubenswrapper[4839]: I0321 04:48:23.810460 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f0b06ab0-2209-4fb3-a837-ec755b412525-config\") pod \"f0b06ab0-2209-4fb3-a837-ec755b412525\" (UID: \"f0b06ab0-2209-4fb3-a837-ec755b412525\") " Mar 21 04:48:23 crc kubenswrapper[4839]: I0321 04:48:23.810583 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/f0b06ab0-2209-4fb3-a837-ec755b412525-dns-swift-storage-0\") pod \"f0b06ab0-2209-4fb3-a837-ec755b412525\" (UID: \"f0b06ab0-2209-4fb3-a837-ec755b412525\") " Mar 21 04:48:23 crc kubenswrapper[4839]: I0321 04:48:23.820130 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f0b06ab0-2209-4fb3-a837-ec755b412525-kube-api-access-pfp48" (OuterVolumeSpecName: "kube-api-access-pfp48") pod "f0b06ab0-2209-4fb3-a837-ec755b412525" (UID: "f0b06ab0-2209-4fb3-a837-ec755b412525"). InnerVolumeSpecName "kube-api-access-pfp48". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 21 04:48:23 crc kubenswrapper[4839]: I0321 04:48:23.863079 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f0b06ab0-2209-4fb3-a837-ec755b412525-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "f0b06ab0-2209-4fb3-a837-ec755b412525" (UID: "f0b06ab0-2209-4fb3-a837-ec755b412525"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 21 04:48:23 crc kubenswrapper[4839]: I0321 04:48:23.863813 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f0b06ab0-2209-4fb3-a837-ec755b412525-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "f0b06ab0-2209-4fb3-a837-ec755b412525" (UID: "f0b06ab0-2209-4fb3-a837-ec755b412525"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 21 04:48:23 crc kubenswrapper[4839]: I0321 04:48:23.878094 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f0b06ab0-2209-4fb3-a837-ec755b412525-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "f0b06ab0-2209-4fb3-a837-ec755b412525" (UID: "f0b06ab0-2209-4fb3-a837-ec755b412525"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 21 04:48:23 crc kubenswrapper[4839]: I0321 04:48:23.878528 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f0b06ab0-2209-4fb3-a837-ec755b412525-config" (OuterVolumeSpecName: "config") pod "f0b06ab0-2209-4fb3-a837-ec755b412525" (UID: "f0b06ab0-2209-4fb3-a837-ec755b412525"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 21 04:48:23 crc kubenswrapper[4839]: I0321 04:48:23.885528 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f0b06ab0-2209-4fb3-a837-ec755b412525-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "f0b06ab0-2209-4fb3-a837-ec755b412525" (UID: "f0b06ab0-2209-4fb3-a837-ec755b412525"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 21 04:48:23 crc kubenswrapper[4839]: I0321 04:48:23.913284 4839 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/f0b06ab0-2209-4fb3-a837-ec755b412525-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Mar 21 04:48:23 crc kubenswrapper[4839]: I0321 04:48:23.913318 4839 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/f0b06ab0-2209-4fb3-a837-ec755b412525-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Mar 21 04:48:23 crc kubenswrapper[4839]: I0321 04:48:23.913329 4839 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/f0b06ab0-2209-4fb3-a837-ec755b412525-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Mar 21 04:48:23 crc kubenswrapper[4839]: I0321 04:48:23.913338 4839 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/f0b06ab0-2209-4fb3-a837-ec755b412525-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 21 04:48:23 crc kubenswrapper[4839]: I0321 04:48:23.913349 4839 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pfp48\" (UniqueName: \"kubernetes.io/projected/f0b06ab0-2209-4fb3-a837-ec755b412525-kube-api-access-pfp48\") on node \"crc\" DevicePath \"\"" Mar 21 04:48:23 crc kubenswrapper[4839]: I0321 04:48:23.913358 4839 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f0b06ab0-2209-4fb3-a837-ec755b412525-config\") on node \"crc\" DevicePath \"\"" Mar 21 04:48:24 crc kubenswrapper[4839]: I0321 04:48:24.026023 4839 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-78c64bc9c5-n4nl2"] Mar 21 04:48:24 crc kubenswrapper[4839]: I0321 04:48:24.216732 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-cd5cbd7b9-6862l" event={"ID":"f0b06ab0-2209-4fb3-a837-ec755b412525","Type":"ContainerDied","Data":"223ef65b13d73e2f7904cd94127f13fb98845ae46f6fe3c063ed71c9184d7fbc"} Mar 21 04:48:24 crc kubenswrapper[4839]: I0321 04:48:24.217065 4839 scope.go:117] "RemoveContainer" containerID="49e68a91d6df7e43ddd3ea0fec63512f2ded0793c00e4d188853502265e78a28" Mar 21 04:48:24 crc kubenswrapper[4839]: I0321 04:48:24.216764 4839 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-cd5cbd7b9-6862l" Mar 21 04:48:24 crc kubenswrapper[4839]: I0321 04:48:24.218535 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-78c64bc9c5-n4nl2" event={"ID":"a31699b4-0a8f-42c8-b7f4-319ef1d5423a","Type":"ContainerStarted","Data":"9630e345dc685e54ca50f351a3dee5431dd5ed44eb1c8385623484c27885d436"} Mar 21 04:48:24 crc kubenswrapper[4839]: I0321 04:48:24.263336 4839 scope.go:117] "RemoveContainer" containerID="9c66a34072939fe1bc06d5317cefcaef381970be743e11c85ddce2a426a837fb" Mar 21 04:48:24 crc kubenswrapper[4839]: I0321 04:48:24.281938 4839 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-cd5cbd7b9-6862l"] Mar 21 04:48:24 crc kubenswrapper[4839]: I0321 04:48:24.289634 4839 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-cd5cbd7b9-6862l"] Mar 21 04:48:24 crc kubenswrapper[4839]: I0321 04:48:24.463980 4839 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f0b06ab0-2209-4fb3-a837-ec755b412525" path="/var/lib/kubelet/pods/f0b06ab0-2209-4fb3-a837-ec755b412525/volumes" Mar 21 04:48:25 crc kubenswrapper[4839]: I0321 04:48:25.232803 4839 generic.go:334] "Generic (PLEG): container finished" podID="a31699b4-0a8f-42c8-b7f4-319ef1d5423a" containerID="2f10ef34f922523dcddaa070f87245aa996ca456d8cc3c469dc033bc9dc5d8e7" exitCode=0 Mar 21 04:48:25 crc kubenswrapper[4839]: I0321 04:48:25.233362 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-78c64bc9c5-n4nl2" event={"ID":"a31699b4-0a8f-42c8-b7f4-319ef1d5423a","Type":"ContainerDied","Data":"2f10ef34f922523dcddaa070f87245aa996ca456d8cc3c469dc033bc9dc5d8e7"} Mar 21 04:48:26 crc kubenswrapper[4839]: I0321 04:48:26.244115 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-78c64bc9c5-n4nl2" event={"ID":"a31699b4-0a8f-42c8-b7f4-319ef1d5423a","Type":"ContainerStarted","Data":"3eb74633227d6594ed27aa32b037a49f3d26f7a5e99f04f430b702b54d398bab"} Mar 21 04:48:26 crc kubenswrapper[4839]: I0321 04:48:26.244410 4839 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-78c64bc9c5-n4nl2" Mar 21 04:48:26 crc kubenswrapper[4839]: I0321 04:48:26.268098 4839 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-78c64bc9c5-n4nl2" podStartSLOduration=3.26807993 podStartE2EDuration="3.26807993s" podCreationTimestamp="2026-03-21 04:48:23 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-21 04:48:26.26272264 +0000 UTC m=+1510.590509316" watchObservedRunningTime="2026-03-21 04:48:26.26807993 +0000 UTC m=+1510.595866606" Mar 21 04:48:29 crc kubenswrapper[4839]: I0321 04:48:29.061515 4839 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-kwl97" Mar 21 04:48:29 crc kubenswrapper[4839]: I0321 04:48:29.113332 4839 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-kwl97" Mar 21 04:48:29 crc kubenswrapper[4839]: I0321 04:48:29.299821 4839 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-kwl97"] Mar 21 04:48:30 crc kubenswrapper[4839]: I0321 04:48:30.288541 4839 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-kwl97" podUID="346daec7-d0f8-4237-a189-2b84c2a65207" containerName="registry-server" containerID="cri-o://8f34fe59339a2072a730832f50bd62453a0d0ef1ad303dbcbb364d5ce3b9734d" gracePeriod=2 Mar 21 04:48:30 crc kubenswrapper[4839]: I0321 04:48:30.746029 4839 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-kwl97" Mar 21 04:48:30 crc kubenswrapper[4839]: I0321 04:48:30.849715 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/346daec7-d0f8-4237-a189-2b84c2a65207-utilities\") pod \"346daec7-d0f8-4237-a189-2b84c2a65207\" (UID: \"346daec7-d0f8-4237-a189-2b84c2a65207\") " Mar 21 04:48:30 crc kubenswrapper[4839]: I0321 04:48:30.849798 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/346daec7-d0f8-4237-a189-2b84c2a65207-catalog-content\") pod \"346daec7-d0f8-4237-a189-2b84c2a65207\" (UID: \"346daec7-d0f8-4237-a189-2b84c2a65207\") " Mar 21 04:48:30 crc kubenswrapper[4839]: I0321 04:48:30.850036 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nl8vh\" (UniqueName: \"kubernetes.io/projected/346daec7-d0f8-4237-a189-2b84c2a65207-kube-api-access-nl8vh\") pod \"346daec7-d0f8-4237-a189-2b84c2a65207\" (UID: \"346daec7-d0f8-4237-a189-2b84c2a65207\") " Mar 21 04:48:30 crc kubenswrapper[4839]: I0321 04:48:30.851672 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/346daec7-d0f8-4237-a189-2b84c2a65207-utilities" (OuterVolumeSpecName: "utilities") pod "346daec7-d0f8-4237-a189-2b84c2a65207" (UID: "346daec7-d0f8-4237-a189-2b84c2a65207"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 21 04:48:30 crc kubenswrapper[4839]: I0321 04:48:30.857944 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/346daec7-d0f8-4237-a189-2b84c2a65207-kube-api-access-nl8vh" (OuterVolumeSpecName: "kube-api-access-nl8vh") pod "346daec7-d0f8-4237-a189-2b84c2a65207" (UID: "346daec7-d0f8-4237-a189-2b84c2a65207"). InnerVolumeSpecName "kube-api-access-nl8vh". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 21 04:48:30 crc kubenswrapper[4839]: I0321 04:48:30.953309 4839 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nl8vh\" (UniqueName: \"kubernetes.io/projected/346daec7-d0f8-4237-a189-2b84c2a65207-kube-api-access-nl8vh\") on node \"crc\" DevicePath \"\"" Mar 21 04:48:30 crc kubenswrapper[4839]: I0321 04:48:30.953362 4839 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/346daec7-d0f8-4237-a189-2b84c2a65207-utilities\") on node \"crc\" DevicePath \"\"" Mar 21 04:48:30 crc kubenswrapper[4839]: I0321 04:48:30.981497 4839 patch_prober.go:28] interesting pod/machine-config-daemon-jx4q7 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 21 04:48:30 crc kubenswrapper[4839]: I0321 04:48:30.981635 4839 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-jx4q7" podUID="4f92fefb-d5cd-451a-8bbe-31eea55d5bd9" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 21 04:48:30 crc kubenswrapper[4839]: I0321 04:48:30.984825 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/346daec7-d0f8-4237-a189-2b84c2a65207-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "346daec7-d0f8-4237-a189-2b84c2a65207" (UID: "346daec7-d0f8-4237-a189-2b84c2a65207"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 21 04:48:31 crc kubenswrapper[4839]: I0321 04:48:31.054562 4839 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/346daec7-d0f8-4237-a189-2b84c2a65207-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 21 04:48:31 crc kubenswrapper[4839]: I0321 04:48:31.298840 4839 generic.go:334] "Generic (PLEG): container finished" podID="346daec7-d0f8-4237-a189-2b84c2a65207" containerID="8f34fe59339a2072a730832f50bd62453a0d0ef1ad303dbcbb364d5ce3b9734d" exitCode=0 Mar 21 04:48:31 crc kubenswrapper[4839]: I0321 04:48:31.298889 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-kwl97" event={"ID":"346daec7-d0f8-4237-a189-2b84c2a65207","Type":"ContainerDied","Data":"8f34fe59339a2072a730832f50bd62453a0d0ef1ad303dbcbb364d5ce3b9734d"} Mar 21 04:48:31 crc kubenswrapper[4839]: I0321 04:48:31.298910 4839 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-kwl97" Mar 21 04:48:31 crc kubenswrapper[4839]: I0321 04:48:31.298925 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-kwl97" event={"ID":"346daec7-d0f8-4237-a189-2b84c2a65207","Type":"ContainerDied","Data":"4d51b3ae90fbcc99929a148d8825194077e5974286f615cf5f4f49328360ebc9"} Mar 21 04:48:31 crc kubenswrapper[4839]: I0321 04:48:31.298972 4839 scope.go:117] "RemoveContainer" containerID="8f34fe59339a2072a730832f50bd62453a0d0ef1ad303dbcbb364d5ce3b9734d" Mar 21 04:48:31 crc kubenswrapper[4839]: I0321 04:48:31.318523 4839 scope.go:117] "RemoveContainer" containerID="79758e53a6a4afc87d273da5e0a098ac34797242bb0be0e09b4fef2bf01960eb" Mar 21 04:48:31 crc kubenswrapper[4839]: I0321 04:48:31.331696 4839 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-kwl97"] Mar 21 04:48:31 crc kubenswrapper[4839]: I0321 04:48:31.340354 4839 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-kwl97"] Mar 21 04:48:31 crc kubenswrapper[4839]: I0321 04:48:31.358987 4839 scope.go:117] "RemoveContainer" containerID="acab5d81cd2e1c37c1a56acb9fa01bf1a93058aa2c338773bcb2ab0c75d8fc2b" Mar 21 04:48:31 crc kubenswrapper[4839]: I0321 04:48:31.387804 4839 scope.go:117] "RemoveContainer" containerID="8f34fe59339a2072a730832f50bd62453a0d0ef1ad303dbcbb364d5ce3b9734d" Mar 21 04:48:31 crc kubenswrapper[4839]: E0321 04:48:31.388328 4839 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8f34fe59339a2072a730832f50bd62453a0d0ef1ad303dbcbb364d5ce3b9734d\": container with ID starting with 8f34fe59339a2072a730832f50bd62453a0d0ef1ad303dbcbb364d5ce3b9734d not found: ID does not exist" containerID="8f34fe59339a2072a730832f50bd62453a0d0ef1ad303dbcbb364d5ce3b9734d" Mar 21 04:48:31 crc kubenswrapper[4839]: I0321 04:48:31.388392 4839 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8f34fe59339a2072a730832f50bd62453a0d0ef1ad303dbcbb364d5ce3b9734d"} err="failed to get container status \"8f34fe59339a2072a730832f50bd62453a0d0ef1ad303dbcbb364d5ce3b9734d\": rpc error: code = NotFound desc = could not find container \"8f34fe59339a2072a730832f50bd62453a0d0ef1ad303dbcbb364d5ce3b9734d\": container with ID starting with 8f34fe59339a2072a730832f50bd62453a0d0ef1ad303dbcbb364d5ce3b9734d not found: ID does not exist" Mar 21 04:48:31 crc kubenswrapper[4839]: I0321 04:48:31.388436 4839 scope.go:117] "RemoveContainer" containerID="79758e53a6a4afc87d273da5e0a098ac34797242bb0be0e09b4fef2bf01960eb" Mar 21 04:48:31 crc kubenswrapper[4839]: E0321 04:48:31.389142 4839 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"79758e53a6a4afc87d273da5e0a098ac34797242bb0be0e09b4fef2bf01960eb\": container with ID starting with 79758e53a6a4afc87d273da5e0a098ac34797242bb0be0e09b4fef2bf01960eb not found: ID does not exist" containerID="79758e53a6a4afc87d273da5e0a098ac34797242bb0be0e09b4fef2bf01960eb" Mar 21 04:48:31 crc kubenswrapper[4839]: I0321 04:48:31.389185 4839 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"79758e53a6a4afc87d273da5e0a098ac34797242bb0be0e09b4fef2bf01960eb"} err="failed to get container status \"79758e53a6a4afc87d273da5e0a098ac34797242bb0be0e09b4fef2bf01960eb\": rpc error: code = NotFound desc = could not find container \"79758e53a6a4afc87d273da5e0a098ac34797242bb0be0e09b4fef2bf01960eb\": container with ID starting with 79758e53a6a4afc87d273da5e0a098ac34797242bb0be0e09b4fef2bf01960eb not found: ID does not exist" Mar 21 04:48:31 crc kubenswrapper[4839]: I0321 04:48:31.389214 4839 scope.go:117] "RemoveContainer" containerID="acab5d81cd2e1c37c1a56acb9fa01bf1a93058aa2c338773bcb2ab0c75d8fc2b" Mar 21 04:48:31 crc kubenswrapper[4839]: E0321 04:48:31.389726 4839 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"acab5d81cd2e1c37c1a56acb9fa01bf1a93058aa2c338773bcb2ab0c75d8fc2b\": container with ID starting with acab5d81cd2e1c37c1a56acb9fa01bf1a93058aa2c338773bcb2ab0c75d8fc2b not found: ID does not exist" containerID="acab5d81cd2e1c37c1a56acb9fa01bf1a93058aa2c338773bcb2ab0c75d8fc2b" Mar 21 04:48:31 crc kubenswrapper[4839]: I0321 04:48:31.389754 4839 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"acab5d81cd2e1c37c1a56acb9fa01bf1a93058aa2c338773bcb2ab0c75d8fc2b"} err="failed to get container status \"acab5d81cd2e1c37c1a56acb9fa01bf1a93058aa2c338773bcb2ab0c75d8fc2b\": rpc error: code = NotFound desc = could not find container \"acab5d81cd2e1c37c1a56acb9fa01bf1a93058aa2c338773bcb2ab0c75d8fc2b\": container with ID starting with acab5d81cd2e1c37c1a56acb9fa01bf1a93058aa2c338773bcb2ab0c75d8fc2b not found: ID does not exist" Mar 21 04:48:32 crc kubenswrapper[4839]: I0321 04:48:32.463807 4839 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="346daec7-d0f8-4237-a189-2b84c2a65207" path="/var/lib/kubelet/pods/346daec7-d0f8-4237-a189-2b84c2a65207/volumes" Mar 21 04:48:33 crc kubenswrapper[4839]: I0321 04:48:33.543772 4839 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-78c64bc9c5-n4nl2" Mar 21 04:48:33 crc kubenswrapper[4839]: I0321 04:48:33.598638 4839 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-d558885bc-cg5zx"] Mar 21 04:48:33 crc kubenswrapper[4839]: I0321 04:48:33.598904 4839 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-d558885bc-cg5zx" podUID="fa3a88fe-e92a-48a2-9d53-97c2e2c16407" containerName="dnsmasq-dns" containerID="cri-o://7e3932a251fb5ea95be45f70c482ec5596ce908fd935cd3f361d8e3a21897545" gracePeriod=10 Mar 21 04:48:34 crc kubenswrapper[4839]: I0321 04:48:34.063150 4839 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-d558885bc-cg5zx" Mar 21 04:48:34 crc kubenswrapper[4839]: I0321 04:48:34.208423 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/fa3a88fe-e92a-48a2-9d53-97c2e2c16407-ovsdbserver-sb\") pod \"fa3a88fe-e92a-48a2-9d53-97c2e2c16407\" (UID: \"fa3a88fe-e92a-48a2-9d53-97c2e2c16407\") " Mar 21 04:48:34 crc kubenswrapper[4839]: I0321 04:48:34.208521 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/fa3a88fe-e92a-48a2-9d53-97c2e2c16407-dns-swift-storage-0\") pod \"fa3a88fe-e92a-48a2-9d53-97c2e2c16407\" (UID: \"fa3a88fe-e92a-48a2-9d53-97c2e2c16407\") " Mar 21 04:48:34 crc kubenswrapper[4839]: I0321 04:48:34.208601 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/fa3a88fe-e92a-48a2-9d53-97c2e2c16407-ovsdbserver-nb\") pod \"fa3a88fe-e92a-48a2-9d53-97c2e2c16407\" (UID: \"fa3a88fe-e92a-48a2-9d53-97c2e2c16407\") " Mar 21 04:48:34 crc kubenswrapper[4839]: I0321 04:48:34.208627 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fa3a88fe-e92a-48a2-9d53-97c2e2c16407-config\") pod \"fa3a88fe-e92a-48a2-9d53-97c2e2c16407\" (UID: \"fa3a88fe-e92a-48a2-9d53-97c2e2c16407\") " Mar 21 04:48:34 crc kubenswrapper[4839]: I0321 04:48:34.208738 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/fa3a88fe-e92a-48a2-9d53-97c2e2c16407-openstack-edpm-ipam\") pod \"fa3a88fe-e92a-48a2-9d53-97c2e2c16407\" (UID: \"fa3a88fe-e92a-48a2-9d53-97c2e2c16407\") " Mar 21 04:48:34 crc kubenswrapper[4839]: I0321 04:48:34.208798 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9tpcn\" (UniqueName: \"kubernetes.io/projected/fa3a88fe-e92a-48a2-9d53-97c2e2c16407-kube-api-access-9tpcn\") pod \"fa3a88fe-e92a-48a2-9d53-97c2e2c16407\" (UID: \"fa3a88fe-e92a-48a2-9d53-97c2e2c16407\") " Mar 21 04:48:34 crc kubenswrapper[4839]: I0321 04:48:34.208903 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/fa3a88fe-e92a-48a2-9d53-97c2e2c16407-dns-svc\") pod \"fa3a88fe-e92a-48a2-9d53-97c2e2c16407\" (UID: \"fa3a88fe-e92a-48a2-9d53-97c2e2c16407\") " Mar 21 04:48:34 crc kubenswrapper[4839]: I0321 04:48:34.255087 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fa3a88fe-e92a-48a2-9d53-97c2e2c16407-kube-api-access-9tpcn" (OuterVolumeSpecName: "kube-api-access-9tpcn") pod "fa3a88fe-e92a-48a2-9d53-97c2e2c16407" (UID: "fa3a88fe-e92a-48a2-9d53-97c2e2c16407"). InnerVolumeSpecName "kube-api-access-9tpcn". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 21 04:48:34 crc kubenswrapper[4839]: I0321 04:48:34.289389 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fa3a88fe-e92a-48a2-9d53-97c2e2c16407-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "fa3a88fe-e92a-48a2-9d53-97c2e2c16407" (UID: "fa3a88fe-e92a-48a2-9d53-97c2e2c16407"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 21 04:48:34 crc kubenswrapper[4839]: I0321 04:48:34.292260 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fa3a88fe-e92a-48a2-9d53-97c2e2c16407-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "fa3a88fe-e92a-48a2-9d53-97c2e2c16407" (UID: "fa3a88fe-e92a-48a2-9d53-97c2e2c16407"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 21 04:48:34 crc kubenswrapper[4839]: I0321 04:48:34.301128 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fa3a88fe-e92a-48a2-9d53-97c2e2c16407-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "fa3a88fe-e92a-48a2-9d53-97c2e2c16407" (UID: "fa3a88fe-e92a-48a2-9d53-97c2e2c16407"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 21 04:48:34 crc kubenswrapper[4839]: I0321 04:48:34.307464 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fa3a88fe-e92a-48a2-9d53-97c2e2c16407-openstack-edpm-ipam" (OuterVolumeSpecName: "openstack-edpm-ipam") pod "fa3a88fe-e92a-48a2-9d53-97c2e2c16407" (UID: "fa3a88fe-e92a-48a2-9d53-97c2e2c16407"). InnerVolumeSpecName "openstack-edpm-ipam". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 21 04:48:34 crc kubenswrapper[4839]: I0321 04:48:34.311410 4839 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/fa3a88fe-e92a-48a2-9d53-97c2e2c16407-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 21 04:48:34 crc kubenswrapper[4839]: I0321 04:48:34.311443 4839 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/fa3a88fe-e92a-48a2-9d53-97c2e2c16407-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Mar 21 04:48:34 crc kubenswrapper[4839]: I0321 04:48:34.311456 4839 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/fa3a88fe-e92a-48a2-9d53-97c2e2c16407-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Mar 21 04:48:34 crc kubenswrapper[4839]: I0321 04:48:34.311466 4839 reconciler_common.go:293] "Volume detached for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/fa3a88fe-e92a-48a2-9d53-97c2e2c16407-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Mar 21 04:48:34 crc kubenswrapper[4839]: I0321 04:48:34.311474 4839 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9tpcn\" (UniqueName: \"kubernetes.io/projected/fa3a88fe-e92a-48a2-9d53-97c2e2c16407-kube-api-access-9tpcn\") on node \"crc\" DevicePath \"\"" Mar 21 04:48:34 crc kubenswrapper[4839]: I0321 04:48:34.317169 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fa3a88fe-e92a-48a2-9d53-97c2e2c16407-config" (OuterVolumeSpecName: "config") pod "fa3a88fe-e92a-48a2-9d53-97c2e2c16407" (UID: "fa3a88fe-e92a-48a2-9d53-97c2e2c16407"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 21 04:48:34 crc kubenswrapper[4839]: I0321 04:48:34.318806 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fa3a88fe-e92a-48a2-9d53-97c2e2c16407-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "fa3a88fe-e92a-48a2-9d53-97c2e2c16407" (UID: "fa3a88fe-e92a-48a2-9d53-97c2e2c16407"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 21 04:48:34 crc kubenswrapper[4839]: I0321 04:48:34.337259 4839 generic.go:334] "Generic (PLEG): container finished" podID="fa3a88fe-e92a-48a2-9d53-97c2e2c16407" containerID="7e3932a251fb5ea95be45f70c482ec5596ce908fd935cd3f361d8e3a21897545" exitCode=0 Mar 21 04:48:34 crc kubenswrapper[4839]: I0321 04:48:34.337358 4839 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-d558885bc-cg5zx" Mar 21 04:48:34 crc kubenswrapper[4839]: I0321 04:48:34.337386 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-d558885bc-cg5zx" event={"ID":"fa3a88fe-e92a-48a2-9d53-97c2e2c16407","Type":"ContainerDied","Data":"7e3932a251fb5ea95be45f70c482ec5596ce908fd935cd3f361d8e3a21897545"} Mar 21 04:48:34 crc kubenswrapper[4839]: I0321 04:48:34.337696 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-d558885bc-cg5zx" event={"ID":"fa3a88fe-e92a-48a2-9d53-97c2e2c16407","Type":"ContainerDied","Data":"6afe4a7a44e7d79961cd510bbe96aaeab650fa9c2ae4dcc1d6d1067536db5cf3"} Mar 21 04:48:34 crc kubenswrapper[4839]: I0321 04:48:34.337742 4839 scope.go:117] "RemoveContainer" containerID="7e3932a251fb5ea95be45f70c482ec5596ce908fd935cd3f361d8e3a21897545" Mar 21 04:48:34 crc kubenswrapper[4839]: I0321 04:48:34.382046 4839 scope.go:117] "RemoveContainer" containerID="03d38a8e228a32b6d2c321880966d58bdb028deb1a7c9e33a2a7cd25a69ed0dc" Mar 21 04:48:34 crc kubenswrapper[4839]: I0321 04:48:34.390235 4839 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-d558885bc-cg5zx"] Mar 21 04:48:34 crc kubenswrapper[4839]: I0321 04:48:34.400194 4839 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-d558885bc-cg5zx"] Mar 21 04:48:34 crc kubenswrapper[4839]: I0321 04:48:34.414114 4839 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/fa3a88fe-e92a-48a2-9d53-97c2e2c16407-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Mar 21 04:48:34 crc kubenswrapper[4839]: I0321 04:48:34.414197 4839 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fa3a88fe-e92a-48a2-9d53-97c2e2c16407-config\") on node \"crc\" DevicePath \"\"" Mar 21 04:48:34 crc kubenswrapper[4839]: I0321 04:48:34.419433 4839 scope.go:117] "RemoveContainer" containerID="7e3932a251fb5ea95be45f70c482ec5596ce908fd935cd3f361d8e3a21897545" Mar 21 04:48:34 crc kubenswrapper[4839]: E0321 04:48:34.420029 4839 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7e3932a251fb5ea95be45f70c482ec5596ce908fd935cd3f361d8e3a21897545\": container with ID starting with 7e3932a251fb5ea95be45f70c482ec5596ce908fd935cd3f361d8e3a21897545 not found: ID does not exist" containerID="7e3932a251fb5ea95be45f70c482ec5596ce908fd935cd3f361d8e3a21897545" Mar 21 04:48:34 crc kubenswrapper[4839]: I0321 04:48:34.420094 4839 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7e3932a251fb5ea95be45f70c482ec5596ce908fd935cd3f361d8e3a21897545"} err="failed to get container status \"7e3932a251fb5ea95be45f70c482ec5596ce908fd935cd3f361d8e3a21897545\": rpc error: code = NotFound desc = could not find container \"7e3932a251fb5ea95be45f70c482ec5596ce908fd935cd3f361d8e3a21897545\": container with ID starting with 7e3932a251fb5ea95be45f70c482ec5596ce908fd935cd3f361d8e3a21897545 not found: ID does not exist" Mar 21 04:48:34 crc kubenswrapper[4839]: I0321 04:48:34.420115 4839 scope.go:117] "RemoveContainer" containerID="03d38a8e228a32b6d2c321880966d58bdb028deb1a7c9e33a2a7cd25a69ed0dc" Mar 21 04:48:34 crc kubenswrapper[4839]: E0321 04:48:34.422031 4839 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"03d38a8e228a32b6d2c321880966d58bdb028deb1a7c9e33a2a7cd25a69ed0dc\": container with ID starting with 03d38a8e228a32b6d2c321880966d58bdb028deb1a7c9e33a2a7cd25a69ed0dc not found: ID does not exist" containerID="03d38a8e228a32b6d2c321880966d58bdb028deb1a7c9e33a2a7cd25a69ed0dc" Mar 21 04:48:34 crc kubenswrapper[4839]: I0321 04:48:34.422085 4839 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"03d38a8e228a32b6d2c321880966d58bdb028deb1a7c9e33a2a7cd25a69ed0dc"} err="failed to get container status \"03d38a8e228a32b6d2c321880966d58bdb028deb1a7c9e33a2a7cd25a69ed0dc\": rpc error: code = NotFound desc = could not find container \"03d38a8e228a32b6d2c321880966d58bdb028deb1a7c9e33a2a7cd25a69ed0dc\": container with ID starting with 03d38a8e228a32b6d2c321880966d58bdb028deb1a7c9e33a2a7cd25a69ed0dc not found: ID does not exist" Mar 21 04:48:34 crc kubenswrapper[4839]: I0321 04:48:34.466242 4839 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fa3a88fe-e92a-48a2-9d53-97c2e2c16407" path="/var/lib/kubelet/pods/fa3a88fe-e92a-48a2-9d53-97c2e2c16407/volumes" Mar 21 04:48:42 crc kubenswrapper[4839]: I0321 04:48:42.302524 4839 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-mbmlq"] Mar 21 04:48:42 crc kubenswrapper[4839]: E0321 04:48:42.305099 4839 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fa3a88fe-e92a-48a2-9d53-97c2e2c16407" containerName="dnsmasq-dns" Mar 21 04:48:42 crc kubenswrapper[4839]: I0321 04:48:42.305132 4839 state_mem.go:107] "Deleted CPUSet assignment" podUID="fa3a88fe-e92a-48a2-9d53-97c2e2c16407" containerName="dnsmasq-dns" Mar 21 04:48:42 crc kubenswrapper[4839]: E0321 04:48:42.305158 4839 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fa3a88fe-e92a-48a2-9d53-97c2e2c16407" containerName="init" Mar 21 04:48:42 crc kubenswrapper[4839]: I0321 04:48:42.305166 4839 state_mem.go:107] "Deleted CPUSet assignment" podUID="fa3a88fe-e92a-48a2-9d53-97c2e2c16407" containerName="init" Mar 21 04:48:42 crc kubenswrapper[4839]: E0321 04:48:42.305188 4839 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="346daec7-d0f8-4237-a189-2b84c2a65207" containerName="extract-utilities" Mar 21 04:48:42 crc kubenswrapper[4839]: I0321 04:48:42.305196 4839 state_mem.go:107] "Deleted CPUSet assignment" podUID="346daec7-d0f8-4237-a189-2b84c2a65207" containerName="extract-utilities" Mar 21 04:48:42 crc kubenswrapper[4839]: E0321 04:48:42.305222 4839 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="346daec7-d0f8-4237-a189-2b84c2a65207" containerName="extract-content" Mar 21 04:48:42 crc kubenswrapper[4839]: I0321 04:48:42.305230 4839 state_mem.go:107] "Deleted CPUSet assignment" podUID="346daec7-d0f8-4237-a189-2b84c2a65207" containerName="extract-content" Mar 21 04:48:42 crc kubenswrapper[4839]: E0321 04:48:42.305272 4839 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f0b06ab0-2209-4fb3-a837-ec755b412525" containerName="init" Mar 21 04:48:42 crc kubenswrapper[4839]: I0321 04:48:42.305280 4839 state_mem.go:107] "Deleted CPUSet assignment" podUID="f0b06ab0-2209-4fb3-a837-ec755b412525" containerName="init" Mar 21 04:48:42 crc kubenswrapper[4839]: E0321 04:48:42.305288 4839 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f0b06ab0-2209-4fb3-a837-ec755b412525" containerName="dnsmasq-dns" Mar 21 04:48:42 crc kubenswrapper[4839]: I0321 04:48:42.305295 4839 state_mem.go:107] "Deleted CPUSet assignment" podUID="f0b06ab0-2209-4fb3-a837-ec755b412525" containerName="dnsmasq-dns" Mar 21 04:48:42 crc kubenswrapper[4839]: E0321 04:48:42.305320 4839 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="346daec7-d0f8-4237-a189-2b84c2a65207" containerName="registry-server" Mar 21 04:48:42 crc kubenswrapper[4839]: I0321 04:48:42.305334 4839 state_mem.go:107] "Deleted CPUSet assignment" podUID="346daec7-d0f8-4237-a189-2b84c2a65207" containerName="registry-server" Mar 21 04:48:42 crc kubenswrapper[4839]: I0321 04:48:42.307017 4839 memory_manager.go:354] "RemoveStaleState removing state" podUID="fa3a88fe-e92a-48a2-9d53-97c2e2c16407" containerName="dnsmasq-dns" Mar 21 04:48:42 crc kubenswrapper[4839]: I0321 04:48:42.307072 4839 memory_manager.go:354] "RemoveStaleState removing state" podUID="346daec7-d0f8-4237-a189-2b84c2a65207" containerName="registry-server" Mar 21 04:48:42 crc kubenswrapper[4839]: I0321 04:48:42.307115 4839 memory_manager.go:354] "RemoveStaleState removing state" podUID="f0b06ab0-2209-4fb3-a837-ec755b412525" containerName="dnsmasq-dns" Mar 21 04:48:42 crc kubenswrapper[4839]: I0321 04:48:42.309183 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-mbmlq" Mar 21 04:48:42 crc kubenswrapper[4839]: I0321 04:48:42.321846 4839 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Mar 21 04:48:42 crc kubenswrapper[4839]: I0321 04:48:42.321964 4839 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-pxvtf" Mar 21 04:48:42 crc kubenswrapper[4839]: I0321 04:48:42.322169 4839 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Mar 21 04:48:42 crc kubenswrapper[4839]: I0321 04:48:42.322397 4839 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Mar 21 04:48:42 crc kubenswrapper[4839]: I0321 04:48:42.350418 4839 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-mbmlq"] Mar 21 04:48:42 crc kubenswrapper[4839]: I0321 04:48:42.469636 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/acb0bb61-c53a-4171-bca5-4a3141d6904a-repo-setup-combined-ca-bundle\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-mbmlq\" (UID: \"acb0bb61-c53a-4171-bca5-4a3141d6904a\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-mbmlq" Mar 21 04:48:42 crc kubenswrapper[4839]: I0321 04:48:42.470527 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/acb0bb61-c53a-4171-bca5-4a3141d6904a-inventory\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-mbmlq\" (UID: \"acb0bb61-c53a-4171-bca5-4a3141d6904a\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-mbmlq" Mar 21 04:48:42 crc kubenswrapper[4839]: I0321 04:48:42.470984 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qmc9j\" (UniqueName: \"kubernetes.io/projected/acb0bb61-c53a-4171-bca5-4a3141d6904a-kube-api-access-qmc9j\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-mbmlq\" (UID: \"acb0bb61-c53a-4171-bca5-4a3141d6904a\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-mbmlq" Mar 21 04:48:42 crc kubenswrapper[4839]: I0321 04:48:42.471060 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/acb0bb61-c53a-4171-bca5-4a3141d6904a-ssh-key-openstack-edpm-ipam\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-mbmlq\" (UID: \"acb0bb61-c53a-4171-bca5-4a3141d6904a\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-mbmlq" Mar 21 04:48:42 crc kubenswrapper[4839]: I0321 04:48:42.572930 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/acb0bb61-c53a-4171-bca5-4a3141d6904a-repo-setup-combined-ca-bundle\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-mbmlq\" (UID: \"acb0bb61-c53a-4171-bca5-4a3141d6904a\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-mbmlq" Mar 21 04:48:42 crc kubenswrapper[4839]: I0321 04:48:42.572996 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/acb0bb61-c53a-4171-bca5-4a3141d6904a-inventory\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-mbmlq\" (UID: \"acb0bb61-c53a-4171-bca5-4a3141d6904a\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-mbmlq" Mar 21 04:48:42 crc kubenswrapper[4839]: I0321 04:48:42.573106 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qmc9j\" (UniqueName: \"kubernetes.io/projected/acb0bb61-c53a-4171-bca5-4a3141d6904a-kube-api-access-qmc9j\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-mbmlq\" (UID: \"acb0bb61-c53a-4171-bca5-4a3141d6904a\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-mbmlq" Mar 21 04:48:42 crc kubenswrapper[4839]: I0321 04:48:42.573158 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/acb0bb61-c53a-4171-bca5-4a3141d6904a-ssh-key-openstack-edpm-ipam\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-mbmlq\" (UID: \"acb0bb61-c53a-4171-bca5-4a3141d6904a\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-mbmlq" Mar 21 04:48:42 crc kubenswrapper[4839]: I0321 04:48:42.584435 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/acb0bb61-c53a-4171-bca5-4a3141d6904a-inventory\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-mbmlq\" (UID: \"acb0bb61-c53a-4171-bca5-4a3141d6904a\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-mbmlq" Mar 21 04:48:42 crc kubenswrapper[4839]: I0321 04:48:42.584480 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/acb0bb61-c53a-4171-bca5-4a3141d6904a-ssh-key-openstack-edpm-ipam\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-mbmlq\" (UID: \"acb0bb61-c53a-4171-bca5-4a3141d6904a\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-mbmlq" Mar 21 04:48:42 crc kubenswrapper[4839]: I0321 04:48:42.586400 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/acb0bb61-c53a-4171-bca5-4a3141d6904a-repo-setup-combined-ca-bundle\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-mbmlq\" (UID: \"acb0bb61-c53a-4171-bca5-4a3141d6904a\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-mbmlq" Mar 21 04:48:42 crc kubenswrapper[4839]: I0321 04:48:42.594653 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qmc9j\" (UniqueName: \"kubernetes.io/projected/acb0bb61-c53a-4171-bca5-4a3141d6904a-kube-api-access-qmc9j\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-mbmlq\" (UID: \"acb0bb61-c53a-4171-bca5-4a3141d6904a\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-mbmlq" Mar 21 04:48:42 crc kubenswrapper[4839]: I0321 04:48:42.654531 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-mbmlq" Mar 21 04:48:43 crc kubenswrapper[4839]: I0321 04:48:43.254452 4839 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-mbmlq"] Mar 21 04:48:43 crc kubenswrapper[4839]: W0321 04:48:43.256274 4839 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podacb0bb61_c53a_4171_bca5_4a3141d6904a.slice/crio-09229b426fbce2a8b2fcac2d33b487a20a81e667d376ebfa1c1a2ccc2f1ace9f WatchSource:0}: Error finding container 09229b426fbce2a8b2fcac2d33b487a20a81e667d376ebfa1c1a2ccc2f1ace9f: Status 404 returned error can't find the container with id 09229b426fbce2a8b2fcac2d33b487a20a81e667d376ebfa1c1a2ccc2f1ace9f Mar 21 04:48:43 crc kubenswrapper[4839]: I0321 04:48:43.427711 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-mbmlq" event={"ID":"acb0bb61-c53a-4171-bca5-4a3141d6904a","Type":"ContainerStarted","Data":"09229b426fbce2a8b2fcac2d33b487a20a81e667d376ebfa1c1a2ccc2f1ace9f"} Mar 21 04:48:46 crc kubenswrapper[4839]: I0321 04:48:46.466124 4839 generic.go:334] "Generic (PLEG): container finished" podID="bfff67da-8ea4-4798-9b8d-58a3abac4347" containerID="644c67f4886f966671f8710482c7d08251548f19faca3155012ce9a9e6664332" exitCode=0 Mar 21 04:48:46 crc kubenswrapper[4839]: I0321 04:48:46.472133 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"bfff67da-8ea4-4798-9b8d-58a3abac4347","Type":"ContainerDied","Data":"644c67f4886f966671f8710482c7d08251548f19faca3155012ce9a9e6664332"} Mar 21 04:48:47 crc kubenswrapper[4839]: I0321 04:48:47.485802 4839 generic.go:334] "Generic (PLEG): container finished" podID="fa82c4a0-2b0e-4e22-9e91-7fc899122414" containerID="7ac821c6a3f3ad08d86993b9d344baebd30c5481183b9c62427b46b6443230a7" exitCode=0 Mar 21 04:48:47 crc kubenswrapper[4839]: I0321 04:48:47.485886 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"fa82c4a0-2b0e-4e22-9e91-7fc899122414","Type":"ContainerDied","Data":"7ac821c6a3f3ad08d86993b9d344baebd30c5481183b9c62427b46b6443230a7"} Mar 21 04:48:47 crc kubenswrapper[4839]: I0321 04:48:47.488960 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"bfff67da-8ea4-4798-9b8d-58a3abac4347","Type":"ContainerStarted","Data":"8ce022af499ca6abdf4830b373ebbc4fd26f136d225297b83b854a27500743cb"} Mar 21 04:48:47 crc kubenswrapper[4839]: I0321 04:48:47.489939 4839 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-server-0" Mar 21 04:48:47 crc kubenswrapper[4839]: I0321 04:48:47.542259 4839 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-server-0" podStartSLOduration=36.542241233 podStartE2EDuration="36.542241233s" podCreationTimestamp="2026-03-21 04:48:11 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-21 04:48:47.540871725 +0000 UTC m=+1531.868658401" watchObservedRunningTime="2026-03-21 04:48:47.542241233 +0000 UTC m=+1531.870027909" Mar 21 04:48:48 crc kubenswrapper[4839]: I0321 04:48:48.132494 4839 scope.go:117] "RemoveContainer" containerID="13a62d6a43116fe61b0ca05db07b93400dd1b1d3d2760f545556037c6e4992fd" Mar 21 04:48:53 crc kubenswrapper[4839]: I0321 04:48:53.556472 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-mbmlq" event={"ID":"acb0bb61-c53a-4171-bca5-4a3141d6904a","Type":"ContainerStarted","Data":"b92ce422c327813606bddc64da9b54ef291ca71f42d2c5fa25cbac824f384d46"} Mar 21 04:48:53 crc kubenswrapper[4839]: I0321 04:48:53.560781 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"fa82c4a0-2b0e-4e22-9e91-7fc899122414","Type":"ContainerStarted","Data":"3f89c476ec85132bd8e8dfe48defcb72b9dbc1aaae748a5f27ad4343af433ead"} Mar 21 04:48:53 crc kubenswrapper[4839]: I0321 04:48:53.561180 4839 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-cell1-server-0" Mar 21 04:48:53 crc kubenswrapper[4839]: I0321 04:48:53.583887 4839 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-mbmlq" podStartSLOduration=1.764881259 podStartE2EDuration="11.583860913s" podCreationTimestamp="2026-03-21 04:48:42 +0000 UTC" firstStartedPulling="2026-03-21 04:48:43.259912685 +0000 UTC m=+1527.587699361" lastFinishedPulling="2026-03-21 04:48:53.078892339 +0000 UTC m=+1537.406679015" observedRunningTime="2026-03-21 04:48:53.571579418 +0000 UTC m=+1537.899366104" watchObservedRunningTime="2026-03-21 04:48:53.583860913 +0000 UTC m=+1537.911647589" Mar 21 04:48:53 crc kubenswrapper[4839]: I0321 04:48:53.603684 4839 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-cell1-server-0" podStartSLOduration=41.603663898 podStartE2EDuration="41.603663898s" podCreationTimestamp="2026-03-21 04:48:12 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-21 04:48:53.597718722 +0000 UTC m=+1537.925505418" watchObservedRunningTime="2026-03-21 04:48:53.603663898 +0000 UTC m=+1537.931450574" Mar 21 04:49:00 crc kubenswrapper[4839]: I0321 04:49:00.980816 4839 patch_prober.go:28] interesting pod/machine-config-daemon-jx4q7 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 21 04:49:00 crc kubenswrapper[4839]: I0321 04:49:00.981371 4839 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-jx4q7" podUID="4f92fefb-d5cd-451a-8bbe-31eea55d5bd9" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 21 04:49:01 crc kubenswrapper[4839]: I0321 04:49:01.468329 4839 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-server-0" Mar 21 04:49:06 crc kubenswrapper[4839]: I0321 04:49:06.688719 4839 generic.go:334] "Generic (PLEG): container finished" podID="acb0bb61-c53a-4171-bca5-4a3141d6904a" containerID="b92ce422c327813606bddc64da9b54ef291ca71f42d2c5fa25cbac824f384d46" exitCode=0 Mar 21 04:49:06 crc kubenswrapper[4839]: I0321 04:49:06.688804 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-mbmlq" event={"ID":"acb0bb61-c53a-4171-bca5-4a3141d6904a","Type":"ContainerDied","Data":"b92ce422c327813606bddc64da9b54ef291ca71f42d2c5fa25cbac824f384d46"} Mar 21 04:49:08 crc kubenswrapper[4839]: I0321 04:49:08.379901 4839 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-mbmlq" Mar 21 04:49:08 crc kubenswrapper[4839]: I0321 04:49:08.474277 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/acb0bb61-c53a-4171-bca5-4a3141d6904a-inventory\") pod \"acb0bb61-c53a-4171-bca5-4a3141d6904a\" (UID: \"acb0bb61-c53a-4171-bca5-4a3141d6904a\") " Mar 21 04:49:08 crc kubenswrapper[4839]: I0321 04:49:08.474366 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qmc9j\" (UniqueName: \"kubernetes.io/projected/acb0bb61-c53a-4171-bca5-4a3141d6904a-kube-api-access-qmc9j\") pod \"acb0bb61-c53a-4171-bca5-4a3141d6904a\" (UID: \"acb0bb61-c53a-4171-bca5-4a3141d6904a\") " Mar 21 04:49:08 crc kubenswrapper[4839]: I0321 04:49:08.476113 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/acb0bb61-c53a-4171-bca5-4a3141d6904a-ssh-key-openstack-edpm-ipam\") pod \"acb0bb61-c53a-4171-bca5-4a3141d6904a\" (UID: \"acb0bb61-c53a-4171-bca5-4a3141d6904a\") " Mar 21 04:49:08 crc kubenswrapper[4839]: I0321 04:49:08.476250 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/acb0bb61-c53a-4171-bca5-4a3141d6904a-repo-setup-combined-ca-bundle\") pod \"acb0bb61-c53a-4171-bca5-4a3141d6904a\" (UID: \"acb0bb61-c53a-4171-bca5-4a3141d6904a\") " Mar 21 04:49:08 crc kubenswrapper[4839]: I0321 04:49:08.487418 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/acb0bb61-c53a-4171-bca5-4a3141d6904a-repo-setup-combined-ca-bundle" (OuterVolumeSpecName: "repo-setup-combined-ca-bundle") pod "acb0bb61-c53a-4171-bca5-4a3141d6904a" (UID: "acb0bb61-c53a-4171-bca5-4a3141d6904a"). InnerVolumeSpecName "repo-setup-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 21 04:49:08 crc kubenswrapper[4839]: I0321 04:49:08.487731 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/acb0bb61-c53a-4171-bca5-4a3141d6904a-kube-api-access-qmc9j" (OuterVolumeSpecName: "kube-api-access-qmc9j") pod "acb0bb61-c53a-4171-bca5-4a3141d6904a" (UID: "acb0bb61-c53a-4171-bca5-4a3141d6904a"). InnerVolumeSpecName "kube-api-access-qmc9j". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 21 04:49:08 crc kubenswrapper[4839]: I0321 04:49:08.504846 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/acb0bb61-c53a-4171-bca5-4a3141d6904a-inventory" (OuterVolumeSpecName: "inventory") pod "acb0bb61-c53a-4171-bca5-4a3141d6904a" (UID: "acb0bb61-c53a-4171-bca5-4a3141d6904a"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 21 04:49:08 crc kubenswrapper[4839]: I0321 04:49:08.531405 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/acb0bb61-c53a-4171-bca5-4a3141d6904a-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "acb0bb61-c53a-4171-bca5-4a3141d6904a" (UID: "acb0bb61-c53a-4171-bca5-4a3141d6904a"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 21 04:49:08 crc kubenswrapper[4839]: I0321 04:49:08.578972 4839 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/acb0bb61-c53a-4171-bca5-4a3141d6904a-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Mar 21 04:49:08 crc kubenswrapper[4839]: I0321 04:49:08.579028 4839 reconciler_common.go:293] "Volume detached for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/acb0bb61-c53a-4171-bca5-4a3141d6904a-repo-setup-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 21 04:49:08 crc kubenswrapper[4839]: I0321 04:49:08.579042 4839 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/acb0bb61-c53a-4171-bca5-4a3141d6904a-inventory\") on node \"crc\" DevicePath \"\"" Mar 21 04:49:08 crc kubenswrapper[4839]: I0321 04:49:08.579054 4839 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qmc9j\" (UniqueName: \"kubernetes.io/projected/acb0bb61-c53a-4171-bca5-4a3141d6904a-kube-api-access-qmc9j\") on node \"crc\" DevicePath \"\"" Mar 21 04:49:08 crc kubenswrapper[4839]: I0321 04:49:08.709538 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-mbmlq" event={"ID":"acb0bb61-c53a-4171-bca5-4a3141d6904a","Type":"ContainerDied","Data":"09229b426fbce2a8b2fcac2d33b487a20a81e667d376ebfa1c1a2ccc2f1ace9f"} Mar 21 04:49:08 crc kubenswrapper[4839]: I0321 04:49:08.710042 4839 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="09229b426fbce2a8b2fcac2d33b487a20a81e667d376ebfa1c1a2ccc2f1ace9f" Mar 21 04:49:08 crc kubenswrapper[4839]: I0321 04:49:08.709653 4839 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-mbmlq" Mar 21 04:49:08 crc kubenswrapper[4839]: I0321 04:49:08.782607 4839 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/redhat-edpm-deployment-openstack-edpm-ipam-pgfnn"] Mar 21 04:49:08 crc kubenswrapper[4839]: E0321 04:49:08.782994 4839 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="acb0bb61-c53a-4171-bca5-4a3141d6904a" containerName="repo-setup-edpm-deployment-openstack-edpm-ipam" Mar 21 04:49:08 crc kubenswrapper[4839]: I0321 04:49:08.783011 4839 state_mem.go:107] "Deleted CPUSet assignment" podUID="acb0bb61-c53a-4171-bca5-4a3141d6904a" containerName="repo-setup-edpm-deployment-openstack-edpm-ipam" Mar 21 04:49:08 crc kubenswrapper[4839]: I0321 04:49:08.783208 4839 memory_manager.go:354] "RemoveStaleState removing state" podUID="acb0bb61-c53a-4171-bca5-4a3141d6904a" containerName="repo-setup-edpm-deployment-openstack-edpm-ipam" Mar 21 04:49:08 crc kubenswrapper[4839]: I0321 04:49:08.783799 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-pgfnn" Mar 21 04:49:08 crc kubenswrapper[4839]: I0321 04:49:08.788006 4839 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Mar 21 04:49:08 crc kubenswrapper[4839]: I0321 04:49:08.788051 4839 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Mar 21 04:49:08 crc kubenswrapper[4839]: I0321 04:49:08.788303 4839 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-pxvtf" Mar 21 04:49:08 crc kubenswrapper[4839]: I0321 04:49:08.788438 4839 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Mar 21 04:49:08 crc kubenswrapper[4839]: I0321 04:49:08.804419 4839 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/redhat-edpm-deployment-openstack-edpm-ipam-pgfnn"] Mar 21 04:49:08 crc kubenswrapper[4839]: I0321 04:49:08.884053 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/a6dd2bff-543f-4ebb-b908-3e528f322548-inventory\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-pgfnn\" (UID: \"a6dd2bff-543f-4ebb-b908-3e528f322548\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-pgfnn" Mar 21 04:49:08 crc kubenswrapper[4839]: I0321 04:49:08.884451 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r7w86\" (UniqueName: \"kubernetes.io/projected/a6dd2bff-543f-4ebb-b908-3e528f322548-kube-api-access-r7w86\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-pgfnn\" (UID: \"a6dd2bff-543f-4ebb-b908-3e528f322548\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-pgfnn" Mar 21 04:49:08 crc kubenswrapper[4839]: I0321 04:49:08.884480 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/a6dd2bff-543f-4ebb-b908-3e528f322548-ssh-key-openstack-edpm-ipam\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-pgfnn\" (UID: \"a6dd2bff-543f-4ebb-b908-3e528f322548\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-pgfnn" Mar 21 04:49:08 crc kubenswrapper[4839]: I0321 04:49:08.986001 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/a6dd2bff-543f-4ebb-b908-3e528f322548-inventory\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-pgfnn\" (UID: \"a6dd2bff-543f-4ebb-b908-3e528f322548\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-pgfnn" Mar 21 04:49:08 crc kubenswrapper[4839]: I0321 04:49:08.986073 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-r7w86\" (UniqueName: \"kubernetes.io/projected/a6dd2bff-543f-4ebb-b908-3e528f322548-kube-api-access-r7w86\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-pgfnn\" (UID: \"a6dd2bff-543f-4ebb-b908-3e528f322548\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-pgfnn" Mar 21 04:49:08 crc kubenswrapper[4839]: I0321 04:49:08.986095 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/a6dd2bff-543f-4ebb-b908-3e528f322548-ssh-key-openstack-edpm-ipam\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-pgfnn\" (UID: \"a6dd2bff-543f-4ebb-b908-3e528f322548\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-pgfnn" Mar 21 04:49:08 crc kubenswrapper[4839]: I0321 04:49:08.990663 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/a6dd2bff-543f-4ebb-b908-3e528f322548-ssh-key-openstack-edpm-ipam\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-pgfnn\" (UID: \"a6dd2bff-543f-4ebb-b908-3e528f322548\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-pgfnn" Mar 21 04:49:08 crc kubenswrapper[4839]: I0321 04:49:08.990677 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/a6dd2bff-543f-4ebb-b908-3e528f322548-inventory\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-pgfnn\" (UID: \"a6dd2bff-543f-4ebb-b908-3e528f322548\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-pgfnn" Mar 21 04:49:09 crc kubenswrapper[4839]: I0321 04:49:09.003489 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-r7w86\" (UniqueName: \"kubernetes.io/projected/a6dd2bff-543f-4ebb-b908-3e528f322548-kube-api-access-r7w86\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-pgfnn\" (UID: \"a6dd2bff-543f-4ebb-b908-3e528f322548\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-pgfnn" Mar 21 04:49:09 crc kubenswrapper[4839]: I0321 04:49:09.120715 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-pgfnn" Mar 21 04:49:09 crc kubenswrapper[4839]: I0321 04:49:09.634539 4839 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/redhat-edpm-deployment-openstack-edpm-ipam-pgfnn"] Mar 21 04:49:09 crc kubenswrapper[4839]: W0321 04:49:09.635947 4839 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda6dd2bff_543f_4ebb_b908_3e528f322548.slice/crio-5c63612b55b73f4c3891be895eb45551655a6f1cd2f188f983a2ac1052418cdd WatchSource:0}: Error finding container 5c63612b55b73f4c3891be895eb45551655a6f1cd2f188f983a2ac1052418cdd: Status 404 returned error can't find the container with id 5c63612b55b73f4c3891be895eb45551655a6f1cd2f188f983a2ac1052418cdd Mar 21 04:49:09 crc kubenswrapper[4839]: I0321 04:49:09.718370 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-pgfnn" event={"ID":"a6dd2bff-543f-4ebb-b908-3e528f322548","Type":"ContainerStarted","Data":"5c63612b55b73f4c3891be895eb45551655a6f1cd2f188f983a2ac1052418cdd"} Mar 21 04:49:12 crc kubenswrapper[4839]: I0321 04:49:12.623775 4839 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-cell1-server-0" Mar 21 04:49:12 crc kubenswrapper[4839]: I0321 04:49:12.759050 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-pgfnn" event={"ID":"a6dd2bff-543f-4ebb-b908-3e528f322548","Type":"ContainerStarted","Data":"dffad1f751349d3858c08b824a236eeafa25bd177c267d117a7cc25a3bfdb104"} Mar 21 04:49:12 crc kubenswrapper[4839]: I0321 04:49:12.783356 4839 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-pgfnn" podStartSLOduration=2.746321885 podStartE2EDuration="4.783334881s" podCreationTimestamp="2026-03-21 04:49:08 +0000 UTC" firstStartedPulling="2026-03-21 04:49:09.638802129 +0000 UTC m=+1553.966588805" lastFinishedPulling="2026-03-21 04:49:11.675815125 +0000 UTC m=+1556.003601801" observedRunningTime="2026-03-21 04:49:12.774441232 +0000 UTC m=+1557.102227918" watchObservedRunningTime="2026-03-21 04:49:12.783334881 +0000 UTC m=+1557.111121567" Mar 21 04:49:14 crc kubenswrapper[4839]: I0321 04:49:14.780648 4839 generic.go:334] "Generic (PLEG): container finished" podID="a6dd2bff-543f-4ebb-b908-3e528f322548" containerID="dffad1f751349d3858c08b824a236eeafa25bd177c267d117a7cc25a3bfdb104" exitCode=0 Mar 21 04:49:14 crc kubenswrapper[4839]: I0321 04:49:14.780724 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-pgfnn" event={"ID":"a6dd2bff-543f-4ebb-b908-3e528f322548","Type":"ContainerDied","Data":"dffad1f751349d3858c08b824a236eeafa25bd177c267d117a7cc25a3bfdb104"} Mar 21 04:49:16 crc kubenswrapper[4839]: I0321 04:49:16.243098 4839 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-pgfnn" Mar 21 04:49:16 crc kubenswrapper[4839]: I0321 04:49:16.325100 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/a6dd2bff-543f-4ebb-b908-3e528f322548-inventory\") pod \"a6dd2bff-543f-4ebb-b908-3e528f322548\" (UID: \"a6dd2bff-543f-4ebb-b908-3e528f322548\") " Mar 21 04:49:16 crc kubenswrapper[4839]: I0321 04:49:16.325196 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-r7w86\" (UniqueName: \"kubernetes.io/projected/a6dd2bff-543f-4ebb-b908-3e528f322548-kube-api-access-r7w86\") pod \"a6dd2bff-543f-4ebb-b908-3e528f322548\" (UID: \"a6dd2bff-543f-4ebb-b908-3e528f322548\") " Mar 21 04:49:16 crc kubenswrapper[4839]: I0321 04:49:16.325260 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/a6dd2bff-543f-4ebb-b908-3e528f322548-ssh-key-openstack-edpm-ipam\") pod \"a6dd2bff-543f-4ebb-b908-3e528f322548\" (UID: \"a6dd2bff-543f-4ebb-b908-3e528f322548\") " Mar 21 04:49:16 crc kubenswrapper[4839]: I0321 04:49:16.331967 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a6dd2bff-543f-4ebb-b908-3e528f322548-kube-api-access-r7w86" (OuterVolumeSpecName: "kube-api-access-r7w86") pod "a6dd2bff-543f-4ebb-b908-3e528f322548" (UID: "a6dd2bff-543f-4ebb-b908-3e528f322548"). InnerVolumeSpecName "kube-api-access-r7w86". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 21 04:49:16 crc kubenswrapper[4839]: I0321 04:49:16.357394 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a6dd2bff-543f-4ebb-b908-3e528f322548-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "a6dd2bff-543f-4ebb-b908-3e528f322548" (UID: "a6dd2bff-543f-4ebb-b908-3e528f322548"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 21 04:49:16 crc kubenswrapper[4839]: I0321 04:49:16.373997 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a6dd2bff-543f-4ebb-b908-3e528f322548-inventory" (OuterVolumeSpecName: "inventory") pod "a6dd2bff-543f-4ebb-b908-3e528f322548" (UID: "a6dd2bff-543f-4ebb-b908-3e528f322548"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 21 04:49:16 crc kubenswrapper[4839]: I0321 04:49:16.428231 4839 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/a6dd2bff-543f-4ebb-b908-3e528f322548-inventory\") on node \"crc\" DevicePath \"\"" Mar 21 04:49:16 crc kubenswrapper[4839]: I0321 04:49:16.428287 4839 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-r7w86\" (UniqueName: \"kubernetes.io/projected/a6dd2bff-543f-4ebb-b908-3e528f322548-kube-api-access-r7w86\") on node \"crc\" DevicePath \"\"" Mar 21 04:49:16 crc kubenswrapper[4839]: I0321 04:49:16.428302 4839 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/a6dd2bff-543f-4ebb-b908-3e528f322548-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Mar 21 04:49:16 crc kubenswrapper[4839]: I0321 04:49:16.804441 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-pgfnn" event={"ID":"a6dd2bff-543f-4ebb-b908-3e528f322548","Type":"ContainerDied","Data":"5c63612b55b73f4c3891be895eb45551655a6f1cd2f188f983a2ac1052418cdd"} Mar 21 04:49:16 crc kubenswrapper[4839]: I0321 04:49:16.804811 4839 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="5c63612b55b73f4c3891be895eb45551655a6f1cd2f188f983a2ac1052418cdd" Mar 21 04:49:16 crc kubenswrapper[4839]: I0321 04:49:16.804635 4839 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-pgfnn" Mar 21 04:49:16 crc kubenswrapper[4839]: I0321 04:49:16.869964 4839 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-ndptz"] Mar 21 04:49:16 crc kubenswrapper[4839]: E0321 04:49:16.870413 4839 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a6dd2bff-543f-4ebb-b908-3e528f322548" containerName="redhat-edpm-deployment-openstack-edpm-ipam" Mar 21 04:49:16 crc kubenswrapper[4839]: I0321 04:49:16.870441 4839 state_mem.go:107] "Deleted CPUSet assignment" podUID="a6dd2bff-543f-4ebb-b908-3e528f322548" containerName="redhat-edpm-deployment-openstack-edpm-ipam" Mar 21 04:49:16 crc kubenswrapper[4839]: I0321 04:49:16.870905 4839 memory_manager.go:354] "RemoveStaleState removing state" podUID="a6dd2bff-543f-4ebb-b908-3e528f322548" containerName="redhat-edpm-deployment-openstack-edpm-ipam" Mar 21 04:49:16 crc kubenswrapper[4839]: I0321 04:49:16.871505 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-ndptz" Mar 21 04:49:16 crc kubenswrapper[4839]: I0321 04:49:16.879157 4839 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Mar 21 04:49:16 crc kubenswrapper[4839]: I0321 04:49:16.879297 4839 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Mar 21 04:49:16 crc kubenswrapper[4839]: I0321 04:49:16.879346 4839 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Mar 21 04:49:16 crc kubenswrapper[4839]: I0321 04:49:16.879493 4839 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-pxvtf" Mar 21 04:49:16 crc kubenswrapper[4839]: I0321 04:49:16.883799 4839 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-ndptz"] Mar 21 04:49:16 crc kubenswrapper[4839]: I0321 04:49:16.938405 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a1d76458-d587-4960-9bcc-7e3d3122b44d-bootstrap-combined-ca-bundle\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-ndptz\" (UID: \"a1d76458-d587-4960-9bcc-7e3d3122b44d\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-ndptz" Mar 21 04:49:16 crc kubenswrapper[4839]: I0321 04:49:16.938487 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zjf5g\" (UniqueName: \"kubernetes.io/projected/a1d76458-d587-4960-9bcc-7e3d3122b44d-kube-api-access-zjf5g\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-ndptz\" (UID: \"a1d76458-d587-4960-9bcc-7e3d3122b44d\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-ndptz" Mar 21 04:49:16 crc kubenswrapper[4839]: I0321 04:49:16.938554 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/a1d76458-d587-4960-9bcc-7e3d3122b44d-ssh-key-openstack-edpm-ipam\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-ndptz\" (UID: \"a1d76458-d587-4960-9bcc-7e3d3122b44d\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-ndptz" Mar 21 04:49:16 crc kubenswrapper[4839]: I0321 04:49:16.938629 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/a1d76458-d587-4960-9bcc-7e3d3122b44d-inventory\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-ndptz\" (UID: \"a1d76458-d587-4960-9bcc-7e3d3122b44d\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-ndptz" Mar 21 04:49:17 crc kubenswrapper[4839]: I0321 04:49:17.039956 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zjf5g\" (UniqueName: \"kubernetes.io/projected/a1d76458-d587-4960-9bcc-7e3d3122b44d-kube-api-access-zjf5g\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-ndptz\" (UID: \"a1d76458-d587-4960-9bcc-7e3d3122b44d\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-ndptz" Mar 21 04:49:17 crc kubenswrapper[4839]: I0321 04:49:17.040032 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/a1d76458-d587-4960-9bcc-7e3d3122b44d-ssh-key-openstack-edpm-ipam\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-ndptz\" (UID: \"a1d76458-d587-4960-9bcc-7e3d3122b44d\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-ndptz" Mar 21 04:49:17 crc kubenswrapper[4839]: I0321 04:49:17.040065 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/a1d76458-d587-4960-9bcc-7e3d3122b44d-inventory\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-ndptz\" (UID: \"a1d76458-d587-4960-9bcc-7e3d3122b44d\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-ndptz" Mar 21 04:49:17 crc kubenswrapper[4839]: I0321 04:49:17.040159 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a1d76458-d587-4960-9bcc-7e3d3122b44d-bootstrap-combined-ca-bundle\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-ndptz\" (UID: \"a1d76458-d587-4960-9bcc-7e3d3122b44d\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-ndptz" Mar 21 04:49:17 crc kubenswrapper[4839]: I0321 04:49:17.045182 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a1d76458-d587-4960-9bcc-7e3d3122b44d-bootstrap-combined-ca-bundle\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-ndptz\" (UID: \"a1d76458-d587-4960-9bcc-7e3d3122b44d\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-ndptz" Mar 21 04:49:17 crc kubenswrapper[4839]: I0321 04:49:17.045335 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/a1d76458-d587-4960-9bcc-7e3d3122b44d-ssh-key-openstack-edpm-ipam\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-ndptz\" (UID: \"a1d76458-d587-4960-9bcc-7e3d3122b44d\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-ndptz" Mar 21 04:49:17 crc kubenswrapper[4839]: I0321 04:49:17.045806 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/a1d76458-d587-4960-9bcc-7e3d3122b44d-inventory\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-ndptz\" (UID: \"a1d76458-d587-4960-9bcc-7e3d3122b44d\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-ndptz" Mar 21 04:49:17 crc kubenswrapper[4839]: I0321 04:49:17.056963 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zjf5g\" (UniqueName: \"kubernetes.io/projected/a1d76458-d587-4960-9bcc-7e3d3122b44d-kube-api-access-zjf5g\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-ndptz\" (UID: \"a1d76458-d587-4960-9bcc-7e3d3122b44d\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-ndptz" Mar 21 04:49:17 crc kubenswrapper[4839]: I0321 04:49:17.191780 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-ndptz" Mar 21 04:49:17 crc kubenswrapper[4839]: I0321 04:49:17.792679 4839 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-ndptz"] Mar 21 04:49:17 crc kubenswrapper[4839]: W0321 04:49:17.793479 4839 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda1d76458_d587_4960_9bcc_7e3d3122b44d.slice/crio-822a23cdb1c22865313c8050c3f022e1750a4731cc96eac72c126accbbe28877 WatchSource:0}: Error finding container 822a23cdb1c22865313c8050c3f022e1750a4731cc96eac72c126accbbe28877: Status 404 returned error can't find the container with id 822a23cdb1c22865313c8050c3f022e1750a4731cc96eac72c126accbbe28877 Mar 21 04:49:17 crc kubenswrapper[4839]: I0321 04:49:17.816127 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-ndptz" event={"ID":"a1d76458-d587-4960-9bcc-7e3d3122b44d","Type":"ContainerStarted","Data":"822a23cdb1c22865313c8050c3f022e1750a4731cc96eac72c126accbbe28877"} Mar 21 04:49:18 crc kubenswrapper[4839]: I0321 04:49:18.825524 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-ndptz" event={"ID":"a1d76458-d587-4960-9bcc-7e3d3122b44d","Type":"ContainerStarted","Data":"70456d0f0e0073bde0ceeec7013aa756cec85385ed747be87d7d96cfa8d04986"} Mar 21 04:49:18 crc kubenswrapper[4839]: I0321 04:49:18.852865 4839 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-ndptz" podStartSLOduration=2.36990932 podStartE2EDuration="2.852841235s" podCreationTimestamp="2026-03-21 04:49:16 +0000 UTC" firstStartedPulling="2026-03-21 04:49:17.7984729 +0000 UTC m=+1562.126259576" lastFinishedPulling="2026-03-21 04:49:18.281404815 +0000 UTC m=+1562.609191491" observedRunningTime="2026-03-21 04:49:18.8437631 +0000 UTC m=+1563.171549776" watchObservedRunningTime="2026-03-21 04:49:18.852841235 +0000 UTC m=+1563.180627911" Mar 21 04:49:30 crc kubenswrapper[4839]: I0321 04:49:30.980554 4839 patch_prober.go:28] interesting pod/machine-config-daemon-jx4q7 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 21 04:49:30 crc kubenswrapper[4839]: I0321 04:49:30.981470 4839 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-jx4q7" podUID="4f92fefb-d5cd-451a-8bbe-31eea55d5bd9" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 21 04:49:30 crc kubenswrapper[4839]: I0321 04:49:30.981552 4839 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-jx4q7" Mar 21 04:49:30 crc kubenswrapper[4839]: I0321 04:49:30.982678 4839 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"c031ed8f7b7576f57e9530a46687f2f2de2e5c2a62f42435eef393cfd7af2b37"} pod="openshift-machine-config-operator/machine-config-daemon-jx4q7" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 21 04:49:30 crc kubenswrapper[4839]: I0321 04:49:30.982744 4839 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-jx4q7" podUID="4f92fefb-d5cd-451a-8bbe-31eea55d5bd9" containerName="machine-config-daemon" containerID="cri-o://c031ed8f7b7576f57e9530a46687f2f2de2e5c2a62f42435eef393cfd7af2b37" gracePeriod=600 Mar 21 04:49:31 crc kubenswrapper[4839]: I0321 04:49:31.940647 4839 generic.go:334] "Generic (PLEG): container finished" podID="4f92fefb-d5cd-451a-8bbe-31eea55d5bd9" containerID="c031ed8f7b7576f57e9530a46687f2f2de2e5c2a62f42435eef393cfd7af2b37" exitCode=0 Mar 21 04:49:31 crc kubenswrapper[4839]: I0321 04:49:31.940723 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-jx4q7" event={"ID":"4f92fefb-d5cd-451a-8bbe-31eea55d5bd9","Type":"ContainerDied","Data":"c031ed8f7b7576f57e9530a46687f2f2de2e5c2a62f42435eef393cfd7af2b37"} Mar 21 04:49:31 crc kubenswrapper[4839]: I0321 04:49:31.941000 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-jx4q7" event={"ID":"4f92fefb-d5cd-451a-8bbe-31eea55d5bd9","Type":"ContainerStarted","Data":"27713c304335ae18a33ce2c1e07fb2a32c8ccd655a1db46168d0e6b4f0325109"} Mar 21 04:49:31 crc kubenswrapper[4839]: I0321 04:49:31.941018 4839 scope.go:117] "RemoveContainer" containerID="48bb6d2443587cf3023178aa72ea424c113f55b1e7600821dbf21c214de8e70f" Mar 21 04:49:53 crc kubenswrapper[4839]: I0321 04:49:53.176967 4839 scope.go:117] "RemoveContainer" containerID="d792580397713b6021c551a3f4cbfaf97f1c5637484d37b25e33338bf6fc4ac7" Mar 21 04:49:53 crc kubenswrapper[4839]: I0321 04:49:53.208369 4839 scope.go:117] "RemoveContainer" containerID="4f187e9e33fa923f2b2629c019ef104918ae6112912ac0480384a8c6a651a762" Mar 21 04:49:53 crc kubenswrapper[4839]: I0321 04:49:53.252064 4839 scope.go:117] "RemoveContainer" containerID="6b5e6693316b5cfa06c0f6c8e7e9f37a0398c873966489b56f00cbd44f60fd16" Mar 21 04:50:00 crc kubenswrapper[4839]: I0321 04:50:00.138493 4839 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29567810-hr2wf"] Mar 21 04:50:00 crc kubenswrapper[4839]: I0321 04:50:00.140337 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567810-hr2wf" Mar 21 04:50:00 crc kubenswrapper[4839]: I0321 04:50:00.143321 4839 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-swld2" Mar 21 04:50:00 crc kubenswrapper[4839]: I0321 04:50:00.143331 4839 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 21 04:50:00 crc kubenswrapper[4839]: I0321 04:50:00.143325 4839 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 21 04:50:00 crc kubenswrapper[4839]: I0321 04:50:00.151785 4839 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29567810-hr2wf"] Mar 21 04:50:00 crc kubenswrapper[4839]: I0321 04:50:00.307869 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qp27f\" (UniqueName: \"kubernetes.io/projected/cf9d6591-e9e7-485d-96f3-8f32958ac530-kube-api-access-qp27f\") pod \"auto-csr-approver-29567810-hr2wf\" (UID: \"cf9d6591-e9e7-485d-96f3-8f32958ac530\") " pod="openshift-infra/auto-csr-approver-29567810-hr2wf" Mar 21 04:50:00 crc kubenswrapper[4839]: I0321 04:50:00.410510 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qp27f\" (UniqueName: \"kubernetes.io/projected/cf9d6591-e9e7-485d-96f3-8f32958ac530-kube-api-access-qp27f\") pod \"auto-csr-approver-29567810-hr2wf\" (UID: \"cf9d6591-e9e7-485d-96f3-8f32958ac530\") " pod="openshift-infra/auto-csr-approver-29567810-hr2wf" Mar 21 04:50:00 crc kubenswrapper[4839]: I0321 04:50:00.429147 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qp27f\" (UniqueName: \"kubernetes.io/projected/cf9d6591-e9e7-485d-96f3-8f32958ac530-kube-api-access-qp27f\") pod \"auto-csr-approver-29567810-hr2wf\" (UID: \"cf9d6591-e9e7-485d-96f3-8f32958ac530\") " pod="openshift-infra/auto-csr-approver-29567810-hr2wf" Mar 21 04:50:00 crc kubenswrapper[4839]: I0321 04:50:00.459603 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567810-hr2wf" Mar 21 04:50:00 crc kubenswrapper[4839]: I0321 04:50:00.885241 4839 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29567810-hr2wf"] Mar 21 04:50:00 crc kubenswrapper[4839]: I0321 04:50:00.895852 4839 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 21 04:50:01 crc kubenswrapper[4839]: I0321 04:50:01.250771 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567810-hr2wf" event={"ID":"cf9d6591-e9e7-485d-96f3-8f32958ac530","Type":"ContainerStarted","Data":"f6e573e57464ee6a595844a7133ae0d4c7c7b617bdc87105aee518eda761c6ca"} Mar 21 04:50:03 crc kubenswrapper[4839]: I0321 04:50:03.271855 4839 generic.go:334] "Generic (PLEG): container finished" podID="cf9d6591-e9e7-485d-96f3-8f32958ac530" containerID="3f39162e6963343de8c3eafe8a89ac888be7f9493499afd89bf8375748fc8e0f" exitCode=0 Mar 21 04:50:03 crc kubenswrapper[4839]: I0321 04:50:03.272005 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567810-hr2wf" event={"ID":"cf9d6591-e9e7-485d-96f3-8f32958ac530","Type":"ContainerDied","Data":"3f39162e6963343de8c3eafe8a89ac888be7f9493499afd89bf8375748fc8e0f"} Mar 21 04:50:04 crc kubenswrapper[4839]: I0321 04:50:04.629518 4839 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567810-hr2wf" Mar 21 04:50:04 crc kubenswrapper[4839]: I0321 04:50:04.794418 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qp27f\" (UniqueName: \"kubernetes.io/projected/cf9d6591-e9e7-485d-96f3-8f32958ac530-kube-api-access-qp27f\") pod \"cf9d6591-e9e7-485d-96f3-8f32958ac530\" (UID: \"cf9d6591-e9e7-485d-96f3-8f32958ac530\") " Mar 21 04:50:04 crc kubenswrapper[4839]: I0321 04:50:04.803368 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cf9d6591-e9e7-485d-96f3-8f32958ac530-kube-api-access-qp27f" (OuterVolumeSpecName: "kube-api-access-qp27f") pod "cf9d6591-e9e7-485d-96f3-8f32958ac530" (UID: "cf9d6591-e9e7-485d-96f3-8f32958ac530"). InnerVolumeSpecName "kube-api-access-qp27f". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 21 04:50:04 crc kubenswrapper[4839]: I0321 04:50:04.897765 4839 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qp27f\" (UniqueName: \"kubernetes.io/projected/cf9d6591-e9e7-485d-96f3-8f32958ac530-kube-api-access-qp27f\") on node \"crc\" DevicePath \"\"" Mar 21 04:50:05 crc kubenswrapper[4839]: I0321 04:50:05.295310 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567810-hr2wf" event={"ID":"cf9d6591-e9e7-485d-96f3-8f32958ac530","Type":"ContainerDied","Data":"f6e573e57464ee6a595844a7133ae0d4c7c7b617bdc87105aee518eda761c6ca"} Mar 21 04:50:05 crc kubenswrapper[4839]: I0321 04:50:05.295358 4839 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f6e573e57464ee6a595844a7133ae0d4c7c7b617bdc87105aee518eda761c6ca" Mar 21 04:50:05 crc kubenswrapper[4839]: I0321 04:50:05.295393 4839 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567810-hr2wf" Mar 21 04:50:05 crc kubenswrapper[4839]: I0321 04:50:05.721195 4839 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29567804-jn7hw"] Mar 21 04:50:05 crc kubenswrapper[4839]: I0321 04:50:05.730056 4839 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29567804-jn7hw"] Mar 21 04:50:06 crc kubenswrapper[4839]: I0321 04:50:06.464444 4839 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="117f0438-5ab3-4616-b574-c5bbc43e8ac9" path="/var/lib/kubelet/pods/117f0438-5ab3-4616-b574-c5bbc43e8ac9/volumes" Mar 21 04:50:31 crc kubenswrapper[4839]: I0321 04:50:31.047507 4839 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-wpn7f"] Mar 21 04:50:31 crc kubenswrapper[4839]: E0321 04:50:31.048503 4839 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cf9d6591-e9e7-485d-96f3-8f32958ac530" containerName="oc" Mar 21 04:50:31 crc kubenswrapper[4839]: I0321 04:50:31.048518 4839 state_mem.go:107] "Deleted CPUSet assignment" podUID="cf9d6591-e9e7-485d-96f3-8f32958ac530" containerName="oc" Mar 21 04:50:31 crc kubenswrapper[4839]: I0321 04:50:31.048699 4839 memory_manager.go:354] "RemoveStaleState removing state" podUID="cf9d6591-e9e7-485d-96f3-8f32958ac530" containerName="oc" Mar 21 04:50:31 crc kubenswrapper[4839]: I0321 04:50:31.050116 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-wpn7f" Mar 21 04:50:31 crc kubenswrapper[4839]: I0321 04:50:31.058802 4839 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-wpn7f"] Mar 21 04:50:31 crc kubenswrapper[4839]: I0321 04:50:31.197481 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/919b184f-e6e2-4633-aad8-37bbe3fa579f-catalog-content\") pod \"community-operators-wpn7f\" (UID: \"919b184f-e6e2-4633-aad8-37bbe3fa579f\") " pod="openshift-marketplace/community-operators-wpn7f" Mar 21 04:50:31 crc kubenswrapper[4839]: I0321 04:50:31.197736 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/919b184f-e6e2-4633-aad8-37bbe3fa579f-utilities\") pod \"community-operators-wpn7f\" (UID: \"919b184f-e6e2-4633-aad8-37bbe3fa579f\") " pod="openshift-marketplace/community-operators-wpn7f" Mar 21 04:50:31 crc kubenswrapper[4839]: I0321 04:50:31.197772 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pz2cp\" (UniqueName: \"kubernetes.io/projected/919b184f-e6e2-4633-aad8-37bbe3fa579f-kube-api-access-pz2cp\") pod \"community-operators-wpn7f\" (UID: \"919b184f-e6e2-4633-aad8-37bbe3fa579f\") " pod="openshift-marketplace/community-operators-wpn7f" Mar 21 04:50:31 crc kubenswrapper[4839]: I0321 04:50:31.298933 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/919b184f-e6e2-4633-aad8-37bbe3fa579f-utilities\") pod \"community-operators-wpn7f\" (UID: \"919b184f-e6e2-4633-aad8-37bbe3fa579f\") " pod="openshift-marketplace/community-operators-wpn7f" Mar 21 04:50:31 crc kubenswrapper[4839]: I0321 04:50:31.298986 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pz2cp\" (UniqueName: \"kubernetes.io/projected/919b184f-e6e2-4633-aad8-37bbe3fa579f-kube-api-access-pz2cp\") pod \"community-operators-wpn7f\" (UID: \"919b184f-e6e2-4633-aad8-37bbe3fa579f\") " pod="openshift-marketplace/community-operators-wpn7f" Mar 21 04:50:31 crc kubenswrapper[4839]: I0321 04:50:31.299086 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/919b184f-e6e2-4633-aad8-37bbe3fa579f-catalog-content\") pod \"community-operators-wpn7f\" (UID: \"919b184f-e6e2-4633-aad8-37bbe3fa579f\") " pod="openshift-marketplace/community-operators-wpn7f" Mar 21 04:50:31 crc kubenswrapper[4839]: I0321 04:50:31.299635 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/919b184f-e6e2-4633-aad8-37bbe3fa579f-utilities\") pod \"community-operators-wpn7f\" (UID: \"919b184f-e6e2-4633-aad8-37bbe3fa579f\") " pod="openshift-marketplace/community-operators-wpn7f" Mar 21 04:50:31 crc kubenswrapper[4839]: I0321 04:50:31.299666 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/919b184f-e6e2-4633-aad8-37bbe3fa579f-catalog-content\") pod \"community-operators-wpn7f\" (UID: \"919b184f-e6e2-4633-aad8-37bbe3fa579f\") " pod="openshift-marketplace/community-operators-wpn7f" Mar 21 04:50:31 crc kubenswrapper[4839]: I0321 04:50:31.320352 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pz2cp\" (UniqueName: \"kubernetes.io/projected/919b184f-e6e2-4633-aad8-37bbe3fa579f-kube-api-access-pz2cp\") pod \"community-operators-wpn7f\" (UID: \"919b184f-e6e2-4633-aad8-37bbe3fa579f\") " pod="openshift-marketplace/community-operators-wpn7f" Mar 21 04:50:31 crc kubenswrapper[4839]: I0321 04:50:31.383063 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-wpn7f" Mar 21 04:50:32 crc kubenswrapper[4839]: I0321 04:50:32.009348 4839 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-wpn7f"] Mar 21 04:50:32 crc kubenswrapper[4839]: I0321 04:50:32.549767 4839 generic.go:334] "Generic (PLEG): container finished" podID="919b184f-e6e2-4633-aad8-37bbe3fa579f" containerID="d51b8301c3c8bb03eba26fad356fbcd7318953d4f1b3376abf01e815d6509cf1" exitCode=0 Mar 21 04:50:32 crc kubenswrapper[4839]: I0321 04:50:32.550073 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-wpn7f" event={"ID":"919b184f-e6e2-4633-aad8-37bbe3fa579f","Type":"ContainerDied","Data":"d51b8301c3c8bb03eba26fad356fbcd7318953d4f1b3376abf01e815d6509cf1"} Mar 21 04:50:32 crc kubenswrapper[4839]: I0321 04:50:32.550104 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-wpn7f" event={"ID":"919b184f-e6e2-4633-aad8-37bbe3fa579f","Type":"ContainerStarted","Data":"3213ef092a94f429bd88322b83c5988943b92e89dae4bbca5523e68fbd740666"} Mar 21 04:50:33 crc kubenswrapper[4839]: I0321 04:50:33.562498 4839 generic.go:334] "Generic (PLEG): container finished" podID="919b184f-e6e2-4633-aad8-37bbe3fa579f" containerID="2c82a777ff61fab81f58fb678dd999ca28aa8bd736bcf9c16dbd8bfc0f56d504" exitCode=0 Mar 21 04:50:33 crc kubenswrapper[4839]: I0321 04:50:33.562620 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-wpn7f" event={"ID":"919b184f-e6e2-4633-aad8-37bbe3fa579f","Type":"ContainerDied","Data":"2c82a777ff61fab81f58fb678dd999ca28aa8bd736bcf9c16dbd8bfc0f56d504"} Mar 21 04:50:34 crc kubenswrapper[4839]: I0321 04:50:34.576010 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-wpn7f" event={"ID":"919b184f-e6e2-4633-aad8-37bbe3fa579f","Type":"ContainerStarted","Data":"7cc590417128252053f41401db6235f13824720b19dbf6d99dc75ef1117fb028"} Mar 21 04:50:34 crc kubenswrapper[4839]: I0321 04:50:34.602134 4839 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-wpn7f" podStartSLOduration=2.175491382 podStartE2EDuration="3.602112258s" podCreationTimestamp="2026-03-21 04:50:31 +0000 UTC" firstStartedPulling="2026-03-21 04:50:32.551748969 +0000 UTC m=+1636.879535645" lastFinishedPulling="2026-03-21 04:50:33.978369845 +0000 UTC m=+1638.306156521" observedRunningTime="2026-03-21 04:50:34.600226655 +0000 UTC m=+1638.928013331" watchObservedRunningTime="2026-03-21 04:50:34.602112258 +0000 UTC m=+1638.929898924" Mar 21 04:50:41 crc kubenswrapper[4839]: I0321 04:50:41.384256 4839 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-wpn7f" Mar 21 04:50:41 crc kubenswrapper[4839]: I0321 04:50:41.385014 4839 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-wpn7f" Mar 21 04:50:41 crc kubenswrapper[4839]: I0321 04:50:41.441378 4839 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-wpn7f" Mar 21 04:50:41 crc kubenswrapper[4839]: I0321 04:50:41.701831 4839 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-wpn7f" Mar 21 04:50:41 crc kubenswrapper[4839]: I0321 04:50:41.752689 4839 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-wpn7f"] Mar 21 04:50:43 crc kubenswrapper[4839]: I0321 04:50:43.676379 4839 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-wpn7f" podUID="919b184f-e6e2-4633-aad8-37bbe3fa579f" containerName="registry-server" containerID="cri-o://7cc590417128252053f41401db6235f13824720b19dbf6d99dc75ef1117fb028" gracePeriod=2 Mar 21 04:50:44 crc kubenswrapper[4839]: I0321 04:50:44.729123 4839 generic.go:334] "Generic (PLEG): container finished" podID="919b184f-e6e2-4633-aad8-37bbe3fa579f" containerID="7cc590417128252053f41401db6235f13824720b19dbf6d99dc75ef1117fb028" exitCode=0 Mar 21 04:50:44 crc kubenswrapper[4839]: I0321 04:50:44.729209 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-wpn7f" event={"ID":"919b184f-e6e2-4633-aad8-37bbe3fa579f","Type":"ContainerDied","Data":"7cc590417128252053f41401db6235f13824720b19dbf6d99dc75ef1117fb028"} Mar 21 04:50:45 crc kubenswrapper[4839]: I0321 04:50:45.017901 4839 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-wpn7f" Mar 21 04:50:45 crc kubenswrapper[4839]: I0321 04:50:45.168093 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pz2cp\" (UniqueName: \"kubernetes.io/projected/919b184f-e6e2-4633-aad8-37bbe3fa579f-kube-api-access-pz2cp\") pod \"919b184f-e6e2-4633-aad8-37bbe3fa579f\" (UID: \"919b184f-e6e2-4633-aad8-37bbe3fa579f\") " Mar 21 04:50:45 crc kubenswrapper[4839]: I0321 04:50:45.168453 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/919b184f-e6e2-4633-aad8-37bbe3fa579f-utilities\") pod \"919b184f-e6e2-4633-aad8-37bbe3fa579f\" (UID: \"919b184f-e6e2-4633-aad8-37bbe3fa579f\") " Mar 21 04:50:45 crc kubenswrapper[4839]: I0321 04:50:45.168736 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/919b184f-e6e2-4633-aad8-37bbe3fa579f-catalog-content\") pod \"919b184f-e6e2-4633-aad8-37bbe3fa579f\" (UID: \"919b184f-e6e2-4633-aad8-37bbe3fa579f\") " Mar 21 04:50:45 crc kubenswrapper[4839]: I0321 04:50:45.169556 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/919b184f-e6e2-4633-aad8-37bbe3fa579f-utilities" (OuterVolumeSpecName: "utilities") pod "919b184f-e6e2-4633-aad8-37bbe3fa579f" (UID: "919b184f-e6e2-4633-aad8-37bbe3fa579f"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 21 04:50:45 crc kubenswrapper[4839]: I0321 04:50:45.169722 4839 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/919b184f-e6e2-4633-aad8-37bbe3fa579f-utilities\") on node \"crc\" DevicePath \"\"" Mar 21 04:50:45 crc kubenswrapper[4839]: I0321 04:50:45.177121 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/919b184f-e6e2-4633-aad8-37bbe3fa579f-kube-api-access-pz2cp" (OuterVolumeSpecName: "kube-api-access-pz2cp") pod "919b184f-e6e2-4633-aad8-37bbe3fa579f" (UID: "919b184f-e6e2-4633-aad8-37bbe3fa579f"). InnerVolumeSpecName "kube-api-access-pz2cp". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 21 04:50:45 crc kubenswrapper[4839]: I0321 04:50:45.241174 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/919b184f-e6e2-4633-aad8-37bbe3fa579f-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "919b184f-e6e2-4633-aad8-37bbe3fa579f" (UID: "919b184f-e6e2-4633-aad8-37bbe3fa579f"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 21 04:50:45 crc kubenswrapper[4839]: I0321 04:50:45.272294 4839 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/919b184f-e6e2-4633-aad8-37bbe3fa579f-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 21 04:50:45 crc kubenswrapper[4839]: I0321 04:50:45.272347 4839 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pz2cp\" (UniqueName: \"kubernetes.io/projected/919b184f-e6e2-4633-aad8-37bbe3fa579f-kube-api-access-pz2cp\") on node \"crc\" DevicePath \"\"" Mar 21 04:50:45 crc kubenswrapper[4839]: I0321 04:50:45.745881 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-wpn7f" event={"ID":"919b184f-e6e2-4633-aad8-37bbe3fa579f","Type":"ContainerDied","Data":"3213ef092a94f429bd88322b83c5988943b92e89dae4bbca5523e68fbd740666"} Mar 21 04:50:45 crc kubenswrapper[4839]: I0321 04:50:45.745944 4839 scope.go:117] "RemoveContainer" containerID="7cc590417128252053f41401db6235f13824720b19dbf6d99dc75ef1117fb028" Mar 21 04:50:45 crc kubenswrapper[4839]: I0321 04:50:45.746021 4839 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-wpn7f" Mar 21 04:50:45 crc kubenswrapper[4839]: I0321 04:50:45.768133 4839 scope.go:117] "RemoveContainer" containerID="2c82a777ff61fab81f58fb678dd999ca28aa8bd736bcf9c16dbd8bfc0f56d504" Mar 21 04:50:45 crc kubenswrapper[4839]: I0321 04:50:45.800591 4839 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-wpn7f"] Mar 21 04:50:45 crc kubenswrapper[4839]: I0321 04:50:45.809500 4839 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-wpn7f"] Mar 21 04:50:45 crc kubenswrapper[4839]: I0321 04:50:45.822182 4839 scope.go:117] "RemoveContainer" containerID="d51b8301c3c8bb03eba26fad356fbcd7318953d4f1b3376abf01e815d6509cf1" Mar 21 04:50:46 crc kubenswrapper[4839]: I0321 04:50:46.464133 4839 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="919b184f-e6e2-4633-aad8-37bbe3fa579f" path="/var/lib/kubelet/pods/919b184f-e6e2-4633-aad8-37bbe3fa579f/volumes" Mar 21 04:50:53 crc kubenswrapper[4839]: I0321 04:50:53.389953 4839 scope.go:117] "RemoveContainer" containerID="d6b588c7e0ea2083a499b261b5c79b627db47988e3cf9da2d927f03f127f5e76" Mar 21 04:50:53 crc kubenswrapper[4839]: I0321 04:50:53.416083 4839 scope.go:117] "RemoveContainer" containerID="d7d86bc6d96470a04c1fc681cf73561b422455dc884417b0677e9ae418f682f0" Mar 21 04:50:53 crc kubenswrapper[4839]: I0321 04:50:53.464725 4839 scope.go:117] "RemoveContainer" containerID="91c89b78e4a205a25af8a93dc758c0974e237fac7942a3cc2a1f6b03e61923e1" Mar 21 04:50:53 crc kubenswrapper[4839]: I0321 04:50:53.499782 4839 scope.go:117] "RemoveContainer" containerID="2233fa3f3ad560ada373befd98764d7c67680bcb094c6c63415e8ef4dc05b7f7" Mar 21 04:50:59 crc kubenswrapper[4839]: I0321 04:50:59.304253 4839 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-wnqhz"] Mar 21 04:50:59 crc kubenswrapper[4839]: E0321 04:50:59.305250 4839 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="919b184f-e6e2-4633-aad8-37bbe3fa579f" containerName="extract-content" Mar 21 04:50:59 crc kubenswrapper[4839]: I0321 04:50:59.305266 4839 state_mem.go:107] "Deleted CPUSet assignment" podUID="919b184f-e6e2-4633-aad8-37bbe3fa579f" containerName="extract-content" Mar 21 04:50:59 crc kubenswrapper[4839]: E0321 04:50:59.305280 4839 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="919b184f-e6e2-4633-aad8-37bbe3fa579f" containerName="extract-utilities" Mar 21 04:50:59 crc kubenswrapper[4839]: I0321 04:50:59.305288 4839 state_mem.go:107] "Deleted CPUSet assignment" podUID="919b184f-e6e2-4633-aad8-37bbe3fa579f" containerName="extract-utilities" Mar 21 04:50:59 crc kubenswrapper[4839]: E0321 04:50:59.305303 4839 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="919b184f-e6e2-4633-aad8-37bbe3fa579f" containerName="registry-server" Mar 21 04:50:59 crc kubenswrapper[4839]: I0321 04:50:59.305311 4839 state_mem.go:107] "Deleted CPUSet assignment" podUID="919b184f-e6e2-4633-aad8-37bbe3fa579f" containerName="registry-server" Mar 21 04:50:59 crc kubenswrapper[4839]: I0321 04:50:59.305549 4839 memory_manager.go:354] "RemoveStaleState removing state" podUID="919b184f-e6e2-4633-aad8-37bbe3fa579f" containerName="registry-server" Mar 21 04:50:59 crc kubenswrapper[4839]: I0321 04:50:59.308405 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-wnqhz" Mar 21 04:50:59 crc kubenswrapper[4839]: I0321 04:50:59.321121 4839 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-wnqhz"] Mar 21 04:50:59 crc kubenswrapper[4839]: I0321 04:50:59.412100 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nxff7\" (UniqueName: \"kubernetes.io/projected/49f12ee9-da5c-44bf-aa45-02640350f0ea-kube-api-access-nxff7\") pod \"redhat-marketplace-wnqhz\" (UID: \"49f12ee9-da5c-44bf-aa45-02640350f0ea\") " pod="openshift-marketplace/redhat-marketplace-wnqhz" Mar 21 04:50:59 crc kubenswrapper[4839]: I0321 04:50:59.412154 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/49f12ee9-da5c-44bf-aa45-02640350f0ea-utilities\") pod \"redhat-marketplace-wnqhz\" (UID: \"49f12ee9-da5c-44bf-aa45-02640350f0ea\") " pod="openshift-marketplace/redhat-marketplace-wnqhz" Mar 21 04:50:59 crc kubenswrapper[4839]: I0321 04:50:59.412213 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/49f12ee9-da5c-44bf-aa45-02640350f0ea-catalog-content\") pod \"redhat-marketplace-wnqhz\" (UID: \"49f12ee9-da5c-44bf-aa45-02640350f0ea\") " pod="openshift-marketplace/redhat-marketplace-wnqhz" Mar 21 04:50:59 crc kubenswrapper[4839]: I0321 04:50:59.514031 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nxff7\" (UniqueName: \"kubernetes.io/projected/49f12ee9-da5c-44bf-aa45-02640350f0ea-kube-api-access-nxff7\") pod \"redhat-marketplace-wnqhz\" (UID: \"49f12ee9-da5c-44bf-aa45-02640350f0ea\") " pod="openshift-marketplace/redhat-marketplace-wnqhz" Mar 21 04:50:59 crc kubenswrapper[4839]: I0321 04:50:59.514106 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/49f12ee9-da5c-44bf-aa45-02640350f0ea-utilities\") pod \"redhat-marketplace-wnqhz\" (UID: \"49f12ee9-da5c-44bf-aa45-02640350f0ea\") " pod="openshift-marketplace/redhat-marketplace-wnqhz" Mar 21 04:50:59 crc kubenswrapper[4839]: I0321 04:50:59.514176 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/49f12ee9-da5c-44bf-aa45-02640350f0ea-catalog-content\") pod \"redhat-marketplace-wnqhz\" (UID: \"49f12ee9-da5c-44bf-aa45-02640350f0ea\") " pod="openshift-marketplace/redhat-marketplace-wnqhz" Mar 21 04:50:59 crc kubenswrapper[4839]: I0321 04:50:59.515162 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/49f12ee9-da5c-44bf-aa45-02640350f0ea-utilities\") pod \"redhat-marketplace-wnqhz\" (UID: \"49f12ee9-da5c-44bf-aa45-02640350f0ea\") " pod="openshift-marketplace/redhat-marketplace-wnqhz" Mar 21 04:50:59 crc kubenswrapper[4839]: I0321 04:50:59.515224 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/49f12ee9-da5c-44bf-aa45-02640350f0ea-catalog-content\") pod \"redhat-marketplace-wnqhz\" (UID: \"49f12ee9-da5c-44bf-aa45-02640350f0ea\") " pod="openshift-marketplace/redhat-marketplace-wnqhz" Mar 21 04:50:59 crc kubenswrapper[4839]: I0321 04:50:59.539808 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nxff7\" (UniqueName: \"kubernetes.io/projected/49f12ee9-da5c-44bf-aa45-02640350f0ea-kube-api-access-nxff7\") pod \"redhat-marketplace-wnqhz\" (UID: \"49f12ee9-da5c-44bf-aa45-02640350f0ea\") " pod="openshift-marketplace/redhat-marketplace-wnqhz" Mar 21 04:50:59 crc kubenswrapper[4839]: I0321 04:50:59.631029 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-wnqhz" Mar 21 04:51:00 crc kubenswrapper[4839]: I0321 04:51:00.002157 4839 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-wnqhz"] Mar 21 04:51:00 crc kubenswrapper[4839]: I0321 04:51:00.886819 4839 generic.go:334] "Generic (PLEG): container finished" podID="49f12ee9-da5c-44bf-aa45-02640350f0ea" containerID="fcbda48bcced36b2ddee4b31a1c2e13b4e70ba09cb38e4a05e3e019e58b0a43d" exitCode=0 Mar 21 04:51:00 crc kubenswrapper[4839]: I0321 04:51:00.887685 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-wnqhz" event={"ID":"49f12ee9-da5c-44bf-aa45-02640350f0ea","Type":"ContainerDied","Data":"fcbda48bcced36b2ddee4b31a1c2e13b4e70ba09cb38e4a05e3e019e58b0a43d"} Mar 21 04:51:00 crc kubenswrapper[4839]: I0321 04:51:00.888336 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-wnqhz" event={"ID":"49f12ee9-da5c-44bf-aa45-02640350f0ea","Type":"ContainerStarted","Data":"76f9974553375f6f0030f46f723765a84752d0e71db2420884c6cbca60a091a3"} Mar 21 04:51:02 crc kubenswrapper[4839]: I0321 04:51:02.905333 4839 generic.go:334] "Generic (PLEG): container finished" podID="49f12ee9-da5c-44bf-aa45-02640350f0ea" containerID="e0e3e0615c3fd03739091938ee80522f6b47d73a4bc35a5bcf07793354cb96ef" exitCode=0 Mar 21 04:51:02 crc kubenswrapper[4839]: I0321 04:51:02.905443 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-wnqhz" event={"ID":"49f12ee9-da5c-44bf-aa45-02640350f0ea","Type":"ContainerDied","Data":"e0e3e0615c3fd03739091938ee80522f6b47d73a4bc35a5bcf07793354cb96ef"} Mar 21 04:51:04 crc kubenswrapper[4839]: I0321 04:51:04.926037 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-wnqhz" event={"ID":"49f12ee9-da5c-44bf-aa45-02640350f0ea","Type":"ContainerStarted","Data":"9ac52a8b1e04099ab0e136f38457ebea6a79463830c190acd52c6c983da0849f"} Mar 21 04:51:04 crc kubenswrapper[4839]: I0321 04:51:04.943599 4839 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-wnqhz" podStartSLOduration=3.081683889 podStartE2EDuration="5.943581554s" podCreationTimestamp="2026-03-21 04:50:59 +0000 UTC" firstStartedPulling="2026-03-21 04:51:00.889637399 +0000 UTC m=+1665.217424075" lastFinishedPulling="2026-03-21 04:51:03.751535064 +0000 UTC m=+1668.079321740" observedRunningTime="2026-03-21 04:51:04.942896025 +0000 UTC m=+1669.270682731" watchObservedRunningTime="2026-03-21 04:51:04.943581554 +0000 UTC m=+1669.271368230" Mar 21 04:51:09 crc kubenswrapper[4839]: I0321 04:51:09.631463 4839 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-wnqhz" Mar 21 04:51:09 crc kubenswrapper[4839]: I0321 04:51:09.632103 4839 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-wnqhz" Mar 21 04:51:09 crc kubenswrapper[4839]: I0321 04:51:09.691471 4839 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-wnqhz" Mar 21 04:51:10 crc kubenswrapper[4839]: I0321 04:51:10.004878 4839 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-wnqhz" Mar 21 04:51:10 crc kubenswrapper[4839]: I0321 04:51:10.048673 4839 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-wnqhz"] Mar 21 04:51:11 crc kubenswrapper[4839]: I0321 04:51:11.979893 4839 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-wnqhz" podUID="49f12ee9-da5c-44bf-aa45-02640350f0ea" containerName="registry-server" containerID="cri-o://9ac52a8b1e04099ab0e136f38457ebea6a79463830c190acd52c6c983da0849f" gracePeriod=2 Mar 21 04:51:12 crc kubenswrapper[4839]: I0321 04:51:12.492046 4839 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-wnqhz" Mar 21 04:51:12 crc kubenswrapper[4839]: I0321 04:51:12.570365 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/49f12ee9-da5c-44bf-aa45-02640350f0ea-catalog-content\") pod \"49f12ee9-da5c-44bf-aa45-02640350f0ea\" (UID: \"49f12ee9-da5c-44bf-aa45-02640350f0ea\") " Mar 21 04:51:12 crc kubenswrapper[4839]: I0321 04:51:12.570557 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/49f12ee9-da5c-44bf-aa45-02640350f0ea-utilities\") pod \"49f12ee9-da5c-44bf-aa45-02640350f0ea\" (UID: \"49f12ee9-da5c-44bf-aa45-02640350f0ea\") " Mar 21 04:51:12 crc kubenswrapper[4839]: I0321 04:51:12.570703 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nxff7\" (UniqueName: \"kubernetes.io/projected/49f12ee9-da5c-44bf-aa45-02640350f0ea-kube-api-access-nxff7\") pod \"49f12ee9-da5c-44bf-aa45-02640350f0ea\" (UID: \"49f12ee9-da5c-44bf-aa45-02640350f0ea\") " Mar 21 04:51:12 crc kubenswrapper[4839]: I0321 04:51:12.572304 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/49f12ee9-da5c-44bf-aa45-02640350f0ea-utilities" (OuterVolumeSpecName: "utilities") pod "49f12ee9-da5c-44bf-aa45-02640350f0ea" (UID: "49f12ee9-da5c-44bf-aa45-02640350f0ea"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 21 04:51:12 crc kubenswrapper[4839]: I0321 04:51:12.578194 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/49f12ee9-da5c-44bf-aa45-02640350f0ea-kube-api-access-nxff7" (OuterVolumeSpecName: "kube-api-access-nxff7") pod "49f12ee9-da5c-44bf-aa45-02640350f0ea" (UID: "49f12ee9-da5c-44bf-aa45-02640350f0ea"). InnerVolumeSpecName "kube-api-access-nxff7". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 21 04:51:12 crc kubenswrapper[4839]: I0321 04:51:12.610704 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/49f12ee9-da5c-44bf-aa45-02640350f0ea-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "49f12ee9-da5c-44bf-aa45-02640350f0ea" (UID: "49f12ee9-da5c-44bf-aa45-02640350f0ea"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 21 04:51:12 crc kubenswrapper[4839]: I0321 04:51:12.673366 4839 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/49f12ee9-da5c-44bf-aa45-02640350f0ea-utilities\") on node \"crc\" DevicePath \"\"" Mar 21 04:51:12 crc kubenswrapper[4839]: I0321 04:51:12.673409 4839 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nxff7\" (UniqueName: \"kubernetes.io/projected/49f12ee9-da5c-44bf-aa45-02640350f0ea-kube-api-access-nxff7\") on node \"crc\" DevicePath \"\"" Mar 21 04:51:12 crc kubenswrapper[4839]: I0321 04:51:12.673424 4839 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/49f12ee9-da5c-44bf-aa45-02640350f0ea-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 21 04:51:12 crc kubenswrapper[4839]: I0321 04:51:12.990329 4839 generic.go:334] "Generic (PLEG): container finished" podID="49f12ee9-da5c-44bf-aa45-02640350f0ea" containerID="9ac52a8b1e04099ab0e136f38457ebea6a79463830c190acd52c6c983da0849f" exitCode=0 Mar 21 04:51:12 crc kubenswrapper[4839]: I0321 04:51:12.990415 4839 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-wnqhz" Mar 21 04:51:12 crc kubenswrapper[4839]: I0321 04:51:12.990414 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-wnqhz" event={"ID":"49f12ee9-da5c-44bf-aa45-02640350f0ea","Type":"ContainerDied","Data":"9ac52a8b1e04099ab0e136f38457ebea6a79463830c190acd52c6c983da0849f"} Mar 21 04:51:12 crc kubenswrapper[4839]: I0321 04:51:12.990800 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-wnqhz" event={"ID":"49f12ee9-da5c-44bf-aa45-02640350f0ea","Type":"ContainerDied","Data":"76f9974553375f6f0030f46f723765a84752d0e71db2420884c6cbca60a091a3"} Mar 21 04:51:12 crc kubenswrapper[4839]: I0321 04:51:12.990821 4839 scope.go:117] "RemoveContainer" containerID="9ac52a8b1e04099ab0e136f38457ebea6a79463830c190acd52c6c983da0849f" Mar 21 04:51:13 crc kubenswrapper[4839]: I0321 04:51:13.011705 4839 scope.go:117] "RemoveContainer" containerID="e0e3e0615c3fd03739091938ee80522f6b47d73a4bc35a5bcf07793354cb96ef" Mar 21 04:51:13 crc kubenswrapper[4839]: I0321 04:51:13.023346 4839 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-wnqhz"] Mar 21 04:51:13 crc kubenswrapper[4839]: I0321 04:51:13.032974 4839 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-wnqhz"] Mar 21 04:51:13 crc kubenswrapper[4839]: I0321 04:51:13.044680 4839 scope.go:117] "RemoveContainer" containerID="fcbda48bcced36b2ddee4b31a1c2e13b4e70ba09cb38e4a05e3e019e58b0a43d" Mar 21 04:51:13 crc kubenswrapper[4839]: I0321 04:51:13.081652 4839 scope.go:117] "RemoveContainer" containerID="9ac52a8b1e04099ab0e136f38457ebea6a79463830c190acd52c6c983da0849f" Mar 21 04:51:13 crc kubenswrapper[4839]: E0321 04:51:13.082067 4839 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9ac52a8b1e04099ab0e136f38457ebea6a79463830c190acd52c6c983da0849f\": container with ID starting with 9ac52a8b1e04099ab0e136f38457ebea6a79463830c190acd52c6c983da0849f not found: ID does not exist" containerID="9ac52a8b1e04099ab0e136f38457ebea6a79463830c190acd52c6c983da0849f" Mar 21 04:51:13 crc kubenswrapper[4839]: I0321 04:51:13.082109 4839 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9ac52a8b1e04099ab0e136f38457ebea6a79463830c190acd52c6c983da0849f"} err="failed to get container status \"9ac52a8b1e04099ab0e136f38457ebea6a79463830c190acd52c6c983da0849f\": rpc error: code = NotFound desc = could not find container \"9ac52a8b1e04099ab0e136f38457ebea6a79463830c190acd52c6c983da0849f\": container with ID starting with 9ac52a8b1e04099ab0e136f38457ebea6a79463830c190acd52c6c983da0849f not found: ID does not exist" Mar 21 04:51:13 crc kubenswrapper[4839]: I0321 04:51:13.082139 4839 scope.go:117] "RemoveContainer" containerID="e0e3e0615c3fd03739091938ee80522f6b47d73a4bc35a5bcf07793354cb96ef" Mar 21 04:51:13 crc kubenswrapper[4839]: E0321 04:51:13.082542 4839 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e0e3e0615c3fd03739091938ee80522f6b47d73a4bc35a5bcf07793354cb96ef\": container with ID starting with e0e3e0615c3fd03739091938ee80522f6b47d73a4bc35a5bcf07793354cb96ef not found: ID does not exist" containerID="e0e3e0615c3fd03739091938ee80522f6b47d73a4bc35a5bcf07793354cb96ef" Mar 21 04:51:13 crc kubenswrapper[4839]: I0321 04:51:13.082620 4839 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e0e3e0615c3fd03739091938ee80522f6b47d73a4bc35a5bcf07793354cb96ef"} err="failed to get container status \"e0e3e0615c3fd03739091938ee80522f6b47d73a4bc35a5bcf07793354cb96ef\": rpc error: code = NotFound desc = could not find container \"e0e3e0615c3fd03739091938ee80522f6b47d73a4bc35a5bcf07793354cb96ef\": container with ID starting with e0e3e0615c3fd03739091938ee80522f6b47d73a4bc35a5bcf07793354cb96ef not found: ID does not exist" Mar 21 04:51:13 crc kubenswrapper[4839]: I0321 04:51:13.082648 4839 scope.go:117] "RemoveContainer" containerID="fcbda48bcced36b2ddee4b31a1c2e13b4e70ba09cb38e4a05e3e019e58b0a43d" Mar 21 04:51:13 crc kubenswrapper[4839]: E0321 04:51:13.083018 4839 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fcbda48bcced36b2ddee4b31a1c2e13b4e70ba09cb38e4a05e3e019e58b0a43d\": container with ID starting with fcbda48bcced36b2ddee4b31a1c2e13b4e70ba09cb38e4a05e3e019e58b0a43d not found: ID does not exist" containerID="fcbda48bcced36b2ddee4b31a1c2e13b4e70ba09cb38e4a05e3e019e58b0a43d" Mar 21 04:51:13 crc kubenswrapper[4839]: I0321 04:51:13.083046 4839 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fcbda48bcced36b2ddee4b31a1c2e13b4e70ba09cb38e4a05e3e019e58b0a43d"} err="failed to get container status \"fcbda48bcced36b2ddee4b31a1c2e13b4e70ba09cb38e4a05e3e019e58b0a43d\": rpc error: code = NotFound desc = could not find container \"fcbda48bcced36b2ddee4b31a1c2e13b4e70ba09cb38e4a05e3e019e58b0a43d\": container with ID starting with fcbda48bcced36b2ddee4b31a1c2e13b4e70ba09cb38e4a05e3e019e58b0a43d not found: ID does not exist" Mar 21 04:51:14 crc kubenswrapper[4839]: I0321 04:51:14.465189 4839 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="49f12ee9-da5c-44bf-aa45-02640350f0ea" path="/var/lib/kubelet/pods/49f12ee9-da5c-44bf-aa45-02640350f0ea/volumes" Mar 21 04:51:39 crc kubenswrapper[4839]: I0321 04:51:39.988681 4839 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-f4lbn"] Mar 21 04:51:39 crc kubenswrapper[4839]: E0321 04:51:39.989884 4839 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="49f12ee9-da5c-44bf-aa45-02640350f0ea" containerName="registry-server" Mar 21 04:51:39 crc kubenswrapper[4839]: I0321 04:51:39.989901 4839 state_mem.go:107] "Deleted CPUSet assignment" podUID="49f12ee9-da5c-44bf-aa45-02640350f0ea" containerName="registry-server" Mar 21 04:51:39 crc kubenswrapper[4839]: E0321 04:51:39.989937 4839 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="49f12ee9-da5c-44bf-aa45-02640350f0ea" containerName="extract-utilities" Mar 21 04:51:39 crc kubenswrapper[4839]: I0321 04:51:39.989946 4839 state_mem.go:107] "Deleted CPUSet assignment" podUID="49f12ee9-da5c-44bf-aa45-02640350f0ea" containerName="extract-utilities" Mar 21 04:51:39 crc kubenswrapper[4839]: E0321 04:51:39.989963 4839 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="49f12ee9-da5c-44bf-aa45-02640350f0ea" containerName="extract-content" Mar 21 04:51:39 crc kubenswrapper[4839]: I0321 04:51:39.989971 4839 state_mem.go:107] "Deleted CPUSet assignment" podUID="49f12ee9-da5c-44bf-aa45-02640350f0ea" containerName="extract-content" Mar 21 04:51:39 crc kubenswrapper[4839]: I0321 04:51:39.990197 4839 memory_manager.go:354] "RemoveStaleState removing state" podUID="49f12ee9-da5c-44bf-aa45-02640350f0ea" containerName="registry-server" Mar 21 04:51:39 crc kubenswrapper[4839]: I0321 04:51:39.991832 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-f4lbn" Mar 21 04:51:40 crc kubenswrapper[4839]: I0321 04:51:40.006892 4839 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-f4lbn"] Mar 21 04:51:40 crc kubenswrapper[4839]: I0321 04:51:40.145177 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8zwfl\" (UniqueName: \"kubernetes.io/projected/913aacec-84de-44a6-98fb-382c04095d62-kube-api-access-8zwfl\") pod \"certified-operators-f4lbn\" (UID: \"913aacec-84de-44a6-98fb-382c04095d62\") " pod="openshift-marketplace/certified-operators-f4lbn" Mar 21 04:51:40 crc kubenswrapper[4839]: I0321 04:51:40.145279 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/913aacec-84de-44a6-98fb-382c04095d62-catalog-content\") pod \"certified-operators-f4lbn\" (UID: \"913aacec-84de-44a6-98fb-382c04095d62\") " pod="openshift-marketplace/certified-operators-f4lbn" Mar 21 04:51:40 crc kubenswrapper[4839]: I0321 04:51:40.145322 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/913aacec-84de-44a6-98fb-382c04095d62-utilities\") pod \"certified-operators-f4lbn\" (UID: \"913aacec-84de-44a6-98fb-382c04095d62\") " pod="openshift-marketplace/certified-operators-f4lbn" Mar 21 04:51:40 crc kubenswrapper[4839]: I0321 04:51:40.247320 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8zwfl\" (UniqueName: \"kubernetes.io/projected/913aacec-84de-44a6-98fb-382c04095d62-kube-api-access-8zwfl\") pod \"certified-operators-f4lbn\" (UID: \"913aacec-84de-44a6-98fb-382c04095d62\") " pod="openshift-marketplace/certified-operators-f4lbn" Mar 21 04:51:40 crc kubenswrapper[4839]: I0321 04:51:40.247467 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/913aacec-84de-44a6-98fb-382c04095d62-catalog-content\") pod \"certified-operators-f4lbn\" (UID: \"913aacec-84de-44a6-98fb-382c04095d62\") " pod="openshift-marketplace/certified-operators-f4lbn" Mar 21 04:51:40 crc kubenswrapper[4839]: I0321 04:51:40.247523 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/913aacec-84de-44a6-98fb-382c04095d62-utilities\") pod \"certified-operators-f4lbn\" (UID: \"913aacec-84de-44a6-98fb-382c04095d62\") " pod="openshift-marketplace/certified-operators-f4lbn" Mar 21 04:51:40 crc kubenswrapper[4839]: I0321 04:51:40.248003 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/913aacec-84de-44a6-98fb-382c04095d62-catalog-content\") pod \"certified-operators-f4lbn\" (UID: \"913aacec-84de-44a6-98fb-382c04095d62\") " pod="openshift-marketplace/certified-operators-f4lbn" Mar 21 04:51:40 crc kubenswrapper[4839]: I0321 04:51:40.248048 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/913aacec-84de-44a6-98fb-382c04095d62-utilities\") pod \"certified-operators-f4lbn\" (UID: \"913aacec-84de-44a6-98fb-382c04095d62\") " pod="openshift-marketplace/certified-operators-f4lbn" Mar 21 04:51:40 crc kubenswrapper[4839]: I0321 04:51:40.275800 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8zwfl\" (UniqueName: \"kubernetes.io/projected/913aacec-84de-44a6-98fb-382c04095d62-kube-api-access-8zwfl\") pod \"certified-operators-f4lbn\" (UID: \"913aacec-84de-44a6-98fb-382c04095d62\") " pod="openshift-marketplace/certified-operators-f4lbn" Mar 21 04:51:40 crc kubenswrapper[4839]: I0321 04:51:40.312664 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-f4lbn" Mar 21 04:51:40 crc kubenswrapper[4839]: I0321 04:51:40.806336 4839 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-f4lbn"] Mar 21 04:51:41 crc kubenswrapper[4839]: I0321 04:51:41.233673 4839 generic.go:334] "Generic (PLEG): container finished" podID="913aacec-84de-44a6-98fb-382c04095d62" containerID="a746950b3578c4d87cac3144ec92f3da634b4a5d684a79031e78585a50a57c8d" exitCode=0 Mar 21 04:51:41 crc kubenswrapper[4839]: I0321 04:51:41.233721 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-f4lbn" event={"ID":"913aacec-84de-44a6-98fb-382c04095d62","Type":"ContainerDied","Data":"a746950b3578c4d87cac3144ec92f3da634b4a5d684a79031e78585a50a57c8d"} Mar 21 04:51:41 crc kubenswrapper[4839]: I0321 04:51:41.233751 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-f4lbn" event={"ID":"913aacec-84de-44a6-98fb-382c04095d62","Type":"ContainerStarted","Data":"ad270626a3212620dcc08dc79d71191ac5bcf08c5e916533298a0ab1ed1c26c5"} Mar 21 04:51:43 crc kubenswrapper[4839]: I0321 04:51:43.256085 4839 generic.go:334] "Generic (PLEG): container finished" podID="913aacec-84de-44a6-98fb-382c04095d62" containerID="dab44ad0cfa8219f7ebc4e43b577d42e27b0c303e738076c7a29783d924e0ab6" exitCode=0 Mar 21 04:51:43 crc kubenswrapper[4839]: I0321 04:51:43.256147 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-f4lbn" event={"ID":"913aacec-84de-44a6-98fb-382c04095d62","Type":"ContainerDied","Data":"dab44ad0cfa8219f7ebc4e43b577d42e27b0c303e738076c7a29783d924e0ab6"} Mar 21 04:51:43 crc kubenswrapper[4839]: E0321 04:51:43.344474 4839 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod913aacec_84de_44a6_98fb_382c04095d62.slice/crio-dab44ad0cfa8219f7ebc4e43b577d42e27b0c303e738076c7a29783d924e0ab6.scope\": RecentStats: unable to find data in memory cache]" Mar 21 04:51:45 crc kubenswrapper[4839]: I0321 04:51:45.274185 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-f4lbn" event={"ID":"913aacec-84de-44a6-98fb-382c04095d62","Type":"ContainerStarted","Data":"476bd11c97b3916a913e9cca1183718468b802b377c417e1b79007f9917876c8"} Mar 21 04:51:45 crc kubenswrapper[4839]: I0321 04:51:45.295444 4839 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-f4lbn" podStartSLOduration=3.251086168 podStartE2EDuration="6.295424905s" podCreationTimestamp="2026-03-21 04:51:39 +0000 UTC" firstStartedPulling="2026-03-21 04:51:41.235319107 +0000 UTC m=+1705.563105803" lastFinishedPulling="2026-03-21 04:51:44.279657874 +0000 UTC m=+1708.607444540" observedRunningTime="2026-03-21 04:51:45.29274708 +0000 UTC m=+1709.620533786" watchObservedRunningTime="2026-03-21 04:51:45.295424905 +0000 UTC m=+1709.623211581" Mar 21 04:51:50 crc kubenswrapper[4839]: I0321 04:51:50.312950 4839 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-f4lbn" Mar 21 04:51:50 crc kubenswrapper[4839]: I0321 04:51:50.313504 4839 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-f4lbn" Mar 21 04:51:50 crc kubenswrapper[4839]: I0321 04:51:50.358130 4839 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-f4lbn" Mar 21 04:51:51 crc kubenswrapper[4839]: I0321 04:51:51.368390 4839 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-f4lbn" Mar 21 04:51:51 crc kubenswrapper[4839]: I0321 04:51:51.417127 4839 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-f4lbn"] Mar 21 04:51:53 crc kubenswrapper[4839]: I0321 04:51:53.344305 4839 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-f4lbn" podUID="913aacec-84de-44a6-98fb-382c04095d62" containerName="registry-server" containerID="cri-o://476bd11c97b3916a913e9cca1183718468b802b377c417e1b79007f9917876c8" gracePeriod=2 Mar 21 04:51:53 crc kubenswrapper[4839]: E0321 04:51:53.574982 4839 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod913aacec_84de_44a6_98fb_382c04095d62.slice/crio-conmon-476bd11c97b3916a913e9cca1183718468b802b377c417e1b79007f9917876c8.scope\": RecentStats: unable to find data in memory cache]" Mar 21 04:51:53 crc kubenswrapper[4839]: I0321 04:51:53.815493 4839 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-f4lbn" Mar 21 04:51:53 crc kubenswrapper[4839]: I0321 04:51:53.903112 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/913aacec-84de-44a6-98fb-382c04095d62-catalog-content\") pod \"913aacec-84de-44a6-98fb-382c04095d62\" (UID: \"913aacec-84de-44a6-98fb-382c04095d62\") " Mar 21 04:51:53 crc kubenswrapper[4839]: I0321 04:51:53.903254 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/913aacec-84de-44a6-98fb-382c04095d62-utilities\") pod \"913aacec-84de-44a6-98fb-382c04095d62\" (UID: \"913aacec-84de-44a6-98fb-382c04095d62\") " Mar 21 04:51:53 crc kubenswrapper[4839]: I0321 04:51:53.903400 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8zwfl\" (UniqueName: \"kubernetes.io/projected/913aacec-84de-44a6-98fb-382c04095d62-kube-api-access-8zwfl\") pod \"913aacec-84de-44a6-98fb-382c04095d62\" (UID: \"913aacec-84de-44a6-98fb-382c04095d62\") " Mar 21 04:51:53 crc kubenswrapper[4839]: I0321 04:51:53.905430 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/913aacec-84de-44a6-98fb-382c04095d62-utilities" (OuterVolumeSpecName: "utilities") pod "913aacec-84de-44a6-98fb-382c04095d62" (UID: "913aacec-84de-44a6-98fb-382c04095d62"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 21 04:51:53 crc kubenswrapper[4839]: I0321 04:51:53.911601 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/913aacec-84de-44a6-98fb-382c04095d62-kube-api-access-8zwfl" (OuterVolumeSpecName: "kube-api-access-8zwfl") pod "913aacec-84de-44a6-98fb-382c04095d62" (UID: "913aacec-84de-44a6-98fb-382c04095d62"). InnerVolumeSpecName "kube-api-access-8zwfl". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 21 04:51:54 crc kubenswrapper[4839]: I0321 04:51:54.005303 4839 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/913aacec-84de-44a6-98fb-382c04095d62-utilities\") on node \"crc\" DevicePath \"\"" Mar 21 04:51:54 crc kubenswrapper[4839]: I0321 04:51:54.005349 4839 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8zwfl\" (UniqueName: \"kubernetes.io/projected/913aacec-84de-44a6-98fb-382c04095d62-kube-api-access-8zwfl\") on node \"crc\" DevicePath \"\"" Mar 21 04:51:54 crc kubenswrapper[4839]: I0321 04:51:54.353651 4839 generic.go:334] "Generic (PLEG): container finished" podID="913aacec-84de-44a6-98fb-382c04095d62" containerID="476bd11c97b3916a913e9cca1183718468b802b377c417e1b79007f9917876c8" exitCode=0 Mar 21 04:51:54 crc kubenswrapper[4839]: I0321 04:51:54.353730 4839 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-f4lbn" Mar 21 04:51:54 crc kubenswrapper[4839]: I0321 04:51:54.353737 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-f4lbn" event={"ID":"913aacec-84de-44a6-98fb-382c04095d62","Type":"ContainerDied","Data":"476bd11c97b3916a913e9cca1183718468b802b377c417e1b79007f9917876c8"} Mar 21 04:51:54 crc kubenswrapper[4839]: I0321 04:51:54.354712 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-f4lbn" event={"ID":"913aacec-84de-44a6-98fb-382c04095d62","Type":"ContainerDied","Data":"ad270626a3212620dcc08dc79d71191ac5bcf08c5e916533298a0ab1ed1c26c5"} Mar 21 04:51:54 crc kubenswrapper[4839]: I0321 04:51:54.354735 4839 scope.go:117] "RemoveContainer" containerID="476bd11c97b3916a913e9cca1183718468b802b377c417e1b79007f9917876c8" Mar 21 04:51:54 crc kubenswrapper[4839]: I0321 04:51:54.373020 4839 scope.go:117] "RemoveContainer" containerID="dab44ad0cfa8219f7ebc4e43b577d42e27b0c303e738076c7a29783d924e0ab6" Mar 21 04:51:54 crc kubenswrapper[4839]: I0321 04:51:54.394388 4839 scope.go:117] "RemoveContainer" containerID="a746950b3578c4d87cac3144ec92f3da634b4a5d684a79031e78585a50a57c8d" Mar 21 04:51:54 crc kubenswrapper[4839]: I0321 04:51:54.436646 4839 scope.go:117] "RemoveContainer" containerID="476bd11c97b3916a913e9cca1183718468b802b377c417e1b79007f9917876c8" Mar 21 04:51:54 crc kubenswrapper[4839]: E0321 04:51:54.437220 4839 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"476bd11c97b3916a913e9cca1183718468b802b377c417e1b79007f9917876c8\": container with ID starting with 476bd11c97b3916a913e9cca1183718468b802b377c417e1b79007f9917876c8 not found: ID does not exist" containerID="476bd11c97b3916a913e9cca1183718468b802b377c417e1b79007f9917876c8" Mar 21 04:51:54 crc kubenswrapper[4839]: I0321 04:51:54.437252 4839 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"476bd11c97b3916a913e9cca1183718468b802b377c417e1b79007f9917876c8"} err="failed to get container status \"476bd11c97b3916a913e9cca1183718468b802b377c417e1b79007f9917876c8\": rpc error: code = NotFound desc = could not find container \"476bd11c97b3916a913e9cca1183718468b802b377c417e1b79007f9917876c8\": container with ID starting with 476bd11c97b3916a913e9cca1183718468b802b377c417e1b79007f9917876c8 not found: ID does not exist" Mar 21 04:51:54 crc kubenswrapper[4839]: I0321 04:51:54.437274 4839 scope.go:117] "RemoveContainer" containerID="dab44ad0cfa8219f7ebc4e43b577d42e27b0c303e738076c7a29783d924e0ab6" Mar 21 04:51:54 crc kubenswrapper[4839]: E0321 04:51:54.437693 4839 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"dab44ad0cfa8219f7ebc4e43b577d42e27b0c303e738076c7a29783d924e0ab6\": container with ID starting with dab44ad0cfa8219f7ebc4e43b577d42e27b0c303e738076c7a29783d924e0ab6 not found: ID does not exist" containerID="dab44ad0cfa8219f7ebc4e43b577d42e27b0c303e738076c7a29783d924e0ab6" Mar 21 04:51:54 crc kubenswrapper[4839]: I0321 04:51:54.437742 4839 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"dab44ad0cfa8219f7ebc4e43b577d42e27b0c303e738076c7a29783d924e0ab6"} err="failed to get container status \"dab44ad0cfa8219f7ebc4e43b577d42e27b0c303e738076c7a29783d924e0ab6\": rpc error: code = NotFound desc = could not find container \"dab44ad0cfa8219f7ebc4e43b577d42e27b0c303e738076c7a29783d924e0ab6\": container with ID starting with dab44ad0cfa8219f7ebc4e43b577d42e27b0c303e738076c7a29783d924e0ab6 not found: ID does not exist" Mar 21 04:51:54 crc kubenswrapper[4839]: I0321 04:51:54.437770 4839 scope.go:117] "RemoveContainer" containerID="a746950b3578c4d87cac3144ec92f3da634b4a5d684a79031e78585a50a57c8d" Mar 21 04:51:54 crc kubenswrapper[4839]: E0321 04:51:54.438049 4839 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a746950b3578c4d87cac3144ec92f3da634b4a5d684a79031e78585a50a57c8d\": container with ID starting with a746950b3578c4d87cac3144ec92f3da634b4a5d684a79031e78585a50a57c8d not found: ID does not exist" containerID="a746950b3578c4d87cac3144ec92f3da634b4a5d684a79031e78585a50a57c8d" Mar 21 04:51:54 crc kubenswrapper[4839]: I0321 04:51:54.438073 4839 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a746950b3578c4d87cac3144ec92f3da634b4a5d684a79031e78585a50a57c8d"} err="failed to get container status \"a746950b3578c4d87cac3144ec92f3da634b4a5d684a79031e78585a50a57c8d\": rpc error: code = NotFound desc = could not find container \"a746950b3578c4d87cac3144ec92f3da634b4a5d684a79031e78585a50a57c8d\": container with ID starting with a746950b3578c4d87cac3144ec92f3da634b4a5d684a79031e78585a50a57c8d not found: ID does not exist" Mar 21 04:51:55 crc kubenswrapper[4839]: I0321 04:51:55.944783 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/913aacec-84de-44a6-98fb-382c04095d62-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "913aacec-84de-44a6-98fb-382c04095d62" (UID: "913aacec-84de-44a6-98fb-382c04095d62"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 21 04:51:55 crc kubenswrapper[4839]: I0321 04:51:55.945453 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/913aacec-84de-44a6-98fb-382c04095d62-catalog-content\") pod \"913aacec-84de-44a6-98fb-382c04095d62\" (UID: \"913aacec-84de-44a6-98fb-382c04095d62\") " Mar 21 04:51:55 crc kubenswrapper[4839]: W0321 04:51:55.945556 4839 empty_dir.go:500] Warning: Unmount skipped because path does not exist: /var/lib/kubelet/pods/913aacec-84de-44a6-98fb-382c04095d62/volumes/kubernetes.io~empty-dir/catalog-content Mar 21 04:51:55 crc kubenswrapper[4839]: I0321 04:51:55.945627 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/913aacec-84de-44a6-98fb-382c04095d62-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "913aacec-84de-44a6-98fb-382c04095d62" (UID: "913aacec-84de-44a6-98fb-382c04095d62"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 21 04:51:55 crc kubenswrapper[4839]: I0321 04:51:55.946072 4839 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/913aacec-84de-44a6-98fb-382c04095d62-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 21 04:51:56 crc kubenswrapper[4839]: I0321 04:51:56.199273 4839 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-f4lbn"] Mar 21 04:51:56 crc kubenswrapper[4839]: I0321 04:51:56.207052 4839 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-f4lbn"] Mar 21 04:51:56 crc kubenswrapper[4839]: I0321 04:51:56.464082 4839 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="913aacec-84de-44a6-98fb-382c04095d62" path="/var/lib/kubelet/pods/913aacec-84de-44a6-98fb-382c04095d62/volumes" Mar 21 04:52:00 crc kubenswrapper[4839]: I0321 04:52:00.144783 4839 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29567812-jglhv"] Mar 21 04:52:00 crc kubenswrapper[4839]: E0321 04:52:00.145888 4839 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="913aacec-84de-44a6-98fb-382c04095d62" containerName="registry-server" Mar 21 04:52:00 crc kubenswrapper[4839]: I0321 04:52:00.145906 4839 state_mem.go:107] "Deleted CPUSet assignment" podUID="913aacec-84de-44a6-98fb-382c04095d62" containerName="registry-server" Mar 21 04:52:00 crc kubenswrapper[4839]: E0321 04:52:00.145960 4839 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="913aacec-84de-44a6-98fb-382c04095d62" containerName="extract-utilities" Mar 21 04:52:00 crc kubenswrapper[4839]: I0321 04:52:00.145970 4839 state_mem.go:107] "Deleted CPUSet assignment" podUID="913aacec-84de-44a6-98fb-382c04095d62" containerName="extract-utilities" Mar 21 04:52:00 crc kubenswrapper[4839]: E0321 04:52:00.145984 4839 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="913aacec-84de-44a6-98fb-382c04095d62" containerName="extract-content" Mar 21 04:52:00 crc kubenswrapper[4839]: I0321 04:52:00.145993 4839 state_mem.go:107] "Deleted CPUSet assignment" podUID="913aacec-84de-44a6-98fb-382c04095d62" containerName="extract-content" Mar 21 04:52:00 crc kubenswrapper[4839]: I0321 04:52:00.146209 4839 memory_manager.go:354] "RemoveStaleState removing state" podUID="913aacec-84de-44a6-98fb-382c04095d62" containerName="registry-server" Mar 21 04:52:00 crc kubenswrapper[4839]: I0321 04:52:00.146895 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567812-jglhv" Mar 21 04:52:00 crc kubenswrapper[4839]: I0321 04:52:00.150378 4839 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 21 04:52:00 crc kubenswrapper[4839]: I0321 04:52:00.150624 4839 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-swld2" Mar 21 04:52:00 crc kubenswrapper[4839]: I0321 04:52:00.150763 4839 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 21 04:52:00 crc kubenswrapper[4839]: I0321 04:52:00.156593 4839 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29567812-jglhv"] Mar 21 04:52:00 crc kubenswrapper[4839]: I0321 04:52:00.325903 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f88r4\" (UniqueName: \"kubernetes.io/projected/d13124dd-cca5-49f6-9638-2cb42ed2bb34-kube-api-access-f88r4\") pod \"auto-csr-approver-29567812-jglhv\" (UID: \"d13124dd-cca5-49f6-9638-2cb42ed2bb34\") " pod="openshift-infra/auto-csr-approver-29567812-jglhv" Mar 21 04:52:00 crc kubenswrapper[4839]: I0321 04:52:00.427780 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-f88r4\" (UniqueName: \"kubernetes.io/projected/d13124dd-cca5-49f6-9638-2cb42ed2bb34-kube-api-access-f88r4\") pod \"auto-csr-approver-29567812-jglhv\" (UID: \"d13124dd-cca5-49f6-9638-2cb42ed2bb34\") " pod="openshift-infra/auto-csr-approver-29567812-jglhv" Mar 21 04:52:00 crc kubenswrapper[4839]: I0321 04:52:00.450135 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-f88r4\" (UniqueName: \"kubernetes.io/projected/d13124dd-cca5-49f6-9638-2cb42ed2bb34-kube-api-access-f88r4\") pod \"auto-csr-approver-29567812-jglhv\" (UID: \"d13124dd-cca5-49f6-9638-2cb42ed2bb34\") " pod="openshift-infra/auto-csr-approver-29567812-jglhv" Mar 21 04:52:00 crc kubenswrapper[4839]: I0321 04:52:00.475599 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567812-jglhv" Mar 21 04:52:00 crc kubenswrapper[4839]: I0321 04:52:00.969424 4839 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29567812-jglhv"] Mar 21 04:52:00 crc kubenswrapper[4839]: I0321 04:52:00.979731 4839 patch_prober.go:28] interesting pod/machine-config-daemon-jx4q7 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 21 04:52:00 crc kubenswrapper[4839]: I0321 04:52:00.979785 4839 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-jx4q7" podUID="4f92fefb-d5cd-451a-8bbe-31eea55d5bd9" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 21 04:52:01 crc kubenswrapper[4839]: I0321 04:52:01.415721 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567812-jglhv" event={"ID":"d13124dd-cca5-49f6-9638-2cb42ed2bb34","Type":"ContainerStarted","Data":"f4b992763e87881a8f6e9f9c3cac8b18f69942c9aec4b03795a6769871a27094"} Mar 21 04:52:02 crc kubenswrapper[4839]: I0321 04:52:02.426456 4839 generic.go:334] "Generic (PLEG): container finished" podID="d13124dd-cca5-49f6-9638-2cb42ed2bb34" containerID="5189f213ccdcf6760a09eb930ee4482a2d44b649489c1422b2b1e4b3849ef663" exitCode=0 Mar 21 04:52:02 crc kubenswrapper[4839]: I0321 04:52:02.426617 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567812-jglhv" event={"ID":"d13124dd-cca5-49f6-9638-2cb42ed2bb34","Type":"ContainerDied","Data":"5189f213ccdcf6760a09eb930ee4482a2d44b649489c1422b2b1e4b3849ef663"} Mar 21 04:52:03 crc kubenswrapper[4839]: I0321 04:52:03.746121 4839 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567812-jglhv" Mar 21 04:52:03 crc kubenswrapper[4839]: I0321 04:52:03.894174 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-f88r4\" (UniqueName: \"kubernetes.io/projected/d13124dd-cca5-49f6-9638-2cb42ed2bb34-kube-api-access-f88r4\") pod \"d13124dd-cca5-49f6-9638-2cb42ed2bb34\" (UID: \"d13124dd-cca5-49f6-9638-2cb42ed2bb34\") " Mar 21 04:52:03 crc kubenswrapper[4839]: I0321 04:52:03.905044 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d13124dd-cca5-49f6-9638-2cb42ed2bb34-kube-api-access-f88r4" (OuterVolumeSpecName: "kube-api-access-f88r4") pod "d13124dd-cca5-49f6-9638-2cb42ed2bb34" (UID: "d13124dd-cca5-49f6-9638-2cb42ed2bb34"). InnerVolumeSpecName "kube-api-access-f88r4". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 21 04:52:03 crc kubenswrapper[4839]: I0321 04:52:03.998196 4839 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-f88r4\" (UniqueName: \"kubernetes.io/projected/d13124dd-cca5-49f6-9638-2cb42ed2bb34-kube-api-access-f88r4\") on node \"crc\" DevicePath \"\"" Mar 21 04:52:04 crc kubenswrapper[4839]: I0321 04:52:04.446008 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567812-jglhv" event={"ID":"d13124dd-cca5-49f6-9638-2cb42ed2bb34","Type":"ContainerDied","Data":"f4b992763e87881a8f6e9f9c3cac8b18f69942c9aec4b03795a6769871a27094"} Mar 21 04:52:04 crc kubenswrapper[4839]: I0321 04:52:04.446071 4839 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567812-jglhv" Mar 21 04:52:04 crc kubenswrapper[4839]: I0321 04:52:04.446086 4839 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f4b992763e87881a8f6e9f9c3cac8b18f69942c9aec4b03795a6769871a27094" Mar 21 04:52:04 crc kubenswrapper[4839]: I0321 04:52:04.841687 4839 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29567806-g4rcl"] Mar 21 04:52:04 crc kubenswrapper[4839]: I0321 04:52:04.854206 4839 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29567806-g4rcl"] Mar 21 04:52:06 crc kubenswrapper[4839]: I0321 04:52:06.464488 4839 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="75c1454e-0aed-48d9-a0f2-f7c2797156ce" path="/var/lib/kubelet/pods/75c1454e-0aed-48d9-a0f2-f7c2797156ce/volumes" Mar 21 04:52:10 crc kubenswrapper[4839]: I0321 04:52:10.497396 4839 generic.go:334] "Generic (PLEG): container finished" podID="a1d76458-d587-4960-9bcc-7e3d3122b44d" containerID="70456d0f0e0073bde0ceeec7013aa756cec85385ed747be87d7d96cfa8d04986" exitCode=0 Mar 21 04:52:10 crc kubenswrapper[4839]: I0321 04:52:10.497487 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-ndptz" event={"ID":"a1d76458-d587-4960-9bcc-7e3d3122b44d","Type":"ContainerDied","Data":"70456d0f0e0073bde0ceeec7013aa756cec85385ed747be87d7d96cfa8d04986"} Mar 21 04:52:11 crc kubenswrapper[4839]: I0321 04:52:11.889061 4839 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-ndptz" Mar 21 04:52:12 crc kubenswrapper[4839]: I0321 04:52:12.050288 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/a1d76458-d587-4960-9bcc-7e3d3122b44d-inventory\") pod \"a1d76458-d587-4960-9bcc-7e3d3122b44d\" (UID: \"a1d76458-d587-4960-9bcc-7e3d3122b44d\") " Mar 21 04:52:12 crc kubenswrapper[4839]: I0321 04:52:12.050376 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zjf5g\" (UniqueName: \"kubernetes.io/projected/a1d76458-d587-4960-9bcc-7e3d3122b44d-kube-api-access-zjf5g\") pod \"a1d76458-d587-4960-9bcc-7e3d3122b44d\" (UID: \"a1d76458-d587-4960-9bcc-7e3d3122b44d\") " Mar 21 04:52:12 crc kubenswrapper[4839]: I0321 04:52:12.050442 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a1d76458-d587-4960-9bcc-7e3d3122b44d-bootstrap-combined-ca-bundle\") pod \"a1d76458-d587-4960-9bcc-7e3d3122b44d\" (UID: \"a1d76458-d587-4960-9bcc-7e3d3122b44d\") " Mar 21 04:52:12 crc kubenswrapper[4839]: I0321 04:52:12.050508 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/a1d76458-d587-4960-9bcc-7e3d3122b44d-ssh-key-openstack-edpm-ipam\") pod \"a1d76458-d587-4960-9bcc-7e3d3122b44d\" (UID: \"a1d76458-d587-4960-9bcc-7e3d3122b44d\") " Mar 21 04:52:12 crc kubenswrapper[4839]: I0321 04:52:12.058705 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a1d76458-d587-4960-9bcc-7e3d3122b44d-kube-api-access-zjf5g" (OuterVolumeSpecName: "kube-api-access-zjf5g") pod "a1d76458-d587-4960-9bcc-7e3d3122b44d" (UID: "a1d76458-d587-4960-9bcc-7e3d3122b44d"). InnerVolumeSpecName "kube-api-access-zjf5g". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 21 04:52:12 crc kubenswrapper[4839]: I0321 04:52:12.063716 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a1d76458-d587-4960-9bcc-7e3d3122b44d-bootstrap-combined-ca-bundle" (OuterVolumeSpecName: "bootstrap-combined-ca-bundle") pod "a1d76458-d587-4960-9bcc-7e3d3122b44d" (UID: "a1d76458-d587-4960-9bcc-7e3d3122b44d"). InnerVolumeSpecName "bootstrap-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 21 04:52:12 crc kubenswrapper[4839]: I0321 04:52:12.081382 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a1d76458-d587-4960-9bcc-7e3d3122b44d-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "a1d76458-d587-4960-9bcc-7e3d3122b44d" (UID: "a1d76458-d587-4960-9bcc-7e3d3122b44d"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 21 04:52:12 crc kubenswrapper[4839]: I0321 04:52:12.083774 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a1d76458-d587-4960-9bcc-7e3d3122b44d-inventory" (OuterVolumeSpecName: "inventory") pod "a1d76458-d587-4960-9bcc-7e3d3122b44d" (UID: "a1d76458-d587-4960-9bcc-7e3d3122b44d"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 21 04:52:12 crc kubenswrapper[4839]: I0321 04:52:12.153498 4839 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/a1d76458-d587-4960-9bcc-7e3d3122b44d-inventory\") on node \"crc\" DevicePath \"\"" Mar 21 04:52:12 crc kubenswrapper[4839]: I0321 04:52:12.153545 4839 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zjf5g\" (UniqueName: \"kubernetes.io/projected/a1d76458-d587-4960-9bcc-7e3d3122b44d-kube-api-access-zjf5g\") on node \"crc\" DevicePath \"\"" Mar 21 04:52:12 crc kubenswrapper[4839]: I0321 04:52:12.153560 4839 reconciler_common.go:293] "Volume detached for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a1d76458-d587-4960-9bcc-7e3d3122b44d-bootstrap-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 21 04:52:12 crc kubenswrapper[4839]: I0321 04:52:12.153582 4839 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/a1d76458-d587-4960-9bcc-7e3d3122b44d-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Mar 21 04:52:12 crc kubenswrapper[4839]: I0321 04:52:12.518963 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-ndptz" event={"ID":"a1d76458-d587-4960-9bcc-7e3d3122b44d","Type":"ContainerDied","Data":"822a23cdb1c22865313c8050c3f022e1750a4731cc96eac72c126accbbe28877"} Mar 21 04:52:12 crc kubenswrapper[4839]: I0321 04:52:12.519278 4839 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="822a23cdb1c22865313c8050c3f022e1750a4731cc96eac72c126accbbe28877" Mar 21 04:52:12 crc kubenswrapper[4839]: I0321 04:52:12.519013 4839 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-ndptz" Mar 21 04:52:12 crc kubenswrapper[4839]: I0321 04:52:12.600367 4839 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/download-cache-edpm-deployment-openstack-edpm-ipam-bs7tt"] Mar 21 04:52:12 crc kubenswrapper[4839]: E0321 04:52:12.600959 4839 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a1d76458-d587-4960-9bcc-7e3d3122b44d" containerName="bootstrap-edpm-deployment-openstack-edpm-ipam" Mar 21 04:52:12 crc kubenswrapper[4839]: I0321 04:52:12.600985 4839 state_mem.go:107] "Deleted CPUSet assignment" podUID="a1d76458-d587-4960-9bcc-7e3d3122b44d" containerName="bootstrap-edpm-deployment-openstack-edpm-ipam" Mar 21 04:52:12 crc kubenswrapper[4839]: E0321 04:52:12.601024 4839 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d13124dd-cca5-49f6-9638-2cb42ed2bb34" containerName="oc" Mar 21 04:52:12 crc kubenswrapper[4839]: I0321 04:52:12.601034 4839 state_mem.go:107] "Deleted CPUSet assignment" podUID="d13124dd-cca5-49f6-9638-2cb42ed2bb34" containerName="oc" Mar 21 04:52:12 crc kubenswrapper[4839]: I0321 04:52:12.601262 4839 memory_manager.go:354] "RemoveStaleState removing state" podUID="a1d76458-d587-4960-9bcc-7e3d3122b44d" containerName="bootstrap-edpm-deployment-openstack-edpm-ipam" Mar 21 04:52:12 crc kubenswrapper[4839]: I0321 04:52:12.601300 4839 memory_manager.go:354] "RemoveStaleState removing state" podUID="d13124dd-cca5-49f6-9638-2cb42ed2bb34" containerName="oc" Mar 21 04:52:12 crc kubenswrapper[4839]: I0321 04:52:12.602076 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-bs7tt" Mar 21 04:52:12 crc kubenswrapper[4839]: I0321 04:52:12.612164 4839 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/download-cache-edpm-deployment-openstack-edpm-ipam-bs7tt"] Mar 21 04:52:12 crc kubenswrapper[4839]: I0321 04:52:12.613010 4839 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Mar 21 04:52:12 crc kubenswrapper[4839]: I0321 04:52:12.613127 4839 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Mar 21 04:52:12 crc kubenswrapper[4839]: I0321 04:52:12.613426 4839 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-pxvtf" Mar 21 04:52:12 crc kubenswrapper[4839]: I0321 04:52:12.613488 4839 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Mar 21 04:52:12 crc kubenswrapper[4839]: I0321 04:52:12.771593 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/7f875f01-020a-4cd6-950a-4dbb6ccb344e-ssh-key-openstack-edpm-ipam\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-bs7tt\" (UID: \"7f875f01-020a-4cd6-950a-4dbb6ccb344e\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-bs7tt" Mar 21 04:52:12 crc kubenswrapper[4839]: I0321 04:52:12.771745 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/7f875f01-020a-4cd6-950a-4dbb6ccb344e-inventory\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-bs7tt\" (UID: \"7f875f01-020a-4cd6-950a-4dbb6ccb344e\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-bs7tt" Mar 21 04:52:12 crc kubenswrapper[4839]: I0321 04:52:12.771849 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vstkz\" (UniqueName: \"kubernetes.io/projected/7f875f01-020a-4cd6-950a-4dbb6ccb344e-kube-api-access-vstkz\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-bs7tt\" (UID: \"7f875f01-020a-4cd6-950a-4dbb6ccb344e\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-bs7tt" Mar 21 04:52:12 crc kubenswrapper[4839]: I0321 04:52:12.873526 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/7f875f01-020a-4cd6-950a-4dbb6ccb344e-ssh-key-openstack-edpm-ipam\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-bs7tt\" (UID: \"7f875f01-020a-4cd6-950a-4dbb6ccb344e\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-bs7tt" Mar 21 04:52:12 crc kubenswrapper[4839]: I0321 04:52:12.873658 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/7f875f01-020a-4cd6-950a-4dbb6ccb344e-inventory\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-bs7tt\" (UID: \"7f875f01-020a-4cd6-950a-4dbb6ccb344e\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-bs7tt" Mar 21 04:52:12 crc kubenswrapper[4839]: I0321 04:52:12.873740 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vstkz\" (UniqueName: \"kubernetes.io/projected/7f875f01-020a-4cd6-950a-4dbb6ccb344e-kube-api-access-vstkz\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-bs7tt\" (UID: \"7f875f01-020a-4cd6-950a-4dbb6ccb344e\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-bs7tt" Mar 21 04:52:12 crc kubenswrapper[4839]: I0321 04:52:12.879303 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/7f875f01-020a-4cd6-950a-4dbb6ccb344e-ssh-key-openstack-edpm-ipam\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-bs7tt\" (UID: \"7f875f01-020a-4cd6-950a-4dbb6ccb344e\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-bs7tt" Mar 21 04:52:12 crc kubenswrapper[4839]: I0321 04:52:12.881989 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/7f875f01-020a-4cd6-950a-4dbb6ccb344e-inventory\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-bs7tt\" (UID: \"7f875f01-020a-4cd6-950a-4dbb6ccb344e\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-bs7tt" Mar 21 04:52:12 crc kubenswrapper[4839]: I0321 04:52:12.894923 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vstkz\" (UniqueName: \"kubernetes.io/projected/7f875f01-020a-4cd6-950a-4dbb6ccb344e-kube-api-access-vstkz\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-bs7tt\" (UID: \"7f875f01-020a-4cd6-950a-4dbb6ccb344e\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-bs7tt" Mar 21 04:52:12 crc kubenswrapper[4839]: I0321 04:52:12.929418 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-bs7tt" Mar 21 04:52:13 crc kubenswrapper[4839]: I0321 04:52:13.477724 4839 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/download-cache-edpm-deployment-openstack-edpm-ipam-bs7tt"] Mar 21 04:52:13 crc kubenswrapper[4839]: I0321 04:52:13.532147 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-bs7tt" event={"ID":"7f875f01-020a-4cd6-950a-4dbb6ccb344e","Type":"ContainerStarted","Data":"c285fa2b3b39fe801417eab93d8e0dfdfe9b49b82a93f8d3496b2dd46f1f1041"} Mar 21 04:52:14 crc kubenswrapper[4839]: I0321 04:52:14.541727 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-bs7tt" event={"ID":"7f875f01-020a-4cd6-950a-4dbb6ccb344e","Type":"ContainerStarted","Data":"10146a412661c985917c18611c191561c13c12f150267b88fe5fad51b5ee448c"} Mar 21 04:52:14 crc kubenswrapper[4839]: I0321 04:52:14.570887 4839 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-bs7tt" podStartSLOduration=2.055597203 podStartE2EDuration="2.570865011s" podCreationTimestamp="2026-03-21 04:52:12 +0000 UTC" firstStartedPulling="2026-03-21 04:52:13.498744088 +0000 UTC m=+1737.826530764" lastFinishedPulling="2026-03-21 04:52:14.014011906 +0000 UTC m=+1738.341798572" observedRunningTime="2026-03-21 04:52:14.564084141 +0000 UTC m=+1738.891870837" watchObservedRunningTime="2026-03-21 04:52:14.570865011 +0000 UTC m=+1738.898651687" Mar 21 04:52:30 crc kubenswrapper[4839]: I0321 04:52:30.980730 4839 patch_prober.go:28] interesting pod/machine-config-daemon-jx4q7 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 21 04:52:30 crc kubenswrapper[4839]: I0321 04:52:30.981153 4839 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-jx4q7" podUID="4f92fefb-d5cd-451a-8bbe-31eea55d5bd9" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 21 04:52:53 crc kubenswrapper[4839]: I0321 04:52:53.674734 4839 scope.go:117] "RemoveContainer" containerID="66cb92ff47a88ccd93ffde6b9853588d4c4d5f3a25eb2c7a9862fbe6f8dc60f8" Mar 21 04:53:00 crc kubenswrapper[4839]: I0321 04:53:00.980441 4839 patch_prober.go:28] interesting pod/machine-config-daemon-jx4q7 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 21 04:53:00 crc kubenswrapper[4839]: I0321 04:53:00.980963 4839 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-jx4q7" podUID="4f92fefb-d5cd-451a-8bbe-31eea55d5bd9" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 21 04:53:00 crc kubenswrapper[4839]: I0321 04:53:00.981006 4839 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-jx4q7" Mar 21 04:53:00 crc kubenswrapper[4839]: I0321 04:53:00.981737 4839 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"27713c304335ae18a33ce2c1e07fb2a32c8ccd655a1db46168d0e6b4f0325109"} pod="openshift-machine-config-operator/machine-config-daemon-jx4q7" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 21 04:53:00 crc kubenswrapper[4839]: I0321 04:53:00.981793 4839 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-jx4q7" podUID="4f92fefb-d5cd-451a-8bbe-31eea55d5bd9" containerName="machine-config-daemon" containerID="cri-o://27713c304335ae18a33ce2c1e07fb2a32c8ccd655a1db46168d0e6b4f0325109" gracePeriod=600 Mar 21 04:53:01 crc kubenswrapper[4839]: E0321 04:53:01.107798 4839 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-jx4q7_openshift-machine-config-operator(4f92fefb-d5cd-451a-8bbe-31eea55d5bd9)\"" pod="openshift-machine-config-operator/machine-config-daemon-jx4q7" podUID="4f92fefb-d5cd-451a-8bbe-31eea55d5bd9" Mar 21 04:53:01 crc kubenswrapper[4839]: I0321 04:53:01.967077 4839 generic.go:334] "Generic (PLEG): container finished" podID="4f92fefb-d5cd-451a-8bbe-31eea55d5bd9" containerID="27713c304335ae18a33ce2c1e07fb2a32c8ccd655a1db46168d0e6b4f0325109" exitCode=0 Mar 21 04:53:01 crc kubenswrapper[4839]: I0321 04:53:01.967141 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-jx4q7" event={"ID":"4f92fefb-d5cd-451a-8bbe-31eea55d5bd9","Type":"ContainerDied","Data":"27713c304335ae18a33ce2c1e07fb2a32c8ccd655a1db46168d0e6b4f0325109"} Mar 21 04:53:01 crc kubenswrapper[4839]: I0321 04:53:01.967441 4839 scope.go:117] "RemoveContainer" containerID="c031ed8f7b7576f57e9530a46687f2f2de2e5c2a62f42435eef393cfd7af2b37" Mar 21 04:53:01 crc kubenswrapper[4839]: I0321 04:53:01.968108 4839 scope.go:117] "RemoveContainer" containerID="27713c304335ae18a33ce2c1e07fb2a32c8ccd655a1db46168d0e6b4f0325109" Mar 21 04:53:01 crc kubenswrapper[4839]: E0321 04:53:01.968430 4839 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-jx4q7_openshift-machine-config-operator(4f92fefb-d5cd-451a-8bbe-31eea55d5bd9)\"" pod="openshift-machine-config-operator/machine-config-daemon-jx4q7" podUID="4f92fefb-d5cd-451a-8bbe-31eea55d5bd9" Mar 21 04:53:16 crc kubenswrapper[4839]: I0321 04:53:16.458738 4839 scope.go:117] "RemoveContainer" containerID="27713c304335ae18a33ce2c1e07fb2a32c8ccd655a1db46168d0e6b4f0325109" Mar 21 04:53:16 crc kubenswrapper[4839]: E0321 04:53:16.459501 4839 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-jx4q7_openshift-machine-config-operator(4f92fefb-d5cd-451a-8bbe-31eea55d5bd9)\"" pod="openshift-machine-config-operator/machine-config-daemon-jx4q7" podUID="4f92fefb-d5cd-451a-8bbe-31eea55d5bd9" Mar 21 04:53:20 crc kubenswrapper[4839]: I0321 04:53:20.049964 4839 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-1eec-account-create-update-h7hp7"] Mar 21 04:53:20 crc kubenswrapper[4839]: I0321 04:53:20.062280 4839 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-db-create-v4k9c"] Mar 21 04:53:20 crc kubenswrapper[4839]: I0321 04:53:20.073997 4839 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-852d-account-create-update-nv5n7"] Mar 21 04:53:20 crc kubenswrapper[4839]: I0321 04:53:20.081870 4839 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-db-create-v4k9c"] Mar 21 04:53:20 crc kubenswrapper[4839]: I0321 04:53:20.089872 4839 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-852d-account-create-update-nv5n7"] Mar 21 04:53:20 crc kubenswrapper[4839]: I0321 04:53:20.100483 4839 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-1eec-account-create-update-h7hp7"] Mar 21 04:53:20 crc kubenswrapper[4839]: I0321 04:53:20.465682 4839 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="59ce13a7-d2a6-4c54-908d-39d1511da50b" path="/var/lib/kubelet/pods/59ce13a7-d2a6-4c54-908d-39d1511da50b/volumes" Mar 21 04:53:20 crc kubenswrapper[4839]: I0321 04:53:20.466821 4839 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8ad3cc08-174a-4164-aa38-3d7f6fbed0c0" path="/var/lib/kubelet/pods/8ad3cc08-174a-4164-aa38-3d7f6fbed0c0/volumes" Mar 21 04:53:20 crc kubenswrapper[4839]: I0321 04:53:20.467893 4839 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e779c2ff-ee70-4779-b3fc-3b3bf87aff47" path="/var/lib/kubelet/pods/e779c2ff-ee70-4779-b3fc-3b3bf87aff47/volumes" Mar 21 04:53:21 crc kubenswrapper[4839]: I0321 04:53:21.033206 4839 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-db-create-tnx95"] Mar 21 04:53:21 crc kubenswrapper[4839]: I0321 04:53:21.041306 4839 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-db-create-tnx95"] Mar 21 04:53:22 crc kubenswrapper[4839]: I0321 04:53:22.463808 4839 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c9a6840a-2ece-4b8d-be60-caa89912db9f" path="/var/lib/kubelet/pods/c9a6840a-2ece-4b8d-be60-caa89912db9f/volumes" Mar 21 04:53:24 crc kubenswrapper[4839]: I0321 04:53:24.045202 4839 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-31f4-account-create-update-98c9m"] Mar 21 04:53:24 crc kubenswrapper[4839]: I0321 04:53:24.053973 4839 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-db-create-vqfbm"] Mar 21 04:53:24 crc kubenswrapper[4839]: I0321 04:53:24.062458 4839 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-31f4-account-create-update-98c9m"] Mar 21 04:53:24 crc kubenswrapper[4839]: I0321 04:53:24.070039 4839 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-db-create-vqfbm"] Mar 21 04:53:24 crc kubenswrapper[4839]: I0321 04:53:24.463257 4839 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5740bec9-4b0c-4092-8309-14fdb2562c2e" path="/var/lib/kubelet/pods/5740bec9-4b0c-4092-8309-14fdb2562c2e/volumes" Mar 21 04:53:24 crc kubenswrapper[4839]: I0321 04:53:24.464434 4839 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b46c59d5-1b87-471e-ae9b-b8ba7ca8d754" path="/var/lib/kubelet/pods/b46c59d5-1b87-471e-ae9b-b8ba7ca8d754/volumes" Mar 21 04:53:31 crc kubenswrapper[4839]: I0321 04:53:31.452968 4839 scope.go:117] "RemoveContainer" containerID="27713c304335ae18a33ce2c1e07fb2a32c8ccd655a1db46168d0e6b4f0325109" Mar 21 04:53:31 crc kubenswrapper[4839]: E0321 04:53:31.453409 4839 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-jx4q7_openshift-machine-config-operator(4f92fefb-d5cd-451a-8bbe-31eea55d5bd9)\"" pod="openshift-machine-config-operator/machine-config-daemon-jx4q7" podUID="4f92fefb-d5cd-451a-8bbe-31eea55d5bd9" Mar 21 04:53:39 crc kubenswrapper[4839]: I0321 04:53:39.310930 4839 generic.go:334] "Generic (PLEG): container finished" podID="7f875f01-020a-4cd6-950a-4dbb6ccb344e" containerID="10146a412661c985917c18611c191561c13c12f150267b88fe5fad51b5ee448c" exitCode=0 Mar 21 04:53:39 crc kubenswrapper[4839]: I0321 04:53:39.311138 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-bs7tt" event={"ID":"7f875f01-020a-4cd6-950a-4dbb6ccb344e","Type":"ContainerDied","Data":"10146a412661c985917c18611c191561c13c12f150267b88fe5fad51b5ee448c"} Mar 21 04:53:40 crc kubenswrapper[4839]: I0321 04:53:40.720039 4839 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-bs7tt" Mar 21 04:53:40 crc kubenswrapper[4839]: I0321 04:53:40.739600 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/7f875f01-020a-4cd6-950a-4dbb6ccb344e-ssh-key-openstack-edpm-ipam\") pod \"7f875f01-020a-4cd6-950a-4dbb6ccb344e\" (UID: \"7f875f01-020a-4cd6-950a-4dbb6ccb344e\") " Mar 21 04:53:40 crc kubenswrapper[4839]: I0321 04:53:40.739651 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vstkz\" (UniqueName: \"kubernetes.io/projected/7f875f01-020a-4cd6-950a-4dbb6ccb344e-kube-api-access-vstkz\") pod \"7f875f01-020a-4cd6-950a-4dbb6ccb344e\" (UID: \"7f875f01-020a-4cd6-950a-4dbb6ccb344e\") " Mar 21 04:53:40 crc kubenswrapper[4839]: I0321 04:53:40.739689 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/7f875f01-020a-4cd6-950a-4dbb6ccb344e-inventory\") pod \"7f875f01-020a-4cd6-950a-4dbb6ccb344e\" (UID: \"7f875f01-020a-4cd6-950a-4dbb6ccb344e\") " Mar 21 04:53:40 crc kubenswrapper[4839]: I0321 04:53:40.764913 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7f875f01-020a-4cd6-950a-4dbb6ccb344e-kube-api-access-vstkz" (OuterVolumeSpecName: "kube-api-access-vstkz") pod "7f875f01-020a-4cd6-950a-4dbb6ccb344e" (UID: "7f875f01-020a-4cd6-950a-4dbb6ccb344e"). InnerVolumeSpecName "kube-api-access-vstkz". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 21 04:53:40 crc kubenswrapper[4839]: I0321 04:53:40.774505 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7f875f01-020a-4cd6-950a-4dbb6ccb344e-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "7f875f01-020a-4cd6-950a-4dbb6ccb344e" (UID: "7f875f01-020a-4cd6-950a-4dbb6ccb344e"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 21 04:53:40 crc kubenswrapper[4839]: I0321 04:53:40.797416 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7f875f01-020a-4cd6-950a-4dbb6ccb344e-inventory" (OuterVolumeSpecName: "inventory") pod "7f875f01-020a-4cd6-950a-4dbb6ccb344e" (UID: "7f875f01-020a-4cd6-950a-4dbb6ccb344e"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 21 04:53:40 crc kubenswrapper[4839]: I0321 04:53:40.841965 4839 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/7f875f01-020a-4cd6-950a-4dbb6ccb344e-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Mar 21 04:53:40 crc kubenswrapper[4839]: I0321 04:53:40.842020 4839 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vstkz\" (UniqueName: \"kubernetes.io/projected/7f875f01-020a-4cd6-950a-4dbb6ccb344e-kube-api-access-vstkz\") on node \"crc\" DevicePath \"\"" Mar 21 04:53:40 crc kubenswrapper[4839]: I0321 04:53:40.842035 4839 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/7f875f01-020a-4cd6-950a-4dbb6ccb344e-inventory\") on node \"crc\" DevicePath \"\"" Mar 21 04:53:41 crc kubenswrapper[4839]: I0321 04:53:41.356303 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-bs7tt" event={"ID":"7f875f01-020a-4cd6-950a-4dbb6ccb344e","Type":"ContainerDied","Data":"c285fa2b3b39fe801417eab93d8e0dfdfe9b49b82a93f8d3496b2dd46f1f1041"} Mar 21 04:53:41 crc kubenswrapper[4839]: I0321 04:53:41.356985 4839 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c285fa2b3b39fe801417eab93d8e0dfdfe9b49b82a93f8d3496b2dd46f1f1041" Mar 21 04:53:41 crc kubenswrapper[4839]: I0321 04:53:41.356794 4839 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-bs7tt" Mar 21 04:53:41 crc kubenswrapper[4839]: I0321 04:53:41.439136 4839 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/configure-network-edpm-deployment-openstack-edpm-ipam-2rvtx"] Mar 21 04:53:41 crc kubenswrapper[4839]: E0321 04:53:41.439680 4839 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7f875f01-020a-4cd6-950a-4dbb6ccb344e" containerName="download-cache-edpm-deployment-openstack-edpm-ipam" Mar 21 04:53:41 crc kubenswrapper[4839]: I0321 04:53:41.439702 4839 state_mem.go:107] "Deleted CPUSet assignment" podUID="7f875f01-020a-4cd6-950a-4dbb6ccb344e" containerName="download-cache-edpm-deployment-openstack-edpm-ipam" Mar 21 04:53:41 crc kubenswrapper[4839]: I0321 04:53:41.439972 4839 memory_manager.go:354] "RemoveStaleState removing state" podUID="7f875f01-020a-4cd6-950a-4dbb6ccb344e" containerName="download-cache-edpm-deployment-openstack-edpm-ipam" Mar 21 04:53:41 crc kubenswrapper[4839]: I0321 04:53:41.440763 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-2rvtx" Mar 21 04:53:41 crc kubenswrapper[4839]: I0321 04:53:41.445414 4839 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-pxvtf" Mar 21 04:53:41 crc kubenswrapper[4839]: I0321 04:53:41.445531 4839 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Mar 21 04:53:41 crc kubenswrapper[4839]: I0321 04:53:41.445677 4839 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Mar 21 04:53:41 crc kubenswrapper[4839]: I0321 04:53:41.446255 4839 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Mar 21 04:53:41 crc kubenswrapper[4839]: I0321 04:53:41.460419 4839 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-network-edpm-deployment-openstack-edpm-ipam-2rvtx"] Mar 21 04:53:41 crc kubenswrapper[4839]: I0321 04:53:41.460507 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9hxd9\" (UniqueName: \"kubernetes.io/projected/a58d82e4-2de9-4680-a08c-6eeb775ed08a-kube-api-access-9hxd9\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-2rvtx\" (UID: \"a58d82e4-2de9-4680-a08c-6eeb775ed08a\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-2rvtx" Mar 21 04:53:41 crc kubenswrapper[4839]: I0321 04:53:41.460616 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/a58d82e4-2de9-4680-a08c-6eeb775ed08a-ssh-key-openstack-edpm-ipam\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-2rvtx\" (UID: \"a58d82e4-2de9-4680-a08c-6eeb775ed08a\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-2rvtx" Mar 21 04:53:41 crc kubenswrapper[4839]: I0321 04:53:41.460874 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/a58d82e4-2de9-4680-a08c-6eeb775ed08a-inventory\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-2rvtx\" (UID: \"a58d82e4-2de9-4680-a08c-6eeb775ed08a\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-2rvtx" Mar 21 04:53:41 crc kubenswrapper[4839]: I0321 04:53:41.562384 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9hxd9\" (UniqueName: \"kubernetes.io/projected/a58d82e4-2de9-4680-a08c-6eeb775ed08a-kube-api-access-9hxd9\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-2rvtx\" (UID: \"a58d82e4-2de9-4680-a08c-6eeb775ed08a\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-2rvtx" Mar 21 04:53:41 crc kubenswrapper[4839]: I0321 04:53:41.562441 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/a58d82e4-2de9-4680-a08c-6eeb775ed08a-ssh-key-openstack-edpm-ipam\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-2rvtx\" (UID: \"a58d82e4-2de9-4680-a08c-6eeb775ed08a\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-2rvtx" Mar 21 04:53:41 crc kubenswrapper[4839]: I0321 04:53:41.562539 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/a58d82e4-2de9-4680-a08c-6eeb775ed08a-inventory\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-2rvtx\" (UID: \"a58d82e4-2de9-4680-a08c-6eeb775ed08a\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-2rvtx" Mar 21 04:53:41 crc kubenswrapper[4839]: I0321 04:53:41.567364 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/a58d82e4-2de9-4680-a08c-6eeb775ed08a-ssh-key-openstack-edpm-ipam\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-2rvtx\" (UID: \"a58d82e4-2de9-4680-a08c-6eeb775ed08a\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-2rvtx" Mar 21 04:53:41 crc kubenswrapper[4839]: I0321 04:53:41.577143 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/a58d82e4-2de9-4680-a08c-6eeb775ed08a-inventory\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-2rvtx\" (UID: \"a58d82e4-2de9-4680-a08c-6eeb775ed08a\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-2rvtx" Mar 21 04:53:41 crc kubenswrapper[4839]: I0321 04:53:41.587680 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9hxd9\" (UniqueName: \"kubernetes.io/projected/a58d82e4-2de9-4680-a08c-6eeb775ed08a-kube-api-access-9hxd9\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-2rvtx\" (UID: \"a58d82e4-2de9-4680-a08c-6eeb775ed08a\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-2rvtx" Mar 21 04:53:41 crc kubenswrapper[4839]: I0321 04:53:41.772796 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-2rvtx" Mar 21 04:53:42 crc kubenswrapper[4839]: I0321 04:53:42.314216 4839 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-network-edpm-deployment-openstack-edpm-ipam-2rvtx"] Mar 21 04:53:42 crc kubenswrapper[4839]: I0321 04:53:42.366231 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-2rvtx" event={"ID":"a58d82e4-2de9-4680-a08c-6eeb775ed08a","Type":"ContainerStarted","Data":"ef1d592a6aae72b1b9bb235f77d743e0bad4065ccda22b837001cbeb2d26cd16"} Mar 21 04:53:43 crc kubenswrapper[4839]: I0321 04:53:43.378799 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-2rvtx" event={"ID":"a58d82e4-2de9-4680-a08c-6eeb775ed08a","Type":"ContainerStarted","Data":"2ff51247bd02a82c38abca9599a8bd0159eda3e65cc9e732eaaf1569f9b29e29"} Mar 21 04:53:43 crc kubenswrapper[4839]: I0321 04:53:43.413505 4839 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-2rvtx" podStartSLOduration=1.793650655 podStartE2EDuration="2.413486108s" podCreationTimestamp="2026-03-21 04:53:41 +0000 UTC" firstStartedPulling="2026-03-21 04:53:42.314864541 +0000 UTC m=+1826.642651207" lastFinishedPulling="2026-03-21 04:53:42.934699984 +0000 UTC m=+1827.262486660" observedRunningTime="2026-03-21 04:53:43.398676742 +0000 UTC m=+1827.726463438" watchObservedRunningTime="2026-03-21 04:53:43.413486108 +0000 UTC m=+1827.741272784" Mar 21 04:53:43 crc kubenswrapper[4839]: I0321 04:53:43.453849 4839 scope.go:117] "RemoveContainer" containerID="27713c304335ae18a33ce2c1e07fb2a32c8ccd655a1db46168d0e6b4f0325109" Mar 21 04:53:43 crc kubenswrapper[4839]: E0321 04:53:43.454211 4839 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-jx4q7_openshift-machine-config-operator(4f92fefb-d5cd-451a-8bbe-31eea55d5bd9)\"" pod="openshift-machine-config-operator/machine-config-daemon-jx4q7" podUID="4f92fefb-d5cd-451a-8bbe-31eea55d5bd9" Mar 21 04:53:47 crc kubenswrapper[4839]: I0321 04:53:47.057703 4839 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-4d6f-account-create-update-st2sv"] Mar 21 04:53:47 crc kubenswrapper[4839]: I0321 04:53:47.074384 4839 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-4d6f-account-create-update-st2sv"] Mar 21 04:53:47 crc kubenswrapper[4839]: I0321 04:53:47.096878 4839 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-db-create-79rjr"] Mar 21 04:53:47 crc kubenswrapper[4839]: I0321 04:53:47.107007 4839 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-db-create-79rjr"] Mar 21 04:53:47 crc kubenswrapper[4839]: I0321 04:53:47.124842 4839 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-3d03-account-create-update-q9rgd"] Mar 21 04:53:47 crc kubenswrapper[4839]: I0321 04:53:47.136262 4839 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-db-create-h5448"] Mar 21 04:53:47 crc kubenswrapper[4839]: I0321 04:53:47.147221 4839 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-a5f8-account-create-update-2srvb"] Mar 21 04:53:47 crc kubenswrapper[4839]: I0321 04:53:47.158044 4839 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-3d03-account-create-update-q9rgd"] Mar 21 04:53:47 crc kubenswrapper[4839]: I0321 04:53:47.168928 4839 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-db-create-h5448"] Mar 21 04:53:47 crc kubenswrapper[4839]: I0321 04:53:47.180102 4839 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-db-create-nv7qf"] Mar 21 04:53:47 crc kubenswrapper[4839]: I0321 04:53:47.191500 4839 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-a5f8-account-create-update-2srvb"] Mar 21 04:53:47 crc kubenswrapper[4839]: I0321 04:53:47.203094 4839 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-db-create-nv7qf"] Mar 21 04:53:47 crc kubenswrapper[4839]: I0321 04:53:47.213154 4839 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/root-account-create-update-xc9zf"] Mar 21 04:53:47 crc kubenswrapper[4839]: I0321 04:53:47.222451 4839 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/root-account-create-update-xc9zf"] Mar 21 04:53:48 crc kubenswrapper[4839]: I0321 04:53:48.464172 4839 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="34a240db-9587-446e-af12-a44b87b1a3ac" path="/var/lib/kubelet/pods/34a240db-9587-446e-af12-a44b87b1a3ac/volumes" Mar 21 04:53:48 crc kubenswrapper[4839]: I0321 04:53:48.464742 4839 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8c8ad856-1b19-4b1c-8124-2e316dd567ee" path="/var/lib/kubelet/pods/8c8ad856-1b19-4b1c-8124-2e316dd567ee/volumes" Mar 21 04:53:48 crc kubenswrapper[4839]: I0321 04:53:48.465243 4839 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9a5cee9b-67b3-40b1-bc62-e6a3c4c1272d" path="/var/lib/kubelet/pods/9a5cee9b-67b3-40b1-bc62-e6a3c4c1272d/volumes" Mar 21 04:53:48 crc kubenswrapper[4839]: I0321 04:53:48.465843 4839 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a7dfdbcf-7830-4f8d-a165-119fe80d999a" path="/var/lib/kubelet/pods/a7dfdbcf-7830-4f8d-a165-119fe80d999a/volumes" Mar 21 04:53:48 crc kubenswrapper[4839]: I0321 04:53:48.466860 4839 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ae4cf7f5-74ed-45d7-ace7-24ada744db6c" path="/var/lib/kubelet/pods/ae4cf7f5-74ed-45d7-ace7-24ada744db6c/volumes" Mar 21 04:53:48 crc kubenswrapper[4839]: I0321 04:53:48.467397 4839 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b567c69c-d110-4ab2-aaf7-da82f0e72cc3" path="/var/lib/kubelet/pods/b567c69c-d110-4ab2-aaf7-da82f0e72cc3/volumes" Mar 21 04:53:48 crc kubenswrapper[4839]: I0321 04:53:48.467926 4839 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f816daf8-a9c7-4e99-a622-2f9bee7d203a" path="/var/lib/kubelet/pods/f816daf8-a9c7-4e99-a622-2f9bee7d203a/volumes" Mar 21 04:53:53 crc kubenswrapper[4839]: I0321 04:53:53.774499 4839 scope.go:117] "RemoveContainer" containerID="032f3b05c1ff562800fafe59fd0384b7d678921d7fe7e90157dab690dc2e9894" Mar 21 04:53:53 crc kubenswrapper[4839]: I0321 04:53:53.808685 4839 scope.go:117] "RemoveContainer" containerID="048e9e28ac07a1e9124d69a89e17059f1d443023c6faf3348223cf9a7387e352" Mar 21 04:53:53 crc kubenswrapper[4839]: I0321 04:53:53.841010 4839 scope.go:117] "RemoveContainer" containerID="3a57c10b2c78e441d84cbfab2416b69cf3571cc562b77b3b3134c8875131a599" Mar 21 04:53:53 crc kubenswrapper[4839]: I0321 04:53:53.880224 4839 scope.go:117] "RemoveContainer" containerID="1fd2a9e659b1f417a5acc26d40481b58d37731bc164379bcd010cc11a61ef9ec" Mar 21 04:53:53 crc kubenswrapper[4839]: I0321 04:53:53.920239 4839 scope.go:117] "RemoveContainer" containerID="18ee77a1f0c351aba88f15dc3bad4a37015f55b27e92d2f7b43fdfe709bc67ef" Mar 21 04:53:53 crc kubenswrapper[4839]: I0321 04:53:53.960357 4839 scope.go:117] "RemoveContainer" containerID="e304597468db9fac443bca06d530c54513659f708975616b601e678f7766dbe4" Mar 21 04:53:54 crc kubenswrapper[4839]: I0321 04:53:54.010486 4839 scope.go:117] "RemoveContainer" containerID="1c106f22b4e401c674f904200a929e5e68e3e4f4a62178a136c50ceb882cf719" Mar 21 04:53:54 crc kubenswrapper[4839]: I0321 04:53:54.039973 4839 scope.go:117] "RemoveContainer" containerID="bc85e819a8b1f2def449cfd0987dc3cf3c1805c923545af4ec58edeba1a10775" Mar 21 04:53:54 crc kubenswrapper[4839]: I0321 04:53:54.058611 4839 scope.go:117] "RemoveContainer" containerID="ebddf9a5729dde6feb4416ede20f92a3dd052bc816ed0593e001a7eb65da5807" Mar 21 04:53:54 crc kubenswrapper[4839]: I0321 04:53:54.079870 4839 scope.go:117] "RemoveContainer" containerID="1fc9b78e56e247468e98f90edfea187e15daf4ea152975c90e4d68c87986ba79" Mar 21 04:53:54 crc kubenswrapper[4839]: I0321 04:53:54.101392 4839 scope.go:117] "RemoveContainer" containerID="435dd7b699a596fb94e68ae9d7689a76011012e1ee2be4e567ac5a478d536eb6" Mar 21 04:53:54 crc kubenswrapper[4839]: I0321 04:53:54.125118 4839 scope.go:117] "RemoveContainer" containerID="3b8d2c5a2c9686ff0f867f32b67ae1a47eca812fdcbd3b93adee22c245151532" Mar 21 04:53:54 crc kubenswrapper[4839]: I0321 04:53:54.143350 4839 scope.go:117] "RemoveContainer" containerID="e6c4b76ad2e0d2ae69413d1d9b61feffab9768c0ce8180f11f0b591ba10e6f2c" Mar 21 04:53:54 crc kubenswrapper[4839]: I0321 04:53:54.453097 4839 scope.go:117] "RemoveContainer" containerID="27713c304335ae18a33ce2c1e07fb2a32c8ccd655a1db46168d0e6b4f0325109" Mar 21 04:53:54 crc kubenswrapper[4839]: E0321 04:53:54.453400 4839 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-jx4q7_openshift-machine-config-operator(4f92fefb-d5cd-451a-8bbe-31eea55d5bd9)\"" pod="openshift-machine-config-operator/machine-config-daemon-jx4q7" podUID="4f92fefb-d5cd-451a-8bbe-31eea55d5bd9" Mar 21 04:53:58 crc kubenswrapper[4839]: I0321 04:53:58.050104 4839 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-db-sync-qgdlf"] Mar 21 04:53:58 crc kubenswrapper[4839]: I0321 04:53:58.059904 4839 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-db-sync-qgdlf"] Mar 21 04:53:58 crc kubenswrapper[4839]: I0321 04:53:58.462473 4839 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bc21c34c-13c1-4733-9013-0cfd304b179c" path="/var/lib/kubelet/pods/bc21c34c-13c1-4733-9013-0cfd304b179c/volumes" Mar 21 04:54:00 crc kubenswrapper[4839]: I0321 04:54:00.143830 4839 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29567814-q8zxw"] Mar 21 04:54:00 crc kubenswrapper[4839]: I0321 04:54:00.145027 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567814-q8zxw" Mar 21 04:54:00 crc kubenswrapper[4839]: I0321 04:54:00.147109 4839 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-swld2" Mar 21 04:54:00 crc kubenswrapper[4839]: I0321 04:54:00.147532 4839 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 21 04:54:00 crc kubenswrapper[4839]: I0321 04:54:00.148181 4839 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 21 04:54:00 crc kubenswrapper[4839]: I0321 04:54:00.152543 4839 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29567814-q8zxw"] Mar 21 04:54:00 crc kubenswrapper[4839]: I0321 04:54:00.208739 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7wmrq\" (UniqueName: \"kubernetes.io/projected/852785cf-c79d-4c8e-92f0-f15d9836b437-kube-api-access-7wmrq\") pod \"auto-csr-approver-29567814-q8zxw\" (UID: \"852785cf-c79d-4c8e-92f0-f15d9836b437\") " pod="openshift-infra/auto-csr-approver-29567814-q8zxw" Mar 21 04:54:00 crc kubenswrapper[4839]: I0321 04:54:00.312256 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7wmrq\" (UniqueName: \"kubernetes.io/projected/852785cf-c79d-4c8e-92f0-f15d9836b437-kube-api-access-7wmrq\") pod \"auto-csr-approver-29567814-q8zxw\" (UID: \"852785cf-c79d-4c8e-92f0-f15d9836b437\") " pod="openshift-infra/auto-csr-approver-29567814-q8zxw" Mar 21 04:54:00 crc kubenswrapper[4839]: I0321 04:54:00.331430 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7wmrq\" (UniqueName: \"kubernetes.io/projected/852785cf-c79d-4c8e-92f0-f15d9836b437-kube-api-access-7wmrq\") pod \"auto-csr-approver-29567814-q8zxw\" (UID: \"852785cf-c79d-4c8e-92f0-f15d9836b437\") " pod="openshift-infra/auto-csr-approver-29567814-q8zxw" Mar 21 04:54:00 crc kubenswrapper[4839]: I0321 04:54:00.585902 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567814-q8zxw" Mar 21 04:54:01 crc kubenswrapper[4839]: I0321 04:54:01.006236 4839 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29567814-q8zxw"] Mar 21 04:54:01 crc kubenswrapper[4839]: W0321 04:54:01.013094 4839 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod852785cf_c79d_4c8e_92f0_f15d9836b437.slice/crio-aa429bb7267da47a516aa432749d3abd8d7562f8f699fd66320b5fb06dbe002d WatchSource:0}: Error finding container aa429bb7267da47a516aa432749d3abd8d7562f8f699fd66320b5fb06dbe002d: Status 404 returned error can't find the container with id aa429bb7267da47a516aa432749d3abd8d7562f8f699fd66320b5fb06dbe002d Mar 21 04:54:01 crc kubenswrapper[4839]: I0321 04:54:01.533762 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567814-q8zxw" event={"ID":"852785cf-c79d-4c8e-92f0-f15d9836b437","Type":"ContainerStarted","Data":"aa429bb7267da47a516aa432749d3abd8d7562f8f699fd66320b5fb06dbe002d"} Mar 21 04:54:03 crc kubenswrapper[4839]: I0321 04:54:03.550518 4839 generic.go:334] "Generic (PLEG): container finished" podID="852785cf-c79d-4c8e-92f0-f15d9836b437" containerID="d122e9d27915a31245552d8140bcb2b6f44ab9e8f5d0f2da420a748e2a0ab38c" exitCode=0 Mar 21 04:54:03 crc kubenswrapper[4839]: I0321 04:54:03.550695 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567814-q8zxw" event={"ID":"852785cf-c79d-4c8e-92f0-f15d9836b437","Type":"ContainerDied","Data":"d122e9d27915a31245552d8140bcb2b6f44ab9e8f5d0f2da420a748e2a0ab38c"} Mar 21 04:54:04 crc kubenswrapper[4839]: I0321 04:54:04.900690 4839 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567814-q8zxw" Mar 21 04:54:05 crc kubenswrapper[4839]: I0321 04:54:05.102420 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7wmrq\" (UniqueName: \"kubernetes.io/projected/852785cf-c79d-4c8e-92f0-f15d9836b437-kube-api-access-7wmrq\") pod \"852785cf-c79d-4c8e-92f0-f15d9836b437\" (UID: \"852785cf-c79d-4c8e-92f0-f15d9836b437\") " Mar 21 04:54:05 crc kubenswrapper[4839]: I0321 04:54:05.109771 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/852785cf-c79d-4c8e-92f0-f15d9836b437-kube-api-access-7wmrq" (OuterVolumeSpecName: "kube-api-access-7wmrq") pod "852785cf-c79d-4c8e-92f0-f15d9836b437" (UID: "852785cf-c79d-4c8e-92f0-f15d9836b437"). InnerVolumeSpecName "kube-api-access-7wmrq". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 21 04:54:05 crc kubenswrapper[4839]: I0321 04:54:05.205073 4839 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7wmrq\" (UniqueName: \"kubernetes.io/projected/852785cf-c79d-4c8e-92f0-f15d9836b437-kube-api-access-7wmrq\") on node \"crc\" DevicePath \"\"" Mar 21 04:54:05 crc kubenswrapper[4839]: I0321 04:54:05.569589 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567814-q8zxw" event={"ID":"852785cf-c79d-4c8e-92f0-f15d9836b437","Type":"ContainerDied","Data":"aa429bb7267da47a516aa432749d3abd8d7562f8f699fd66320b5fb06dbe002d"} Mar 21 04:54:05 crc kubenswrapper[4839]: I0321 04:54:05.569640 4839 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="aa429bb7267da47a516aa432749d3abd8d7562f8f699fd66320b5fb06dbe002d" Mar 21 04:54:05 crc kubenswrapper[4839]: I0321 04:54:05.569638 4839 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567814-q8zxw" Mar 21 04:54:05 crc kubenswrapper[4839]: I0321 04:54:05.961170 4839 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29567808-pxvv9"] Mar 21 04:54:05 crc kubenswrapper[4839]: I0321 04:54:05.976507 4839 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29567808-pxvv9"] Mar 21 04:54:06 crc kubenswrapper[4839]: I0321 04:54:06.464035 4839 scope.go:117] "RemoveContainer" containerID="27713c304335ae18a33ce2c1e07fb2a32c8ccd655a1db46168d0e6b4f0325109" Mar 21 04:54:06 crc kubenswrapper[4839]: E0321 04:54:06.464646 4839 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-jx4q7_openshift-machine-config-operator(4f92fefb-d5cd-451a-8bbe-31eea55d5bd9)\"" pod="openshift-machine-config-operator/machine-config-daemon-jx4q7" podUID="4f92fefb-d5cd-451a-8bbe-31eea55d5bd9" Mar 21 04:54:06 crc kubenswrapper[4839]: I0321 04:54:06.467318 4839 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5d56af53-fce2-4320-b4fa-32b5c6798921" path="/var/lib/kubelet/pods/5d56af53-fce2-4320-b4fa-32b5c6798921/volumes" Mar 21 04:54:16 crc kubenswrapper[4839]: I0321 04:54:16.030261 4839 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-db-sync-ng2tw"] Mar 21 04:54:16 crc kubenswrapper[4839]: I0321 04:54:16.039756 4839 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-db-sync-ng2tw"] Mar 21 04:54:16 crc kubenswrapper[4839]: I0321 04:54:16.465125 4839 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2cc1dfb9-8108-46e5-8dc5-5b555590ecc1" path="/var/lib/kubelet/pods/2cc1dfb9-8108-46e5-8dc5-5b555590ecc1/volumes" Mar 21 04:54:19 crc kubenswrapper[4839]: I0321 04:54:19.452970 4839 scope.go:117] "RemoveContainer" containerID="27713c304335ae18a33ce2c1e07fb2a32c8ccd655a1db46168d0e6b4f0325109" Mar 21 04:54:19 crc kubenswrapper[4839]: E0321 04:54:19.454865 4839 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-jx4q7_openshift-machine-config-operator(4f92fefb-d5cd-451a-8bbe-31eea55d5bd9)\"" pod="openshift-machine-config-operator/machine-config-daemon-jx4q7" podUID="4f92fefb-d5cd-451a-8bbe-31eea55d5bd9" Mar 21 04:54:33 crc kubenswrapper[4839]: I0321 04:54:33.037506 4839 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-db-sync-nm9t5"] Mar 21 04:54:33 crc kubenswrapper[4839]: I0321 04:54:33.045676 4839 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-db-sync-nm9t5"] Mar 21 04:54:34 crc kubenswrapper[4839]: I0321 04:54:34.452970 4839 scope.go:117] "RemoveContainer" containerID="27713c304335ae18a33ce2c1e07fb2a32c8ccd655a1db46168d0e6b4f0325109" Mar 21 04:54:34 crc kubenswrapper[4839]: E0321 04:54:34.453286 4839 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-jx4q7_openshift-machine-config-operator(4f92fefb-d5cd-451a-8bbe-31eea55d5bd9)\"" pod="openshift-machine-config-operator/machine-config-daemon-jx4q7" podUID="4f92fefb-d5cd-451a-8bbe-31eea55d5bd9" Mar 21 04:54:34 crc kubenswrapper[4839]: I0321 04:54:34.463919 4839 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="625a99bd-bc01-400e-8e9c-1f5eff390466" path="/var/lib/kubelet/pods/625a99bd-bc01-400e-8e9c-1f5eff390466/volumes" Mar 21 04:54:42 crc kubenswrapper[4839]: I0321 04:54:42.040636 4839 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-db-sync-wdddk"] Mar 21 04:54:42 crc kubenswrapper[4839]: I0321 04:54:42.049900 4839 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-db-sync-wdddk"] Mar 21 04:54:42 crc kubenswrapper[4839]: I0321 04:54:42.464034 4839 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e6e87cbd-1f46-4fa0-9529-8250f9fee21c" path="/var/lib/kubelet/pods/e6e87cbd-1f46-4fa0-9529-8250f9fee21c/volumes" Mar 21 04:54:45 crc kubenswrapper[4839]: I0321 04:54:45.028092 4839 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-db-sync-t8kxj"] Mar 21 04:54:45 crc kubenswrapper[4839]: I0321 04:54:45.039110 4839 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-bootstrap-ts52d"] Mar 21 04:54:45 crc kubenswrapper[4839]: I0321 04:54:45.050476 4839 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-db-sync-t8kxj"] Mar 21 04:54:45 crc kubenswrapper[4839]: I0321 04:54:45.059731 4839 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-bootstrap-ts52d"] Mar 21 04:54:46 crc kubenswrapper[4839]: I0321 04:54:46.460132 4839 scope.go:117] "RemoveContainer" containerID="27713c304335ae18a33ce2c1e07fb2a32c8ccd655a1db46168d0e6b4f0325109" Mar 21 04:54:46 crc kubenswrapper[4839]: E0321 04:54:46.460411 4839 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-jx4q7_openshift-machine-config-operator(4f92fefb-d5cd-451a-8bbe-31eea55d5bd9)\"" pod="openshift-machine-config-operator/machine-config-daemon-jx4q7" podUID="4f92fefb-d5cd-451a-8bbe-31eea55d5bd9" Mar 21 04:54:46 crc kubenswrapper[4839]: I0321 04:54:46.462925 4839 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6d0e1745-6e0b-475c-a1de-d049018abea6" path="/var/lib/kubelet/pods/6d0e1745-6e0b-475c-a1de-d049018abea6/volumes" Mar 21 04:54:46 crc kubenswrapper[4839]: I0321 04:54:46.463523 4839 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7cada35b-7e7f-4d22-895f-588b90e48c70" path="/var/lib/kubelet/pods/7cada35b-7e7f-4d22-895f-588b90e48c70/volumes" Mar 21 04:54:54 crc kubenswrapper[4839]: I0321 04:54:54.367037 4839 scope.go:117] "RemoveContainer" containerID="3f2d4fa09933468a7b6e88aaba055705019ffd1468416047f45f0ae828c805fe" Mar 21 04:54:54 crc kubenswrapper[4839]: I0321 04:54:54.403538 4839 scope.go:117] "RemoveContainer" containerID="fe7545d66419e9d11543f534eecf214e1fa485d02ad773333c092ee39cadde88" Mar 21 04:54:54 crc kubenswrapper[4839]: I0321 04:54:54.452910 4839 scope.go:117] "RemoveContainer" containerID="79604402661ee3c465cb72ff146dbc568553c3204385175c4f68e9dccfa5a6c6" Mar 21 04:54:54 crc kubenswrapper[4839]: I0321 04:54:54.491838 4839 scope.go:117] "RemoveContainer" containerID="848904d0e2ac99454595812a77ae5d4f4ec6aacc9198508a3ea49e5fd72d6ee4" Mar 21 04:54:54 crc kubenswrapper[4839]: I0321 04:54:54.540457 4839 scope.go:117] "RemoveContainer" containerID="143fbf65afa2773912765c6bb85681ce2740b19aa556d5df9884eb40a87ddf95" Mar 21 04:54:54 crc kubenswrapper[4839]: I0321 04:54:54.586442 4839 scope.go:117] "RemoveContainer" containerID="dfcec3a2306ecb1c0b0e9a1bd05577683dcbd7efc3319d4ee942c6e22862d913" Mar 21 04:54:54 crc kubenswrapper[4839]: I0321 04:54:54.640939 4839 scope.go:117] "RemoveContainer" containerID="412e0d9615c7dcab7728f617fda54216ecfc01e31d3155750522d0825a7d167a" Mar 21 04:54:56 crc kubenswrapper[4839]: I0321 04:54:56.593195 4839 generic.go:334] "Generic (PLEG): container finished" podID="a58d82e4-2de9-4680-a08c-6eeb775ed08a" containerID="2ff51247bd02a82c38abca9599a8bd0159eda3e65cc9e732eaaf1569f9b29e29" exitCode=0 Mar 21 04:54:56 crc kubenswrapper[4839]: I0321 04:54:56.593298 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-2rvtx" event={"ID":"a58d82e4-2de9-4680-a08c-6eeb775ed08a","Type":"ContainerDied","Data":"2ff51247bd02a82c38abca9599a8bd0159eda3e65cc9e732eaaf1569f9b29e29"} Mar 21 04:54:57 crc kubenswrapper[4839]: I0321 04:54:57.453252 4839 scope.go:117] "RemoveContainer" containerID="27713c304335ae18a33ce2c1e07fb2a32c8ccd655a1db46168d0e6b4f0325109" Mar 21 04:54:57 crc kubenswrapper[4839]: E0321 04:54:57.453510 4839 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-jx4q7_openshift-machine-config-operator(4f92fefb-d5cd-451a-8bbe-31eea55d5bd9)\"" pod="openshift-machine-config-operator/machine-config-daemon-jx4q7" podUID="4f92fefb-d5cd-451a-8bbe-31eea55d5bd9" Mar 21 04:54:58 crc kubenswrapper[4839]: I0321 04:54:58.003435 4839 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-2rvtx" Mar 21 04:54:58 crc kubenswrapper[4839]: I0321 04:54:58.126870 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/a58d82e4-2de9-4680-a08c-6eeb775ed08a-ssh-key-openstack-edpm-ipam\") pod \"a58d82e4-2de9-4680-a08c-6eeb775ed08a\" (UID: \"a58d82e4-2de9-4680-a08c-6eeb775ed08a\") " Mar 21 04:54:58 crc kubenswrapper[4839]: I0321 04:54:58.127290 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9hxd9\" (UniqueName: \"kubernetes.io/projected/a58d82e4-2de9-4680-a08c-6eeb775ed08a-kube-api-access-9hxd9\") pod \"a58d82e4-2de9-4680-a08c-6eeb775ed08a\" (UID: \"a58d82e4-2de9-4680-a08c-6eeb775ed08a\") " Mar 21 04:54:58 crc kubenswrapper[4839]: I0321 04:54:58.127549 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/a58d82e4-2de9-4680-a08c-6eeb775ed08a-inventory\") pod \"a58d82e4-2de9-4680-a08c-6eeb775ed08a\" (UID: \"a58d82e4-2de9-4680-a08c-6eeb775ed08a\") " Mar 21 04:54:58 crc kubenswrapper[4839]: I0321 04:54:58.132251 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a58d82e4-2de9-4680-a08c-6eeb775ed08a-kube-api-access-9hxd9" (OuterVolumeSpecName: "kube-api-access-9hxd9") pod "a58d82e4-2de9-4680-a08c-6eeb775ed08a" (UID: "a58d82e4-2de9-4680-a08c-6eeb775ed08a"). InnerVolumeSpecName "kube-api-access-9hxd9". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 21 04:54:58 crc kubenswrapper[4839]: I0321 04:54:58.156383 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a58d82e4-2de9-4680-a08c-6eeb775ed08a-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "a58d82e4-2de9-4680-a08c-6eeb775ed08a" (UID: "a58d82e4-2de9-4680-a08c-6eeb775ed08a"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 21 04:54:58 crc kubenswrapper[4839]: I0321 04:54:58.157341 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a58d82e4-2de9-4680-a08c-6eeb775ed08a-inventory" (OuterVolumeSpecName: "inventory") pod "a58d82e4-2de9-4680-a08c-6eeb775ed08a" (UID: "a58d82e4-2de9-4680-a08c-6eeb775ed08a"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 21 04:54:58 crc kubenswrapper[4839]: I0321 04:54:58.230313 4839 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/a58d82e4-2de9-4680-a08c-6eeb775ed08a-inventory\") on node \"crc\" DevicePath \"\"" Mar 21 04:54:58 crc kubenswrapper[4839]: I0321 04:54:58.230622 4839 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/a58d82e4-2de9-4680-a08c-6eeb775ed08a-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Mar 21 04:54:58 crc kubenswrapper[4839]: I0321 04:54:58.230716 4839 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9hxd9\" (UniqueName: \"kubernetes.io/projected/a58d82e4-2de9-4680-a08c-6eeb775ed08a-kube-api-access-9hxd9\") on node \"crc\" DevicePath \"\"" Mar 21 04:54:58 crc kubenswrapper[4839]: I0321 04:54:58.611527 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-2rvtx" event={"ID":"a58d82e4-2de9-4680-a08c-6eeb775ed08a","Type":"ContainerDied","Data":"ef1d592a6aae72b1b9bb235f77d743e0bad4065ccda22b837001cbeb2d26cd16"} Mar 21 04:54:58 crc kubenswrapper[4839]: I0321 04:54:58.611588 4839 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-2rvtx" Mar 21 04:54:58 crc kubenswrapper[4839]: I0321 04:54:58.611607 4839 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ef1d592a6aae72b1b9bb235f77d743e0bad4065ccda22b837001cbeb2d26cd16" Mar 21 04:54:58 crc kubenswrapper[4839]: I0321 04:54:58.695545 4839 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/validate-network-edpm-deployment-openstack-edpm-ipam-jsp4h"] Mar 21 04:54:58 crc kubenswrapper[4839]: E0321 04:54:58.695963 4839 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="852785cf-c79d-4c8e-92f0-f15d9836b437" containerName="oc" Mar 21 04:54:58 crc kubenswrapper[4839]: I0321 04:54:58.695998 4839 state_mem.go:107] "Deleted CPUSet assignment" podUID="852785cf-c79d-4c8e-92f0-f15d9836b437" containerName="oc" Mar 21 04:54:58 crc kubenswrapper[4839]: E0321 04:54:58.696024 4839 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a58d82e4-2de9-4680-a08c-6eeb775ed08a" containerName="configure-network-edpm-deployment-openstack-edpm-ipam" Mar 21 04:54:58 crc kubenswrapper[4839]: I0321 04:54:58.696031 4839 state_mem.go:107] "Deleted CPUSet assignment" podUID="a58d82e4-2de9-4680-a08c-6eeb775ed08a" containerName="configure-network-edpm-deployment-openstack-edpm-ipam" Mar 21 04:54:58 crc kubenswrapper[4839]: I0321 04:54:58.696335 4839 memory_manager.go:354] "RemoveStaleState removing state" podUID="a58d82e4-2de9-4680-a08c-6eeb775ed08a" containerName="configure-network-edpm-deployment-openstack-edpm-ipam" Mar 21 04:54:58 crc kubenswrapper[4839]: I0321 04:54:58.696351 4839 memory_manager.go:354] "RemoveStaleState removing state" podUID="852785cf-c79d-4c8e-92f0-f15d9836b437" containerName="oc" Mar 21 04:54:58 crc kubenswrapper[4839]: I0321 04:54:58.697107 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-jsp4h" Mar 21 04:54:58 crc kubenswrapper[4839]: I0321 04:54:58.698559 4839 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-pxvtf" Mar 21 04:54:58 crc kubenswrapper[4839]: I0321 04:54:58.699272 4839 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Mar 21 04:54:58 crc kubenswrapper[4839]: I0321 04:54:58.699310 4839 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Mar 21 04:54:58 crc kubenswrapper[4839]: I0321 04:54:58.699397 4839 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Mar 21 04:54:58 crc kubenswrapper[4839]: I0321 04:54:58.706999 4839 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/validate-network-edpm-deployment-openstack-edpm-ipam-jsp4h"] Mar 21 04:54:58 crc kubenswrapper[4839]: I0321 04:54:58.842643 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7gzp4\" (UniqueName: \"kubernetes.io/projected/f9d60b3b-b1b4-4d98-9da2-e152ac410c81-kube-api-access-7gzp4\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-jsp4h\" (UID: \"f9d60b3b-b1b4-4d98-9da2-e152ac410c81\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-jsp4h" Mar 21 04:54:58 crc kubenswrapper[4839]: I0321 04:54:58.842851 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/f9d60b3b-b1b4-4d98-9da2-e152ac410c81-ssh-key-openstack-edpm-ipam\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-jsp4h\" (UID: \"f9d60b3b-b1b4-4d98-9da2-e152ac410c81\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-jsp4h" Mar 21 04:54:58 crc kubenswrapper[4839]: I0321 04:54:58.843204 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/f9d60b3b-b1b4-4d98-9da2-e152ac410c81-inventory\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-jsp4h\" (UID: \"f9d60b3b-b1b4-4d98-9da2-e152ac410c81\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-jsp4h" Mar 21 04:54:58 crc kubenswrapper[4839]: I0321 04:54:58.944902 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/f9d60b3b-b1b4-4d98-9da2-e152ac410c81-inventory\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-jsp4h\" (UID: \"f9d60b3b-b1b4-4d98-9da2-e152ac410c81\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-jsp4h" Mar 21 04:54:58 crc kubenswrapper[4839]: I0321 04:54:58.944997 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7gzp4\" (UniqueName: \"kubernetes.io/projected/f9d60b3b-b1b4-4d98-9da2-e152ac410c81-kube-api-access-7gzp4\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-jsp4h\" (UID: \"f9d60b3b-b1b4-4d98-9da2-e152ac410c81\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-jsp4h" Mar 21 04:54:58 crc kubenswrapper[4839]: I0321 04:54:58.945049 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/f9d60b3b-b1b4-4d98-9da2-e152ac410c81-ssh-key-openstack-edpm-ipam\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-jsp4h\" (UID: \"f9d60b3b-b1b4-4d98-9da2-e152ac410c81\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-jsp4h" Mar 21 04:54:58 crc kubenswrapper[4839]: I0321 04:54:58.948878 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/f9d60b3b-b1b4-4d98-9da2-e152ac410c81-inventory\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-jsp4h\" (UID: \"f9d60b3b-b1b4-4d98-9da2-e152ac410c81\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-jsp4h" Mar 21 04:54:58 crc kubenswrapper[4839]: I0321 04:54:58.949040 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/f9d60b3b-b1b4-4d98-9da2-e152ac410c81-ssh-key-openstack-edpm-ipam\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-jsp4h\" (UID: \"f9d60b3b-b1b4-4d98-9da2-e152ac410c81\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-jsp4h" Mar 21 04:54:58 crc kubenswrapper[4839]: I0321 04:54:58.966675 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7gzp4\" (UniqueName: \"kubernetes.io/projected/f9d60b3b-b1b4-4d98-9da2-e152ac410c81-kube-api-access-7gzp4\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-jsp4h\" (UID: \"f9d60b3b-b1b4-4d98-9da2-e152ac410c81\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-jsp4h" Mar 21 04:54:59 crc kubenswrapper[4839]: I0321 04:54:59.018679 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-jsp4h" Mar 21 04:54:59 crc kubenswrapper[4839]: I0321 04:54:59.506275 4839 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/validate-network-edpm-deployment-openstack-edpm-ipam-jsp4h"] Mar 21 04:54:59 crc kubenswrapper[4839]: I0321 04:54:59.619583 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-jsp4h" event={"ID":"f9d60b3b-b1b4-4d98-9da2-e152ac410c81","Type":"ContainerStarted","Data":"eadb9ad06806647ab9e379d0ae46ee8dc799d857833f8f6433a7547d0d7d61d7"} Mar 21 04:55:00 crc kubenswrapper[4839]: I0321 04:55:00.631203 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-jsp4h" event={"ID":"f9d60b3b-b1b4-4d98-9da2-e152ac410c81","Type":"ContainerStarted","Data":"79265d762afb243298ce5f51d33dbbb4aef590602f5e1376cea25cefa028f66d"} Mar 21 04:55:00 crc kubenswrapper[4839]: I0321 04:55:00.651471 4839 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-jsp4h" podStartSLOduration=2.246753007 podStartE2EDuration="2.65145288s" podCreationTimestamp="2026-03-21 04:54:58 +0000 UTC" firstStartedPulling="2026-03-21 04:54:59.511773588 +0000 UTC m=+1903.839560254" lastFinishedPulling="2026-03-21 04:54:59.916473451 +0000 UTC m=+1904.244260127" observedRunningTime="2026-03-21 04:55:00.647992052 +0000 UTC m=+1904.975778738" watchObservedRunningTime="2026-03-21 04:55:00.65145288 +0000 UTC m=+1904.979239556" Mar 21 04:55:04 crc kubenswrapper[4839]: I0321 04:55:04.663343 4839 generic.go:334] "Generic (PLEG): container finished" podID="f9d60b3b-b1b4-4d98-9da2-e152ac410c81" containerID="79265d762afb243298ce5f51d33dbbb4aef590602f5e1376cea25cefa028f66d" exitCode=0 Mar 21 04:55:04 crc kubenswrapper[4839]: I0321 04:55:04.663630 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-jsp4h" event={"ID":"f9d60b3b-b1b4-4d98-9da2-e152ac410c81","Type":"ContainerDied","Data":"79265d762afb243298ce5f51d33dbbb4aef590602f5e1376cea25cefa028f66d"} Mar 21 04:55:05 crc kubenswrapper[4839]: I0321 04:55:05.043914 4839 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-db-sync-qfjms"] Mar 21 04:55:05 crc kubenswrapper[4839]: I0321 04:55:05.053034 4839 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-db-sync-qfjms"] Mar 21 04:55:06 crc kubenswrapper[4839]: I0321 04:55:06.033988 4839 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-jsp4h" Mar 21 04:55:06 crc kubenswrapper[4839]: I0321 04:55:06.177914 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/f9d60b3b-b1b4-4d98-9da2-e152ac410c81-ssh-key-openstack-edpm-ipam\") pod \"f9d60b3b-b1b4-4d98-9da2-e152ac410c81\" (UID: \"f9d60b3b-b1b4-4d98-9da2-e152ac410c81\") " Mar 21 04:55:06 crc kubenswrapper[4839]: I0321 04:55:06.178098 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7gzp4\" (UniqueName: \"kubernetes.io/projected/f9d60b3b-b1b4-4d98-9da2-e152ac410c81-kube-api-access-7gzp4\") pod \"f9d60b3b-b1b4-4d98-9da2-e152ac410c81\" (UID: \"f9d60b3b-b1b4-4d98-9da2-e152ac410c81\") " Mar 21 04:55:06 crc kubenswrapper[4839]: I0321 04:55:06.178131 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/f9d60b3b-b1b4-4d98-9da2-e152ac410c81-inventory\") pod \"f9d60b3b-b1b4-4d98-9da2-e152ac410c81\" (UID: \"f9d60b3b-b1b4-4d98-9da2-e152ac410c81\") " Mar 21 04:55:06 crc kubenswrapper[4839]: I0321 04:55:06.183871 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f9d60b3b-b1b4-4d98-9da2-e152ac410c81-kube-api-access-7gzp4" (OuterVolumeSpecName: "kube-api-access-7gzp4") pod "f9d60b3b-b1b4-4d98-9da2-e152ac410c81" (UID: "f9d60b3b-b1b4-4d98-9da2-e152ac410c81"). InnerVolumeSpecName "kube-api-access-7gzp4". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 21 04:55:06 crc kubenswrapper[4839]: I0321 04:55:06.205592 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f9d60b3b-b1b4-4d98-9da2-e152ac410c81-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "f9d60b3b-b1b4-4d98-9da2-e152ac410c81" (UID: "f9d60b3b-b1b4-4d98-9da2-e152ac410c81"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 21 04:55:06 crc kubenswrapper[4839]: I0321 04:55:06.206187 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f9d60b3b-b1b4-4d98-9da2-e152ac410c81-inventory" (OuterVolumeSpecName: "inventory") pod "f9d60b3b-b1b4-4d98-9da2-e152ac410c81" (UID: "f9d60b3b-b1b4-4d98-9da2-e152ac410c81"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 21 04:55:06 crc kubenswrapper[4839]: I0321 04:55:06.281493 4839 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7gzp4\" (UniqueName: \"kubernetes.io/projected/f9d60b3b-b1b4-4d98-9da2-e152ac410c81-kube-api-access-7gzp4\") on node \"crc\" DevicePath \"\"" Mar 21 04:55:06 crc kubenswrapper[4839]: I0321 04:55:06.281542 4839 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/f9d60b3b-b1b4-4d98-9da2-e152ac410c81-inventory\") on node \"crc\" DevicePath \"\"" Mar 21 04:55:06 crc kubenswrapper[4839]: I0321 04:55:06.281556 4839 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/f9d60b3b-b1b4-4d98-9da2-e152ac410c81-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Mar 21 04:55:06 crc kubenswrapper[4839]: I0321 04:55:06.465307 4839 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6000d2d4-e84a-443f-9094-ab999541331d" path="/var/lib/kubelet/pods/6000d2d4-e84a-443f-9094-ab999541331d/volumes" Mar 21 04:55:06 crc kubenswrapper[4839]: I0321 04:55:06.679274 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-jsp4h" event={"ID":"f9d60b3b-b1b4-4d98-9da2-e152ac410c81","Type":"ContainerDied","Data":"eadb9ad06806647ab9e379d0ae46ee8dc799d857833f8f6433a7547d0d7d61d7"} Mar 21 04:55:06 crc kubenswrapper[4839]: I0321 04:55:06.679324 4839 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="eadb9ad06806647ab9e379d0ae46ee8dc799d857833f8f6433a7547d0d7d61d7" Mar 21 04:55:06 crc kubenswrapper[4839]: I0321 04:55:06.679826 4839 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-jsp4h" Mar 21 04:55:06 crc kubenswrapper[4839]: I0321 04:55:06.817320 4839 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/install-os-edpm-deployment-openstack-edpm-ipam-xdvx2"] Mar 21 04:55:06 crc kubenswrapper[4839]: E0321 04:55:06.817720 4839 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f9d60b3b-b1b4-4d98-9da2-e152ac410c81" containerName="validate-network-edpm-deployment-openstack-edpm-ipam" Mar 21 04:55:06 crc kubenswrapper[4839]: I0321 04:55:06.817739 4839 state_mem.go:107] "Deleted CPUSet assignment" podUID="f9d60b3b-b1b4-4d98-9da2-e152ac410c81" containerName="validate-network-edpm-deployment-openstack-edpm-ipam" Mar 21 04:55:06 crc kubenswrapper[4839]: I0321 04:55:06.817944 4839 memory_manager.go:354] "RemoveStaleState removing state" podUID="f9d60b3b-b1b4-4d98-9da2-e152ac410c81" containerName="validate-network-edpm-deployment-openstack-edpm-ipam" Mar 21 04:55:06 crc kubenswrapper[4839]: I0321 04:55:06.819333 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-xdvx2" Mar 21 04:55:06 crc kubenswrapper[4839]: I0321 04:55:06.823339 4839 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Mar 21 04:55:06 crc kubenswrapper[4839]: I0321 04:55:06.823584 4839 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Mar 21 04:55:06 crc kubenswrapper[4839]: I0321 04:55:06.823697 4839 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Mar 21 04:55:06 crc kubenswrapper[4839]: I0321 04:55:06.823801 4839 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-pxvtf" Mar 21 04:55:06 crc kubenswrapper[4839]: I0321 04:55:06.833666 4839 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/install-os-edpm-deployment-openstack-edpm-ipam-xdvx2"] Mar 21 04:55:06 crc kubenswrapper[4839]: I0321 04:55:06.898370 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/7538d496-3768-42b7-9f2e-70e1b44a9d6b-ssh-key-openstack-edpm-ipam\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-xdvx2\" (UID: \"7538d496-3768-42b7-9f2e-70e1b44a9d6b\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-xdvx2" Mar 21 04:55:06 crc kubenswrapper[4839]: I0321 04:55:06.898947 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v8tht\" (UniqueName: \"kubernetes.io/projected/7538d496-3768-42b7-9f2e-70e1b44a9d6b-kube-api-access-v8tht\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-xdvx2\" (UID: \"7538d496-3768-42b7-9f2e-70e1b44a9d6b\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-xdvx2" Mar 21 04:55:06 crc kubenswrapper[4839]: I0321 04:55:06.899179 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/7538d496-3768-42b7-9f2e-70e1b44a9d6b-inventory\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-xdvx2\" (UID: \"7538d496-3768-42b7-9f2e-70e1b44a9d6b\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-xdvx2" Mar 21 04:55:07 crc kubenswrapper[4839]: I0321 04:55:07.000450 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/7538d496-3768-42b7-9f2e-70e1b44a9d6b-inventory\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-xdvx2\" (UID: \"7538d496-3768-42b7-9f2e-70e1b44a9d6b\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-xdvx2" Mar 21 04:55:07 crc kubenswrapper[4839]: I0321 04:55:07.000722 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/7538d496-3768-42b7-9f2e-70e1b44a9d6b-ssh-key-openstack-edpm-ipam\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-xdvx2\" (UID: \"7538d496-3768-42b7-9f2e-70e1b44a9d6b\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-xdvx2" Mar 21 04:55:07 crc kubenswrapper[4839]: I0321 04:55:07.001033 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-v8tht\" (UniqueName: \"kubernetes.io/projected/7538d496-3768-42b7-9f2e-70e1b44a9d6b-kube-api-access-v8tht\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-xdvx2\" (UID: \"7538d496-3768-42b7-9f2e-70e1b44a9d6b\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-xdvx2" Mar 21 04:55:07 crc kubenswrapper[4839]: I0321 04:55:07.005118 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/7538d496-3768-42b7-9f2e-70e1b44a9d6b-inventory\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-xdvx2\" (UID: \"7538d496-3768-42b7-9f2e-70e1b44a9d6b\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-xdvx2" Mar 21 04:55:07 crc kubenswrapper[4839]: I0321 04:55:07.005162 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/7538d496-3768-42b7-9f2e-70e1b44a9d6b-ssh-key-openstack-edpm-ipam\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-xdvx2\" (UID: \"7538d496-3768-42b7-9f2e-70e1b44a9d6b\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-xdvx2" Mar 21 04:55:07 crc kubenswrapper[4839]: I0321 04:55:07.020843 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-v8tht\" (UniqueName: \"kubernetes.io/projected/7538d496-3768-42b7-9f2e-70e1b44a9d6b-kube-api-access-v8tht\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-xdvx2\" (UID: \"7538d496-3768-42b7-9f2e-70e1b44a9d6b\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-xdvx2" Mar 21 04:55:07 crc kubenswrapper[4839]: I0321 04:55:07.153117 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-xdvx2" Mar 21 04:55:07 crc kubenswrapper[4839]: I0321 04:55:07.821956 4839 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/install-os-edpm-deployment-openstack-edpm-ipam-xdvx2"] Mar 21 04:55:07 crc kubenswrapper[4839]: I0321 04:55:07.834189 4839 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 21 04:55:08 crc kubenswrapper[4839]: I0321 04:55:08.695526 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-xdvx2" event={"ID":"7538d496-3768-42b7-9f2e-70e1b44a9d6b","Type":"ContainerStarted","Data":"c2bed0a99a23391c3bf403e5b41349a8afe66f05e34f738cc9341c78027bd20e"} Mar 21 04:55:09 crc kubenswrapper[4839]: I0321 04:55:09.704586 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-xdvx2" event={"ID":"7538d496-3768-42b7-9f2e-70e1b44a9d6b","Type":"ContainerStarted","Data":"3c6cee74975474902b422479025952e542c55d69697ddabc5338d76b71ec669a"} Mar 21 04:55:09 crc kubenswrapper[4839]: I0321 04:55:09.727827 4839 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-xdvx2" podStartSLOduration=3.082889663 podStartE2EDuration="3.727785664s" podCreationTimestamp="2026-03-21 04:55:06 +0000 UTC" firstStartedPulling="2026-03-21 04:55:07.833906351 +0000 UTC m=+1912.161693027" lastFinishedPulling="2026-03-21 04:55:08.478802352 +0000 UTC m=+1912.806589028" observedRunningTime="2026-03-21 04:55:09.722083233 +0000 UTC m=+1914.049869909" watchObservedRunningTime="2026-03-21 04:55:09.727785664 +0000 UTC m=+1914.055572340" Mar 21 04:55:11 crc kubenswrapper[4839]: I0321 04:55:11.453703 4839 scope.go:117] "RemoveContainer" containerID="27713c304335ae18a33ce2c1e07fb2a32c8ccd655a1db46168d0e6b4f0325109" Mar 21 04:55:11 crc kubenswrapper[4839]: E0321 04:55:11.454305 4839 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-jx4q7_openshift-machine-config-operator(4f92fefb-d5cd-451a-8bbe-31eea55d5bd9)\"" pod="openshift-machine-config-operator/machine-config-daemon-jx4q7" podUID="4f92fefb-d5cd-451a-8bbe-31eea55d5bd9" Mar 21 04:55:24 crc kubenswrapper[4839]: I0321 04:55:24.453226 4839 scope.go:117] "RemoveContainer" containerID="27713c304335ae18a33ce2c1e07fb2a32c8ccd655a1db46168d0e6b4f0325109" Mar 21 04:55:24 crc kubenswrapper[4839]: E0321 04:55:24.453917 4839 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-jx4q7_openshift-machine-config-operator(4f92fefb-d5cd-451a-8bbe-31eea55d5bd9)\"" pod="openshift-machine-config-operator/machine-config-daemon-jx4q7" podUID="4f92fefb-d5cd-451a-8bbe-31eea55d5bd9" Mar 21 04:55:39 crc kubenswrapper[4839]: I0321 04:55:39.453425 4839 scope.go:117] "RemoveContainer" containerID="27713c304335ae18a33ce2c1e07fb2a32c8ccd655a1db46168d0e6b4f0325109" Mar 21 04:55:39 crc kubenswrapper[4839]: E0321 04:55:39.454313 4839 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-jx4q7_openshift-machine-config-operator(4f92fefb-d5cd-451a-8bbe-31eea55d5bd9)\"" pod="openshift-machine-config-operator/machine-config-daemon-jx4q7" podUID="4f92fefb-d5cd-451a-8bbe-31eea55d5bd9" Mar 21 04:55:40 crc kubenswrapper[4839]: I0321 04:55:40.949707 4839 generic.go:334] "Generic (PLEG): container finished" podID="7538d496-3768-42b7-9f2e-70e1b44a9d6b" containerID="3c6cee74975474902b422479025952e542c55d69697ddabc5338d76b71ec669a" exitCode=0 Mar 21 04:55:40 crc kubenswrapper[4839]: I0321 04:55:40.949758 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-xdvx2" event={"ID":"7538d496-3768-42b7-9f2e-70e1b44a9d6b","Type":"ContainerDied","Data":"3c6cee74975474902b422479025952e542c55d69697ddabc5338d76b71ec669a"} Mar 21 04:55:42 crc kubenswrapper[4839]: I0321 04:55:42.325639 4839 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-xdvx2" Mar 21 04:55:42 crc kubenswrapper[4839]: I0321 04:55:42.478249 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-v8tht\" (UniqueName: \"kubernetes.io/projected/7538d496-3768-42b7-9f2e-70e1b44a9d6b-kube-api-access-v8tht\") pod \"7538d496-3768-42b7-9f2e-70e1b44a9d6b\" (UID: \"7538d496-3768-42b7-9f2e-70e1b44a9d6b\") " Mar 21 04:55:42 crc kubenswrapper[4839]: I0321 04:55:42.478654 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/7538d496-3768-42b7-9f2e-70e1b44a9d6b-inventory\") pod \"7538d496-3768-42b7-9f2e-70e1b44a9d6b\" (UID: \"7538d496-3768-42b7-9f2e-70e1b44a9d6b\") " Mar 21 04:55:42 crc kubenswrapper[4839]: I0321 04:55:42.478758 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/7538d496-3768-42b7-9f2e-70e1b44a9d6b-ssh-key-openstack-edpm-ipam\") pod \"7538d496-3768-42b7-9f2e-70e1b44a9d6b\" (UID: \"7538d496-3768-42b7-9f2e-70e1b44a9d6b\") " Mar 21 04:55:42 crc kubenswrapper[4839]: I0321 04:55:42.484028 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7538d496-3768-42b7-9f2e-70e1b44a9d6b-kube-api-access-v8tht" (OuterVolumeSpecName: "kube-api-access-v8tht") pod "7538d496-3768-42b7-9f2e-70e1b44a9d6b" (UID: "7538d496-3768-42b7-9f2e-70e1b44a9d6b"). InnerVolumeSpecName "kube-api-access-v8tht". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 21 04:55:42 crc kubenswrapper[4839]: I0321 04:55:42.505197 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7538d496-3768-42b7-9f2e-70e1b44a9d6b-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "7538d496-3768-42b7-9f2e-70e1b44a9d6b" (UID: "7538d496-3768-42b7-9f2e-70e1b44a9d6b"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 21 04:55:42 crc kubenswrapper[4839]: I0321 04:55:42.527816 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7538d496-3768-42b7-9f2e-70e1b44a9d6b-inventory" (OuterVolumeSpecName: "inventory") pod "7538d496-3768-42b7-9f2e-70e1b44a9d6b" (UID: "7538d496-3768-42b7-9f2e-70e1b44a9d6b"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 21 04:55:42 crc kubenswrapper[4839]: I0321 04:55:42.581309 4839 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/7538d496-3768-42b7-9f2e-70e1b44a9d6b-inventory\") on node \"crc\" DevicePath \"\"" Mar 21 04:55:42 crc kubenswrapper[4839]: I0321 04:55:42.581343 4839 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/7538d496-3768-42b7-9f2e-70e1b44a9d6b-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Mar 21 04:55:42 crc kubenswrapper[4839]: I0321 04:55:42.581358 4839 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-v8tht\" (UniqueName: \"kubernetes.io/projected/7538d496-3768-42b7-9f2e-70e1b44a9d6b-kube-api-access-v8tht\") on node \"crc\" DevicePath \"\"" Mar 21 04:55:42 crc kubenswrapper[4839]: I0321 04:55:42.968147 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-xdvx2" event={"ID":"7538d496-3768-42b7-9f2e-70e1b44a9d6b","Type":"ContainerDied","Data":"c2bed0a99a23391c3bf403e5b41349a8afe66f05e34f738cc9341c78027bd20e"} Mar 21 04:55:42 crc kubenswrapper[4839]: I0321 04:55:42.968201 4839 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c2bed0a99a23391c3bf403e5b41349a8afe66f05e34f738cc9341c78027bd20e" Mar 21 04:55:42 crc kubenswrapper[4839]: I0321 04:55:42.968218 4839 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-xdvx2" Mar 21 04:55:43 crc kubenswrapper[4839]: I0321 04:55:43.054499 4839 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/configure-os-edpm-deployment-openstack-edpm-ipam-qkclf"] Mar 21 04:55:43 crc kubenswrapper[4839]: E0321 04:55:43.054952 4839 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7538d496-3768-42b7-9f2e-70e1b44a9d6b" containerName="install-os-edpm-deployment-openstack-edpm-ipam" Mar 21 04:55:43 crc kubenswrapper[4839]: I0321 04:55:43.054972 4839 state_mem.go:107] "Deleted CPUSet assignment" podUID="7538d496-3768-42b7-9f2e-70e1b44a9d6b" containerName="install-os-edpm-deployment-openstack-edpm-ipam" Mar 21 04:55:43 crc kubenswrapper[4839]: I0321 04:55:43.055184 4839 memory_manager.go:354] "RemoveStaleState removing state" podUID="7538d496-3768-42b7-9f2e-70e1b44a9d6b" containerName="install-os-edpm-deployment-openstack-edpm-ipam" Mar 21 04:55:43 crc kubenswrapper[4839]: I0321 04:55:43.055927 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-qkclf" Mar 21 04:55:43 crc kubenswrapper[4839]: I0321 04:55:43.058466 4839 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Mar 21 04:55:43 crc kubenswrapper[4839]: I0321 04:55:43.058656 4839 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Mar 21 04:55:43 crc kubenswrapper[4839]: I0321 04:55:43.059673 4839 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Mar 21 04:55:43 crc kubenswrapper[4839]: I0321 04:55:43.059760 4839 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-pxvtf" Mar 21 04:55:43 crc kubenswrapper[4839]: I0321 04:55:43.076936 4839 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-os-edpm-deployment-openstack-edpm-ipam-qkclf"] Mar 21 04:55:43 crc kubenswrapper[4839]: I0321 04:55:43.092602 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-blxs6\" (UniqueName: \"kubernetes.io/projected/ab9d4433-fe0e-471b-84f8-568b31920ed3-kube-api-access-blxs6\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-qkclf\" (UID: \"ab9d4433-fe0e-471b-84f8-568b31920ed3\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-qkclf" Mar 21 04:55:43 crc kubenswrapper[4839]: I0321 04:55:43.092650 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/ab9d4433-fe0e-471b-84f8-568b31920ed3-ssh-key-openstack-edpm-ipam\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-qkclf\" (UID: \"ab9d4433-fe0e-471b-84f8-568b31920ed3\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-qkclf" Mar 21 04:55:43 crc kubenswrapper[4839]: I0321 04:55:43.092735 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/ab9d4433-fe0e-471b-84f8-568b31920ed3-inventory\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-qkclf\" (UID: \"ab9d4433-fe0e-471b-84f8-568b31920ed3\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-qkclf" Mar 21 04:55:43 crc kubenswrapper[4839]: I0321 04:55:43.194375 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/ab9d4433-fe0e-471b-84f8-568b31920ed3-inventory\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-qkclf\" (UID: \"ab9d4433-fe0e-471b-84f8-568b31920ed3\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-qkclf" Mar 21 04:55:43 crc kubenswrapper[4839]: I0321 04:55:43.194541 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-blxs6\" (UniqueName: \"kubernetes.io/projected/ab9d4433-fe0e-471b-84f8-568b31920ed3-kube-api-access-blxs6\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-qkclf\" (UID: \"ab9d4433-fe0e-471b-84f8-568b31920ed3\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-qkclf" Mar 21 04:55:43 crc kubenswrapper[4839]: I0321 04:55:43.194588 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/ab9d4433-fe0e-471b-84f8-568b31920ed3-ssh-key-openstack-edpm-ipam\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-qkclf\" (UID: \"ab9d4433-fe0e-471b-84f8-568b31920ed3\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-qkclf" Mar 21 04:55:43 crc kubenswrapper[4839]: I0321 04:55:43.198354 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/ab9d4433-fe0e-471b-84f8-568b31920ed3-ssh-key-openstack-edpm-ipam\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-qkclf\" (UID: \"ab9d4433-fe0e-471b-84f8-568b31920ed3\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-qkclf" Mar 21 04:55:43 crc kubenswrapper[4839]: I0321 04:55:43.198454 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/ab9d4433-fe0e-471b-84f8-568b31920ed3-inventory\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-qkclf\" (UID: \"ab9d4433-fe0e-471b-84f8-568b31920ed3\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-qkclf" Mar 21 04:55:43 crc kubenswrapper[4839]: I0321 04:55:43.209840 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-blxs6\" (UniqueName: \"kubernetes.io/projected/ab9d4433-fe0e-471b-84f8-568b31920ed3-kube-api-access-blxs6\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-qkclf\" (UID: \"ab9d4433-fe0e-471b-84f8-568b31920ed3\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-qkclf" Mar 21 04:55:43 crc kubenswrapper[4839]: I0321 04:55:43.376423 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-qkclf" Mar 21 04:55:43 crc kubenswrapper[4839]: I0321 04:55:43.901162 4839 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-os-edpm-deployment-openstack-edpm-ipam-qkclf"] Mar 21 04:55:43 crc kubenswrapper[4839]: I0321 04:55:43.977530 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-qkclf" event={"ID":"ab9d4433-fe0e-471b-84f8-568b31920ed3","Type":"ContainerStarted","Data":"4d2611a14110b17c867ac61b555f142cd249467327dd49d330816ffb10a58194"} Mar 21 04:55:44 crc kubenswrapper[4839]: I0321 04:55:44.987507 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-qkclf" event={"ID":"ab9d4433-fe0e-471b-84f8-568b31920ed3","Type":"ContainerStarted","Data":"da612acf6dc5607ae4b2dde018284e7ea25cf4afa5712325128aa32f44b461be"} Mar 21 04:55:45 crc kubenswrapper[4839]: I0321 04:55:45.012323 4839 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-qkclf" podStartSLOduration=1.586092883 podStartE2EDuration="2.012299612s" podCreationTimestamp="2026-03-21 04:55:43 +0000 UTC" firstStartedPulling="2026-03-21 04:55:43.900335291 +0000 UTC m=+1948.228121977" lastFinishedPulling="2026-03-21 04:55:44.32654203 +0000 UTC m=+1948.654328706" observedRunningTime="2026-03-21 04:55:45.003239456 +0000 UTC m=+1949.331026152" watchObservedRunningTime="2026-03-21 04:55:45.012299612 +0000 UTC m=+1949.340086288" Mar 21 04:55:46 crc kubenswrapper[4839]: I0321 04:55:46.047495 4839 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-db-create-4zz89"] Mar 21 04:55:46 crc kubenswrapper[4839]: I0321 04:55:46.057498 4839 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-48d5-account-create-update-5k79b"] Mar 21 04:55:46 crc kubenswrapper[4839]: I0321 04:55:46.066534 4839 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-48d5-account-create-update-5k79b"] Mar 21 04:55:46 crc kubenswrapper[4839]: I0321 04:55:46.074560 4839 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-db-create-4zz89"] Mar 21 04:55:46 crc kubenswrapper[4839]: I0321 04:55:46.465069 4839 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b76e9253-1495-42d5-910f-cce6f2730243" path="/var/lib/kubelet/pods/b76e9253-1495-42d5-910f-cce6f2730243/volumes" Mar 21 04:55:46 crc kubenswrapper[4839]: I0321 04:55:46.466420 4839 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f481fb0d-ac2f-4989-a547-50f5081e4e78" path="/var/lib/kubelet/pods/f481fb0d-ac2f-4989-a547-50f5081e4e78/volumes" Mar 21 04:55:47 crc kubenswrapper[4839]: I0321 04:55:47.034367 4839 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-94b7-account-create-update-zmpzr"] Mar 21 04:55:47 crc kubenswrapper[4839]: I0321 04:55:47.044700 4839 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-db-create-ds7tq"] Mar 21 04:55:47 crc kubenswrapper[4839]: I0321 04:55:47.054695 4839 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-46c8-account-create-update-mp8jl"] Mar 21 04:55:47 crc kubenswrapper[4839]: I0321 04:55:47.064910 4839 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-db-create-w9wx6"] Mar 21 04:55:47 crc kubenswrapper[4839]: I0321 04:55:47.073730 4839 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-46c8-account-create-update-mp8jl"] Mar 21 04:55:47 crc kubenswrapper[4839]: I0321 04:55:47.081972 4839 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-db-create-ds7tq"] Mar 21 04:55:47 crc kubenswrapper[4839]: I0321 04:55:47.089829 4839 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-94b7-account-create-update-zmpzr"] Mar 21 04:55:47 crc kubenswrapper[4839]: I0321 04:55:47.097761 4839 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-db-create-w9wx6"] Mar 21 04:55:48 crc kubenswrapper[4839]: I0321 04:55:48.470443 4839 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4185a56e-9d10-4aea-ad84-a865dff3e6be" path="/var/lib/kubelet/pods/4185a56e-9d10-4aea-ad84-a865dff3e6be/volumes" Mar 21 04:55:48 crc kubenswrapper[4839]: I0321 04:55:48.471369 4839 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="46c56098-2959-4bd0-b762-36a4ee1bb2e6" path="/var/lib/kubelet/pods/46c56098-2959-4bd0-b762-36a4ee1bb2e6/volumes" Mar 21 04:55:48 crc kubenswrapper[4839]: I0321 04:55:48.471999 4839 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="60534a44-1538-4bdb-81d1-043c9ae84cee" path="/var/lib/kubelet/pods/60534a44-1538-4bdb-81d1-043c9ae84cee/volumes" Mar 21 04:55:48 crc kubenswrapper[4839]: I0321 04:55:48.472526 4839 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9220ed3c-2e97-4efc-a4cc-28bb29774ad8" path="/var/lib/kubelet/pods/9220ed3c-2e97-4efc-a4cc-28bb29774ad8/volumes" Mar 21 04:55:54 crc kubenswrapper[4839]: I0321 04:55:54.453982 4839 scope.go:117] "RemoveContainer" containerID="27713c304335ae18a33ce2c1e07fb2a32c8ccd655a1db46168d0e6b4f0325109" Mar 21 04:55:54 crc kubenswrapper[4839]: E0321 04:55:54.454710 4839 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-jx4q7_openshift-machine-config-operator(4f92fefb-d5cd-451a-8bbe-31eea55d5bd9)\"" pod="openshift-machine-config-operator/machine-config-daemon-jx4q7" podUID="4f92fefb-d5cd-451a-8bbe-31eea55d5bd9" Mar 21 04:55:54 crc kubenswrapper[4839]: I0321 04:55:54.788325 4839 scope.go:117] "RemoveContainer" containerID="89d53502805454e28eeac8a6f5794fb2c1a2eba3acba95c45fd4f0d839ae56ac" Mar 21 04:55:54 crc kubenswrapper[4839]: I0321 04:55:54.861532 4839 scope.go:117] "RemoveContainer" containerID="ccd22af7723d538ca33a42ba3654ebdb55e8713c02134e6ab93cc893ad28c76a" Mar 21 04:55:54 crc kubenswrapper[4839]: I0321 04:55:54.898883 4839 scope.go:117] "RemoveContainer" containerID="2de908b5bd6bba55215cf326e7323c0123b89a96311bd62e86b355ee0ff19bc1" Mar 21 04:55:54 crc kubenswrapper[4839]: I0321 04:55:54.925344 4839 scope.go:117] "RemoveContainer" containerID="d3d91629ebc8060afc821dc6f6ff1f1f4f9eb9613514c223b3a39c31ccd40e5c" Mar 21 04:55:54 crc kubenswrapper[4839]: I0321 04:55:54.974051 4839 scope.go:117] "RemoveContainer" containerID="296c6956b7a45c772d2bc75858a9b2db91782289c6cf30854b24fd106bb5d692" Mar 21 04:55:55 crc kubenswrapper[4839]: I0321 04:55:55.018631 4839 scope.go:117] "RemoveContainer" containerID="9c0964d074027bd8b20f7561440904d50f0f5ad70eed7435ed8da532c09da947" Mar 21 04:55:55 crc kubenswrapper[4839]: I0321 04:55:55.052462 4839 scope.go:117] "RemoveContainer" containerID="c7f784ce54bb50fe64fb506149fb81059511360e35c18d47126e20bcbe758d00" Mar 21 04:56:00 crc kubenswrapper[4839]: I0321 04:56:00.140392 4839 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29567816-8qfld"] Mar 21 04:56:00 crc kubenswrapper[4839]: I0321 04:56:00.142291 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567816-8qfld" Mar 21 04:56:00 crc kubenswrapper[4839]: I0321 04:56:00.144354 4839 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 21 04:56:00 crc kubenswrapper[4839]: I0321 04:56:00.144725 4839 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-swld2" Mar 21 04:56:00 crc kubenswrapper[4839]: I0321 04:56:00.145006 4839 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 21 04:56:00 crc kubenswrapper[4839]: I0321 04:56:00.150818 4839 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29567816-8qfld"] Mar 21 04:56:00 crc kubenswrapper[4839]: I0321 04:56:00.243089 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qjmcj\" (UniqueName: \"kubernetes.io/projected/f5c6cd43-2bb2-46cd-90ec-b1037aaae8a8-kube-api-access-qjmcj\") pod \"auto-csr-approver-29567816-8qfld\" (UID: \"f5c6cd43-2bb2-46cd-90ec-b1037aaae8a8\") " pod="openshift-infra/auto-csr-approver-29567816-8qfld" Mar 21 04:56:00 crc kubenswrapper[4839]: I0321 04:56:00.344773 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qjmcj\" (UniqueName: \"kubernetes.io/projected/f5c6cd43-2bb2-46cd-90ec-b1037aaae8a8-kube-api-access-qjmcj\") pod \"auto-csr-approver-29567816-8qfld\" (UID: \"f5c6cd43-2bb2-46cd-90ec-b1037aaae8a8\") " pod="openshift-infra/auto-csr-approver-29567816-8qfld" Mar 21 04:56:00 crc kubenswrapper[4839]: I0321 04:56:00.363335 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qjmcj\" (UniqueName: \"kubernetes.io/projected/f5c6cd43-2bb2-46cd-90ec-b1037aaae8a8-kube-api-access-qjmcj\") pod \"auto-csr-approver-29567816-8qfld\" (UID: \"f5c6cd43-2bb2-46cd-90ec-b1037aaae8a8\") " pod="openshift-infra/auto-csr-approver-29567816-8qfld" Mar 21 04:56:00 crc kubenswrapper[4839]: I0321 04:56:00.464558 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567816-8qfld" Mar 21 04:56:00 crc kubenswrapper[4839]: I0321 04:56:00.891871 4839 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29567816-8qfld"] Mar 21 04:56:01 crc kubenswrapper[4839]: I0321 04:56:01.117949 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567816-8qfld" event={"ID":"f5c6cd43-2bb2-46cd-90ec-b1037aaae8a8","Type":"ContainerStarted","Data":"02aac1e91b6557da8f5a23c314a49628a032623832ecd33d29794d3712751930"} Mar 21 04:56:03 crc kubenswrapper[4839]: I0321 04:56:03.134934 4839 generic.go:334] "Generic (PLEG): container finished" podID="f5c6cd43-2bb2-46cd-90ec-b1037aaae8a8" containerID="4fe2426cb283c93b9728be8cbc10600e5f92f98c8d9cf9800594541cb0424886" exitCode=0 Mar 21 04:56:03 crc kubenswrapper[4839]: I0321 04:56:03.135110 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567816-8qfld" event={"ID":"f5c6cd43-2bb2-46cd-90ec-b1037aaae8a8","Type":"ContainerDied","Data":"4fe2426cb283c93b9728be8cbc10600e5f92f98c8d9cf9800594541cb0424886"} Mar 21 04:56:04 crc kubenswrapper[4839]: I0321 04:56:04.463822 4839 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567816-8qfld" Mar 21 04:56:04 crc kubenswrapper[4839]: I0321 04:56:04.628464 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qjmcj\" (UniqueName: \"kubernetes.io/projected/f5c6cd43-2bb2-46cd-90ec-b1037aaae8a8-kube-api-access-qjmcj\") pod \"f5c6cd43-2bb2-46cd-90ec-b1037aaae8a8\" (UID: \"f5c6cd43-2bb2-46cd-90ec-b1037aaae8a8\") " Mar 21 04:56:04 crc kubenswrapper[4839]: I0321 04:56:04.636578 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f5c6cd43-2bb2-46cd-90ec-b1037aaae8a8-kube-api-access-qjmcj" (OuterVolumeSpecName: "kube-api-access-qjmcj") pod "f5c6cd43-2bb2-46cd-90ec-b1037aaae8a8" (UID: "f5c6cd43-2bb2-46cd-90ec-b1037aaae8a8"). InnerVolumeSpecName "kube-api-access-qjmcj". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 21 04:56:04 crc kubenswrapper[4839]: I0321 04:56:04.731219 4839 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qjmcj\" (UniqueName: \"kubernetes.io/projected/f5c6cd43-2bb2-46cd-90ec-b1037aaae8a8-kube-api-access-qjmcj\") on node \"crc\" DevicePath \"\"" Mar 21 04:56:05 crc kubenswrapper[4839]: I0321 04:56:05.151369 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567816-8qfld" event={"ID":"f5c6cd43-2bb2-46cd-90ec-b1037aaae8a8","Type":"ContainerDied","Data":"02aac1e91b6557da8f5a23c314a49628a032623832ecd33d29794d3712751930"} Mar 21 04:56:05 crc kubenswrapper[4839]: I0321 04:56:05.151402 4839 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567816-8qfld" Mar 21 04:56:05 crc kubenswrapper[4839]: I0321 04:56:05.151416 4839 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="02aac1e91b6557da8f5a23c314a49628a032623832ecd33d29794d3712751930" Mar 21 04:56:05 crc kubenswrapper[4839]: I0321 04:56:05.527926 4839 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29567810-hr2wf"] Mar 21 04:56:05 crc kubenswrapper[4839]: I0321 04:56:05.535394 4839 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29567810-hr2wf"] Mar 21 04:56:06 crc kubenswrapper[4839]: I0321 04:56:06.457895 4839 scope.go:117] "RemoveContainer" containerID="27713c304335ae18a33ce2c1e07fb2a32c8ccd655a1db46168d0e6b4f0325109" Mar 21 04:56:06 crc kubenswrapper[4839]: E0321 04:56:06.458522 4839 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-jx4q7_openshift-machine-config-operator(4f92fefb-d5cd-451a-8bbe-31eea55d5bd9)\"" pod="openshift-machine-config-operator/machine-config-daemon-jx4q7" podUID="4f92fefb-d5cd-451a-8bbe-31eea55d5bd9" Mar 21 04:56:06 crc kubenswrapper[4839]: I0321 04:56:06.462496 4839 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cf9d6591-e9e7-485d-96f3-8f32958ac530" path="/var/lib/kubelet/pods/cf9d6591-e9e7-485d-96f3-8f32958ac530/volumes" Mar 21 04:56:19 crc kubenswrapper[4839]: I0321 04:56:19.034715 4839 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-5dvtr"] Mar 21 04:56:19 crc kubenswrapper[4839]: I0321 04:56:19.043519 4839 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-5dvtr"] Mar 21 04:56:20 crc kubenswrapper[4839]: I0321 04:56:20.453445 4839 scope.go:117] "RemoveContainer" containerID="27713c304335ae18a33ce2c1e07fb2a32c8ccd655a1db46168d0e6b4f0325109" Mar 21 04:56:20 crc kubenswrapper[4839]: E0321 04:56:20.454158 4839 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-jx4q7_openshift-machine-config-operator(4f92fefb-d5cd-451a-8bbe-31eea55d5bd9)\"" pod="openshift-machine-config-operator/machine-config-daemon-jx4q7" podUID="4f92fefb-d5cd-451a-8bbe-31eea55d5bd9" Mar 21 04:56:20 crc kubenswrapper[4839]: I0321 04:56:20.466595 4839 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bbaf057c-375e-4da6-a7cd-8c879a51ff50" path="/var/lib/kubelet/pods/bbaf057c-375e-4da6-a7cd-8c879a51ff50/volumes" Mar 21 04:56:29 crc kubenswrapper[4839]: I0321 04:56:29.359666 4839 generic.go:334] "Generic (PLEG): container finished" podID="ab9d4433-fe0e-471b-84f8-568b31920ed3" containerID="da612acf6dc5607ae4b2dde018284e7ea25cf4afa5712325128aa32f44b461be" exitCode=0 Mar 21 04:56:29 crc kubenswrapper[4839]: I0321 04:56:29.359872 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-qkclf" event={"ID":"ab9d4433-fe0e-471b-84f8-568b31920ed3","Type":"ContainerDied","Data":"da612acf6dc5607ae4b2dde018284e7ea25cf4afa5712325128aa32f44b461be"} Mar 21 04:56:30 crc kubenswrapper[4839]: I0321 04:56:30.777069 4839 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-qkclf" Mar 21 04:56:30 crc kubenswrapper[4839]: I0321 04:56:30.828252 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/ab9d4433-fe0e-471b-84f8-568b31920ed3-ssh-key-openstack-edpm-ipam\") pod \"ab9d4433-fe0e-471b-84f8-568b31920ed3\" (UID: \"ab9d4433-fe0e-471b-84f8-568b31920ed3\") " Mar 21 04:56:30 crc kubenswrapper[4839]: I0321 04:56:30.828447 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/ab9d4433-fe0e-471b-84f8-568b31920ed3-inventory\") pod \"ab9d4433-fe0e-471b-84f8-568b31920ed3\" (UID: \"ab9d4433-fe0e-471b-84f8-568b31920ed3\") " Mar 21 04:56:30 crc kubenswrapper[4839]: I0321 04:56:30.828483 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-blxs6\" (UniqueName: \"kubernetes.io/projected/ab9d4433-fe0e-471b-84f8-568b31920ed3-kube-api-access-blxs6\") pod \"ab9d4433-fe0e-471b-84f8-568b31920ed3\" (UID: \"ab9d4433-fe0e-471b-84f8-568b31920ed3\") " Mar 21 04:56:30 crc kubenswrapper[4839]: I0321 04:56:30.847785 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ab9d4433-fe0e-471b-84f8-568b31920ed3-kube-api-access-blxs6" (OuterVolumeSpecName: "kube-api-access-blxs6") pod "ab9d4433-fe0e-471b-84f8-568b31920ed3" (UID: "ab9d4433-fe0e-471b-84f8-568b31920ed3"). InnerVolumeSpecName "kube-api-access-blxs6". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 21 04:56:30 crc kubenswrapper[4839]: I0321 04:56:30.861399 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ab9d4433-fe0e-471b-84f8-568b31920ed3-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "ab9d4433-fe0e-471b-84f8-568b31920ed3" (UID: "ab9d4433-fe0e-471b-84f8-568b31920ed3"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 21 04:56:30 crc kubenswrapper[4839]: I0321 04:56:30.866697 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ab9d4433-fe0e-471b-84f8-568b31920ed3-inventory" (OuterVolumeSpecName: "inventory") pod "ab9d4433-fe0e-471b-84f8-568b31920ed3" (UID: "ab9d4433-fe0e-471b-84f8-568b31920ed3"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 21 04:56:30 crc kubenswrapper[4839]: I0321 04:56:30.931235 4839 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/ab9d4433-fe0e-471b-84f8-568b31920ed3-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Mar 21 04:56:30 crc kubenswrapper[4839]: I0321 04:56:30.931277 4839 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/ab9d4433-fe0e-471b-84f8-568b31920ed3-inventory\") on node \"crc\" DevicePath \"\"" Mar 21 04:56:30 crc kubenswrapper[4839]: I0321 04:56:30.931290 4839 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-blxs6\" (UniqueName: \"kubernetes.io/projected/ab9d4433-fe0e-471b-84f8-568b31920ed3-kube-api-access-blxs6\") on node \"crc\" DevicePath \"\"" Mar 21 04:56:31 crc kubenswrapper[4839]: I0321 04:56:31.377308 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-qkclf" event={"ID":"ab9d4433-fe0e-471b-84f8-568b31920ed3","Type":"ContainerDied","Data":"4d2611a14110b17c867ac61b555f142cd249467327dd49d330816ffb10a58194"} Mar 21 04:56:31 crc kubenswrapper[4839]: I0321 04:56:31.377666 4839 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="4d2611a14110b17c867ac61b555f142cd249467327dd49d330816ffb10a58194" Mar 21 04:56:31 crc kubenswrapper[4839]: I0321 04:56:31.377528 4839 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-qkclf" Mar 21 04:56:31 crc kubenswrapper[4839]: I0321 04:56:31.469252 4839 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ssh-known-hosts-edpm-deployment-chfcw"] Mar 21 04:56:31 crc kubenswrapper[4839]: E0321 04:56:31.469730 4839 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ab9d4433-fe0e-471b-84f8-568b31920ed3" containerName="configure-os-edpm-deployment-openstack-edpm-ipam" Mar 21 04:56:31 crc kubenswrapper[4839]: I0321 04:56:31.469758 4839 state_mem.go:107] "Deleted CPUSet assignment" podUID="ab9d4433-fe0e-471b-84f8-568b31920ed3" containerName="configure-os-edpm-deployment-openstack-edpm-ipam" Mar 21 04:56:31 crc kubenswrapper[4839]: E0321 04:56:31.469808 4839 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f5c6cd43-2bb2-46cd-90ec-b1037aaae8a8" containerName="oc" Mar 21 04:56:31 crc kubenswrapper[4839]: I0321 04:56:31.469817 4839 state_mem.go:107] "Deleted CPUSet assignment" podUID="f5c6cd43-2bb2-46cd-90ec-b1037aaae8a8" containerName="oc" Mar 21 04:56:31 crc kubenswrapper[4839]: I0321 04:56:31.470050 4839 memory_manager.go:354] "RemoveStaleState removing state" podUID="f5c6cd43-2bb2-46cd-90ec-b1037aaae8a8" containerName="oc" Mar 21 04:56:31 crc kubenswrapper[4839]: I0321 04:56:31.470070 4839 memory_manager.go:354] "RemoveStaleState removing state" podUID="ab9d4433-fe0e-471b-84f8-568b31920ed3" containerName="configure-os-edpm-deployment-openstack-edpm-ipam" Mar 21 04:56:31 crc kubenswrapper[4839]: I0321 04:56:31.475179 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ssh-known-hosts-edpm-deployment-chfcw" Mar 21 04:56:31 crc kubenswrapper[4839]: I0321 04:56:31.477666 4839 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-pxvtf" Mar 21 04:56:31 crc kubenswrapper[4839]: I0321 04:56:31.477681 4839 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Mar 21 04:56:31 crc kubenswrapper[4839]: I0321 04:56:31.477964 4839 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Mar 21 04:56:31 crc kubenswrapper[4839]: I0321 04:56:31.480341 4839 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ssh-known-hosts-edpm-deployment-chfcw"] Mar 21 04:56:31 crc kubenswrapper[4839]: I0321 04:56:31.481449 4839 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Mar 21 04:56:31 crc kubenswrapper[4839]: I0321 04:56:31.542937 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8gf4r\" (UniqueName: \"kubernetes.io/projected/39dbacec-c845-4f19-92a9-c0e63fba203c-kube-api-access-8gf4r\") pod \"ssh-known-hosts-edpm-deployment-chfcw\" (UID: \"39dbacec-c845-4f19-92a9-c0e63fba203c\") " pod="openstack/ssh-known-hosts-edpm-deployment-chfcw" Mar 21 04:56:31 crc kubenswrapper[4839]: I0321 04:56:31.543087 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/39dbacec-c845-4f19-92a9-c0e63fba203c-inventory-0\") pod \"ssh-known-hosts-edpm-deployment-chfcw\" (UID: \"39dbacec-c845-4f19-92a9-c0e63fba203c\") " pod="openstack/ssh-known-hosts-edpm-deployment-chfcw" Mar 21 04:56:31 crc kubenswrapper[4839]: I0321 04:56:31.543124 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/39dbacec-c845-4f19-92a9-c0e63fba203c-ssh-key-openstack-edpm-ipam\") pod \"ssh-known-hosts-edpm-deployment-chfcw\" (UID: \"39dbacec-c845-4f19-92a9-c0e63fba203c\") " pod="openstack/ssh-known-hosts-edpm-deployment-chfcw" Mar 21 04:56:31 crc kubenswrapper[4839]: I0321 04:56:31.645252 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/39dbacec-c845-4f19-92a9-c0e63fba203c-inventory-0\") pod \"ssh-known-hosts-edpm-deployment-chfcw\" (UID: \"39dbacec-c845-4f19-92a9-c0e63fba203c\") " pod="openstack/ssh-known-hosts-edpm-deployment-chfcw" Mar 21 04:56:31 crc kubenswrapper[4839]: I0321 04:56:31.645308 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/39dbacec-c845-4f19-92a9-c0e63fba203c-ssh-key-openstack-edpm-ipam\") pod \"ssh-known-hosts-edpm-deployment-chfcw\" (UID: \"39dbacec-c845-4f19-92a9-c0e63fba203c\") " pod="openstack/ssh-known-hosts-edpm-deployment-chfcw" Mar 21 04:56:31 crc kubenswrapper[4839]: I0321 04:56:31.645388 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8gf4r\" (UniqueName: \"kubernetes.io/projected/39dbacec-c845-4f19-92a9-c0e63fba203c-kube-api-access-8gf4r\") pod \"ssh-known-hosts-edpm-deployment-chfcw\" (UID: \"39dbacec-c845-4f19-92a9-c0e63fba203c\") " pod="openstack/ssh-known-hosts-edpm-deployment-chfcw" Mar 21 04:56:31 crc kubenswrapper[4839]: I0321 04:56:31.649434 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/39dbacec-c845-4f19-92a9-c0e63fba203c-ssh-key-openstack-edpm-ipam\") pod \"ssh-known-hosts-edpm-deployment-chfcw\" (UID: \"39dbacec-c845-4f19-92a9-c0e63fba203c\") " pod="openstack/ssh-known-hosts-edpm-deployment-chfcw" Mar 21 04:56:31 crc kubenswrapper[4839]: I0321 04:56:31.652017 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/39dbacec-c845-4f19-92a9-c0e63fba203c-inventory-0\") pod \"ssh-known-hosts-edpm-deployment-chfcw\" (UID: \"39dbacec-c845-4f19-92a9-c0e63fba203c\") " pod="openstack/ssh-known-hosts-edpm-deployment-chfcw" Mar 21 04:56:31 crc kubenswrapper[4839]: I0321 04:56:31.664406 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8gf4r\" (UniqueName: \"kubernetes.io/projected/39dbacec-c845-4f19-92a9-c0e63fba203c-kube-api-access-8gf4r\") pod \"ssh-known-hosts-edpm-deployment-chfcw\" (UID: \"39dbacec-c845-4f19-92a9-c0e63fba203c\") " pod="openstack/ssh-known-hosts-edpm-deployment-chfcw" Mar 21 04:56:31 crc kubenswrapper[4839]: I0321 04:56:31.802801 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ssh-known-hosts-edpm-deployment-chfcw" Mar 21 04:56:32 crc kubenswrapper[4839]: I0321 04:56:32.332482 4839 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ssh-known-hosts-edpm-deployment-chfcw"] Mar 21 04:56:32 crc kubenswrapper[4839]: I0321 04:56:32.386784 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ssh-known-hosts-edpm-deployment-chfcw" event={"ID":"39dbacec-c845-4f19-92a9-c0e63fba203c","Type":"ContainerStarted","Data":"45cf77cbb165cdeab83452c0e8e2aa4aea1d85e4c4eb12a0e746b22674d3f296"} Mar 21 04:56:33 crc kubenswrapper[4839]: I0321 04:56:33.400300 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ssh-known-hosts-edpm-deployment-chfcw" event={"ID":"39dbacec-c845-4f19-92a9-c0e63fba203c","Type":"ContainerStarted","Data":"6cf9b33fd67e98124ff5ecedbd15da8038542728bbcd98b40d8d307a3bc9485c"} Mar 21 04:56:33 crc kubenswrapper[4839]: I0321 04:56:33.416772 4839 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ssh-known-hosts-edpm-deployment-chfcw" podStartSLOduration=1.804084568 podStartE2EDuration="2.416748449s" podCreationTimestamp="2026-03-21 04:56:31 +0000 UTC" firstStartedPulling="2026-03-21 04:56:32.340708122 +0000 UTC m=+1996.668494798" lastFinishedPulling="2026-03-21 04:56:32.953371983 +0000 UTC m=+1997.281158679" observedRunningTime="2026-03-21 04:56:33.413139868 +0000 UTC m=+1997.740926544" watchObservedRunningTime="2026-03-21 04:56:33.416748449 +0000 UTC m=+1997.744535125" Mar 21 04:56:35 crc kubenswrapper[4839]: I0321 04:56:35.453386 4839 scope.go:117] "RemoveContainer" containerID="27713c304335ae18a33ce2c1e07fb2a32c8ccd655a1db46168d0e6b4f0325109" Mar 21 04:56:35 crc kubenswrapper[4839]: E0321 04:56:35.454086 4839 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-jx4q7_openshift-machine-config-operator(4f92fefb-d5cd-451a-8bbe-31eea55d5bd9)\"" pod="openshift-machine-config-operator/machine-config-daemon-jx4q7" podUID="4f92fefb-d5cd-451a-8bbe-31eea55d5bd9" Mar 21 04:56:38 crc kubenswrapper[4839]: I0321 04:56:38.040150 4839 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-cell-mapping-csj7l"] Mar 21 04:56:38 crc kubenswrapper[4839]: I0321 04:56:38.051972 4839 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-cell-mapping-csj7l"] Mar 21 04:56:38 crc kubenswrapper[4839]: I0321 04:56:38.465007 4839 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="37c6fbf7-427d-45a8-b190-439265c8d6d0" path="/var/lib/kubelet/pods/37c6fbf7-427d-45a8-b190-439265c8d6d0/volumes" Mar 21 04:56:39 crc kubenswrapper[4839]: I0321 04:56:39.455254 4839 generic.go:334] "Generic (PLEG): container finished" podID="39dbacec-c845-4f19-92a9-c0e63fba203c" containerID="6cf9b33fd67e98124ff5ecedbd15da8038542728bbcd98b40d8d307a3bc9485c" exitCode=0 Mar 21 04:56:39 crc kubenswrapper[4839]: I0321 04:56:39.455334 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ssh-known-hosts-edpm-deployment-chfcw" event={"ID":"39dbacec-c845-4f19-92a9-c0e63fba203c","Type":"ContainerDied","Data":"6cf9b33fd67e98124ff5ecedbd15da8038542728bbcd98b40d8d307a3bc9485c"} Mar 21 04:56:40 crc kubenswrapper[4839]: I0321 04:56:40.876782 4839 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ssh-known-hosts-edpm-deployment-chfcw" Mar 21 04:56:40 crc kubenswrapper[4839]: I0321 04:56:40.945022 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/39dbacec-c845-4f19-92a9-c0e63fba203c-ssh-key-openstack-edpm-ipam\") pod \"39dbacec-c845-4f19-92a9-c0e63fba203c\" (UID: \"39dbacec-c845-4f19-92a9-c0e63fba203c\") " Mar 21 04:56:40 crc kubenswrapper[4839]: I0321 04:56:40.945291 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/39dbacec-c845-4f19-92a9-c0e63fba203c-inventory-0\") pod \"39dbacec-c845-4f19-92a9-c0e63fba203c\" (UID: \"39dbacec-c845-4f19-92a9-c0e63fba203c\") " Mar 21 04:56:40 crc kubenswrapper[4839]: I0321 04:56:40.946459 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8gf4r\" (UniqueName: \"kubernetes.io/projected/39dbacec-c845-4f19-92a9-c0e63fba203c-kube-api-access-8gf4r\") pod \"39dbacec-c845-4f19-92a9-c0e63fba203c\" (UID: \"39dbacec-c845-4f19-92a9-c0e63fba203c\") " Mar 21 04:56:40 crc kubenswrapper[4839]: I0321 04:56:40.951588 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/39dbacec-c845-4f19-92a9-c0e63fba203c-kube-api-access-8gf4r" (OuterVolumeSpecName: "kube-api-access-8gf4r") pod "39dbacec-c845-4f19-92a9-c0e63fba203c" (UID: "39dbacec-c845-4f19-92a9-c0e63fba203c"). InnerVolumeSpecName "kube-api-access-8gf4r". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 21 04:56:40 crc kubenswrapper[4839]: I0321 04:56:40.972950 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/39dbacec-c845-4f19-92a9-c0e63fba203c-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "39dbacec-c845-4f19-92a9-c0e63fba203c" (UID: "39dbacec-c845-4f19-92a9-c0e63fba203c"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 21 04:56:40 crc kubenswrapper[4839]: I0321 04:56:40.973172 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/39dbacec-c845-4f19-92a9-c0e63fba203c-inventory-0" (OuterVolumeSpecName: "inventory-0") pod "39dbacec-c845-4f19-92a9-c0e63fba203c" (UID: "39dbacec-c845-4f19-92a9-c0e63fba203c"). InnerVolumeSpecName "inventory-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 21 04:56:41 crc kubenswrapper[4839]: I0321 04:56:41.048986 4839 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/39dbacec-c845-4f19-92a9-c0e63fba203c-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Mar 21 04:56:41 crc kubenswrapper[4839]: I0321 04:56:41.049482 4839 reconciler_common.go:293] "Volume detached for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/39dbacec-c845-4f19-92a9-c0e63fba203c-inventory-0\") on node \"crc\" DevicePath \"\"" Mar 21 04:56:41 crc kubenswrapper[4839]: I0321 04:56:41.049554 4839 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8gf4r\" (UniqueName: \"kubernetes.io/projected/39dbacec-c845-4f19-92a9-c0e63fba203c-kube-api-access-8gf4r\") on node \"crc\" DevicePath \"\"" Mar 21 04:56:41 crc kubenswrapper[4839]: I0321 04:56:41.473192 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ssh-known-hosts-edpm-deployment-chfcw" event={"ID":"39dbacec-c845-4f19-92a9-c0e63fba203c","Type":"ContainerDied","Data":"45cf77cbb165cdeab83452c0e8e2aa4aea1d85e4c4eb12a0e746b22674d3f296"} Mar 21 04:56:41 crc kubenswrapper[4839]: I0321 04:56:41.473244 4839 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="45cf77cbb165cdeab83452c0e8e2aa4aea1d85e4c4eb12a0e746b22674d3f296" Mar 21 04:56:41 crc kubenswrapper[4839]: I0321 04:56:41.473261 4839 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ssh-known-hosts-edpm-deployment-chfcw" Mar 21 04:56:41 crc kubenswrapper[4839]: I0321 04:56:41.550679 4839 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/run-os-edpm-deployment-openstack-edpm-ipam-55fzl"] Mar 21 04:56:41 crc kubenswrapper[4839]: E0321 04:56:41.551091 4839 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="39dbacec-c845-4f19-92a9-c0e63fba203c" containerName="ssh-known-hosts-edpm-deployment" Mar 21 04:56:41 crc kubenswrapper[4839]: I0321 04:56:41.551114 4839 state_mem.go:107] "Deleted CPUSet assignment" podUID="39dbacec-c845-4f19-92a9-c0e63fba203c" containerName="ssh-known-hosts-edpm-deployment" Mar 21 04:56:41 crc kubenswrapper[4839]: I0321 04:56:41.551289 4839 memory_manager.go:354] "RemoveStaleState removing state" podUID="39dbacec-c845-4f19-92a9-c0e63fba203c" containerName="ssh-known-hosts-edpm-deployment" Mar 21 04:56:41 crc kubenswrapper[4839]: I0321 04:56:41.551933 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-55fzl" Mar 21 04:56:41 crc kubenswrapper[4839]: I0321 04:56:41.558829 4839 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Mar 21 04:56:41 crc kubenswrapper[4839]: I0321 04:56:41.559132 4839 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Mar 21 04:56:41 crc kubenswrapper[4839]: I0321 04:56:41.559328 4839 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-pxvtf" Mar 21 04:56:41 crc kubenswrapper[4839]: I0321 04:56:41.559441 4839 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Mar 21 04:56:41 crc kubenswrapper[4839]: I0321 04:56:41.570482 4839 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/run-os-edpm-deployment-openstack-edpm-ipam-55fzl"] Mar 21 04:56:41 crc kubenswrapper[4839]: I0321 04:56:41.659378 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c444z\" (UniqueName: \"kubernetes.io/projected/26adbd7b-7994-4bea-9f94-338881339833-kube-api-access-c444z\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-55fzl\" (UID: \"26adbd7b-7994-4bea-9f94-338881339833\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-55fzl" Mar 21 04:56:41 crc kubenswrapper[4839]: I0321 04:56:41.659509 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/26adbd7b-7994-4bea-9f94-338881339833-ssh-key-openstack-edpm-ipam\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-55fzl\" (UID: \"26adbd7b-7994-4bea-9f94-338881339833\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-55fzl" Mar 21 04:56:41 crc kubenswrapper[4839]: I0321 04:56:41.659585 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/26adbd7b-7994-4bea-9f94-338881339833-inventory\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-55fzl\" (UID: \"26adbd7b-7994-4bea-9f94-338881339833\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-55fzl" Mar 21 04:56:41 crc kubenswrapper[4839]: I0321 04:56:41.761085 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/26adbd7b-7994-4bea-9f94-338881339833-ssh-key-openstack-edpm-ipam\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-55fzl\" (UID: \"26adbd7b-7994-4bea-9f94-338881339833\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-55fzl" Mar 21 04:56:41 crc kubenswrapper[4839]: I0321 04:56:41.761192 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/26adbd7b-7994-4bea-9f94-338881339833-inventory\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-55fzl\" (UID: \"26adbd7b-7994-4bea-9f94-338881339833\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-55fzl" Mar 21 04:56:41 crc kubenswrapper[4839]: I0321 04:56:41.761305 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-c444z\" (UniqueName: \"kubernetes.io/projected/26adbd7b-7994-4bea-9f94-338881339833-kube-api-access-c444z\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-55fzl\" (UID: \"26adbd7b-7994-4bea-9f94-338881339833\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-55fzl" Mar 21 04:56:41 crc kubenswrapper[4839]: I0321 04:56:41.765332 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/26adbd7b-7994-4bea-9f94-338881339833-ssh-key-openstack-edpm-ipam\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-55fzl\" (UID: \"26adbd7b-7994-4bea-9f94-338881339833\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-55fzl" Mar 21 04:56:41 crc kubenswrapper[4839]: I0321 04:56:41.766757 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/26adbd7b-7994-4bea-9f94-338881339833-inventory\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-55fzl\" (UID: \"26adbd7b-7994-4bea-9f94-338881339833\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-55fzl" Mar 21 04:56:41 crc kubenswrapper[4839]: I0321 04:56:41.783894 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-c444z\" (UniqueName: \"kubernetes.io/projected/26adbd7b-7994-4bea-9f94-338881339833-kube-api-access-c444z\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-55fzl\" (UID: \"26adbd7b-7994-4bea-9f94-338881339833\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-55fzl" Mar 21 04:56:41 crc kubenswrapper[4839]: I0321 04:56:41.871365 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-55fzl" Mar 21 04:56:42 crc kubenswrapper[4839]: I0321 04:56:42.038928 4839 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-jznl6"] Mar 21 04:56:42 crc kubenswrapper[4839]: I0321 04:56:42.048818 4839 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-jznl6"] Mar 21 04:56:42 crc kubenswrapper[4839]: I0321 04:56:42.426470 4839 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/run-os-edpm-deployment-openstack-edpm-ipam-55fzl"] Mar 21 04:56:42 crc kubenswrapper[4839]: I0321 04:56:42.468120 4839 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="500decd4-2b92-4e52-bfa8-bb8d1fe13b9d" path="/var/lib/kubelet/pods/500decd4-2b92-4e52-bfa8-bb8d1fe13b9d/volumes" Mar 21 04:56:42 crc kubenswrapper[4839]: I0321 04:56:42.482146 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-55fzl" event={"ID":"26adbd7b-7994-4bea-9f94-338881339833","Type":"ContainerStarted","Data":"8f682b0989285d8c00029eab5269a32b2effbb8bd30193f56340e53ee018216a"} Mar 21 04:56:43 crc kubenswrapper[4839]: I0321 04:56:43.491969 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-55fzl" event={"ID":"26adbd7b-7994-4bea-9f94-338881339833","Type":"ContainerStarted","Data":"57c38841df5f83fb1292c8c5338647bdc36120ad9d503b33d4421151aaab3f6e"} Mar 21 04:56:43 crc kubenswrapper[4839]: I0321 04:56:43.516235 4839 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-55fzl" podStartSLOduration=2.118632058 podStartE2EDuration="2.516220181s" podCreationTimestamp="2026-03-21 04:56:41 +0000 UTC" firstStartedPulling="2026-03-21 04:56:42.446374418 +0000 UTC m=+2006.774161094" lastFinishedPulling="2026-03-21 04:56:42.843962551 +0000 UTC m=+2007.171749217" observedRunningTime="2026-03-21 04:56:43.511945311 +0000 UTC m=+2007.839731997" watchObservedRunningTime="2026-03-21 04:56:43.516220181 +0000 UTC m=+2007.844006857" Mar 21 04:56:47 crc kubenswrapper[4839]: I0321 04:56:47.453086 4839 scope.go:117] "RemoveContainer" containerID="27713c304335ae18a33ce2c1e07fb2a32c8ccd655a1db46168d0e6b4f0325109" Mar 21 04:56:47 crc kubenswrapper[4839]: E0321 04:56:47.453853 4839 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-jx4q7_openshift-machine-config-operator(4f92fefb-d5cd-451a-8bbe-31eea55d5bd9)\"" pod="openshift-machine-config-operator/machine-config-daemon-jx4q7" podUID="4f92fefb-d5cd-451a-8bbe-31eea55d5bd9" Mar 21 04:56:50 crc kubenswrapper[4839]: E0321 04:56:50.340513 4839 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod26adbd7b_7994_4bea_9f94_338881339833.slice/crio-57c38841df5f83fb1292c8c5338647bdc36120ad9d503b33d4421151aaab3f6e.scope\": RecentStats: unable to find data in memory cache]" Mar 21 04:56:50 crc kubenswrapper[4839]: I0321 04:56:50.560072 4839 generic.go:334] "Generic (PLEG): container finished" podID="26adbd7b-7994-4bea-9f94-338881339833" containerID="57c38841df5f83fb1292c8c5338647bdc36120ad9d503b33d4421151aaab3f6e" exitCode=0 Mar 21 04:56:50 crc kubenswrapper[4839]: I0321 04:56:50.560123 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-55fzl" event={"ID":"26adbd7b-7994-4bea-9f94-338881339833","Type":"ContainerDied","Data":"57c38841df5f83fb1292c8c5338647bdc36120ad9d503b33d4421151aaab3f6e"} Mar 21 04:56:51 crc kubenswrapper[4839]: I0321 04:56:51.996408 4839 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-55fzl" Mar 21 04:56:52 crc kubenswrapper[4839]: I0321 04:56:52.054531 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/26adbd7b-7994-4bea-9f94-338881339833-ssh-key-openstack-edpm-ipam\") pod \"26adbd7b-7994-4bea-9f94-338881339833\" (UID: \"26adbd7b-7994-4bea-9f94-338881339833\") " Mar 21 04:56:52 crc kubenswrapper[4839]: I0321 04:56:52.054666 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/26adbd7b-7994-4bea-9f94-338881339833-inventory\") pod \"26adbd7b-7994-4bea-9f94-338881339833\" (UID: \"26adbd7b-7994-4bea-9f94-338881339833\") " Mar 21 04:56:52 crc kubenswrapper[4839]: I0321 04:56:52.054709 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-c444z\" (UniqueName: \"kubernetes.io/projected/26adbd7b-7994-4bea-9f94-338881339833-kube-api-access-c444z\") pod \"26adbd7b-7994-4bea-9f94-338881339833\" (UID: \"26adbd7b-7994-4bea-9f94-338881339833\") " Mar 21 04:56:52 crc kubenswrapper[4839]: I0321 04:56:52.059992 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/26adbd7b-7994-4bea-9f94-338881339833-kube-api-access-c444z" (OuterVolumeSpecName: "kube-api-access-c444z") pod "26adbd7b-7994-4bea-9f94-338881339833" (UID: "26adbd7b-7994-4bea-9f94-338881339833"). InnerVolumeSpecName "kube-api-access-c444z". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 21 04:56:52 crc kubenswrapper[4839]: I0321 04:56:52.085977 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/26adbd7b-7994-4bea-9f94-338881339833-inventory" (OuterVolumeSpecName: "inventory") pod "26adbd7b-7994-4bea-9f94-338881339833" (UID: "26adbd7b-7994-4bea-9f94-338881339833"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 21 04:56:52 crc kubenswrapper[4839]: I0321 04:56:52.086076 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/26adbd7b-7994-4bea-9f94-338881339833-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "26adbd7b-7994-4bea-9f94-338881339833" (UID: "26adbd7b-7994-4bea-9f94-338881339833"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 21 04:56:52 crc kubenswrapper[4839]: I0321 04:56:52.157772 4839 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/26adbd7b-7994-4bea-9f94-338881339833-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Mar 21 04:56:52 crc kubenswrapper[4839]: I0321 04:56:52.157824 4839 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/26adbd7b-7994-4bea-9f94-338881339833-inventory\") on node \"crc\" DevicePath \"\"" Mar 21 04:56:52 crc kubenswrapper[4839]: I0321 04:56:52.157837 4839 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-c444z\" (UniqueName: \"kubernetes.io/projected/26adbd7b-7994-4bea-9f94-338881339833-kube-api-access-c444z\") on node \"crc\" DevicePath \"\"" Mar 21 04:56:52 crc kubenswrapper[4839]: I0321 04:56:52.579076 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-55fzl" event={"ID":"26adbd7b-7994-4bea-9f94-338881339833","Type":"ContainerDied","Data":"8f682b0989285d8c00029eab5269a32b2effbb8bd30193f56340e53ee018216a"} Mar 21 04:56:52 crc kubenswrapper[4839]: I0321 04:56:52.579114 4839 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="8f682b0989285d8c00029eab5269a32b2effbb8bd30193f56340e53ee018216a" Mar 21 04:56:52 crc kubenswrapper[4839]: I0321 04:56:52.579135 4839 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-55fzl" Mar 21 04:56:52 crc kubenswrapper[4839]: I0321 04:56:52.639795 4839 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-mkn9r"] Mar 21 04:56:52 crc kubenswrapper[4839]: E0321 04:56:52.640180 4839 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="26adbd7b-7994-4bea-9f94-338881339833" containerName="run-os-edpm-deployment-openstack-edpm-ipam" Mar 21 04:56:52 crc kubenswrapper[4839]: I0321 04:56:52.640200 4839 state_mem.go:107] "Deleted CPUSet assignment" podUID="26adbd7b-7994-4bea-9f94-338881339833" containerName="run-os-edpm-deployment-openstack-edpm-ipam" Mar 21 04:56:52 crc kubenswrapper[4839]: I0321 04:56:52.640415 4839 memory_manager.go:354] "RemoveStaleState removing state" podUID="26adbd7b-7994-4bea-9f94-338881339833" containerName="run-os-edpm-deployment-openstack-edpm-ipam" Mar 21 04:56:52 crc kubenswrapper[4839]: I0321 04:56:52.641179 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-mkn9r" Mar 21 04:56:52 crc kubenswrapper[4839]: I0321 04:56:52.644433 4839 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-pxvtf" Mar 21 04:56:52 crc kubenswrapper[4839]: I0321 04:56:52.644434 4839 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Mar 21 04:56:52 crc kubenswrapper[4839]: I0321 04:56:52.645294 4839 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Mar 21 04:56:52 crc kubenswrapper[4839]: I0321 04:56:52.645770 4839 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Mar 21 04:56:52 crc kubenswrapper[4839]: I0321 04:56:52.655722 4839 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-mkn9r"] Mar 21 04:56:52 crc kubenswrapper[4839]: I0321 04:56:52.666423 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/66c3e343-3306-455d-89d7-db17c1bd53ed-inventory\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-mkn9r\" (UID: \"66c3e343-3306-455d-89d7-db17c1bd53ed\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-mkn9r" Mar 21 04:56:52 crc kubenswrapper[4839]: I0321 04:56:52.666484 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tgthl\" (UniqueName: \"kubernetes.io/projected/66c3e343-3306-455d-89d7-db17c1bd53ed-kube-api-access-tgthl\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-mkn9r\" (UID: \"66c3e343-3306-455d-89d7-db17c1bd53ed\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-mkn9r" Mar 21 04:56:52 crc kubenswrapper[4839]: I0321 04:56:52.666551 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/66c3e343-3306-455d-89d7-db17c1bd53ed-ssh-key-openstack-edpm-ipam\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-mkn9r\" (UID: \"66c3e343-3306-455d-89d7-db17c1bd53ed\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-mkn9r" Mar 21 04:56:52 crc kubenswrapper[4839]: I0321 04:56:52.768209 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/66c3e343-3306-455d-89d7-db17c1bd53ed-ssh-key-openstack-edpm-ipam\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-mkn9r\" (UID: \"66c3e343-3306-455d-89d7-db17c1bd53ed\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-mkn9r" Mar 21 04:56:52 crc kubenswrapper[4839]: I0321 04:56:52.768345 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/66c3e343-3306-455d-89d7-db17c1bd53ed-inventory\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-mkn9r\" (UID: \"66c3e343-3306-455d-89d7-db17c1bd53ed\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-mkn9r" Mar 21 04:56:52 crc kubenswrapper[4839]: I0321 04:56:52.768386 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tgthl\" (UniqueName: \"kubernetes.io/projected/66c3e343-3306-455d-89d7-db17c1bd53ed-kube-api-access-tgthl\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-mkn9r\" (UID: \"66c3e343-3306-455d-89d7-db17c1bd53ed\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-mkn9r" Mar 21 04:56:52 crc kubenswrapper[4839]: I0321 04:56:52.771737 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/66c3e343-3306-455d-89d7-db17c1bd53ed-ssh-key-openstack-edpm-ipam\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-mkn9r\" (UID: \"66c3e343-3306-455d-89d7-db17c1bd53ed\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-mkn9r" Mar 21 04:56:52 crc kubenswrapper[4839]: I0321 04:56:52.772234 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/66c3e343-3306-455d-89d7-db17c1bd53ed-inventory\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-mkn9r\" (UID: \"66c3e343-3306-455d-89d7-db17c1bd53ed\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-mkn9r" Mar 21 04:56:52 crc kubenswrapper[4839]: I0321 04:56:52.792867 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tgthl\" (UniqueName: \"kubernetes.io/projected/66c3e343-3306-455d-89d7-db17c1bd53ed-kube-api-access-tgthl\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-mkn9r\" (UID: \"66c3e343-3306-455d-89d7-db17c1bd53ed\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-mkn9r" Mar 21 04:56:52 crc kubenswrapper[4839]: I0321 04:56:52.957073 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-mkn9r" Mar 21 04:56:53 crc kubenswrapper[4839]: I0321 04:56:53.453335 4839 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-mkn9r"] Mar 21 04:56:53 crc kubenswrapper[4839]: W0321 04:56:53.456482 4839 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod66c3e343_3306_455d_89d7_db17c1bd53ed.slice/crio-b38083ec341a913ae40633cd241eb560533502df4320d1402330cb006f7cab05 WatchSource:0}: Error finding container b38083ec341a913ae40633cd241eb560533502df4320d1402330cb006f7cab05: Status 404 returned error can't find the container with id b38083ec341a913ae40633cd241eb560533502df4320d1402330cb006f7cab05 Mar 21 04:56:53 crc kubenswrapper[4839]: I0321 04:56:53.587784 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-mkn9r" event={"ID":"66c3e343-3306-455d-89d7-db17c1bd53ed","Type":"ContainerStarted","Data":"b38083ec341a913ae40633cd241eb560533502df4320d1402330cb006f7cab05"} Mar 21 04:56:54 crc kubenswrapper[4839]: I0321 04:56:54.596591 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-mkn9r" event={"ID":"66c3e343-3306-455d-89d7-db17c1bd53ed","Type":"ContainerStarted","Data":"9221954f588154e2a1711698f6e316dc7635cd9779ff5b27684643d387bfa5bc"} Mar 21 04:56:54 crc kubenswrapper[4839]: I0321 04:56:54.618294 4839 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-mkn9r" podStartSLOduration=2.162687536 podStartE2EDuration="2.618274222s" podCreationTimestamp="2026-03-21 04:56:52 +0000 UTC" firstStartedPulling="2026-03-21 04:56:53.458835825 +0000 UTC m=+2017.786622501" lastFinishedPulling="2026-03-21 04:56:53.914422511 +0000 UTC m=+2018.242209187" observedRunningTime="2026-03-21 04:56:54.610990067 +0000 UTC m=+2018.938776733" watchObservedRunningTime="2026-03-21 04:56:54.618274222 +0000 UTC m=+2018.946060898" Mar 21 04:56:55 crc kubenswrapper[4839]: I0321 04:56:55.176434 4839 scope.go:117] "RemoveContainer" containerID="3f39162e6963343de8c3eafe8a89ac888be7f9493499afd89bf8375748fc8e0f" Mar 21 04:56:55 crc kubenswrapper[4839]: I0321 04:56:55.220070 4839 scope.go:117] "RemoveContainer" containerID="6500e5c41c0724032a37daabaaadca5a2ab96ab0732aaceeaaccdf5e739d902c" Mar 21 04:56:55 crc kubenswrapper[4839]: I0321 04:56:55.263033 4839 scope.go:117] "RemoveContainer" containerID="118f2c293ce181a9defa7eb0621b40d7a4ec32e8ea91c36b0f98ccebfdd6ba13" Mar 21 04:56:55 crc kubenswrapper[4839]: I0321 04:56:55.310865 4839 scope.go:117] "RemoveContainer" containerID="a57d3dec4c234a21b088b3986b8d9a4b8012dec53cc26619ad9bdd0f9475d8cc" Mar 21 04:56:58 crc kubenswrapper[4839]: I0321 04:56:58.453867 4839 scope.go:117] "RemoveContainer" containerID="27713c304335ae18a33ce2c1e07fb2a32c8ccd655a1db46168d0e6b4f0325109" Mar 21 04:56:58 crc kubenswrapper[4839]: E0321 04:56:58.454647 4839 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-jx4q7_openshift-machine-config-operator(4f92fefb-d5cd-451a-8bbe-31eea55d5bd9)\"" pod="openshift-machine-config-operator/machine-config-daemon-jx4q7" podUID="4f92fefb-d5cd-451a-8bbe-31eea55d5bd9" Mar 21 04:57:03 crc kubenswrapper[4839]: I0321 04:57:03.663338 4839 generic.go:334] "Generic (PLEG): container finished" podID="66c3e343-3306-455d-89d7-db17c1bd53ed" containerID="9221954f588154e2a1711698f6e316dc7635cd9779ff5b27684643d387bfa5bc" exitCode=0 Mar 21 04:57:03 crc kubenswrapper[4839]: I0321 04:57:03.663426 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-mkn9r" event={"ID":"66c3e343-3306-455d-89d7-db17c1bd53ed","Type":"ContainerDied","Data":"9221954f588154e2a1711698f6e316dc7635cd9779ff5b27684643d387bfa5bc"} Mar 21 04:57:05 crc kubenswrapper[4839]: I0321 04:57:05.083999 4839 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-mkn9r" Mar 21 04:57:05 crc kubenswrapper[4839]: I0321 04:57:05.190459 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/66c3e343-3306-455d-89d7-db17c1bd53ed-ssh-key-openstack-edpm-ipam\") pod \"66c3e343-3306-455d-89d7-db17c1bd53ed\" (UID: \"66c3e343-3306-455d-89d7-db17c1bd53ed\") " Mar 21 04:57:05 crc kubenswrapper[4839]: I0321 04:57:05.190518 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/66c3e343-3306-455d-89d7-db17c1bd53ed-inventory\") pod \"66c3e343-3306-455d-89d7-db17c1bd53ed\" (UID: \"66c3e343-3306-455d-89d7-db17c1bd53ed\") " Mar 21 04:57:05 crc kubenswrapper[4839]: I0321 04:57:05.190614 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tgthl\" (UniqueName: \"kubernetes.io/projected/66c3e343-3306-455d-89d7-db17c1bd53ed-kube-api-access-tgthl\") pod \"66c3e343-3306-455d-89d7-db17c1bd53ed\" (UID: \"66c3e343-3306-455d-89d7-db17c1bd53ed\") " Mar 21 04:57:05 crc kubenswrapper[4839]: I0321 04:57:05.198389 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/66c3e343-3306-455d-89d7-db17c1bd53ed-kube-api-access-tgthl" (OuterVolumeSpecName: "kube-api-access-tgthl") pod "66c3e343-3306-455d-89d7-db17c1bd53ed" (UID: "66c3e343-3306-455d-89d7-db17c1bd53ed"). InnerVolumeSpecName "kube-api-access-tgthl". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 21 04:57:05 crc kubenswrapper[4839]: I0321 04:57:05.216137 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/66c3e343-3306-455d-89d7-db17c1bd53ed-inventory" (OuterVolumeSpecName: "inventory") pod "66c3e343-3306-455d-89d7-db17c1bd53ed" (UID: "66c3e343-3306-455d-89d7-db17c1bd53ed"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 21 04:57:05 crc kubenswrapper[4839]: I0321 04:57:05.218089 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/66c3e343-3306-455d-89d7-db17c1bd53ed-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "66c3e343-3306-455d-89d7-db17c1bd53ed" (UID: "66c3e343-3306-455d-89d7-db17c1bd53ed"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 21 04:57:05 crc kubenswrapper[4839]: I0321 04:57:05.293170 4839 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/66c3e343-3306-455d-89d7-db17c1bd53ed-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Mar 21 04:57:05 crc kubenswrapper[4839]: I0321 04:57:05.293200 4839 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/66c3e343-3306-455d-89d7-db17c1bd53ed-inventory\") on node \"crc\" DevicePath \"\"" Mar 21 04:57:05 crc kubenswrapper[4839]: I0321 04:57:05.293209 4839 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tgthl\" (UniqueName: \"kubernetes.io/projected/66c3e343-3306-455d-89d7-db17c1bd53ed-kube-api-access-tgthl\") on node \"crc\" DevicePath \"\"" Mar 21 04:57:05 crc kubenswrapper[4839]: I0321 04:57:05.683470 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-mkn9r" event={"ID":"66c3e343-3306-455d-89d7-db17c1bd53ed","Type":"ContainerDied","Data":"b38083ec341a913ae40633cd241eb560533502df4320d1402330cb006f7cab05"} Mar 21 04:57:05 crc kubenswrapper[4839]: I0321 04:57:05.683518 4839 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b38083ec341a913ae40633cd241eb560533502df4320d1402330cb006f7cab05" Mar 21 04:57:05 crc kubenswrapper[4839]: I0321 04:57:05.683536 4839 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-mkn9r" Mar 21 04:57:05 crc kubenswrapper[4839]: I0321 04:57:05.763584 4839 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/install-certs-edpm-deployment-openstack-edpm-ipam-s7pk7"] Mar 21 04:57:05 crc kubenswrapper[4839]: E0321 04:57:05.764016 4839 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="66c3e343-3306-455d-89d7-db17c1bd53ed" containerName="reboot-os-edpm-deployment-openstack-edpm-ipam" Mar 21 04:57:05 crc kubenswrapper[4839]: I0321 04:57:05.764038 4839 state_mem.go:107] "Deleted CPUSet assignment" podUID="66c3e343-3306-455d-89d7-db17c1bd53ed" containerName="reboot-os-edpm-deployment-openstack-edpm-ipam" Mar 21 04:57:05 crc kubenswrapper[4839]: I0321 04:57:05.764307 4839 memory_manager.go:354] "RemoveStaleState removing state" podUID="66c3e343-3306-455d-89d7-db17c1bd53ed" containerName="reboot-os-edpm-deployment-openstack-edpm-ipam" Mar 21 04:57:05 crc kubenswrapper[4839]: I0321 04:57:05.765083 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-s7pk7" Mar 21 04:57:05 crc kubenswrapper[4839]: I0321 04:57:05.770748 4839 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-telemetry-default-certs-0" Mar 21 04:57:05 crc kubenswrapper[4839]: I0321 04:57:05.770802 4839 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Mar 21 04:57:05 crc kubenswrapper[4839]: I0321 04:57:05.770749 4839 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Mar 21 04:57:05 crc kubenswrapper[4839]: I0321 04:57:05.770908 4839 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-pxvtf" Mar 21 04:57:05 crc kubenswrapper[4839]: I0321 04:57:05.771038 4839 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Mar 21 04:57:05 crc kubenswrapper[4839]: I0321 04:57:05.771146 4839 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-ovn-default-certs-0" Mar 21 04:57:05 crc kubenswrapper[4839]: I0321 04:57:05.772646 4839 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-neutron-metadata-default-certs-0" Mar 21 04:57:05 crc kubenswrapper[4839]: I0321 04:57:05.772882 4839 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-libvirt-default-certs-0" Mar 21 04:57:05 crc kubenswrapper[4839]: I0321 04:57:05.778697 4839 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/install-certs-edpm-deployment-openstack-edpm-ipam-s7pk7"] Mar 21 04:57:05 crc kubenswrapper[4839]: I0321 04:57:05.902901 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam-neutron-metadata-default-certs-0\" (UniqueName: \"kubernetes.io/projected/268d87b5-57ec-49ff-be62-fe59e6b4b819-openstack-edpm-ipam-neutron-metadata-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-s7pk7\" (UID: \"268d87b5-57ec-49ff-be62-fe59e6b4b819\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-s7pk7" Mar 21 04:57:05 crc kubenswrapper[4839]: I0321 04:57:05.902981 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam-telemetry-default-certs-0\" (UniqueName: \"kubernetes.io/projected/268d87b5-57ec-49ff-be62-fe59e6b4b819-openstack-edpm-ipam-telemetry-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-s7pk7\" (UID: \"268d87b5-57ec-49ff-be62-fe59e6b4b819\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-s7pk7" Mar 21 04:57:05 crc kubenswrapper[4839]: I0321 04:57:05.903034 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/268d87b5-57ec-49ff-be62-fe59e6b4b819-ovn-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-s7pk7\" (UID: \"268d87b5-57ec-49ff-be62-fe59e6b4b819\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-s7pk7" Mar 21 04:57:05 crc kubenswrapper[4839]: I0321 04:57:05.903244 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam-libvirt-default-certs-0\" (UniqueName: \"kubernetes.io/projected/268d87b5-57ec-49ff-be62-fe59e6b4b819-openstack-edpm-ipam-libvirt-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-s7pk7\" (UID: \"268d87b5-57ec-49ff-be62-fe59e6b4b819\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-s7pk7" Mar 21 04:57:05 crc kubenswrapper[4839]: I0321 04:57:05.903315 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/268d87b5-57ec-49ff-be62-fe59e6b4b819-nova-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-s7pk7\" (UID: \"268d87b5-57ec-49ff-be62-fe59e6b4b819\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-s7pk7" Mar 21 04:57:05 crc kubenswrapper[4839]: I0321 04:57:05.903388 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/268d87b5-57ec-49ff-be62-fe59e6b4b819-libvirt-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-s7pk7\" (UID: \"268d87b5-57ec-49ff-be62-fe59e6b4b819\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-s7pk7" Mar 21 04:57:05 crc kubenswrapper[4839]: I0321 04:57:05.903491 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/268d87b5-57ec-49ff-be62-fe59e6b4b819-inventory\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-s7pk7\" (UID: \"268d87b5-57ec-49ff-be62-fe59e6b4b819\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-s7pk7" Mar 21 04:57:05 crc kubenswrapper[4839]: I0321 04:57:05.903551 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/268d87b5-57ec-49ff-be62-fe59e6b4b819-bootstrap-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-s7pk7\" (UID: \"268d87b5-57ec-49ff-be62-fe59e6b4b819\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-s7pk7" Mar 21 04:57:05 crc kubenswrapper[4839]: I0321 04:57:05.903706 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/268d87b5-57ec-49ff-be62-fe59e6b4b819-neutron-metadata-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-s7pk7\" (UID: \"268d87b5-57ec-49ff-be62-fe59e6b4b819\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-s7pk7" Mar 21 04:57:05 crc kubenswrapper[4839]: I0321 04:57:05.903756 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-44gls\" (UniqueName: \"kubernetes.io/projected/268d87b5-57ec-49ff-be62-fe59e6b4b819-kube-api-access-44gls\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-s7pk7\" (UID: \"268d87b5-57ec-49ff-be62-fe59e6b4b819\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-s7pk7" Mar 21 04:57:05 crc kubenswrapper[4839]: I0321 04:57:05.903837 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam-ovn-default-certs-0\" (UniqueName: \"kubernetes.io/projected/268d87b5-57ec-49ff-be62-fe59e6b4b819-openstack-edpm-ipam-ovn-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-s7pk7\" (UID: \"268d87b5-57ec-49ff-be62-fe59e6b4b819\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-s7pk7" Mar 21 04:57:05 crc kubenswrapper[4839]: I0321 04:57:05.903897 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/268d87b5-57ec-49ff-be62-fe59e6b4b819-ssh-key-openstack-edpm-ipam\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-s7pk7\" (UID: \"268d87b5-57ec-49ff-be62-fe59e6b4b819\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-s7pk7" Mar 21 04:57:05 crc kubenswrapper[4839]: I0321 04:57:05.903956 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/268d87b5-57ec-49ff-be62-fe59e6b4b819-repo-setup-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-s7pk7\" (UID: \"268d87b5-57ec-49ff-be62-fe59e6b4b819\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-s7pk7" Mar 21 04:57:05 crc kubenswrapper[4839]: I0321 04:57:05.904040 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/268d87b5-57ec-49ff-be62-fe59e6b4b819-telemetry-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-s7pk7\" (UID: \"268d87b5-57ec-49ff-be62-fe59e6b4b819\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-s7pk7" Mar 21 04:57:06 crc kubenswrapper[4839]: I0321 04:57:06.006076 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam-libvirt-default-certs-0\" (UniqueName: \"kubernetes.io/projected/268d87b5-57ec-49ff-be62-fe59e6b4b819-openstack-edpm-ipam-libvirt-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-s7pk7\" (UID: \"268d87b5-57ec-49ff-be62-fe59e6b4b819\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-s7pk7" Mar 21 04:57:06 crc kubenswrapper[4839]: I0321 04:57:06.006128 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/268d87b5-57ec-49ff-be62-fe59e6b4b819-nova-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-s7pk7\" (UID: \"268d87b5-57ec-49ff-be62-fe59e6b4b819\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-s7pk7" Mar 21 04:57:06 crc kubenswrapper[4839]: I0321 04:57:06.006152 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/268d87b5-57ec-49ff-be62-fe59e6b4b819-libvirt-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-s7pk7\" (UID: \"268d87b5-57ec-49ff-be62-fe59e6b4b819\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-s7pk7" Mar 21 04:57:06 crc kubenswrapper[4839]: I0321 04:57:06.006190 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/268d87b5-57ec-49ff-be62-fe59e6b4b819-inventory\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-s7pk7\" (UID: \"268d87b5-57ec-49ff-be62-fe59e6b4b819\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-s7pk7" Mar 21 04:57:06 crc kubenswrapper[4839]: I0321 04:57:06.006212 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/268d87b5-57ec-49ff-be62-fe59e6b4b819-bootstrap-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-s7pk7\" (UID: \"268d87b5-57ec-49ff-be62-fe59e6b4b819\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-s7pk7" Mar 21 04:57:06 crc kubenswrapper[4839]: I0321 04:57:06.006276 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/268d87b5-57ec-49ff-be62-fe59e6b4b819-neutron-metadata-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-s7pk7\" (UID: \"268d87b5-57ec-49ff-be62-fe59e6b4b819\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-s7pk7" Mar 21 04:57:06 crc kubenswrapper[4839]: I0321 04:57:06.006308 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-44gls\" (UniqueName: \"kubernetes.io/projected/268d87b5-57ec-49ff-be62-fe59e6b4b819-kube-api-access-44gls\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-s7pk7\" (UID: \"268d87b5-57ec-49ff-be62-fe59e6b4b819\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-s7pk7" Mar 21 04:57:06 crc kubenswrapper[4839]: I0321 04:57:06.006346 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam-ovn-default-certs-0\" (UniqueName: \"kubernetes.io/projected/268d87b5-57ec-49ff-be62-fe59e6b4b819-openstack-edpm-ipam-ovn-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-s7pk7\" (UID: \"268d87b5-57ec-49ff-be62-fe59e6b4b819\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-s7pk7" Mar 21 04:57:06 crc kubenswrapper[4839]: I0321 04:57:06.006396 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/268d87b5-57ec-49ff-be62-fe59e6b4b819-ssh-key-openstack-edpm-ipam\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-s7pk7\" (UID: \"268d87b5-57ec-49ff-be62-fe59e6b4b819\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-s7pk7" Mar 21 04:57:06 crc kubenswrapper[4839]: I0321 04:57:06.006429 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/268d87b5-57ec-49ff-be62-fe59e6b4b819-repo-setup-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-s7pk7\" (UID: \"268d87b5-57ec-49ff-be62-fe59e6b4b819\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-s7pk7" Mar 21 04:57:06 crc kubenswrapper[4839]: I0321 04:57:06.006475 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/268d87b5-57ec-49ff-be62-fe59e6b4b819-telemetry-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-s7pk7\" (UID: \"268d87b5-57ec-49ff-be62-fe59e6b4b819\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-s7pk7" Mar 21 04:57:06 crc kubenswrapper[4839]: I0321 04:57:06.006511 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam-neutron-metadata-default-certs-0\" (UniqueName: \"kubernetes.io/projected/268d87b5-57ec-49ff-be62-fe59e6b4b819-openstack-edpm-ipam-neutron-metadata-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-s7pk7\" (UID: \"268d87b5-57ec-49ff-be62-fe59e6b4b819\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-s7pk7" Mar 21 04:57:06 crc kubenswrapper[4839]: I0321 04:57:06.006540 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam-telemetry-default-certs-0\" (UniqueName: \"kubernetes.io/projected/268d87b5-57ec-49ff-be62-fe59e6b4b819-openstack-edpm-ipam-telemetry-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-s7pk7\" (UID: \"268d87b5-57ec-49ff-be62-fe59e6b4b819\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-s7pk7" Mar 21 04:57:06 crc kubenswrapper[4839]: I0321 04:57:06.006596 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/268d87b5-57ec-49ff-be62-fe59e6b4b819-ovn-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-s7pk7\" (UID: \"268d87b5-57ec-49ff-be62-fe59e6b4b819\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-s7pk7" Mar 21 04:57:06 crc kubenswrapper[4839]: I0321 04:57:06.010666 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/268d87b5-57ec-49ff-be62-fe59e6b4b819-repo-setup-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-s7pk7\" (UID: \"268d87b5-57ec-49ff-be62-fe59e6b4b819\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-s7pk7" Mar 21 04:57:06 crc kubenswrapper[4839]: I0321 04:57:06.011122 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/268d87b5-57ec-49ff-be62-fe59e6b4b819-ssh-key-openstack-edpm-ipam\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-s7pk7\" (UID: \"268d87b5-57ec-49ff-be62-fe59e6b4b819\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-s7pk7" Mar 21 04:57:06 crc kubenswrapper[4839]: I0321 04:57:06.011956 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam-libvirt-default-certs-0\" (UniqueName: \"kubernetes.io/projected/268d87b5-57ec-49ff-be62-fe59e6b4b819-openstack-edpm-ipam-libvirt-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-s7pk7\" (UID: \"268d87b5-57ec-49ff-be62-fe59e6b4b819\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-s7pk7" Mar 21 04:57:06 crc kubenswrapper[4839]: I0321 04:57:06.011968 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/268d87b5-57ec-49ff-be62-fe59e6b4b819-telemetry-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-s7pk7\" (UID: \"268d87b5-57ec-49ff-be62-fe59e6b4b819\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-s7pk7" Mar 21 04:57:06 crc kubenswrapper[4839]: I0321 04:57:06.013179 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/268d87b5-57ec-49ff-be62-fe59e6b4b819-neutron-metadata-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-s7pk7\" (UID: \"268d87b5-57ec-49ff-be62-fe59e6b4b819\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-s7pk7" Mar 21 04:57:06 crc kubenswrapper[4839]: I0321 04:57:06.013212 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/268d87b5-57ec-49ff-be62-fe59e6b4b819-libvirt-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-s7pk7\" (UID: \"268d87b5-57ec-49ff-be62-fe59e6b4b819\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-s7pk7" Mar 21 04:57:06 crc kubenswrapper[4839]: I0321 04:57:06.013271 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/268d87b5-57ec-49ff-be62-fe59e6b4b819-ovn-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-s7pk7\" (UID: \"268d87b5-57ec-49ff-be62-fe59e6b4b819\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-s7pk7" Mar 21 04:57:06 crc kubenswrapper[4839]: I0321 04:57:06.013350 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam-telemetry-default-certs-0\" (UniqueName: \"kubernetes.io/projected/268d87b5-57ec-49ff-be62-fe59e6b4b819-openstack-edpm-ipam-telemetry-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-s7pk7\" (UID: \"268d87b5-57ec-49ff-be62-fe59e6b4b819\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-s7pk7" Mar 21 04:57:06 crc kubenswrapper[4839]: I0321 04:57:06.013700 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam-neutron-metadata-default-certs-0\" (UniqueName: \"kubernetes.io/projected/268d87b5-57ec-49ff-be62-fe59e6b4b819-openstack-edpm-ipam-neutron-metadata-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-s7pk7\" (UID: \"268d87b5-57ec-49ff-be62-fe59e6b4b819\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-s7pk7" Mar 21 04:57:06 crc kubenswrapper[4839]: I0321 04:57:06.014362 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/268d87b5-57ec-49ff-be62-fe59e6b4b819-nova-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-s7pk7\" (UID: \"268d87b5-57ec-49ff-be62-fe59e6b4b819\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-s7pk7" Mar 21 04:57:06 crc kubenswrapper[4839]: I0321 04:57:06.014469 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/268d87b5-57ec-49ff-be62-fe59e6b4b819-inventory\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-s7pk7\" (UID: \"268d87b5-57ec-49ff-be62-fe59e6b4b819\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-s7pk7" Mar 21 04:57:06 crc kubenswrapper[4839]: I0321 04:57:06.014852 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam-ovn-default-certs-0\" (UniqueName: \"kubernetes.io/projected/268d87b5-57ec-49ff-be62-fe59e6b4b819-openstack-edpm-ipam-ovn-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-s7pk7\" (UID: \"268d87b5-57ec-49ff-be62-fe59e6b4b819\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-s7pk7" Mar 21 04:57:06 crc kubenswrapper[4839]: I0321 04:57:06.019133 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/268d87b5-57ec-49ff-be62-fe59e6b4b819-bootstrap-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-s7pk7\" (UID: \"268d87b5-57ec-49ff-be62-fe59e6b4b819\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-s7pk7" Mar 21 04:57:06 crc kubenswrapper[4839]: I0321 04:57:06.024604 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-44gls\" (UniqueName: \"kubernetes.io/projected/268d87b5-57ec-49ff-be62-fe59e6b4b819-kube-api-access-44gls\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-s7pk7\" (UID: \"268d87b5-57ec-49ff-be62-fe59e6b4b819\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-s7pk7" Mar 21 04:57:06 crc kubenswrapper[4839]: I0321 04:57:06.083900 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-s7pk7" Mar 21 04:57:06 crc kubenswrapper[4839]: I0321 04:57:06.585605 4839 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/install-certs-edpm-deployment-openstack-edpm-ipam-s7pk7"] Mar 21 04:57:06 crc kubenswrapper[4839]: I0321 04:57:06.693223 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-s7pk7" event={"ID":"268d87b5-57ec-49ff-be62-fe59e6b4b819","Type":"ContainerStarted","Data":"6e8b398cf3e59e505ca93e4e71d85f466e23b3528386cc1cb498ab7fcfbcfaeb"} Mar 21 04:57:07 crc kubenswrapper[4839]: I0321 04:57:07.700849 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-s7pk7" event={"ID":"268d87b5-57ec-49ff-be62-fe59e6b4b819","Type":"ContainerStarted","Data":"25ab4e1f891bddc34d799aea59670348f47ccbc48d537f5147c67b449b3ea5a6"} Mar 21 04:57:07 crc kubenswrapper[4839]: I0321 04:57:07.753386 4839 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-s7pk7" podStartSLOduration=2.311778335 podStartE2EDuration="2.753359517s" podCreationTimestamp="2026-03-21 04:57:05 +0000 UTC" firstStartedPulling="2026-03-21 04:57:06.585627395 +0000 UTC m=+2030.913414081" lastFinishedPulling="2026-03-21 04:57:07.027208587 +0000 UTC m=+2031.354995263" observedRunningTime="2026-03-21 04:57:07.738062146 +0000 UTC m=+2032.065848822" watchObservedRunningTime="2026-03-21 04:57:07.753359517 +0000 UTC m=+2032.081146193" Mar 21 04:57:09 crc kubenswrapper[4839]: I0321 04:57:09.453168 4839 scope.go:117] "RemoveContainer" containerID="27713c304335ae18a33ce2c1e07fb2a32c8ccd655a1db46168d0e6b4f0325109" Mar 21 04:57:09 crc kubenswrapper[4839]: E0321 04:57:09.453462 4839 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-jx4q7_openshift-machine-config-operator(4f92fefb-d5cd-451a-8bbe-31eea55d5bd9)\"" pod="openshift-machine-config-operator/machine-config-daemon-jx4q7" podUID="4f92fefb-d5cd-451a-8bbe-31eea55d5bd9" Mar 21 04:57:21 crc kubenswrapper[4839]: I0321 04:57:21.453554 4839 scope.go:117] "RemoveContainer" containerID="27713c304335ae18a33ce2c1e07fb2a32c8ccd655a1db46168d0e6b4f0325109" Mar 21 04:57:21 crc kubenswrapper[4839]: E0321 04:57:21.454466 4839 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-jx4q7_openshift-machine-config-operator(4f92fefb-d5cd-451a-8bbe-31eea55d5bd9)\"" pod="openshift-machine-config-operator/machine-config-daemon-jx4q7" podUID="4f92fefb-d5cd-451a-8bbe-31eea55d5bd9" Mar 21 04:57:23 crc kubenswrapper[4839]: I0321 04:57:23.038785 4839 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-cell-mapping-f7kjm"] Mar 21 04:57:23 crc kubenswrapper[4839]: I0321 04:57:23.051372 4839 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-cell-mapping-f7kjm"] Mar 21 04:57:24 crc kubenswrapper[4839]: I0321 04:57:24.463014 4839 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6c8778a4-d8b7-4331-be57-d1844b3c0f9f" path="/var/lib/kubelet/pods/6c8778a4-d8b7-4331-be57-d1844b3c0f9f/volumes" Mar 21 04:57:35 crc kubenswrapper[4839]: I0321 04:57:35.452734 4839 scope.go:117] "RemoveContainer" containerID="27713c304335ae18a33ce2c1e07fb2a32c8ccd655a1db46168d0e6b4f0325109" Mar 21 04:57:35 crc kubenswrapper[4839]: E0321 04:57:35.453530 4839 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-jx4q7_openshift-machine-config-operator(4f92fefb-d5cd-451a-8bbe-31eea55d5bd9)\"" pod="openshift-machine-config-operator/machine-config-daemon-jx4q7" podUID="4f92fefb-d5cd-451a-8bbe-31eea55d5bd9" Mar 21 04:57:40 crc kubenswrapper[4839]: I0321 04:57:40.963843 4839 generic.go:334] "Generic (PLEG): container finished" podID="268d87b5-57ec-49ff-be62-fe59e6b4b819" containerID="25ab4e1f891bddc34d799aea59670348f47ccbc48d537f5147c67b449b3ea5a6" exitCode=0 Mar 21 04:57:40 crc kubenswrapper[4839]: I0321 04:57:40.963946 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-s7pk7" event={"ID":"268d87b5-57ec-49ff-be62-fe59e6b4b819","Type":"ContainerDied","Data":"25ab4e1f891bddc34d799aea59670348f47ccbc48d537f5147c67b449b3ea5a6"} Mar 21 04:57:42 crc kubenswrapper[4839]: I0321 04:57:42.379914 4839 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-s7pk7" Mar 21 04:57:42 crc kubenswrapper[4839]: I0321 04:57:42.502102 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/268d87b5-57ec-49ff-be62-fe59e6b4b819-neutron-metadata-combined-ca-bundle\") pod \"268d87b5-57ec-49ff-be62-fe59e6b4b819\" (UID: \"268d87b5-57ec-49ff-be62-fe59e6b4b819\") " Mar 21 04:57:42 crc kubenswrapper[4839]: I0321 04:57:42.502184 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/268d87b5-57ec-49ff-be62-fe59e6b4b819-libvirt-combined-ca-bundle\") pod \"268d87b5-57ec-49ff-be62-fe59e6b4b819\" (UID: \"268d87b5-57ec-49ff-be62-fe59e6b4b819\") " Mar 21 04:57:42 crc kubenswrapper[4839]: I0321 04:57:42.502308 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-edpm-ipam-neutron-metadata-default-certs-0\" (UniqueName: \"kubernetes.io/projected/268d87b5-57ec-49ff-be62-fe59e6b4b819-openstack-edpm-ipam-neutron-metadata-default-certs-0\") pod \"268d87b5-57ec-49ff-be62-fe59e6b4b819\" (UID: \"268d87b5-57ec-49ff-be62-fe59e6b4b819\") " Mar 21 04:57:42 crc kubenswrapper[4839]: I0321 04:57:42.502350 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/268d87b5-57ec-49ff-be62-fe59e6b4b819-ssh-key-openstack-edpm-ipam\") pod \"268d87b5-57ec-49ff-be62-fe59e6b4b819\" (UID: \"268d87b5-57ec-49ff-be62-fe59e6b4b819\") " Mar 21 04:57:42 crc kubenswrapper[4839]: I0321 04:57:42.503201 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/268d87b5-57ec-49ff-be62-fe59e6b4b819-inventory\") pod \"268d87b5-57ec-49ff-be62-fe59e6b4b819\" (UID: \"268d87b5-57ec-49ff-be62-fe59e6b4b819\") " Mar 21 04:57:42 crc kubenswrapper[4839]: I0321 04:57:42.503237 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-edpm-ipam-libvirt-default-certs-0\" (UniqueName: \"kubernetes.io/projected/268d87b5-57ec-49ff-be62-fe59e6b4b819-openstack-edpm-ipam-libvirt-default-certs-0\") pod \"268d87b5-57ec-49ff-be62-fe59e6b4b819\" (UID: \"268d87b5-57ec-49ff-be62-fe59e6b4b819\") " Mar 21 04:57:42 crc kubenswrapper[4839]: I0321 04:57:42.503294 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/268d87b5-57ec-49ff-be62-fe59e6b4b819-bootstrap-combined-ca-bundle\") pod \"268d87b5-57ec-49ff-be62-fe59e6b4b819\" (UID: \"268d87b5-57ec-49ff-be62-fe59e6b4b819\") " Mar 21 04:57:42 crc kubenswrapper[4839]: I0321 04:57:42.503338 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-edpm-ipam-telemetry-default-certs-0\" (UniqueName: \"kubernetes.io/projected/268d87b5-57ec-49ff-be62-fe59e6b4b819-openstack-edpm-ipam-telemetry-default-certs-0\") pod \"268d87b5-57ec-49ff-be62-fe59e6b4b819\" (UID: \"268d87b5-57ec-49ff-be62-fe59e6b4b819\") " Mar 21 04:57:42 crc kubenswrapper[4839]: I0321 04:57:42.503472 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/268d87b5-57ec-49ff-be62-fe59e6b4b819-telemetry-combined-ca-bundle\") pod \"268d87b5-57ec-49ff-be62-fe59e6b4b819\" (UID: \"268d87b5-57ec-49ff-be62-fe59e6b4b819\") " Mar 21 04:57:42 crc kubenswrapper[4839]: I0321 04:57:42.503537 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-edpm-ipam-ovn-default-certs-0\" (UniqueName: \"kubernetes.io/projected/268d87b5-57ec-49ff-be62-fe59e6b4b819-openstack-edpm-ipam-ovn-default-certs-0\") pod \"268d87b5-57ec-49ff-be62-fe59e6b4b819\" (UID: \"268d87b5-57ec-49ff-be62-fe59e6b4b819\") " Mar 21 04:57:42 crc kubenswrapper[4839]: I0321 04:57:42.503603 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/268d87b5-57ec-49ff-be62-fe59e6b4b819-repo-setup-combined-ca-bundle\") pod \"268d87b5-57ec-49ff-be62-fe59e6b4b819\" (UID: \"268d87b5-57ec-49ff-be62-fe59e6b4b819\") " Mar 21 04:57:42 crc kubenswrapper[4839]: I0321 04:57:42.503639 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/268d87b5-57ec-49ff-be62-fe59e6b4b819-ovn-combined-ca-bundle\") pod \"268d87b5-57ec-49ff-be62-fe59e6b4b819\" (UID: \"268d87b5-57ec-49ff-be62-fe59e6b4b819\") " Mar 21 04:57:42 crc kubenswrapper[4839]: I0321 04:57:42.503709 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/268d87b5-57ec-49ff-be62-fe59e6b4b819-nova-combined-ca-bundle\") pod \"268d87b5-57ec-49ff-be62-fe59e6b4b819\" (UID: \"268d87b5-57ec-49ff-be62-fe59e6b4b819\") " Mar 21 04:57:42 crc kubenswrapper[4839]: I0321 04:57:42.503735 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-44gls\" (UniqueName: \"kubernetes.io/projected/268d87b5-57ec-49ff-be62-fe59e6b4b819-kube-api-access-44gls\") pod \"268d87b5-57ec-49ff-be62-fe59e6b4b819\" (UID: \"268d87b5-57ec-49ff-be62-fe59e6b4b819\") " Mar 21 04:57:42 crc kubenswrapper[4839]: I0321 04:57:42.508700 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/268d87b5-57ec-49ff-be62-fe59e6b4b819-openstack-edpm-ipam-neutron-metadata-default-certs-0" (OuterVolumeSpecName: "openstack-edpm-ipam-neutron-metadata-default-certs-0") pod "268d87b5-57ec-49ff-be62-fe59e6b4b819" (UID: "268d87b5-57ec-49ff-be62-fe59e6b4b819"). InnerVolumeSpecName "openstack-edpm-ipam-neutron-metadata-default-certs-0". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 21 04:57:42 crc kubenswrapper[4839]: I0321 04:57:42.508985 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/268d87b5-57ec-49ff-be62-fe59e6b4b819-openstack-edpm-ipam-ovn-default-certs-0" (OuterVolumeSpecName: "openstack-edpm-ipam-ovn-default-certs-0") pod "268d87b5-57ec-49ff-be62-fe59e6b4b819" (UID: "268d87b5-57ec-49ff-be62-fe59e6b4b819"). InnerVolumeSpecName "openstack-edpm-ipam-ovn-default-certs-0". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 21 04:57:42 crc kubenswrapper[4839]: I0321 04:57:42.509009 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/268d87b5-57ec-49ff-be62-fe59e6b4b819-ovn-combined-ca-bundle" (OuterVolumeSpecName: "ovn-combined-ca-bundle") pod "268d87b5-57ec-49ff-be62-fe59e6b4b819" (UID: "268d87b5-57ec-49ff-be62-fe59e6b4b819"). InnerVolumeSpecName "ovn-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 21 04:57:42 crc kubenswrapper[4839]: I0321 04:57:42.509045 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/268d87b5-57ec-49ff-be62-fe59e6b4b819-libvirt-combined-ca-bundle" (OuterVolumeSpecName: "libvirt-combined-ca-bundle") pod "268d87b5-57ec-49ff-be62-fe59e6b4b819" (UID: "268d87b5-57ec-49ff-be62-fe59e6b4b819"). InnerVolumeSpecName "libvirt-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 21 04:57:42 crc kubenswrapper[4839]: I0321 04:57:42.509096 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/268d87b5-57ec-49ff-be62-fe59e6b4b819-neutron-metadata-combined-ca-bundle" (OuterVolumeSpecName: "neutron-metadata-combined-ca-bundle") pod "268d87b5-57ec-49ff-be62-fe59e6b4b819" (UID: "268d87b5-57ec-49ff-be62-fe59e6b4b819"). InnerVolumeSpecName "neutron-metadata-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 21 04:57:42 crc kubenswrapper[4839]: I0321 04:57:42.509115 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/268d87b5-57ec-49ff-be62-fe59e6b4b819-openstack-edpm-ipam-libvirt-default-certs-0" (OuterVolumeSpecName: "openstack-edpm-ipam-libvirt-default-certs-0") pod "268d87b5-57ec-49ff-be62-fe59e6b4b819" (UID: "268d87b5-57ec-49ff-be62-fe59e6b4b819"). InnerVolumeSpecName "openstack-edpm-ipam-libvirt-default-certs-0". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 21 04:57:42 crc kubenswrapper[4839]: I0321 04:57:42.511133 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/268d87b5-57ec-49ff-be62-fe59e6b4b819-nova-combined-ca-bundle" (OuterVolumeSpecName: "nova-combined-ca-bundle") pod "268d87b5-57ec-49ff-be62-fe59e6b4b819" (UID: "268d87b5-57ec-49ff-be62-fe59e6b4b819"). InnerVolumeSpecName "nova-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 21 04:57:42 crc kubenswrapper[4839]: I0321 04:57:42.511155 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/268d87b5-57ec-49ff-be62-fe59e6b4b819-bootstrap-combined-ca-bundle" (OuterVolumeSpecName: "bootstrap-combined-ca-bundle") pod "268d87b5-57ec-49ff-be62-fe59e6b4b819" (UID: "268d87b5-57ec-49ff-be62-fe59e6b4b819"). InnerVolumeSpecName "bootstrap-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 21 04:57:42 crc kubenswrapper[4839]: I0321 04:57:42.511162 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/268d87b5-57ec-49ff-be62-fe59e6b4b819-kube-api-access-44gls" (OuterVolumeSpecName: "kube-api-access-44gls") pod "268d87b5-57ec-49ff-be62-fe59e6b4b819" (UID: "268d87b5-57ec-49ff-be62-fe59e6b4b819"). InnerVolumeSpecName "kube-api-access-44gls". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 21 04:57:42 crc kubenswrapper[4839]: I0321 04:57:42.511184 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/268d87b5-57ec-49ff-be62-fe59e6b4b819-repo-setup-combined-ca-bundle" (OuterVolumeSpecName: "repo-setup-combined-ca-bundle") pod "268d87b5-57ec-49ff-be62-fe59e6b4b819" (UID: "268d87b5-57ec-49ff-be62-fe59e6b4b819"). InnerVolumeSpecName "repo-setup-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 21 04:57:42 crc kubenswrapper[4839]: I0321 04:57:42.511239 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/268d87b5-57ec-49ff-be62-fe59e6b4b819-openstack-edpm-ipam-telemetry-default-certs-0" (OuterVolumeSpecName: "openstack-edpm-ipam-telemetry-default-certs-0") pod "268d87b5-57ec-49ff-be62-fe59e6b4b819" (UID: "268d87b5-57ec-49ff-be62-fe59e6b4b819"). InnerVolumeSpecName "openstack-edpm-ipam-telemetry-default-certs-0". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 21 04:57:42 crc kubenswrapper[4839]: I0321 04:57:42.512169 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/268d87b5-57ec-49ff-be62-fe59e6b4b819-telemetry-combined-ca-bundle" (OuterVolumeSpecName: "telemetry-combined-ca-bundle") pod "268d87b5-57ec-49ff-be62-fe59e6b4b819" (UID: "268d87b5-57ec-49ff-be62-fe59e6b4b819"). InnerVolumeSpecName "telemetry-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 21 04:57:42 crc kubenswrapper[4839]: I0321 04:57:42.530835 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/268d87b5-57ec-49ff-be62-fe59e6b4b819-inventory" (OuterVolumeSpecName: "inventory") pod "268d87b5-57ec-49ff-be62-fe59e6b4b819" (UID: "268d87b5-57ec-49ff-be62-fe59e6b4b819"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 21 04:57:42 crc kubenswrapper[4839]: I0321 04:57:42.536536 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/268d87b5-57ec-49ff-be62-fe59e6b4b819-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "268d87b5-57ec-49ff-be62-fe59e6b4b819" (UID: "268d87b5-57ec-49ff-be62-fe59e6b4b819"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 21 04:57:42 crc kubenswrapper[4839]: I0321 04:57:42.606929 4839 reconciler_common.go:293] "Volume detached for volume \"openstack-edpm-ipam-ovn-default-certs-0\" (UniqueName: \"kubernetes.io/projected/268d87b5-57ec-49ff-be62-fe59e6b4b819-openstack-edpm-ipam-ovn-default-certs-0\") on node \"crc\" DevicePath \"\"" Mar 21 04:57:42 crc kubenswrapper[4839]: I0321 04:57:42.606964 4839 reconciler_common.go:293] "Volume detached for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/268d87b5-57ec-49ff-be62-fe59e6b4b819-repo-setup-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 21 04:57:42 crc kubenswrapper[4839]: I0321 04:57:42.606977 4839 reconciler_common.go:293] "Volume detached for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/268d87b5-57ec-49ff-be62-fe59e6b4b819-ovn-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 21 04:57:42 crc kubenswrapper[4839]: I0321 04:57:42.606986 4839 reconciler_common.go:293] "Volume detached for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/268d87b5-57ec-49ff-be62-fe59e6b4b819-nova-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 21 04:57:42 crc kubenswrapper[4839]: I0321 04:57:42.606995 4839 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-44gls\" (UniqueName: \"kubernetes.io/projected/268d87b5-57ec-49ff-be62-fe59e6b4b819-kube-api-access-44gls\") on node \"crc\" DevicePath \"\"" Mar 21 04:57:42 crc kubenswrapper[4839]: I0321 04:57:42.607004 4839 reconciler_common.go:293] "Volume detached for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/268d87b5-57ec-49ff-be62-fe59e6b4b819-neutron-metadata-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 21 04:57:42 crc kubenswrapper[4839]: I0321 04:57:42.607014 4839 reconciler_common.go:293] "Volume detached for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/268d87b5-57ec-49ff-be62-fe59e6b4b819-libvirt-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 21 04:57:42 crc kubenswrapper[4839]: I0321 04:57:42.607024 4839 reconciler_common.go:293] "Volume detached for volume \"openstack-edpm-ipam-neutron-metadata-default-certs-0\" (UniqueName: \"kubernetes.io/projected/268d87b5-57ec-49ff-be62-fe59e6b4b819-openstack-edpm-ipam-neutron-metadata-default-certs-0\") on node \"crc\" DevicePath \"\"" Mar 21 04:57:42 crc kubenswrapper[4839]: I0321 04:57:42.607035 4839 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/268d87b5-57ec-49ff-be62-fe59e6b4b819-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Mar 21 04:57:42 crc kubenswrapper[4839]: I0321 04:57:42.607045 4839 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/268d87b5-57ec-49ff-be62-fe59e6b4b819-inventory\") on node \"crc\" DevicePath \"\"" Mar 21 04:57:42 crc kubenswrapper[4839]: I0321 04:57:42.607057 4839 reconciler_common.go:293] "Volume detached for volume \"openstack-edpm-ipam-libvirt-default-certs-0\" (UniqueName: \"kubernetes.io/projected/268d87b5-57ec-49ff-be62-fe59e6b4b819-openstack-edpm-ipam-libvirt-default-certs-0\") on node \"crc\" DevicePath \"\"" Mar 21 04:57:42 crc kubenswrapper[4839]: I0321 04:57:42.607069 4839 reconciler_common.go:293] "Volume detached for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/268d87b5-57ec-49ff-be62-fe59e6b4b819-bootstrap-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 21 04:57:42 crc kubenswrapper[4839]: I0321 04:57:42.607081 4839 reconciler_common.go:293] "Volume detached for volume \"openstack-edpm-ipam-telemetry-default-certs-0\" (UniqueName: \"kubernetes.io/projected/268d87b5-57ec-49ff-be62-fe59e6b4b819-openstack-edpm-ipam-telemetry-default-certs-0\") on node \"crc\" DevicePath \"\"" Mar 21 04:57:42 crc kubenswrapper[4839]: I0321 04:57:42.607095 4839 reconciler_common.go:293] "Volume detached for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/268d87b5-57ec-49ff-be62-fe59e6b4b819-telemetry-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 21 04:57:42 crc kubenswrapper[4839]: I0321 04:57:42.981891 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-s7pk7" event={"ID":"268d87b5-57ec-49ff-be62-fe59e6b4b819","Type":"ContainerDied","Data":"6e8b398cf3e59e505ca93e4e71d85f466e23b3528386cc1cb498ab7fcfbcfaeb"} Mar 21 04:57:42 crc kubenswrapper[4839]: I0321 04:57:42.981937 4839 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6e8b398cf3e59e505ca93e4e71d85f466e23b3528386cc1cb498ab7fcfbcfaeb" Mar 21 04:57:42 crc kubenswrapper[4839]: I0321 04:57:42.981940 4839 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-s7pk7" Mar 21 04:57:43 crc kubenswrapper[4839]: I0321 04:57:43.128902 4839 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-edpm-deployment-openstack-edpm-ipam-v4wqq"] Mar 21 04:57:43 crc kubenswrapper[4839]: E0321 04:57:43.129286 4839 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="268d87b5-57ec-49ff-be62-fe59e6b4b819" containerName="install-certs-edpm-deployment-openstack-edpm-ipam" Mar 21 04:57:43 crc kubenswrapper[4839]: I0321 04:57:43.129302 4839 state_mem.go:107] "Deleted CPUSet assignment" podUID="268d87b5-57ec-49ff-be62-fe59e6b4b819" containerName="install-certs-edpm-deployment-openstack-edpm-ipam" Mar 21 04:57:43 crc kubenswrapper[4839]: I0321 04:57:43.129467 4839 memory_manager.go:354] "RemoveStaleState removing state" podUID="268d87b5-57ec-49ff-be62-fe59e6b4b819" containerName="install-certs-edpm-deployment-openstack-edpm-ipam" Mar 21 04:57:43 crc kubenswrapper[4839]: I0321 04:57:43.130022 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-v4wqq" Mar 21 04:57:43 crc kubenswrapper[4839]: I0321 04:57:43.132079 4839 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-pxvtf" Mar 21 04:57:43 crc kubenswrapper[4839]: I0321 04:57:43.134889 4839 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Mar 21 04:57:43 crc kubenswrapper[4839]: I0321 04:57:43.135065 4839 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Mar 21 04:57:43 crc kubenswrapper[4839]: I0321 04:57:43.135201 4839 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-config" Mar 21 04:57:43 crc kubenswrapper[4839]: I0321 04:57:43.138923 4839 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Mar 21 04:57:43 crc kubenswrapper[4839]: I0321 04:57:43.157951 4839 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-edpm-deployment-openstack-edpm-ipam-v4wqq"] Mar 21 04:57:43 crc kubenswrapper[4839]: I0321 04:57:43.217357 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7e5c69e5-d234-4a0b-a327-1cf44ddaf1bd-ovn-combined-ca-bundle\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-v4wqq\" (UID: \"7e5c69e5-d234-4a0b-a327-1cf44ddaf1bd\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-v4wqq" Mar 21 04:57:43 crc kubenswrapper[4839]: I0321 04:57:43.217494 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/7e5c69e5-d234-4a0b-a327-1cf44ddaf1bd-ovncontroller-config-0\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-v4wqq\" (UID: \"7e5c69e5-d234-4a0b-a327-1cf44ddaf1bd\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-v4wqq" Mar 21 04:57:43 crc kubenswrapper[4839]: I0321 04:57:43.217576 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/7e5c69e5-d234-4a0b-a327-1cf44ddaf1bd-ssh-key-openstack-edpm-ipam\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-v4wqq\" (UID: \"7e5c69e5-d234-4a0b-a327-1cf44ddaf1bd\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-v4wqq" Mar 21 04:57:43 crc kubenswrapper[4839]: I0321 04:57:43.217722 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/7e5c69e5-d234-4a0b-a327-1cf44ddaf1bd-inventory\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-v4wqq\" (UID: \"7e5c69e5-d234-4a0b-a327-1cf44ddaf1bd\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-v4wqq" Mar 21 04:57:43 crc kubenswrapper[4839]: I0321 04:57:43.217758 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tp4jm\" (UniqueName: \"kubernetes.io/projected/7e5c69e5-d234-4a0b-a327-1cf44ddaf1bd-kube-api-access-tp4jm\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-v4wqq\" (UID: \"7e5c69e5-d234-4a0b-a327-1cf44ddaf1bd\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-v4wqq" Mar 21 04:57:43 crc kubenswrapper[4839]: I0321 04:57:43.319411 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/7e5c69e5-d234-4a0b-a327-1cf44ddaf1bd-inventory\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-v4wqq\" (UID: \"7e5c69e5-d234-4a0b-a327-1cf44ddaf1bd\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-v4wqq" Mar 21 04:57:43 crc kubenswrapper[4839]: I0321 04:57:43.319467 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tp4jm\" (UniqueName: \"kubernetes.io/projected/7e5c69e5-d234-4a0b-a327-1cf44ddaf1bd-kube-api-access-tp4jm\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-v4wqq\" (UID: \"7e5c69e5-d234-4a0b-a327-1cf44ddaf1bd\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-v4wqq" Mar 21 04:57:43 crc kubenswrapper[4839]: I0321 04:57:43.319596 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7e5c69e5-d234-4a0b-a327-1cf44ddaf1bd-ovn-combined-ca-bundle\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-v4wqq\" (UID: \"7e5c69e5-d234-4a0b-a327-1cf44ddaf1bd\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-v4wqq" Mar 21 04:57:43 crc kubenswrapper[4839]: I0321 04:57:43.320051 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/7e5c69e5-d234-4a0b-a327-1cf44ddaf1bd-ovncontroller-config-0\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-v4wqq\" (UID: \"7e5c69e5-d234-4a0b-a327-1cf44ddaf1bd\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-v4wqq" Mar 21 04:57:43 crc kubenswrapper[4839]: I0321 04:57:43.320099 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/7e5c69e5-d234-4a0b-a327-1cf44ddaf1bd-ssh-key-openstack-edpm-ipam\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-v4wqq\" (UID: \"7e5c69e5-d234-4a0b-a327-1cf44ddaf1bd\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-v4wqq" Mar 21 04:57:43 crc kubenswrapper[4839]: I0321 04:57:43.321036 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/7e5c69e5-d234-4a0b-a327-1cf44ddaf1bd-ovncontroller-config-0\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-v4wqq\" (UID: \"7e5c69e5-d234-4a0b-a327-1cf44ddaf1bd\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-v4wqq" Mar 21 04:57:43 crc kubenswrapper[4839]: I0321 04:57:43.323293 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7e5c69e5-d234-4a0b-a327-1cf44ddaf1bd-ovn-combined-ca-bundle\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-v4wqq\" (UID: \"7e5c69e5-d234-4a0b-a327-1cf44ddaf1bd\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-v4wqq" Mar 21 04:57:43 crc kubenswrapper[4839]: I0321 04:57:43.323503 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/7e5c69e5-d234-4a0b-a327-1cf44ddaf1bd-inventory\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-v4wqq\" (UID: \"7e5c69e5-d234-4a0b-a327-1cf44ddaf1bd\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-v4wqq" Mar 21 04:57:43 crc kubenswrapper[4839]: I0321 04:57:43.329118 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/7e5c69e5-d234-4a0b-a327-1cf44ddaf1bd-ssh-key-openstack-edpm-ipam\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-v4wqq\" (UID: \"7e5c69e5-d234-4a0b-a327-1cf44ddaf1bd\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-v4wqq" Mar 21 04:57:43 crc kubenswrapper[4839]: I0321 04:57:43.337213 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tp4jm\" (UniqueName: \"kubernetes.io/projected/7e5c69e5-d234-4a0b-a327-1cf44ddaf1bd-kube-api-access-tp4jm\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-v4wqq\" (UID: \"7e5c69e5-d234-4a0b-a327-1cf44ddaf1bd\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-v4wqq" Mar 21 04:57:43 crc kubenswrapper[4839]: I0321 04:57:43.461589 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-v4wqq" Mar 21 04:57:43 crc kubenswrapper[4839]: I0321 04:57:43.988466 4839 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-edpm-deployment-openstack-edpm-ipam-v4wqq"] Mar 21 04:57:44 crc kubenswrapper[4839]: I0321 04:57:44.999404 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-v4wqq" event={"ID":"7e5c69e5-d234-4a0b-a327-1cf44ddaf1bd","Type":"ContainerStarted","Data":"90bc916bb321224b514616b34e453eee1cee1631314ee728c0c789b978bf6856"} Mar 21 04:57:46 crc kubenswrapper[4839]: I0321 04:57:46.009649 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-v4wqq" event={"ID":"7e5c69e5-d234-4a0b-a327-1cf44ddaf1bd","Type":"ContainerStarted","Data":"eaaf1096c895c8ef87987d2b9b75baeb732a85617c19cf6724ad330b3a1d7d4a"} Mar 21 04:57:46 crc kubenswrapper[4839]: I0321 04:57:46.038972 4839 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-v4wqq" podStartSLOduration=2.27872117 podStartE2EDuration="3.03895032s" podCreationTimestamp="2026-03-21 04:57:43 +0000 UTC" firstStartedPulling="2026-03-21 04:57:44.000574567 +0000 UTC m=+2068.328361243" lastFinishedPulling="2026-03-21 04:57:44.760803717 +0000 UTC m=+2069.088590393" observedRunningTime="2026-03-21 04:57:46.025941903 +0000 UTC m=+2070.353728599" watchObservedRunningTime="2026-03-21 04:57:46.03895032 +0000 UTC m=+2070.366737006" Mar 21 04:57:50 crc kubenswrapper[4839]: I0321 04:57:50.453274 4839 scope.go:117] "RemoveContainer" containerID="27713c304335ae18a33ce2c1e07fb2a32c8ccd655a1db46168d0e6b4f0325109" Mar 21 04:57:50 crc kubenswrapper[4839]: E0321 04:57:50.453977 4839 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-jx4q7_openshift-machine-config-operator(4f92fefb-d5cd-451a-8bbe-31eea55d5bd9)\"" pod="openshift-machine-config-operator/machine-config-daemon-jx4q7" podUID="4f92fefb-d5cd-451a-8bbe-31eea55d5bd9" Mar 21 04:57:55 crc kubenswrapper[4839]: I0321 04:57:55.437298 4839 scope.go:117] "RemoveContainer" containerID="07f2e48c7301d0027bae700357d24a79a9ba9d36dd4d10cd8158d308e2f8bf3d" Mar 21 04:58:00 crc kubenswrapper[4839]: I0321 04:58:00.158275 4839 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29567818-qzz8l"] Mar 21 04:58:00 crc kubenswrapper[4839]: I0321 04:58:00.160383 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567818-qzz8l" Mar 21 04:58:00 crc kubenswrapper[4839]: I0321 04:58:00.163434 4839 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 21 04:58:00 crc kubenswrapper[4839]: I0321 04:58:00.163611 4839 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-swld2" Mar 21 04:58:00 crc kubenswrapper[4839]: I0321 04:58:00.163614 4839 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 21 04:58:00 crc kubenswrapper[4839]: I0321 04:58:00.168186 4839 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29567818-qzz8l"] Mar 21 04:58:00 crc kubenswrapper[4839]: I0321 04:58:00.228228 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rw85t\" (UniqueName: \"kubernetes.io/projected/04b644e0-9d17-491d-be8c-359dd9f82604-kube-api-access-rw85t\") pod \"auto-csr-approver-29567818-qzz8l\" (UID: \"04b644e0-9d17-491d-be8c-359dd9f82604\") " pod="openshift-infra/auto-csr-approver-29567818-qzz8l" Mar 21 04:58:00 crc kubenswrapper[4839]: I0321 04:58:00.329471 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rw85t\" (UniqueName: \"kubernetes.io/projected/04b644e0-9d17-491d-be8c-359dd9f82604-kube-api-access-rw85t\") pod \"auto-csr-approver-29567818-qzz8l\" (UID: \"04b644e0-9d17-491d-be8c-359dd9f82604\") " pod="openshift-infra/auto-csr-approver-29567818-qzz8l" Mar 21 04:58:00 crc kubenswrapper[4839]: I0321 04:58:00.347240 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rw85t\" (UniqueName: \"kubernetes.io/projected/04b644e0-9d17-491d-be8c-359dd9f82604-kube-api-access-rw85t\") pod \"auto-csr-approver-29567818-qzz8l\" (UID: \"04b644e0-9d17-491d-be8c-359dd9f82604\") " pod="openshift-infra/auto-csr-approver-29567818-qzz8l" Mar 21 04:58:00 crc kubenswrapper[4839]: I0321 04:58:00.490151 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567818-qzz8l" Mar 21 04:58:00 crc kubenswrapper[4839]: I0321 04:58:00.920825 4839 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29567818-qzz8l"] Mar 21 04:58:01 crc kubenswrapper[4839]: I0321 04:58:01.139776 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567818-qzz8l" event={"ID":"04b644e0-9d17-491d-be8c-359dd9f82604","Type":"ContainerStarted","Data":"e3a17cb1049773d20eadab08a7388a8d7ad767aaa4fd1b4d999e0094525c5ba2"} Mar 21 04:58:02 crc kubenswrapper[4839]: I0321 04:58:02.153666 4839 generic.go:334] "Generic (PLEG): container finished" podID="04b644e0-9d17-491d-be8c-359dd9f82604" containerID="c879183e5f723b0bd5065e25afca82cc281c19704902af0285103b69c58011ac" exitCode=0 Mar 21 04:58:02 crc kubenswrapper[4839]: I0321 04:58:02.153772 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567818-qzz8l" event={"ID":"04b644e0-9d17-491d-be8c-359dd9f82604","Type":"ContainerDied","Data":"c879183e5f723b0bd5065e25afca82cc281c19704902af0285103b69c58011ac"} Mar 21 04:58:02 crc kubenswrapper[4839]: I0321 04:58:02.453724 4839 scope.go:117] "RemoveContainer" containerID="27713c304335ae18a33ce2c1e07fb2a32c8ccd655a1db46168d0e6b4f0325109" Mar 21 04:58:03 crc kubenswrapper[4839]: I0321 04:58:03.165400 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-jx4q7" event={"ID":"4f92fefb-d5cd-451a-8bbe-31eea55d5bd9","Type":"ContainerStarted","Data":"f3d8592746d13c0ed95f298f8a3279e3766ac0141ca420e1630c7b54039959ed"} Mar 21 04:58:03 crc kubenswrapper[4839]: I0321 04:58:03.516904 4839 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567818-qzz8l" Mar 21 04:58:03 crc kubenswrapper[4839]: I0321 04:58:03.599459 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rw85t\" (UniqueName: \"kubernetes.io/projected/04b644e0-9d17-491d-be8c-359dd9f82604-kube-api-access-rw85t\") pod \"04b644e0-9d17-491d-be8c-359dd9f82604\" (UID: \"04b644e0-9d17-491d-be8c-359dd9f82604\") " Mar 21 04:58:03 crc kubenswrapper[4839]: I0321 04:58:03.605522 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/04b644e0-9d17-491d-be8c-359dd9f82604-kube-api-access-rw85t" (OuterVolumeSpecName: "kube-api-access-rw85t") pod "04b644e0-9d17-491d-be8c-359dd9f82604" (UID: "04b644e0-9d17-491d-be8c-359dd9f82604"). InnerVolumeSpecName "kube-api-access-rw85t". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 21 04:58:03 crc kubenswrapper[4839]: I0321 04:58:03.702083 4839 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rw85t\" (UniqueName: \"kubernetes.io/projected/04b644e0-9d17-491d-be8c-359dd9f82604-kube-api-access-rw85t\") on node \"crc\" DevicePath \"\"" Mar 21 04:58:04 crc kubenswrapper[4839]: I0321 04:58:04.174488 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567818-qzz8l" event={"ID":"04b644e0-9d17-491d-be8c-359dd9f82604","Type":"ContainerDied","Data":"e3a17cb1049773d20eadab08a7388a8d7ad767aaa4fd1b4d999e0094525c5ba2"} Mar 21 04:58:04 crc kubenswrapper[4839]: I0321 04:58:04.174781 4839 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e3a17cb1049773d20eadab08a7388a8d7ad767aaa4fd1b4d999e0094525c5ba2" Mar 21 04:58:04 crc kubenswrapper[4839]: I0321 04:58:04.174528 4839 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567818-qzz8l" Mar 21 04:58:04 crc kubenswrapper[4839]: I0321 04:58:04.587226 4839 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29567812-jglhv"] Mar 21 04:58:04 crc kubenswrapper[4839]: I0321 04:58:04.595621 4839 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29567812-jglhv"] Mar 21 04:58:06 crc kubenswrapper[4839]: I0321 04:58:06.484237 4839 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d13124dd-cca5-49f6-9638-2cb42ed2bb34" path="/var/lib/kubelet/pods/d13124dd-cca5-49f6-9638-2cb42ed2bb34/volumes" Mar 21 04:58:40 crc kubenswrapper[4839]: I0321 04:58:40.514022 4839 generic.go:334] "Generic (PLEG): container finished" podID="7e5c69e5-d234-4a0b-a327-1cf44ddaf1bd" containerID="eaaf1096c895c8ef87987d2b9b75baeb732a85617c19cf6724ad330b3a1d7d4a" exitCode=0 Mar 21 04:58:40 crc kubenswrapper[4839]: I0321 04:58:40.514131 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-v4wqq" event={"ID":"7e5c69e5-d234-4a0b-a327-1cf44ddaf1bd","Type":"ContainerDied","Data":"eaaf1096c895c8ef87987d2b9b75baeb732a85617c19cf6724ad330b3a1d7d4a"} Mar 21 04:58:41 crc kubenswrapper[4839]: I0321 04:58:41.944705 4839 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-v4wqq" Mar 21 04:58:42 crc kubenswrapper[4839]: I0321 04:58:42.097192 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/7e5c69e5-d234-4a0b-a327-1cf44ddaf1bd-ovncontroller-config-0\") pod \"7e5c69e5-d234-4a0b-a327-1cf44ddaf1bd\" (UID: \"7e5c69e5-d234-4a0b-a327-1cf44ddaf1bd\") " Mar 21 04:58:42 crc kubenswrapper[4839]: I0321 04:58:42.097615 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/7e5c69e5-d234-4a0b-a327-1cf44ddaf1bd-ssh-key-openstack-edpm-ipam\") pod \"7e5c69e5-d234-4a0b-a327-1cf44ddaf1bd\" (UID: \"7e5c69e5-d234-4a0b-a327-1cf44ddaf1bd\") " Mar 21 04:58:42 crc kubenswrapper[4839]: I0321 04:58:42.097656 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tp4jm\" (UniqueName: \"kubernetes.io/projected/7e5c69e5-d234-4a0b-a327-1cf44ddaf1bd-kube-api-access-tp4jm\") pod \"7e5c69e5-d234-4a0b-a327-1cf44ddaf1bd\" (UID: \"7e5c69e5-d234-4a0b-a327-1cf44ddaf1bd\") " Mar 21 04:58:42 crc kubenswrapper[4839]: I0321 04:58:42.097790 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7e5c69e5-d234-4a0b-a327-1cf44ddaf1bd-ovn-combined-ca-bundle\") pod \"7e5c69e5-d234-4a0b-a327-1cf44ddaf1bd\" (UID: \"7e5c69e5-d234-4a0b-a327-1cf44ddaf1bd\") " Mar 21 04:58:42 crc kubenswrapper[4839]: I0321 04:58:42.097849 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/7e5c69e5-d234-4a0b-a327-1cf44ddaf1bd-inventory\") pod \"7e5c69e5-d234-4a0b-a327-1cf44ddaf1bd\" (UID: \"7e5c69e5-d234-4a0b-a327-1cf44ddaf1bd\") " Mar 21 04:58:42 crc kubenswrapper[4839]: I0321 04:58:42.102725 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7e5c69e5-d234-4a0b-a327-1cf44ddaf1bd-ovn-combined-ca-bundle" (OuterVolumeSpecName: "ovn-combined-ca-bundle") pod "7e5c69e5-d234-4a0b-a327-1cf44ddaf1bd" (UID: "7e5c69e5-d234-4a0b-a327-1cf44ddaf1bd"). InnerVolumeSpecName "ovn-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 21 04:58:42 crc kubenswrapper[4839]: I0321 04:58:42.106805 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7e5c69e5-d234-4a0b-a327-1cf44ddaf1bd-kube-api-access-tp4jm" (OuterVolumeSpecName: "kube-api-access-tp4jm") pod "7e5c69e5-d234-4a0b-a327-1cf44ddaf1bd" (UID: "7e5c69e5-d234-4a0b-a327-1cf44ddaf1bd"). InnerVolumeSpecName "kube-api-access-tp4jm". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 21 04:58:42 crc kubenswrapper[4839]: I0321 04:58:42.122775 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7e5c69e5-d234-4a0b-a327-1cf44ddaf1bd-ovncontroller-config-0" (OuterVolumeSpecName: "ovncontroller-config-0") pod "7e5c69e5-d234-4a0b-a327-1cf44ddaf1bd" (UID: "7e5c69e5-d234-4a0b-a327-1cf44ddaf1bd"). InnerVolumeSpecName "ovncontroller-config-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 21 04:58:42 crc kubenswrapper[4839]: I0321 04:58:42.123611 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7e5c69e5-d234-4a0b-a327-1cf44ddaf1bd-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "7e5c69e5-d234-4a0b-a327-1cf44ddaf1bd" (UID: "7e5c69e5-d234-4a0b-a327-1cf44ddaf1bd"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 21 04:58:42 crc kubenswrapper[4839]: I0321 04:58:42.125795 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7e5c69e5-d234-4a0b-a327-1cf44ddaf1bd-inventory" (OuterVolumeSpecName: "inventory") pod "7e5c69e5-d234-4a0b-a327-1cf44ddaf1bd" (UID: "7e5c69e5-d234-4a0b-a327-1cf44ddaf1bd"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 21 04:58:42 crc kubenswrapper[4839]: I0321 04:58:42.199735 4839 reconciler_common.go:293] "Volume detached for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7e5c69e5-d234-4a0b-a327-1cf44ddaf1bd-ovn-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 21 04:58:42 crc kubenswrapper[4839]: I0321 04:58:42.199775 4839 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/7e5c69e5-d234-4a0b-a327-1cf44ddaf1bd-inventory\") on node \"crc\" DevicePath \"\"" Mar 21 04:58:42 crc kubenswrapper[4839]: I0321 04:58:42.199786 4839 reconciler_common.go:293] "Volume detached for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/7e5c69e5-d234-4a0b-a327-1cf44ddaf1bd-ovncontroller-config-0\") on node \"crc\" DevicePath \"\"" Mar 21 04:58:42 crc kubenswrapper[4839]: I0321 04:58:42.199795 4839 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/7e5c69e5-d234-4a0b-a327-1cf44ddaf1bd-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Mar 21 04:58:42 crc kubenswrapper[4839]: I0321 04:58:42.199803 4839 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tp4jm\" (UniqueName: \"kubernetes.io/projected/7e5c69e5-d234-4a0b-a327-1cf44ddaf1bd-kube-api-access-tp4jm\") on node \"crc\" DevicePath \"\"" Mar 21 04:58:42 crc kubenswrapper[4839]: I0321 04:58:42.546218 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-v4wqq" event={"ID":"7e5c69e5-d234-4a0b-a327-1cf44ddaf1bd","Type":"ContainerDied","Data":"90bc916bb321224b514616b34e453eee1cee1631314ee728c0c789b978bf6856"} Mar 21 04:58:42 crc kubenswrapper[4839]: I0321 04:58:42.546261 4839 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="90bc916bb321224b514616b34e453eee1cee1631314ee728c0c789b978bf6856" Mar 21 04:58:42 crc kubenswrapper[4839]: I0321 04:58:42.546318 4839 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-v4wqq" Mar 21 04:58:42 crc kubenswrapper[4839]: I0321 04:58:42.766624 4839 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-vx26d"] Mar 21 04:58:42 crc kubenswrapper[4839]: E0321 04:58:42.767446 4839 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7e5c69e5-d234-4a0b-a327-1cf44ddaf1bd" containerName="ovn-edpm-deployment-openstack-edpm-ipam" Mar 21 04:58:42 crc kubenswrapper[4839]: I0321 04:58:42.767545 4839 state_mem.go:107] "Deleted CPUSet assignment" podUID="7e5c69e5-d234-4a0b-a327-1cf44ddaf1bd" containerName="ovn-edpm-deployment-openstack-edpm-ipam" Mar 21 04:58:42 crc kubenswrapper[4839]: E0321 04:58:42.767658 4839 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="04b644e0-9d17-491d-be8c-359dd9f82604" containerName="oc" Mar 21 04:58:42 crc kubenswrapper[4839]: I0321 04:58:42.767732 4839 state_mem.go:107] "Deleted CPUSet assignment" podUID="04b644e0-9d17-491d-be8c-359dd9f82604" containerName="oc" Mar 21 04:58:42 crc kubenswrapper[4839]: I0321 04:58:42.768050 4839 memory_manager.go:354] "RemoveStaleState removing state" podUID="7e5c69e5-d234-4a0b-a327-1cf44ddaf1bd" containerName="ovn-edpm-deployment-openstack-edpm-ipam" Mar 21 04:58:42 crc kubenswrapper[4839]: I0321 04:58:42.768168 4839 memory_manager.go:354] "RemoveStaleState removing state" podUID="04b644e0-9d17-491d-be8c-359dd9f82604" containerName="oc" Mar 21 04:58:42 crc kubenswrapper[4839]: I0321 04:58:42.769042 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-vx26d" Mar 21 04:58:42 crc kubenswrapper[4839]: I0321 04:58:42.773079 4839 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-ovn-metadata-agent-neutron-config" Mar 21 04:58:42 crc kubenswrapper[4839]: I0321 04:58:42.773262 4839 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-neutron-config" Mar 21 04:58:42 crc kubenswrapper[4839]: I0321 04:58:42.773317 4839 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Mar 21 04:58:42 crc kubenswrapper[4839]: I0321 04:58:42.773641 4839 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Mar 21 04:58:42 crc kubenswrapper[4839]: I0321 04:58:42.773889 4839 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-pxvtf" Mar 21 04:58:42 crc kubenswrapper[4839]: I0321 04:58:42.773966 4839 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Mar 21 04:58:42 crc kubenswrapper[4839]: I0321 04:58:42.787378 4839 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-vx26d"] Mar 21 04:58:42 crc kubenswrapper[4839]: I0321 04:58:42.920109 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/ceef8f42-5d77-44c1-ac39-edf0080f68e0-nova-metadata-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-vx26d\" (UID: \"ceef8f42-5d77-44c1-ac39-edf0080f68e0\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-vx26d" Mar 21 04:58:42 crc kubenswrapper[4839]: I0321 04:58:42.920168 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/ceef8f42-5d77-44c1-ac39-edf0080f68e0-neutron-ovn-metadata-agent-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-vx26d\" (UID: \"ceef8f42-5d77-44c1-ac39-edf0080f68e0\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-vx26d" Mar 21 04:58:42 crc kubenswrapper[4839]: I0321 04:58:42.920191 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/ceef8f42-5d77-44c1-ac39-edf0080f68e0-ssh-key-openstack-edpm-ipam\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-vx26d\" (UID: \"ceef8f42-5d77-44c1-ac39-edf0080f68e0\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-vx26d" Mar 21 04:58:42 crc kubenswrapper[4839]: I0321 04:58:42.920220 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ceef8f42-5d77-44c1-ac39-edf0080f68e0-neutron-metadata-combined-ca-bundle\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-vx26d\" (UID: \"ceef8f42-5d77-44c1-ac39-edf0080f68e0\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-vx26d" Mar 21 04:58:42 crc kubenswrapper[4839]: I0321 04:58:42.920282 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/ceef8f42-5d77-44c1-ac39-edf0080f68e0-inventory\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-vx26d\" (UID: \"ceef8f42-5d77-44c1-ac39-edf0080f68e0\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-vx26d" Mar 21 04:58:42 crc kubenswrapper[4839]: I0321 04:58:42.920315 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zfz7c\" (UniqueName: \"kubernetes.io/projected/ceef8f42-5d77-44c1-ac39-edf0080f68e0-kube-api-access-zfz7c\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-vx26d\" (UID: \"ceef8f42-5d77-44c1-ac39-edf0080f68e0\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-vx26d" Mar 21 04:58:43 crc kubenswrapper[4839]: I0321 04:58:43.021332 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/ceef8f42-5d77-44c1-ac39-edf0080f68e0-nova-metadata-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-vx26d\" (UID: \"ceef8f42-5d77-44c1-ac39-edf0080f68e0\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-vx26d" Mar 21 04:58:43 crc kubenswrapper[4839]: I0321 04:58:43.022081 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/ceef8f42-5d77-44c1-ac39-edf0080f68e0-neutron-ovn-metadata-agent-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-vx26d\" (UID: \"ceef8f42-5d77-44c1-ac39-edf0080f68e0\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-vx26d" Mar 21 04:58:43 crc kubenswrapper[4839]: I0321 04:58:43.022173 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/ceef8f42-5d77-44c1-ac39-edf0080f68e0-ssh-key-openstack-edpm-ipam\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-vx26d\" (UID: \"ceef8f42-5d77-44c1-ac39-edf0080f68e0\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-vx26d" Mar 21 04:58:43 crc kubenswrapper[4839]: I0321 04:58:43.022343 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ceef8f42-5d77-44c1-ac39-edf0080f68e0-neutron-metadata-combined-ca-bundle\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-vx26d\" (UID: \"ceef8f42-5d77-44c1-ac39-edf0080f68e0\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-vx26d" Mar 21 04:58:43 crc kubenswrapper[4839]: I0321 04:58:43.022467 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/ceef8f42-5d77-44c1-ac39-edf0080f68e0-inventory\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-vx26d\" (UID: \"ceef8f42-5d77-44c1-ac39-edf0080f68e0\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-vx26d" Mar 21 04:58:43 crc kubenswrapper[4839]: I0321 04:58:43.022558 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zfz7c\" (UniqueName: \"kubernetes.io/projected/ceef8f42-5d77-44c1-ac39-edf0080f68e0-kube-api-access-zfz7c\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-vx26d\" (UID: \"ceef8f42-5d77-44c1-ac39-edf0080f68e0\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-vx26d" Mar 21 04:58:43 crc kubenswrapper[4839]: I0321 04:58:43.026432 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/ceef8f42-5d77-44c1-ac39-edf0080f68e0-ssh-key-openstack-edpm-ipam\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-vx26d\" (UID: \"ceef8f42-5d77-44c1-ac39-edf0080f68e0\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-vx26d" Mar 21 04:58:43 crc kubenswrapper[4839]: I0321 04:58:43.026513 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/ceef8f42-5d77-44c1-ac39-edf0080f68e0-inventory\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-vx26d\" (UID: \"ceef8f42-5d77-44c1-ac39-edf0080f68e0\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-vx26d" Mar 21 04:58:43 crc kubenswrapper[4839]: I0321 04:58:43.026909 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/ceef8f42-5d77-44c1-ac39-edf0080f68e0-neutron-ovn-metadata-agent-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-vx26d\" (UID: \"ceef8f42-5d77-44c1-ac39-edf0080f68e0\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-vx26d" Mar 21 04:58:43 crc kubenswrapper[4839]: I0321 04:58:43.027953 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/ceef8f42-5d77-44c1-ac39-edf0080f68e0-nova-metadata-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-vx26d\" (UID: \"ceef8f42-5d77-44c1-ac39-edf0080f68e0\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-vx26d" Mar 21 04:58:43 crc kubenswrapper[4839]: I0321 04:58:43.029517 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ceef8f42-5d77-44c1-ac39-edf0080f68e0-neutron-metadata-combined-ca-bundle\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-vx26d\" (UID: \"ceef8f42-5d77-44c1-ac39-edf0080f68e0\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-vx26d" Mar 21 04:58:43 crc kubenswrapper[4839]: I0321 04:58:43.040655 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zfz7c\" (UniqueName: \"kubernetes.io/projected/ceef8f42-5d77-44c1-ac39-edf0080f68e0-kube-api-access-zfz7c\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-vx26d\" (UID: \"ceef8f42-5d77-44c1-ac39-edf0080f68e0\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-vx26d" Mar 21 04:58:43 crc kubenswrapper[4839]: I0321 04:58:43.089511 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-vx26d" Mar 21 04:58:43 crc kubenswrapper[4839]: I0321 04:58:43.491304 4839 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-vx26d"] Mar 21 04:58:43 crc kubenswrapper[4839]: I0321 04:58:43.555295 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-vx26d" event={"ID":"ceef8f42-5d77-44c1-ac39-edf0080f68e0","Type":"ContainerStarted","Data":"4ee715bddff07c76544b702e5eaee41ba3d3c365ea6d9c4b6f3ab84190c734f8"} Mar 21 04:58:44 crc kubenswrapper[4839]: I0321 04:58:44.566684 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-vx26d" event={"ID":"ceef8f42-5d77-44c1-ac39-edf0080f68e0","Type":"ContainerStarted","Data":"6bbd6cad4a6706fb2dbeb55ea678fa13e80d6192c348063b3888476d7157223a"} Mar 21 04:58:44 crc kubenswrapper[4839]: I0321 04:58:44.588638 4839 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-vx26d" podStartSLOduration=1.932611635 podStartE2EDuration="2.588621228s" podCreationTimestamp="2026-03-21 04:58:42 +0000 UTC" firstStartedPulling="2026-03-21 04:58:43.497915117 +0000 UTC m=+2127.825701813" lastFinishedPulling="2026-03-21 04:58:44.15392473 +0000 UTC m=+2128.481711406" observedRunningTime="2026-03-21 04:58:44.588072023 +0000 UTC m=+2128.915858709" watchObservedRunningTime="2026-03-21 04:58:44.588621228 +0000 UTC m=+2128.916407904" Mar 21 04:58:55 crc kubenswrapper[4839]: I0321 04:58:55.507046 4839 scope.go:117] "RemoveContainer" containerID="5189f213ccdcf6760a09eb930ee4482a2d44b649489c1422b2b1e4b3849ef663" Mar 21 04:59:11 crc kubenswrapper[4839]: I0321 04:59:11.142034 4839 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-gsh99"] Mar 21 04:59:11 crc kubenswrapper[4839]: I0321 04:59:11.145723 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-gsh99" Mar 21 04:59:11 crc kubenswrapper[4839]: I0321 04:59:11.157302 4839 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-gsh99"] Mar 21 04:59:11 crc kubenswrapper[4839]: I0321 04:59:11.181073 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/21140940-0075-4e70-915c-e37382cc0dd8-catalog-content\") pod \"redhat-operators-gsh99\" (UID: \"21140940-0075-4e70-915c-e37382cc0dd8\") " pod="openshift-marketplace/redhat-operators-gsh99" Mar 21 04:59:11 crc kubenswrapper[4839]: I0321 04:59:11.181162 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4hrrk\" (UniqueName: \"kubernetes.io/projected/21140940-0075-4e70-915c-e37382cc0dd8-kube-api-access-4hrrk\") pod \"redhat-operators-gsh99\" (UID: \"21140940-0075-4e70-915c-e37382cc0dd8\") " pod="openshift-marketplace/redhat-operators-gsh99" Mar 21 04:59:11 crc kubenswrapper[4839]: I0321 04:59:11.181252 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/21140940-0075-4e70-915c-e37382cc0dd8-utilities\") pod \"redhat-operators-gsh99\" (UID: \"21140940-0075-4e70-915c-e37382cc0dd8\") " pod="openshift-marketplace/redhat-operators-gsh99" Mar 21 04:59:11 crc kubenswrapper[4839]: I0321 04:59:11.283211 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4hrrk\" (UniqueName: \"kubernetes.io/projected/21140940-0075-4e70-915c-e37382cc0dd8-kube-api-access-4hrrk\") pod \"redhat-operators-gsh99\" (UID: \"21140940-0075-4e70-915c-e37382cc0dd8\") " pod="openshift-marketplace/redhat-operators-gsh99" Mar 21 04:59:11 crc kubenswrapper[4839]: I0321 04:59:11.283316 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/21140940-0075-4e70-915c-e37382cc0dd8-utilities\") pod \"redhat-operators-gsh99\" (UID: \"21140940-0075-4e70-915c-e37382cc0dd8\") " pod="openshift-marketplace/redhat-operators-gsh99" Mar 21 04:59:11 crc kubenswrapper[4839]: I0321 04:59:11.283412 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/21140940-0075-4e70-915c-e37382cc0dd8-catalog-content\") pod \"redhat-operators-gsh99\" (UID: \"21140940-0075-4e70-915c-e37382cc0dd8\") " pod="openshift-marketplace/redhat-operators-gsh99" Mar 21 04:59:11 crc kubenswrapper[4839]: I0321 04:59:11.283872 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/21140940-0075-4e70-915c-e37382cc0dd8-catalog-content\") pod \"redhat-operators-gsh99\" (UID: \"21140940-0075-4e70-915c-e37382cc0dd8\") " pod="openshift-marketplace/redhat-operators-gsh99" Mar 21 04:59:11 crc kubenswrapper[4839]: I0321 04:59:11.283971 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/21140940-0075-4e70-915c-e37382cc0dd8-utilities\") pod \"redhat-operators-gsh99\" (UID: \"21140940-0075-4e70-915c-e37382cc0dd8\") " pod="openshift-marketplace/redhat-operators-gsh99" Mar 21 04:59:11 crc kubenswrapper[4839]: I0321 04:59:11.308886 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4hrrk\" (UniqueName: \"kubernetes.io/projected/21140940-0075-4e70-915c-e37382cc0dd8-kube-api-access-4hrrk\") pod \"redhat-operators-gsh99\" (UID: \"21140940-0075-4e70-915c-e37382cc0dd8\") " pod="openshift-marketplace/redhat-operators-gsh99" Mar 21 04:59:11 crc kubenswrapper[4839]: I0321 04:59:11.482332 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-gsh99" Mar 21 04:59:11 crc kubenswrapper[4839]: I0321 04:59:11.957632 4839 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-gsh99"] Mar 21 04:59:12 crc kubenswrapper[4839]: I0321 04:59:12.832696 4839 generic.go:334] "Generic (PLEG): container finished" podID="21140940-0075-4e70-915c-e37382cc0dd8" containerID="55aece69ea4b938a83677f50b6562a8cb8b2f71c1eedd472b9b4b617c5f69867" exitCode=0 Mar 21 04:59:12 crc kubenswrapper[4839]: I0321 04:59:12.833497 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-gsh99" event={"ID":"21140940-0075-4e70-915c-e37382cc0dd8","Type":"ContainerDied","Data":"55aece69ea4b938a83677f50b6562a8cb8b2f71c1eedd472b9b4b617c5f69867"} Mar 21 04:59:12 crc kubenswrapper[4839]: I0321 04:59:12.833609 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-gsh99" event={"ID":"21140940-0075-4e70-915c-e37382cc0dd8","Type":"ContainerStarted","Data":"208cf34686ee589dee6133b46731ab90df50d08a69e91f3ab54667a8147ae4d9"} Mar 21 04:59:13 crc kubenswrapper[4839]: I0321 04:59:13.843531 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-gsh99" event={"ID":"21140940-0075-4e70-915c-e37382cc0dd8","Type":"ContainerStarted","Data":"cc7df358d00b09b247c1efcf8886b075a20eb00bfbb4f3f671e8a5f170118ec9"} Mar 21 04:59:15 crc kubenswrapper[4839]: I0321 04:59:15.864353 4839 generic.go:334] "Generic (PLEG): container finished" podID="21140940-0075-4e70-915c-e37382cc0dd8" containerID="cc7df358d00b09b247c1efcf8886b075a20eb00bfbb4f3f671e8a5f170118ec9" exitCode=0 Mar 21 04:59:15 crc kubenswrapper[4839]: I0321 04:59:15.864449 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-gsh99" event={"ID":"21140940-0075-4e70-915c-e37382cc0dd8","Type":"ContainerDied","Data":"cc7df358d00b09b247c1efcf8886b075a20eb00bfbb4f3f671e8a5f170118ec9"} Mar 21 04:59:18 crc kubenswrapper[4839]: I0321 04:59:18.894811 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-gsh99" event={"ID":"21140940-0075-4e70-915c-e37382cc0dd8","Type":"ContainerStarted","Data":"6d9b0f369a1fc13f21c4a26a53ef07fc31884298133c4ccdcab5004c17fdd355"} Mar 21 04:59:19 crc kubenswrapper[4839]: I0321 04:59:19.933786 4839 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-gsh99" podStartSLOduration=4.518973304 podStartE2EDuration="8.933766506s" podCreationTimestamp="2026-03-21 04:59:11 +0000 UTC" firstStartedPulling="2026-03-21 04:59:12.83497879 +0000 UTC m=+2157.162765466" lastFinishedPulling="2026-03-21 04:59:17.249771982 +0000 UTC m=+2161.577558668" observedRunningTime="2026-03-21 04:59:19.925546194 +0000 UTC m=+2164.253332880" watchObservedRunningTime="2026-03-21 04:59:19.933766506 +0000 UTC m=+2164.261553202" Mar 21 04:59:21 crc kubenswrapper[4839]: I0321 04:59:21.483341 4839 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-gsh99" Mar 21 04:59:21 crc kubenswrapper[4839]: I0321 04:59:21.485124 4839 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-gsh99" Mar 21 04:59:22 crc kubenswrapper[4839]: I0321 04:59:22.530833 4839 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-gsh99" podUID="21140940-0075-4e70-915c-e37382cc0dd8" containerName="registry-server" probeResult="failure" output=< Mar 21 04:59:22 crc kubenswrapper[4839]: timeout: failed to connect service ":50051" within 1s Mar 21 04:59:22 crc kubenswrapper[4839]: > Mar 21 04:59:29 crc kubenswrapper[4839]: I0321 04:59:29.995024 4839 generic.go:334] "Generic (PLEG): container finished" podID="ceef8f42-5d77-44c1-ac39-edf0080f68e0" containerID="6bbd6cad4a6706fb2dbeb55ea678fa13e80d6192c348063b3888476d7157223a" exitCode=0 Mar 21 04:59:29 crc kubenswrapper[4839]: I0321 04:59:29.995090 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-vx26d" event={"ID":"ceef8f42-5d77-44c1-ac39-edf0080f68e0","Type":"ContainerDied","Data":"6bbd6cad4a6706fb2dbeb55ea678fa13e80d6192c348063b3888476d7157223a"} Mar 21 04:59:31 crc kubenswrapper[4839]: I0321 04:59:31.401654 4839 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-vx26d" Mar 21 04:59:31 crc kubenswrapper[4839]: I0321 04:59:31.520644 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zfz7c\" (UniqueName: \"kubernetes.io/projected/ceef8f42-5d77-44c1-ac39-edf0080f68e0-kube-api-access-zfz7c\") pod \"ceef8f42-5d77-44c1-ac39-edf0080f68e0\" (UID: \"ceef8f42-5d77-44c1-ac39-edf0080f68e0\") " Mar 21 04:59:31 crc kubenswrapper[4839]: I0321 04:59:31.521008 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/ceef8f42-5d77-44c1-ac39-edf0080f68e0-nova-metadata-neutron-config-0\") pod \"ceef8f42-5d77-44c1-ac39-edf0080f68e0\" (UID: \"ceef8f42-5d77-44c1-ac39-edf0080f68e0\") " Mar 21 04:59:31 crc kubenswrapper[4839]: I0321 04:59:31.521117 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/ceef8f42-5d77-44c1-ac39-edf0080f68e0-ssh-key-openstack-edpm-ipam\") pod \"ceef8f42-5d77-44c1-ac39-edf0080f68e0\" (UID: \"ceef8f42-5d77-44c1-ac39-edf0080f68e0\") " Mar 21 04:59:31 crc kubenswrapper[4839]: I0321 04:59:31.522120 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/ceef8f42-5d77-44c1-ac39-edf0080f68e0-inventory\") pod \"ceef8f42-5d77-44c1-ac39-edf0080f68e0\" (UID: \"ceef8f42-5d77-44c1-ac39-edf0080f68e0\") " Mar 21 04:59:31 crc kubenswrapper[4839]: I0321 04:59:31.522281 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/ceef8f42-5d77-44c1-ac39-edf0080f68e0-neutron-ovn-metadata-agent-neutron-config-0\") pod \"ceef8f42-5d77-44c1-ac39-edf0080f68e0\" (UID: \"ceef8f42-5d77-44c1-ac39-edf0080f68e0\") " Mar 21 04:59:31 crc kubenswrapper[4839]: I0321 04:59:31.522349 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ceef8f42-5d77-44c1-ac39-edf0080f68e0-neutron-metadata-combined-ca-bundle\") pod \"ceef8f42-5d77-44c1-ac39-edf0080f68e0\" (UID: \"ceef8f42-5d77-44c1-ac39-edf0080f68e0\") " Mar 21 04:59:31 crc kubenswrapper[4839]: I0321 04:59:31.526826 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ceef8f42-5d77-44c1-ac39-edf0080f68e0-kube-api-access-zfz7c" (OuterVolumeSpecName: "kube-api-access-zfz7c") pod "ceef8f42-5d77-44c1-ac39-edf0080f68e0" (UID: "ceef8f42-5d77-44c1-ac39-edf0080f68e0"). InnerVolumeSpecName "kube-api-access-zfz7c". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 21 04:59:31 crc kubenswrapper[4839]: I0321 04:59:31.533716 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ceef8f42-5d77-44c1-ac39-edf0080f68e0-neutron-metadata-combined-ca-bundle" (OuterVolumeSpecName: "neutron-metadata-combined-ca-bundle") pod "ceef8f42-5d77-44c1-ac39-edf0080f68e0" (UID: "ceef8f42-5d77-44c1-ac39-edf0080f68e0"). InnerVolumeSpecName "neutron-metadata-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 21 04:59:31 crc kubenswrapper[4839]: I0321 04:59:31.544455 4839 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-gsh99" Mar 21 04:59:31 crc kubenswrapper[4839]: I0321 04:59:31.551497 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ceef8f42-5d77-44c1-ac39-edf0080f68e0-inventory" (OuterVolumeSpecName: "inventory") pod "ceef8f42-5d77-44c1-ac39-edf0080f68e0" (UID: "ceef8f42-5d77-44c1-ac39-edf0080f68e0"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 21 04:59:31 crc kubenswrapper[4839]: I0321 04:59:31.551543 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ceef8f42-5d77-44c1-ac39-edf0080f68e0-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "ceef8f42-5d77-44c1-ac39-edf0080f68e0" (UID: "ceef8f42-5d77-44c1-ac39-edf0080f68e0"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 21 04:59:31 crc kubenswrapper[4839]: I0321 04:59:31.554517 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ceef8f42-5d77-44c1-ac39-edf0080f68e0-neutron-ovn-metadata-agent-neutron-config-0" (OuterVolumeSpecName: "neutron-ovn-metadata-agent-neutron-config-0") pod "ceef8f42-5d77-44c1-ac39-edf0080f68e0" (UID: "ceef8f42-5d77-44c1-ac39-edf0080f68e0"). InnerVolumeSpecName "neutron-ovn-metadata-agent-neutron-config-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 21 04:59:31 crc kubenswrapper[4839]: I0321 04:59:31.581610 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ceef8f42-5d77-44c1-ac39-edf0080f68e0-nova-metadata-neutron-config-0" (OuterVolumeSpecName: "nova-metadata-neutron-config-0") pod "ceef8f42-5d77-44c1-ac39-edf0080f68e0" (UID: "ceef8f42-5d77-44c1-ac39-edf0080f68e0"). InnerVolumeSpecName "nova-metadata-neutron-config-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 21 04:59:31 crc kubenswrapper[4839]: I0321 04:59:31.604360 4839 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-gsh99" Mar 21 04:59:31 crc kubenswrapper[4839]: I0321 04:59:31.629869 4839 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/ceef8f42-5d77-44c1-ac39-edf0080f68e0-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Mar 21 04:59:31 crc kubenswrapper[4839]: I0321 04:59:31.629919 4839 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/ceef8f42-5d77-44c1-ac39-edf0080f68e0-inventory\") on node \"crc\" DevicePath \"\"" Mar 21 04:59:31 crc kubenswrapper[4839]: I0321 04:59:31.629936 4839 reconciler_common.go:293] "Volume detached for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/ceef8f42-5d77-44c1-ac39-edf0080f68e0-neutron-ovn-metadata-agent-neutron-config-0\") on node \"crc\" DevicePath \"\"" Mar 21 04:59:31 crc kubenswrapper[4839]: I0321 04:59:31.630011 4839 reconciler_common.go:293] "Volume detached for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ceef8f42-5d77-44c1-ac39-edf0080f68e0-neutron-metadata-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 21 04:59:31 crc kubenswrapper[4839]: I0321 04:59:31.630049 4839 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zfz7c\" (UniqueName: \"kubernetes.io/projected/ceef8f42-5d77-44c1-ac39-edf0080f68e0-kube-api-access-zfz7c\") on node \"crc\" DevicePath \"\"" Mar 21 04:59:31 crc kubenswrapper[4839]: I0321 04:59:31.630065 4839 reconciler_common.go:293] "Volume detached for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/ceef8f42-5d77-44c1-ac39-edf0080f68e0-nova-metadata-neutron-config-0\") on node \"crc\" DevicePath \"\"" Mar 21 04:59:31 crc kubenswrapper[4839]: I0321 04:59:31.780937 4839 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-gsh99"] Mar 21 04:59:32 crc kubenswrapper[4839]: I0321 04:59:32.024989 4839 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-vx26d" Mar 21 04:59:32 crc kubenswrapper[4839]: I0321 04:59:32.024988 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-vx26d" event={"ID":"ceef8f42-5d77-44c1-ac39-edf0080f68e0","Type":"ContainerDied","Data":"4ee715bddff07c76544b702e5eaee41ba3d3c365ea6d9c4b6f3ab84190c734f8"} Mar 21 04:59:32 crc kubenswrapper[4839]: I0321 04:59:32.025096 4839 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="4ee715bddff07c76544b702e5eaee41ba3d3c365ea6d9c4b6f3ab84190c734f8" Mar 21 04:59:32 crc kubenswrapper[4839]: I0321 04:59:32.155723 4839 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/libvirt-edpm-deployment-openstack-edpm-ipam-w48j6"] Mar 21 04:59:32 crc kubenswrapper[4839]: E0321 04:59:32.156328 4839 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ceef8f42-5d77-44c1-ac39-edf0080f68e0" containerName="neutron-metadata-edpm-deployment-openstack-edpm-ipam" Mar 21 04:59:32 crc kubenswrapper[4839]: I0321 04:59:32.156364 4839 state_mem.go:107] "Deleted CPUSet assignment" podUID="ceef8f42-5d77-44c1-ac39-edf0080f68e0" containerName="neutron-metadata-edpm-deployment-openstack-edpm-ipam" Mar 21 04:59:32 crc kubenswrapper[4839]: I0321 04:59:32.156788 4839 memory_manager.go:354] "RemoveStaleState removing state" podUID="ceef8f42-5d77-44c1-ac39-edf0080f68e0" containerName="neutron-metadata-edpm-deployment-openstack-edpm-ipam" Mar 21 04:59:32 crc kubenswrapper[4839]: I0321 04:59:32.158094 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-w48j6" Mar 21 04:59:32 crc kubenswrapper[4839]: I0321 04:59:32.162540 4839 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Mar 21 04:59:32 crc kubenswrapper[4839]: I0321 04:59:32.163625 4839 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"libvirt-secret" Mar 21 04:59:32 crc kubenswrapper[4839]: I0321 04:59:32.164307 4839 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Mar 21 04:59:32 crc kubenswrapper[4839]: I0321 04:59:32.164481 4839 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Mar 21 04:59:32 crc kubenswrapper[4839]: I0321 04:59:32.164701 4839 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-pxvtf" Mar 21 04:59:32 crc kubenswrapper[4839]: I0321 04:59:32.177564 4839 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/libvirt-edpm-deployment-openstack-edpm-ipam-w48j6"] Mar 21 04:59:32 crc kubenswrapper[4839]: I0321 04:59:32.345416 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/2d056acb-0183-4157-a830-fff4cd1dcacf-ssh-key-openstack-edpm-ipam\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-w48j6\" (UID: \"2d056acb-0183-4157-a830-fff4cd1dcacf\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-w48j6" Mar 21 04:59:32 crc kubenswrapper[4839]: I0321 04:59:32.345626 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/2d056acb-0183-4157-a830-fff4cd1dcacf-libvirt-secret-0\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-w48j6\" (UID: \"2d056acb-0183-4157-a830-fff4cd1dcacf\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-w48j6" Mar 21 04:59:32 crc kubenswrapper[4839]: I0321 04:59:32.345687 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/2d056acb-0183-4157-a830-fff4cd1dcacf-inventory\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-w48j6\" (UID: \"2d056acb-0183-4157-a830-fff4cd1dcacf\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-w48j6" Mar 21 04:59:32 crc kubenswrapper[4839]: I0321 04:59:32.345764 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2d056acb-0183-4157-a830-fff4cd1dcacf-libvirt-combined-ca-bundle\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-w48j6\" (UID: \"2d056acb-0183-4157-a830-fff4cd1dcacf\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-w48j6" Mar 21 04:59:32 crc kubenswrapper[4839]: I0321 04:59:32.345823 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pf8mj\" (UniqueName: \"kubernetes.io/projected/2d056acb-0183-4157-a830-fff4cd1dcacf-kube-api-access-pf8mj\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-w48j6\" (UID: \"2d056acb-0183-4157-a830-fff4cd1dcacf\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-w48j6" Mar 21 04:59:32 crc kubenswrapper[4839]: I0321 04:59:32.447651 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/2d056acb-0183-4157-a830-fff4cd1dcacf-ssh-key-openstack-edpm-ipam\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-w48j6\" (UID: \"2d056acb-0183-4157-a830-fff4cd1dcacf\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-w48j6" Mar 21 04:59:32 crc kubenswrapper[4839]: I0321 04:59:32.447813 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/2d056acb-0183-4157-a830-fff4cd1dcacf-libvirt-secret-0\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-w48j6\" (UID: \"2d056acb-0183-4157-a830-fff4cd1dcacf\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-w48j6" Mar 21 04:59:32 crc kubenswrapper[4839]: I0321 04:59:32.447862 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/2d056acb-0183-4157-a830-fff4cd1dcacf-inventory\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-w48j6\" (UID: \"2d056acb-0183-4157-a830-fff4cd1dcacf\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-w48j6" Mar 21 04:59:32 crc kubenswrapper[4839]: I0321 04:59:32.447921 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2d056acb-0183-4157-a830-fff4cd1dcacf-libvirt-combined-ca-bundle\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-w48j6\" (UID: \"2d056acb-0183-4157-a830-fff4cd1dcacf\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-w48j6" Mar 21 04:59:32 crc kubenswrapper[4839]: I0321 04:59:32.447974 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pf8mj\" (UniqueName: \"kubernetes.io/projected/2d056acb-0183-4157-a830-fff4cd1dcacf-kube-api-access-pf8mj\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-w48j6\" (UID: \"2d056acb-0183-4157-a830-fff4cd1dcacf\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-w48j6" Mar 21 04:59:32 crc kubenswrapper[4839]: I0321 04:59:32.453727 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/2d056acb-0183-4157-a830-fff4cd1dcacf-inventory\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-w48j6\" (UID: \"2d056acb-0183-4157-a830-fff4cd1dcacf\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-w48j6" Mar 21 04:59:32 crc kubenswrapper[4839]: I0321 04:59:32.454619 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/2d056acb-0183-4157-a830-fff4cd1dcacf-libvirt-secret-0\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-w48j6\" (UID: \"2d056acb-0183-4157-a830-fff4cd1dcacf\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-w48j6" Mar 21 04:59:32 crc kubenswrapper[4839]: I0321 04:59:32.455058 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2d056acb-0183-4157-a830-fff4cd1dcacf-libvirt-combined-ca-bundle\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-w48j6\" (UID: \"2d056acb-0183-4157-a830-fff4cd1dcacf\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-w48j6" Mar 21 04:59:32 crc kubenswrapper[4839]: I0321 04:59:32.456786 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/2d056acb-0183-4157-a830-fff4cd1dcacf-ssh-key-openstack-edpm-ipam\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-w48j6\" (UID: \"2d056acb-0183-4157-a830-fff4cd1dcacf\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-w48j6" Mar 21 04:59:32 crc kubenswrapper[4839]: I0321 04:59:32.475451 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pf8mj\" (UniqueName: \"kubernetes.io/projected/2d056acb-0183-4157-a830-fff4cd1dcacf-kube-api-access-pf8mj\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-w48j6\" (UID: \"2d056acb-0183-4157-a830-fff4cd1dcacf\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-w48j6" Mar 21 04:59:32 crc kubenswrapper[4839]: I0321 04:59:32.479432 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-w48j6" Mar 21 04:59:32 crc kubenswrapper[4839]: I0321 04:59:32.984288 4839 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/libvirt-edpm-deployment-openstack-edpm-ipam-w48j6"] Mar 21 04:59:32 crc kubenswrapper[4839]: W0321 04:59:32.984436 4839 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod2d056acb_0183_4157_a830_fff4cd1dcacf.slice/crio-a47bbc08aef3ecca5f0392206f51820ac7164fd826b5332b0545a1c8dd6795ca WatchSource:0}: Error finding container a47bbc08aef3ecca5f0392206f51820ac7164fd826b5332b0545a1c8dd6795ca: Status 404 returned error can't find the container with id a47bbc08aef3ecca5f0392206f51820ac7164fd826b5332b0545a1c8dd6795ca Mar 21 04:59:33 crc kubenswrapper[4839]: I0321 04:59:33.033680 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-w48j6" event={"ID":"2d056acb-0183-4157-a830-fff4cd1dcacf","Type":"ContainerStarted","Data":"a47bbc08aef3ecca5f0392206f51820ac7164fd826b5332b0545a1c8dd6795ca"} Mar 21 04:59:33 crc kubenswrapper[4839]: I0321 04:59:33.033828 4839 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-gsh99" podUID="21140940-0075-4e70-915c-e37382cc0dd8" containerName="registry-server" containerID="cri-o://6d9b0f369a1fc13f21c4a26a53ef07fc31884298133c4ccdcab5004c17fdd355" gracePeriod=2 Mar 21 04:59:33 crc kubenswrapper[4839]: I0321 04:59:33.588666 4839 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-gsh99" Mar 21 04:59:33 crc kubenswrapper[4839]: I0321 04:59:33.772494 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/21140940-0075-4e70-915c-e37382cc0dd8-utilities\") pod \"21140940-0075-4e70-915c-e37382cc0dd8\" (UID: \"21140940-0075-4e70-915c-e37382cc0dd8\") " Mar 21 04:59:33 crc kubenswrapper[4839]: I0321 04:59:33.772785 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4hrrk\" (UniqueName: \"kubernetes.io/projected/21140940-0075-4e70-915c-e37382cc0dd8-kube-api-access-4hrrk\") pod \"21140940-0075-4e70-915c-e37382cc0dd8\" (UID: \"21140940-0075-4e70-915c-e37382cc0dd8\") " Mar 21 04:59:33 crc kubenswrapper[4839]: I0321 04:59:33.772831 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/21140940-0075-4e70-915c-e37382cc0dd8-catalog-content\") pod \"21140940-0075-4e70-915c-e37382cc0dd8\" (UID: \"21140940-0075-4e70-915c-e37382cc0dd8\") " Mar 21 04:59:33 crc kubenswrapper[4839]: I0321 04:59:33.774011 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/21140940-0075-4e70-915c-e37382cc0dd8-utilities" (OuterVolumeSpecName: "utilities") pod "21140940-0075-4e70-915c-e37382cc0dd8" (UID: "21140940-0075-4e70-915c-e37382cc0dd8"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 21 04:59:33 crc kubenswrapper[4839]: I0321 04:59:33.778702 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/21140940-0075-4e70-915c-e37382cc0dd8-kube-api-access-4hrrk" (OuterVolumeSpecName: "kube-api-access-4hrrk") pod "21140940-0075-4e70-915c-e37382cc0dd8" (UID: "21140940-0075-4e70-915c-e37382cc0dd8"). InnerVolumeSpecName "kube-api-access-4hrrk". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 21 04:59:33 crc kubenswrapper[4839]: I0321 04:59:33.875039 4839 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/21140940-0075-4e70-915c-e37382cc0dd8-utilities\") on node \"crc\" DevicePath \"\"" Mar 21 04:59:33 crc kubenswrapper[4839]: I0321 04:59:33.875306 4839 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4hrrk\" (UniqueName: \"kubernetes.io/projected/21140940-0075-4e70-915c-e37382cc0dd8-kube-api-access-4hrrk\") on node \"crc\" DevicePath \"\"" Mar 21 04:59:33 crc kubenswrapper[4839]: I0321 04:59:33.935844 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/21140940-0075-4e70-915c-e37382cc0dd8-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "21140940-0075-4e70-915c-e37382cc0dd8" (UID: "21140940-0075-4e70-915c-e37382cc0dd8"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 21 04:59:33 crc kubenswrapper[4839]: I0321 04:59:33.978144 4839 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/21140940-0075-4e70-915c-e37382cc0dd8-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 21 04:59:34 crc kubenswrapper[4839]: I0321 04:59:34.043102 4839 generic.go:334] "Generic (PLEG): container finished" podID="21140940-0075-4e70-915c-e37382cc0dd8" containerID="6d9b0f369a1fc13f21c4a26a53ef07fc31884298133c4ccdcab5004c17fdd355" exitCode=0 Mar 21 04:59:34 crc kubenswrapper[4839]: I0321 04:59:34.043162 4839 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-gsh99" Mar 21 04:59:34 crc kubenswrapper[4839]: I0321 04:59:34.043167 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-gsh99" event={"ID":"21140940-0075-4e70-915c-e37382cc0dd8","Type":"ContainerDied","Data":"6d9b0f369a1fc13f21c4a26a53ef07fc31884298133c4ccdcab5004c17fdd355"} Mar 21 04:59:34 crc kubenswrapper[4839]: I0321 04:59:34.043306 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-gsh99" event={"ID":"21140940-0075-4e70-915c-e37382cc0dd8","Type":"ContainerDied","Data":"208cf34686ee589dee6133b46731ab90df50d08a69e91f3ab54667a8147ae4d9"} Mar 21 04:59:34 crc kubenswrapper[4839]: I0321 04:59:34.043342 4839 scope.go:117] "RemoveContainer" containerID="6d9b0f369a1fc13f21c4a26a53ef07fc31884298133c4ccdcab5004c17fdd355" Mar 21 04:59:34 crc kubenswrapper[4839]: I0321 04:59:34.046474 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-w48j6" event={"ID":"2d056acb-0183-4157-a830-fff4cd1dcacf","Type":"ContainerStarted","Data":"38923cbb1565ae7e426fbfa5a7cacab2c7dc20c694af3c1982bdf3aeaab3650d"} Mar 21 04:59:34 crc kubenswrapper[4839]: I0321 04:59:34.066882 4839 scope.go:117] "RemoveContainer" containerID="cc7df358d00b09b247c1efcf8886b075a20eb00bfbb4f3f671e8a5f170118ec9" Mar 21 04:59:34 crc kubenswrapper[4839]: I0321 04:59:34.072396 4839 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-w48j6" podStartSLOduration=1.273464228 podStartE2EDuration="2.072378733s" podCreationTimestamp="2026-03-21 04:59:32 +0000 UTC" firstStartedPulling="2026-03-21 04:59:32.98728634 +0000 UTC m=+2177.315073016" lastFinishedPulling="2026-03-21 04:59:33.786200805 +0000 UTC m=+2178.113987521" observedRunningTime="2026-03-21 04:59:34.067080714 +0000 UTC m=+2178.394867400" watchObservedRunningTime="2026-03-21 04:59:34.072378733 +0000 UTC m=+2178.400165409" Mar 21 04:59:34 crc kubenswrapper[4839]: I0321 04:59:34.095191 4839 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-gsh99"] Mar 21 04:59:34 crc kubenswrapper[4839]: I0321 04:59:34.101340 4839 scope.go:117] "RemoveContainer" containerID="55aece69ea4b938a83677f50b6562a8cb8b2f71c1eedd472b9b4b617c5f69867" Mar 21 04:59:34 crc kubenswrapper[4839]: I0321 04:59:34.104594 4839 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-gsh99"] Mar 21 04:59:34 crc kubenswrapper[4839]: I0321 04:59:34.125492 4839 scope.go:117] "RemoveContainer" containerID="6d9b0f369a1fc13f21c4a26a53ef07fc31884298133c4ccdcab5004c17fdd355" Mar 21 04:59:34 crc kubenswrapper[4839]: E0321 04:59:34.125995 4839 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6d9b0f369a1fc13f21c4a26a53ef07fc31884298133c4ccdcab5004c17fdd355\": container with ID starting with 6d9b0f369a1fc13f21c4a26a53ef07fc31884298133c4ccdcab5004c17fdd355 not found: ID does not exist" containerID="6d9b0f369a1fc13f21c4a26a53ef07fc31884298133c4ccdcab5004c17fdd355" Mar 21 04:59:34 crc kubenswrapper[4839]: I0321 04:59:34.126028 4839 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6d9b0f369a1fc13f21c4a26a53ef07fc31884298133c4ccdcab5004c17fdd355"} err="failed to get container status \"6d9b0f369a1fc13f21c4a26a53ef07fc31884298133c4ccdcab5004c17fdd355\": rpc error: code = NotFound desc = could not find container \"6d9b0f369a1fc13f21c4a26a53ef07fc31884298133c4ccdcab5004c17fdd355\": container with ID starting with 6d9b0f369a1fc13f21c4a26a53ef07fc31884298133c4ccdcab5004c17fdd355 not found: ID does not exist" Mar 21 04:59:34 crc kubenswrapper[4839]: I0321 04:59:34.126048 4839 scope.go:117] "RemoveContainer" containerID="cc7df358d00b09b247c1efcf8886b075a20eb00bfbb4f3f671e8a5f170118ec9" Mar 21 04:59:34 crc kubenswrapper[4839]: E0321 04:59:34.126352 4839 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"cc7df358d00b09b247c1efcf8886b075a20eb00bfbb4f3f671e8a5f170118ec9\": container with ID starting with cc7df358d00b09b247c1efcf8886b075a20eb00bfbb4f3f671e8a5f170118ec9 not found: ID does not exist" containerID="cc7df358d00b09b247c1efcf8886b075a20eb00bfbb4f3f671e8a5f170118ec9" Mar 21 04:59:34 crc kubenswrapper[4839]: I0321 04:59:34.126393 4839 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cc7df358d00b09b247c1efcf8886b075a20eb00bfbb4f3f671e8a5f170118ec9"} err="failed to get container status \"cc7df358d00b09b247c1efcf8886b075a20eb00bfbb4f3f671e8a5f170118ec9\": rpc error: code = NotFound desc = could not find container \"cc7df358d00b09b247c1efcf8886b075a20eb00bfbb4f3f671e8a5f170118ec9\": container with ID starting with cc7df358d00b09b247c1efcf8886b075a20eb00bfbb4f3f671e8a5f170118ec9 not found: ID does not exist" Mar 21 04:59:34 crc kubenswrapper[4839]: I0321 04:59:34.126422 4839 scope.go:117] "RemoveContainer" containerID="55aece69ea4b938a83677f50b6562a8cb8b2f71c1eedd472b9b4b617c5f69867" Mar 21 04:59:34 crc kubenswrapper[4839]: E0321 04:59:34.126745 4839 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"55aece69ea4b938a83677f50b6562a8cb8b2f71c1eedd472b9b4b617c5f69867\": container with ID starting with 55aece69ea4b938a83677f50b6562a8cb8b2f71c1eedd472b9b4b617c5f69867 not found: ID does not exist" containerID="55aece69ea4b938a83677f50b6562a8cb8b2f71c1eedd472b9b4b617c5f69867" Mar 21 04:59:34 crc kubenswrapper[4839]: I0321 04:59:34.126789 4839 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"55aece69ea4b938a83677f50b6562a8cb8b2f71c1eedd472b9b4b617c5f69867"} err="failed to get container status \"55aece69ea4b938a83677f50b6562a8cb8b2f71c1eedd472b9b4b617c5f69867\": rpc error: code = NotFound desc = could not find container \"55aece69ea4b938a83677f50b6562a8cb8b2f71c1eedd472b9b4b617c5f69867\": container with ID starting with 55aece69ea4b938a83677f50b6562a8cb8b2f71c1eedd472b9b4b617c5f69867 not found: ID does not exist" Mar 21 04:59:34 crc kubenswrapper[4839]: E0321 04:59:34.188963 4839 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod21140940_0075_4e70_915c_e37382cc0dd8.slice\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod21140940_0075_4e70_915c_e37382cc0dd8.slice/crio-208cf34686ee589dee6133b46731ab90df50d08a69e91f3ab54667a8147ae4d9\": RecentStats: unable to find data in memory cache]" Mar 21 04:59:34 crc kubenswrapper[4839]: I0321 04:59:34.464445 4839 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="21140940-0075-4e70-915c-e37382cc0dd8" path="/var/lib/kubelet/pods/21140940-0075-4e70-915c-e37382cc0dd8/volumes" Mar 21 05:00:00 crc kubenswrapper[4839]: I0321 05:00:00.181851 4839 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29567820-drjbh"] Mar 21 05:00:00 crc kubenswrapper[4839]: E0321 05:00:00.183261 4839 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="21140940-0075-4e70-915c-e37382cc0dd8" containerName="extract-content" Mar 21 05:00:00 crc kubenswrapper[4839]: I0321 05:00:00.183290 4839 state_mem.go:107] "Deleted CPUSet assignment" podUID="21140940-0075-4e70-915c-e37382cc0dd8" containerName="extract-content" Mar 21 05:00:00 crc kubenswrapper[4839]: E0321 05:00:00.183324 4839 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="21140940-0075-4e70-915c-e37382cc0dd8" containerName="extract-utilities" Mar 21 05:00:00 crc kubenswrapper[4839]: I0321 05:00:00.183341 4839 state_mem.go:107] "Deleted CPUSet assignment" podUID="21140940-0075-4e70-915c-e37382cc0dd8" containerName="extract-utilities" Mar 21 05:00:00 crc kubenswrapper[4839]: E0321 05:00:00.183391 4839 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="21140940-0075-4e70-915c-e37382cc0dd8" containerName="registry-server" Mar 21 05:00:00 crc kubenswrapper[4839]: I0321 05:00:00.183408 4839 state_mem.go:107] "Deleted CPUSet assignment" podUID="21140940-0075-4e70-915c-e37382cc0dd8" containerName="registry-server" Mar 21 05:00:00 crc kubenswrapper[4839]: I0321 05:00:00.183875 4839 memory_manager.go:354] "RemoveStaleState removing state" podUID="21140940-0075-4e70-915c-e37382cc0dd8" containerName="registry-server" Mar 21 05:00:00 crc kubenswrapper[4839]: I0321 05:00:00.185170 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567820-drjbh" Mar 21 05:00:00 crc kubenswrapper[4839]: I0321 05:00:00.190477 4839 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 21 05:00:00 crc kubenswrapper[4839]: I0321 05:00:00.191049 4839 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 21 05:00:00 crc kubenswrapper[4839]: I0321 05:00:00.191378 4839 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-swld2" Mar 21 05:00:00 crc kubenswrapper[4839]: I0321 05:00:00.204768 4839 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29567820-vhdmf"] Mar 21 05:00:00 crc kubenswrapper[4839]: I0321 05:00:00.206394 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29567820-vhdmf" Mar 21 05:00:00 crc kubenswrapper[4839]: I0321 05:00:00.208478 4839 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29567820-drjbh"] Mar 21 05:00:00 crc kubenswrapper[4839]: I0321 05:00:00.209579 4839 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Mar 21 05:00:00 crc kubenswrapper[4839]: I0321 05:00:00.213165 4839 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Mar 21 05:00:00 crc kubenswrapper[4839]: I0321 05:00:00.219758 4839 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29567820-vhdmf"] Mar 21 05:00:00 crc kubenswrapper[4839]: I0321 05:00:00.310242 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/e2ddd6fb-1042-49a9-a76c-d00f5710a0fd-config-volume\") pod \"collect-profiles-29567820-vhdmf\" (UID: \"e2ddd6fb-1042-49a9-a76c-d00f5710a0fd\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29567820-vhdmf" Mar 21 05:00:00 crc kubenswrapper[4839]: I0321 05:00:00.310348 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5lr8z\" (UniqueName: \"kubernetes.io/projected/2a082320-155d-4eb3-9779-9c6bb4db2b77-kube-api-access-5lr8z\") pod \"auto-csr-approver-29567820-drjbh\" (UID: \"2a082320-155d-4eb3-9779-9c6bb4db2b77\") " pod="openshift-infra/auto-csr-approver-29567820-drjbh" Mar 21 05:00:00 crc kubenswrapper[4839]: I0321 05:00:00.311885 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/e2ddd6fb-1042-49a9-a76c-d00f5710a0fd-secret-volume\") pod \"collect-profiles-29567820-vhdmf\" (UID: \"e2ddd6fb-1042-49a9-a76c-d00f5710a0fd\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29567820-vhdmf" Mar 21 05:00:00 crc kubenswrapper[4839]: I0321 05:00:00.311968 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fkwlg\" (UniqueName: \"kubernetes.io/projected/e2ddd6fb-1042-49a9-a76c-d00f5710a0fd-kube-api-access-fkwlg\") pod \"collect-profiles-29567820-vhdmf\" (UID: \"e2ddd6fb-1042-49a9-a76c-d00f5710a0fd\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29567820-vhdmf" Mar 21 05:00:00 crc kubenswrapper[4839]: I0321 05:00:00.415754 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fkwlg\" (UniqueName: \"kubernetes.io/projected/e2ddd6fb-1042-49a9-a76c-d00f5710a0fd-kube-api-access-fkwlg\") pod \"collect-profiles-29567820-vhdmf\" (UID: \"e2ddd6fb-1042-49a9-a76c-d00f5710a0fd\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29567820-vhdmf" Mar 21 05:00:00 crc kubenswrapper[4839]: I0321 05:00:00.416030 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/e2ddd6fb-1042-49a9-a76c-d00f5710a0fd-config-volume\") pod \"collect-profiles-29567820-vhdmf\" (UID: \"e2ddd6fb-1042-49a9-a76c-d00f5710a0fd\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29567820-vhdmf" Mar 21 05:00:00 crc kubenswrapper[4839]: I0321 05:00:00.416296 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5lr8z\" (UniqueName: \"kubernetes.io/projected/2a082320-155d-4eb3-9779-9c6bb4db2b77-kube-api-access-5lr8z\") pod \"auto-csr-approver-29567820-drjbh\" (UID: \"2a082320-155d-4eb3-9779-9c6bb4db2b77\") " pod="openshift-infra/auto-csr-approver-29567820-drjbh" Mar 21 05:00:00 crc kubenswrapper[4839]: I0321 05:00:00.416748 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/e2ddd6fb-1042-49a9-a76c-d00f5710a0fd-secret-volume\") pod \"collect-profiles-29567820-vhdmf\" (UID: \"e2ddd6fb-1042-49a9-a76c-d00f5710a0fd\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29567820-vhdmf" Mar 21 05:00:00 crc kubenswrapper[4839]: I0321 05:00:00.418331 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/e2ddd6fb-1042-49a9-a76c-d00f5710a0fd-config-volume\") pod \"collect-profiles-29567820-vhdmf\" (UID: \"e2ddd6fb-1042-49a9-a76c-d00f5710a0fd\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29567820-vhdmf" Mar 21 05:00:00 crc kubenswrapper[4839]: I0321 05:00:00.428472 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/e2ddd6fb-1042-49a9-a76c-d00f5710a0fd-secret-volume\") pod \"collect-profiles-29567820-vhdmf\" (UID: \"e2ddd6fb-1042-49a9-a76c-d00f5710a0fd\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29567820-vhdmf" Mar 21 05:00:00 crc kubenswrapper[4839]: I0321 05:00:00.431167 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fkwlg\" (UniqueName: \"kubernetes.io/projected/e2ddd6fb-1042-49a9-a76c-d00f5710a0fd-kube-api-access-fkwlg\") pod \"collect-profiles-29567820-vhdmf\" (UID: \"e2ddd6fb-1042-49a9-a76c-d00f5710a0fd\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29567820-vhdmf" Mar 21 05:00:00 crc kubenswrapper[4839]: I0321 05:00:00.436541 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5lr8z\" (UniqueName: \"kubernetes.io/projected/2a082320-155d-4eb3-9779-9c6bb4db2b77-kube-api-access-5lr8z\") pod \"auto-csr-approver-29567820-drjbh\" (UID: \"2a082320-155d-4eb3-9779-9c6bb4db2b77\") " pod="openshift-infra/auto-csr-approver-29567820-drjbh" Mar 21 05:00:00 crc kubenswrapper[4839]: I0321 05:00:00.530159 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567820-drjbh" Mar 21 05:00:00 crc kubenswrapper[4839]: I0321 05:00:00.547421 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29567820-vhdmf" Mar 21 05:00:01 crc kubenswrapper[4839]: I0321 05:00:01.013511 4839 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29567820-drjbh"] Mar 21 05:00:01 crc kubenswrapper[4839]: W0321 05:00:01.015112 4839 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod2a082320_155d_4eb3_9779_9c6bb4db2b77.slice/crio-f351d4aa8d34f3a05caa31e580fbae00b07a058f794284e383a28c466298b37b WatchSource:0}: Error finding container f351d4aa8d34f3a05caa31e580fbae00b07a058f794284e383a28c466298b37b: Status 404 returned error can't find the container with id f351d4aa8d34f3a05caa31e580fbae00b07a058f794284e383a28c466298b37b Mar 21 05:00:01 crc kubenswrapper[4839]: I0321 05:00:01.080347 4839 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29567820-vhdmf"] Mar 21 05:00:01 crc kubenswrapper[4839]: W0321 05:00:01.080356 4839 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode2ddd6fb_1042_49a9_a76c_d00f5710a0fd.slice/crio-61268ffe5b7b4cc3afd1ac0ee773199d0f1f59992991e1326fb41d82c6e03302 WatchSource:0}: Error finding container 61268ffe5b7b4cc3afd1ac0ee773199d0f1f59992991e1326fb41d82c6e03302: Status 404 returned error can't find the container with id 61268ffe5b7b4cc3afd1ac0ee773199d0f1f59992991e1326fb41d82c6e03302 Mar 21 05:00:01 crc kubenswrapper[4839]: I0321 05:00:01.316267 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29567820-vhdmf" event={"ID":"e2ddd6fb-1042-49a9-a76c-d00f5710a0fd","Type":"ContainerStarted","Data":"002d83959bbc4835db1d6adcbc064b45d0e38c924ab375cc3c27ef8985bcdd9a"} Mar 21 05:00:01 crc kubenswrapper[4839]: I0321 05:00:01.316580 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29567820-vhdmf" event={"ID":"e2ddd6fb-1042-49a9-a76c-d00f5710a0fd","Type":"ContainerStarted","Data":"61268ffe5b7b4cc3afd1ac0ee773199d0f1f59992991e1326fb41d82c6e03302"} Mar 21 05:00:01 crc kubenswrapper[4839]: I0321 05:00:01.318443 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567820-drjbh" event={"ID":"2a082320-155d-4eb3-9779-9c6bb4db2b77","Type":"ContainerStarted","Data":"f351d4aa8d34f3a05caa31e580fbae00b07a058f794284e383a28c466298b37b"} Mar 21 05:00:01 crc kubenswrapper[4839]: I0321 05:00:01.335728 4839 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/collect-profiles-29567820-vhdmf" podStartSLOduration=1.335697633 podStartE2EDuration="1.335697633s" podCreationTimestamp="2026-03-21 05:00:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-21 05:00:01.329736985 +0000 UTC m=+2205.657523691" watchObservedRunningTime="2026-03-21 05:00:01.335697633 +0000 UTC m=+2205.663484359" Mar 21 05:00:02 crc kubenswrapper[4839]: I0321 05:00:02.329041 4839 generic.go:334] "Generic (PLEG): container finished" podID="e2ddd6fb-1042-49a9-a76c-d00f5710a0fd" containerID="002d83959bbc4835db1d6adcbc064b45d0e38c924ab375cc3c27ef8985bcdd9a" exitCode=0 Mar 21 05:00:02 crc kubenswrapper[4839]: I0321 05:00:02.329120 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29567820-vhdmf" event={"ID":"e2ddd6fb-1042-49a9-a76c-d00f5710a0fd","Type":"ContainerDied","Data":"002d83959bbc4835db1d6adcbc064b45d0e38c924ab375cc3c27ef8985bcdd9a"} Mar 21 05:00:03 crc kubenswrapper[4839]: I0321 05:00:03.354228 4839 generic.go:334] "Generic (PLEG): container finished" podID="2a082320-155d-4eb3-9779-9c6bb4db2b77" containerID="282bb60b5cd122e380e9afc3be1cd2592f307d95be7c684d73af1ea1a65bb700" exitCode=0 Mar 21 05:00:03 crc kubenswrapper[4839]: I0321 05:00:03.354292 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567820-drjbh" event={"ID":"2a082320-155d-4eb3-9779-9c6bb4db2b77","Type":"ContainerDied","Data":"282bb60b5cd122e380e9afc3be1cd2592f307d95be7c684d73af1ea1a65bb700"} Mar 21 05:00:03 crc kubenswrapper[4839]: I0321 05:00:03.722022 4839 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29567820-vhdmf" Mar 21 05:00:03 crc kubenswrapper[4839]: I0321 05:00:03.910845 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/e2ddd6fb-1042-49a9-a76c-d00f5710a0fd-secret-volume\") pod \"e2ddd6fb-1042-49a9-a76c-d00f5710a0fd\" (UID: \"e2ddd6fb-1042-49a9-a76c-d00f5710a0fd\") " Mar 21 05:00:03 crc kubenswrapper[4839]: I0321 05:00:03.910991 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/e2ddd6fb-1042-49a9-a76c-d00f5710a0fd-config-volume\") pod \"e2ddd6fb-1042-49a9-a76c-d00f5710a0fd\" (UID: \"e2ddd6fb-1042-49a9-a76c-d00f5710a0fd\") " Mar 21 05:00:03 crc kubenswrapper[4839]: I0321 05:00:03.911963 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e2ddd6fb-1042-49a9-a76c-d00f5710a0fd-config-volume" (OuterVolumeSpecName: "config-volume") pod "e2ddd6fb-1042-49a9-a76c-d00f5710a0fd" (UID: "e2ddd6fb-1042-49a9-a76c-d00f5710a0fd"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 21 05:00:03 crc kubenswrapper[4839]: I0321 05:00:03.912430 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fkwlg\" (UniqueName: \"kubernetes.io/projected/e2ddd6fb-1042-49a9-a76c-d00f5710a0fd-kube-api-access-fkwlg\") pod \"e2ddd6fb-1042-49a9-a76c-d00f5710a0fd\" (UID: \"e2ddd6fb-1042-49a9-a76c-d00f5710a0fd\") " Mar 21 05:00:03 crc kubenswrapper[4839]: I0321 05:00:03.913369 4839 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/e2ddd6fb-1042-49a9-a76c-d00f5710a0fd-config-volume\") on node \"crc\" DevicePath \"\"" Mar 21 05:00:03 crc kubenswrapper[4839]: I0321 05:00:03.916798 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e2ddd6fb-1042-49a9-a76c-d00f5710a0fd-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "e2ddd6fb-1042-49a9-a76c-d00f5710a0fd" (UID: "e2ddd6fb-1042-49a9-a76c-d00f5710a0fd"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 21 05:00:03 crc kubenswrapper[4839]: I0321 05:00:03.918553 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e2ddd6fb-1042-49a9-a76c-d00f5710a0fd-kube-api-access-fkwlg" (OuterVolumeSpecName: "kube-api-access-fkwlg") pod "e2ddd6fb-1042-49a9-a76c-d00f5710a0fd" (UID: "e2ddd6fb-1042-49a9-a76c-d00f5710a0fd"). InnerVolumeSpecName "kube-api-access-fkwlg". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 21 05:00:04 crc kubenswrapper[4839]: I0321 05:00:04.015313 4839 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fkwlg\" (UniqueName: \"kubernetes.io/projected/e2ddd6fb-1042-49a9-a76c-d00f5710a0fd-kube-api-access-fkwlg\") on node \"crc\" DevicePath \"\"" Mar 21 05:00:04 crc kubenswrapper[4839]: I0321 05:00:04.015358 4839 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/e2ddd6fb-1042-49a9-a76c-d00f5710a0fd-secret-volume\") on node \"crc\" DevicePath \"\"" Mar 21 05:00:04 crc kubenswrapper[4839]: I0321 05:00:04.370193 4839 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29567820-vhdmf" Mar 21 05:00:04 crc kubenswrapper[4839]: I0321 05:00:04.370210 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29567820-vhdmf" event={"ID":"e2ddd6fb-1042-49a9-a76c-d00f5710a0fd","Type":"ContainerDied","Data":"61268ffe5b7b4cc3afd1ac0ee773199d0f1f59992991e1326fb41d82c6e03302"} Mar 21 05:00:04 crc kubenswrapper[4839]: I0321 05:00:04.370275 4839 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="61268ffe5b7b4cc3afd1ac0ee773199d0f1f59992991e1326fb41d82c6e03302" Mar 21 05:00:04 crc kubenswrapper[4839]: I0321 05:00:04.414159 4839 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29567775-lfv48"] Mar 21 05:00:04 crc kubenswrapper[4839]: I0321 05:00:04.422167 4839 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29567775-lfv48"] Mar 21 05:00:04 crc kubenswrapper[4839]: I0321 05:00:04.477818 4839 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0368223e-2e01-4681-a7a6-67b77387f8d8" path="/var/lib/kubelet/pods/0368223e-2e01-4681-a7a6-67b77387f8d8/volumes" Mar 21 05:00:04 crc kubenswrapper[4839]: I0321 05:00:04.699965 4839 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567820-drjbh" Mar 21 05:00:04 crc kubenswrapper[4839]: I0321 05:00:04.828185 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5lr8z\" (UniqueName: \"kubernetes.io/projected/2a082320-155d-4eb3-9779-9c6bb4db2b77-kube-api-access-5lr8z\") pod \"2a082320-155d-4eb3-9779-9c6bb4db2b77\" (UID: \"2a082320-155d-4eb3-9779-9c6bb4db2b77\") " Mar 21 05:00:04 crc kubenswrapper[4839]: I0321 05:00:04.834467 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2a082320-155d-4eb3-9779-9c6bb4db2b77-kube-api-access-5lr8z" (OuterVolumeSpecName: "kube-api-access-5lr8z") pod "2a082320-155d-4eb3-9779-9c6bb4db2b77" (UID: "2a082320-155d-4eb3-9779-9c6bb4db2b77"). InnerVolumeSpecName "kube-api-access-5lr8z". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 21 05:00:04 crc kubenswrapper[4839]: I0321 05:00:04.930467 4839 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5lr8z\" (UniqueName: \"kubernetes.io/projected/2a082320-155d-4eb3-9779-9c6bb4db2b77-kube-api-access-5lr8z\") on node \"crc\" DevicePath \"\"" Mar 21 05:00:05 crc kubenswrapper[4839]: I0321 05:00:05.379252 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567820-drjbh" event={"ID":"2a082320-155d-4eb3-9779-9c6bb4db2b77","Type":"ContainerDied","Data":"f351d4aa8d34f3a05caa31e580fbae00b07a058f794284e383a28c466298b37b"} Mar 21 05:00:05 crc kubenswrapper[4839]: I0321 05:00:05.379287 4839 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f351d4aa8d34f3a05caa31e580fbae00b07a058f794284e383a28c466298b37b" Mar 21 05:00:05 crc kubenswrapper[4839]: I0321 05:00:05.379351 4839 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567820-drjbh" Mar 21 05:00:05 crc kubenswrapper[4839]: I0321 05:00:05.756408 4839 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29567814-q8zxw"] Mar 21 05:00:05 crc kubenswrapper[4839]: I0321 05:00:05.785389 4839 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29567814-q8zxw"] Mar 21 05:00:06 crc kubenswrapper[4839]: I0321 05:00:06.465932 4839 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="852785cf-c79d-4c8e-92f0-f15d9836b437" path="/var/lib/kubelet/pods/852785cf-c79d-4c8e-92f0-f15d9836b437/volumes" Mar 21 05:00:30 crc kubenswrapper[4839]: I0321 05:00:30.980061 4839 patch_prober.go:28] interesting pod/machine-config-daemon-jx4q7 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 21 05:00:30 crc kubenswrapper[4839]: I0321 05:00:30.981743 4839 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-jx4q7" podUID="4f92fefb-d5cd-451a-8bbe-31eea55d5bd9" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 21 05:00:34 crc kubenswrapper[4839]: I0321 05:00:34.720838 4839 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-tbnr8"] Mar 21 05:00:34 crc kubenswrapper[4839]: E0321 05:00:34.721656 4839 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2a082320-155d-4eb3-9779-9c6bb4db2b77" containerName="oc" Mar 21 05:00:34 crc kubenswrapper[4839]: I0321 05:00:34.721670 4839 state_mem.go:107] "Deleted CPUSet assignment" podUID="2a082320-155d-4eb3-9779-9c6bb4db2b77" containerName="oc" Mar 21 05:00:34 crc kubenswrapper[4839]: E0321 05:00:34.721691 4839 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e2ddd6fb-1042-49a9-a76c-d00f5710a0fd" containerName="collect-profiles" Mar 21 05:00:34 crc kubenswrapper[4839]: I0321 05:00:34.721698 4839 state_mem.go:107] "Deleted CPUSet assignment" podUID="e2ddd6fb-1042-49a9-a76c-d00f5710a0fd" containerName="collect-profiles" Mar 21 05:00:34 crc kubenswrapper[4839]: I0321 05:00:34.721896 4839 memory_manager.go:354] "RemoveStaleState removing state" podUID="e2ddd6fb-1042-49a9-a76c-d00f5710a0fd" containerName="collect-profiles" Mar 21 05:00:34 crc kubenswrapper[4839]: I0321 05:00:34.721920 4839 memory_manager.go:354] "RemoveStaleState removing state" podUID="2a082320-155d-4eb3-9779-9c6bb4db2b77" containerName="oc" Mar 21 05:00:34 crc kubenswrapper[4839]: I0321 05:00:34.724875 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-tbnr8" Mar 21 05:00:34 crc kubenswrapper[4839]: I0321 05:00:34.759641 4839 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-tbnr8"] Mar 21 05:00:34 crc kubenswrapper[4839]: I0321 05:00:34.854825 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c7515499-6e18-46fa-b97d-583a44f6066d-catalog-content\") pod \"community-operators-tbnr8\" (UID: \"c7515499-6e18-46fa-b97d-583a44f6066d\") " pod="openshift-marketplace/community-operators-tbnr8" Mar 21 05:00:34 crc kubenswrapper[4839]: I0321 05:00:34.854927 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2hpzw\" (UniqueName: \"kubernetes.io/projected/c7515499-6e18-46fa-b97d-583a44f6066d-kube-api-access-2hpzw\") pod \"community-operators-tbnr8\" (UID: \"c7515499-6e18-46fa-b97d-583a44f6066d\") " pod="openshift-marketplace/community-operators-tbnr8" Mar 21 05:00:34 crc kubenswrapper[4839]: I0321 05:00:34.854956 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c7515499-6e18-46fa-b97d-583a44f6066d-utilities\") pod \"community-operators-tbnr8\" (UID: \"c7515499-6e18-46fa-b97d-583a44f6066d\") " pod="openshift-marketplace/community-operators-tbnr8" Mar 21 05:00:34 crc kubenswrapper[4839]: I0321 05:00:34.956245 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2hpzw\" (UniqueName: \"kubernetes.io/projected/c7515499-6e18-46fa-b97d-583a44f6066d-kube-api-access-2hpzw\") pod \"community-operators-tbnr8\" (UID: \"c7515499-6e18-46fa-b97d-583a44f6066d\") " pod="openshift-marketplace/community-operators-tbnr8" Mar 21 05:00:34 crc kubenswrapper[4839]: I0321 05:00:34.956310 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c7515499-6e18-46fa-b97d-583a44f6066d-utilities\") pod \"community-operators-tbnr8\" (UID: \"c7515499-6e18-46fa-b97d-583a44f6066d\") " pod="openshift-marketplace/community-operators-tbnr8" Mar 21 05:00:34 crc kubenswrapper[4839]: I0321 05:00:34.956517 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c7515499-6e18-46fa-b97d-583a44f6066d-catalog-content\") pod \"community-operators-tbnr8\" (UID: \"c7515499-6e18-46fa-b97d-583a44f6066d\") " pod="openshift-marketplace/community-operators-tbnr8" Mar 21 05:00:34 crc kubenswrapper[4839]: I0321 05:00:34.957016 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c7515499-6e18-46fa-b97d-583a44f6066d-utilities\") pod \"community-operators-tbnr8\" (UID: \"c7515499-6e18-46fa-b97d-583a44f6066d\") " pod="openshift-marketplace/community-operators-tbnr8" Mar 21 05:00:34 crc kubenswrapper[4839]: I0321 05:00:34.957036 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c7515499-6e18-46fa-b97d-583a44f6066d-catalog-content\") pod \"community-operators-tbnr8\" (UID: \"c7515499-6e18-46fa-b97d-583a44f6066d\") " pod="openshift-marketplace/community-operators-tbnr8" Mar 21 05:00:34 crc kubenswrapper[4839]: I0321 05:00:34.976065 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2hpzw\" (UniqueName: \"kubernetes.io/projected/c7515499-6e18-46fa-b97d-583a44f6066d-kube-api-access-2hpzw\") pod \"community-operators-tbnr8\" (UID: \"c7515499-6e18-46fa-b97d-583a44f6066d\") " pod="openshift-marketplace/community-operators-tbnr8" Mar 21 05:00:35 crc kubenswrapper[4839]: I0321 05:00:35.043307 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-tbnr8" Mar 21 05:00:35 crc kubenswrapper[4839]: I0321 05:00:35.410770 4839 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-tbnr8"] Mar 21 05:00:35 crc kubenswrapper[4839]: W0321 05:00:35.425936 4839 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc7515499_6e18_46fa_b97d_583a44f6066d.slice/crio-5f670a4d49fe6161a1487413471ff8ef4ad0dc5a4980ee849cc16304389b019d WatchSource:0}: Error finding container 5f670a4d49fe6161a1487413471ff8ef4ad0dc5a4980ee849cc16304389b019d: Status 404 returned error can't find the container with id 5f670a4d49fe6161a1487413471ff8ef4ad0dc5a4980ee849cc16304389b019d Mar 21 05:00:35 crc kubenswrapper[4839]: I0321 05:00:35.711097 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-tbnr8" event={"ID":"c7515499-6e18-46fa-b97d-583a44f6066d","Type":"ContainerStarted","Data":"5f670a4d49fe6161a1487413471ff8ef4ad0dc5a4980ee849cc16304389b019d"} Mar 21 05:00:36 crc kubenswrapper[4839]: I0321 05:00:36.742808 4839 generic.go:334] "Generic (PLEG): container finished" podID="c7515499-6e18-46fa-b97d-583a44f6066d" containerID="8359023cb97d896a3fdf49891f803b8793fefe959ee570e1e9454337ddc48261" exitCode=0 Mar 21 05:00:36 crc kubenswrapper[4839]: I0321 05:00:36.742924 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-tbnr8" event={"ID":"c7515499-6e18-46fa-b97d-583a44f6066d","Type":"ContainerDied","Data":"8359023cb97d896a3fdf49891f803b8793fefe959ee570e1e9454337ddc48261"} Mar 21 05:00:36 crc kubenswrapper[4839]: I0321 05:00:36.746103 4839 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 21 05:00:38 crc kubenswrapper[4839]: I0321 05:00:38.766812 4839 generic.go:334] "Generic (PLEG): container finished" podID="c7515499-6e18-46fa-b97d-583a44f6066d" containerID="9491568322b114d0f3fba095f6fad14915b4536f38978a8091edbbcd436ed958" exitCode=0 Mar 21 05:00:38 crc kubenswrapper[4839]: I0321 05:00:38.767165 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-tbnr8" event={"ID":"c7515499-6e18-46fa-b97d-583a44f6066d","Type":"ContainerDied","Data":"9491568322b114d0f3fba095f6fad14915b4536f38978a8091edbbcd436ed958"} Mar 21 05:00:40 crc kubenswrapper[4839]: I0321 05:00:40.789765 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-tbnr8" event={"ID":"c7515499-6e18-46fa-b97d-583a44f6066d","Type":"ContainerStarted","Data":"0cd99d8ae19983c08e303421faad1e7ef1e440e1aed0f32dc1ab671651ede9f8"} Mar 21 05:00:45 crc kubenswrapper[4839]: I0321 05:00:45.044470 4839 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-tbnr8" Mar 21 05:00:45 crc kubenswrapper[4839]: I0321 05:00:45.045235 4839 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-tbnr8" Mar 21 05:00:45 crc kubenswrapper[4839]: I0321 05:00:45.117196 4839 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-tbnr8" Mar 21 05:00:45 crc kubenswrapper[4839]: I0321 05:00:45.142759 4839 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-tbnr8" podStartSLOduration=8.150740645 podStartE2EDuration="11.142744052s" podCreationTimestamp="2026-03-21 05:00:34 +0000 UTC" firstStartedPulling="2026-03-21 05:00:36.745823917 +0000 UTC m=+2241.073610593" lastFinishedPulling="2026-03-21 05:00:39.737827324 +0000 UTC m=+2244.065614000" observedRunningTime="2026-03-21 05:00:40.820750007 +0000 UTC m=+2245.148536713" watchObservedRunningTime="2026-03-21 05:00:45.142744052 +0000 UTC m=+2249.470530728" Mar 21 05:00:45 crc kubenswrapper[4839]: I0321 05:00:45.891550 4839 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-tbnr8" Mar 21 05:00:45 crc kubenswrapper[4839]: I0321 05:00:45.951194 4839 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-tbnr8"] Mar 21 05:00:47 crc kubenswrapper[4839]: I0321 05:00:47.860519 4839 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-tbnr8" podUID="c7515499-6e18-46fa-b97d-583a44f6066d" containerName="registry-server" containerID="cri-o://0cd99d8ae19983c08e303421faad1e7ef1e440e1aed0f32dc1ab671651ede9f8" gracePeriod=2 Mar 21 05:00:48 crc kubenswrapper[4839]: I0321 05:00:48.346310 4839 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-tbnr8" Mar 21 05:00:48 crc kubenswrapper[4839]: I0321 05:00:48.482328 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2hpzw\" (UniqueName: \"kubernetes.io/projected/c7515499-6e18-46fa-b97d-583a44f6066d-kube-api-access-2hpzw\") pod \"c7515499-6e18-46fa-b97d-583a44f6066d\" (UID: \"c7515499-6e18-46fa-b97d-583a44f6066d\") " Mar 21 05:00:48 crc kubenswrapper[4839]: I0321 05:00:48.483298 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c7515499-6e18-46fa-b97d-583a44f6066d-catalog-content\") pod \"c7515499-6e18-46fa-b97d-583a44f6066d\" (UID: \"c7515499-6e18-46fa-b97d-583a44f6066d\") " Mar 21 05:00:48 crc kubenswrapper[4839]: I0321 05:00:48.483338 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c7515499-6e18-46fa-b97d-583a44f6066d-utilities\") pod \"c7515499-6e18-46fa-b97d-583a44f6066d\" (UID: \"c7515499-6e18-46fa-b97d-583a44f6066d\") " Mar 21 05:00:48 crc kubenswrapper[4839]: I0321 05:00:48.485237 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c7515499-6e18-46fa-b97d-583a44f6066d-utilities" (OuterVolumeSpecName: "utilities") pod "c7515499-6e18-46fa-b97d-583a44f6066d" (UID: "c7515499-6e18-46fa-b97d-583a44f6066d"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 21 05:00:48 crc kubenswrapper[4839]: I0321 05:00:48.487975 4839 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c7515499-6e18-46fa-b97d-583a44f6066d-utilities\") on node \"crc\" DevicePath \"\"" Mar 21 05:00:48 crc kubenswrapper[4839]: I0321 05:00:48.488548 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c7515499-6e18-46fa-b97d-583a44f6066d-kube-api-access-2hpzw" (OuterVolumeSpecName: "kube-api-access-2hpzw") pod "c7515499-6e18-46fa-b97d-583a44f6066d" (UID: "c7515499-6e18-46fa-b97d-583a44f6066d"). InnerVolumeSpecName "kube-api-access-2hpzw". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 21 05:00:48 crc kubenswrapper[4839]: I0321 05:00:48.557009 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c7515499-6e18-46fa-b97d-583a44f6066d-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "c7515499-6e18-46fa-b97d-583a44f6066d" (UID: "c7515499-6e18-46fa-b97d-583a44f6066d"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 21 05:00:48 crc kubenswrapper[4839]: I0321 05:00:48.589284 4839 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2hpzw\" (UniqueName: \"kubernetes.io/projected/c7515499-6e18-46fa-b97d-583a44f6066d-kube-api-access-2hpzw\") on node \"crc\" DevicePath \"\"" Mar 21 05:00:48 crc kubenswrapper[4839]: I0321 05:00:48.589325 4839 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c7515499-6e18-46fa-b97d-583a44f6066d-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 21 05:00:48 crc kubenswrapper[4839]: I0321 05:00:48.873632 4839 generic.go:334] "Generic (PLEG): container finished" podID="c7515499-6e18-46fa-b97d-583a44f6066d" containerID="0cd99d8ae19983c08e303421faad1e7ef1e440e1aed0f32dc1ab671651ede9f8" exitCode=0 Mar 21 05:00:48 crc kubenswrapper[4839]: I0321 05:00:48.873731 4839 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-tbnr8" Mar 21 05:00:48 crc kubenswrapper[4839]: I0321 05:00:48.873721 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-tbnr8" event={"ID":"c7515499-6e18-46fa-b97d-583a44f6066d","Type":"ContainerDied","Data":"0cd99d8ae19983c08e303421faad1e7ef1e440e1aed0f32dc1ab671651ede9f8"} Mar 21 05:00:48 crc kubenswrapper[4839]: I0321 05:00:48.874022 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-tbnr8" event={"ID":"c7515499-6e18-46fa-b97d-583a44f6066d","Type":"ContainerDied","Data":"5f670a4d49fe6161a1487413471ff8ef4ad0dc5a4980ee849cc16304389b019d"} Mar 21 05:00:48 crc kubenswrapper[4839]: I0321 05:00:48.874090 4839 scope.go:117] "RemoveContainer" containerID="0cd99d8ae19983c08e303421faad1e7ef1e440e1aed0f32dc1ab671651ede9f8" Mar 21 05:00:48 crc kubenswrapper[4839]: I0321 05:00:48.895385 4839 scope.go:117] "RemoveContainer" containerID="9491568322b114d0f3fba095f6fad14915b4536f38978a8091edbbcd436ed958" Mar 21 05:00:48 crc kubenswrapper[4839]: I0321 05:00:48.915643 4839 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-tbnr8"] Mar 21 05:00:48 crc kubenswrapper[4839]: I0321 05:00:48.934612 4839 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-tbnr8"] Mar 21 05:00:48 crc kubenswrapper[4839]: I0321 05:00:48.947815 4839 scope.go:117] "RemoveContainer" containerID="8359023cb97d896a3fdf49891f803b8793fefe959ee570e1e9454337ddc48261" Mar 21 05:00:48 crc kubenswrapper[4839]: I0321 05:00:48.969057 4839 scope.go:117] "RemoveContainer" containerID="0cd99d8ae19983c08e303421faad1e7ef1e440e1aed0f32dc1ab671651ede9f8" Mar 21 05:00:48 crc kubenswrapper[4839]: E0321 05:00:48.969953 4839 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0cd99d8ae19983c08e303421faad1e7ef1e440e1aed0f32dc1ab671651ede9f8\": container with ID starting with 0cd99d8ae19983c08e303421faad1e7ef1e440e1aed0f32dc1ab671651ede9f8 not found: ID does not exist" containerID="0cd99d8ae19983c08e303421faad1e7ef1e440e1aed0f32dc1ab671651ede9f8" Mar 21 05:00:48 crc kubenswrapper[4839]: I0321 05:00:48.970026 4839 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0cd99d8ae19983c08e303421faad1e7ef1e440e1aed0f32dc1ab671651ede9f8"} err="failed to get container status \"0cd99d8ae19983c08e303421faad1e7ef1e440e1aed0f32dc1ab671651ede9f8\": rpc error: code = NotFound desc = could not find container \"0cd99d8ae19983c08e303421faad1e7ef1e440e1aed0f32dc1ab671651ede9f8\": container with ID starting with 0cd99d8ae19983c08e303421faad1e7ef1e440e1aed0f32dc1ab671651ede9f8 not found: ID does not exist" Mar 21 05:00:48 crc kubenswrapper[4839]: I0321 05:00:48.970053 4839 scope.go:117] "RemoveContainer" containerID="9491568322b114d0f3fba095f6fad14915b4536f38978a8091edbbcd436ed958" Mar 21 05:00:48 crc kubenswrapper[4839]: E0321 05:00:48.970452 4839 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9491568322b114d0f3fba095f6fad14915b4536f38978a8091edbbcd436ed958\": container with ID starting with 9491568322b114d0f3fba095f6fad14915b4536f38978a8091edbbcd436ed958 not found: ID does not exist" containerID="9491568322b114d0f3fba095f6fad14915b4536f38978a8091edbbcd436ed958" Mar 21 05:00:48 crc kubenswrapper[4839]: I0321 05:00:48.970492 4839 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9491568322b114d0f3fba095f6fad14915b4536f38978a8091edbbcd436ed958"} err="failed to get container status \"9491568322b114d0f3fba095f6fad14915b4536f38978a8091edbbcd436ed958\": rpc error: code = NotFound desc = could not find container \"9491568322b114d0f3fba095f6fad14915b4536f38978a8091edbbcd436ed958\": container with ID starting with 9491568322b114d0f3fba095f6fad14915b4536f38978a8091edbbcd436ed958 not found: ID does not exist" Mar 21 05:00:48 crc kubenswrapper[4839]: I0321 05:00:48.970516 4839 scope.go:117] "RemoveContainer" containerID="8359023cb97d896a3fdf49891f803b8793fefe959ee570e1e9454337ddc48261" Mar 21 05:00:48 crc kubenswrapper[4839]: E0321 05:00:48.970762 4839 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8359023cb97d896a3fdf49891f803b8793fefe959ee570e1e9454337ddc48261\": container with ID starting with 8359023cb97d896a3fdf49891f803b8793fefe959ee570e1e9454337ddc48261 not found: ID does not exist" containerID="8359023cb97d896a3fdf49891f803b8793fefe959ee570e1e9454337ddc48261" Mar 21 05:00:48 crc kubenswrapper[4839]: I0321 05:00:48.970783 4839 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8359023cb97d896a3fdf49891f803b8793fefe959ee570e1e9454337ddc48261"} err="failed to get container status \"8359023cb97d896a3fdf49891f803b8793fefe959ee570e1e9454337ddc48261\": rpc error: code = NotFound desc = could not find container \"8359023cb97d896a3fdf49891f803b8793fefe959ee570e1e9454337ddc48261\": container with ID starting with 8359023cb97d896a3fdf49891f803b8793fefe959ee570e1e9454337ddc48261 not found: ID does not exist" Mar 21 05:00:50 crc kubenswrapper[4839]: I0321 05:00:50.463519 4839 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c7515499-6e18-46fa-b97d-583a44f6066d" path="/var/lib/kubelet/pods/c7515499-6e18-46fa-b97d-583a44f6066d/volumes" Mar 21 05:00:55 crc kubenswrapper[4839]: I0321 05:00:55.610446 4839 scope.go:117] "RemoveContainer" containerID="d122e9d27915a31245552d8140bcb2b6f44ab9e8f5d0f2da420a748e2a0ab38c" Mar 21 05:00:55 crc kubenswrapper[4839]: I0321 05:00:55.660711 4839 scope.go:117] "RemoveContainer" containerID="8dc51ff3af9bc295da39ecd84349288a171e09e13e9355c5592ecc0b1f1951e7" Mar 21 05:01:00 crc kubenswrapper[4839]: I0321 05:01:00.154023 4839 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-cron-29567821-rmctn"] Mar 21 05:01:00 crc kubenswrapper[4839]: E0321 05:01:00.156351 4839 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c7515499-6e18-46fa-b97d-583a44f6066d" containerName="extract-content" Mar 21 05:01:00 crc kubenswrapper[4839]: I0321 05:01:00.156376 4839 state_mem.go:107] "Deleted CPUSet assignment" podUID="c7515499-6e18-46fa-b97d-583a44f6066d" containerName="extract-content" Mar 21 05:01:00 crc kubenswrapper[4839]: E0321 05:01:00.156408 4839 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c7515499-6e18-46fa-b97d-583a44f6066d" containerName="extract-utilities" Mar 21 05:01:00 crc kubenswrapper[4839]: I0321 05:01:00.156416 4839 state_mem.go:107] "Deleted CPUSet assignment" podUID="c7515499-6e18-46fa-b97d-583a44f6066d" containerName="extract-utilities" Mar 21 05:01:00 crc kubenswrapper[4839]: E0321 05:01:00.156435 4839 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c7515499-6e18-46fa-b97d-583a44f6066d" containerName="registry-server" Mar 21 05:01:00 crc kubenswrapper[4839]: I0321 05:01:00.156442 4839 state_mem.go:107] "Deleted CPUSet assignment" podUID="c7515499-6e18-46fa-b97d-583a44f6066d" containerName="registry-server" Mar 21 05:01:00 crc kubenswrapper[4839]: I0321 05:01:00.156706 4839 memory_manager.go:354] "RemoveStaleState removing state" podUID="c7515499-6e18-46fa-b97d-583a44f6066d" containerName="registry-server" Mar 21 05:01:00 crc kubenswrapper[4839]: I0321 05:01:00.157531 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29567821-rmctn" Mar 21 05:01:00 crc kubenswrapper[4839]: I0321 05:01:00.163775 4839 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-cron-29567821-rmctn"] Mar 21 05:01:00 crc kubenswrapper[4839]: I0321 05:01:00.305713 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/666be2f4-0416-4086-94d3-c48c82f380b2-fernet-keys\") pod \"keystone-cron-29567821-rmctn\" (UID: \"666be2f4-0416-4086-94d3-c48c82f380b2\") " pod="openstack/keystone-cron-29567821-rmctn" Mar 21 05:01:00 crc kubenswrapper[4839]: I0321 05:01:00.305776 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/666be2f4-0416-4086-94d3-c48c82f380b2-combined-ca-bundle\") pod \"keystone-cron-29567821-rmctn\" (UID: \"666be2f4-0416-4086-94d3-c48c82f380b2\") " pod="openstack/keystone-cron-29567821-rmctn" Mar 21 05:01:00 crc kubenswrapper[4839]: I0321 05:01:00.305831 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hj9gr\" (UniqueName: \"kubernetes.io/projected/666be2f4-0416-4086-94d3-c48c82f380b2-kube-api-access-hj9gr\") pod \"keystone-cron-29567821-rmctn\" (UID: \"666be2f4-0416-4086-94d3-c48c82f380b2\") " pod="openstack/keystone-cron-29567821-rmctn" Mar 21 05:01:00 crc kubenswrapper[4839]: I0321 05:01:00.305882 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/666be2f4-0416-4086-94d3-c48c82f380b2-config-data\") pod \"keystone-cron-29567821-rmctn\" (UID: \"666be2f4-0416-4086-94d3-c48c82f380b2\") " pod="openstack/keystone-cron-29567821-rmctn" Mar 21 05:01:00 crc kubenswrapper[4839]: I0321 05:01:00.407351 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hj9gr\" (UniqueName: \"kubernetes.io/projected/666be2f4-0416-4086-94d3-c48c82f380b2-kube-api-access-hj9gr\") pod \"keystone-cron-29567821-rmctn\" (UID: \"666be2f4-0416-4086-94d3-c48c82f380b2\") " pod="openstack/keystone-cron-29567821-rmctn" Mar 21 05:01:00 crc kubenswrapper[4839]: I0321 05:01:00.407430 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/666be2f4-0416-4086-94d3-c48c82f380b2-config-data\") pod \"keystone-cron-29567821-rmctn\" (UID: \"666be2f4-0416-4086-94d3-c48c82f380b2\") " pod="openstack/keystone-cron-29567821-rmctn" Mar 21 05:01:00 crc kubenswrapper[4839]: I0321 05:01:00.407961 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/666be2f4-0416-4086-94d3-c48c82f380b2-fernet-keys\") pod \"keystone-cron-29567821-rmctn\" (UID: \"666be2f4-0416-4086-94d3-c48c82f380b2\") " pod="openstack/keystone-cron-29567821-rmctn" Mar 21 05:01:00 crc kubenswrapper[4839]: I0321 05:01:00.407994 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/666be2f4-0416-4086-94d3-c48c82f380b2-combined-ca-bundle\") pod \"keystone-cron-29567821-rmctn\" (UID: \"666be2f4-0416-4086-94d3-c48c82f380b2\") " pod="openstack/keystone-cron-29567821-rmctn" Mar 21 05:01:00 crc kubenswrapper[4839]: I0321 05:01:00.417773 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/666be2f4-0416-4086-94d3-c48c82f380b2-combined-ca-bundle\") pod \"keystone-cron-29567821-rmctn\" (UID: \"666be2f4-0416-4086-94d3-c48c82f380b2\") " pod="openstack/keystone-cron-29567821-rmctn" Mar 21 05:01:00 crc kubenswrapper[4839]: I0321 05:01:00.417891 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/666be2f4-0416-4086-94d3-c48c82f380b2-fernet-keys\") pod \"keystone-cron-29567821-rmctn\" (UID: \"666be2f4-0416-4086-94d3-c48c82f380b2\") " pod="openstack/keystone-cron-29567821-rmctn" Mar 21 05:01:00 crc kubenswrapper[4839]: I0321 05:01:00.418005 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/666be2f4-0416-4086-94d3-c48c82f380b2-config-data\") pod \"keystone-cron-29567821-rmctn\" (UID: \"666be2f4-0416-4086-94d3-c48c82f380b2\") " pod="openstack/keystone-cron-29567821-rmctn" Mar 21 05:01:00 crc kubenswrapper[4839]: I0321 05:01:00.430071 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hj9gr\" (UniqueName: \"kubernetes.io/projected/666be2f4-0416-4086-94d3-c48c82f380b2-kube-api-access-hj9gr\") pod \"keystone-cron-29567821-rmctn\" (UID: \"666be2f4-0416-4086-94d3-c48c82f380b2\") " pod="openstack/keystone-cron-29567821-rmctn" Mar 21 05:01:00 crc kubenswrapper[4839]: I0321 05:01:00.474404 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29567821-rmctn" Mar 21 05:01:00 crc kubenswrapper[4839]: I0321 05:01:00.928522 4839 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-cron-29567821-rmctn"] Mar 21 05:01:00 crc kubenswrapper[4839]: I0321 05:01:00.980638 4839 patch_prober.go:28] interesting pod/machine-config-daemon-jx4q7 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 21 05:01:00 crc kubenswrapper[4839]: I0321 05:01:00.980703 4839 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-jx4q7" podUID="4f92fefb-d5cd-451a-8bbe-31eea55d5bd9" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 21 05:01:00 crc kubenswrapper[4839]: I0321 05:01:00.991799 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29567821-rmctn" event={"ID":"666be2f4-0416-4086-94d3-c48c82f380b2","Type":"ContainerStarted","Data":"84a267c39feac16f072a5873d4f95a7263c12a65583247dd89372cebb746437c"} Mar 21 05:01:02 crc kubenswrapper[4839]: I0321 05:01:02.002465 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29567821-rmctn" event={"ID":"666be2f4-0416-4086-94d3-c48c82f380b2","Type":"ContainerStarted","Data":"04d4f720b62c166e071450a4c2f749a516bf7a6bafc166466380b4293d53da5c"} Mar 21 05:01:02 crc kubenswrapper[4839]: I0321 05:01:02.028849 4839 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-cron-29567821-rmctn" podStartSLOduration=2.028834864 podStartE2EDuration="2.028834864s" podCreationTimestamp="2026-03-21 05:01:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-21 05:01:02.025271404 +0000 UTC m=+2266.353058080" watchObservedRunningTime="2026-03-21 05:01:02.028834864 +0000 UTC m=+2266.356621540" Mar 21 05:01:04 crc kubenswrapper[4839]: I0321 05:01:04.021044 4839 generic.go:334] "Generic (PLEG): container finished" podID="666be2f4-0416-4086-94d3-c48c82f380b2" containerID="04d4f720b62c166e071450a4c2f749a516bf7a6bafc166466380b4293d53da5c" exitCode=0 Mar 21 05:01:04 crc kubenswrapper[4839]: I0321 05:01:04.021127 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29567821-rmctn" event={"ID":"666be2f4-0416-4086-94d3-c48c82f380b2","Type":"ContainerDied","Data":"04d4f720b62c166e071450a4c2f749a516bf7a6bafc166466380b4293d53da5c"} Mar 21 05:01:05 crc kubenswrapper[4839]: I0321 05:01:05.451326 4839 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29567821-rmctn" Mar 21 05:01:05 crc kubenswrapper[4839]: I0321 05:01:05.602700 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hj9gr\" (UniqueName: \"kubernetes.io/projected/666be2f4-0416-4086-94d3-c48c82f380b2-kube-api-access-hj9gr\") pod \"666be2f4-0416-4086-94d3-c48c82f380b2\" (UID: \"666be2f4-0416-4086-94d3-c48c82f380b2\") " Mar 21 05:01:05 crc kubenswrapper[4839]: I0321 05:01:05.602833 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/666be2f4-0416-4086-94d3-c48c82f380b2-combined-ca-bundle\") pod \"666be2f4-0416-4086-94d3-c48c82f380b2\" (UID: \"666be2f4-0416-4086-94d3-c48c82f380b2\") " Mar 21 05:01:05 crc kubenswrapper[4839]: I0321 05:01:05.602943 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/666be2f4-0416-4086-94d3-c48c82f380b2-fernet-keys\") pod \"666be2f4-0416-4086-94d3-c48c82f380b2\" (UID: \"666be2f4-0416-4086-94d3-c48c82f380b2\") " Mar 21 05:01:05 crc kubenswrapper[4839]: I0321 05:01:05.602965 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/666be2f4-0416-4086-94d3-c48c82f380b2-config-data\") pod \"666be2f4-0416-4086-94d3-c48c82f380b2\" (UID: \"666be2f4-0416-4086-94d3-c48c82f380b2\") " Mar 21 05:01:05 crc kubenswrapper[4839]: I0321 05:01:05.612693 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/666be2f4-0416-4086-94d3-c48c82f380b2-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "666be2f4-0416-4086-94d3-c48c82f380b2" (UID: "666be2f4-0416-4086-94d3-c48c82f380b2"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 21 05:01:05 crc kubenswrapper[4839]: I0321 05:01:05.618695 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/666be2f4-0416-4086-94d3-c48c82f380b2-kube-api-access-hj9gr" (OuterVolumeSpecName: "kube-api-access-hj9gr") pod "666be2f4-0416-4086-94d3-c48c82f380b2" (UID: "666be2f4-0416-4086-94d3-c48c82f380b2"). InnerVolumeSpecName "kube-api-access-hj9gr". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 21 05:01:05 crc kubenswrapper[4839]: I0321 05:01:05.647913 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/666be2f4-0416-4086-94d3-c48c82f380b2-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "666be2f4-0416-4086-94d3-c48c82f380b2" (UID: "666be2f4-0416-4086-94d3-c48c82f380b2"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 21 05:01:05 crc kubenswrapper[4839]: I0321 05:01:05.680744 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/666be2f4-0416-4086-94d3-c48c82f380b2-config-data" (OuterVolumeSpecName: "config-data") pod "666be2f4-0416-4086-94d3-c48c82f380b2" (UID: "666be2f4-0416-4086-94d3-c48c82f380b2"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 21 05:01:05 crc kubenswrapper[4839]: I0321 05:01:05.704964 4839 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hj9gr\" (UniqueName: \"kubernetes.io/projected/666be2f4-0416-4086-94d3-c48c82f380b2-kube-api-access-hj9gr\") on node \"crc\" DevicePath \"\"" Mar 21 05:01:05 crc kubenswrapper[4839]: I0321 05:01:05.705012 4839 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/666be2f4-0416-4086-94d3-c48c82f380b2-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 21 05:01:05 crc kubenswrapper[4839]: I0321 05:01:05.705023 4839 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/666be2f4-0416-4086-94d3-c48c82f380b2-fernet-keys\") on node \"crc\" DevicePath \"\"" Mar 21 05:01:05 crc kubenswrapper[4839]: I0321 05:01:05.705035 4839 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/666be2f4-0416-4086-94d3-c48c82f380b2-config-data\") on node \"crc\" DevicePath \"\"" Mar 21 05:01:06 crc kubenswrapper[4839]: I0321 05:01:06.044644 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29567821-rmctn" event={"ID":"666be2f4-0416-4086-94d3-c48c82f380b2","Type":"ContainerDied","Data":"84a267c39feac16f072a5873d4f95a7263c12a65583247dd89372cebb746437c"} Mar 21 05:01:06 crc kubenswrapper[4839]: I0321 05:01:06.045015 4839 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="84a267c39feac16f072a5873d4f95a7263c12a65583247dd89372cebb746437c" Mar 21 05:01:06 crc kubenswrapper[4839]: I0321 05:01:06.045103 4839 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29567821-rmctn" Mar 21 05:01:30 crc kubenswrapper[4839]: I0321 05:01:30.980486 4839 patch_prober.go:28] interesting pod/machine-config-daemon-jx4q7 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 21 05:01:30 crc kubenswrapper[4839]: I0321 05:01:30.981084 4839 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-jx4q7" podUID="4f92fefb-d5cd-451a-8bbe-31eea55d5bd9" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 21 05:01:30 crc kubenswrapper[4839]: I0321 05:01:30.981132 4839 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-jx4q7" Mar 21 05:01:30 crc kubenswrapper[4839]: I0321 05:01:30.981890 4839 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"f3d8592746d13c0ed95f298f8a3279e3766ac0141ca420e1630c7b54039959ed"} pod="openshift-machine-config-operator/machine-config-daemon-jx4q7" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 21 05:01:30 crc kubenswrapper[4839]: I0321 05:01:30.981955 4839 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-jx4q7" podUID="4f92fefb-d5cd-451a-8bbe-31eea55d5bd9" containerName="machine-config-daemon" containerID="cri-o://f3d8592746d13c0ed95f298f8a3279e3766ac0141ca420e1630c7b54039959ed" gracePeriod=600 Mar 21 05:01:31 crc kubenswrapper[4839]: I0321 05:01:31.267147 4839 generic.go:334] "Generic (PLEG): container finished" podID="4f92fefb-d5cd-451a-8bbe-31eea55d5bd9" containerID="f3d8592746d13c0ed95f298f8a3279e3766ac0141ca420e1630c7b54039959ed" exitCode=0 Mar 21 05:01:31 crc kubenswrapper[4839]: I0321 05:01:31.267231 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-jx4q7" event={"ID":"4f92fefb-d5cd-451a-8bbe-31eea55d5bd9","Type":"ContainerDied","Data":"f3d8592746d13c0ed95f298f8a3279e3766ac0141ca420e1630c7b54039959ed"} Mar 21 05:01:31 crc kubenswrapper[4839]: I0321 05:01:31.267471 4839 scope.go:117] "RemoveContainer" containerID="27713c304335ae18a33ce2c1e07fb2a32c8ccd655a1db46168d0e6b4f0325109" Mar 21 05:01:32 crc kubenswrapper[4839]: I0321 05:01:32.278160 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-jx4q7" event={"ID":"4f92fefb-d5cd-451a-8bbe-31eea55d5bd9","Type":"ContainerStarted","Data":"baf34d4219b300148cd1cf3e2853a6f9d40af9c7af97b863f8a2a90b0a187c21"} Mar 21 05:02:00 crc kubenswrapper[4839]: I0321 05:02:00.144561 4839 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29567822-666xq"] Mar 21 05:02:00 crc kubenswrapper[4839]: E0321 05:02:00.146404 4839 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="666be2f4-0416-4086-94d3-c48c82f380b2" containerName="keystone-cron" Mar 21 05:02:00 crc kubenswrapper[4839]: I0321 05:02:00.146485 4839 state_mem.go:107] "Deleted CPUSet assignment" podUID="666be2f4-0416-4086-94d3-c48c82f380b2" containerName="keystone-cron" Mar 21 05:02:00 crc kubenswrapper[4839]: I0321 05:02:00.146739 4839 memory_manager.go:354] "RemoveStaleState removing state" podUID="666be2f4-0416-4086-94d3-c48c82f380b2" containerName="keystone-cron" Mar 21 05:02:00 crc kubenswrapper[4839]: I0321 05:02:00.147514 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567822-666xq" Mar 21 05:02:00 crc kubenswrapper[4839]: I0321 05:02:00.149415 4839 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 21 05:02:00 crc kubenswrapper[4839]: I0321 05:02:00.149697 4839 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-swld2" Mar 21 05:02:00 crc kubenswrapper[4839]: I0321 05:02:00.149722 4839 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 21 05:02:00 crc kubenswrapper[4839]: I0321 05:02:00.152639 4839 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29567822-666xq"] Mar 21 05:02:00 crc kubenswrapper[4839]: I0321 05:02:00.219057 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k7vdz\" (UniqueName: \"kubernetes.io/projected/5246ade9-02c7-4a6c-b903-f556b6405d03-kube-api-access-k7vdz\") pod \"auto-csr-approver-29567822-666xq\" (UID: \"5246ade9-02c7-4a6c-b903-f556b6405d03\") " pod="openshift-infra/auto-csr-approver-29567822-666xq" Mar 21 05:02:00 crc kubenswrapper[4839]: I0321 05:02:00.320366 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-k7vdz\" (UniqueName: \"kubernetes.io/projected/5246ade9-02c7-4a6c-b903-f556b6405d03-kube-api-access-k7vdz\") pod \"auto-csr-approver-29567822-666xq\" (UID: \"5246ade9-02c7-4a6c-b903-f556b6405d03\") " pod="openshift-infra/auto-csr-approver-29567822-666xq" Mar 21 05:02:00 crc kubenswrapper[4839]: I0321 05:02:00.339816 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-k7vdz\" (UniqueName: \"kubernetes.io/projected/5246ade9-02c7-4a6c-b903-f556b6405d03-kube-api-access-k7vdz\") pod \"auto-csr-approver-29567822-666xq\" (UID: \"5246ade9-02c7-4a6c-b903-f556b6405d03\") " pod="openshift-infra/auto-csr-approver-29567822-666xq" Mar 21 05:02:00 crc kubenswrapper[4839]: I0321 05:02:00.467459 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567822-666xq" Mar 21 05:02:00 crc kubenswrapper[4839]: I0321 05:02:00.894786 4839 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29567822-666xq"] Mar 21 05:02:01 crc kubenswrapper[4839]: I0321 05:02:01.564385 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567822-666xq" event={"ID":"5246ade9-02c7-4a6c-b903-f556b6405d03","Type":"ContainerStarted","Data":"6d1fe1e4a0918fd8ee0bb394d0dab442d8e00841c7e225f3a4940e9ef79b272f"} Mar 21 05:02:03 crc kubenswrapper[4839]: I0321 05:02:03.581695 4839 generic.go:334] "Generic (PLEG): container finished" podID="5246ade9-02c7-4a6c-b903-f556b6405d03" containerID="292e8e6107a41a041900da65d2d65595f3753b426bbb8c2a06a65273c04fd6b1" exitCode=0 Mar 21 05:02:03 crc kubenswrapper[4839]: I0321 05:02:03.581784 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567822-666xq" event={"ID":"5246ade9-02c7-4a6c-b903-f556b6405d03","Type":"ContainerDied","Data":"292e8e6107a41a041900da65d2d65595f3753b426bbb8c2a06a65273c04fd6b1"} Mar 21 05:02:04 crc kubenswrapper[4839]: I0321 05:02:04.951728 4839 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567822-666xq" Mar 21 05:02:05 crc kubenswrapper[4839]: I0321 05:02:05.016950 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-k7vdz\" (UniqueName: \"kubernetes.io/projected/5246ade9-02c7-4a6c-b903-f556b6405d03-kube-api-access-k7vdz\") pod \"5246ade9-02c7-4a6c-b903-f556b6405d03\" (UID: \"5246ade9-02c7-4a6c-b903-f556b6405d03\") " Mar 21 05:02:05 crc kubenswrapper[4839]: I0321 05:02:05.022899 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5246ade9-02c7-4a6c-b903-f556b6405d03-kube-api-access-k7vdz" (OuterVolumeSpecName: "kube-api-access-k7vdz") pod "5246ade9-02c7-4a6c-b903-f556b6405d03" (UID: "5246ade9-02c7-4a6c-b903-f556b6405d03"). InnerVolumeSpecName "kube-api-access-k7vdz". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 21 05:02:05 crc kubenswrapper[4839]: I0321 05:02:05.118856 4839 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-k7vdz\" (UniqueName: \"kubernetes.io/projected/5246ade9-02c7-4a6c-b903-f556b6405d03-kube-api-access-k7vdz\") on node \"crc\" DevicePath \"\"" Mar 21 05:02:05 crc kubenswrapper[4839]: I0321 05:02:05.620775 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567822-666xq" event={"ID":"5246ade9-02c7-4a6c-b903-f556b6405d03","Type":"ContainerDied","Data":"6d1fe1e4a0918fd8ee0bb394d0dab442d8e00841c7e225f3a4940e9ef79b272f"} Mar 21 05:02:05 crc kubenswrapper[4839]: I0321 05:02:05.621298 4839 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6d1fe1e4a0918fd8ee0bb394d0dab442d8e00841c7e225f3a4940e9ef79b272f" Mar 21 05:02:05 crc kubenswrapper[4839]: I0321 05:02:05.620837 4839 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567822-666xq" Mar 21 05:02:06 crc kubenswrapper[4839]: I0321 05:02:06.016449 4839 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29567816-8qfld"] Mar 21 05:02:06 crc kubenswrapper[4839]: I0321 05:02:06.022842 4839 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29567816-8qfld"] Mar 21 05:02:06 crc kubenswrapper[4839]: I0321 05:02:06.463091 4839 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f5c6cd43-2bb2-46cd-90ec-b1037aaae8a8" path="/var/lib/kubelet/pods/f5c6cd43-2bb2-46cd-90ec-b1037aaae8a8/volumes" Mar 21 05:02:10 crc kubenswrapper[4839]: I0321 05:02:10.482945 4839 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-x7bww"] Mar 21 05:02:10 crc kubenswrapper[4839]: E0321 05:02:10.485470 4839 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5246ade9-02c7-4a6c-b903-f556b6405d03" containerName="oc" Mar 21 05:02:10 crc kubenswrapper[4839]: I0321 05:02:10.485494 4839 state_mem.go:107] "Deleted CPUSet assignment" podUID="5246ade9-02c7-4a6c-b903-f556b6405d03" containerName="oc" Mar 21 05:02:10 crc kubenswrapper[4839]: I0321 05:02:10.485694 4839 memory_manager.go:354] "RemoveStaleState removing state" podUID="5246ade9-02c7-4a6c-b903-f556b6405d03" containerName="oc" Mar 21 05:02:10 crc kubenswrapper[4839]: I0321 05:02:10.487278 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-x7bww" Mar 21 05:02:10 crc kubenswrapper[4839]: I0321 05:02:10.501177 4839 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-x7bww"] Mar 21 05:02:10 crc kubenswrapper[4839]: I0321 05:02:10.558038 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1efcbe59-69a0-46b0-b47b-e7e6fd1502c3-utilities\") pod \"redhat-marketplace-x7bww\" (UID: \"1efcbe59-69a0-46b0-b47b-e7e6fd1502c3\") " pod="openshift-marketplace/redhat-marketplace-x7bww" Mar 21 05:02:10 crc kubenswrapper[4839]: I0321 05:02:10.558428 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hk578\" (UniqueName: \"kubernetes.io/projected/1efcbe59-69a0-46b0-b47b-e7e6fd1502c3-kube-api-access-hk578\") pod \"redhat-marketplace-x7bww\" (UID: \"1efcbe59-69a0-46b0-b47b-e7e6fd1502c3\") " pod="openshift-marketplace/redhat-marketplace-x7bww" Mar 21 05:02:10 crc kubenswrapper[4839]: I0321 05:02:10.558496 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1efcbe59-69a0-46b0-b47b-e7e6fd1502c3-catalog-content\") pod \"redhat-marketplace-x7bww\" (UID: \"1efcbe59-69a0-46b0-b47b-e7e6fd1502c3\") " pod="openshift-marketplace/redhat-marketplace-x7bww" Mar 21 05:02:10 crc kubenswrapper[4839]: I0321 05:02:10.662211 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hk578\" (UniqueName: \"kubernetes.io/projected/1efcbe59-69a0-46b0-b47b-e7e6fd1502c3-kube-api-access-hk578\") pod \"redhat-marketplace-x7bww\" (UID: \"1efcbe59-69a0-46b0-b47b-e7e6fd1502c3\") " pod="openshift-marketplace/redhat-marketplace-x7bww" Mar 21 05:02:10 crc kubenswrapper[4839]: I0321 05:02:10.662324 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1efcbe59-69a0-46b0-b47b-e7e6fd1502c3-catalog-content\") pod \"redhat-marketplace-x7bww\" (UID: \"1efcbe59-69a0-46b0-b47b-e7e6fd1502c3\") " pod="openshift-marketplace/redhat-marketplace-x7bww" Mar 21 05:02:10 crc kubenswrapper[4839]: I0321 05:02:10.662508 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1efcbe59-69a0-46b0-b47b-e7e6fd1502c3-utilities\") pod \"redhat-marketplace-x7bww\" (UID: \"1efcbe59-69a0-46b0-b47b-e7e6fd1502c3\") " pod="openshift-marketplace/redhat-marketplace-x7bww" Mar 21 05:02:10 crc kubenswrapper[4839]: I0321 05:02:10.663130 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1efcbe59-69a0-46b0-b47b-e7e6fd1502c3-catalog-content\") pod \"redhat-marketplace-x7bww\" (UID: \"1efcbe59-69a0-46b0-b47b-e7e6fd1502c3\") " pod="openshift-marketplace/redhat-marketplace-x7bww" Mar 21 05:02:10 crc kubenswrapper[4839]: I0321 05:02:10.663196 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1efcbe59-69a0-46b0-b47b-e7e6fd1502c3-utilities\") pod \"redhat-marketplace-x7bww\" (UID: \"1efcbe59-69a0-46b0-b47b-e7e6fd1502c3\") " pod="openshift-marketplace/redhat-marketplace-x7bww" Mar 21 05:02:10 crc kubenswrapper[4839]: I0321 05:02:10.683787 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hk578\" (UniqueName: \"kubernetes.io/projected/1efcbe59-69a0-46b0-b47b-e7e6fd1502c3-kube-api-access-hk578\") pod \"redhat-marketplace-x7bww\" (UID: \"1efcbe59-69a0-46b0-b47b-e7e6fd1502c3\") " pod="openshift-marketplace/redhat-marketplace-x7bww" Mar 21 05:02:10 crc kubenswrapper[4839]: I0321 05:02:10.808836 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-x7bww" Mar 21 05:02:11 crc kubenswrapper[4839]: I0321 05:02:11.310692 4839 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-x7bww"] Mar 21 05:02:11 crc kubenswrapper[4839]: I0321 05:02:11.672046 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-x7bww" event={"ID":"1efcbe59-69a0-46b0-b47b-e7e6fd1502c3","Type":"ContainerStarted","Data":"373a8c5aa8fdbb58b3ae94a725b57b695a91f013d0b4d98618d1c738df727506"} Mar 21 05:02:12 crc kubenswrapper[4839]: I0321 05:02:12.682174 4839 generic.go:334] "Generic (PLEG): container finished" podID="1efcbe59-69a0-46b0-b47b-e7e6fd1502c3" containerID="1efc52951d43a245177e4dff7cfd1e3426fa930b28133acb294aea6743e70139" exitCode=0 Mar 21 05:02:12 crc kubenswrapper[4839]: I0321 05:02:12.682233 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-x7bww" event={"ID":"1efcbe59-69a0-46b0-b47b-e7e6fd1502c3","Type":"ContainerDied","Data":"1efc52951d43a245177e4dff7cfd1e3426fa930b28133acb294aea6743e70139"} Mar 21 05:02:13 crc kubenswrapper[4839]: I0321 05:02:13.482669 4839 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-pb97p"] Mar 21 05:02:13 crc kubenswrapper[4839]: I0321 05:02:13.484985 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-pb97p" Mar 21 05:02:13 crc kubenswrapper[4839]: I0321 05:02:13.493391 4839 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-pb97p"] Mar 21 05:02:13 crc kubenswrapper[4839]: I0321 05:02:13.616999 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f939e367-e323-4cac-85d0-55d26d60f4ec-catalog-content\") pod \"certified-operators-pb97p\" (UID: \"f939e367-e323-4cac-85d0-55d26d60f4ec\") " pod="openshift-marketplace/certified-operators-pb97p" Mar 21 05:02:13 crc kubenswrapper[4839]: I0321 05:02:13.617130 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sgwg2\" (UniqueName: \"kubernetes.io/projected/f939e367-e323-4cac-85d0-55d26d60f4ec-kube-api-access-sgwg2\") pod \"certified-operators-pb97p\" (UID: \"f939e367-e323-4cac-85d0-55d26d60f4ec\") " pod="openshift-marketplace/certified-operators-pb97p" Mar 21 05:02:13 crc kubenswrapper[4839]: I0321 05:02:13.617187 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f939e367-e323-4cac-85d0-55d26d60f4ec-utilities\") pod \"certified-operators-pb97p\" (UID: \"f939e367-e323-4cac-85d0-55d26d60f4ec\") " pod="openshift-marketplace/certified-operators-pb97p" Mar 21 05:02:13 crc kubenswrapper[4839]: I0321 05:02:13.719242 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sgwg2\" (UniqueName: \"kubernetes.io/projected/f939e367-e323-4cac-85d0-55d26d60f4ec-kube-api-access-sgwg2\") pod \"certified-operators-pb97p\" (UID: \"f939e367-e323-4cac-85d0-55d26d60f4ec\") " pod="openshift-marketplace/certified-operators-pb97p" Mar 21 05:02:13 crc kubenswrapper[4839]: I0321 05:02:13.719584 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f939e367-e323-4cac-85d0-55d26d60f4ec-utilities\") pod \"certified-operators-pb97p\" (UID: \"f939e367-e323-4cac-85d0-55d26d60f4ec\") " pod="openshift-marketplace/certified-operators-pb97p" Mar 21 05:02:13 crc kubenswrapper[4839]: I0321 05:02:13.719833 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f939e367-e323-4cac-85d0-55d26d60f4ec-catalog-content\") pod \"certified-operators-pb97p\" (UID: \"f939e367-e323-4cac-85d0-55d26d60f4ec\") " pod="openshift-marketplace/certified-operators-pb97p" Mar 21 05:02:13 crc kubenswrapper[4839]: I0321 05:02:13.720235 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f939e367-e323-4cac-85d0-55d26d60f4ec-utilities\") pod \"certified-operators-pb97p\" (UID: \"f939e367-e323-4cac-85d0-55d26d60f4ec\") " pod="openshift-marketplace/certified-operators-pb97p" Mar 21 05:02:13 crc kubenswrapper[4839]: I0321 05:02:13.720260 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f939e367-e323-4cac-85d0-55d26d60f4ec-catalog-content\") pod \"certified-operators-pb97p\" (UID: \"f939e367-e323-4cac-85d0-55d26d60f4ec\") " pod="openshift-marketplace/certified-operators-pb97p" Mar 21 05:02:13 crc kubenswrapper[4839]: I0321 05:02:13.742515 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sgwg2\" (UniqueName: \"kubernetes.io/projected/f939e367-e323-4cac-85d0-55d26d60f4ec-kube-api-access-sgwg2\") pod \"certified-operators-pb97p\" (UID: \"f939e367-e323-4cac-85d0-55d26d60f4ec\") " pod="openshift-marketplace/certified-operators-pb97p" Mar 21 05:02:13 crc kubenswrapper[4839]: I0321 05:02:13.807597 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-pb97p" Mar 21 05:02:14 crc kubenswrapper[4839]: W0321 05:02:14.325092 4839 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf939e367_e323_4cac_85d0_55d26d60f4ec.slice/crio-226aff01fcb6e9c576f943e6b8a6622749c456bb21bf2cb5df8b9ab8362e0e3e WatchSource:0}: Error finding container 226aff01fcb6e9c576f943e6b8a6622749c456bb21bf2cb5df8b9ab8362e0e3e: Status 404 returned error can't find the container with id 226aff01fcb6e9c576f943e6b8a6622749c456bb21bf2cb5df8b9ab8362e0e3e Mar 21 05:02:14 crc kubenswrapper[4839]: I0321 05:02:14.325808 4839 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-pb97p"] Mar 21 05:02:14 crc kubenswrapper[4839]: I0321 05:02:14.701871 4839 generic.go:334] "Generic (PLEG): container finished" podID="1efcbe59-69a0-46b0-b47b-e7e6fd1502c3" containerID="4b22a92b0ab8fcff90ca92aa57b5aa47ae8ec5ba62184c302116eb1d72e13a3d" exitCode=0 Mar 21 05:02:14 crc kubenswrapper[4839]: I0321 05:02:14.701969 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-x7bww" event={"ID":"1efcbe59-69a0-46b0-b47b-e7e6fd1502c3","Type":"ContainerDied","Data":"4b22a92b0ab8fcff90ca92aa57b5aa47ae8ec5ba62184c302116eb1d72e13a3d"} Mar 21 05:02:14 crc kubenswrapper[4839]: I0321 05:02:14.704352 4839 generic.go:334] "Generic (PLEG): container finished" podID="f939e367-e323-4cac-85d0-55d26d60f4ec" containerID="05536196eb9060091b02cda5458e67b22a4301abe94bd268ab45cbb2e462af55" exitCode=0 Mar 21 05:02:14 crc kubenswrapper[4839]: I0321 05:02:14.704390 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-pb97p" event={"ID":"f939e367-e323-4cac-85d0-55d26d60f4ec","Type":"ContainerDied","Data":"05536196eb9060091b02cda5458e67b22a4301abe94bd268ab45cbb2e462af55"} Mar 21 05:02:14 crc kubenswrapper[4839]: I0321 05:02:14.704413 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-pb97p" event={"ID":"f939e367-e323-4cac-85d0-55d26d60f4ec","Type":"ContainerStarted","Data":"226aff01fcb6e9c576f943e6b8a6622749c456bb21bf2cb5df8b9ab8362e0e3e"} Mar 21 05:02:15 crc kubenswrapper[4839]: I0321 05:02:15.715859 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-x7bww" event={"ID":"1efcbe59-69a0-46b0-b47b-e7e6fd1502c3","Type":"ContainerStarted","Data":"b9de2139979cd92ed9f8ddab30d1a42bd7d67a863ada145cf6cbe71703537956"} Mar 21 05:02:15 crc kubenswrapper[4839]: I0321 05:02:15.718472 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-pb97p" event={"ID":"f939e367-e323-4cac-85d0-55d26d60f4ec","Type":"ContainerStarted","Data":"45573170de201923aa2e043b497080bdb7b19d0d4e30d2bc220f9d78a14fec2f"} Mar 21 05:02:15 crc kubenswrapper[4839]: I0321 05:02:15.740239 4839 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-x7bww" podStartSLOduration=3.075991648 podStartE2EDuration="5.740214745s" podCreationTimestamp="2026-03-21 05:02:10 +0000 UTC" firstStartedPulling="2026-03-21 05:02:12.683834041 +0000 UTC m=+2337.011620757" lastFinishedPulling="2026-03-21 05:02:15.348057178 +0000 UTC m=+2339.675843854" observedRunningTime="2026-03-21 05:02:15.733697071 +0000 UTC m=+2340.061483767" watchObservedRunningTime="2026-03-21 05:02:15.740214745 +0000 UTC m=+2340.068001421" Mar 21 05:02:16 crc kubenswrapper[4839]: I0321 05:02:16.728847 4839 generic.go:334] "Generic (PLEG): container finished" podID="f939e367-e323-4cac-85d0-55d26d60f4ec" containerID="45573170de201923aa2e043b497080bdb7b19d0d4e30d2bc220f9d78a14fec2f" exitCode=0 Mar 21 05:02:16 crc kubenswrapper[4839]: I0321 05:02:16.728960 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-pb97p" event={"ID":"f939e367-e323-4cac-85d0-55d26d60f4ec","Type":"ContainerDied","Data":"45573170de201923aa2e043b497080bdb7b19d0d4e30d2bc220f9d78a14fec2f"} Mar 21 05:02:20 crc kubenswrapper[4839]: I0321 05:02:20.810257 4839 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-x7bww" Mar 21 05:02:20 crc kubenswrapper[4839]: I0321 05:02:20.810891 4839 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-x7bww" Mar 21 05:02:20 crc kubenswrapper[4839]: I0321 05:02:20.858398 4839 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-x7bww" Mar 21 05:02:21 crc kubenswrapper[4839]: I0321 05:02:21.824099 4839 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-x7bww" Mar 21 05:02:21 crc kubenswrapper[4839]: I0321 05:02:21.868918 4839 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-x7bww"] Mar 21 05:02:22 crc kubenswrapper[4839]: I0321 05:02:22.789541 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-pb97p" event={"ID":"f939e367-e323-4cac-85d0-55d26d60f4ec","Type":"ContainerStarted","Data":"935f7c36e9c71c4fca49f4aab7c929a57af2a445189275ccfc5955dcd95764ef"} Mar 21 05:02:23 crc kubenswrapper[4839]: I0321 05:02:23.799320 4839 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-x7bww" podUID="1efcbe59-69a0-46b0-b47b-e7e6fd1502c3" containerName="registry-server" containerID="cri-o://b9de2139979cd92ed9f8ddab30d1a42bd7d67a863ada145cf6cbe71703537956" gracePeriod=2 Mar 21 05:02:23 crc kubenswrapper[4839]: I0321 05:02:23.808762 4839 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-pb97p" Mar 21 05:02:23 crc kubenswrapper[4839]: I0321 05:02:23.809209 4839 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-pb97p" Mar 21 05:02:23 crc kubenswrapper[4839]: I0321 05:02:23.819658 4839 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-pb97p" podStartSLOduration=4.034429463 podStartE2EDuration="10.819623626s" podCreationTimestamp="2026-03-21 05:02:13 +0000 UTC" firstStartedPulling="2026-03-21 05:02:14.706362406 +0000 UTC m=+2339.034149082" lastFinishedPulling="2026-03-21 05:02:21.491556569 +0000 UTC m=+2345.819343245" observedRunningTime="2026-03-21 05:02:23.817887397 +0000 UTC m=+2348.145674073" watchObservedRunningTime="2026-03-21 05:02:23.819623626 +0000 UTC m=+2348.147410302" Mar 21 05:02:24 crc kubenswrapper[4839]: I0321 05:02:24.863256 4839 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/certified-operators-pb97p" podUID="f939e367-e323-4cac-85d0-55d26d60f4ec" containerName="registry-server" probeResult="failure" output=< Mar 21 05:02:24 crc kubenswrapper[4839]: timeout: failed to connect service ":50051" within 1s Mar 21 05:02:24 crc kubenswrapper[4839]: > Mar 21 05:02:25 crc kubenswrapper[4839]: I0321 05:02:25.832971 4839 generic.go:334] "Generic (PLEG): container finished" podID="1efcbe59-69a0-46b0-b47b-e7e6fd1502c3" containerID="b9de2139979cd92ed9f8ddab30d1a42bd7d67a863ada145cf6cbe71703537956" exitCode=0 Mar 21 05:02:25 crc kubenswrapper[4839]: I0321 05:02:25.833156 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-x7bww" event={"ID":"1efcbe59-69a0-46b0-b47b-e7e6fd1502c3","Type":"ContainerDied","Data":"b9de2139979cd92ed9f8ddab30d1a42bd7d67a863ada145cf6cbe71703537956"} Mar 21 05:02:25 crc kubenswrapper[4839]: I0321 05:02:25.833303 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-x7bww" event={"ID":"1efcbe59-69a0-46b0-b47b-e7e6fd1502c3","Type":"ContainerDied","Data":"373a8c5aa8fdbb58b3ae94a725b57b695a91f013d0b4d98618d1c738df727506"} Mar 21 05:02:25 crc kubenswrapper[4839]: I0321 05:02:25.833330 4839 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="373a8c5aa8fdbb58b3ae94a725b57b695a91f013d0b4d98618d1c738df727506" Mar 21 05:02:25 crc kubenswrapper[4839]: I0321 05:02:25.842817 4839 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-x7bww" Mar 21 05:02:25 crc kubenswrapper[4839]: I0321 05:02:25.950049 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1efcbe59-69a0-46b0-b47b-e7e6fd1502c3-catalog-content\") pod \"1efcbe59-69a0-46b0-b47b-e7e6fd1502c3\" (UID: \"1efcbe59-69a0-46b0-b47b-e7e6fd1502c3\") " Mar 21 05:02:25 crc kubenswrapper[4839]: I0321 05:02:25.950292 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1efcbe59-69a0-46b0-b47b-e7e6fd1502c3-utilities\") pod \"1efcbe59-69a0-46b0-b47b-e7e6fd1502c3\" (UID: \"1efcbe59-69a0-46b0-b47b-e7e6fd1502c3\") " Mar 21 05:02:25 crc kubenswrapper[4839]: I0321 05:02:25.950383 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hk578\" (UniqueName: \"kubernetes.io/projected/1efcbe59-69a0-46b0-b47b-e7e6fd1502c3-kube-api-access-hk578\") pod \"1efcbe59-69a0-46b0-b47b-e7e6fd1502c3\" (UID: \"1efcbe59-69a0-46b0-b47b-e7e6fd1502c3\") " Mar 21 05:02:25 crc kubenswrapper[4839]: I0321 05:02:25.950986 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1efcbe59-69a0-46b0-b47b-e7e6fd1502c3-utilities" (OuterVolumeSpecName: "utilities") pod "1efcbe59-69a0-46b0-b47b-e7e6fd1502c3" (UID: "1efcbe59-69a0-46b0-b47b-e7e6fd1502c3"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 21 05:02:25 crc kubenswrapper[4839]: I0321 05:02:25.956012 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1efcbe59-69a0-46b0-b47b-e7e6fd1502c3-kube-api-access-hk578" (OuterVolumeSpecName: "kube-api-access-hk578") pod "1efcbe59-69a0-46b0-b47b-e7e6fd1502c3" (UID: "1efcbe59-69a0-46b0-b47b-e7e6fd1502c3"). InnerVolumeSpecName "kube-api-access-hk578". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 21 05:02:25 crc kubenswrapper[4839]: I0321 05:02:25.977524 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1efcbe59-69a0-46b0-b47b-e7e6fd1502c3-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "1efcbe59-69a0-46b0-b47b-e7e6fd1502c3" (UID: "1efcbe59-69a0-46b0-b47b-e7e6fd1502c3"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 21 05:02:26 crc kubenswrapper[4839]: I0321 05:02:26.053166 4839 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1efcbe59-69a0-46b0-b47b-e7e6fd1502c3-utilities\") on node \"crc\" DevicePath \"\"" Mar 21 05:02:26 crc kubenswrapper[4839]: I0321 05:02:26.053221 4839 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hk578\" (UniqueName: \"kubernetes.io/projected/1efcbe59-69a0-46b0-b47b-e7e6fd1502c3-kube-api-access-hk578\") on node \"crc\" DevicePath \"\"" Mar 21 05:02:26 crc kubenswrapper[4839]: I0321 05:02:26.053238 4839 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1efcbe59-69a0-46b0-b47b-e7e6fd1502c3-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 21 05:02:26 crc kubenswrapper[4839]: I0321 05:02:26.840149 4839 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-x7bww" Mar 21 05:02:26 crc kubenswrapper[4839]: I0321 05:02:26.864375 4839 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-x7bww"] Mar 21 05:02:26 crc kubenswrapper[4839]: I0321 05:02:26.873474 4839 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-x7bww"] Mar 21 05:02:28 crc kubenswrapper[4839]: I0321 05:02:28.462065 4839 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1efcbe59-69a0-46b0-b47b-e7e6fd1502c3" path="/var/lib/kubelet/pods/1efcbe59-69a0-46b0-b47b-e7e6fd1502c3/volumes" Mar 21 05:02:33 crc kubenswrapper[4839]: I0321 05:02:33.859731 4839 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-pb97p" Mar 21 05:02:33 crc kubenswrapper[4839]: I0321 05:02:33.919205 4839 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-pb97p" Mar 21 05:02:34 crc kubenswrapper[4839]: I0321 05:02:34.099119 4839 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-pb97p"] Mar 21 05:02:34 crc kubenswrapper[4839]: I0321 05:02:34.903346 4839 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-pb97p" podUID="f939e367-e323-4cac-85d0-55d26d60f4ec" containerName="registry-server" containerID="cri-o://935f7c36e9c71c4fca49f4aab7c929a57af2a445189275ccfc5955dcd95764ef" gracePeriod=2 Mar 21 05:02:35 crc kubenswrapper[4839]: I0321 05:02:35.371451 4839 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-pb97p" Mar 21 05:02:35 crc kubenswrapper[4839]: I0321 05:02:35.427588 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f939e367-e323-4cac-85d0-55d26d60f4ec-catalog-content\") pod \"f939e367-e323-4cac-85d0-55d26d60f4ec\" (UID: \"f939e367-e323-4cac-85d0-55d26d60f4ec\") " Mar 21 05:02:35 crc kubenswrapper[4839]: I0321 05:02:35.427848 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f939e367-e323-4cac-85d0-55d26d60f4ec-utilities\") pod \"f939e367-e323-4cac-85d0-55d26d60f4ec\" (UID: \"f939e367-e323-4cac-85d0-55d26d60f4ec\") " Mar 21 05:02:35 crc kubenswrapper[4839]: I0321 05:02:35.427968 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sgwg2\" (UniqueName: \"kubernetes.io/projected/f939e367-e323-4cac-85d0-55d26d60f4ec-kube-api-access-sgwg2\") pod \"f939e367-e323-4cac-85d0-55d26d60f4ec\" (UID: \"f939e367-e323-4cac-85d0-55d26d60f4ec\") " Mar 21 05:02:35 crc kubenswrapper[4839]: I0321 05:02:35.428864 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f939e367-e323-4cac-85d0-55d26d60f4ec-utilities" (OuterVolumeSpecName: "utilities") pod "f939e367-e323-4cac-85d0-55d26d60f4ec" (UID: "f939e367-e323-4cac-85d0-55d26d60f4ec"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 21 05:02:35 crc kubenswrapper[4839]: I0321 05:02:35.440970 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f939e367-e323-4cac-85d0-55d26d60f4ec-kube-api-access-sgwg2" (OuterVolumeSpecName: "kube-api-access-sgwg2") pod "f939e367-e323-4cac-85d0-55d26d60f4ec" (UID: "f939e367-e323-4cac-85d0-55d26d60f4ec"). InnerVolumeSpecName "kube-api-access-sgwg2". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 21 05:02:35 crc kubenswrapper[4839]: I0321 05:02:35.486188 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f939e367-e323-4cac-85d0-55d26d60f4ec-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "f939e367-e323-4cac-85d0-55d26d60f4ec" (UID: "f939e367-e323-4cac-85d0-55d26d60f4ec"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 21 05:02:35 crc kubenswrapper[4839]: I0321 05:02:35.532278 4839 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f939e367-e323-4cac-85d0-55d26d60f4ec-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 21 05:02:35 crc kubenswrapper[4839]: I0321 05:02:35.532412 4839 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f939e367-e323-4cac-85d0-55d26d60f4ec-utilities\") on node \"crc\" DevicePath \"\"" Mar 21 05:02:35 crc kubenswrapper[4839]: I0321 05:02:35.532437 4839 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sgwg2\" (UniqueName: \"kubernetes.io/projected/f939e367-e323-4cac-85d0-55d26d60f4ec-kube-api-access-sgwg2\") on node \"crc\" DevicePath \"\"" Mar 21 05:02:35 crc kubenswrapper[4839]: I0321 05:02:35.913837 4839 generic.go:334] "Generic (PLEG): container finished" podID="f939e367-e323-4cac-85d0-55d26d60f4ec" containerID="935f7c36e9c71c4fca49f4aab7c929a57af2a445189275ccfc5955dcd95764ef" exitCode=0 Mar 21 05:02:35 crc kubenswrapper[4839]: I0321 05:02:35.913903 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-pb97p" event={"ID":"f939e367-e323-4cac-85d0-55d26d60f4ec","Type":"ContainerDied","Data":"935f7c36e9c71c4fca49f4aab7c929a57af2a445189275ccfc5955dcd95764ef"} Mar 21 05:02:35 crc kubenswrapper[4839]: I0321 05:02:35.913934 4839 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-pb97p" Mar 21 05:02:35 crc kubenswrapper[4839]: I0321 05:02:35.913979 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-pb97p" event={"ID":"f939e367-e323-4cac-85d0-55d26d60f4ec","Type":"ContainerDied","Data":"226aff01fcb6e9c576f943e6b8a6622749c456bb21bf2cb5df8b9ab8362e0e3e"} Mar 21 05:02:35 crc kubenswrapper[4839]: I0321 05:02:35.914016 4839 scope.go:117] "RemoveContainer" containerID="935f7c36e9c71c4fca49f4aab7c929a57af2a445189275ccfc5955dcd95764ef" Mar 21 05:02:35 crc kubenswrapper[4839]: I0321 05:02:35.938655 4839 scope.go:117] "RemoveContainer" containerID="45573170de201923aa2e043b497080bdb7b19d0d4e30d2bc220f9d78a14fec2f" Mar 21 05:02:35 crc kubenswrapper[4839]: I0321 05:02:35.962019 4839 scope.go:117] "RemoveContainer" containerID="05536196eb9060091b02cda5458e67b22a4301abe94bd268ab45cbb2e462af55" Mar 21 05:02:35 crc kubenswrapper[4839]: I0321 05:02:35.965785 4839 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-pb97p"] Mar 21 05:02:35 crc kubenswrapper[4839]: I0321 05:02:35.974216 4839 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-pb97p"] Mar 21 05:02:36 crc kubenswrapper[4839]: I0321 05:02:36.013149 4839 scope.go:117] "RemoveContainer" containerID="935f7c36e9c71c4fca49f4aab7c929a57af2a445189275ccfc5955dcd95764ef" Mar 21 05:02:36 crc kubenswrapper[4839]: E0321 05:02:36.013915 4839 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"935f7c36e9c71c4fca49f4aab7c929a57af2a445189275ccfc5955dcd95764ef\": container with ID starting with 935f7c36e9c71c4fca49f4aab7c929a57af2a445189275ccfc5955dcd95764ef not found: ID does not exist" containerID="935f7c36e9c71c4fca49f4aab7c929a57af2a445189275ccfc5955dcd95764ef" Mar 21 05:02:36 crc kubenswrapper[4839]: I0321 05:02:36.013960 4839 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"935f7c36e9c71c4fca49f4aab7c929a57af2a445189275ccfc5955dcd95764ef"} err="failed to get container status \"935f7c36e9c71c4fca49f4aab7c929a57af2a445189275ccfc5955dcd95764ef\": rpc error: code = NotFound desc = could not find container \"935f7c36e9c71c4fca49f4aab7c929a57af2a445189275ccfc5955dcd95764ef\": container with ID starting with 935f7c36e9c71c4fca49f4aab7c929a57af2a445189275ccfc5955dcd95764ef not found: ID does not exist" Mar 21 05:02:36 crc kubenswrapper[4839]: I0321 05:02:36.013986 4839 scope.go:117] "RemoveContainer" containerID="45573170de201923aa2e043b497080bdb7b19d0d4e30d2bc220f9d78a14fec2f" Mar 21 05:02:36 crc kubenswrapper[4839]: E0321 05:02:36.014410 4839 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"45573170de201923aa2e043b497080bdb7b19d0d4e30d2bc220f9d78a14fec2f\": container with ID starting with 45573170de201923aa2e043b497080bdb7b19d0d4e30d2bc220f9d78a14fec2f not found: ID does not exist" containerID="45573170de201923aa2e043b497080bdb7b19d0d4e30d2bc220f9d78a14fec2f" Mar 21 05:02:36 crc kubenswrapper[4839]: I0321 05:02:36.014438 4839 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"45573170de201923aa2e043b497080bdb7b19d0d4e30d2bc220f9d78a14fec2f"} err="failed to get container status \"45573170de201923aa2e043b497080bdb7b19d0d4e30d2bc220f9d78a14fec2f\": rpc error: code = NotFound desc = could not find container \"45573170de201923aa2e043b497080bdb7b19d0d4e30d2bc220f9d78a14fec2f\": container with ID starting with 45573170de201923aa2e043b497080bdb7b19d0d4e30d2bc220f9d78a14fec2f not found: ID does not exist" Mar 21 05:02:36 crc kubenswrapper[4839]: I0321 05:02:36.014461 4839 scope.go:117] "RemoveContainer" containerID="05536196eb9060091b02cda5458e67b22a4301abe94bd268ab45cbb2e462af55" Mar 21 05:02:36 crc kubenswrapper[4839]: E0321 05:02:36.014743 4839 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"05536196eb9060091b02cda5458e67b22a4301abe94bd268ab45cbb2e462af55\": container with ID starting with 05536196eb9060091b02cda5458e67b22a4301abe94bd268ab45cbb2e462af55 not found: ID does not exist" containerID="05536196eb9060091b02cda5458e67b22a4301abe94bd268ab45cbb2e462af55" Mar 21 05:02:36 crc kubenswrapper[4839]: I0321 05:02:36.014766 4839 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"05536196eb9060091b02cda5458e67b22a4301abe94bd268ab45cbb2e462af55"} err="failed to get container status \"05536196eb9060091b02cda5458e67b22a4301abe94bd268ab45cbb2e462af55\": rpc error: code = NotFound desc = could not find container \"05536196eb9060091b02cda5458e67b22a4301abe94bd268ab45cbb2e462af55\": container with ID starting with 05536196eb9060091b02cda5458e67b22a4301abe94bd268ab45cbb2e462af55 not found: ID does not exist" Mar 21 05:02:36 crc kubenswrapper[4839]: I0321 05:02:36.467128 4839 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f939e367-e323-4cac-85d0-55d26d60f4ec" path="/var/lib/kubelet/pods/f939e367-e323-4cac-85d0-55d26d60f4ec/volumes" Mar 21 05:02:55 crc kubenswrapper[4839]: I0321 05:02:55.777666 4839 scope.go:117] "RemoveContainer" containerID="4fe2426cb283c93b9728be8cbc10600e5f92f98c8d9cf9800594541cb0424886" Mar 21 05:03:21 crc kubenswrapper[4839]: I0321 05:03:21.349552 4839 generic.go:334] "Generic (PLEG): container finished" podID="2d056acb-0183-4157-a830-fff4cd1dcacf" containerID="38923cbb1565ae7e426fbfa5a7cacab2c7dc20c694af3c1982bdf3aeaab3650d" exitCode=0 Mar 21 05:03:21 crc kubenswrapper[4839]: I0321 05:03:21.349659 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-w48j6" event={"ID":"2d056acb-0183-4157-a830-fff4cd1dcacf","Type":"ContainerDied","Data":"38923cbb1565ae7e426fbfa5a7cacab2c7dc20c694af3c1982bdf3aeaab3650d"} Mar 21 05:03:22 crc kubenswrapper[4839]: I0321 05:03:22.847361 4839 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-w48j6" Mar 21 05:03:23 crc kubenswrapper[4839]: I0321 05:03:23.021341 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/2d056acb-0183-4157-a830-fff4cd1dcacf-inventory\") pod \"2d056acb-0183-4157-a830-fff4cd1dcacf\" (UID: \"2d056acb-0183-4157-a830-fff4cd1dcacf\") " Mar 21 05:03:23 crc kubenswrapper[4839]: I0321 05:03:23.021446 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/2d056acb-0183-4157-a830-fff4cd1dcacf-libvirt-secret-0\") pod \"2d056acb-0183-4157-a830-fff4cd1dcacf\" (UID: \"2d056acb-0183-4157-a830-fff4cd1dcacf\") " Mar 21 05:03:23 crc kubenswrapper[4839]: I0321 05:03:23.021511 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2d056acb-0183-4157-a830-fff4cd1dcacf-libvirt-combined-ca-bundle\") pod \"2d056acb-0183-4157-a830-fff4cd1dcacf\" (UID: \"2d056acb-0183-4157-a830-fff4cd1dcacf\") " Mar 21 05:03:23 crc kubenswrapper[4839]: I0321 05:03:23.021823 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/2d056acb-0183-4157-a830-fff4cd1dcacf-ssh-key-openstack-edpm-ipam\") pod \"2d056acb-0183-4157-a830-fff4cd1dcacf\" (UID: \"2d056acb-0183-4157-a830-fff4cd1dcacf\") " Mar 21 05:03:23 crc kubenswrapper[4839]: I0321 05:03:23.021974 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pf8mj\" (UniqueName: \"kubernetes.io/projected/2d056acb-0183-4157-a830-fff4cd1dcacf-kube-api-access-pf8mj\") pod \"2d056acb-0183-4157-a830-fff4cd1dcacf\" (UID: \"2d056acb-0183-4157-a830-fff4cd1dcacf\") " Mar 21 05:03:23 crc kubenswrapper[4839]: I0321 05:03:23.028463 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2d056acb-0183-4157-a830-fff4cd1dcacf-libvirt-combined-ca-bundle" (OuterVolumeSpecName: "libvirt-combined-ca-bundle") pod "2d056acb-0183-4157-a830-fff4cd1dcacf" (UID: "2d056acb-0183-4157-a830-fff4cd1dcacf"). InnerVolumeSpecName "libvirt-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 21 05:03:23 crc kubenswrapper[4839]: I0321 05:03:23.032741 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2d056acb-0183-4157-a830-fff4cd1dcacf-kube-api-access-pf8mj" (OuterVolumeSpecName: "kube-api-access-pf8mj") pod "2d056acb-0183-4157-a830-fff4cd1dcacf" (UID: "2d056acb-0183-4157-a830-fff4cd1dcacf"). InnerVolumeSpecName "kube-api-access-pf8mj". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 21 05:03:23 crc kubenswrapper[4839]: I0321 05:03:23.058874 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2d056acb-0183-4157-a830-fff4cd1dcacf-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "2d056acb-0183-4157-a830-fff4cd1dcacf" (UID: "2d056acb-0183-4157-a830-fff4cd1dcacf"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 21 05:03:23 crc kubenswrapper[4839]: I0321 05:03:23.061184 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2d056acb-0183-4157-a830-fff4cd1dcacf-inventory" (OuterVolumeSpecName: "inventory") pod "2d056acb-0183-4157-a830-fff4cd1dcacf" (UID: "2d056acb-0183-4157-a830-fff4cd1dcacf"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 21 05:03:23 crc kubenswrapper[4839]: I0321 05:03:23.074824 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2d056acb-0183-4157-a830-fff4cd1dcacf-libvirt-secret-0" (OuterVolumeSpecName: "libvirt-secret-0") pod "2d056acb-0183-4157-a830-fff4cd1dcacf" (UID: "2d056acb-0183-4157-a830-fff4cd1dcacf"). InnerVolumeSpecName "libvirt-secret-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 21 05:03:23 crc kubenswrapper[4839]: I0321 05:03:23.125159 4839 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/2d056acb-0183-4157-a830-fff4cd1dcacf-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Mar 21 05:03:23 crc kubenswrapper[4839]: I0321 05:03:23.125215 4839 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pf8mj\" (UniqueName: \"kubernetes.io/projected/2d056acb-0183-4157-a830-fff4cd1dcacf-kube-api-access-pf8mj\") on node \"crc\" DevicePath \"\"" Mar 21 05:03:23 crc kubenswrapper[4839]: I0321 05:03:23.125236 4839 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/2d056acb-0183-4157-a830-fff4cd1dcacf-inventory\") on node \"crc\" DevicePath \"\"" Mar 21 05:03:23 crc kubenswrapper[4839]: I0321 05:03:23.125255 4839 reconciler_common.go:293] "Volume detached for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/2d056acb-0183-4157-a830-fff4cd1dcacf-libvirt-secret-0\") on node \"crc\" DevicePath \"\"" Mar 21 05:03:23 crc kubenswrapper[4839]: I0321 05:03:23.125274 4839 reconciler_common.go:293] "Volume detached for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2d056acb-0183-4157-a830-fff4cd1dcacf-libvirt-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 21 05:03:23 crc kubenswrapper[4839]: I0321 05:03:23.373252 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-w48j6" event={"ID":"2d056acb-0183-4157-a830-fff4cd1dcacf","Type":"ContainerDied","Data":"a47bbc08aef3ecca5f0392206f51820ac7164fd826b5332b0545a1c8dd6795ca"} Mar 21 05:03:23 crc kubenswrapper[4839]: I0321 05:03:23.373718 4839 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a47bbc08aef3ecca5f0392206f51820ac7164fd826b5332b0545a1c8dd6795ca" Mar 21 05:03:23 crc kubenswrapper[4839]: I0321 05:03:23.373303 4839 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-w48j6" Mar 21 05:03:23 crc kubenswrapper[4839]: I0321 05:03:23.472862 4839 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-edpm-deployment-openstack-edpm-ipam-hf42f"] Mar 21 05:03:23 crc kubenswrapper[4839]: E0321 05:03:23.473305 4839 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1efcbe59-69a0-46b0-b47b-e7e6fd1502c3" containerName="extract-content" Mar 21 05:03:23 crc kubenswrapper[4839]: I0321 05:03:23.473329 4839 state_mem.go:107] "Deleted CPUSet assignment" podUID="1efcbe59-69a0-46b0-b47b-e7e6fd1502c3" containerName="extract-content" Mar 21 05:03:23 crc kubenswrapper[4839]: E0321 05:03:23.473347 4839 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f939e367-e323-4cac-85d0-55d26d60f4ec" containerName="registry-server" Mar 21 05:03:23 crc kubenswrapper[4839]: I0321 05:03:23.473354 4839 state_mem.go:107] "Deleted CPUSet assignment" podUID="f939e367-e323-4cac-85d0-55d26d60f4ec" containerName="registry-server" Mar 21 05:03:23 crc kubenswrapper[4839]: E0321 05:03:23.473381 4839 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f939e367-e323-4cac-85d0-55d26d60f4ec" containerName="extract-content" Mar 21 05:03:23 crc kubenswrapper[4839]: I0321 05:03:23.473389 4839 state_mem.go:107] "Deleted CPUSet assignment" podUID="f939e367-e323-4cac-85d0-55d26d60f4ec" containerName="extract-content" Mar 21 05:03:23 crc kubenswrapper[4839]: E0321 05:03:23.473401 4839 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1efcbe59-69a0-46b0-b47b-e7e6fd1502c3" containerName="registry-server" Mar 21 05:03:23 crc kubenswrapper[4839]: I0321 05:03:23.473408 4839 state_mem.go:107] "Deleted CPUSet assignment" podUID="1efcbe59-69a0-46b0-b47b-e7e6fd1502c3" containerName="registry-server" Mar 21 05:03:23 crc kubenswrapper[4839]: E0321 05:03:23.473425 4839 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1efcbe59-69a0-46b0-b47b-e7e6fd1502c3" containerName="extract-utilities" Mar 21 05:03:23 crc kubenswrapper[4839]: I0321 05:03:23.473434 4839 state_mem.go:107] "Deleted CPUSet assignment" podUID="1efcbe59-69a0-46b0-b47b-e7e6fd1502c3" containerName="extract-utilities" Mar 21 05:03:23 crc kubenswrapper[4839]: E0321 05:03:23.473448 4839 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2d056acb-0183-4157-a830-fff4cd1dcacf" containerName="libvirt-edpm-deployment-openstack-edpm-ipam" Mar 21 05:03:23 crc kubenswrapper[4839]: I0321 05:03:23.473457 4839 state_mem.go:107] "Deleted CPUSet assignment" podUID="2d056acb-0183-4157-a830-fff4cd1dcacf" containerName="libvirt-edpm-deployment-openstack-edpm-ipam" Mar 21 05:03:23 crc kubenswrapper[4839]: E0321 05:03:23.473471 4839 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f939e367-e323-4cac-85d0-55d26d60f4ec" containerName="extract-utilities" Mar 21 05:03:23 crc kubenswrapper[4839]: I0321 05:03:23.473482 4839 state_mem.go:107] "Deleted CPUSet assignment" podUID="f939e367-e323-4cac-85d0-55d26d60f4ec" containerName="extract-utilities" Mar 21 05:03:23 crc kubenswrapper[4839]: I0321 05:03:23.473750 4839 memory_manager.go:354] "RemoveStaleState removing state" podUID="1efcbe59-69a0-46b0-b47b-e7e6fd1502c3" containerName="registry-server" Mar 21 05:03:23 crc kubenswrapper[4839]: I0321 05:03:23.473764 4839 memory_manager.go:354] "RemoveStaleState removing state" podUID="2d056acb-0183-4157-a830-fff4cd1dcacf" containerName="libvirt-edpm-deployment-openstack-edpm-ipam" Mar 21 05:03:23 crc kubenswrapper[4839]: I0321 05:03:23.473784 4839 memory_manager.go:354] "RemoveStaleState removing state" podUID="f939e367-e323-4cac-85d0-55d26d60f4ec" containerName="registry-server" Mar 21 05:03:23 crc kubenswrapper[4839]: I0321 05:03:23.474600 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-hf42f" Mar 21 05:03:23 crc kubenswrapper[4839]: I0321 05:03:23.481396 4839 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-pxvtf" Mar 21 05:03:23 crc kubenswrapper[4839]: I0321 05:03:23.481600 4839 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Mar 21 05:03:23 crc kubenswrapper[4839]: I0321 05:03:23.481834 4839 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-migration-ssh-key" Mar 21 05:03:23 crc kubenswrapper[4839]: I0321 05:03:23.483986 4839 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Mar 21 05:03:23 crc kubenswrapper[4839]: I0321 05:03:23.484141 4839 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Mar 21 05:03:23 crc kubenswrapper[4839]: I0321 05:03:23.484165 4839 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-compute-config" Mar 21 05:03:23 crc kubenswrapper[4839]: I0321 05:03:23.484475 4839 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"nova-extra-config" Mar 21 05:03:23 crc kubenswrapper[4839]: I0321 05:03:23.497005 4839 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-edpm-deployment-openstack-edpm-ipam-hf42f"] Mar 21 05:03:23 crc kubenswrapper[4839]: I0321 05:03:23.532646 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/3f8728ca-30ff-41a9-8a48-e3bb7911bcc7-nova-cell1-compute-config-1\") pod \"nova-edpm-deployment-openstack-edpm-ipam-hf42f\" (UID: \"3f8728ca-30ff-41a9-8a48-e3bb7911bcc7\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-hf42f" Mar 21 05:03:23 crc kubenswrapper[4839]: I0321 05:03:23.532742 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/3f8728ca-30ff-41a9-8a48-e3bb7911bcc7-nova-migration-ssh-key-1\") pod \"nova-edpm-deployment-openstack-edpm-ipam-hf42f\" (UID: \"3f8728ca-30ff-41a9-8a48-e3bb7911bcc7\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-hf42f" Mar 21 05:03:23 crc kubenswrapper[4839]: I0321 05:03:23.532771 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/3f8728ca-30ff-41a9-8a48-e3bb7911bcc7-ssh-key-openstack-edpm-ipam\") pod \"nova-edpm-deployment-openstack-edpm-ipam-hf42f\" (UID: \"3f8728ca-30ff-41a9-8a48-e3bb7911bcc7\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-hf42f" Mar 21 05:03:23 crc kubenswrapper[4839]: I0321 05:03:23.532803 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/3f8728ca-30ff-41a9-8a48-e3bb7911bcc7-inventory\") pod \"nova-edpm-deployment-openstack-edpm-ipam-hf42f\" (UID: \"3f8728ca-30ff-41a9-8a48-e3bb7911bcc7\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-hf42f" Mar 21 05:03:23 crc kubenswrapper[4839]: I0321 05:03:23.532829 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-extra-config-0\" (UniqueName: \"kubernetes.io/configmap/3f8728ca-30ff-41a9-8a48-e3bb7911bcc7-nova-extra-config-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-hf42f\" (UID: \"3f8728ca-30ff-41a9-8a48-e3bb7911bcc7\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-hf42f" Mar 21 05:03:23 crc kubenswrapper[4839]: I0321 05:03:23.532846 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/3f8728ca-30ff-41a9-8a48-e3bb7911bcc7-nova-cell1-compute-config-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-hf42f\" (UID: \"3f8728ca-30ff-41a9-8a48-e3bb7911bcc7\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-hf42f" Mar 21 05:03:23 crc kubenswrapper[4839]: I0321 05:03:23.532879 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3f8728ca-30ff-41a9-8a48-e3bb7911bcc7-nova-combined-ca-bundle\") pod \"nova-edpm-deployment-openstack-edpm-ipam-hf42f\" (UID: \"3f8728ca-30ff-41a9-8a48-e3bb7911bcc7\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-hf42f" Mar 21 05:03:23 crc kubenswrapper[4839]: I0321 05:03:23.532897 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5jjdv\" (UniqueName: \"kubernetes.io/projected/3f8728ca-30ff-41a9-8a48-e3bb7911bcc7-kube-api-access-5jjdv\") pod \"nova-edpm-deployment-openstack-edpm-ipam-hf42f\" (UID: \"3f8728ca-30ff-41a9-8a48-e3bb7911bcc7\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-hf42f" Mar 21 05:03:23 crc kubenswrapper[4839]: I0321 05:03:23.532926 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-cell1-compute-config-3\" (UniqueName: \"kubernetes.io/secret/3f8728ca-30ff-41a9-8a48-e3bb7911bcc7-nova-cell1-compute-config-3\") pod \"nova-edpm-deployment-openstack-edpm-ipam-hf42f\" (UID: \"3f8728ca-30ff-41a9-8a48-e3bb7911bcc7\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-hf42f" Mar 21 05:03:23 crc kubenswrapper[4839]: I0321 05:03:23.532981 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-cell1-compute-config-2\" (UniqueName: \"kubernetes.io/secret/3f8728ca-30ff-41a9-8a48-e3bb7911bcc7-nova-cell1-compute-config-2\") pod \"nova-edpm-deployment-openstack-edpm-ipam-hf42f\" (UID: \"3f8728ca-30ff-41a9-8a48-e3bb7911bcc7\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-hf42f" Mar 21 05:03:23 crc kubenswrapper[4839]: I0321 05:03:23.533005 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/3f8728ca-30ff-41a9-8a48-e3bb7911bcc7-nova-migration-ssh-key-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-hf42f\" (UID: \"3f8728ca-30ff-41a9-8a48-e3bb7911bcc7\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-hf42f" Mar 21 05:03:23 crc kubenswrapper[4839]: I0321 05:03:23.634865 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/3f8728ca-30ff-41a9-8a48-e3bb7911bcc7-nova-cell1-compute-config-1\") pod \"nova-edpm-deployment-openstack-edpm-ipam-hf42f\" (UID: \"3f8728ca-30ff-41a9-8a48-e3bb7911bcc7\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-hf42f" Mar 21 05:03:23 crc kubenswrapper[4839]: I0321 05:03:23.634945 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/3f8728ca-30ff-41a9-8a48-e3bb7911bcc7-nova-migration-ssh-key-1\") pod \"nova-edpm-deployment-openstack-edpm-ipam-hf42f\" (UID: \"3f8728ca-30ff-41a9-8a48-e3bb7911bcc7\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-hf42f" Mar 21 05:03:23 crc kubenswrapper[4839]: I0321 05:03:23.634976 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/3f8728ca-30ff-41a9-8a48-e3bb7911bcc7-ssh-key-openstack-edpm-ipam\") pod \"nova-edpm-deployment-openstack-edpm-ipam-hf42f\" (UID: \"3f8728ca-30ff-41a9-8a48-e3bb7911bcc7\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-hf42f" Mar 21 05:03:23 crc kubenswrapper[4839]: I0321 05:03:23.634998 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/3f8728ca-30ff-41a9-8a48-e3bb7911bcc7-inventory\") pod \"nova-edpm-deployment-openstack-edpm-ipam-hf42f\" (UID: \"3f8728ca-30ff-41a9-8a48-e3bb7911bcc7\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-hf42f" Mar 21 05:03:23 crc kubenswrapper[4839]: I0321 05:03:23.635020 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-extra-config-0\" (UniqueName: \"kubernetes.io/configmap/3f8728ca-30ff-41a9-8a48-e3bb7911bcc7-nova-extra-config-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-hf42f\" (UID: \"3f8728ca-30ff-41a9-8a48-e3bb7911bcc7\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-hf42f" Mar 21 05:03:23 crc kubenswrapper[4839]: I0321 05:03:23.635040 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/3f8728ca-30ff-41a9-8a48-e3bb7911bcc7-nova-cell1-compute-config-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-hf42f\" (UID: \"3f8728ca-30ff-41a9-8a48-e3bb7911bcc7\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-hf42f" Mar 21 05:03:23 crc kubenswrapper[4839]: I0321 05:03:23.635072 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3f8728ca-30ff-41a9-8a48-e3bb7911bcc7-nova-combined-ca-bundle\") pod \"nova-edpm-deployment-openstack-edpm-ipam-hf42f\" (UID: \"3f8728ca-30ff-41a9-8a48-e3bb7911bcc7\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-hf42f" Mar 21 05:03:23 crc kubenswrapper[4839]: I0321 05:03:23.635089 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5jjdv\" (UniqueName: \"kubernetes.io/projected/3f8728ca-30ff-41a9-8a48-e3bb7911bcc7-kube-api-access-5jjdv\") pod \"nova-edpm-deployment-openstack-edpm-ipam-hf42f\" (UID: \"3f8728ca-30ff-41a9-8a48-e3bb7911bcc7\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-hf42f" Mar 21 05:03:23 crc kubenswrapper[4839]: I0321 05:03:23.635116 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-cell1-compute-config-3\" (UniqueName: \"kubernetes.io/secret/3f8728ca-30ff-41a9-8a48-e3bb7911bcc7-nova-cell1-compute-config-3\") pod \"nova-edpm-deployment-openstack-edpm-ipam-hf42f\" (UID: \"3f8728ca-30ff-41a9-8a48-e3bb7911bcc7\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-hf42f" Mar 21 05:03:23 crc kubenswrapper[4839]: I0321 05:03:23.635154 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-cell1-compute-config-2\" (UniqueName: \"kubernetes.io/secret/3f8728ca-30ff-41a9-8a48-e3bb7911bcc7-nova-cell1-compute-config-2\") pod \"nova-edpm-deployment-openstack-edpm-ipam-hf42f\" (UID: \"3f8728ca-30ff-41a9-8a48-e3bb7911bcc7\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-hf42f" Mar 21 05:03:23 crc kubenswrapper[4839]: I0321 05:03:23.635172 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/3f8728ca-30ff-41a9-8a48-e3bb7911bcc7-nova-migration-ssh-key-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-hf42f\" (UID: \"3f8728ca-30ff-41a9-8a48-e3bb7911bcc7\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-hf42f" Mar 21 05:03:23 crc kubenswrapper[4839]: I0321 05:03:23.636413 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-extra-config-0\" (UniqueName: \"kubernetes.io/configmap/3f8728ca-30ff-41a9-8a48-e3bb7911bcc7-nova-extra-config-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-hf42f\" (UID: \"3f8728ca-30ff-41a9-8a48-e3bb7911bcc7\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-hf42f" Mar 21 05:03:23 crc kubenswrapper[4839]: I0321 05:03:23.640928 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/3f8728ca-30ff-41a9-8a48-e3bb7911bcc7-nova-migration-ssh-key-1\") pod \"nova-edpm-deployment-openstack-edpm-ipam-hf42f\" (UID: \"3f8728ca-30ff-41a9-8a48-e3bb7911bcc7\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-hf42f" Mar 21 05:03:23 crc kubenswrapper[4839]: I0321 05:03:23.641790 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/3f8728ca-30ff-41a9-8a48-e3bb7911bcc7-ssh-key-openstack-edpm-ipam\") pod \"nova-edpm-deployment-openstack-edpm-ipam-hf42f\" (UID: \"3f8728ca-30ff-41a9-8a48-e3bb7911bcc7\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-hf42f" Mar 21 05:03:23 crc kubenswrapper[4839]: I0321 05:03:23.641930 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3f8728ca-30ff-41a9-8a48-e3bb7911bcc7-nova-combined-ca-bundle\") pod \"nova-edpm-deployment-openstack-edpm-ipam-hf42f\" (UID: \"3f8728ca-30ff-41a9-8a48-e3bb7911bcc7\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-hf42f" Mar 21 05:03:23 crc kubenswrapper[4839]: I0321 05:03:23.642094 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/3f8728ca-30ff-41a9-8a48-e3bb7911bcc7-nova-cell1-compute-config-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-hf42f\" (UID: \"3f8728ca-30ff-41a9-8a48-e3bb7911bcc7\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-hf42f" Mar 21 05:03:23 crc kubenswrapper[4839]: I0321 05:03:23.642243 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/3f8728ca-30ff-41a9-8a48-e3bb7911bcc7-nova-migration-ssh-key-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-hf42f\" (UID: \"3f8728ca-30ff-41a9-8a48-e3bb7911bcc7\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-hf42f" Mar 21 05:03:23 crc kubenswrapper[4839]: I0321 05:03:23.642248 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-cell1-compute-config-3\" (UniqueName: \"kubernetes.io/secret/3f8728ca-30ff-41a9-8a48-e3bb7911bcc7-nova-cell1-compute-config-3\") pod \"nova-edpm-deployment-openstack-edpm-ipam-hf42f\" (UID: \"3f8728ca-30ff-41a9-8a48-e3bb7911bcc7\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-hf42f" Mar 21 05:03:23 crc kubenswrapper[4839]: I0321 05:03:23.642255 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/3f8728ca-30ff-41a9-8a48-e3bb7911bcc7-nova-cell1-compute-config-1\") pod \"nova-edpm-deployment-openstack-edpm-ipam-hf42f\" (UID: \"3f8728ca-30ff-41a9-8a48-e3bb7911bcc7\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-hf42f" Mar 21 05:03:23 crc kubenswrapper[4839]: I0321 05:03:23.643942 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-cell1-compute-config-2\" (UniqueName: \"kubernetes.io/secret/3f8728ca-30ff-41a9-8a48-e3bb7911bcc7-nova-cell1-compute-config-2\") pod \"nova-edpm-deployment-openstack-edpm-ipam-hf42f\" (UID: \"3f8728ca-30ff-41a9-8a48-e3bb7911bcc7\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-hf42f" Mar 21 05:03:23 crc kubenswrapper[4839]: I0321 05:03:23.644693 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/3f8728ca-30ff-41a9-8a48-e3bb7911bcc7-inventory\") pod \"nova-edpm-deployment-openstack-edpm-ipam-hf42f\" (UID: \"3f8728ca-30ff-41a9-8a48-e3bb7911bcc7\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-hf42f" Mar 21 05:03:23 crc kubenswrapper[4839]: I0321 05:03:23.654324 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5jjdv\" (UniqueName: \"kubernetes.io/projected/3f8728ca-30ff-41a9-8a48-e3bb7911bcc7-kube-api-access-5jjdv\") pod \"nova-edpm-deployment-openstack-edpm-ipam-hf42f\" (UID: \"3f8728ca-30ff-41a9-8a48-e3bb7911bcc7\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-hf42f" Mar 21 05:03:23 crc kubenswrapper[4839]: I0321 05:03:23.793425 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-hf42f" Mar 21 05:03:24 crc kubenswrapper[4839]: W0321 05:03:24.382143 4839 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod3f8728ca_30ff_41a9_8a48_e3bb7911bcc7.slice/crio-8fc0a22e4f19211488bb9325fd259e084fbec2617916e0087741e1f7592e5580 WatchSource:0}: Error finding container 8fc0a22e4f19211488bb9325fd259e084fbec2617916e0087741e1f7592e5580: Status 404 returned error can't find the container with id 8fc0a22e4f19211488bb9325fd259e084fbec2617916e0087741e1f7592e5580 Mar 21 05:03:24 crc kubenswrapper[4839]: I0321 05:03:24.384100 4839 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-edpm-deployment-openstack-edpm-ipam-hf42f"] Mar 21 05:03:25 crc kubenswrapper[4839]: I0321 05:03:25.393352 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-hf42f" event={"ID":"3f8728ca-30ff-41a9-8a48-e3bb7911bcc7","Type":"ContainerStarted","Data":"cee45afd6f3370035efcc95b2e9996fd151275bd0576f734a67b97aa4b99e58e"} Mar 21 05:03:25 crc kubenswrapper[4839]: I0321 05:03:25.393778 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-hf42f" event={"ID":"3f8728ca-30ff-41a9-8a48-e3bb7911bcc7","Type":"ContainerStarted","Data":"8fc0a22e4f19211488bb9325fd259e084fbec2617916e0087741e1f7592e5580"} Mar 21 05:03:25 crc kubenswrapper[4839]: I0321 05:03:25.424071 4839 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-hf42f" podStartSLOduration=1.999634482 podStartE2EDuration="2.42404876s" podCreationTimestamp="2026-03-21 05:03:23 +0000 UTC" firstStartedPulling="2026-03-21 05:03:24.385063112 +0000 UTC m=+2408.712849788" lastFinishedPulling="2026-03-21 05:03:24.80947738 +0000 UTC m=+2409.137264066" observedRunningTime="2026-03-21 05:03:25.421238401 +0000 UTC m=+2409.749025107" watchObservedRunningTime="2026-03-21 05:03:25.42404876 +0000 UTC m=+2409.751835446" Mar 21 05:04:00 crc kubenswrapper[4839]: I0321 05:04:00.141932 4839 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29567824-vx55r"] Mar 21 05:04:00 crc kubenswrapper[4839]: I0321 05:04:00.143784 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567824-vx55r" Mar 21 05:04:00 crc kubenswrapper[4839]: I0321 05:04:00.146768 4839 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-swld2" Mar 21 05:04:00 crc kubenswrapper[4839]: I0321 05:04:00.146790 4839 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 21 05:04:00 crc kubenswrapper[4839]: I0321 05:04:00.147013 4839 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 21 05:04:00 crc kubenswrapper[4839]: I0321 05:04:00.153887 4839 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29567824-vx55r"] Mar 21 05:04:00 crc kubenswrapper[4839]: I0321 05:04:00.312028 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fxk6t\" (UniqueName: \"kubernetes.io/projected/0bbefcc3-042e-4587-b172-1a1de0f34dcf-kube-api-access-fxk6t\") pod \"auto-csr-approver-29567824-vx55r\" (UID: \"0bbefcc3-042e-4587-b172-1a1de0f34dcf\") " pod="openshift-infra/auto-csr-approver-29567824-vx55r" Mar 21 05:04:00 crc kubenswrapper[4839]: I0321 05:04:00.413809 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fxk6t\" (UniqueName: \"kubernetes.io/projected/0bbefcc3-042e-4587-b172-1a1de0f34dcf-kube-api-access-fxk6t\") pod \"auto-csr-approver-29567824-vx55r\" (UID: \"0bbefcc3-042e-4587-b172-1a1de0f34dcf\") " pod="openshift-infra/auto-csr-approver-29567824-vx55r" Mar 21 05:04:00 crc kubenswrapper[4839]: I0321 05:04:00.433298 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fxk6t\" (UniqueName: \"kubernetes.io/projected/0bbefcc3-042e-4587-b172-1a1de0f34dcf-kube-api-access-fxk6t\") pod \"auto-csr-approver-29567824-vx55r\" (UID: \"0bbefcc3-042e-4587-b172-1a1de0f34dcf\") " pod="openshift-infra/auto-csr-approver-29567824-vx55r" Mar 21 05:04:00 crc kubenswrapper[4839]: I0321 05:04:00.464979 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567824-vx55r" Mar 21 05:04:00 crc kubenswrapper[4839]: I0321 05:04:00.902526 4839 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29567824-vx55r"] Mar 21 05:04:00 crc kubenswrapper[4839]: I0321 05:04:00.980876 4839 patch_prober.go:28] interesting pod/machine-config-daemon-jx4q7 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 21 05:04:00 crc kubenswrapper[4839]: I0321 05:04:00.981170 4839 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-jx4q7" podUID="4f92fefb-d5cd-451a-8bbe-31eea55d5bd9" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 21 05:04:01 crc kubenswrapper[4839]: I0321 05:04:01.763099 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567824-vx55r" event={"ID":"0bbefcc3-042e-4587-b172-1a1de0f34dcf","Type":"ContainerStarted","Data":"6a274b1ac44ecf31582173215cc84369c49fffb426ab7c2ea960eb33ff65416b"} Mar 21 05:04:02 crc kubenswrapper[4839]: I0321 05:04:02.772105 4839 generic.go:334] "Generic (PLEG): container finished" podID="0bbefcc3-042e-4587-b172-1a1de0f34dcf" containerID="ca1bd38f0e84cbc6abf00446444e014421818ecf4311848a17094cd139ec8ed6" exitCode=0 Mar 21 05:04:02 crc kubenswrapper[4839]: I0321 05:04:02.772170 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567824-vx55r" event={"ID":"0bbefcc3-042e-4587-b172-1a1de0f34dcf","Type":"ContainerDied","Data":"ca1bd38f0e84cbc6abf00446444e014421818ecf4311848a17094cd139ec8ed6"} Mar 21 05:04:04 crc kubenswrapper[4839]: I0321 05:04:04.097499 4839 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567824-vx55r" Mar 21 05:04:04 crc kubenswrapper[4839]: I0321 05:04:04.288029 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fxk6t\" (UniqueName: \"kubernetes.io/projected/0bbefcc3-042e-4587-b172-1a1de0f34dcf-kube-api-access-fxk6t\") pod \"0bbefcc3-042e-4587-b172-1a1de0f34dcf\" (UID: \"0bbefcc3-042e-4587-b172-1a1de0f34dcf\") " Mar 21 05:04:04 crc kubenswrapper[4839]: I0321 05:04:04.293855 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0bbefcc3-042e-4587-b172-1a1de0f34dcf-kube-api-access-fxk6t" (OuterVolumeSpecName: "kube-api-access-fxk6t") pod "0bbefcc3-042e-4587-b172-1a1de0f34dcf" (UID: "0bbefcc3-042e-4587-b172-1a1de0f34dcf"). InnerVolumeSpecName "kube-api-access-fxk6t". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 21 05:04:04 crc kubenswrapper[4839]: I0321 05:04:04.391030 4839 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fxk6t\" (UniqueName: \"kubernetes.io/projected/0bbefcc3-042e-4587-b172-1a1de0f34dcf-kube-api-access-fxk6t\") on node \"crc\" DevicePath \"\"" Mar 21 05:04:04 crc kubenswrapper[4839]: I0321 05:04:04.797460 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567824-vx55r" event={"ID":"0bbefcc3-042e-4587-b172-1a1de0f34dcf","Type":"ContainerDied","Data":"6a274b1ac44ecf31582173215cc84369c49fffb426ab7c2ea960eb33ff65416b"} Mar 21 05:04:04 crc kubenswrapper[4839]: I0321 05:04:04.797783 4839 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6a274b1ac44ecf31582173215cc84369c49fffb426ab7c2ea960eb33ff65416b" Mar 21 05:04:04 crc kubenswrapper[4839]: I0321 05:04:04.797538 4839 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567824-vx55r" Mar 21 05:04:05 crc kubenswrapper[4839]: I0321 05:04:05.174928 4839 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29567818-qzz8l"] Mar 21 05:04:05 crc kubenswrapper[4839]: I0321 05:04:05.185423 4839 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29567818-qzz8l"] Mar 21 05:04:06 crc kubenswrapper[4839]: I0321 05:04:06.463601 4839 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="04b644e0-9d17-491d-be8c-359dd9f82604" path="/var/lib/kubelet/pods/04b644e0-9d17-491d-be8c-359dd9f82604/volumes" Mar 21 05:04:30 crc kubenswrapper[4839]: I0321 05:04:30.980611 4839 patch_prober.go:28] interesting pod/machine-config-daemon-jx4q7 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 21 05:04:30 crc kubenswrapper[4839]: I0321 05:04:30.981455 4839 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-jx4q7" podUID="4f92fefb-d5cd-451a-8bbe-31eea55d5bd9" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 21 05:04:55 crc kubenswrapper[4839]: I0321 05:04:55.877146 4839 scope.go:117] "RemoveContainer" containerID="c879183e5f723b0bd5065e25afca82cc281c19704902af0285103b69c58011ac" Mar 21 05:05:00 crc kubenswrapper[4839]: I0321 05:05:00.980209 4839 patch_prober.go:28] interesting pod/machine-config-daemon-jx4q7 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 21 05:05:00 crc kubenswrapper[4839]: I0321 05:05:00.981013 4839 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-jx4q7" podUID="4f92fefb-d5cd-451a-8bbe-31eea55d5bd9" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 21 05:05:00 crc kubenswrapper[4839]: I0321 05:05:00.981089 4839 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-jx4q7" Mar 21 05:05:00 crc kubenswrapper[4839]: I0321 05:05:00.982219 4839 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"baf34d4219b300148cd1cf3e2853a6f9d40af9c7af97b863f8a2a90b0a187c21"} pod="openshift-machine-config-operator/machine-config-daemon-jx4q7" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 21 05:05:00 crc kubenswrapper[4839]: I0321 05:05:00.982325 4839 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-jx4q7" podUID="4f92fefb-d5cd-451a-8bbe-31eea55d5bd9" containerName="machine-config-daemon" containerID="cri-o://baf34d4219b300148cd1cf3e2853a6f9d40af9c7af97b863f8a2a90b0a187c21" gracePeriod=600 Mar 21 05:05:01 crc kubenswrapper[4839]: E0321 05:05:01.106758 4839 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-jx4q7_openshift-machine-config-operator(4f92fefb-d5cd-451a-8bbe-31eea55d5bd9)\"" pod="openshift-machine-config-operator/machine-config-daemon-jx4q7" podUID="4f92fefb-d5cd-451a-8bbe-31eea55d5bd9" Mar 21 05:05:01 crc kubenswrapper[4839]: I0321 05:05:01.297675 4839 generic.go:334] "Generic (PLEG): container finished" podID="4f92fefb-d5cd-451a-8bbe-31eea55d5bd9" containerID="baf34d4219b300148cd1cf3e2853a6f9d40af9c7af97b863f8a2a90b0a187c21" exitCode=0 Mar 21 05:05:01 crc kubenswrapper[4839]: I0321 05:05:01.297729 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-jx4q7" event={"ID":"4f92fefb-d5cd-451a-8bbe-31eea55d5bd9","Type":"ContainerDied","Data":"baf34d4219b300148cd1cf3e2853a6f9d40af9c7af97b863f8a2a90b0a187c21"} Mar 21 05:05:01 crc kubenswrapper[4839]: I0321 05:05:01.297787 4839 scope.go:117] "RemoveContainer" containerID="f3d8592746d13c0ed95f298f8a3279e3766ac0141ca420e1630c7b54039959ed" Mar 21 05:05:01 crc kubenswrapper[4839]: I0321 05:05:01.298648 4839 scope.go:117] "RemoveContainer" containerID="baf34d4219b300148cd1cf3e2853a6f9d40af9c7af97b863f8a2a90b0a187c21" Mar 21 05:05:01 crc kubenswrapper[4839]: E0321 05:05:01.299104 4839 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-jx4q7_openshift-machine-config-operator(4f92fefb-d5cd-451a-8bbe-31eea55d5bd9)\"" pod="openshift-machine-config-operator/machine-config-daemon-jx4q7" podUID="4f92fefb-d5cd-451a-8bbe-31eea55d5bd9" Mar 21 05:05:13 crc kubenswrapper[4839]: I0321 05:05:13.453971 4839 scope.go:117] "RemoveContainer" containerID="baf34d4219b300148cd1cf3e2853a6f9d40af9c7af97b863f8a2a90b0a187c21" Mar 21 05:05:13 crc kubenswrapper[4839]: E0321 05:05:13.454835 4839 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-jx4q7_openshift-machine-config-operator(4f92fefb-d5cd-451a-8bbe-31eea55d5bd9)\"" pod="openshift-machine-config-operator/machine-config-daemon-jx4q7" podUID="4f92fefb-d5cd-451a-8bbe-31eea55d5bd9" Mar 21 05:05:27 crc kubenswrapper[4839]: I0321 05:05:27.452641 4839 scope.go:117] "RemoveContainer" containerID="baf34d4219b300148cd1cf3e2853a6f9d40af9c7af97b863f8a2a90b0a187c21" Mar 21 05:05:27 crc kubenswrapper[4839]: E0321 05:05:27.453440 4839 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-jx4q7_openshift-machine-config-operator(4f92fefb-d5cd-451a-8bbe-31eea55d5bd9)\"" pod="openshift-machine-config-operator/machine-config-daemon-jx4q7" podUID="4f92fefb-d5cd-451a-8bbe-31eea55d5bd9" Mar 21 05:05:39 crc kubenswrapper[4839]: I0321 05:05:39.453362 4839 scope.go:117] "RemoveContainer" containerID="baf34d4219b300148cd1cf3e2853a6f9d40af9c7af97b863f8a2a90b0a187c21" Mar 21 05:05:39 crc kubenswrapper[4839]: E0321 05:05:39.455236 4839 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-jx4q7_openshift-machine-config-operator(4f92fefb-d5cd-451a-8bbe-31eea55d5bd9)\"" pod="openshift-machine-config-operator/machine-config-daemon-jx4q7" podUID="4f92fefb-d5cd-451a-8bbe-31eea55d5bd9" Mar 21 05:05:42 crc kubenswrapper[4839]: I0321 05:05:42.658810 4839 generic.go:334] "Generic (PLEG): container finished" podID="3f8728ca-30ff-41a9-8a48-e3bb7911bcc7" containerID="cee45afd6f3370035efcc95b2e9996fd151275bd0576f734a67b97aa4b99e58e" exitCode=0 Mar 21 05:05:42 crc kubenswrapper[4839]: I0321 05:05:42.658875 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-hf42f" event={"ID":"3f8728ca-30ff-41a9-8a48-e3bb7911bcc7","Type":"ContainerDied","Data":"cee45afd6f3370035efcc95b2e9996fd151275bd0576f734a67b97aa4b99e58e"} Mar 21 05:05:42 crc kubenswrapper[4839]: E0321 05:05:42.982909 4839 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/system.slice/rpm-ostreed.service\": RecentStats: unable to find data in memory cache]" Mar 21 05:05:44 crc kubenswrapper[4839]: I0321 05:05:44.038384 4839 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-hf42f" Mar 21 05:05:44 crc kubenswrapper[4839]: I0321 05:05:44.141733 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3f8728ca-30ff-41a9-8a48-e3bb7911bcc7-nova-combined-ca-bundle\") pod \"3f8728ca-30ff-41a9-8a48-e3bb7911bcc7\" (UID: \"3f8728ca-30ff-41a9-8a48-e3bb7911bcc7\") " Mar 21 05:05:44 crc kubenswrapper[4839]: I0321 05:05:44.141780 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/3f8728ca-30ff-41a9-8a48-e3bb7911bcc7-nova-migration-ssh-key-0\") pod \"3f8728ca-30ff-41a9-8a48-e3bb7911bcc7\" (UID: \"3f8728ca-30ff-41a9-8a48-e3bb7911bcc7\") " Mar 21 05:05:44 crc kubenswrapper[4839]: I0321 05:05:44.141840 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/3f8728ca-30ff-41a9-8a48-e3bb7911bcc7-nova-migration-ssh-key-1\") pod \"3f8728ca-30ff-41a9-8a48-e3bb7911bcc7\" (UID: \"3f8728ca-30ff-41a9-8a48-e3bb7911bcc7\") " Mar 21 05:05:44 crc kubenswrapper[4839]: I0321 05:05:44.141902 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-extra-config-0\" (UniqueName: \"kubernetes.io/configmap/3f8728ca-30ff-41a9-8a48-e3bb7911bcc7-nova-extra-config-0\") pod \"3f8728ca-30ff-41a9-8a48-e3bb7911bcc7\" (UID: \"3f8728ca-30ff-41a9-8a48-e3bb7911bcc7\") " Mar 21 05:05:44 crc kubenswrapper[4839]: I0321 05:05:44.141967 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-cell1-compute-config-2\" (UniqueName: \"kubernetes.io/secret/3f8728ca-30ff-41a9-8a48-e3bb7911bcc7-nova-cell1-compute-config-2\") pod \"3f8728ca-30ff-41a9-8a48-e3bb7911bcc7\" (UID: \"3f8728ca-30ff-41a9-8a48-e3bb7911bcc7\") " Mar 21 05:05:44 crc kubenswrapper[4839]: I0321 05:05:44.142042 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/3f8728ca-30ff-41a9-8a48-e3bb7911bcc7-ssh-key-openstack-edpm-ipam\") pod \"3f8728ca-30ff-41a9-8a48-e3bb7911bcc7\" (UID: \"3f8728ca-30ff-41a9-8a48-e3bb7911bcc7\") " Mar 21 05:05:44 crc kubenswrapper[4839]: I0321 05:05:44.142066 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/3f8728ca-30ff-41a9-8a48-e3bb7911bcc7-inventory\") pod \"3f8728ca-30ff-41a9-8a48-e3bb7911bcc7\" (UID: \"3f8728ca-30ff-41a9-8a48-e3bb7911bcc7\") " Mar 21 05:05:44 crc kubenswrapper[4839]: I0321 05:05:44.142097 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/3f8728ca-30ff-41a9-8a48-e3bb7911bcc7-nova-cell1-compute-config-0\") pod \"3f8728ca-30ff-41a9-8a48-e3bb7911bcc7\" (UID: \"3f8728ca-30ff-41a9-8a48-e3bb7911bcc7\") " Mar 21 05:05:44 crc kubenswrapper[4839]: I0321 05:05:44.142120 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/3f8728ca-30ff-41a9-8a48-e3bb7911bcc7-nova-cell1-compute-config-1\") pod \"3f8728ca-30ff-41a9-8a48-e3bb7911bcc7\" (UID: \"3f8728ca-30ff-41a9-8a48-e3bb7911bcc7\") " Mar 21 05:05:44 crc kubenswrapper[4839]: I0321 05:05:44.142162 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-cell1-compute-config-3\" (UniqueName: \"kubernetes.io/secret/3f8728ca-30ff-41a9-8a48-e3bb7911bcc7-nova-cell1-compute-config-3\") pod \"3f8728ca-30ff-41a9-8a48-e3bb7911bcc7\" (UID: \"3f8728ca-30ff-41a9-8a48-e3bb7911bcc7\") " Mar 21 05:05:44 crc kubenswrapper[4839]: I0321 05:05:44.142239 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5jjdv\" (UniqueName: \"kubernetes.io/projected/3f8728ca-30ff-41a9-8a48-e3bb7911bcc7-kube-api-access-5jjdv\") pod \"3f8728ca-30ff-41a9-8a48-e3bb7911bcc7\" (UID: \"3f8728ca-30ff-41a9-8a48-e3bb7911bcc7\") " Mar 21 05:05:44 crc kubenswrapper[4839]: I0321 05:05:44.164390 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3f8728ca-30ff-41a9-8a48-e3bb7911bcc7-kube-api-access-5jjdv" (OuterVolumeSpecName: "kube-api-access-5jjdv") pod "3f8728ca-30ff-41a9-8a48-e3bb7911bcc7" (UID: "3f8728ca-30ff-41a9-8a48-e3bb7911bcc7"). InnerVolumeSpecName "kube-api-access-5jjdv". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 21 05:05:44 crc kubenswrapper[4839]: I0321 05:05:44.166036 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3f8728ca-30ff-41a9-8a48-e3bb7911bcc7-nova-combined-ca-bundle" (OuterVolumeSpecName: "nova-combined-ca-bundle") pod "3f8728ca-30ff-41a9-8a48-e3bb7911bcc7" (UID: "3f8728ca-30ff-41a9-8a48-e3bb7911bcc7"). InnerVolumeSpecName "nova-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 21 05:05:44 crc kubenswrapper[4839]: I0321 05:05:44.170251 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3f8728ca-30ff-41a9-8a48-e3bb7911bcc7-nova-cell1-compute-config-2" (OuterVolumeSpecName: "nova-cell1-compute-config-2") pod "3f8728ca-30ff-41a9-8a48-e3bb7911bcc7" (UID: "3f8728ca-30ff-41a9-8a48-e3bb7911bcc7"). InnerVolumeSpecName "nova-cell1-compute-config-2". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 21 05:05:44 crc kubenswrapper[4839]: I0321 05:05:44.170968 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3f8728ca-30ff-41a9-8a48-e3bb7911bcc7-nova-migration-ssh-key-0" (OuterVolumeSpecName: "nova-migration-ssh-key-0") pod "3f8728ca-30ff-41a9-8a48-e3bb7911bcc7" (UID: "3f8728ca-30ff-41a9-8a48-e3bb7911bcc7"). InnerVolumeSpecName "nova-migration-ssh-key-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 21 05:05:44 crc kubenswrapper[4839]: I0321 05:05:44.172266 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3f8728ca-30ff-41a9-8a48-e3bb7911bcc7-nova-extra-config-0" (OuterVolumeSpecName: "nova-extra-config-0") pod "3f8728ca-30ff-41a9-8a48-e3bb7911bcc7" (UID: "3f8728ca-30ff-41a9-8a48-e3bb7911bcc7"). InnerVolumeSpecName "nova-extra-config-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 21 05:05:44 crc kubenswrapper[4839]: I0321 05:05:44.176147 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3f8728ca-30ff-41a9-8a48-e3bb7911bcc7-nova-cell1-compute-config-3" (OuterVolumeSpecName: "nova-cell1-compute-config-3") pod "3f8728ca-30ff-41a9-8a48-e3bb7911bcc7" (UID: "3f8728ca-30ff-41a9-8a48-e3bb7911bcc7"). InnerVolumeSpecName "nova-cell1-compute-config-3". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 21 05:05:44 crc kubenswrapper[4839]: I0321 05:05:44.179943 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3f8728ca-30ff-41a9-8a48-e3bb7911bcc7-inventory" (OuterVolumeSpecName: "inventory") pod "3f8728ca-30ff-41a9-8a48-e3bb7911bcc7" (UID: "3f8728ca-30ff-41a9-8a48-e3bb7911bcc7"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 21 05:05:44 crc kubenswrapper[4839]: I0321 05:05:44.189533 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3f8728ca-30ff-41a9-8a48-e3bb7911bcc7-nova-cell1-compute-config-1" (OuterVolumeSpecName: "nova-cell1-compute-config-1") pod "3f8728ca-30ff-41a9-8a48-e3bb7911bcc7" (UID: "3f8728ca-30ff-41a9-8a48-e3bb7911bcc7"). InnerVolumeSpecName "nova-cell1-compute-config-1". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 21 05:05:44 crc kubenswrapper[4839]: I0321 05:05:44.201816 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3f8728ca-30ff-41a9-8a48-e3bb7911bcc7-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "3f8728ca-30ff-41a9-8a48-e3bb7911bcc7" (UID: "3f8728ca-30ff-41a9-8a48-e3bb7911bcc7"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 21 05:05:44 crc kubenswrapper[4839]: I0321 05:05:44.204475 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3f8728ca-30ff-41a9-8a48-e3bb7911bcc7-nova-cell1-compute-config-0" (OuterVolumeSpecName: "nova-cell1-compute-config-0") pod "3f8728ca-30ff-41a9-8a48-e3bb7911bcc7" (UID: "3f8728ca-30ff-41a9-8a48-e3bb7911bcc7"). InnerVolumeSpecName "nova-cell1-compute-config-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 21 05:05:44 crc kubenswrapper[4839]: I0321 05:05:44.244317 4839 reconciler_common.go:293] "Volume detached for volume \"nova-cell1-compute-config-2\" (UniqueName: \"kubernetes.io/secret/3f8728ca-30ff-41a9-8a48-e3bb7911bcc7-nova-cell1-compute-config-2\") on node \"crc\" DevicePath \"\"" Mar 21 05:05:44 crc kubenswrapper[4839]: I0321 05:05:44.244348 4839 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/3f8728ca-30ff-41a9-8a48-e3bb7911bcc7-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Mar 21 05:05:44 crc kubenswrapper[4839]: I0321 05:05:44.244360 4839 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/3f8728ca-30ff-41a9-8a48-e3bb7911bcc7-inventory\") on node \"crc\" DevicePath \"\"" Mar 21 05:05:44 crc kubenswrapper[4839]: I0321 05:05:44.244371 4839 reconciler_common.go:293] "Volume detached for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/3f8728ca-30ff-41a9-8a48-e3bb7911bcc7-nova-cell1-compute-config-0\") on node \"crc\" DevicePath \"\"" Mar 21 05:05:44 crc kubenswrapper[4839]: I0321 05:05:44.244383 4839 reconciler_common.go:293] "Volume detached for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/3f8728ca-30ff-41a9-8a48-e3bb7911bcc7-nova-cell1-compute-config-1\") on node \"crc\" DevicePath \"\"" Mar 21 05:05:44 crc kubenswrapper[4839]: I0321 05:05:44.244393 4839 reconciler_common.go:293] "Volume detached for volume \"nova-cell1-compute-config-3\" (UniqueName: \"kubernetes.io/secret/3f8728ca-30ff-41a9-8a48-e3bb7911bcc7-nova-cell1-compute-config-3\") on node \"crc\" DevicePath \"\"" Mar 21 05:05:44 crc kubenswrapper[4839]: I0321 05:05:44.244403 4839 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5jjdv\" (UniqueName: \"kubernetes.io/projected/3f8728ca-30ff-41a9-8a48-e3bb7911bcc7-kube-api-access-5jjdv\") on node \"crc\" DevicePath \"\"" Mar 21 05:05:44 crc kubenswrapper[4839]: I0321 05:05:44.244447 4839 reconciler_common.go:293] "Volume detached for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3f8728ca-30ff-41a9-8a48-e3bb7911bcc7-nova-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 21 05:05:44 crc kubenswrapper[4839]: I0321 05:05:44.244458 4839 reconciler_common.go:293] "Volume detached for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/3f8728ca-30ff-41a9-8a48-e3bb7911bcc7-nova-migration-ssh-key-0\") on node \"crc\" DevicePath \"\"" Mar 21 05:05:44 crc kubenswrapper[4839]: I0321 05:05:44.244470 4839 reconciler_common.go:293] "Volume detached for volume \"nova-extra-config-0\" (UniqueName: \"kubernetes.io/configmap/3f8728ca-30ff-41a9-8a48-e3bb7911bcc7-nova-extra-config-0\") on node \"crc\" DevicePath \"\"" Mar 21 05:05:44 crc kubenswrapper[4839]: I0321 05:05:44.736374 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3f8728ca-30ff-41a9-8a48-e3bb7911bcc7-nova-migration-ssh-key-1" (OuterVolumeSpecName: "nova-migration-ssh-key-1") pod "3f8728ca-30ff-41a9-8a48-e3bb7911bcc7" (UID: "3f8728ca-30ff-41a9-8a48-e3bb7911bcc7"). InnerVolumeSpecName "nova-migration-ssh-key-1". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 21 05:05:44 crc kubenswrapper[4839]: I0321 05:05:44.752820 4839 reconciler_common.go:293] "Volume detached for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/3f8728ca-30ff-41a9-8a48-e3bb7911bcc7-nova-migration-ssh-key-1\") on node \"crc\" DevicePath \"\"" Mar 21 05:05:44 crc kubenswrapper[4839]: I0321 05:05:44.753435 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-hf42f" event={"ID":"3f8728ca-30ff-41a9-8a48-e3bb7911bcc7","Type":"ContainerDied","Data":"8fc0a22e4f19211488bb9325fd259e084fbec2617916e0087741e1f7592e5580"} Mar 21 05:05:44 crc kubenswrapper[4839]: I0321 05:05:44.753460 4839 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="8fc0a22e4f19211488bb9325fd259e084fbec2617916e0087741e1f7592e5580" Mar 21 05:05:44 crc kubenswrapper[4839]: I0321 05:05:44.753510 4839 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-hf42f" Mar 21 05:05:44 crc kubenswrapper[4839]: I0321 05:05:44.792358 4839 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/telemetry-edpm-deployment-openstack-edpm-ipam-zmjtq"] Mar 21 05:05:44 crc kubenswrapper[4839]: E0321 05:05:44.792790 4839 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3f8728ca-30ff-41a9-8a48-e3bb7911bcc7" containerName="nova-edpm-deployment-openstack-edpm-ipam" Mar 21 05:05:44 crc kubenswrapper[4839]: I0321 05:05:44.792808 4839 state_mem.go:107] "Deleted CPUSet assignment" podUID="3f8728ca-30ff-41a9-8a48-e3bb7911bcc7" containerName="nova-edpm-deployment-openstack-edpm-ipam" Mar 21 05:05:44 crc kubenswrapper[4839]: E0321 05:05:44.792828 4839 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0bbefcc3-042e-4587-b172-1a1de0f34dcf" containerName="oc" Mar 21 05:05:44 crc kubenswrapper[4839]: I0321 05:05:44.792839 4839 state_mem.go:107] "Deleted CPUSet assignment" podUID="0bbefcc3-042e-4587-b172-1a1de0f34dcf" containerName="oc" Mar 21 05:05:44 crc kubenswrapper[4839]: I0321 05:05:44.793002 4839 memory_manager.go:354] "RemoveStaleState removing state" podUID="3f8728ca-30ff-41a9-8a48-e3bb7911bcc7" containerName="nova-edpm-deployment-openstack-edpm-ipam" Mar 21 05:05:44 crc kubenswrapper[4839]: I0321 05:05:44.793017 4839 memory_manager.go:354] "RemoveStaleState removing state" podUID="0bbefcc3-042e-4587-b172-1a1de0f34dcf" containerName="oc" Mar 21 05:05:44 crc kubenswrapper[4839]: I0321 05:05:44.793636 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-zmjtq" Mar 21 05:05:44 crc kubenswrapper[4839]: I0321 05:05:44.802267 4839 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Mar 21 05:05:44 crc kubenswrapper[4839]: I0321 05:05:44.802291 4839 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-compute-config-data" Mar 21 05:05:44 crc kubenswrapper[4839]: I0321 05:05:44.802702 4839 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Mar 21 05:05:44 crc kubenswrapper[4839]: I0321 05:05:44.802349 4839 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Mar 21 05:05:44 crc kubenswrapper[4839]: I0321 05:05:44.802400 4839 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-pxvtf" Mar 21 05:05:44 crc kubenswrapper[4839]: I0321 05:05:44.812963 4839 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/telemetry-edpm-deployment-openstack-edpm-ipam-zmjtq"] Mar 21 05:05:44 crc kubenswrapper[4839]: I0321 05:05:44.860595 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4f49b501-bec5-4fe1-89d7-ff3c217ba580-telemetry-combined-ca-bundle\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-zmjtq\" (UID: \"4f49b501-bec5-4fe1-89d7-ff3c217ba580\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-zmjtq" Mar 21 05:05:44 crc kubenswrapper[4839]: I0321 05:05:44.860656 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-compute-config-data-2\" (UniqueName: \"kubernetes.io/secret/4f49b501-bec5-4fe1-89d7-ff3c217ba580-ceilometer-compute-config-data-2\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-zmjtq\" (UID: \"4f49b501-bec5-4fe1-89d7-ff3c217ba580\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-zmjtq" Mar 21 05:05:44 crc kubenswrapper[4839]: I0321 05:05:44.860693 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v5mvz\" (UniqueName: \"kubernetes.io/projected/4f49b501-bec5-4fe1-89d7-ff3c217ba580-kube-api-access-v5mvz\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-zmjtq\" (UID: \"4f49b501-bec5-4fe1-89d7-ff3c217ba580\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-zmjtq" Mar 21 05:05:44 crc kubenswrapper[4839]: I0321 05:05:44.860734 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-compute-config-data-0\" (UniqueName: \"kubernetes.io/secret/4f49b501-bec5-4fe1-89d7-ff3c217ba580-ceilometer-compute-config-data-0\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-zmjtq\" (UID: \"4f49b501-bec5-4fe1-89d7-ff3c217ba580\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-zmjtq" Mar 21 05:05:44 crc kubenswrapper[4839]: I0321 05:05:44.860797 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/4f49b501-bec5-4fe1-89d7-ff3c217ba580-inventory\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-zmjtq\" (UID: \"4f49b501-bec5-4fe1-89d7-ff3c217ba580\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-zmjtq" Mar 21 05:05:44 crc kubenswrapper[4839]: I0321 05:05:44.860816 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-compute-config-data-1\" (UniqueName: \"kubernetes.io/secret/4f49b501-bec5-4fe1-89d7-ff3c217ba580-ceilometer-compute-config-data-1\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-zmjtq\" (UID: \"4f49b501-bec5-4fe1-89d7-ff3c217ba580\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-zmjtq" Mar 21 05:05:44 crc kubenswrapper[4839]: I0321 05:05:44.860833 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/4f49b501-bec5-4fe1-89d7-ff3c217ba580-ssh-key-openstack-edpm-ipam\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-zmjtq\" (UID: \"4f49b501-bec5-4fe1-89d7-ff3c217ba580\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-zmjtq" Mar 21 05:05:44 crc kubenswrapper[4839]: I0321 05:05:44.963822 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-compute-config-data-0\" (UniqueName: \"kubernetes.io/secret/4f49b501-bec5-4fe1-89d7-ff3c217ba580-ceilometer-compute-config-data-0\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-zmjtq\" (UID: \"4f49b501-bec5-4fe1-89d7-ff3c217ba580\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-zmjtq" Mar 21 05:05:44 crc kubenswrapper[4839]: I0321 05:05:44.964028 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/4f49b501-bec5-4fe1-89d7-ff3c217ba580-inventory\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-zmjtq\" (UID: \"4f49b501-bec5-4fe1-89d7-ff3c217ba580\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-zmjtq" Mar 21 05:05:44 crc kubenswrapper[4839]: I0321 05:05:44.964067 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-compute-config-data-1\" (UniqueName: \"kubernetes.io/secret/4f49b501-bec5-4fe1-89d7-ff3c217ba580-ceilometer-compute-config-data-1\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-zmjtq\" (UID: \"4f49b501-bec5-4fe1-89d7-ff3c217ba580\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-zmjtq" Mar 21 05:05:44 crc kubenswrapper[4839]: I0321 05:05:44.964105 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/4f49b501-bec5-4fe1-89d7-ff3c217ba580-ssh-key-openstack-edpm-ipam\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-zmjtq\" (UID: \"4f49b501-bec5-4fe1-89d7-ff3c217ba580\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-zmjtq" Mar 21 05:05:44 crc kubenswrapper[4839]: I0321 05:05:44.964256 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4f49b501-bec5-4fe1-89d7-ff3c217ba580-telemetry-combined-ca-bundle\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-zmjtq\" (UID: \"4f49b501-bec5-4fe1-89d7-ff3c217ba580\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-zmjtq" Mar 21 05:05:44 crc kubenswrapper[4839]: I0321 05:05:44.964339 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-compute-config-data-2\" (UniqueName: \"kubernetes.io/secret/4f49b501-bec5-4fe1-89d7-ff3c217ba580-ceilometer-compute-config-data-2\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-zmjtq\" (UID: \"4f49b501-bec5-4fe1-89d7-ff3c217ba580\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-zmjtq" Mar 21 05:05:44 crc kubenswrapper[4839]: I0321 05:05:44.964413 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-v5mvz\" (UniqueName: \"kubernetes.io/projected/4f49b501-bec5-4fe1-89d7-ff3c217ba580-kube-api-access-v5mvz\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-zmjtq\" (UID: \"4f49b501-bec5-4fe1-89d7-ff3c217ba580\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-zmjtq" Mar 21 05:05:44 crc kubenswrapper[4839]: I0321 05:05:44.970723 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-compute-config-data-1\" (UniqueName: \"kubernetes.io/secret/4f49b501-bec5-4fe1-89d7-ff3c217ba580-ceilometer-compute-config-data-1\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-zmjtq\" (UID: \"4f49b501-bec5-4fe1-89d7-ff3c217ba580\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-zmjtq" Mar 21 05:05:44 crc kubenswrapper[4839]: I0321 05:05:44.973002 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/4f49b501-bec5-4fe1-89d7-ff3c217ba580-ssh-key-openstack-edpm-ipam\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-zmjtq\" (UID: \"4f49b501-bec5-4fe1-89d7-ff3c217ba580\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-zmjtq" Mar 21 05:05:44 crc kubenswrapper[4839]: I0321 05:05:44.977235 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-compute-config-data-0\" (UniqueName: \"kubernetes.io/secret/4f49b501-bec5-4fe1-89d7-ff3c217ba580-ceilometer-compute-config-data-0\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-zmjtq\" (UID: \"4f49b501-bec5-4fe1-89d7-ff3c217ba580\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-zmjtq" Mar 21 05:05:44 crc kubenswrapper[4839]: I0321 05:05:44.977268 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4f49b501-bec5-4fe1-89d7-ff3c217ba580-telemetry-combined-ca-bundle\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-zmjtq\" (UID: \"4f49b501-bec5-4fe1-89d7-ff3c217ba580\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-zmjtq" Mar 21 05:05:44 crc kubenswrapper[4839]: I0321 05:05:44.977240 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-compute-config-data-2\" (UniqueName: \"kubernetes.io/secret/4f49b501-bec5-4fe1-89d7-ff3c217ba580-ceilometer-compute-config-data-2\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-zmjtq\" (UID: \"4f49b501-bec5-4fe1-89d7-ff3c217ba580\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-zmjtq" Mar 21 05:05:44 crc kubenswrapper[4839]: I0321 05:05:44.983105 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/4f49b501-bec5-4fe1-89d7-ff3c217ba580-inventory\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-zmjtq\" (UID: \"4f49b501-bec5-4fe1-89d7-ff3c217ba580\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-zmjtq" Mar 21 05:05:44 crc kubenswrapper[4839]: I0321 05:05:44.983201 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-v5mvz\" (UniqueName: \"kubernetes.io/projected/4f49b501-bec5-4fe1-89d7-ff3c217ba580-kube-api-access-v5mvz\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-zmjtq\" (UID: \"4f49b501-bec5-4fe1-89d7-ff3c217ba580\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-zmjtq" Mar 21 05:05:46 crc kubenswrapper[4839]: I0321 05:05:46.030827 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-zmjtq" Mar 21 05:05:46 crc kubenswrapper[4839]: I0321 05:05:46.704359 4839 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 21 05:05:46 crc kubenswrapper[4839]: I0321 05:05:46.709481 4839 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/telemetry-edpm-deployment-openstack-edpm-ipam-zmjtq"] Mar 21 05:05:47 crc kubenswrapper[4839]: I0321 05:05:47.112350 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-zmjtq" event={"ID":"4f49b501-bec5-4fe1-89d7-ff3c217ba580","Type":"ContainerStarted","Data":"d270c1962df5881492e69c0fae671ca811333a43b6b51f339984de8b79640216"} Mar 21 05:05:48 crc kubenswrapper[4839]: I0321 05:05:48.122463 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-zmjtq" event={"ID":"4f49b501-bec5-4fe1-89d7-ff3c217ba580","Type":"ContainerStarted","Data":"6da4c59622372824b75333d339cb0b5485cc2c5826926b2fb67ddc2f62e7dcd1"} Mar 21 05:05:48 crc kubenswrapper[4839]: I0321 05:05:48.159115 4839 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-zmjtq" podStartSLOduration=3.403742169 podStartE2EDuration="4.159097347s" podCreationTimestamp="2026-03-21 05:05:44 +0000 UTC" firstStartedPulling="2026-03-21 05:05:46.7040993 +0000 UTC m=+2551.031885976" lastFinishedPulling="2026-03-21 05:05:47.459454478 +0000 UTC m=+2551.787241154" observedRunningTime="2026-03-21 05:05:48.1460952 +0000 UTC m=+2552.473881886" watchObservedRunningTime="2026-03-21 05:05:48.159097347 +0000 UTC m=+2552.486884023" Mar 21 05:05:51 crc kubenswrapper[4839]: I0321 05:05:51.453522 4839 scope.go:117] "RemoveContainer" containerID="baf34d4219b300148cd1cf3e2853a6f9d40af9c7af97b863f8a2a90b0a187c21" Mar 21 05:05:51 crc kubenswrapper[4839]: E0321 05:05:51.454164 4839 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-jx4q7_openshift-machine-config-operator(4f92fefb-d5cd-451a-8bbe-31eea55d5bd9)\"" pod="openshift-machine-config-operator/machine-config-daemon-jx4q7" podUID="4f92fefb-d5cd-451a-8bbe-31eea55d5bd9" Mar 21 05:06:00 crc kubenswrapper[4839]: I0321 05:06:00.147919 4839 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29567826-x4cdd"] Mar 21 05:06:00 crc kubenswrapper[4839]: I0321 05:06:00.149697 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567826-x4cdd" Mar 21 05:06:00 crc kubenswrapper[4839]: I0321 05:06:00.153885 4839 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-swld2" Mar 21 05:06:00 crc kubenswrapper[4839]: I0321 05:06:00.154106 4839 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 21 05:06:00 crc kubenswrapper[4839]: I0321 05:06:00.154320 4839 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 21 05:06:00 crc kubenswrapper[4839]: I0321 05:06:00.157564 4839 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29567826-x4cdd"] Mar 21 05:06:00 crc kubenswrapper[4839]: I0321 05:06:00.261057 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wvjrb\" (UniqueName: \"kubernetes.io/projected/ba8ac9dd-e3e7-4e21-9286-731d926d9580-kube-api-access-wvjrb\") pod \"auto-csr-approver-29567826-x4cdd\" (UID: \"ba8ac9dd-e3e7-4e21-9286-731d926d9580\") " pod="openshift-infra/auto-csr-approver-29567826-x4cdd" Mar 21 05:06:00 crc kubenswrapper[4839]: I0321 05:06:00.362703 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wvjrb\" (UniqueName: \"kubernetes.io/projected/ba8ac9dd-e3e7-4e21-9286-731d926d9580-kube-api-access-wvjrb\") pod \"auto-csr-approver-29567826-x4cdd\" (UID: \"ba8ac9dd-e3e7-4e21-9286-731d926d9580\") " pod="openshift-infra/auto-csr-approver-29567826-x4cdd" Mar 21 05:06:00 crc kubenswrapper[4839]: I0321 05:06:00.389012 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wvjrb\" (UniqueName: \"kubernetes.io/projected/ba8ac9dd-e3e7-4e21-9286-731d926d9580-kube-api-access-wvjrb\") pod \"auto-csr-approver-29567826-x4cdd\" (UID: \"ba8ac9dd-e3e7-4e21-9286-731d926d9580\") " pod="openshift-infra/auto-csr-approver-29567826-x4cdd" Mar 21 05:06:00 crc kubenswrapper[4839]: I0321 05:06:00.468863 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567826-x4cdd" Mar 21 05:06:00 crc kubenswrapper[4839]: I0321 05:06:00.912656 4839 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29567826-x4cdd"] Mar 21 05:06:01 crc kubenswrapper[4839]: I0321 05:06:01.230326 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567826-x4cdd" event={"ID":"ba8ac9dd-e3e7-4e21-9286-731d926d9580","Type":"ContainerStarted","Data":"74c42ba36d7e1a7761c8db53d665fd37163df1d3b96b661a9bbac805b13316d3"} Mar 21 05:06:02 crc kubenswrapper[4839]: I0321 05:06:02.242935 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567826-x4cdd" event={"ID":"ba8ac9dd-e3e7-4e21-9286-731d926d9580","Type":"ContainerStarted","Data":"1be67ef407b003d168ed5f91777a4df15466b61c19dea5b77ca6763eff6dadb2"} Mar 21 05:06:02 crc kubenswrapper[4839]: I0321 05:06:02.263830 4839 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29567826-x4cdd" podStartSLOduration=1.415089748 podStartE2EDuration="2.263808999s" podCreationTimestamp="2026-03-21 05:06:00 +0000 UTC" firstStartedPulling="2026-03-21 05:06:00.925015429 +0000 UTC m=+2565.252802105" lastFinishedPulling="2026-03-21 05:06:01.77373469 +0000 UTC m=+2566.101521356" observedRunningTime="2026-03-21 05:06:02.257970405 +0000 UTC m=+2566.585757081" watchObservedRunningTime="2026-03-21 05:06:02.263808999 +0000 UTC m=+2566.591595675" Mar 21 05:06:02 crc kubenswrapper[4839]: I0321 05:06:02.453685 4839 scope.go:117] "RemoveContainer" containerID="baf34d4219b300148cd1cf3e2853a6f9d40af9c7af97b863f8a2a90b0a187c21" Mar 21 05:06:02 crc kubenswrapper[4839]: E0321 05:06:02.453936 4839 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-jx4q7_openshift-machine-config-operator(4f92fefb-d5cd-451a-8bbe-31eea55d5bd9)\"" pod="openshift-machine-config-operator/machine-config-daemon-jx4q7" podUID="4f92fefb-d5cd-451a-8bbe-31eea55d5bd9" Mar 21 05:06:03 crc kubenswrapper[4839]: I0321 05:06:03.255238 4839 generic.go:334] "Generic (PLEG): container finished" podID="ba8ac9dd-e3e7-4e21-9286-731d926d9580" containerID="1be67ef407b003d168ed5f91777a4df15466b61c19dea5b77ca6763eff6dadb2" exitCode=0 Mar 21 05:06:03 crc kubenswrapper[4839]: I0321 05:06:03.255560 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567826-x4cdd" event={"ID":"ba8ac9dd-e3e7-4e21-9286-731d926d9580","Type":"ContainerDied","Data":"1be67ef407b003d168ed5f91777a4df15466b61c19dea5b77ca6763eff6dadb2"} Mar 21 05:06:04 crc kubenswrapper[4839]: I0321 05:06:04.633269 4839 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567826-x4cdd" Mar 21 05:06:04 crc kubenswrapper[4839]: I0321 05:06:04.748430 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wvjrb\" (UniqueName: \"kubernetes.io/projected/ba8ac9dd-e3e7-4e21-9286-731d926d9580-kube-api-access-wvjrb\") pod \"ba8ac9dd-e3e7-4e21-9286-731d926d9580\" (UID: \"ba8ac9dd-e3e7-4e21-9286-731d926d9580\") " Mar 21 05:06:04 crc kubenswrapper[4839]: I0321 05:06:04.760206 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ba8ac9dd-e3e7-4e21-9286-731d926d9580-kube-api-access-wvjrb" (OuterVolumeSpecName: "kube-api-access-wvjrb") pod "ba8ac9dd-e3e7-4e21-9286-731d926d9580" (UID: "ba8ac9dd-e3e7-4e21-9286-731d926d9580"). InnerVolumeSpecName "kube-api-access-wvjrb". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 21 05:06:04 crc kubenswrapper[4839]: I0321 05:06:04.851134 4839 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wvjrb\" (UniqueName: \"kubernetes.io/projected/ba8ac9dd-e3e7-4e21-9286-731d926d9580-kube-api-access-wvjrb\") on node \"crc\" DevicePath \"\"" Mar 21 05:06:05 crc kubenswrapper[4839]: I0321 05:06:05.271760 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567826-x4cdd" event={"ID":"ba8ac9dd-e3e7-4e21-9286-731d926d9580","Type":"ContainerDied","Data":"74c42ba36d7e1a7761c8db53d665fd37163df1d3b96b661a9bbac805b13316d3"} Mar 21 05:06:05 crc kubenswrapper[4839]: I0321 05:06:05.271809 4839 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567826-x4cdd" Mar 21 05:06:05 crc kubenswrapper[4839]: I0321 05:06:05.271813 4839 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="74c42ba36d7e1a7761c8db53d665fd37163df1d3b96b661a9bbac805b13316d3" Mar 21 05:06:05 crc kubenswrapper[4839]: I0321 05:06:05.340293 4839 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29567820-drjbh"] Mar 21 05:06:05 crc kubenswrapper[4839]: I0321 05:06:05.350510 4839 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29567820-drjbh"] Mar 21 05:06:06 crc kubenswrapper[4839]: I0321 05:06:06.463671 4839 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2a082320-155d-4eb3-9779-9c6bb4db2b77" path="/var/lib/kubelet/pods/2a082320-155d-4eb3-9779-9c6bb4db2b77/volumes" Mar 21 05:06:14 crc kubenswrapper[4839]: I0321 05:06:14.812473 4839 pod_container_manager_linux.go:210] "Failed to delete cgroup paths" cgroupName=["kubepods","besteffort","pod3f8728ca-30ff-41a9-8a48-e3bb7911bcc7"] err="unable to destroy cgroup paths for cgroup [kubepods besteffort pod3f8728ca-30ff-41a9-8a48-e3bb7911bcc7] : Timed out while waiting for systemd to remove kubepods-besteffort-pod3f8728ca_30ff_41a9_8a48_e3bb7911bcc7.slice" Mar 21 05:06:14 crc kubenswrapper[4839]: E0321 05:06:14.813084 4839 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to delete cgroup paths for [kubepods besteffort pod3f8728ca-30ff-41a9-8a48-e3bb7911bcc7] : unable to destroy cgroup paths for cgroup [kubepods besteffort pod3f8728ca-30ff-41a9-8a48-e3bb7911bcc7] : Timed out while waiting for systemd to remove kubepods-besteffort-pod3f8728ca_30ff_41a9_8a48_e3bb7911bcc7.slice" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-hf42f" podUID="3f8728ca-30ff-41a9-8a48-e3bb7911bcc7" Mar 21 05:06:15 crc kubenswrapper[4839]: I0321 05:06:15.350195 4839 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-hf42f" Mar 21 05:06:17 crc kubenswrapper[4839]: I0321 05:06:17.452783 4839 scope.go:117] "RemoveContainer" containerID="baf34d4219b300148cd1cf3e2853a6f9d40af9c7af97b863f8a2a90b0a187c21" Mar 21 05:06:17 crc kubenswrapper[4839]: E0321 05:06:17.453367 4839 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-jx4q7_openshift-machine-config-operator(4f92fefb-d5cd-451a-8bbe-31eea55d5bd9)\"" pod="openshift-machine-config-operator/machine-config-daemon-jx4q7" podUID="4f92fefb-d5cd-451a-8bbe-31eea55d5bd9" Mar 21 05:06:28 crc kubenswrapper[4839]: I0321 05:06:28.453413 4839 scope.go:117] "RemoveContainer" containerID="baf34d4219b300148cd1cf3e2853a6f9d40af9c7af97b863f8a2a90b0a187c21" Mar 21 05:06:28 crc kubenswrapper[4839]: E0321 05:06:28.456249 4839 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-jx4q7_openshift-machine-config-operator(4f92fefb-d5cd-451a-8bbe-31eea55d5bd9)\"" pod="openshift-machine-config-operator/machine-config-daemon-jx4q7" podUID="4f92fefb-d5cd-451a-8bbe-31eea55d5bd9" Mar 21 05:06:43 crc kubenswrapper[4839]: I0321 05:06:43.453510 4839 scope.go:117] "RemoveContainer" containerID="baf34d4219b300148cd1cf3e2853a6f9d40af9c7af97b863f8a2a90b0a187c21" Mar 21 05:06:43 crc kubenswrapper[4839]: E0321 05:06:43.454514 4839 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-jx4q7_openshift-machine-config-operator(4f92fefb-d5cd-451a-8bbe-31eea55d5bd9)\"" pod="openshift-machine-config-operator/machine-config-daemon-jx4q7" podUID="4f92fefb-d5cd-451a-8bbe-31eea55d5bd9" Mar 21 05:06:55 crc kubenswrapper[4839]: I0321 05:06:55.968836 4839 scope.go:117] "RemoveContainer" containerID="282bb60b5cd122e380e9afc3be1cd2592f307d95be7c684d73af1ea1a65bb700" Mar 21 05:06:58 crc kubenswrapper[4839]: I0321 05:06:58.452744 4839 scope.go:117] "RemoveContainer" containerID="baf34d4219b300148cd1cf3e2853a6f9d40af9c7af97b863f8a2a90b0a187c21" Mar 21 05:06:58 crc kubenswrapper[4839]: E0321 05:06:58.453544 4839 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-jx4q7_openshift-machine-config-operator(4f92fefb-d5cd-451a-8bbe-31eea55d5bd9)\"" pod="openshift-machine-config-operator/machine-config-daemon-jx4q7" podUID="4f92fefb-d5cd-451a-8bbe-31eea55d5bd9" Mar 21 05:07:10 crc kubenswrapper[4839]: I0321 05:07:10.452251 4839 scope.go:117] "RemoveContainer" containerID="baf34d4219b300148cd1cf3e2853a6f9d40af9c7af97b863f8a2a90b0a187c21" Mar 21 05:07:10 crc kubenswrapper[4839]: E0321 05:07:10.454395 4839 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-jx4q7_openshift-machine-config-operator(4f92fefb-d5cd-451a-8bbe-31eea55d5bd9)\"" pod="openshift-machine-config-operator/machine-config-daemon-jx4q7" podUID="4f92fefb-d5cd-451a-8bbe-31eea55d5bd9" Mar 21 05:07:23 crc kubenswrapper[4839]: I0321 05:07:23.453228 4839 scope.go:117] "RemoveContainer" containerID="baf34d4219b300148cd1cf3e2853a6f9d40af9c7af97b863f8a2a90b0a187c21" Mar 21 05:07:23 crc kubenswrapper[4839]: E0321 05:07:23.454009 4839 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-jx4q7_openshift-machine-config-operator(4f92fefb-d5cd-451a-8bbe-31eea55d5bd9)\"" pod="openshift-machine-config-operator/machine-config-daemon-jx4q7" podUID="4f92fefb-d5cd-451a-8bbe-31eea55d5bd9" Mar 21 05:07:34 crc kubenswrapper[4839]: I0321 05:07:34.452990 4839 scope.go:117] "RemoveContainer" containerID="baf34d4219b300148cd1cf3e2853a6f9d40af9c7af97b863f8a2a90b0a187c21" Mar 21 05:07:34 crc kubenswrapper[4839]: E0321 05:07:34.453686 4839 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-jx4q7_openshift-machine-config-operator(4f92fefb-d5cd-451a-8bbe-31eea55d5bd9)\"" pod="openshift-machine-config-operator/machine-config-daemon-jx4q7" podUID="4f92fefb-d5cd-451a-8bbe-31eea55d5bd9" Mar 21 05:07:47 crc kubenswrapper[4839]: I0321 05:07:47.454113 4839 scope.go:117] "RemoveContainer" containerID="baf34d4219b300148cd1cf3e2853a6f9d40af9c7af97b863f8a2a90b0a187c21" Mar 21 05:07:47 crc kubenswrapper[4839]: E0321 05:07:47.456094 4839 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-jx4q7_openshift-machine-config-operator(4f92fefb-d5cd-451a-8bbe-31eea55d5bd9)\"" pod="openshift-machine-config-operator/machine-config-daemon-jx4q7" podUID="4f92fefb-d5cd-451a-8bbe-31eea55d5bd9" Mar 21 05:07:59 crc kubenswrapper[4839]: I0321 05:07:59.453040 4839 scope.go:117] "RemoveContainer" containerID="baf34d4219b300148cd1cf3e2853a6f9d40af9c7af97b863f8a2a90b0a187c21" Mar 21 05:07:59 crc kubenswrapper[4839]: E0321 05:07:59.453834 4839 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-jx4q7_openshift-machine-config-operator(4f92fefb-d5cd-451a-8bbe-31eea55d5bd9)\"" pod="openshift-machine-config-operator/machine-config-daemon-jx4q7" podUID="4f92fefb-d5cd-451a-8bbe-31eea55d5bd9" Mar 21 05:08:00 crc kubenswrapper[4839]: I0321 05:08:00.150079 4839 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29567828-qnptz"] Mar 21 05:08:00 crc kubenswrapper[4839]: E0321 05:08:00.150610 4839 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ba8ac9dd-e3e7-4e21-9286-731d926d9580" containerName="oc" Mar 21 05:08:00 crc kubenswrapper[4839]: I0321 05:08:00.150627 4839 state_mem.go:107] "Deleted CPUSet assignment" podUID="ba8ac9dd-e3e7-4e21-9286-731d926d9580" containerName="oc" Mar 21 05:08:00 crc kubenswrapper[4839]: I0321 05:08:00.150891 4839 memory_manager.go:354] "RemoveStaleState removing state" podUID="ba8ac9dd-e3e7-4e21-9286-731d926d9580" containerName="oc" Mar 21 05:08:00 crc kubenswrapper[4839]: I0321 05:08:00.151669 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567828-qnptz" Mar 21 05:08:00 crc kubenswrapper[4839]: I0321 05:08:00.153982 4839 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 21 05:08:00 crc kubenswrapper[4839]: I0321 05:08:00.153992 4839 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-swld2" Mar 21 05:08:00 crc kubenswrapper[4839]: I0321 05:08:00.156480 4839 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 21 05:08:00 crc kubenswrapper[4839]: I0321 05:08:00.160516 4839 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29567828-qnptz"] Mar 21 05:08:00 crc kubenswrapper[4839]: I0321 05:08:00.219462 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hwlvj\" (UniqueName: \"kubernetes.io/projected/e550e6e6-fc33-4703-b8db-6cd8169ebc7f-kube-api-access-hwlvj\") pod \"auto-csr-approver-29567828-qnptz\" (UID: \"e550e6e6-fc33-4703-b8db-6cd8169ebc7f\") " pod="openshift-infra/auto-csr-approver-29567828-qnptz" Mar 21 05:08:00 crc kubenswrapper[4839]: I0321 05:08:00.320870 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hwlvj\" (UniqueName: \"kubernetes.io/projected/e550e6e6-fc33-4703-b8db-6cd8169ebc7f-kube-api-access-hwlvj\") pod \"auto-csr-approver-29567828-qnptz\" (UID: \"e550e6e6-fc33-4703-b8db-6cd8169ebc7f\") " pod="openshift-infra/auto-csr-approver-29567828-qnptz" Mar 21 05:08:00 crc kubenswrapper[4839]: I0321 05:08:00.339887 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hwlvj\" (UniqueName: \"kubernetes.io/projected/e550e6e6-fc33-4703-b8db-6cd8169ebc7f-kube-api-access-hwlvj\") pod \"auto-csr-approver-29567828-qnptz\" (UID: \"e550e6e6-fc33-4703-b8db-6cd8169ebc7f\") " pod="openshift-infra/auto-csr-approver-29567828-qnptz" Mar 21 05:08:00 crc kubenswrapper[4839]: I0321 05:08:00.473436 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567828-qnptz" Mar 21 05:08:00 crc kubenswrapper[4839]: I0321 05:08:00.917927 4839 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29567828-qnptz"] Mar 21 05:08:01 crc kubenswrapper[4839]: I0321 05:08:01.426597 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567828-qnptz" event={"ID":"e550e6e6-fc33-4703-b8db-6cd8169ebc7f","Type":"ContainerStarted","Data":"6fff64e5fc3a56c4210f99621355bab4012625eb139003829e1ddce0fe2e1941"} Mar 21 05:08:03 crc kubenswrapper[4839]: I0321 05:08:03.453694 4839 generic.go:334] "Generic (PLEG): container finished" podID="e550e6e6-fc33-4703-b8db-6cd8169ebc7f" containerID="4eee1ce2fe1133d7d9ea3d85cfff635c2acc34be75ad8890d23c93b28ff12298" exitCode=0 Mar 21 05:08:03 crc kubenswrapper[4839]: I0321 05:08:03.453773 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567828-qnptz" event={"ID":"e550e6e6-fc33-4703-b8db-6cd8169ebc7f","Type":"ContainerDied","Data":"4eee1ce2fe1133d7d9ea3d85cfff635c2acc34be75ad8890d23c93b28ff12298"} Mar 21 05:08:04 crc kubenswrapper[4839]: I0321 05:08:04.793622 4839 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567828-qnptz" Mar 21 05:08:04 crc kubenswrapper[4839]: I0321 05:08:04.938558 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hwlvj\" (UniqueName: \"kubernetes.io/projected/e550e6e6-fc33-4703-b8db-6cd8169ebc7f-kube-api-access-hwlvj\") pod \"e550e6e6-fc33-4703-b8db-6cd8169ebc7f\" (UID: \"e550e6e6-fc33-4703-b8db-6cd8169ebc7f\") " Mar 21 05:08:04 crc kubenswrapper[4839]: I0321 05:08:04.944592 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e550e6e6-fc33-4703-b8db-6cd8169ebc7f-kube-api-access-hwlvj" (OuterVolumeSpecName: "kube-api-access-hwlvj") pod "e550e6e6-fc33-4703-b8db-6cd8169ebc7f" (UID: "e550e6e6-fc33-4703-b8db-6cd8169ebc7f"). InnerVolumeSpecName "kube-api-access-hwlvj". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 21 05:08:05 crc kubenswrapper[4839]: I0321 05:08:05.041425 4839 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hwlvj\" (UniqueName: \"kubernetes.io/projected/e550e6e6-fc33-4703-b8db-6cd8169ebc7f-kube-api-access-hwlvj\") on node \"crc\" DevicePath \"\"" Mar 21 05:08:05 crc kubenswrapper[4839]: I0321 05:08:05.475454 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567828-qnptz" event={"ID":"e550e6e6-fc33-4703-b8db-6cd8169ebc7f","Type":"ContainerDied","Data":"6fff64e5fc3a56c4210f99621355bab4012625eb139003829e1ddce0fe2e1941"} Mar 21 05:08:05 crc kubenswrapper[4839]: I0321 05:08:05.475492 4839 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6fff64e5fc3a56c4210f99621355bab4012625eb139003829e1ddce0fe2e1941" Mar 21 05:08:05 crc kubenswrapper[4839]: I0321 05:08:05.475496 4839 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567828-qnptz" Mar 21 05:08:05 crc kubenswrapper[4839]: I0321 05:08:05.865215 4839 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29567822-666xq"] Mar 21 05:08:05 crc kubenswrapper[4839]: I0321 05:08:05.873426 4839 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29567822-666xq"] Mar 21 05:08:06 crc kubenswrapper[4839]: I0321 05:08:06.462654 4839 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5246ade9-02c7-4a6c-b903-f556b6405d03" path="/var/lib/kubelet/pods/5246ade9-02c7-4a6c-b903-f556b6405d03/volumes" Mar 21 05:08:07 crc kubenswrapper[4839]: I0321 05:08:07.494140 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-zmjtq" event={"ID":"4f49b501-bec5-4fe1-89d7-ff3c217ba580","Type":"ContainerDied","Data":"6da4c59622372824b75333d339cb0b5485cc2c5826926b2fb67ddc2f62e7dcd1"} Mar 21 05:08:07 crc kubenswrapper[4839]: I0321 05:08:07.494189 4839 generic.go:334] "Generic (PLEG): container finished" podID="4f49b501-bec5-4fe1-89d7-ff3c217ba580" containerID="6da4c59622372824b75333d339cb0b5485cc2c5826926b2fb67ddc2f62e7dcd1" exitCode=0 Mar 21 05:08:08 crc kubenswrapper[4839]: I0321 05:08:08.946699 4839 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-zmjtq" Mar 21 05:08:09 crc kubenswrapper[4839]: I0321 05:08:09.119890 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/4f49b501-bec5-4fe1-89d7-ff3c217ba580-inventory\") pod \"4f49b501-bec5-4fe1-89d7-ff3c217ba580\" (UID: \"4f49b501-bec5-4fe1-89d7-ff3c217ba580\") " Mar 21 05:08:09 crc kubenswrapper[4839]: I0321 05:08:09.119990 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/4f49b501-bec5-4fe1-89d7-ff3c217ba580-ssh-key-openstack-edpm-ipam\") pod \"4f49b501-bec5-4fe1-89d7-ff3c217ba580\" (UID: \"4f49b501-bec5-4fe1-89d7-ff3c217ba580\") " Mar 21 05:08:09 crc kubenswrapper[4839]: I0321 05:08:09.120054 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-v5mvz\" (UniqueName: \"kubernetes.io/projected/4f49b501-bec5-4fe1-89d7-ff3c217ba580-kube-api-access-v5mvz\") pod \"4f49b501-bec5-4fe1-89d7-ff3c217ba580\" (UID: \"4f49b501-bec5-4fe1-89d7-ff3c217ba580\") " Mar 21 05:08:09 crc kubenswrapper[4839]: I0321 05:08:09.120112 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-compute-config-data-1\" (UniqueName: \"kubernetes.io/secret/4f49b501-bec5-4fe1-89d7-ff3c217ba580-ceilometer-compute-config-data-1\") pod \"4f49b501-bec5-4fe1-89d7-ff3c217ba580\" (UID: \"4f49b501-bec5-4fe1-89d7-ff3c217ba580\") " Mar 21 05:08:09 crc kubenswrapper[4839]: I0321 05:08:09.120140 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-compute-config-data-2\" (UniqueName: \"kubernetes.io/secret/4f49b501-bec5-4fe1-89d7-ff3c217ba580-ceilometer-compute-config-data-2\") pod \"4f49b501-bec5-4fe1-89d7-ff3c217ba580\" (UID: \"4f49b501-bec5-4fe1-89d7-ff3c217ba580\") " Mar 21 05:08:09 crc kubenswrapper[4839]: I0321 05:08:09.120217 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-compute-config-data-0\" (UniqueName: \"kubernetes.io/secret/4f49b501-bec5-4fe1-89d7-ff3c217ba580-ceilometer-compute-config-data-0\") pod \"4f49b501-bec5-4fe1-89d7-ff3c217ba580\" (UID: \"4f49b501-bec5-4fe1-89d7-ff3c217ba580\") " Mar 21 05:08:09 crc kubenswrapper[4839]: I0321 05:08:09.120267 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4f49b501-bec5-4fe1-89d7-ff3c217ba580-telemetry-combined-ca-bundle\") pod \"4f49b501-bec5-4fe1-89d7-ff3c217ba580\" (UID: \"4f49b501-bec5-4fe1-89d7-ff3c217ba580\") " Mar 21 05:08:09 crc kubenswrapper[4839]: I0321 05:08:09.129817 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4f49b501-bec5-4fe1-89d7-ff3c217ba580-telemetry-combined-ca-bundle" (OuterVolumeSpecName: "telemetry-combined-ca-bundle") pod "4f49b501-bec5-4fe1-89d7-ff3c217ba580" (UID: "4f49b501-bec5-4fe1-89d7-ff3c217ba580"). InnerVolumeSpecName "telemetry-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 21 05:08:09 crc kubenswrapper[4839]: I0321 05:08:09.131901 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4f49b501-bec5-4fe1-89d7-ff3c217ba580-kube-api-access-v5mvz" (OuterVolumeSpecName: "kube-api-access-v5mvz") pod "4f49b501-bec5-4fe1-89d7-ff3c217ba580" (UID: "4f49b501-bec5-4fe1-89d7-ff3c217ba580"). InnerVolumeSpecName "kube-api-access-v5mvz". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 21 05:08:09 crc kubenswrapper[4839]: I0321 05:08:09.148120 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4f49b501-bec5-4fe1-89d7-ff3c217ba580-ceilometer-compute-config-data-1" (OuterVolumeSpecName: "ceilometer-compute-config-data-1") pod "4f49b501-bec5-4fe1-89d7-ff3c217ba580" (UID: "4f49b501-bec5-4fe1-89d7-ff3c217ba580"). InnerVolumeSpecName "ceilometer-compute-config-data-1". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 21 05:08:09 crc kubenswrapper[4839]: I0321 05:08:09.149835 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4f49b501-bec5-4fe1-89d7-ff3c217ba580-ceilometer-compute-config-data-2" (OuterVolumeSpecName: "ceilometer-compute-config-data-2") pod "4f49b501-bec5-4fe1-89d7-ff3c217ba580" (UID: "4f49b501-bec5-4fe1-89d7-ff3c217ba580"). InnerVolumeSpecName "ceilometer-compute-config-data-2". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 21 05:08:09 crc kubenswrapper[4839]: I0321 05:08:09.150855 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4f49b501-bec5-4fe1-89d7-ff3c217ba580-ceilometer-compute-config-data-0" (OuterVolumeSpecName: "ceilometer-compute-config-data-0") pod "4f49b501-bec5-4fe1-89d7-ff3c217ba580" (UID: "4f49b501-bec5-4fe1-89d7-ff3c217ba580"). InnerVolumeSpecName "ceilometer-compute-config-data-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 21 05:08:09 crc kubenswrapper[4839]: I0321 05:08:09.152770 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4f49b501-bec5-4fe1-89d7-ff3c217ba580-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "4f49b501-bec5-4fe1-89d7-ff3c217ba580" (UID: "4f49b501-bec5-4fe1-89d7-ff3c217ba580"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 21 05:08:09 crc kubenswrapper[4839]: I0321 05:08:09.152813 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4f49b501-bec5-4fe1-89d7-ff3c217ba580-inventory" (OuterVolumeSpecName: "inventory") pod "4f49b501-bec5-4fe1-89d7-ff3c217ba580" (UID: "4f49b501-bec5-4fe1-89d7-ff3c217ba580"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 21 05:08:09 crc kubenswrapper[4839]: I0321 05:08:09.222093 4839 reconciler_common.go:293] "Volume detached for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4f49b501-bec5-4fe1-89d7-ff3c217ba580-telemetry-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 21 05:08:09 crc kubenswrapper[4839]: I0321 05:08:09.222129 4839 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/4f49b501-bec5-4fe1-89d7-ff3c217ba580-inventory\") on node \"crc\" DevicePath \"\"" Mar 21 05:08:09 crc kubenswrapper[4839]: I0321 05:08:09.222861 4839 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/4f49b501-bec5-4fe1-89d7-ff3c217ba580-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Mar 21 05:08:09 crc kubenswrapper[4839]: I0321 05:08:09.222886 4839 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-v5mvz\" (UniqueName: \"kubernetes.io/projected/4f49b501-bec5-4fe1-89d7-ff3c217ba580-kube-api-access-v5mvz\") on node \"crc\" DevicePath \"\"" Mar 21 05:08:09 crc kubenswrapper[4839]: I0321 05:08:09.222898 4839 reconciler_common.go:293] "Volume detached for volume \"ceilometer-compute-config-data-1\" (UniqueName: \"kubernetes.io/secret/4f49b501-bec5-4fe1-89d7-ff3c217ba580-ceilometer-compute-config-data-1\") on node \"crc\" DevicePath \"\"" Mar 21 05:08:09 crc kubenswrapper[4839]: I0321 05:08:09.222933 4839 reconciler_common.go:293] "Volume detached for volume \"ceilometer-compute-config-data-2\" (UniqueName: \"kubernetes.io/secret/4f49b501-bec5-4fe1-89d7-ff3c217ba580-ceilometer-compute-config-data-2\") on node \"crc\" DevicePath \"\"" Mar 21 05:08:09 crc kubenswrapper[4839]: I0321 05:08:09.222946 4839 reconciler_common.go:293] "Volume detached for volume \"ceilometer-compute-config-data-0\" (UniqueName: \"kubernetes.io/secret/4f49b501-bec5-4fe1-89d7-ff3c217ba580-ceilometer-compute-config-data-0\") on node \"crc\" DevicePath \"\"" Mar 21 05:08:09 crc kubenswrapper[4839]: I0321 05:08:09.512024 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-zmjtq" event={"ID":"4f49b501-bec5-4fe1-89d7-ff3c217ba580","Type":"ContainerDied","Data":"d270c1962df5881492e69c0fae671ca811333a43b6b51f339984de8b79640216"} Mar 21 05:08:09 crc kubenswrapper[4839]: I0321 05:08:09.512109 4839 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d270c1962df5881492e69c0fae671ca811333a43b6b51f339984de8b79640216" Mar 21 05:08:09 crc kubenswrapper[4839]: I0321 05:08:09.512119 4839 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-zmjtq" Mar 21 05:08:14 crc kubenswrapper[4839]: I0321 05:08:14.453175 4839 scope.go:117] "RemoveContainer" containerID="baf34d4219b300148cd1cf3e2853a6f9d40af9c7af97b863f8a2a90b0a187c21" Mar 21 05:08:14 crc kubenswrapper[4839]: E0321 05:08:14.454059 4839 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-jx4q7_openshift-machine-config-operator(4f92fefb-d5cd-451a-8bbe-31eea55d5bd9)\"" pod="openshift-machine-config-operator/machine-config-daemon-jx4q7" podUID="4f92fefb-d5cd-451a-8bbe-31eea55d5bd9" Mar 21 05:08:28 crc kubenswrapper[4839]: I0321 05:08:28.453439 4839 scope.go:117] "RemoveContainer" containerID="baf34d4219b300148cd1cf3e2853a6f9d40af9c7af97b863f8a2a90b0a187c21" Mar 21 05:08:28 crc kubenswrapper[4839]: E0321 05:08:28.454315 4839 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-jx4q7_openshift-machine-config-operator(4f92fefb-d5cd-451a-8bbe-31eea55d5bd9)\"" pod="openshift-machine-config-operator/machine-config-daemon-jx4q7" podUID="4f92fefb-d5cd-451a-8bbe-31eea55d5bd9" Mar 21 05:08:42 crc kubenswrapper[4839]: I0321 05:08:42.453028 4839 scope.go:117] "RemoveContainer" containerID="baf34d4219b300148cd1cf3e2853a6f9d40af9c7af97b863f8a2a90b0a187c21" Mar 21 05:08:42 crc kubenswrapper[4839]: E0321 05:08:42.453946 4839 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-jx4q7_openshift-machine-config-operator(4f92fefb-d5cd-451a-8bbe-31eea55d5bd9)\"" pod="openshift-machine-config-operator/machine-config-daemon-jx4q7" podUID="4f92fefb-d5cd-451a-8bbe-31eea55d5bd9" Mar 21 05:08:54 crc kubenswrapper[4839]: I0321 05:08:54.523440 4839 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/tempest-tests-tempest"] Mar 21 05:08:54 crc kubenswrapper[4839]: E0321 05:08:54.524457 4839 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4f49b501-bec5-4fe1-89d7-ff3c217ba580" containerName="telemetry-edpm-deployment-openstack-edpm-ipam" Mar 21 05:08:54 crc kubenswrapper[4839]: I0321 05:08:54.524476 4839 state_mem.go:107] "Deleted CPUSet assignment" podUID="4f49b501-bec5-4fe1-89d7-ff3c217ba580" containerName="telemetry-edpm-deployment-openstack-edpm-ipam" Mar 21 05:08:54 crc kubenswrapper[4839]: E0321 05:08:54.524493 4839 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e550e6e6-fc33-4703-b8db-6cd8169ebc7f" containerName="oc" Mar 21 05:08:54 crc kubenswrapper[4839]: I0321 05:08:54.524500 4839 state_mem.go:107] "Deleted CPUSet assignment" podUID="e550e6e6-fc33-4703-b8db-6cd8169ebc7f" containerName="oc" Mar 21 05:08:54 crc kubenswrapper[4839]: I0321 05:08:54.524739 4839 memory_manager.go:354] "RemoveStaleState removing state" podUID="e550e6e6-fc33-4703-b8db-6cd8169ebc7f" containerName="oc" Mar 21 05:08:54 crc kubenswrapper[4839]: I0321 05:08:54.524756 4839 memory_manager.go:354] "RemoveStaleState removing state" podUID="4f49b501-bec5-4fe1-89d7-ff3c217ba580" containerName="telemetry-edpm-deployment-openstack-edpm-ipam" Mar 21 05:08:54 crc kubenswrapper[4839]: I0321 05:08:54.525452 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/tempest-tests-tempest" Mar 21 05:08:54 crc kubenswrapper[4839]: I0321 05:08:54.527764 4839 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"test-operator-controller-priv-key" Mar 21 05:08:54 crc kubenswrapper[4839]: I0321 05:08:54.527891 4839 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"tempest-tests-tempest-env-vars-s0" Mar 21 05:08:54 crc kubenswrapper[4839]: I0321 05:08:54.528947 4839 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"default-dockercfg-6v5zd" Mar 21 05:08:54 crc kubenswrapper[4839]: I0321 05:08:54.529297 4839 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"tempest-tests-tempest-custom-data-s0" Mar 21 05:08:54 crc kubenswrapper[4839]: I0321 05:08:54.537307 4839 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/tempest-tests-tempest"] Mar 21 05:08:54 crc kubenswrapper[4839]: I0321 05:08:54.552255 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/65c89a5c-df7e-4d6c-bc7e-5e4fecfc6cb3-openstack-config-secret\") pod \"tempest-tests-tempest\" (UID: \"65c89a5c-df7e-4d6c-bc7e-5e4fecfc6cb3\") " pod="openstack/tempest-tests-tempest" Mar 21 05:08:54 crc kubenswrapper[4839]: I0321 05:08:54.552317 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/65c89a5c-df7e-4d6c-bc7e-5e4fecfc6cb3-openstack-config\") pod \"tempest-tests-tempest\" (UID: \"65c89a5c-df7e-4d6c-bc7e-5e4fecfc6cb3\") " pod="openstack/tempest-tests-tempest" Mar 21 05:08:54 crc kubenswrapper[4839]: I0321 05:08:54.552364 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/65c89a5c-df7e-4d6c-bc7e-5e4fecfc6cb3-config-data\") pod \"tempest-tests-tempest\" (UID: \"65c89a5c-df7e-4d6c-bc7e-5e4fecfc6cb3\") " pod="openstack/tempest-tests-tempest" Mar 21 05:08:54 crc kubenswrapper[4839]: I0321 05:08:54.654105 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/secret/65c89a5c-df7e-4d6c-bc7e-5e4fecfc6cb3-ca-certs\") pod \"tempest-tests-tempest\" (UID: \"65c89a5c-df7e-4d6c-bc7e-5e4fecfc6cb3\") " pod="openstack/tempest-tests-tempest" Mar 21 05:08:54 crc kubenswrapper[4839]: I0321 05:08:54.654240 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"test-operator-ephemeral-temporary\" (UniqueName: \"kubernetes.io/empty-dir/65c89a5c-df7e-4d6c-bc7e-5e4fecfc6cb3-test-operator-ephemeral-temporary\") pod \"tempest-tests-tempest\" (UID: \"65c89a5c-df7e-4d6c-bc7e-5e4fecfc6cb3\") " pod="openstack/tempest-tests-tempest" Mar 21 05:08:54 crc kubenswrapper[4839]: I0321 05:08:54.654268 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/65c89a5c-df7e-4d6c-bc7e-5e4fecfc6cb3-config-data\") pod \"tempest-tests-tempest\" (UID: \"65c89a5c-df7e-4d6c-bc7e-5e4fecfc6cb3\") " pod="openstack/tempest-tests-tempest" Mar 21 05:08:54 crc kubenswrapper[4839]: I0321 05:08:54.654303 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"test-operator-ephemeral-workdir\" (UniqueName: \"kubernetes.io/empty-dir/65c89a5c-df7e-4d6c-bc7e-5e4fecfc6cb3-test-operator-ephemeral-workdir\") pod \"tempest-tests-tempest\" (UID: \"65c89a5c-df7e-4d6c-bc7e-5e4fecfc6cb3\") " pod="openstack/tempest-tests-tempest" Mar 21 05:08:54 crc kubenswrapper[4839]: I0321 05:08:54.654901 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/65c89a5c-df7e-4d6c-bc7e-5e4fecfc6cb3-ssh-key\") pod \"tempest-tests-tempest\" (UID: \"65c89a5c-df7e-4d6c-bc7e-5e4fecfc6cb3\") " pod="openstack/tempest-tests-tempest" Mar 21 05:08:54 crc kubenswrapper[4839]: I0321 05:08:54.654982 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x84s5\" (UniqueName: \"kubernetes.io/projected/65c89a5c-df7e-4d6c-bc7e-5e4fecfc6cb3-kube-api-access-x84s5\") pod \"tempest-tests-tempest\" (UID: \"65c89a5c-df7e-4d6c-bc7e-5e4fecfc6cb3\") " pod="openstack/tempest-tests-tempest" Mar 21 05:08:54 crc kubenswrapper[4839]: I0321 05:08:54.655067 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"tempest-tests-tempest\" (UID: \"65c89a5c-df7e-4d6c-bc7e-5e4fecfc6cb3\") " pod="openstack/tempest-tests-tempest" Mar 21 05:08:54 crc kubenswrapper[4839]: I0321 05:08:54.655123 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/65c89a5c-df7e-4d6c-bc7e-5e4fecfc6cb3-openstack-config-secret\") pod \"tempest-tests-tempest\" (UID: \"65c89a5c-df7e-4d6c-bc7e-5e4fecfc6cb3\") " pod="openstack/tempest-tests-tempest" Mar 21 05:08:54 crc kubenswrapper[4839]: I0321 05:08:54.655449 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/65c89a5c-df7e-4d6c-bc7e-5e4fecfc6cb3-openstack-config\") pod \"tempest-tests-tempest\" (UID: \"65c89a5c-df7e-4d6c-bc7e-5e4fecfc6cb3\") " pod="openstack/tempest-tests-tempest" Mar 21 05:08:54 crc kubenswrapper[4839]: I0321 05:08:54.657510 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/65c89a5c-df7e-4d6c-bc7e-5e4fecfc6cb3-openstack-config\") pod \"tempest-tests-tempest\" (UID: \"65c89a5c-df7e-4d6c-bc7e-5e4fecfc6cb3\") " pod="openstack/tempest-tests-tempest" Mar 21 05:08:54 crc kubenswrapper[4839]: I0321 05:08:54.657647 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/65c89a5c-df7e-4d6c-bc7e-5e4fecfc6cb3-config-data\") pod \"tempest-tests-tempest\" (UID: \"65c89a5c-df7e-4d6c-bc7e-5e4fecfc6cb3\") " pod="openstack/tempest-tests-tempest" Mar 21 05:08:54 crc kubenswrapper[4839]: I0321 05:08:54.665435 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/65c89a5c-df7e-4d6c-bc7e-5e4fecfc6cb3-openstack-config-secret\") pod \"tempest-tests-tempest\" (UID: \"65c89a5c-df7e-4d6c-bc7e-5e4fecfc6cb3\") " pod="openstack/tempest-tests-tempest" Mar 21 05:08:54 crc kubenswrapper[4839]: I0321 05:08:54.757408 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-x84s5\" (UniqueName: \"kubernetes.io/projected/65c89a5c-df7e-4d6c-bc7e-5e4fecfc6cb3-kube-api-access-x84s5\") pod \"tempest-tests-tempest\" (UID: \"65c89a5c-df7e-4d6c-bc7e-5e4fecfc6cb3\") " pod="openstack/tempest-tests-tempest" Mar 21 05:08:54 crc kubenswrapper[4839]: I0321 05:08:54.757937 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"tempest-tests-tempest\" (UID: \"65c89a5c-df7e-4d6c-bc7e-5e4fecfc6cb3\") " pod="openstack/tempest-tests-tempest" Mar 21 05:08:54 crc kubenswrapper[4839]: I0321 05:08:54.758084 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/secret/65c89a5c-df7e-4d6c-bc7e-5e4fecfc6cb3-ca-certs\") pod \"tempest-tests-tempest\" (UID: \"65c89a5c-df7e-4d6c-bc7e-5e4fecfc6cb3\") " pod="openstack/tempest-tests-tempest" Mar 21 05:08:54 crc kubenswrapper[4839]: I0321 05:08:54.758151 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"test-operator-ephemeral-temporary\" (UniqueName: \"kubernetes.io/empty-dir/65c89a5c-df7e-4d6c-bc7e-5e4fecfc6cb3-test-operator-ephemeral-temporary\") pod \"tempest-tests-tempest\" (UID: \"65c89a5c-df7e-4d6c-bc7e-5e4fecfc6cb3\") " pod="openstack/tempest-tests-tempest" Mar 21 05:08:54 crc kubenswrapper[4839]: I0321 05:08:54.758269 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"test-operator-ephemeral-workdir\" (UniqueName: \"kubernetes.io/empty-dir/65c89a5c-df7e-4d6c-bc7e-5e4fecfc6cb3-test-operator-ephemeral-workdir\") pod \"tempest-tests-tempest\" (UID: \"65c89a5c-df7e-4d6c-bc7e-5e4fecfc6cb3\") " pod="openstack/tempest-tests-tempest" Mar 21 05:08:54 crc kubenswrapper[4839]: I0321 05:08:54.758284 4839 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"tempest-tests-tempest\" (UID: \"65c89a5c-df7e-4d6c-bc7e-5e4fecfc6cb3\") device mount path \"/mnt/openstack/pv09\"" pod="openstack/tempest-tests-tempest" Mar 21 05:08:54 crc kubenswrapper[4839]: I0321 05:08:54.758698 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"test-operator-ephemeral-temporary\" (UniqueName: \"kubernetes.io/empty-dir/65c89a5c-df7e-4d6c-bc7e-5e4fecfc6cb3-test-operator-ephemeral-temporary\") pod \"tempest-tests-tempest\" (UID: \"65c89a5c-df7e-4d6c-bc7e-5e4fecfc6cb3\") " pod="openstack/tempest-tests-tempest" Mar 21 05:08:54 crc kubenswrapper[4839]: I0321 05:08:54.758853 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"test-operator-ephemeral-workdir\" (UniqueName: \"kubernetes.io/empty-dir/65c89a5c-df7e-4d6c-bc7e-5e4fecfc6cb3-test-operator-ephemeral-workdir\") pod \"tempest-tests-tempest\" (UID: \"65c89a5c-df7e-4d6c-bc7e-5e4fecfc6cb3\") " pod="openstack/tempest-tests-tempest" Mar 21 05:08:54 crc kubenswrapper[4839]: I0321 05:08:54.758934 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/65c89a5c-df7e-4d6c-bc7e-5e4fecfc6cb3-ssh-key\") pod \"tempest-tests-tempest\" (UID: \"65c89a5c-df7e-4d6c-bc7e-5e4fecfc6cb3\") " pod="openstack/tempest-tests-tempest" Mar 21 05:08:54 crc kubenswrapper[4839]: I0321 05:08:54.763173 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/65c89a5c-df7e-4d6c-bc7e-5e4fecfc6cb3-ssh-key\") pod \"tempest-tests-tempest\" (UID: \"65c89a5c-df7e-4d6c-bc7e-5e4fecfc6cb3\") " pod="openstack/tempest-tests-tempest" Mar 21 05:08:54 crc kubenswrapper[4839]: I0321 05:08:54.764508 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ca-certs\" (UniqueName: \"kubernetes.io/secret/65c89a5c-df7e-4d6c-bc7e-5e4fecfc6cb3-ca-certs\") pod \"tempest-tests-tempest\" (UID: \"65c89a5c-df7e-4d6c-bc7e-5e4fecfc6cb3\") " pod="openstack/tempest-tests-tempest" Mar 21 05:08:54 crc kubenswrapper[4839]: I0321 05:08:54.782311 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-x84s5\" (UniqueName: \"kubernetes.io/projected/65c89a5c-df7e-4d6c-bc7e-5e4fecfc6cb3-kube-api-access-x84s5\") pod \"tempest-tests-tempest\" (UID: \"65c89a5c-df7e-4d6c-bc7e-5e4fecfc6cb3\") " pod="openstack/tempest-tests-tempest" Mar 21 05:08:54 crc kubenswrapper[4839]: I0321 05:08:54.786208 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"tempest-tests-tempest\" (UID: \"65c89a5c-df7e-4d6c-bc7e-5e4fecfc6cb3\") " pod="openstack/tempest-tests-tempest" Mar 21 05:08:54 crc kubenswrapper[4839]: I0321 05:08:54.862418 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/tempest-tests-tempest" Mar 21 05:08:55 crc kubenswrapper[4839]: I0321 05:08:55.337724 4839 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/tempest-tests-tempest"] Mar 21 05:08:55 crc kubenswrapper[4839]: I0321 05:08:55.927890 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/tempest-tests-tempest" event={"ID":"65c89a5c-df7e-4d6c-bc7e-5e4fecfc6cb3","Type":"ContainerStarted","Data":"de55efea3ef0459f6ae11516d74916a8089be737bb24fdba8c8fcffa3719ebe6"} Mar 21 05:08:56 crc kubenswrapper[4839]: I0321 05:08:56.077989 4839 scope.go:117] "RemoveContainer" containerID="4b22a92b0ab8fcff90ca92aa57b5aa47ae8ec5ba62184c302116eb1d72e13a3d" Mar 21 05:08:56 crc kubenswrapper[4839]: I0321 05:08:56.125956 4839 scope.go:117] "RemoveContainer" containerID="1efc52951d43a245177e4dff7cfd1e3426fa930b28133acb294aea6743e70139" Mar 21 05:08:56 crc kubenswrapper[4839]: I0321 05:08:56.166490 4839 scope.go:117] "RemoveContainer" containerID="292e8e6107a41a041900da65d2d65595f3753b426bbb8c2a06a65273c04fd6b1" Mar 21 05:08:56 crc kubenswrapper[4839]: I0321 05:08:56.251436 4839 scope.go:117] "RemoveContainer" containerID="b9de2139979cd92ed9f8ddab30d1a42bd7d67a863ada145cf6cbe71703537956" Mar 21 05:08:57 crc kubenswrapper[4839]: I0321 05:08:57.453217 4839 scope.go:117] "RemoveContainer" containerID="baf34d4219b300148cd1cf3e2853a6f9d40af9c7af97b863f8a2a90b0a187c21" Mar 21 05:08:57 crc kubenswrapper[4839]: E0321 05:08:57.453785 4839 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-jx4q7_openshift-machine-config-operator(4f92fefb-d5cd-451a-8bbe-31eea55d5bd9)\"" pod="openshift-machine-config-operator/machine-config-daemon-jx4q7" podUID="4f92fefb-d5cd-451a-8bbe-31eea55d5bd9" Mar 21 05:09:08 crc kubenswrapper[4839]: I0321 05:09:08.452704 4839 scope.go:117] "RemoveContainer" containerID="baf34d4219b300148cd1cf3e2853a6f9d40af9c7af97b863f8a2a90b0a187c21" Mar 21 05:09:08 crc kubenswrapper[4839]: E0321 05:09:08.453718 4839 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-jx4q7_openshift-machine-config-operator(4f92fefb-d5cd-451a-8bbe-31eea55d5bd9)\"" pod="openshift-machine-config-operator/machine-config-daemon-jx4q7" podUID="4f92fefb-d5cd-451a-8bbe-31eea55d5bd9" Mar 21 05:09:22 crc kubenswrapper[4839]: I0321 05:09:22.453453 4839 scope.go:117] "RemoveContainer" containerID="baf34d4219b300148cd1cf3e2853a6f9d40af9c7af97b863f8a2a90b0a187c21" Mar 21 05:09:22 crc kubenswrapper[4839]: E0321 05:09:22.454648 4839 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-jx4q7_openshift-machine-config-operator(4f92fefb-d5cd-451a-8bbe-31eea55d5bd9)\"" pod="openshift-machine-config-operator/machine-config-daemon-jx4q7" podUID="4f92fefb-d5cd-451a-8bbe-31eea55d5bd9" Mar 21 05:09:23 crc kubenswrapper[4839]: E0321 05:09:23.562861 4839 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-tempest-all:current-podified" Mar 21 05:09:23 crc kubenswrapper[4839]: E0321 05:09:23.563379 4839 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:tempest-tests-tempest-tests-runner,Image:quay.io/podified-antelope-centos9/openstack-tempest-all:current-podified,Command:[],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config-data,ReadOnly:false,MountPath:/etc/test_operator,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:test-operator-ephemeral-workdir,ReadOnly:false,MountPath:/var/lib/tempest,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:test-operator-ephemeral-temporary,ReadOnly:false,MountPath:/tmp,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:test-operator-logs,ReadOnly:false,MountPath:/var/lib/tempest/external_files,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:openstack-config,ReadOnly:true,MountPath:/etc/openstack/clouds.yaml,SubPath:clouds.yaml,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:openstack-config,ReadOnly:true,MountPath:/var/lib/tempest/.config/openstack/clouds.yaml,SubPath:clouds.yaml,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:openstack-config-secret,ReadOnly:false,MountPath:/etc/openstack/secure.yaml,SubPath:secure.yaml,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:ca-certs,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:ssh-key,ReadOnly:false,MountPath:/var/lib/tempest/id_ecdsa,SubPath:ssh_key,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-x84s5,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*42480,RunAsNonRoot:*false,ReadOnlyRootFilesystem:*false,AllowPrivilegeEscalation:*true,RunAsGroup:*42480,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{EnvFromSource{Prefix:,ConfigMapRef:&ConfigMapEnvSource{LocalObjectReference:LocalObjectReference{Name:tempest-tests-tempest-custom-data-s0,},Optional:nil,},SecretRef:nil,},EnvFromSource{Prefix:,ConfigMapRef:&ConfigMapEnvSource{LocalObjectReference:LocalObjectReference{Name:tempest-tests-tempest-env-vars-s0,},Optional:nil,},SecretRef:nil,},},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod tempest-tests-tempest_openstack(65c89a5c-df7e-4d6c-bc7e-5e4fecfc6cb3): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 21 05:09:23 crc kubenswrapper[4839]: E0321 05:09:23.564542 4839 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"tempest-tests-tempest-tests-runner\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/tempest-tests-tempest" podUID="65c89a5c-df7e-4d6c-bc7e-5e4fecfc6cb3" Mar 21 05:09:24 crc kubenswrapper[4839]: E0321 05:09:24.177224 4839 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"tempest-tests-tempest-tests-runner\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-tempest-all:current-podified\\\"\"" pod="openstack/tempest-tests-tempest" podUID="65c89a5c-df7e-4d6c-bc7e-5e4fecfc6cb3" Mar 21 05:09:36 crc kubenswrapper[4839]: I0321 05:09:36.877081 4839 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"tempest-tests-tempest-env-vars-s0" Mar 21 05:09:37 crc kubenswrapper[4839]: I0321 05:09:37.453072 4839 scope.go:117] "RemoveContainer" containerID="baf34d4219b300148cd1cf3e2853a6f9d40af9c7af97b863f8a2a90b0a187c21" Mar 21 05:09:37 crc kubenswrapper[4839]: E0321 05:09:37.453707 4839 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-jx4q7_openshift-machine-config-operator(4f92fefb-d5cd-451a-8bbe-31eea55d5bd9)\"" pod="openshift-machine-config-operator/machine-config-daemon-jx4q7" podUID="4f92fefb-d5cd-451a-8bbe-31eea55d5bd9" Mar 21 05:09:38 crc kubenswrapper[4839]: I0321 05:09:38.017099 4839 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-qslfs"] Mar 21 05:09:38 crc kubenswrapper[4839]: I0321 05:09:38.021127 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-qslfs" Mar 21 05:09:38 crc kubenswrapper[4839]: I0321 05:09:38.028282 4839 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-qslfs"] Mar 21 05:09:38 crc kubenswrapper[4839]: I0321 05:09:38.123356 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mmx52\" (UniqueName: \"kubernetes.io/projected/9724e408-6086-45cd-961d-5d5504f15791-kube-api-access-mmx52\") pod \"redhat-operators-qslfs\" (UID: \"9724e408-6086-45cd-961d-5d5504f15791\") " pod="openshift-marketplace/redhat-operators-qslfs" Mar 21 05:09:38 crc kubenswrapper[4839]: I0321 05:09:38.123448 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9724e408-6086-45cd-961d-5d5504f15791-catalog-content\") pod \"redhat-operators-qslfs\" (UID: \"9724e408-6086-45cd-961d-5d5504f15791\") " pod="openshift-marketplace/redhat-operators-qslfs" Mar 21 05:09:38 crc kubenswrapper[4839]: I0321 05:09:38.123503 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9724e408-6086-45cd-961d-5d5504f15791-utilities\") pod \"redhat-operators-qslfs\" (UID: \"9724e408-6086-45cd-961d-5d5504f15791\") " pod="openshift-marketplace/redhat-operators-qslfs" Mar 21 05:09:38 crc kubenswrapper[4839]: I0321 05:09:38.225141 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9724e408-6086-45cd-961d-5d5504f15791-catalog-content\") pod \"redhat-operators-qslfs\" (UID: \"9724e408-6086-45cd-961d-5d5504f15791\") " pod="openshift-marketplace/redhat-operators-qslfs" Mar 21 05:09:38 crc kubenswrapper[4839]: I0321 05:09:38.226261 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9724e408-6086-45cd-961d-5d5504f15791-catalog-content\") pod \"redhat-operators-qslfs\" (UID: \"9724e408-6086-45cd-961d-5d5504f15791\") " pod="openshift-marketplace/redhat-operators-qslfs" Mar 21 05:09:38 crc kubenswrapper[4839]: I0321 05:09:38.226441 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9724e408-6086-45cd-961d-5d5504f15791-utilities\") pod \"redhat-operators-qslfs\" (UID: \"9724e408-6086-45cd-961d-5d5504f15791\") " pod="openshift-marketplace/redhat-operators-qslfs" Mar 21 05:09:38 crc kubenswrapper[4839]: I0321 05:09:38.226693 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mmx52\" (UniqueName: \"kubernetes.io/projected/9724e408-6086-45cd-961d-5d5504f15791-kube-api-access-mmx52\") pod \"redhat-operators-qslfs\" (UID: \"9724e408-6086-45cd-961d-5d5504f15791\") " pod="openshift-marketplace/redhat-operators-qslfs" Mar 21 05:09:38 crc kubenswrapper[4839]: I0321 05:09:38.226709 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9724e408-6086-45cd-961d-5d5504f15791-utilities\") pod \"redhat-operators-qslfs\" (UID: \"9724e408-6086-45cd-961d-5d5504f15791\") " pod="openshift-marketplace/redhat-operators-qslfs" Mar 21 05:09:38 crc kubenswrapper[4839]: I0321 05:09:38.245055 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mmx52\" (UniqueName: \"kubernetes.io/projected/9724e408-6086-45cd-961d-5d5504f15791-kube-api-access-mmx52\") pod \"redhat-operators-qslfs\" (UID: \"9724e408-6086-45cd-961d-5d5504f15791\") " pod="openshift-marketplace/redhat-operators-qslfs" Mar 21 05:09:38 crc kubenswrapper[4839]: I0321 05:09:38.331697 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/tempest-tests-tempest" event={"ID":"65c89a5c-df7e-4d6c-bc7e-5e4fecfc6cb3","Type":"ContainerStarted","Data":"0176599a8a2d6c5f1b857f924207691c2463c8d61ed3270470c8fc3d29535c3b"} Mar 21 05:09:38 crc kubenswrapper[4839]: I0321 05:09:38.340696 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-qslfs" Mar 21 05:09:38 crc kubenswrapper[4839]: I0321 05:09:38.367073 4839 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/tempest-tests-tempest" podStartSLOduration=3.8323489139999998 podStartE2EDuration="45.367055123s" podCreationTimestamp="2026-03-21 05:08:53 +0000 UTC" firstStartedPulling="2026-03-21 05:08:55.339294053 +0000 UTC m=+2739.667080729" lastFinishedPulling="2026-03-21 05:09:36.874000262 +0000 UTC m=+2781.201786938" observedRunningTime="2026-03-21 05:09:38.356829495 +0000 UTC m=+2782.684616171" watchObservedRunningTime="2026-03-21 05:09:38.367055123 +0000 UTC m=+2782.694841799" Mar 21 05:09:38 crc kubenswrapper[4839]: W0321 05:09:38.836165 4839 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9724e408_6086_45cd_961d_5d5504f15791.slice/crio-3ee0994c45cc96d08771860e048f51ded7dac2e5b506ab578b474e4ea7342e27 WatchSource:0}: Error finding container 3ee0994c45cc96d08771860e048f51ded7dac2e5b506ab578b474e4ea7342e27: Status 404 returned error can't find the container with id 3ee0994c45cc96d08771860e048f51ded7dac2e5b506ab578b474e4ea7342e27 Mar 21 05:09:38 crc kubenswrapper[4839]: I0321 05:09:38.847186 4839 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-qslfs"] Mar 21 05:09:39 crc kubenswrapper[4839]: I0321 05:09:39.344037 4839 generic.go:334] "Generic (PLEG): container finished" podID="9724e408-6086-45cd-961d-5d5504f15791" containerID="f416b63465291b1b0afd3a134a0a816800ac0b064ea726d9921fdf7a33ec3237" exitCode=0 Mar 21 05:09:39 crc kubenswrapper[4839]: I0321 05:09:39.344266 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-qslfs" event={"ID":"9724e408-6086-45cd-961d-5d5504f15791","Type":"ContainerDied","Data":"f416b63465291b1b0afd3a134a0a816800ac0b064ea726d9921fdf7a33ec3237"} Mar 21 05:09:39 crc kubenswrapper[4839]: I0321 05:09:39.344358 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-qslfs" event={"ID":"9724e408-6086-45cd-961d-5d5504f15791","Type":"ContainerStarted","Data":"3ee0994c45cc96d08771860e048f51ded7dac2e5b506ab578b474e4ea7342e27"} Mar 21 05:09:40 crc kubenswrapper[4839]: I0321 05:09:40.357171 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-qslfs" event={"ID":"9724e408-6086-45cd-961d-5d5504f15791","Type":"ContainerStarted","Data":"f4222b0bdc17d02a76cee6d0d46fc25a35d55bbc936412983c70c150951d1ca8"} Mar 21 05:09:42 crc kubenswrapper[4839]: I0321 05:09:42.381973 4839 generic.go:334] "Generic (PLEG): container finished" podID="9724e408-6086-45cd-961d-5d5504f15791" containerID="f4222b0bdc17d02a76cee6d0d46fc25a35d55bbc936412983c70c150951d1ca8" exitCode=0 Mar 21 05:09:42 crc kubenswrapper[4839]: I0321 05:09:42.382046 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-qslfs" event={"ID":"9724e408-6086-45cd-961d-5d5504f15791","Type":"ContainerDied","Data":"f4222b0bdc17d02a76cee6d0d46fc25a35d55bbc936412983c70c150951d1ca8"} Mar 21 05:09:43 crc kubenswrapper[4839]: I0321 05:09:43.394198 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-qslfs" event={"ID":"9724e408-6086-45cd-961d-5d5504f15791","Type":"ContainerStarted","Data":"a93bb86a5e91c867993ccba2e62a0a47cc3e5af5df9b68eef0f88d6d164f97d4"} Mar 21 05:09:43 crc kubenswrapper[4839]: I0321 05:09:43.425301 4839 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-qslfs" podStartSLOduration=2.961230426 podStartE2EDuration="6.425280885s" podCreationTimestamp="2026-03-21 05:09:37 +0000 UTC" firstStartedPulling="2026-03-21 05:09:39.345998298 +0000 UTC m=+2783.673784974" lastFinishedPulling="2026-03-21 05:09:42.810048757 +0000 UTC m=+2787.137835433" observedRunningTime="2026-03-21 05:09:43.421466888 +0000 UTC m=+2787.749253564" watchObservedRunningTime="2026-03-21 05:09:43.425280885 +0000 UTC m=+2787.753067561" Mar 21 05:09:48 crc kubenswrapper[4839]: I0321 05:09:48.341015 4839 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-qslfs" Mar 21 05:09:48 crc kubenswrapper[4839]: I0321 05:09:48.341478 4839 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-qslfs" Mar 21 05:09:49 crc kubenswrapper[4839]: I0321 05:09:49.399893 4839 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-qslfs" podUID="9724e408-6086-45cd-961d-5d5504f15791" containerName="registry-server" probeResult="failure" output=< Mar 21 05:09:49 crc kubenswrapper[4839]: timeout: failed to connect service ":50051" within 1s Mar 21 05:09:49 crc kubenswrapper[4839]: > Mar 21 05:09:51 crc kubenswrapper[4839]: I0321 05:09:51.453204 4839 scope.go:117] "RemoveContainer" containerID="baf34d4219b300148cd1cf3e2853a6f9d40af9c7af97b863f8a2a90b0a187c21" Mar 21 05:09:51 crc kubenswrapper[4839]: E0321 05:09:51.453781 4839 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-jx4q7_openshift-machine-config-operator(4f92fefb-d5cd-451a-8bbe-31eea55d5bd9)\"" pod="openshift-machine-config-operator/machine-config-daemon-jx4q7" podUID="4f92fefb-d5cd-451a-8bbe-31eea55d5bd9" Mar 21 05:09:59 crc kubenswrapper[4839]: I0321 05:09:59.389726 4839 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-qslfs" podUID="9724e408-6086-45cd-961d-5d5504f15791" containerName="registry-server" probeResult="failure" output=< Mar 21 05:09:59 crc kubenswrapper[4839]: timeout: failed to connect service ":50051" within 1s Mar 21 05:09:59 crc kubenswrapper[4839]: > Mar 21 05:10:00 crc kubenswrapper[4839]: I0321 05:10:00.150826 4839 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29567830-8w2nt"] Mar 21 05:10:00 crc kubenswrapper[4839]: I0321 05:10:00.152708 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567830-8w2nt" Mar 21 05:10:00 crc kubenswrapper[4839]: I0321 05:10:00.154846 4839 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 21 05:10:00 crc kubenswrapper[4839]: I0321 05:10:00.157972 4839 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-swld2" Mar 21 05:10:00 crc kubenswrapper[4839]: I0321 05:10:00.158269 4839 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 21 05:10:00 crc kubenswrapper[4839]: I0321 05:10:00.159979 4839 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29567830-8w2nt"] Mar 21 05:10:00 crc kubenswrapper[4839]: I0321 05:10:00.208387 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kl7qr\" (UniqueName: \"kubernetes.io/projected/aec2b9da-f24b-47bb-95cc-2903624e2eb1-kube-api-access-kl7qr\") pod \"auto-csr-approver-29567830-8w2nt\" (UID: \"aec2b9da-f24b-47bb-95cc-2903624e2eb1\") " pod="openshift-infra/auto-csr-approver-29567830-8w2nt" Mar 21 05:10:00 crc kubenswrapper[4839]: I0321 05:10:00.310256 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kl7qr\" (UniqueName: \"kubernetes.io/projected/aec2b9da-f24b-47bb-95cc-2903624e2eb1-kube-api-access-kl7qr\") pod \"auto-csr-approver-29567830-8w2nt\" (UID: \"aec2b9da-f24b-47bb-95cc-2903624e2eb1\") " pod="openshift-infra/auto-csr-approver-29567830-8w2nt" Mar 21 05:10:00 crc kubenswrapper[4839]: I0321 05:10:00.330467 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kl7qr\" (UniqueName: \"kubernetes.io/projected/aec2b9da-f24b-47bb-95cc-2903624e2eb1-kube-api-access-kl7qr\") pod \"auto-csr-approver-29567830-8w2nt\" (UID: \"aec2b9da-f24b-47bb-95cc-2903624e2eb1\") " pod="openshift-infra/auto-csr-approver-29567830-8w2nt" Mar 21 05:10:00 crc kubenswrapper[4839]: I0321 05:10:00.475543 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567830-8w2nt" Mar 21 05:10:00 crc kubenswrapper[4839]: I0321 05:10:00.998146 4839 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29567830-8w2nt"] Mar 21 05:10:01 crc kubenswrapper[4839]: I0321 05:10:01.731383 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567830-8w2nt" event={"ID":"aec2b9da-f24b-47bb-95cc-2903624e2eb1","Type":"ContainerStarted","Data":"6995a7d62b6effa45d3302d6b5288dcf4083f2bb7e7241635732d04a65452302"} Mar 21 05:10:03 crc kubenswrapper[4839]: I0321 05:10:03.453142 4839 scope.go:117] "RemoveContainer" containerID="baf34d4219b300148cd1cf3e2853a6f9d40af9c7af97b863f8a2a90b0a187c21" Mar 21 05:10:03 crc kubenswrapper[4839]: I0321 05:10:03.747244 4839 generic.go:334] "Generic (PLEG): container finished" podID="aec2b9da-f24b-47bb-95cc-2903624e2eb1" containerID="3edbd65acb10c2e93822fc30bbb912980fc1145d300030a3f42acd4b77e841c2" exitCode=0 Mar 21 05:10:03 crc kubenswrapper[4839]: I0321 05:10:03.747328 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567830-8w2nt" event={"ID":"aec2b9da-f24b-47bb-95cc-2903624e2eb1","Type":"ContainerDied","Data":"3edbd65acb10c2e93822fc30bbb912980fc1145d300030a3f42acd4b77e841c2"} Mar 21 05:10:03 crc kubenswrapper[4839]: I0321 05:10:03.750316 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-jx4q7" event={"ID":"4f92fefb-d5cd-451a-8bbe-31eea55d5bd9","Type":"ContainerStarted","Data":"b3b7e174e088400e045d778d09a66442ae3aba7106eba8585b97c2b9d3b1ab21"} Mar 21 05:10:05 crc kubenswrapper[4839]: I0321 05:10:05.208032 4839 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567830-8w2nt" Mar 21 05:10:05 crc kubenswrapper[4839]: I0321 05:10:05.323277 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kl7qr\" (UniqueName: \"kubernetes.io/projected/aec2b9da-f24b-47bb-95cc-2903624e2eb1-kube-api-access-kl7qr\") pod \"aec2b9da-f24b-47bb-95cc-2903624e2eb1\" (UID: \"aec2b9da-f24b-47bb-95cc-2903624e2eb1\") " Mar 21 05:10:05 crc kubenswrapper[4839]: I0321 05:10:05.336779 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/aec2b9da-f24b-47bb-95cc-2903624e2eb1-kube-api-access-kl7qr" (OuterVolumeSpecName: "kube-api-access-kl7qr") pod "aec2b9da-f24b-47bb-95cc-2903624e2eb1" (UID: "aec2b9da-f24b-47bb-95cc-2903624e2eb1"). InnerVolumeSpecName "kube-api-access-kl7qr". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 21 05:10:05 crc kubenswrapper[4839]: I0321 05:10:05.425458 4839 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kl7qr\" (UniqueName: \"kubernetes.io/projected/aec2b9da-f24b-47bb-95cc-2903624e2eb1-kube-api-access-kl7qr\") on node \"crc\" DevicePath \"\"" Mar 21 05:10:05 crc kubenswrapper[4839]: I0321 05:10:05.775894 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567830-8w2nt" event={"ID":"aec2b9da-f24b-47bb-95cc-2903624e2eb1","Type":"ContainerDied","Data":"6995a7d62b6effa45d3302d6b5288dcf4083f2bb7e7241635732d04a65452302"} Mar 21 05:10:05 crc kubenswrapper[4839]: I0321 05:10:05.779438 4839 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6995a7d62b6effa45d3302d6b5288dcf4083f2bb7e7241635732d04a65452302" Mar 21 05:10:05 crc kubenswrapper[4839]: I0321 05:10:05.778987 4839 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567830-8w2nt" Mar 21 05:10:06 crc kubenswrapper[4839]: I0321 05:10:06.282392 4839 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29567824-vx55r"] Mar 21 05:10:06 crc kubenswrapper[4839]: I0321 05:10:06.290296 4839 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29567824-vx55r"] Mar 21 05:10:06 crc kubenswrapper[4839]: I0321 05:10:06.465484 4839 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0bbefcc3-042e-4587-b172-1a1de0f34dcf" path="/var/lib/kubelet/pods/0bbefcc3-042e-4587-b172-1a1de0f34dcf/volumes" Mar 21 05:10:09 crc kubenswrapper[4839]: I0321 05:10:09.397810 4839 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-qslfs" podUID="9724e408-6086-45cd-961d-5d5504f15791" containerName="registry-server" probeResult="failure" output=< Mar 21 05:10:09 crc kubenswrapper[4839]: timeout: failed to connect service ":50051" within 1s Mar 21 05:10:09 crc kubenswrapper[4839]: > Mar 21 05:10:18 crc kubenswrapper[4839]: I0321 05:10:18.382310 4839 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-qslfs" Mar 21 05:10:18 crc kubenswrapper[4839]: I0321 05:10:18.429143 4839 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-qslfs" Mar 21 05:10:18 crc kubenswrapper[4839]: I0321 05:10:18.616319 4839 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-qslfs"] Mar 21 05:10:19 crc kubenswrapper[4839]: I0321 05:10:19.915475 4839 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-qslfs" podUID="9724e408-6086-45cd-961d-5d5504f15791" containerName="registry-server" containerID="cri-o://a93bb86a5e91c867993ccba2e62a0a47cc3e5af5df9b68eef0f88d6d164f97d4" gracePeriod=2 Mar 21 05:10:21 crc kubenswrapper[4839]: I0321 05:10:20.905627 4839 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-qslfs" Mar 21 05:10:21 crc kubenswrapper[4839]: I0321 05:10:20.934462 4839 generic.go:334] "Generic (PLEG): container finished" podID="9724e408-6086-45cd-961d-5d5504f15791" containerID="a93bb86a5e91c867993ccba2e62a0a47cc3e5af5df9b68eef0f88d6d164f97d4" exitCode=0 Mar 21 05:10:21 crc kubenswrapper[4839]: I0321 05:10:20.934505 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-qslfs" event={"ID":"9724e408-6086-45cd-961d-5d5504f15791","Type":"ContainerDied","Data":"a93bb86a5e91c867993ccba2e62a0a47cc3e5af5df9b68eef0f88d6d164f97d4"} Mar 21 05:10:21 crc kubenswrapper[4839]: I0321 05:10:20.934546 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-qslfs" event={"ID":"9724e408-6086-45cd-961d-5d5504f15791","Type":"ContainerDied","Data":"3ee0994c45cc96d08771860e048f51ded7dac2e5b506ab578b474e4ea7342e27"} Mar 21 05:10:21 crc kubenswrapper[4839]: I0321 05:10:20.934584 4839 scope.go:117] "RemoveContainer" containerID="a93bb86a5e91c867993ccba2e62a0a47cc3e5af5df9b68eef0f88d6d164f97d4" Mar 21 05:10:21 crc kubenswrapper[4839]: I0321 05:10:20.934670 4839 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-qslfs" Mar 21 05:10:21 crc kubenswrapper[4839]: I0321 05:10:20.956069 4839 scope.go:117] "RemoveContainer" containerID="f4222b0bdc17d02a76cee6d0d46fc25a35d55bbc936412983c70c150951d1ca8" Mar 21 05:10:21 crc kubenswrapper[4839]: I0321 05:10:20.980723 4839 scope.go:117] "RemoveContainer" containerID="f416b63465291b1b0afd3a134a0a816800ac0b064ea726d9921fdf7a33ec3237" Mar 21 05:10:21 crc kubenswrapper[4839]: I0321 05:10:21.018286 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9724e408-6086-45cd-961d-5d5504f15791-catalog-content\") pod \"9724e408-6086-45cd-961d-5d5504f15791\" (UID: \"9724e408-6086-45cd-961d-5d5504f15791\") " Mar 21 05:10:21 crc kubenswrapper[4839]: I0321 05:10:21.018384 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mmx52\" (UniqueName: \"kubernetes.io/projected/9724e408-6086-45cd-961d-5d5504f15791-kube-api-access-mmx52\") pod \"9724e408-6086-45cd-961d-5d5504f15791\" (UID: \"9724e408-6086-45cd-961d-5d5504f15791\") " Mar 21 05:10:21 crc kubenswrapper[4839]: I0321 05:10:21.018435 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9724e408-6086-45cd-961d-5d5504f15791-utilities\") pod \"9724e408-6086-45cd-961d-5d5504f15791\" (UID: \"9724e408-6086-45cd-961d-5d5504f15791\") " Mar 21 05:10:21 crc kubenswrapper[4839]: I0321 05:10:21.019494 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9724e408-6086-45cd-961d-5d5504f15791-utilities" (OuterVolumeSpecName: "utilities") pod "9724e408-6086-45cd-961d-5d5504f15791" (UID: "9724e408-6086-45cd-961d-5d5504f15791"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 21 05:10:21 crc kubenswrapper[4839]: I0321 05:10:21.022414 4839 scope.go:117] "RemoveContainer" containerID="a93bb86a5e91c867993ccba2e62a0a47cc3e5af5df9b68eef0f88d6d164f97d4" Mar 21 05:10:21 crc kubenswrapper[4839]: E0321 05:10:21.023616 4839 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a93bb86a5e91c867993ccba2e62a0a47cc3e5af5df9b68eef0f88d6d164f97d4\": container with ID starting with a93bb86a5e91c867993ccba2e62a0a47cc3e5af5df9b68eef0f88d6d164f97d4 not found: ID does not exist" containerID="a93bb86a5e91c867993ccba2e62a0a47cc3e5af5df9b68eef0f88d6d164f97d4" Mar 21 05:10:21 crc kubenswrapper[4839]: I0321 05:10:21.023681 4839 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a93bb86a5e91c867993ccba2e62a0a47cc3e5af5df9b68eef0f88d6d164f97d4"} err="failed to get container status \"a93bb86a5e91c867993ccba2e62a0a47cc3e5af5df9b68eef0f88d6d164f97d4\": rpc error: code = NotFound desc = could not find container \"a93bb86a5e91c867993ccba2e62a0a47cc3e5af5df9b68eef0f88d6d164f97d4\": container with ID starting with a93bb86a5e91c867993ccba2e62a0a47cc3e5af5df9b68eef0f88d6d164f97d4 not found: ID does not exist" Mar 21 05:10:21 crc kubenswrapper[4839]: I0321 05:10:21.023716 4839 scope.go:117] "RemoveContainer" containerID="f4222b0bdc17d02a76cee6d0d46fc25a35d55bbc936412983c70c150951d1ca8" Mar 21 05:10:21 crc kubenswrapper[4839]: E0321 05:10:21.024128 4839 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f4222b0bdc17d02a76cee6d0d46fc25a35d55bbc936412983c70c150951d1ca8\": container with ID starting with f4222b0bdc17d02a76cee6d0d46fc25a35d55bbc936412983c70c150951d1ca8 not found: ID does not exist" containerID="f4222b0bdc17d02a76cee6d0d46fc25a35d55bbc936412983c70c150951d1ca8" Mar 21 05:10:21 crc kubenswrapper[4839]: I0321 05:10:21.024155 4839 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f4222b0bdc17d02a76cee6d0d46fc25a35d55bbc936412983c70c150951d1ca8"} err="failed to get container status \"f4222b0bdc17d02a76cee6d0d46fc25a35d55bbc936412983c70c150951d1ca8\": rpc error: code = NotFound desc = could not find container \"f4222b0bdc17d02a76cee6d0d46fc25a35d55bbc936412983c70c150951d1ca8\": container with ID starting with f4222b0bdc17d02a76cee6d0d46fc25a35d55bbc936412983c70c150951d1ca8 not found: ID does not exist" Mar 21 05:10:21 crc kubenswrapper[4839]: I0321 05:10:21.024175 4839 scope.go:117] "RemoveContainer" containerID="f416b63465291b1b0afd3a134a0a816800ac0b064ea726d9921fdf7a33ec3237" Mar 21 05:10:21 crc kubenswrapper[4839]: E0321 05:10:21.024595 4839 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f416b63465291b1b0afd3a134a0a816800ac0b064ea726d9921fdf7a33ec3237\": container with ID starting with f416b63465291b1b0afd3a134a0a816800ac0b064ea726d9921fdf7a33ec3237 not found: ID does not exist" containerID="f416b63465291b1b0afd3a134a0a816800ac0b064ea726d9921fdf7a33ec3237" Mar 21 05:10:21 crc kubenswrapper[4839]: I0321 05:10:21.024639 4839 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f416b63465291b1b0afd3a134a0a816800ac0b064ea726d9921fdf7a33ec3237"} err="failed to get container status \"f416b63465291b1b0afd3a134a0a816800ac0b064ea726d9921fdf7a33ec3237\": rpc error: code = NotFound desc = could not find container \"f416b63465291b1b0afd3a134a0a816800ac0b064ea726d9921fdf7a33ec3237\": container with ID starting with f416b63465291b1b0afd3a134a0a816800ac0b064ea726d9921fdf7a33ec3237 not found: ID does not exist" Mar 21 05:10:21 crc kubenswrapper[4839]: I0321 05:10:21.027892 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9724e408-6086-45cd-961d-5d5504f15791-kube-api-access-mmx52" (OuterVolumeSpecName: "kube-api-access-mmx52") pod "9724e408-6086-45cd-961d-5d5504f15791" (UID: "9724e408-6086-45cd-961d-5d5504f15791"). InnerVolumeSpecName "kube-api-access-mmx52". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 21 05:10:21 crc kubenswrapper[4839]: I0321 05:10:21.120500 4839 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mmx52\" (UniqueName: \"kubernetes.io/projected/9724e408-6086-45cd-961d-5d5504f15791-kube-api-access-mmx52\") on node \"crc\" DevicePath \"\"" Mar 21 05:10:21 crc kubenswrapper[4839]: I0321 05:10:21.120525 4839 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9724e408-6086-45cd-961d-5d5504f15791-utilities\") on node \"crc\" DevicePath \"\"" Mar 21 05:10:21 crc kubenswrapper[4839]: I0321 05:10:21.154527 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9724e408-6086-45cd-961d-5d5504f15791-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "9724e408-6086-45cd-961d-5d5504f15791" (UID: "9724e408-6086-45cd-961d-5d5504f15791"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 21 05:10:21 crc kubenswrapper[4839]: I0321 05:10:21.222354 4839 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9724e408-6086-45cd-961d-5d5504f15791-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 21 05:10:21 crc kubenswrapper[4839]: I0321 05:10:21.276751 4839 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-qslfs"] Mar 21 05:10:21 crc kubenswrapper[4839]: I0321 05:10:21.285721 4839 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-qslfs"] Mar 21 05:10:22 crc kubenswrapper[4839]: I0321 05:10:22.463610 4839 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9724e408-6086-45cd-961d-5d5504f15791" path="/var/lib/kubelet/pods/9724e408-6086-45cd-961d-5d5504f15791/volumes" Mar 21 05:10:56 crc kubenswrapper[4839]: I0321 05:10:56.350098 4839 scope.go:117] "RemoveContainer" containerID="ca1bd38f0e84cbc6abf00446444e014421818ecf4311848a17094cd139ec8ed6" Mar 21 05:11:18 crc kubenswrapper[4839]: I0321 05:11:18.441788 4839 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-fh65n"] Mar 21 05:11:18 crc kubenswrapper[4839]: E0321 05:11:18.443196 4839 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9724e408-6086-45cd-961d-5d5504f15791" containerName="registry-server" Mar 21 05:11:18 crc kubenswrapper[4839]: I0321 05:11:18.443214 4839 state_mem.go:107] "Deleted CPUSet assignment" podUID="9724e408-6086-45cd-961d-5d5504f15791" containerName="registry-server" Mar 21 05:11:18 crc kubenswrapper[4839]: E0321 05:11:18.443241 4839 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9724e408-6086-45cd-961d-5d5504f15791" containerName="extract-utilities" Mar 21 05:11:18 crc kubenswrapper[4839]: I0321 05:11:18.443250 4839 state_mem.go:107] "Deleted CPUSet assignment" podUID="9724e408-6086-45cd-961d-5d5504f15791" containerName="extract-utilities" Mar 21 05:11:18 crc kubenswrapper[4839]: E0321 05:11:18.443258 4839 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9724e408-6086-45cd-961d-5d5504f15791" containerName="extract-content" Mar 21 05:11:18 crc kubenswrapper[4839]: I0321 05:11:18.443267 4839 state_mem.go:107] "Deleted CPUSet assignment" podUID="9724e408-6086-45cd-961d-5d5504f15791" containerName="extract-content" Mar 21 05:11:18 crc kubenswrapper[4839]: E0321 05:11:18.443289 4839 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="aec2b9da-f24b-47bb-95cc-2903624e2eb1" containerName="oc" Mar 21 05:11:18 crc kubenswrapper[4839]: I0321 05:11:18.443297 4839 state_mem.go:107] "Deleted CPUSet assignment" podUID="aec2b9da-f24b-47bb-95cc-2903624e2eb1" containerName="oc" Mar 21 05:11:18 crc kubenswrapper[4839]: I0321 05:11:18.443491 4839 memory_manager.go:354] "RemoveStaleState removing state" podUID="9724e408-6086-45cd-961d-5d5504f15791" containerName="registry-server" Mar 21 05:11:18 crc kubenswrapper[4839]: I0321 05:11:18.443513 4839 memory_manager.go:354] "RemoveStaleState removing state" podUID="aec2b9da-f24b-47bb-95cc-2903624e2eb1" containerName="oc" Mar 21 05:11:18 crc kubenswrapper[4839]: I0321 05:11:18.444967 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-fh65n" Mar 21 05:11:18 crc kubenswrapper[4839]: I0321 05:11:18.466976 4839 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-fh65n"] Mar 21 05:11:18 crc kubenswrapper[4839]: I0321 05:11:18.543945 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b1dd30d5-cf1b-46d1-bea2-1e3298f2f53e-catalog-content\") pod \"community-operators-fh65n\" (UID: \"b1dd30d5-cf1b-46d1-bea2-1e3298f2f53e\") " pod="openshift-marketplace/community-operators-fh65n" Mar 21 05:11:18 crc kubenswrapper[4839]: I0321 05:11:18.544088 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-frdgb\" (UniqueName: \"kubernetes.io/projected/b1dd30d5-cf1b-46d1-bea2-1e3298f2f53e-kube-api-access-frdgb\") pod \"community-operators-fh65n\" (UID: \"b1dd30d5-cf1b-46d1-bea2-1e3298f2f53e\") " pod="openshift-marketplace/community-operators-fh65n" Mar 21 05:11:18 crc kubenswrapper[4839]: I0321 05:11:18.544148 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b1dd30d5-cf1b-46d1-bea2-1e3298f2f53e-utilities\") pod \"community-operators-fh65n\" (UID: \"b1dd30d5-cf1b-46d1-bea2-1e3298f2f53e\") " pod="openshift-marketplace/community-operators-fh65n" Mar 21 05:11:18 crc kubenswrapper[4839]: I0321 05:11:18.645561 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b1dd30d5-cf1b-46d1-bea2-1e3298f2f53e-catalog-content\") pod \"community-operators-fh65n\" (UID: \"b1dd30d5-cf1b-46d1-bea2-1e3298f2f53e\") " pod="openshift-marketplace/community-operators-fh65n" Mar 21 05:11:18 crc kubenswrapper[4839]: I0321 05:11:18.645700 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-frdgb\" (UniqueName: \"kubernetes.io/projected/b1dd30d5-cf1b-46d1-bea2-1e3298f2f53e-kube-api-access-frdgb\") pod \"community-operators-fh65n\" (UID: \"b1dd30d5-cf1b-46d1-bea2-1e3298f2f53e\") " pod="openshift-marketplace/community-operators-fh65n" Mar 21 05:11:18 crc kubenswrapper[4839]: I0321 05:11:18.645758 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b1dd30d5-cf1b-46d1-bea2-1e3298f2f53e-utilities\") pod \"community-operators-fh65n\" (UID: \"b1dd30d5-cf1b-46d1-bea2-1e3298f2f53e\") " pod="openshift-marketplace/community-operators-fh65n" Mar 21 05:11:18 crc kubenswrapper[4839]: I0321 05:11:18.646107 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b1dd30d5-cf1b-46d1-bea2-1e3298f2f53e-catalog-content\") pod \"community-operators-fh65n\" (UID: \"b1dd30d5-cf1b-46d1-bea2-1e3298f2f53e\") " pod="openshift-marketplace/community-operators-fh65n" Mar 21 05:11:18 crc kubenswrapper[4839]: I0321 05:11:18.646140 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b1dd30d5-cf1b-46d1-bea2-1e3298f2f53e-utilities\") pod \"community-operators-fh65n\" (UID: \"b1dd30d5-cf1b-46d1-bea2-1e3298f2f53e\") " pod="openshift-marketplace/community-operators-fh65n" Mar 21 05:11:18 crc kubenswrapper[4839]: I0321 05:11:18.670877 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-frdgb\" (UniqueName: \"kubernetes.io/projected/b1dd30d5-cf1b-46d1-bea2-1e3298f2f53e-kube-api-access-frdgb\") pod \"community-operators-fh65n\" (UID: \"b1dd30d5-cf1b-46d1-bea2-1e3298f2f53e\") " pod="openshift-marketplace/community-operators-fh65n" Mar 21 05:11:18 crc kubenswrapper[4839]: I0321 05:11:18.766916 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-fh65n" Mar 21 05:11:19 crc kubenswrapper[4839]: I0321 05:11:19.284341 4839 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-fh65n"] Mar 21 05:11:19 crc kubenswrapper[4839]: I0321 05:11:19.466328 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-fh65n" event={"ID":"b1dd30d5-cf1b-46d1-bea2-1e3298f2f53e","Type":"ContainerStarted","Data":"cde0425763174457626a3683ed0e3ac17018ee383a5638dbacee306c171373be"} Mar 21 05:11:20 crc kubenswrapper[4839]: I0321 05:11:20.479664 4839 generic.go:334] "Generic (PLEG): container finished" podID="b1dd30d5-cf1b-46d1-bea2-1e3298f2f53e" containerID="68c35cefa145fbe9f7c4e26de6ffc1ef0cceda473451ac306382c83dcdb3c883" exitCode=0 Mar 21 05:11:20 crc kubenswrapper[4839]: I0321 05:11:20.479913 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-fh65n" event={"ID":"b1dd30d5-cf1b-46d1-bea2-1e3298f2f53e","Type":"ContainerDied","Data":"68c35cefa145fbe9f7c4e26de6ffc1ef0cceda473451ac306382c83dcdb3c883"} Mar 21 05:11:20 crc kubenswrapper[4839]: I0321 05:11:20.484522 4839 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 21 05:11:21 crc kubenswrapper[4839]: I0321 05:11:21.490221 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-fh65n" event={"ID":"b1dd30d5-cf1b-46d1-bea2-1e3298f2f53e","Type":"ContainerStarted","Data":"bae945bd03b95ffa50789d0010748dc9fb1a42844d6aa29e54ac008374bc7eb5"} Mar 21 05:11:22 crc kubenswrapper[4839]: I0321 05:11:22.501965 4839 generic.go:334] "Generic (PLEG): container finished" podID="b1dd30d5-cf1b-46d1-bea2-1e3298f2f53e" containerID="bae945bd03b95ffa50789d0010748dc9fb1a42844d6aa29e54ac008374bc7eb5" exitCode=0 Mar 21 05:11:22 crc kubenswrapper[4839]: I0321 05:11:22.502061 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-fh65n" event={"ID":"b1dd30d5-cf1b-46d1-bea2-1e3298f2f53e","Type":"ContainerDied","Data":"bae945bd03b95ffa50789d0010748dc9fb1a42844d6aa29e54ac008374bc7eb5"} Mar 21 05:11:22 crc kubenswrapper[4839]: I0321 05:11:22.502425 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-fh65n" event={"ID":"b1dd30d5-cf1b-46d1-bea2-1e3298f2f53e","Type":"ContainerStarted","Data":"2ca0749692ccec6e33ef8ee8342a8dfda5712c41830dde3cd8fb1bca6436d60a"} Mar 21 05:11:22 crc kubenswrapper[4839]: I0321 05:11:22.532141 4839 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-fh65n" podStartSLOduration=3.1604547849999998 podStartE2EDuration="4.532110564s" podCreationTimestamp="2026-03-21 05:11:18 +0000 UTC" firstStartedPulling="2026-03-21 05:11:20.484161495 +0000 UTC m=+2884.811948171" lastFinishedPulling="2026-03-21 05:11:21.855817274 +0000 UTC m=+2886.183603950" observedRunningTime="2026-03-21 05:11:22.521211996 +0000 UTC m=+2886.848998682" watchObservedRunningTime="2026-03-21 05:11:22.532110564 +0000 UTC m=+2886.859897240" Mar 21 05:11:28 crc kubenswrapper[4839]: I0321 05:11:28.767318 4839 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-fh65n" Mar 21 05:11:28 crc kubenswrapper[4839]: I0321 05:11:28.767848 4839 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-fh65n" Mar 21 05:11:28 crc kubenswrapper[4839]: I0321 05:11:28.817285 4839 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-fh65n" Mar 21 05:11:29 crc kubenswrapper[4839]: I0321 05:11:29.764453 4839 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-fh65n" Mar 21 05:11:29 crc kubenswrapper[4839]: I0321 05:11:29.811458 4839 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-fh65n"] Mar 21 05:11:31 crc kubenswrapper[4839]: I0321 05:11:31.727738 4839 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-fh65n" podUID="b1dd30d5-cf1b-46d1-bea2-1e3298f2f53e" containerName="registry-server" containerID="cri-o://2ca0749692ccec6e33ef8ee8342a8dfda5712c41830dde3cd8fb1bca6436d60a" gracePeriod=2 Mar 21 05:11:32 crc kubenswrapper[4839]: I0321 05:11:32.183985 4839 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-fh65n" Mar 21 05:11:32 crc kubenswrapper[4839]: I0321 05:11:32.208338 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-frdgb\" (UniqueName: \"kubernetes.io/projected/b1dd30d5-cf1b-46d1-bea2-1e3298f2f53e-kube-api-access-frdgb\") pod \"b1dd30d5-cf1b-46d1-bea2-1e3298f2f53e\" (UID: \"b1dd30d5-cf1b-46d1-bea2-1e3298f2f53e\") " Mar 21 05:11:32 crc kubenswrapper[4839]: I0321 05:11:32.208758 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b1dd30d5-cf1b-46d1-bea2-1e3298f2f53e-utilities\") pod \"b1dd30d5-cf1b-46d1-bea2-1e3298f2f53e\" (UID: \"b1dd30d5-cf1b-46d1-bea2-1e3298f2f53e\") " Mar 21 05:11:32 crc kubenswrapper[4839]: I0321 05:11:32.208795 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b1dd30d5-cf1b-46d1-bea2-1e3298f2f53e-catalog-content\") pod \"b1dd30d5-cf1b-46d1-bea2-1e3298f2f53e\" (UID: \"b1dd30d5-cf1b-46d1-bea2-1e3298f2f53e\") " Mar 21 05:11:32 crc kubenswrapper[4839]: I0321 05:11:32.211306 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b1dd30d5-cf1b-46d1-bea2-1e3298f2f53e-utilities" (OuterVolumeSpecName: "utilities") pod "b1dd30d5-cf1b-46d1-bea2-1e3298f2f53e" (UID: "b1dd30d5-cf1b-46d1-bea2-1e3298f2f53e"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 21 05:11:32 crc kubenswrapper[4839]: I0321 05:11:32.217114 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b1dd30d5-cf1b-46d1-bea2-1e3298f2f53e-kube-api-access-frdgb" (OuterVolumeSpecName: "kube-api-access-frdgb") pod "b1dd30d5-cf1b-46d1-bea2-1e3298f2f53e" (UID: "b1dd30d5-cf1b-46d1-bea2-1e3298f2f53e"). InnerVolumeSpecName "kube-api-access-frdgb". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 21 05:11:32 crc kubenswrapper[4839]: I0321 05:11:32.310880 4839 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-frdgb\" (UniqueName: \"kubernetes.io/projected/b1dd30d5-cf1b-46d1-bea2-1e3298f2f53e-kube-api-access-frdgb\") on node \"crc\" DevicePath \"\"" Mar 21 05:11:32 crc kubenswrapper[4839]: I0321 05:11:32.310912 4839 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b1dd30d5-cf1b-46d1-bea2-1e3298f2f53e-utilities\") on node \"crc\" DevicePath \"\"" Mar 21 05:11:32 crc kubenswrapper[4839]: I0321 05:11:32.738450 4839 generic.go:334] "Generic (PLEG): container finished" podID="b1dd30d5-cf1b-46d1-bea2-1e3298f2f53e" containerID="2ca0749692ccec6e33ef8ee8342a8dfda5712c41830dde3cd8fb1bca6436d60a" exitCode=0 Mar 21 05:11:32 crc kubenswrapper[4839]: I0321 05:11:32.738495 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-fh65n" event={"ID":"b1dd30d5-cf1b-46d1-bea2-1e3298f2f53e","Type":"ContainerDied","Data":"2ca0749692ccec6e33ef8ee8342a8dfda5712c41830dde3cd8fb1bca6436d60a"} Mar 21 05:11:32 crc kubenswrapper[4839]: I0321 05:11:32.738521 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-fh65n" event={"ID":"b1dd30d5-cf1b-46d1-bea2-1e3298f2f53e","Type":"ContainerDied","Data":"cde0425763174457626a3683ed0e3ac17018ee383a5638dbacee306c171373be"} Mar 21 05:11:32 crc kubenswrapper[4839]: I0321 05:11:32.738541 4839 scope.go:117] "RemoveContainer" containerID="2ca0749692ccec6e33ef8ee8342a8dfda5712c41830dde3cd8fb1bca6436d60a" Mar 21 05:11:32 crc kubenswrapper[4839]: I0321 05:11:32.738702 4839 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-fh65n" Mar 21 05:11:32 crc kubenswrapper[4839]: I0321 05:11:32.762893 4839 scope.go:117] "RemoveContainer" containerID="bae945bd03b95ffa50789d0010748dc9fb1a42844d6aa29e54ac008374bc7eb5" Mar 21 05:11:32 crc kubenswrapper[4839]: I0321 05:11:32.783905 4839 scope.go:117] "RemoveContainer" containerID="68c35cefa145fbe9f7c4e26de6ffc1ef0cceda473451ac306382c83dcdb3c883" Mar 21 05:11:32 crc kubenswrapper[4839]: I0321 05:11:32.829455 4839 scope.go:117] "RemoveContainer" containerID="2ca0749692ccec6e33ef8ee8342a8dfda5712c41830dde3cd8fb1bca6436d60a" Mar 21 05:11:32 crc kubenswrapper[4839]: E0321 05:11:32.830079 4839 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2ca0749692ccec6e33ef8ee8342a8dfda5712c41830dde3cd8fb1bca6436d60a\": container with ID starting with 2ca0749692ccec6e33ef8ee8342a8dfda5712c41830dde3cd8fb1bca6436d60a not found: ID does not exist" containerID="2ca0749692ccec6e33ef8ee8342a8dfda5712c41830dde3cd8fb1bca6436d60a" Mar 21 05:11:32 crc kubenswrapper[4839]: I0321 05:11:32.830129 4839 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2ca0749692ccec6e33ef8ee8342a8dfda5712c41830dde3cd8fb1bca6436d60a"} err="failed to get container status \"2ca0749692ccec6e33ef8ee8342a8dfda5712c41830dde3cd8fb1bca6436d60a\": rpc error: code = NotFound desc = could not find container \"2ca0749692ccec6e33ef8ee8342a8dfda5712c41830dde3cd8fb1bca6436d60a\": container with ID starting with 2ca0749692ccec6e33ef8ee8342a8dfda5712c41830dde3cd8fb1bca6436d60a not found: ID does not exist" Mar 21 05:11:32 crc kubenswrapper[4839]: I0321 05:11:32.830156 4839 scope.go:117] "RemoveContainer" containerID="bae945bd03b95ffa50789d0010748dc9fb1a42844d6aa29e54ac008374bc7eb5" Mar 21 05:11:32 crc kubenswrapper[4839]: E0321 05:11:32.830540 4839 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bae945bd03b95ffa50789d0010748dc9fb1a42844d6aa29e54ac008374bc7eb5\": container with ID starting with bae945bd03b95ffa50789d0010748dc9fb1a42844d6aa29e54ac008374bc7eb5 not found: ID does not exist" containerID="bae945bd03b95ffa50789d0010748dc9fb1a42844d6aa29e54ac008374bc7eb5" Mar 21 05:11:32 crc kubenswrapper[4839]: I0321 05:11:32.830566 4839 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bae945bd03b95ffa50789d0010748dc9fb1a42844d6aa29e54ac008374bc7eb5"} err="failed to get container status \"bae945bd03b95ffa50789d0010748dc9fb1a42844d6aa29e54ac008374bc7eb5\": rpc error: code = NotFound desc = could not find container \"bae945bd03b95ffa50789d0010748dc9fb1a42844d6aa29e54ac008374bc7eb5\": container with ID starting with bae945bd03b95ffa50789d0010748dc9fb1a42844d6aa29e54ac008374bc7eb5 not found: ID does not exist" Mar 21 05:11:32 crc kubenswrapper[4839]: I0321 05:11:32.830592 4839 scope.go:117] "RemoveContainer" containerID="68c35cefa145fbe9f7c4e26de6ffc1ef0cceda473451ac306382c83dcdb3c883" Mar 21 05:11:32 crc kubenswrapper[4839]: E0321 05:11:32.830841 4839 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"68c35cefa145fbe9f7c4e26de6ffc1ef0cceda473451ac306382c83dcdb3c883\": container with ID starting with 68c35cefa145fbe9f7c4e26de6ffc1ef0cceda473451ac306382c83dcdb3c883 not found: ID does not exist" containerID="68c35cefa145fbe9f7c4e26de6ffc1ef0cceda473451ac306382c83dcdb3c883" Mar 21 05:11:32 crc kubenswrapper[4839]: I0321 05:11:32.830884 4839 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"68c35cefa145fbe9f7c4e26de6ffc1ef0cceda473451ac306382c83dcdb3c883"} err="failed to get container status \"68c35cefa145fbe9f7c4e26de6ffc1ef0cceda473451ac306382c83dcdb3c883\": rpc error: code = NotFound desc = could not find container \"68c35cefa145fbe9f7c4e26de6ffc1ef0cceda473451ac306382c83dcdb3c883\": container with ID starting with 68c35cefa145fbe9f7c4e26de6ffc1ef0cceda473451ac306382c83dcdb3c883 not found: ID does not exist" Mar 21 05:11:33 crc kubenswrapper[4839]: I0321 05:11:33.166756 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b1dd30d5-cf1b-46d1-bea2-1e3298f2f53e-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "b1dd30d5-cf1b-46d1-bea2-1e3298f2f53e" (UID: "b1dd30d5-cf1b-46d1-bea2-1e3298f2f53e"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 21 05:11:33 crc kubenswrapper[4839]: I0321 05:11:33.226952 4839 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b1dd30d5-cf1b-46d1-bea2-1e3298f2f53e-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 21 05:11:33 crc kubenswrapper[4839]: I0321 05:11:33.540295 4839 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-fh65n"] Mar 21 05:11:33 crc kubenswrapper[4839]: I0321 05:11:33.550561 4839 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-fh65n"] Mar 21 05:11:34 crc kubenswrapper[4839]: I0321 05:11:34.464851 4839 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b1dd30d5-cf1b-46d1-bea2-1e3298f2f53e" path="/var/lib/kubelet/pods/b1dd30d5-cf1b-46d1-bea2-1e3298f2f53e/volumes" Mar 21 05:12:00 crc kubenswrapper[4839]: I0321 05:12:00.153682 4839 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29567832-p2sbz"] Mar 21 05:12:00 crc kubenswrapper[4839]: E0321 05:12:00.154516 4839 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b1dd30d5-cf1b-46d1-bea2-1e3298f2f53e" containerName="extract-content" Mar 21 05:12:00 crc kubenswrapper[4839]: I0321 05:12:00.154530 4839 state_mem.go:107] "Deleted CPUSet assignment" podUID="b1dd30d5-cf1b-46d1-bea2-1e3298f2f53e" containerName="extract-content" Mar 21 05:12:00 crc kubenswrapper[4839]: E0321 05:12:00.154569 4839 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b1dd30d5-cf1b-46d1-bea2-1e3298f2f53e" containerName="extract-utilities" Mar 21 05:12:00 crc kubenswrapper[4839]: I0321 05:12:00.154591 4839 state_mem.go:107] "Deleted CPUSet assignment" podUID="b1dd30d5-cf1b-46d1-bea2-1e3298f2f53e" containerName="extract-utilities" Mar 21 05:12:00 crc kubenswrapper[4839]: E0321 05:12:00.154601 4839 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b1dd30d5-cf1b-46d1-bea2-1e3298f2f53e" containerName="registry-server" Mar 21 05:12:00 crc kubenswrapper[4839]: I0321 05:12:00.154607 4839 state_mem.go:107] "Deleted CPUSet assignment" podUID="b1dd30d5-cf1b-46d1-bea2-1e3298f2f53e" containerName="registry-server" Mar 21 05:12:00 crc kubenswrapper[4839]: I0321 05:12:00.154769 4839 memory_manager.go:354] "RemoveStaleState removing state" podUID="b1dd30d5-cf1b-46d1-bea2-1e3298f2f53e" containerName="registry-server" Mar 21 05:12:00 crc kubenswrapper[4839]: I0321 05:12:00.155461 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567832-p2sbz" Mar 21 05:12:00 crc kubenswrapper[4839]: I0321 05:12:00.157341 4839 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 21 05:12:00 crc kubenswrapper[4839]: I0321 05:12:00.157378 4839 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-swld2" Mar 21 05:12:00 crc kubenswrapper[4839]: I0321 05:12:00.157737 4839 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 21 05:12:00 crc kubenswrapper[4839]: I0321 05:12:00.161338 4839 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29567832-p2sbz"] Mar 21 05:12:00 crc kubenswrapper[4839]: I0321 05:12:00.218439 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hdsg7\" (UniqueName: \"kubernetes.io/projected/82138c0f-eab6-4265-8db8-a8a1d934493a-kube-api-access-hdsg7\") pod \"auto-csr-approver-29567832-p2sbz\" (UID: \"82138c0f-eab6-4265-8db8-a8a1d934493a\") " pod="openshift-infra/auto-csr-approver-29567832-p2sbz" Mar 21 05:12:00 crc kubenswrapper[4839]: I0321 05:12:00.319785 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hdsg7\" (UniqueName: \"kubernetes.io/projected/82138c0f-eab6-4265-8db8-a8a1d934493a-kube-api-access-hdsg7\") pod \"auto-csr-approver-29567832-p2sbz\" (UID: \"82138c0f-eab6-4265-8db8-a8a1d934493a\") " pod="openshift-infra/auto-csr-approver-29567832-p2sbz" Mar 21 05:12:00 crc kubenswrapper[4839]: I0321 05:12:00.340287 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hdsg7\" (UniqueName: \"kubernetes.io/projected/82138c0f-eab6-4265-8db8-a8a1d934493a-kube-api-access-hdsg7\") pod \"auto-csr-approver-29567832-p2sbz\" (UID: \"82138c0f-eab6-4265-8db8-a8a1d934493a\") " pod="openshift-infra/auto-csr-approver-29567832-p2sbz" Mar 21 05:12:00 crc kubenswrapper[4839]: I0321 05:12:00.473868 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567832-p2sbz" Mar 21 05:12:01 crc kubenswrapper[4839]: I0321 05:12:01.182346 4839 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29567832-p2sbz"] Mar 21 05:12:02 crc kubenswrapper[4839]: I0321 05:12:02.012638 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567832-p2sbz" event={"ID":"82138c0f-eab6-4265-8db8-a8a1d934493a","Type":"ContainerStarted","Data":"c3d42162bd8f80621e321b47a8de003804f416494da4aef352c458177b7ef499"} Mar 21 05:12:03 crc kubenswrapper[4839]: I0321 05:12:03.023955 4839 generic.go:334] "Generic (PLEG): container finished" podID="82138c0f-eab6-4265-8db8-a8a1d934493a" containerID="3318757ff7e10c85856bb47e2aa55c12ee4b6d281f642a8f4e0f9c93681e4785" exitCode=0 Mar 21 05:12:03 crc kubenswrapper[4839]: I0321 05:12:03.024056 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567832-p2sbz" event={"ID":"82138c0f-eab6-4265-8db8-a8a1d934493a","Type":"ContainerDied","Data":"3318757ff7e10c85856bb47e2aa55c12ee4b6d281f642a8f4e0f9c93681e4785"} Mar 21 05:12:04 crc kubenswrapper[4839]: I0321 05:12:04.431675 4839 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567832-p2sbz" Mar 21 05:12:04 crc kubenswrapper[4839]: I0321 05:12:04.536183 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hdsg7\" (UniqueName: \"kubernetes.io/projected/82138c0f-eab6-4265-8db8-a8a1d934493a-kube-api-access-hdsg7\") pod \"82138c0f-eab6-4265-8db8-a8a1d934493a\" (UID: \"82138c0f-eab6-4265-8db8-a8a1d934493a\") " Mar 21 05:12:04 crc kubenswrapper[4839]: I0321 05:12:04.542311 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/82138c0f-eab6-4265-8db8-a8a1d934493a-kube-api-access-hdsg7" (OuterVolumeSpecName: "kube-api-access-hdsg7") pod "82138c0f-eab6-4265-8db8-a8a1d934493a" (UID: "82138c0f-eab6-4265-8db8-a8a1d934493a"). InnerVolumeSpecName "kube-api-access-hdsg7". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 21 05:12:04 crc kubenswrapper[4839]: I0321 05:12:04.639200 4839 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hdsg7\" (UniqueName: \"kubernetes.io/projected/82138c0f-eab6-4265-8db8-a8a1d934493a-kube-api-access-hdsg7\") on node \"crc\" DevicePath \"\"" Mar 21 05:12:05 crc kubenswrapper[4839]: I0321 05:12:05.042181 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567832-p2sbz" event={"ID":"82138c0f-eab6-4265-8db8-a8a1d934493a","Type":"ContainerDied","Data":"c3d42162bd8f80621e321b47a8de003804f416494da4aef352c458177b7ef499"} Mar 21 05:12:05 crc kubenswrapper[4839]: I0321 05:12:05.042522 4839 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c3d42162bd8f80621e321b47a8de003804f416494da4aef352c458177b7ef499" Mar 21 05:12:05 crc kubenswrapper[4839]: I0321 05:12:05.042236 4839 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567832-p2sbz" Mar 21 05:12:05 crc kubenswrapper[4839]: I0321 05:12:05.500881 4839 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29567826-x4cdd"] Mar 21 05:12:05 crc kubenswrapper[4839]: I0321 05:12:05.510130 4839 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29567826-x4cdd"] Mar 21 05:12:06 crc kubenswrapper[4839]: I0321 05:12:06.465289 4839 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ba8ac9dd-e3e7-4e21-9286-731d926d9580" path="/var/lib/kubelet/pods/ba8ac9dd-e3e7-4e21-9286-731d926d9580/volumes" Mar 21 05:12:16 crc kubenswrapper[4839]: I0321 05:12:16.309765 4839 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-ffqlt"] Mar 21 05:12:16 crc kubenswrapper[4839]: E0321 05:12:16.310854 4839 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="82138c0f-eab6-4265-8db8-a8a1d934493a" containerName="oc" Mar 21 05:12:16 crc kubenswrapper[4839]: I0321 05:12:16.310875 4839 state_mem.go:107] "Deleted CPUSet assignment" podUID="82138c0f-eab6-4265-8db8-a8a1d934493a" containerName="oc" Mar 21 05:12:16 crc kubenswrapper[4839]: I0321 05:12:16.311094 4839 memory_manager.go:354] "RemoveStaleState removing state" podUID="82138c0f-eab6-4265-8db8-a8a1d934493a" containerName="oc" Mar 21 05:12:16 crc kubenswrapper[4839]: I0321 05:12:16.313383 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-ffqlt" Mar 21 05:12:16 crc kubenswrapper[4839]: I0321 05:12:16.323376 4839 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-ffqlt"] Mar 21 05:12:16 crc kubenswrapper[4839]: I0321 05:12:16.499387 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nqcr9\" (UniqueName: \"kubernetes.io/projected/2caa3218-47ca-4a13-aa31-bc551be3b478-kube-api-access-nqcr9\") pod \"redhat-marketplace-ffqlt\" (UID: \"2caa3218-47ca-4a13-aa31-bc551be3b478\") " pod="openshift-marketplace/redhat-marketplace-ffqlt" Mar 21 05:12:16 crc kubenswrapper[4839]: I0321 05:12:16.500413 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2caa3218-47ca-4a13-aa31-bc551be3b478-catalog-content\") pod \"redhat-marketplace-ffqlt\" (UID: \"2caa3218-47ca-4a13-aa31-bc551be3b478\") " pod="openshift-marketplace/redhat-marketplace-ffqlt" Mar 21 05:12:16 crc kubenswrapper[4839]: I0321 05:12:16.500506 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2caa3218-47ca-4a13-aa31-bc551be3b478-utilities\") pod \"redhat-marketplace-ffqlt\" (UID: \"2caa3218-47ca-4a13-aa31-bc551be3b478\") " pod="openshift-marketplace/redhat-marketplace-ffqlt" Mar 21 05:12:16 crc kubenswrapper[4839]: I0321 05:12:16.603643 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2caa3218-47ca-4a13-aa31-bc551be3b478-utilities\") pod \"redhat-marketplace-ffqlt\" (UID: \"2caa3218-47ca-4a13-aa31-bc551be3b478\") " pod="openshift-marketplace/redhat-marketplace-ffqlt" Mar 21 05:12:16 crc kubenswrapper[4839]: I0321 05:12:16.603790 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nqcr9\" (UniqueName: \"kubernetes.io/projected/2caa3218-47ca-4a13-aa31-bc551be3b478-kube-api-access-nqcr9\") pod \"redhat-marketplace-ffqlt\" (UID: \"2caa3218-47ca-4a13-aa31-bc551be3b478\") " pod="openshift-marketplace/redhat-marketplace-ffqlt" Mar 21 05:12:16 crc kubenswrapper[4839]: I0321 05:12:16.603960 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2caa3218-47ca-4a13-aa31-bc551be3b478-catalog-content\") pod \"redhat-marketplace-ffqlt\" (UID: \"2caa3218-47ca-4a13-aa31-bc551be3b478\") " pod="openshift-marketplace/redhat-marketplace-ffqlt" Mar 21 05:12:16 crc kubenswrapper[4839]: I0321 05:12:16.605714 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2caa3218-47ca-4a13-aa31-bc551be3b478-utilities\") pod \"redhat-marketplace-ffqlt\" (UID: \"2caa3218-47ca-4a13-aa31-bc551be3b478\") " pod="openshift-marketplace/redhat-marketplace-ffqlt" Mar 21 05:12:16 crc kubenswrapper[4839]: I0321 05:12:16.607043 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2caa3218-47ca-4a13-aa31-bc551be3b478-catalog-content\") pod \"redhat-marketplace-ffqlt\" (UID: \"2caa3218-47ca-4a13-aa31-bc551be3b478\") " pod="openshift-marketplace/redhat-marketplace-ffqlt" Mar 21 05:12:16 crc kubenswrapper[4839]: I0321 05:12:16.639434 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nqcr9\" (UniqueName: \"kubernetes.io/projected/2caa3218-47ca-4a13-aa31-bc551be3b478-kube-api-access-nqcr9\") pod \"redhat-marketplace-ffqlt\" (UID: \"2caa3218-47ca-4a13-aa31-bc551be3b478\") " pod="openshift-marketplace/redhat-marketplace-ffqlt" Mar 21 05:12:16 crc kubenswrapper[4839]: I0321 05:12:16.937778 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-ffqlt" Mar 21 05:12:19 crc kubenswrapper[4839]: I0321 05:12:19.422861 4839 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-ffqlt"] Mar 21 05:12:20 crc kubenswrapper[4839]: I0321 05:12:20.110202 4839 generic.go:334] "Generic (PLEG): container finished" podID="2caa3218-47ca-4a13-aa31-bc551be3b478" containerID="70a00c3a25b1779ac37ba330d9568d5a63f1d05aa1ec5bf3c6df68ccff242a5d" exitCode=0 Mar 21 05:12:20 crc kubenswrapper[4839]: I0321 05:12:20.110293 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-ffqlt" event={"ID":"2caa3218-47ca-4a13-aa31-bc551be3b478","Type":"ContainerDied","Data":"70a00c3a25b1779ac37ba330d9568d5a63f1d05aa1ec5bf3c6df68ccff242a5d"} Mar 21 05:12:20 crc kubenswrapper[4839]: I0321 05:12:20.110490 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-ffqlt" event={"ID":"2caa3218-47ca-4a13-aa31-bc551be3b478","Type":"ContainerStarted","Data":"8e022772352c8be147feb12372aa8cb33c79e9b947e4b88f3cb5a88ac672c9cf"} Mar 21 05:12:21 crc kubenswrapper[4839]: I0321 05:12:21.120765 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-ffqlt" event={"ID":"2caa3218-47ca-4a13-aa31-bc551be3b478","Type":"ContainerStarted","Data":"4b5fb6b6cc2d5f06a80ee49c54e037d42b230620b1d7edfaf461b6aecf85af46"} Mar 21 05:12:22 crc kubenswrapper[4839]: I0321 05:12:22.136854 4839 generic.go:334] "Generic (PLEG): container finished" podID="2caa3218-47ca-4a13-aa31-bc551be3b478" containerID="4b5fb6b6cc2d5f06a80ee49c54e037d42b230620b1d7edfaf461b6aecf85af46" exitCode=0 Mar 21 05:12:22 crc kubenswrapper[4839]: I0321 05:12:22.137169 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-ffqlt" event={"ID":"2caa3218-47ca-4a13-aa31-bc551be3b478","Type":"ContainerDied","Data":"4b5fb6b6cc2d5f06a80ee49c54e037d42b230620b1d7edfaf461b6aecf85af46"} Mar 21 05:12:23 crc kubenswrapper[4839]: I0321 05:12:23.150851 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-ffqlt" event={"ID":"2caa3218-47ca-4a13-aa31-bc551be3b478","Type":"ContainerStarted","Data":"52ce7e99b9b95f19a033cedb3c3ca847e7115e58b04648595d7e014abffa9264"} Mar 21 05:12:23 crc kubenswrapper[4839]: I0321 05:12:23.176919 4839 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-ffqlt" podStartSLOduration=4.495910051 podStartE2EDuration="7.176897296s" podCreationTimestamp="2026-03-21 05:12:16 +0000 UTC" firstStartedPulling="2026-03-21 05:12:20.11194645 +0000 UTC m=+2944.439733126" lastFinishedPulling="2026-03-21 05:12:22.792933695 +0000 UTC m=+2947.120720371" observedRunningTime="2026-03-21 05:12:23.171188245 +0000 UTC m=+2947.498974931" watchObservedRunningTime="2026-03-21 05:12:23.176897296 +0000 UTC m=+2947.504683972" Mar 21 05:12:26 crc kubenswrapper[4839]: I0321 05:12:26.938551 4839 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-ffqlt" Mar 21 05:12:26 crc kubenswrapper[4839]: I0321 05:12:26.940787 4839 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-ffqlt" Mar 21 05:12:26 crc kubenswrapper[4839]: I0321 05:12:26.994283 4839 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-ffqlt" Mar 21 05:12:27 crc kubenswrapper[4839]: I0321 05:12:27.246973 4839 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-ffqlt" Mar 21 05:12:27 crc kubenswrapper[4839]: I0321 05:12:27.305798 4839 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-ffqlt"] Mar 21 05:12:29 crc kubenswrapper[4839]: I0321 05:12:29.212906 4839 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-ffqlt" podUID="2caa3218-47ca-4a13-aa31-bc551be3b478" containerName="registry-server" containerID="cri-o://52ce7e99b9b95f19a033cedb3c3ca847e7115e58b04648595d7e014abffa9264" gracePeriod=2 Mar 21 05:12:29 crc kubenswrapper[4839]: I0321 05:12:29.812805 4839 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-ffqlt" Mar 21 05:12:29 crc kubenswrapper[4839]: I0321 05:12:29.913238 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2caa3218-47ca-4a13-aa31-bc551be3b478-utilities\") pod \"2caa3218-47ca-4a13-aa31-bc551be3b478\" (UID: \"2caa3218-47ca-4a13-aa31-bc551be3b478\") " Mar 21 05:12:29 crc kubenswrapper[4839]: I0321 05:12:29.914087 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nqcr9\" (UniqueName: \"kubernetes.io/projected/2caa3218-47ca-4a13-aa31-bc551be3b478-kube-api-access-nqcr9\") pod \"2caa3218-47ca-4a13-aa31-bc551be3b478\" (UID: \"2caa3218-47ca-4a13-aa31-bc551be3b478\") " Mar 21 05:12:29 crc kubenswrapper[4839]: I0321 05:12:29.914184 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2caa3218-47ca-4a13-aa31-bc551be3b478-catalog-content\") pod \"2caa3218-47ca-4a13-aa31-bc551be3b478\" (UID: \"2caa3218-47ca-4a13-aa31-bc551be3b478\") " Mar 21 05:12:29 crc kubenswrapper[4839]: I0321 05:12:29.914341 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2caa3218-47ca-4a13-aa31-bc551be3b478-utilities" (OuterVolumeSpecName: "utilities") pod "2caa3218-47ca-4a13-aa31-bc551be3b478" (UID: "2caa3218-47ca-4a13-aa31-bc551be3b478"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 21 05:12:29 crc kubenswrapper[4839]: I0321 05:12:29.915060 4839 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2caa3218-47ca-4a13-aa31-bc551be3b478-utilities\") on node \"crc\" DevicePath \"\"" Mar 21 05:12:29 crc kubenswrapper[4839]: I0321 05:12:29.919699 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2caa3218-47ca-4a13-aa31-bc551be3b478-kube-api-access-nqcr9" (OuterVolumeSpecName: "kube-api-access-nqcr9") pod "2caa3218-47ca-4a13-aa31-bc551be3b478" (UID: "2caa3218-47ca-4a13-aa31-bc551be3b478"). InnerVolumeSpecName "kube-api-access-nqcr9". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 21 05:12:29 crc kubenswrapper[4839]: I0321 05:12:29.943272 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2caa3218-47ca-4a13-aa31-bc551be3b478-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "2caa3218-47ca-4a13-aa31-bc551be3b478" (UID: "2caa3218-47ca-4a13-aa31-bc551be3b478"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 21 05:12:30 crc kubenswrapper[4839]: I0321 05:12:30.016498 4839 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nqcr9\" (UniqueName: \"kubernetes.io/projected/2caa3218-47ca-4a13-aa31-bc551be3b478-kube-api-access-nqcr9\") on node \"crc\" DevicePath \"\"" Mar 21 05:12:30 crc kubenswrapper[4839]: I0321 05:12:30.016536 4839 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2caa3218-47ca-4a13-aa31-bc551be3b478-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 21 05:12:30 crc kubenswrapper[4839]: I0321 05:12:30.226685 4839 generic.go:334] "Generic (PLEG): container finished" podID="2caa3218-47ca-4a13-aa31-bc551be3b478" containerID="52ce7e99b9b95f19a033cedb3c3ca847e7115e58b04648595d7e014abffa9264" exitCode=0 Mar 21 05:12:30 crc kubenswrapper[4839]: I0321 05:12:30.226738 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-ffqlt" event={"ID":"2caa3218-47ca-4a13-aa31-bc551be3b478","Type":"ContainerDied","Data":"52ce7e99b9b95f19a033cedb3c3ca847e7115e58b04648595d7e014abffa9264"} Mar 21 05:12:30 crc kubenswrapper[4839]: I0321 05:12:30.226749 4839 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-ffqlt" Mar 21 05:12:30 crc kubenswrapper[4839]: I0321 05:12:30.226772 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-ffqlt" event={"ID":"2caa3218-47ca-4a13-aa31-bc551be3b478","Type":"ContainerDied","Data":"8e022772352c8be147feb12372aa8cb33c79e9b947e4b88f3cb5a88ac672c9cf"} Mar 21 05:12:30 crc kubenswrapper[4839]: I0321 05:12:30.226796 4839 scope.go:117] "RemoveContainer" containerID="52ce7e99b9b95f19a033cedb3c3ca847e7115e58b04648595d7e014abffa9264" Mar 21 05:12:30 crc kubenswrapper[4839]: I0321 05:12:30.254656 4839 scope.go:117] "RemoveContainer" containerID="4b5fb6b6cc2d5f06a80ee49c54e037d42b230620b1d7edfaf461b6aecf85af46" Mar 21 05:12:30 crc kubenswrapper[4839]: I0321 05:12:30.264368 4839 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-ffqlt"] Mar 21 05:12:30 crc kubenswrapper[4839]: I0321 05:12:30.279430 4839 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-ffqlt"] Mar 21 05:12:30 crc kubenswrapper[4839]: I0321 05:12:30.288101 4839 scope.go:117] "RemoveContainer" containerID="70a00c3a25b1779ac37ba330d9568d5a63f1d05aa1ec5bf3c6df68ccff242a5d" Mar 21 05:12:30 crc kubenswrapper[4839]: I0321 05:12:30.322283 4839 scope.go:117] "RemoveContainer" containerID="52ce7e99b9b95f19a033cedb3c3ca847e7115e58b04648595d7e014abffa9264" Mar 21 05:12:30 crc kubenswrapper[4839]: E0321 05:12:30.322839 4839 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"52ce7e99b9b95f19a033cedb3c3ca847e7115e58b04648595d7e014abffa9264\": container with ID starting with 52ce7e99b9b95f19a033cedb3c3ca847e7115e58b04648595d7e014abffa9264 not found: ID does not exist" containerID="52ce7e99b9b95f19a033cedb3c3ca847e7115e58b04648595d7e014abffa9264" Mar 21 05:12:30 crc kubenswrapper[4839]: I0321 05:12:30.322884 4839 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"52ce7e99b9b95f19a033cedb3c3ca847e7115e58b04648595d7e014abffa9264"} err="failed to get container status \"52ce7e99b9b95f19a033cedb3c3ca847e7115e58b04648595d7e014abffa9264\": rpc error: code = NotFound desc = could not find container \"52ce7e99b9b95f19a033cedb3c3ca847e7115e58b04648595d7e014abffa9264\": container with ID starting with 52ce7e99b9b95f19a033cedb3c3ca847e7115e58b04648595d7e014abffa9264 not found: ID does not exist" Mar 21 05:12:30 crc kubenswrapper[4839]: I0321 05:12:30.322916 4839 scope.go:117] "RemoveContainer" containerID="4b5fb6b6cc2d5f06a80ee49c54e037d42b230620b1d7edfaf461b6aecf85af46" Mar 21 05:12:30 crc kubenswrapper[4839]: E0321 05:12:30.326896 4839 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4b5fb6b6cc2d5f06a80ee49c54e037d42b230620b1d7edfaf461b6aecf85af46\": container with ID starting with 4b5fb6b6cc2d5f06a80ee49c54e037d42b230620b1d7edfaf461b6aecf85af46 not found: ID does not exist" containerID="4b5fb6b6cc2d5f06a80ee49c54e037d42b230620b1d7edfaf461b6aecf85af46" Mar 21 05:12:30 crc kubenswrapper[4839]: I0321 05:12:30.326937 4839 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4b5fb6b6cc2d5f06a80ee49c54e037d42b230620b1d7edfaf461b6aecf85af46"} err="failed to get container status \"4b5fb6b6cc2d5f06a80ee49c54e037d42b230620b1d7edfaf461b6aecf85af46\": rpc error: code = NotFound desc = could not find container \"4b5fb6b6cc2d5f06a80ee49c54e037d42b230620b1d7edfaf461b6aecf85af46\": container with ID starting with 4b5fb6b6cc2d5f06a80ee49c54e037d42b230620b1d7edfaf461b6aecf85af46 not found: ID does not exist" Mar 21 05:12:30 crc kubenswrapper[4839]: I0321 05:12:30.326964 4839 scope.go:117] "RemoveContainer" containerID="70a00c3a25b1779ac37ba330d9568d5a63f1d05aa1ec5bf3c6df68ccff242a5d" Mar 21 05:12:30 crc kubenswrapper[4839]: E0321 05:12:30.327303 4839 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"70a00c3a25b1779ac37ba330d9568d5a63f1d05aa1ec5bf3c6df68ccff242a5d\": container with ID starting with 70a00c3a25b1779ac37ba330d9568d5a63f1d05aa1ec5bf3c6df68ccff242a5d not found: ID does not exist" containerID="70a00c3a25b1779ac37ba330d9568d5a63f1d05aa1ec5bf3c6df68ccff242a5d" Mar 21 05:12:30 crc kubenswrapper[4839]: I0321 05:12:30.327325 4839 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"70a00c3a25b1779ac37ba330d9568d5a63f1d05aa1ec5bf3c6df68ccff242a5d"} err="failed to get container status \"70a00c3a25b1779ac37ba330d9568d5a63f1d05aa1ec5bf3c6df68ccff242a5d\": rpc error: code = NotFound desc = could not find container \"70a00c3a25b1779ac37ba330d9568d5a63f1d05aa1ec5bf3c6df68ccff242a5d\": container with ID starting with 70a00c3a25b1779ac37ba330d9568d5a63f1d05aa1ec5bf3c6df68ccff242a5d not found: ID does not exist" Mar 21 05:12:30 crc kubenswrapper[4839]: I0321 05:12:30.466486 4839 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2caa3218-47ca-4a13-aa31-bc551be3b478" path="/var/lib/kubelet/pods/2caa3218-47ca-4a13-aa31-bc551be3b478/volumes" Mar 21 05:12:30 crc kubenswrapper[4839]: I0321 05:12:30.980126 4839 patch_prober.go:28] interesting pod/machine-config-daemon-jx4q7 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 21 05:12:30 crc kubenswrapper[4839]: I0321 05:12:30.980388 4839 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-jx4q7" podUID="4f92fefb-d5cd-451a-8bbe-31eea55d5bd9" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 21 05:12:41 crc kubenswrapper[4839]: I0321 05:12:41.396668 4839 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-9cfwq"] Mar 21 05:12:41 crc kubenswrapper[4839]: E0321 05:12:41.397904 4839 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2caa3218-47ca-4a13-aa31-bc551be3b478" containerName="extract-content" Mar 21 05:12:41 crc kubenswrapper[4839]: I0321 05:12:41.397920 4839 state_mem.go:107] "Deleted CPUSet assignment" podUID="2caa3218-47ca-4a13-aa31-bc551be3b478" containerName="extract-content" Mar 21 05:12:41 crc kubenswrapper[4839]: E0321 05:12:41.397943 4839 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2caa3218-47ca-4a13-aa31-bc551be3b478" containerName="registry-server" Mar 21 05:12:41 crc kubenswrapper[4839]: I0321 05:12:41.397948 4839 state_mem.go:107] "Deleted CPUSet assignment" podUID="2caa3218-47ca-4a13-aa31-bc551be3b478" containerName="registry-server" Mar 21 05:12:41 crc kubenswrapper[4839]: E0321 05:12:41.397977 4839 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2caa3218-47ca-4a13-aa31-bc551be3b478" containerName="extract-utilities" Mar 21 05:12:41 crc kubenswrapper[4839]: I0321 05:12:41.397983 4839 state_mem.go:107] "Deleted CPUSet assignment" podUID="2caa3218-47ca-4a13-aa31-bc551be3b478" containerName="extract-utilities" Mar 21 05:12:41 crc kubenswrapper[4839]: I0321 05:12:41.398147 4839 memory_manager.go:354] "RemoveStaleState removing state" podUID="2caa3218-47ca-4a13-aa31-bc551be3b478" containerName="registry-server" Mar 21 05:12:41 crc kubenswrapper[4839]: I0321 05:12:41.399465 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-9cfwq" Mar 21 05:12:41 crc kubenswrapper[4839]: I0321 05:12:41.417438 4839 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-9cfwq"] Mar 21 05:12:41 crc kubenswrapper[4839]: I0321 05:12:41.670309 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/759f8f2c-b554-4a39-80d1-ec067ebec86f-utilities\") pod \"certified-operators-9cfwq\" (UID: \"759f8f2c-b554-4a39-80d1-ec067ebec86f\") " pod="openshift-marketplace/certified-operators-9cfwq" Mar 21 05:12:41 crc kubenswrapper[4839]: I0321 05:12:41.670609 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/759f8f2c-b554-4a39-80d1-ec067ebec86f-catalog-content\") pod \"certified-operators-9cfwq\" (UID: \"759f8f2c-b554-4a39-80d1-ec067ebec86f\") " pod="openshift-marketplace/certified-operators-9cfwq" Mar 21 05:12:41 crc kubenswrapper[4839]: I0321 05:12:41.670688 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dhqvg\" (UniqueName: \"kubernetes.io/projected/759f8f2c-b554-4a39-80d1-ec067ebec86f-kube-api-access-dhqvg\") pod \"certified-operators-9cfwq\" (UID: \"759f8f2c-b554-4a39-80d1-ec067ebec86f\") " pod="openshift-marketplace/certified-operators-9cfwq" Mar 21 05:12:41 crc kubenswrapper[4839]: I0321 05:12:41.772557 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dhqvg\" (UniqueName: \"kubernetes.io/projected/759f8f2c-b554-4a39-80d1-ec067ebec86f-kube-api-access-dhqvg\") pod \"certified-operators-9cfwq\" (UID: \"759f8f2c-b554-4a39-80d1-ec067ebec86f\") " pod="openshift-marketplace/certified-operators-9cfwq" Mar 21 05:12:41 crc kubenswrapper[4839]: I0321 05:12:41.772653 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/759f8f2c-b554-4a39-80d1-ec067ebec86f-utilities\") pod \"certified-operators-9cfwq\" (UID: \"759f8f2c-b554-4a39-80d1-ec067ebec86f\") " pod="openshift-marketplace/certified-operators-9cfwq" Mar 21 05:12:41 crc kubenswrapper[4839]: I0321 05:12:41.772828 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/759f8f2c-b554-4a39-80d1-ec067ebec86f-catalog-content\") pod \"certified-operators-9cfwq\" (UID: \"759f8f2c-b554-4a39-80d1-ec067ebec86f\") " pod="openshift-marketplace/certified-operators-9cfwq" Mar 21 05:12:41 crc kubenswrapper[4839]: I0321 05:12:41.773369 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/759f8f2c-b554-4a39-80d1-ec067ebec86f-utilities\") pod \"certified-operators-9cfwq\" (UID: \"759f8f2c-b554-4a39-80d1-ec067ebec86f\") " pod="openshift-marketplace/certified-operators-9cfwq" Mar 21 05:12:41 crc kubenswrapper[4839]: I0321 05:12:41.773410 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/759f8f2c-b554-4a39-80d1-ec067ebec86f-catalog-content\") pod \"certified-operators-9cfwq\" (UID: \"759f8f2c-b554-4a39-80d1-ec067ebec86f\") " pod="openshift-marketplace/certified-operators-9cfwq" Mar 21 05:12:41 crc kubenswrapper[4839]: I0321 05:12:41.795474 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dhqvg\" (UniqueName: \"kubernetes.io/projected/759f8f2c-b554-4a39-80d1-ec067ebec86f-kube-api-access-dhqvg\") pod \"certified-operators-9cfwq\" (UID: \"759f8f2c-b554-4a39-80d1-ec067ebec86f\") " pod="openshift-marketplace/certified-operators-9cfwq" Mar 21 05:12:41 crc kubenswrapper[4839]: I0321 05:12:41.883104 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-9cfwq" Mar 21 05:12:42 crc kubenswrapper[4839]: I0321 05:12:42.441397 4839 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-9cfwq"] Mar 21 05:12:43 crc kubenswrapper[4839]: I0321 05:12:43.351975 4839 generic.go:334] "Generic (PLEG): container finished" podID="759f8f2c-b554-4a39-80d1-ec067ebec86f" containerID="0471e9eacbbda33ca133e7b38f3650ac296123484074595f33648044768e5d3e" exitCode=0 Mar 21 05:12:43 crc kubenswrapper[4839]: I0321 05:12:43.352071 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-9cfwq" event={"ID":"759f8f2c-b554-4a39-80d1-ec067ebec86f","Type":"ContainerDied","Data":"0471e9eacbbda33ca133e7b38f3650ac296123484074595f33648044768e5d3e"} Mar 21 05:12:43 crc kubenswrapper[4839]: I0321 05:12:43.352547 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-9cfwq" event={"ID":"759f8f2c-b554-4a39-80d1-ec067ebec86f","Type":"ContainerStarted","Data":"ff9c7d348116e9c8ef37ba10f189bce01fe1fe3bb594e044bacc7cb9f6670753"} Mar 21 05:12:44 crc kubenswrapper[4839]: I0321 05:12:44.363812 4839 generic.go:334] "Generic (PLEG): container finished" podID="759f8f2c-b554-4a39-80d1-ec067ebec86f" containerID="dc3562202bfc3445fc78d5123c12166dd88a29b9e49b2c01bd89f5c2b4c0b9fb" exitCode=0 Mar 21 05:12:44 crc kubenswrapper[4839]: I0321 05:12:44.363895 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-9cfwq" event={"ID":"759f8f2c-b554-4a39-80d1-ec067ebec86f","Type":"ContainerDied","Data":"dc3562202bfc3445fc78d5123c12166dd88a29b9e49b2c01bd89f5c2b4c0b9fb"} Mar 21 05:12:46 crc kubenswrapper[4839]: I0321 05:12:46.703623 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-9cfwq" event={"ID":"759f8f2c-b554-4a39-80d1-ec067ebec86f","Type":"ContainerStarted","Data":"3e0ce864b7433a6da2941869e9cf4394273201d0b5d8409bc391e6e0b7347210"} Mar 21 05:12:46 crc kubenswrapper[4839]: I0321 05:12:46.730441 4839 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-9cfwq" podStartSLOduration=3.3541314509999998 podStartE2EDuration="5.730419505s" podCreationTimestamp="2026-03-21 05:12:41 +0000 UTC" firstStartedPulling="2026-03-21 05:12:43.353695405 +0000 UTC m=+2967.681482081" lastFinishedPulling="2026-03-21 05:12:45.729983459 +0000 UTC m=+2970.057770135" observedRunningTime="2026-03-21 05:12:46.726632578 +0000 UTC m=+2971.054419264" watchObservedRunningTime="2026-03-21 05:12:46.730419505 +0000 UTC m=+2971.058206181" Mar 21 05:12:51 crc kubenswrapper[4839]: I0321 05:12:51.883950 4839 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-9cfwq" Mar 21 05:12:51 crc kubenswrapper[4839]: I0321 05:12:51.884657 4839 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-9cfwq" Mar 21 05:12:51 crc kubenswrapper[4839]: I0321 05:12:51.928120 4839 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-9cfwq" Mar 21 05:12:52 crc kubenswrapper[4839]: I0321 05:12:52.801221 4839 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-9cfwq" Mar 21 05:12:52 crc kubenswrapper[4839]: I0321 05:12:52.872462 4839 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-9cfwq"] Mar 21 05:12:54 crc kubenswrapper[4839]: I0321 05:12:54.763725 4839 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-9cfwq" podUID="759f8f2c-b554-4a39-80d1-ec067ebec86f" containerName="registry-server" containerID="cri-o://3e0ce864b7433a6da2941869e9cf4394273201d0b5d8409bc391e6e0b7347210" gracePeriod=2 Mar 21 05:12:55 crc kubenswrapper[4839]: I0321 05:12:55.776202 4839 generic.go:334] "Generic (PLEG): container finished" podID="759f8f2c-b554-4a39-80d1-ec067ebec86f" containerID="3e0ce864b7433a6da2941869e9cf4394273201d0b5d8409bc391e6e0b7347210" exitCode=0 Mar 21 05:12:55 crc kubenswrapper[4839]: I0321 05:12:55.776215 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-9cfwq" event={"ID":"759f8f2c-b554-4a39-80d1-ec067ebec86f","Type":"ContainerDied","Data":"3e0ce864b7433a6da2941869e9cf4394273201d0b5d8409bc391e6e0b7347210"} Mar 21 05:12:56 crc kubenswrapper[4839]: I0321 05:12:56.098373 4839 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-9cfwq" Mar 21 05:12:56 crc kubenswrapper[4839]: I0321 05:12:56.267223 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/759f8f2c-b554-4a39-80d1-ec067ebec86f-catalog-content\") pod \"759f8f2c-b554-4a39-80d1-ec067ebec86f\" (UID: \"759f8f2c-b554-4a39-80d1-ec067ebec86f\") " Mar 21 05:12:56 crc kubenswrapper[4839]: I0321 05:12:56.267380 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/759f8f2c-b554-4a39-80d1-ec067ebec86f-utilities\") pod \"759f8f2c-b554-4a39-80d1-ec067ebec86f\" (UID: \"759f8f2c-b554-4a39-80d1-ec067ebec86f\") " Mar 21 05:12:56 crc kubenswrapper[4839]: I0321 05:12:56.267838 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dhqvg\" (UniqueName: \"kubernetes.io/projected/759f8f2c-b554-4a39-80d1-ec067ebec86f-kube-api-access-dhqvg\") pod \"759f8f2c-b554-4a39-80d1-ec067ebec86f\" (UID: \"759f8f2c-b554-4a39-80d1-ec067ebec86f\") " Mar 21 05:12:56 crc kubenswrapper[4839]: I0321 05:12:56.268121 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/759f8f2c-b554-4a39-80d1-ec067ebec86f-utilities" (OuterVolumeSpecName: "utilities") pod "759f8f2c-b554-4a39-80d1-ec067ebec86f" (UID: "759f8f2c-b554-4a39-80d1-ec067ebec86f"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 21 05:12:56 crc kubenswrapper[4839]: I0321 05:12:56.268442 4839 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/759f8f2c-b554-4a39-80d1-ec067ebec86f-utilities\") on node \"crc\" DevicePath \"\"" Mar 21 05:12:56 crc kubenswrapper[4839]: I0321 05:12:56.274169 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/759f8f2c-b554-4a39-80d1-ec067ebec86f-kube-api-access-dhqvg" (OuterVolumeSpecName: "kube-api-access-dhqvg") pod "759f8f2c-b554-4a39-80d1-ec067ebec86f" (UID: "759f8f2c-b554-4a39-80d1-ec067ebec86f"). InnerVolumeSpecName "kube-api-access-dhqvg". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 21 05:12:56 crc kubenswrapper[4839]: I0321 05:12:56.326220 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/759f8f2c-b554-4a39-80d1-ec067ebec86f-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "759f8f2c-b554-4a39-80d1-ec067ebec86f" (UID: "759f8f2c-b554-4a39-80d1-ec067ebec86f"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 21 05:12:56 crc kubenswrapper[4839]: I0321 05:12:56.387541 4839 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dhqvg\" (UniqueName: \"kubernetes.io/projected/759f8f2c-b554-4a39-80d1-ec067ebec86f-kube-api-access-dhqvg\") on node \"crc\" DevicePath \"\"" Mar 21 05:12:56 crc kubenswrapper[4839]: I0321 05:12:56.387606 4839 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/759f8f2c-b554-4a39-80d1-ec067ebec86f-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 21 05:12:56 crc kubenswrapper[4839]: I0321 05:12:56.487694 4839 scope.go:117] "RemoveContainer" containerID="1be67ef407b003d168ed5f91777a4df15466b61c19dea5b77ca6763eff6dadb2" Mar 21 05:12:56 crc kubenswrapper[4839]: I0321 05:12:56.789003 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-9cfwq" event={"ID":"759f8f2c-b554-4a39-80d1-ec067ebec86f","Type":"ContainerDied","Data":"ff9c7d348116e9c8ef37ba10f189bce01fe1fe3bb594e044bacc7cb9f6670753"} Mar 21 05:12:56 crc kubenswrapper[4839]: I0321 05:12:56.790469 4839 scope.go:117] "RemoveContainer" containerID="3e0ce864b7433a6da2941869e9cf4394273201d0b5d8409bc391e6e0b7347210" Mar 21 05:12:56 crc kubenswrapper[4839]: I0321 05:12:56.789389 4839 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-9cfwq" Mar 21 05:12:56 crc kubenswrapper[4839]: I0321 05:12:56.815876 4839 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-9cfwq"] Mar 21 05:12:56 crc kubenswrapper[4839]: I0321 05:12:56.819839 4839 scope.go:117] "RemoveContainer" containerID="dc3562202bfc3445fc78d5123c12166dd88a29b9e49b2c01bd89f5c2b4c0b9fb" Mar 21 05:12:56 crc kubenswrapper[4839]: I0321 05:12:56.825388 4839 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-9cfwq"] Mar 21 05:12:56 crc kubenswrapper[4839]: I0321 05:12:56.840617 4839 scope.go:117] "RemoveContainer" containerID="0471e9eacbbda33ca133e7b38f3650ac296123484074595f33648044768e5d3e" Mar 21 05:12:58 crc kubenswrapper[4839]: I0321 05:12:58.462740 4839 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="759f8f2c-b554-4a39-80d1-ec067ebec86f" path="/var/lib/kubelet/pods/759f8f2c-b554-4a39-80d1-ec067ebec86f/volumes" Mar 21 05:13:00 crc kubenswrapper[4839]: I0321 05:13:00.980554 4839 patch_prober.go:28] interesting pod/machine-config-daemon-jx4q7 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 21 05:13:00 crc kubenswrapper[4839]: I0321 05:13:00.981297 4839 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-jx4q7" podUID="4f92fefb-d5cd-451a-8bbe-31eea55d5bd9" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 21 05:13:30 crc kubenswrapper[4839]: I0321 05:13:30.979809 4839 patch_prober.go:28] interesting pod/machine-config-daemon-jx4q7 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 21 05:13:30 crc kubenswrapper[4839]: I0321 05:13:30.980489 4839 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-jx4q7" podUID="4f92fefb-d5cd-451a-8bbe-31eea55d5bd9" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 21 05:13:30 crc kubenswrapper[4839]: I0321 05:13:30.980559 4839 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-jx4q7" Mar 21 05:13:30 crc kubenswrapper[4839]: I0321 05:13:30.981542 4839 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"b3b7e174e088400e045d778d09a66442ae3aba7106eba8585b97c2b9d3b1ab21"} pod="openshift-machine-config-operator/machine-config-daemon-jx4q7" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 21 05:13:30 crc kubenswrapper[4839]: I0321 05:13:30.981657 4839 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-jx4q7" podUID="4f92fefb-d5cd-451a-8bbe-31eea55d5bd9" containerName="machine-config-daemon" containerID="cri-o://b3b7e174e088400e045d778d09a66442ae3aba7106eba8585b97c2b9d3b1ab21" gracePeriod=600 Mar 21 05:13:31 crc kubenswrapper[4839]: I0321 05:13:31.137229 4839 generic.go:334] "Generic (PLEG): container finished" podID="4f92fefb-d5cd-451a-8bbe-31eea55d5bd9" containerID="b3b7e174e088400e045d778d09a66442ae3aba7106eba8585b97c2b9d3b1ab21" exitCode=0 Mar 21 05:13:31 crc kubenswrapper[4839]: I0321 05:13:31.137272 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-jx4q7" event={"ID":"4f92fefb-d5cd-451a-8bbe-31eea55d5bd9","Type":"ContainerDied","Data":"b3b7e174e088400e045d778d09a66442ae3aba7106eba8585b97c2b9d3b1ab21"} Mar 21 05:13:31 crc kubenswrapper[4839]: I0321 05:13:31.137303 4839 scope.go:117] "RemoveContainer" containerID="baf34d4219b300148cd1cf3e2853a6f9d40af9c7af97b863f8a2a90b0a187c21" Mar 21 05:13:32 crc kubenswrapper[4839]: I0321 05:13:32.155735 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-jx4q7" event={"ID":"4f92fefb-d5cd-451a-8bbe-31eea55d5bd9","Type":"ContainerStarted","Data":"ae0086f9eacefc01aae5e4f5f99607f42c8a56a9eb17fd3e93fb691fd9a335b2"} Mar 21 05:14:00 crc kubenswrapper[4839]: I0321 05:14:00.143687 4839 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29567834-zmfrq"] Mar 21 05:14:00 crc kubenswrapper[4839]: E0321 05:14:00.144587 4839 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="759f8f2c-b554-4a39-80d1-ec067ebec86f" containerName="extract-utilities" Mar 21 05:14:00 crc kubenswrapper[4839]: I0321 05:14:00.144606 4839 state_mem.go:107] "Deleted CPUSet assignment" podUID="759f8f2c-b554-4a39-80d1-ec067ebec86f" containerName="extract-utilities" Mar 21 05:14:00 crc kubenswrapper[4839]: E0321 05:14:00.144625 4839 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="759f8f2c-b554-4a39-80d1-ec067ebec86f" containerName="registry-server" Mar 21 05:14:00 crc kubenswrapper[4839]: I0321 05:14:00.144632 4839 state_mem.go:107] "Deleted CPUSet assignment" podUID="759f8f2c-b554-4a39-80d1-ec067ebec86f" containerName="registry-server" Mar 21 05:14:00 crc kubenswrapper[4839]: E0321 05:14:00.144653 4839 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="759f8f2c-b554-4a39-80d1-ec067ebec86f" containerName="extract-content" Mar 21 05:14:00 crc kubenswrapper[4839]: I0321 05:14:00.144660 4839 state_mem.go:107] "Deleted CPUSet assignment" podUID="759f8f2c-b554-4a39-80d1-ec067ebec86f" containerName="extract-content" Mar 21 05:14:00 crc kubenswrapper[4839]: I0321 05:14:00.144885 4839 memory_manager.go:354] "RemoveStaleState removing state" podUID="759f8f2c-b554-4a39-80d1-ec067ebec86f" containerName="registry-server" Mar 21 05:14:00 crc kubenswrapper[4839]: I0321 05:14:00.145471 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567834-zmfrq" Mar 21 05:14:00 crc kubenswrapper[4839]: I0321 05:14:00.148074 4839 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-swld2" Mar 21 05:14:00 crc kubenswrapper[4839]: I0321 05:14:00.148299 4839 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 21 05:14:00 crc kubenswrapper[4839]: I0321 05:14:00.149059 4839 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 21 05:14:00 crc kubenswrapper[4839]: I0321 05:14:00.156306 4839 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29567834-zmfrq"] Mar 21 05:14:00 crc kubenswrapper[4839]: I0321 05:14:00.292673 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g4tqb\" (UniqueName: \"kubernetes.io/projected/acfde7eb-12d0-4baf-9958-3ff93b290071-kube-api-access-g4tqb\") pod \"auto-csr-approver-29567834-zmfrq\" (UID: \"acfde7eb-12d0-4baf-9958-3ff93b290071\") " pod="openshift-infra/auto-csr-approver-29567834-zmfrq" Mar 21 05:14:00 crc kubenswrapper[4839]: I0321 05:14:00.395675 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g4tqb\" (UniqueName: \"kubernetes.io/projected/acfde7eb-12d0-4baf-9958-3ff93b290071-kube-api-access-g4tqb\") pod \"auto-csr-approver-29567834-zmfrq\" (UID: \"acfde7eb-12d0-4baf-9958-3ff93b290071\") " pod="openshift-infra/auto-csr-approver-29567834-zmfrq" Mar 21 05:14:00 crc kubenswrapper[4839]: I0321 05:14:00.432261 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-g4tqb\" (UniqueName: \"kubernetes.io/projected/acfde7eb-12d0-4baf-9958-3ff93b290071-kube-api-access-g4tqb\") pod \"auto-csr-approver-29567834-zmfrq\" (UID: \"acfde7eb-12d0-4baf-9958-3ff93b290071\") " pod="openshift-infra/auto-csr-approver-29567834-zmfrq" Mar 21 05:14:00 crc kubenswrapper[4839]: I0321 05:14:00.465675 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567834-zmfrq" Mar 21 05:14:00 crc kubenswrapper[4839]: I0321 05:14:00.923945 4839 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29567834-zmfrq"] Mar 21 05:14:00 crc kubenswrapper[4839]: W0321 05:14:00.925875 4839 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podacfde7eb_12d0_4baf_9958_3ff93b290071.slice/crio-f1abffb51a446bee24be83a02417f55dd6d29514c9693e93e4b0052d9cfa8ef4 WatchSource:0}: Error finding container f1abffb51a446bee24be83a02417f55dd6d29514c9693e93e4b0052d9cfa8ef4: Status 404 returned error can't find the container with id f1abffb51a446bee24be83a02417f55dd6d29514c9693e93e4b0052d9cfa8ef4 Mar 21 05:14:01 crc kubenswrapper[4839]: I0321 05:14:01.428578 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567834-zmfrq" event={"ID":"acfde7eb-12d0-4baf-9958-3ff93b290071","Type":"ContainerStarted","Data":"f1abffb51a446bee24be83a02417f55dd6d29514c9693e93e4b0052d9cfa8ef4"} Mar 21 05:14:02 crc kubenswrapper[4839]: I0321 05:14:02.441008 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567834-zmfrq" event={"ID":"acfde7eb-12d0-4baf-9958-3ff93b290071","Type":"ContainerStarted","Data":"93fa052ae298171aa1d976a2185cd4fffe6a03fbc4b347a5bb68165274eaac3a"} Mar 21 05:14:02 crc kubenswrapper[4839]: I0321 05:14:02.457162 4839 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29567834-zmfrq" podStartSLOduration=1.525210737 podStartE2EDuration="2.457145029s" podCreationTimestamp="2026-03-21 05:14:00 +0000 UTC" firstStartedPulling="2026-03-21 05:14:00.92818075 +0000 UTC m=+3045.255967426" lastFinishedPulling="2026-03-21 05:14:01.860115032 +0000 UTC m=+3046.187901718" observedRunningTime="2026-03-21 05:14:02.453514216 +0000 UTC m=+3046.781300892" watchObservedRunningTime="2026-03-21 05:14:02.457145029 +0000 UTC m=+3046.784931705" Mar 21 05:14:03 crc kubenswrapper[4839]: I0321 05:14:03.450665 4839 generic.go:334] "Generic (PLEG): container finished" podID="acfde7eb-12d0-4baf-9958-3ff93b290071" containerID="93fa052ae298171aa1d976a2185cd4fffe6a03fbc4b347a5bb68165274eaac3a" exitCode=0 Mar 21 05:14:03 crc kubenswrapper[4839]: I0321 05:14:03.450713 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567834-zmfrq" event={"ID":"acfde7eb-12d0-4baf-9958-3ff93b290071","Type":"ContainerDied","Data":"93fa052ae298171aa1d976a2185cd4fffe6a03fbc4b347a5bb68165274eaac3a"} Mar 21 05:14:04 crc kubenswrapper[4839]: I0321 05:14:04.831923 4839 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567834-zmfrq" Mar 21 05:14:04 crc kubenswrapper[4839]: I0321 05:14:04.873512 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-g4tqb\" (UniqueName: \"kubernetes.io/projected/acfde7eb-12d0-4baf-9958-3ff93b290071-kube-api-access-g4tqb\") pod \"acfde7eb-12d0-4baf-9958-3ff93b290071\" (UID: \"acfde7eb-12d0-4baf-9958-3ff93b290071\") " Mar 21 05:14:04 crc kubenswrapper[4839]: I0321 05:14:04.897387 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/acfde7eb-12d0-4baf-9958-3ff93b290071-kube-api-access-g4tqb" (OuterVolumeSpecName: "kube-api-access-g4tqb") pod "acfde7eb-12d0-4baf-9958-3ff93b290071" (UID: "acfde7eb-12d0-4baf-9958-3ff93b290071"). InnerVolumeSpecName "kube-api-access-g4tqb". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 21 05:14:04 crc kubenswrapper[4839]: I0321 05:14:04.975373 4839 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-g4tqb\" (UniqueName: \"kubernetes.io/projected/acfde7eb-12d0-4baf-9958-3ff93b290071-kube-api-access-g4tqb\") on node \"crc\" DevicePath \"\"" Mar 21 05:14:05 crc kubenswrapper[4839]: I0321 05:14:05.469448 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567834-zmfrq" event={"ID":"acfde7eb-12d0-4baf-9958-3ff93b290071","Type":"ContainerDied","Data":"f1abffb51a446bee24be83a02417f55dd6d29514c9693e93e4b0052d9cfa8ef4"} Mar 21 05:14:05 crc kubenswrapper[4839]: I0321 05:14:05.469808 4839 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f1abffb51a446bee24be83a02417f55dd6d29514c9693e93e4b0052d9cfa8ef4" Mar 21 05:14:05 crc kubenswrapper[4839]: I0321 05:14:05.469805 4839 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567834-zmfrq" Mar 21 05:14:05 crc kubenswrapper[4839]: I0321 05:14:05.520650 4839 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29567828-qnptz"] Mar 21 05:14:05 crc kubenswrapper[4839]: I0321 05:14:05.532957 4839 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29567828-qnptz"] Mar 21 05:14:06 crc kubenswrapper[4839]: I0321 05:14:06.463747 4839 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e550e6e6-fc33-4703-b8db-6cd8169ebc7f" path="/var/lib/kubelet/pods/e550e6e6-fc33-4703-b8db-6cd8169ebc7f/volumes" Mar 21 05:14:56 crc kubenswrapper[4839]: I0321 05:14:56.620553 4839 scope.go:117] "RemoveContainer" containerID="4eee1ce2fe1133d7d9ea3d85cfff635c2acc34be75ad8890d23c93b28ff12298" Mar 21 05:15:00 crc kubenswrapper[4839]: I0321 05:15:00.142680 4839 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29567835-8vmjd"] Mar 21 05:15:00 crc kubenswrapper[4839]: E0321 05:15:00.143603 4839 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="acfde7eb-12d0-4baf-9958-3ff93b290071" containerName="oc" Mar 21 05:15:00 crc kubenswrapper[4839]: I0321 05:15:00.143618 4839 state_mem.go:107] "Deleted CPUSet assignment" podUID="acfde7eb-12d0-4baf-9958-3ff93b290071" containerName="oc" Mar 21 05:15:00 crc kubenswrapper[4839]: I0321 05:15:00.143825 4839 memory_manager.go:354] "RemoveStaleState removing state" podUID="acfde7eb-12d0-4baf-9958-3ff93b290071" containerName="oc" Mar 21 05:15:00 crc kubenswrapper[4839]: I0321 05:15:00.144436 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29567835-8vmjd" Mar 21 05:15:00 crc kubenswrapper[4839]: I0321 05:15:00.150454 4839 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Mar 21 05:15:00 crc kubenswrapper[4839]: I0321 05:15:00.150503 4839 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Mar 21 05:15:00 crc kubenswrapper[4839]: I0321 05:15:00.159370 4839 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29567835-8vmjd"] Mar 21 05:15:00 crc kubenswrapper[4839]: I0321 05:15:00.253963 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/e54425e5-2050-4510-be6a-ef16c4311765-config-volume\") pod \"collect-profiles-29567835-8vmjd\" (UID: \"e54425e5-2050-4510-be6a-ef16c4311765\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29567835-8vmjd" Mar 21 05:15:00 crc kubenswrapper[4839]: I0321 05:15:00.254077 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xt46f\" (UniqueName: \"kubernetes.io/projected/e54425e5-2050-4510-be6a-ef16c4311765-kube-api-access-xt46f\") pod \"collect-profiles-29567835-8vmjd\" (UID: \"e54425e5-2050-4510-be6a-ef16c4311765\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29567835-8vmjd" Mar 21 05:15:00 crc kubenswrapper[4839]: I0321 05:15:00.254150 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/e54425e5-2050-4510-be6a-ef16c4311765-secret-volume\") pod \"collect-profiles-29567835-8vmjd\" (UID: \"e54425e5-2050-4510-be6a-ef16c4311765\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29567835-8vmjd" Mar 21 05:15:00 crc kubenswrapper[4839]: I0321 05:15:00.356150 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xt46f\" (UniqueName: \"kubernetes.io/projected/e54425e5-2050-4510-be6a-ef16c4311765-kube-api-access-xt46f\") pod \"collect-profiles-29567835-8vmjd\" (UID: \"e54425e5-2050-4510-be6a-ef16c4311765\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29567835-8vmjd" Mar 21 05:15:00 crc kubenswrapper[4839]: I0321 05:15:00.356249 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/e54425e5-2050-4510-be6a-ef16c4311765-secret-volume\") pod \"collect-profiles-29567835-8vmjd\" (UID: \"e54425e5-2050-4510-be6a-ef16c4311765\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29567835-8vmjd" Mar 21 05:15:00 crc kubenswrapper[4839]: I0321 05:15:00.356353 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/e54425e5-2050-4510-be6a-ef16c4311765-config-volume\") pod \"collect-profiles-29567835-8vmjd\" (UID: \"e54425e5-2050-4510-be6a-ef16c4311765\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29567835-8vmjd" Mar 21 05:15:00 crc kubenswrapper[4839]: I0321 05:15:00.357438 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/e54425e5-2050-4510-be6a-ef16c4311765-config-volume\") pod \"collect-profiles-29567835-8vmjd\" (UID: \"e54425e5-2050-4510-be6a-ef16c4311765\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29567835-8vmjd" Mar 21 05:15:00 crc kubenswrapper[4839]: I0321 05:15:00.362222 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/e54425e5-2050-4510-be6a-ef16c4311765-secret-volume\") pod \"collect-profiles-29567835-8vmjd\" (UID: \"e54425e5-2050-4510-be6a-ef16c4311765\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29567835-8vmjd" Mar 21 05:15:00 crc kubenswrapper[4839]: I0321 05:15:00.375006 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xt46f\" (UniqueName: \"kubernetes.io/projected/e54425e5-2050-4510-be6a-ef16c4311765-kube-api-access-xt46f\") pod \"collect-profiles-29567835-8vmjd\" (UID: \"e54425e5-2050-4510-be6a-ef16c4311765\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29567835-8vmjd" Mar 21 05:15:00 crc kubenswrapper[4839]: I0321 05:15:00.469041 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29567835-8vmjd" Mar 21 05:15:00 crc kubenswrapper[4839]: I0321 05:15:00.915052 4839 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29567835-8vmjd"] Mar 21 05:15:00 crc kubenswrapper[4839]: W0321 05:15:00.925216 4839 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode54425e5_2050_4510_be6a_ef16c4311765.slice/crio-9f5822211a0a0dc8a462f22ebe73630d12cd9c388d680042f0d3c5791a530c45 WatchSource:0}: Error finding container 9f5822211a0a0dc8a462f22ebe73630d12cd9c388d680042f0d3c5791a530c45: Status 404 returned error can't find the container with id 9f5822211a0a0dc8a462f22ebe73630d12cd9c388d680042f0d3c5791a530c45 Mar 21 05:15:00 crc kubenswrapper[4839]: I0321 05:15:00.949580 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29567835-8vmjd" event={"ID":"e54425e5-2050-4510-be6a-ef16c4311765","Type":"ContainerStarted","Data":"9f5822211a0a0dc8a462f22ebe73630d12cd9c388d680042f0d3c5791a530c45"} Mar 21 05:15:01 crc kubenswrapper[4839]: I0321 05:15:01.958196 4839 generic.go:334] "Generic (PLEG): container finished" podID="e54425e5-2050-4510-be6a-ef16c4311765" containerID="2048e73b844bad4a2409cb352a14732d0691192b6583598f901a01e2393f2a26" exitCode=0 Mar 21 05:15:01 crc kubenswrapper[4839]: I0321 05:15:01.958256 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29567835-8vmjd" event={"ID":"e54425e5-2050-4510-be6a-ef16c4311765","Type":"ContainerDied","Data":"2048e73b844bad4a2409cb352a14732d0691192b6583598f901a01e2393f2a26"} Mar 21 05:15:03 crc kubenswrapper[4839]: I0321 05:15:03.339259 4839 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29567835-8vmjd" Mar 21 05:15:03 crc kubenswrapper[4839]: I0321 05:15:03.415818 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/e54425e5-2050-4510-be6a-ef16c4311765-secret-volume\") pod \"e54425e5-2050-4510-be6a-ef16c4311765\" (UID: \"e54425e5-2050-4510-be6a-ef16c4311765\") " Mar 21 05:15:03 crc kubenswrapper[4839]: I0321 05:15:03.416190 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/e54425e5-2050-4510-be6a-ef16c4311765-config-volume\") pod \"e54425e5-2050-4510-be6a-ef16c4311765\" (UID: \"e54425e5-2050-4510-be6a-ef16c4311765\") " Mar 21 05:15:03 crc kubenswrapper[4839]: I0321 05:15:03.416438 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xt46f\" (UniqueName: \"kubernetes.io/projected/e54425e5-2050-4510-be6a-ef16c4311765-kube-api-access-xt46f\") pod \"e54425e5-2050-4510-be6a-ef16c4311765\" (UID: \"e54425e5-2050-4510-be6a-ef16c4311765\") " Mar 21 05:15:03 crc kubenswrapper[4839]: I0321 05:15:03.418418 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e54425e5-2050-4510-be6a-ef16c4311765-config-volume" (OuterVolumeSpecName: "config-volume") pod "e54425e5-2050-4510-be6a-ef16c4311765" (UID: "e54425e5-2050-4510-be6a-ef16c4311765"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 21 05:15:03 crc kubenswrapper[4839]: I0321 05:15:03.427258 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e54425e5-2050-4510-be6a-ef16c4311765-kube-api-access-xt46f" (OuterVolumeSpecName: "kube-api-access-xt46f") pod "e54425e5-2050-4510-be6a-ef16c4311765" (UID: "e54425e5-2050-4510-be6a-ef16c4311765"). InnerVolumeSpecName "kube-api-access-xt46f". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 21 05:15:03 crc kubenswrapper[4839]: I0321 05:15:03.435767 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e54425e5-2050-4510-be6a-ef16c4311765-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "e54425e5-2050-4510-be6a-ef16c4311765" (UID: "e54425e5-2050-4510-be6a-ef16c4311765"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 21 05:15:03 crc kubenswrapper[4839]: I0321 05:15:03.519031 4839 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xt46f\" (UniqueName: \"kubernetes.io/projected/e54425e5-2050-4510-be6a-ef16c4311765-kube-api-access-xt46f\") on node \"crc\" DevicePath \"\"" Mar 21 05:15:03 crc kubenswrapper[4839]: I0321 05:15:03.519074 4839 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/e54425e5-2050-4510-be6a-ef16c4311765-secret-volume\") on node \"crc\" DevicePath \"\"" Mar 21 05:15:03 crc kubenswrapper[4839]: I0321 05:15:03.519083 4839 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/e54425e5-2050-4510-be6a-ef16c4311765-config-volume\") on node \"crc\" DevicePath \"\"" Mar 21 05:15:03 crc kubenswrapper[4839]: I0321 05:15:03.975826 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29567835-8vmjd" event={"ID":"e54425e5-2050-4510-be6a-ef16c4311765","Type":"ContainerDied","Data":"9f5822211a0a0dc8a462f22ebe73630d12cd9c388d680042f0d3c5791a530c45"} Mar 21 05:15:03 crc kubenswrapper[4839]: I0321 05:15:03.975868 4839 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="9f5822211a0a0dc8a462f22ebe73630d12cd9c388d680042f0d3c5791a530c45" Mar 21 05:15:03 crc kubenswrapper[4839]: I0321 05:15:03.975925 4839 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29567835-8vmjd" Mar 21 05:15:04 crc kubenswrapper[4839]: I0321 05:15:04.430478 4839 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29567790-knjwr"] Mar 21 05:15:04 crc kubenswrapper[4839]: I0321 05:15:04.443198 4839 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29567790-knjwr"] Mar 21 05:15:04 crc kubenswrapper[4839]: I0321 05:15:04.461969 4839 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0fd65835-5c51-49a6-8e2f-9ac9569c2c64" path="/var/lib/kubelet/pods/0fd65835-5c51-49a6-8e2f-9ac9569c2c64/volumes" Mar 21 05:15:56 crc kubenswrapper[4839]: I0321 05:15:56.686817 4839 scope.go:117] "RemoveContainer" containerID="d1742f96e69ee0f8c2f73ccffb16323bc9bae63d20c55bd829c98946a612539f" Mar 21 05:16:00 crc kubenswrapper[4839]: I0321 05:16:00.158010 4839 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29567836-7nx7s"] Mar 21 05:16:00 crc kubenswrapper[4839]: E0321 05:16:00.159351 4839 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e54425e5-2050-4510-be6a-ef16c4311765" containerName="collect-profiles" Mar 21 05:16:00 crc kubenswrapper[4839]: I0321 05:16:00.159380 4839 state_mem.go:107] "Deleted CPUSet assignment" podUID="e54425e5-2050-4510-be6a-ef16c4311765" containerName="collect-profiles" Mar 21 05:16:00 crc kubenswrapper[4839]: I0321 05:16:00.159708 4839 memory_manager.go:354] "RemoveStaleState removing state" podUID="e54425e5-2050-4510-be6a-ef16c4311765" containerName="collect-profiles" Mar 21 05:16:00 crc kubenswrapper[4839]: I0321 05:16:00.160832 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567836-7nx7s" Mar 21 05:16:00 crc kubenswrapper[4839]: I0321 05:16:00.163319 4839 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 21 05:16:00 crc kubenswrapper[4839]: I0321 05:16:00.163464 4839 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-swld2" Mar 21 05:16:00 crc kubenswrapper[4839]: I0321 05:16:00.163535 4839 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 21 05:16:00 crc kubenswrapper[4839]: I0321 05:16:00.180609 4839 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29567836-7nx7s"] Mar 21 05:16:00 crc kubenswrapper[4839]: I0321 05:16:00.276033 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q42dv\" (UniqueName: \"kubernetes.io/projected/bcf873d9-ae04-40eb-b855-cca2a045773c-kube-api-access-q42dv\") pod \"auto-csr-approver-29567836-7nx7s\" (UID: \"bcf873d9-ae04-40eb-b855-cca2a045773c\") " pod="openshift-infra/auto-csr-approver-29567836-7nx7s" Mar 21 05:16:00 crc kubenswrapper[4839]: I0321 05:16:00.378326 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-q42dv\" (UniqueName: \"kubernetes.io/projected/bcf873d9-ae04-40eb-b855-cca2a045773c-kube-api-access-q42dv\") pod \"auto-csr-approver-29567836-7nx7s\" (UID: \"bcf873d9-ae04-40eb-b855-cca2a045773c\") " pod="openshift-infra/auto-csr-approver-29567836-7nx7s" Mar 21 05:16:00 crc kubenswrapper[4839]: I0321 05:16:00.409821 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-q42dv\" (UniqueName: \"kubernetes.io/projected/bcf873d9-ae04-40eb-b855-cca2a045773c-kube-api-access-q42dv\") pod \"auto-csr-approver-29567836-7nx7s\" (UID: \"bcf873d9-ae04-40eb-b855-cca2a045773c\") " pod="openshift-infra/auto-csr-approver-29567836-7nx7s" Mar 21 05:16:00 crc kubenswrapper[4839]: I0321 05:16:00.479738 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567836-7nx7s" Mar 21 05:16:00 crc kubenswrapper[4839]: I0321 05:16:00.979902 4839 patch_prober.go:28] interesting pod/machine-config-daemon-jx4q7 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 21 05:16:00 crc kubenswrapper[4839]: I0321 05:16:00.980460 4839 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-jx4q7" podUID="4f92fefb-d5cd-451a-8bbe-31eea55d5bd9" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 21 05:16:01 crc kubenswrapper[4839]: I0321 05:16:01.017210 4839 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29567836-7nx7s"] Mar 21 05:16:01 crc kubenswrapper[4839]: I0321 05:16:01.512172 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567836-7nx7s" event={"ID":"bcf873d9-ae04-40eb-b855-cca2a045773c","Type":"ContainerStarted","Data":"74ccf6b6cf340e9b811448396311ae52b1038c7de266d673f1fab4fccbc8dc87"} Mar 21 05:16:03 crc kubenswrapper[4839]: I0321 05:16:03.529849 4839 generic.go:334] "Generic (PLEG): container finished" podID="bcf873d9-ae04-40eb-b855-cca2a045773c" containerID="7a58fda260a0b556c155576b775648d85fe42364027e5d171170d3cfbd959f32" exitCode=0 Mar 21 05:16:03 crc kubenswrapper[4839]: I0321 05:16:03.529915 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567836-7nx7s" event={"ID":"bcf873d9-ae04-40eb-b855-cca2a045773c","Type":"ContainerDied","Data":"7a58fda260a0b556c155576b775648d85fe42364027e5d171170d3cfbd959f32"} Mar 21 05:16:04 crc kubenswrapper[4839]: I0321 05:16:04.936539 4839 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567836-7nx7s" Mar 21 05:16:05 crc kubenswrapper[4839]: I0321 05:16:05.075201 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-q42dv\" (UniqueName: \"kubernetes.io/projected/bcf873d9-ae04-40eb-b855-cca2a045773c-kube-api-access-q42dv\") pod \"bcf873d9-ae04-40eb-b855-cca2a045773c\" (UID: \"bcf873d9-ae04-40eb-b855-cca2a045773c\") " Mar 21 05:16:05 crc kubenswrapper[4839]: I0321 05:16:05.080808 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bcf873d9-ae04-40eb-b855-cca2a045773c-kube-api-access-q42dv" (OuterVolumeSpecName: "kube-api-access-q42dv") pod "bcf873d9-ae04-40eb-b855-cca2a045773c" (UID: "bcf873d9-ae04-40eb-b855-cca2a045773c"). InnerVolumeSpecName "kube-api-access-q42dv". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 21 05:16:05 crc kubenswrapper[4839]: I0321 05:16:05.177912 4839 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-q42dv\" (UniqueName: \"kubernetes.io/projected/bcf873d9-ae04-40eb-b855-cca2a045773c-kube-api-access-q42dv\") on node \"crc\" DevicePath \"\"" Mar 21 05:16:05 crc kubenswrapper[4839]: I0321 05:16:05.549645 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567836-7nx7s" event={"ID":"bcf873d9-ae04-40eb-b855-cca2a045773c","Type":"ContainerDied","Data":"74ccf6b6cf340e9b811448396311ae52b1038c7de266d673f1fab4fccbc8dc87"} Mar 21 05:16:05 crc kubenswrapper[4839]: I0321 05:16:05.549985 4839 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="74ccf6b6cf340e9b811448396311ae52b1038c7de266d673f1fab4fccbc8dc87" Mar 21 05:16:05 crc kubenswrapper[4839]: I0321 05:16:05.549910 4839 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567836-7nx7s" Mar 21 05:16:06 crc kubenswrapper[4839]: I0321 05:16:06.016088 4839 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29567830-8w2nt"] Mar 21 05:16:06 crc kubenswrapper[4839]: I0321 05:16:06.027074 4839 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29567830-8w2nt"] Mar 21 05:16:06 crc kubenswrapper[4839]: I0321 05:16:06.464464 4839 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="aec2b9da-f24b-47bb-95cc-2903624e2eb1" path="/var/lib/kubelet/pods/aec2b9da-f24b-47bb-95cc-2903624e2eb1/volumes" Mar 21 05:16:30 crc kubenswrapper[4839]: I0321 05:16:30.980987 4839 patch_prober.go:28] interesting pod/machine-config-daemon-jx4q7 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 21 05:16:30 crc kubenswrapper[4839]: I0321 05:16:30.982070 4839 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-jx4q7" podUID="4f92fefb-d5cd-451a-8bbe-31eea55d5bd9" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 21 05:16:56 crc kubenswrapper[4839]: I0321 05:16:56.747368 4839 scope.go:117] "RemoveContainer" containerID="3edbd65acb10c2e93822fc30bbb912980fc1145d300030a3f42acd4b77e841c2" Mar 21 05:17:00 crc kubenswrapper[4839]: I0321 05:17:00.979954 4839 patch_prober.go:28] interesting pod/machine-config-daemon-jx4q7 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 21 05:17:00 crc kubenswrapper[4839]: I0321 05:17:00.980330 4839 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-jx4q7" podUID="4f92fefb-d5cd-451a-8bbe-31eea55d5bd9" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 21 05:17:00 crc kubenswrapper[4839]: I0321 05:17:00.980390 4839 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-jx4q7" Mar 21 05:17:00 crc kubenswrapper[4839]: I0321 05:17:00.981357 4839 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"ae0086f9eacefc01aae5e4f5f99607f42c8a56a9eb17fd3e93fb691fd9a335b2"} pod="openshift-machine-config-operator/machine-config-daemon-jx4q7" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 21 05:17:00 crc kubenswrapper[4839]: I0321 05:17:00.981447 4839 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-jx4q7" podUID="4f92fefb-d5cd-451a-8bbe-31eea55d5bd9" containerName="machine-config-daemon" containerID="cri-o://ae0086f9eacefc01aae5e4f5f99607f42c8a56a9eb17fd3e93fb691fd9a335b2" gracePeriod=600 Mar 21 05:17:01 crc kubenswrapper[4839]: E0321 05:17:01.146453 4839 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-jx4q7_openshift-machine-config-operator(4f92fefb-d5cd-451a-8bbe-31eea55d5bd9)\"" pod="openshift-machine-config-operator/machine-config-daemon-jx4q7" podUID="4f92fefb-d5cd-451a-8bbe-31eea55d5bd9" Mar 21 05:17:02 crc kubenswrapper[4839]: I0321 05:17:02.088407 4839 generic.go:334] "Generic (PLEG): container finished" podID="4f92fefb-d5cd-451a-8bbe-31eea55d5bd9" containerID="ae0086f9eacefc01aae5e4f5f99607f42c8a56a9eb17fd3e93fb691fd9a335b2" exitCode=0 Mar 21 05:17:02 crc kubenswrapper[4839]: I0321 05:17:02.088506 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-jx4q7" event={"ID":"4f92fefb-d5cd-451a-8bbe-31eea55d5bd9","Type":"ContainerDied","Data":"ae0086f9eacefc01aae5e4f5f99607f42c8a56a9eb17fd3e93fb691fd9a335b2"} Mar 21 05:17:02 crc kubenswrapper[4839]: I0321 05:17:02.088804 4839 scope.go:117] "RemoveContainer" containerID="b3b7e174e088400e045d778d09a66442ae3aba7106eba8585b97c2b9d3b1ab21" Mar 21 05:17:02 crc kubenswrapper[4839]: I0321 05:17:02.089608 4839 scope.go:117] "RemoveContainer" containerID="ae0086f9eacefc01aae5e4f5f99607f42c8a56a9eb17fd3e93fb691fd9a335b2" Mar 21 05:17:02 crc kubenswrapper[4839]: E0321 05:17:02.089890 4839 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-jx4q7_openshift-machine-config-operator(4f92fefb-d5cd-451a-8bbe-31eea55d5bd9)\"" pod="openshift-machine-config-operator/machine-config-daemon-jx4q7" podUID="4f92fefb-d5cd-451a-8bbe-31eea55d5bd9" Mar 21 05:17:14 crc kubenswrapper[4839]: I0321 05:17:14.452739 4839 scope.go:117] "RemoveContainer" containerID="ae0086f9eacefc01aae5e4f5f99607f42c8a56a9eb17fd3e93fb691fd9a335b2" Mar 21 05:17:14 crc kubenswrapper[4839]: E0321 05:17:14.453590 4839 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-jx4q7_openshift-machine-config-operator(4f92fefb-d5cd-451a-8bbe-31eea55d5bd9)\"" pod="openshift-machine-config-operator/machine-config-daemon-jx4q7" podUID="4f92fefb-d5cd-451a-8bbe-31eea55d5bd9" Mar 21 05:17:25 crc kubenswrapper[4839]: I0321 05:17:25.452831 4839 scope.go:117] "RemoveContainer" containerID="ae0086f9eacefc01aae5e4f5f99607f42c8a56a9eb17fd3e93fb691fd9a335b2" Mar 21 05:17:25 crc kubenswrapper[4839]: E0321 05:17:25.453611 4839 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-jx4q7_openshift-machine-config-operator(4f92fefb-d5cd-451a-8bbe-31eea55d5bd9)\"" pod="openshift-machine-config-operator/machine-config-daemon-jx4q7" podUID="4f92fefb-d5cd-451a-8bbe-31eea55d5bd9" Mar 21 05:17:36 crc kubenswrapper[4839]: I0321 05:17:36.463677 4839 scope.go:117] "RemoveContainer" containerID="ae0086f9eacefc01aae5e4f5f99607f42c8a56a9eb17fd3e93fb691fd9a335b2" Mar 21 05:17:36 crc kubenswrapper[4839]: E0321 05:17:36.464801 4839 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-jx4q7_openshift-machine-config-operator(4f92fefb-d5cd-451a-8bbe-31eea55d5bd9)\"" pod="openshift-machine-config-operator/machine-config-daemon-jx4q7" podUID="4f92fefb-d5cd-451a-8bbe-31eea55d5bd9" Mar 21 05:17:47 crc kubenswrapper[4839]: I0321 05:17:47.453493 4839 scope.go:117] "RemoveContainer" containerID="ae0086f9eacefc01aae5e4f5f99607f42c8a56a9eb17fd3e93fb691fd9a335b2" Mar 21 05:17:47 crc kubenswrapper[4839]: E0321 05:17:47.454354 4839 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-jx4q7_openshift-machine-config-operator(4f92fefb-d5cd-451a-8bbe-31eea55d5bd9)\"" pod="openshift-machine-config-operator/machine-config-daemon-jx4q7" podUID="4f92fefb-d5cd-451a-8bbe-31eea55d5bd9" Mar 21 05:18:00 crc kubenswrapper[4839]: I0321 05:18:00.185364 4839 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29567838-z4kh5"] Mar 21 05:18:00 crc kubenswrapper[4839]: E0321 05:18:00.186280 4839 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bcf873d9-ae04-40eb-b855-cca2a045773c" containerName="oc" Mar 21 05:18:00 crc kubenswrapper[4839]: I0321 05:18:00.186294 4839 state_mem.go:107] "Deleted CPUSet assignment" podUID="bcf873d9-ae04-40eb-b855-cca2a045773c" containerName="oc" Mar 21 05:18:00 crc kubenswrapper[4839]: I0321 05:18:00.186541 4839 memory_manager.go:354] "RemoveStaleState removing state" podUID="bcf873d9-ae04-40eb-b855-cca2a045773c" containerName="oc" Mar 21 05:18:00 crc kubenswrapper[4839]: I0321 05:18:00.187201 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567838-z4kh5" Mar 21 05:18:00 crc kubenswrapper[4839]: I0321 05:18:00.189082 4839 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 21 05:18:00 crc kubenswrapper[4839]: I0321 05:18:00.189293 4839 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 21 05:18:00 crc kubenswrapper[4839]: I0321 05:18:00.189424 4839 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-swld2" Mar 21 05:18:00 crc kubenswrapper[4839]: I0321 05:18:00.203361 4839 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29567838-z4kh5"] Mar 21 05:18:00 crc kubenswrapper[4839]: I0321 05:18:00.280803 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c4lx8\" (UniqueName: \"kubernetes.io/projected/a41e2c3a-6a99-4d7f-9f7c-1baf73fe815a-kube-api-access-c4lx8\") pod \"auto-csr-approver-29567838-z4kh5\" (UID: \"a41e2c3a-6a99-4d7f-9f7c-1baf73fe815a\") " pod="openshift-infra/auto-csr-approver-29567838-z4kh5" Mar 21 05:18:00 crc kubenswrapper[4839]: I0321 05:18:00.382976 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-c4lx8\" (UniqueName: \"kubernetes.io/projected/a41e2c3a-6a99-4d7f-9f7c-1baf73fe815a-kube-api-access-c4lx8\") pod \"auto-csr-approver-29567838-z4kh5\" (UID: \"a41e2c3a-6a99-4d7f-9f7c-1baf73fe815a\") " pod="openshift-infra/auto-csr-approver-29567838-z4kh5" Mar 21 05:18:00 crc kubenswrapper[4839]: I0321 05:18:00.402836 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-c4lx8\" (UniqueName: \"kubernetes.io/projected/a41e2c3a-6a99-4d7f-9f7c-1baf73fe815a-kube-api-access-c4lx8\") pod \"auto-csr-approver-29567838-z4kh5\" (UID: \"a41e2c3a-6a99-4d7f-9f7c-1baf73fe815a\") " pod="openshift-infra/auto-csr-approver-29567838-z4kh5" Mar 21 05:18:00 crc kubenswrapper[4839]: I0321 05:18:00.518722 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567838-z4kh5" Mar 21 05:18:00 crc kubenswrapper[4839]: I0321 05:18:00.959480 4839 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29567838-z4kh5"] Mar 21 05:18:00 crc kubenswrapper[4839]: I0321 05:18:00.975974 4839 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 21 05:18:01 crc kubenswrapper[4839]: I0321 05:18:01.453496 4839 scope.go:117] "RemoveContainer" containerID="ae0086f9eacefc01aae5e4f5f99607f42c8a56a9eb17fd3e93fb691fd9a335b2" Mar 21 05:18:01 crc kubenswrapper[4839]: E0321 05:18:01.454032 4839 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-jx4q7_openshift-machine-config-operator(4f92fefb-d5cd-451a-8bbe-31eea55d5bd9)\"" pod="openshift-machine-config-operator/machine-config-daemon-jx4q7" podUID="4f92fefb-d5cd-451a-8bbe-31eea55d5bd9" Mar 21 05:18:01 crc kubenswrapper[4839]: I0321 05:18:01.643704 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567838-z4kh5" event={"ID":"a41e2c3a-6a99-4d7f-9f7c-1baf73fe815a","Type":"ContainerStarted","Data":"96b842525df8f60fe6b9cf7189b28d7b23df5df7c8156175f571aa55d2866ae4"} Mar 21 05:18:03 crc kubenswrapper[4839]: I0321 05:18:03.661626 4839 generic.go:334] "Generic (PLEG): container finished" podID="a41e2c3a-6a99-4d7f-9f7c-1baf73fe815a" containerID="aa966754301c031bb6355b4136e2fe214f5819ce3ea77c126ebfb20a4377b523" exitCode=0 Mar 21 05:18:03 crc kubenswrapper[4839]: I0321 05:18:03.661708 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567838-z4kh5" event={"ID":"a41e2c3a-6a99-4d7f-9f7c-1baf73fe815a","Type":"ContainerDied","Data":"aa966754301c031bb6355b4136e2fe214f5819ce3ea77c126ebfb20a4377b523"} Mar 21 05:18:05 crc kubenswrapper[4839]: I0321 05:18:05.107189 4839 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567838-z4kh5" Mar 21 05:18:05 crc kubenswrapper[4839]: I0321 05:18:05.181429 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-c4lx8\" (UniqueName: \"kubernetes.io/projected/a41e2c3a-6a99-4d7f-9f7c-1baf73fe815a-kube-api-access-c4lx8\") pod \"a41e2c3a-6a99-4d7f-9f7c-1baf73fe815a\" (UID: \"a41e2c3a-6a99-4d7f-9f7c-1baf73fe815a\") " Mar 21 05:18:05 crc kubenswrapper[4839]: I0321 05:18:05.186811 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a41e2c3a-6a99-4d7f-9f7c-1baf73fe815a-kube-api-access-c4lx8" (OuterVolumeSpecName: "kube-api-access-c4lx8") pod "a41e2c3a-6a99-4d7f-9f7c-1baf73fe815a" (UID: "a41e2c3a-6a99-4d7f-9f7c-1baf73fe815a"). InnerVolumeSpecName "kube-api-access-c4lx8". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 21 05:18:05 crc kubenswrapper[4839]: I0321 05:18:05.283855 4839 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-c4lx8\" (UniqueName: \"kubernetes.io/projected/a41e2c3a-6a99-4d7f-9f7c-1baf73fe815a-kube-api-access-c4lx8\") on node \"crc\" DevicePath \"\"" Mar 21 05:18:05 crc kubenswrapper[4839]: I0321 05:18:05.679770 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567838-z4kh5" event={"ID":"a41e2c3a-6a99-4d7f-9f7c-1baf73fe815a","Type":"ContainerDied","Data":"96b842525df8f60fe6b9cf7189b28d7b23df5df7c8156175f571aa55d2866ae4"} Mar 21 05:18:05 crc kubenswrapper[4839]: I0321 05:18:05.679810 4839 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="96b842525df8f60fe6b9cf7189b28d7b23df5df7c8156175f571aa55d2866ae4" Mar 21 05:18:05 crc kubenswrapper[4839]: I0321 05:18:05.679833 4839 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567838-z4kh5" Mar 21 05:18:06 crc kubenswrapper[4839]: I0321 05:18:06.176096 4839 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29567832-p2sbz"] Mar 21 05:18:06 crc kubenswrapper[4839]: I0321 05:18:06.185336 4839 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29567832-p2sbz"] Mar 21 05:18:06 crc kubenswrapper[4839]: I0321 05:18:06.463409 4839 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="82138c0f-eab6-4265-8db8-a8a1d934493a" path="/var/lib/kubelet/pods/82138c0f-eab6-4265-8db8-a8a1d934493a/volumes" Mar 21 05:18:15 crc kubenswrapper[4839]: I0321 05:18:15.453135 4839 scope.go:117] "RemoveContainer" containerID="ae0086f9eacefc01aae5e4f5f99607f42c8a56a9eb17fd3e93fb691fd9a335b2" Mar 21 05:18:15 crc kubenswrapper[4839]: E0321 05:18:15.453852 4839 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-jx4q7_openshift-machine-config-operator(4f92fefb-d5cd-451a-8bbe-31eea55d5bd9)\"" pod="openshift-machine-config-operator/machine-config-daemon-jx4q7" podUID="4f92fefb-d5cd-451a-8bbe-31eea55d5bd9" Mar 21 05:18:29 crc kubenswrapper[4839]: I0321 05:18:29.453950 4839 scope.go:117] "RemoveContainer" containerID="ae0086f9eacefc01aae5e4f5f99607f42c8a56a9eb17fd3e93fb691fd9a335b2" Mar 21 05:18:29 crc kubenswrapper[4839]: E0321 05:18:29.455188 4839 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-jx4q7_openshift-machine-config-operator(4f92fefb-d5cd-451a-8bbe-31eea55d5bd9)\"" pod="openshift-machine-config-operator/machine-config-daemon-jx4q7" podUID="4f92fefb-d5cd-451a-8bbe-31eea55d5bd9" Mar 21 05:18:43 crc kubenswrapper[4839]: I0321 05:18:43.452515 4839 scope.go:117] "RemoveContainer" containerID="ae0086f9eacefc01aae5e4f5f99607f42c8a56a9eb17fd3e93fb691fd9a335b2" Mar 21 05:18:43 crc kubenswrapper[4839]: E0321 05:18:43.453475 4839 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-jx4q7_openshift-machine-config-operator(4f92fefb-d5cd-451a-8bbe-31eea55d5bd9)\"" pod="openshift-machine-config-operator/machine-config-daemon-jx4q7" podUID="4f92fefb-d5cd-451a-8bbe-31eea55d5bd9" Mar 21 05:18:54 crc kubenswrapper[4839]: I0321 05:18:54.453784 4839 scope.go:117] "RemoveContainer" containerID="ae0086f9eacefc01aae5e4f5f99607f42c8a56a9eb17fd3e93fb691fd9a335b2" Mar 21 05:18:54 crc kubenswrapper[4839]: E0321 05:18:54.455364 4839 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-jx4q7_openshift-machine-config-operator(4f92fefb-d5cd-451a-8bbe-31eea55d5bd9)\"" pod="openshift-machine-config-operator/machine-config-daemon-jx4q7" podUID="4f92fefb-d5cd-451a-8bbe-31eea55d5bd9" Mar 21 05:18:56 crc kubenswrapper[4839]: I0321 05:18:56.838260 4839 scope.go:117] "RemoveContainer" containerID="3318757ff7e10c85856bb47e2aa55c12ee4b6d281f642a8f4e0f9c93681e4785" Mar 21 05:19:06 crc kubenswrapper[4839]: I0321 05:19:06.458383 4839 scope.go:117] "RemoveContainer" containerID="ae0086f9eacefc01aae5e4f5f99607f42c8a56a9eb17fd3e93fb691fd9a335b2" Mar 21 05:19:06 crc kubenswrapper[4839]: E0321 05:19:06.459200 4839 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-jx4q7_openshift-machine-config-operator(4f92fefb-d5cd-451a-8bbe-31eea55d5bd9)\"" pod="openshift-machine-config-operator/machine-config-daemon-jx4q7" podUID="4f92fefb-d5cd-451a-8bbe-31eea55d5bd9" Mar 21 05:19:20 crc kubenswrapper[4839]: I0321 05:19:20.452867 4839 scope.go:117] "RemoveContainer" containerID="ae0086f9eacefc01aae5e4f5f99607f42c8a56a9eb17fd3e93fb691fd9a335b2" Mar 21 05:19:20 crc kubenswrapper[4839]: E0321 05:19:20.453683 4839 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-jx4q7_openshift-machine-config-operator(4f92fefb-d5cd-451a-8bbe-31eea55d5bd9)\"" pod="openshift-machine-config-operator/machine-config-daemon-jx4q7" podUID="4f92fefb-d5cd-451a-8bbe-31eea55d5bd9" Mar 21 05:19:34 crc kubenswrapper[4839]: I0321 05:19:34.453528 4839 scope.go:117] "RemoveContainer" containerID="ae0086f9eacefc01aae5e4f5f99607f42c8a56a9eb17fd3e93fb691fd9a335b2" Mar 21 05:19:34 crc kubenswrapper[4839]: E0321 05:19:34.454849 4839 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-jx4q7_openshift-machine-config-operator(4f92fefb-d5cd-451a-8bbe-31eea55d5bd9)\"" pod="openshift-machine-config-operator/machine-config-daemon-jx4q7" podUID="4f92fefb-d5cd-451a-8bbe-31eea55d5bd9" Mar 21 05:19:45 crc kubenswrapper[4839]: I0321 05:19:45.438103 4839 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-l4vch"] Mar 21 05:19:45 crc kubenswrapper[4839]: E0321 05:19:45.439122 4839 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a41e2c3a-6a99-4d7f-9f7c-1baf73fe815a" containerName="oc" Mar 21 05:19:45 crc kubenswrapper[4839]: I0321 05:19:45.439139 4839 state_mem.go:107] "Deleted CPUSet assignment" podUID="a41e2c3a-6a99-4d7f-9f7c-1baf73fe815a" containerName="oc" Mar 21 05:19:45 crc kubenswrapper[4839]: I0321 05:19:45.439335 4839 memory_manager.go:354] "RemoveStaleState removing state" podUID="a41e2c3a-6a99-4d7f-9f7c-1baf73fe815a" containerName="oc" Mar 21 05:19:45 crc kubenswrapper[4839]: I0321 05:19:45.440719 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-l4vch" Mar 21 05:19:45 crc kubenswrapper[4839]: I0321 05:19:45.467460 4839 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-l4vch"] Mar 21 05:19:45 crc kubenswrapper[4839]: I0321 05:19:45.580008 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f723037e-e37c-4441-b840-6a3da3ec2fff-catalog-content\") pod \"redhat-operators-l4vch\" (UID: \"f723037e-e37c-4441-b840-6a3da3ec2fff\") " pod="openshift-marketplace/redhat-operators-l4vch" Mar 21 05:19:45 crc kubenswrapper[4839]: I0321 05:19:45.580249 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h5vn5\" (UniqueName: \"kubernetes.io/projected/f723037e-e37c-4441-b840-6a3da3ec2fff-kube-api-access-h5vn5\") pod \"redhat-operators-l4vch\" (UID: \"f723037e-e37c-4441-b840-6a3da3ec2fff\") " pod="openshift-marketplace/redhat-operators-l4vch" Mar 21 05:19:45 crc kubenswrapper[4839]: I0321 05:19:45.580313 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f723037e-e37c-4441-b840-6a3da3ec2fff-utilities\") pod \"redhat-operators-l4vch\" (UID: \"f723037e-e37c-4441-b840-6a3da3ec2fff\") " pod="openshift-marketplace/redhat-operators-l4vch" Mar 21 05:19:45 crc kubenswrapper[4839]: I0321 05:19:45.682030 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f723037e-e37c-4441-b840-6a3da3ec2fff-catalog-content\") pod \"redhat-operators-l4vch\" (UID: \"f723037e-e37c-4441-b840-6a3da3ec2fff\") " pod="openshift-marketplace/redhat-operators-l4vch" Mar 21 05:19:45 crc kubenswrapper[4839]: I0321 05:19:45.682180 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-h5vn5\" (UniqueName: \"kubernetes.io/projected/f723037e-e37c-4441-b840-6a3da3ec2fff-kube-api-access-h5vn5\") pod \"redhat-operators-l4vch\" (UID: \"f723037e-e37c-4441-b840-6a3da3ec2fff\") " pod="openshift-marketplace/redhat-operators-l4vch" Mar 21 05:19:45 crc kubenswrapper[4839]: I0321 05:19:45.682216 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f723037e-e37c-4441-b840-6a3da3ec2fff-utilities\") pod \"redhat-operators-l4vch\" (UID: \"f723037e-e37c-4441-b840-6a3da3ec2fff\") " pod="openshift-marketplace/redhat-operators-l4vch" Mar 21 05:19:45 crc kubenswrapper[4839]: I0321 05:19:45.682546 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f723037e-e37c-4441-b840-6a3da3ec2fff-catalog-content\") pod \"redhat-operators-l4vch\" (UID: \"f723037e-e37c-4441-b840-6a3da3ec2fff\") " pod="openshift-marketplace/redhat-operators-l4vch" Mar 21 05:19:45 crc kubenswrapper[4839]: I0321 05:19:45.682610 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f723037e-e37c-4441-b840-6a3da3ec2fff-utilities\") pod \"redhat-operators-l4vch\" (UID: \"f723037e-e37c-4441-b840-6a3da3ec2fff\") " pod="openshift-marketplace/redhat-operators-l4vch" Mar 21 05:19:45 crc kubenswrapper[4839]: I0321 05:19:45.709276 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-h5vn5\" (UniqueName: \"kubernetes.io/projected/f723037e-e37c-4441-b840-6a3da3ec2fff-kube-api-access-h5vn5\") pod \"redhat-operators-l4vch\" (UID: \"f723037e-e37c-4441-b840-6a3da3ec2fff\") " pod="openshift-marketplace/redhat-operators-l4vch" Mar 21 05:19:45 crc kubenswrapper[4839]: I0321 05:19:45.763092 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-l4vch" Mar 21 05:19:46 crc kubenswrapper[4839]: I0321 05:19:46.303430 4839 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-l4vch"] Mar 21 05:19:46 crc kubenswrapper[4839]: I0321 05:19:46.751454 4839 generic.go:334] "Generic (PLEG): container finished" podID="f723037e-e37c-4441-b840-6a3da3ec2fff" containerID="36e58fa15812a6897b50d6db0ed7a274a81303b9dec23efb000319a8d9a33254" exitCode=0 Mar 21 05:19:46 crc kubenswrapper[4839]: I0321 05:19:46.751510 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-l4vch" event={"ID":"f723037e-e37c-4441-b840-6a3da3ec2fff","Type":"ContainerDied","Data":"36e58fa15812a6897b50d6db0ed7a274a81303b9dec23efb000319a8d9a33254"} Mar 21 05:19:46 crc kubenswrapper[4839]: I0321 05:19:46.752600 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-l4vch" event={"ID":"f723037e-e37c-4441-b840-6a3da3ec2fff","Type":"ContainerStarted","Data":"e203f4608cf6c11b77dd0d150de856a5e47df3e1fd1ef0fba1ab2bfc0eda5172"} Mar 21 05:19:48 crc kubenswrapper[4839]: I0321 05:19:48.454105 4839 scope.go:117] "RemoveContainer" containerID="ae0086f9eacefc01aae5e4f5f99607f42c8a56a9eb17fd3e93fb691fd9a335b2" Mar 21 05:19:48 crc kubenswrapper[4839]: E0321 05:19:48.455659 4839 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-jx4q7_openshift-machine-config-operator(4f92fefb-d5cd-451a-8bbe-31eea55d5bd9)\"" pod="openshift-machine-config-operator/machine-config-daemon-jx4q7" podUID="4f92fefb-d5cd-451a-8bbe-31eea55d5bd9" Mar 21 05:19:48 crc kubenswrapper[4839]: I0321 05:19:48.772153 4839 generic.go:334] "Generic (PLEG): container finished" podID="f723037e-e37c-4441-b840-6a3da3ec2fff" containerID="0fb7e2104f45f80289c410cb7ed43ea4ebad69cb7a12b53a2a2c205806ea1801" exitCode=0 Mar 21 05:19:48 crc kubenswrapper[4839]: I0321 05:19:48.772229 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-l4vch" event={"ID":"f723037e-e37c-4441-b840-6a3da3ec2fff","Type":"ContainerDied","Data":"0fb7e2104f45f80289c410cb7ed43ea4ebad69cb7a12b53a2a2c205806ea1801"} Mar 21 05:19:50 crc kubenswrapper[4839]: I0321 05:19:50.823652 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-l4vch" event={"ID":"f723037e-e37c-4441-b840-6a3da3ec2fff","Type":"ContainerStarted","Data":"0f0fb05b1ad7b9c8c86908bf5eed059955dab4bf2710991db435f65e1f3837bc"} Mar 21 05:19:50 crc kubenswrapper[4839]: I0321 05:19:50.895221 4839 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-l4vch" podStartSLOduration=3.034622317 podStartE2EDuration="5.895195173s" podCreationTimestamp="2026-03-21 05:19:45 +0000 UTC" firstStartedPulling="2026-03-21 05:19:46.753144875 +0000 UTC m=+3391.080931551" lastFinishedPulling="2026-03-21 05:19:49.613717731 +0000 UTC m=+3393.941504407" observedRunningTime="2026-03-21 05:19:50.841266061 +0000 UTC m=+3395.169052737" watchObservedRunningTime="2026-03-21 05:19:50.895195173 +0000 UTC m=+3395.222981849" Mar 21 05:19:55 crc kubenswrapper[4839]: I0321 05:19:55.763604 4839 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-l4vch" Mar 21 05:19:55 crc kubenswrapper[4839]: I0321 05:19:55.764447 4839 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-l4vch" Mar 21 05:19:56 crc kubenswrapper[4839]: I0321 05:19:56.825424 4839 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-l4vch" podUID="f723037e-e37c-4441-b840-6a3da3ec2fff" containerName="registry-server" probeResult="failure" output=< Mar 21 05:19:56 crc kubenswrapper[4839]: timeout: failed to connect service ":50051" within 1s Mar 21 05:19:56 crc kubenswrapper[4839]: > Mar 21 05:20:00 crc kubenswrapper[4839]: I0321 05:20:00.145135 4839 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29567840-rbk96"] Mar 21 05:20:00 crc kubenswrapper[4839]: I0321 05:20:00.146918 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567840-rbk96" Mar 21 05:20:00 crc kubenswrapper[4839]: I0321 05:20:00.149976 4839 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 21 05:20:00 crc kubenswrapper[4839]: I0321 05:20:00.150495 4839 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 21 05:20:00 crc kubenswrapper[4839]: I0321 05:20:00.150770 4839 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-swld2" Mar 21 05:20:00 crc kubenswrapper[4839]: I0321 05:20:00.160860 4839 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29567840-rbk96"] Mar 21 05:20:00 crc kubenswrapper[4839]: I0321 05:20:00.272905 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-whlvb\" (UniqueName: \"kubernetes.io/projected/b89d49dc-a7f5-4a24-98c5-818fe0e99ded-kube-api-access-whlvb\") pod \"auto-csr-approver-29567840-rbk96\" (UID: \"b89d49dc-a7f5-4a24-98c5-818fe0e99ded\") " pod="openshift-infra/auto-csr-approver-29567840-rbk96" Mar 21 05:20:00 crc kubenswrapper[4839]: I0321 05:20:00.375064 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-whlvb\" (UniqueName: \"kubernetes.io/projected/b89d49dc-a7f5-4a24-98c5-818fe0e99ded-kube-api-access-whlvb\") pod \"auto-csr-approver-29567840-rbk96\" (UID: \"b89d49dc-a7f5-4a24-98c5-818fe0e99ded\") " pod="openshift-infra/auto-csr-approver-29567840-rbk96" Mar 21 05:20:00 crc kubenswrapper[4839]: I0321 05:20:00.397893 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-whlvb\" (UniqueName: \"kubernetes.io/projected/b89d49dc-a7f5-4a24-98c5-818fe0e99ded-kube-api-access-whlvb\") pod \"auto-csr-approver-29567840-rbk96\" (UID: \"b89d49dc-a7f5-4a24-98c5-818fe0e99ded\") " pod="openshift-infra/auto-csr-approver-29567840-rbk96" Mar 21 05:20:00 crc kubenswrapper[4839]: I0321 05:20:00.467023 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567840-rbk96" Mar 21 05:20:00 crc kubenswrapper[4839]: I0321 05:20:00.935072 4839 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29567840-rbk96"] Mar 21 05:20:01 crc kubenswrapper[4839]: I0321 05:20:01.925237 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567840-rbk96" event={"ID":"b89d49dc-a7f5-4a24-98c5-818fe0e99ded","Type":"ContainerStarted","Data":"9b2c05d698f1afb22dc74eff02006e38c51edb53e34ee88d85a71d07b8b98779"} Mar 21 05:20:02 crc kubenswrapper[4839]: I0321 05:20:02.934852 4839 generic.go:334] "Generic (PLEG): container finished" podID="b89d49dc-a7f5-4a24-98c5-818fe0e99ded" containerID="229d14a49fd481dc353dc5d371b3e82a7f1a7396db2fffc8de8355fe9e2338cb" exitCode=0 Mar 21 05:20:02 crc kubenswrapper[4839]: I0321 05:20:02.935032 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567840-rbk96" event={"ID":"b89d49dc-a7f5-4a24-98c5-818fe0e99ded","Type":"ContainerDied","Data":"229d14a49fd481dc353dc5d371b3e82a7f1a7396db2fffc8de8355fe9e2338cb"} Mar 21 05:20:03 crc kubenswrapper[4839]: I0321 05:20:03.452905 4839 scope.go:117] "RemoveContainer" containerID="ae0086f9eacefc01aae5e4f5f99607f42c8a56a9eb17fd3e93fb691fd9a335b2" Mar 21 05:20:03 crc kubenswrapper[4839]: E0321 05:20:03.453226 4839 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-jx4q7_openshift-machine-config-operator(4f92fefb-d5cd-451a-8bbe-31eea55d5bd9)\"" pod="openshift-machine-config-operator/machine-config-daemon-jx4q7" podUID="4f92fefb-d5cd-451a-8bbe-31eea55d5bd9" Mar 21 05:20:04 crc kubenswrapper[4839]: I0321 05:20:04.328643 4839 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567840-rbk96" Mar 21 05:20:04 crc kubenswrapper[4839]: I0321 05:20:04.469701 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-whlvb\" (UniqueName: \"kubernetes.io/projected/b89d49dc-a7f5-4a24-98c5-818fe0e99ded-kube-api-access-whlvb\") pod \"b89d49dc-a7f5-4a24-98c5-818fe0e99ded\" (UID: \"b89d49dc-a7f5-4a24-98c5-818fe0e99ded\") " Mar 21 05:20:04 crc kubenswrapper[4839]: I0321 05:20:04.476235 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b89d49dc-a7f5-4a24-98c5-818fe0e99ded-kube-api-access-whlvb" (OuterVolumeSpecName: "kube-api-access-whlvb") pod "b89d49dc-a7f5-4a24-98c5-818fe0e99ded" (UID: "b89d49dc-a7f5-4a24-98c5-818fe0e99ded"). InnerVolumeSpecName "kube-api-access-whlvb". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 21 05:20:04 crc kubenswrapper[4839]: I0321 05:20:04.572759 4839 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-whlvb\" (UniqueName: \"kubernetes.io/projected/b89d49dc-a7f5-4a24-98c5-818fe0e99ded-kube-api-access-whlvb\") on node \"crc\" DevicePath \"\"" Mar 21 05:20:04 crc kubenswrapper[4839]: I0321 05:20:04.953840 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567840-rbk96" event={"ID":"b89d49dc-a7f5-4a24-98c5-818fe0e99ded","Type":"ContainerDied","Data":"9b2c05d698f1afb22dc74eff02006e38c51edb53e34ee88d85a71d07b8b98779"} Mar 21 05:20:04 crc kubenswrapper[4839]: I0321 05:20:04.954027 4839 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="9b2c05d698f1afb22dc74eff02006e38c51edb53e34ee88d85a71d07b8b98779" Mar 21 05:20:04 crc kubenswrapper[4839]: I0321 05:20:04.953939 4839 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567840-rbk96" Mar 21 05:20:05 crc kubenswrapper[4839]: I0321 05:20:05.437428 4839 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29567834-zmfrq"] Mar 21 05:20:05 crc kubenswrapper[4839]: I0321 05:20:05.448509 4839 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29567834-zmfrq"] Mar 21 05:20:06 crc kubenswrapper[4839]: I0321 05:20:06.471097 4839 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="acfde7eb-12d0-4baf-9958-3ff93b290071" path="/var/lib/kubelet/pods/acfde7eb-12d0-4baf-9958-3ff93b290071/volumes" Mar 21 05:20:06 crc kubenswrapper[4839]: I0321 05:20:06.817439 4839 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-l4vch" podUID="f723037e-e37c-4441-b840-6a3da3ec2fff" containerName="registry-server" probeResult="failure" output=< Mar 21 05:20:06 crc kubenswrapper[4839]: timeout: failed to connect service ":50051" within 1s Mar 21 05:20:06 crc kubenswrapper[4839]: > Mar 21 05:20:14 crc kubenswrapper[4839]: I0321 05:20:14.453636 4839 scope.go:117] "RemoveContainer" containerID="ae0086f9eacefc01aae5e4f5f99607f42c8a56a9eb17fd3e93fb691fd9a335b2" Mar 21 05:20:14 crc kubenswrapper[4839]: E0321 05:20:14.454318 4839 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-jx4q7_openshift-machine-config-operator(4f92fefb-d5cd-451a-8bbe-31eea55d5bd9)\"" pod="openshift-machine-config-operator/machine-config-daemon-jx4q7" podUID="4f92fefb-d5cd-451a-8bbe-31eea55d5bd9" Mar 21 05:20:16 crc kubenswrapper[4839]: I0321 05:20:16.904379 4839 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-l4vch" podUID="f723037e-e37c-4441-b840-6a3da3ec2fff" containerName="registry-server" probeResult="failure" output=< Mar 21 05:20:16 crc kubenswrapper[4839]: timeout: failed to connect service ":50051" within 1s Mar 21 05:20:16 crc kubenswrapper[4839]: > Mar 21 05:20:25 crc kubenswrapper[4839]: I0321 05:20:25.805039 4839 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-l4vch" Mar 21 05:20:25 crc kubenswrapper[4839]: I0321 05:20:25.854448 4839 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-l4vch" Mar 21 05:20:26 crc kubenswrapper[4839]: I0321 05:20:26.042310 4839 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-l4vch"] Mar 21 05:20:27 crc kubenswrapper[4839]: I0321 05:20:27.164081 4839 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-l4vch" podUID="f723037e-e37c-4441-b840-6a3da3ec2fff" containerName="registry-server" containerID="cri-o://0f0fb05b1ad7b9c8c86908bf5eed059955dab4bf2710991db435f65e1f3837bc" gracePeriod=2 Mar 21 05:20:27 crc kubenswrapper[4839]: I0321 05:20:27.453103 4839 scope.go:117] "RemoveContainer" containerID="ae0086f9eacefc01aae5e4f5f99607f42c8a56a9eb17fd3e93fb691fd9a335b2" Mar 21 05:20:27 crc kubenswrapper[4839]: E0321 05:20:27.453381 4839 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-jx4q7_openshift-machine-config-operator(4f92fefb-d5cd-451a-8bbe-31eea55d5bd9)\"" pod="openshift-machine-config-operator/machine-config-daemon-jx4q7" podUID="4f92fefb-d5cd-451a-8bbe-31eea55d5bd9" Mar 21 05:20:28 crc kubenswrapper[4839]: I0321 05:20:28.177965 4839 generic.go:334] "Generic (PLEG): container finished" podID="f723037e-e37c-4441-b840-6a3da3ec2fff" containerID="0f0fb05b1ad7b9c8c86908bf5eed059955dab4bf2710991db435f65e1f3837bc" exitCode=0 Mar 21 05:20:28 crc kubenswrapper[4839]: I0321 05:20:28.178460 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-l4vch" event={"ID":"f723037e-e37c-4441-b840-6a3da3ec2fff","Type":"ContainerDied","Data":"0f0fb05b1ad7b9c8c86908bf5eed059955dab4bf2710991db435f65e1f3837bc"} Mar 21 05:20:28 crc kubenswrapper[4839]: I0321 05:20:28.178501 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-l4vch" event={"ID":"f723037e-e37c-4441-b840-6a3da3ec2fff","Type":"ContainerDied","Data":"e203f4608cf6c11b77dd0d150de856a5e47df3e1fd1ef0fba1ab2bfc0eda5172"} Mar 21 05:20:28 crc kubenswrapper[4839]: I0321 05:20:28.178513 4839 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e203f4608cf6c11b77dd0d150de856a5e47df3e1fd1ef0fba1ab2bfc0eda5172" Mar 21 05:20:28 crc kubenswrapper[4839]: I0321 05:20:28.196910 4839 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-l4vch" Mar 21 05:20:28 crc kubenswrapper[4839]: I0321 05:20:28.254785 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f723037e-e37c-4441-b840-6a3da3ec2fff-utilities\") pod \"f723037e-e37c-4441-b840-6a3da3ec2fff\" (UID: \"f723037e-e37c-4441-b840-6a3da3ec2fff\") " Mar 21 05:20:28 crc kubenswrapper[4839]: I0321 05:20:28.255054 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-h5vn5\" (UniqueName: \"kubernetes.io/projected/f723037e-e37c-4441-b840-6a3da3ec2fff-kube-api-access-h5vn5\") pod \"f723037e-e37c-4441-b840-6a3da3ec2fff\" (UID: \"f723037e-e37c-4441-b840-6a3da3ec2fff\") " Mar 21 05:20:28 crc kubenswrapper[4839]: I0321 05:20:28.255188 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f723037e-e37c-4441-b840-6a3da3ec2fff-catalog-content\") pod \"f723037e-e37c-4441-b840-6a3da3ec2fff\" (UID: \"f723037e-e37c-4441-b840-6a3da3ec2fff\") " Mar 21 05:20:28 crc kubenswrapper[4839]: I0321 05:20:28.255624 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f723037e-e37c-4441-b840-6a3da3ec2fff-utilities" (OuterVolumeSpecName: "utilities") pod "f723037e-e37c-4441-b840-6a3da3ec2fff" (UID: "f723037e-e37c-4441-b840-6a3da3ec2fff"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 21 05:20:28 crc kubenswrapper[4839]: I0321 05:20:28.255822 4839 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f723037e-e37c-4441-b840-6a3da3ec2fff-utilities\") on node \"crc\" DevicePath \"\"" Mar 21 05:20:28 crc kubenswrapper[4839]: I0321 05:20:28.262942 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f723037e-e37c-4441-b840-6a3da3ec2fff-kube-api-access-h5vn5" (OuterVolumeSpecName: "kube-api-access-h5vn5") pod "f723037e-e37c-4441-b840-6a3da3ec2fff" (UID: "f723037e-e37c-4441-b840-6a3da3ec2fff"). InnerVolumeSpecName "kube-api-access-h5vn5". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 21 05:20:28 crc kubenswrapper[4839]: I0321 05:20:28.357905 4839 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-h5vn5\" (UniqueName: \"kubernetes.io/projected/f723037e-e37c-4441-b840-6a3da3ec2fff-kube-api-access-h5vn5\") on node \"crc\" DevicePath \"\"" Mar 21 05:20:28 crc kubenswrapper[4839]: I0321 05:20:28.398307 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f723037e-e37c-4441-b840-6a3da3ec2fff-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "f723037e-e37c-4441-b840-6a3da3ec2fff" (UID: "f723037e-e37c-4441-b840-6a3da3ec2fff"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 21 05:20:28 crc kubenswrapper[4839]: I0321 05:20:28.459783 4839 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f723037e-e37c-4441-b840-6a3da3ec2fff-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 21 05:20:29 crc kubenswrapper[4839]: I0321 05:20:29.188050 4839 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-l4vch" Mar 21 05:20:29 crc kubenswrapper[4839]: I0321 05:20:29.224307 4839 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-l4vch"] Mar 21 05:20:29 crc kubenswrapper[4839]: I0321 05:20:29.241954 4839 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-l4vch"] Mar 21 05:20:30 crc kubenswrapper[4839]: I0321 05:20:30.464216 4839 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f723037e-e37c-4441-b840-6a3da3ec2fff" path="/var/lib/kubelet/pods/f723037e-e37c-4441-b840-6a3da3ec2fff/volumes" Mar 21 05:20:39 crc kubenswrapper[4839]: I0321 05:20:39.452992 4839 scope.go:117] "RemoveContainer" containerID="ae0086f9eacefc01aae5e4f5f99607f42c8a56a9eb17fd3e93fb691fd9a335b2" Mar 21 05:20:39 crc kubenswrapper[4839]: E0321 05:20:39.453858 4839 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-jx4q7_openshift-machine-config-operator(4f92fefb-d5cd-451a-8bbe-31eea55d5bd9)\"" pod="openshift-machine-config-operator/machine-config-daemon-jx4q7" podUID="4f92fefb-d5cd-451a-8bbe-31eea55d5bd9" Mar 21 05:20:50 crc kubenswrapper[4839]: I0321 05:20:50.454135 4839 scope.go:117] "RemoveContainer" containerID="ae0086f9eacefc01aae5e4f5f99607f42c8a56a9eb17fd3e93fb691fd9a335b2" Mar 21 05:20:50 crc kubenswrapper[4839]: E0321 05:20:50.455764 4839 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-jx4q7_openshift-machine-config-operator(4f92fefb-d5cd-451a-8bbe-31eea55d5bd9)\"" pod="openshift-machine-config-operator/machine-config-daemon-jx4q7" podUID="4f92fefb-d5cd-451a-8bbe-31eea55d5bd9" Mar 21 05:20:56 crc kubenswrapper[4839]: I0321 05:20:56.952435 4839 scope.go:117] "RemoveContainer" containerID="93fa052ae298171aa1d976a2185cd4fffe6a03fbc4b347a5bb68165274eaac3a" Mar 21 05:21:03 crc kubenswrapper[4839]: I0321 05:21:03.456858 4839 scope.go:117] "RemoveContainer" containerID="ae0086f9eacefc01aae5e4f5f99607f42c8a56a9eb17fd3e93fb691fd9a335b2" Mar 21 05:21:03 crc kubenswrapper[4839]: E0321 05:21:03.457866 4839 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-jx4q7_openshift-machine-config-operator(4f92fefb-d5cd-451a-8bbe-31eea55d5bd9)\"" pod="openshift-machine-config-operator/machine-config-daemon-jx4q7" podUID="4f92fefb-d5cd-451a-8bbe-31eea55d5bd9" Mar 21 05:21:15 crc kubenswrapper[4839]: I0321 05:21:15.453603 4839 scope.go:117] "RemoveContainer" containerID="ae0086f9eacefc01aae5e4f5f99607f42c8a56a9eb17fd3e93fb691fd9a335b2" Mar 21 05:21:15 crc kubenswrapper[4839]: E0321 05:21:15.454907 4839 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-jx4q7_openshift-machine-config-operator(4f92fefb-d5cd-451a-8bbe-31eea55d5bd9)\"" pod="openshift-machine-config-operator/machine-config-daemon-jx4q7" podUID="4f92fefb-d5cd-451a-8bbe-31eea55d5bd9" Mar 21 05:21:26 crc kubenswrapper[4839]: I0321 05:21:26.463162 4839 scope.go:117] "RemoveContainer" containerID="ae0086f9eacefc01aae5e4f5f99607f42c8a56a9eb17fd3e93fb691fd9a335b2" Mar 21 05:21:26 crc kubenswrapper[4839]: E0321 05:21:26.465387 4839 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-jx4q7_openshift-machine-config-operator(4f92fefb-d5cd-451a-8bbe-31eea55d5bd9)\"" pod="openshift-machine-config-operator/machine-config-daemon-jx4q7" podUID="4f92fefb-d5cd-451a-8bbe-31eea55d5bd9" Mar 21 05:21:34 crc kubenswrapper[4839]: I0321 05:21:34.131972 4839 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-rgwlh"] Mar 21 05:21:34 crc kubenswrapper[4839]: E0321 05:21:34.133169 4839 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f723037e-e37c-4441-b840-6a3da3ec2fff" containerName="extract-utilities" Mar 21 05:21:34 crc kubenswrapper[4839]: I0321 05:21:34.133185 4839 state_mem.go:107] "Deleted CPUSet assignment" podUID="f723037e-e37c-4441-b840-6a3da3ec2fff" containerName="extract-utilities" Mar 21 05:21:34 crc kubenswrapper[4839]: E0321 05:21:34.133212 4839 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f723037e-e37c-4441-b840-6a3da3ec2fff" containerName="registry-server" Mar 21 05:21:34 crc kubenswrapper[4839]: I0321 05:21:34.133220 4839 state_mem.go:107] "Deleted CPUSet assignment" podUID="f723037e-e37c-4441-b840-6a3da3ec2fff" containerName="registry-server" Mar 21 05:21:34 crc kubenswrapper[4839]: E0321 05:21:34.133234 4839 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b89d49dc-a7f5-4a24-98c5-818fe0e99ded" containerName="oc" Mar 21 05:21:34 crc kubenswrapper[4839]: I0321 05:21:34.133241 4839 state_mem.go:107] "Deleted CPUSet assignment" podUID="b89d49dc-a7f5-4a24-98c5-818fe0e99ded" containerName="oc" Mar 21 05:21:34 crc kubenswrapper[4839]: E0321 05:21:34.133255 4839 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f723037e-e37c-4441-b840-6a3da3ec2fff" containerName="extract-content" Mar 21 05:21:34 crc kubenswrapper[4839]: I0321 05:21:34.133261 4839 state_mem.go:107] "Deleted CPUSet assignment" podUID="f723037e-e37c-4441-b840-6a3da3ec2fff" containerName="extract-content" Mar 21 05:21:34 crc kubenswrapper[4839]: I0321 05:21:34.133464 4839 memory_manager.go:354] "RemoveStaleState removing state" podUID="b89d49dc-a7f5-4a24-98c5-818fe0e99ded" containerName="oc" Mar 21 05:21:34 crc kubenswrapper[4839]: I0321 05:21:34.133477 4839 memory_manager.go:354] "RemoveStaleState removing state" podUID="f723037e-e37c-4441-b840-6a3da3ec2fff" containerName="registry-server" Mar 21 05:21:34 crc kubenswrapper[4839]: I0321 05:21:34.135158 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-rgwlh" Mar 21 05:21:34 crc kubenswrapper[4839]: I0321 05:21:34.148117 4839 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-rgwlh"] Mar 21 05:21:34 crc kubenswrapper[4839]: I0321 05:21:34.251732 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b39946e1-8c4a-48cc-9b4a-4ce148c3b8e5-utilities\") pod \"community-operators-rgwlh\" (UID: \"b39946e1-8c4a-48cc-9b4a-4ce148c3b8e5\") " pod="openshift-marketplace/community-operators-rgwlh" Mar 21 05:21:34 crc kubenswrapper[4839]: I0321 05:21:34.251980 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-stfwd\" (UniqueName: \"kubernetes.io/projected/b39946e1-8c4a-48cc-9b4a-4ce148c3b8e5-kube-api-access-stfwd\") pod \"community-operators-rgwlh\" (UID: \"b39946e1-8c4a-48cc-9b4a-4ce148c3b8e5\") " pod="openshift-marketplace/community-operators-rgwlh" Mar 21 05:21:34 crc kubenswrapper[4839]: I0321 05:21:34.252023 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b39946e1-8c4a-48cc-9b4a-4ce148c3b8e5-catalog-content\") pod \"community-operators-rgwlh\" (UID: \"b39946e1-8c4a-48cc-9b4a-4ce148c3b8e5\") " pod="openshift-marketplace/community-operators-rgwlh" Mar 21 05:21:34 crc kubenswrapper[4839]: I0321 05:21:34.354927 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-stfwd\" (UniqueName: \"kubernetes.io/projected/b39946e1-8c4a-48cc-9b4a-4ce148c3b8e5-kube-api-access-stfwd\") pod \"community-operators-rgwlh\" (UID: \"b39946e1-8c4a-48cc-9b4a-4ce148c3b8e5\") " pod="openshift-marketplace/community-operators-rgwlh" Mar 21 05:21:34 crc kubenswrapper[4839]: I0321 05:21:34.355022 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b39946e1-8c4a-48cc-9b4a-4ce148c3b8e5-catalog-content\") pod \"community-operators-rgwlh\" (UID: \"b39946e1-8c4a-48cc-9b4a-4ce148c3b8e5\") " pod="openshift-marketplace/community-operators-rgwlh" Mar 21 05:21:34 crc kubenswrapper[4839]: I0321 05:21:34.355192 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b39946e1-8c4a-48cc-9b4a-4ce148c3b8e5-utilities\") pod \"community-operators-rgwlh\" (UID: \"b39946e1-8c4a-48cc-9b4a-4ce148c3b8e5\") " pod="openshift-marketplace/community-operators-rgwlh" Mar 21 05:21:34 crc kubenswrapper[4839]: I0321 05:21:34.356251 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b39946e1-8c4a-48cc-9b4a-4ce148c3b8e5-utilities\") pod \"community-operators-rgwlh\" (UID: \"b39946e1-8c4a-48cc-9b4a-4ce148c3b8e5\") " pod="openshift-marketplace/community-operators-rgwlh" Mar 21 05:21:34 crc kubenswrapper[4839]: I0321 05:21:34.356260 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b39946e1-8c4a-48cc-9b4a-4ce148c3b8e5-catalog-content\") pod \"community-operators-rgwlh\" (UID: \"b39946e1-8c4a-48cc-9b4a-4ce148c3b8e5\") " pod="openshift-marketplace/community-operators-rgwlh" Mar 21 05:21:34 crc kubenswrapper[4839]: I0321 05:21:34.384414 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-stfwd\" (UniqueName: \"kubernetes.io/projected/b39946e1-8c4a-48cc-9b4a-4ce148c3b8e5-kube-api-access-stfwd\") pod \"community-operators-rgwlh\" (UID: \"b39946e1-8c4a-48cc-9b4a-4ce148c3b8e5\") " pod="openshift-marketplace/community-operators-rgwlh" Mar 21 05:21:34 crc kubenswrapper[4839]: I0321 05:21:34.467223 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-rgwlh" Mar 21 05:21:35 crc kubenswrapper[4839]: I0321 05:21:35.069802 4839 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-rgwlh"] Mar 21 05:21:35 crc kubenswrapper[4839]: I0321 05:21:35.833500 4839 generic.go:334] "Generic (PLEG): container finished" podID="b39946e1-8c4a-48cc-9b4a-4ce148c3b8e5" containerID="e45e155e440eadd7b16f1d84fbd4ac903a306839524ebedb5e79c97e1f5c8712" exitCode=0 Mar 21 05:21:35 crc kubenswrapper[4839]: I0321 05:21:35.833616 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-rgwlh" event={"ID":"b39946e1-8c4a-48cc-9b4a-4ce148c3b8e5","Type":"ContainerDied","Data":"e45e155e440eadd7b16f1d84fbd4ac903a306839524ebedb5e79c97e1f5c8712"} Mar 21 05:21:35 crc kubenswrapper[4839]: I0321 05:21:35.834336 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-rgwlh" event={"ID":"b39946e1-8c4a-48cc-9b4a-4ce148c3b8e5","Type":"ContainerStarted","Data":"02d803b24cd1bb5adbe92926bb02820c11a8c095dc0491408ccacdabb5366603"} Mar 21 05:21:36 crc kubenswrapper[4839]: I0321 05:21:36.845472 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-rgwlh" event={"ID":"b39946e1-8c4a-48cc-9b4a-4ce148c3b8e5","Type":"ContainerStarted","Data":"4562cd9c8152972a69c31d00df8df7e48df986202f27782b33ec2df30111899d"} Mar 21 05:21:37 crc kubenswrapper[4839]: I0321 05:21:37.860544 4839 generic.go:334] "Generic (PLEG): container finished" podID="b39946e1-8c4a-48cc-9b4a-4ce148c3b8e5" containerID="4562cd9c8152972a69c31d00df8df7e48df986202f27782b33ec2df30111899d" exitCode=0 Mar 21 05:21:37 crc kubenswrapper[4839]: I0321 05:21:37.860732 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-rgwlh" event={"ID":"b39946e1-8c4a-48cc-9b4a-4ce148c3b8e5","Type":"ContainerDied","Data":"4562cd9c8152972a69c31d00df8df7e48df986202f27782b33ec2df30111899d"} Mar 21 05:21:38 crc kubenswrapper[4839]: I0321 05:21:38.453591 4839 scope.go:117] "RemoveContainer" containerID="ae0086f9eacefc01aae5e4f5f99607f42c8a56a9eb17fd3e93fb691fd9a335b2" Mar 21 05:21:38 crc kubenswrapper[4839]: E0321 05:21:38.453878 4839 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-jx4q7_openshift-machine-config-operator(4f92fefb-d5cd-451a-8bbe-31eea55d5bd9)\"" pod="openshift-machine-config-operator/machine-config-daemon-jx4q7" podUID="4f92fefb-d5cd-451a-8bbe-31eea55d5bd9" Mar 21 05:21:38 crc kubenswrapper[4839]: I0321 05:21:38.876415 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-rgwlh" event={"ID":"b39946e1-8c4a-48cc-9b4a-4ce148c3b8e5","Type":"ContainerStarted","Data":"a0842676442eb476e02b4782f0eccdd718a0012632aed96e0f5955327f7196c9"} Mar 21 05:21:44 crc kubenswrapper[4839]: I0321 05:21:44.469168 4839 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-rgwlh" Mar 21 05:21:44 crc kubenswrapper[4839]: I0321 05:21:44.469780 4839 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-rgwlh" Mar 21 05:21:44 crc kubenswrapper[4839]: I0321 05:21:44.527983 4839 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-rgwlh" Mar 21 05:21:44 crc kubenswrapper[4839]: I0321 05:21:44.554561 4839 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-rgwlh" podStartSLOduration=8.065752499 podStartE2EDuration="10.554533744s" podCreationTimestamp="2026-03-21 05:21:34 +0000 UTC" firstStartedPulling="2026-03-21 05:21:35.835522589 +0000 UTC m=+3500.163309265" lastFinishedPulling="2026-03-21 05:21:38.324303834 +0000 UTC m=+3502.652090510" observedRunningTime="2026-03-21 05:21:38.898604756 +0000 UTC m=+3503.226391432" watchObservedRunningTime="2026-03-21 05:21:44.554533744 +0000 UTC m=+3508.882320420" Mar 21 05:21:44 crc kubenswrapper[4839]: I0321 05:21:44.971752 4839 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-rgwlh" Mar 21 05:21:45 crc kubenswrapper[4839]: I0321 05:21:45.021097 4839 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-rgwlh"] Mar 21 05:21:46 crc kubenswrapper[4839]: I0321 05:21:46.940688 4839 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-rgwlh" podUID="b39946e1-8c4a-48cc-9b4a-4ce148c3b8e5" containerName="registry-server" containerID="cri-o://a0842676442eb476e02b4782f0eccdd718a0012632aed96e0f5955327f7196c9" gracePeriod=2 Mar 21 05:21:47 crc kubenswrapper[4839]: I0321 05:21:47.441760 4839 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-rgwlh" Mar 21 05:21:47 crc kubenswrapper[4839]: I0321 05:21:47.636490 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b39946e1-8c4a-48cc-9b4a-4ce148c3b8e5-utilities\") pod \"b39946e1-8c4a-48cc-9b4a-4ce148c3b8e5\" (UID: \"b39946e1-8c4a-48cc-9b4a-4ce148c3b8e5\") " Mar 21 05:21:47 crc kubenswrapper[4839]: I0321 05:21:47.636642 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-stfwd\" (UniqueName: \"kubernetes.io/projected/b39946e1-8c4a-48cc-9b4a-4ce148c3b8e5-kube-api-access-stfwd\") pod \"b39946e1-8c4a-48cc-9b4a-4ce148c3b8e5\" (UID: \"b39946e1-8c4a-48cc-9b4a-4ce148c3b8e5\") " Mar 21 05:21:47 crc kubenswrapper[4839]: I0321 05:21:47.636794 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b39946e1-8c4a-48cc-9b4a-4ce148c3b8e5-catalog-content\") pod \"b39946e1-8c4a-48cc-9b4a-4ce148c3b8e5\" (UID: \"b39946e1-8c4a-48cc-9b4a-4ce148c3b8e5\") " Mar 21 05:21:47 crc kubenswrapper[4839]: I0321 05:21:47.637700 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b39946e1-8c4a-48cc-9b4a-4ce148c3b8e5-utilities" (OuterVolumeSpecName: "utilities") pod "b39946e1-8c4a-48cc-9b4a-4ce148c3b8e5" (UID: "b39946e1-8c4a-48cc-9b4a-4ce148c3b8e5"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 21 05:21:47 crc kubenswrapper[4839]: I0321 05:21:47.643718 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b39946e1-8c4a-48cc-9b4a-4ce148c3b8e5-kube-api-access-stfwd" (OuterVolumeSpecName: "kube-api-access-stfwd") pod "b39946e1-8c4a-48cc-9b4a-4ce148c3b8e5" (UID: "b39946e1-8c4a-48cc-9b4a-4ce148c3b8e5"). InnerVolumeSpecName "kube-api-access-stfwd". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 21 05:21:47 crc kubenswrapper[4839]: I0321 05:21:47.738666 4839 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b39946e1-8c4a-48cc-9b4a-4ce148c3b8e5-utilities\") on node \"crc\" DevicePath \"\"" Mar 21 05:21:47 crc kubenswrapper[4839]: I0321 05:21:47.738702 4839 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-stfwd\" (UniqueName: \"kubernetes.io/projected/b39946e1-8c4a-48cc-9b4a-4ce148c3b8e5-kube-api-access-stfwd\") on node \"crc\" DevicePath \"\"" Mar 21 05:21:47 crc kubenswrapper[4839]: I0321 05:21:47.953006 4839 generic.go:334] "Generic (PLEG): container finished" podID="b39946e1-8c4a-48cc-9b4a-4ce148c3b8e5" containerID="a0842676442eb476e02b4782f0eccdd718a0012632aed96e0f5955327f7196c9" exitCode=0 Mar 21 05:21:47 crc kubenswrapper[4839]: I0321 05:21:47.953111 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-rgwlh" event={"ID":"b39946e1-8c4a-48cc-9b4a-4ce148c3b8e5","Type":"ContainerDied","Data":"a0842676442eb476e02b4782f0eccdd718a0012632aed96e0f5955327f7196c9"} Mar 21 05:21:47 crc kubenswrapper[4839]: I0321 05:21:47.953145 4839 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-rgwlh" Mar 21 05:21:47 crc kubenswrapper[4839]: I0321 05:21:47.953181 4839 scope.go:117] "RemoveContainer" containerID="a0842676442eb476e02b4782f0eccdd718a0012632aed96e0f5955327f7196c9" Mar 21 05:21:47 crc kubenswrapper[4839]: I0321 05:21:47.953163 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-rgwlh" event={"ID":"b39946e1-8c4a-48cc-9b4a-4ce148c3b8e5","Type":"ContainerDied","Data":"02d803b24cd1bb5adbe92926bb02820c11a8c095dc0491408ccacdabb5366603"} Mar 21 05:21:47 crc kubenswrapper[4839]: I0321 05:21:47.972381 4839 scope.go:117] "RemoveContainer" containerID="4562cd9c8152972a69c31d00df8df7e48df986202f27782b33ec2df30111899d" Mar 21 05:21:47 crc kubenswrapper[4839]: I0321 05:21:47.996465 4839 scope.go:117] "RemoveContainer" containerID="e45e155e440eadd7b16f1d84fbd4ac903a306839524ebedb5e79c97e1f5c8712" Mar 21 05:21:48 crc kubenswrapper[4839]: I0321 05:21:48.039150 4839 scope.go:117] "RemoveContainer" containerID="a0842676442eb476e02b4782f0eccdd718a0012632aed96e0f5955327f7196c9" Mar 21 05:21:48 crc kubenswrapper[4839]: E0321 05:21:48.039687 4839 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a0842676442eb476e02b4782f0eccdd718a0012632aed96e0f5955327f7196c9\": container with ID starting with a0842676442eb476e02b4782f0eccdd718a0012632aed96e0f5955327f7196c9 not found: ID does not exist" containerID="a0842676442eb476e02b4782f0eccdd718a0012632aed96e0f5955327f7196c9" Mar 21 05:21:48 crc kubenswrapper[4839]: I0321 05:21:48.039730 4839 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a0842676442eb476e02b4782f0eccdd718a0012632aed96e0f5955327f7196c9"} err="failed to get container status \"a0842676442eb476e02b4782f0eccdd718a0012632aed96e0f5955327f7196c9\": rpc error: code = NotFound desc = could not find container \"a0842676442eb476e02b4782f0eccdd718a0012632aed96e0f5955327f7196c9\": container with ID starting with a0842676442eb476e02b4782f0eccdd718a0012632aed96e0f5955327f7196c9 not found: ID does not exist" Mar 21 05:21:48 crc kubenswrapper[4839]: I0321 05:21:48.039751 4839 scope.go:117] "RemoveContainer" containerID="4562cd9c8152972a69c31d00df8df7e48df986202f27782b33ec2df30111899d" Mar 21 05:21:48 crc kubenswrapper[4839]: E0321 05:21:48.040334 4839 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4562cd9c8152972a69c31d00df8df7e48df986202f27782b33ec2df30111899d\": container with ID starting with 4562cd9c8152972a69c31d00df8df7e48df986202f27782b33ec2df30111899d not found: ID does not exist" containerID="4562cd9c8152972a69c31d00df8df7e48df986202f27782b33ec2df30111899d" Mar 21 05:21:48 crc kubenswrapper[4839]: I0321 05:21:48.040440 4839 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4562cd9c8152972a69c31d00df8df7e48df986202f27782b33ec2df30111899d"} err="failed to get container status \"4562cd9c8152972a69c31d00df8df7e48df986202f27782b33ec2df30111899d\": rpc error: code = NotFound desc = could not find container \"4562cd9c8152972a69c31d00df8df7e48df986202f27782b33ec2df30111899d\": container with ID starting with 4562cd9c8152972a69c31d00df8df7e48df986202f27782b33ec2df30111899d not found: ID does not exist" Mar 21 05:21:48 crc kubenswrapper[4839]: I0321 05:21:48.040739 4839 scope.go:117] "RemoveContainer" containerID="e45e155e440eadd7b16f1d84fbd4ac903a306839524ebedb5e79c97e1f5c8712" Mar 21 05:21:48 crc kubenswrapper[4839]: E0321 05:21:48.041303 4839 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e45e155e440eadd7b16f1d84fbd4ac903a306839524ebedb5e79c97e1f5c8712\": container with ID starting with e45e155e440eadd7b16f1d84fbd4ac903a306839524ebedb5e79c97e1f5c8712 not found: ID does not exist" containerID="e45e155e440eadd7b16f1d84fbd4ac903a306839524ebedb5e79c97e1f5c8712" Mar 21 05:21:48 crc kubenswrapper[4839]: I0321 05:21:48.041337 4839 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e45e155e440eadd7b16f1d84fbd4ac903a306839524ebedb5e79c97e1f5c8712"} err="failed to get container status \"e45e155e440eadd7b16f1d84fbd4ac903a306839524ebedb5e79c97e1f5c8712\": rpc error: code = NotFound desc = could not find container \"e45e155e440eadd7b16f1d84fbd4ac903a306839524ebedb5e79c97e1f5c8712\": container with ID starting with e45e155e440eadd7b16f1d84fbd4ac903a306839524ebedb5e79c97e1f5c8712 not found: ID does not exist" Mar 21 05:21:48 crc kubenswrapper[4839]: I0321 05:21:48.546166 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b39946e1-8c4a-48cc-9b4a-4ce148c3b8e5-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "b39946e1-8c4a-48cc-9b4a-4ce148c3b8e5" (UID: "b39946e1-8c4a-48cc-9b4a-4ce148c3b8e5"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 21 05:21:48 crc kubenswrapper[4839]: I0321 05:21:48.555112 4839 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b39946e1-8c4a-48cc-9b4a-4ce148c3b8e5-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 21 05:21:48 crc kubenswrapper[4839]: I0321 05:21:48.591776 4839 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-rgwlh"] Mar 21 05:21:48 crc kubenswrapper[4839]: I0321 05:21:48.600162 4839 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-rgwlh"] Mar 21 05:21:50 crc kubenswrapper[4839]: I0321 05:21:50.465058 4839 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b39946e1-8c4a-48cc-9b4a-4ce148c3b8e5" path="/var/lib/kubelet/pods/b39946e1-8c4a-48cc-9b4a-4ce148c3b8e5/volumes" Mar 21 05:21:53 crc kubenswrapper[4839]: I0321 05:21:53.453448 4839 scope.go:117] "RemoveContainer" containerID="ae0086f9eacefc01aae5e4f5f99607f42c8a56a9eb17fd3e93fb691fd9a335b2" Mar 21 05:21:53 crc kubenswrapper[4839]: E0321 05:21:53.454373 4839 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-jx4q7_openshift-machine-config-operator(4f92fefb-d5cd-451a-8bbe-31eea55d5bd9)\"" pod="openshift-machine-config-operator/machine-config-daemon-jx4q7" podUID="4f92fefb-d5cd-451a-8bbe-31eea55d5bd9" Mar 21 05:22:00 crc kubenswrapper[4839]: I0321 05:22:00.142223 4839 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29567842-mkbkh"] Mar 21 05:22:00 crc kubenswrapper[4839]: E0321 05:22:00.143307 4839 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b39946e1-8c4a-48cc-9b4a-4ce148c3b8e5" containerName="extract-utilities" Mar 21 05:22:00 crc kubenswrapper[4839]: I0321 05:22:00.143362 4839 state_mem.go:107] "Deleted CPUSet assignment" podUID="b39946e1-8c4a-48cc-9b4a-4ce148c3b8e5" containerName="extract-utilities" Mar 21 05:22:00 crc kubenswrapper[4839]: E0321 05:22:00.143410 4839 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b39946e1-8c4a-48cc-9b4a-4ce148c3b8e5" containerName="registry-server" Mar 21 05:22:00 crc kubenswrapper[4839]: I0321 05:22:00.143417 4839 state_mem.go:107] "Deleted CPUSet assignment" podUID="b39946e1-8c4a-48cc-9b4a-4ce148c3b8e5" containerName="registry-server" Mar 21 05:22:00 crc kubenswrapper[4839]: E0321 05:22:00.143433 4839 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b39946e1-8c4a-48cc-9b4a-4ce148c3b8e5" containerName="extract-content" Mar 21 05:22:00 crc kubenswrapper[4839]: I0321 05:22:00.143439 4839 state_mem.go:107] "Deleted CPUSet assignment" podUID="b39946e1-8c4a-48cc-9b4a-4ce148c3b8e5" containerName="extract-content" Mar 21 05:22:00 crc kubenswrapper[4839]: I0321 05:22:00.143658 4839 memory_manager.go:354] "RemoveStaleState removing state" podUID="b39946e1-8c4a-48cc-9b4a-4ce148c3b8e5" containerName="registry-server" Mar 21 05:22:00 crc kubenswrapper[4839]: I0321 05:22:00.144397 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567842-mkbkh" Mar 21 05:22:00 crc kubenswrapper[4839]: I0321 05:22:00.146533 4839 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 21 05:22:00 crc kubenswrapper[4839]: I0321 05:22:00.146930 4839 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-swld2" Mar 21 05:22:00 crc kubenswrapper[4839]: I0321 05:22:00.147224 4839 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 21 05:22:00 crc kubenswrapper[4839]: I0321 05:22:00.153401 4839 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29567842-mkbkh"] Mar 21 05:22:00 crc kubenswrapper[4839]: I0321 05:22:00.183153 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bsphj\" (UniqueName: \"kubernetes.io/projected/98d91ef7-84b2-40fa-b268-b3a42085ecbd-kube-api-access-bsphj\") pod \"auto-csr-approver-29567842-mkbkh\" (UID: \"98d91ef7-84b2-40fa-b268-b3a42085ecbd\") " pod="openshift-infra/auto-csr-approver-29567842-mkbkh" Mar 21 05:22:00 crc kubenswrapper[4839]: I0321 05:22:00.284804 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bsphj\" (UniqueName: \"kubernetes.io/projected/98d91ef7-84b2-40fa-b268-b3a42085ecbd-kube-api-access-bsphj\") pod \"auto-csr-approver-29567842-mkbkh\" (UID: \"98d91ef7-84b2-40fa-b268-b3a42085ecbd\") " pod="openshift-infra/auto-csr-approver-29567842-mkbkh" Mar 21 05:22:00 crc kubenswrapper[4839]: I0321 05:22:00.315065 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bsphj\" (UniqueName: \"kubernetes.io/projected/98d91ef7-84b2-40fa-b268-b3a42085ecbd-kube-api-access-bsphj\") pod \"auto-csr-approver-29567842-mkbkh\" (UID: \"98d91ef7-84b2-40fa-b268-b3a42085ecbd\") " pod="openshift-infra/auto-csr-approver-29567842-mkbkh" Mar 21 05:22:00 crc kubenswrapper[4839]: I0321 05:22:00.480162 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567842-mkbkh" Mar 21 05:22:00 crc kubenswrapper[4839]: I0321 05:22:00.981289 4839 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29567842-mkbkh"] Mar 21 05:22:01 crc kubenswrapper[4839]: I0321 05:22:01.078014 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567842-mkbkh" event={"ID":"98d91ef7-84b2-40fa-b268-b3a42085ecbd","Type":"ContainerStarted","Data":"674344555b6619a3e458790d8e4b299fcb88c8a29df166d1bd678fbe24712979"} Mar 21 05:22:03 crc kubenswrapper[4839]: I0321 05:22:03.101887 4839 generic.go:334] "Generic (PLEG): container finished" podID="98d91ef7-84b2-40fa-b268-b3a42085ecbd" containerID="eedac11fd6ab65c2edef01ab4734c0cb94cc7a431c8be7bf1c6cc4417de55aa3" exitCode=0 Mar 21 05:22:03 crc kubenswrapper[4839]: I0321 05:22:03.102116 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567842-mkbkh" event={"ID":"98d91ef7-84b2-40fa-b268-b3a42085ecbd","Type":"ContainerDied","Data":"eedac11fd6ab65c2edef01ab4734c0cb94cc7a431c8be7bf1c6cc4417de55aa3"} Mar 21 05:22:04 crc kubenswrapper[4839]: I0321 05:22:04.570991 4839 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567842-mkbkh" Mar 21 05:22:04 crc kubenswrapper[4839]: I0321 05:22:04.672672 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bsphj\" (UniqueName: \"kubernetes.io/projected/98d91ef7-84b2-40fa-b268-b3a42085ecbd-kube-api-access-bsphj\") pod \"98d91ef7-84b2-40fa-b268-b3a42085ecbd\" (UID: \"98d91ef7-84b2-40fa-b268-b3a42085ecbd\") " Mar 21 05:22:04 crc kubenswrapper[4839]: I0321 05:22:04.694758 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/98d91ef7-84b2-40fa-b268-b3a42085ecbd-kube-api-access-bsphj" (OuterVolumeSpecName: "kube-api-access-bsphj") pod "98d91ef7-84b2-40fa-b268-b3a42085ecbd" (UID: "98d91ef7-84b2-40fa-b268-b3a42085ecbd"). InnerVolumeSpecName "kube-api-access-bsphj". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 21 05:22:04 crc kubenswrapper[4839]: I0321 05:22:04.776458 4839 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bsphj\" (UniqueName: \"kubernetes.io/projected/98d91ef7-84b2-40fa-b268-b3a42085ecbd-kube-api-access-bsphj\") on node \"crc\" DevicePath \"\"" Mar 21 05:22:05 crc kubenswrapper[4839]: I0321 05:22:05.125002 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567842-mkbkh" event={"ID":"98d91ef7-84b2-40fa-b268-b3a42085ecbd","Type":"ContainerDied","Data":"674344555b6619a3e458790d8e4b299fcb88c8a29df166d1bd678fbe24712979"} Mar 21 05:22:05 crc kubenswrapper[4839]: I0321 05:22:05.125536 4839 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="674344555b6619a3e458790d8e4b299fcb88c8a29df166d1bd678fbe24712979" Mar 21 05:22:05 crc kubenswrapper[4839]: I0321 05:22:05.125111 4839 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567842-mkbkh" Mar 21 05:22:05 crc kubenswrapper[4839]: I0321 05:22:05.452961 4839 scope.go:117] "RemoveContainer" containerID="ae0086f9eacefc01aae5e4f5f99607f42c8a56a9eb17fd3e93fb691fd9a335b2" Mar 21 05:22:05 crc kubenswrapper[4839]: I0321 05:22:05.660163 4839 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29567836-7nx7s"] Mar 21 05:22:05 crc kubenswrapper[4839]: I0321 05:22:05.674205 4839 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29567836-7nx7s"] Mar 21 05:22:06 crc kubenswrapper[4839]: I0321 05:22:06.138506 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-jx4q7" event={"ID":"4f92fefb-d5cd-451a-8bbe-31eea55d5bd9","Type":"ContainerStarted","Data":"26d5ad8d8c206d8ada93506f3a162dccbd9846e40dd3da26db34bab6bbf70437"} Mar 21 05:22:06 crc kubenswrapper[4839]: I0321 05:22:06.464902 4839 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bcf873d9-ae04-40eb-b855-cca2a045773c" path="/var/lib/kubelet/pods/bcf873d9-ae04-40eb-b855-cca2a045773c/volumes" Mar 21 05:22:24 crc kubenswrapper[4839]: I0321 05:22:24.530384 4839 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-wzl7t"] Mar 21 05:22:24 crc kubenswrapper[4839]: E0321 05:22:24.531544 4839 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="98d91ef7-84b2-40fa-b268-b3a42085ecbd" containerName="oc" Mar 21 05:22:24 crc kubenswrapper[4839]: I0321 05:22:24.531581 4839 state_mem.go:107] "Deleted CPUSet assignment" podUID="98d91ef7-84b2-40fa-b268-b3a42085ecbd" containerName="oc" Mar 21 05:22:24 crc kubenswrapper[4839]: I0321 05:22:24.531884 4839 memory_manager.go:354] "RemoveStaleState removing state" podUID="98d91ef7-84b2-40fa-b268-b3a42085ecbd" containerName="oc" Mar 21 05:22:24 crc kubenswrapper[4839]: I0321 05:22:24.534557 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-wzl7t" Mar 21 05:22:24 crc kubenswrapper[4839]: I0321 05:22:24.548776 4839 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-wzl7t"] Mar 21 05:22:24 crc kubenswrapper[4839]: I0321 05:22:24.701446 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mmdmg\" (UniqueName: \"kubernetes.io/projected/ff36c9ee-5581-48b3-be29-a6d5ad4b9476-kube-api-access-mmdmg\") pod \"redhat-marketplace-wzl7t\" (UID: \"ff36c9ee-5581-48b3-be29-a6d5ad4b9476\") " pod="openshift-marketplace/redhat-marketplace-wzl7t" Mar 21 05:22:24 crc kubenswrapper[4839]: I0321 05:22:24.701923 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ff36c9ee-5581-48b3-be29-a6d5ad4b9476-catalog-content\") pod \"redhat-marketplace-wzl7t\" (UID: \"ff36c9ee-5581-48b3-be29-a6d5ad4b9476\") " pod="openshift-marketplace/redhat-marketplace-wzl7t" Mar 21 05:22:24 crc kubenswrapper[4839]: I0321 05:22:24.701956 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ff36c9ee-5581-48b3-be29-a6d5ad4b9476-utilities\") pod \"redhat-marketplace-wzl7t\" (UID: \"ff36c9ee-5581-48b3-be29-a6d5ad4b9476\") " pod="openshift-marketplace/redhat-marketplace-wzl7t" Mar 21 05:22:24 crc kubenswrapper[4839]: I0321 05:22:24.803489 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ff36c9ee-5581-48b3-be29-a6d5ad4b9476-catalog-content\") pod \"redhat-marketplace-wzl7t\" (UID: \"ff36c9ee-5581-48b3-be29-a6d5ad4b9476\") " pod="openshift-marketplace/redhat-marketplace-wzl7t" Mar 21 05:22:24 crc kubenswrapper[4839]: I0321 05:22:24.803554 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ff36c9ee-5581-48b3-be29-a6d5ad4b9476-utilities\") pod \"redhat-marketplace-wzl7t\" (UID: \"ff36c9ee-5581-48b3-be29-a6d5ad4b9476\") " pod="openshift-marketplace/redhat-marketplace-wzl7t" Mar 21 05:22:24 crc kubenswrapper[4839]: I0321 05:22:24.803667 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mmdmg\" (UniqueName: \"kubernetes.io/projected/ff36c9ee-5581-48b3-be29-a6d5ad4b9476-kube-api-access-mmdmg\") pod \"redhat-marketplace-wzl7t\" (UID: \"ff36c9ee-5581-48b3-be29-a6d5ad4b9476\") " pod="openshift-marketplace/redhat-marketplace-wzl7t" Mar 21 05:22:24 crc kubenswrapper[4839]: I0321 05:22:24.804368 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ff36c9ee-5581-48b3-be29-a6d5ad4b9476-catalog-content\") pod \"redhat-marketplace-wzl7t\" (UID: \"ff36c9ee-5581-48b3-be29-a6d5ad4b9476\") " pod="openshift-marketplace/redhat-marketplace-wzl7t" Mar 21 05:22:24 crc kubenswrapper[4839]: I0321 05:22:24.804505 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ff36c9ee-5581-48b3-be29-a6d5ad4b9476-utilities\") pod \"redhat-marketplace-wzl7t\" (UID: \"ff36c9ee-5581-48b3-be29-a6d5ad4b9476\") " pod="openshift-marketplace/redhat-marketplace-wzl7t" Mar 21 05:22:24 crc kubenswrapper[4839]: I0321 05:22:24.827419 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mmdmg\" (UniqueName: \"kubernetes.io/projected/ff36c9ee-5581-48b3-be29-a6d5ad4b9476-kube-api-access-mmdmg\") pod \"redhat-marketplace-wzl7t\" (UID: \"ff36c9ee-5581-48b3-be29-a6d5ad4b9476\") " pod="openshift-marketplace/redhat-marketplace-wzl7t" Mar 21 05:22:24 crc kubenswrapper[4839]: I0321 05:22:24.873521 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-wzl7t" Mar 21 05:22:25 crc kubenswrapper[4839]: I0321 05:22:25.359840 4839 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-wzl7t"] Mar 21 05:22:25 crc kubenswrapper[4839]: I0321 05:22:25.583434 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-wzl7t" event={"ID":"ff36c9ee-5581-48b3-be29-a6d5ad4b9476","Type":"ContainerStarted","Data":"ccce4a290aff8ab3e5d1554009c1a69f2fb741ea3f5fba347059d74bed0b8ded"} Mar 21 05:22:26 crc kubenswrapper[4839]: I0321 05:22:26.593937 4839 generic.go:334] "Generic (PLEG): container finished" podID="ff36c9ee-5581-48b3-be29-a6d5ad4b9476" containerID="97ce872ff52632fee3002b46fbe2d1087d0acb59cd9704873c29b470d51ff9e4" exitCode=0 Mar 21 05:22:26 crc kubenswrapper[4839]: I0321 05:22:26.594016 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-wzl7t" event={"ID":"ff36c9ee-5581-48b3-be29-a6d5ad4b9476","Type":"ContainerDied","Data":"97ce872ff52632fee3002b46fbe2d1087d0acb59cd9704873c29b470d51ff9e4"} Mar 21 05:22:28 crc kubenswrapper[4839]: I0321 05:22:28.615561 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-wzl7t" event={"ID":"ff36c9ee-5581-48b3-be29-a6d5ad4b9476","Type":"ContainerStarted","Data":"fd8610a23aa4477b05f1e471927e591e3db28e8c730e3f65952a7cfd15d24ba8"} Mar 21 05:22:29 crc kubenswrapper[4839]: I0321 05:22:29.629477 4839 generic.go:334] "Generic (PLEG): container finished" podID="ff36c9ee-5581-48b3-be29-a6d5ad4b9476" containerID="fd8610a23aa4477b05f1e471927e591e3db28e8c730e3f65952a7cfd15d24ba8" exitCode=0 Mar 21 05:22:29 crc kubenswrapper[4839]: I0321 05:22:29.629721 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-wzl7t" event={"ID":"ff36c9ee-5581-48b3-be29-a6d5ad4b9476","Type":"ContainerDied","Data":"fd8610a23aa4477b05f1e471927e591e3db28e8c730e3f65952a7cfd15d24ba8"} Mar 21 05:22:31 crc kubenswrapper[4839]: I0321 05:22:31.657952 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-wzl7t" event={"ID":"ff36c9ee-5581-48b3-be29-a6d5ad4b9476","Type":"ContainerStarted","Data":"65a7d17f89d1557c72c5ffd06bb72faeb0e67e3bd8925184b20df8ed6afa7a8d"} Mar 21 05:22:31 crc kubenswrapper[4839]: I0321 05:22:31.685754 4839 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-wzl7t" podStartSLOduration=3.84304255 podStartE2EDuration="7.685730332s" podCreationTimestamp="2026-03-21 05:22:24 +0000 UTC" firstStartedPulling="2026-03-21 05:22:26.59679398 +0000 UTC m=+3550.924580656" lastFinishedPulling="2026-03-21 05:22:30.439481762 +0000 UTC m=+3554.767268438" observedRunningTime="2026-03-21 05:22:31.678667383 +0000 UTC m=+3556.006454069" watchObservedRunningTime="2026-03-21 05:22:31.685730332 +0000 UTC m=+3556.013517008" Mar 21 05:22:34 crc kubenswrapper[4839]: I0321 05:22:34.875158 4839 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-wzl7t" Mar 21 05:22:34 crc kubenswrapper[4839]: I0321 05:22:34.875871 4839 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-wzl7t" Mar 21 05:22:34 crc kubenswrapper[4839]: I0321 05:22:34.929141 4839 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-wzl7t" Mar 21 05:22:35 crc kubenswrapper[4839]: I0321 05:22:35.757763 4839 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-wzl7t" Mar 21 05:22:35 crc kubenswrapper[4839]: I0321 05:22:35.821705 4839 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-wzl7t"] Mar 21 05:22:37 crc kubenswrapper[4839]: I0321 05:22:37.728313 4839 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-wzl7t" podUID="ff36c9ee-5581-48b3-be29-a6d5ad4b9476" containerName="registry-server" containerID="cri-o://65a7d17f89d1557c72c5ffd06bb72faeb0e67e3bd8925184b20df8ed6afa7a8d" gracePeriod=2 Mar 21 05:22:38 crc kubenswrapper[4839]: I0321 05:22:38.741975 4839 generic.go:334] "Generic (PLEG): container finished" podID="ff36c9ee-5581-48b3-be29-a6d5ad4b9476" containerID="65a7d17f89d1557c72c5ffd06bb72faeb0e67e3bd8925184b20df8ed6afa7a8d" exitCode=0 Mar 21 05:22:38 crc kubenswrapper[4839]: I0321 05:22:38.742022 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-wzl7t" event={"ID":"ff36c9ee-5581-48b3-be29-a6d5ad4b9476","Type":"ContainerDied","Data":"65a7d17f89d1557c72c5ffd06bb72faeb0e67e3bd8925184b20df8ed6afa7a8d"} Mar 21 05:22:38 crc kubenswrapper[4839]: I0321 05:22:38.742373 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-wzl7t" event={"ID":"ff36c9ee-5581-48b3-be29-a6d5ad4b9476","Type":"ContainerDied","Data":"ccce4a290aff8ab3e5d1554009c1a69f2fb741ea3f5fba347059d74bed0b8ded"} Mar 21 05:22:38 crc kubenswrapper[4839]: I0321 05:22:38.742389 4839 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ccce4a290aff8ab3e5d1554009c1a69f2fb741ea3f5fba347059d74bed0b8ded" Mar 21 05:22:38 crc kubenswrapper[4839]: I0321 05:22:38.777168 4839 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-wzl7t" Mar 21 05:22:38 crc kubenswrapper[4839]: I0321 05:22:38.901769 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ff36c9ee-5581-48b3-be29-a6d5ad4b9476-utilities\") pod \"ff36c9ee-5581-48b3-be29-a6d5ad4b9476\" (UID: \"ff36c9ee-5581-48b3-be29-a6d5ad4b9476\") " Mar 21 05:22:38 crc kubenswrapper[4839]: I0321 05:22:38.901864 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mmdmg\" (UniqueName: \"kubernetes.io/projected/ff36c9ee-5581-48b3-be29-a6d5ad4b9476-kube-api-access-mmdmg\") pod \"ff36c9ee-5581-48b3-be29-a6d5ad4b9476\" (UID: \"ff36c9ee-5581-48b3-be29-a6d5ad4b9476\") " Mar 21 05:22:38 crc kubenswrapper[4839]: I0321 05:22:38.901966 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ff36c9ee-5581-48b3-be29-a6d5ad4b9476-catalog-content\") pod \"ff36c9ee-5581-48b3-be29-a6d5ad4b9476\" (UID: \"ff36c9ee-5581-48b3-be29-a6d5ad4b9476\") " Mar 21 05:22:38 crc kubenswrapper[4839]: I0321 05:22:38.903352 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ff36c9ee-5581-48b3-be29-a6d5ad4b9476-utilities" (OuterVolumeSpecName: "utilities") pod "ff36c9ee-5581-48b3-be29-a6d5ad4b9476" (UID: "ff36c9ee-5581-48b3-be29-a6d5ad4b9476"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 21 05:22:38 crc kubenswrapper[4839]: I0321 05:22:38.910501 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ff36c9ee-5581-48b3-be29-a6d5ad4b9476-kube-api-access-mmdmg" (OuterVolumeSpecName: "kube-api-access-mmdmg") pod "ff36c9ee-5581-48b3-be29-a6d5ad4b9476" (UID: "ff36c9ee-5581-48b3-be29-a6d5ad4b9476"). InnerVolumeSpecName "kube-api-access-mmdmg". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 21 05:22:38 crc kubenswrapper[4839]: I0321 05:22:38.928183 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ff36c9ee-5581-48b3-be29-a6d5ad4b9476-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "ff36c9ee-5581-48b3-be29-a6d5ad4b9476" (UID: "ff36c9ee-5581-48b3-be29-a6d5ad4b9476"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 21 05:22:39 crc kubenswrapper[4839]: I0321 05:22:39.004604 4839 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ff36c9ee-5581-48b3-be29-a6d5ad4b9476-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 21 05:22:39 crc kubenswrapper[4839]: I0321 05:22:39.004649 4839 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ff36c9ee-5581-48b3-be29-a6d5ad4b9476-utilities\") on node \"crc\" DevicePath \"\"" Mar 21 05:22:39 crc kubenswrapper[4839]: I0321 05:22:39.004658 4839 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mmdmg\" (UniqueName: \"kubernetes.io/projected/ff36c9ee-5581-48b3-be29-a6d5ad4b9476-kube-api-access-mmdmg\") on node \"crc\" DevicePath \"\"" Mar 21 05:22:39 crc kubenswrapper[4839]: I0321 05:22:39.751250 4839 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-wzl7t" Mar 21 05:22:39 crc kubenswrapper[4839]: I0321 05:22:39.793492 4839 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-wzl7t"] Mar 21 05:22:39 crc kubenswrapper[4839]: I0321 05:22:39.809000 4839 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-wzl7t"] Mar 21 05:22:40 crc kubenswrapper[4839]: I0321 05:22:40.464311 4839 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ff36c9ee-5581-48b3-be29-a6d5ad4b9476" path="/var/lib/kubelet/pods/ff36c9ee-5581-48b3-be29-a6d5ad4b9476/volumes" Mar 21 05:22:57 crc kubenswrapper[4839]: I0321 05:22:57.245551 4839 scope.go:117] "RemoveContainer" containerID="7a58fda260a0b556c155576b775648d85fe42364027e5d171170d3cfbd959f32" Mar 21 05:22:57 crc kubenswrapper[4839]: I0321 05:22:57.917520 4839 generic.go:334] "Generic (PLEG): container finished" podID="65c89a5c-df7e-4d6c-bc7e-5e4fecfc6cb3" containerID="0176599a8a2d6c5f1b857f924207691c2463c8d61ed3270470c8fc3d29535c3b" exitCode=0 Mar 21 05:22:57 crc kubenswrapper[4839]: I0321 05:22:57.917675 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/tempest-tests-tempest" event={"ID":"65c89a5c-df7e-4d6c-bc7e-5e4fecfc6cb3","Type":"ContainerDied","Data":"0176599a8a2d6c5f1b857f924207691c2463c8d61ed3270470c8fc3d29535c3b"} Mar 21 05:22:59 crc kubenswrapper[4839]: I0321 05:22:59.329164 4839 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/tempest-tests-tempest" Mar 21 05:22:59 crc kubenswrapper[4839]: I0321 05:22:59.435808 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/65c89a5c-df7e-4d6c-bc7e-5e4fecfc6cb3-config-data\") pod \"65c89a5c-df7e-4d6c-bc7e-5e4fecfc6cb3\" (UID: \"65c89a5c-df7e-4d6c-bc7e-5e4fecfc6cb3\") " Mar 21 05:22:59 crc kubenswrapper[4839]: I0321 05:22:59.435901 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"test-operator-ephemeral-temporary\" (UniqueName: \"kubernetes.io/empty-dir/65c89a5c-df7e-4d6c-bc7e-5e4fecfc6cb3-test-operator-ephemeral-temporary\") pod \"65c89a5c-df7e-4d6c-bc7e-5e4fecfc6cb3\" (UID: \"65c89a5c-df7e-4d6c-bc7e-5e4fecfc6cb3\") " Mar 21 05:22:59 crc kubenswrapper[4839]: I0321 05:22:59.435984 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/65c89a5c-df7e-4d6c-bc7e-5e4fecfc6cb3-openstack-config\") pod \"65c89a5c-df7e-4d6c-bc7e-5e4fecfc6cb3\" (UID: \"65c89a5c-df7e-4d6c-bc7e-5e4fecfc6cb3\") " Mar 21 05:22:59 crc kubenswrapper[4839]: I0321 05:22:59.436049 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x84s5\" (UniqueName: \"kubernetes.io/projected/65c89a5c-df7e-4d6c-bc7e-5e4fecfc6cb3-kube-api-access-x84s5\") pod \"65c89a5c-df7e-4d6c-bc7e-5e4fecfc6cb3\" (UID: \"65c89a5c-df7e-4d6c-bc7e-5e4fecfc6cb3\") " Mar 21 05:22:59 crc kubenswrapper[4839]: I0321 05:22:59.436091 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"test-operator-ephemeral-workdir\" (UniqueName: \"kubernetes.io/empty-dir/65c89a5c-df7e-4d6c-bc7e-5e4fecfc6cb3-test-operator-ephemeral-workdir\") pod \"65c89a5c-df7e-4d6c-bc7e-5e4fecfc6cb3\" (UID: \"65c89a5c-df7e-4d6c-bc7e-5e4fecfc6cb3\") " Mar 21 05:22:59 crc kubenswrapper[4839]: I0321 05:22:59.436211 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/65c89a5c-df7e-4d6c-bc7e-5e4fecfc6cb3-openstack-config-secret\") pod \"65c89a5c-df7e-4d6c-bc7e-5e4fecfc6cb3\" (UID: \"65c89a5c-df7e-4d6c-bc7e-5e4fecfc6cb3\") " Mar 21 05:22:59 crc kubenswrapper[4839]: I0321 05:22:59.436235 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/65c89a5c-df7e-4d6c-bc7e-5e4fecfc6cb3-ssh-key\") pod \"65c89a5c-df7e-4d6c-bc7e-5e4fecfc6cb3\" (UID: \"65c89a5c-df7e-4d6c-bc7e-5e4fecfc6cb3\") " Mar 21 05:22:59 crc kubenswrapper[4839]: I0321 05:22:59.436271 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"test-operator-logs\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"65c89a5c-df7e-4d6c-bc7e-5e4fecfc6cb3\" (UID: \"65c89a5c-df7e-4d6c-bc7e-5e4fecfc6cb3\") " Mar 21 05:22:59 crc kubenswrapper[4839]: I0321 05:22:59.436305 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/secret/65c89a5c-df7e-4d6c-bc7e-5e4fecfc6cb3-ca-certs\") pod \"65c89a5c-df7e-4d6c-bc7e-5e4fecfc6cb3\" (UID: \"65c89a5c-df7e-4d6c-bc7e-5e4fecfc6cb3\") " Mar 21 05:22:59 crc kubenswrapper[4839]: I0321 05:22:59.437257 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/65c89a5c-df7e-4d6c-bc7e-5e4fecfc6cb3-test-operator-ephemeral-temporary" (OuterVolumeSpecName: "test-operator-ephemeral-temporary") pod "65c89a5c-df7e-4d6c-bc7e-5e4fecfc6cb3" (UID: "65c89a5c-df7e-4d6c-bc7e-5e4fecfc6cb3"). InnerVolumeSpecName "test-operator-ephemeral-temporary". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 21 05:22:59 crc kubenswrapper[4839]: I0321 05:22:59.437007 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/65c89a5c-df7e-4d6c-bc7e-5e4fecfc6cb3-config-data" (OuterVolumeSpecName: "config-data") pod "65c89a5c-df7e-4d6c-bc7e-5e4fecfc6cb3" (UID: "65c89a5c-df7e-4d6c-bc7e-5e4fecfc6cb3"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 21 05:22:59 crc kubenswrapper[4839]: I0321 05:22:59.438070 4839 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/65c89a5c-df7e-4d6c-bc7e-5e4fecfc6cb3-config-data\") on node \"crc\" DevicePath \"\"" Mar 21 05:22:59 crc kubenswrapper[4839]: I0321 05:22:59.438100 4839 reconciler_common.go:293] "Volume detached for volume \"test-operator-ephemeral-temporary\" (UniqueName: \"kubernetes.io/empty-dir/65c89a5c-df7e-4d6c-bc7e-5e4fecfc6cb3-test-operator-ephemeral-temporary\") on node \"crc\" DevicePath \"\"" Mar 21 05:22:59 crc kubenswrapper[4839]: I0321 05:22:59.442918 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage09-crc" (OuterVolumeSpecName: "test-operator-logs") pod "65c89a5c-df7e-4d6c-bc7e-5e4fecfc6cb3" (UID: "65c89a5c-df7e-4d6c-bc7e-5e4fecfc6cb3"). InnerVolumeSpecName "local-storage09-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Mar 21 05:22:59 crc kubenswrapper[4839]: I0321 05:22:59.444222 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/65c89a5c-df7e-4d6c-bc7e-5e4fecfc6cb3-test-operator-ephemeral-workdir" (OuterVolumeSpecName: "test-operator-ephemeral-workdir") pod "65c89a5c-df7e-4d6c-bc7e-5e4fecfc6cb3" (UID: "65c89a5c-df7e-4d6c-bc7e-5e4fecfc6cb3"). InnerVolumeSpecName "test-operator-ephemeral-workdir". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 21 05:22:59 crc kubenswrapper[4839]: I0321 05:22:59.448845 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/65c89a5c-df7e-4d6c-bc7e-5e4fecfc6cb3-kube-api-access-x84s5" (OuterVolumeSpecName: "kube-api-access-x84s5") pod "65c89a5c-df7e-4d6c-bc7e-5e4fecfc6cb3" (UID: "65c89a5c-df7e-4d6c-bc7e-5e4fecfc6cb3"). InnerVolumeSpecName "kube-api-access-x84s5". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 21 05:22:59 crc kubenswrapper[4839]: I0321 05:22:59.474974 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/65c89a5c-df7e-4d6c-bc7e-5e4fecfc6cb3-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "65c89a5c-df7e-4d6c-bc7e-5e4fecfc6cb3" (UID: "65c89a5c-df7e-4d6c-bc7e-5e4fecfc6cb3"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 21 05:22:59 crc kubenswrapper[4839]: I0321 05:22:59.480297 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/65c89a5c-df7e-4d6c-bc7e-5e4fecfc6cb3-openstack-config-secret" (OuterVolumeSpecName: "openstack-config-secret") pod "65c89a5c-df7e-4d6c-bc7e-5e4fecfc6cb3" (UID: "65c89a5c-df7e-4d6c-bc7e-5e4fecfc6cb3"). InnerVolumeSpecName "openstack-config-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 21 05:22:59 crc kubenswrapper[4839]: I0321 05:22:59.480460 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/65c89a5c-df7e-4d6c-bc7e-5e4fecfc6cb3-ca-certs" (OuterVolumeSpecName: "ca-certs") pod "65c89a5c-df7e-4d6c-bc7e-5e4fecfc6cb3" (UID: "65c89a5c-df7e-4d6c-bc7e-5e4fecfc6cb3"). InnerVolumeSpecName "ca-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 21 05:22:59 crc kubenswrapper[4839]: I0321 05:22:59.502637 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/65c89a5c-df7e-4d6c-bc7e-5e4fecfc6cb3-openstack-config" (OuterVolumeSpecName: "openstack-config") pod "65c89a5c-df7e-4d6c-bc7e-5e4fecfc6cb3" (UID: "65c89a5c-df7e-4d6c-bc7e-5e4fecfc6cb3"). InnerVolumeSpecName "openstack-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 21 05:22:59 crc kubenswrapper[4839]: I0321 05:22:59.545093 4839 reconciler_common.go:293] "Volume detached for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/65c89a5c-df7e-4d6c-bc7e-5e4fecfc6cb3-openstack-config\") on node \"crc\" DevicePath \"\"" Mar 21 05:22:59 crc kubenswrapper[4839]: I0321 05:22:59.545127 4839 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x84s5\" (UniqueName: \"kubernetes.io/projected/65c89a5c-df7e-4d6c-bc7e-5e4fecfc6cb3-kube-api-access-x84s5\") on node \"crc\" DevicePath \"\"" Mar 21 05:22:59 crc kubenswrapper[4839]: I0321 05:22:59.545140 4839 reconciler_common.go:293] "Volume detached for volume \"test-operator-ephemeral-workdir\" (UniqueName: \"kubernetes.io/empty-dir/65c89a5c-df7e-4d6c-bc7e-5e4fecfc6cb3-test-operator-ephemeral-workdir\") on node \"crc\" DevicePath \"\"" Mar 21 05:22:59 crc kubenswrapper[4839]: I0321 05:22:59.545149 4839 reconciler_common.go:293] "Volume detached for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/65c89a5c-df7e-4d6c-bc7e-5e4fecfc6cb3-openstack-config-secret\") on node \"crc\" DevicePath \"\"" Mar 21 05:22:59 crc kubenswrapper[4839]: I0321 05:22:59.545158 4839 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/65c89a5c-df7e-4d6c-bc7e-5e4fecfc6cb3-ssh-key\") on node \"crc\" DevicePath \"\"" Mar 21 05:22:59 crc kubenswrapper[4839]: I0321 05:22:59.545177 4839 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") on node \"crc\" " Mar 21 05:22:59 crc kubenswrapper[4839]: I0321 05:22:59.545186 4839 reconciler_common.go:293] "Volume detached for volume \"ca-certs\" (UniqueName: \"kubernetes.io/secret/65c89a5c-df7e-4d6c-bc7e-5e4fecfc6cb3-ca-certs\") on node \"crc\" DevicePath \"\"" Mar 21 05:22:59 crc kubenswrapper[4839]: I0321 05:22:59.627480 4839 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage09-crc" (UniqueName: "kubernetes.io/local-volume/local-storage09-crc") on node "crc" Mar 21 05:22:59 crc kubenswrapper[4839]: I0321 05:22:59.647015 4839 reconciler_common.go:293] "Volume detached for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") on node \"crc\" DevicePath \"\"" Mar 21 05:22:59 crc kubenswrapper[4839]: I0321 05:22:59.945994 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/tempest-tests-tempest" event={"ID":"65c89a5c-df7e-4d6c-bc7e-5e4fecfc6cb3","Type":"ContainerDied","Data":"de55efea3ef0459f6ae11516d74916a8089be737bb24fdba8c8fcffa3719ebe6"} Mar 21 05:22:59 crc kubenswrapper[4839]: I0321 05:22:59.946097 4839 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="de55efea3ef0459f6ae11516d74916a8089be737bb24fdba8c8fcffa3719ebe6" Mar 21 05:22:59 crc kubenswrapper[4839]: I0321 05:22:59.946058 4839 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/tempest-tests-tempest" Mar 21 05:23:09 crc kubenswrapper[4839]: I0321 05:23:09.553909 4839 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/test-operator-logs-pod-tempest-tempest-tests-tempest"] Mar 21 05:23:09 crc kubenswrapper[4839]: E0321 05:23:09.554850 4839 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ff36c9ee-5581-48b3-be29-a6d5ad4b9476" containerName="extract-content" Mar 21 05:23:09 crc kubenswrapper[4839]: I0321 05:23:09.554865 4839 state_mem.go:107] "Deleted CPUSet assignment" podUID="ff36c9ee-5581-48b3-be29-a6d5ad4b9476" containerName="extract-content" Mar 21 05:23:09 crc kubenswrapper[4839]: E0321 05:23:09.554888 4839 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ff36c9ee-5581-48b3-be29-a6d5ad4b9476" containerName="extract-utilities" Mar 21 05:23:09 crc kubenswrapper[4839]: I0321 05:23:09.554896 4839 state_mem.go:107] "Deleted CPUSet assignment" podUID="ff36c9ee-5581-48b3-be29-a6d5ad4b9476" containerName="extract-utilities" Mar 21 05:23:09 crc kubenswrapper[4839]: E0321 05:23:09.554920 4839 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="65c89a5c-df7e-4d6c-bc7e-5e4fecfc6cb3" containerName="tempest-tests-tempest-tests-runner" Mar 21 05:23:09 crc kubenswrapper[4839]: I0321 05:23:09.554929 4839 state_mem.go:107] "Deleted CPUSet assignment" podUID="65c89a5c-df7e-4d6c-bc7e-5e4fecfc6cb3" containerName="tempest-tests-tempest-tests-runner" Mar 21 05:23:09 crc kubenswrapper[4839]: E0321 05:23:09.554955 4839 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ff36c9ee-5581-48b3-be29-a6d5ad4b9476" containerName="registry-server" Mar 21 05:23:09 crc kubenswrapper[4839]: I0321 05:23:09.554962 4839 state_mem.go:107] "Deleted CPUSet assignment" podUID="ff36c9ee-5581-48b3-be29-a6d5ad4b9476" containerName="registry-server" Mar 21 05:23:09 crc kubenswrapper[4839]: I0321 05:23:09.555144 4839 memory_manager.go:354] "RemoveStaleState removing state" podUID="ff36c9ee-5581-48b3-be29-a6d5ad4b9476" containerName="registry-server" Mar 21 05:23:09 crc kubenswrapper[4839]: I0321 05:23:09.555162 4839 memory_manager.go:354] "RemoveStaleState removing state" podUID="65c89a5c-df7e-4d6c-bc7e-5e4fecfc6cb3" containerName="tempest-tests-tempest-tests-runner" Mar 21 05:23:09 crc kubenswrapper[4839]: I0321 05:23:09.555851 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Mar 21 05:23:09 crc kubenswrapper[4839]: I0321 05:23:09.559676 4839 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"default-dockercfg-6v5zd" Mar 21 05:23:09 crc kubenswrapper[4839]: I0321 05:23:09.580338 4839 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/test-operator-logs-pod-tempest-tempest-tests-tempest"] Mar 21 05:23:09 crc kubenswrapper[4839]: I0321 05:23:09.743241 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b4qk2\" (UniqueName: \"kubernetes.io/projected/d4f9fe7e-8922-4015-a3eb-8c0f829cc5f8-kube-api-access-b4qk2\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"d4f9fe7e-8922-4015-a3eb-8c0f829cc5f8\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Mar 21 05:23:09 crc kubenswrapper[4839]: I0321 05:23:09.743932 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"d4f9fe7e-8922-4015-a3eb-8c0f829cc5f8\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Mar 21 05:23:09 crc kubenswrapper[4839]: I0321 05:23:09.845487 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-b4qk2\" (UniqueName: \"kubernetes.io/projected/d4f9fe7e-8922-4015-a3eb-8c0f829cc5f8-kube-api-access-b4qk2\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"d4f9fe7e-8922-4015-a3eb-8c0f829cc5f8\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Mar 21 05:23:09 crc kubenswrapper[4839]: I0321 05:23:09.845680 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"d4f9fe7e-8922-4015-a3eb-8c0f829cc5f8\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Mar 21 05:23:09 crc kubenswrapper[4839]: I0321 05:23:09.846141 4839 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"d4f9fe7e-8922-4015-a3eb-8c0f829cc5f8\") device mount path \"/mnt/openstack/pv09\"" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Mar 21 05:23:09 crc kubenswrapper[4839]: I0321 05:23:09.871273 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-b4qk2\" (UniqueName: \"kubernetes.io/projected/d4f9fe7e-8922-4015-a3eb-8c0f829cc5f8-kube-api-access-b4qk2\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"d4f9fe7e-8922-4015-a3eb-8c0f829cc5f8\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Mar 21 05:23:09 crc kubenswrapper[4839]: I0321 05:23:09.871940 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"d4f9fe7e-8922-4015-a3eb-8c0f829cc5f8\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Mar 21 05:23:09 crc kubenswrapper[4839]: I0321 05:23:09.938535 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Mar 21 05:23:10 crc kubenswrapper[4839]: I0321 05:23:10.429228 4839 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/test-operator-logs-pod-tempest-tempest-tests-tempest"] Mar 21 05:23:10 crc kubenswrapper[4839]: I0321 05:23:10.435599 4839 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 21 05:23:11 crc kubenswrapper[4839]: I0321 05:23:11.041072 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" event={"ID":"d4f9fe7e-8922-4015-a3eb-8c0f829cc5f8","Type":"ContainerStarted","Data":"f84da512d0b4ee4769cf676e54c429ce52b3bd443ddce23391de8b5f03054507"} Mar 21 05:23:15 crc kubenswrapper[4839]: I0321 05:23:15.087835 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" event={"ID":"d4f9fe7e-8922-4015-a3eb-8c0f829cc5f8","Type":"ContainerStarted","Data":"4bcb391614db0784c3a6e55fa5bf5ea8e33f5a28e61318dca439f8c0bd5726b2"} Mar 21 05:23:15 crc kubenswrapper[4839]: I0321 05:23:15.108341 4839 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" podStartSLOduration=2.6273181340000002 podStartE2EDuration="6.108317887s" podCreationTimestamp="2026-03-21 05:23:09 +0000 UTC" firstStartedPulling="2026-03-21 05:23:10.435335246 +0000 UTC m=+3594.763121922" lastFinishedPulling="2026-03-21 05:23:13.916334999 +0000 UTC m=+3598.244121675" observedRunningTime="2026-03-21 05:23:15.098912042 +0000 UTC m=+3599.426698718" watchObservedRunningTime="2026-03-21 05:23:15.108317887 +0000 UTC m=+3599.436104563" Mar 21 05:23:40 crc kubenswrapper[4839]: I0321 05:23:40.731425 4839 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-9br9q"] Mar 21 05:23:40 crc kubenswrapper[4839]: I0321 05:23:40.749200 4839 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-9br9q"] Mar 21 05:23:40 crc kubenswrapper[4839]: I0321 05:23:40.749314 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-9br9q" Mar 21 05:23:40 crc kubenswrapper[4839]: I0321 05:23:40.939487 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6qxxr\" (UniqueName: \"kubernetes.io/projected/aaf4c7b6-9115-45ff-b52d-3e232223ae62-kube-api-access-6qxxr\") pod \"certified-operators-9br9q\" (UID: \"aaf4c7b6-9115-45ff-b52d-3e232223ae62\") " pod="openshift-marketplace/certified-operators-9br9q" Mar 21 05:23:40 crc kubenswrapper[4839]: I0321 05:23:40.940312 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/aaf4c7b6-9115-45ff-b52d-3e232223ae62-utilities\") pod \"certified-operators-9br9q\" (UID: \"aaf4c7b6-9115-45ff-b52d-3e232223ae62\") " pod="openshift-marketplace/certified-operators-9br9q" Mar 21 05:23:40 crc kubenswrapper[4839]: I0321 05:23:40.940391 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/aaf4c7b6-9115-45ff-b52d-3e232223ae62-catalog-content\") pod \"certified-operators-9br9q\" (UID: \"aaf4c7b6-9115-45ff-b52d-3e232223ae62\") " pod="openshift-marketplace/certified-operators-9br9q" Mar 21 05:23:41 crc kubenswrapper[4839]: I0321 05:23:41.042320 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6qxxr\" (UniqueName: \"kubernetes.io/projected/aaf4c7b6-9115-45ff-b52d-3e232223ae62-kube-api-access-6qxxr\") pod \"certified-operators-9br9q\" (UID: \"aaf4c7b6-9115-45ff-b52d-3e232223ae62\") " pod="openshift-marketplace/certified-operators-9br9q" Mar 21 05:23:41 crc kubenswrapper[4839]: I0321 05:23:41.042679 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/aaf4c7b6-9115-45ff-b52d-3e232223ae62-utilities\") pod \"certified-operators-9br9q\" (UID: \"aaf4c7b6-9115-45ff-b52d-3e232223ae62\") " pod="openshift-marketplace/certified-operators-9br9q" Mar 21 05:23:41 crc kubenswrapper[4839]: I0321 05:23:41.042724 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/aaf4c7b6-9115-45ff-b52d-3e232223ae62-catalog-content\") pod \"certified-operators-9br9q\" (UID: \"aaf4c7b6-9115-45ff-b52d-3e232223ae62\") " pod="openshift-marketplace/certified-operators-9br9q" Mar 21 05:23:41 crc kubenswrapper[4839]: I0321 05:23:41.043225 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/aaf4c7b6-9115-45ff-b52d-3e232223ae62-utilities\") pod \"certified-operators-9br9q\" (UID: \"aaf4c7b6-9115-45ff-b52d-3e232223ae62\") " pod="openshift-marketplace/certified-operators-9br9q" Mar 21 05:23:41 crc kubenswrapper[4839]: I0321 05:23:41.043465 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/aaf4c7b6-9115-45ff-b52d-3e232223ae62-catalog-content\") pod \"certified-operators-9br9q\" (UID: \"aaf4c7b6-9115-45ff-b52d-3e232223ae62\") " pod="openshift-marketplace/certified-operators-9br9q" Mar 21 05:23:41 crc kubenswrapper[4839]: I0321 05:23:41.065368 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6qxxr\" (UniqueName: \"kubernetes.io/projected/aaf4c7b6-9115-45ff-b52d-3e232223ae62-kube-api-access-6qxxr\") pod \"certified-operators-9br9q\" (UID: \"aaf4c7b6-9115-45ff-b52d-3e232223ae62\") " pod="openshift-marketplace/certified-operators-9br9q" Mar 21 05:23:41 crc kubenswrapper[4839]: I0321 05:23:41.077083 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-9br9q" Mar 21 05:23:41 crc kubenswrapper[4839]: I0321 05:23:41.685828 4839 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-9br9q"] Mar 21 05:23:41 crc kubenswrapper[4839]: I0321 05:23:41.772115 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-9br9q" event={"ID":"aaf4c7b6-9115-45ff-b52d-3e232223ae62","Type":"ContainerStarted","Data":"15a447ccc083b3575f68f14b0abb62b3593891a818abc131c80bd34fc5738f62"} Mar 21 05:23:42 crc kubenswrapper[4839]: I0321 05:23:42.787081 4839 generic.go:334] "Generic (PLEG): container finished" podID="aaf4c7b6-9115-45ff-b52d-3e232223ae62" containerID="e8d7065951a6898c812e94884898191f01a58bc63b26600de006202fa8c929e1" exitCode=0 Mar 21 05:23:42 crc kubenswrapper[4839]: I0321 05:23:42.787142 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-9br9q" event={"ID":"aaf4c7b6-9115-45ff-b52d-3e232223ae62","Type":"ContainerDied","Data":"e8d7065951a6898c812e94884898191f01a58bc63b26600de006202fa8c929e1"} Mar 21 05:23:43 crc kubenswrapper[4839]: I0321 05:23:43.801188 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-9br9q" event={"ID":"aaf4c7b6-9115-45ff-b52d-3e232223ae62","Type":"ContainerStarted","Data":"20c506a490bea9b182aeeed1af36acdae8097f4ff646607287d213f582f77884"} Mar 21 05:23:44 crc kubenswrapper[4839]: I0321 05:23:44.813941 4839 generic.go:334] "Generic (PLEG): container finished" podID="aaf4c7b6-9115-45ff-b52d-3e232223ae62" containerID="20c506a490bea9b182aeeed1af36acdae8097f4ff646607287d213f582f77884" exitCode=0 Mar 21 05:23:44 crc kubenswrapper[4839]: I0321 05:23:44.814061 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-9br9q" event={"ID":"aaf4c7b6-9115-45ff-b52d-3e232223ae62","Type":"ContainerDied","Data":"20c506a490bea9b182aeeed1af36acdae8097f4ff646607287d213f582f77884"} Mar 21 05:23:46 crc kubenswrapper[4839]: I0321 05:23:46.841922 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-9br9q" event={"ID":"aaf4c7b6-9115-45ff-b52d-3e232223ae62","Type":"ContainerStarted","Data":"b587c903c28806ae060b27518769299417f133a1cceb7d65518b0aab1147a141"} Mar 21 05:23:46 crc kubenswrapper[4839]: I0321 05:23:46.868856 4839 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-9br9q" podStartSLOduration=3.139453393 podStartE2EDuration="6.868830878s" podCreationTimestamp="2026-03-21 05:23:40 +0000 UTC" firstStartedPulling="2026-03-21 05:23:42.78933967 +0000 UTC m=+3627.117126346" lastFinishedPulling="2026-03-21 05:23:46.518717155 +0000 UTC m=+3630.846503831" observedRunningTime="2026-03-21 05:23:46.861648006 +0000 UTC m=+3631.189434682" watchObservedRunningTime="2026-03-21 05:23:46.868830878 +0000 UTC m=+3631.196617544" Mar 21 05:23:51 crc kubenswrapper[4839]: I0321 05:23:51.077684 4839 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-9br9q" Mar 21 05:23:51 crc kubenswrapper[4839]: I0321 05:23:51.078446 4839 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-9br9q" Mar 21 05:23:51 crc kubenswrapper[4839]: I0321 05:23:51.129850 4839 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-9br9q" Mar 21 05:23:51 crc kubenswrapper[4839]: I0321 05:23:51.574958 4839 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-72qrq/must-gather-mxjpl"] Mar 21 05:23:51 crc kubenswrapper[4839]: I0321 05:23:51.576907 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-72qrq/must-gather-mxjpl" Mar 21 05:23:51 crc kubenswrapper[4839]: I0321 05:23:51.579763 4839 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-must-gather-72qrq"/"default-dockercfg-b7xd9" Mar 21 05:23:51 crc kubenswrapper[4839]: I0321 05:23:51.588594 4839 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-must-gather-72qrq"/"openshift-service-ca.crt" Mar 21 05:23:51 crc kubenswrapper[4839]: I0321 05:23:51.588888 4839 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-must-gather-72qrq"/"kube-root-ca.crt" Mar 21 05:23:51 crc kubenswrapper[4839]: I0321 05:23:51.609617 4839 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-72qrq/must-gather-mxjpl"] Mar 21 05:23:51 crc kubenswrapper[4839]: I0321 05:23:51.731191 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t4s42\" (UniqueName: \"kubernetes.io/projected/de78e0a8-6c32-44ae-8f44-443eb0f1dd25-kube-api-access-t4s42\") pod \"must-gather-mxjpl\" (UID: \"de78e0a8-6c32-44ae-8f44-443eb0f1dd25\") " pod="openshift-must-gather-72qrq/must-gather-mxjpl" Mar 21 05:23:51 crc kubenswrapper[4839]: I0321 05:23:51.731883 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/de78e0a8-6c32-44ae-8f44-443eb0f1dd25-must-gather-output\") pod \"must-gather-mxjpl\" (UID: \"de78e0a8-6c32-44ae-8f44-443eb0f1dd25\") " pod="openshift-must-gather-72qrq/must-gather-mxjpl" Mar 21 05:23:51 crc kubenswrapper[4839]: I0321 05:23:51.833794 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-t4s42\" (UniqueName: \"kubernetes.io/projected/de78e0a8-6c32-44ae-8f44-443eb0f1dd25-kube-api-access-t4s42\") pod \"must-gather-mxjpl\" (UID: \"de78e0a8-6c32-44ae-8f44-443eb0f1dd25\") " pod="openshift-must-gather-72qrq/must-gather-mxjpl" Mar 21 05:23:51 crc kubenswrapper[4839]: I0321 05:23:51.833916 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/de78e0a8-6c32-44ae-8f44-443eb0f1dd25-must-gather-output\") pod \"must-gather-mxjpl\" (UID: \"de78e0a8-6c32-44ae-8f44-443eb0f1dd25\") " pod="openshift-must-gather-72qrq/must-gather-mxjpl" Mar 21 05:23:51 crc kubenswrapper[4839]: I0321 05:23:51.834465 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/de78e0a8-6c32-44ae-8f44-443eb0f1dd25-must-gather-output\") pod \"must-gather-mxjpl\" (UID: \"de78e0a8-6c32-44ae-8f44-443eb0f1dd25\") " pod="openshift-must-gather-72qrq/must-gather-mxjpl" Mar 21 05:23:51 crc kubenswrapper[4839]: I0321 05:23:51.852483 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-t4s42\" (UniqueName: \"kubernetes.io/projected/de78e0a8-6c32-44ae-8f44-443eb0f1dd25-kube-api-access-t4s42\") pod \"must-gather-mxjpl\" (UID: \"de78e0a8-6c32-44ae-8f44-443eb0f1dd25\") " pod="openshift-must-gather-72qrq/must-gather-mxjpl" Mar 21 05:23:51 crc kubenswrapper[4839]: I0321 05:23:51.904419 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-72qrq/must-gather-mxjpl" Mar 21 05:23:51 crc kubenswrapper[4839]: I0321 05:23:51.934823 4839 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-9br9q" Mar 21 05:23:52 crc kubenswrapper[4839]: I0321 05:23:52.008457 4839 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-9br9q"] Mar 21 05:23:52 crc kubenswrapper[4839]: I0321 05:23:52.403435 4839 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-72qrq/must-gather-mxjpl"] Mar 21 05:23:52 crc kubenswrapper[4839]: I0321 05:23:52.895010 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-72qrq/must-gather-mxjpl" event={"ID":"de78e0a8-6c32-44ae-8f44-443eb0f1dd25","Type":"ContainerStarted","Data":"d623f9d6f3bcfead5a92b85d22dc41afd83ca6b9def71de722479fb2dbc37fbf"} Mar 21 05:23:53 crc kubenswrapper[4839]: I0321 05:23:53.903580 4839 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-9br9q" podUID="aaf4c7b6-9115-45ff-b52d-3e232223ae62" containerName="registry-server" containerID="cri-o://b587c903c28806ae060b27518769299417f133a1cceb7d65518b0aab1147a141" gracePeriod=2 Mar 21 05:23:54 crc kubenswrapper[4839]: I0321 05:23:54.396097 4839 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-9br9q" Mar 21 05:23:54 crc kubenswrapper[4839]: I0321 05:23:54.499202 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6qxxr\" (UniqueName: \"kubernetes.io/projected/aaf4c7b6-9115-45ff-b52d-3e232223ae62-kube-api-access-6qxxr\") pod \"aaf4c7b6-9115-45ff-b52d-3e232223ae62\" (UID: \"aaf4c7b6-9115-45ff-b52d-3e232223ae62\") " Mar 21 05:23:54 crc kubenswrapper[4839]: I0321 05:23:54.499276 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/aaf4c7b6-9115-45ff-b52d-3e232223ae62-utilities\") pod \"aaf4c7b6-9115-45ff-b52d-3e232223ae62\" (UID: \"aaf4c7b6-9115-45ff-b52d-3e232223ae62\") " Mar 21 05:23:54 crc kubenswrapper[4839]: I0321 05:23:54.499407 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/aaf4c7b6-9115-45ff-b52d-3e232223ae62-catalog-content\") pod \"aaf4c7b6-9115-45ff-b52d-3e232223ae62\" (UID: \"aaf4c7b6-9115-45ff-b52d-3e232223ae62\") " Mar 21 05:23:54 crc kubenswrapper[4839]: I0321 05:23:54.501548 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/aaf4c7b6-9115-45ff-b52d-3e232223ae62-utilities" (OuterVolumeSpecName: "utilities") pod "aaf4c7b6-9115-45ff-b52d-3e232223ae62" (UID: "aaf4c7b6-9115-45ff-b52d-3e232223ae62"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 21 05:23:54 crc kubenswrapper[4839]: I0321 05:23:54.507534 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/aaf4c7b6-9115-45ff-b52d-3e232223ae62-kube-api-access-6qxxr" (OuterVolumeSpecName: "kube-api-access-6qxxr") pod "aaf4c7b6-9115-45ff-b52d-3e232223ae62" (UID: "aaf4c7b6-9115-45ff-b52d-3e232223ae62"). InnerVolumeSpecName "kube-api-access-6qxxr". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 21 05:23:54 crc kubenswrapper[4839]: I0321 05:23:54.574230 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/aaf4c7b6-9115-45ff-b52d-3e232223ae62-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "aaf4c7b6-9115-45ff-b52d-3e232223ae62" (UID: "aaf4c7b6-9115-45ff-b52d-3e232223ae62"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 21 05:23:54 crc kubenswrapper[4839]: I0321 05:23:54.602080 4839 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6qxxr\" (UniqueName: \"kubernetes.io/projected/aaf4c7b6-9115-45ff-b52d-3e232223ae62-kube-api-access-6qxxr\") on node \"crc\" DevicePath \"\"" Mar 21 05:23:54 crc kubenswrapper[4839]: I0321 05:23:54.602124 4839 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/aaf4c7b6-9115-45ff-b52d-3e232223ae62-utilities\") on node \"crc\" DevicePath \"\"" Mar 21 05:23:54 crc kubenswrapper[4839]: I0321 05:23:54.602139 4839 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/aaf4c7b6-9115-45ff-b52d-3e232223ae62-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 21 05:23:54 crc kubenswrapper[4839]: I0321 05:23:54.914897 4839 generic.go:334] "Generic (PLEG): container finished" podID="aaf4c7b6-9115-45ff-b52d-3e232223ae62" containerID="b587c903c28806ae060b27518769299417f133a1cceb7d65518b0aab1147a141" exitCode=0 Mar 21 05:23:54 crc kubenswrapper[4839]: I0321 05:23:54.914953 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-9br9q" event={"ID":"aaf4c7b6-9115-45ff-b52d-3e232223ae62","Type":"ContainerDied","Data":"b587c903c28806ae060b27518769299417f133a1cceb7d65518b0aab1147a141"} Mar 21 05:23:54 crc kubenswrapper[4839]: I0321 05:23:54.914981 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-9br9q" event={"ID":"aaf4c7b6-9115-45ff-b52d-3e232223ae62","Type":"ContainerDied","Data":"15a447ccc083b3575f68f14b0abb62b3593891a818abc131c80bd34fc5738f62"} Mar 21 05:23:54 crc kubenswrapper[4839]: I0321 05:23:54.914999 4839 scope.go:117] "RemoveContainer" containerID="b587c903c28806ae060b27518769299417f133a1cceb7d65518b0aab1147a141" Mar 21 05:23:54 crc kubenswrapper[4839]: I0321 05:23:54.915009 4839 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-9br9q" Mar 21 05:23:54 crc kubenswrapper[4839]: I0321 05:23:54.951451 4839 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-9br9q"] Mar 21 05:23:54 crc kubenswrapper[4839]: I0321 05:23:54.959377 4839 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-9br9q"] Mar 21 05:23:56 crc kubenswrapper[4839]: I0321 05:23:56.465499 4839 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="aaf4c7b6-9115-45ff-b52d-3e232223ae62" path="/var/lib/kubelet/pods/aaf4c7b6-9115-45ff-b52d-3e232223ae62/volumes" Mar 21 05:23:56 crc kubenswrapper[4839]: I0321 05:23:56.622821 4839 scope.go:117] "RemoveContainer" containerID="20c506a490bea9b182aeeed1af36acdae8097f4ff646607287d213f582f77884" Mar 21 05:23:56 crc kubenswrapper[4839]: I0321 05:23:56.692882 4839 scope.go:117] "RemoveContainer" containerID="e8d7065951a6898c812e94884898191f01a58bc63b26600de006202fa8c929e1" Mar 21 05:23:56 crc kubenswrapper[4839]: I0321 05:23:56.720908 4839 scope.go:117] "RemoveContainer" containerID="b587c903c28806ae060b27518769299417f133a1cceb7d65518b0aab1147a141" Mar 21 05:23:56 crc kubenswrapper[4839]: E0321 05:23:56.721900 4839 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b587c903c28806ae060b27518769299417f133a1cceb7d65518b0aab1147a141\": container with ID starting with b587c903c28806ae060b27518769299417f133a1cceb7d65518b0aab1147a141 not found: ID does not exist" containerID="b587c903c28806ae060b27518769299417f133a1cceb7d65518b0aab1147a141" Mar 21 05:23:56 crc kubenswrapper[4839]: I0321 05:23:56.721951 4839 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b587c903c28806ae060b27518769299417f133a1cceb7d65518b0aab1147a141"} err="failed to get container status \"b587c903c28806ae060b27518769299417f133a1cceb7d65518b0aab1147a141\": rpc error: code = NotFound desc = could not find container \"b587c903c28806ae060b27518769299417f133a1cceb7d65518b0aab1147a141\": container with ID starting with b587c903c28806ae060b27518769299417f133a1cceb7d65518b0aab1147a141 not found: ID does not exist" Mar 21 05:23:56 crc kubenswrapper[4839]: I0321 05:23:56.721984 4839 scope.go:117] "RemoveContainer" containerID="20c506a490bea9b182aeeed1af36acdae8097f4ff646607287d213f582f77884" Mar 21 05:23:56 crc kubenswrapper[4839]: E0321 05:23:56.722395 4839 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"20c506a490bea9b182aeeed1af36acdae8097f4ff646607287d213f582f77884\": container with ID starting with 20c506a490bea9b182aeeed1af36acdae8097f4ff646607287d213f582f77884 not found: ID does not exist" containerID="20c506a490bea9b182aeeed1af36acdae8097f4ff646607287d213f582f77884" Mar 21 05:23:56 crc kubenswrapper[4839]: I0321 05:23:56.722435 4839 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"20c506a490bea9b182aeeed1af36acdae8097f4ff646607287d213f582f77884"} err="failed to get container status \"20c506a490bea9b182aeeed1af36acdae8097f4ff646607287d213f582f77884\": rpc error: code = NotFound desc = could not find container \"20c506a490bea9b182aeeed1af36acdae8097f4ff646607287d213f582f77884\": container with ID starting with 20c506a490bea9b182aeeed1af36acdae8097f4ff646607287d213f582f77884 not found: ID does not exist" Mar 21 05:23:56 crc kubenswrapper[4839]: I0321 05:23:56.722455 4839 scope.go:117] "RemoveContainer" containerID="e8d7065951a6898c812e94884898191f01a58bc63b26600de006202fa8c929e1" Mar 21 05:23:56 crc kubenswrapper[4839]: E0321 05:23:56.722786 4839 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e8d7065951a6898c812e94884898191f01a58bc63b26600de006202fa8c929e1\": container with ID starting with e8d7065951a6898c812e94884898191f01a58bc63b26600de006202fa8c929e1 not found: ID does not exist" containerID="e8d7065951a6898c812e94884898191f01a58bc63b26600de006202fa8c929e1" Mar 21 05:23:56 crc kubenswrapper[4839]: I0321 05:23:56.722842 4839 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e8d7065951a6898c812e94884898191f01a58bc63b26600de006202fa8c929e1"} err="failed to get container status \"e8d7065951a6898c812e94884898191f01a58bc63b26600de006202fa8c929e1\": rpc error: code = NotFound desc = could not find container \"e8d7065951a6898c812e94884898191f01a58bc63b26600de006202fa8c929e1\": container with ID starting with e8d7065951a6898c812e94884898191f01a58bc63b26600de006202fa8c929e1 not found: ID does not exist" Mar 21 05:23:57 crc kubenswrapper[4839]: I0321 05:23:57.952186 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-72qrq/must-gather-mxjpl" event={"ID":"de78e0a8-6c32-44ae-8f44-443eb0f1dd25","Type":"ContainerStarted","Data":"0acccee08d8e21b640f974b3184f1317711fe544ebde8a6ac1eadbd5cddfd459"} Mar 21 05:23:57 crc kubenswrapper[4839]: I0321 05:23:57.953125 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-72qrq/must-gather-mxjpl" event={"ID":"de78e0a8-6c32-44ae-8f44-443eb0f1dd25","Type":"ContainerStarted","Data":"4aacadbb7c340286a8bc5bb514479c70f999217550d1118742a4cf28f857f96a"} Mar 21 05:23:57 crc kubenswrapper[4839]: I0321 05:23:57.999705 4839 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-72qrq/must-gather-mxjpl" podStartSLOduration=2.686641435 podStartE2EDuration="6.999681656s" podCreationTimestamp="2026-03-21 05:23:51 +0000 UTC" firstStartedPulling="2026-03-21 05:23:52.410342956 +0000 UTC m=+3636.738129632" lastFinishedPulling="2026-03-21 05:23:56.723383177 +0000 UTC m=+3641.051169853" observedRunningTime="2026-03-21 05:23:57.991157825 +0000 UTC m=+3642.318944531" watchObservedRunningTime="2026-03-21 05:23:57.999681656 +0000 UTC m=+3642.327468332" Mar 21 05:24:00 crc kubenswrapper[4839]: I0321 05:24:00.171445 4839 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29567844-qzhl5"] Mar 21 05:24:00 crc kubenswrapper[4839]: E0321 05:24:00.172251 4839 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="aaf4c7b6-9115-45ff-b52d-3e232223ae62" containerName="extract-content" Mar 21 05:24:00 crc kubenswrapper[4839]: I0321 05:24:00.172270 4839 state_mem.go:107] "Deleted CPUSet assignment" podUID="aaf4c7b6-9115-45ff-b52d-3e232223ae62" containerName="extract-content" Mar 21 05:24:00 crc kubenswrapper[4839]: E0321 05:24:00.172287 4839 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="aaf4c7b6-9115-45ff-b52d-3e232223ae62" containerName="extract-utilities" Mar 21 05:24:00 crc kubenswrapper[4839]: I0321 05:24:00.172294 4839 state_mem.go:107] "Deleted CPUSet assignment" podUID="aaf4c7b6-9115-45ff-b52d-3e232223ae62" containerName="extract-utilities" Mar 21 05:24:00 crc kubenswrapper[4839]: E0321 05:24:00.172328 4839 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="aaf4c7b6-9115-45ff-b52d-3e232223ae62" containerName="registry-server" Mar 21 05:24:00 crc kubenswrapper[4839]: I0321 05:24:00.172335 4839 state_mem.go:107] "Deleted CPUSet assignment" podUID="aaf4c7b6-9115-45ff-b52d-3e232223ae62" containerName="registry-server" Mar 21 05:24:00 crc kubenswrapper[4839]: I0321 05:24:00.172488 4839 memory_manager.go:354] "RemoveStaleState removing state" podUID="aaf4c7b6-9115-45ff-b52d-3e232223ae62" containerName="registry-server" Mar 21 05:24:00 crc kubenswrapper[4839]: I0321 05:24:00.173251 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567844-qzhl5" Mar 21 05:24:00 crc kubenswrapper[4839]: I0321 05:24:00.176762 4839 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 21 05:24:00 crc kubenswrapper[4839]: I0321 05:24:00.177123 4839 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 21 05:24:00 crc kubenswrapper[4839]: I0321 05:24:00.178550 4839 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-swld2" Mar 21 05:24:00 crc kubenswrapper[4839]: I0321 05:24:00.194459 4839 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29567844-qzhl5"] Mar 21 05:24:00 crc kubenswrapper[4839]: I0321 05:24:00.330691 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-65fhp\" (UniqueName: \"kubernetes.io/projected/ad5b8a95-e16d-42a8-9069-5294c8934559-kube-api-access-65fhp\") pod \"auto-csr-approver-29567844-qzhl5\" (UID: \"ad5b8a95-e16d-42a8-9069-5294c8934559\") " pod="openshift-infra/auto-csr-approver-29567844-qzhl5" Mar 21 05:24:00 crc kubenswrapper[4839]: I0321 05:24:00.433819 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-65fhp\" (UniqueName: \"kubernetes.io/projected/ad5b8a95-e16d-42a8-9069-5294c8934559-kube-api-access-65fhp\") pod \"auto-csr-approver-29567844-qzhl5\" (UID: \"ad5b8a95-e16d-42a8-9069-5294c8934559\") " pod="openshift-infra/auto-csr-approver-29567844-qzhl5" Mar 21 05:24:00 crc kubenswrapper[4839]: I0321 05:24:00.467141 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-65fhp\" (UniqueName: \"kubernetes.io/projected/ad5b8a95-e16d-42a8-9069-5294c8934559-kube-api-access-65fhp\") pod \"auto-csr-approver-29567844-qzhl5\" (UID: \"ad5b8a95-e16d-42a8-9069-5294c8934559\") " pod="openshift-infra/auto-csr-approver-29567844-qzhl5" Mar 21 05:24:00 crc kubenswrapper[4839]: I0321 05:24:00.499494 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567844-qzhl5" Mar 21 05:24:01 crc kubenswrapper[4839]: I0321 05:24:01.097809 4839 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29567844-qzhl5"] Mar 21 05:24:01 crc kubenswrapper[4839]: I0321 05:24:01.448486 4839 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-72qrq/crc-debug-lb22p"] Mar 21 05:24:01 crc kubenswrapper[4839]: I0321 05:24:01.449859 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-72qrq/crc-debug-lb22p" Mar 21 05:24:01 crc kubenswrapper[4839]: I0321 05:24:01.560383 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/8a565de2-9452-4c7a-85c5-8fe6f15f6859-host\") pod \"crc-debug-lb22p\" (UID: \"8a565de2-9452-4c7a-85c5-8fe6f15f6859\") " pod="openshift-must-gather-72qrq/crc-debug-lb22p" Mar 21 05:24:01 crc kubenswrapper[4839]: I0321 05:24:01.560526 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4g9vn\" (UniqueName: \"kubernetes.io/projected/8a565de2-9452-4c7a-85c5-8fe6f15f6859-kube-api-access-4g9vn\") pod \"crc-debug-lb22p\" (UID: \"8a565de2-9452-4c7a-85c5-8fe6f15f6859\") " pod="openshift-must-gather-72qrq/crc-debug-lb22p" Mar 21 05:24:01 crc kubenswrapper[4839]: I0321 05:24:01.662403 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/8a565de2-9452-4c7a-85c5-8fe6f15f6859-host\") pod \"crc-debug-lb22p\" (UID: \"8a565de2-9452-4c7a-85c5-8fe6f15f6859\") " pod="openshift-must-gather-72qrq/crc-debug-lb22p" Mar 21 05:24:01 crc kubenswrapper[4839]: I0321 05:24:01.662537 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4g9vn\" (UniqueName: \"kubernetes.io/projected/8a565de2-9452-4c7a-85c5-8fe6f15f6859-kube-api-access-4g9vn\") pod \"crc-debug-lb22p\" (UID: \"8a565de2-9452-4c7a-85c5-8fe6f15f6859\") " pod="openshift-must-gather-72qrq/crc-debug-lb22p" Mar 21 05:24:01 crc kubenswrapper[4839]: I0321 05:24:01.662543 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/8a565de2-9452-4c7a-85c5-8fe6f15f6859-host\") pod \"crc-debug-lb22p\" (UID: \"8a565de2-9452-4c7a-85c5-8fe6f15f6859\") " pod="openshift-must-gather-72qrq/crc-debug-lb22p" Mar 21 05:24:01 crc kubenswrapper[4839]: I0321 05:24:01.698764 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4g9vn\" (UniqueName: \"kubernetes.io/projected/8a565de2-9452-4c7a-85c5-8fe6f15f6859-kube-api-access-4g9vn\") pod \"crc-debug-lb22p\" (UID: \"8a565de2-9452-4c7a-85c5-8fe6f15f6859\") " pod="openshift-must-gather-72qrq/crc-debug-lb22p" Mar 21 05:24:01 crc kubenswrapper[4839]: I0321 05:24:01.794156 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-72qrq/crc-debug-lb22p" Mar 21 05:24:01 crc kubenswrapper[4839]: W0321 05:24:01.837160 4839 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod8a565de2_9452_4c7a_85c5_8fe6f15f6859.slice/crio-e108ca9ad3bac0d26710fce2c914fcea5a0bce2060d74cc3faf00e429ecc5144 WatchSource:0}: Error finding container e108ca9ad3bac0d26710fce2c914fcea5a0bce2060d74cc3faf00e429ecc5144: Status 404 returned error can't find the container with id e108ca9ad3bac0d26710fce2c914fcea5a0bce2060d74cc3faf00e429ecc5144 Mar 21 05:24:02 crc kubenswrapper[4839]: I0321 05:24:02.003075 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567844-qzhl5" event={"ID":"ad5b8a95-e16d-42a8-9069-5294c8934559","Type":"ContainerStarted","Data":"3df04d0604b00287c03f656614f8f501c7dd27b6e7b919a6ab1a6ddbacba037a"} Mar 21 05:24:02 crc kubenswrapper[4839]: I0321 05:24:02.005146 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-72qrq/crc-debug-lb22p" event={"ID":"8a565de2-9452-4c7a-85c5-8fe6f15f6859","Type":"ContainerStarted","Data":"e108ca9ad3bac0d26710fce2c914fcea5a0bce2060d74cc3faf00e429ecc5144"} Mar 21 05:24:04 crc kubenswrapper[4839]: I0321 05:24:04.024762 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567844-qzhl5" event={"ID":"ad5b8a95-e16d-42a8-9069-5294c8934559","Type":"ContainerStarted","Data":"cb7937f2ae576fec589579ad2dd17797c203b9a5a4641193da2cc618f8fd881c"} Mar 21 05:24:05 crc kubenswrapper[4839]: I0321 05:24:05.037603 4839 generic.go:334] "Generic (PLEG): container finished" podID="ad5b8a95-e16d-42a8-9069-5294c8934559" containerID="cb7937f2ae576fec589579ad2dd17797c203b9a5a4641193da2cc618f8fd881c" exitCode=0 Mar 21 05:24:05 crc kubenswrapper[4839]: I0321 05:24:05.037678 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567844-qzhl5" event={"ID":"ad5b8a95-e16d-42a8-9069-5294c8934559","Type":"ContainerDied","Data":"cb7937f2ae576fec589579ad2dd17797c203b9a5a4641193da2cc618f8fd881c"} Mar 21 05:24:06 crc kubenswrapper[4839]: I0321 05:24:06.620707 4839 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567844-qzhl5" Mar 21 05:24:06 crc kubenswrapper[4839]: I0321 05:24:06.702530 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-65fhp\" (UniqueName: \"kubernetes.io/projected/ad5b8a95-e16d-42a8-9069-5294c8934559-kube-api-access-65fhp\") pod \"ad5b8a95-e16d-42a8-9069-5294c8934559\" (UID: \"ad5b8a95-e16d-42a8-9069-5294c8934559\") " Mar 21 05:24:06 crc kubenswrapper[4839]: I0321 05:24:06.722500 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ad5b8a95-e16d-42a8-9069-5294c8934559-kube-api-access-65fhp" (OuterVolumeSpecName: "kube-api-access-65fhp") pod "ad5b8a95-e16d-42a8-9069-5294c8934559" (UID: "ad5b8a95-e16d-42a8-9069-5294c8934559"). InnerVolumeSpecName "kube-api-access-65fhp". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 21 05:24:06 crc kubenswrapper[4839]: I0321 05:24:06.804979 4839 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-65fhp\" (UniqueName: \"kubernetes.io/projected/ad5b8a95-e16d-42a8-9069-5294c8934559-kube-api-access-65fhp\") on node \"crc\" DevicePath \"\"" Mar 21 05:24:07 crc kubenswrapper[4839]: I0321 05:24:07.061675 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567844-qzhl5" event={"ID":"ad5b8a95-e16d-42a8-9069-5294c8934559","Type":"ContainerDied","Data":"3df04d0604b00287c03f656614f8f501c7dd27b6e7b919a6ab1a6ddbacba037a"} Mar 21 05:24:07 crc kubenswrapper[4839]: I0321 05:24:07.061728 4839 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="3df04d0604b00287c03f656614f8f501c7dd27b6e7b919a6ab1a6ddbacba037a" Mar 21 05:24:07 crc kubenswrapper[4839]: I0321 05:24:07.061730 4839 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567844-qzhl5" Mar 21 05:24:07 crc kubenswrapper[4839]: I0321 05:24:07.710442 4839 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29567838-z4kh5"] Mar 21 05:24:07 crc kubenswrapper[4839]: I0321 05:24:07.724697 4839 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29567838-z4kh5"] Mar 21 05:24:08 crc kubenswrapper[4839]: I0321 05:24:08.463638 4839 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a41e2c3a-6a99-4d7f-9f7c-1baf73fe815a" path="/var/lib/kubelet/pods/a41e2c3a-6a99-4d7f-9f7c-1baf73fe815a/volumes" Mar 21 05:24:17 crc kubenswrapper[4839]: I0321 05:24:17.472985 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-72qrq/crc-debug-lb22p" event={"ID":"8a565de2-9452-4c7a-85c5-8fe6f15f6859","Type":"ContainerStarted","Data":"1992910c0bf58b0e82afab7177c5dce6191e1367cd3ed72f39dc82352bb794e1"} Mar 21 05:24:17 crc kubenswrapper[4839]: I0321 05:24:17.493784 4839 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-72qrq/crc-debug-lb22p" podStartSLOduration=1.14990143 podStartE2EDuration="16.493757323s" podCreationTimestamp="2026-03-21 05:24:01 +0000 UTC" firstStartedPulling="2026-03-21 05:24:01.839411356 +0000 UTC m=+3646.167198032" lastFinishedPulling="2026-03-21 05:24:17.183267259 +0000 UTC m=+3661.511053925" observedRunningTime="2026-03-21 05:24:17.48689598 +0000 UTC m=+3661.814682656" watchObservedRunningTime="2026-03-21 05:24:17.493757323 +0000 UTC m=+3661.821544009" Mar 21 05:24:30 crc kubenswrapper[4839]: I0321 05:24:30.980165 4839 patch_prober.go:28] interesting pod/machine-config-daemon-jx4q7 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 21 05:24:30 crc kubenswrapper[4839]: I0321 05:24:30.981126 4839 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-jx4q7" podUID="4f92fefb-d5cd-451a-8bbe-31eea55d5bd9" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 21 05:24:57 crc kubenswrapper[4839]: I0321 05:24:57.411355 4839 scope.go:117] "RemoveContainer" containerID="aa966754301c031bb6355b4136e2fe214f5819ce3ea77c126ebfb20a4377b523" Mar 21 05:25:00 crc kubenswrapper[4839]: I0321 05:25:00.930469 4839 generic.go:334] "Generic (PLEG): container finished" podID="8a565de2-9452-4c7a-85c5-8fe6f15f6859" containerID="1992910c0bf58b0e82afab7177c5dce6191e1367cd3ed72f39dc82352bb794e1" exitCode=0 Mar 21 05:25:00 crc kubenswrapper[4839]: I0321 05:25:00.930588 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-72qrq/crc-debug-lb22p" event={"ID":"8a565de2-9452-4c7a-85c5-8fe6f15f6859","Type":"ContainerDied","Data":"1992910c0bf58b0e82afab7177c5dce6191e1367cd3ed72f39dc82352bb794e1"} Mar 21 05:25:00 crc kubenswrapper[4839]: I0321 05:25:00.981017 4839 patch_prober.go:28] interesting pod/machine-config-daemon-jx4q7 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 21 05:25:00 crc kubenswrapper[4839]: I0321 05:25:00.981076 4839 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-jx4q7" podUID="4f92fefb-d5cd-451a-8bbe-31eea55d5bd9" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 21 05:25:02 crc kubenswrapper[4839]: I0321 05:25:02.054599 4839 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-72qrq/crc-debug-lb22p" Mar 21 05:25:02 crc kubenswrapper[4839]: I0321 05:25:02.092237 4839 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-72qrq/crc-debug-lb22p"] Mar 21 05:25:02 crc kubenswrapper[4839]: I0321 05:25:02.106833 4839 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-72qrq/crc-debug-lb22p"] Mar 21 05:25:02 crc kubenswrapper[4839]: I0321 05:25:02.227270 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/8a565de2-9452-4c7a-85c5-8fe6f15f6859-host\") pod \"8a565de2-9452-4c7a-85c5-8fe6f15f6859\" (UID: \"8a565de2-9452-4c7a-85c5-8fe6f15f6859\") " Mar 21 05:25:02 crc kubenswrapper[4839]: I0321 05:25:02.227373 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/8a565de2-9452-4c7a-85c5-8fe6f15f6859-host" (OuterVolumeSpecName: "host") pod "8a565de2-9452-4c7a-85c5-8fe6f15f6859" (UID: "8a565de2-9452-4c7a-85c5-8fe6f15f6859"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 21 05:25:02 crc kubenswrapper[4839]: I0321 05:25:02.227589 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4g9vn\" (UniqueName: \"kubernetes.io/projected/8a565de2-9452-4c7a-85c5-8fe6f15f6859-kube-api-access-4g9vn\") pod \"8a565de2-9452-4c7a-85c5-8fe6f15f6859\" (UID: \"8a565de2-9452-4c7a-85c5-8fe6f15f6859\") " Mar 21 05:25:02 crc kubenswrapper[4839]: I0321 05:25:02.228148 4839 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/8a565de2-9452-4c7a-85c5-8fe6f15f6859-host\") on node \"crc\" DevicePath \"\"" Mar 21 05:25:02 crc kubenswrapper[4839]: I0321 05:25:02.240427 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8a565de2-9452-4c7a-85c5-8fe6f15f6859-kube-api-access-4g9vn" (OuterVolumeSpecName: "kube-api-access-4g9vn") pod "8a565de2-9452-4c7a-85c5-8fe6f15f6859" (UID: "8a565de2-9452-4c7a-85c5-8fe6f15f6859"). InnerVolumeSpecName "kube-api-access-4g9vn". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 21 05:25:02 crc kubenswrapper[4839]: I0321 05:25:02.329661 4839 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4g9vn\" (UniqueName: \"kubernetes.io/projected/8a565de2-9452-4c7a-85c5-8fe6f15f6859-kube-api-access-4g9vn\") on node \"crc\" DevicePath \"\"" Mar 21 05:25:02 crc kubenswrapper[4839]: I0321 05:25:02.465107 4839 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8a565de2-9452-4c7a-85c5-8fe6f15f6859" path="/var/lib/kubelet/pods/8a565de2-9452-4c7a-85c5-8fe6f15f6859/volumes" Mar 21 05:25:02 crc kubenswrapper[4839]: I0321 05:25:02.948993 4839 scope.go:117] "RemoveContainer" containerID="1992910c0bf58b0e82afab7177c5dce6191e1367cd3ed72f39dc82352bb794e1" Mar 21 05:25:02 crc kubenswrapper[4839]: I0321 05:25:02.949136 4839 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-72qrq/crc-debug-lb22p" Mar 21 05:25:03 crc kubenswrapper[4839]: I0321 05:25:03.266837 4839 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-72qrq/crc-debug-nkdhd"] Mar 21 05:25:03 crc kubenswrapper[4839]: E0321 05:25:03.267794 4839 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8a565de2-9452-4c7a-85c5-8fe6f15f6859" containerName="container-00" Mar 21 05:25:03 crc kubenswrapper[4839]: I0321 05:25:03.267823 4839 state_mem.go:107] "Deleted CPUSet assignment" podUID="8a565de2-9452-4c7a-85c5-8fe6f15f6859" containerName="container-00" Mar 21 05:25:03 crc kubenswrapper[4839]: E0321 05:25:03.267866 4839 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ad5b8a95-e16d-42a8-9069-5294c8934559" containerName="oc" Mar 21 05:25:03 crc kubenswrapper[4839]: I0321 05:25:03.267874 4839 state_mem.go:107] "Deleted CPUSet assignment" podUID="ad5b8a95-e16d-42a8-9069-5294c8934559" containerName="oc" Mar 21 05:25:03 crc kubenswrapper[4839]: I0321 05:25:03.268070 4839 memory_manager.go:354] "RemoveStaleState removing state" podUID="ad5b8a95-e16d-42a8-9069-5294c8934559" containerName="oc" Mar 21 05:25:03 crc kubenswrapper[4839]: I0321 05:25:03.268096 4839 memory_manager.go:354] "RemoveStaleState removing state" podUID="8a565de2-9452-4c7a-85c5-8fe6f15f6859" containerName="container-00" Mar 21 05:25:03 crc kubenswrapper[4839]: I0321 05:25:03.268883 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-72qrq/crc-debug-nkdhd" Mar 21 05:25:03 crc kubenswrapper[4839]: I0321 05:25:03.452905 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-84tzr\" (UniqueName: \"kubernetes.io/projected/c815b0ab-12b4-4c7e-a22c-5ae8680936c0-kube-api-access-84tzr\") pod \"crc-debug-nkdhd\" (UID: \"c815b0ab-12b4-4c7e-a22c-5ae8680936c0\") " pod="openshift-must-gather-72qrq/crc-debug-nkdhd" Mar 21 05:25:03 crc kubenswrapper[4839]: I0321 05:25:03.453219 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/c815b0ab-12b4-4c7e-a22c-5ae8680936c0-host\") pod \"crc-debug-nkdhd\" (UID: \"c815b0ab-12b4-4c7e-a22c-5ae8680936c0\") " pod="openshift-must-gather-72qrq/crc-debug-nkdhd" Mar 21 05:25:03 crc kubenswrapper[4839]: I0321 05:25:03.555977 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-84tzr\" (UniqueName: \"kubernetes.io/projected/c815b0ab-12b4-4c7e-a22c-5ae8680936c0-kube-api-access-84tzr\") pod \"crc-debug-nkdhd\" (UID: \"c815b0ab-12b4-4c7e-a22c-5ae8680936c0\") " pod="openshift-must-gather-72qrq/crc-debug-nkdhd" Mar 21 05:25:03 crc kubenswrapper[4839]: I0321 05:25:03.556219 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/c815b0ab-12b4-4c7e-a22c-5ae8680936c0-host\") pod \"crc-debug-nkdhd\" (UID: \"c815b0ab-12b4-4c7e-a22c-5ae8680936c0\") " pod="openshift-must-gather-72qrq/crc-debug-nkdhd" Mar 21 05:25:03 crc kubenswrapper[4839]: I0321 05:25:03.556345 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/c815b0ab-12b4-4c7e-a22c-5ae8680936c0-host\") pod \"crc-debug-nkdhd\" (UID: \"c815b0ab-12b4-4c7e-a22c-5ae8680936c0\") " pod="openshift-must-gather-72qrq/crc-debug-nkdhd" Mar 21 05:25:03 crc kubenswrapper[4839]: I0321 05:25:03.580772 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-84tzr\" (UniqueName: \"kubernetes.io/projected/c815b0ab-12b4-4c7e-a22c-5ae8680936c0-kube-api-access-84tzr\") pod \"crc-debug-nkdhd\" (UID: \"c815b0ab-12b4-4c7e-a22c-5ae8680936c0\") " pod="openshift-must-gather-72qrq/crc-debug-nkdhd" Mar 21 05:25:03 crc kubenswrapper[4839]: I0321 05:25:03.587846 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-72qrq/crc-debug-nkdhd" Mar 21 05:25:03 crc kubenswrapper[4839]: I0321 05:25:03.959650 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-72qrq/crc-debug-nkdhd" event={"ID":"c815b0ab-12b4-4c7e-a22c-5ae8680936c0","Type":"ContainerStarted","Data":"e901d2e0584567d056a14a65a0dbf9adc3eea19c597864f96a32867a7a1b9a1d"} Mar 21 05:25:03 crc kubenswrapper[4839]: I0321 05:25:03.959962 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-72qrq/crc-debug-nkdhd" event={"ID":"c815b0ab-12b4-4c7e-a22c-5ae8680936c0","Type":"ContainerStarted","Data":"1ef3a429fab53a62499b69a9b777966ca288c36bcf5ff402c6d44788234a6533"} Mar 21 05:25:03 crc kubenswrapper[4839]: I0321 05:25:03.972694 4839 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-72qrq/crc-debug-nkdhd" podStartSLOduration=0.972675305 podStartE2EDuration="972.675305ms" podCreationTimestamp="2026-03-21 05:25:03 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-21 05:25:03.972253863 +0000 UTC m=+3708.300040559" watchObservedRunningTime="2026-03-21 05:25:03.972675305 +0000 UTC m=+3708.300461981" Mar 21 05:25:04 crc kubenswrapper[4839]: I0321 05:25:04.971436 4839 generic.go:334] "Generic (PLEG): container finished" podID="c815b0ab-12b4-4c7e-a22c-5ae8680936c0" containerID="e901d2e0584567d056a14a65a0dbf9adc3eea19c597864f96a32867a7a1b9a1d" exitCode=0 Mar 21 05:25:04 crc kubenswrapper[4839]: I0321 05:25:04.971490 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-72qrq/crc-debug-nkdhd" event={"ID":"c815b0ab-12b4-4c7e-a22c-5ae8680936c0","Type":"ContainerDied","Data":"e901d2e0584567d056a14a65a0dbf9adc3eea19c597864f96a32867a7a1b9a1d"} Mar 21 05:25:06 crc kubenswrapper[4839]: I0321 05:25:06.072791 4839 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-72qrq/crc-debug-nkdhd" Mar 21 05:25:06 crc kubenswrapper[4839]: I0321 05:25:06.106477 4839 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-72qrq/crc-debug-nkdhd"] Mar 21 05:25:06 crc kubenswrapper[4839]: I0321 05:25:06.115129 4839 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-72qrq/crc-debug-nkdhd"] Mar 21 05:25:06 crc kubenswrapper[4839]: I0321 05:25:06.204333 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/c815b0ab-12b4-4c7e-a22c-5ae8680936c0-host\") pod \"c815b0ab-12b4-4c7e-a22c-5ae8680936c0\" (UID: \"c815b0ab-12b4-4c7e-a22c-5ae8680936c0\") " Mar 21 05:25:06 crc kubenswrapper[4839]: I0321 05:25:06.204457 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-84tzr\" (UniqueName: \"kubernetes.io/projected/c815b0ab-12b4-4c7e-a22c-5ae8680936c0-kube-api-access-84tzr\") pod \"c815b0ab-12b4-4c7e-a22c-5ae8680936c0\" (UID: \"c815b0ab-12b4-4c7e-a22c-5ae8680936c0\") " Mar 21 05:25:06 crc kubenswrapper[4839]: I0321 05:25:06.204561 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/c815b0ab-12b4-4c7e-a22c-5ae8680936c0-host" (OuterVolumeSpecName: "host") pod "c815b0ab-12b4-4c7e-a22c-5ae8680936c0" (UID: "c815b0ab-12b4-4c7e-a22c-5ae8680936c0"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 21 05:25:06 crc kubenswrapper[4839]: I0321 05:25:06.206035 4839 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/c815b0ab-12b4-4c7e-a22c-5ae8680936c0-host\") on node \"crc\" DevicePath \"\"" Mar 21 05:25:06 crc kubenswrapper[4839]: I0321 05:25:06.211925 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c815b0ab-12b4-4c7e-a22c-5ae8680936c0-kube-api-access-84tzr" (OuterVolumeSpecName: "kube-api-access-84tzr") pod "c815b0ab-12b4-4c7e-a22c-5ae8680936c0" (UID: "c815b0ab-12b4-4c7e-a22c-5ae8680936c0"). InnerVolumeSpecName "kube-api-access-84tzr". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 21 05:25:06 crc kubenswrapper[4839]: I0321 05:25:06.308587 4839 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-84tzr\" (UniqueName: \"kubernetes.io/projected/c815b0ab-12b4-4c7e-a22c-5ae8680936c0-kube-api-access-84tzr\") on node \"crc\" DevicePath \"\"" Mar 21 05:25:06 crc kubenswrapper[4839]: I0321 05:25:06.464600 4839 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c815b0ab-12b4-4c7e-a22c-5ae8680936c0" path="/var/lib/kubelet/pods/c815b0ab-12b4-4c7e-a22c-5ae8680936c0/volumes" Mar 21 05:25:06 crc kubenswrapper[4839]: I0321 05:25:06.997869 4839 scope.go:117] "RemoveContainer" containerID="e901d2e0584567d056a14a65a0dbf9adc3eea19c597864f96a32867a7a1b9a1d" Mar 21 05:25:06 crc kubenswrapper[4839]: I0321 05:25:06.997985 4839 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-72qrq/crc-debug-nkdhd" Mar 21 05:25:07 crc kubenswrapper[4839]: I0321 05:25:07.285888 4839 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-72qrq/crc-debug-xfsb6"] Mar 21 05:25:07 crc kubenswrapper[4839]: E0321 05:25:07.286742 4839 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c815b0ab-12b4-4c7e-a22c-5ae8680936c0" containerName="container-00" Mar 21 05:25:07 crc kubenswrapper[4839]: I0321 05:25:07.286759 4839 state_mem.go:107] "Deleted CPUSet assignment" podUID="c815b0ab-12b4-4c7e-a22c-5ae8680936c0" containerName="container-00" Mar 21 05:25:07 crc kubenswrapper[4839]: I0321 05:25:07.286966 4839 memory_manager.go:354] "RemoveStaleState removing state" podUID="c815b0ab-12b4-4c7e-a22c-5ae8680936c0" containerName="container-00" Mar 21 05:25:07 crc kubenswrapper[4839]: I0321 05:25:07.289334 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-72qrq/crc-debug-xfsb6" Mar 21 05:25:07 crc kubenswrapper[4839]: I0321 05:25:07.430076 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9nckp\" (UniqueName: \"kubernetes.io/projected/dfc60151-2f6c-4842-b0e9-4194fa1ff596-kube-api-access-9nckp\") pod \"crc-debug-xfsb6\" (UID: \"dfc60151-2f6c-4842-b0e9-4194fa1ff596\") " pod="openshift-must-gather-72qrq/crc-debug-xfsb6" Mar 21 05:25:07 crc kubenswrapper[4839]: I0321 05:25:07.430180 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/dfc60151-2f6c-4842-b0e9-4194fa1ff596-host\") pod \"crc-debug-xfsb6\" (UID: \"dfc60151-2f6c-4842-b0e9-4194fa1ff596\") " pod="openshift-must-gather-72qrq/crc-debug-xfsb6" Mar 21 05:25:07 crc kubenswrapper[4839]: I0321 05:25:07.531384 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/dfc60151-2f6c-4842-b0e9-4194fa1ff596-host\") pod \"crc-debug-xfsb6\" (UID: \"dfc60151-2f6c-4842-b0e9-4194fa1ff596\") " pod="openshift-must-gather-72qrq/crc-debug-xfsb6" Mar 21 05:25:07 crc kubenswrapper[4839]: I0321 05:25:07.531512 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9nckp\" (UniqueName: \"kubernetes.io/projected/dfc60151-2f6c-4842-b0e9-4194fa1ff596-kube-api-access-9nckp\") pod \"crc-debug-xfsb6\" (UID: \"dfc60151-2f6c-4842-b0e9-4194fa1ff596\") " pod="openshift-must-gather-72qrq/crc-debug-xfsb6" Mar 21 05:25:07 crc kubenswrapper[4839]: I0321 05:25:07.531614 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/dfc60151-2f6c-4842-b0e9-4194fa1ff596-host\") pod \"crc-debug-xfsb6\" (UID: \"dfc60151-2f6c-4842-b0e9-4194fa1ff596\") " pod="openshift-must-gather-72qrq/crc-debug-xfsb6" Mar 21 05:25:07 crc kubenswrapper[4839]: I0321 05:25:07.551603 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9nckp\" (UniqueName: \"kubernetes.io/projected/dfc60151-2f6c-4842-b0e9-4194fa1ff596-kube-api-access-9nckp\") pod \"crc-debug-xfsb6\" (UID: \"dfc60151-2f6c-4842-b0e9-4194fa1ff596\") " pod="openshift-must-gather-72qrq/crc-debug-xfsb6" Mar 21 05:25:07 crc kubenswrapper[4839]: I0321 05:25:07.608272 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-72qrq/crc-debug-xfsb6" Mar 21 05:25:08 crc kubenswrapper[4839]: I0321 05:25:08.009261 4839 generic.go:334] "Generic (PLEG): container finished" podID="dfc60151-2f6c-4842-b0e9-4194fa1ff596" containerID="42273614d5636fbbe841ce75b9205756068fc6541255df4d8420f7f3fe4fd250" exitCode=0 Mar 21 05:25:08 crc kubenswrapper[4839]: I0321 05:25:08.009346 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-72qrq/crc-debug-xfsb6" event={"ID":"dfc60151-2f6c-4842-b0e9-4194fa1ff596","Type":"ContainerDied","Data":"42273614d5636fbbe841ce75b9205756068fc6541255df4d8420f7f3fe4fd250"} Mar 21 05:25:08 crc kubenswrapper[4839]: I0321 05:25:08.009798 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-72qrq/crc-debug-xfsb6" event={"ID":"dfc60151-2f6c-4842-b0e9-4194fa1ff596","Type":"ContainerStarted","Data":"d9e2cc211386c3fc3bc6977bd20cd9c80e6183dffa782cd8c68d6688c551308e"} Mar 21 05:25:08 crc kubenswrapper[4839]: I0321 05:25:08.050453 4839 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-72qrq/crc-debug-xfsb6"] Mar 21 05:25:08 crc kubenswrapper[4839]: I0321 05:25:08.059850 4839 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-72qrq/crc-debug-xfsb6"] Mar 21 05:25:09 crc kubenswrapper[4839]: I0321 05:25:09.132801 4839 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-72qrq/crc-debug-xfsb6" Mar 21 05:25:09 crc kubenswrapper[4839]: I0321 05:25:09.261038 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9nckp\" (UniqueName: \"kubernetes.io/projected/dfc60151-2f6c-4842-b0e9-4194fa1ff596-kube-api-access-9nckp\") pod \"dfc60151-2f6c-4842-b0e9-4194fa1ff596\" (UID: \"dfc60151-2f6c-4842-b0e9-4194fa1ff596\") " Mar 21 05:25:09 crc kubenswrapper[4839]: I0321 05:25:09.262053 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/dfc60151-2f6c-4842-b0e9-4194fa1ff596-host\") pod \"dfc60151-2f6c-4842-b0e9-4194fa1ff596\" (UID: \"dfc60151-2f6c-4842-b0e9-4194fa1ff596\") " Mar 21 05:25:09 crc kubenswrapper[4839]: I0321 05:25:09.263543 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/dfc60151-2f6c-4842-b0e9-4194fa1ff596-host" (OuterVolumeSpecName: "host") pod "dfc60151-2f6c-4842-b0e9-4194fa1ff596" (UID: "dfc60151-2f6c-4842-b0e9-4194fa1ff596"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 21 05:25:09 crc kubenswrapper[4839]: I0321 05:25:09.269897 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/dfc60151-2f6c-4842-b0e9-4194fa1ff596-kube-api-access-9nckp" (OuterVolumeSpecName: "kube-api-access-9nckp") pod "dfc60151-2f6c-4842-b0e9-4194fa1ff596" (UID: "dfc60151-2f6c-4842-b0e9-4194fa1ff596"). InnerVolumeSpecName "kube-api-access-9nckp". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 21 05:25:09 crc kubenswrapper[4839]: I0321 05:25:09.365917 4839 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/dfc60151-2f6c-4842-b0e9-4194fa1ff596-host\") on node \"crc\" DevicePath \"\"" Mar 21 05:25:09 crc kubenswrapper[4839]: I0321 05:25:09.365967 4839 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9nckp\" (UniqueName: \"kubernetes.io/projected/dfc60151-2f6c-4842-b0e9-4194fa1ff596-kube-api-access-9nckp\") on node \"crc\" DevicePath \"\"" Mar 21 05:25:10 crc kubenswrapper[4839]: I0321 05:25:10.027883 4839 scope.go:117] "RemoveContainer" containerID="42273614d5636fbbe841ce75b9205756068fc6541255df4d8420f7f3fe4fd250" Mar 21 05:25:10 crc kubenswrapper[4839]: I0321 05:25:10.027925 4839 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-72qrq/crc-debug-xfsb6" Mar 21 05:25:10 crc kubenswrapper[4839]: I0321 05:25:10.465692 4839 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="dfc60151-2f6c-4842-b0e9-4194fa1ff596" path="/var/lib/kubelet/pods/dfc60151-2f6c-4842-b0e9-4194fa1ff596/volumes" Mar 21 05:25:25 crc kubenswrapper[4839]: I0321 05:25:25.826319 4839 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-api-d9cf4c794-jb7lf_37ba14c5-dfc7-4268-86c9-c0efe37fe6c9/barbican-api/0.log" Mar 21 05:25:26 crc kubenswrapper[4839]: I0321 05:25:26.018222 4839 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-api-d9cf4c794-jb7lf_37ba14c5-dfc7-4268-86c9-c0efe37fe6c9/barbican-api-log/0.log" Mar 21 05:25:26 crc kubenswrapper[4839]: I0321 05:25:26.033804 4839 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-keystone-listener-7b946d96f4-chv76_e6e03301-fb6e-467b-b19d-21b5c475d35c/barbican-keystone-listener/0.log" Mar 21 05:25:26 crc kubenswrapper[4839]: I0321 05:25:26.100937 4839 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-keystone-listener-7b946d96f4-chv76_e6e03301-fb6e-467b-b19d-21b5c475d35c/barbican-keystone-listener-log/0.log" Mar 21 05:25:26 crc kubenswrapper[4839]: I0321 05:25:26.250432 4839 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-worker-db77b8b5f-grbp8_3563c0f9-9e82-4798-bae3-b3836a6b5866/barbican-worker/0.log" Mar 21 05:25:26 crc kubenswrapper[4839]: I0321 05:25:26.293087 4839 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-worker-db77b8b5f-grbp8_3563c0f9-9e82-4798-bae3-b3836a6b5866/barbican-worker-log/0.log" Mar 21 05:25:26 crc kubenswrapper[4839]: I0321 05:25:26.504447 4839 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_d1041d12-2cae-4009-a3f3-9df6e219d03b/ceilometer-central-agent/0.log" Mar 21 05:25:26 crc kubenswrapper[4839]: I0321 05:25:26.632537 4839 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_d1041d12-2cae-4009-a3f3-9df6e219d03b/ceilometer-notification-agent/0.log" Mar 21 05:25:26 crc kubenswrapper[4839]: I0321 05:25:26.633182 4839 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_bootstrap-edpm-deployment-openstack-edpm-ipam-ndptz_a1d76458-d587-4960-9bcc-7e3d3122b44d/bootstrap-edpm-deployment-openstack-edpm-ipam/0.log" Mar 21 05:25:26 crc kubenswrapper[4839]: I0321 05:25:26.686240 4839 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_d1041d12-2cae-4009-a3f3-9df6e219d03b/proxy-httpd/0.log" Mar 21 05:25:26 crc kubenswrapper[4839]: I0321 05:25:26.763409 4839 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_d1041d12-2cae-4009-a3f3-9df6e219d03b/sg-core/0.log" Mar 21 05:25:26 crc kubenswrapper[4839]: I0321 05:25:26.901749 4839 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-api-0_5162af3c-3b00-4643-afd9-680f6e2f5c03/cinder-api/0.log" Mar 21 05:25:26 crc kubenswrapper[4839]: I0321 05:25:26.907055 4839 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-api-0_5162af3c-3b00-4643-afd9-680f6e2f5c03/cinder-api-log/0.log" Mar 21 05:25:27 crc kubenswrapper[4839]: I0321 05:25:27.060124 4839 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-scheduler-0_77964653-d242-4258-b06e-c9cd0fb64d84/cinder-scheduler/0.log" Mar 21 05:25:27 crc kubenswrapper[4839]: I0321 05:25:27.144394 4839 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-scheduler-0_77964653-d242-4258-b06e-c9cd0fb64d84/probe/0.log" Mar 21 05:25:27 crc kubenswrapper[4839]: I0321 05:25:27.258239 4839 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_configure-network-edpm-deployment-openstack-edpm-ipam-2rvtx_a58d82e4-2de9-4680-a08c-6eeb775ed08a/configure-network-edpm-deployment-openstack-edpm-ipam/0.log" Mar 21 05:25:27 crc kubenswrapper[4839]: I0321 05:25:27.466437 4839 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-78c64bc9c5-n4nl2_a31699b4-0a8f-42c8-b7f4-319ef1d5423a/init/0.log" Mar 21 05:25:27 crc kubenswrapper[4839]: I0321 05:25:27.490351 4839 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_configure-os-edpm-deployment-openstack-edpm-ipam-qkclf_ab9d4433-fe0e-471b-84f8-568b31920ed3/configure-os-edpm-deployment-openstack-edpm-ipam/0.log" Mar 21 05:25:27 crc kubenswrapper[4839]: I0321 05:25:27.704367 4839 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-78c64bc9c5-n4nl2_a31699b4-0a8f-42c8-b7f4-319ef1d5423a/init/0.log" Mar 21 05:25:27 crc kubenswrapper[4839]: I0321 05:25:27.755550 4839 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-78c64bc9c5-n4nl2_a31699b4-0a8f-42c8-b7f4-319ef1d5423a/dnsmasq-dns/0.log" Mar 21 05:25:27 crc kubenswrapper[4839]: I0321 05:25:27.832247 4839 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_download-cache-edpm-deployment-openstack-edpm-ipam-bs7tt_7f875f01-020a-4cd6-950a-4dbb6ccb344e/download-cache-edpm-deployment-openstack-edpm-ipam/0.log" Mar 21 05:25:27 crc kubenswrapper[4839]: I0321 05:25:27.987113 4839 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-external-api-0_3e3e15ec-7425-4e0a-99a8-db3bb1cd486c/glance-httpd/0.log" Mar 21 05:25:28 crc kubenswrapper[4839]: I0321 05:25:28.017013 4839 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-external-api-0_3e3e15ec-7425-4e0a-99a8-db3bb1cd486c/glance-log/0.log" Mar 21 05:25:28 crc kubenswrapper[4839]: I0321 05:25:28.157561 4839 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-internal-api-0_c7aa4192-53bb-412e-b25e-1fe47c59fa75/glance-log/0.log" Mar 21 05:25:28 crc kubenswrapper[4839]: I0321 05:25:28.194252 4839 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-internal-api-0_c7aa4192-53bb-412e-b25e-1fe47c59fa75/glance-httpd/0.log" Mar 21 05:25:28 crc kubenswrapper[4839]: I0321 05:25:28.398126 4839 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_horizon-9c97f4dbd-k2scs_579308eb-854d-4160-ad35-8677f2d0e634/horizon/0.log" Mar 21 05:25:28 crc kubenswrapper[4839]: I0321 05:25:28.664221 4839 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_install-certs-edpm-deployment-openstack-edpm-ipam-s7pk7_268d87b5-57ec-49ff-be62-fe59e6b4b819/install-certs-edpm-deployment-openstack-edpm-ipam/0.log" Mar 21 05:25:28 crc kubenswrapper[4839]: I0321 05:25:28.880538 4839 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_horizon-9c97f4dbd-k2scs_579308eb-854d-4160-ad35-8677f2d0e634/horizon-log/0.log" Mar 21 05:25:29 crc kubenswrapper[4839]: I0321 05:25:29.156900 4839 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_install-os-edpm-deployment-openstack-edpm-ipam-xdvx2_7538d496-3768-42b7-9f2e-70e1b44a9d6b/install-os-edpm-deployment-openstack-edpm-ipam/0.log" Mar 21 05:25:29 crc kubenswrapper[4839]: I0321 05:25:29.175698 4839 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_keystone-cb996784d-fvhvp_6a3fcdf0-3099-467b-928b-89a4876130fe/keystone-api/0.log" Mar 21 05:25:29 crc kubenswrapper[4839]: I0321 05:25:29.208781 4839 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_keystone-cron-29567821-rmctn_666be2f4-0416-4086-94d3-c48c82f380b2/keystone-cron/0.log" Mar 21 05:25:29 crc kubenswrapper[4839]: I0321 05:25:29.377806 4839 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_kube-state-metrics-0_1626316f-b029-4424-b783-25eeb2790eb2/kube-state-metrics/0.log" Mar 21 05:25:29 crc kubenswrapper[4839]: I0321 05:25:29.924769 4839 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_libvirt-edpm-deployment-openstack-edpm-ipam-w48j6_2d056acb-0183-4157-a830-fff4cd1dcacf/libvirt-edpm-deployment-openstack-edpm-ipam/0.log" Mar 21 05:25:30 crc kubenswrapper[4839]: I0321 05:25:30.037174 4839 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-748dbf85fc-jslwv_cd21ac8b-d3c0-4f0c-9205-d60d55425d8a/neutron-httpd/0.log" Mar 21 05:25:30 crc kubenswrapper[4839]: I0321 05:25:30.058811 4839 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-748dbf85fc-jslwv_cd21ac8b-d3c0-4f0c-9205-d60d55425d8a/neutron-api/0.log" Mar 21 05:25:30 crc kubenswrapper[4839]: I0321 05:25:30.437148 4839 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-metadata-edpm-deployment-openstack-edpm-ipam-vx26d_ceef8f42-5d77-44c1-ac39-edf0080f68e0/neutron-metadata-edpm-deployment-openstack-edpm-ipam/0.log" Mar 21 05:25:30 crc kubenswrapper[4839]: I0321 05:25:30.930682 4839 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell0-conductor-0_152d0351-12d2-4cf1-ad49-fd943b223442/nova-cell0-conductor-conductor/0.log" Mar 21 05:25:30 crc kubenswrapper[4839]: I0321 05:25:30.979890 4839 patch_prober.go:28] interesting pod/machine-config-daemon-jx4q7 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 21 05:25:30 crc kubenswrapper[4839]: I0321 05:25:30.979965 4839 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-jx4q7" podUID="4f92fefb-d5cd-451a-8bbe-31eea55d5bd9" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 21 05:25:30 crc kubenswrapper[4839]: I0321 05:25:30.980022 4839 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-jx4q7" Mar 21 05:25:30 crc kubenswrapper[4839]: I0321 05:25:30.981479 4839 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"26d5ad8d8c206d8ada93506f3a162dccbd9846e40dd3da26db34bab6bbf70437"} pod="openshift-machine-config-operator/machine-config-daemon-jx4q7" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 21 05:25:30 crc kubenswrapper[4839]: I0321 05:25:30.981743 4839 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-jx4q7" podUID="4f92fefb-d5cd-451a-8bbe-31eea55d5bd9" containerName="machine-config-daemon" containerID="cri-o://26d5ad8d8c206d8ada93506f3a162dccbd9846e40dd3da26db34bab6bbf70437" gracePeriod=600 Mar 21 05:25:31 crc kubenswrapper[4839]: I0321 05:25:31.009711 4839 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-api-0_627bf6a3-cf5d-42e1-9250-ba6684bb2cfc/nova-api-log/0.log" Mar 21 05:25:31 crc kubenswrapper[4839]: I0321 05:25:31.246021 4839 generic.go:334] "Generic (PLEG): container finished" podID="4f92fefb-d5cd-451a-8bbe-31eea55d5bd9" containerID="26d5ad8d8c206d8ada93506f3a162dccbd9846e40dd3da26db34bab6bbf70437" exitCode=0 Mar 21 05:25:31 crc kubenswrapper[4839]: I0321 05:25:31.246105 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-jx4q7" event={"ID":"4f92fefb-d5cd-451a-8bbe-31eea55d5bd9","Type":"ContainerDied","Data":"26d5ad8d8c206d8ada93506f3a162dccbd9846e40dd3da26db34bab6bbf70437"} Mar 21 05:25:31 crc kubenswrapper[4839]: I0321 05:25:31.246219 4839 scope.go:117] "RemoveContainer" containerID="ae0086f9eacefc01aae5e4f5f99607f42c8a56a9eb17fd3e93fb691fd9a335b2" Mar 21 05:25:31 crc kubenswrapper[4839]: I0321 05:25:31.258065 4839 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell1-conductor-0_3194b187-fe06-4eed-b725-995cef2b05a0/nova-cell1-conductor-conductor/0.log" Mar 21 05:25:31 crc kubenswrapper[4839]: I0321 05:25:31.354952 4839 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-api-0_627bf6a3-cf5d-42e1-9250-ba6684bb2cfc/nova-api-api/0.log" Mar 21 05:25:31 crc kubenswrapper[4839]: I0321 05:25:31.369509 4839 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell1-novncproxy-0_9ddf8fc2-ec2a-4b98-aa76-2dc43426e3f2/nova-cell1-novncproxy-novncproxy/0.log" Mar 21 05:25:31 crc kubenswrapper[4839]: I0321 05:25:31.770235 4839 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-metadata-0_0aafbc7f-e890-4a32-8531-f148aeea18e6/nova-metadata-log/0.log" Mar 21 05:25:32 crc kubenswrapper[4839]: I0321 05:25:32.168069 4839 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-metadata-0_0aafbc7f-e890-4a32-8531-f148aeea18e6/nova-metadata-metadata/0.log" Mar 21 05:25:32 crc kubenswrapper[4839]: I0321 05:25:32.200693 4839 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-scheduler-0_bbecccff-0ecc-44ff-a57b-f7289b8bcf5a/nova-scheduler-scheduler/0.log" Mar 21 05:25:32 crc kubenswrapper[4839]: I0321 05:25:32.250388 4839 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-edpm-deployment-openstack-edpm-ipam-hf42f_3f8728ca-30ff-41a9-8a48-e3bb7911bcc7/nova-edpm-deployment-openstack-edpm-ipam/0.log" Mar 21 05:25:32 crc kubenswrapper[4839]: I0321 05:25:32.259560 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-jx4q7" event={"ID":"4f92fefb-d5cd-451a-8bbe-31eea55d5bd9","Type":"ContainerStarted","Data":"135bd8a875eb5ef2e4c8acd69ed54a00aa689eabe31cc4130e794bbf798a359d"} Mar 21 05:25:32 crc kubenswrapper[4839]: I0321 05:25:32.306126 4839 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_30d22e92-45bd-4d1e-954e-3ade801245d4/mysql-bootstrap/0.log" Mar 21 05:25:32 crc kubenswrapper[4839]: I0321 05:25:32.549718 4839 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_30d22e92-45bd-4d1e-954e-3ade801245d4/galera/0.log" Mar 21 05:25:32 crc kubenswrapper[4839]: I0321 05:25:32.570236 4839 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_30d22e92-45bd-4d1e-954e-3ade801245d4/mysql-bootstrap/0.log" Mar 21 05:25:32 crc kubenswrapper[4839]: I0321 05:25:32.588212 4839 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_4f1edf0d-f220-4815-aeb6-e4507576247a/mysql-bootstrap/0.log" Mar 21 05:25:32 crc kubenswrapper[4839]: I0321 05:25:32.869678 4839 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstackclient_52b9f7e1-d86c-457e-9391-eee855a9f7a7/openstackclient/0.log" Mar 21 05:25:32 crc kubenswrapper[4839]: I0321 05:25:32.887011 4839 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_4f1edf0d-f220-4815-aeb6-e4507576247a/galera/0.log" Mar 21 05:25:32 crc kubenswrapper[4839]: I0321 05:25:32.910020 4839 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_4f1edf0d-f220-4815-aeb6-e4507576247a/mysql-bootstrap/0.log" Mar 21 05:25:33 crc kubenswrapper[4839]: I0321 05:25:33.315835 4839 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-hrww8_3d74e911-e100-4e79-89be-202e06bb4d30/ovsdb-server-init/0.log" Mar 21 05:25:33 crc kubenswrapper[4839]: I0321 05:25:33.327884 4839 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-metrics-mx5tf_64d13111-845e-4c61-a4ce-483ddfb799b7/openstack-network-exporter/0.log" Mar 21 05:25:33 crc kubenswrapper[4839]: I0321 05:25:33.603096 4839 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-hrww8_3d74e911-e100-4e79-89be-202e06bb4d30/ovsdb-server/0.log" Mar 21 05:25:33 crc kubenswrapper[4839]: I0321 05:25:33.604760 4839 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-hrww8_3d74e911-e100-4e79-89be-202e06bb4d30/ovsdb-server-init/0.log" Mar 21 05:25:33 crc kubenswrapper[4839]: I0321 05:25:33.655991 4839 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-hrww8_3d74e911-e100-4e79-89be-202e06bb4d30/ovs-vswitchd/0.log" Mar 21 05:25:33 crc kubenswrapper[4839]: I0321 05:25:33.802057 4839 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-qt5s4_b31b64cb-0266-4b8a-9fcb-ae5e36c8309a/ovn-controller/0.log" Mar 21 05:25:34 crc kubenswrapper[4839]: I0321 05:25:34.000407 4839 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-edpm-deployment-openstack-edpm-ipam-v4wqq_7e5c69e5-d234-4a0b-a327-1cf44ddaf1bd/ovn-edpm-deployment-openstack-edpm-ipam/0.log" Mar 21 05:25:34 crc kubenswrapper[4839]: I0321 05:25:34.055006 4839 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-northd-0_dbcaa531-3e09-48c7-8535-76f3e1f5c303/openstack-network-exporter/0.log" Mar 21 05:25:34 crc kubenswrapper[4839]: I0321 05:25:34.088590 4839 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-northd-0_dbcaa531-3e09-48c7-8535-76f3e1f5c303/ovn-northd/0.log" Mar 21 05:25:34 crc kubenswrapper[4839]: I0321 05:25:34.200424 4839 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-0_4a7a1028-3deb-4033-890c-db0861c6a9a2/openstack-network-exporter/0.log" Mar 21 05:25:34 crc kubenswrapper[4839]: I0321 05:25:34.343796 4839 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-0_4a7a1028-3deb-4033-890c-db0861c6a9a2/ovsdbserver-nb/0.log" Mar 21 05:25:34 crc kubenswrapper[4839]: I0321 05:25:34.390618 4839 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-0_8c2e5ef4-e4c0-4278-897e-ce5d00b4079d/openstack-network-exporter/0.log" Mar 21 05:25:34 crc kubenswrapper[4839]: I0321 05:25:34.527907 4839 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-0_8c2e5ef4-e4c0-4278-897e-ce5d00b4079d/ovsdbserver-sb/0.log" Mar 21 05:25:34 crc kubenswrapper[4839]: I0321 05:25:34.674641 4839 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_placement-75bd8b89b4-djjlh_bf5a44f8-8eb1-4953-b611-a02576e414ea/placement-log/0.log" Mar 21 05:25:34 crc kubenswrapper[4839]: I0321 05:25:34.688907 4839 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_placement-75bd8b89b4-djjlh_bf5a44f8-8eb1-4953-b611-a02576e414ea/placement-api/0.log" Mar 21 05:25:34 crc kubenswrapper[4839]: I0321 05:25:34.838730 4839 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_fa82c4a0-2b0e-4e22-9e91-7fc899122414/setup-container/0.log" Mar 21 05:25:35 crc kubenswrapper[4839]: I0321 05:25:35.094750 4839 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_fa82c4a0-2b0e-4e22-9e91-7fc899122414/rabbitmq/0.log" Mar 21 05:25:35 crc kubenswrapper[4839]: I0321 05:25:35.096048 4839 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_fa82c4a0-2b0e-4e22-9e91-7fc899122414/setup-container/0.log" Mar 21 05:25:35 crc kubenswrapper[4839]: I0321 05:25:35.102982 4839 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_bfff67da-8ea4-4798-9b8d-58a3abac4347/setup-container/0.log" Mar 21 05:25:35 crc kubenswrapper[4839]: I0321 05:25:35.328050 4839 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_bfff67da-8ea4-4798-9b8d-58a3abac4347/rabbitmq/0.log" Mar 21 05:25:35 crc kubenswrapper[4839]: I0321 05:25:35.388842 4839 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_bfff67da-8ea4-4798-9b8d-58a3abac4347/setup-container/0.log" Mar 21 05:25:35 crc kubenswrapper[4839]: I0321 05:25:35.402585 4839 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_reboot-os-edpm-deployment-openstack-edpm-ipam-mkn9r_66c3e343-3306-455d-89d7-db17c1bd53ed/reboot-os-edpm-deployment-openstack-edpm-ipam/0.log" Mar 21 05:25:35 crc kubenswrapper[4839]: I0321 05:25:35.571252 4839 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_redhat-edpm-deployment-openstack-edpm-ipam-pgfnn_a6dd2bff-543f-4ebb-b908-3e528f322548/redhat-edpm-deployment-openstack-edpm-ipam/0.log" Mar 21 05:25:35 crc kubenswrapper[4839]: I0321 05:25:35.710206 4839 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_repo-setup-edpm-deployment-openstack-edpm-ipam-mbmlq_acb0bb61-c53a-4171-bca5-4a3141d6904a/repo-setup-edpm-deployment-openstack-edpm-ipam/0.log" Mar 21 05:25:35 crc kubenswrapper[4839]: I0321 05:25:35.888480 4839 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_run-os-edpm-deployment-openstack-edpm-ipam-55fzl_26adbd7b-7994-4bea-9f94-338881339833/run-os-edpm-deployment-openstack-edpm-ipam/0.log" Mar 21 05:25:35 crc kubenswrapper[4839]: I0321 05:25:35.911287 4839 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ssh-known-hosts-edpm-deployment-chfcw_39dbacec-c845-4f19-92a9-c0e63fba203c/ssh-known-hosts-edpm-deployment/0.log" Mar 21 05:25:36 crc kubenswrapper[4839]: I0321 05:25:36.210384 4839 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-proxy-b66c6bfff-76gfx_1af5fd5b-8392-4e55-b3fb-fdc9285dd135/proxy-httpd/0.log" Mar 21 05:25:36 crc kubenswrapper[4839]: I0321 05:25:36.214798 4839 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-proxy-b66c6bfff-76gfx_1af5fd5b-8392-4e55-b3fb-fdc9285dd135/proxy-server/0.log" Mar 21 05:25:36 crc kubenswrapper[4839]: I0321 05:25:36.327257 4839 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-ring-rebalance-kkvzq_5484abbf-53f2-445a-b6fe-0996eba95345/swift-ring-rebalance/0.log" Mar 21 05:25:36 crc kubenswrapper[4839]: I0321 05:25:36.423546 4839 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_9848d2f0-c562-4b2a-bd1c-cd91c6754079/account-reaper/0.log" Mar 21 05:25:36 crc kubenswrapper[4839]: I0321 05:25:36.444955 4839 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_9848d2f0-c562-4b2a-bd1c-cd91c6754079/account-auditor/0.log" Mar 21 05:25:36 crc kubenswrapper[4839]: I0321 05:25:36.559187 4839 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_9848d2f0-c562-4b2a-bd1c-cd91c6754079/account-replicator/0.log" Mar 21 05:25:36 crc kubenswrapper[4839]: I0321 05:25:36.675304 4839 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_9848d2f0-c562-4b2a-bd1c-cd91c6754079/account-server/0.log" Mar 21 05:25:36 crc kubenswrapper[4839]: I0321 05:25:36.696343 4839 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_9848d2f0-c562-4b2a-bd1c-cd91c6754079/container-auditor/0.log" Mar 21 05:25:36 crc kubenswrapper[4839]: I0321 05:25:36.768187 4839 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_9848d2f0-c562-4b2a-bd1c-cd91c6754079/container-replicator/0.log" Mar 21 05:25:36 crc kubenswrapper[4839]: I0321 05:25:36.828422 4839 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_9848d2f0-c562-4b2a-bd1c-cd91c6754079/container-server/0.log" Mar 21 05:25:36 crc kubenswrapper[4839]: I0321 05:25:36.925265 4839 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_9848d2f0-c562-4b2a-bd1c-cd91c6754079/container-updater/0.log" Mar 21 05:25:36 crc kubenswrapper[4839]: I0321 05:25:36.935412 4839 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_9848d2f0-c562-4b2a-bd1c-cd91c6754079/object-auditor/0.log" Mar 21 05:25:36 crc kubenswrapper[4839]: I0321 05:25:36.949799 4839 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_9848d2f0-c562-4b2a-bd1c-cd91c6754079/object-expirer/0.log" Mar 21 05:25:37 crc kubenswrapper[4839]: I0321 05:25:37.087334 4839 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_9848d2f0-c562-4b2a-bd1c-cd91c6754079/object-replicator/0.log" Mar 21 05:25:37 crc kubenswrapper[4839]: I0321 05:25:37.165697 4839 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_9848d2f0-c562-4b2a-bd1c-cd91c6754079/object-server/0.log" Mar 21 05:25:37 crc kubenswrapper[4839]: I0321 05:25:37.167628 4839 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_9848d2f0-c562-4b2a-bd1c-cd91c6754079/object-updater/0.log" Mar 21 05:25:37 crc kubenswrapper[4839]: I0321 05:25:37.173716 4839 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_9848d2f0-c562-4b2a-bd1c-cd91c6754079/rsync/0.log" Mar 21 05:25:37 crc kubenswrapper[4839]: I0321 05:25:37.328914 4839 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_9848d2f0-c562-4b2a-bd1c-cd91c6754079/swift-recon-cron/0.log" Mar 21 05:25:37 crc kubenswrapper[4839]: I0321 05:25:37.549548 4839 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_tempest-tests-tempest_65c89a5c-df7e-4d6c-bc7e-5e4fecfc6cb3/tempest-tests-tempest-tests-runner/0.log" Mar 21 05:25:37 crc kubenswrapper[4839]: I0321 05:25:37.690506 4839 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_test-operator-logs-pod-tempest-tempest-tests-tempest_d4f9fe7e-8922-4015-a3eb-8c0f829cc5f8/test-operator-logs-container/0.log" Mar 21 05:25:37 crc kubenswrapper[4839]: I0321 05:25:37.865204 4839 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_telemetry-edpm-deployment-openstack-edpm-ipam-zmjtq_4f49b501-bec5-4fe1-89d7-ff3c217ba580/telemetry-edpm-deployment-openstack-edpm-ipam/0.log" Mar 21 05:25:37 crc kubenswrapper[4839]: I0321 05:25:37.886730 4839 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_validate-network-edpm-deployment-openstack-edpm-ipam-jsp4h_f9d60b3b-b1b4-4d98-9da2-e152ac410c81/validate-network-edpm-deployment-openstack-edpm-ipam/0.log" Mar 21 05:25:46 crc kubenswrapper[4839]: I0321 05:25:46.986845 4839 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_memcached-0_3c49bdbb-0c05-4dea-8de8-61ca09b7e84c/memcached/0.log" Mar 21 05:25:57 crc kubenswrapper[4839]: I0321 05:25:57.483166 4839 scope.go:117] "RemoveContainer" containerID="0f0fb05b1ad7b9c8c86908bf5eed059955dab4bf2710991db435f65e1f3837bc" Mar 21 05:25:57 crc kubenswrapper[4839]: I0321 05:25:57.506784 4839 scope.go:117] "RemoveContainer" containerID="0fb7e2104f45f80289c410cb7ed43ea4ebad69cb7a12b53a2a2c205806ea1801" Mar 21 05:25:57 crc kubenswrapper[4839]: I0321 05:25:57.539918 4839 scope.go:117] "RemoveContainer" containerID="36e58fa15812a6897b50d6db0ed7a274a81303b9dec23efb000319a8d9a33254" Mar 21 05:26:00 crc kubenswrapper[4839]: I0321 05:26:00.139097 4839 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29567846-8dwrk"] Mar 21 05:26:00 crc kubenswrapper[4839]: E0321 05:26:00.139860 4839 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dfc60151-2f6c-4842-b0e9-4194fa1ff596" containerName="container-00" Mar 21 05:26:00 crc kubenswrapper[4839]: I0321 05:26:00.139876 4839 state_mem.go:107] "Deleted CPUSet assignment" podUID="dfc60151-2f6c-4842-b0e9-4194fa1ff596" containerName="container-00" Mar 21 05:26:00 crc kubenswrapper[4839]: I0321 05:26:00.140134 4839 memory_manager.go:354] "RemoveStaleState removing state" podUID="dfc60151-2f6c-4842-b0e9-4194fa1ff596" containerName="container-00" Mar 21 05:26:00 crc kubenswrapper[4839]: I0321 05:26:00.140842 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567846-8dwrk" Mar 21 05:26:00 crc kubenswrapper[4839]: I0321 05:26:00.143437 4839 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 21 05:26:00 crc kubenswrapper[4839]: I0321 05:26:00.143755 4839 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-swld2" Mar 21 05:26:00 crc kubenswrapper[4839]: I0321 05:26:00.146173 4839 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 21 05:26:00 crc kubenswrapper[4839]: I0321 05:26:00.150283 4839 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29567846-8dwrk"] Mar 21 05:26:00 crc kubenswrapper[4839]: I0321 05:26:00.238145 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gk44c\" (UniqueName: \"kubernetes.io/projected/274043bb-38cf-435f-9cb1-01d194d34325-kube-api-access-gk44c\") pod \"auto-csr-approver-29567846-8dwrk\" (UID: \"274043bb-38cf-435f-9cb1-01d194d34325\") " pod="openshift-infra/auto-csr-approver-29567846-8dwrk" Mar 21 05:26:00 crc kubenswrapper[4839]: I0321 05:26:00.339402 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gk44c\" (UniqueName: \"kubernetes.io/projected/274043bb-38cf-435f-9cb1-01d194d34325-kube-api-access-gk44c\") pod \"auto-csr-approver-29567846-8dwrk\" (UID: \"274043bb-38cf-435f-9cb1-01d194d34325\") " pod="openshift-infra/auto-csr-approver-29567846-8dwrk" Mar 21 05:26:00 crc kubenswrapper[4839]: I0321 05:26:00.366807 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gk44c\" (UniqueName: \"kubernetes.io/projected/274043bb-38cf-435f-9cb1-01d194d34325-kube-api-access-gk44c\") pod \"auto-csr-approver-29567846-8dwrk\" (UID: \"274043bb-38cf-435f-9cb1-01d194d34325\") " pod="openshift-infra/auto-csr-approver-29567846-8dwrk" Mar 21 05:26:00 crc kubenswrapper[4839]: I0321 05:26:00.461108 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567846-8dwrk" Mar 21 05:26:00 crc kubenswrapper[4839]: I0321 05:26:00.933369 4839 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29567846-8dwrk"] Mar 21 05:26:01 crc kubenswrapper[4839]: I0321 05:26:01.558747 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567846-8dwrk" event={"ID":"274043bb-38cf-435f-9cb1-01d194d34325","Type":"ContainerStarted","Data":"8641a02b7c87efdb8a0624f1a6768117dfedc165ea974aa527a2fee8b60e842e"} Mar 21 05:26:02 crc kubenswrapper[4839]: I0321 05:26:02.223002 4839 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_barbican-operator-controller-manager-59bc569d95-2mkmz_0c51ffa0-2285-4f7e-af09-0cafba139934/manager/0.log" Mar 21 05:26:02 crc kubenswrapper[4839]: I0321 05:26:02.476775 4839 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_designate-operator-controller-manager-588d4d986b-9s4vt_ee9d64a7-0d03-4cb0-a266-47b26f9957b5/manager/0.log" Mar 21 05:26:02 crc kubenswrapper[4839]: E0321 05:26:02.569071 4839 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod274043bb_38cf_435f_9cb1_01d194d34325.slice/crio-conmon-3e713d4f3a2eccb8fba4adfa096046056f6cf5d095f6cdc7fc919eb1fb945456.scope\": RecentStats: unable to find data in memory cache]" Mar 21 05:26:02 crc kubenswrapper[4839]: I0321 05:26:02.570033 4839 generic.go:334] "Generic (PLEG): container finished" podID="274043bb-38cf-435f-9cb1-01d194d34325" containerID="3e713d4f3a2eccb8fba4adfa096046056f6cf5d095f6cdc7fc919eb1fb945456" exitCode=0 Mar 21 05:26:02 crc kubenswrapper[4839]: I0321 05:26:02.570069 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567846-8dwrk" event={"ID":"274043bb-38cf-435f-9cb1-01d194d34325","Type":"ContainerDied","Data":"3e713d4f3a2eccb8fba4adfa096046056f6cf5d095f6cdc7fc919eb1fb945456"} Mar 21 05:26:02 crc kubenswrapper[4839]: I0321 05:26:02.704150 4839 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_e32239b1d0c83face19bfb7ecc802c1e0782962973f0d7ba9f259e86a2x5d6m_f63f3493-d532-4d99-94c0-ab8648252dab/util/0.log" Mar 21 05:26:02 crc kubenswrapper[4839]: I0321 05:26:02.919109 4839 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_e32239b1d0c83face19bfb7ecc802c1e0782962973f0d7ba9f259e86a2x5d6m_f63f3493-d532-4d99-94c0-ab8648252dab/pull/0.log" Mar 21 05:26:02 crc kubenswrapper[4839]: I0321 05:26:02.932033 4839 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_e32239b1d0c83face19bfb7ecc802c1e0782962973f0d7ba9f259e86a2x5d6m_f63f3493-d532-4d99-94c0-ab8648252dab/util/0.log" Mar 21 05:26:02 crc kubenswrapper[4839]: I0321 05:26:02.955543 4839 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_e32239b1d0c83face19bfb7ecc802c1e0782962973f0d7ba9f259e86a2x5d6m_f63f3493-d532-4d99-94c0-ab8648252dab/pull/0.log" Mar 21 05:26:03 crc kubenswrapper[4839]: I0321 05:26:03.190573 4839 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_e32239b1d0c83face19bfb7ecc802c1e0782962973f0d7ba9f259e86a2x5d6m_f63f3493-d532-4d99-94c0-ab8648252dab/pull/0.log" Mar 21 05:26:03 crc kubenswrapper[4839]: I0321 05:26:03.247475 4839 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_e32239b1d0c83face19bfb7ecc802c1e0782962973f0d7ba9f259e86a2x5d6m_f63f3493-d532-4d99-94c0-ab8648252dab/util/0.log" Mar 21 05:26:03 crc kubenswrapper[4839]: I0321 05:26:03.247884 4839 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_e32239b1d0c83face19bfb7ecc802c1e0782962973f0d7ba9f259e86a2x5d6m_f63f3493-d532-4d99-94c0-ab8648252dab/extract/0.log" Mar 21 05:26:03 crc kubenswrapper[4839]: I0321 05:26:03.254228 4839 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_cinder-operator-controller-manager-8d58dc466-dncxc_05f30a88-e899-4727-9440-981d010a1342/manager/0.log" Mar 21 05:26:03 crc kubenswrapper[4839]: I0321 05:26:03.474254 4839 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_heat-operator-controller-manager-67dd5f86f5-2n27d_fd731e7e-440b-4e77-a778-08a4a62e0c9f/manager/0.log" Mar 21 05:26:03 crc kubenswrapper[4839]: I0321 05:26:03.511823 4839 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_glance-operator-controller-manager-79df6bcc97-6s6q7_d3dc722f-f66c-46a0-9b1a-ae1b9c4de060/manager/0.log" Mar 21 05:26:03 crc kubenswrapper[4839]: I0321 05:26:03.727249 4839 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_horizon-operator-controller-manager-8464cc45fb-d7h7r_acb1d7ac-b3f9-4564-8346-344ffb5c3964/manager/0.log" Mar 21 05:26:03 crc kubenswrapper[4839]: I0321 05:26:03.997291 4839 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ironic-operator-controller-manager-6f787dddc9-8sg4d_ccec0d11-294b-43a2-be2e-fcef8a6818c6/manager/0.log" Mar 21 05:26:04 crc kubenswrapper[4839]: I0321 05:26:04.009845 4839 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567846-8dwrk" Mar 21 05:26:04 crc kubenswrapper[4839]: I0321 05:26:04.084403 4839 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_infra-operator-controller-manager-7b9c774f96-bsdjs_ca2a8cd0-1c71-45bb-b4fc-4c7f82515b3b/manager/0.log" Mar 21 05:26:04 crc kubenswrapper[4839]: I0321 05:26:04.119096 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gk44c\" (UniqueName: \"kubernetes.io/projected/274043bb-38cf-435f-9cb1-01d194d34325-kube-api-access-gk44c\") pod \"274043bb-38cf-435f-9cb1-01d194d34325\" (UID: \"274043bb-38cf-435f-9cb1-01d194d34325\") " Mar 21 05:26:04 crc kubenswrapper[4839]: I0321 05:26:04.125984 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/274043bb-38cf-435f-9cb1-01d194d34325-kube-api-access-gk44c" (OuterVolumeSpecName: "kube-api-access-gk44c") pod "274043bb-38cf-435f-9cb1-01d194d34325" (UID: "274043bb-38cf-435f-9cb1-01d194d34325"). InnerVolumeSpecName "kube-api-access-gk44c". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 21 05:26:04 crc kubenswrapper[4839]: I0321 05:26:04.221096 4839 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gk44c\" (UniqueName: \"kubernetes.io/projected/274043bb-38cf-435f-9cb1-01d194d34325-kube-api-access-gk44c\") on node \"crc\" DevicePath \"\"" Mar 21 05:26:04 crc kubenswrapper[4839]: I0321 05:26:04.282336 4839 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_manila-operator-controller-manager-55f864c847-gzh8j_6074766c-0ecd-4051-a676-dcc21b24184f/manager/0.log" Mar 21 05:26:04 crc kubenswrapper[4839]: I0321 05:26:04.288712 4839 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_keystone-operator-controller-manager-768b96df4c-k4lg5_7a7bf7a3-acea-4059-8a89-db576f3588d1/manager/0.log" Mar 21 05:26:04 crc kubenswrapper[4839]: I0321 05:26:04.533875 4839 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_mariadb-operator-controller-manager-67ccfc9778-sp4j4_2162bafb-7e49-435c-9591-d8b725f10336/manager/0.log" Mar 21 05:26:04 crc kubenswrapper[4839]: I0321 05:26:04.544510 4839 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_neutron-operator-controller-manager-767865f676-94vpf_70702cd5-6815-4a01-98a4-2f4dfaeef839/manager/0.log" Mar 21 05:26:04 crc kubenswrapper[4839]: I0321 05:26:04.595960 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567846-8dwrk" event={"ID":"274043bb-38cf-435f-9cb1-01d194d34325","Type":"ContainerDied","Data":"8641a02b7c87efdb8a0624f1a6768117dfedc165ea974aa527a2fee8b60e842e"} Mar 21 05:26:04 crc kubenswrapper[4839]: I0321 05:26:04.596017 4839 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="8641a02b7c87efdb8a0624f1a6768117dfedc165ea974aa527a2fee8b60e842e" Mar 21 05:26:04 crc kubenswrapper[4839]: I0321 05:26:04.596101 4839 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567846-8dwrk" Mar 21 05:26:04 crc kubenswrapper[4839]: I0321 05:26:04.776834 4839 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_nova-operator-controller-manager-5d488d59fb-wjw9j_6914418f-3639-4ebc-a58d-d8b478cbf6b4/manager/0.log" Mar 21 05:26:04 crc kubenswrapper[4839]: I0321 05:26:04.778351 4839 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_octavia-operator-controller-manager-5b9f45d989-6p4mn_faac458b-73d9-4fb8-9f1c-50f7521088b0/manager/0.log" Mar 21 05:26:04 crc kubenswrapper[4839]: I0321 05:26:04.965974 4839 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-baremetal-operator-controller-manager-89d64c458-8gc22_859b11bc-e9fb-40a2-a053-66a07337965c/manager/0.log" Mar 21 05:26:05 crc kubenswrapper[4839]: I0321 05:26:05.105901 4839 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29567840-rbk96"] Mar 21 05:26:05 crc kubenswrapper[4839]: I0321 05:26:05.140267 4839 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29567840-rbk96"] Mar 21 05:26:05 crc kubenswrapper[4839]: I0321 05:26:05.155292 4839 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-controller-init-948579bb7-j6fx6_b27308cc-b2b7-4bf0-a3ca-55ccdfa47f59/operator/0.log" Mar 21 05:26:05 crc kubenswrapper[4839]: I0321 05:26:05.262240 4839 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-index-lj8h4_6ff65f56-ff89-43c6-b087-6d3c3b72d2ef/registry-server/0.log" Mar 21 05:26:05 crc kubenswrapper[4839]: I0321 05:26:05.526624 4839 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ovn-operator-controller-manager-884679f54-qt58c_379b40a1-e3f5-448b-b668-0f168457e5d0/manager/0.log" Mar 21 05:26:05 crc kubenswrapper[4839]: I0321 05:26:05.696415 4839 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_placement-operator-controller-manager-5784578c99-x75fd_361c2d7b-9a75-41fd-953d-4b1bd64ca6df/manager/0.log" Mar 21 05:26:05 crc kubenswrapper[4839]: I0321 05:26:05.810491 4839 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_rabbitmq-cluster-operator-manager-668c99d594-lzbtt_c8584ecb-dc92-4cec-9178-3017f09095da/operator/0.log" Mar 21 05:26:05 crc kubenswrapper[4839]: I0321 05:26:05.988429 4839 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_swift-operator-controller-manager-c674c5965-xt7xt_2045f5d2-c67e-47cd-b16d-3c69d449f099/manager/0.log" Mar 21 05:26:06 crc kubenswrapper[4839]: I0321 05:26:06.258850 4839 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_telemetry-operator-controller-manager-d6b694c5-btkvt_d3ea9c2e-11a4-492e-9e84-8294e81ce775/manager/0.log" Mar 21 05:26:06 crc kubenswrapper[4839]: I0321 05:26:06.324390 4839 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_test-operator-controller-manager-5c5cb9c4d7-7f4qh_5eeb53bd-3988-458f-baa5-d265e0178aea/manager/0.log" Mar 21 05:26:06 crc kubenswrapper[4839]: I0321 05:26:06.488634 4839 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b89d49dc-a7f5-4a24-98c5-818fe0e99ded" path="/var/lib/kubelet/pods/b89d49dc-a7f5-4a24-98c5-818fe0e99ded/volumes" Mar 21 05:26:06 crc kubenswrapper[4839]: I0321 05:26:06.489103 4839 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-controller-manager-5ccd4855ff-jx6pn_06f9e67e-8978-46a1-9dc8-c511197241e2/manager/0.log" Mar 21 05:26:06 crc kubenswrapper[4839]: I0321 05:26:06.573609 4839 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_watcher-operator-controller-manager-6c4d75f7f9-hh27s_1d32b541-7b80-492b-adac-e51d5090b668/manager/0.log" Mar 21 05:26:24 crc kubenswrapper[4839]: I0321 05:26:24.980419 4839 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_control-plane-machine-set-operator-78cbb6b69f-whlp9_40014780-8cb8-47fa-8b2c-c4fb7d04a85c/control-plane-machine-set-operator/0.log" Mar 21 05:26:25 crc kubenswrapper[4839]: I0321 05:26:25.183856 4839 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_machine-api-operator-5694c8668f-nmj8p_c4d393d7-42d7-4b7d-a3cd-f7e325b97c54/kube-rbac-proxy/0.log" Mar 21 05:26:25 crc kubenswrapper[4839]: I0321 05:26:25.185208 4839 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_machine-api-operator-5694c8668f-nmj8p_c4d393d7-42d7-4b7d-a3cd-f7e325b97c54/machine-api-operator/0.log" Mar 21 05:26:36 crc kubenswrapper[4839]: I0321 05:26:36.496208 4839 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-858654f9db-x2cpt_daed7a16-7023-463e-9d60-3f56f091f73e/cert-manager-controller/0.log" Mar 21 05:26:36 crc kubenswrapper[4839]: I0321 05:26:36.688658 4839 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-cainjector-cf98fcc89-v297k_814a91ac-5e2f-4479-88a3-254e4216e50c/cert-manager-cainjector/0.log" Mar 21 05:26:36 crc kubenswrapper[4839]: I0321 05:26:36.771790 4839 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-webhook-687f57d79b-s9zj6_d70f5b8f-f5a8-4829-b4e1-7a7a12dddd1f/cert-manager-webhook/0.log" Mar 21 05:26:48 crc kubenswrapper[4839]: I0321 05:26:48.822912 4839 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-console-plugin-86f58fcf4-j5z4g_8e7a66bb-3731-4f75-9a7f-5b9d07a36b39/nmstate-console-plugin/0.log" Mar 21 05:26:49 crc kubenswrapper[4839]: I0321 05:26:49.030770 4839 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-handler-k57vv_42329e42-8b9b-45ed-ab04-bf12468d8859/nmstate-handler/0.log" Mar 21 05:26:49 crc kubenswrapper[4839]: I0321 05:26:49.112350 4839 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-metrics-9b8c8685d-z5wkc_fdc1639d-742f-41a6-8cb7-318997a4a8b1/nmstate-metrics/0.log" Mar 21 05:26:49 crc kubenswrapper[4839]: I0321 05:26:49.138285 4839 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-metrics-9b8c8685d-z5wkc_fdc1639d-742f-41a6-8cb7-318997a4a8b1/kube-rbac-proxy/0.log" Mar 21 05:26:49 crc kubenswrapper[4839]: I0321 05:26:49.259022 4839 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-operator-796d4cfff4-vrlf4_fbd83ba5-ac43-45f6-8a15-78ba82a246f7/nmstate-operator/0.log" Mar 21 05:26:49 crc kubenswrapper[4839]: I0321 05:26:49.304867 4839 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-webhook-5f558f5558-7ghd4_5a2485ca-cb21-4edf-b074-f7ac255f45f8/nmstate-webhook/0.log" Mar 21 05:26:57 crc kubenswrapper[4839]: I0321 05:26:57.659516 4839 scope.go:117] "RemoveContainer" containerID="229d14a49fd481dc353dc5d371b3e82a7f1a7396db2fffc8de8355fe9e2338cb" Mar 21 05:27:15 crc kubenswrapper[4839]: I0321 05:27:15.706658 4839 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_controller-7bb4cc7c98-q9zb9_f0373e22-a3f9-48c6-abd6-fc8147ea49e6/kube-rbac-proxy/0.log" Mar 21 05:27:15 crc kubenswrapper[4839]: I0321 05:27:15.855823 4839 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_controller-7bb4cc7c98-q9zb9_f0373e22-a3f9-48c6-abd6-fc8147ea49e6/controller/0.log" Mar 21 05:27:15 crc kubenswrapper[4839]: I0321 05:27:15.939521 4839 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-lzrf7_822ff984-89c3-48d0-b420-4ecf223f8176/cp-frr-files/0.log" Mar 21 05:27:16 crc kubenswrapper[4839]: I0321 05:27:16.104935 4839 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-lzrf7_822ff984-89c3-48d0-b420-4ecf223f8176/cp-reloader/0.log" Mar 21 05:27:16 crc kubenswrapper[4839]: I0321 05:27:16.148773 4839 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-lzrf7_822ff984-89c3-48d0-b420-4ecf223f8176/cp-frr-files/0.log" Mar 21 05:27:16 crc kubenswrapper[4839]: I0321 05:27:16.154129 4839 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-lzrf7_822ff984-89c3-48d0-b420-4ecf223f8176/cp-metrics/0.log" Mar 21 05:27:16 crc kubenswrapper[4839]: I0321 05:27:16.175869 4839 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-lzrf7_822ff984-89c3-48d0-b420-4ecf223f8176/cp-reloader/0.log" Mar 21 05:27:16 crc kubenswrapper[4839]: I0321 05:27:16.376967 4839 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-lzrf7_822ff984-89c3-48d0-b420-4ecf223f8176/cp-metrics/0.log" Mar 21 05:27:16 crc kubenswrapper[4839]: I0321 05:27:16.389622 4839 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-lzrf7_822ff984-89c3-48d0-b420-4ecf223f8176/cp-reloader/0.log" Mar 21 05:27:16 crc kubenswrapper[4839]: I0321 05:27:16.419342 4839 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-lzrf7_822ff984-89c3-48d0-b420-4ecf223f8176/cp-metrics/0.log" Mar 21 05:27:16 crc kubenswrapper[4839]: I0321 05:27:16.464993 4839 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-lzrf7_822ff984-89c3-48d0-b420-4ecf223f8176/cp-frr-files/0.log" Mar 21 05:27:16 crc kubenswrapper[4839]: I0321 05:27:16.587101 4839 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-lzrf7_822ff984-89c3-48d0-b420-4ecf223f8176/cp-frr-files/0.log" Mar 21 05:27:16 crc kubenswrapper[4839]: I0321 05:27:16.589892 4839 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-lzrf7_822ff984-89c3-48d0-b420-4ecf223f8176/cp-metrics/0.log" Mar 21 05:27:16 crc kubenswrapper[4839]: I0321 05:27:16.609159 4839 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-lzrf7_822ff984-89c3-48d0-b420-4ecf223f8176/cp-reloader/0.log" Mar 21 05:27:16 crc kubenswrapper[4839]: I0321 05:27:16.691706 4839 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-lzrf7_822ff984-89c3-48d0-b420-4ecf223f8176/controller/0.log" Mar 21 05:27:16 crc kubenswrapper[4839]: I0321 05:27:16.769918 4839 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-lzrf7_822ff984-89c3-48d0-b420-4ecf223f8176/kube-rbac-proxy/0.log" Mar 21 05:27:16 crc kubenswrapper[4839]: I0321 05:27:16.799551 4839 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-lzrf7_822ff984-89c3-48d0-b420-4ecf223f8176/frr-metrics/0.log" Mar 21 05:27:16 crc kubenswrapper[4839]: I0321 05:27:16.910642 4839 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-lzrf7_822ff984-89c3-48d0-b420-4ecf223f8176/kube-rbac-proxy-frr/0.log" Mar 21 05:27:16 crc kubenswrapper[4839]: I0321 05:27:16.974953 4839 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-lzrf7_822ff984-89c3-48d0-b420-4ecf223f8176/reloader/0.log" Mar 21 05:27:17 crc kubenswrapper[4839]: I0321 05:27:17.139414 4839 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-webhook-server-bcc4b6f68-qm7jb_06b3d06a-d515-469a-9a88-77b3f1e6c6f0/frr-k8s-webhook-server/0.log" Mar 21 05:27:17 crc kubenswrapper[4839]: I0321 05:27:17.345499 4839 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_metallb-operator-controller-manager-7b8d865685-2pk4g_888cdc0b-241d-456a-9a9f-3ed253b3dbf3/manager/0.log" Mar 21 05:27:17 crc kubenswrapper[4839]: I0321 05:27:17.434900 4839 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_metallb-operator-webhook-server-7df97b96d6-7wvzr_ca0627e2-8115-4514-ba93-47e00a823a31/webhook-server/0.log" Mar 21 05:27:17 crc kubenswrapper[4839]: I0321 05:27:17.641085 4839 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_speaker-b2wb4_6b330e86-2ac2-4bee-8a6e-364cb2f093d7/kube-rbac-proxy/0.log" Mar 21 05:27:18 crc kubenswrapper[4839]: I0321 05:27:18.196856 4839 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_speaker-b2wb4_6b330e86-2ac2-4bee-8a6e-364cb2f093d7/speaker/0.log" Mar 21 05:27:18 crc kubenswrapper[4839]: I0321 05:27:18.525293 4839 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-lzrf7_822ff984-89c3-48d0-b420-4ecf223f8176/frr/0.log" Mar 21 05:27:30 crc kubenswrapper[4839]: I0321 05:27:30.132708 4839 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874bb5km_e0eea72c-ae42-4ea4-a067-6ff3e853c081/util/0.log" Mar 21 05:27:30 crc kubenswrapper[4839]: I0321 05:27:30.550692 4839 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874bb5km_e0eea72c-ae42-4ea4-a067-6ff3e853c081/util/0.log" Mar 21 05:27:30 crc kubenswrapper[4839]: I0321 05:27:30.551988 4839 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874bb5km_e0eea72c-ae42-4ea4-a067-6ff3e853c081/pull/0.log" Mar 21 05:27:30 crc kubenswrapper[4839]: I0321 05:27:30.569874 4839 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874bb5km_e0eea72c-ae42-4ea4-a067-6ff3e853c081/pull/0.log" Mar 21 05:27:30 crc kubenswrapper[4839]: I0321 05:27:30.755898 4839 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874bb5km_e0eea72c-ae42-4ea4-a067-6ff3e853c081/extract/0.log" Mar 21 05:27:30 crc kubenswrapper[4839]: I0321 05:27:30.765799 4839 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874bb5km_e0eea72c-ae42-4ea4-a067-6ff3e853c081/util/0.log" Mar 21 05:27:30 crc kubenswrapper[4839]: I0321 05:27:30.773485 4839 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874bb5km_e0eea72c-ae42-4ea4-a067-6ff3e853c081/pull/0.log" Mar 21 05:27:30 crc kubenswrapper[4839]: I0321 05:27:30.939915 4839 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c18p48w_cb3471d2-6268-4816-bc09-31044e9989e7/util/0.log" Mar 21 05:27:31 crc kubenswrapper[4839]: I0321 05:27:31.567296 4839 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c18p48w_cb3471d2-6268-4816-bc09-31044e9989e7/pull/0.log" Mar 21 05:27:31 crc kubenswrapper[4839]: I0321 05:27:31.598277 4839 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c18p48w_cb3471d2-6268-4816-bc09-31044e9989e7/pull/0.log" Mar 21 05:27:31 crc kubenswrapper[4839]: I0321 05:27:31.605802 4839 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c18p48w_cb3471d2-6268-4816-bc09-31044e9989e7/util/0.log" Mar 21 05:27:31 crc kubenswrapper[4839]: I0321 05:27:31.862228 4839 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c18p48w_cb3471d2-6268-4816-bc09-31044e9989e7/pull/0.log" Mar 21 05:27:31 crc kubenswrapper[4839]: I0321 05:27:31.862687 4839 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c18p48w_cb3471d2-6268-4816-bc09-31044e9989e7/extract/0.log" Mar 21 05:27:31 crc kubenswrapper[4839]: I0321 05:27:31.931457 4839 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c18p48w_cb3471d2-6268-4816-bc09-31044e9989e7/util/0.log" Mar 21 05:27:32 crc kubenswrapper[4839]: I0321 05:27:32.047889 4839 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-xg8xw_1d4943ad-c109-47a0-bcc8-4eb1a89836ca/extract-utilities/0.log" Mar 21 05:27:32 crc kubenswrapper[4839]: I0321 05:27:32.211753 4839 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-xg8xw_1d4943ad-c109-47a0-bcc8-4eb1a89836ca/extract-utilities/0.log" Mar 21 05:27:32 crc kubenswrapper[4839]: I0321 05:27:32.242582 4839 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-xg8xw_1d4943ad-c109-47a0-bcc8-4eb1a89836ca/extract-content/0.log" Mar 21 05:27:32 crc kubenswrapper[4839]: I0321 05:27:32.276844 4839 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-xg8xw_1d4943ad-c109-47a0-bcc8-4eb1a89836ca/extract-content/0.log" Mar 21 05:27:32 crc kubenswrapper[4839]: I0321 05:27:32.463228 4839 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-xg8xw_1d4943ad-c109-47a0-bcc8-4eb1a89836ca/extract-utilities/0.log" Mar 21 05:27:32 crc kubenswrapper[4839]: I0321 05:27:32.468470 4839 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-xg8xw_1d4943ad-c109-47a0-bcc8-4eb1a89836ca/extract-content/0.log" Mar 21 05:27:32 crc kubenswrapper[4839]: I0321 05:27:32.700749 4839 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-99hx2_51f96bb3-505b-4c7b-bc6d-b0a465c7daae/extract-utilities/0.log" Mar 21 05:27:32 crc kubenswrapper[4839]: I0321 05:27:32.926735 4839 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-xg8xw_1d4943ad-c109-47a0-bcc8-4eb1a89836ca/registry-server/0.log" Mar 21 05:27:32 crc kubenswrapper[4839]: I0321 05:27:32.929241 4839 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-99hx2_51f96bb3-505b-4c7b-bc6d-b0a465c7daae/extract-content/0.log" Mar 21 05:27:32 crc kubenswrapper[4839]: I0321 05:27:32.992786 4839 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-99hx2_51f96bb3-505b-4c7b-bc6d-b0a465c7daae/extract-content/0.log" Mar 21 05:27:33 crc kubenswrapper[4839]: I0321 05:27:33.022156 4839 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-99hx2_51f96bb3-505b-4c7b-bc6d-b0a465c7daae/extract-utilities/0.log" Mar 21 05:27:33 crc kubenswrapper[4839]: I0321 05:27:33.109348 4839 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-99hx2_51f96bb3-505b-4c7b-bc6d-b0a465c7daae/extract-content/0.log" Mar 21 05:27:33 crc kubenswrapper[4839]: I0321 05:27:33.121945 4839 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-99hx2_51f96bb3-505b-4c7b-bc6d-b0a465c7daae/extract-utilities/0.log" Mar 21 05:27:33 crc kubenswrapper[4839]: I0321 05:27:33.365394 4839 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_marketplace-operator-79b997595-qb9bp_df9bf95b-dc8f-4104-9c6c-873159393850/marketplace-operator/0.log" Mar 21 05:27:33 crc kubenswrapper[4839]: I0321 05:27:33.549193 4839 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-7sxqv_e28c0850-90f8-445b-be34-13ab0d940eb4/extract-utilities/0.log" Mar 21 05:27:33 crc kubenswrapper[4839]: I0321 05:27:33.794463 4839 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-7sxqv_e28c0850-90f8-445b-be34-13ab0d940eb4/extract-utilities/0.log" Mar 21 05:27:33 crc kubenswrapper[4839]: I0321 05:27:33.799829 4839 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-7sxqv_e28c0850-90f8-445b-be34-13ab0d940eb4/extract-content/0.log" Mar 21 05:27:33 crc kubenswrapper[4839]: I0321 05:27:33.815904 4839 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-7sxqv_e28c0850-90f8-445b-be34-13ab0d940eb4/extract-content/0.log" Mar 21 05:27:33 crc kubenswrapper[4839]: I0321 05:27:33.828575 4839 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-99hx2_51f96bb3-505b-4c7b-bc6d-b0a465c7daae/registry-server/0.log" Mar 21 05:27:34 crc kubenswrapper[4839]: I0321 05:27:34.024892 4839 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-7sxqv_e28c0850-90f8-445b-be34-13ab0d940eb4/extract-content/0.log" Mar 21 05:27:34 crc kubenswrapper[4839]: I0321 05:27:34.025870 4839 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-7sxqv_e28c0850-90f8-445b-be34-13ab0d940eb4/extract-utilities/0.log" Mar 21 05:27:34 crc kubenswrapper[4839]: I0321 05:27:34.064287 4839 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-8p22k_d2de7c7a-fc46-44bc-9fad-d346e82f8ebc/extract-utilities/0.log" Mar 21 05:27:34 crc kubenswrapper[4839]: I0321 05:27:34.286336 4839 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-8p22k_d2de7c7a-fc46-44bc-9fad-d346e82f8ebc/extract-content/0.log" Mar 21 05:27:34 crc kubenswrapper[4839]: I0321 05:27:34.292420 4839 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-7sxqv_e28c0850-90f8-445b-be34-13ab0d940eb4/registry-server/0.log" Mar 21 05:27:34 crc kubenswrapper[4839]: I0321 05:27:34.302699 4839 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-8p22k_d2de7c7a-fc46-44bc-9fad-d346e82f8ebc/extract-utilities/0.log" Mar 21 05:27:34 crc kubenswrapper[4839]: I0321 05:27:34.325110 4839 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-8p22k_d2de7c7a-fc46-44bc-9fad-d346e82f8ebc/extract-content/0.log" Mar 21 05:27:34 crc kubenswrapper[4839]: I0321 05:27:34.504288 4839 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-8p22k_d2de7c7a-fc46-44bc-9fad-d346e82f8ebc/extract-content/0.log" Mar 21 05:27:34 crc kubenswrapper[4839]: I0321 05:27:34.509888 4839 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-8p22k_d2de7c7a-fc46-44bc-9fad-d346e82f8ebc/extract-utilities/0.log" Mar 21 05:27:35 crc kubenswrapper[4839]: I0321 05:27:35.119402 4839 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-8p22k_d2de7c7a-fc46-44bc-9fad-d346e82f8ebc/registry-server/0.log" Mar 21 05:28:00 crc kubenswrapper[4839]: I0321 05:28:00.150538 4839 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29567848-p6wgd"] Mar 21 05:28:00 crc kubenswrapper[4839]: E0321 05:28:00.151465 4839 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="274043bb-38cf-435f-9cb1-01d194d34325" containerName="oc" Mar 21 05:28:00 crc kubenswrapper[4839]: I0321 05:28:00.151477 4839 state_mem.go:107] "Deleted CPUSet assignment" podUID="274043bb-38cf-435f-9cb1-01d194d34325" containerName="oc" Mar 21 05:28:00 crc kubenswrapper[4839]: I0321 05:28:00.151672 4839 memory_manager.go:354] "RemoveStaleState removing state" podUID="274043bb-38cf-435f-9cb1-01d194d34325" containerName="oc" Mar 21 05:28:00 crc kubenswrapper[4839]: I0321 05:28:00.152261 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567848-p6wgd" Mar 21 05:28:00 crc kubenswrapper[4839]: I0321 05:28:00.154387 4839 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-swld2" Mar 21 05:28:00 crc kubenswrapper[4839]: I0321 05:28:00.154878 4839 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 21 05:28:00 crc kubenswrapper[4839]: I0321 05:28:00.155208 4839 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 21 05:28:00 crc kubenswrapper[4839]: I0321 05:28:00.161740 4839 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29567848-p6wgd"] Mar 21 05:28:00 crc kubenswrapper[4839]: I0321 05:28:00.201011 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tsqsn\" (UniqueName: \"kubernetes.io/projected/c8214f95-33aa-486b-bb82-915b2c5b2cf6-kube-api-access-tsqsn\") pod \"auto-csr-approver-29567848-p6wgd\" (UID: \"c8214f95-33aa-486b-bb82-915b2c5b2cf6\") " pod="openshift-infra/auto-csr-approver-29567848-p6wgd" Mar 21 05:28:00 crc kubenswrapper[4839]: I0321 05:28:00.306420 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tsqsn\" (UniqueName: \"kubernetes.io/projected/c8214f95-33aa-486b-bb82-915b2c5b2cf6-kube-api-access-tsqsn\") pod \"auto-csr-approver-29567848-p6wgd\" (UID: \"c8214f95-33aa-486b-bb82-915b2c5b2cf6\") " pod="openshift-infra/auto-csr-approver-29567848-p6wgd" Mar 21 05:28:00 crc kubenswrapper[4839]: I0321 05:28:00.335388 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tsqsn\" (UniqueName: \"kubernetes.io/projected/c8214f95-33aa-486b-bb82-915b2c5b2cf6-kube-api-access-tsqsn\") pod \"auto-csr-approver-29567848-p6wgd\" (UID: \"c8214f95-33aa-486b-bb82-915b2c5b2cf6\") " pod="openshift-infra/auto-csr-approver-29567848-p6wgd" Mar 21 05:28:00 crc kubenswrapper[4839]: I0321 05:28:00.476199 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567848-p6wgd" Mar 21 05:28:00 crc kubenswrapper[4839]: I0321 05:28:00.980746 4839 patch_prober.go:28] interesting pod/machine-config-daemon-jx4q7 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 21 05:28:00 crc kubenswrapper[4839]: I0321 05:28:00.981117 4839 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-jx4q7" podUID="4f92fefb-d5cd-451a-8bbe-31eea55d5bd9" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 21 05:28:01 crc kubenswrapper[4839]: I0321 05:28:01.102787 4839 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29567848-p6wgd"] Mar 21 05:28:01 crc kubenswrapper[4839]: I0321 05:28:01.637249 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567848-p6wgd" event={"ID":"c8214f95-33aa-486b-bb82-915b2c5b2cf6","Type":"ContainerStarted","Data":"bdc7f332bf5b36a06c38ba377e71ce315c1ab651db2e2b3b15234e9d9fa884ed"} Mar 21 05:28:02 crc kubenswrapper[4839]: I0321 05:28:02.647762 4839 generic.go:334] "Generic (PLEG): container finished" podID="c8214f95-33aa-486b-bb82-915b2c5b2cf6" containerID="3c4dbc17150a4b84d9f816e99c3c6823e1cf60ce3010cad74846a38e98f64886" exitCode=0 Mar 21 05:28:02 crc kubenswrapper[4839]: I0321 05:28:02.647873 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567848-p6wgd" event={"ID":"c8214f95-33aa-486b-bb82-915b2c5b2cf6","Type":"ContainerDied","Data":"3c4dbc17150a4b84d9f816e99c3c6823e1cf60ce3010cad74846a38e98f64886"} Mar 21 05:28:04 crc kubenswrapper[4839]: I0321 05:28:04.117405 4839 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567848-p6wgd" Mar 21 05:28:04 crc kubenswrapper[4839]: I0321 05:28:04.286269 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tsqsn\" (UniqueName: \"kubernetes.io/projected/c8214f95-33aa-486b-bb82-915b2c5b2cf6-kube-api-access-tsqsn\") pod \"c8214f95-33aa-486b-bb82-915b2c5b2cf6\" (UID: \"c8214f95-33aa-486b-bb82-915b2c5b2cf6\") " Mar 21 05:28:04 crc kubenswrapper[4839]: I0321 05:28:04.293136 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c8214f95-33aa-486b-bb82-915b2c5b2cf6-kube-api-access-tsqsn" (OuterVolumeSpecName: "kube-api-access-tsqsn") pod "c8214f95-33aa-486b-bb82-915b2c5b2cf6" (UID: "c8214f95-33aa-486b-bb82-915b2c5b2cf6"). InnerVolumeSpecName "kube-api-access-tsqsn". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 21 05:28:04 crc kubenswrapper[4839]: I0321 05:28:04.391851 4839 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tsqsn\" (UniqueName: \"kubernetes.io/projected/c8214f95-33aa-486b-bb82-915b2c5b2cf6-kube-api-access-tsqsn\") on node \"crc\" DevicePath \"\"" Mar 21 05:28:04 crc kubenswrapper[4839]: I0321 05:28:04.672154 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567848-p6wgd" event={"ID":"c8214f95-33aa-486b-bb82-915b2c5b2cf6","Type":"ContainerDied","Data":"bdc7f332bf5b36a06c38ba377e71ce315c1ab651db2e2b3b15234e9d9fa884ed"} Mar 21 05:28:04 crc kubenswrapper[4839]: I0321 05:28:04.672507 4839 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="bdc7f332bf5b36a06c38ba377e71ce315c1ab651db2e2b3b15234e9d9fa884ed" Mar 21 05:28:04 crc kubenswrapper[4839]: I0321 05:28:04.672328 4839 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567848-p6wgd" Mar 21 05:28:05 crc kubenswrapper[4839]: I0321 05:28:05.187307 4839 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29567842-mkbkh"] Mar 21 05:28:05 crc kubenswrapper[4839]: I0321 05:28:05.196539 4839 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29567842-mkbkh"] Mar 21 05:28:06 crc kubenswrapper[4839]: I0321 05:28:06.465034 4839 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="98d91ef7-84b2-40fa-b268-b3a42085ecbd" path="/var/lib/kubelet/pods/98d91ef7-84b2-40fa-b268-b3a42085ecbd/volumes" Mar 21 05:28:30 crc kubenswrapper[4839]: I0321 05:28:30.980404 4839 patch_prober.go:28] interesting pod/machine-config-daemon-jx4q7 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 21 05:28:30 crc kubenswrapper[4839]: I0321 05:28:30.981071 4839 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-jx4q7" podUID="4f92fefb-d5cd-451a-8bbe-31eea55d5bd9" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 21 05:28:57 crc kubenswrapper[4839]: I0321 05:28:57.755047 4839 scope.go:117] "RemoveContainer" containerID="fd8610a23aa4477b05f1e471927e591e3db28e8c730e3f65952a7cfd15d24ba8" Mar 21 05:28:57 crc kubenswrapper[4839]: I0321 05:28:57.777989 4839 scope.go:117] "RemoveContainer" containerID="97ce872ff52632fee3002b46fbe2d1087d0acb59cd9704873c29b470d51ff9e4" Mar 21 05:28:57 crc kubenswrapper[4839]: I0321 05:28:57.824851 4839 scope.go:117] "RemoveContainer" containerID="eedac11fd6ab65c2edef01ab4734c0cb94cc7a431c8be7bf1c6cc4417de55aa3" Mar 21 05:28:57 crc kubenswrapper[4839]: I0321 05:28:57.902512 4839 scope.go:117] "RemoveContainer" containerID="65a7d17f89d1557c72c5ffd06bb72faeb0e67e3bd8925184b20df8ed6afa7a8d" Mar 21 05:29:00 crc kubenswrapper[4839]: I0321 05:29:00.980271 4839 patch_prober.go:28] interesting pod/machine-config-daemon-jx4q7 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 21 05:29:00 crc kubenswrapper[4839]: I0321 05:29:00.981258 4839 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-jx4q7" podUID="4f92fefb-d5cd-451a-8bbe-31eea55d5bd9" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 21 05:29:00 crc kubenswrapper[4839]: I0321 05:29:00.981323 4839 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-jx4q7" Mar 21 05:29:00 crc kubenswrapper[4839]: I0321 05:29:00.982343 4839 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"135bd8a875eb5ef2e4c8acd69ed54a00aa689eabe31cc4130e794bbf798a359d"} pod="openshift-machine-config-operator/machine-config-daemon-jx4q7" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 21 05:29:00 crc kubenswrapper[4839]: I0321 05:29:00.982402 4839 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-jx4q7" podUID="4f92fefb-d5cd-451a-8bbe-31eea55d5bd9" containerName="machine-config-daemon" containerID="cri-o://135bd8a875eb5ef2e4c8acd69ed54a00aa689eabe31cc4130e794bbf798a359d" gracePeriod=600 Mar 21 05:29:01 crc kubenswrapper[4839]: E0321 05:29:01.110068 4839 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-jx4q7_openshift-machine-config-operator(4f92fefb-d5cd-451a-8bbe-31eea55d5bd9)\"" pod="openshift-machine-config-operator/machine-config-daemon-jx4q7" podUID="4f92fefb-d5cd-451a-8bbe-31eea55d5bd9" Mar 21 05:29:01 crc kubenswrapper[4839]: I0321 05:29:01.313145 4839 generic.go:334] "Generic (PLEG): container finished" podID="4f92fefb-d5cd-451a-8bbe-31eea55d5bd9" containerID="135bd8a875eb5ef2e4c8acd69ed54a00aa689eabe31cc4130e794bbf798a359d" exitCode=0 Mar 21 05:29:01 crc kubenswrapper[4839]: I0321 05:29:01.313208 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-jx4q7" event={"ID":"4f92fefb-d5cd-451a-8bbe-31eea55d5bd9","Type":"ContainerDied","Data":"135bd8a875eb5ef2e4c8acd69ed54a00aa689eabe31cc4130e794bbf798a359d"} Mar 21 05:29:01 crc kubenswrapper[4839]: I0321 05:29:01.313260 4839 scope.go:117] "RemoveContainer" containerID="26d5ad8d8c206d8ada93506f3a162dccbd9846e40dd3da26db34bab6bbf70437" Mar 21 05:29:01 crc kubenswrapper[4839]: I0321 05:29:01.314114 4839 scope.go:117] "RemoveContainer" containerID="135bd8a875eb5ef2e4c8acd69ed54a00aa689eabe31cc4130e794bbf798a359d" Mar 21 05:29:01 crc kubenswrapper[4839]: E0321 05:29:01.314471 4839 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-jx4q7_openshift-machine-config-operator(4f92fefb-d5cd-451a-8bbe-31eea55d5bd9)\"" pod="openshift-machine-config-operator/machine-config-daemon-jx4q7" podUID="4f92fefb-d5cd-451a-8bbe-31eea55d5bd9" Mar 21 05:29:12 crc kubenswrapper[4839]: I0321 05:29:12.452630 4839 scope.go:117] "RemoveContainer" containerID="135bd8a875eb5ef2e4c8acd69ed54a00aa689eabe31cc4130e794bbf798a359d" Mar 21 05:29:12 crc kubenswrapper[4839]: E0321 05:29:12.453402 4839 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-jx4q7_openshift-machine-config-operator(4f92fefb-d5cd-451a-8bbe-31eea55d5bd9)\"" pod="openshift-machine-config-operator/machine-config-daemon-jx4q7" podUID="4f92fefb-d5cd-451a-8bbe-31eea55d5bd9" Mar 21 05:29:25 crc kubenswrapper[4839]: I0321 05:29:25.453068 4839 scope.go:117] "RemoveContainer" containerID="135bd8a875eb5ef2e4c8acd69ed54a00aa689eabe31cc4130e794bbf798a359d" Mar 21 05:29:25 crc kubenswrapper[4839]: E0321 05:29:25.453867 4839 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-jx4q7_openshift-machine-config-operator(4f92fefb-d5cd-451a-8bbe-31eea55d5bd9)\"" pod="openshift-machine-config-operator/machine-config-daemon-jx4q7" podUID="4f92fefb-d5cd-451a-8bbe-31eea55d5bd9" Mar 21 05:29:28 crc kubenswrapper[4839]: I0321 05:29:28.595410 4839 generic.go:334] "Generic (PLEG): container finished" podID="de78e0a8-6c32-44ae-8f44-443eb0f1dd25" containerID="4aacadbb7c340286a8bc5bb514479c70f999217550d1118742a4cf28f857f96a" exitCode=0 Mar 21 05:29:28 crc kubenswrapper[4839]: I0321 05:29:28.595499 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-72qrq/must-gather-mxjpl" event={"ID":"de78e0a8-6c32-44ae-8f44-443eb0f1dd25","Type":"ContainerDied","Data":"4aacadbb7c340286a8bc5bb514479c70f999217550d1118742a4cf28f857f96a"} Mar 21 05:29:28 crc kubenswrapper[4839]: I0321 05:29:28.596823 4839 scope.go:117] "RemoveContainer" containerID="4aacadbb7c340286a8bc5bb514479c70f999217550d1118742a4cf28f857f96a" Mar 21 05:29:28 crc kubenswrapper[4839]: I0321 05:29:28.927811 4839 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-72qrq_must-gather-mxjpl_de78e0a8-6c32-44ae-8f44-443eb0f1dd25/gather/0.log" Mar 21 05:29:37 crc kubenswrapper[4839]: I0321 05:29:37.278876 4839 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-72qrq/must-gather-mxjpl"] Mar 21 05:29:37 crc kubenswrapper[4839]: I0321 05:29:37.279671 4839 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-must-gather-72qrq/must-gather-mxjpl" podUID="de78e0a8-6c32-44ae-8f44-443eb0f1dd25" containerName="copy" containerID="cri-o://0acccee08d8e21b640f974b3184f1317711fe544ebde8a6ac1eadbd5cddfd459" gracePeriod=2 Mar 21 05:29:37 crc kubenswrapper[4839]: I0321 05:29:37.291004 4839 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-72qrq/must-gather-mxjpl"] Mar 21 05:29:37 crc kubenswrapper[4839]: I0321 05:29:37.682795 4839 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-72qrq_must-gather-mxjpl_de78e0a8-6c32-44ae-8f44-443eb0f1dd25/copy/0.log" Mar 21 05:29:37 crc kubenswrapper[4839]: I0321 05:29:37.683264 4839 generic.go:334] "Generic (PLEG): container finished" podID="de78e0a8-6c32-44ae-8f44-443eb0f1dd25" containerID="0acccee08d8e21b640f974b3184f1317711fe544ebde8a6ac1eadbd5cddfd459" exitCode=143 Mar 21 05:29:37 crc kubenswrapper[4839]: I0321 05:29:37.829195 4839 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-72qrq_must-gather-mxjpl_de78e0a8-6c32-44ae-8f44-443eb0f1dd25/copy/0.log" Mar 21 05:29:37 crc kubenswrapper[4839]: I0321 05:29:37.830715 4839 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-72qrq/must-gather-mxjpl" Mar 21 05:29:37 crc kubenswrapper[4839]: I0321 05:29:37.936650 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-t4s42\" (UniqueName: \"kubernetes.io/projected/de78e0a8-6c32-44ae-8f44-443eb0f1dd25-kube-api-access-t4s42\") pod \"de78e0a8-6c32-44ae-8f44-443eb0f1dd25\" (UID: \"de78e0a8-6c32-44ae-8f44-443eb0f1dd25\") " Mar 21 05:29:37 crc kubenswrapper[4839]: I0321 05:29:37.936780 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/de78e0a8-6c32-44ae-8f44-443eb0f1dd25-must-gather-output\") pod \"de78e0a8-6c32-44ae-8f44-443eb0f1dd25\" (UID: \"de78e0a8-6c32-44ae-8f44-443eb0f1dd25\") " Mar 21 05:29:37 crc kubenswrapper[4839]: I0321 05:29:37.943186 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/de78e0a8-6c32-44ae-8f44-443eb0f1dd25-kube-api-access-t4s42" (OuterVolumeSpecName: "kube-api-access-t4s42") pod "de78e0a8-6c32-44ae-8f44-443eb0f1dd25" (UID: "de78e0a8-6c32-44ae-8f44-443eb0f1dd25"). InnerVolumeSpecName "kube-api-access-t4s42". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 21 05:29:38 crc kubenswrapper[4839]: I0321 05:29:38.039342 4839 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-t4s42\" (UniqueName: \"kubernetes.io/projected/de78e0a8-6c32-44ae-8f44-443eb0f1dd25-kube-api-access-t4s42\") on node \"crc\" DevicePath \"\"" Mar 21 05:29:38 crc kubenswrapper[4839]: I0321 05:29:38.093993 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/de78e0a8-6c32-44ae-8f44-443eb0f1dd25-must-gather-output" (OuterVolumeSpecName: "must-gather-output") pod "de78e0a8-6c32-44ae-8f44-443eb0f1dd25" (UID: "de78e0a8-6c32-44ae-8f44-443eb0f1dd25"). InnerVolumeSpecName "must-gather-output". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 21 05:29:38 crc kubenswrapper[4839]: I0321 05:29:38.141228 4839 reconciler_common.go:293] "Volume detached for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/de78e0a8-6c32-44ae-8f44-443eb0f1dd25-must-gather-output\") on node \"crc\" DevicePath \"\"" Mar 21 05:29:38 crc kubenswrapper[4839]: I0321 05:29:38.453264 4839 scope.go:117] "RemoveContainer" containerID="135bd8a875eb5ef2e4c8acd69ed54a00aa689eabe31cc4130e794bbf798a359d" Mar 21 05:29:38 crc kubenswrapper[4839]: E0321 05:29:38.455082 4839 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-jx4q7_openshift-machine-config-operator(4f92fefb-d5cd-451a-8bbe-31eea55d5bd9)\"" pod="openshift-machine-config-operator/machine-config-daemon-jx4q7" podUID="4f92fefb-d5cd-451a-8bbe-31eea55d5bd9" Mar 21 05:29:38 crc kubenswrapper[4839]: I0321 05:29:38.465634 4839 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="de78e0a8-6c32-44ae-8f44-443eb0f1dd25" path="/var/lib/kubelet/pods/de78e0a8-6c32-44ae-8f44-443eb0f1dd25/volumes" Mar 21 05:29:38 crc kubenswrapper[4839]: I0321 05:29:38.694416 4839 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-72qrq_must-gather-mxjpl_de78e0a8-6c32-44ae-8f44-443eb0f1dd25/copy/0.log" Mar 21 05:29:38 crc kubenswrapper[4839]: I0321 05:29:38.695217 4839 scope.go:117] "RemoveContainer" containerID="0acccee08d8e21b640f974b3184f1317711fe544ebde8a6ac1eadbd5cddfd459" Mar 21 05:29:38 crc kubenswrapper[4839]: I0321 05:29:38.695329 4839 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-72qrq/must-gather-mxjpl" Mar 21 05:29:38 crc kubenswrapper[4839]: I0321 05:29:38.716615 4839 scope.go:117] "RemoveContainer" containerID="4aacadbb7c340286a8bc5bb514479c70f999217550d1118742a4cf28f857f96a" Mar 21 05:29:50 crc kubenswrapper[4839]: I0321 05:29:50.453830 4839 scope.go:117] "RemoveContainer" containerID="135bd8a875eb5ef2e4c8acd69ed54a00aa689eabe31cc4130e794bbf798a359d" Mar 21 05:29:50 crc kubenswrapper[4839]: E0321 05:29:50.454843 4839 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-jx4q7_openshift-machine-config-operator(4f92fefb-d5cd-451a-8bbe-31eea55d5bd9)\"" pod="openshift-machine-config-operator/machine-config-daemon-jx4q7" podUID="4f92fefb-d5cd-451a-8bbe-31eea55d5bd9" Mar 21 05:30:00 crc kubenswrapper[4839]: I0321 05:30:00.156809 4839 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29567850-6q472"] Mar 21 05:30:00 crc kubenswrapper[4839]: E0321 05:30:00.158017 4839 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c8214f95-33aa-486b-bb82-915b2c5b2cf6" containerName="oc" Mar 21 05:30:00 crc kubenswrapper[4839]: I0321 05:30:00.158038 4839 state_mem.go:107] "Deleted CPUSet assignment" podUID="c8214f95-33aa-486b-bb82-915b2c5b2cf6" containerName="oc" Mar 21 05:30:00 crc kubenswrapper[4839]: E0321 05:30:00.158058 4839 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="de78e0a8-6c32-44ae-8f44-443eb0f1dd25" containerName="copy" Mar 21 05:30:00 crc kubenswrapper[4839]: I0321 05:30:00.158066 4839 state_mem.go:107] "Deleted CPUSet assignment" podUID="de78e0a8-6c32-44ae-8f44-443eb0f1dd25" containerName="copy" Mar 21 05:30:00 crc kubenswrapper[4839]: E0321 05:30:00.158100 4839 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="de78e0a8-6c32-44ae-8f44-443eb0f1dd25" containerName="gather" Mar 21 05:30:00 crc kubenswrapper[4839]: I0321 05:30:00.158109 4839 state_mem.go:107] "Deleted CPUSet assignment" podUID="de78e0a8-6c32-44ae-8f44-443eb0f1dd25" containerName="gather" Mar 21 05:30:00 crc kubenswrapper[4839]: I0321 05:30:00.158345 4839 memory_manager.go:354] "RemoveStaleState removing state" podUID="c8214f95-33aa-486b-bb82-915b2c5b2cf6" containerName="oc" Mar 21 05:30:00 crc kubenswrapper[4839]: I0321 05:30:00.158363 4839 memory_manager.go:354] "RemoveStaleState removing state" podUID="de78e0a8-6c32-44ae-8f44-443eb0f1dd25" containerName="gather" Mar 21 05:30:00 crc kubenswrapper[4839]: I0321 05:30:00.158385 4839 memory_manager.go:354] "RemoveStaleState removing state" podUID="de78e0a8-6c32-44ae-8f44-443eb0f1dd25" containerName="copy" Mar 21 05:30:00 crc kubenswrapper[4839]: I0321 05:30:00.159152 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567850-6q472" Mar 21 05:30:00 crc kubenswrapper[4839]: I0321 05:30:00.162964 4839 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 21 05:30:00 crc kubenswrapper[4839]: I0321 05:30:00.163370 4839 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-swld2" Mar 21 05:30:00 crc kubenswrapper[4839]: I0321 05:30:00.163592 4839 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 21 05:30:00 crc kubenswrapper[4839]: I0321 05:30:00.166753 4839 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29567850-6q472"] Mar 21 05:30:00 crc kubenswrapper[4839]: I0321 05:30:00.254361 4839 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29567850-ttg6z"] Mar 21 05:30:00 crc kubenswrapper[4839]: I0321 05:30:00.255509 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29567850-ttg6z" Mar 21 05:30:00 crc kubenswrapper[4839]: I0321 05:30:00.257718 4839 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Mar 21 05:30:00 crc kubenswrapper[4839]: I0321 05:30:00.257991 4839 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Mar 21 05:30:00 crc kubenswrapper[4839]: I0321 05:30:00.271715 4839 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29567850-ttg6z"] Mar 21 05:30:00 crc kubenswrapper[4839]: I0321 05:30:00.285477 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hkbxl\" (UniqueName: \"kubernetes.io/projected/5ff27433-bc42-4edf-bcac-48ffe5e0680a-kube-api-access-hkbxl\") pod \"auto-csr-approver-29567850-6q472\" (UID: \"5ff27433-bc42-4edf-bcac-48ffe5e0680a\") " pod="openshift-infra/auto-csr-approver-29567850-6q472" Mar 21 05:30:00 crc kubenswrapper[4839]: I0321 05:30:00.388203 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hkbxl\" (UniqueName: \"kubernetes.io/projected/5ff27433-bc42-4edf-bcac-48ffe5e0680a-kube-api-access-hkbxl\") pod \"auto-csr-approver-29567850-6q472\" (UID: \"5ff27433-bc42-4edf-bcac-48ffe5e0680a\") " pod="openshift-infra/auto-csr-approver-29567850-6q472" Mar 21 05:30:00 crc kubenswrapper[4839]: I0321 05:30:00.388351 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/5068c229-fbb7-489b-909b-767dd8db6c26-secret-volume\") pod \"collect-profiles-29567850-ttg6z\" (UID: \"5068c229-fbb7-489b-909b-767dd8db6c26\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29567850-ttg6z" Mar 21 05:30:00 crc kubenswrapper[4839]: I0321 05:30:00.390208 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/5068c229-fbb7-489b-909b-767dd8db6c26-config-volume\") pod \"collect-profiles-29567850-ttg6z\" (UID: \"5068c229-fbb7-489b-909b-767dd8db6c26\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29567850-ttg6z" Mar 21 05:30:00 crc kubenswrapper[4839]: I0321 05:30:00.390376 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bfdj7\" (UniqueName: \"kubernetes.io/projected/5068c229-fbb7-489b-909b-767dd8db6c26-kube-api-access-bfdj7\") pod \"collect-profiles-29567850-ttg6z\" (UID: \"5068c229-fbb7-489b-909b-767dd8db6c26\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29567850-ttg6z" Mar 21 05:30:00 crc kubenswrapper[4839]: I0321 05:30:00.420188 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hkbxl\" (UniqueName: \"kubernetes.io/projected/5ff27433-bc42-4edf-bcac-48ffe5e0680a-kube-api-access-hkbxl\") pod \"auto-csr-approver-29567850-6q472\" (UID: \"5ff27433-bc42-4edf-bcac-48ffe5e0680a\") " pod="openshift-infra/auto-csr-approver-29567850-6q472" Mar 21 05:30:00 crc kubenswrapper[4839]: I0321 05:30:00.486208 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567850-6q472" Mar 21 05:30:00 crc kubenswrapper[4839]: I0321 05:30:00.492765 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bfdj7\" (UniqueName: \"kubernetes.io/projected/5068c229-fbb7-489b-909b-767dd8db6c26-kube-api-access-bfdj7\") pod \"collect-profiles-29567850-ttg6z\" (UID: \"5068c229-fbb7-489b-909b-767dd8db6c26\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29567850-ttg6z" Mar 21 05:30:00 crc kubenswrapper[4839]: I0321 05:30:00.492872 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/5068c229-fbb7-489b-909b-767dd8db6c26-secret-volume\") pod \"collect-profiles-29567850-ttg6z\" (UID: \"5068c229-fbb7-489b-909b-767dd8db6c26\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29567850-ttg6z" Mar 21 05:30:00 crc kubenswrapper[4839]: I0321 05:30:00.493037 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/5068c229-fbb7-489b-909b-767dd8db6c26-config-volume\") pod \"collect-profiles-29567850-ttg6z\" (UID: \"5068c229-fbb7-489b-909b-767dd8db6c26\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29567850-ttg6z" Mar 21 05:30:00 crc kubenswrapper[4839]: I0321 05:30:00.494401 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/5068c229-fbb7-489b-909b-767dd8db6c26-config-volume\") pod \"collect-profiles-29567850-ttg6z\" (UID: \"5068c229-fbb7-489b-909b-767dd8db6c26\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29567850-ttg6z" Mar 21 05:30:00 crc kubenswrapper[4839]: I0321 05:30:00.499779 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/5068c229-fbb7-489b-909b-767dd8db6c26-secret-volume\") pod \"collect-profiles-29567850-ttg6z\" (UID: \"5068c229-fbb7-489b-909b-767dd8db6c26\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29567850-ttg6z" Mar 21 05:30:00 crc kubenswrapper[4839]: I0321 05:30:00.515225 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bfdj7\" (UniqueName: \"kubernetes.io/projected/5068c229-fbb7-489b-909b-767dd8db6c26-kube-api-access-bfdj7\") pod \"collect-profiles-29567850-ttg6z\" (UID: \"5068c229-fbb7-489b-909b-767dd8db6c26\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29567850-ttg6z" Mar 21 05:30:00 crc kubenswrapper[4839]: I0321 05:30:00.588900 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29567850-ttg6z" Mar 21 05:30:00 crc kubenswrapper[4839]: I0321 05:30:00.926617 4839 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29567850-6q472"] Mar 21 05:30:00 crc kubenswrapper[4839]: I0321 05:30:00.937201 4839 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 21 05:30:01 crc kubenswrapper[4839]: I0321 05:30:01.036760 4839 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29567850-ttg6z"] Mar 21 05:30:01 crc kubenswrapper[4839]: W0321 05:30:01.043191 4839 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5068c229_fbb7_489b_909b_767dd8db6c26.slice/crio-5d3df8ab1decb57324635594557afe3da3713c265cdc38fdb99134b723d707d6 WatchSource:0}: Error finding container 5d3df8ab1decb57324635594557afe3da3713c265cdc38fdb99134b723d707d6: Status 404 returned error can't find the container with id 5d3df8ab1decb57324635594557afe3da3713c265cdc38fdb99134b723d707d6 Mar 21 05:30:01 crc kubenswrapper[4839]: I0321 05:30:01.904959 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29567850-ttg6z" event={"ID":"5068c229-fbb7-489b-909b-767dd8db6c26","Type":"ContainerStarted","Data":"795ea2bdd38729415d4fdc09f70f45e99f6b013a9dbbd17157a70057466b2a66"} Mar 21 05:30:01 crc kubenswrapper[4839]: I0321 05:30:01.905259 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29567850-ttg6z" event={"ID":"5068c229-fbb7-489b-909b-767dd8db6c26","Type":"ContainerStarted","Data":"5d3df8ab1decb57324635594557afe3da3713c265cdc38fdb99134b723d707d6"} Mar 21 05:30:01 crc kubenswrapper[4839]: I0321 05:30:01.907402 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567850-6q472" event={"ID":"5ff27433-bc42-4edf-bcac-48ffe5e0680a","Type":"ContainerStarted","Data":"6a6c7d9e27ba4cfe8063fb41dea20f7f9e9da39d4362c1756955ba046915f307"} Mar 21 05:30:01 crc kubenswrapper[4839]: I0321 05:30:01.923551 4839 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/collect-profiles-29567850-ttg6z" podStartSLOduration=1.9235247 podStartE2EDuration="1.9235247s" podCreationTimestamp="2026-03-21 05:30:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-21 05:30:01.918241971 +0000 UTC m=+4006.246028647" watchObservedRunningTime="2026-03-21 05:30:01.9235247 +0000 UTC m=+4006.251311376" Mar 21 05:30:02 crc kubenswrapper[4839]: I0321 05:30:02.923728 4839 generic.go:334] "Generic (PLEG): container finished" podID="5068c229-fbb7-489b-909b-767dd8db6c26" containerID="795ea2bdd38729415d4fdc09f70f45e99f6b013a9dbbd17157a70057466b2a66" exitCode=0 Mar 21 05:30:02 crc kubenswrapper[4839]: I0321 05:30:02.924074 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29567850-ttg6z" event={"ID":"5068c229-fbb7-489b-909b-767dd8db6c26","Type":"ContainerDied","Data":"795ea2bdd38729415d4fdc09f70f45e99f6b013a9dbbd17157a70057466b2a66"} Mar 21 05:30:03 crc kubenswrapper[4839]: I0321 05:30:03.452981 4839 scope.go:117] "RemoveContainer" containerID="135bd8a875eb5ef2e4c8acd69ed54a00aa689eabe31cc4130e794bbf798a359d" Mar 21 05:30:03 crc kubenswrapper[4839]: E0321 05:30:03.453474 4839 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-jx4q7_openshift-machine-config-operator(4f92fefb-d5cd-451a-8bbe-31eea55d5bd9)\"" pod="openshift-machine-config-operator/machine-config-daemon-jx4q7" podUID="4f92fefb-d5cd-451a-8bbe-31eea55d5bd9" Mar 21 05:30:03 crc kubenswrapper[4839]: I0321 05:30:03.933370 4839 generic.go:334] "Generic (PLEG): container finished" podID="5ff27433-bc42-4edf-bcac-48ffe5e0680a" containerID="32ef2594966320293c7652dfc99c30b2eedf27f32e9592ed12c4d3d92de56d1a" exitCode=0 Mar 21 05:30:03 crc kubenswrapper[4839]: I0321 05:30:03.933452 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567850-6q472" event={"ID":"5ff27433-bc42-4edf-bcac-48ffe5e0680a","Type":"ContainerDied","Data":"32ef2594966320293c7652dfc99c30b2eedf27f32e9592ed12c4d3d92de56d1a"} Mar 21 05:30:04 crc kubenswrapper[4839]: I0321 05:30:04.273926 4839 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29567850-ttg6z" Mar 21 05:30:04 crc kubenswrapper[4839]: I0321 05:30:04.372004 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/5068c229-fbb7-489b-909b-767dd8db6c26-secret-volume\") pod \"5068c229-fbb7-489b-909b-767dd8db6c26\" (UID: \"5068c229-fbb7-489b-909b-767dd8db6c26\") " Mar 21 05:30:04 crc kubenswrapper[4839]: I0321 05:30:04.372373 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/5068c229-fbb7-489b-909b-767dd8db6c26-config-volume\") pod \"5068c229-fbb7-489b-909b-767dd8db6c26\" (UID: \"5068c229-fbb7-489b-909b-767dd8db6c26\") " Mar 21 05:30:04 crc kubenswrapper[4839]: I0321 05:30:04.372485 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bfdj7\" (UniqueName: \"kubernetes.io/projected/5068c229-fbb7-489b-909b-767dd8db6c26-kube-api-access-bfdj7\") pod \"5068c229-fbb7-489b-909b-767dd8db6c26\" (UID: \"5068c229-fbb7-489b-909b-767dd8db6c26\") " Mar 21 05:30:04 crc kubenswrapper[4839]: I0321 05:30:04.373174 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5068c229-fbb7-489b-909b-767dd8db6c26-config-volume" (OuterVolumeSpecName: "config-volume") pod "5068c229-fbb7-489b-909b-767dd8db6c26" (UID: "5068c229-fbb7-489b-909b-767dd8db6c26"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 21 05:30:04 crc kubenswrapper[4839]: I0321 05:30:04.378270 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5068c229-fbb7-489b-909b-767dd8db6c26-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "5068c229-fbb7-489b-909b-767dd8db6c26" (UID: "5068c229-fbb7-489b-909b-767dd8db6c26"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 21 05:30:04 crc kubenswrapper[4839]: I0321 05:30:04.385352 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5068c229-fbb7-489b-909b-767dd8db6c26-kube-api-access-bfdj7" (OuterVolumeSpecName: "kube-api-access-bfdj7") pod "5068c229-fbb7-489b-909b-767dd8db6c26" (UID: "5068c229-fbb7-489b-909b-767dd8db6c26"). InnerVolumeSpecName "kube-api-access-bfdj7". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 21 05:30:04 crc kubenswrapper[4839]: I0321 05:30:04.477828 4839 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/5068c229-fbb7-489b-909b-767dd8db6c26-secret-volume\") on node \"crc\" DevicePath \"\"" Mar 21 05:30:04 crc kubenswrapper[4839]: I0321 05:30:04.477873 4839 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/5068c229-fbb7-489b-909b-767dd8db6c26-config-volume\") on node \"crc\" DevicePath \"\"" Mar 21 05:30:04 crc kubenswrapper[4839]: I0321 05:30:04.477889 4839 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bfdj7\" (UniqueName: \"kubernetes.io/projected/5068c229-fbb7-489b-909b-767dd8db6c26-kube-api-access-bfdj7\") on node \"crc\" DevicePath \"\"" Mar 21 05:30:04 crc kubenswrapper[4839]: I0321 05:30:04.942530 4839 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29567850-ttg6z" Mar 21 05:30:04 crc kubenswrapper[4839]: I0321 05:30:04.942515 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29567850-ttg6z" event={"ID":"5068c229-fbb7-489b-909b-767dd8db6c26","Type":"ContainerDied","Data":"5d3df8ab1decb57324635594557afe3da3713c265cdc38fdb99134b723d707d6"} Mar 21 05:30:04 crc kubenswrapper[4839]: I0321 05:30:04.942998 4839 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="5d3df8ab1decb57324635594557afe3da3713c265cdc38fdb99134b723d707d6" Mar 21 05:30:04 crc kubenswrapper[4839]: I0321 05:30:04.995793 4839 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29567805-9sr8j"] Mar 21 05:30:05 crc kubenswrapper[4839]: I0321 05:30:05.004703 4839 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29567805-9sr8j"] Mar 21 05:30:05 crc kubenswrapper[4839]: I0321 05:30:05.288745 4839 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567850-6q472" Mar 21 05:30:05 crc kubenswrapper[4839]: I0321 05:30:05.299658 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hkbxl\" (UniqueName: \"kubernetes.io/projected/5ff27433-bc42-4edf-bcac-48ffe5e0680a-kube-api-access-hkbxl\") pod \"5ff27433-bc42-4edf-bcac-48ffe5e0680a\" (UID: \"5ff27433-bc42-4edf-bcac-48ffe5e0680a\") " Mar 21 05:30:05 crc kubenswrapper[4839]: I0321 05:30:05.307006 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5ff27433-bc42-4edf-bcac-48ffe5e0680a-kube-api-access-hkbxl" (OuterVolumeSpecName: "kube-api-access-hkbxl") pod "5ff27433-bc42-4edf-bcac-48ffe5e0680a" (UID: "5ff27433-bc42-4edf-bcac-48ffe5e0680a"). InnerVolumeSpecName "kube-api-access-hkbxl". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 21 05:30:05 crc kubenswrapper[4839]: I0321 05:30:05.402171 4839 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hkbxl\" (UniqueName: \"kubernetes.io/projected/5ff27433-bc42-4edf-bcac-48ffe5e0680a-kube-api-access-hkbxl\") on node \"crc\" DevicePath \"\"" Mar 21 05:30:05 crc kubenswrapper[4839]: I0321 05:30:05.954750 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567850-6q472" event={"ID":"5ff27433-bc42-4edf-bcac-48ffe5e0680a","Type":"ContainerDied","Data":"6a6c7d9e27ba4cfe8063fb41dea20f7f9e9da39d4362c1756955ba046915f307"} Mar 21 05:30:05 crc kubenswrapper[4839]: I0321 05:30:05.955168 4839 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6a6c7d9e27ba4cfe8063fb41dea20f7f9e9da39d4362c1756955ba046915f307" Mar 21 05:30:05 crc kubenswrapper[4839]: I0321 05:30:05.954939 4839 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567850-6q472" Mar 21 05:30:06 crc kubenswrapper[4839]: I0321 05:30:06.355155 4839 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29567844-qzhl5"] Mar 21 05:30:06 crc kubenswrapper[4839]: I0321 05:30:06.363362 4839 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29567844-qzhl5"] Mar 21 05:30:06 crc kubenswrapper[4839]: I0321 05:30:06.464221 4839 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="47d5a79e-3e14-4d49-bed4-a9c49e7b7f26" path="/var/lib/kubelet/pods/47d5a79e-3e14-4d49-bed4-a9c49e7b7f26/volumes" Mar 21 05:30:06 crc kubenswrapper[4839]: I0321 05:30:06.464953 4839 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ad5b8a95-e16d-42a8-9069-5294c8934559" path="/var/lib/kubelet/pods/ad5b8a95-e16d-42a8-9069-5294c8934559/volumes" Mar 21 05:30:13 crc kubenswrapper[4839]: I0321 05:30:13.989364 4839 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-csd6j"] Mar 21 05:30:13 crc kubenswrapper[4839]: E0321 05:30:13.990333 4839 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5ff27433-bc42-4edf-bcac-48ffe5e0680a" containerName="oc" Mar 21 05:30:13 crc kubenswrapper[4839]: I0321 05:30:13.990346 4839 state_mem.go:107] "Deleted CPUSet assignment" podUID="5ff27433-bc42-4edf-bcac-48ffe5e0680a" containerName="oc" Mar 21 05:30:13 crc kubenswrapper[4839]: E0321 05:30:13.990361 4839 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5068c229-fbb7-489b-909b-767dd8db6c26" containerName="collect-profiles" Mar 21 05:30:13 crc kubenswrapper[4839]: I0321 05:30:13.990369 4839 state_mem.go:107] "Deleted CPUSet assignment" podUID="5068c229-fbb7-489b-909b-767dd8db6c26" containerName="collect-profiles" Mar 21 05:30:13 crc kubenswrapper[4839]: I0321 05:30:13.990528 4839 memory_manager.go:354] "RemoveStaleState removing state" podUID="5068c229-fbb7-489b-909b-767dd8db6c26" containerName="collect-profiles" Mar 21 05:30:13 crc kubenswrapper[4839]: I0321 05:30:13.990551 4839 memory_manager.go:354] "RemoveStaleState removing state" podUID="5ff27433-bc42-4edf-bcac-48ffe5e0680a" containerName="oc" Mar 21 05:30:13 crc kubenswrapper[4839]: I0321 05:30:13.991815 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-csd6j" Mar 21 05:30:14 crc kubenswrapper[4839]: I0321 05:30:14.002540 4839 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-csd6j"] Mar 21 05:30:14 crc kubenswrapper[4839]: I0321 05:30:14.068391 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bx8sz\" (UniqueName: \"kubernetes.io/projected/a52535f1-c597-4bcd-9cdf-b51230e45194-kube-api-access-bx8sz\") pod \"redhat-operators-csd6j\" (UID: \"a52535f1-c597-4bcd-9cdf-b51230e45194\") " pod="openshift-marketplace/redhat-operators-csd6j" Mar 21 05:30:14 crc kubenswrapper[4839]: I0321 05:30:14.068466 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a52535f1-c597-4bcd-9cdf-b51230e45194-catalog-content\") pod \"redhat-operators-csd6j\" (UID: \"a52535f1-c597-4bcd-9cdf-b51230e45194\") " pod="openshift-marketplace/redhat-operators-csd6j" Mar 21 05:30:14 crc kubenswrapper[4839]: I0321 05:30:14.068912 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a52535f1-c597-4bcd-9cdf-b51230e45194-utilities\") pod \"redhat-operators-csd6j\" (UID: \"a52535f1-c597-4bcd-9cdf-b51230e45194\") " pod="openshift-marketplace/redhat-operators-csd6j" Mar 21 05:30:14 crc kubenswrapper[4839]: I0321 05:30:14.171990 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a52535f1-c597-4bcd-9cdf-b51230e45194-utilities\") pod \"redhat-operators-csd6j\" (UID: \"a52535f1-c597-4bcd-9cdf-b51230e45194\") " pod="openshift-marketplace/redhat-operators-csd6j" Mar 21 05:30:14 crc kubenswrapper[4839]: I0321 05:30:14.172217 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bx8sz\" (UniqueName: \"kubernetes.io/projected/a52535f1-c597-4bcd-9cdf-b51230e45194-kube-api-access-bx8sz\") pod \"redhat-operators-csd6j\" (UID: \"a52535f1-c597-4bcd-9cdf-b51230e45194\") " pod="openshift-marketplace/redhat-operators-csd6j" Mar 21 05:30:14 crc kubenswrapper[4839]: I0321 05:30:14.172290 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a52535f1-c597-4bcd-9cdf-b51230e45194-catalog-content\") pod \"redhat-operators-csd6j\" (UID: \"a52535f1-c597-4bcd-9cdf-b51230e45194\") " pod="openshift-marketplace/redhat-operators-csd6j" Mar 21 05:30:14 crc kubenswrapper[4839]: I0321 05:30:14.172551 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a52535f1-c597-4bcd-9cdf-b51230e45194-utilities\") pod \"redhat-operators-csd6j\" (UID: \"a52535f1-c597-4bcd-9cdf-b51230e45194\") " pod="openshift-marketplace/redhat-operators-csd6j" Mar 21 05:30:14 crc kubenswrapper[4839]: I0321 05:30:14.172919 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a52535f1-c597-4bcd-9cdf-b51230e45194-catalog-content\") pod \"redhat-operators-csd6j\" (UID: \"a52535f1-c597-4bcd-9cdf-b51230e45194\") " pod="openshift-marketplace/redhat-operators-csd6j" Mar 21 05:30:14 crc kubenswrapper[4839]: I0321 05:30:14.193629 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bx8sz\" (UniqueName: \"kubernetes.io/projected/a52535f1-c597-4bcd-9cdf-b51230e45194-kube-api-access-bx8sz\") pod \"redhat-operators-csd6j\" (UID: \"a52535f1-c597-4bcd-9cdf-b51230e45194\") " pod="openshift-marketplace/redhat-operators-csd6j" Mar 21 05:30:14 crc kubenswrapper[4839]: I0321 05:30:14.316051 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-csd6j" Mar 21 05:30:14 crc kubenswrapper[4839]: I0321 05:30:14.824549 4839 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-csd6j"] Mar 21 05:30:15 crc kubenswrapper[4839]: I0321 05:30:15.028885 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-csd6j" event={"ID":"a52535f1-c597-4bcd-9cdf-b51230e45194","Type":"ContainerStarted","Data":"11bf8d4811bd37333961fab92452c6a2b0698d79539b44b206fdc039f7461b0b"} Mar 21 05:30:15 crc kubenswrapper[4839]: I0321 05:30:15.028939 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-csd6j" event={"ID":"a52535f1-c597-4bcd-9cdf-b51230e45194","Type":"ContainerStarted","Data":"b0864809c14aa0a411344168780ec1f23fecd6a5710ca66679f834831778488d"} Mar 21 05:30:16 crc kubenswrapper[4839]: I0321 05:30:16.038930 4839 generic.go:334] "Generic (PLEG): container finished" podID="a52535f1-c597-4bcd-9cdf-b51230e45194" containerID="11bf8d4811bd37333961fab92452c6a2b0698d79539b44b206fdc039f7461b0b" exitCode=0 Mar 21 05:30:16 crc kubenswrapper[4839]: I0321 05:30:16.039018 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-csd6j" event={"ID":"a52535f1-c597-4bcd-9cdf-b51230e45194","Type":"ContainerDied","Data":"11bf8d4811bd37333961fab92452c6a2b0698d79539b44b206fdc039f7461b0b"} Mar 21 05:30:17 crc kubenswrapper[4839]: I0321 05:30:17.049433 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-csd6j" event={"ID":"a52535f1-c597-4bcd-9cdf-b51230e45194","Type":"ContainerStarted","Data":"29c4e7c16b734f5d55db99e70306de25f6bd3ce8412aef36e66b3ae3fba4aa67"} Mar 21 05:30:17 crc kubenswrapper[4839]: I0321 05:30:17.452648 4839 scope.go:117] "RemoveContainer" containerID="135bd8a875eb5ef2e4c8acd69ed54a00aa689eabe31cc4130e794bbf798a359d" Mar 21 05:30:17 crc kubenswrapper[4839]: E0321 05:30:17.453306 4839 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-jx4q7_openshift-machine-config-operator(4f92fefb-d5cd-451a-8bbe-31eea55d5bd9)\"" pod="openshift-machine-config-operator/machine-config-daemon-jx4q7" podUID="4f92fefb-d5cd-451a-8bbe-31eea55d5bd9" Mar 21 05:30:18 crc kubenswrapper[4839]: I0321 05:30:18.064068 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-csd6j" event={"ID":"a52535f1-c597-4bcd-9cdf-b51230e45194","Type":"ContainerDied","Data":"29c4e7c16b734f5d55db99e70306de25f6bd3ce8412aef36e66b3ae3fba4aa67"} Mar 21 05:30:18 crc kubenswrapper[4839]: I0321 05:30:18.063827 4839 generic.go:334] "Generic (PLEG): container finished" podID="a52535f1-c597-4bcd-9cdf-b51230e45194" containerID="29c4e7c16b734f5d55db99e70306de25f6bd3ce8412aef36e66b3ae3fba4aa67" exitCode=0 Mar 21 05:30:19 crc kubenswrapper[4839]: I0321 05:30:19.077088 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-csd6j" event={"ID":"a52535f1-c597-4bcd-9cdf-b51230e45194","Type":"ContainerStarted","Data":"7c3986fc2d374298329091f03d0df61db6a6535375bb6b72b1d989a838c16ce3"} Mar 21 05:30:19 crc kubenswrapper[4839]: I0321 05:30:19.100591 4839 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-csd6j" podStartSLOduration=3.518213369 podStartE2EDuration="6.100553326s" podCreationTimestamp="2026-03-21 05:30:13 +0000 UTC" firstStartedPulling="2026-03-21 05:30:16.040875798 +0000 UTC m=+4020.368662474" lastFinishedPulling="2026-03-21 05:30:18.623215755 +0000 UTC m=+4022.951002431" observedRunningTime="2026-03-21 05:30:19.097223872 +0000 UTC m=+4023.425010548" watchObservedRunningTime="2026-03-21 05:30:19.100553326 +0000 UTC m=+4023.428340002" Mar 21 05:30:24 crc kubenswrapper[4839]: I0321 05:30:24.317027 4839 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-csd6j" Mar 21 05:30:24 crc kubenswrapper[4839]: I0321 05:30:24.317530 4839 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-csd6j" Mar 21 05:30:25 crc kubenswrapper[4839]: I0321 05:30:25.364062 4839 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-csd6j" podUID="a52535f1-c597-4bcd-9cdf-b51230e45194" containerName="registry-server" probeResult="failure" output=< Mar 21 05:30:25 crc kubenswrapper[4839]: timeout: failed to connect service ":50051" within 1s Mar 21 05:30:25 crc kubenswrapper[4839]: > Mar 21 05:30:28 crc kubenswrapper[4839]: I0321 05:30:28.452699 4839 scope.go:117] "RemoveContainer" containerID="135bd8a875eb5ef2e4c8acd69ed54a00aa689eabe31cc4130e794bbf798a359d" Mar 21 05:30:28 crc kubenswrapper[4839]: E0321 05:30:28.453519 4839 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-jx4q7_openshift-machine-config-operator(4f92fefb-d5cd-451a-8bbe-31eea55d5bd9)\"" pod="openshift-machine-config-operator/machine-config-daemon-jx4q7" podUID="4f92fefb-d5cd-451a-8bbe-31eea55d5bd9" Mar 21 05:30:34 crc kubenswrapper[4839]: I0321 05:30:34.379271 4839 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-csd6j" Mar 21 05:30:34 crc kubenswrapper[4839]: I0321 05:30:34.439246 4839 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-csd6j" Mar 21 05:30:34 crc kubenswrapper[4839]: I0321 05:30:34.618148 4839 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-csd6j"] Mar 21 05:30:36 crc kubenswrapper[4839]: I0321 05:30:36.238961 4839 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-csd6j" podUID="a52535f1-c597-4bcd-9cdf-b51230e45194" containerName="registry-server" containerID="cri-o://7c3986fc2d374298329091f03d0df61db6a6535375bb6b72b1d989a838c16ce3" gracePeriod=2 Mar 21 05:30:37 crc kubenswrapper[4839]: I0321 05:30:37.254069 4839 generic.go:334] "Generic (PLEG): container finished" podID="a52535f1-c597-4bcd-9cdf-b51230e45194" containerID="7c3986fc2d374298329091f03d0df61db6a6535375bb6b72b1d989a838c16ce3" exitCode=0 Mar 21 05:30:37 crc kubenswrapper[4839]: I0321 05:30:37.254698 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-csd6j" event={"ID":"a52535f1-c597-4bcd-9cdf-b51230e45194","Type":"ContainerDied","Data":"7c3986fc2d374298329091f03d0df61db6a6535375bb6b72b1d989a838c16ce3"} Mar 21 05:30:37 crc kubenswrapper[4839]: I0321 05:30:37.374340 4839 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-csd6j" Mar 21 05:30:37 crc kubenswrapper[4839]: I0321 05:30:37.444493 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a52535f1-c597-4bcd-9cdf-b51230e45194-utilities\") pod \"a52535f1-c597-4bcd-9cdf-b51230e45194\" (UID: \"a52535f1-c597-4bcd-9cdf-b51230e45194\") " Mar 21 05:30:37 crc kubenswrapper[4839]: I0321 05:30:37.444645 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bx8sz\" (UniqueName: \"kubernetes.io/projected/a52535f1-c597-4bcd-9cdf-b51230e45194-kube-api-access-bx8sz\") pod \"a52535f1-c597-4bcd-9cdf-b51230e45194\" (UID: \"a52535f1-c597-4bcd-9cdf-b51230e45194\") " Mar 21 05:30:37 crc kubenswrapper[4839]: I0321 05:30:37.444679 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a52535f1-c597-4bcd-9cdf-b51230e45194-catalog-content\") pod \"a52535f1-c597-4bcd-9cdf-b51230e45194\" (UID: \"a52535f1-c597-4bcd-9cdf-b51230e45194\") " Mar 21 05:30:37 crc kubenswrapper[4839]: I0321 05:30:37.445733 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a52535f1-c597-4bcd-9cdf-b51230e45194-utilities" (OuterVolumeSpecName: "utilities") pod "a52535f1-c597-4bcd-9cdf-b51230e45194" (UID: "a52535f1-c597-4bcd-9cdf-b51230e45194"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 21 05:30:37 crc kubenswrapper[4839]: I0321 05:30:37.453088 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a52535f1-c597-4bcd-9cdf-b51230e45194-kube-api-access-bx8sz" (OuterVolumeSpecName: "kube-api-access-bx8sz") pod "a52535f1-c597-4bcd-9cdf-b51230e45194" (UID: "a52535f1-c597-4bcd-9cdf-b51230e45194"). InnerVolumeSpecName "kube-api-access-bx8sz". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 21 05:30:37 crc kubenswrapper[4839]: I0321 05:30:37.546803 4839 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a52535f1-c597-4bcd-9cdf-b51230e45194-utilities\") on node \"crc\" DevicePath \"\"" Mar 21 05:30:37 crc kubenswrapper[4839]: I0321 05:30:37.546861 4839 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bx8sz\" (UniqueName: \"kubernetes.io/projected/a52535f1-c597-4bcd-9cdf-b51230e45194-kube-api-access-bx8sz\") on node \"crc\" DevicePath \"\"" Mar 21 05:30:37 crc kubenswrapper[4839]: I0321 05:30:37.593523 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a52535f1-c597-4bcd-9cdf-b51230e45194-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "a52535f1-c597-4bcd-9cdf-b51230e45194" (UID: "a52535f1-c597-4bcd-9cdf-b51230e45194"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 21 05:30:37 crc kubenswrapper[4839]: I0321 05:30:37.648383 4839 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a52535f1-c597-4bcd-9cdf-b51230e45194-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 21 05:30:38 crc kubenswrapper[4839]: I0321 05:30:38.275212 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-csd6j" event={"ID":"a52535f1-c597-4bcd-9cdf-b51230e45194","Type":"ContainerDied","Data":"b0864809c14aa0a411344168780ec1f23fecd6a5710ca66679f834831778488d"} Mar 21 05:30:38 crc kubenswrapper[4839]: I0321 05:30:38.276288 4839 scope.go:117] "RemoveContainer" containerID="7c3986fc2d374298329091f03d0df61db6a6535375bb6b72b1d989a838c16ce3" Mar 21 05:30:38 crc kubenswrapper[4839]: I0321 05:30:38.276238 4839 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-csd6j" Mar 21 05:30:38 crc kubenswrapper[4839]: I0321 05:30:38.296878 4839 scope.go:117] "RemoveContainer" containerID="29c4e7c16b734f5d55db99e70306de25f6bd3ce8412aef36e66b3ae3fba4aa67" Mar 21 05:30:38 crc kubenswrapper[4839]: I0321 05:30:38.315227 4839 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-csd6j"] Mar 21 05:30:38 crc kubenswrapper[4839]: I0321 05:30:38.338291 4839 scope.go:117] "RemoveContainer" containerID="11bf8d4811bd37333961fab92452c6a2b0698d79539b44b206fdc039f7461b0b" Mar 21 05:30:38 crc kubenswrapper[4839]: I0321 05:30:38.346106 4839 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-csd6j"] Mar 21 05:30:38 crc kubenswrapper[4839]: I0321 05:30:38.463619 4839 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a52535f1-c597-4bcd-9cdf-b51230e45194" path="/var/lib/kubelet/pods/a52535f1-c597-4bcd-9cdf-b51230e45194/volumes" Mar 21 05:30:41 crc kubenswrapper[4839]: I0321 05:30:41.452838 4839 scope.go:117] "RemoveContainer" containerID="135bd8a875eb5ef2e4c8acd69ed54a00aa689eabe31cc4130e794bbf798a359d" Mar 21 05:30:41 crc kubenswrapper[4839]: E0321 05:30:41.453719 4839 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-jx4q7_openshift-machine-config-operator(4f92fefb-d5cd-451a-8bbe-31eea55d5bd9)\"" pod="openshift-machine-config-operator/machine-config-daemon-jx4q7" podUID="4f92fefb-d5cd-451a-8bbe-31eea55d5bd9" Mar 21 05:30:54 crc kubenswrapper[4839]: I0321 05:30:54.453301 4839 scope.go:117] "RemoveContainer" containerID="135bd8a875eb5ef2e4c8acd69ed54a00aa689eabe31cc4130e794bbf798a359d" Mar 21 05:30:54 crc kubenswrapper[4839]: E0321 05:30:54.454481 4839 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-jx4q7_openshift-machine-config-operator(4f92fefb-d5cd-451a-8bbe-31eea55d5bd9)\"" pod="openshift-machine-config-operator/machine-config-daemon-jx4q7" podUID="4f92fefb-d5cd-451a-8bbe-31eea55d5bd9" Mar 21 05:30:58 crc kubenswrapper[4839]: I0321 05:30:58.029102 4839 scope.go:117] "RemoveContainer" containerID="77bf1caf6b0a8e86542e0854eb602cb5e02b5990b61a512100dca57b8da7f1d1" Mar 21 05:30:58 crc kubenswrapper[4839]: I0321 05:30:58.065495 4839 scope.go:117] "RemoveContainer" containerID="cb7937f2ae576fec589579ad2dd17797c203b9a5a4641193da2cc618f8fd881c" Mar 21 05:31:08 crc kubenswrapper[4839]: I0321 05:31:08.453909 4839 scope.go:117] "RemoveContainer" containerID="135bd8a875eb5ef2e4c8acd69ed54a00aa689eabe31cc4130e794bbf798a359d" Mar 21 05:31:08 crc kubenswrapper[4839]: E0321 05:31:08.454819 4839 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-jx4q7_openshift-machine-config-operator(4f92fefb-d5cd-451a-8bbe-31eea55d5bd9)\"" pod="openshift-machine-config-operator/machine-config-daemon-jx4q7" podUID="4f92fefb-d5cd-451a-8bbe-31eea55d5bd9" Mar 21 05:31:22 crc kubenswrapper[4839]: I0321 05:31:22.454740 4839 scope.go:117] "RemoveContainer" containerID="135bd8a875eb5ef2e4c8acd69ed54a00aa689eabe31cc4130e794bbf798a359d" Mar 21 05:31:22 crc kubenswrapper[4839]: E0321 05:31:22.456040 4839 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-jx4q7_openshift-machine-config-operator(4f92fefb-d5cd-451a-8bbe-31eea55d5bd9)\"" pod="openshift-machine-config-operator/machine-config-daemon-jx4q7" podUID="4f92fefb-d5cd-451a-8bbe-31eea55d5bd9" Mar 21 05:31:33 crc kubenswrapper[4839]: I0321 05:31:33.453310 4839 scope.go:117] "RemoveContainer" containerID="135bd8a875eb5ef2e4c8acd69ed54a00aa689eabe31cc4130e794bbf798a359d" Mar 21 05:31:33 crc kubenswrapper[4839]: E0321 05:31:33.454179 4839 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-jx4q7_openshift-machine-config-operator(4f92fefb-d5cd-451a-8bbe-31eea55d5bd9)\"" pod="openshift-machine-config-operator/machine-config-daemon-jx4q7" podUID="4f92fefb-d5cd-451a-8bbe-31eea55d5bd9" Mar 21 05:31:45 crc kubenswrapper[4839]: I0321 05:31:45.453171 4839 scope.go:117] "RemoveContainer" containerID="135bd8a875eb5ef2e4c8acd69ed54a00aa689eabe31cc4130e794bbf798a359d" Mar 21 05:31:45 crc kubenswrapper[4839]: E0321 05:31:45.454035 4839 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-jx4q7_openshift-machine-config-operator(4f92fefb-d5cd-451a-8bbe-31eea55d5bd9)\"" pod="openshift-machine-config-operator/machine-config-daemon-jx4q7" podUID="4f92fefb-d5cd-451a-8bbe-31eea55d5bd9" Mar 21 05:31:48 crc kubenswrapper[4839]: I0321 05:31:48.131647 4839 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-l4gg7"] Mar 21 05:31:48 crc kubenswrapper[4839]: E0321 05:31:48.136230 4839 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a52535f1-c597-4bcd-9cdf-b51230e45194" containerName="registry-server" Mar 21 05:31:48 crc kubenswrapper[4839]: I0321 05:31:48.136282 4839 state_mem.go:107] "Deleted CPUSet assignment" podUID="a52535f1-c597-4bcd-9cdf-b51230e45194" containerName="registry-server" Mar 21 05:31:48 crc kubenswrapper[4839]: E0321 05:31:48.136331 4839 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a52535f1-c597-4bcd-9cdf-b51230e45194" containerName="extract-utilities" Mar 21 05:31:48 crc kubenswrapper[4839]: I0321 05:31:48.136345 4839 state_mem.go:107] "Deleted CPUSet assignment" podUID="a52535f1-c597-4bcd-9cdf-b51230e45194" containerName="extract-utilities" Mar 21 05:31:48 crc kubenswrapper[4839]: E0321 05:31:48.136374 4839 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a52535f1-c597-4bcd-9cdf-b51230e45194" containerName="extract-content" Mar 21 05:31:48 crc kubenswrapper[4839]: I0321 05:31:48.136389 4839 state_mem.go:107] "Deleted CPUSet assignment" podUID="a52535f1-c597-4bcd-9cdf-b51230e45194" containerName="extract-content" Mar 21 05:31:48 crc kubenswrapper[4839]: I0321 05:31:48.137098 4839 memory_manager.go:354] "RemoveStaleState removing state" podUID="a52535f1-c597-4bcd-9cdf-b51230e45194" containerName="registry-server" Mar 21 05:31:48 crc kubenswrapper[4839]: I0321 05:31:48.139051 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-l4gg7" Mar 21 05:31:48 crc kubenswrapper[4839]: I0321 05:31:48.144077 4839 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-l4gg7"] Mar 21 05:31:48 crc kubenswrapper[4839]: I0321 05:31:48.261422 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/06965c4c-a775-46ef-ac7b-9638ec75c419-catalog-content\") pod \"community-operators-l4gg7\" (UID: \"06965c4c-a775-46ef-ac7b-9638ec75c419\") " pod="openshift-marketplace/community-operators-l4gg7" Mar 21 05:31:48 crc kubenswrapper[4839]: I0321 05:31:48.261614 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xwf7c\" (UniqueName: \"kubernetes.io/projected/06965c4c-a775-46ef-ac7b-9638ec75c419-kube-api-access-xwf7c\") pod \"community-operators-l4gg7\" (UID: \"06965c4c-a775-46ef-ac7b-9638ec75c419\") " pod="openshift-marketplace/community-operators-l4gg7" Mar 21 05:31:48 crc kubenswrapper[4839]: I0321 05:31:48.261649 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/06965c4c-a775-46ef-ac7b-9638ec75c419-utilities\") pod \"community-operators-l4gg7\" (UID: \"06965c4c-a775-46ef-ac7b-9638ec75c419\") " pod="openshift-marketplace/community-operators-l4gg7" Mar 21 05:31:48 crc kubenswrapper[4839]: I0321 05:31:48.363595 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xwf7c\" (UniqueName: \"kubernetes.io/projected/06965c4c-a775-46ef-ac7b-9638ec75c419-kube-api-access-xwf7c\") pod \"community-operators-l4gg7\" (UID: \"06965c4c-a775-46ef-ac7b-9638ec75c419\") " pod="openshift-marketplace/community-operators-l4gg7" Mar 21 05:31:48 crc kubenswrapper[4839]: I0321 05:31:48.363667 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/06965c4c-a775-46ef-ac7b-9638ec75c419-utilities\") pod \"community-operators-l4gg7\" (UID: \"06965c4c-a775-46ef-ac7b-9638ec75c419\") " pod="openshift-marketplace/community-operators-l4gg7" Mar 21 05:31:48 crc kubenswrapper[4839]: I0321 05:31:48.363725 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/06965c4c-a775-46ef-ac7b-9638ec75c419-catalog-content\") pod \"community-operators-l4gg7\" (UID: \"06965c4c-a775-46ef-ac7b-9638ec75c419\") " pod="openshift-marketplace/community-operators-l4gg7" Mar 21 05:31:48 crc kubenswrapper[4839]: I0321 05:31:48.364403 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/06965c4c-a775-46ef-ac7b-9638ec75c419-utilities\") pod \"community-operators-l4gg7\" (UID: \"06965c4c-a775-46ef-ac7b-9638ec75c419\") " pod="openshift-marketplace/community-operators-l4gg7" Mar 21 05:31:48 crc kubenswrapper[4839]: I0321 05:31:48.364464 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/06965c4c-a775-46ef-ac7b-9638ec75c419-catalog-content\") pod \"community-operators-l4gg7\" (UID: \"06965c4c-a775-46ef-ac7b-9638ec75c419\") " pod="openshift-marketplace/community-operators-l4gg7" Mar 21 05:31:48 crc kubenswrapper[4839]: I0321 05:31:48.386098 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xwf7c\" (UniqueName: \"kubernetes.io/projected/06965c4c-a775-46ef-ac7b-9638ec75c419-kube-api-access-xwf7c\") pod \"community-operators-l4gg7\" (UID: \"06965c4c-a775-46ef-ac7b-9638ec75c419\") " pod="openshift-marketplace/community-operators-l4gg7" Mar 21 05:31:48 crc kubenswrapper[4839]: I0321 05:31:48.475992 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-l4gg7" Mar 21 05:31:49 crc kubenswrapper[4839]: I0321 05:31:49.005966 4839 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-l4gg7"] Mar 21 05:31:49 crc kubenswrapper[4839]: I0321 05:31:49.932225 4839 generic.go:334] "Generic (PLEG): container finished" podID="06965c4c-a775-46ef-ac7b-9638ec75c419" containerID="0360c3cec9736618095e5477a17f0a4efb323af668769d345433a0f1699f1fd8" exitCode=0 Mar 21 05:31:49 crc kubenswrapper[4839]: I0321 05:31:49.932785 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-l4gg7" event={"ID":"06965c4c-a775-46ef-ac7b-9638ec75c419","Type":"ContainerDied","Data":"0360c3cec9736618095e5477a17f0a4efb323af668769d345433a0f1699f1fd8"} Mar 21 05:31:49 crc kubenswrapper[4839]: I0321 05:31:49.932839 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-l4gg7" event={"ID":"06965c4c-a775-46ef-ac7b-9638ec75c419","Type":"ContainerStarted","Data":"6d21bf3853d86faa6fa835635204d84060fc0472abf70b72de3c10254b58b0e9"} Mar 21 05:31:51 crc kubenswrapper[4839]: I0321 05:31:51.954728 4839 generic.go:334] "Generic (PLEG): container finished" podID="06965c4c-a775-46ef-ac7b-9638ec75c419" containerID="82dc51bfb265289e4e99906f3b27dc184e7f48898b7bb203114fd4f5fcc512d8" exitCode=0 Mar 21 05:31:51 crc kubenswrapper[4839]: I0321 05:31:51.954769 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-l4gg7" event={"ID":"06965c4c-a775-46ef-ac7b-9638ec75c419","Type":"ContainerDied","Data":"82dc51bfb265289e4e99906f3b27dc184e7f48898b7bb203114fd4f5fcc512d8"} Mar 21 05:31:52 crc kubenswrapper[4839]: I0321 05:31:52.966754 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-l4gg7" event={"ID":"06965c4c-a775-46ef-ac7b-9638ec75c419","Type":"ContainerStarted","Data":"e9baf04dadd9c1bbef6e24a2bef358d8faeec1a054d8e47187cf9a2b97bb6dd2"} Mar 21 05:31:52 crc kubenswrapper[4839]: I0321 05:31:52.992612 4839 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-l4gg7" podStartSLOduration=2.600862572 podStartE2EDuration="4.992588859s" podCreationTimestamp="2026-03-21 05:31:48 +0000 UTC" firstStartedPulling="2026-03-21 05:31:49.941510284 +0000 UTC m=+4114.269296960" lastFinishedPulling="2026-03-21 05:31:52.333236571 +0000 UTC m=+4116.661023247" observedRunningTime="2026-03-21 05:31:52.988332079 +0000 UTC m=+4117.316118755" watchObservedRunningTime="2026-03-21 05:31:52.992588859 +0000 UTC m=+4117.320375535" Mar 21 05:31:56 crc kubenswrapper[4839]: I0321 05:31:56.464215 4839 scope.go:117] "RemoveContainer" containerID="135bd8a875eb5ef2e4c8acd69ed54a00aa689eabe31cc4130e794bbf798a359d" Mar 21 05:31:56 crc kubenswrapper[4839]: E0321 05:31:56.465078 4839 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-jx4q7_openshift-machine-config-operator(4f92fefb-d5cd-451a-8bbe-31eea55d5bd9)\"" pod="openshift-machine-config-operator/machine-config-daemon-jx4q7" podUID="4f92fefb-d5cd-451a-8bbe-31eea55d5bd9" Mar 21 05:31:58 crc kubenswrapper[4839]: I0321 05:31:58.476625 4839 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-l4gg7" Mar 21 05:31:58 crc kubenswrapper[4839]: I0321 05:31:58.476934 4839 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-l4gg7" Mar 21 05:31:58 crc kubenswrapper[4839]: I0321 05:31:58.535705 4839 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-l4gg7" Mar 21 05:31:59 crc kubenswrapper[4839]: I0321 05:31:59.076726 4839 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-l4gg7" Mar 21 05:31:59 crc kubenswrapper[4839]: I0321 05:31:59.126295 4839 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-l4gg7"] Mar 21 05:32:00 crc kubenswrapper[4839]: I0321 05:32:00.151148 4839 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29567852-gb6qv"] Mar 21 05:32:00 crc kubenswrapper[4839]: I0321 05:32:00.152354 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567852-gb6qv" Mar 21 05:32:00 crc kubenswrapper[4839]: I0321 05:32:00.154516 4839 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-swld2" Mar 21 05:32:00 crc kubenswrapper[4839]: I0321 05:32:00.154817 4839 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 21 05:32:00 crc kubenswrapper[4839]: I0321 05:32:00.158366 4839 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 21 05:32:00 crc kubenswrapper[4839]: I0321 05:32:00.167276 4839 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29567852-gb6qv"] Mar 21 05:32:00 crc kubenswrapper[4839]: I0321 05:32:00.294673 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p56ld\" (UniqueName: \"kubernetes.io/projected/d8ef5a8a-ae63-4e94-9c1e-1e7d5e4a99c1-kube-api-access-p56ld\") pod \"auto-csr-approver-29567852-gb6qv\" (UID: \"d8ef5a8a-ae63-4e94-9c1e-1e7d5e4a99c1\") " pod="openshift-infra/auto-csr-approver-29567852-gb6qv" Mar 21 05:32:00 crc kubenswrapper[4839]: I0321 05:32:00.397059 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-p56ld\" (UniqueName: \"kubernetes.io/projected/d8ef5a8a-ae63-4e94-9c1e-1e7d5e4a99c1-kube-api-access-p56ld\") pod \"auto-csr-approver-29567852-gb6qv\" (UID: \"d8ef5a8a-ae63-4e94-9c1e-1e7d5e4a99c1\") " pod="openshift-infra/auto-csr-approver-29567852-gb6qv" Mar 21 05:32:00 crc kubenswrapper[4839]: I0321 05:32:00.416910 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-p56ld\" (UniqueName: \"kubernetes.io/projected/d8ef5a8a-ae63-4e94-9c1e-1e7d5e4a99c1-kube-api-access-p56ld\") pod \"auto-csr-approver-29567852-gb6qv\" (UID: \"d8ef5a8a-ae63-4e94-9c1e-1e7d5e4a99c1\") " pod="openshift-infra/auto-csr-approver-29567852-gb6qv" Mar 21 05:32:00 crc kubenswrapper[4839]: I0321 05:32:00.492861 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567852-gb6qv" Mar 21 05:32:00 crc kubenswrapper[4839]: I0321 05:32:00.959650 4839 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29567852-gb6qv"] Mar 21 05:32:01 crc kubenswrapper[4839]: I0321 05:32:01.053183 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567852-gb6qv" event={"ID":"d8ef5a8a-ae63-4e94-9c1e-1e7d5e4a99c1","Type":"ContainerStarted","Data":"d836d723ae82d86055fa728c2549d853b7e29c4678af59f29c1c2ecf50bd5917"} Mar 21 05:32:01 crc kubenswrapper[4839]: I0321 05:32:01.053336 4839 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-l4gg7" podUID="06965c4c-a775-46ef-ac7b-9638ec75c419" containerName="registry-server" containerID="cri-o://e9baf04dadd9c1bbef6e24a2bef358d8faeec1a054d8e47187cf9a2b97bb6dd2" gracePeriod=2 Mar 21 05:32:01 crc kubenswrapper[4839]: I0321 05:32:01.518501 4839 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-l4gg7" Mar 21 05:32:01 crc kubenswrapper[4839]: I0321 05:32:01.625169 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/06965c4c-a775-46ef-ac7b-9638ec75c419-catalog-content\") pod \"06965c4c-a775-46ef-ac7b-9638ec75c419\" (UID: \"06965c4c-a775-46ef-ac7b-9638ec75c419\") " Mar 21 05:32:01 crc kubenswrapper[4839]: I0321 05:32:01.625733 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/06965c4c-a775-46ef-ac7b-9638ec75c419-utilities\") pod \"06965c4c-a775-46ef-ac7b-9638ec75c419\" (UID: \"06965c4c-a775-46ef-ac7b-9638ec75c419\") " Mar 21 05:32:01 crc kubenswrapper[4839]: I0321 05:32:01.625854 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xwf7c\" (UniqueName: \"kubernetes.io/projected/06965c4c-a775-46ef-ac7b-9638ec75c419-kube-api-access-xwf7c\") pod \"06965c4c-a775-46ef-ac7b-9638ec75c419\" (UID: \"06965c4c-a775-46ef-ac7b-9638ec75c419\") " Mar 21 05:32:01 crc kubenswrapper[4839]: I0321 05:32:01.627038 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/06965c4c-a775-46ef-ac7b-9638ec75c419-utilities" (OuterVolumeSpecName: "utilities") pod "06965c4c-a775-46ef-ac7b-9638ec75c419" (UID: "06965c4c-a775-46ef-ac7b-9638ec75c419"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 21 05:32:01 crc kubenswrapper[4839]: I0321 05:32:01.636970 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/06965c4c-a775-46ef-ac7b-9638ec75c419-kube-api-access-xwf7c" (OuterVolumeSpecName: "kube-api-access-xwf7c") pod "06965c4c-a775-46ef-ac7b-9638ec75c419" (UID: "06965c4c-a775-46ef-ac7b-9638ec75c419"). InnerVolumeSpecName "kube-api-access-xwf7c". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 21 05:32:01 crc kubenswrapper[4839]: I0321 05:32:01.687859 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/06965c4c-a775-46ef-ac7b-9638ec75c419-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "06965c4c-a775-46ef-ac7b-9638ec75c419" (UID: "06965c4c-a775-46ef-ac7b-9638ec75c419"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 21 05:32:01 crc kubenswrapper[4839]: I0321 05:32:01.728286 4839 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/06965c4c-a775-46ef-ac7b-9638ec75c419-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 21 05:32:01 crc kubenswrapper[4839]: I0321 05:32:01.728324 4839 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/06965c4c-a775-46ef-ac7b-9638ec75c419-utilities\") on node \"crc\" DevicePath \"\"" Mar 21 05:32:01 crc kubenswrapper[4839]: I0321 05:32:01.728336 4839 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xwf7c\" (UniqueName: \"kubernetes.io/projected/06965c4c-a775-46ef-ac7b-9638ec75c419-kube-api-access-xwf7c\") on node \"crc\" DevicePath \"\"" Mar 21 05:32:02 crc kubenswrapper[4839]: I0321 05:32:02.068763 4839 generic.go:334] "Generic (PLEG): container finished" podID="06965c4c-a775-46ef-ac7b-9638ec75c419" containerID="e9baf04dadd9c1bbef6e24a2bef358d8faeec1a054d8e47187cf9a2b97bb6dd2" exitCode=0 Mar 21 05:32:02 crc kubenswrapper[4839]: I0321 05:32:02.068850 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-l4gg7" event={"ID":"06965c4c-a775-46ef-ac7b-9638ec75c419","Type":"ContainerDied","Data":"e9baf04dadd9c1bbef6e24a2bef358d8faeec1a054d8e47187cf9a2b97bb6dd2"} Mar 21 05:32:02 crc kubenswrapper[4839]: I0321 05:32:02.068915 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-l4gg7" event={"ID":"06965c4c-a775-46ef-ac7b-9638ec75c419","Type":"ContainerDied","Data":"6d21bf3853d86faa6fa835635204d84060fc0472abf70b72de3c10254b58b0e9"} Mar 21 05:32:02 crc kubenswrapper[4839]: I0321 05:32:02.068939 4839 scope.go:117] "RemoveContainer" containerID="e9baf04dadd9c1bbef6e24a2bef358d8faeec1a054d8e47187cf9a2b97bb6dd2" Mar 21 05:32:02 crc kubenswrapper[4839]: I0321 05:32:02.068877 4839 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-l4gg7" Mar 21 05:32:02 crc kubenswrapper[4839]: I0321 05:32:02.096887 4839 scope.go:117] "RemoveContainer" containerID="82dc51bfb265289e4e99906f3b27dc184e7f48898b7bb203114fd4f5fcc512d8" Mar 21 05:32:02 crc kubenswrapper[4839]: I0321 05:32:02.128995 4839 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-l4gg7"] Mar 21 05:32:02 crc kubenswrapper[4839]: I0321 05:32:02.136160 4839 scope.go:117] "RemoveContainer" containerID="0360c3cec9736618095e5477a17f0a4efb323af668769d345433a0f1699f1fd8" Mar 21 05:32:02 crc kubenswrapper[4839]: I0321 05:32:02.165000 4839 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-l4gg7"] Mar 21 05:32:02 crc kubenswrapper[4839]: I0321 05:32:02.199944 4839 scope.go:117] "RemoveContainer" containerID="e9baf04dadd9c1bbef6e24a2bef358d8faeec1a054d8e47187cf9a2b97bb6dd2" Mar 21 05:32:02 crc kubenswrapper[4839]: E0321 05:32:02.201285 4839 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e9baf04dadd9c1bbef6e24a2bef358d8faeec1a054d8e47187cf9a2b97bb6dd2\": container with ID starting with e9baf04dadd9c1bbef6e24a2bef358d8faeec1a054d8e47187cf9a2b97bb6dd2 not found: ID does not exist" containerID="e9baf04dadd9c1bbef6e24a2bef358d8faeec1a054d8e47187cf9a2b97bb6dd2" Mar 21 05:32:02 crc kubenswrapper[4839]: I0321 05:32:02.201391 4839 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e9baf04dadd9c1bbef6e24a2bef358d8faeec1a054d8e47187cf9a2b97bb6dd2"} err="failed to get container status \"e9baf04dadd9c1bbef6e24a2bef358d8faeec1a054d8e47187cf9a2b97bb6dd2\": rpc error: code = NotFound desc = could not find container \"e9baf04dadd9c1bbef6e24a2bef358d8faeec1a054d8e47187cf9a2b97bb6dd2\": container with ID starting with e9baf04dadd9c1bbef6e24a2bef358d8faeec1a054d8e47187cf9a2b97bb6dd2 not found: ID does not exist" Mar 21 05:32:02 crc kubenswrapper[4839]: I0321 05:32:02.201432 4839 scope.go:117] "RemoveContainer" containerID="82dc51bfb265289e4e99906f3b27dc184e7f48898b7bb203114fd4f5fcc512d8" Mar 21 05:32:02 crc kubenswrapper[4839]: E0321 05:32:02.202116 4839 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"82dc51bfb265289e4e99906f3b27dc184e7f48898b7bb203114fd4f5fcc512d8\": container with ID starting with 82dc51bfb265289e4e99906f3b27dc184e7f48898b7bb203114fd4f5fcc512d8 not found: ID does not exist" containerID="82dc51bfb265289e4e99906f3b27dc184e7f48898b7bb203114fd4f5fcc512d8" Mar 21 05:32:02 crc kubenswrapper[4839]: I0321 05:32:02.202149 4839 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"82dc51bfb265289e4e99906f3b27dc184e7f48898b7bb203114fd4f5fcc512d8"} err="failed to get container status \"82dc51bfb265289e4e99906f3b27dc184e7f48898b7bb203114fd4f5fcc512d8\": rpc error: code = NotFound desc = could not find container \"82dc51bfb265289e4e99906f3b27dc184e7f48898b7bb203114fd4f5fcc512d8\": container with ID starting with 82dc51bfb265289e4e99906f3b27dc184e7f48898b7bb203114fd4f5fcc512d8 not found: ID does not exist" Mar 21 05:32:02 crc kubenswrapper[4839]: I0321 05:32:02.202164 4839 scope.go:117] "RemoveContainer" containerID="0360c3cec9736618095e5477a17f0a4efb323af668769d345433a0f1699f1fd8" Mar 21 05:32:02 crc kubenswrapper[4839]: E0321 05:32:02.203917 4839 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0360c3cec9736618095e5477a17f0a4efb323af668769d345433a0f1699f1fd8\": container with ID starting with 0360c3cec9736618095e5477a17f0a4efb323af668769d345433a0f1699f1fd8 not found: ID does not exist" containerID="0360c3cec9736618095e5477a17f0a4efb323af668769d345433a0f1699f1fd8" Mar 21 05:32:02 crc kubenswrapper[4839]: I0321 05:32:02.203943 4839 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0360c3cec9736618095e5477a17f0a4efb323af668769d345433a0f1699f1fd8"} err="failed to get container status \"0360c3cec9736618095e5477a17f0a4efb323af668769d345433a0f1699f1fd8\": rpc error: code = NotFound desc = could not find container \"0360c3cec9736618095e5477a17f0a4efb323af668769d345433a0f1699f1fd8\": container with ID starting with 0360c3cec9736618095e5477a17f0a4efb323af668769d345433a0f1699f1fd8 not found: ID does not exist" Mar 21 05:32:02 crc kubenswrapper[4839]: I0321 05:32:02.462880 4839 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="06965c4c-a775-46ef-ac7b-9638ec75c419" path="/var/lib/kubelet/pods/06965c4c-a775-46ef-ac7b-9638ec75c419/volumes" Mar 21 05:32:03 crc kubenswrapper[4839]: I0321 05:32:03.079184 4839 generic.go:334] "Generic (PLEG): container finished" podID="d8ef5a8a-ae63-4e94-9c1e-1e7d5e4a99c1" containerID="cc3802ac333d73f4abb16330d261760555d938cdc36d0050dadf5466674b13ba" exitCode=0 Mar 21 05:32:03 crc kubenswrapper[4839]: I0321 05:32:03.080368 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567852-gb6qv" event={"ID":"d8ef5a8a-ae63-4e94-9c1e-1e7d5e4a99c1","Type":"ContainerDied","Data":"cc3802ac333d73f4abb16330d261760555d938cdc36d0050dadf5466674b13ba"} Mar 21 05:32:04 crc kubenswrapper[4839]: I0321 05:32:04.436495 4839 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567852-gb6qv" Mar 21 05:32:04 crc kubenswrapper[4839]: I0321 05:32:04.508304 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-p56ld\" (UniqueName: \"kubernetes.io/projected/d8ef5a8a-ae63-4e94-9c1e-1e7d5e4a99c1-kube-api-access-p56ld\") pod \"d8ef5a8a-ae63-4e94-9c1e-1e7d5e4a99c1\" (UID: \"d8ef5a8a-ae63-4e94-9c1e-1e7d5e4a99c1\") " Mar 21 05:32:04 crc kubenswrapper[4839]: I0321 05:32:04.515793 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d8ef5a8a-ae63-4e94-9c1e-1e7d5e4a99c1-kube-api-access-p56ld" (OuterVolumeSpecName: "kube-api-access-p56ld") pod "d8ef5a8a-ae63-4e94-9c1e-1e7d5e4a99c1" (UID: "d8ef5a8a-ae63-4e94-9c1e-1e7d5e4a99c1"). InnerVolumeSpecName "kube-api-access-p56ld". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 21 05:32:04 crc kubenswrapper[4839]: I0321 05:32:04.611054 4839 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-p56ld\" (UniqueName: \"kubernetes.io/projected/d8ef5a8a-ae63-4e94-9c1e-1e7d5e4a99c1-kube-api-access-p56ld\") on node \"crc\" DevicePath \"\"" Mar 21 05:32:05 crc kubenswrapper[4839]: I0321 05:32:05.105950 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567852-gb6qv" event={"ID":"d8ef5a8a-ae63-4e94-9c1e-1e7d5e4a99c1","Type":"ContainerDied","Data":"d836d723ae82d86055fa728c2549d853b7e29c4678af59f29c1c2ecf50bd5917"} Mar 21 05:32:05 crc kubenswrapper[4839]: I0321 05:32:05.106004 4839 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567852-gb6qv" Mar 21 05:32:05 crc kubenswrapper[4839]: I0321 05:32:05.107849 4839 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d836d723ae82d86055fa728c2549d853b7e29c4678af59f29c1c2ecf50bd5917" Mar 21 05:32:05 crc kubenswrapper[4839]: I0321 05:32:05.498574 4839 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29567846-8dwrk"] Mar 21 05:32:05 crc kubenswrapper[4839]: I0321 05:32:05.507848 4839 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29567846-8dwrk"] Mar 21 05:32:06 crc kubenswrapper[4839]: I0321 05:32:06.466275 4839 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="274043bb-38cf-435f-9cb1-01d194d34325" path="/var/lib/kubelet/pods/274043bb-38cf-435f-9cb1-01d194d34325/volumes" Mar 21 05:32:07 crc kubenswrapper[4839]: I0321 05:32:07.452665 4839 scope.go:117] "RemoveContainer" containerID="135bd8a875eb5ef2e4c8acd69ed54a00aa689eabe31cc4130e794bbf798a359d" Mar 21 05:32:07 crc kubenswrapper[4839]: E0321 05:32:07.453304 4839 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-jx4q7_openshift-machine-config-operator(4f92fefb-d5cd-451a-8bbe-31eea55d5bd9)\"" pod="openshift-machine-config-operator/machine-config-daemon-jx4q7" podUID="4f92fefb-d5cd-451a-8bbe-31eea55d5bd9" Mar 21 05:32:20 crc kubenswrapper[4839]: I0321 05:32:20.453156 4839 scope.go:117] "RemoveContainer" containerID="135bd8a875eb5ef2e4c8acd69ed54a00aa689eabe31cc4130e794bbf798a359d" Mar 21 05:32:20 crc kubenswrapper[4839]: E0321 05:32:20.453830 4839 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-jx4q7_openshift-machine-config-operator(4f92fefb-d5cd-451a-8bbe-31eea55d5bd9)\"" pod="openshift-machine-config-operator/machine-config-daemon-jx4q7" podUID="4f92fefb-d5cd-451a-8bbe-31eea55d5bd9" Mar 21 05:32:34 crc kubenswrapper[4839]: I0321 05:32:34.453387 4839 scope.go:117] "RemoveContainer" containerID="135bd8a875eb5ef2e4c8acd69ed54a00aa689eabe31cc4130e794bbf798a359d" Mar 21 05:32:34 crc kubenswrapper[4839]: E0321 05:32:34.454217 4839 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-jx4q7_openshift-machine-config-operator(4f92fefb-d5cd-451a-8bbe-31eea55d5bd9)\"" pod="openshift-machine-config-operator/machine-config-daemon-jx4q7" podUID="4f92fefb-d5cd-451a-8bbe-31eea55d5bd9" Mar 21 05:32:41 crc kubenswrapper[4839]: I0321 05:32:41.062104 4839 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-ppvlf/must-gather-sjwj7"] Mar 21 05:32:41 crc kubenswrapper[4839]: E0321 05:32:41.063146 4839 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="06965c4c-a775-46ef-ac7b-9638ec75c419" containerName="extract-content" Mar 21 05:32:41 crc kubenswrapper[4839]: I0321 05:32:41.063163 4839 state_mem.go:107] "Deleted CPUSet assignment" podUID="06965c4c-a775-46ef-ac7b-9638ec75c419" containerName="extract-content" Mar 21 05:32:41 crc kubenswrapper[4839]: E0321 05:32:41.063184 4839 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="06965c4c-a775-46ef-ac7b-9638ec75c419" containerName="registry-server" Mar 21 05:32:41 crc kubenswrapper[4839]: I0321 05:32:41.063192 4839 state_mem.go:107] "Deleted CPUSet assignment" podUID="06965c4c-a775-46ef-ac7b-9638ec75c419" containerName="registry-server" Mar 21 05:32:41 crc kubenswrapper[4839]: E0321 05:32:41.063206 4839 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d8ef5a8a-ae63-4e94-9c1e-1e7d5e4a99c1" containerName="oc" Mar 21 05:32:41 crc kubenswrapper[4839]: I0321 05:32:41.063216 4839 state_mem.go:107] "Deleted CPUSet assignment" podUID="d8ef5a8a-ae63-4e94-9c1e-1e7d5e4a99c1" containerName="oc" Mar 21 05:32:41 crc kubenswrapper[4839]: E0321 05:32:41.063232 4839 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="06965c4c-a775-46ef-ac7b-9638ec75c419" containerName="extract-utilities" Mar 21 05:32:41 crc kubenswrapper[4839]: I0321 05:32:41.063239 4839 state_mem.go:107] "Deleted CPUSet assignment" podUID="06965c4c-a775-46ef-ac7b-9638ec75c419" containerName="extract-utilities" Mar 21 05:32:41 crc kubenswrapper[4839]: I0321 05:32:41.063499 4839 memory_manager.go:354] "RemoveStaleState removing state" podUID="06965c4c-a775-46ef-ac7b-9638ec75c419" containerName="registry-server" Mar 21 05:32:41 crc kubenswrapper[4839]: I0321 05:32:41.063521 4839 memory_manager.go:354] "RemoveStaleState removing state" podUID="d8ef5a8a-ae63-4e94-9c1e-1e7d5e4a99c1" containerName="oc" Mar 21 05:32:41 crc kubenswrapper[4839]: I0321 05:32:41.064730 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-ppvlf/must-gather-sjwj7" Mar 21 05:32:41 crc kubenswrapper[4839]: I0321 05:32:41.066603 4839 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-must-gather-ppvlf"/"openshift-service-ca.crt" Mar 21 05:32:41 crc kubenswrapper[4839]: I0321 05:32:41.066811 4839 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-must-gather-ppvlf"/"kube-root-ca.crt" Mar 21 05:32:41 crc kubenswrapper[4839]: I0321 05:32:41.067042 4839 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-must-gather-ppvlf"/"default-dockercfg-xwmcj" Mar 21 05:32:41 crc kubenswrapper[4839]: I0321 05:32:41.069886 4839 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-ppvlf/must-gather-sjwj7"] Mar 21 05:32:41 crc kubenswrapper[4839]: I0321 05:32:41.123691 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rx5vq\" (UniqueName: \"kubernetes.io/projected/5072f4c5-1de6-4d8c-b69c-72d081fc7a0d-kube-api-access-rx5vq\") pod \"must-gather-sjwj7\" (UID: \"5072f4c5-1de6-4d8c-b69c-72d081fc7a0d\") " pod="openshift-must-gather-ppvlf/must-gather-sjwj7" Mar 21 05:32:41 crc kubenswrapper[4839]: I0321 05:32:41.123951 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/5072f4c5-1de6-4d8c-b69c-72d081fc7a0d-must-gather-output\") pod \"must-gather-sjwj7\" (UID: \"5072f4c5-1de6-4d8c-b69c-72d081fc7a0d\") " pod="openshift-must-gather-ppvlf/must-gather-sjwj7" Mar 21 05:32:41 crc kubenswrapper[4839]: I0321 05:32:41.225496 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rx5vq\" (UniqueName: \"kubernetes.io/projected/5072f4c5-1de6-4d8c-b69c-72d081fc7a0d-kube-api-access-rx5vq\") pod \"must-gather-sjwj7\" (UID: \"5072f4c5-1de6-4d8c-b69c-72d081fc7a0d\") " pod="openshift-must-gather-ppvlf/must-gather-sjwj7" Mar 21 05:32:41 crc kubenswrapper[4839]: I0321 05:32:41.225549 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/5072f4c5-1de6-4d8c-b69c-72d081fc7a0d-must-gather-output\") pod \"must-gather-sjwj7\" (UID: \"5072f4c5-1de6-4d8c-b69c-72d081fc7a0d\") " pod="openshift-must-gather-ppvlf/must-gather-sjwj7" Mar 21 05:32:41 crc kubenswrapper[4839]: I0321 05:32:41.226094 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/5072f4c5-1de6-4d8c-b69c-72d081fc7a0d-must-gather-output\") pod \"must-gather-sjwj7\" (UID: \"5072f4c5-1de6-4d8c-b69c-72d081fc7a0d\") " pod="openshift-must-gather-ppvlf/must-gather-sjwj7" Mar 21 05:32:41 crc kubenswrapper[4839]: I0321 05:32:41.248105 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rx5vq\" (UniqueName: \"kubernetes.io/projected/5072f4c5-1de6-4d8c-b69c-72d081fc7a0d-kube-api-access-rx5vq\") pod \"must-gather-sjwj7\" (UID: \"5072f4c5-1de6-4d8c-b69c-72d081fc7a0d\") " pod="openshift-must-gather-ppvlf/must-gather-sjwj7" Mar 21 05:32:41 crc kubenswrapper[4839]: I0321 05:32:41.383355 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-ppvlf/must-gather-sjwj7" Mar 21 05:32:41 crc kubenswrapper[4839]: I0321 05:32:41.946434 4839 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-ppvlf/must-gather-sjwj7"] Mar 21 05:32:42 crc kubenswrapper[4839]: I0321 05:32:42.442985 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-ppvlf/must-gather-sjwj7" event={"ID":"5072f4c5-1de6-4d8c-b69c-72d081fc7a0d","Type":"ContainerStarted","Data":"14e8bebce0253c85aad33e0b71ce4edd291dedb5a18a4452ca51e078dab4db0e"} Mar 21 05:32:42 crc kubenswrapper[4839]: I0321 05:32:42.443241 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-ppvlf/must-gather-sjwj7" event={"ID":"5072f4c5-1de6-4d8c-b69c-72d081fc7a0d","Type":"ContainerStarted","Data":"5a207bcd98fea8bdcbd8fcac34144924b54bfe6689991ce19989ac8cd6f7c3fd"} Mar 21 05:32:43 crc kubenswrapper[4839]: I0321 05:32:43.453233 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-ppvlf/must-gather-sjwj7" event={"ID":"5072f4c5-1de6-4d8c-b69c-72d081fc7a0d","Type":"ContainerStarted","Data":"048a66db59bdf132d5c901dfec60d6fc743b835eddb4d17b55a8605cd2b2d8b8"} Mar 21 05:32:43 crc kubenswrapper[4839]: I0321 05:32:43.472007 4839 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-ppvlf/must-gather-sjwj7" podStartSLOduration=2.471988495 podStartE2EDuration="2.471988495s" podCreationTimestamp="2026-03-21 05:32:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-21 05:32:43.470334828 +0000 UTC m=+4167.798121504" watchObservedRunningTime="2026-03-21 05:32:43.471988495 +0000 UTC m=+4167.799775171" Mar 21 05:32:45 crc kubenswrapper[4839]: I0321 05:32:45.962899 4839 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-ppvlf/crc-debug-jm2cv"] Mar 21 05:32:45 crc kubenswrapper[4839]: I0321 05:32:45.964454 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-ppvlf/crc-debug-jm2cv" Mar 21 05:32:46 crc kubenswrapper[4839]: I0321 05:32:46.032504 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/81dc8803-7188-45df-8f25-fe1037bbdd01-host\") pod \"crc-debug-jm2cv\" (UID: \"81dc8803-7188-45df-8f25-fe1037bbdd01\") " pod="openshift-must-gather-ppvlf/crc-debug-jm2cv" Mar 21 05:32:46 crc kubenswrapper[4839]: I0321 05:32:46.032657 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-95t7g\" (UniqueName: \"kubernetes.io/projected/81dc8803-7188-45df-8f25-fe1037bbdd01-kube-api-access-95t7g\") pod \"crc-debug-jm2cv\" (UID: \"81dc8803-7188-45df-8f25-fe1037bbdd01\") " pod="openshift-must-gather-ppvlf/crc-debug-jm2cv" Mar 21 05:32:46 crc kubenswrapper[4839]: I0321 05:32:46.133509 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/81dc8803-7188-45df-8f25-fe1037bbdd01-host\") pod \"crc-debug-jm2cv\" (UID: \"81dc8803-7188-45df-8f25-fe1037bbdd01\") " pod="openshift-must-gather-ppvlf/crc-debug-jm2cv" Mar 21 05:32:46 crc kubenswrapper[4839]: I0321 05:32:46.133585 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-95t7g\" (UniqueName: \"kubernetes.io/projected/81dc8803-7188-45df-8f25-fe1037bbdd01-kube-api-access-95t7g\") pod \"crc-debug-jm2cv\" (UID: \"81dc8803-7188-45df-8f25-fe1037bbdd01\") " pod="openshift-must-gather-ppvlf/crc-debug-jm2cv" Mar 21 05:32:46 crc kubenswrapper[4839]: I0321 05:32:46.133696 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/81dc8803-7188-45df-8f25-fe1037bbdd01-host\") pod \"crc-debug-jm2cv\" (UID: \"81dc8803-7188-45df-8f25-fe1037bbdd01\") " pod="openshift-must-gather-ppvlf/crc-debug-jm2cv" Mar 21 05:32:46 crc kubenswrapper[4839]: I0321 05:32:46.166397 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-95t7g\" (UniqueName: \"kubernetes.io/projected/81dc8803-7188-45df-8f25-fe1037bbdd01-kube-api-access-95t7g\") pod \"crc-debug-jm2cv\" (UID: \"81dc8803-7188-45df-8f25-fe1037bbdd01\") " pod="openshift-must-gather-ppvlf/crc-debug-jm2cv" Mar 21 05:32:46 crc kubenswrapper[4839]: I0321 05:32:46.289515 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-ppvlf/crc-debug-jm2cv" Mar 21 05:32:46 crc kubenswrapper[4839]: I0321 05:32:46.462362 4839 scope.go:117] "RemoveContainer" containerID="135bd8a875eb5ef2e4c8acd69ed54a00aa689eabe31cc4130e794bbf798a359d" Mar 21 05:32:46 crc kubenswrapper[4839]: E0321 05:32:46.462650 4839 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-jx4q7_openshift-machine-config-operator(4f92fefb-d5cd-451a-8bbe-31eea55d5bd9)\"" pod="openshift-machine-config-operator/machine-config-daemon-jx4q7" podUID="4f92fefb-d5cd-451a-8bbe-31eea55d5bd9" Mar 21 05:32:46 crc kubenswrapper[4839]: I0321 05:32:46.485330 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-ppvlf/crc-debug-jm2cv" event={"ID":"81dc8803-7188-45df-8f25-fe1037bbdd01","Type":"ContainerStarted","Data":"1430886b299af015100e7533b1000c0e59127cd85c1994b254190e65fcf9e655"} Mar 21 05:32:47 crc kubenswrapper[4839]: I0321 05:32:47.498643 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-ppvlf/crc-debug-jm2cv" event={"ID":"81dc8803-7188-45df-8f25-fe1037bbdd01","Type":"ContainerStarted","Data":"e0aaa7c76a0ee9b1660ca2e309fd9d60f43c9f5876dc19d939b4dd884d137805"} Mar 21 05:32:47 crc kubenswrapper[4839]: I0321 05:32:47.527293 4839 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-ppvlf/crc-debug-jm2cv" podStartSLOduration=2.527275901 podStartE2EDuration="2.527275901s" podCreationTimestamp="2026-03-21 05:32:45 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-21 05:32:47.523917806 +0000 UTC m=+4171.851704482" watchObservedRunningTime="2026-03-21 05:32:47.527275901 +0000 UTC m=+4171.855062577" Mar 21 05:32:58 crc kubenswrapper[4839]: I0321 05:32:58.209411 4839 scope.go:117] "RemoveContainer" containerID="3e713d4f3a2eccb8fba4adfa096046056f6cf5d095f6cdc7fc919eb1fb945456" Mar 21 05:33:00 crc kubenswrapper[4839]: I0321 05:33:00.453445 4839 scope.go:117] "RemoveContainer" containerID="135bd8a875eb5ef2e4c8acd69ed54a00aa689eabe31cc4130e794bbf798a359d" Mar 21 05:33:00 crc kubenswrapper[4839]: E0321 05:33:00.453984 4839 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-jx4q7_openshift-machine-config-operator(4f92fefb-d5cd-451a-8bbe-31eea55d5bd9)\"" pod="openshift-machine-config-operator/machine-config-daemon-jx4q7" podUID="4f92fefb-d5cd-451a-8bbe-31eea55d5bd9" Mar 21 05:33:11 crc kubenswrapper[4839]: I0321 05:33:11.452969 4839 scope.go:117] "RemoveContainer" containerID="135bd8a875eb5ef2e4c8acd69ed54a00aa689eabe31cc4130e794bbf798a359d" Mar 21 05:33:11 crc kubenswrapper[4839]: E0321 05:33:11.454013 4839 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-jx4q7_openshift-machine-config-operator(4f92fefb-d5cd-451a-8bbe-31eea55d5bd9)\"" pod="openshift-machine-config-operator/machine-config-daemon-jx4q7" podUID="4f92fefb-d5cd-451a-8bbe-31eea55d5bd9" Mar 21 05:33:21 crc kubenswrapper[4839]: I0321 05:33:21.782103 4839 generic.go:334] "Generic (PLEG): container finished" podID="81dc8803-7188-45df-8f25-fe1037bbdd01" containerID="e0aaa7c76a0ee9b1660ca2e309fd9d60f43c9f5876dc19d939b4dd884d137805" exitCode=0 Mar 21 05:33:21 crc kubenswrapper[4839]: I0321 05:33:21.782179 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-ppvlf/crc-debug-jm2cv" event={"ID":"81dc8803-7188-45df-8f25-fe1037bbdd01","Type":"ContainerDied","Data":"e0aaa7c76a0ee9b1660ca2e309fd9d60f43c9f5876dc19d939b4dd884d137805"} Mar 21 05:33:22 crc kubenswrapper[4839]: I0321 05:33:22.453279 4839 scope.go:117] "RemoveContainer" containerID="135bd8a875eb5ef2e4c8acd69ed54a00aa689eabe31cc4130e794bbf798a359d" Mar 21 05:33:22 crc kubenswrapper[4839]: E0321 05:33:22.453750 4839 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-jx4q7_openshift-machine-config-operator(4f92fefb-d5cd-451a-8bbe-31eea55d5bd9)\"" pod="openshift-machine-config-operator/machine-config-daemon-jx4q7" podUID="4f92fefb-d5cd-451a-8bbe-31eea55d5bd9" Mar 21 05:33:22 crc kubenswrapper[4839]: I0321 05:33:22.888902 4839 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-ppvlf/crc-debug-jm2cv" Mar 21 05:33:22 crc kubenswrapper[4839]: I0321 05:33:22.924819 4839 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-ppvlf/crc-debug-jm2cv"] Mar 21 05:33:22 crc kubenswrapper[4839]: I0321 05:33:22.934514 4839 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-ppvlf/crc-debug-jm2cv"] Mar 21 05:33:23 crc kubenswrapper[4839]: I0321 05:33:23.057041 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/81dc8803-7188-45df-8f25-fe1037bbdd01-host\") pod \"81dc8803-7188-45df-8f25-fe1037bbdd01\" (UID: \"81dc8803-7188-45df-8f25-fe1037bbdd01\") " Mar 21 05:33:23 crc kubenswrapper[4839]: I0321 05:33:23.057351 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-95t7g\" (UniqueName: \"kubernetes.io/projected/81dc8803-7188-45df-8f25-fe1037bbdd01-kube-api-access-95t7g\") pod \"81dc8803-7188-45df-8f25-fe1037bbdd01\" (UID: \"81dc8803-7188-45df-8f25-fe1037bbdd01\") " Mar 21 05:33:23 crc kubenswrapper[4839]: I0321 05:33:23.057362 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/81dc8803-7188-45df-8f25-fe1037bbdd01-host" (OuterVolumeSpecName: "host") pod "81dc8803-7188-45df-8f25-fe1037bbdd01" (UID: "81dc8803-7188-45df-8f25-fe1037bbdd01"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 21 05:33:23 crc kubenswrapper[4839]: I0321 05:33:23.058051 4839 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/81dc8803-7188-45df-8f25-fe1037bbdd01-host\") on node \"crc\" DevicePath \"\"" Mar 21 05:33:23 crc kubenswrapper[4839]: I0321 05:33:23.069256 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/81dc8803-7188-45df-8f25-fe1037bbdd01-kube-api-access-95t7g" (OuterVolumeSpecName: "kube-api-access-95t7g") pod "81dc8803-7188-45df-8f25-fe1037bbdd01" (UID: "81dc8803-7188-45df-8f25-fe1037bbdd01"). InnerVolumeSpecName "kube-api-access-95t7g". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 21 05:33:23 crc kubenswrapper[4839]: I0321 05:33:23.159987 4839 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-95t7g\" (UniqueName: \"kubernetes.io/projected/81dc8803-7188-45df-8f25-fe1037bbdd01-kube-api-access-95t7g\") on node \"crc\" DevicePath \"\"" Mar 21 05:33:23 crc kubenswrapper[4839]: I0321 05:33:23.800540 4839 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="1430886b299af015100e7533b1000c0e59127cd85c1994b254190e65fcf9e655" Mar 21 05:33:23 crc kubenswrapper[4839]: I0321 05:33:23.800614 4839 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-ppvlf/crc-debug-jm2cv" Mar 21 05:33:24 crc kubenswrapper[4839]: I0321 05:33:24.122485 4839 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-ppvlf/crc-debug-qnkbb"] Mar 21 05:33:24 crc kubenswrapper[4839]: E0321 05:33:24.123005 4839 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="81dc8803-7188-45df-8f25-fe1037bbdd01" containerName="container-00" Mar 21 05:33:24 crc kubenswrapper[4839]: I0321 05:33:24.123028 4839 state_mem.go:107] "Deleted CPUSet assignment" podUID="81dc8803-7188-45df-8f25-fe1037bbdd01" containerName="container-00" Mar 21 05:33:24 crc kubenswrapper[4839]: I0321 05:33:24.123305 4839 memory_manager.go:354] "RemoveStaleState removing state" podUID="81dc8803-7188-45df-8f25-fe1037bbdd01" containerName="container-00" Mar 21 05:33:24 crc kubenswrapper[4839]: I0321 05:33:24.124090 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-ppvlf/crc-debug-qnkbb" Mar 21 05:33:24 crc kubenswrapper[4839]: I0321 05:33:24.278507 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/c9def19c-1036-49fc-874e-aa1013e5c547-host\") pod \"crc-debug-qnkbb\" (UID: \"c9def19c-1036-49fc-874e-aa1013e5c547\") " pod="openshift-must-gather-ppvlf/crc-debug-qnkbb" Mar 21 05:33:24 crc kubenswrapper[4839]: I0321 05:33:24.278580 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jmn5k\" (UniqueName: \"kubernetes.io/projected/c9def19c-1036-49fc-874e-aa1013e5c547-kube-api-access-jmn5k\") pod \"crc-debug-qnkbb\" (UID: \"c9def19c-1036-49fc-874e-aa1013e5c547\") " pod="openshift-must-gather-ppvlf/crc-debug-qnkbb" Mar 21 05:33:24 crc kubenswrapper[4839]: I0321 05:33:24.380867 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jmn5k\" (UniqueName: \"kubernetes.io/projected/c9def19c-1036-49fc-874e-aa1013e5c547-kube-api-access-jmn5k\") pod \"crc-debug-qnkbb\" (UID: \"c9def19c-1036-49fc-874e-aa1013e5c547\") " pod="openshift-must-gather-ppvlf/crc-debug-qnkbb" Mar 21 05:33:24 crc kubenswrapper[4839]: I0321 05:33:24.381100 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/c9def19c-1036-49fc-874e-aa1013e5c547-host\") pod \"crc-debug-qnkbb\" (UID: \"c9def19c-1036-49fc-874e-aa1013e5c547\") " pod="openshift-must-gather-ppvlf/crc-debug-qnkbb" Mar 21 05:33:24 crc kubenswrapper[4839]: I0321 05:33:24.381227 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/c9def19c-1036-49fc-874e-aa1013e5c547-host\") pod \"crc-debug-qnkbb\" (UID: \"c9def19c-1036-49fc-874e-aa1013e5c547\") " pod="openshift-must-gather-ppvlf/crc-debug-qnkbb" Mar 21 05:33:24 crc kubenswrapper[4839]: I0321 05:33:24.403469 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jmn5k\" (UniqueName: \"kubernetes.io/projected/c9def19c-1036-49fc-874e-aa1013e5c547-kube-api-access-jmn5k\") pod \"crc-debug-qnkbb\" (UID: \"c9def19c-1036-49fc-874e-aa1013e5c547\") " pod="openshift-must-gather-ppvlf/crc-debug-qnkbb" Mar 21 05:33:24 crc kubenswrapper[4839]: I0321 05:33:24.441265 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-ppvlf/crc-debug-qnkbb" Mar 21 05:33:24 crc kubenswrapper[4839]: I0321 05:33:24.466162 4839 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="81dc8803-7188-45df-8f25-fe1037bbdd01" path="/var/lib/kubelet/pods/81dc8803-7188-45df-8f25-fe1037bbdd01/volumes" Mar 21 05:33:24 crc kubenswrapper[4839]: I0321 05:33:24.810540 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-ppvlf/crc-debug-qnkbb" event={"ID":"c9def19c-1036-49fc-874e-aa1013e5c547","Type":"ContainerStarted","Data":"d991b608c7dd15cd8e8f6e12d6073ad24091724986f4f1fa631390572cd83d55"} Mar 21 05:33:24 crc kubenswrapper[4839]: I0321 05:33:24.811181 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-ppvlf/crc-debug-qnkbb" event={"ID":"c9def19c-1036-49fc-874e-aa1013e5c547","Type":"ContainerStarted","Data":"55701753f649ce649fcb45fa7eda4471b54715543775e82e35dbbd0d3456ffd0"} Mar 21 05:33:24 crc kubenswrapper[4839]: I0321 05:33:24.825723 4839 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-ppvlf/crc-debug-qnkbb" podStartSLOduration=0.825704086 podStartE2EDuration="825.704086ms" podCreationTimestamp="2026-03-21 05:33:24 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-21 05:33:24.824136052 +0000 UTC m=+4209.151922738" watchObservedRunningTime="2026-03-21 05:33:24.825704086 +0000 UTC m=+4209.153490762" Mar 21 05:33:25 crc kubenswrapper[4839]: I0321 05:33:25.830171 4839 generic.go:334] "Generic (PLEG): container finished" podID="c9def19c-1036-49fc-874e-aa1013e5c547" containerID="d991b608c7dd15cd8e8f6e12d6073ad24091724986f4f1fa631390572cd83d55" exitCode=0 Mar 21 05:33:25 crc kubenswrapper[4839]: I0321 05:33:25.830524 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-ppvlf/crc-debug-qnkbb" event={"ID":"c9def19c-1036-49fc-874e-aa1013e5c547","Type":"ContainerDied","Data":"d991b608c7dd15cd8e8f6e12d6073ad24091724986f4f1fa631390572cd83d55"} Mar 21 05:33:26 crc kubenswrapper[4839]: I0321 05:33:26.938841 4839 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-ppvlf/crc-debug-qnkbb" Mar 21 05:33:26 crc kubenswrapper[4839]: I0321 05:33:26.972938 4839 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-ppvlf/crc-debug-qnkbb"] Mar 21 05:33:26 crc kubenswrapper[4839]: I0321 05:33:26.982430 4839 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-ppvlf/crc-debug-qnkbb"] Mar 21 05:33:27 crc kubenswrapper[4839]: I0321 05:33:27.026538 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jmn5k\" (UniqueName: \"kubernetes.io/projected/c9def19c-1036-49fc-874e-aa1013e5c547-kube-api-access-jmn5k\") pod \"c9def19c-1036-49fc-874e-aa1013e5c547\" (UID: \"c9def19c-1036-49fc-874e-aa1013e5c547\") " Mar 21 05:33:27 crc kubenswrapper[4839]: I0321 05:33:27.026868 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/c9def19c-1036-49fc-874e-aa1013e5c547-host\") pod \"c9def19c-1036-49fc-874e-aa1013e5c547\" (UID: \"c9def19c-1036-49fc-874e-aa1013e5c547\") " Mar 21 05:33:27 crc kubenswrapper[4839]: I0321 05:33:27.026990 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/c9def19c-1036-49fc-874e-aa1013e5c547-host" (OuterVolumeSpecName: "host") pod "c9def19c-1036-49fc-874e-aa1013e5c547" (UID: "c9def19c-1036-49fc-874e-aa1013e5c547"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 21 05:33:27 crc kubenswrapper[4839]: I0321 05:33:27.027386 4839 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/c9def19c-1036-49fc-874e-aa1013e5c547-host\") on node \"crc\" DevicePath \"\"" Mar 21 05:33:27 crc kubenswrapper[4839]: I0321 05:33:27.032627 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c9def19c-1036-49fc-874e-aa1013e5c547-kube-api-access-jmn5k" (OuterVolumeSpecName: "kube-api-access-jmn5k") pod "c9def19c-1036-49fc-874e-aa1013e5c547" (UID: "c9def19c-1036-49fc-874e-aa1013e5c547"). InnerVolumeSpecName "kube-api-access-jmn5k". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 21 05:33:27 crc kubenswrapper[4839]: I0321 05:33:27.128995 4839 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jmn5k\" (UniqueName: \"kubernetes.io/projected/c9def19c-1036-49fc-874e-aa1013e5c547-kube-api-access-jmn5k\") on node \"crc\" DevicePath \"\"" Mar 21 05:33:27 crc kubenswrapper[4839]: I0321 05:33:27.846084 4839 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="55701753f649ce649fcb45fa7eda4471b54715543775e82e35dbbd0d3456ffd0" Mar 21 05:33:27 crc kubenswrapper[4839]: I0321 05:33:27.846156 4839 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-ppvlf/crc-debug-qnkbb" Mar 21 05:33:28 crc kubenswrapper[4839]: I0321 05:33:28.181145 4839 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-ppvlf/crc-debug-4bzpc"] Mar 21 05:33:28 crc kubenswrapper[4839]: E0321 05:33:28.181767 4839 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c9def19c-1036-49fc-874e-aa1013e5c547" containerName="container-00" Mar 21 05:33:28 crc kubenswrapper[4839]: I0321 05:33:28.181786 4839 state_mem.go:107] "Deleted CPUSet assignment" podUID="c9def19c-1036-49fc-874e-aa1013e5c547" containerName="container-00" Mar 21 05:33:28 crc kubenswrapper[4839]: I0321 05:33:28.182023 4839 memory_manager.go:354] "RemoveStaleState removing state" podUID="c9def19c-1036-49fc-874e-aa1013e5c547" containerName="container-00" Mar 21 05:33:28 crc kubenswrapper[4839]: I0321 05:33:28.182793 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-ppvlf/crc-debug-4bzpc" Mar 21 05:33:28 crc kubenswrapper[4839]: I0321 05:33:28.350860 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-drbw4\" (UniqueName: \"kubernetes.io/projected/9a65d87c-e91f-4d4d-846d-16c7699da843-kube-api-access-drbw4\") pod \"crc-debug-4bzpc\" (UID: \"9a65d87c-e91f-4d4d-846d-16c7699da843\") " pod="openshift-must-gather-ppvlf/crc-debug-4bzpc" Mar 21 05:33:28 crc kubenswrapper[4839]: I0321 05:33:28.350986 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/9a65d87c-e91f-4d4d-846d-16c7699da843-host\") pod \"crc-debug-4bzpc\" (UID: \"9a65d87c-e91f-4d4d-846d-16c7699da843\") " pod="openshift-must-gather-ppvlf/crc-debug-4bzpc" Mar 21 05:33:28 crc kubenswrapper[4839]: I0321 05:33:28.453164 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/9a65d87c-e91f-4d4d-846d-16c7699da843-host\") pod \"crc-debug-4bzpc\" (UID: \"9a65d87c-e91f-4d4d-846d-16c7699da843\") " pod="openshift-must-gather-ppvlf/crc-debug-4bzpc" Mar 21 05:33:28 crc kubenswrapper[4839]: I0321 05:33:28.453303 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/9a65d87c-e91f-4d4d-846d-16c7699da843-host\") pod \"crc-debug-4bzpc\" (UID: \"9a65d87c-e91f-4d4d-846d-16c7699da843\") " pod="openshift-must-gather-ppvlf/crc-debug-4bzpc" Mar 21 05:33:28 crc kubenswrapper[4839]: I0321 05:33:28.453317 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-drbw4\" (UniqueName: \"kubernetes.io/projected/9a65d87c-e91f-4d4d-846d-16c7699da843-kube-api-access-drbw4\") pod \"crc-debug-4bzpc\" (UID: \"9a65d87c-e91f-4d4d-846d-16c7699da843\") " pod="openshift-must-gather-ppvlf/crc-debug-4bzpc" Mar 21 05:33:28 crc kubenswrapper[4839]: I0321 05:33:28.464915 4839 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c9def19c-1036-49fc-874e-aa1013e5c547" path="/var/lib/kubelet/pods/c9def19c-1036-49fc-874e-aa1013e5c547/volumes" Mar 21 05:33:28 crc kubenswrapper[4839]: I0321 05:33:28.475326 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-drbw4\" (UniqueName: \"kubernetes.io/projected/9a65d87c-e91f-4d4d-846d-16c7699da843-kube-api-access-drbw4\") pod \"crc-debug-4bzpc\" (UID: \"9a65d87c-e91f-4d4d-846d-16c7699da843\") " pod="openshift-must-gather-ppvlf/crc-debug-4bzpc" Mar 21 05:33:28 crc kubenswrapper[4839]: I0321 05:33:28.500947 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-ppvlf/crc-debug-4bzpc" Mar 21 05:33:28 crc kubenswrapper[4839]: W0321 05:33:28.526238 4839 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod9a65d87c_e91f_4d4d_846d_16c7699da843.slice/crio-f57d1f21fdbdd1b9a60363a5606e7fdd0f95473ba15321b34545ab027e9b215e WatchSource:0}: Error finding container f57d1f21fdbdd1b9a60363a5606e7fdd0f95473ba15321b34545ab027e9b215e: Status 404 returned error can't find the container with id f57d1f21fdbdd1b9a60363a5606e7fdd0f95473ba15321b34545ab027e9b215e Mar 21 05:33:28 crc kubenswrapper[4839]: I0321 05:33:28.856520 4839 generic.go:334] "Generic (PLEG): container finished" podID="9a65d87c-e91f-4d4d-846d-16c7699da843" containerID="1d47288112350ee1777065d7e7d1470b54937a865dd7e52b77f16a09cf7130e3" exitCode=0 Mar 21 05:33:28 crc kubenswrapper[4839]: I0321 05:33:28.856605 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-ppvlf/crc-debug-4bzpc" event={"ID":"9a65d87c-e91f-4d4d-846d-16c7699da843","Type":"ContainerDied","Data":"1d47288112350ee1777065d7e7d1470b54937a865dd7e52b77f16a09cf7130e3"} Mar 21 05:33:28 crc kubenswrapper[4839]: I0321 05:33:28.856866 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-ppvlf/crc-debug-4bzpc" event={"ID":"9a65d87c-e91f-4d4d-846d-16c7699da843","Type":"ContainerStarted","Data":"f57d1f21fdbdd1b9a60363a5606e7fdd0f95473ba15321b34545ab027e9b215e"} Mar 21 05:33:28 crc kubenswrapper[4839]: I0321 05:33:28.902758 4839 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-ppvlf/crc-debug-4bzpc"] Mar 21 05:33:28 crc kubenswrapper[4839]: I0321 05:33:28.911884 4839 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-ppvlf/crc-debug-4bzpc"] Mar 21 05:33:29 crc kubenswrapper[4839]: I0321 05:33:29.238200 4839 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-2fdn9"] Mar 21 05:33:29 crc kubenswrapper[4839]: E0321 05:33:29.239526 4839 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9a65d87c-e91f-4d4d-846d-16c7699da843" containerName="container-00" Mar 21 05:33:29 crc kubenswrapper[4839]: I0321 05:33:29.239624 4839 state_mem.go:107] "Deleted CPUSet assignment" podUID="9a65d87c-e91f-4d4d-846d-16c7699da843" containerName="container-00" Mar 21 05:33:29 crc kubenswrapper[4839]: I0321 05:33:29.239921 4839 memory_manager.go:354] "RemoveStaleState removing state" podUID="9a65d87c-e91f-4d4d-846d-16c7699da843" containerName="container-00" Mar 21 05:33:29 crc kubenswrapper[4839]: I0321 05:33:29.241937 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-2fdn9" Mar 21 05:33:29 crc kubenswrapper[4839]: I0321 05:33:29.250975 4839 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-2fdn9"] Mar 21 05:33:29 crc kubenswrapper[4839]: I0321 05:33:29.368746 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/319559c6-c34d-4b00-ba90-8fcd5b5ff425-utilities\") pod \"redhat-marketplace-2fdn9\" (UID: \"319559c6-c34d-4b00-ba90-8fcd5b5ff425\") " pod="openshift-marketplace/redhat-marketplace-2fdn9" Mar 21 05:33:29 crc kubenswrapper[4839]: I0321 05:33:29.368933 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s225m\" (UniqueName: \"kubernetes.io/projected/319559c6-c34d-4b00-ba90-8fcd5b5ff425-kube-api-access-s225m\") pod \"redhat-marketplace-2fdn9\" (UID: \"319559c6-c34d-4b00-ba90-8fcd5b5ff425\") " pod="openshift-marketplace/redhat-marketplace-2fdn9" Mar 21 05:33:29 crc kubenswrapper[4839]: I0321 05:33:29.368985 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/319559c6-c34d-4b00-ba90-8fcd5b5ff425-catalog-content\") pod \"redhat-marketplace-2fdn9\" (UID: \"319559c6-c34d-4b00-ba90-8fcd5b5ff425\") " pod="openshift-marketplace/redhat-marketplace-2fdn9" Mar 21 05:33:29 crc kubenswrapper[4839]: I0321 05:33:29.470804 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s225m\" (UniqueName: \"kubernetes.io/projected/319559c6-c34d-4b00-ba90-8fcd5b5ff425-kube-api-access-s225m\") pod \"redhat-marketplace-2fdn9\" (UID: \"319559c6-c34d-4b00-ba90-8fcd5b5ff425\") " pod="openshift-marketplace/redhat-marketplace-2fdn9" Mar 21 05:33:29 crc kubenswrapper[4839]: I0321 05:33:29.470897 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/319559c6-c34d-4b00-ba90-8fcd5b5ff425-catalog-content\") pod \"redhat-marketplace-2fdn9\" (UID: \"319559c6-c34d-4b00-ba90-8fcd5b5ff425\") " pod="openshift-marketplace/redhat-marketplace-2fdn9" Mar 21 05:33:29 crc kubenswrapper[4839]: I0321 05:33:29.471119 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/319559c6-c34d-4b00-ba90-8fcd5b5ff425-utilities\") pod \"redhat-marketplace-2fdn9\" (UID: \"319559c6-c34d-4b00-ba90-8fcd5b5ff425\") " pod="openshift-marketplace/redhat-marketplace-2fdn9" Mar 21 05:33:29 crc kubenswrapper[4839]: I0321 05:33:29.471849 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/319559c6-c34d-4b00-ba90-8fcd5b5ff425-catalog-content\") pod \"redhat-marketplace-2fdn9\" (UID: \"319559c6-c34d-4b00-ba90-8fcd5b5ff425\") " pod="openshift-marketplace/redhat-marketplace-2fdn9" Mar 21 05:33:29 crc kubenswrapper[4839]: I0321 05:33:29.471944 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/319559c6-c34d-4b00-ba90-8fcd5b5ff425-utilities\") pod \"redhat-marketplace-2fdn9\" (UID: \"319559c6-c34d-4b00-ba90-8fcd5b5ff425\") " pod="openshift-marketplace/redhat-marketplace-2fdn9" Mar 21 05:33:29 crc kubenswrapper[4839]: I0321 05:33:29.492543 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s225m\" (UniqueName: \"kubernetes.io/projected/319559c6-c34d-4b00-ba90-8fcd5b5ff425-kube-api-access-s225m\") pod \"redhat-marketplace-2fdn9\" (UID: \"319559c6-c34d-4b00-ba90-8fcd5b5ff425\") " pod="openshift-marketplace/redhat-marketplace-2fdn9" Mar 21 05:33:29 crc kubenswrapper[4839]: I0321 05:33:29.559634 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-2fdn9" Mar 21 05:33:30 crc kubenswrapper[4839]: I0321 05:33:30.058057 4839 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-ppvlf/crc-debug-4bzpc" Mar 21 05:33:30 crc kubenswrapper[4839]: I0321 05:33:30.182505 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/9a65d87c-e91f-4d4d-846d-16c7699da843-host\") pod \"9a65d87c-e91f-4d4d-846d-16c7699da843\" (UID: \"9a65d87c-e91f-4d4d-846d-16c7699da843\") " Mar 21 05:33:30 crc kubenswrapper[4839]: I0321 05:33:30.182763 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-drbw4\" (UniqueName: \"kubernetes.io/projected/9a65d87c-e91f-4d4d-846d-16c7699da843-kube-api-access-drbw4\") pod \"9a65d87c-e91f-4d4d-846d-16c7699da843\" (UID: \"9a65d87c-e91f-4d4d-846d-16c7699da843\") " Mar 21 05:33:30 crc kubenswrapper[4839]: I0321 05:33:30.183685 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/9a65d87c-e91f-4d4d-846d-16c7699da843-host" (OuterVolumeSpecName: "host") pod "9a65d87c-e91f-4d4d-846d-16c7699da843" (UID: "9a65d87c-e91f-4d4d-846d-16c7699da843"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 21 05:33:30 crc kubenswrapper[4839]: I0321 05:33:30.188292 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9a65d87c-e91f-4d4d-846d-16c7699da843-kube-api-access-drbw4" (OuterVolumeSpecName: "kube-api-access-drbw4") pod "9a65d87c-e91f-4d4d-846d-16c7699da843" (UID: "9a65d87c-e91f-4d4d-846d-16c7699da843"). InnerVolumeSpecName "kube-api-access-drbw4". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 21 05:33:30 crc kubenswrapper[4839]: I0321 05:33:30.284834 4839 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/9a65d87c-e91f-4d4d-846d-16c7699da843-host\") on node \"crc\" DevicePath \"\"" Mar 21 05:33:30 crc kubenswrapper[4839]: I0321 05:33:30.285976 4839 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-drbw4\" (UniqueName: \"kubernetes.io/projected/9a65d87c-e91f-4d4d-846d-16c7699da843-kube-api-access-drbw4\") on node \"crc\" DevicePath \"\"" Mar 21 05:33:30 crc kubenswrapper[4839]: I0321 05:33:30.411304 4839 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-2fdn9"] Mar 21 05:33:30 crc kubenswrapper[4839]: W0321 05:33:30.416685 4839 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod319559c6_c34d_4b00_ba90_8fcd5b5ff425.slice/crio-bbdb617a70ae3560ec2109cc611ceab71b54198665cd34c53ff0b89b1320b6d8 WatchSource:0}: Error finding container bbdb617a70ae3560ec2109cc611ceab71b54198665cd34c53ff0b89b1320b6d8: Status 404 returned error can't find the container with id bbdb617a70ae3560ec2109cc611ceab71b54198665cd34c53ff0b89b1320b6d8 Mar 21 05:33:30 crc kubenswrapper[4839]: I0321 05:33:30.468451 4839 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9a65d87c-e91f-4d4d-846d-16c7699da843" path="/var/lib/kubelet/pods/9a65d87c-e91f-4d4d-846d-16c7699da843/volumes" Mar 21 05:33:30 crc kubenswrapper[4839]: I0321 05:33:30.875103 4839 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-ppvlf/crc-debug-4bzpc" Mar 21 05:33:30 crc kubenswrapper[4839]: I0321 05:33:30.875126 4839 scope.go:117] "RemoveContainer" containerID="1d47288112350ee1777065d7e7d1470b54937a865dd7e52b77f16a09cf7130e3" Mar 21 05:33:30 crc kubenswrapper[4839]: I0321 05:33:30.877433 4839 generic.go:334] "Generic (PLEG): container finished" podID="319559c6-c34d-4b00-ba90-8fcd5b5ff425" containerID="f2b597e1fa2c74bb598dec6011739de388be4c84a928103a06aadbe11ea920c4" exitCode=0 Mar 21 05:33:30 crc kubenswrapper[4839]: I0321 05:33:30.877465 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-2fdn9" event={"ID":"319559c6-c34d-4b00-ba90-8fcd5b5ff425","Type":"ContainerDied","Data":"f2b597e1fa2c74bb598dec6011739de388be4c84a928103a06aadbe11ea920c4"} Mar 21 05:33:30 crc kubenswrapper[4839]: I0321 05:33:30.877487 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-2fdn9" event={"ID":"319559c6-c34d-4b00-ba90-8fcd5b5ff425","Type":"ContainerStarted","Data":"bbdb617a70ae3560ec2109cc611ceab71b54198665cd34c53ff0b89b1320b6d8"} Mar 21 05:33:32 crc kubenswrapper[4839]: I0321 05:33:32.897768 4839 generic.go:334] "Generic (PLEG): container finished" podID="319559c6-c34d-4b00-ba90-8fcd5b5ff425" containerID="08fe7a9ab4632db40330db26ea460fb7563a01b7cc604af38618f2de6a05bf39" exitCode=0 Mar 21 05:33:32 crc kubenswrapper[4839]: I0321 05:33:32.897844 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-2fdn9" event={"ID":"319559c6-c34d-4b00-ba90-8fcd5b5ff425","Type":"ContainerDied","Data":"08fe7a9ab4632db40330db26ea460fb7563a01b7cc604af38618f2de6a05bf39"} Mar 21 05:33:33 crc kubenswrapper[4839]: I0321 05:33:33.452993 4839 scope.go:117] "RemoveContainer" containerID="135bd8a875eb5ef2e4c8acd69ed54a00aa689eabe31cc4130e794bbf798a359d" Mar 21 05:33:33 crc kubenswrapper[4839]: E0321 05:33:33.453313 4839 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-jx4q7_openshift-machine-config-operator(4f92fefb-d5cd-451a-8bbe-31eea55d5bd9)\"" pod="openshift-machine-config-operator/machine-config-daemon-jx4q7" podUID="4f92fefb-d5cd-451a-8bbe-31eea55d5bd9" Mar 21 05:33:33 crc kubenswrapper[4839]: I0321 05:33:33.909098 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-2fdn9" event={"ID":"319559c6-c34d-4b00-ba90-8fcd5b5ff425","Type":"ContainerStarted","Data":"e2003cc5eef8937b4d5e6d01a9f312da1bdde7a4f93033f9bbd72be3a73d9ea0"} Mar 21 05:33:33 crc kubenswrapper[4839]: I0321 05:33:33.932451 4839 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-2fdn9" podStartSLOduration=2.499160751 podStartE2EDuration="4.932424921s" podCreationTimestamp="2026-03-21 05:33:29 +0000 UTC" firstStartedPulling="2026-03-21 05:33:30.879248497 +0000 UTC m=+4215.207035173" lastFinishedPulling="2026-03-21 05:33:33.312512657 +0000 UTC m=+4217.640299343" observedRunningTime="2026-03-21 05:33:33.92669451 +0000 UTC m=+4218.254481206" watchObservedRunningTime="2026-03-21 05:33:33.932424921 +0000 UTC m=+4218.260211597" Mar 21 05:33:39 crc kubenswrapper[4839]: I0321 05:33:39.560741 4839 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-2fdn9" Mar 21 05:33:39 crc kubenswrapper[4839]: I0321 05:33:39.561402 4839 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-2fdn9" Mar 21 05:33:39 crc kubenswrapper[4839]: I0321 05:33:39.923358 4839 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-2fdn9" Mar 21 05:33:39 crc kubenswrapper[4839]: I0321 05:33:39.998490 4839 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-2fdn9" Mar 21 05:33:40 crc kubenswrapper[4839]: I0321 05:33:40.170177 4839 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-2fdn9"] Mar 21 05:33:41 crc kubenswrapper[4839]: I0321 05:33:41.973206 4839 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-2fdn9" podUID="319559c6-c34d-4b00-ba90-8fcd5b5ff425" containerName="registry-server" containerID="cri-o://e2003cc5eef8937b4d5e6d01a9f312da1bdde7a4f93033f9bbd72be3a73d9ea0" gracePeriod=2 Mar 21 05:33:42 crc kubenswrapper[4839]: I0321 05:33:42.463559 4839 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-2fdn9" Mar 21 05:33:42 crc kubenswrapper[4839]: I0321 05:33:42.548623 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-s225m\" (UniqueName: \"kubernetes.io/projected/319559c6-c34d-4b00-ba90-8fcd5b5ff425-kube-api-access-s225m\") pod \"319559c6-c34d-4b00-ba90-8fcd5b5ff425\" (UID: \"319559c6-c34d-4b00-ba90-8fcd5b5ff425\") " Mar 21 05:33:42 crc kubenswrapper[4839]: I0321 05:33:42.548723 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/319559c6-c34d-4b00-ba90-8fcd5b5ff425-utilities\") pod \"319559c6-c34d-4b00-ba90-8fcd5b5ff425\" (UID: \"319559c6-c34d-4b00-ba90-8fcd5b5ff425\") " Mar 21 05:33:42 crc kubenswrapper[4839]: I0321 05:33:42.548774 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/319559c6-c34d-4b00-ba90-8fcd5b5ff425-catalog-content\") pod \"319559c6-c34d-4b00-ba90-8fcd5b5ff425\" (UID: \"319559c6-c34d-4b00-ba90-8fcd5b5ff425\") " Mar 21 05:33:42 crc kubenswrapper[4839]: I0321 05:33:42.549627 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/319559c6-c34d-4b00-ba90-8fcd5b5ff425-utilities" (OuterVolumeSpecName: "utilities") pod "319559c6-c34d-4b00-ba90-8fcd5b5ff425" (UID: "319559c6-c34d-4b00-ba90-8fcd5b5ff425"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 21 05:33:42 crc kubenswrapper[4839]: I0321 05:33:42.565793 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/319559c6-c34d-4b00-ba90-8fcd5b5ff425-kube-api-access-s225m" (OuterVolumeSpecName: "kube-api-access-s225m") pod "319559c6-c34d-4b00-ba90-8fcd5b5ff425" (UID: "319559c6-c34d-4b00-ba90-8fcd5b5ff425"). InnerVolumeSpecName "kube-api-access-s225m". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 21 05:33:42 crc kubenswrapper[4839]: I0321 05:33:42.583624 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/319559c6-c34d-4b00-ba90-8fcd5b5ff425-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "319559c6-c34d-4b00-ba90-8fcd5b5ff425" (UID: "319559c6-c34d-4b00-ba90-8fcd5b5ff425"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 21 05:33:42 crc kubenswrapper[4839]: I0321 05:33:42.651272 4839 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/319559c6-c34d-4b00-ba90-8fcd5b5ff425-utilities\") on node \"crc\" DevicePath \"\"" Mar 21 05:33:42 crc kubenswrapper[4839]: I0321 05:33:42.651323 4839 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/319559c6-c34d-4b00-ba90-8fcd5b5ff425-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 21 05:33:42 crc kubenswrapper[4839]: I0321 05:33:42.651338 4839 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-s225m\" (UniqueName: \"kubernetes.io/projected/319559c6-c34d-4b00-ba90-8fcd5b5ff425-kube-api-access-s225m\") on node \"crc\" DevicePath \"\"" Mar 21 05:33:42 crc kubenswrapper[4839]: I0321 05:33:42.985919 4839 generic.go:334] "Generic (PLEG): container finished" podID="319559c6-c34d-4b00-ba90-8fcd5b5ff425" containerID="e2003cc5eef8937b4d5e6d01a9f312da1bdde7a4f93033f9bbd72be3a73d9ea0" exitCode=0 Mar 21 05:33:42 crc kubenswrapper[4839]: I0321 05:33:42.985961 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-2fdn9" event={"ID":"319559c6-c34d-4b00-ba90-8fcd5b5ff425","Type":"ContainerDied","Data":"e2003cc5eef8937b4d5e6d01a9f312da1bdde7a4f93033f9bbd72be3a73d9ea0"} Mar 21 05:33:42 crc kubenswrapper[4839]: I0321 05:33:42.985989 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-2fdn9" event={"ID":"319559c6-c34d-4b00-ba90-8fcd5b5ff425","Type":"ContainerDied","Data":"bbdb617a70ae3560ec2109cc611ceab71b54198665cd34c53ff0b89b1320b6d8"} Mar 21 05:33:42 crc kubenswrapper[4839]: I0321 05:33:42.986007 4839 scope.go:117] "RemoveContainer" containerID="e2003cc5eef8937b4d5e6d01a9f312da1bdde7a4f93033f9bbd72be3a73d9ea0" Mar 21 05:33:42 crc kubenswrapper[4839]: I0321 05:33:42.986161 4839 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-2fdn9" Mar 21 05:33:43 crc kubenswrapper[4839]: I0321 05:33:43.006913 4839 scope.go:117] "RemoveContainer" containerID="08fe7a9ab4632db40330db26ea460fb7563a01b7cc604af38618f2de6a05bf39" Mar 21 05:33:43 crc kubenswrapper[4839]: I0321 05:33:43.033699 4839 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-2fdn9"] Mar 21 05:33:43 crc kubenswrapper[4839]: I0321 05:33:43.041744 4839 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-2fdn9"] Mar 21 05:33:43 crc kubenswrapper[4839]: I0321 05:33:43.049717 4839 scope.go:117] "RemoveContainer" containerID="f2b597e1fa2c74bb598dec6011739de388be4c84a928103a06aadbe11ea920c4" Mar 21 05:33:43 crc kubenswrapper[4839]: I0321 05:33:43.097079 4839 scope.go:117] "RemoveContainer" containerID="e2003cc5eef8937b4d5e6d01a9f312da1bdde7a4f93033f9bbd72be3a73d9ea0" Mar 21 05:33:43 crc kubenswrapper[4839]: E0321 05:33:43.097885 4839 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e2003cc5eef8937b4d5e6d01a9f312da1bdde7a4f93033f9bbd72be3a73d9ea0\": container with ID starting with e2003cc5eef8937b4d5e6d01a9f312da1bdde7a4f93033f9bbd72be3a73d9ea0 not found: ID does not exist" containerID="e2003cc5eef8937b4d5e6d01a9f312da1bdde7a4f93033f9bbd72be3a73d9ea0" Mar 21 05:33:43 crc kubenswrapper[4839]: I0321 05:33:43.097928 4839 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e2003cc5eef8937b4d5e6d01a9f312da1bdde7a4f93033f9bbd72be3a73d9ea0"} err="failed to get container status \"e2003cc5eef8937b4d5e6d01a9f312da1bdde7a4f93033f9bbd72be3a73d9ea0\": rpc error: code = NotFound desc = could not find container \"e2003cc5eef8937b4d5e6d01a9f312da1bdde7a4f93033f9bbd72be3a73d9ea0\": container with ID starting with e2003cc5eef8937b4d5e6d01a9f312da1bdde7a4f93033f9bbd72be3a73d9ea0 not found: ID does not exist" Mar 21 05:33:43 crc kubenswrapper[4839]: I0321 05:33:43.097953 4839 scope.go:117] "RemoveContainer" containerID="08fe7a9ab4632db40330db26ea460fb7563a01b7cc604af38618f2de6a05bf39" Mar 21 05:33:43 crc kubenswrapper[4839]: E0321 05:33:43.098980 4839 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"08fe7a9ab4632db40330db26ea460fb7563a01b7cc604af38618f2de6a05bf39\": container with ID starting with 08fe7a9ab4632db40330db26ea460fb7563a01b7cc604af38618f2de6a05bf39 not found: ID does not exist" containerID="08fe7a9ab4632db40330db26ea460fb7563a01b7cc604af38618f2de6a05bf39" Mar 21 05:33:43 crc kubenswrapper[4839]: I0321 05:33:43.099058 4839 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"08fe7a9ab4632db40330db26ea460fb7563a01b7cc604af38618f2de6a05bf39"} err="failed to get container status \"08fe7a9ab4632db40330db26ea460fb7563a01b7cc604af38618f2de6a05bf39\": rpc error: code = NotFound desc = could not find container \"08fe7a9ab4632db40330db26ea460fb7563a01b7cc604af38618f2de6a05bf39\": container with ID starting with 08fe7a9ab4632db40330db26ea460fb7563a01b7cc604af38618f2de6a05bf39 not found: ID does not exist" Mar 21 05:33:43 crc kubenswrapper[4839]: I0321 05:33:43.099099 4839 scope.go:117] "RemoveContainer" containerID="f2b597e1fa2c74bb598dec6011739de388be4c84a928103a06aadbe11ea920c4" Mar 21 05:33:43 crc kubenswrapper[4839]: E0321 05:33:43.099547 4839 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f2b597e1fa2c74bb598dec6011739de388be4c84a928103a06aadbe11ea920c4\": container with ID starting with f2b597e1fa2c74bb598dec6011739de388be4c84a928103a06aadbe11ea920c4 not found: ID does not exist" containerID="f2b597e1fa2c74bb598dec6011739de388be4c84a928103a06aadbe11ea920c4" Mar 21 05:33:43 crc kubenswrapper[4839]: I0321 05:33:43.099633 4839 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f2b597e1fa2c74bb598dec6011739de388be4c84a928103a06aadbe11ea920c4"} err="failed to get container status \"f2b597e1fa2c74bb598dec6011739de388be4c84a928103a06aadbe11ea920c4\": rpc error: code = NotFound desc = could not find container \"f2b597e1fa2c74bb598dec6011739de388be4c84a928103a06aadbe11ea920c4\": container with ID starting with f2b597e1fa2c74bb598dec6011739de388be4c84a928103a06aadbe11ea920c4 not found: ID does not exist" Mar 21 05:33:44 crc kubenswrapper[4839]: I0321 05:33:44.464684 4839 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="319559c6-c34d-4b00-ba90-8fcd5b5ff425" path="/var/lib/kubelet/pods/319559c6-c34d-4b00-ba90-8fcd5b5ff425/volumes" Mar 21 05:33:48 crc kubenswrapper[4839]: I0321 05:33:48.453626 4839 scope.go:117] "RemoveContainer" containerID="135bd8a875eb5ef2e4c8acd69ed54a00aa689eabe31cc4130e794bbf798a359d" Mar 21 05:33:48 crc kubenswrapper[4839]: E0321 05:33:48.454866 4839 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-jx4q7_openshift-machine-config-operator(4f92fefb-d5cd-451a-8bbe-31eea55d5bd9)\"" pod="openshift-machine-config-operator/machine-config-daemon-jx4q7" podUID="4f92fefb-d5cd-451a-8bbe-31eea55d5bd9" Mar 21 05:34:00 crc kubenswrapper[4839]: I0321 05:34:00.152685 4839 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29567854-85fvh"] Mar 21 05:34:00 crc kubenswrapper[4839]: E0321 05:34:00.153635 4839 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="319559c6-c34d-4b00-ba90-8fcd5b5ff425" containerName="registry-server" Mar 21 05:34:00 crc kubenswrapper[4839]: I0321 05:34:00.153649 4839 state_mem.go:107] "Deleted CPUSet assignment" podUID="319559c6-c34d-4b00-ba90-8fcd5b5ff425" containerName="registry-server" Mar 21 05:34:00 crc kubenswrapper[4839]: E0321 05:34:00.153664 4839 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="319559c6-c34d-4b00-ba90-8fcd5b5ff425" containerName="extract-utilities" Mar 21 05:34:00 crc kubenswrapper[4839]: I0321 05:34:00.153670 4839 state_mem.go:107] "Deleted CPUSet assignment" podUID="319559c6-c34d-4b00-ba90-8fcd5b5ff425" containerName="extract-utilities" Mar 21 05:34:00 crc kubenswrapper[4839]: E0321 05:34:00.153687 4839 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="319559c6-c34d-4b00-ba90-8fcd5b5ff425" containerName="extract-content" Mar 21 05:34:00 crc kubenswrapper[4839]: I0321 05:34:00.153693 4839 state_mem.go:107] "Deleted CPUSet assignment" podUID="319559c6-c34d-4b00-ba90-8fcd5b5ff425" containerName="extract-content" Mar 21 05:34:00 crc kubenswrapper[4839]: I0321 05:34:00.153876 4839 memory_manager.go:354] "RemoveStaleState removing state" podUID="319559c6-c34d-4b00-ba90-8fcd5b5ff425" containerName="registry-server" Mar 21 05:34:00 crc kubenswrapper[4839]: I0321 05:34:00.154532 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567854-85fvh" Mar 21 05:34:00 crc kubenswrapper[4839]: I0321 05:34:00.158699 4839 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 21 05:34:00 crc kubenswrapper[4839]: I0321 05:34:00.158959 4839 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 21 05:34:00 crc kubenswrapper[4839]: I0321 05:34:00.159170 4839 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-swld2" Mar 21 05:34:00 crc kubenswrapper[4839]: I0321 05:34:00.167200 4839 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29567854-85fvh"] Mar 21 05:34:00 crc kubenswrapper[4839]: I0321 05:34:00.290302 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cbrks\" (UniqueName: \"kubernetes.io/projected/bec1d36d-4ff4-4f29-9d04-59f088e00f09-kube-api-access-cbrks\") pod \"auto-csr-approver-29567854-85fvh\" (UID: \"bec1d36d-4ff4-4f29-9d04-59f088e00f09\") " pod="openshift-infra/auto-csr-approver-29567854-85fvh" Mar 21 05:34:00 crc kubenswrapper[4839]: I0321 05:34:00.392795 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cbrks\" (UniqueName: \"kubernetes.io/projected/bec1d36d-4ff4-4f29-9d04-59f088e00f09-kube-api-access-cbrks\") pod \"auto-csr-approver-29567854-85fvh\" (UID: \"bec1d36d-4ff4-4f29-9d04-59f088e00f09\") " pod="openshift-infra/auto-csr-approver-29567854-85fvh" Mar 21 05:34:00 crc kubenswrapper[4839]: I0321 05:34:00.413563 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cbrks\" (UniqueName: \"kubernetes.io/projected/bec1d36d-4ff4-4f29-9d04-59f088e00f09-kube-api-access-cbrks\") pod \"auto-csr-approver-29567854-85fvh\" (UID: \"bec1d36d-4ff4-4f29-9d04-59f088e00f09\") " pod="openshift-infra/auto-csr-approver-29567854-85fvh" Mar 21 05:34:00 crc kubenswrapper[4839]: I0321 05:34:00.482965 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567854-85fvh" Mar 21 05:34:00 crc kubenswrapper[4839]: I0321 05:34:00.917755 4839 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29567854-85fvh"] Mar 21 05:34:01 crc kubenswrapper[4839]: I0321 05:34:01.158855 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567854-85fvh" event={"ID":"bec1d36d-4ff4-4f29-9d04-59f088e00f09","Type":"ContainerStarted","Data":"31223f38f218f9eee4ae54c62fd90b1f022c42abbeb7d89f51483d3b58494628"} Mar 21 05:34:02 crc kubenswrapper[4839]: I0321 05:34:02.468226 4839 scope.go:117] "RemoveContainer" containerID="135bd8a875eb5ef2e4c8acd69ed54a00aa689eabe31cc4130e794bbf798a359d" Mar 21 05:34:03 crc kubenswrapper[4839]: I0321 05:34:03.178996 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-jx4q7" event={"ID":"4f92fefb-d5cd-451a-8bbe-31eea55d5bd9","Type":"ContainerStarted","Data":"c9fc249e5d17a6c2fdbc1eaec440716c42661d1c2e7c8e6b17923104003e02fe"} Mar 21 05:34:03 crc kubenswrapper[4839]: I0321 05:34:03.184373 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567854-85fvh" event={"ID":"bec1d36d-4ff4-4f29-9d04-59f088e00f09","Type":"ContainerStarted","Data":"1b773a94d9de7762b645d818a8305a6aa83ff1f49522be66070dc127da6682d7"} Mar 21 05:34:03 crc kubenswrapper[4839]: I0321 05:34:03.236150 4839 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29567854-85fvh" podStartSLOduration=2.172201956 podStartE2EDuration="3.236131172s" podCreationTimestamp="2026-03-21 05:34:00 +0000 UTC" firstStartedPulling="2026-03-21 05:34:00.925204983 +0000 UTC m=+4245.252991659" lastFinishedPulling="2026-03-21 05:34:01.989134199 +0000 UTC m=+4246.316920875" observedRunningTime="2026-03-21 05:34:03.223458414 +0000 UTC m=+4247.551245080" watchObservedRunningTime="2026-03-21 05:34:03.236131172 +0000 UTC m=+4247.563917848" Mar 21 05:34:04 crc kubenswrapper[4839]: I0321 05:34:04.195552 4839 generic.go:334] "Generic (PLEG): container finished" podID="bec1d36d-4ff4-4f29-9d04-59f088e00f09" containerID="1b773a94d9de7762b645d818a8305a6aa83ff1f49522be66070dc127da6682d7" exitCode=0 Mar 21 05:34:04 crc kubenswrapper[4839]: I0321 05:34:04.195687 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567854-85fvh" event={"ID":"bec1d36d-4ff4-4f29-9d04-59f088e00f09","Type":"ContainerDied","Data":"1b773a94d9de7762b645d818a8305a6aa83ff1f49522be66070dc127da6682d7"} Mar 21 05:34:05 crc kubenswrapper[4839]: I0321 05:34:05.582863 4839 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567854-85fvh" Mar 21 05:34:05 crc kubenswrapper[4839]: I0321 05:34:05.612725 4839 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-api-d9cf4c794-jb7lf_37ba14c5-dfc7-4268-86c9-c0efe37fe6c9/barbican-api/0.log" Mar 21 05:34:05 crc kubenswrapper[4839]: I0321 05:34:05.690311 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cbrks\" (UniqueName: \"kubernetes.io/projected/bec1d36d-4ff4-4f29-9d04-59f088e00f09-kube-api-access-cbrks\") pod \"bec1d36d-4ff4-4f29-9d04-59f088e00f09\" (UID: \"bec1d36d-4ff4-4f29-9d04-59f088e00f09\") " Mar 21 05:34:05 crc kubenswrapper[4839]: I0321 05:34:05.696555 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bec1d36d-4ff4-4f29-9d04-59f088e00f09-kube-api-access-cbrks" (OuterVolumeSpecName: "kube-api-access-cbrks") pod "bec1d36d-4ff4-4f29-9d04-59f088e00f09" (UID: "bec1d36d-4ff4-4f29-9d04-59f088e00f09"). InnerVolumeSpecName "kube-api-access-cbrks". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 21 05:34:05 crc kubenswrapper[4839]: I0321 05:34:05.792645 4839 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cbrks\" (UniqueName: \"kubernetes.io/projected/bec1d36d-4ff4-4f29-9d04-59f088e00f09-kube-api-access-cbrks\") on node \"crc\" DevicePath \"\"" Mar 21 05:34:05 crc kubenswrapper[4839]: I0321 05:34:05.865157 4839 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-api-d9cf4c794-jb7lf_37ba14c5-dfc7-4268-86c9-c0efe37fe6c9/barbican-api-log/0.log" Mar 21 05:34:05 crc kubenswrapper[4839]: I0321 05:34:05.883431 4839 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-keystone-listener-7b946d96f4-chv76_e6e03301-fb6e-467b-b19d-21b5c475d35c/barbican-keystone-listener/0.log" Mar 21 05:34:05 crc kubenswrapper[4839]: I0321 05:34:05.918295 4839 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-keystone-listener-7b946d96f4-chv76_e6e03301-fb6e-467b-b19d-21b5c475d35c/barbican-keystone-listener-log/0.log" Mar 21 05:34:06 crc kubenswrapper[4839]: I0321 05:34:06.111796 4839 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-worker-db77b8b5f-grbp8_3563c0f9-9e82-4798-bae3-b3836a6b5866/barbican-worker/0.log" Mar 21 05:34:06 crc kubenswrapper[4839]: I0321 05:34:06.119611 4839 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-worker-db77b8b5f-grbp8_3563c0f9-9e82-4798-bae3-b3836a6b5866/barbican-worker-log/0.log" Mar 21 05:34:06 crc kubenswrapper[4839]: I0321 05:34:06.214189 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567854-85fvh" event={"ID":"bec1d36d-4ff4-4f29-9d04-59f088e00f09","Type":"ContainerDied","Data":"31223f38f218f9eee4ae54c62fd90b1f022c42abbeb7d89f51483d3b58494628"} Mar 21 05:34:06 crc kubenswrapper[4839]: I0321 05:34:06.214527 4839 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="31223f38f218f9eee4ae54c62fd90b1f022c42abbeb7d89f51483d3b58494628" Mar 21 05:34:06 crc kubenswrapper[4839]: I0321 05:34:06.214267 4839 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567854-85fvh" Mar 21 05:34:06 crc kubenswrapper[4839]: I0321 05:34:06.570616 4839 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_d1041d12-2cae-4009-a3f3-9df6e219d03b/ceilometer-central-agent/0.log" Mar 21 05:34:06 crc kubenswrapper[4839]: I0321 05:34:06.647018 4839 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_bootstrap-edpm-deployment-openstack-edpm-ipam-ndptz_a1d76458-d587-4960-9bcc-7e3d3122b44d/bootstrap-edpm-deployment-openstack-edpm-ipam/0.log" Mar 21 05:34:06 crc kubenswrapper[4839]: I0321 05:34:06.665839 4839 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29567848-p6wgd"] Mar 21 05:34:06 crc kubenswrapper[4839]: I0321 05:34:06.681056 4839 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29567848-p6wgd"] Mar 21 05:34:06 crc kubenswrapper[4839]: I0321 05:34:06.729639 4839 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_d1041d12-2cae-4009-a3f3-9df6e219d03b/ceilometer-notification-agent/0.log" Mar 21 05:34:06 crc kubenswrapper[4839]: I0321 05:34:06.808703 4839 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_d1041d12-2cae-4009-a3f3-9df6e219d03b/proxy-httpd/0.log" Mar 21 05:34:06 crc kubenswrapper[4839]: I0321 05:34:06.901459 4839 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_d1041d12-2cae-4009-a3f3-9df6e219d03b/sg-core/0.log" Mar 21 05:34:06 crc kubenswrapper[4839]: I0321 05:34:06.993300 4839 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-api-0_5162af3c-3b00-4643-afd9-680f6e2f5c03/cinder-api-log/0.log" Mar 21 05:34:07 crc kubenswrapper[4839]: I0321 05:34:07.046599 4839 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-api-0_5162af3c-3b00-4643-afd9-680f6e2f5c03/cinder-api/0.log" Mar 21 05:34:07 crc kubenswrapper[4839]: I0321 05:34:07.148798 4839 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-scheduler-0_77964653-d242-4258-b06e-c9cd0fb64d84/cinder-scheduler/0.log" Mar 21 05:34:07 crc kubenswrapper[4839]: I0321 05:34:07.342136 4839 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-scheduler-0_77964653-d242-4258-b06e-c9cd0fb64d84/probe/0.log" Mar 21 05:34:07 crc kubenswrapper[4839]: I0321 05:34:07.543417 4839 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_configure-network-edpm-deployment-openstack-edpm-ipam-2rvtx_a58d82e4-2de9-4680-a08c-6eeb775ed08a/configure-network-edpm-deployment-openstack-edpm-ipam/0.log" Mar 21 05:34:07 crc kubenswrapper[4839]: I0321 05:34:07.681508 4839 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_configure-os-edpm-deployment-openstack-edpm-ipam-qkclf_ab9d4433-fe0e-471b-84f8-568b31920ed3/configure-os-edpm-deployment-openstack-edpm-ipam/0.log" Mar 21 05:34:07 crc kubenswrapper[4839]: I0321 05:34:07.713666 4839 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-78c64bc9c5-n4nl2_a31699b4-0a8f-42c8-b7f4-319ef1d5423a/init/0.log" Mar 21 05:34:07 crc kubenswrapper[4839]: I0321 05:34:07.977095 4839 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-78c64bc9c5-n4nl2_a31699b4-0a8f-42c8-b7f4-319ef1d5423a/init/0.log" Mar 21 05:34:08 crc kubenswrapper[4839]: I0321 05:34:08.130822 4839 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_download-cache-edpm-deployment-openstack-edpm-ipam-bs7tt_7f875f01-020a-4cd6-950a-4dbb6ccb344e/download-cache-edpm-deployment-openstack-edpm-ipam/0.log" Mar 21 05:34:08 crc kubenswrapper[4839]: I0321 05:34:08.137275 4839 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-78c64bc9c5-n4nl2_a31699b4-0a8f-42c8-b7f4-319ef1d5423a/dnsmasq-dns/0.log" Mar 21 05:34:08 crc kubenswrapper[4839]: I0321 05:34:08.219677 4839 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-external-api-0_3e3e15ec-7425-4e0a-99a8-db3bb1cd486c/glance-httpd/0.log" Mar 21 05:34:08 crc kubenswrapper[4839]: I0321 05:34:08.314326 4839 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-external-api-0_3e3e15ec-7425-4e0a-99a8-db3bb1cd486c/glance-log/0.log" Mar 21 05:34:08 crc kubenswrapper[4839]: I0321 05:34:08.462319 4839 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-internal-api-0_c7aa4192-53bb-412e-b25e-1fe47c59fa75/glance-httpd/0.log" Mar 21 05:34:08 crc kubenswrapper[4839]: I0321 05:34:08.464463 4839 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c8214f95-33aa-486b-bb82-915b2c5b2cf6" path="/var/lib/kubelet/pods/c8214f95-33aa-486b-bb82-915b2c5b2cf6/volumes" Mar 21 05:34:08 crc kubenswrapper[4839]: I0321 05:34:08.507442 4839 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-internal-api-0_c7aa4192-53bb-412e-b25e-1fe47c59fa75/glance-log/0.log" Mar 21 05:34:08 crc kubenswrapper[4839]: I0321 05:34:08.692865 4839 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_horizon-9c97f4dbd-k2scs_579308eb-854d-4160-ad35-8677f2d0e634/horizon/0.log" Mar 21 05:34:08 crc kubenswrapper[4839]: I0321 05:34:08.919159 4839 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_install-certs-edpm-deployment-openstack-edpm-ipam-s7pk7_268d87b5-57ec-49ff-be62-fe59e6b4b819/install-certs-edpm-deployment-openstack-edpm-ipam/0.log" Mar 21 05:34:09 crc kubenswrapper[4839]: I0321 05:34:09.174090 4839 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_horizon-9c97f4dbd-k2scs_579308eb-854d-4160-ad35-8677f2d0e634/horizon-log/0.log" Mar 21 05:34:09 crc kubenswrapper[4839]: I0321 05:34:09.345751 4839 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_install-os-edpm-deployment-openstack-edpm-ipam-xdvx2_7538d496-3768-42b7-9f2e-70e1b44a9d6b/install-os-edpm-deployment-openstack-edpm-ipam/0.log" Mar 21 05:34:09 crc kubenswrapper[4839]: I0321 05:34:09.409668 4839 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_keystone-cb996784d-fvhvp_6a3fcdf0-3099-467b-928b-89a4876130fe/keystone-api/0.log" Mar 21 05:34:09 crc kubenswrapper[4839]: I0321 05:34:09.669532 4839 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_keystone-cron-29567821-rmctn_666be2f4-0416-4086-94d3-c48c82f380b2/keystone-cron/0.log" Mar 21 05:34:09 crc kubenswrapper[4839]: I0321 05:34:09.776283 4839 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_kube-state-metrics-0_1626316f-b029-4424-b783-25eeb2790eb2/kube-state-metrics/0.log" Mar 21 05:34:10 crc kubenswrapper[4839]: I0321 05:34:10.361549 4839 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-748dbf85fc-jslwv_cd21ac8b-d3c0-4f0c-9205-d60d55425d8a/neutron-api/0.log" Mar 21 05:34:10 crc kubenswrapper[4839]: I0321 05:34:10.400529 4839 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_libvirt-edpm-deployment-openstack-edpm-ipam-w48j6_2d056acb-0183-4157-a830-fff4cd1dcacf/libvirt-edpm-deployment-openstack-edpm-ipam/0.log" Mar 21 05:34:11 crc kubenswrapper[4839]: I0321 05:34:11.059812 4839 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-748dbf85fc-jslwv_cd21ac8b-d3c0-4f0c-9205-d60d55425d8a/neutron-httpd/0.log" Mar 21 05:34:11 crc kubenswrapper[4839]: I0321 05:34:11.143843 4839 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-metadata-edpm-deployment-openstack-edpm-ipam-vx26d_ceef8f42-5d77-44c1-ac39-edf0080f68e0/neutron-metadata-edpm-deployment-openstack-edpm-ipam/0.log" Mar 21 05:34:11 crc kubenswrapper[4839]: I0321 05:34:11.721750 4839 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell0-conductor-0_152d0351-12d2-4cf1-ad49-fd943b223442/nova-cell0-conductor-conductor/0.log" Mar 21 05:34:11 crc kubenswrapper[4839]: I0321 05:34:11.763339 4839 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-api-0_627bf6a3-cf5d-42e1-9250-ba6684bb2cfc/nova-api-log/0.log" Mar 21 05:34:12 crc kubenswrapper[4839]: I0321 05:34:12.037244 4839 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell1-conductor-0_3194b187-fe06-4eed-b725-995cef2b05a0/nova-cell1-conductor-conductor/0.log" Mar 21 05:34:12 crc kubenswrapper[4839]: I0321 05:34:12.174386 4839 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell1-novncproxy-0_9ddf8fc2-ec2a-4b98-aa76-2dc43426e3f2/nova-cell1-novncproxy-novncproxy/0.log" Mar 21 05:34:12 crc kubenswrapper[4839]: I0321 05:34:12.275052 4839 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-api-0_627bf6a3-cf5d-42e1-9250-ba6684bb2cfc/nova-api-api/0.log" Mar 21 05:34:12 crc kubenswrapper[4839]: I0321 05:34:12.717864 4839 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-edpm-deployment-openstack-edpm-ipam-hf42f_3f8728ca-30ff-41a9-8a48-e3bb7911bcc7/nova-edpm-deployment-openstack-edpm-ipam/0.log" Mar 21 05:34:12 crc kubenswrapper[4839]: I0321 05:34:12.986101 4839 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-metadata-0_0aafbc7f-e890-4a32-8531-f148aeea18e6/nova-metadata-log/0.log" Mar 21 05:34:13 crc kubenswrapper[4839]: I0321 05:34:13.361422 4839 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-metadata-0_0aafbc7f-e890-4a32-8531-f148aeea18e6/nova-metadata-metadata/0.log" Mar 21 05:34:13 crc kubenswrapper[4839]: I0321 05:34:13.409951 4839 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_30d22e92-45bd-4d1e-954e-3ade801245d4/mysql-bootstrap/0.log" Mar 21 05:34:13 crc kubenswrapper[4839]: I0321 05:34:13.510199 4839 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-scheduler-0_bbecccff-0ecc-44ff-a57b-f7289b8bcf5a/nova-scheduler-scheduler/0.log" Mar 21 05:34:13 crc kubenswrapper[4839]: I0321 05:34:13.590468 4839 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_30d22e92-45bd-4d1e-954e-3ade801245d4/mysql-bootstrap/0.log" Mar 21 05:34:13 crc kubenswrapper[4839]: I0321 05:34:13.660574 4839 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_30d22e92-45bd-4d1e-954e-3ade801245d4/galera/0.log" Mar 21 05:34:13 crc kubenswrapper[4839]: I0321 05:34:13.799602 4839 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_4f1edf0d-f220-4815-aeb6-e4507576247a/mysql-bootstrap/0.log" Mar 21 05:34:13 crc kubenswrapper[4839]: I0321 05:34:13.983999 4839 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_4f1edf0d-f220-4815-aeb6-e4507576247a/mysql-bootstrap/0.log" Mar 21 05:34:13 crc kubenswrapper[4839]: I0321 05:34:13.996792 4839 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_4f1edf0d-f220-4815-aeb6-e4507576247a/galera/0.log" Mar 21 05:34:14 crc kubenswrapper[4839]: I0321 05:34:14.024599 4839 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstackclient_52b9f7e1-d86c-457e-9391-eee855a9f7a7/openstackclient/0.log" Mar 21 05:34:14 crc kubenswrapper[4839]: I0321 05:34:14.269840 4839 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-metrics-mx5tf_64d13111-845e-4c61-a4ce-483ddfb799b7/openstack-network-exporter/0.log" Mar 21 05:34:14 crc kubenswrapper[4839]: I0321 05:34:14.342135 4839 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-hrww8_3d74e911-e100-4e79-89be-202e06bb4d30/ovsdb-server-init/0.log" Mar 21 05:34:14 crc kubenswrapper[4839]: I0321 05:34:14.459625 4839 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-hrww8_3d74e911-e100-4e79-89be-202e06bb4d30/ovsdb-server-init/0.log" Mar 21 05:34:14 crc kubenswrapper[4839]: I0321 05:34:14.508787 4839 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-hrww8_3d74e911-e100-4e79-89be-202e06bb4d30/ovs-vswitchd/0.log" Mar 21 05:34:14 crc kubenswrapper[4839]: I0321 05:34:14.649106 4839 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-hrww8_3d74e911-e100-4e79-89be-202e06bb4d30/ovsdb-server/0.log" Mar 21 05:34:14 crc kubenswrapper[4839]: I0321 05:34:14.657021 4839 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-qt5s4_b31b64cb-0266-4b8a-9fcb-ae5e36c8309a/ovn-controller/0.log" Mar 21 05:34:15 crc kubenswrapper[4839]: I0321 05:34:15.006198 4839 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-northd-0_dbcaa531-3e09-48c7-8535-76f3e1f5c303/ovn-northd/0.log" Mar 21 05:34:15 crc kubenswrapper[4839]: I0321 05:34:15.019720 4839 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-northd-0_dbcaa531-3e09-48c7-8535-76f3e1f5c303/openstack-network-exporter/0.log" Mar 21 05:34:15 crc kubenswrapper[4839]: I0321 05:34:15.045115 4839 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-edpm-deployment-openstack-edpm-ipam-v4wqq_7e5c69e5-d234-4a0b-a327-1cf44ddaf1bd/ovn-edpm-deployment-openstack-edpm-ipam/0.log" Mar 21 05:34:15 crc kubenswrapper[4839]: I0321 05:34:15.254145 4839 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-0_4a7a1028-3deb-4033-890c-db0861c6a9a2/openstack-network-exporter/0.log" Mar 21 05:34:15 crc kubenswrapper[4839]: I0321 05:34:15.270031 4839 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-0_4a7a1028-3deb-4033-890c-db0861c6a9a2/ovsdbserver-nb/0.log" Mar 21 05:34:15 crc kubenswrapper[4839]: I0321 05:34:15.396639 4839 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-0_8c2e5ef4-e4c0-4278-897e-ce5d00b4079d/openstack-network-exporter/0.log" Mar 21 05:34:15 crc kubenswrapper[4839]: I0321 05:34:15.505608 4839 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-0_8c2e5ef4-e4c0-4278-897e-ce5d00b4079d/ovsdbserver-sb/0.log" Mar 21 05:34:15 crc kubenswrapper[4839]: I0321 05:34:15.587623 4839 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_placement-75bd8b89b4-djjlh_bf5a44f8-8eb1-4953-b611-a02576e414ea/placement-api/0.log" Mar 21 05:34:15 crc kubenswrapper[4839]: I0321 05:34:15.697633 4839 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_placement-75bd8b89b4-djjlh_bf5a44f8-8eb1-4953-b611-a02576e414ea/placement-log/0.log" Mar 21 05:34:15 crc kubenswrapper[4839]: I0321 05:34:15.717445 4839 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_fa82c4a0-2b0e-4e22-9e91-7fc899122414/setup-container/0.log" Mar 21 05:34:15 crc kubenswrapper[4839]: I0321 05:34:15.983361 4839 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_fa82c4a0-2b0e-4e22-9e91-7fc899122414/setup-container/0.log" Mar 21 05:34:16 crc kubenswrapper[4839]: I0321 05:34:16.014062 4839 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_fa82c4a0-2b0e-4e22-9e91-7fc899122414/rabbitmq/0.log" Mar 21 05:34:16 crc kubenswrapper[4839]: I0321 05:34:16.056561 4839 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_bfff67da-8ea4-4798-9b8d-58a3abac4347/setup-container/0.log" Mar 21 05:34:16 crc kubenswrapper[4839]: I0321 05:34:16.211163 4839 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_bfff67da-8ea4-4798-9b8d-58a3abac4347/setup-container/0.log" Mar 21 05:34:16 crc kubenswrapper[4839]: I0321 05:34:16.227098 4839 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_bfff67da-8ea4-4798-9b8d-58a3abac4347/rabbitmq/0.log" Mar 21 05:34:16 crc kubenswrapper[4839]: I0321 05:34:16.274417 4839 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_reboot-os-edpm-deployment-openstack-edpm-ipam-mkn9r_66c3e343-3306-455d-89d7-db17c1bd53ed/reboot-os-edpm-deployment-openstack-edpm-ipam/0.log" Mar 21 05:34:16 crc kubenswrapper[4839]: I0321 05:34:16.470010 4839 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_redhat-edpm-deployment-openstack-edpm-ipam-pgfnn_a6dd2bff-543f-4ebb-b908-3e528f322548/redhat-edpm-deployment-openstack-edpm-ipam/0.log" Mar 21 05:34:16 crc kubenswrapper[4839]: I0321 05:34:16.540290 4839 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_repo-setup-edpm-deployment-openstack-edpm-ipam-mbmlq_acb0bb61-c53a-4171-bca5-4a3141d6904a/repo-setup-edpm-deployment-openstack-edpm-ipam/0.log" Mar 21 05:34:17 crc kubenswrapper[4839]: I0321 05:34:17.096333 4839 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ssh-known-hosts-edpm-deployment-chfcw_39dbacec-c845-4f19-92a9-c0e63fba203c/ssh-known-hosts-edpm-deployment/0.log" Mar 21 05:34:17 crc kubenswrapper[4839]: I0321 05:34:17.114585 4839 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_run-os-edpm-deployment-openstack-edpm-ipam-55fzl_26adbd7b-7994-4bea-9f94-338881339833/run-os-edpm-deployment-openstack-edpm-ipam/0.log" Mar 21 05:34:17 crc kubenswrapper[4839]: I0321 05:34:17.320775 4839 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-proxy-b66c6bfff-76gfx_1af5fd5b-8392-4e55-b3fb-fdc9285dd135/proxy-server/0.log" Mar 21 05:34:17 crc kubenswrapper[4839]: I0321 05:34:17.404345 4839 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-proxy-b66c6bfff-76gfx_1af5fd5b-8392-4e55-b3fb-fdc9285dd135/proxy-httpd/0.log" Mar 21 05:34:17 crc kubenswrapper[4839]: I0321 05:34:17.445734 4839 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-ring-rebalance-kkvzq_5484abbf-53f2-445a-b6fe-0996eba95345/swift-ring-rebalance/0.log" Mar 21 05:34:17 crc kubenswrapper[4839]: I0321 05:34:17.618038 4839 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_9848d2f0-c562-4b2a-bd1c-cd91c6754079/account-auditor/0.log" Mar 21 05:34:17 crc kubenswrapper[4839]: I0321 05:34:17.663036 4839 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_9848d2f0-c562-4b2a-bd1c-cd91c6754079/account-replicator/0.log" Mar 21 05:34:17 crc kubenswrapper[4839]: I0321 05:34:17.691630 4839 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_9848d2f0-c562-4b2a-bd1c-cd91c6754079/account-reaper/0.log" Mar 21 05:34:17 crc kubenswrapper[4839]: I0321 05:34:17.803710 4839 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_9848d2f0-c562-4b2a-bd1c-cd91c6754079/account-server/0.log" Mar 21 05:34:17 crc kubenswrapper[4839]: I0321 05:34:17.813313 4839 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_9848d2f0-c562-4b2a-bd1c-cd91c6754079/container-auditor/0.log" Mar 21 05:34:17 crc kubenswrapper[4839]: I0321 05:34:17.883719 4839 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_9848d2f0-c562-4b2a-bd1c-cd91c6754079/container-replicator/0.log" Mar 21 05:34:17 crc kubenswrapper[4839]: I0321 05:34:17.927131 4839 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_9848d2f0-c562-4b2a-bd1c-cd91c6754079/container-server/0.log" Mar 21 05:34:18 crc kubenswrapper[4839]: I0321 05:34:18.052537 4839 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_9848d2f0-c562-4b2a-bd1c-cd91c6754079/container-updater/0.log" Mar 21 05:34:18 crc kubenswrapper[4839]: I0321 05:34:18.058079 4839 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_9848d2f0-c562-4b2a-bd1c-cd91c6754079/object-auditor/0.log" Mar 21 05:34:18 crc kubenswrapper[4839]: I0321 05:34:18.081720 4839 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_9848d2f0-c562-4b2a-bd1c-cd91c6754079/object-expirer/0.log" Mar 21 05:34:18 crc kubenswrapper[4839]: I0321 05:34:18.234750 4839 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_9848d2f0-c562-4b2a-bd1c-cd91c6754079/object-replicator/0.log" Mar 21 05:34:18 crc kubenswrapper[4839]: I0321 05:34:18.257686 4839 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_9848d2f0-c562-4b2a-bd1c-cd91c6754079/object-updater/0.log" Mar 21 05:34:18 crc kubenswrapper[4839]: I0321 05:34:18.311674 4839 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_9848d2f0-c562-4b2a-bd1c-cd91c6754079/object-server/0.log" Mar 21 05:34:18 crc kubenswrapper[4839]: I0321 05:34:18.349957 4839 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_9848d2f0-c562-4b2a-bd1c-cd91c6754079/rsync/0.log" Mar 21 05:34:18 crc kubenswrapper[4839]: I0321 05:34:18.450473 4839 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_9848d2f0-c562-4b2a-bd1c-cd91c6754079/swift-recon-cron/0.log" Mar 21 05:34:18 crc kubenswrapper[4839]: I0321 05:34:18.675041 4839 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_tempest-tests-tempest_65c89a5c-df7e-4d6c-bc7e-5e4fecfc6cb3/tempest-tests-tempest-tests-runner/0.log" Mar 21 05:34:19 crc kubenswrapper[4839]: I0321 05:34:19.014437 4839 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_telemetry-edpm-deployment-openstack-edpm-ipam-zmjtq_4f49b501-bec5-4fe1-89d7-ff3c217ba580/telemetry-edpm-deployment-openstack-edpm-ipam/0.log" Mar 21 05:34:19 crc kubenswrapper[4839]: I0321 05:34:19.033212 4839 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_test-operator-logs-pod-tempest-tempest-tests-tempest_d4f9fe7e-8922-4015-a3eb-8c0f829cc5f8/test-operator-logs-container/0.log" Mar 21 05:34:19 crc kubenswrapper[4839]: I0321 05:34:19.186783 4839 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_validate-network-edpm-deployment-openstack-edpm-ipam-jsp4h_f9d60b3b-b1b4-4d98-9da2-e152ac410c81/validate-network-edpm-deployment-openstack-edpm-ipam/0.log" Mar 21 05:34:28 crc kubenswrapper[4839]: I0321 05:34:28.720082 4839 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_memcached-0_3c49bdbb-0c05-4dea-8de8-61ca09b7e84c/memcached/0.log" Mar 21 05:34:46 crc kubenswrapper[4839]: I0321 05:34:46.180556 4839 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_barbican-operator-controller-manager-59bc569d95-2mkmz_0c51ffa0-2285-4f7e-af09-0cafba139934/manager/0.log" Mar 21 05:34:46 crc kubenswrapper[4839]: I0321 05:34:46.291295 4839 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_designate-operator-controller-manager-588d4d986b-9s4vt_ee9d64a7-0d03-4cb0-a266-47b26f9957b5/manager/0.log" Mar 21 05:34:46 crc kubenswrapper[4839]: I0321 05:34:46.477353 4839 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_e32239b1d0c83face19bfb7ecc802c1e0782962973f0d7ba9f259e86a2x5d6m_f63f3493-d532-4d99-94c0-ab8648252dab/util/0.log" Mar 21 05:34:46 crc kubenswrapper[4839]: I0321 05:34:46.689809 4839 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_e32239b1d0c83face19bfb7ecc802c1e0782962973f0d7ba9f259e86a2x5d6m_f63f3493-d532-4d99-94c0-ab8648252dab/pull/0.log" Mar 21 05:34:46 crc kubenswrapper[4839]: I0321 05:34:46.717359 4839 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_e32239b1d0c83face19bfb7ecc802c1e0782962973f0d7ba9f259e86a2x5d6m_f63f3493-d532-4d99-94c0-ab8648252dab/pull/0.log" Mar 21 05:34:46 crc kubenswrapper[4839]: I0321 05:34:46.771330 4839 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_e32239b1d0c83face19bfb7ecc802c1e0782962973f0d7ba9f259e86a2x5d6m_f63f3493-d532-4d99-94c0-ab8648252dab/util/0.log" Mar 21 05:34:46 crc kubenswrapper[4839]: I0321 05:34:46.907280 4839 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_e32239b1d0c83face19bfb7ecc802c1e0782962973f0d7ba9f259e86a2x5d6m_f63f3493-d532-4d99-94c0-ab8648252dab/util/0.log" Mar 21 05:34:46 crc kubenswrapper[4839]: I0321 05:34:46.919995 4839 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_e32239b1d0c83face19bfb7ecc802c1e0782962973f0d7ba9f259e86a2x5d6m_f63f3493-d532-4d99-94c0-ab8648252dab/pull/0.log" Mar 21 05:34:47 crc kubenswrapper[4839]: I0321 05:34:47.022823 4839 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_e32239b1d0c83face19bfb7ecc802c1e0782962973f0d7ba9f259e86a2x5d6m_f63f3493-d532-4d99-94c0-ab8648252dab/extract/0.log" Mar 21 05:34:47 crc kubenswrapper[4839]: I0321 05:34:47.241322 4839 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_glance-operator-controller-manager-79df6bcc97-6s6q7_d3dc722f-f66c-46a0-9b1a-ae1b9c4de060/manager/0.log" Mar 21 05:34:47 crc kubenswrapper[4839]: I0321 05:34:47.346682 4839 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_heat-operator-controller-manager-67dd5f86f5-2n27d_fd731e7e-440b-4e77-a778-08a4a62e0c9f/manager/0.log" Mar 21 05:34:47 crc kubenswrapper[4839]: I0321 05:34:47.578387 4839 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_cinder-operator-controller-manager-8d58dc466-dncxc_05f30a88-e899-4727-9440-981d010a1342/manager/0.log" Mar 21 05:34:47 crc kubenswrapper[4839]: I0321 05:34:47.971535 4839 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_horizon-operator-controller-manager-8464cc45fb-d7h7r_acb1d7ac-b3f9-4564-8346-344ffb5c3964/manager/0.log" Mar 21 05:34:48 crc kubenswrapper[4839]: I0321 05:34:48.238719 4839 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ironic-operator-controller-manager-6f787dddc9-8sg4d_ccec0d11-294b-43a2-be2e-fcef8a6818c6/manager/0.log" Mar 21 05:34:48 crc kubenswrapper[4839]: I0321 05:34:48.295474 4839 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_infra-operator-controller-manager-7b9c774f96-bsdjs_ca2a8cd0-1c71-45bb-b4fc-4c7f82515b3b/manager/0.log" Mar 21 05:34:48 crc kubenswrapper[4839]: I0321 05:34:48.471796 4839 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_manila-operator-controller-manager-55f864c847-gzh8j_6074766c-0ecd-4051-a676-dcc21b24184f/manager/0.log" Mar 21 05:34:48 crc kubenswrapper[4839]: I0321 05:34:48.495812 4839 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_keystone-operator-controller-manager-768b96df4c-k4lg5_7a7bf7a3-acea-4059-8a89-db576f3588d1/manager/0.log" Mar 21 05:34:48 crc kubenswrapper[4839]: I0321 05:34:48.679803 4839 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_mariadb-operator-controller-manager-67ccfc9778-sp4j4_2162bafb-7e49-435c-9591-d8b725f10336/manager/0.log" Mar 21 05:34:48 crc kubenswrapper[4839]: I0321 05:34:48.774586 4839 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_neutron-operator-controller-manager-767865f676-94vpf_70702cd5-6815-4a01-98a4-2f4dfaeef839/manager/0.log" Mar 21 05:34:48 crc kubenswrapper[4839]: I0321 05:34:48.985313 4839 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_nova-operator-controller-manager-5d488d59fb-wjw9j_6914418f-3639-4ebc-a58d-d8b478cbf6b4/manager/0.log" Mar 21 05:34:48 crc kubenswrapper[4839]: I0321 05:34:48.996107 4839 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_octavia-operator-controller-manager-5b9f45d989-6p4mn_faac458b-73d9-4fb8-9f1c-50f7521088b0/manager/0.log" Mar 21 05:34:49 crc kubenswrapper[4839]: I0321 05:34:49.186272 4839 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-baremetal-operator-controller-manager-89d64c458-8gc22_859b11bc-e9fb-40a2-a053-66a07337965c/manager/0.log" Mar 21 05:34:49 crc kubenswrapper[4839]: I0321 05:34:49.303064 4839 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-controller-init-948579bb7-j6fx6_b27308cc-b2b7-4bf0-a3ca-55ccdfa47f59/operator/0.log" Mar 21 05:34:49 crc kubenswrapper[4839]: I0321 05:34:49.576905 4839 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-index-lj8h4_6ff65f56-ff89-43c6-b087-6d3c3b72d2ef/registry-server/0.log" Mar 21 05:34:49 crc kubenswrapper[4839]: I0321 05:34:49.819690 4839 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ovn-operator-controller-manager-884679f54-qt58c_379b40a1-e3f5-448b-b668-0f168457e5d0/manager/0.log" Mar 21 05:34:49 crc kubenswrapper[4839]: I0321 05:34:49.880733 4839 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_placement-operator-controller-manager-5784578c99-x75fd_361c2d7b-9a75-41fd-953d-4b1bd64ca6df/manager/0.log" Mar 21 05:34:50 crc kubenswrapper[4839]: I0321 05:34:50.123130 4839 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_rabbitmq-cluster-operator-manager-668c99d594-lzbtt_c8584ecb-dc92-4cec-9178-3017f09095da/operator/0.log" Mar 21 05:34:50 crc kubenswrapper[4839]: I0321 05:34:50.152231 4839 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_swift-operator-controller-manager-c674c5965-xt7xt_2045f5d2-c67e-47cd-b16d-3c69d449f099/manager/0.log" Mar 21 05:34:50 crc kubenswrapper[4839]: I0321 05:34:50.420806 4839 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_test-operator-controller-manager-5c5cb9c4d7-7f4qh_5eeb53bd-3988-458f-baa5-d265e0178aea/manager/0.log" Mar 21 05:34:50 crc kubenswrapper[4839]: I0321 05:34:50.423338 4839 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_telemetry-operator-controller-manager-d6b694c5-btkvt_d3ea9c2e-11a4-492e-9e84-8294e81ce775/manager/0.log" Mar 21 05:34:50 crc kubenswrapper[4839]: I0321 05:34:50.537483 4839 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_watcher-operator-controller-manager-6c4d75f7f9-hh27s_1d32b541-7b80-492b-adac-e51d5090b668/manager/0.log" Mar 21 05:34:50 crc kubenswrapper[4839]: I0321 05:34:50.670528 4839 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-controller-manager-5ccd4855ff-jx6pn_06f9e67e-8978-46a1-9dc8-c511197241e2/manager/0.log" Mar 21 05:34:58 crc kubenswrapper[4839]: I0321 05:34:58.361866 4839 scope.go:117] "RemoveContainer" containerID="3c4dbc17150a4b84d9f816e99c3c6823e1cf60ce3010cad74846a38e98f64886" Mar 21 05:35:10 crc kubenswrapper[4839]: I0321 05:35:10.226935 4839 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_control-plane-machine-set-operator-78cbb6b69f-whlp9_40014780-8cb8-47fa-8b2c-c4fb7d04a85c/control-plane-machine-set-operator/0.log" Mar 21 05:35:10 crc kubenswrapper[4839]: I0321 05:35:10.467373 4839 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_machine-api-operator-5694c8668f-nmj8p_c4d393d7-42d7-4b7d-a3cd-f7e325b97c54/kube-rbac-proxy/0.log" Mar 21 05:35:10 crc kubenswrapper[4839]: I0321 05:35:10.480272 4839 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_machine-api-operator-5694c8668f-nmj8p_c4d393d7-42d7-4b7d-a3cd-f7e325b97c54/machine-api-operator/0.log" Mar 21 05:35:26 crc kubenswrapper[4839]: I0321 05:35:26.617429 4839 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-858654f9db-x2cpt_daed7a16-7023-463e-9d60-3f56f091f73e/cert-manager-controller/0.log" Mar 21 05:35:26 crc kubenswrapper[4839]: I0321 05:35:26.876898 4839 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-cainjector-cf98fcc89-v297k_814a91ac-5e2f-4479-88a3-254e4216e50c/cert-manager-cainjector/0.log" Mar 21 05:35:26 crc kubenswrapper[4839]: I0321 05:35:26.930285 4839 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-webhook-687f57d79b-s9zj6_d70f5b8f-f5a8-4829-b4e1-7a7a12dddd1f/cert-manager-webhook/0.log" Mar 21 05:35:40 crc kubenswrapper[4839]: I0321 05:35:40.002557 4839 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-console-plugin-86f58fcf4-j5z4g_8e7a66bb-3731-4f75-9a7f-5b9d07a36b39/nmstate-console-plugin/0.log" Mar 21 05:35:40 crc kubenswrapper[4839]: I0321 05:35:40.199140 4839 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-handler-k57vv_42329e42-8b9b-45ed-ab04-bf12468d8859/nmstate-handler/0.log" Mar 21 05:35:40 crc kubenswrapper[4839]: I0321 05:35:40.252085 4839 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-metrics-9b8c8685d-z5wkc_fdc1639d-742f-41a6-8cb7-318997a4a8b1/kube-rbac-proxy/0.log" Mar 21 05:35:40 crc kubenswrapper[4839]: I0321 05:35:40.559760 4839 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-operator-796d4cfff4-vrlf4_fbd83ba5-ac43-45f6-8a15-78ba82a246f7/nmstate-operator/0.log" Mar 21 05:35:41 crc kubenswrapper[4839]: I0321 05:35:41.073026 4839 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-metrics-9b8c8685d-z5wkc_fdc1639d-742f-41a6-8cb7-318997a4a8b1/nmstate-metrics/0.log" Mar 21 05:35:41 crc kubenswrapper[4839]: I0321 05:35:41.241408 4839 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-webhook-5f558f5558-7ghd4_5a2485ca-cb21-4edf-b074-f7ac255f45f8/nmstate-webhook/0.log" Mar 21 05:36:00 crc kubenswrapper[4839]: I0321 05:36:00.153887 4839 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29567856-pzqfl"] Mar 21 05:36:00 crc kubenswrapper[4839]: E0321 05:36:00.155717 4839 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bec1d36d-4ff4-4f29-9d04-59f088e00f09" containerName="oc" Mar 21 05:36:00 crc kubenswrapper[4839]: I0321 05:36:00.155735 4839 state_mem.go:107] "Deleted CPUSet assignment" podUID="bec1d36d-4ff4-4f29-9d04-59f088e00f09" containerName="oc" Mar 21 05:36:00 crc kubenswrapper[4839]: I0321 05:36:00.156003 4839 memory_manager.go:354] "RemoveStaleState removing state" podUID="bec1d36d-4ff4-4f29-9d04-59f088e00f09" containerName="oc" Mar 21 05:36:00 crc kubenswrapper[4839]: I0321 05:36:00.156871 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567856-pzqfl" Mar 21 05:36:00 crc kubenswrapper[4839]: I0321 05:36:00.159220 4839 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 21 05:36:00 crc kubenswrapper[4839]: I0321 05:36:00.159603 4839 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 21 05:36:00 crc kubenswrapper[4839]: I0321 05:36:00.159972 4839 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-swld2" Mar 21 05:36:00 crc kubenswrapper[4839]: I0321 05:36:00.162530 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f8vsj\" (UniqueName: \"kubernetes.io/projected/a61d1142-1394-4cf7-a8f7-6f1841a6694d-kube-api-access-f8vsj\") pod \"auto-csr-approver-29567856-pzqfl\" (UID: \"a61d1142-1394-4cf7-a8f7-6f1841a6694d\") " pod="openshift-infra/auto-csr-approver-29567856-pzqfl" Mar 21 05:36:00 crc kubenswrapper[4839]: I0321 05:36:00.178884 4839 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29567856-pzqfl"] Mar 21 05:36:00 crc kubenswrapper[4839]: I0321 05:36:00.265787 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-f8vsj\" (UniqueName: \"kubernetes.io/projected/a61d1142-1394-4cf7-a8f7-6f1841a6694d-kube-api-access-f8vsj\") pod \"auto-csr-approver-29567856-pzqfl\" (UID: \"a61d1142-1394-4cf7-a8f7-6f1841a6694d\") " pod="openshift-infra/auto-csr-approver-29567856-pzqfl" Mar 21 05:36:00 crc kubenswrapper[4839]: I0321 05:36:00.293881 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-f8vsj\" (UniqueName: \"kubernetes.io/projected/a61d1142-1394-4cf7-a8f7-6f1841a6694d-kube-api-access-f8vsj\") pod \"auto-csr-approver-29567856-pzqfl\" (UID: \"a61d1142-1394-4cf7-a8f7-6f1841a6694d\") " pod="openshift-infra/auto-csr-approver-29567856-pzqfl" Mar 21 05:36:00 crc kubenswrapper[4839]: I0321 05:36:00.530411 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567856-pzqfl" Mar 21 05:36:01 crc kubenswrapper[4839]: I0321 05:36:01.003275 4839 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29567856-pzqfl"] Mar 21 05:36:01 crc kubenswrapper[4839]: I0321 05:36:01.299681 4839 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 21 05:36:02 crc kubenswrapper[4839]: I0321 05:36:02.006552 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567856-pzqfl" event={"ID":"a61d1142-1394-4cf7-a8f7-6f1841a6694d","Type":"ContainerStarted","Data":"ad2b19f1e54e88969f7e8e11b355551de72d748b4608d9eeafcca46f8f257e9d"} Mar 21 05:36:04 crc kubenswrapper[4839]: I0321 05:36:04.023916 4839 generic.go:334] "Generic (PLEG): container finished" podID="a61d1142-1394-4cf7-a8f7-6f1841a6694d" containerID="3b72b1b0e5d05a1d6603f6bb93e0270d894bb25f1de761d8b7c5c8644a45fe83" exitCode=0 Mar 21 05:36:04 crc kubenswrapper[4839]: I0321 05:36:04.023980 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567856-pzqfl" event={"ID":"a61d1142-1394-4cf7-a8f7-6f1841a6694d","Type":"ContainerDied","Data":"3b72b1b0e5d05a1d6603f6bb93e0270d894bb25f1de761d8b7c5c8644a45fe83"} Mar 21 05:36:05 crc kubenswrapper[4839]: I0321 05:36:05.371079 4839 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567856-pzqfl" Mar 21 05:36:05 crc kubenswrapper[4839]: I0321 05:36:05.484516 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-f8vsj\" (UniqueName: \"kubernetes.io/projected/a61d1142-1394-4cf7-a8f7-6f1841a6694d-kube-api-access-f8vsj\") pod \"a61d1142-1394-4cf7-a8f7-6f1841a6694d\" (UID: \"a61d1142-1394-4cf7-a8f7-6f1841a6694d\") " Mar 21 05:36:05 crc kubenswrapper[4839]: I0321 05:36:05.490174 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a61d1142-1394-4cf7-a8f7-6f1841a6694d-kube-api-access-f8vsj" (OuterVolumeSpecName: "kube-api-access-f8vsj") pod "a61d1142-1394-4cf7-a8f7-6f1841a6694d" (UID: "a61d1142-1394-4cf7-a8f7-6f1841a6694d"). InnerVolumeSpecName "kube-api-access-f8vsj". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 21 05:36:05 crc kubenswrapper[4839]: I0321 05:36:05.587083 4839 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-f8vsj\" (UniqueName: \"kubernetes.io/projected/a61d1142-1394-4cf7-a8f7-6f1841a6694d-kube-api-access-f8vsj\") on node \"crc\" DevicePath \"\"" Mar 21 05:36:06 crc kubenswrapper[4839]: I0321 05:36:06.047432 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567856-pzqfl" event={"ID":"a61d1142-1394-4cf7-a8f7-6f1841a6694d","Type":"ContainerDied","Data":"ad2b19f1e54e88969f7e8e11b355551de72d748b4608d9eeafcca46f8f257e9d"} Mar 21 05:36:06 crc kubenswrapper[4839]: I0321 05:36:06.047479 4839 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567856-pzqfl" Mar 21 05:36:06 crc kubenswrapper[4839]: I0321 05:36:06.047501 4839 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ad2b19f1e54e88969f7e8e11b355551de72d748b4608d9eeafcca46f8f257e9d" Mar 21 05:36:06 crc kubenswrapper[4839]: I0321 05:36:06.438873 4839 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29567850-6q472"] Mar 21 05:36:06 crc kubenswrapper[4839]: I0321 05:36:06.447636 4839 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29567850-6q472"] Mar 21 05:36:06 crc kubenswrapper[4839]: I0321 05:36:06.464834 4839 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5ff27433-bc42-4edf-bcac-48ffe5e0680a" path="/var/lib/kubelet/pods/5ff27433-bc42-4edf-bcac-48ffe5e0680a/volumes" Mar 21 05:36:10 crc kubenswrapper[4839]: I0321 05:36:10.180901 4839 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_controller-7bb4cc7c98-q9zb9_f0373e22-a3f9-48c6-abd6-fc8147ea49e6/kube-rbac-proxy/0.log" Mar 21 05:36:10 crc kubenswrapper[4839]: I0321 05:36:10.285718 4839 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_controller-7bb4cc7c98-q9zb9_f0373e22-a3f9-48c6-abd6-fc8147ea49e6/controller/0.log" Mar 21 05:36:10 crc kubenswrapper[4839]: I0321 05:36:10.425077 4839 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-lzrf7_822ff984-89c3-48d0-b420-4ecf223f8176/cp-frr-files/0.log" Mar 21 05:36:10 crc kubenswrapper[4839]: I0321 05:36:10.540362 4839 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-lzrf7_822ff984-89c3-48d0-b420-4ecf223f8176/cp-reloader/0.log" Mar 21 05:36:10 crc kubenswrapper[4839]: I0321 05:36:10.581462 4839 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-lzrf7_822ff984-89c3-48d0-b420-4ecf223f8176/cp-metrics/0.log" Mar 21 05:36:10 crc kubenswrapper[4839]: I0321 05:36:10.590710 4839 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-lzrf7_822ff984-89c3-48d0-b420-4ecf223f8176/cp-frr-files/0.log" Mar 21 05:36:10 crc kubenswrapper[4839]: I0321 05:36:10.625523 4839 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-lzrf7_822ff984-89c3-48d0-b420-4ecf223f8176/cp-reloader/0.log" Mar 21 05:36:10 crc kubenswrapper[4839]: I0321 05:36:10.868836 4839 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-lzrf7_822ff984-89c3-48d0-b420-4ecf223f8176/cp-metrics/0.log" Mar 21 05:36:10 crc kubenswrapper[4839]: I0321 05:36:10.872045 4839 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-lzrf7_822ff984-89c3-48d0-b420-4ecf223f8176/cp-frr-files/0.log" Mar 21 05:36:10 crc kubenswrapper[4839]: I0321 05:36:10.878453 4839 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-lzrf7_822ff984-89c3-48d0-b420-4ecf223f8176/cp-reloader/0.log" Mar 21 05:36:10 crc kubenswrapper[4839]: I0321 05:36:10.882099 4839 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-lzrf7_822ff984-89c3-48d0-b420-4ecf223f8176/cp-metrics/0.log" Mar 21 05:36:11 crc kubenswrapper[4839]: I0321 05:36:11.061280 4839 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-lzrf7_822ff984-89c3-48d0-b420-4ecf223f8176/cp-reloader/0.log" Mar 21 05:36:11 crc kubenswrapper[4839]: I0321 05:36:11.064492 4839 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-lzrf7_822ff984-89c3-48d0-b420-4ecf223f8176/cp-metrics/0.log" Mar 21 05:36:11 crc kubenswrapper[4839]: I0321 05:36:11.087120 4839 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-lzrf7_822ff984-89c3-48d0-b420-4ecf223f8176/cp-frr-files/0.log" Mar 21 05:36:11 crc kubenswrapper[4839]: I0321 05:36:11.129391 4839 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-lzrf7_822ff984-89c3-48d0-b420-4ecf223f8176/controller/0.log" Mar 21 05:36:11 crc kubenswrapper[4839]: I0321 05:36:11.264242 4839 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-lzrf7_822ff984-89c3-48d0-b420-4ecf223f8176/frr-metrics/0.log" Mar 21 05:36:11 crc kubenswrapper[4839]: I0321 05:36:11.275462 4839 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-lzrf7_822ff984-89c3-48d0-b420-4ecf223f8176/kube-rbac-proxy/0.log" Mar 21 05:36:11 crc kubenswrapper[4839]: I0321 05:36:11.345560 4839 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-lzrf7_822ff984-89c3-48d0-b420-4ecf223f8176/kube-rbac-proxy-frr/0.log" Mar 21 05:36:11 crc kubenswrapper[4839]: I0321 05:36:11.476763 4839 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-lzrf7_822ff984-89c3-48d0-b420-4ecf223f8176/reloader/0.log" Mar 21 05:36:11 crc kubenswrapper[4839]: I0321 05:36:11.562642 4839 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-webhook-server-bcc4b6f68-qm7jb_06b3d06a-d515-469a-9a88-77b3f1e6c6f0/frr-k8s-webhook-server/0.log" Mar 21 05:36:11 crc kubenswrapper[4839]: I0321 05:36:11.735866 4839 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_metallb-operator-controller-manager-7b8d865685-2pk4g_888cdc0b-241d-456a-9a9f-3ed253b3dbf3/manager/0.log" Mar 21 05:36:12 crc kubenswrapper[4839]: I0321 05:36:12.288289 4839 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_speaker-b2wb4_6b330e86-2ac2-4bee-8a6e-364cb2f093d7/kube-rbac-proxy/0.log" Mar 21 05:36:12 crc kubenswrapper[4839]: I0321 05:36:12.289988 4839 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_metallb-operator-webhook-server-7df97b96d6-7wvzr_ca0627e2-8115-4514-ba93-47e00a823a31/webhook-server/0.log" Mar 21 05:36:13 crc kubenswrapper[4839]: I0321 05:36:13.052394 4839 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-lzrf7_822ff984-89c3-48d0-b420-4ecf223f8176/frr/0.log" Mar 21 05:36:13 crc kubenswrapper[4839]: I0321 05:36:13.068601 4839 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_speaker-b2wb4_6b330e86-2ac2-4bee-8a6e-364cb2f093d7/speaker/0.log" Mar 21 05:36:27 crc kubenswrapper[4839]: I0321 05:36:27.882798 4839 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874bb5km_e0eea72c-ae42-4ea4-a067-6ff3e853c081/util/0.log" Mar 21 05:36:28 crc kubenswrapper[4839]: I0321 05:36:28.030850 4839 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874bb5km_e0eea72c-ae42-4ea4-a067-6ff3e853c081/pull/0.log" Mar 21 05:36:28 crc kubenswrapper[4839]: I0321 05:36:28.069931 4839 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874bb5km_e0eea72c-ae42-4ea4-a067-6ff3e853c081/pull/0.log" Mar 21 05:36:28 crc kubenswrapper[4839]: I0321 05:36:28.082946 4839 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874bb5km_e0eea72c-ae42-4ea4-a067-6ff3e853c081/util/0.log" Mar 21 05:36:28 crc kubenswrapper[4839]: I0321 05:36:28.261197 4839 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874bb5km_e0eea72c-ae42-4ea4-a067-6ff3e853c081/util/0.log" Mar 21 05:36:28 crc kubenswrapper[4839]: I0321 05:36:28.283208 4839 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874bb5km_e0eea72c-ae42-4ea4-a067-6ff3e853c081/extract/0.log" Mar 21 05:36:28 crc kubenswrapper[4839]: I0321 05:36:28.293900 4839 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874bb5km_e0eea72c-ae42-4ea4-a067-6ff3e853c081/pull/0.log" Mar 21 05:36:28 crc kubenswrapper[4839]: I0321 05:36:28.449281 4839 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c18p48w_cb3471d2-6268-4816-bc09-31044e9989e7/util/0.log" Mar 21 05:36:28 crc kubenswrapper[4839]: I0321 05:36:28.682468 4839 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c18p48w_cb3471d2-6268-4816-bc09-31044e9989e7/pull/0.log" Mar 21 05:36:28 crc kubenswrapper[4839]: I0321 05:36:28.690381 4839 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c18p48w_cb3471d2-6268-4816-bc09-31044e9989e7/pull/0.log" Mar 21 05:36:28 crc kubenswrapper[4839]: I0321 05:36:28.695705 4839 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c18p48w_cb3471d2-6268-4816-bc09-31044e9989e7/util/0.log" Mar 21 05:36:28 crc kubenswrapper[4839]: I0321 05:36:28.868902 4839 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c18p48w_cb3471d2-6268-4816-bc09-31044e9989e7/extract/0.log" Mar 21 05:36:28 crc kubenswrapper[4839]: I0321 05:36:28.878142 4839 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c18p48w_cb3471d2-6268-4816-bc09-31044e9989e7/util/0.log" Mar 21 05:36:28 crc kubenswrapper[4839]: I0321 05:36:28.904856 4839 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c18p48w_cb3471d2-6268-4816-bc09-31044e9989e7/pull/0.log" Mar 21 05:36:29 crc kubenswrapper[4839]: I0321 05:36:29.034359 4839 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-xg8xw_1d4943ad-c109-47a0-bcc8-4eb1a89836ca/extract-utilities/0.log" Mar 21 05:36:29 crc kubenswrapper[4839]: I0321 05:36:29.216242 4839 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-xg8xw_1d4943ad-c109-47a0-bcc8-4eb1a89836ca/extract-utilities/0.log" Mar 21 05:36:29 crc kubenswrapper[4839]: I0321 05:36:29.217650 4839 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-xg8xw_1d4943ad-c109-47a0-bcc8-4eb1a89836ca/extract-content/0.log" Mar 21 05:36:29 crc kubenswrapper[4839]: I0321 05:36:29.222119 4839 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-xg8xw_1d4943ad-c109-47a0-bcc8-4eb1a89836ca/extract-content/0.log" Mar 21 05:36:29 crc kubenswrapper[4839]: I0321 05:36:29.519352 4839 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-xg8xw_1d4943ad-c109-47a0-bcc8-4eb1a89836ca/extract-utilities/0.log" Mar 21 05:36:29 crc kubenswrapper[4839]: I0321 05:36:29.593002 4839 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-xg8xw_1d4943ad-c109-47a0-bcc8-4eb1a89836ca/extract-content/0.log" Mar 21 05:36:29 crc kubenswrapper[4839]: I0321 05:36:29.817224 4839 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-99hx2_51f96bb3-505b-4c7b-bc6d-b0a465c7daae/extract-utilities/0.log" Mar 21 05:36:30 crc kubenswrapper[4839]: I0321 05:36:30.015916 4839 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-99hx2_51f96bb3-505b-4c7b-bc6d-b0a465c7daae/extract-utilities/0.log" Mar 21 05:36:30 crc kubenswrapper[4839]: I0321 05:36:30.082477 4839 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-99hx2_51f96bb3-505b-4c7b-bc6d-b0a465c7daae/extract-content/0.log" Mar 21 05:36:30 crc kubenswrapper[4839]: I0321 05:36:30.113164 4839 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-xg8xw_1d4943ad-c109-47a0-bcc8-4eb1a89836ca/registry-server/0.log" Mar 21 05:36:30 crc kubenswrapper[4839]: I0321 05:36:30.121896 4839 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-99hx2_51f96bb3-505b-4c7b-bc6d-b0a465c7daae/extract-content/0.log" Mar 21 05:36:30 crc kubenswrapper[4839]: I0321 05:36:30.344959 4839 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-99hx2_51f96bb3-505b-4c7b-bc6d-b0a465c7daae/extract-utilities/0.log" Mar 21 05:36:30 crc kubenswrapper[4839]: I0321 05:36:30.375792 4839 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-99hx2_51f96bb3-505b-4c7b-bc6d-b0a465c7daae/extract-content/0.log" Mar 21 05:36:30 crc kubenswrapper[4839]: I0321 05:36:30.573481 4839 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_marketplace-operator-79b997595-qb9bp_df9bf95b-dc8f-4104-9c6c-873159393850/marketplace-operator/0.log" Mar 21 05:36:30 crc kubenswrapper[4839]: I0321 05:36:30.980735 4839 patch_prober.go:28] interesting pod/machine-config-daemon-jx4q7 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 21 05:36:30 crc kubenswrapper[4839]: I0321 05:36:30.980815 4839 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-jx4q7" podUID="4f92fefb-d5cd-451a-8bbe-31eea55d5bd9" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 21 05:36:30 crc kubenswrapper[4839]: I0321 05:36:30.982372 4839 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-7sxqv_e28c0850-90f8-445b-be34-13ab0d940eb4/extract-utilities/0.log" Mar 21 05:36:31 crc kubenswrapper[4839]: I0321 05:36:31.192909 4839 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-7sxqv_e28c0850-90f8-445b-be34-13ab0d940eb4/extract-utilities/0.log" Mar 21 05:36:31 crc kubenswrapper[4839]: I0321 05:36:31.201768 4839 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-99hx2_51f96bb3-505b-4c7b-bc6d-b0a465c7daae/registry-server/0.log" Mar 21 05:36:31 crc kubenswrapper[4839]: I0321 05:36:31.233093 4839 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-7sxqv_e28c0850-90f8-445b-be34-13ab0d940eb4/extract-content/0.log" Mar 21 05:36:31 crc kubenswrapper[4839]: I0321 05:36:31.253743 4839 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-7sxqv_e28c0850-90f8-445b-be34-13ab0d940eb4/extract-content/0.log" Mar 21 05:36:31 crc kubenswrapper[4839]: I0321 05:36:31.424300 4839 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-7sxqv_e28c0850-90f8-445b-be34-13ab0d940eb4/extract-content/0.log" Mar 21 05:36:31 crc kubenswrapper[4839]: I0321 05:36:31.438787 4839 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-7sxqv_e28c0850-90f8-445b-be34-13ab0d940eb4/extract-utilities/0.log" Mar 21 05:36:31 crc kubenswrapper[4839]: I0321 05:36:31.622311 4839 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-7sxqv_e28c0850-90f8-445b-be34-13ab0d940eb4/registry-server/0.log" Mar 21 05:36:32 crc kubenswrapper[4839]: I0321 05:36:32.269444 4839 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-8p22k_d2de7c7a-fc46-44bc-9fad-d346e82f8ebc/extract-utilities/0.log" Mar 21 05:36:32 crc kubenswrapper[4839]: I0321 05:36:32.296696 4839 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-8p22k_d2de7c7a-fc46-44bc-9fad-d346e82f8ebc/extract-utilities/0.log" Mar 21 05:36:32 crc kubenswrapper[4839]: I0321 05:36:32.355226 4839 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-8p22k_d2de7c7a-fc46-44bc-9fad-d346e82f8ebc/extract-content/0.log" Mar 21 05:36:32 crc kubenswrapper[4839]: I0321 05:36:32.443352 4839 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-8p22k_d2de7c7a-fc46-44bc-9fad-d346e82f8ebc/extract-utilities/0.log" Mar 21 05:36:32 crc kubenswrapper[4839]: I0321 05:36:32.450688 4839 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-8p22k_d2de7c7a-fc46-44bc-9fad-d346e82f8ebc/extract-content/0.log" Mar 21 05:36:32 crc kubenswrapper[4839]: I0321 05:36:32.509926 4839 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-8p22k_d2de7c7a-fc46-44bc-9fad-d346e82f8ebc/extract-content/0.log" Mar 21 05:36:33 crc kubenswrapper[4839]: I0321 05:36:33.057151 4839 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-8p22k_d2de7c7a-fc46-44bc-9fad-d346e82f8ebc/registry-server/0.log" Mar 21 05:36:58 crc kubenswrapper[4839]: I0321 05:36:58.483892 4839 scope.go:117] "RemoveContainer" containerID="32ef2594966320293c7652dfc99c30b2eedf27f32e9592ed12c4d3d92de56d1a" Mar 21 05:37:00 crc kubenswrapper[4839]: I0321 05:37:00.980288 4839 patch_prober.go:28] interesting pod/machine-config-daemon-jx4q7 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 21 05:37:00 crc kubenswrapper[4839]: I0321 05:37:00.980814 4839 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-jx4q7" podUID="4f92fefb-d5cd-451a-8bbe-31eea55d5bd9" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 21 05:37:30 crc kubenswrapper[4839]: I0321 05:37:30.980645 4839 patch_prober.go:28] interesting pod/machine-config-daemon-jx4q7 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 21 05:37:30 crc kubenswrapper[4839]: I0321 05:37:30.981262 4839 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-jx4q7" podUID="4f92fefb-d5cd-451a-8bbe-31eea55d5bd9" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 21 05:37:30 crc kubenswrapper[4839]: I0321 05:37:30.981312 4839 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-jx4q7" Mar 21 05:37:30 crc kubenswrapper[4839]: I0321 05:37:30.982136 4839 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"c9fc249e5d17a6c2fdbc1eaec440716c42661d1c2e7c8e6b17923104003e02fe"} pod="openshift-machine-config-operator/machine-config-daemon-jx4q7" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 21 05:37:30 crc kubenswrapper[4839]: I0321 05:37:30.982195 4839 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-jx4q7" podUID="4f92fefb-d5cd-451a-8bbe-31eea55d5bd9" containerName="machine-config-daemon" containerID="cri-o://c9fc249e5d17a6c2fdbc1eaec440716c42661d1c2e7c8e6b17923104003e02fe" gracePeriod=600 Mar 21 05:37:31 crc kubenswrapper[4839]: I0321 05:37:31.154009 4839 generic.go:334] "Generic (PLEG): container finished" podID="4f92fefb-d5cd-451a-8bbe-31eea55d5bd9" containerID="c9fc249e5d17a6c2fdbc1eaec440716c42661d1c2e7c8e6b17923104003e02fe" exitCode=0 Mar 21 05:37:31 crc kubenswrapper[4839]: I0321 05:37:31.154055 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-jx4q7" event={"ID":"4f92fefb-d5cd-451a-8bbe-31eea55d5bd9","Type":"ContainerDied","Data":"c9fc249e5d17a6c2fdbc1eaec440716c42661d1c2e7c8e6b17923104003e02fe"} Mar 21 05:37:31 crc kubenswrapper[4839]: I0321 05:37:31.154096 4839 scope.go:117] "RemoveContainer" containerID="135bd8a875eb5ef2e4c8acd69ed54a00aa689eabe31cc4130e794bbf798a359d" Mar 21 05:37:32 crc kubenswrapper[4839]: I0321 05:37:32.166333 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-jx4q7" event={"ID":"4f92fefb-d5cd-451a-8bbe-31eea55d5bd9","Type":"ContainerStarted","Data":"3ed2b1cd754eebd9a34fd22a5307aa90be9d9709c3e6c1624ea187562e9a5a58"} Mar 21 05:38:00 crc kubenswrapper[4839]: I0321 05:38:00.152817 4839 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29567858-qlqwl"] Mar 21 05:38:00 crc kubenswrapper[4839]: E0321 05:38:00.153627 4839 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a61d1142-1394-4cf7-a8f7-6f1841a6694d" containerName="oc" Mar 21 05:38:00 crc kubenswrapper[4839]: I0321 05:38:00.153640 4839 state_mem.go:107] "Deleted CPUSet assignment" podUID="a61d1142-1394-4cf7-a8f7-6f1841a6694d" containerName="oc" Mar 21 05:38:00 crc kubenswrapper[4839]: I0321 05:38:00.153916 4839 memory_manager.go:354] "RemoveStaleState removing state" podUID="a61d1142-1394-4cf7-a8f7-6f1841a6694d" containerName="oc" Mar 21 05:38:00 crc kubenswrapper[4839]: I0321 05:38:00.154447 4839 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29567858-qlqwl"] Mar 21 05:38:00 crc kubenswrapper[4839]: I0321 05:38:00.154514 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567858-qlqwl" Mar 21 05:38:00 crc kubenswrapper[4839]: I0321 05:38:00.201241 4839 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 21 05:38:00 crc kubenswrapper[4839]: I0321 05:38:00.201334 4839 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 21 05:38:00 crc kubenswrapper[4839]: I0321 05:38:00.201636 4839 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-swld2" Mar 21 05:38:00 crc kubenswrapper[4839]: I0321 05:38:00.220159 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hdr25\" (UniqueName: \"kubernetes.io/projected/f62e08ee-35a1-4db2-9d9a-de4f9de7fd5f-kube-api-access-hdr25\") pod \"auto-csr-approver-29567858-qlqwl\" (UID: \"f62e08ee-35a1-4db2-9d9a-de4f9de7fd5f\") " pod="openshift-infra/auto-csr-approver-29567858-qlqwl" Mar 21 05:38:00 crc kubenswrapper[4839]: I0321 05:38:00.321597 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hdr25\" (UniqueName: \"kubernetes.io/projected/f62e08ee-35a1-4db2-9d9a-de4f9de7fd5f-kube-api-access-hdr25\") pod \"auto-csr-approver-29567858-qlqwl\" (UID: \"f62e08ee-35a1-4db2-9d9a-de4f9de7fd5f\") " pod="openshift-infra/auto-csr-approver-29567858-qlqwl" Mar 21 05:38:00 crc kubenswrapper[4839]: I0321 05:38:00.340247 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hdr25\" (UniqueName: \"kubernetes.io/projected/f62e08ee-35a1-4db2-9d9a-de4f9de7fd5f-kube-api-access-hdr25\") pod \"auto-csr-approver-29567858-qlqwl\" (UID: \"f62e08ee-35a1-4db2-9d9a-de4f9de7fd5f\") " pod="openshift-infra/auto-csr-approver-29567858-qlqwl" Mar 21 05:38:00 crc kubenswrapper[4839]: I0321 05:38:00.531929 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567858-qlqwl" Mar 21 05:38:01 crc kubenswrapper[4839]: I0321 05:38:01.007440 4839 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29567858-qlqwl"] Mar 21 05:38:01 crc kubenswrapper[4839]: W0321 05:38:01.008982 4839 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf62e08ee_35a1_4db2_9d9a_de4f9de7fd5f.slice/crio-3ad7e02fa72dc04fd3f928962090051c901d4ac848ee613c941d18a83ca24702 WatchSource:0}: Error finding container 3ad7e02fa72dc04fd3f928962090051c901d4ac848ee613c941d18a83ca24702: Status 404 returned error can't find the container with id 3ad7e02fa72dc04fd3f928962090051c901d4ac848ee613c941d18a83ca24702 Mar 21 05:38:01 crc kubenswrapper[4839]: I0321 05:38:01.155353 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567858-qlqwl" event={"ID":"f62e08ee-35a1-4db2-9d9a-de4f9de7fd5f","Type":"ContainerStarted","Data":"3ad7e02fa72dc04fd3f928962090051c901d4ac848ee613c941d18a83ca24702"} Mar 21 05:38:03 crc kubenswrapper[4839]: I0321 05:38:03.191933 4839 generic.go:334] "Generic (PLEG): container finished" podID="f62e08ee-35a1-4db2-9d9a-de4f9de7fd5f" containerID="85a2cbb3a85126b520d32ce4bc2403f6773bb3f095dc1b0013f7736ed37e9add" exitCode=0 Mar 21 05:38:03 crc kubenswrapper[4839]: I0321 05:38:03.191974 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567858-qlqwl" event={"ID":"f62e08ee-35a1-4db2-9d9a-de4f9de7fd5f","Type":"ContainerDied","Data":"85a2cbb3a85126b520d32ce4bc2403f6773bb3f095dc1b0013f7736ed37e9add"} Mar 21 05:38:04 crc kubenswrapper[4839]: I0321 05:38:04.508241 4839 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567858-qlqwl" Mar 21 05:38:04 crc kubenswrapper[4839]: I0321 05:38:04.617306 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hdr25\" (UniqueName: \"kubernetes.io/projected/f62e08ee-35a1-4db2-9d9a-de4f9de7fd5f-kube-api-access-hdr25\") pod \"f62e08ee-35a1-4db2-9d9a-de4f9de7fd5f\" (UID: \"f62e08ee-35a1-4db2-9d9a-de4f9de7fd5f\") " Mar 21 05:38:04 crc kubenswrapper[4839]: I0321 05:38:04.623986 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f62e08ee-35a1-4db2-9d9a-de4f9de7fd5f-kube-api-access-hdr25" (OuterVolumeSpecName: "kube-api-access-hdr25") pod "f62e08ee-35a1-4db2-9d9a-de4f9de7fd5f" (UID: "f62e08ee-35a1-4db2-9d9a-de4f9de7fd5f"). InnerVolumeSpecName "kube-api-access-hdr25". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 21 05:38:04 crc kubenswrapper[4839]: I0321 05:38:04.720258 4839 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hdr25\" (UniqueName: \"kubernetes.io/projected/f62e08ee-35a1-4db2-9d9a-de4f9de7fd5f-kube-api-access-hdr25\") on node \"crc\" DevicePath \"\"" Mar 21 05:38:05 crc kubenswrapper[4839]: I0321 05:38:05.217477 4839 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567858-qlqwl" Mar 21 05:38:05 crc kubenswrapper[4839]: I0321 05:38:05.217474 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567858-qlqwl" event={"ID":"f62e08ee-35a1-4db2-9d9a-de4f9de7fd5f","Type":"ContainerDied","Data":"3ad7e02fa72dc04fd3f928962090051c901d4ac848ee613c941d18a83ca24702"} Mar 21 05:38:05 crc kubenswrapper[4839]: I0321 05:38:05.217561 4839 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="3ad7e02fa72dc04fd3f928962090051c901d4ac848ee613c941d18a83ca24702" Mar 21 05:38:05 crc kubenswrapper[4839]: I0321 05:38:05.602202 4839 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29567852-gb6qv"] Mar 21 05:38:05 crc kubenswrapper[4839]: I0321 05:38:05.612949 4839 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29567852-gb6qv"] Mar 21 05:38:06 crc kubenswrapper[4839]: I0321 05:38:06.474200 4839 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d8ef5a8a-ae63-4e94-9c1e-1e7d5e4a99c1" path="/var/lib/kubelet/pods/d8ef5a8a-ae63-4e94-9c1e-1e7d5e4a99c1/volumes" Mar 21 05:38:33 crc kubenswrapper[4839]: I0321 05:38:33.834829 4839 generic.go:334] "Generic (PLEG): container finished" podID="5072f4c5-1de6-4d8c-b69c-72d081fc7a0d" containerID="14e8bebce0253c85aad33e0b71ce4edd291dedb5a18a4452ca51e078dab4db0e" exitCode=0 Mar 21 05:38:33 crc kubenswrapper[4839]: I0321 05:38:33.834937 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-ppvlf/must-gather-sjwj7" event={"ID":"5072f4c5-1de6-4d8c-b69c-72d081fc7a0d","Type":"ContainerDied","Data":"14e8bebce0253c85aad33e0b71ce4edd291dedb5a18a4452ca51e078dab4db0e"} Mar 21 05:38:33 crc kubenswrapper[4839]: I0321 05:38:33.836139 4839 scope.go:117] "RemoveContainer" containerID="14e8bebce0253c85aad33e0b71ce4edd291dedb5a18a4452ca51e078dab4db0e" Mar 21 05:38:34 crc kubenswrapper[4839]: I0321 05:38:34.243043 4839 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-ppvlf_must-gather-sjwj7_5072f4c5-1de6-4d8c-b69c-72d081fc7a0d/gather/0.log" Mar 21 05:38:42 crc kubenswrapper[4839]: I0321 05:38:42.265332 4839 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-ll62v"] Mar 21 05:38:42 crc kubenswrapper[4839]: E0321 05:38:42.266432 4839 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f62e08ee-35a1-4db2-9d9a-de4f9de7fd5f" containerName="oc" Mar 21 05:38:42 crc kubenswrapper[4839]: I0321 05:38:42.266450 4839 state_mem.go:107] "Deleted CPUSet assignment" podUID="f62e08ee-35a1-4db2-9d9a-de4f9de7fd5f" containerName="oc" Mar 21 05:38:42 crc kubenswrapper[4839]: I0321 05:38:42.268036 4839 memory_manager.go:354] "RemoveStaleState removing state" podUID="f62e08ee-35a1-4db2-9d9a-de4f9de7fd5f" containerName="oc" Mar 21 05:38:42 crc kubenswrapper[4839]: I0321 05:38:42.269419 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-ll62v" Mar 21 05:38:42 crc kubenswrapper[4839]: I0321 05:38:42.289665 4839 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-ll62v"] Mar 21 05:38:42 crc kubenswrapper[4839]: I0321 05:38:42.345536 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dk5px\" (UniqueName: \"kubernetes.io/projected/78b94c3e-17dd-4253-8aed-25de5cbc0215-kube-api-access-dk5px\") pod \"certified-operators-ll62v\" (UID: \"78b94c3e-17dd-4253-8aed-25de5cbc0215\") " pod="openshift-marketplace/certified-operators-ll62v" Mar 21 05:38:42 crc kubenswrapper[4839]: I0321 05:38:42.345608 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/78b94c3e-17dd-4253-8aed-25de5cbc0215-catalog-content\") pod \"certified-operators-ll62v\" (UID: \"78b94c3e-17dd-4253-8aed-25de5cbc0215\") " pod="openshift-marketplace/certified-operators-ll62v" Mar 21 05:38:42 crc kubenswrapper[4839]: I0321 05:38:42.345705 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/78b94c3e-17dd-4253-8aed-25de5cbc0215-utilities\") pod \"certified-operators-ll62v\" (UID: \"78b94c3e-17dd-4253-8aed-25de5cbc0215\") " pod="openshift-marketplace/certified-operators-ll62v" Mar 21 05:38:42 crc kubenswrapper[4839]: I0321 05:38:42.447289 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dk5px\" (UniqueName: \"kubernetes.io/projected/78b94c3e-17dd-4253-8aed-25de5cbc0215-kube-api-access-dk5px\") pod \"certified-operators-ll62v\" (UID: \"78b94c3e-17dd-4253-8aed-25de5cbc0215\") " pod="openshift-marketplace/certified-operators-ll62v" Mar 21 05:38:42 crc kubenswrapper[4839]: I0321 05:38:42.447354 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/78b94c3e-17dd-4253-8aed-25de5cbc0215-catalog-content\") pod \"certified-operators-ll62v\" (UID: \"78b94c3e-17dd-4253-8aed-25de5cbc0215\") " pod="openshift-marketplace/certified-operators-ll62v" Mar 21 05:38:42 crc kubenswrapper[4839]: I0321 05:38:42.447416 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/78b94c3e-17dd-4253-8aed-25de5cbc0215-utilities\") pod \"certified-operators-ll62v\" (UID: \"78b94c3e-17dd-4253-8aed-25de5cbc0215\") " pod="openshift-marketplace/certified-operators-ll62v" Mar 21 05:38:42 crc kubenswrapper[4839]: I0321 05:38:42.448006 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/78b94c3e-17dd-4253-8aed-25de5cbc0215-catalog-content\") pod \"certified-operators-ll62v\" (UID: \"78b94c3e-17dd-4253-8aed-25de5cbc0215\") " pod="openshift-marketplace/certified-operators-ll62v" Mar 21 05:38:42 crc kubenswrapper[4839]: I0321 05:38:42.448095 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/78b94c3e-17dd-4253-8aed-25de5cbc0215-utilities\") pod \"certified-operators-ll62v\" (UID: \"78b94c3e-17dd-4253-8aed-25de5cbc0215\") " pod="openshift-marketplace/certified-operators-ll62v" Mar 21 05:38:42 crc kubenswrapper[4839]: I0321 05:38:42.477422 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dk5px\" (UniqueName: \"kubernetes.io/projected/78b94c3e-17dd-4253-8aed-25de5cbc0215-kube-api-access-dk5px\") pod \"certified-operators-ll62v\" (UID: \"78b94c3e-17dd-4253-8aed-25de5cbc0215\") " pod="openshift-marketplace/certified-operators-ll62v" Mar 21 05:38:42 crc kubenswrapper[4839]: I0321 05:38:42.591094 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-ll62v" Mar 21 05:38:43 crc kubenswrapper[4839]: I0321 05:38:43.140273 4839 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-ll62v"] Mar 21 05:38:43 crc kubenswrapper[4839]: I0321 05:38:43.949655 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-ll62v" event={"ID":"78b94c3e-17dd-4253-8aed-25de5cbc0215","Type":"ContainerStarted","Data":"660ab990dd2c9102a94a5ac35184e719e4e6c6f722f12c8ccf1fe4b3ae9ff166"} Mar 21 05:38:43 crc kubenswrapper[4839]: I0321 05:38:43.949728 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-ll62v" event={"ID":"78b94c3e-17dd-4253-8aed-25de5cbc0215","Type":"ContainerStarted","Data":"240f05dbf34fd755fdacb0ccc96083ada20d0c3272a8aa95239fc19a9ccae79a"} Mar 21 05:38:44 crc kubenswrapper[4839]: I0321 05:38:44.961188 4839 generic.go:334] "Generic (PLEG): container finished" podID="78b94c3e-17dd-4253-8aed-25de5cbc0215" containerID="660ab990dd2c9102a94a5ac35184e719e4e6c6f722f12c8ccf1fe4b3ae9ff166" exitCode=0 Mar 21 05:38:44 crc kubenswrapper[4839]: I0321 05:38:44.961293 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-ll62v" event={"ID":"78b94c3e-17dd-4253-8aed-25de5cbc0215","Type":"ContainerDied","Data":"660ab990dd2c9102a94a5ac35184e719e4e6c6f722f12c8ccf1fe4b3ae9ff166"} Mar 21 05:38:44 crc kubenswrapper[4839]: I0321 05:38:44.961394 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-ll62v" event={"ID":"78b94c3e-17dd-4253-8aed-25de5cbc0215","Type":"ContainerStarted","Data":"d455542feefa66517496e53183dac2ab3a6f9cbc8f06950babf0fd1a21727448"} Mar 21 05:38:45 crc kubenswrapper[4839]: I0321 05:38:45.147778 4839 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-ppvlf/must-gather-sjwj7"] Mar 21 05:38:45 crc kubenswrapper[4839]: I0321 05:38:45.148108 4839 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-must-gather-ppvlf/must-gather-sjwj7" podUID="5072f4c5-1de6-4d8c-b69c-72d081fc7a0d" containerName="copy" containerID="cri-o://048a66db59bdf132d5c901dfec60d6fc743b835eddb4d17b55a8605cd2b2d8b8" gracePeriod=2 Mar 21 05:38:45 crc kubenswrapper[4839]: I0321 05:38:45.157434 4839 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-ppvlf/must-gather-sjwj7"] Mar 21 05:38:45 crc kubenswrapper[4839]: I0321 05:38:45.587708 4839 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-ppvlf_must-gather-sjwj7_5072f4c5-1de6-4d8c-b69c-72d081fc7a0d/copy/0.log" Mar 21 05:38:45 crc kubenswrapper[4839]: I0321 05:38:45.588528 4839 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-ppvlf/must-gather-sjwj7" Mar 21 05:38:45 crc kubenswrapper[4839]: I0321 05:38:45.715160 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/5072f4c5-1de6-4d8c-b69c-72d081fc7a0d-must-gather-output\") pod \"5072f4c5-1de6-4d8c-b69c-72d081fc7a0d\" (UID: \"5072f4c5-1de6-4d8c-b69c-72d081fc7a0d\") " Mar 21 05:38:45 crc kubenswrapper[4839]: I0321 05:38:45.715279 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rx5vq\" (UniqueName: \"kubernetes.io/projected/5072f4c5-1de6-4d8c-b69c-72d081fc7a0d-kube-api-access-rx5vq\") pod \"5072f4c5-1de6-4d8c-b69c-72d081fc7a0d\" (UID: \"5072f4c5-1de6-4d8c-b69c-72d081fc7a0d\") " Mar 21 05:38:45 crc kubenswrapper[4839]: I0321 05:38:45.721921 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5072f4c5-1de6-4d8c-b69c-72d081fc7a0d-kube-api-access-rx5vq" (OuterVolumeSpecName: "kube-api-access-rx5vq") pod "5072f4c5-1de6-4d8c-b69c-72d081fc7a0d" (UID: "5072f4c5-1de6-4d8c-b69c-72d081fc7a0d"). InnerVolumeSpecName "kube-api-access-rx5vq". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 21 05:38:45 crc kubenswrapper[4839]: I0321 05:38:45.817948 4839 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rx5vq\" (UniqueName: \"kubernetes.io/projected/5072f4c5-1de6-4d8c-b69c-72d081fc7a0d-kube-api-access-rx5vq\") on node \"crc\" DevicePath \"\"" Mar 21 05:38:45 crc kubenswrapper[4839]: I0321 05:38:45.878916 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5072f4c5-1de6-4d8c-b69c-72d081fc7a0d-must-gather-output" (OuterVolumeSpecName: "must-gather-output") pod "5072f4c5-1de6-4d8c-b69c-72d081fc7a0d" (UID: "5072f4c5-1de6-4d8c-b69c-72d081fc7a0d"). InnerVolumeSpecName "must-gather-output". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 21 05:38:45 crc kubenswrapper[4839]: I0321 05:38:45.920351 4839 reconciler_common.go:293] "Volume detached for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/5072f4c5-1de6-4d8c-b69c-72d081fc7a0d-must-gather-output\") on node \"crc\" DevicePath \"\"" Mar 21 05:38:45 crc kubenswrapper[4839]: I0321 05:38:45.971643 4839 generic.go:334] "Generic (PLEG): container finished" podID="78b94c3e-17dd-4253-8aed-25de5cbc0215" containerID="d455542feefa66517496e53183dac2ab3a6f9cbc8f06950babf0fd1a21727448" exitCode=0 Mar 21 05:38:45 crc kubenswrapper[4839]: I0321 05:38:45.971717 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-ll62v" event={"ID":"78b94c3e-17dd-4253-8aed-25de5cbc0215","Type":"ContainerDied","Data":"d455542feefa66517496e53183dac2ab3a6f9cbc8f06950babf0fd1a21727448"} Mar 21 05:38:45 crc kubenswrapper[4839]: I0321 05:38:45.974435 4839 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-ppvlf_must-gather-sjwj7_5072f4c5-1de6-4d8c-b69c-72d081fc7a0d/copy/0.log" Mar 21 05:38:45 crc kubenswrapper[4839]: I0321 05:38:45.974901 4839 generic.go:334] "Generic (PLEG): container finished" podID="5072f4c5-1de6-4d8c-b69c-72d081fc7a0d" containerID="048a66db59bdf132d5c901dfec60d6fc743b835eddb4d17b55a8605cd2b2d8b8" exitCode=143 Mar 21 05:38:45 crc kubenswrapper[4839]: I0321 05:38:45.974934 4839 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-ppvlf/must-gather-sjwj7" Mar 21 05:38:45 crc kubenswrapper[4839]: I0321 05:38:45.974974 4839 scope.go:117] "RemoveContainer" containerID="048a66db59bdf132d5c901dfec60d6fc743b835eddb4d17b55a8605cd2b2d8b8" Mar 21 05:38:46 crc kubenswrapper[4839]: I0321 05:38:46.004720 4839 scope.go:117] "RemoveContainer" containerID="14e8bebce0253c85aad33e0b71ce4edd291dedb5a18a4452ca51e078dab4db0e" Mar 21 05:38:46 crc kubenswrapper[4839]: I0321 05:38:46.055015 4839 scope.go:117] "RemoveContainer" containerID="048a66db59bdf132d5c901dfec60d6fc743b835eddb4d17b55a8605cd2b2d8b8" Mar 21 05:38:46 crc kubenswrapper[4839]: E0321 05:38:46.055460 4839 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"048a66db59bdf132d5c901dfec60d6fc743b835eddb4d17b55a8605cd2b2d8b8\": container with ID starting with 048a66db59bdf132d5c901dfec60d6fc743b835eddb4d17b55a8605cd2b2d8b8 not found: ID does not exist" containerID="048a66db59bdf132d5c901dfec60d6fc743b835eddb4d17b55a8605cd2b2d8b8" Mar 21 05:38:46 crc kubenswrapper[4839]: I0321 05:38:46.055496 4839 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"048a66db59bdf132d5c901dfec60d6fc743b835eddb4d17b55a8605cd2b2d8b8"} err="failed to get container status \"048a66db59bdf132d5c901dfec60d6fc743b835eddb4d17b55a8605cd2b2d8b8\": rpc error: code = NotFound desc = could not find container \"048a66db59bdf132d5c901dfec60d6fc743b835eddb4d17b55a8605cd2b2d8b8\": container with ID starting with 048a66db59bdf132d5c901dfec60d6fc743b835eddb4d17b55a8605cd2b2d8b8 not found: ID does not exist" Mar 21 05:38:46 crc kubenswrapper[4839]: I0321 05:38:46.055516 4839 scope.go:117] "RemoveContainer" containerID="14e8bebce0253c85aad33e0b71ce4edd291dedb5a18a4452ca51e078dab4db0e" Mar 21 05:38:46 crc kubenswrapper[4839]: E0321 05:38:46.055951 4839 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"14e8bebce0253c85aad33e0b71ce4edd291dedb5a18a4452ca51e078dab4db0e\": container with ID starting with 14e8bebce0253c85aad33e0b71ce4edd291dedb5a18a4452ca51e078dab4db0e not found: ID does not exist" containerID="14e8bebce0253c85aad33e0b71ce4edd291dedb5a18a4452ca51e078dab4db0e" Mar 21 05:38:46 crc kubenswrapper[4839]: I0321 05:38:46.055982 4839 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"14e8bebce0253c85aad33e0b71ce4edd291dedb5a18a4452ca51e078dab4db0e"} err="failed to get container status \"14e8bebce0253c85aad33e0b71ce4edd291dedb5a18a4452ca51e078dab4db0e\": rpc error: code = NotFound desc = could not find container \"14e8bebce0253c85aad33e0b71ce4edd291dedb5a18a4452ca51e078dab4db0e\": container with ID starting with 14e8bebce0253c85aad33e0b71ce4edd291dedb5a18a4452ca51e078dab4db0e not found: ID does not exist" Mar 21 05:38:46 crc kubenswrapper[4839]: I0321 05:38:46.466205 4839 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5072f4c5-1de6-4d8c-b69c-72d081fc7a0d" path="/var/lib/kubelet/pods/5072f4c5-1de6-4d8c-b69c-72d081fc7a0d/volumes" Mar 21 05:38:46 crc kubenswrapper[4839]: I0321 05:38:46.984497 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-ll62v" event={"ID":"78b94c3e-17dd-4253-8aed-25de5cbc0215","Type":"ContainerStarted","Data":"3c1fee562067ca9358c26f54d92cb826bf4b9fffd2460d92100449ea81f1bed1"} Mar 21 05:38:47 crc kubenswrapper[4839]: I0321 05:38:47.010656 4839 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-ll62v" podStartSLOduration=2.510352924 podStartE2EDuration="5.010636726s" podCreationTimestamp="2026-03-21 05:38:42 +0000 UTC" firstStartedPulling="2026-03-21 05:38:43.952029078 +0000 UTC m=+4528.279815764" lastFinishedPulling="2026-03-21 05:38:46.45231289 +0000 UTC m=+4530.780099566" observedRunningTime="2026-03-21 05:38:47.002768764 +0000 UTC m=+4531.330555450" watchObservedRunningTime="2026-03-21 05:38:47.010636726 +0000 UTC m=+4531.338423402" Mar 21 05:38:52 crc kubenswrapper[4839]: I0321 05:38:52.591382 4839 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-ll62v" Mar 21 05:38:52 crc kubenswrapper[4839]: I0321 05:38:52.591984 4839 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-ll62v" Mar 21 05:38:52 crc kubenswrapper[4839]: I0321 05:38:52.819072 4839 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-ll62v" Mar 21 05:38:53 crc kubenswrapper[4839]: I0321 05:38:53.088304 4839 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-ll62v" Mar 21 05:38:53 crc kubenswrapper[4839]: I0321 05:38:53.149261 4839 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-ll62v"] Mar 21 05:38:55 crc kubenswrapper[4839]: I0321 05:38:55.055845 4839 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-ll62v" podUID="78b94c3e-17dd-4253-8aed-25de5cbc0215" containerName="registry-server" containerID="cri-o://3c1fee562067ca9358c26f54d92cb826bf4b9fffd2460d92100449ea81f1bed1" gracePeriod=2 Mar 21 05:38:55 crc kubenswrapper[4839]: I0321 05:38:55.508687 4839 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-ll62v" Mar 21 05:38:55 crc kubenswrapper[4839]: I0321 05:38:55.624323 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dk5px\" (UniqueName: \"kubernetes.io/projected/78b94c3e-17dd-4253-8aed-25de5cbc0215-kube-api-access-dk5px\") pod \"78b94c3e-17dd-4253-8aed-25de5cbc0215\" (UID: \"78b94c3e-17dd-4253-8aed-25de5cbc0215\") " Mar 21 05:38:55 crc kubenswrapper[4839]: I0321 05:38:55.624507 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/78b94c3e-17dd-4253-8aed-25de5cbc0215-catalog-content\") pod \"78b94c3e-17dd-4253-8aed-25de5cbc0215\" (UID: \"78b94c3e-17dd-4253-8aed-25de5cbc0215\") " Mar 21 05:38:55 crc kubenswrapper[4839]: I0321 05:38:55.625017 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/78b94c3e-17dd-4253-8aed-25de5cbc0215-utilities\") pod \"78b94c3e-17dd-4253-8aed-25de5cbc0215\" (UID: \"78b94c3e-17dd-4253-8aed-25de5cbc0215\") " Mar 21 05:38:55 crc kubenswrapper[4839]: I0321 05:38:55.626306 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/78b94c3e-17dd-4253-8aed-25de5cbc0215-utilities" (OuterVolumeSpecName: "utilities") pod "78b94c3e-17dd-4253-8aed-25de5cbc0215" (UID: "78b94c3e-17dd-4253-8aed-25de5cbc0215"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 21 05:38:55 crc kubenswrapper[4839]: I0321 05:38:55.636148 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/78b94c3e-17dd-4253-8aed-25de5cbc0215-kube-api-access-dk5px" (OuterVolumeSpecName: "kube-api-access-dk5px") pod "78b94c3e-17dd-4253-8aed-25de5cbc0215" (UID: "78b94c3e-17dd-4253-8aed-25de5cbc0215"). InnerVolumeSpecName "kube-api-access-dk5px". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 21 05:38:55 crc kubenswrapper[4839]: I0321 05:38:55.728424 4839 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/78b94c3e-17dd-4253-8aed-25de5cbc0215-utilities\") on node \"crc\" DevicePath \"\"" Mar 21 05:38:55 crc kubenswrapper[4839]: I0321 05:38:55.728837 4839 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dk5px\" (UniqueName: \"kubernetes.io/projected/78b94c3e-17dd-4253-8aed-25de5cbc0215-kube-api-access-dk5px\") on node \"crc\" DevicePath \"\"" Mar 21 05:38:56 crc kubenswrapper[4839]: I0321 05:38:56.071391 4839 generic.go:334] "Generic (PLEG): container finished" podID="78b94c3e-17dd-4253-8aed-25de5cbc0215" containerID="3c1fee562067ca9358c26f54d92cb826bf4b9fffd2460d92100449ea81f1bed1" exitCode=0 Mar 21 05:38:56 crc kubenswrapper[4839]: I0321 05:38:56.071466 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-ll62v" event={"ID":"78b94c3e-17dd-4253-8aed-25de5cbc0215","Type":"ContainerDied","Data":"3c1fee562067ca9358c26f54d92cb826bf4b9fffd2460d92100449ea81f1bed1"} Mar 21 05:38:56 crc kubenswrapper[4839]: I0321 05:38:56.071563 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-ll62v" event={"ID":"78b94c3e-17dd-4253-8aed-25de5cbc0215","Type":"ContainerDied","Data":"240f05dbf34fd755fdacb0ccc96083ada20d0c3272a8aa95239fc19a9ccae79a"} Mar 21 05:38:56 crc kubenswrapper[4839]: I0321 05:38:56.071624 4839 scope.go:117] "RemoveContainer" containerID="3c1fee562067ca9358c26f54d92cb826bf4b9fffd2460d92100449ea81f1bed1" Mar 21 05:38:56 crc kubenswrapper[4839]: I0321 05:38:56.074526 4839 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-ll62v" Mar 21 05:38:56 crc kubenswrapper[4839]: I0321 05:38:56.122518 4839 scope.go:117] "RemoveContainer" containerID="d455542feefa66517496e53183dac2ab3a6f9cbc8f06950babf0fd1a21727448" Mar 21 05:38:56 crc kubenswrapper[4839]: I0321 05:38:56.153685 4839 scope.go:117] "RemoveContainer" containerID="660ab990dd2c9102a94a5ac35184e719e4e6c6f722f12c8ccf1fe4b3ae9ff166" Mar 21 05:38:56 crc kubenswrapper[4839]: I0321 05:38:56.197069 4839 scope.go:117] "RemoveContainer" containerID="3c1fee562067ca9358c26f54d92cb826bf4b9fffd2460d92100449ea81f1bed1" Mar 21 05:38:56 crc kubenswrapper[4839]: E0321 05:38:56.197711 4839 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3c1fee562067ca9358c26f54d92cb826bf4b9fffd2460d92100449ea81f1bed1\": container with ID starting with 3c1fee562067ca9358c26f54d92cb826bf4b9fffd2460d92100449ea81f1bed1 not found: ID does not exist" containerID="3c1fee562067ca9358c26f54d92cb826bf4b9fffd2460d92100449ea81f1bed1" Mar 21 05:38:56 crc kubenswrapper[4839]: I0321 05:38:56.197781 4839 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3c1fee562067ca9358c26f54d92cb826bf4b9fffd2460d92100449ea81f1bed1"} err="failed to get container status \"3c1fee562067ca9358c26f54d92cb826bf4b9fffd2460d92100449ea81f1bed1\": rpc error: code = NotFound desc = could not find container \"3c1fee562067ca9358c26f54d92cb826bf4b9fffd2460d92100449ea81f1bed1\": container with ID starting with 3c1fee562067ca9358c26f54d92cb826bf4b9fffd2460d92100449ea81f1bed1 not found: ID does not exist" Mar 21 05:38:56 crc kubenswrapper[4839]: I0321 05:38:56.197826 4839 scope.go:117] "RemoveContainer" containerID="d455542feefa66517496e53183dac2ab3a6f9cbc8f06950babf0fd1a21727448" Mar 21 05:38:56 crc kubenswrapper[4839]: E0321 05:38:56.198134 4839 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d455542feefa66517496e53183dac2ab3a6f9cbc8f06950babf0fd1a21727448\": container with ID starting with d455542feefa66517496e53183dac2ab3a6f9cbc8f06950babf0fd1a21727448 not found: ID does not exist" containerID="d455542feefa66517496e53183dac2ab3a6f9cbc8f06950babf0fd1a21727448" Mar 21 05:38:56 crc kubenswrapper[4839]: I0321 05:38:56.198168 4839 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d455542feefa66517496e53183dac2ab3a6f9cbc8f06950babf0fd1a21727448"} err="failed to get container status \"d455542feefa66517496e53183dac2ab3a6f9cbc8f06950babf0fd1a21727448\": rpc error: code = NotFound desc = could not find container \"d455542feefa66517496e53183dac2ab3a6f9cbc8f06950babf0fd1a21727448\": container with ID starting with d455542feefa66517496e53183dac2ab3a6f9cbc8f06950babf0fd1a21727448 not found: ID does not exist" Mar 21 05:38:56 crc kubenswrapper[4839]: I0321 05:38:56.198183 4839 scope.go:117] "RemoveContainer" containerID="660ab990dd2c9102a94a5ac35184e719e4e6c6f722f12c8ccf1fe4b3ae9ff166" Mar 21 05:38:56 crc kubenswrapper[4839]: E0321 05:38:56.198374 4839 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"660ab990dd2c9102a94a5ac35184e719e4e6c6f722f12c8ccf1fe4b3ae9ff166\": container with ID starting with 660ab990dd2c9102a94a5ac35184e719e4e6c6f722f12c8ccf1fe4b3ae9ff166 not found: ID does not exist" containerID="660ab990dd2c9102a94a5ac35184e719e4e6c6f722f12c8ccf1fe4b3ae9ff166" Mar 21 05:38:56 crc kubenswrapper[4839]: I0321 05:38:56.198395 4839 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"660ab990dd2c9102a94a5ac35184e719e4e6c6f722f12c8ccf1fe4b3ae9ff166"} err="failed to get container status \"660ab990dd2c9102a94a5ac35184e719e4e6c6f722f12c8ccf1fe4b3ae9ff166\": rpc error: code = NotFound desc = could not find container \"660ab990dd2c9102a94a5ac35184e719e4e6c6f722f12c8ccf1fe4b3ae9ff166\": container with ID starting with 660ab990dd2c9102a94a5ac35184e719e4e6c6f722f12c8ccf1fe4b3ae9ff166 not found: ID does not exist" Mar 21 05:38:56 crc kubenswrapper[4839]: I0321 05:38:56.512695 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/78b94c3e-17dd-4253-8aed-25de5cbc0215-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "78b94c3e-17dd-4253-8aed-25de5cbc0215" (UID: "78b94c3e-17dd-4253-8aed-25de5cbc0215"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 21 05:38:56 crc kubenswrapper[4839]: I0321 05:38:56.548085 4839 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/78b94c3e-17dd-4253-8aed-25de5cbc0215-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 21 05:38:56 crc kubenswrapper[4839]: I0321 05:38:56.712919 4839 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-ll62v"] Mar 21 05:38:56 crc kubenswrapper[4839]: I0321 05:38:56.726376 4839 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-ll62v"] Mar 21 05:38:58 crc kubenswrapper[4839]: I0321 05:38:58.465139 4839 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="78b94c3e-17dd-4253-8aed-25de5cbc0215" path="/var/lib/kubelet/pods/78b94c3e-17dd-4253-8aed-25de5cbc0215/volumes" Mar 21 05:38:58 crc kubenswrapper[4839]: I0321 05:38:58.625012 4839 scope.go:117] "RemoveContainer" containerID="e0aaa7c76a0ee9b1660ca2e309fd9d60f43c9f5876dc19d939b4dd884d137805" Mar 21 05:38:58 crc kubenswrapper[4839]: I0321 05:38:58.653596 4839 scope.go:117] "RemoveContainer" containerID="cc3802ac333d73f4abb16330d261760555d938cdc36d0050dadf5466674b13ba" Mar 21 05:39:58 crc kubenswrapper[4839]: I0321 05:39:58.845476 4839 scope.go:117] "RemoveContainer" containerID="d991b608c7dd15cd8e8f6e12d6073ad24091724986f4f1fa631390572cd83d55" Mar 21 05:40:00 crc kubenswrapper[4839]: I0321 05:40:00.148246 4839 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29567860-qprns"] Mar 21 05:40:00 crc kubenswrapper[4839]: E0321 05:40:00.149109 4839 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5072f4c5-1de6-4d8c-b69c-72d081fc7a0d" containerName="gather" Mar 21 05:40:00 crc kubenswrapper[4839]: I0321 05:40:00.149128 4839 state_mem.go:107] "Deleted CPUSet assignment" podUID="5072f4c5-1de6-4d8c-b69c-72d081fc7a0d" containerName="gather" Mar 21 05:40:00 crc kubenswrapper[4839]: E0321 05:40:00.149148 4839 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="78b94c3e-17dd-4253-8aed-25de5cbc0215" containerName="registry-server" Mar 21 05:40:00 crc kubenswrapper[4839]: I0321 05:40:00.149156 4839 state_mem.go:107] "Deleted CPUSet assignment" podUID="78b94c3e-17dd-4253-8aed-25de5cbc0215" containerName="registry-server" Mar 21 05:40:00 crc kubenswrapper[4839]: E0321 05:40:00.149188 4839 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5072f4c5-1de6-4d8c-b69c-72d081fc7a0d" containerName="copy" Mar 21 05:40:00 crc kubenswrapper[4839]: I0321 05:40:00.149198 4839 state_mem.go:107] "Deleted CPUSet assignment" podUID="5072f4c5-1de6-4d8c-b69c-72d081fc7a0d" containerName="copy" Mar 21 05:40:00 crc kubenswrapper[4839]: E0321 05:40:00.149215 4839 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="78b94c3e-17dd-4253-8aed-25de5cbc0215" containerName="extract-utilities" Mar 21 05:40:00 crc kubenswrapper[4839]: I0321 05:40:00.149223 4839 state_mem.go:107] "Deleted CPUSet assignment" podUID="78b94c3e-17dd-4253-8aed-25de5cbc0215" containerName="extract-utilities" Mar 21 05:40:00 crc kubenswrapper[4839]: E0321 05:40:00.149250 4839 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="78b94c3e-17dd-4253-8aed-25de5cbc0215" containerName="extract-content" Mar 21 05:40:00 crc kubenswrapper[4839]: I0321 05:40:00.149258 4839 state_mem.go:107] "Deleted CPUSet assignment" podUID="78b94c3e-17dd-4253-8aed-25de5cbc0215" containerName="extract-content" Mar 21 05:40:00 crc kubenswrapper[4839]: I0321 05:40:00.149450 4839 memory_manager.go:354] "RemoveStaleState removing state" podUID="78b94c3e-17dd-4253-8aed-25de5cbc0215" containerName="registry-server" Mar 21 05:40:00 crc kubenswrapper[4839]: I0321 05:40:00.149487 4839 memory_manager.go:354] "RemoveStaleState removing state" podUID="5072f4c5-1de6-4d8c-b69c-72d081fc7a0d" containerName="gather" Mar 21 05:40:00 crc kubenswrapper[4839]: I0321 05:40:00.149504 4839 memory_manager.go:354] "RemoveStaleState removing state" podUID="5072f4c5-1de6-4d8c-b69c-72d081fc7a0d" containerName="copy" Mar 21 05:40:00 crc kubenswrapper[4839]: I0321 05:40:00.150321 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567860-qprns" Mar 21 05:40:00 crc kubenswrapper[4839]: I0321 05:40:00.152400 4839 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 21 05:40:00 crc kubenswrapper[4839]: I0321 05:40:00.152548 4839 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-swld2" Mar 21 05:40:00 crc kubenswrapper[4839]: I0321 05:40:00.153344 4839 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 21 05:40:00 crc kubenswrapper[4839]: I0321 05:40:00.163272 4839 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29567860-qprns"] Mar 21 05:40:00 crc kubenswrapper[4839]: I0321 05:40:00.194394 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p5f2k\" (UniqueName: \"kubernetes.io/projected/3bb631e5-c431-43b1-8e8b-ebe2a9e4842d-kube-api-access-p5f2k\") pod \"auto-csr-approver-29567860-qprns\" (UID: \"3bb631e5-c431-43b1-8e8b-ebe2a9e4842d\") " pod="openshift-infra/auto-csr-approver-29567860-qprns" Mar 21 05:40:00 crc kubenswrapper[4839]: I0321 05:40:00.297203 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-p5f2k\" (UniqueName: \"kubernetes.io/projected/3bb631e5-c431-43b1-8e8b-ebe2a9e4842d-kube-api-access-p5f2k\") pod \"auto-csr-approver-29567860-qprns\" (UID: \"3bb631e5-c431-43b1-8e8b-ebe2a9e4842d\") " pod="openshift-infra/auto-csr-approver-29567860-qprns" Mar 21 05:40:00 crc kubenswrapper[4839]: I0321 05:40:00.331052 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-p5f2k\" (UniqueName: \"kubernetes.io/projected/3bb631e5-c431-43b1-8e8b-ebe2a9e4842d-kube-api-access-p5f2k\") pod \"auto-csr-approver-29567860-qprns\" (UID: \"3bb631e5-c431-43b1-8e8b-ebe2a9e4842d\") " pod="openshift-infra/auto-csr-approver-29567860-qprns" Mar 21 05:40:00 crc kubenswrapper[4839]: I0321 05:40:00.500298 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567860-qprns" Mar 21 05:40:00 crc kubenswrapper[4839]: I0321 05:40:00.963015 4839 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29567860-qprns"] Mar 21 05:40:00 crc kubenswrapper[4839]: I0321 05:40:00.981181 4839 patch_prober.go:28] interesting pod/machine-config-daemon-jx4q7 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 21 05:40:00 crc kubenswrapper[4839]: I0321 05:40:00.981288 4839 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-jx4q7" podUID="4f92fefb-d5cd-451a-8bbe-31eea55d5bd9" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 21 05:40:01 crc kubenswrapper[4839]: I0321 05:40:01.514861 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567860-qprns" event={"ID":"3bb631e5-c431-43b1-8e8b-ebe2a9e4842d","Type":"ContainerStarted","Data":"36e0c52dbe7f2a3766441d105e4720ecb523bb955499d35c4c63009ee9ac6b0b"} Mar 21 05:40:03 crc kubenswrapper[4839]: I0321 05:40:03.533101 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567860-qprns" event={"ID":"3bb631e5-c431-43b1-8e8b-ebe2a9e4842d","Type":"ContainerStarted","Data":"1abaabd278447b88afde2bcc37993f253c31a34d2313296e6e7167afadc39a08"} Mar 21 05:40:03 crc kubenswrapper[4839]: I0321 05:40:03.553109 4839 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29567860-qprns" podStartSLOduration=1.627987514 podStartE2EDuration="3.553093473s" podCreationTimestamp="2026-03-21 05:40:00 +0000 UTC" firstStartedPulling="2026-03-21 05:40:00.965990521 +0000 UTC m=+4605.293777197" lastFinishedPulling="2026-03-21 05:40:02.89109647 +0000 UTC m=+4607.218883156" observedRunningTime="2026-03-21 05:40:03.549851152 +0000 UTC m=+4607.877637828" watchObservedRunningTime="2026-03-21 05:40:03.553093473 +0000 UTC m=+4607.880880149" Mar 21 05:40:04 crc kubenswrapper[4839]: I0321 05:40:04.544439 4839 generic.go:334] "Generic (PLEG): container finished" podID="3bb631e5-c431-43b1-8e8b-ebe2a9e4842d" containerID="1abaabd278447b88afde2bcc37993f253c31a34d2313296e6e7167afadc39a08" exitCode=0 Mar 21 05:40:04 crc kubenswrapper[4839]: I0321 05:40:04.544539 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567860-qprns" event={"ID":"3bb631e5-c431-43b1-8e8b-ebe2a9e4842d","Type":"ContainerDied","Data":"1abaabd278447b88afde2bcc37993f253c31a34d2313296e6e7167afadc39a08"} Mar 21 05:40:05 crc kubenswrapper[4839]: I0321 05:40:05.896590 4839 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567860-qprns" Mar 21 05:40:05 crc kubenswrapper[4839]: I0321 05:40:05.903393 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-p5f2k\" (UniqueName: \"kubernetes.io/projected/3bb631e5-c431-43b1-8e8b-ebe2a9e4842d-kube-api-access-p5f2k\") pod \"3bb631e5-c431-43b1-8e8b-ebe2a9e4842d\" (UID: \"3bb631e5-c431-43b1-8e8b-ebe2a9e4842d\") " Mar 21 05:40:05 crc kubenswrapper[4839]: I0321 05:40:05.912641 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3bb631e5-c431-43b1-8e8b-ebe2a9e4842d-kube-api-access-p5f2k" (OuterVolumeSpecName: "kube-api-access-p5f2k") pod "3bb631e5-c431-43b1-8e8b-ebe2a9e4842d" (UID: "3bb631e5-c431-43b1-8e8b-ebe2a9e4842d"). InnerVolumeSpecName "kube-api-access-p5f2k". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 21 05:40:06 crc kubenswrapper[4839]: I0321 05:40:06.005052 4839 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-p5f2k\" (UniqueName: \"kubernetes.io/projected/3bb631e5-c431-43b1-8e8b-ebe2a9e4842d-kube-api-access-p5f2k\") on node \"crc\" DevicePath \"\"" Mar 21 05:40:06 crc kubenswrapper[4839]: I0321 05:40:06.564383 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567860-qprns" event={"ID":"3bb631e5-c431-43b1-8e8b-ebe2a9e4842d","Type":"ContainerDied","Data":"36e0c52dbe7f2a3766441d105e4720ecb523bb955499d35c4c63009ee9ac6b0b"} Mar 21 05:40:06 crc kubenswrapper[4839]: I0321 05:40:06.564455 4839 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="36e0c52dbe7f2a3766441d105e4720ecb523bb955499d35c4c63009ee9ac6b0b" Mar 21 05:40:06 crc kubenswrapper[4839]: I0321 05:40:06.564538 4839 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567860-qprns" Mar 21 05:40:06 crc kubenswrapper[4839]: I0321 05:40:06.623517 4839 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29567854-85fvh"] Mar 21 05:40:06 crc kubenswrapper[4839]: I0321 05:40:06.634793 4839 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29567854-85fvh"] Mar 21 05:40:08 crc kubenswrapper[4839]: I0321 05:40:08.474814 4839 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bec1d36d-4ff4-4f29-9d04-59f088e00f09" path="/var/lib/kubelet/pods/bec1d36d-4ff4-4f29-9d04-59f088e00f09/volumes" Mar 21 05:40:30 crc kubenswrapper[4839]: I0321 05:40:30.980012 4839 patch_prober.go:28] interesting pod/machine-config-daemon-jx4q7 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 21 05:40:30 crc kubenswrapper[4839]: I0321 05:40:30.981696 4839 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-jx4q7" podUID="4f92fefb-d5cd-451a-8bbe-31eea55d5bd9" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 21 05:40:58 crc kubenswrapper[4839]: I0321 05:40:58.899282 4839 scope.go:117] "RemoveContainer" containerID="1b773a94d9de7762b645d818a8305a6aa83ff1f49522be66070dc127da6682d7" Mar 21 05:41:00 crc kubenswrapper[4839]: I0321 05:41:00.980613 4839 patch_prober.go:28] interesting pod/machine-config-daemon-jx4q7 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 21 05:41:00 crc kubenswrapper[4839]: I0321 05:41:00.981001 4839 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-jx4q7" podUID="4f92fefb-d5cd-451a-8bbe-31eea55d5bd9" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 21 05:41:00 crc kubenswrapper[4839]: I0321 05:41:00.981060 4839 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-jx4q7" Mar 21 05:41:00 crc kubenswrapper[4839]: I0321 05:41:00.981719 4839 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"3ed2b1cd754eebd9a34fd22a5307aa90be9d9709c3e6c1624ea187562e9a5a58"} pod="openshift-machine-config-operator/machine-config-daemon-jx4q7" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 21 05:41:00 crc kubenswrapper[4839]: I0321 05:41:00.981803 4839 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-jx4q7" podUID="4f92fefb-d5cd-451a-8bbe-31eea55d5bd9" containerName="machine-config-daemon" containerID="cri-o://3ed2b1cd754eebd9a34fd22a5307aa90be9d9709c3e6c1624ea187562e9a5a58" gracePeriod=600 Mar 21 05:41:01 crc kubenswrapper[4839]: E0321 05:41:01.175161 4839 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-jx4q7_openshift-machine-config-operator(4f92fefb-d5cd-451a-8bbe-31eea55d5bd9)\"" pod="openshift-machine-config-operator/machine-config-daemon-jx4q7" podUID="4f92fefb-d5cd-451a-8bbe-31eea55d5bd9" Mar 21 05:41:01 crc kubenswrapper[4839]: I0321 05:41:01.334393 4839 generic.go:334] "Generic (PLEG): container finished" podID="4f92fefb-d5cd-451a-8bbe-31eea55d5bd9" containerID="3ed2b1cd754eebd9a34fd22a5307aa90be9d9709c3e6c1624ea187562e9a5a58" exitCode=0 Mar 21 05:41:01 crc kubenswrapper[4839]: I0321 05:41:01.334467 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-jx4q7" event={"ID":"4f92fefb-d5cd-451a-8bbe-31eea55d5bd9","Type":"ContainerDied","Data":"3ed2b1cd754eebd9a34fd22a5307aa90be9d9709c3e6c1624ea187562e9a5a58"} Mar 21 05:41:01 crc kubenswrapper[4839]: I0321 05:41:01.334534 4839 scope.go:117] "RemoveContainer" containerID="c9fc249e5d17a6c2fdbc1eaec440716c42661d1c2e7c8e6b17923104003e02fe" Mar 21 05:41:01 crc kubenswrapper[4839]: I0321 05:41:01.335530 4839 scope.go:117] "RemoveContainer" containerID="3ed2b1cd754eebd9a34fd22a5307aa90be9d9709c3e6c1624ea187562e9a5a58" Mar 21 05:41:01 crc kubenswrapper[4839]: E0321 05:41:01.336046 4839 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-jx4q7_openshift-machine-config-operator(4f92fefb-d5cd-451a-8bbe-31eea55d5bd9)\"" pod="openshift-machine-config-operator/machine-config-daemon-jx4q7" podUID="4f92fefb-d5cd-451a-8bbe-31eea55d5bd9" Mar 21 05:41:16 crc kubenswrapper[4839]: I0321 05:41:16.461474 4839 scope.go:117] "RemoveContainer" containerID="3ed2b1cd754eebd9a34fd22a5307aa90be9d9709c3e6c1624ea187562e9a5a58" Mar 21 05:41:16 crc kubenswrapper[4839]: E0321 05:41:16.463243 4839 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-jx4q7_openshift-machine-config-operator(4f92fefb-d5cd-451a-8bbe-31eea55d5bd9)\"" pod="openshift-machine-config-operator/machine-config-daemon-jx4q7" podUID="4f92fefb-d5cd-451a-8bbe-31eea55d5bd9" Mar 21 05:41:31 crc kubenswrapper[4839]: I0321 05:41:31.453944 4839 scope.go:117] "RemoveContainer" containerID="3ed2b1cd754eebd9a34fd22a5307aa90be9d9709c3e6c1624ea187562e9a5a58" Mar 21 05:41:31 crc kubenswrapper[4839]: E0321 05:41:31.454736 4839 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-jx4q7_openshift-machine-config-operator(4f92fefb-d5cd-451a-8bbe-31eea55d5bd9)\"" pod="openshift-machine-config-operator/machine-config-daemon-jx4q7" podUID="4f92fefb-d5cd-451a-8bbe-31eea55d5bd9" Mar 21 05:41:42 crc kubenswrapper[4839]: I0321 05:41:42.453115 4839 scope.go:117] "RemoveContainer" containerID="3ed2b1cd754eebd9a34fd22a5307aa90be9d9709c3e6c1624ea187562e9a5a58" Mar 21 05:41:42 crc kubenswrapper[4839]: E0321 05:41:42.453840 4839 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-jx4q7_openshift-machine-config-operator(4f92fefb-d5cd-451a-8bbe-31eea55d5bd9)\"" pod="openshift-machine-config-operator/machine-config-daemon-jx4q7" podUID="4f92fefb-d5cd-451a-8bbe-31eea55d5bd9" Mar 21 05:41:55 crc kubenswrapper[4839]: I0321 05:41:55.453370 4839 scope.go:117] "RemoveContainer" containerID="3ed2b1cd754eebd9a34fd22a5307aa90be9d9709c3e6c1624ea187562e9a5a58" Mar 21 05:41:55 crc kubenswrapper[4839]: E0321 05:41:55.455162 4839 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-jx4q7_openshift-machine-config-operator(4f92fefb-d5cd-451a-8bbe-31eea55d5bd9)\"" pod="openshift-machine-config-operator/machine-config-daemon-jx4q7" podUID="4f92fefb-d5cd-451a-8bbe-31eea55d5bd9" Mar 21 05:42:00 crc kubenswrapper[4839]: I0321 05:42:00.165618 4839 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29567862-dqhdk"] Mar 21 05:42:00 crc kubenswrapper[4839]: E0321 05:42:00.166651 4839 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3bb631e5-c431-43b1-8e8b-ebe2a9e4842d" containerName="oc" Mar 21 05:42:00 crc kubenswrapper[4839]: I0321 05:42:00.166663 4839 state_mem.go:107] "Deleted CPUSet assignment" podUID="3bb631e5-c431-43b1-8e8b-ebe2a9e4842d" containerName="oc" Mar 21 05:42:00 crc kubenswrapper[4839]: I0321 05:42:00.166834 4839 memory_manager.go:354] "RemoveStaleState removing state" podUID="3bb631e5-c431-43b1-8e8b-ebe2a9e4842d" containerName="oc" Mar 21 05:42:00 crc kubenswrapper[4839]: I0321 05:42:00.167597 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567862-dqhdk" Mar 21 05:42:00 crc kubenswrapper[4839]: I0321 05:42:00.169692 4839 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-swld2" Mar 21 05:42:00 crc kubenswrapper[4839]: I0321 05:42:00.170665 4839 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 21 05:42:00 crc kubenswrapper[4839]: I0321 05:42:00.170829 4839 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 21 05:42:00 crc kubenswrapper[4839]: I0321 05:42:00.185667 4839 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29567862-dqhdk"] Mar 21 05:42:00 crc kubenswrapper[4839]: I0321 05:42:00.328378 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8c675\" (UniqueName: \"kubernetes.io/projected/310ca8e3-f2ad-491a-9453-3fc357628cd3-kube-api-access-8c675\") pod \"auto-csr-approver-29567862-dqhdk\" (UID: \"310ca8e3-f2ad-491a-9453-3fc357628cd3\") " pod="openshift-infra/auto-csr-approver-29567862-dqhdk" Mar 21 05:42:00 crc kubenswrapper[4839]: I0321 05:42:00.430240 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8c675\" (UniqueName: \"kubernetes.io/projected/310ca8e3-f2ad-491a-9453-3fc357628cd3-kube-api-access-8c675\") pod \"auto-csr-approver-29567862-dqhdk\" (UID: \"310ca8e3-f2ad-491a-9453-3fc357628cd3\") " pod="openshift-infra/auto-csr-approver-29567862-dqhdk" Mar 21 05:42:00 crc kubenswrapper[4839]: I0321 05:42:00.448783 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8c675\" (UniqueName: \"kubernetes.io/projected/310ca8e3-f2ad-491a-9453-3fc357628cd3-kube-api-access-8c675\") pod \"auto-csr-approver-29567862-dqhdk\" (UID: \"310ca8e3-f2ad-491a-9453-3fc357628cd3\") " pod="openshift-infra/auto-csr-approver-29567862-dqhdk" Mar 21 05:42:00 crc kubenswrapper[4839]: I0321 05:42:00.488931 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567862-dqhdk" Mar 21 05:42:01 crc kubenswrapper[4839]: I0321 05:42:01.295078 4839 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29567862-dqhdk"] Mar 21 05:42:01 crc kubenswrapper[4839]: I0321 05:42:01.301773 4839 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 21 05:42:02 crc kubenswrapper[4839]: I0321 05:42:02.084935 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567862-dqhdk" event={"ID":"310ca8e3-f2ad-491a-9453-3fc357628cd3","Type":"ContainerStarted","Data":"2a3365bede70aacb7ee31c031fed2b2720edc09728bd74044d8235e6cd6bccf7"} Mar 21 05:42:04 crc kubenswrapper[4839]: I0321 05:42:04.115254 4839 generic.go:334] "Generic (PLEG): container finished" podID="310ca8e3-f2ad-491a-9453-3fc357628cd3" containerID="93da64406c417a6e2ac4bf006b6bc4a4396a6d01c348ab94c9c8fddd70192ce0" exitCode=0 Mar 21 05:42:04 crc kubenswrapper[4839]: I0321 05:42:04.115323 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567862-dqhdk" event={"ID":"310ca8e3-f2ad-491a-9453-3fc357628cd3","Type":"ContainerDied","Data":"93da64406c417a6e2ac4bf006b6bc4a4396a6d01c348ab94c9c8fddd70192ce0"} Mar 21 05:42:05 crc kubenswrapper[4839]: I0321 05:42:05.525521 4839 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567862-dqhdk" Mar 21 05:42:05 crc kubenswrapper[4839]: I0321 05:42:05.585521 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8c675\" (UniqueName: \"kubernetes.io/projected/310ca8e3-f2ad-491a-9453-3fc357628cd3-kube-api-access-8c675\") pod \"310ca8e3-f2ad-491a-9453-3fc357628cd3\" (UID: \"310ca8e3-f2ad-491a-9453-3fc357628cd3\") " Mar 21 05:42:05 crc kubenswrapper[4839]: I0321 05:42:05.593157 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/310ca8e3-f2ad-491a-9453-3fc357628cd3-kube-api-access-8c675" (OuterVolumeSpecName: "kube-api-access-8c675") pod "310ca8e3-f2ad-491a-9453-3fc357628cd3" (UID: "310ca8e3-f2ad-491a-9453-3fc357628cd3"). InnerVolumeSpecName "kube-api-access-8c675". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 21 05:42:05 crc kubenswrapper[4839]: I0321 05:42:05.687246 4839 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8c675\" (UniqueName: \"kubernetes.io/projected/310ca8e3-f2ad-491a-9453-3fc357628cd3-kube-api-access-8c675\") on node \"crc\" DevicePath \"\"" Mar 21 05:42:06 crc kubenswrapper[4839]: I0321 05:42:06.134284 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567862-dqhdk" event={"ID":"310ca8e3-f2ad-491a-9453-3fc357628cd3","Type":"ContainerDied","Data":"2a3365bede70aacb7ee31c031fed2b2720edc09728bd74044d8235e6cd6bccf7"} Mar 21 05:42:06 crc kubenswrapper[4839]: I0321 05:42:06.134319 4839 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2a3365bede70aacb7ee31c031fed2b2720edc09728bd74044d8235e6cd6bccf7" Mar 21 05:42:06 crc kubenswrapper[4839]: I0321 05:42:06.134676 4839 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567862-dqhdk" Mar 21 05:42:06 crc kubenswrapper[4839]: I0321 05:42:06.604436 4839 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29567856-pzqfl"] Mar 21 05:42:06 crc kubenswrapper[4839]: I0321 05:42:06.613766 4839 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29567856-pzqfl"] Mar 21 05:42:08 crc kubenswrapper[4839]: I0321 05:42:08.453500 4839 scope.go:117] "RemoveContainer" containerID="3ed2b1cd754eebd9a34fd22a5307aa90be9d9709c3e6c1624ea187562e9a5a58" Mar 21 05:42:08 crc kubenswrapper[4839]: E0321 05:42:08.459526 4839 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-jx4q7_openshift-machine-config-operator(4f92fefb-d5cd-451a-8bbe-31eea55d5bd9)\"" pod="openshift-machine-config-operator/machine-config-daemon-jx4q7" podUID="4f92fefb-d5cd-451a-8bbe-31eea55d5bd9" Mar 21 05:42:08 crc kubenswrapper[4839]: I0321 05:42:08.476242 4839 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a61d1142-1394-4cf7-a8f7-6f1841a6694d" path="/var/lib/kubelet/pods/a61d1142-1394-4cf7-a8f7-6f1841a6694d/volumes" Mar 21 05:42:19 crc kubenswrapper[4839]: I0321 05:42:19.453205 4839 scope.go:117] "RemoveContainer" containerID="3ed2b1cd754eebd9a34fd22a5307aa90be9d9709c3e6c1624ea187562e9a5a58" Mar 21 05:42:19 crc kubenswrapper[4839]: E0321 05:42:19.453861 4839 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-jx4q7_openshift-machine-config-operator(4f92fefb-d5cd-451a-8bbe-31eea55d5bd9)\"" pod="openshift-machine-config-operator/machine-config-daemon-jx4q7" podUID="4f92fefb-d5cd-451a-8bbe-31eea55d5bd9" Mar 21 05:42:23 crc kubenswrapper[4839]: I0321 05:42:23.753720 4839 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-nqgqm"] Mar 21 05:42:23 crc kubenswrapper[4839]: E0321 05:42:23.754777 4839 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="310ca8e3-f2ad-491a-9453-3fc357628cd3" containerName="oc" Mar 21 05:42:23 crc kubenswrapper[4839]: I0321 05:42:23.754791 4839 state_mem.go:107] "Deleted CPUSet assignment" podUID="310ca8e3-f2ad-491a-9453-3fc357628cd3" containerName="oc" Mar 21 05:42:23 crc kubenswrapper[4839]: I0321 05:42:23.754968 4839 memory_manager.go:354] "RemoveStaleState removing state" podUID="310ca8e3-f2ad-491a-9453-3fc357628cd3" containerName="oc" Mar 21 05:42:23 crc kubenswrapper[4839]: I0321 05:42:23.756322 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-nqgqm" Mar 21 05:42:23 crc kubenswrapper[4839]: I0321 05:42:23.766643 4839 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-nqgqm"] Mar 21 05:42:23 crc kubenswrapper[4839]: I0321 05:42:23.837922 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/31974c6b-82d8-4d18-9dc2-9a7f29374d2f-catalog-content\") pod \"community-operators-nqgqm\" (UID: \"31974c6b-82d8-4d18-9dc2-9a7f29374d2f\") " pod="openshift-marketplace/community-operators-nqgqm" Mar 21 05:42:23 crc kubenswrapper[4839]: I0321 05:42:23.837986 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hp6sz\" (UniqueName: \"kubernetes.io/projected/31974c6b-82d8-4d18-9dc2-9a7f29374d2f-kube-api-access-hp6sz\") pod \"community-operators-nqgqm\" (UID: \"31974c6b-82d8-4d18-9dc2-9a7f29374d2f\") " pod="openshift-marketplace/community-operators-nqgqm" Mar 21 05:42:23 crc kubenswrapper[4839]: I0321 05:42:23.838016 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/31974c6b-82d8-4d18-9dc2-9a7f29374d2f-utilities\") pod \"community-operators-nqgqm\" (UID: \"31974c6b-82d8-4d18-9dc2-9a7f29374d2f\") " pod="openshift-marketplace/community-operators-nqgqm" Mar 21 05:42:23 crc kubenswrapper[4839]: I0321 05:42:23.939255 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/31974c6b-82d8-4d18-9dc2-9a7f29374d2f-catalog-content\") pod \"community-operators-nqgqm\" (UID: \"31974c6b-82d8-4d18-9dc2-9a7f29374d2f\") " pod="openshift-marketplace/community-operators-nqgqm" Mar 21 05:42:23 crc kubenswrapper[4839]: I0321 05:42:23.939302 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hp6sz\" (UniqueName: \"kubernetes.io/projected/31974c6b-82d8-4d18-9dc2-9a7f29374d2f-kube-api-access-hp6sz\") pod \"community-operators-nqgqm\" (UID: \"31974c6b-82d8-4d18-9dc2-9a7f29374d2f\") " pod="openshift-marketplace/community-operators-nqgqm" Mar 21 05:42:23 crc kubenswrapper[4839]: I0321 05:42:23.939332 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/31974c6b-82d8-4d18-9dc2-9a7f29374d2f-utilities\") pod \"community-operators-nqgqm\" (UID: \"31974c6b-82d8-4d18-9dc2-9a7f29374d2f\") " pod="openshift-marketplace/community-operators-nqgqm" Mar 21 05:42:23 crc kubenswrapper[4839]: I0321 05:42:23.939831 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/31974c6b-82d8-4d18-9dc2-9a7f29374d2f-catalog-content\") pod \"community-operators-nqgqm\" (UID: \"31974c6b-82d8-4d18-9dc2-9a7f29374d2f\") " pod="openshift-marketplace/community-operators-nqgqm" Mar 21 05:42:23 crc kubenswrapper[4839]: I0321 05:42:23.939863 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/31974c6b-82d8-4d18-9dc2-9a7f29374d2f-utilities\") pod \"community-operators-nqgqm\" (UID: \"31974c6b-82d8-4d18-9dc2-9a7f29374d2f\") " pod="openshift-marketplace/community-operators-nqgqm" Mar 21 05:42:23 crc kubenswrapper[4839]: I0321 05:42:23.960121 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hp6sz\" (UniqueName: \"kubernetes.io/projected/31974c6b-82d8-4d18-9dc2-9a7f29374d2f-kube-api-access-hp6sz\") pod \"community-operators-nqgqm\" (UID: \"31974c6b-82d8-4d18-9dc2-9a7f29374d2f\") " pod="openshift-marketplace/community-operators-nqgqm" Mar 21 05:42:24 crc kubenswrapper[4839]: I0321 05:42:24.112943 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-nqgqm" Mar 21 05:42:24 crc kubenswrapper[4839]: I0321 05:42:24.651745 4839 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-nqgqm"] Mar 21 05:42:25 crc kubenswrapper[4839]: I0321 05:42:25.605526 4839 generic.go:334] "Generic (PLEG): container finished" podID="31974c6b-82d8-4d18-9dc2-9a7f29374d2f" containerID="2c45cc0f8befc65f525c075b18ed919ad6ee83a090fe3d2d9ed2b9b0293e76af" exitCode=0 Mar 21 05:42:25 crc kubenswrapper[4839]: I0321 05:42:25.605607 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-nqgqm" event={"ID":"31974c6b-82d8-4d18-9dc2-9a7f29374d2f","Type":"ContainerDied","Data":"2c45cc0f8befc65f525c075b18ed919ad6ee83a090fe3d2d9ed2b9b0293e76af"} Mar 21 05:42:25 crc kubenswrapper[4839]: I0321 05:42:25.605835 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-nqgqm" event={"ID":"31974c6b-82d8-4d18-9dc2-9a7f29374d2f","Type":"ContainerStarted","Data":"69a350c23f645fd571cfc04982724c5750af661fc4c28d11db24e0b6e55d97e9"} Mar 21 05:42:27 crc kubenswrapper[4839]: I0321 05:42:27.626184 4839 generic.go:334] "Generic (PLEG): container finished" podID="31974c6b-82d8-4d18-9dc2-9a7f29374d2f" containerID="b374efae9e2d442339b1b9f7f2d9a90aeba73c19a68818da8da62ebd72f5fe00" exitCode=0 Mar 21 05:42:27 crc kubenswrapper[4839]: I0321 05:42:27.626256 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-nqgqm" event={"ID":"31974c6b-82d8-4d18-9dc2-9a7f29374d2f","Type":"ContainerDied","Data":"b374efae9e2d442339b1b9f7f2d9a90aeba73c19a68818da8da62ebd72f5fe00"} Mar 21 05:42:28 crc kubenswrapper[4839]: I0321 05:42:28.648790 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-nqgqm" event={"ID":"31974c6b-82d8-4d18-9dc2-9a7f29374d2f","Type":"ContainerStarted","Data":"8c7d3ddec3d41ff0172b9bc00312e6b00634bbc6afa0f48f2d02c9f8834b2571"} Mar 21 05:42:28 crc kubenswrapper[4839]: I0321 05:42:28.678147 4839 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-nqgqm" podStartSLOduration=3.240137242 podStartE2EDuration="5.678125436s" podCreationTimestamp="2026-03-21 05:42:23 +0000 UTC" firstStartedPulling="2026-03-21 05:42:25.607720065 +0000 UTC m=+4749.935506741" lastFinishedPulling="2026-03-21 05:42:28.045708259 +0000 UTC m=+4752.373494935" observedRunningTime="2026-03-21 05:42:28.67545032 +0000 UTC m=+4753.003237026" watchObservedRunningTime="2026-03-21 05:42:28.678125436 +0000 UTC m=+4753.005912112" Mar 21 05:42:34 crc kubenswrapper[4839]: I0321 05:42:34.114042 4839 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-nqgqm" Mar 21 05:42:34 crc kubenswrapper[4839]: I0321 05:42:34.114323 4839 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-nqgqm" Mar 21 05:42:34 crc kubenswrapper[4839]: I0321 05:42:34.162668 4839 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-nqgqm" Mar 21 05:42:34 crc kubenswrapper[4839]: I0321 05:42:34.454061 4839 scope.go:117] "RemoveContainer" containerID="3ed2b1cd754eebd9a34fd22a5307aa90be9d9709c3e6c1624ea187562e9a5a58" Mar 21 05:42:34 crc kubenswrapper[4839]: E0321 05:42:34.454467 4839 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-jx4q7_openshift-machine-config-operator(4f92fefb-d5cd-451a-8bbe-31eea55d5bd9)\"" pod="openshift-machine-config-operator/machine-config-daemon-jx4q7" podUID="4f92fefb-d5cd-451a-8bbe-31eea55d5bd9" Mar 21 05:42:34 crc kubenswrapper[4839]: I0321 05:42:34.760885 4839 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-nqgqm" Mar 21 05:42:34 crc kubenswrapper[4839]: I0321 05:42:34.821956 4839 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-nqgqm"] Mar 21 05:42:36 crc kubenswrapper[4839]: I0321 05:42:36.731005 4839 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-nqgqm" podUID="31974c6b-82d8-4d18-9dc2-9a7f29374d2f" containerName="registry-server" containerID="cri-o://8c7d3ddec3d41ff0172b9bc00312e6b00634bbc6afa0f48f2d02c9f8834b2571" gracePeriod=2 Mar 21 05:42:37 crc kubenswrapper[4839]: I0321 05:42:37.224342 4839 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-nqgqm" Mar 21 05:42:37 crc kubenswrapper[4839]: I0321 05:42:37.328456 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/31974c6b-82d8-4d18-9dc2-9a7f29374d2f-catalog-content\") pod \"31974c6b-82d8-4d18-9dc2-9a7f29374d2f\" (UID: \"31974c6b-82d8-4d18-9dc2-9a7f29374d2f\") " Mar 21 05:42:37 crc kubenswrapper[4839]: I0321 05:42:37.328555 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hp6sz\" (UniqueName: \"kubernetes.io/projected/31974c6b-82d8-4d18-9dc2-9a7f29374d2f-kube-api-access-hp6sz\") pod \"31974c6b-82d8-4d18-9dc2-9a7f29374d2f\" (UID: \"31974c6b-82d8-4d18-9dc2-9a7f29374d2f\") " Mar 21 05:42:37 crc kubenswrapper[4839]: I0321 05:42:37.328747 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/31974c6b-82d8-4d18-9dc2-9a7f29374d2f-utilities\") pod \"31974c6b-82d8-4d18-9dc2-9a7f29374d2f\" (UID: \"31974c6b-82d8-4d18-9dc2-9a7f29374d2f\") " Mar 21 05:42:37 crc kubenswrapper[4839]: I0321 05:42:37.330046 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/31974c6b-82d8-4d18-9dc2-9a7f29374d2f-utilities" (OuterVolumeSpecName: "utilities") pod "31974c6b-82d8-4d18-9dc2-9a7f29374d2f" (UID: "31974c6b-82d8-4d18-9dc2-9a7f29374d2f"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 21 05:42:37 crc kubenswrapper[4839]: I0321 05:42:37.338409 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/31974c6b-82d8-4d18-9dc2-9a7f29374d2f-kube-api-access-hp6sz" (OuterVolumeSpecName: "kube-api-access-hp6sz") pod "31974c6b-82d8-4d18-9dc2-9a7f29374d2f" (UID: "31974c6b-82d8-4d18-9dc2-9a7f29374d2f"). InnerVolumeSpecName "kube-api-access-hp6sz". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 21 05:42:37 crc kubenswrapper[4839]: I0321 05:42:37.388282 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/31974c6b-82d8-4d18-9dc2-9a7f29374d2f-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "31974c6b-82d8-4d18-9dc2-9a7f29374d2f" (UID: "31974c6b-82d8-4d18-9dc2-9a7f29374d2f"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 21 05:42:37 crc kubenswrapper[4839]: I0321 05:42:37.430884 4839 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/31974c6b-82d8-4d18-9dc2-9a7f29374d2f-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 21 05:42:37 crc kubenswrapper[4839]: I0321 05:42:37.431150 4839 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hp6sz\" (UniqueName: \"kubernetes.io/projected/31974c6b-82d8-4d18-9dc2-9a7f29374d2f-kube-api-access-hp6sz\") on node \"crc\" DevicePath \"\"" Mar 21 05:42:37 crc kubenswrapper[4839]: I0321 05:42:37.431212 4839 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/31974c6b-82d8-4d18-9dc2-9a7f29374d2f-utilities\") on node \"crc\" DevicePath \"\"" Mar 21 05:42:37 crc kubenswrapper[4839]: I0321 05:42:37.742037 4839 generic.go:334] "Generic (PLEG): container finished" podID="31974c6b-82d8-4d18-9dc2-9a7f29374d2f" containerID="8c7d3ddec3d41ff0172b9bc00312e6b00634bbc6afa0f48f2d02c9f8834b2571" exitCode=0 Mar 21 05:42:37 crc kubenswrapper[4839]: I0321 05:42:37.742093 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-nqgqm" event={"ID":"31974c6b-82d8-4d18-9dc2-9a7f29374d2f","Type":"ContainerDied","Data":"8c7d3ddec3d41ff0172b9bc00312e6b00634bbc6afa0f48f2d02c9f8834b2571"} Mar 21 05:42:37 crc kubenswrapper[4839]: I0321 05:42:37.742172 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-nqgqm" event={"ID":"31974c6b-82d8-4d18-9dc2-9a7f29374d2f","Type":"ContainerDied","Data":"69a350c23f645fd571cfc04982724c5750af661fc4c28d11db24e0b6e55d97e9"} Mar 21 05:42:37 crc kubenswrapper[4839]: I0321 05:42:37.742199 4839 scope.go:117] "RemoveContainer" containerID="8c7d3ddec3d41ff0172b9bc00312e6b00634bbc6afa0f48f2d02c9f8834b2571" Mar 21 05:42:37 crc kubenswrapper[4839]: I0321 05:42:37.744953 4839 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-nqgqm" Mar 21 05:42:37 crc kubenswrapper[4839]: I0321 05:42:37.763008 4839 scope.go:117] "RemoveContainer" containerID="b374efae9e2d442339b1b9f7f2d9a90aeba73c19a68818da8da62ebd72f5fe00" Mar 21 05:42:37 crc kubenswrapper[4839]: I0321 05:42:37.788719 4839 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-nqgqm"] Mar 21 05:42:37 crc kubenswrapper[4839]: I0321 05:42:37.796340 4839 scope.go:117] "RemoveContainer" containerID="2c45cc0f8befc65f525c075b18ed919ad6ee83a090fe3d2d9ed2b9b0293e76af" Mar 21 05:42:37 crc kubenswrapper[4839]: I0321 05:42:37.802556 4839 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-nqgqm"] Mar 21 05:42:37 crc kubenswrapper[4839]: I0321 05:42:37.844905 4839 scope.go:117] "RemoveContainer" containerID="8c7d3ddec3d41ff0172b9bc00312e6b00634bbc6afa0f48f2d02c9f8834b2571" Mar 21 05:42:37 crc kubenswrapper[4839]: E0321 05:42:37.845315 4839 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8c7d3ddec3d41ff0172b9bc00312e6b00634bbc6afa0f48f2d02c9f8834b2571\": container with ID starting with 8c7d3ddec3d41ff0172b9bc00312e6b00634bbc6afa0f48f2d02c9f8834b2571 not found: ID does not exist" containerID="8c7d3ddec3d41ff0172b9bc00312e6b00634bbc6afa0f48f2d02c9f8834b2571" Mar 21 05:42:37 crc kubenswrapper[4839]: I0321 05:42:37.845358 4839 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8c7d3ddec3d41ff0172b9bc00312e6b00634bbc6afa0f48f2d02c9f8834b2571"} err="failed to get container status \"8c7d3ddec3d41ff0172b9bc00312e6b00634bbc6afa0f48f2d02c9f8834b2571\": rpc error: code = NotFound desc = could not find container \"8c7d3ddec3d41ff0172b9bc00312e6b00634bbc6afa0f48f2d02c9f8834b2571\": container with ID starting with 8c7d3ddec3d41ff0172b9bc00312e6b00634bbc6afa0f48f2d02c9f8834b2571 not found: ID does not exist" Mar 21 05:42:37 crc kubenswrapper[4839]: I0321 05:42:37.845385 4839 scope.go:117] "RemoveContainer" containerID="b374efae9e2d442339b1b9f7f2d9a90aeba73c19a68818da8da62ebd72f5fe00" Mar 21 05:42:37 crc kubenswrapper[4839]: E0321 05:42:37.845841 4839 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b374efae9e2d442339b1b9f7f2d9a90aeba73c19a68818da8da62ebd72f5fe00\": container with ID starting with b374efae9e2d442339b1b9f7f2d9a90aeba73c19a68818da8da62ebd72f5fe00 not found: ID does not exist" containerID="b374efae9e2d442339b1b9f7f2d9a90aeba73c19a68818da8da62ebd72f5fe00" Mar 21 05:42:37 crc kubenswrapper[4839]: I0321 05:42:37.845913 4839 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b374efae9e2d442339b1b9f7f2d9a90aeba73c19a68818da8da62ebd72f5fe00"} err="failed to get container status \"b374efae9e2d442339b1b9f7f2d9a90aeba73c19a68818da8da62ebd72f5fe00\": rpc error: code = NotFound desc = could not find container \"b374efae9e2d442339b1b9f7f2d9a90aeba73c19a68818da8da62ebd72f5fe00\": container with ID starting with b374efae9e2d442339b1b9f7f2d9a90aeba73c19a68818da8da62ebd72f5fe00 not found: ID does not exist" Mar 21 05:42:37 crc kubenswrapper[4839]: I0321 05:42:37.845947 4839 scope.go:117] "RemoveContainer" containerID="2c45cc0f8befc65f525c075b18ed919ad6ee83a090fe3d2d9ed2b9b0293e76af" Mar 21 05:42:37 crc kubenswrapper[4839]: E0321 05:42:37.846335 4839 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2c45cc0f8befc65f525c075b18ed919ad6ee83a090fe3d2d9ed2b9b0293e76af\": container with ID starting with 2c45cc0f8befc65f525c075b18ed919ad6ee83a090fe3d2d9ed2b9b0293e76af not found: ID does not exist" containerID="2c45cc0f8befc65f525c075b18ed919ad6ee83a090fe3d2d9ed2b9b0293e76af" Mar 21 05:42:37 crc kubenswrapper[4839]: I0321 05:42:37.846358 4839 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2c45cc0f8befc65f525c075b18ed919ad6ee83a090fe3d2d9ed2b9b0293e76af"} err="failed to get container status \"2c45cc0f8befc65f525c075b18ed919ad6ee83a090fe3d2d9ed2b9b0293e76af\": rpc error: code = NotFound desc = could not find container \"2c45cc0f8befc65f525c075b18ed919ad6ee83a090fe3d2d9ed2b9b0293e76af\": container with ID starting with 2c45cc0f8befc65f525c075b18ed919ad6ee83a090fe3d2d9ed2b9b0293e76af not found: ID does not exist" Mar 21 05:42:38 crc kubenswrapper[4839]: I0321 05:42:38.462395 4839 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="31974c6b-82d8-4d18-9dc2-9a7f29374d2f" path="/var/lib/kubelet/pods/31974c6b-82d8-4d18-9dc2-9a7f29374d2f/volumes" Mar 21 05:42:48 crc kubenswrapper[4839]: I0321 05:42:48.453029 4839 scope.go:117] "RemoveContainer" containerID="3ed2b1cd754eebd9a34fd22a5307aa90be9d9709c3e6c1624ea187562e9a5a58" Mar 21 05:42:48 crc kubenswrapper[4839]: E0321 05:42:48.453811 4839 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-jx4q7_openshift-machine-config-operator(4f92fefb-d5cd-451a-8bbe-31eea55d5bd9)\"" pod="openshift-machine-config-operator/machine-config-daemon-jx4q7" podUID="4f92fefb-d5cd-451a-8bbe-31eea55d5bd9" Mar 21 05:42:58 crc kubenswrapper[4839]: I0321 05:42:58.984224 4839 scope.go:117] "RemoveContainer" containerID="3b72b1b0e5d05a1d6603f6bb93e0270d894bb25f1de761d8b7c5c8644a45fe83" Mar 21 05:43:00 crc kubenswrapper[4839]: I0321 05:43:00.453313 4839 scope.go:117] "RemoveContainer" containerID="3ed2b1cd754eebd9a34fd22a5307aa90be9d9709c3e6c1624ea187562e9a5a58" Mar 21 05:43:00 crc kubenswrapper[4839]: E0321 05:43:00.453877 4839 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-jx4q7_openshift-machine-config-operator(4f92fefb-d5cd-451a-8bbe-31eea55d5bd9)\"" pod="openshift-machine-config-operator/machine-config-daemon-jx4q7" podUID="4f92fefb-d5cd-451a-8bbe-31eea55d5bd9" var/home/core/zuul-output/logs/crc-cloud-workdir-crc-all-logs.tar.gz0000644000175000000000000000005515157427761024464 0ustar coreroot  Om77'(var/home/core/zuul-output/logs/crc-cloud/0000755000175000000000000000000015157427762017402 5ustar corerootvar/home/core/zuul-output/artifacts/0000755000175000017500000000000015157416104016511 5ustar corecorevar/home/core/zuul-output/docs/0000755000175000017500000000000015157416104015461 5ustar corecore